8 votes

Why aren't more sports in 4K?

18 comments

  1. [18]
    Grzmot
    Link
    After having experienced Bluray-Quality 1080p on a 60" TV, I'm convinced that the chase for higher resolutions is completely unnecessary save for enormous televisions. Bitrate is king, and good...

    After having experienced Bluray-Quality 1080p on a 60" TV, I'm convinced that the chase for higher resolutions is completely unnecessary save for enormous televisions. Bitrate is king, and good 1080p is good enough.

    6 votes
    1. [12]
      NoblePath
      Link Parent
      Mostly true for tv. Getting good color and contrast is also important. That said, uncompressed 4k is noticeably better.

      Mostly true for tv. Getting good color and contrast is also important. That said, uncompressed 4k is noticeably better.

      2 votes
      1. [11]
        Grzmot
        Link Parent
        And where exactly can you get uncompressed 4k? Not even Bluray is uncompressed for 1080p, I doubt it is for 4k. I don't I've ever seen any sort of video uncompressed. Uncompressed 1080p is already...

        And where exactly can you get uncompressed 4k? Not even Bluray is uncompressed for 1080p, I doubt it is for 4k. I don't I've ever seen any sort of video uncompressed.

        Uncompressed 1080p is already ~1.4 Gbit/s, the file sizes for 4k uncompressed have to be like 4x that.

        2 votes
        1. [8]
          FlippantGod
          Link Parent
          Even compressed, 4K Bluray with HDR 10+ or DV can look amazing. I figure 4k bluray release quality is mostly derived from the following: Compression rate: still more detail than 1080p. Resolution:...

          Even compressed, 4K Bluray with HDR 10+ or DV can look amazing. I figure 4k bluray release quality is mostly derived from the following:

          Compression rate: still more detail than 1080p.

          Resolution: Mostly impacts headshot closeups, large vistas, and other cinematic stills. Heavily dependent on the master (both resolution and quality), so sometimes this is not nearly as nice as it should be. 2k upscaled is still usually better than bluray but with a couple truly atrocious frames per film. Sometimes there is very strong grain, which is fine, and sometimes there is horrible noise reduction, which is possibly worse than the bluray release.

          Color: Expanded colorspace, even if only a few frames use it, seems to look better across the entire film. It could be more accurate panels, or colorists are more willing to use the entire standard space, or knock-on effects from brightness, probably a bit of everything.

          HDR: Regular HDR is kinda meh, but it might still be improving my perception of colors. HDR 10+ and DV blow SDR out of the water.

          2 votes
          1. [7]
            Greg
            Link Parent
            Is it the 4K part that helps here, or more just that the UHD Bluray format allows for the best versions of all the other points? My understanding was that the eye can’t really resolve the...

            Is it the 4K part that helps here, or more just that the UHD Bluray format allows for the best versions of all the other points? My understanding was that the eye can’t really resolve the difference between 1080 and 4K at normal TV sizes and viewing distances, although it’s worth it up close on a monitor.

            I guess a 4K source for a TV could potentially act as a form of super sampling for better macro level quality even if we can’t perceive the micro level differences, although I’d be interested to know if it would be distinguishable with all else equal (bitrate, encoding, colour depth, etc.). Seems a good way to get the best quality copy when we know all else is almost definitely not equal, though!

            1 vote
            1. FlippantGod
              Link Parent
              The resolution bump definitely helps. It does improve edges and aliasing. It can resolve high frequency details in the source that previously would not be visible. Also, compression artifacts to...

              The resolution bump definitely helps. It does improve edges and aliasing. It can resolve high frequency details in the source that previously would not be visible.

              Also, compression artifacts to my eyes seem to be relatively less visible. I need to zoom good 4k content to clearly compare two different compression levels. I assume that as most artifacts are visible mostly at edges, the finer edges are less relative area on the screen, so compression artifacts are more finely grained? This one is really specific to compressors and sources, but it reflects my overall experience with 4k bluray content.

              2 votes
            2. babypuncher
              Link Parent
              It depends on the size of the TV and viewing conditions. I can certainly tell a difference on my 65" TV from my couch, but it's not nearly the kind of jarring difference that going from 480i to...

              My understanding was that the eye can’t really resolve the difference between 1080 and 4K at normal TV sizes and viewing distances, although it’s worth it up close on a monitor.

              It depends on the size of the TV and viewing conditions. I can certainly tell a difference on my 65" TV from my couch, but it's not nearly the kind of jarring difference that going from 480i to 1080p was 15 years ago. I have no problems buying and watching 1080p movies, but I generally refuse to buy DVD's anymore unless the content on them was originally produced and mastered in standard definition.

              In terms of pixel counts, 4k is a nice "future proof" option. It is already past a point of diminishing returns for any realistic home theater setup. Contrary to what some purists may try to claim, the resolving power of your typical 35mm cinema film stock is roughly equivalent to a "3k" digital resolution.

              HDR is another discussion entirely, and I think it's the bigger reason why buying a 4k TV is a worthwhile upgrade. But it is dependent on the type of display you use, and I personally only find it satisfactory on OLED panels where you don't have to contend with local dimming shenanigans.

              2 votes
            3. [4]
              DrStone
              Link Parent
              Rtings has a nice chart of resolution mattering based on tv size and viewing distance (full article with a bunch more charts and commentary). Anecdotally, I can definitely notice a difference...

              Rtings has a nice chart of resolution mattering based on tv size and viewing distance (full article with a bunch more charts and commentary).

              Anecdotally, I can definitely notice a difference between 1080p and 4K on my 65” with the couch 6-7ft away, particularly with live action content, which lines up with the Rtings findings.

              1 vote
              1. [3]
                Greg
                Link Parent
                Thanks! I actually glanced at that before posting, just to make sure I wasn't misremembering anything major, but I've learned a couple of interesting additional points from this thread: A decent...

                Thanks! I actually glanced at that before posting, just to make sure I wasn't misremembering anything major, but I've learned a couple of interesting additional points from this thread:

                • A decent number of people have TVs larger and closer than my totally unscientific guesstimate from my own living room

                • The "worth it" bands are based on 60 pixels per degree of visual field (one pixel per arc minute), and a lot of articles (including rtings' own) suggest that's the angular resolution of our eyes, but the study that @Gaywallet linked in a comment below indicates that the actual limit of what we can perceive is about double that

                So it seems as though 4K is within the physical limits of what we can distinguish a lot more of the time than I'd realised, although the "worth it" question is still inherently subjective. Also seems like most people pretty much agree that a 4K source is a good way to raise the chances of also getting a copy with better bitrate, encoding, colour depth, etc. so the overall product normally wins out regardless!

                3 votes
                1. [2]
                  Gaywallet
                  Link Parent
                  Well hold on now, you're taking the conclusions of the paper a step too far. The paper was specific to what level of resolution was necessary to be able to clearly read cartographic characters....

                  Well hold on now, you're taking the conclusions of the paper a step too far. The paper was specific to what level of resolution was necessary to be able to clearly read cartographic characters. This wasn't a paper on maximal visual acuity and it didn't take into consideration anything but cartographic characters which were designed to be legible, actually being legible.

                  Human perception and vision are incredibly complicated. The concept of an arc minute being important comes from the fundamental architecture in the eye, and isn't representative of how we actually perceive. In fact, due to some really fancy tricks that our brain does to make sense of the world, our actual visual acuity is much higher than would be indicated by just the architecture as we currently understand it. I don't have time to dig up a ton of papers on this, but I found a short article touching on just a few ways (with examples) on how we shouldn't jump to conclusions about visual perception from just a few measurements. We need to be thorough and test specific kinds of visual perception against displays to establish guidelines for specific use cases.

                  And just for fun, here's also an old paper about how our notions of what is possible to be perceived based on structure were provably wrong, as images which were designed to be illusions caused the illusions outside the range of perception as indicated by the structure of cells within the eye.

                  2 votes
                  1. Greg
                    Link Parent
                    Huh, that's genuinely fascinating - my slightly lazy "it's all just physics, right?" bias stands firmly corrected!

                    Huh, that's genuinely fascinating - my slightly lazy "it's all just physics, right?" bias stands firmly corrected!

                    2 votes
        2. babypuncher
          Link Parent
          While technically incorrect, I think the context makes it clear that we are talking about "low" compression rather than "no" compression. Truly uncompressed video is exceptionally rare, even in...

          While technically incorrect, I think the context makes it clear that we are talking about "low" compression rather than "no" compression. Truly uncompressed video is exceptionally rare, even in professional video production.

          It's not uncommon for people to refer to Blu-Ray rips that simply remux the original audio and video streams into an mkv file as "uncompressed", simply because they haven't been re-encoded.

          1 vote
        3. NoblePath
          Link Parent
          I clearly mis-spoke, had not yet had enough tea. Others down the chain have said what I was trying to say, and have said it far better than I ever could even with all my tea in :)

          I clearly mis-spoke, had not yet had enough tea. Others down the chain have said what I was trying to say, and have said it far better than I ever could even with all my tea in :)

    2. [5]
      Greg
      Link Parent
      Yeah, it's rare that you're sitting close enough to a TV for pixel density to be a limitation. One of my frustrations with the fragmentation in the streaming market is that there isn't space for a...

      Yeah, it's rare that you're sitting close enough to a TV for pixel density to be a limitation. One of my frustrations with the fragmentation in the streaming market is that there isn't space for a video equivalent of Tidal, with higher quality encoding as a specific feature; I couldn't personally tell you if I'm listening to Spotify or a CD by hearing it, but I can sure as hell tell Netflix from Bluray.

      I guess it makes sense that the hardware manufacturers want a number to use as a selling point, whether it's meaningful to the consumer or not, and then the content companies have to follow that lead because the TV companies have already embedded "4K == high quality" in the general consciousness and it's easier to follow that lead than spend a bunch more money educating them about bitrate.

      One real upside is the halo effect on the availability of high DPI computer monitors and the connection standards to drive them, though! If 8K takes off then that'll be 200+ DPI on anything that could feasibly fit on a desk.

      2 votes
      1. [2]
        Gaywallet
        Link Parent
        Really all depends on what you mean by "limitation". Most TVs, including 4k resolutions, still have quite a bit to gain from increased pixel density. Tests for the ability to discriminate between...

        it's rare that you're sitting close enough to a TV for pixel density to be a limitation

        Really all depends on what you mean by "limitation". Most TVs, including 4k resolutions, still have quite a bit to gain from increased pixel density. Tests for the ability to discriminate between cartographic characters put effective PPI requirements at around 550ppi for a screen approximately 30cm away from the face. An example of one such study can be found at this link.

        Now obviously there are considerations to make when examining a television through the lens of motion picture. Assuming legible text is not important is perhaps discounting the experiences of those who need the assistance of text or when text is presented in media as a part of the images being displayed. However, even if we ignore cartographic symbols, they happen to give us an idea of how well distinct shapes can be represented. A perfect representation of a scene may require a high PPI in order to accurately capture features of an environment, such as the cracks on a wall in the background.

        The cracks on a wall in the background may not be particularly important to understand the content of a scene, however. If the scene is a conversation at a diner, visual quality may be of significantly less importance than that of the audio. However important features to the story may be hinted with good cinematography (think small details set up in a film and then highlighted later in a flashback such as in murder mysteries).

        There is unfortunately no way to objectively measure the information provided in a scene and how much will be lost when it is compressed via absolute size (1080 vs 4k) or via bit depth. This is too dependent upon the content, cinematography, and ultimately the viewers perceptive ability. Just as visual data is of close to no importance for someone who is blind, some viewers may choose to focus on certain parts of cinematography in different ways based on their own predispositions and physical and cognitive abilities.

        2 votes
        1. Greg
          Link Parent
          This is a great and thoughtful take - it's all too easy to forget sometimes how the technical and artistic requirements and limitations play off each other! The one thing I will say that maybe I...

          This is a great and thoughtful take - it's all too easy to forget sometimes how the technical and artistic requirements and limitations play off each other! The one thing I will say that maybe I didn't make clear is that I did specifically mean TV as in "display on the wall that you watch from the sofa across the room" rather than "general display that you watch video content on".

          I'm singing the praises of a 220dpi monitor in another reply because similar to the study I'm sitting ~50cm from it, and even there I can believe that there's a bit more headroom if technology and cost allowed, but from a couple of metres away the angular size of those pixels drops pretty rapidly towards the limits of what we can resolve.

          I will say it's particularly interesting that the study suggests an upper limit closer to 2 linear pixels per arc minute than the general rule of thumb most literature seems to use, which is about half that. That'd imply that moving from 1080 to 4K actually is within the plausible limits of what we can distinguish at a 30-40º field of view, whereas the more widespread 1 arc minute assumption puts the cap at 1080 (i.e. half that in both dimensions).

          1 vote
      2. [2]
        NoblePath
        Link Parent
        Have you looked at the new iMac displays? They are the nicest screen I have beheld. Some of the mbp/ipad pros might be brighter and better color and higher dpi, but at that size, and that density,...

        Have you looked at the new iMac displays? They are the nicest screen I have beheld. Some of the mbp/ipad pros might be brighter and better color and higher dpi, but at that size, and that density, I'm just flabbergasted.

        1 vote
        1. Greg
          Link Parent
          I haven't seen the latest gen iMacs up close actually, but I have heard many good things! I'm typing this on an LG UltraFine 5K, though, which was their Apple collab from seven(!) years ago and...

          I haven't seen the latest gen iMacs up close actually, but I have heard many good things!

          I'm typing this on an LG UltraFine 5K, though, which was their Apple collab from seven(!) years ago and remains pretty much the best general purpose display I have ever used, anywhere. Same panel as the older iMac 27" I think? It's exactly 4x the resolution of 1440p, on a 27" panel, so everything naturally scales to a sensible physical size but with 4:1 pixel ratio - rather than the somewhat too big/way too small/non-integer blur compromise you need to make with larger 4K panels. It's an absolute thing of beauty.

          My hopes are high for a bit of competition in the space before this one eventually gives up on me, though! I got an extremely good deal on it and still paid the equivalent of about $800, which is prohibitive for pretty much anyone who isn't working in front of the thing all day and even a lot of people who are, it uses some weird dual link extension to an older DisplayPort standard to actually push that many pixels, there's no HDR or local dimming of any kind, refresh rate is locked to 60Hz, and it's taken until this year's CES to see any viable alternatives at all.

          1 vote