22 votes

Seeing very smooth movement on classic shows on a big screen TV

I recently visited my sister who has a big screen TV that shows very smooth movement including on Star Trek, the original series. Not that I don't have smooth movement on my home TV set, but hers was ridiculous. It almost looks like raw footage before it was refined down. It was like there were frames between the frames, if that makes any sense.

What am I seeing here?

39 comments

  1. [16]
    Akir
    Link
    Motion Interpolation. AKA soap opera mode. It's a feature that's been on pretty much every new TV for the past 5-10 years or so. I hate it so much. It's an ugly effect that makes things tend to...

    Motion Interpolation. AKA soap opera mode.

    It's a feature that's been on pretty much every new TV for the past 5-10 years or so.

    I hate it so much. It's an ugly effect that makes things tend to look worse, and at times it can be really distracting - especially when it gets parts that it can't figure out how to interpolate and the picture starts to look like it's stuttering. My old Samsung smart TV had it enabled in the Amazon Prime Video app and it could not be disabled, and it drove me nuts.

    58 votes
    1. [6]
      BashCrandiboot
      Link Parent
      It's the first thing I turn off when I get a new screen. I have no idea how it came about or gained traction to begin with. My personal theory is that it makes the demo animations more "ooooh...

      It's the first thing I turn off when I get a new screen. I have no idea how it came about or gained traction to begin with. My personal theory is that it makes the demo animations more "ooooh ahhhh" when the screen is in store. Then people buying their new 4K+ Ultra ZLED ISO*BURST Ray-Enabled Blast-Off Genius TV can really see how much better their new TV is than their old one without understanding the tech.

      Honestly, high framerate always looks "nicer" than 4k vs 2k. The layman probably can't even tell the difference. So they throw motion smoothing bullshit on there to simulate high frame rates and pass that off as "higher quality image." Worst thing ever.

      23 votes
      1. NoobFace
        (edited )
        Link Parent
        There are lots of reasons, but for the OG LCDs it's mostly motion blur from persistence issues. There's a decent Wikipedia article on the subject:...

        There are lots of reasons, but for the OG LCDs it's mostly motion blur from persistence issues. There's a decent Wikipedia article on the subject: https://en.m.wikipedia.org/wiki/Motion_interpolation

        There's also 3:2 pull-down related blurring that makes things worse as most studios doing analog content back in the day we're using 24fps cameras, leading to some funky blurriness as they needed to convert to NTSC's 29.997hz for broadcast. https://en.m.wikipedia.org/wiki/Three-two_pull_down

        Content creation has transitioned to digital 30/60 fps workflows and high refresh rate displays have improved persistence issues to point interpolation is a distraction.

        6 votes
      2. [4]
        Akir
        Link Parent
        AFAIK most every TV has a demo mode you can enable specifically to show off those features, so that explanation doesn't make sense. Probably a more realistic explaination is that it's enabled by...

        AFAIK most every TV has a demo mode you can enable specifically to show off those features, so that explanation doesn't make sense.

        Probably a more realistic explaination is that it's enabled by default because otherwise it would be difficult for consumers to find and therefore cause some people to not purchase or return the TV because they think it doesn't have that feature.

        4 votes
        1. [3]
          babypuncher
          Link Parent
          But do most people actually want the feature? I feel like everyone who knows what it is doesn't want it at all, and everyone else thinks that is just what TV looks like nowadays. I turned it off...

          But do most people actually want the feature?

          I feel like everyone who knows what it is doesn't want it at all, and everyone else thinks that is just what TV looks like nowadays.

          I turned it off on my parents TV last time I visited and they were shocked how much better everything looked without it.

          9 votes
          1. Akir
            Link Parent
            Probably not, but I would imagine that the people who made that decision either didn't bother to do a study to find out or did and determined that they would make more money with it on by default...

            Probably not, but I would imagine that the people who made that decision either didn't bother to do a study to find out or did and determined that they would make more money with it on by default than they would otherwise.

            3 votes
          2. digitalphil
            Link Parent
            The ONLY time I have ever used the feature, is when you can scale it. I think I have it set at 1 out of 5, the lowest. It does improve the pan stutter of older movies a bit without being a Soap Opera.

            The ONLY time I have ever used the feature, is when you can scale it. I think I have it set at 1 out of 5, the lowest. It does improve the pan stutter of older movies a bit without being a Soap Opera.

            3 votes
    2. [4]
      babypuncher
      (edited )
      Link Parent
      "feature" I classify it as a bug, deliberately introduced to make me angry. Fortunately, fixing most of the shitty default settings TVs ship with is at least getting easier, thanks to the adoption...

      "feature"

      I classify it as a bug, deliberately introduced to make me angry.

      Fortunately, fixing most of the shitty default settings TVs ship with is at least getting easier, thanks to the adoption of "Filmmaker Modes", at the behest of some prominent directors.

      8 votes
      1. [3]
        Akir
        Link Parent
        Here's the thing, though: all commercial video is mastered digitally and designed to be displayed on digital displays. Unless you are putting some home videos on the display, a set of experts have...

        Here's the thing, though: all commercial video is mastered digitally and designed to be displayed on digital displays. Unless you are putting some home videos on the display, a set of experts have already gone through the footage and specifically designed it to look the right way. Even video games get this kind of treatment, for the most part. This perfection is exactly what the appeal of digital was in the first place!

        That means that literally any processing step is intentionally degrading the video quality. The only exception are adjustments like brightness, which will need to adapted based on the environment the display is set up in. There shouldn't be a "filmmaker mode". It should just display the video I want the way it's meant to be displayed.

        That being said, I don't fault anyone who prefers video processing effects, because quality is subjective. If someone likes soap opera mode, I won't give them a hard time about it (though I might just silently judge them :P ). Processing can help make up for technical limitations, for instance. DVDs were all encoded with interlaced video, and so they need to be deinterlaced for modern progressive displays or you'll get weird-looking "screen door" effects whenever there is motion on the screen. TVs also need to have high quality scaling so you can watch in full-screen media that isn't broadcast to it in it's native resolution. I actually spent a little bit more on a TV specifically because it had a high quality video processor built into it.

        2 votes
        1. [2]
          Qgel
          Link Parent
          I fail to see how Motion Interpolation is something that could have been done during mastering and is therefore a 'degradation' of the original. Unless you know the framerate of the display your...

          I fail to see how Motion Interpolation is something that could have been done during mastering and is therefore a 'degradation' of the original. Unless you know the framerate of the display your video will be displayed on (and then wasting massive amounts of bandwidth when streaming because you are sending 120fps) there is no way to get the same effect baked into your video.

          1 vote
          1. Akir
            Link Parent
            I’m saying that motion interpolation and other video processing techniques are by definition an adulteration of the way the film is meant to be viewed. Framerate is both a technical and creative...

            I’m saying that motion interpolation and other video processing techniques are by definition an adulteration of the way the film is meant to be viewed.

            Framerate is both a technical and creative choice. If a film was meant to be seen at high frame rates, it would have been shot with cameras capable of capturing at that frame rate rather than using motion interpolation because the quality would be noticeably better. And there are indeed films that do exactly that, like Gemini Man or The Hobbit trilogy.

    3. ewintr
      Link Parent
      My sister had a television that had that problem with faces. Sometimes when the camera was panning or zooming, a face seemed to move slightly different than the person it was on. Very distracting....

      It's an ugly effect that makes things tend to look worse, and at times it can be really distracting - especially when it gets parts that it can't figure out how to interpolate and the picture starts to look like it's stuttering

      My sister had a television that had that problem with faces. Sometimes when the camera was panning or zooming, a face seemed to move slightly different than the person it was on. Very distracting.

      There is also a big difference between televisions when watching sports and the camera follows the ball when it is shot/thrown with a long curve from one end of the field to the other. Sometimes it is just a ball, sometimes it stutters and sometimes it is not really a round ball anymore, and it has several half moon discs before and after it.

      Once you see it you cannot unsee it. But most people are not able to see it, is my experience.

      4 votes
    4. [3]
      Chobbes
      Link Parent
      I don't understand why it's a default setting... I guess maybe it's less noticeable than judder in certain situations (though I'd expect this not to really be an issue with most content on a 120Hz...

      I don't understand why it's a default setting... I guess maybe it's less noticeable than judder in certain situations (though I'd expect this not to really be an issue with most content on a 120Hz display?), and the interpolation generally looks good enough and the extra smoothness looks better for most people in most situations? In my experience it would generate complete garbage frames far too often and it was really distracting.

      3 votes
      1. [2]
        Greg
        Link Parent
        Given that overscan is still enabled by default for some TVs, and that’s unequivocally worse in every way unless you’re somehow watching analogue broadcasts 15 years after most of the world turned...

        Given that overscan is still enabled by default for some TVs, and that’s unequivocally worse in every way unless you’re somehow watching analogue broadcasts 15 years after most of the world turned them off, I have a fair measure of cynicism about the decision making behind these settings in general.

        6 votes
        1. Chobbes
          Link Parent
          GOD overscan also makes me so angry.

          GOD overscan also makes me so angry.

          3 votes
  2. [6]
    blitz
    Link
    Other people have already answered what this is, I'll give an example of a time when I think it's actually helpful (oh god don't ban me please). I have an OLED TV, whose pixel response times are...

    Other people have already answered what this is, I'll give an example of a time when I think it's actually helpful (oh god don't ban me please).

    I have an OLED TV, whose pixel response times are incredibly fast. I'm pretty sure they're the fastest technology by an order of magnitude.

    LotR: Fellowship is actually kind of hard to watch without some kind of motion smoothing. There are a ton of panning landscape shots that at 24-30 FPS really just look awful without motion smoothing on an OLED TV. Am I doing something wrong?

    16 votes
    1. [2]
      V17
      Link Parent
      In my opinion, if the algorithms are good, it can look great for just about anything and people just dislike it because we're so used to 24 - 30 fps. I have no idea how good the algorithms on TVs...

      In my opinion, if the algorithms are good, it can look great for just about anything and people just dislike it because we're so used to 24 - 30 fps. I have no idea how good the algorithms on TVs are, but I used to watch shows using Smooth Video Project, which is okay when set up differently for different types of shows, and I also tried the trial version of DmitriRender which is awesome and has the least amount of artifacts, but I didn't feel like paying for something like this yet. Watched most of remastered Star Trek using SVP though.

      5 votes
      1. [2]
        Comment deleted by author
        Link Parent
        1. V17
          Link Parent
          Oh. Yeah, I probably wouldn't want that either then.

          Oh. Yeah, I probably wouldn't want that either then.

          1 vote
    2. lux
      Link Parent
      You have my Axe. Without a bit of smoothing, movies in general look awful in such scenes on my laser projector as well. On a 3.50m wide image you generally see imperfections more often. Sure, the...

      You have my Axe. Without a bit of smoothing, movies in general look awful in such scenes on my laser projector as well. On a 3.50m wide image you generally see imperfections more often.

      Sure, the higher the interpolation, the more you get artifacts. I find Low to Mid to be a good compromise though. Otherwise it would be unbearable to watch. I dont like movies to be laggy. Another example would be the skyscraper drone panning scene in The Dark Knight. Without smoothing there, it would have ruined the collosal effect imo.

      4 votes
    3. Pavouk106
      Link Parent
      I ripped LotR: Fellowship recently from original Blu ray (to add to Jellyfin library) and I was wondering if I did something wrong because of these panning shots. They look really bad! Thanks for...

      I ripped LotR: Fellowship recently from original Blu ray (to add to Jellyfin library) and I was wondering if I did something wrong because of these panning shots. They look really bad! Thanks for confirming that this is how the movie actually is.

      I have old Panasonic plasma TV, this might add to the problem, or rather magnify the effect.

      1 vote
    4. ScaryLarry
      Link Parent
      I used to hate motion smoothing because of the soap opera effect, but since getting my OLED it was way too choppy without turning the smoothing on. The faster OLED refresh makes sense, I hadn’t...

      I used to hate motion smoothing because of the soap opera effect, but since getting my OLED it was way too choppy without turning the smoothing on. The faster OLED refresh makes sense, I hadn’t thought of that. I figured I had just gotten used to the smoothing effect.

  3. [2]
    Greg
    Link
    Not only does that make sense, it’s exactly what’s actually happening. The way motion smoothing/interpolation works is pretty much just the TVs image processor making a best guess at what would be...

    It was like there were frames between the frames, if that makes any sense.

    Not only does that make sense, it’s exactly what’s actually happening. The way motion smoothing/interpolation works is pretty much just the TVs image processor making a best guess at what would be between two frames and then filling in the gap with an extra one.

    8 votes
  4. Slystuff
    Link
    From the sounds of it, they likely have motion smoothing enabled (seems to be on by default on a lot of newer TVs) Variety has listed how to turn it off on various devices.

    From the sounds of it, they likely have motion smoothing enabled (seems to be on by default on a lot of newer TVs) Variety has listed how to turn it off on various devices.

    7 votes
  5. [2]
    lou
    (edited )
    Link
    Many TVs have a feature that does that. It's using prediction to increase smoothness by adding frames to the video. To make sub-60fps content (which includes a lot of old stuff, but also most...

    Many TVs have a feature that does that. It's using prediction to increase smoothness by adding frames to the video. To make sub-60fps content (which includes a lot of old stuff, but also most movies and scripted shows) look right, you have to disable it. It will be called "true motion", or something similar.

    There is also something called "Filmmaker Mode", which is a mode in newer TVs that basically tries to make the settings adequate for most films. It disables true motion as well.

    True motion can be good for sports, documentaries, and the news. It's usually awful for fiction.

    5 votes
    1. wervenyt
      Link Parent
      I'm surprised nobody else even paid lipservice to the fact it is kind of nice for those genres you laid out. Especially college sports with crappy official streams, they're actually quite...

      I'm surprised nobody else even paid lipservice to the fact it is kind of nice for those genres you laid out. Especially college sports with crappy official streams, they're actually quite effective at making the whole experience a bit more pleasant. Or if you're used to watching modern high-framerate nature documentaries, you lose that weird sense of stutter when going back to an older one, and there's really no downside.

      Still absurd to be the default, though. The typical person with a sport game or news station on 16/7 isn't going to care about the slight improvement, and the rest of us have to convince these modern settings interfaces to actually save the options set.

      7 votes
  6. [8]
    Halfdan
    Link
    It's funny, but in the old days, I don't remember noticing the stuttering effect on panning shots. Is this a feature of modern flatscreen monitors?

    It's funny, but in the old days, I don't remember noticing the stuttering effect on panning shots. Is this a feature of modern flatscreen monitors?

    3 votes
    1. [7]
      timo
      Link Parent
      The screen response time is faster, almost instant. Especially with OLED. This means that a single frame is on screen for longer, and it changes to the next frame so quickly that it looks like...

      The screen response time is faster, almost instant. Especially with OLED. This means that a single frame is on screen for longer, and it changes to the next frame so quickly that it looks like stutter. This is very visible with low frame rate footage (like 24fps).

      When your response time is slower, a frame changes more slowly from one to the next, so the complete frame is on screen for a shorter amount of time. More time is spent in transitions from one frame to the next. That is why it appears smoother.

      4 votes
      1. [2]
        arch
        (edited )
        Link Parent
        There's also the fact those of a certain age grew up with CRT. The phosphor layer in a CRT emits the image and it slowly fades out over time, which created a blurring and smoothing effect between...

        There's also the fact those of a certain age grew up with CRT. The phosphor layer in a CRT emits the image and it slowly fades out over time, which created a blurring and smoothing effect between frames. I would like to see an algorithm that mimics this with additional frames, slowly dimming out each frame then blending over the new one at max brightness. I believe the only artifact would be ghosting, it would potentially be an improvement over the weird visual artifacts we can get now.

        3 votes
        1. Greg
          Link Parent
          That’s a really interesting idea - you could probably simulate the dimming with reasonable fidelity using a 120Hz panel, just by interpolating the brightness of each frame in your source along a...

          That’s a really interesting idea - you could probably simulate the dimming with reasonable fidelity using a 120Hz panel, just by interpolating the brightness of each frame in your source along a fixed curve rather than trying to blend the motion itself. I’d be fascinated to see how it performed, actually!

          The other side of it would be changing the frame in a sweep rather than in a grid, which I assume (with no backing beyond educated guesswork!) is also a fair part of the CRT advantage here. That’s much more into hardware hackery dark magic - you’d need to address the panel over 1000 times per frame to do it by line, or a few million to do it by pixel and I have no idea where that lies on the spectrum from “six million dollars and a research team” to “oh yeah just give me four resistors, a heat gun, and some duct tape”. That said, you can get 480Hz panels now, so even in software you could do 50 lines per refresh on a 1080p24 source, with the appropriate dimming per row as well - might make an interesting experiment for some HCI students to figure out if it even warrants further study…

          2 votes
      2. [4]
        Halfdan
        Link Parent
        This would make sense. But Isn't it the other way around? As far as I know, CRT has faster response time. I always been bothered by the bluring effect when I design fast-paced games, never noticed...

        This would make sense. But Isn't it the other way around? As far as I know, CRT has faster response time. I always been bothered by the bluring effect when I design fast-paced games, never noticed this in the CRT days. This ghosting tes shows plenty of blur, at least on my monitor.

        3 votes
        1. Maethon
          Link Parent
          You are most certainly right. CRTs were extremely fast compared to the very first LCD monitors. But, it had other shortcomings compared to today's tech. First, as Arch said, there was slow fade...

          You are most certainly right. CRTs were extremely fast compared to the very first LCD monitors. But, it had other shortcomings compared to today's tech. First, as Arch said, there was slow fade time, and scanning was left to right for each pixel which probably contributed to the feeling of it a bit. Then we had the famous sprites which altered the viewing experience a lot. You can see it in slow motion the difference from an OLED is massive.

          Nowadays we can just refresh the entire screen every frame without any temporal effects from the imaging itself. The only caveat is slower initial drawing time but nowadays monitors are fast enough that you won't notice it and with newer technologies like mini LEDs, they are getting faster and faster response times.

          Coming back to your initial question, I do think it is a modern problem. I don't ever recall experiencing problems with panning in the past but nowadays it is unbearable to watch. It makes me feel like my eyes are juddering but in reality, it's just a slow-moving image.

          5 votes
        2. [2]
          Greg
          Link Parent
          The individual pixels change extremely fast on a CRT, but the image as a whole is drawn one pixel at a time, sweeping line by line from top left to bottom right over the course of each refresh...

          The individual pixels change extremely fast on a CRT, but the image as a whole is drawn one pixel at a time, sweeping line by line from top left to bottom right over the course of each refresh (every 16.66…ms at 60Hz).

          Modern displays treat the pixels as a grid, changing them in parallel, rather than as a sequence changing one by one, so the way the frames behave will be quite different over and above the individual pixel behaviour: the entire image will be static for almost the entire 16ms on the OLED, rather than continuously changing through the sweep.

          The CRT behaviour also means every pixel dims slightly and then gets refreshed every time, even if it hasn’t changed, whereas modern displays only “touch” pixels that have changed colour or brightness on a given refresh. I’m not sure how much difference that actually makes, but I can see it acting as something almost analogous to the shutter angle in physical film and helping the brain fill in the motion - similar to how CRT pixels being points of light rather than little squares helps older, low res sprites look a lot better than they do on an LCD or OLED.

          4 votes
          1. arch
            (edited )
            Link Parent
            It's actually, I believe, double that due to interlacing. Another interesting thought experiment is that interlacing. On an interlaced CRT due to the drawing method and artifacts you mentioned,...

            It's actually, I believe, double that due to interlacing. Another interesting thought experiment is that interlacing. On an interlaced CRT due to the drawing method and artifacts you mentioned, you don't notice it. The comb effect is next to invisible. Put that on a progressive screen that actually draws every pixel every frame and it looks like garbage.

            I'm fairly certain that using metrics like "response time" when comparing CRT to OLED or LCD is kind of meaningless. It's like using horsepower to compare the performance of an electric direct drive car to the differential and transmission driven ICE vehicle. The differences in how they function make them perform so differently that it doesn't account for how we view it in real life.

            1 vote
  7. [3]
    smiles134
    Link
    My wife's parents had this on their TV but it was like overtuned and just made me nauseous watching anything on their TV

    My wife's parents had this on their TV but it was like overtuned and just made me nauseous watching anything on their TV

    1 vote
    1. g33kphr33k
      Link Parent
      It can take some getting used to. I felt the same way initially but now it feels weird watching PAL TV 25fps without motion interpolation.

      It can take some getting used to. I felt the same way initially but now it feels weird watching PAL TV 25fps without motion interpolation.

      1 vote
    2. qyuns
      Link Parent
      Same with some friends of mine. They called me over to watch a movie on their song new TV, I spent the entire evening feeling sick. They couldn't tell the difference, so I genuinely thought there...

      Same with some friends of mine. They called me over to watch a movie on their song new TV, I spent the entire evening feeling sick. They couldn't tell the difference, so I genuinely thought there was something wrong with my vision all of a sudden. And even if I could have got used to it I found it so distracting because it made everything feel the same as one of those old BBC plays broadcast on tv. Like suddenly the sound was even different (even though that makes no damn sense) and everything was so blatantly a set. Turning it off was the first thing I did when I helped my parents set up their new tv. Didn't ask, didn't explain, just did it at the same time I was setting everything else up. Mostly because of the memory of their first widescreen, someone else set it up and set it all to 4:3, and they got mad and made me turn it back on when I tried to set it to 16:9 because they hated the black bars. Meanwhile I was going nuts because the obviously cut off edges were so distracting I kept getting distracted from what I was supposed to be watching.

  8. gingerbeardman
    Link
    The only thing I've ever used and enjoyed this mode for is watching Pixar movies, if you want them to look more like a video game. WALL•E was great fun with this on.

    The only thing I've ever used and enjoyed this mode for is watching Pixar movies, if you want them to look more like a video game. WALL•E was great fun with this on.