23 votes

iOS 13 redraws your eyes using ARKit so that you're looking at the camera instead of the screen

Tags: apple, iphone
@schukin:
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly. Notice the warping of the line across both the eyes and nose. https://t.co/U7PMa4oNGN

13 comments

  1. [5]
    balooga
    Link
    I have surprisingly conflicted feelings about this. On one hand, it's always mildly annoyed me that you can't have the illusion of eye contact with someone over video chat because they're always...

    I have surprisingly conflicted feelings about this. On one hand, it's always mildly annoyed me that you can't have the illusion of eye contact with someone over video chat because they're always looking just below the camera. So why not fix it with software?

    On the other hand, the more I read about the subtle ways modern digital cameras are manipulating images (by default), the more disconcerted I become. I expect photography to create a good-faith facsimile of the real world, not a deliberate uncanny valley approximation of it. At the very least, I'd like to see these beautifying / smile-improving / attention-correcting "enhancements" all locked behind discrete settings that are not enabled out of the box.

    12 votes
    1. cptcobalt
      (edited )
      Link Parent
      I think this is just a sign of the times. This attention correction feature is likely just a limited-scope "beta" feature for what will certainly become a tier-one feature at some point, like...

      On the other hand, the more I read about the subtle ways modern digital cameras are manipulating images (by default), the more disconcerted I become. I expect photography to create a good-faith facsimile of the real world, not a deliberate uncanny valley approximation of it.

      I think this is just a sign of the times. This attention correction feature is likely just a limited-scope "beta" feature for what will certainly become a tier-one feature at some point, like adjusting all the eyes in group photos to all look at your camera.

      I get the sort of philosophy that cameras should be a facsimile of real life, but tools like this will improve photos overall. A feature like this does no harm to anyone, and only improves outcomes (unless it breaks).

      I want a feature like this to be prolific across platforms sooner rather than later. After a few years of bad glasses—think nearly "coke bottle" glasses, even with today's high-index lenses—coupled with rapidly degrading eyesight, my left eye now drifts a bit after prolonged strain, causing semi-frequent cross-eyed photos. (Recent Lasik surgery has undone a ton of the damage, but it's not gone.) Unless I take a photo from a few angles that don't trigger it, I can't stand to look at any photo of myself. Realizing that people's phones will have features like this on by default will make me less camera shy.

      This is unquestionably a net positive, no matter how "uncanny valley" this may feel from newness.

      All that being said, I'm not the biggest fan of other software photography features, like fake depth of field. It's extremely obvious when it's software applied, and skill + real lenses still trumps that.

      4 votes
    2. nothis
      Link Parent
      It used to be that you hate how you look in photos because it looks so much different from how you see yourself in mirrors. Soon, it will be the other way round!

      It used to be that you hate how you look in photos because it looks so much different from how you see yourself in mirrors. Soon, it will be the other way round!

      1 vote
    3. frickindeal
      Link Parent
      You can always use a third-party camera app that does none of that, lets you shoot RAW, gives you control over shutter speed and ISO, focus point, etc.

      You can always use a third-party camera app that does none of that, lets you shoot RAW, gives you control over shutter speed and ISO, focus point, etc.

    4. beowulfey
      Link Parent
      Given how much of an unreliable narrator we are in our own lives, I suppose the next step would be to make us question it even more. It's a slippery slope to throw what is "real" out the window,...

      Given how much of an unreliable narrator we are in our own lives, I suppose the next step would be to make us question it even more. It's a slippery slope to throw what is "real" out the window, since we naturally will adapt to believe what we think is true.

      At the same time, with how fluid reality is already I can't say I can think of any good reason why we shouldn't other than that it makes me uncomfortable.

  2. [7]
    kavi
    Link
    I'm not too knowledgeable on this, but doesn't this kind of start a slippery slope? It seems useful, but aren't cameras for taking "snapshots" of the real world, not getting one, changing it a bit...

    I'm not too knowledgeable on this, but doesn't this kind of start a slippery slope?

    It seems useful, but aren't cameras for taking "snapshots" of the real world, not getting one, changing it a bit so it looks nicer, and making it seem like it's just how it is. You've got a lot of other stuff, like beautification, baked in and doesn't this increase the gap between the way a person actually is and the way they're perceived online?

    4 votes
    1. [5]
      emdash
      Link Parent
      It's not the beginning of the slippery slope at all. We pushed the rock down that hill a long time ago with smartphone cameras and Instagram. Not that this justifies it, mind you. This is just...

      It's not the beginning of the slippery slope at all. We pushed the rock down that hill a long time ago with smartphone cameras and Instagram. Not that this justifies it, mind you. This is just another shitty step towards a world where reality does not match content. Deepfakes, fake news, photoshop, etc.

      This seems like a very un-Apple like move, and I don't appreciate it.

      6 votes
      1. [4]
        cptcobalt
        Link Parent
        It's not that photos have always been a 1:1 imitation of reality either. You could argue this started with our first ability to capture photos, even. A pro-photographer, with good gear, knows...

        This is just another shitty step towards a world where reality does not match content.

        It's not that photos have always been a 1:1 imitation of reality either. You could argue this started with our first ability to capture photos, even.

        A pro-photographer, with good gear, knows they're not capturing reality—they're only capturing an imitation of it. Passing light through lenses to capture onto a film or digital sensor introduces a menagerie of things that distort reality: barrel distortion, grain/noise, chromatic aberration, you name it. They'll even manipulate things outside of the camera, like setting up extremely advantageous lighting scenarios to modify the outcome of their photos. If you flip through any magazine of beautiful homes, food, or people, they've all been modified to be "more than real"—magazines dating to before the days of photoshop. (People even thirst for some of this, like tasteful lens flares and film grain.)

        In a way, it's not great that software is reducing the skill required in taking truly mind blowing photos—we shouldn't accept a cheap software blur as a substitute for a fast lens—but I don't exactly mind having software around to help in the lowest common denominator of scenarios, either.

        6 votes
        1. [3]
          emdash
          Link Parent
          There's a sliding scale between acceptable and unacceptable, in my mind; and while most of the examples you mention seem fine to me, this new feature to me crosses a threshold of reality...

          There's a sliding scale between acceptable and unacceptable, in my mind; and while most of the examples you mention seem fine to me, this new feature to me crosses a threshold of reality distortion I'm not comfortable with.

          3 votes
          1. [2]
            Spel
            Link Parent
            Why? It's a software fix for a hardware problem (they can't place the camera in the optimal position, which is behind the eyes of the person you're talking to). What's especially unacceptable...

            Why? It's a software fix for a hardware problem (they can't place the camera in the optimal position, which is behind the eyes of the person you're talking to). What's especially unacceptable about that?

            3 votes
            1. emdash
              Link Parent
              I didn't say it's unacceptable, I said I'm not comfortable with it. It reminds me of a number of dystopic futures where the humanity is essentially removed from humanity. What makes us ourselves...

              I didn't say it's unacceptable, I said I'm not comfortable with it. It reminds me of a number of dystopic futures where the humanity is essentially removed from humanity. What makes us ourselves is concealed.

              2 votes
    2. nothis
      Link Parent
      It will be like in the movie Surrogates. People just interact with others through perfectly looking robot bodies.

      It will be like in the movie Surrogates. People just interact with others through perfectly looking robot bodies.

  3. ainar-g
    Link
    There is something eerily dystopian about it. A thing as common as looking into someone's eyes suddenly becomes “improved” by technology. In the pretend-world you're looking into people's eyes...

    There is something eerily dystopian about it. A thing as common as looking into someone's eyes suddenly becomes “improved” by technology. In the pretend-world you're looking into people's eyes more, but in reality you don't see the eyes at all; you see an animation that looks like your partner's eyes. This desire to make the world look more refined without actually trying to refine it bothers me greatly.

    3 votes