9 votes

An AI-generated image of a Victorian MP raises wider questions on digital ethics

23 comments

  1. [4]
    Minty
    Link
    The alleged crop they expanded doesn't match the end result. Look at the left fold on the dress present in original and gone in the end. So, they very much did change the content in the original...

    The alleged crop they expanded doesn't match the end result. Look at the left fold on the dress present in original and gone in the end. So, they very much did change the content in the original framing, and then added some fake content in the expanded framing.

    Neither is excusable in journalism.

    11 votes
    1. [3]
      asukii
      Link Parent
      Generative fill tools can sometimes change some of the adjacent parts of the source image, for what it's worth.

      Generative fill tools can sometimes change some of the adjacent parts of the source image, for what it's worth.

      4 votes
      1. [2]
        balooga
        Link Parent
        There’s also a dark gradient overlaying that part of the image, which is meant to be the background for the text element, but it also darkens the folds in the fabric, making them more pronounced....

        There’s also a dark gradient overlaying that part of the image, which is meant to be the background for the text element, but it also darkens the folds in the fabric, making them more pronounced. Unfortunately, that creates the illusion that her bust was edited to appear larger. This is just a confluence of different technical accidents, nothing intentional but still problematic nonetheless.

        4 votes
        1. Minty
          Link Parent
          This dark gradient doesn't go far enough to the right to explain it there.

          This dark gradient doesn't go far enough to the right to explain it there.

          2 votes
  2. [9]
    asukii
    Link
    The short version: Nine News airs a photo of this MP edited to look more sexualized, with enlarged breasts and a crop top to show her midriff. After backlash, they claim that it was an honest...

    The short version: Nine News airs a photo of this MP edited to look more sexualized, with enlarged breasts and a crop top to show her midriff. After backlash, they claim that it was an honest mistake, because they started with a more tightly cropped version of the photo and used photoshop's Generative AI fill tool to imagine the rest, meaning that it was AI making the sexualizing edits - but whose responsibility is that really in the world of journalism?

    9 votes
    1. [5]
      lou
      (edited )
      Link Parent
      I would say that manipulating images to that degree is a serious ethical failure even if it worked as intended. Journalists should not use those features at all. This line has been progressively...

      I would say that manipulating images to that degree is a serious ethical failure even if it worked as intended. Journalists should not use those features at all. This line has been progressively blurred in present times, but, if you present something as non-fiction, you immediately lose the liberty to freely alter the material as you would do with fiction.

      This is a serious issue that predates AI and I don't see enough concern around it. I'm talking about photorealistic CGI and actors interpreting real figures in audio, both without sufficient demarcation, leading the audience to take fiction for archival material.

      Documentarists and journalists are currently given way too much leeway to cross the boundaries between fiction and non-fiction. And that'll only get worse.

      EDIT: added a bunch of stuff.
      EDIT2: this reminded of this previous post https://tildes.net/~tech/1cx6/startup_channel_1_creates_news_service_presented_by_ai

      16 votes
      1. [4]
        Eji1700
        Link Parent
        While I agree, using photoshop to alter images became the extension of snapping 100 photos and picking the most embarrassing ones. Small alterations to the colors, height, size of people can make...

        While I agree, using photoshop to alter images became the extension of snapping 100 photos and picking the most embarrassing ones. Small alterations to the colors, height, size of people can make them look better or worse depending on the tone of your article and what you want to convey.

        2 votes
        1. [3]
          lou
          Link Parent
          Treating images existed before Photoshop and that is, of course, okay. However, I would advise against any feature that produces content that wasn't there before.

          Treating images existed before Photoshop and that is, of course, okay. However, I would advise against any feature that produces content that wasn't there before.

          6 votes
          1. [2]
            balooga
            Link Parent
            It’s not really so simple though. What about the magazines that have “altered” the skin tone of dark-skinned celebrities? Often there’s a technical explanation for this, coming down to suboptimal...

            It’s not really so simple though. What about the magazines that have “altered” the skin tone of dark-skinned celebrities? Often there’s a technical explanation for this, coming down to suboptimal lighting or camera settings or white balancing in post. A digital or print photograph is never going to have true-to-life colors. That hasn’t stopped people from accusing publications of deliberately lightening or darkening skin to suit an alleged agenda.

            My point is, even unedited photos can be unflattering or look wrong in ways that could be considered “producing content that wasn’t there before.” An image is a facsimile of the real world, by their nature all images are fake to some degree.

            4 votes
            1. lou
              Link Parent
              I don't have a ready made rule or solution to any of these. I would only venture that documentaries and factual journalism should be under closer scrutiny in that regard.

              I don't have a ready made rule or solution to any of these. I would only venture that documentaries and factual journalism should be under closer scrutiny in that regard.

              2 votes
    2. [2]
      stu2b50
      Link Parent
      I don’t think that’s even a question, it’s obviously the responsibility of the journalist. The fact that “AI” did it is a red herring - any kind of modifications of images used for reporting is...

      whose responsibility is that really in the world of journalism?

      I don’t think that’s even a question, it’s obviously the responsibility of the journalist. The fact that “AI” did it is a red herring - any kind of modifications of images used for reporting is unwanted.

      That’s why a coalition of news companies and camera companies are rolling out content authenticity. I’d expect sooner or later every image on the NYT or WaPo to be verified by content authenticity.

      15 votes
      1. unkz
        Link Parent
        For more info on this, https://contentauthenticity.org/
        • Exemplary

        That’s why a coalition of news companies and camera companies are rolling out content authenticity.

        For more info on this,

        https://contentauthenticity.org/

        6 votes
    3. DavesWorld
      Link Parent
      This whole thing is bullshit. Don't like the pictures the AI is kicking out? Update what you're asking of it, and have it kick out more. It is not any more complicated than that. At all. You have...

      This whole thing is bullshit. Don't like the pictures the AI is kicking out? Update what you're asking of it, and have it kick out more. It is not any more complicated than that. At all. You have control over the AI, can tell it what you want, how you want it, and nudge and fiddle with the parameters as it keeps giving you choices, until you like what it gives you.

      They liked what it gave them. They aired what it gave them.

      This is just the updated version of the infamous Time magazine OJ Simpson cover. It was one photo, taken at Simpson's booking. Time chose to alter the photo with filters and whatever, to make it darker and weirder and we all know how people interpreted the result. Poorly.

      Time claimed to be surprised by the reactions, but that's bullshit. They knew it would get a reaction. No one could have possibly looked at the photo they chose to run on their cover and have thought "yeah, that's just like the same photo we got from the cops, the same photo the others are gonna use in their stories."

      "Journalists" attempting to play gotcha with AI destroys what limited levels of credibility they might have left. They're doing this stuff, and putting on surprised faces, as a mummer's farce. It's something they're doing on purpose, and then play acting to, just for ratings. Short term gain (people tune in), and ignoring what could turn out to be a long term loss (people decide the journalistic source is shit and stop tuning in).

      None of this makes AI look bad. It makes journalists look bad. It makes them look like they're simpleton idiots, befuddled by fire, amazed at the magic coming from the mystery box. So which is it? Are they that stupid, or are they just not bothering to actually research and investigate and think before they run with a "story"?

      Either way, it's still their fault.

      5 votes
  3. Dr_Amazing
    Link
    I've used this tool a fair bit and I completely believe the explanation that they just tried to expand the picture a bit. Still it always gives you 3 choices and it's easy to click the button...

    I've used this tool a fair bit and I completely believe the explanation that they just tried to expand the picture a bit.

    Still it always gives you 3 choices and it's easy to click the button again for 3 more. So while the AI is responsible for drawing it, a real person picked it out from a few choices. In my experience it's way better at backgrounds than people, so I can easily see someone picking it because it blended better or didn't have weird hands, not even thinking about what the fashion choice was saying.

    5 votes
  4. [7]
    bloup
    Link
    On topic, given this was image was supposedly generated by mindless AI, isn’t it a bit presumptuous to say that the edit represents a crop top and an exposed midriff? I honestly think it looks...

    On topic, given this was image was supposedly generated by mindless AI, isn’t it a bit presumptuous to say that the edit represents a crop top and an exposed midriff? I honestly think it looks more like a tan belt… it’s not even a very similar color to her skin tone in my opinion. The modifying body proportions part is nothing to defend, but still the reporting seems highly sensational.

    1 vote
    1. asukii
      Link Parent
      Anecdotal, but I immediately saw it as a crop top before reading any of the text in the article.

      Anecdotal, but I immediately saw it as a crop top before reading any of the text in the article.

      4 votes
    2. [2]
      mild_takes
      Link Parent
      Looking at the image again I'm not sure if maybe I see midriff because I was told that's what it is. I still think it looks like midriff but I know have that suggestion and bias in my mind. The...

      Looking at the image again I'm not sure if maybe I see midriff because I was told that's what it is. I still think it looks like midriff but I know have that suggestion and bias in my mind.

      The way the top of the skirt (bellow the belt-riff) is bigger, thicker, and away from the body suggests a skirt. With a belt you'd expect the fabric to sit different or for the belt to be larger than her waistline. In the photo the belt-riff is inset, which we would tend to infer as a gap in the clothing.

      3 votes
      1. bloup
        Link Parent
        I wouldn’t have a problem with them mentioning this aspect of the image if they said a lot of people see it as her having her midriff exposed, and supported that assertion with some quotes from...

        I wouldn’t have a problem with them mentioning this aspect of the image if they said a lot of people see it as her having her midriff exposed, and supported that assertion with some quotes from the public. But the thing is it’s literally a false image of something that never even happened, so if someone literally doesn’t see it as her having her midriff exposed, I’m not really sure how someone else can say they are “wrong” for it, unless they have some kind of meta-knowledge about how this specific image was generated.

        As for me, I just feel like it doesn’t really match her skin tone. Also ironically, I’m used to see people put belts on and have some muffin topping, and have their top billow over the belt a little bit, so all of the things you mentioned were all things that made me feel like it looks like a belt. By the way, I hope you don’t take this to mean that I think there is something wrong with seeing it as midriff.

    3. [3]
      Eji1700
      Link Parent
      You can see the "belly button" it was trying to assume would be there in the middle center. It's not really a belt or a midriff, it's an AI's best guess at what goes there so it's probably both....

      You can see the "belly button" it was trying to assume would be there in the middle center.

      It's not really a belt or a midriff, it's an AI's best guess at what goes there so it's probably both. The main reason i wouldn't expect anyone to think it's belt is because it has no buckle or strap or anything else, and while those exist it's not exactly common or normal.

      1 vote
      1. balooga
        Link Parent
        Like a lot of AI-generated imagery it’s not really either thing but is a hallucination of both combined. The AI isn’t specifically trying to draw a belt or midriff. It’s trying to draw some...

        Like a lot of AI-generated imagery it’s not really either thing but is a hallucination of both combined. The AI isn’t specifically trying to draw a belt or midriff. It’s trying to draw some concept that probabilistically exists between the two. The longer you stare at these images the more details like this you notice, things that seem kinda vaguely like object X but also kinda object Y and never really land on either. They look fine at a glance but don’t hold up on closer inspection.

        3 votes
      2. bloup
        (edited )
        Link Parent
        I definitely don’t see a belly button. I see a slightly darker region, and if you’re willing to call that a belly button then I hope you’d let a different person call it a belt buckle. I really...

        I definitely don’t see a belly button. I see a slightly darker region, and if you’re willing to call that a belly button then I hope you’d let a different person call it a belt buckle. I really think calling it either is a completely ridiculous stretch though (and it’s literally the whole point of my comment). It’s an AI generated image, and if there is even one iota of ambiguity about a particular aspect of an AI generated image, it should not be presented as depicting anything in particular (though it’s definitely still worth mentioning the most common interpretations by the public). For what it’s worth, I’ve sent the edited image stripped of all context to like 4 different people asking them “do you think it looks like this woman is wearing a crop top with midriff exposed, or do you think it looks like she is wearing a tan belt?” and i got 2 belts and 2 midriffs.

        Also I’m pretty disappointed none of these replies addresses the thesis of my comment which is simply that it’s a little irresponsible to present an ambiguous AI generated image as certainly depicting anything in particular. I really don’t care why anyone might think it’s a midriff or a belt. It’s all kind of besides the point.

        3 votes
  5. [2]
    bloup
    Link
    Off topic, but before I realized this was about Australia, I thought it might have been a news article about someone making a deepfake of a 19th century British politician

    Off topic, but before I realized this was about Australia, I thought it might have been a news article about someone making a deepfake of a 19th century British politician

    13 votes
    1. oniony
      (edited )
      Link Parent
      That's where my mind went too. Then I was really confused because the person in question didn't look at all Victorian in dress or appearance, which led me to conclude they had used AI to update...

      That's where my mind went too. Then I was really confused because the person in question didn't look at all Victorian in dress or appearance, which led me to conclude they had used AI to update her Victorian look to modern fashions.

      1 vote
  6. Comment removed by site admin
    Link