30 votes

Google wants an invisible digital watermark to bring transparency to AI art

15 comments

  1. [3]
    Comment deleted by author
    Link
    1. [2]
      arch
      Link Parent
      Probably so that it can easily be filtered out as material for training the AI itself. I do think the idea of a watermark being required on all AI generated images is a good one. If it can be...

      I've become so sceptical of anything Google does these days that my first reaction to this was "Why?"

      Probably so that it can easily be filtered out as material for training the AI itself.

      I do think the idea of a watermark being required on all AI generated images is a good one. If it can be figured out it should also be required for text somehow. I don't have a problem with AI being used, and I don't have a problem with the results being sold, but it should be transparent to the purchaser and user that AI generated some (or all) of the pieces, and which pieces. But I do think it should be free to implement, and it should be mandated by the federal government to avoid abuse by organizations like Google.

      20 votes
      1. unkz
        Link Parent
        Not just filtering from their training data, but probably also as a ranking signal for image search.

        Not just filtering from their training data, but probably also as a ranking signal for image search.

        9 votes
  2. [6]
    HeroesJourneyMadness
    Link
    This is pretty silly. I agree with @douchebag - why? Photoshop already has normalized distrust in images. This is just DRM for pictures. We don’t need or want an additional layer of complexity...

    This is pretty silly. I agree with @douchebag - why? Photoshop already has normalized distrust in images. This is just DRM for pictures. We don’t need or want an additional layer of complexity that’s easily defeated by a quick scripted run through a photo editor.

    My quick “solution”- legislate that there needs to be disclosure in the image metadata, (I bet most do this already) and then we’ll have to deal with the bad actors the old fashioned way.

    14 votes
    1. [4]
      arch
      Link Parent
      Metadata can be stripped so easily that it's pointless for this purpose. Even without any bad intentions you could take a screenshot of the image and end up with the metadata being stripped.

      Metadata can be stripped so easily that it's pointless for this purpose. Even without any bad intentions you could take a screenshot of the image and end up with the metadata being stripped.

      10 votes
      1. HeroesJourneyMadness
        Link Parent
        Yeah, I’m aware of how easy it is to do, I’m just trying to give a path forward that’s reasonable and doesn’t break things. Loading data into pixels is bending image files into doing something...

        Yeah, I’m aware of how easy it is to do, I’m just trying to give a path forward that’s reasonable and doesn’t break things. Loading data into pixels is bending image files into doing something they’re not designed to do. Data belongs in the metadata. That’s the right place for it.

        I’d be willing to bet that some tiny change to the image-or just converting it back and forth between file formats in an editor defeats this anyway… so again… why? We’d be much better off just being rigorous with attribution and be done with it.

        4 votes
      2. [2]
        pete_the_paper_boat
        Link Parent
        Could be embedded into the image itself, but then it would only be protected through obscurity. And that's just a cat and mouse game.

        Metadata can be stripped so easily

        Could be embedded into the image itself, but then it would only be protected through obscurity. And that's just a cat and mouse game.

        1. arch
          Link Parent
          That's what a watermark is. They're often invisible to the eye now.

          That's what a watermark is. They're often invisible to the eye now.

          5 votes
    2. unkz
      Link Parent
      The issue is metadata is auto stripped by most image hosts. I don’t think this should be looked at as a measure against bad actors so much as a convenience feature for a quick first pass filter...

      The issue is metadata is auto stripped by most image hosts. I don’t think this should be looked at as a measure against bad actors so much as a convenience feature for a quick first pass filter for good actors.

      3 votes
  3. [7]
    Caliwyrm
    Link
    How long do you think there will be AI tools to remove the AI watermark? This just seems like extra steps for no real gain. The honest people would already say that the piece was AI generated. The...

    How long do you think there will be AI tools to remove the AI watermark?

    This just seems like extra steps for no real gain. The honest people would already say that the piece was AI generated. The dishonest people will just remove the watermark.

    5 votes
    1. [5]
      GoatOnPony
      Link Parent
      I think there's a large class of honest people who share images without taking extra effort to also share the provenance. Most people aren't actively malicious, they just don't care/aren't...

      I think there's a large class of honest people who share images without taking extra effort to also share the provenance. Most people aren't actively malicious, they just don't care/aren't thinking about it. They wouldn't remove any provenance if it was already attached for them. If there was a widely used metadata format which systems would automatically preserve on the user's behalf that would be sufficient, but alas, we don't. Putting a watermark in the image content is the best facsimile of that we have.

      5 votes
      1. [4]
        HeroesJourneyMadness
        Link Parent
        I disagree. Getting the W3C to add a metadata requirement to images I think MIGHT be a way to go here? I know everybody hates on them, but standards should be slow moving. Hiding data in pixels...

        I disagree. Getting the W3C to add a metadata requirement to images I think MIGHT be a way to go here? I know everybody hates on them, but standards should be slow moving. Hiding data in pixels just seems like bad juju and a way to make another black box for data.

        2 votes
        1. [3]
          GoatOnPony
          Link Parent
          I'd be very happy if metadata was standardized and retained across systems, but I don't think that's achievable on relevant timescales. Getting a w3c/ietf standard is only the very top of the...

          I'd be very happy if metadata was standardized and retained across systems, but I don't think that's achievable on relevant timescales. Getting a w3c/ietf standard is only the very top of the iceberg and it's already a slow process. The hard part is updating every OS, server image upload/download path, SMS, email, etc to handle it properly. A watermark can skip all of that. If you accept that AI image proliferation is doing harm now, a watermark gets you there much faster.

          Normalizing hiding data in images is a reasonable concern, but I think implicitly passing metadata has very similar risks.

          1. [2]
            HeroesJourneyMadness
            Link Parent
            I’m out of my depth here, but I believe it’s possible to put malicious code in image files already? I wonder - and acknowledge the fear-mongering implied here- if putting hidden stuff in images is...

            I’m out of my depth here, but I believe it’s possible to put malicious code in image files already? I wonder - and acknowledge the fear-mongering implied here- if putting hidden stuff in images is a can of worms we want to make standard though?

            I’m just generally very very suspicious of new “advances” like this that kinda toss web standards out the window for the sake of ease/speed for an issue that really only just became an issue and is still kind of morphing and changing.

            It won’t deter bad actors, so it doesn’t really fix the problem, but it does open potentially a whole bunch of new issues. Pixels should be pixels- not pixels and drm or copyright or invisible signatures.

            1 vote
            1. GoatOnPony
              Link Parent
              I think your concerns are valid - hiding arbitrary information in images is a worry. It's already possible, but normalizing it has additional dangers. IMO, this is mostly about how much you think...

              I think your concerns are valid - hiding arbitrary information in images is a worry. It's already possible, but normalizing it has additional dangers. IMO, this is mostly about how much you think AI generated images being created and shared by unsophisticated people is a concern. I'm concerned enough about bad data provenance as a vector for widespread harm (political misinformation, faked reviews, conspiracies, etc) that on balance I weigh this effort as a net positive. But I can see how a different weighting for the concerns/likelihood of it succeeding would lead to it being a negative.