10 votes

The most urgent threat of deepfakes isn't politics, it's porn

9 comments

  1. [7]
    markx2
    Link
    I would very much disagree. Yes a huge number of people watch porn as the details that Pornhub publish demonstrate. I would suggest though that the topic of porn is not a mainstream activity at...

    I would very much disagree.

    Yes a huge number of people watch porn as the details that Pornhub publish demonstrate. I would suggest though that the topic of porn is not a mainstream activity at home, at work, even at the pub/club/wherever. So the fact that someone has been deepfaked into a porn video would need the wider media to diseminate the news - which of course they would. But any 'danger' is then limited to the person being targeted, maybe their family, colleagues. If that video involves illegal behaviour then the target has recourse to the law (as far as they can afford). But again the danger is very limited.

    A deepfaked political announcement though? That's a whole different game. The target there is not the person being deepfaked, it's the people that will react to the contents of that video. That's potentially a huge target.

    Not a great example:

    Russia produces a deepfake that shows a nation's / religious leader ****ing a pig.

    Russia produces a deepfake showing a nation's / religious leader saying something inflammatory.

    The wider reaction will be the latter surely?

    So while a porn deepfake will be deeply distressing to a very small number of people, a political deepfake has the ability to cause much more widespread damage.

    16 votes
    1. [6]
      cfabbro
      (edited )
      Link Parent
      IMO you're not wrong that the most disastrous consequences of deepfakes in the future are likely to be from politically motivated ones, but I feel like you're ignoring a very critical part of the...

      IMO you're not wrong that the most disastrous consequences of deepfakes in the future are likely to be from politically motivated ones, but I feel like you're ignoring a very critical part of the title. "urgent" = requiring immediate action or attention. And in the video they explain precisely why that is the case with deepfake porn.

      According to the study they cite, 96% of deepfake videos out there in the wild right now are porn related, there are already several large communities dedicated to creating them (some of which allow paying for requests), and said content is difficult for those victimized by them to have removed and seek damages for. So porn deepfakes are clearly already an issue at this very moment, whereas politically motivated deepfakes have yet to really surface as a major issue.

      p.s. Research into detecting deepfakes is already well underway and progressing (e.g.), which may mitigate the effectiveness of politically motivated ones due to them likely being put under more intense scrutiny once released. However, since people creating and consuming deepfake porn already generally know they are fake, proving them as such doesn't really accomplish much, as the damage they cause is not necessarily from people believing they are real, but due to them being created and shared without the consent of their victims in the first place.

      7 votes
      1. [5]
        NaraVara
        (edited )
        Link Parent
        I have my doubts about how effective this would be. It's basically just an arms race and, as we've seen, people don't believe mendacious bullshit because it's convincing. They believe it because...

        Research into detecting deepfakes is already underway and progressing

        I have my doubts about how effective this would be. It's basically just an arms race and, as we've seen, people don't believe mendacious bullshit because it's convincing. They believe it because they want to believe it. It supports their conclusions and they'll throw out anything that casts doubt on it if they hold those conclusions hard enough.

        Back to the porn thing, though. I don't really know any way around it if we're being honest. The problem is that the technology is too democratized and can be done in complete privacy. So there's just no paper trail or footprint. We can probably find fakes of celebrities using the same kind of face detection, but fakes of regular people would be impossible to identify without some clues in the context where it's shown, the filename, or metadata. If it's hosted on a website called "DeepFakePorn.com" or something you could know, but otherwise what can you do? How would you even prove that it wasn't a voluntary deepfake where someone wanted to have porn of their own face with a better body?

        The most analogous kind of penalty (in terms of how you would find the offending material, find the people who produced it, and punish them) would be revenge porn. But we're really bad at doing anything about revenge porn! And, what's more, revenge porn only works if the subject initially consented to being filmed in the first place. So the victim would actually know if that material existed and it would be on their radar as a thing to worry about.

        Worse yet, it gets hard to decide if it's actually not just covered under freedom of expression. Practically speaking, what's the difference between someone deepfaking Emma Watson and Daniel Radcliffe into some kind of porn vs. drawing some kind of Harry Potter hentai that uses the actors' likenesses vs. hiring lookalikes to film something? (Not a lawyer, but I actually think the middle case is actionable since you're using likenesses without permission, but the last case is free use).

        6 votes
        1. [4]
          cfabbro
          (edited )
          Link Parent
          Yeah, I definitely do too, hence my use of "may" in the next sentence. :P The technology behind deepfakes is not the issue. And ones created in private and kept private can't be detected, but they...

          I have my doubts about how effective this would be.

          Yeah, I definitely do too, hence my use of "may" in the next sentence. :P

          The problem is that the technology is too democratized and can be done in complete privacy.

          The technology behind deepfakes is not the issue. And ones created in private and kept private can't be detected, but they are also not necessarily harmful, so they are also not the issue here. Sharing of those deepfakes to the public is the issue. And while we may not be able to identify the original deepfake porn creator, responsibility still rests on those who share (and host) said deepfake, so they can and in some cases should be held legally liable.

          And, what's more, revenge porn only works if the subject initially consented to being filmed in the first place. So the victim would actually know if compromising material existed and it would be on their radar as a thing to worry about.

          What's your point here exactly? Because it sounds very close to victim blaming, IMO. Consent to be filmed in private does not mean consent was granted to have said film shared with others or publicly released. And yes, the victim would know of it existing, but again, that's not the issue; It being shared without consent and the damage that causes, both psychologically, and potentially to someones reputation, is the issue.

          Worse yet, it gets hard to decide if it's actually not just covered under freedom of expression. Practically speaking, what's the difference between someone deepfaking Emma Watson and Daniel Radcliffe into some kind of porn vs. drawing some kind of Harry Potter hentai that uses the actors' likenesses vs. hiring lookalikes to film something?

          No it isn't hard to decide at all, IMO. And the difference is that drawings are inherently creative expression due to their methods of creation, their medium, and the creativity that is being expressed by their authors through them. That and the intent of them is not to fool others into thinking they are real, and nobody is generally going to be fooled by them. So, while ethically speaking, sexualizing real life people or characters they protrayed, without their consent, for sexual gratification is iffy at best, legally speaking it makes sense why it's protected.

          However deepfakes are algorithmic digital manipulation using existing media as training data, and intended to exploitatively portray real people, as realistically as possible, in real life scenarios they did not actually take part in themselves, without their consent. There is very little (if any) creative expression in the deepfakes themselves or their creation.

          1. [3]
            NaraVara
            Link Parent
            How? I'm talking about the likelihood of finding the footage before it gets wide distribution. People with explicit footage of themselves out there can know it exists and know to keep an eye out...

            What's your point here exactly? It sounds very close to victim blaming, IMO.

            How? I'm talking about the likelihood of finding the footage before it gets wide distribution. People with explicit footage of themselves out there can know it exists and know to keep an eye out for it. People have no idea if someone is making deepfakes of them, and it can, in theory, be trending on a porn site for ages before they ever learn about it. It becomes functionally even more impossible to really enforce than revenge porn rules which, like I said, we're already really bad at enforcing.

            However deepfakes are algorithmic digital manipulation using existing media as training data, and intended to exploitatively portray real people, as realistically as possible, in real life scenarios they did not actually take part in themselves, without their consent. There is very little (if any) creative expression in the deepfakes themselves or their creation.

            You could apply this argument to explain that computational photography doesn't count as "art" either. They may be putting the face over something, but they're still choosing source footage, deciding what scenario to depict, making editing decisions, probably running multiple versions to pick the one that's looks "best," and possibly even selecting or dubbing in audio. That's all within the scope of things considered to be creative expression in other contexts.

            4 votes
            1. [2]
              cfabbro
              (edited )
              Link Parent
              I explained exactly how it comes across as victim blaming. "Consent to be filmed in private does not mean consent was granted to have said film shared with others or publicly released." So what's...

              And, what's more, revenge porn only works if the subject initially consented to being filmed in the first place. So the victim would actually know if compromising material existed and it would be on their radar as a thing to worry about.

              I explained exactly how it comes across as victim blaming. "Consent to be filmed in private does not mean consent was granted to have said film shared with others or publicly released." So what's your point in saying the above, and in that way? I still don't understand your point there, or here:

              People have no idea if someone is making deepfakes of them, and it can, in theory, be trending on a porn site for ages before they ever learn about it. It becomes functionally even more impossible to really enforce than revenge porn rules which, like I said, we're already really bad at enforcing.

              Again, what's your point? I really don't understand your logic. It reads to me as, "these things are hard to detect, therefor we should not bother trying to stop them"... but just because deepfake porn can, in theory, be trending on a porn site for ages before the victim ever learned about it doesn't mean we should permit it to remain there once it is detected.

              You could apply this argument to explain that computational photography doesn't count as "art" either.

              If that computational photography project were designed to exploit someone else's image without their consent and the purpose of it didn't fall under fair use (e.g. for critique or satire purposes), then yes, I would apply the same argument, for the same reasons I feel that deepfakes don't count as creative expression. Intent of the creator, and consent + outcome for the subject, matters in ethics, and the law. And that is ultimately what is at issue here.

              1. NaraVara
                Link Parent
                I think you should reread what I said very carefully, specifically: "So the victim would actually know if that material existed and it would be on their radar as a thing to worry about." The whole...

                "Consent to be filmed in private does not mean consent was granted to have said film shared with others or publicly released."

                I think you should reread what I said very carefully, specifically: "So the victim would actually know if that material existed and it would be on their radar as a thing to worry about."
                The whole point is likelihood of being able to find the stuff and being able to trace back to who put it out there. Something you can't do with a deepfake.

                It reads to me as, "these things are hard to detect, therefor we should not bother trying to stop them"

                If you read my post more carefully you'll notice I never said "we should not bother." My first sentence was "I don't really know any way around it if we're being honest." In other words, that means what are you going to do about it that will actually address or prevent the harm? You can have all the rules you want about taking it down, but if the odds of finding where it's coming from to issue a takedown notice are basically nil, they're not going to stop any of the harm you're worried about. Even with some laws around taking them down or having civil penalties on production of it, I think it's still something we're just gonna have to learn to live with culturally.

                It's all well and good to say "[thing] is bad," but merely declaring it's bad doesn't mean we have viable enforcement mechanisms to do anything about it or that cures we propose don't end up being worse than the disease. For good examples, see the War on Drugs or SESTA/FOSTA, which was ostensibly about preventing human trafficking but ends up mostly being about further criminalizing sex workers while doing next to nothing about sex trafficking.

                If that computational photography project were designed to exploit someone else's image without their consent and the purpose of it didn't fall under fair use (e.g. for critique or satire purposes), then yes, I would apply the same argument for the same reasons I feel that deepfakes don't count as creative expression. Intent of the creator, and consent + outcome for the subject, matters in ethics, and the law. And that is ultimately what is at issue here.

                This may seem pedantic, but it's an important point of order here. Neither consent nor fair use have much to do with whether an action is creative expression or not. They may impact whether it's protected under the First Amendment (like, has literary or artistic value) but if creativity and expression is involved, it's creative expression by definition. We should be careful to not conflate the two.

                You can protect your likeness and sue on that basis if it is intentionally distributed publicly. But if it's made for personal use there really isn't that much you can do about it. If it's not distributed, then practically speaking, it wouldn't be that different from writing self-insert erotica in your diary or just indulging in lurid fantasies. The only similar rules we have where intent factors in that much are around realistic renderings of child pornography. But even those are difficult to maintain. To be honest, I'm pretty sure the only reason some of them can stay on the books is just because the people prosecuted under them are deeply unsympathetic and shallow pocketed. (The deep pocketed ones never get prosecuted, naturally.)

                4 votes
  2. [2]
    tempestoftruth
    Link
    I understand how the video title is influenced by recent discourse on this issue, but pitting one potential abuse of this new technology against another isn't constructive, in my opinion. Porn...

    I understand how the video title is influenced by recent discourse on this issue, but pitting one potential abuse of this new technology against another isn't constructive, in my opinion. Porn deepfakes and disinformation deepfakes are both terrifying in their own ways, and we should be wary of both types.

    5 votes
    1. ThatFanficGuy
      Link Parent
      "urgent". That implies priority of response, rather than qualitative comparison based on overall damage potential.

      "urgent". That implies priority of response, rather than qualitative comparison based on overall damage potential.

      2 votes