30 votes

There are no laws against deepfake pornography in the US

28 comments

  1. JuDGe3690
    (edited )
    Link
    One of my former law school classmates (now graduated and working as an attorney) wrote a law review article last year on this topic: Natalie Lussier, Nonconsensual Deepfakes: Detecting and...

    One of my former law school classmates (now graduated and working as an attorney) wrote a law review article last year on this topic:

    Natalie Lussier, Nonconsensual Deepfakes: Detecting and Regulating this Rising Threat to Privacy, 58 Idaho L. Rev. 352 (2022), https://www.uidaho.edu/-/media/UIdaho-Responsive/Files/law/law-review/articles/volume-58/issue2/6-lussier.pdf.

    The current roadmap for victims of nonconsensual deepfake pornography is
    cloudy at best. A nationwide ban on deepfakes would be the most effective solution
    to the issues discussed, but this is not possible and would create more issues than
    remedies. Implementing a ban such as that would be a suffocation of the freedom
    of expression that Americans have a fundamental right to. Similarly, an injunction
    against deepfakes likely infringes on the Constitution’s First Amendment.

    State laws alone are not reliable nor effective due to the nature of the internet
    as a national and global force. (p. 374)

    In short, there are serious issues with implementing such regulation in the U.S., including constitutional protections on free expression (the Supreme Court is loath to expand categories of unprotected speech), although she highlights some potential avenues to explore, including § 230 amendments and more.

    21 votes
  2. [19]
    ignorabimus
    (edited )
    Link
    I tried to find a way to put a trigger warning on this post but wasn't really sure how, so I added the "nsfw" tag. I'm not sure this is appropriate as the Bloomberg article (naturally) doesn't...

    I tried to find a way to put a trigger warning on this post but wasn't really sure how, so I added the "nsfw" tag. I'm not sure this is appropriate as the Bloomberg article (naturally) doesn't include any of the NSFW images, but it discusses them in depth.

    This isn't an isolated incident either, there are lots of cases of horrible uses of generative AI (and even old fashioned Photoshop) to intimidate and harass people (mostly women).

    Let's hope that the US passes laws criminalising this kind of thing. Aiui some jurisdictions (e.g. UK) ban all child pornography (i.e. including representative works and not just real images).

    13 votes
    1. [13]
      bitshift
      Link Parent
      One of the big differences between this and CP is that an image of a child is visually distinct from an image of an adult 99% of the time. In contrast, it is pretty much impossible for an...

      One of the big differences between this and CP is that an image of a child is visually distinct from an image of an adult 99% of the time. In contrast, it is pretty much impossible for an arbitrary human to correctly judge whether an AI-generated photo was intended to resemble a real person — and it will be increasingly difficult to even tell that it was AI generated.

      Pandora's Box is already open. But I say that in the most optimistic, least defeatist way possible! I think society will successfully adjust to the new normal (whether willingly or not). And to that end, I hope that laws will address the human interactions at fault here: i.e., don't ban numbers (AI model weights, image files), but ban the use of them for evil (sharing a image of a classmate — fake or real — with the rest of the school).

      And as time goes on, the impact of fake images may still be there, but it will fade. As Benjamin Franklin aptly put it, "In our pursuit of wisdom, let us teach our children to be guided by the lantern of Reason, and not be swayed by the flickering candles of the Internet."

      30 votes
      1. [13]
        Comment deleted by author
        Link Parent
        1. [12]
          wervenyt
          Link Parent
          Yeah, I don't see why the solution to this problem isn't to stop pretending it's a bad thing to have made pornography. It's not a short-term solution, but I don't see how the alternatives actually...

          Yeah, I don't see why the solution to this problem isn't to stop pretending it's a bad thing to have made pornography. It's not a short-term solution, but I don't see how the alternatives actually prevent harm.

          12 votes
          1. [11]
            raze2012
            Link Parent
            Sure would address many societal problems the day a nipple popping out isn't a national tragedy that changes cable broadcast laws. But I wonder which approach here is more idealistic.

            Sure would address many societal problems the day a nipple popping out isn't a national tragedy that changes cable broadcast laws. But I wonder which approach here is more idealistic.

            5 votes
            1. [2]
              arch
              Link Parent
              I think our puritanical social norms surrounding nudity are a separate issue from pornography, and I am not certain that it is beneficial to conflate them. Especially in this discussion which is...

              Sure would address many societal problems the day a nipple popping out isn't a national tragedy that changes cable broadcast laws.

              I think our puritanical social norms surrounding nudity are a separate issue from pornography, and I am not certain that it is beneficial to conflate them. Especially in this discussion which is already so far removed from discussion of nudity and pornography. We have made the idea of nudity so taboo that just the suggestion of a woman being naked is titillating. That is not the same as pornography, though. Nudity is not the same as engaging in sex acts. Nudity in film in the 70's was fairly normal. You could get a PG rating with it. We are moving further and further away from that, and it is perhaps teaching children that are bodies are either something to be ashamed of, or a sexual gift to bestow on others. Neither of those things are healthy thought processes surrounding our bodies.

              5 votes
              1. raze2012
                Link Parent
                I get what you're saying, but I also feel that they are two separate factors with the same puratanical root. If the nude human body is a shameful thing, then the act of sex may as well be a...

                I get what you're saying, but I also feel that they are two separate factors with the same puratanical root. If the nude human body is a shameful thing, then the act of sex may as well be a demonic ritual. I'm sure the latter was why the former faded over the recent decades in media.

                Especially in this discussion which is already so far removed from discussion of nudity and pornography

                It relates, no? Minors aside, a nude photo, real or fake, shouldn't have this debilitating effect on someone, and that's only because of societal norms here. Fake sexual acts is just a more extreme extent of this. They both bring consent into question but one admittedly will have a more charged discussion (paparazzi on the street's vs. Violation of a private moment).

                These are certainly issues that don't necessarily need pornography involved in order to discuss the legal ramifications, but that angle is also made to drive in more participants.

                1 vote
            2. [8]
              wervenyt
              Link Parent
              Sure, but the alternative seems to essentially create a category of banned art based entirely on happenstance, to "protect" victims by fining? arresting? the creator/distributor after the fact of...

              Sure, but the alternative seems to essentially create a category of banned art based entirely on happenstance, to "protect" victims by fining? arresting? the creator/distributor after the fact of already having been shamed by association. In a world where anybody could "star" in porn without their knowledge, let alone consent. One of these methods is actually endeavoring to maintain the stigma, so sure, I'm glad it'll help those people by giving them justice on a ledger as we continue to create our labyrinth of government based on outdated taboos. That's a solution.

              1 vote
              1. [6]
                PelagiusSeptim
                Link Parent
                I really have to strongly disagree with this take. To begin: I agree that the stigma against someone having appeared in pornography is bad and outdated. That does not mean it is easy to remove...

                I really have to strongly disagree with this take. To begin: I agree that the stigma against someone having appeared in pornography is bad and outdated. That does not mean it is easy to remove from our culture. There is no one who has the power to unilaterally change our perspectives as a whole. You say "it's not a short term solution", but how can it even be a long term solution? The views of society may improve, but they could also get worse.

                My second issue with this solution is that deepfake pornography can cause harms outside of just other people's views of the victim. Seeing yourself, or a version of yourself with a different body, engaging in sexual acts you did not and would not consent to, can absolutely be psychologically harmful. Porn already tends to create unrealistic expectations of sex and body image, how much worse is it when a person has to compare their own body or have others compare their body to a fake version created for masturbatory purposes?

                I think banning deepfake pornography, and punishing those who make it, is the best immediate solution to the problem. Changing society's views toward sexuality is important, but taking action now does not preclude that change.

                3 votes
                1. [5]
                  wervenyt
                  (edited )
                  Link Parent
                  Fundamentally, banning things is effective thanks to a few phenomena that this doesn't participate in. It requires visibility to be enforced, and since porn is socially coded as "to be used...

                  Fundamentally, banning things is effective thanks to a few phenomena that this doesn't participate in. It requires visibility to be enforced, and since porn is socially coded as "to be used privately", you not only lose the happenstance equivalent to a cop wandering past a drug deal, but you also have a strong social stigma against coming forward with it. What good person would be in a situation to see faked porn, huh?

                  It may seem like that doesn't matter, but because we're talking about the immediate effects of viewership on the target of mimickry, unlimited by any sort of transmission. You don't have to see that Leslie was "in a porn" to hear about it.

                  Further, a ban relies on discouraging further production. When production involves natural resource extraction and refinement, construction, as well as distribution, you have hundreds or thousands who, if they're afraid, can throw a wrench in the gears. In this case, we have one or two people creating the model with no external signs beyond power use and having downloaded porn, and the actual perpetrator of specific ills almost always being a social pariah, people who've preemptively dismissed social mores, or young enough that long term consequences are not effective deterrence anyway.

                  With that in mind, how does a ban prevent that mindfuck of seeing a fetishized/"more normal" version of yourself? The production cost here is actually the punishment, because otherwise there's no limit on how many lives can be destroyed if, instead of destroying our unhealthy relationship with sex, we further kowtow to and cement into law the preferences of the least reasonable members of our society.

                  Meta postscript: I, for one, don't much care for the rhetoric of "something must be done!" It seems mostly to cover for (clearly in your case) fear of the monster we must wrangle to improve anything, and (clearly not you) intellectual cowardice. I'm not willing to feed the demon of our systems (convenience) with my concern, personally, and fuck the hateful who insist the rest of us bend to their will. Please take my arguments as sincere, but I do recognize that most other people have more pragmatic and pressing priorities. All this to say: if we disagree, that doesn't mean I don't appreciate your viewpoint or that I think you're wrong, I just prefer to be an alarm bell for when we could be better, even if it's mostly annoying.

                  6 votes
                  1. [4]
                    PelagiusSeptim
                    Link Parent
                    You make some good points about the possible efficacy of such as system, but I think you are unfairly characterizing what I'm saying as being in favor of not trying to change societal views, when...

                    The production cost here is actually the punishment, because otherwise there's no limit on how many lives can be destroyed if, instead of destroying our unhealthy relationship with sex, we further kowtow to and cement into law the preferences of the least reasonable members of our society.

                    You make some good points about the possible efficacy of such as system, but I think you are unfairly characterizing what I'm saying as being in favor of not trying to change societal views, when it seems apparent to me that you can do both. I don't agree that making laws of this sort, effective or not, will somehow perpetuate these views.

                    1. [3]
                      wervenyt
                      Link Parent
                      Well, I'm sorry for characterizing it that way. I didn't think that was actually how you viewed it, was just trying to exaggerate my own emotional response for communication's sake, but that...

                      Well, I'm sorry for characterizing it that way. I didn't think that was actually how you viewed it, was just trying to exaggerate my own emotional response for communication's sake, but that doesn't mean it was fair.

                      More to the point, I think we may have to agree to disagree then. I don't see how the project of destigmatization can progress if we're also happy to create regulations, which we all know will stick around far longer than strictly necessary or even productive, which only make sense in a culture which is far more likely to shift than the regulations themselves.

                      1 vote
                      1. [2]
                        PelagiusSeptim
                        Link Parent
                        I guess my question is what is the problem with the regulation continuing to exist in a world in where being in pornography is destigmatized? Even in such a world, consent continues to be...

                        I guess my question is what is the problem with the regulation continuing to exist in a world in where being in pornography is destigmatized? Even in such a world, consent continues to be important. Aside from that, if you think the regulation is unenforceable, why would it cause problems in this world?

                        I think the core of my feeling here is that people deserve the right to choose whether or not they appear in pornographic material. This remains true regardless of any stigma. The internet being as it is, it will be hard to keep it entirely away, but that doesn't mean we should avoid trying.

                        1. wervenyt
                          Link Parent
                          But these people are not appearing in pornographic material. Pornographic material featuring characters who resemble them is being created. It would still be fraudulent, and currently is...

                          But these people are not appearing in pornographic material. Pornographic material featuring characters who resemble them is being created. It would still be fraudulent, and currently is defamatory, to claim the real person has appeared, regardless of specific protections against this particular form of defamation.

                          1 vote
              2. raze2012
                Link Parent
                It's the most profitable solution for the government. Impractical and it doesn't fully protect the victim, but they have the biggest incentive to maintain that. Much easier to throw money at the...

                It's the most profitable solution for the government. Impractical and it doesn't fully protect the victim, but they have the biggest incentive to maintain that. Much easier to throw money at the wronged than address mental trauma.

                And it's already such an easy stature to go "we much be hard on pornography to protect potential victims/to maintain our wholesome community" (depending on your political line. Passive aggressively bi-partisan issue are the best kinds of issues!)

                1 vote
    2. [6]
      Comment deleted by author
      Link Parent
      1. [3]
        Wolf_359
        Link Parent
        About ten years ago I fell into a job where I worked in a facility that housed pedophiles. They weren't in jail but couldn't be in society either. As a result, this issue has quite unexpectedly...

        About ten years ago I fell into a job where I worked in a facility that housed pedophiles. They weren't in jail but couldn't be in society either. As a result, this issue has quite unexpectedly come into my life as something I care a lot about.

        I learned more than I ever cared to know about the issue and I've gotten past the immediate and visceral disgust. My view on the topic now is to do anything that reduces the number of victims, even if it seems weird, disgusting, or counterintuitive.

        I would like some very serious studies done on the matter of fake CSAM. Does it help reduce the number of offenders by providing them a different outlet? Does it provide a gateway by which "new" pedophiles can be created? I doubt it, but we should definitely study it from all angles.

        On that note, can we create a safe and anonymous system where pedophiles can seek help without fear of retribution? Can we study drugs which might help with the disorder?

        Society wants to keep it simple. They want to immediately murder anyone who offends on a child. I understand the impulse because, let's be real, it's fucking gross and incredibly harmful to the children.

        But if we just wait for people to offend and act punitively, we aren't reducing the number of victims at all. Trust me, the pedophiles I worked with never believed they would be caught, and they also did not believe they were hurting anyone. Not a single one of them expressed genuine remorse. They could not change their desires. As such, it just wasn't in their best interest to believe they had hurt anyone - so they didn't.

        We really need to be studying the issue more deeply. Part of that is going to be holding our noses and not stigmatizing anyone who dares to create harm reduction programs or facilitates studies of the problem.

        47 votes
        1. bitshift
          Link Parent
          This is an amazing paragraph — both because it hits upon the central issue (instinctual response vs. what actually makes the world a better place), and because it's completely generic with respect...

          I learned more than I ever cared to know about the issue and I've gotten past the immediate and visceral disgust. My view on the topic now is to do anything that reduces the number of victims, even if it seems weird, disgusting, or counterintuitive.

          This is an amazing paragraph — both because it hits upon the central issue (instinctual response vs. what actually makes the world a better place), and because it's completely generic with respect to the topic.

          I can immediately think of several examples of social issues in the US where there's a policy/law/attitude that's causing unnecessary suffering. And if we did something differently, we could alleviate that suffering. But it feels right to enough people, because some other group of people feels abhorrent, and the suffering remains.

          18 votes
        2. tealblue
          (edited )
          Link Parent
          My view would be that our understanding of sexuality being differentiated into discrete sexual orientations that one is born into is a bit of a simplification, and is partially a product of...

          My view would be that our understanding of sexuality being differentiated into discrete sexual orientations that one is born into is a bit of a simplification, and is partially a product of needing to create a defense of homosexuality against religious criticism. Sexuality is IMO more adaptable than we think, and both creating an outlet may make it worse and stemming access to fictional and real CP may make it better. Personally, I wouldn't be opposed to countries making fake CP illegal (though it would probably be out of step for legal norms in the US), but the punishment should probably be less severe than actual CP (maybe just punishing the production of it?). I think there should definitely be resources to support and treat non-offending pedophiles, but It'd be a slippery-slope in my estimation to view fake CP as potentially part of that treatment (especially for the simple reason that an individual can find pornography of extremely young looking adults anyways, so it's not clear what fake CP could do therapeutically what that type of porn couldn't).

          10 votes
      2. raze2012
        Link Parent
        fiction can run into some very borderline issues. Nothing stops an artist from making what otherwise looks like a very adult figure and then slap on "they're [age of consent -1] years old", nor...

        fiction can run into some very borderline issues. Nothing stops an artist from making what otherwise looks like a very adult figure and then slap on "they're [age of consent -1] years old", nor vice versa. Austrlia's laws judging it based on looks only caused more issues on trying to rectify that. This is solved easily with real people based on identification, but fiction lacks that.

        There's no good answer, and it feel frivolous to deal with this in court for every single edge case. so I think ultimately the best approach is to approach it for the perspective of protecting victims, not punishing potential pedophiles. Which means there will be objectionable fictitious content out that runs free but isn't depicting a real person.

        5 votes
      3. itdepends
        Link Parent
        The difference is that CP either has very real very affected victims and (as I understand it) even fictional created CP promotes and normalizes the sexualization of minors which is a problem in...

        The difference is that CP either has very real very affected victims and (as I understand it) even fictional created CP promotes and normalizes the sexualization of minors which is a problem in itself.

        This does not apply to adults. A deepfaked image of an adult that does not get shared has no victim, and sexual attraction between adults is obviously normalized.

        The problem, which is also a problem concerning CP, is that the genie is out of the box. There's software that can quickly and easily create deepfakes just like there's software that can easily produce whichever image you like. Age and nfsw detection filters are just lines of code that can be easily written out.

        IMHO the only realistic way to deal with this is to deal with distribution, like you deal with say, revenge-porn. You don't ban videos of consensual sexual activity, you ban their non-consensual distribution.

        4 votes
  3. [6]
    pallas
    (edited )
    Link
    This may sound a bit ridiculous, but, in many cases, don't these images constitute copyright infringement? I don't mean that they infringe on copyright as a result of a underlying machine learning...

    This may sound a bit ridiculous, but, in many cases, don't these images constitute copyright infringement? I don't mean that they infringe on copyright as a result of a underlying machine learning process involving copyright more generally. Rather, it sounds like many of these images, if not most, use direct copies of parts of recognizable images of the targets, especially images of faces, and have the sexually-explicit parts generated¹. Especially in this case, unless someone is specifically training a system to generate specific individuals (in which case they're likely a lucrative target for civil suits!), if the point is to make sexually-explicit, clearly recognizable images of people who are not public figures, then it seems like the only way to do so would be for the system to incorporate copies of images for the recognizable portions, unless having a system that takes specific images and generates one that is essentially and recognizably the same is not a copyright violation, which would seem to explode copyright of photographs entirely.

    Just because someone posts an image online, on social media or elsewhere does not mean that it can be freely used by someone else, somewhere completely different, or that copyright somehow does not apply. This seems to be a frequent confusion, and many of the people in this article seem to be confused about it. Posting a photograph on Facebook, for example, involves licensing the use of that photograph to Facebook for displaying on Facebook, and probably for other nefarious purposes by Meta, but certainly would not seem to license it for sexually-explicit display on an unrelated attack site by a person with no connection to Meta. And this is clearly not fair use. So why not go after the copyright violation? Why not DMCA-notice the site, and the provider?

    It's also a bit confusing that no one seems to have tried to obtain a restraining order in this case? I've seen them given for far, far less, and with so many people to watch and document violations, it seems like it could have been an effective way of preventing the behavior from continuing, unless restraining orders are very different in these states than in California.

    ¹ Actually, it seems like the case discussed largely didn't involve machine learning at all, but instead involved photoshopped combinations of multiple images. The text of the proposed bill mentioned in the article doesn't appear to publicly exist yet, but I'm a bit confused as to what it would say. If it bans "digital manipulation" of sexually-explicit images generally (without considering consent), it's not clear how that ban would both include these images and not include essentially all sexually-explicit digital photos (and thus be clearly unconstitutional), without being very carefully written.

    3 votes
    1. [2]
      itdepends
      Link Parent
      The system will use the original image as a reference to generate a brand new image based on the reference. It's the argument over AI art all over again, it does not copy-paste, it "learns" and...

      if the point is to make sexually-explicit, clearly recognizable images of people who are not public figures, then it seems like the only way to do so would be for the system to incorporate copies of images for the recognizable portions

      The system will use the original image as a reference to generate a brand new image based on the reference. It's the argument over AI art all over again, it does not copy-paste, it "learns" and generates.

      2 votes
      1. pallas
        Link Parent
        I think there are distinctions here beyond just the argument over AI art. The copyright considerations of an image being used as part of a large amount of training data, and present only in its...

        I think there are distinctions here beyond just the argument over AI art. The copyright considerations of an image being used as part of a large amount of training data, and present only in its effect on model weights, may be different than that of an image used directly as input, or as part of a very small amount of training data. At some level this must be the case: it would be trivial to construct a neural network, for example, that would just copy its input (or trivially fiddle with it), or exactly output one of its small number of training images, or a portion of one.

        This may not be the case for generating deepfakes of public figures. It is also possible to envision a process that would at least somewhat turn it into the more general AI art argument for private figures. It's possible that would make sense for a well-resourced attacker trying to create political deepfakes, but this seems problem seems like it mostly involves poorly-resourced attackers.

    2. [3]
      sparksbet
      Link Parent
      This wouldn't be copyright, at least not the subject's copyright. It could potentially be a violation of the photographer's copyright, but I think that would be a tough argument to make (and...

      Rather, it sounds like many of these images, if not most, use direct copies of parts of recognizable images of the targets, especially images of faces, and have the sexually-explicit parts generated¹.

      This wouldn't be copyright, at least not the subject's copyright. It could potentially be a violation of the photographer's copyright, but I think that would be a tough argument to make (and wouldn't give the subject the right to sue).

      I think unauthorized use of one's name/likeness (aka personality rights) is going to be more relevant here, but those laws vary a lot state by state in the US. And afaik they only apply when something is used in a commercial context, which wouldn't protect from deepfake revenge porn at all.

      1 vote
      1. [2]
        pallas
        Link Parent
        I was referring to the photographer's copyright here, not the subject, sorry: I should have added that it sounds like many of the images being used in these are either self-taken, or taken in...

        This wouldn't be copyright, at least not the subject's copyright.

        I was referring to the photographer's copyright here, not the subject, sorry: I should have added that it sounds like many of the images being used in these are either self-taken, or taken in situations where the photographer is closely linked (eg, a friend/relative) or would be willing/motivated to go after the attacker.

        1 vote
        1. sparksbet
          Link Parent
          Ah alright that makes sense. It's definitely a bit murky still (althought I do think it's easier to argue in court than, say, having your work used to train a proper generative model) but that...

          Ah alright that makes sense. It's definitely a bit murky still (althought I do think it's easier to argue in court than, say, having your work used to train a proper generative model) but that definitely makes more sense copyright-wise

  4. Arminius
    Link
    Other countries also struggle with this. Here is an example where the prosecution is about violation of sexual autonomy and bodily integrity.

    Other countries also struggle with this.
    Here is an example where the prosecution is about violation of sexual autonomy and bodily integrity.

    2 votes