44 votes

Instagram's Nudify [non-consensual fake nude photo generator] ads

60 comments

  1. [41]
    Wolf_359
    (edited )
    Link
    Genie is out of the bottle. Obviously it shouldn't be advertised any more than you see regular porn sites advertised but I think we will have to accept that this is just a thing now. It may...
    • Exemplary

    Genie is out of the bottle. Obviously it shouldn't be advertised any more than you see regular porn sites advertised but I think we will have to accept that this is just a thing now.

    It may actually be a golden opportunity too. When we get to the point that you can't tell between real and AI generated nudes, every nude is suspect. Revenge porn won't matter, people won't lose jobs over images, etc.

    I'm not sure this will end up being as bad a thing as people think. And perhaps I'm being naive - extremely possible - but I also think AI nudes won't be as damaging to any one person's mental health if it's pretty much everywhere all the time and happening with everyone.

    Not an advocate for it, but it may be one of those counterintuitive things that ends up working out for the better.

    62 votes
    1. [6]
      winther
      (edited )
      Link Parent
      It is interesting notion, but I am not sure it will turn out that way. At least it might take generations for that sort of cultural shift. And since the invention of language, it has been possible...

      It is interesting notion, but I am not sure it will turn out that way. At least it might take generations for that sort of cultural shift. And since the invention of language, it has been possible to spread lies about people. I think most people who have experienced lies or rumors about them being spread around, be it online or among their peers, it can still hurt and not something you can easily disregard just by saying that everyone can say something untrue about you. Now we have the ability to basically tell lies about people using images. I don't see how that would be easier to just shrug off and easily ignored.

      27 votes
      1. [5]
        Wolf_359
        (edited )
        Link Parent
        We rebuilt society around the smart phone within 10 years of its invention. There have been some major pros and cons with this. But society can move quick when there is a new technology. I think...

        We rebuilt society around the smart phone within 10 years of its invention.

        There have been some major pros and cons with this. But society can move quick when there is a new technology.

        I think custom pornography is the future whether we like it or not. There will be an option to have VR porn generated of anyone you want, anytime. It will just be assumed that everyone has a virtual world where they are living out fantasies. It'll be normal and you may even see research saying it has health benefits with regard to lowering stress, but can be addicting for depressed people. We saw this with video games and porn videos already. We see this with Marijuana use and alcohol use.

        At that point, I wonder how much stuff even gets shared online. Why search online when you can make your own little fantasy world with whoever and whatever you want? I think the dynamic will shift and it'll be far more embarrassing for the person generating the porn when/if it leaks.

        I can even see some scenario in which people use it as a bragging right. Sounds insane to us right now but we also could have never predicted the narcissistic tendencies social media would shine a light on either. "Likes" are so validating to us. I can see a world where people are gladly sharing more accurate body scans and loving the dopamine hit they get when their download count goes up by one. Is it someone at work? Is it someone I also downloaded?

        Does this have a name? In the unlikely event that this plays out, can I call it the Wolf paradox? It'll be when something shameful and embarrassing reaches mass adoption and becomes empowering or enjoyable.

        Again, I could be seriously underestimating the damage here. I just think this has potential to be good. Especially for women (or men) who have nudes ruin their lives.

        22 votes
        1. [3]
          winther
          Link Parent
          Yes the smartphone changed a lot of things, but many things dealing with basic human emotions are still the same. There was bullying before the smartphone. Now there is bullying online instead. It...

          Yes the smartphone changed a lot of things, but many things dealing with basic human emotions are still the same. There was bullying before the smartphone. Now there is bullying online instead. It still hurts being on the receiving end of that. Before you could start a false rumor on a school that X slept with Y. Now you can create fake images/videos of the same. I don't really see how AI technology is going to make it easier for victims of those kind of lies.

          9 votes
          1. [2]
            Wolf_359
            (edited )
            Link Parent
            You could be right. And I think I edited to add more to my comment before you replied. But I wonder if the bigger point of bullying will be "Jon made this fake video of him sleeping with Jennifer,...

            You could be right. And I think I edited to add more to my comment before you replied. But I wonder if the bigger point of bullying will be "Jon made this fake video of him sleeping with Jennifer, What a loser" instead of what we have now which is, "Did Jennifer really sleep with Jon? What a [derogatory name]"

            8 votes
            1. winther
              Link Parent
              Yeah, one can hope it will turn out that way. Perhaps it will depending on various circumstances like the social status of whoever is involved. And that is only if it happens within a known group...

              Yeah, one can hope it will turn out that way. Perhaps it will depending on various circumstances like the social status of whoever is involved. And that is only if it happens within a known group of people. If you randomly finds fakes of yourself or random strangers approach you because they have seen some fake stuff of you, that is going to feel very creepy and uncomfortable regardless.

              With regards to your comment of what people keep fully private, that might be different. It is one thing to secretly fantasize about classmates while masturbating, it is another thing to go around telling everyone about your fantasies. Same could apply to having private AI generated stuff. Though I somehow doubt it is going to stay private in many cases, and just being on the receiving end knowing that someone wanted to make and distribute AI generated stuff of you... It is going to take a massive mental shift for that not to be uncomfortable for many people.

              4 votes
        2. karim
          Link Parent
          Ninja edit: with regards to your final point, I generally agree with you I'll bet my neck and arms it'll be the opposite. It'll be souch worse. These things will be a mental health nightmare. A...

          Ninja edit: with regards to your final point, I generally agree with you


          health benefits with regard to lowering stress

          I'll bet my neck and arms it'll be the opposite.

          but can be addicting for depressed people.

          It'll be souch worse.

          These things will be a mental health nightmare. A large part of our mental health issues comes from isolation, and even though we can trick our brains to thinking we are connecting with people, deep down on our subconscious and body know we aren't connecting.

          Justifications for my conclusion:

          • Instant messaging didn't make feel connected, a lot of people around me (anecdotal, I know) feel lonely Even if they spent rhe ebtire day messaging.

          • Same goes for phone calls and video calls

          We need touch, we need another person's touch to feel connected. We are not our brains; our brains are just one part of us. We are still our arms, legs, chests, and fingers. We need to smell other people, see them, hear the, touch them, and live with them. That's what we evolved to do, and we can't trick millions of years of evolution in the span of 2 generatios.

          I believe that the majority of humans currently live in habitats not suited for human beings. We live in concrete jungles, surrouned by Constant noise from traffic, horrible air quality due to cars, and an extreme lack of greenery/plant life. Our foods are increasingly more processd and "fake", and filled with too much sugar.

          Technology is advancing far too quickly for our brains, and we have barely begun to grasp the drawbacks of these technologocal advancements. To make things worse, ad-serving corpos want these techs to be more than tools, they want our entire lives to revolve around them. Imagine a city square filled with ad-billboards, but now everywhere on our lives, and at every moment.


          Boy that was a long rant.

          I believe our tools need to remain that: tools. The convenience of modern tech is just that: a convenience. They aren't necessary for happiness and fulfillment. We have always been happy before tech, and we can remain happy without it.

          8 votes
    2. [9]
      papasquat
      Link Parent
      This is genuinely one of those things that makes me feel like I'm from another planet, or have some sort of mental illness or something, because I genuinely cannot empathize with people who feel...

      This is genuinely one of those things that makes me feel like I'm from another planet, or have some sort of mental illness or something, because I genuinely cannot empathize with people who feel that having AI generated naked images of them is somehow violating.

      Maybe it's because I'm a man? At the same time, I've had women who were interested me to who I wasn't interested in. If I found out they downloaded an Instagram picture of me, generated a nude image, and used that for sexual gratification, it wouldn't bother me even slightly. It feels exactly the same to me as if someone was fantasizing about me without the aid of technology. If that photo actually got out, maybe it would be very slightly embarrassing if some amount of people assumed it was real, but at this point, most people are aware that AI image generation tools like this exist, so that potential number gets lower every month.

      I know a lot of people say this is a gross violation using the strongest possible wording, but I honestly feel like I'm taking crazy pills here, because I'd rather have someone generate a private AI nude of me than basically any minor inconvenience. Stubbing my toe, dropping my phone, forgetting to plan dinner, or having someone cut me off on the road would all bother me more.

      Maybe it would be different if I were an attractive woman, and was constantly objectified? I don't know, it's hard to put myself in that state of mind.

      I feel like I'm the only one I ever hear with this position and it's starting to make me feel like there's actually something wrong with me.

      23 votes
      1. GenuinelyCrooked
        (edited )
        Link Parent
        I think the first error is that you're underestimating how many people would believe that it was real, or act as though it was, for the purposes of shaming or using against someone. It's been...

        I think the first error is that you're underestimating how many people would believe that it was real, or act as though it was, for the purposes of shaming or using against someone. It's been possible to create realistic fake nudes of people for decades, it was just more time consuming before. In those decades there hasn't been a shift to assuming that nudes shared of someone are fake.

        Even if this makes the assumption of fakeness drastically more likely, the women most likely to benefit from that are women who comport themselves very chastely. We're already constantly policing our own sexuality and vigilant about how others are perceiving it. This is just another thread to hold over us. "I better not act like a slut or when someone inevitably shares deepfakes of me, everyone will believe they're real."

        And imagine proving that they're not real! The model is likely to be trained on bathing suit, so in order to actually show that they're fake, you'd have to show everyone your nipples or genitals, so it would be violating either way.

        It feels exactly the same to me as if someone was fantasizing about me without the aid of technology.

        This is actually more uncomfortable than you might think! Frequency and start date are a pretty huge factor here. As a girl, you become aware sometime between 11 and 13 that a huge proportion of the people you encounter are doing this, and you are constantly reminded at least through your 20's. This is not remotely comparable to having it happen once or twice, or even a handful of times, as an adult. It twists the way that you see and think about yourself in a way that I'm not sure it's possible to heal from.

        These pictures are an extension of that, yes, but they are also worse. Someone who hadn't thought about you like that up until that moment might come across your photos while they're already in that mindset, which is going to prompt those thoughts. They're tangible and transmissible.

        I definitely think being a man is a massive factor in your emotional response to something like this.

        I say all of this, by the way, as a person who has shared nudes and wouldn't be all that devastated if they got out. Those were something I chose to do, and something that I still largely have control over. This software takes that control away. That's so much more violating. I would rather my real nudes got out than fake ones any day.

        19 votes
      2. teaearlgraycold
        Link Parent
        I can empathize with someone if they're insecure and the images are being shared with others for the purpose of bullying them. I'm sure that's where most of the horror stories will come from. A...

        I can empathize with someone if they're insecure and the images are being shared with others for the purpose of bullying them. I'm sure that's where most of the horror stories will come from. A teenager (boy or girl) will have fake underage nudes made of them. Everyone will know it's fake but they'll pretend it isn't and will tease them endlessly about how they did something in the photo. That will lead to suicides given how widespread it will soon be.

        And on the other hand I can empathize with people that will find out their friend made fake pornography of them (this sort of thing happened with a Twitch streamer recently). Fundamentally it's not a new story. Friend A finds out that friend B has been intensely attracted to them this whole time. And given that these are digital artifacts they're much easier to find out about than a dream someone had.

        14 votes
      3. winther
        Link Parent
        It is one thing not feeling it will impact yourself. I get that, I personally don't feel overly concerned. My reaction will probably mostly be puzzlement if it happened. However, I have read...

        It is one thing not feeling it will impact yourself. I get that, I personally don't feel overly concerned. My reaction will probably mostly be puzzlement if it happened. However, I have read enough stories from women who has been on the receiving end of this sort of thing to know that it is definitely not so easy to disregard. For starters, here is an article about someone who was a victim of deep fake porn.

        There is a tendency at these kind of online forums with plenty of tech savy users, where we think it will just be normalized at some point since there will be fakes of everyone. It is possible we might reach that state some day, but we should at least acknowledge the damage it is doing right now and will do for many years. There are still millions of people who will be easily fooled by this and especially when it comes to children, we can't really expect them to easily shrug such a thing off as no big deal. There are also tons of dynamic social factors that weigh in how this will affect people. Like if you have previously been a victim of sexual abuse, if your social status is low or you have a certain reputation, the damages will be different.

        There probably is also a pretty big gender disparity at work here. I don't have a reference link at hand, but I saw a Norwegian documentary series where a couple volunteered to have deep fake porn made of them and it will be shown only to them to get their perspective on it. The man reacted with mostly puzzlement and could laugh it off as somewhat silly, whereas the women was visibly deeply uncomfortable - even though this was created with full consent from them and not shown to anyone else.

        Yes, if people just create stuff privately with their own locally run AI model then it isn't much different than private fantasies, but that it is unrealistic to think it is going to stay way. It will be used for harassment. We know from victims of revenge porn that they even years later experience online stalking and harassment if they are recognized. Sure there is a big difference between real and fake revenge porn, but I think it is hard to deny that it is going to get a whole lot worse before it could turn around for the better. And it is anyones guess how long that period will be.

        12 votes
      4. Wolf_359
        Link Parent
        As a man, I tend to agree with you. I don't care and would care even less if I never found out. Personally, I realize that not everything in a person's fantasy life is something they actually want...

        As a man, I tend to agree with you. I don't care and would care even less if I never found out.

        Personally, I realize that not everything in a person's fantasy life is something they actually want to do. I think we all have thousands of sexual and non-sexual fantasies we would never act out in real life. And even for the fantasies we would consider making a reality, I don't know that a thought necessitates an action for most people. We can want things without pursuing them.

        Anyway, I suppose it's different if you're a woman and men are a physical threat to you. Especially when so many women report non-stop creepiness from tons of the men they deal with.

        Still, I don't see this going away and I think there is an argument to be made that AI porn will give women more breathing room in the real world in various ways.

        5 votes
      5. [3]
        shu
        (edited )
        Link Parent
        Would that change if the generated images would show you involved in scenes you don't agree with? No kinkshaming, but I think it's fair to assume that many people wouldn't be comfortable with...

        because I genuinely cannot empathize with people who feel that having AI generated naked images of them is somehow violating

        Would that change if the generated images would show you involved in scenes you don't agree with?

        No kinkshaming, but I think it's fair to assume that many people wouldn't be comfortable with realistic generated images showing them involved in scenes that they can't identify with. (Maybe certain fetishes (e.g. in diapers with a pacifier), or sexually involved with a partner they don't find attractive, or in a sexual practice they don't agree with, or generated with a body they don't identify with, etc.)

        Or would that still not bother you?

        e: I just ask, because I also think that generated nude pics of myself wouldn't be a big problem for me, but I can definitely think of generated pics I wouldn't want to star in (given the hypothetical that these pictures would be shared online).

        3 votes
        1. [2]
          papasquat
          Link Parent
          Honestly, no. Them being posted all over my workplace or sent to tons of people I know would be irritating, but I think that falls more into the realm of harassment than what we're talking about...

          Honestly, no. Them being posted all over my workplace or sent to tons of people I know would be irritating, but I think that falls more into the realm of harassment than what we're talking about here.

          As far as just having the images generated for private use, given the option of having it happen and not having it happen, I'd obviously prefer it not to happen, but at the same time, it really wouldn't be a big deal for me.

          3 votes
          1. GenuinelyCrooked
            Link Parent
            I think the most likely scenario is that they'll be posted to a site like reddit or Twitter/X or whatever the next site of that ilk is, and there will always be a chance that someone you know will...

            I think the most likely scenario is that they'll be posted to a site like reddit or Twitter/X or whatever the next site of that ilk is, and there will always be a chance that someone you know will see it and believe that it's real, without you even knowing that they did. That's a lot of what happens with revenge porn now.

            I think the scenario where they stay private and hidden on someone's personal device is unrealistically optimistic. That might happen a lot, maybe even most of the time, but there will still be a massive deluge of images that do not stay private.

            3 votes
      6. Acorn_CK
        Link Parent
        Honestly I'm in the same boat as you, you're not the only one. As far as I'm concerned if it's done on a personal computer and not shared, for all intents and purposes it is just computer enhanced...

        Honestly I'm in the same boat as you, you're not the only one.

        As far as I'm concerned if it's done on a personal computer and not shared, for all intents and purposes it is just computer enhanced imagination. As you said -- people already fantasize about others all the time. It just makes the fantasies a little more realistic, I suppose.

        Sharing those photos anywhere -- now that, I can see why people would feel violated, as it is bringing the fantasy into the real world in a way that could potentially harm them. Many also consider it to be a violation of privacy / bodily autonomy; I assume this is due to the general accuracy of the tools, as I'm sure many consider it essentially taking a nude photo of someone without their consent.

        1 vote
    3. [22]
      DanBC
      Link Parent
      You can bet if the datasets had been trained on men, and images released were male politicians, that laws would have been passed rapidly. This feels like Yet Another Thing that men don't take...

      You can bet if the datasets had been trained on men, and images released were male politicians, that laws would have been passed rapidly.

      This feels like Yet Another Thing that men don't take seriously because it's mostly not happening to them.

      15 votes
      1. [21]
        Wolf_359
        (edited )
        Link Parent
        I definitely welcome a woman's perspective on this. My thought is that this could be pretty empowering for women and human sexuality in general. I acknowledge that, unless/until AI porn is so...

        I definitely welcome a woman's perspective on this. My thought is that this could be pretty empowering for women and human sexuality in general.

        I acknowledge that, unless/until AI porn is so ubiquitous that it's a non-issue, there will be a pretty shitty transition phase.

        I still think it's happening no matter what and we may as well roll with the punches and turn it into something "good." There is just no way to stop this now that the tech can be run locally and extremely cheaply.

        Would any women chime in on this? I think those opinions are pretty valuable since this will, at least initially, affect them most.

        4 votes
        1. [3]
          Vito
          Link Parent
          As a woman, I feel like the reason it's more devastating for women is that we are socially shamed for our sexuality while men are praised. At least that was true while I was growing up as a...

          As a woman, I feel like the reason it's more devastating for women is that we are socially shamed for our sexuality while men are praised. At least that was true while I was growing up as a millennial. A boy that hooked up a lot was a stud, a girl was a slut and everyone would bully them. I distinctively remember boys talking about masturbating and myself having to pretend that I had never done it along with all girls at my school. Admitting it would have meant being bullied forever.
          On a different note, I'm a teacher and I teach teenagers. If one of them wants to think about me while masturbating, as long as they don't tell me, go nuts. But if fake porn with one of my students was leaked, I'd probably lose my job and become unhirable, just in case.

          14 votes
          1. [2]
            GenuinelyCrooked
            (edited )
            Link Parent
            Can you imagine proving to your administrators - or worse, a new prospective employer - that porn with a student was faked? "You can tell that's not me because the areola is the wrong color. Let...

            Can you imagine proving to your administrators - or worse, a new prospective employer - that porn with a student was faked?

            "You can tell that's not me because the areola is the wrong color. Let me just pull out my actual nipple in your office for proof."

            Edit: This is actually a business opportunity. You offer tattoos to teachers and other people especially vulnerable to this sort of thing, along with little band-aids or stickers that they can use to keep them covered. The tattoo artist will also take one film photo of the tattoo. The tattoo can be of anything, some small symbol is best. As long as you never allow that tattoo to be photographed aside from that hard copy that only the tattoo artist has, you'll have proof that the deepfakes are fake.

            "See? I have this tiny green heart on the back of my knee. That's not in these photos."

            13 votes
            1. MimicSquid
              Link Parent
              The fact that we may need private verification marks to secure our images is one of those hyper-dystopian ideas that somehow makes perfect sense in the current zeitgeist.

              The fact that we may need private verification marks to secure our images is one of those hyper-dystopian ideas that somehow makes perfect sense in the current zeitgeist.

              10 votes
        2. [13]
          DefinitelyNotAFae
          Link Parent
          We just had an article about how the threatened exposure of explicit images is leading to suicide in young men targeted by scammers. So when that scammer just needs a public picture of you off...

          We just had an article about how the threatened exposure of explicit images is leading to suicide in young men targeted by scammers. So when that scammer just needs a public picture of you off social media instead of conning an 18 year old into sending dick pics...

          There is no world where this doesn't make that worse among every gender. And "shitty transition period" is a horrible understatement.

          Maybe, if women were in control of the AI tools, and the media coverage of non-consensual use of them, and the churches which would call them whores and the family units that would beat someone or kick someone out for it, and the prosecutors that will say it isn't a big deal, maybe it could be empowering. But there's nothing empowering about other people making you more vulnerable for their pleasure.

          And the AI tools were not trained by women to make empowering fantasies for themselves. It's bullshit to pretend that fake nude photos trained, probably on porn, are what women would find empowering themselves.

          My "womanly" opinion is to ban this software and prosecute every person that shares a photo of someone else edited to be nude non-consensually under revenge porn laws. And every picture of a minor under CSAM laws. We can absolutely say this isn't ok as a society. Until that happens, there is no world this is empowering to anyone except those willing to exploit others. I'd prefer my college students not to try to kill themselves.

          14 votes
          1. [12]
            Wolf_359
            (edited )
            Link Parent
            Thanks for sharing that. I completely see your point and I think you're probably right that it's going to be worse than just a "shitty transition." Sorry for being blasé about that. I guess my...

            Thanks for sharing that. I completely see your point and I think you're probably right that it's going to be worse than just a "shitty transition." Sorry for being blasé about that.

            I guess my question is, do you think those situations stop happening when AI nudes become so ubiquitous that they are essentially worthless? In your examples, you're using our current situation, which is one where people can still be humiliated and fired for these types of photos. I'm imagining a world in which they have essentially no value because everyone sees them as probably fake. One where nobody can be scammed or humiliated because nobody gives a shit about yet another fake photo. Who doesn't have several out there? Who even cares to see them when anyone could make anything at any time?

            I don't think these tools are able to be banned either. Will be about as useful as banning alcohol or marijuana or pirated music. Never going to happen. With alcohol, marijuana, and free music, the solution was to make them easily available. That was the least damaging way to deal with them.

            By the way, I used to make my living working in a facility that housed pedos. I will never get over my disgust toward them but I am over my knee-jerk "kill them all" instinct. I am a person who wants to prevent them from offending at all costs rather than punish them after they've already offended and caused damage. I would love to see studies done on whether fake CSAM gives them an outlet and stops them from offending. If it does... well, I guess let them have the fake stuff. Yes, it makes me want to gag, but it may be better than what we're doing now, which isn't working.

            Anyway, I do see your point and I value your perspective. I hope you understand I'm not an advocate for AI porn. I don't like it. But I don't think it's going anywhere no matter what I think and I'm just wondering if it won't be as much of a disaster as we worry about.

            3 votes
            1. [8]
              DefinitelyNotAFae
              (edited )
              Link Parent
              I read a lot of sci-fi so yeah I can imagine this, in a sex positive or sex neutral society where we don't shame people for their perceived sexuality and accuse children of seducing adults. But...

              I guess my question is, do you think those situations stop happening when AI nudes become so ubiquitous that they are essentially worthless? In your examples, you're using our current situation, which is one where people can still be humiliated and fired for these types of photos. I'm imagining a world in which they have essentially no value because everyone sees them as probably fake. One where nobody can be scammed or humiliated because nobody gives a shit about yet another fake photo. Who doesn't have several out there? Who even cares to see them when anyone could make anything at any time?

              I read a lot of sci-fi so yeah I can imagine this, in a sex positive or sex neutral society where we don't shame people for their perceived sexuality and accuse children of seducing adults. But that world hasn't happened yet and doesn't seem likely any time soon. And it's not the world we live in.

              I don't think these tools are able to be banned either. Will be about as useful as banning alcohol or marijuana or pirated music. Never going to happen. With alcohol, marijuana, and free music, the solution was to make them easily available. That was the least damaging way to deal with them.

              Banning the software is easy, you can't get rid of it but you can ban it. Criminalizing the posting non-consensual AI pictures in the same way that non-consensual photos are already criminalized is also fairly easy. Neither infringe particularly on anyone's right - they are usually violating copyright anyway. I agree that our current society seems to lack the will to do that. But that's once again why this isn't "no big deal". A society willing to allow women, teens, young adults and children be victimized isn't one that's gonna be sex-positive. It's going to tell women to stop feeling so victimized.

              By the way, I used to make my living working in a facility that housed pedos. I will never get over my disgust toward them but I am over my knee-jerk "kill them all" instinct. I am a person who wants to prevent them from offending at all costs rather than punish them after they've already offended and caused damage. I would love to see studies done on whether fake CSAM gives them an outlet and stops them from offending. If it does... well, I guess let them have the fake stuff. Yes, it makes me want to gag, but it may be better than what we're doing now, which isn't working.

              I worked with juvenile offenders and victims for a short while. Most offenders aren't oriented pedophiles they're heterosexual men or boys (I loathe "pedos" as a shorthand) This sort of app however is not victimless crime, as real people's photos are being used. If fake CSAM was a viable option, this wouldn't be it. There are studies on child sex abusers, I don't keep up with them as I don't do the work but you can look them up. You also don't need to convince me of your opposition to this with a visceral response. My work didn't make me gag. That doesn't change that CSA is horrible.

              Anyway, using a real photo of a child should make it covered by CSAM laws just as a real photo doesn't have to be nude to count, it's about the intent of the image.

              Anyway, I do see your point and I value your perspective. I hope you understand I'm not an advocate for AI porn. I don't like it. But I don't think it's going anywhere no matter what I think and I'm just wondering if it won't be as much of a disaster as we worry about.

              It's a nice dream world hypothetical to imagine we will all get over our shit as a society. But how many suicides in the meantime? The answer is always far too many. And I'd rather address what makes people think it's ok to violate others' consent as a society first. Because that'll actually make change and if we're living in a dream world one where people respect each other is better than one where everyone has non-consensual porn made of them.

              6 votes
              1. [7]
                Wolf_359
                (edited )
                Link Parent
                I worked with them every day and I frequently felt a visceral disgust at some of the things they would do. Even in a facility they would stop at nothing to glimpse a child in person or on...

                I worked with them every day and I frequently felt a visceral disgust at some of the things they would do. Even in a facility they would stop at nothing to glimpse a child in person or on television. I treated them kindly and compassionately. I know they're sick and my job wasn't to judge or punish. But I did feel disgust. My intention was to portray to you that I, like many others, feel that way about them. However, I am able to ignore that and look toward potentially productive solutions rather than purely punitive ones. It was not to convince you of my opposition to pedophilia.

                When working with them for that long, I definitely got used to calling them pedos for short. You don't have to call them that if you don't like it.

                I feel that some of the responses to me have been fairly confrontational and people are getting angry with me. Some of what is being said has been condescending, including some of your reply. I realize this is an emotionally charged issue but I think and hope my responses have been respectful to everyone. I wish it was being reciprocated more.

                3 votes
                1. [3]
                  GenuinelyCrooked
                  Link Parent
                  I read through the other responses to you again and nothing stuck out to me as particularly confrontational or condescending. That isn't to dispute that you felt that way, only that if I can't...

                  I read through the other responses to you again and nothing stuck out to me as particularly confrontational or condescending. That isn't to dispute that you felt that way, only that if I can't determine what made you feel that way then I can't rule out that my comments did as well. I hope that they weren't perceived that way. They weren't written with that intent at all.

                  I hope the perception that people are getting angry with you doesn't make you feel the need to disengage from the conversation. I'm still curious about the answers to my questions (what do we do when we do need to determine if photos are authentic or not, if Law Enforcement is not getting involved), which I don't see addressed anywhere by anyone (not just you) who finds themselves confused about why photos like these would be upsetting.

                  4 votes
                  1. [2]
                    Wolf_359
                    Link Parent
                    I don't have the answer on what we should do to prove whether a photo is real or not. I suppose forensics would come into play at that point, as it would with Photoshop now. I don't think my point...

                    I don't have the answer on what we should do to prove whether a photo is real or not. I suppose forensics would come into play at that point, as it would with Photoshop now.

                    I don't think my point about AI porn being empowering really relates to that anyway. It exists already and always will, legal or not. If someone is determined to make it, they can already. We already have to decide if it's real or not. If we make it illegal, we will still have black market sources and will still have to make that determination.

                    Letting it run its course and adapting, seeing if it has the potential to be a "good" thing - these won't have any bearing on a problem that already exists unfortunately.

                    As for the comments, it was mostly the one comment I had replied to but there were a couple lines here and there from others that felt, at least when I read them, like the style of argument I sometimes see on Reddit. Maybe not quite so bad upon rereading and giving more benefit of the doubt to the commenters. Anyway, I will assume everyone is here sharing ideas in good faith. Thank you all for the insightful replies. Men who identify as men can't experience womanhood, so this issue is vastly different to us and I recognize that. I'm hopeful that AI porn won't be as bad as we think. I'm optimistic that there will be benefits. But I don't know. And if I'm wrong and you all are correct, then yes it will be a disaster of epic proportions.

                    3 votes
                    1. GenuinelyCrooked
                      Link Parent
                      Simply presuming everything is fake and ignoring it removes a pretty important mechanism for genuinely bad actors to get caught. If actual pictures of a teacher taking advantage of a student or a...

                      Simply presuming everything is fake and ignoring it removes a pretty important mechanism for genuinely bad actors to get caught. If actual pictures of a teacher taking advantage of a student or a doctor taking advantage of a patient surface with no investigation, that's a problem. If fake pictures surface and there's an investigation that victimized the doctor or teacher, that's a problem. Having the process be illegal means that law enforcement can investigate by searching for the source of the photo and potentially examining the machine on which it was created. If a doctor is sleeping with adult patients, that could be extremely harmful and unethical but not illegal, meaning that if their employers do investigate, it will necessarily victimize the doctor. If they don't, patients may continue being harmed.

                      4 votes
                2. [2]
                  Vito
                  Link Parent
                  On my part, as one of the people who answered, I also hope I didn't come of as condescending, it was honestly not my intention. I hope you don't think I was exaggerating about my experience at my...

                  On my part, as one of the people who answered, I also hope I didn't come of as condescending, it was honestly not my intention.
                  I hope you don't think I was exaggerating about my experience at my school as a teenager. Unless out of the 250 girls I knew I was the only one lying about not masturbating because I was the only one actually doing it, haha.
                  About the possibility of me getting fired in the hypothetical of this happening with a student, I can only speculate. I do think there is a possibility that my bosses would believe me that it's fake, being a woman would probably benefit me in this circumstance and I have earned their trust. But it's also possible that lawyers would advise them to fire me or that the public in general would demand it. I just really hope this remains a hypothetical. It probably won't for some of my colleagues.

                  4 votes
                  1. Wolf_359
                    Link Parent
                    I'm also a teacher and AI porn as a man scares the shit out of me in this context. I am literally never alone with a student for even a second because I'm fully aware my life would be over if...

                    I'm also a teacher and AI porn as a man scares the shit out of me in this context. I am literally never alone with a student for even a second because I'm fully aware my life would be over if there was even a hint of something untoward occuring.

                    But again, I'm imagining a world where this is such a common occurrence that people are given the benefit of the doubt on this. If anyone is going to use AI to make absurd and horrible videos, it's going to be students. Students are monsters lol. I picture a world where AI videos are as common as memes and nobody even takes them seriously - one where they're hardly worth sharing and certainly not worth acting on. They're so ubiquitous as to be worthless.

                    Maybe it'll never happen, but if they do become common I don't see how society continues to function without some verification method or just agreeing that video is suspect until proven otherwise. We won't be firing every person in the country who has a fake video made of them. It won't be possible.

                    Stopping a new technology has literally never worked in human history. Progress, good or evil, finds a way. I genuinely think that our best bet is to find a way to live with this. There is no stopping it.

                    3 votes
                3. DefinitelyNotAFae
                  Link Parent
                  My intent is not to be condescending and I'm sorry if it came across that way, and I'm not angry with you. I wanted to make it clear you didn't need to convince me you were disgusted with...

                  My intent is not to be condescending and I'm sorry if it came across that way, and I'm not angry with you. I wanted to make it clear you didn't need to convince me you were disgusted with pedophilia or child abuse, at the same time I personally avoid dehumanizing language like "pedo" because offenders aren't some "other" they are people making choices and we have to deal with them as people for it.

                  Emotion is however part of having these conversations because it is the emotional impact that is so incredibly visceral. The emotional impact of having your privacy violated in this way, having some of your autonomy stripped from you and not knowing who has seen the pics, whether everyone is talking about you, and when it will happen again is immense. It leads to some people taking their own life, to years of PTSD for others, to people moving, dropping out of school etc.

                  It would be amazing to reduce that stigma and put the blame on perpetrators, but enabling them and even granting permission to them is not the way to do that. I understand you were postulating a world where it didn't matter if a nude picture was made of you without your consent because everyone has had it happen. I think that is not a "better" world and not one I want to work towards. It's better if no one shames victims of sexual assault either, but letting assaults continue and just stopping the shaming wouldn't be the correct answer.

                  I also cannot truly imagine the near future being that world given our current society, so no, I don't think it's even a realistic option. Hence "dream world"

                  I don't think it's too late to address this technology if there's a will to do it. And IMO the lack of a societal will is the tacit acceptance of (primarily) men being granted access to, authority over and permission to violate (primarily) women's bodies and even their images. Of course men are harmed by all of this too, hence the article about young men dying by suicide over threatened nude photo exposure. I'd rather work towards a world of people respecting the consent and boundaries of each other, regardless of gender and regardless of whether they make consensual nude photos.

                  3 votes
            2. GenuinelyCrooked
              Link Parent
              How far off do you expect the tipping point for this to be, and why don't you think we've moved in that direction at all despite photo-shop being able to perform a similar function for decades? If...

              I'm imagining a world in which they have essentially no value because everyone sees them as probably fake.

              How far off do you expect the tipping point for this to be, and why don't you think we've moved in that direction at all despite photo-shop being able to perform a similar function for decades?

              If there's a situation where someone does need to prove that their nudes are fake, (for example images of a teacher involved with a student, or any adult involved with a minor) how could someone go about proving that they're fake without being further victimized? Software designed to detect AI is already garbage and will likely get worse as AI gets better.

              5 votes
            3. [2]
              winther
              Link Parent
              There is a difference in banning tools and certain use of them. All laws are being broken, but that doesn't mean every law is useless. It does limit things we generally don't tolerate. Like...

              I don't think these tools are able to be banned either. Will be about as useful as banning alcohol or marijuana or pirated music. Never going to happen. With alcohol, marijuana, and free music, the solution was to make them easily available. That was the least damaging way to deal with them.

              There is a difference in banning tools and certain use of them. All laws are being broken, but that doesn't mean every law is useless. It does limit things we generally don't tolerate. Like security testers and hackers are using the same set of software tools, but there is a pretty clear line between illegal and legal use. One that is very applicable here as well, whether there is consent involved. Do you have consent to mess around with this computer system, or do you have consent to use images of this person.

              5 votes
              1. GenuinelyCrooked
                Link Parent
                Also if it's banned and photos like this surface, law enforcement will have resources to trace the source of the photos and might be able to determine authenticity by examining the creator's...

                Also if it's banned and photos like this surface, law enforcement will have resources to trace the source of the photos and might be able to determine authenticity by examining the creator's computer, rather than the burden being on the person victimized by the photos in the event that they need to prove that photos are fake for employment reasons or something.

                Whether law enforcement would actually use those resources, or if they would treat it like they do sexual assaults now, is obviously a separate can of worms.

                4 votes
        3. GenuinelyCrooked
          Link Parent
          I think you're right that it's inevitable, and I do hope that one day it will lead to a default assumption that any nudes that don't have the subject's explicit endorsement will be considered...

          I think you're right that it's inevitable, and I do hope that one day it will lead to a default assumption that any nudes that don't have the subject's explicit endorsement will be considered fake, but I don't have the faith you have that it will be empowering. I think the best case scenario is that once that ubiquity is reached it won't be so disempowering.

          It's kind of difficult for me to articulate why this would feel worse than someone I'm uncomfortable with fantasizing about me. If I was asked why it would be worse to be stabbed than to be punched I'd say "it hurts worse, and it's doing more damage to my body". That seems obvious, and like a complete answer. But in this case, if I simply say "it hurts worse, and it's doing more damage to my feelings", that seems insufficient to the point of not being an answer at all. I wrote more in another comment further up about the experiences that cause women to feel very raw about things like this. I'm happy to elaborate further but I don't want to be redundant.

          8 votes
        4. [3]
          chocobean
          Link Parent
          Older thread, additional perspective: Aside from social stigma, there is also the danger of encouraging sexual assault directly. To be perceived as sexually available is dangerous to many women:...

          Older thread, additional perspective:

          Aside from social stigma, there is also the danger of encouraging sexual assault directly.

          To be perceived as sexually available is dangerous to many women: the language used to describe women's bodies and acceptable behavior towards them differ base on how "deserving" or "open" the women are perceived to be.

          We also know that people's opinions on stuff changes based on images seen, even when specifically told it's not real.

          Some will choose to believe the women depicted are really consenting to violence and assault based on what they see, and pluck up their courage to attack women when they otherwise might not have.

          6 votes
          1. [2]
            Wolf_359
            Link Parent
            I'd be curious to see whether there is a potential to reduce sexual crimes by providing AI porn as an alternative outlet for otherwise inaccessible fantasies and desires, violent or not. In other...

            I'd be curious to see whether there is a potential to reduce sexual crimes by providing AI porn as an alternative outlet for otherwise inaccessible fantasies and desires, violent or not. In other words, can we reduce these crimes by giving people a less harmful outlet? I could see this being the case. Sexually open societies with easy access to porn tend to be less misogynistic and less violent than sexually repressed societies. Compare The USA to India. Compare the UK to the Middle East.

            If it does reduce sexual crimes, does this reduction outweigh the impact of what you're describing?

            To be clear, I have no real evidence that this will be the case. For all we know, AI porn could be a gateway drug to more extreme and more criminal sexual desires. I doubt it, but it's certainly possible.

            Thanks for sharing. You do bring up a valid concern. I hadn't even considered the "look what she was wearing" crowd in my response.

            2 votes
            1. chocobean
              Link Parent
              My puritanical reflex would be "no", that easy gratification whets ones appetite and "inflames a passion" (ignites something that makes us passively bound by its appeal and subjugate us by our...

              My puritanical reflex would be "no", that easy gratification whets ones appetite and "inflames a passion" (ignites something that makes us passively bound by its appeal and subjugate us by our desire for more and more and more).

              There's the dark side of repression, as you pointed out, a hellscape where victims can't even speak out without ruining their own lives. But I'm not sure that limitless indulgence is the answer either....that there isn't a middle ground where.....

              Where AI porn isn't outlawed, but one where responsible use satisfies people's desires without hurting each other, including ourselves.

              If the desire is for fantasy, why isn't AI porn of fictitious women good enough? Why isn't imagination of public figures enough? The specific desire here is for specific public figures to satisfy one's sexual desire without the public figure's consent. When phrased like this, it really isn't that much of a debate is it?

              2 votes
    4. [3]
      dysthymia
      Link Parent
      I honestly never thought about this this way, but it does make sense that, at some point, we'd have some kind of cultural shift regarding this kind of thing.

      I honestly never thought about this this way, but it does make sense that, at some point, we'd have some kind of cultural shift regarding this kind of thing.

      2 votes
      1. [2]
        Pun
        Link Parent
        (I got mildly meta here, but I think it's still tangentially on topic.) I think a huge factor in this is sexuality having been such a taboo in the modern world, which is why people get shamed or...

        (I got mildly meta here, but I think it's still tangentially on topic.)

        I think a huge factor in this is sexuality having been such a taboo in the modern world, which is why people get shamed or in trouble if their sexy pics/videos are found publicly. Though in the West, our attitudes have steadily been getting better for many decades now, even if individual events might suggest otherwise. (Western focus in my comment anyway, since I'm European. Though considering laws around pornography in many other countries, I'm sure people there don't have it any easier.)

        Sex is one of the cornerstones of being an animal (along with food/water, safety etc), so I think it's such a shame the way we treat it in such an unhealthy way. I know it's likely unprovable, but I wholeheartedly cannot believe that this attitude was the prevailing one we humans had for hundreds of thousands of years.

        The internet has undeniably been a great driver in bringing people together to allow us to safely talk about sex. Porn websites used to be treated almost exclusively as hives of computer viruses where only degenerates would venture. Nowadays, names like Pornhub and Onlyfans are pretty regularly seen/heard in discussions. Hell, the entire reason for OF being talked about relatively recently was their attempt to ban porn on their platform which led to a huge backlash.

        My hunch is that the reason for the taboo is humanity's fervent desire to separate ourselves from other animals, to see ourselves above beasts. Just because we can build some huts and a wall to keep nature away from us, doesn't remove the fact that we're animals too.

        Personally I'm hopeful about where we're headed in the future, when it comes to sexuality.

        6 votes
        1. Tiraon
          Link Parent
          My own theory is different and perhaps a bit more cynical, or perhaps in a different way. Person who is guilty(actual guilt in any kind of sane moral framework is irrelevant) is easier to control....

          My hunch is that the reason for the taboo is humanity's fervent desire to separate ourselves from other animals, to see ourselves above beasts. Just because we can build some huts and a wall to keep nature away from us, doesn't remove the fact that we're animals too.

          My own theory is different and perhaps a bit more cynical, or perhaps in a different way. Person who is guilty(actual guilt in any kind of sane moral framework is irrelevant) is easier to control. Criminalizing(or making it a thing for perverts and weirdos) a subset of sexuality makes a lot of people guilty(or all as makes no difference, depending on the subset).

          1 vote
  2. [12]
    CptBluebear
    Link
    Now this is problematic. You won't stop someone from running a local version of stable diffusion and letting their degenerate flag fly, but there's a good reason to stop purposeful sharing of deep...

    Now this is problematic. You won't stop someone from running a local version of stable diffusion and letting their degenerate flag fly, but there's a good reason to stop purposeful sharing of deep fakes (or in this case, a nude mod) of existing people. If people do this in their hidey hole and keep the picture to themselves I don't see how you can enforce it anyway, so it'll happen regardless and it's a waste of time to worry about it too much. The problem mainly starts when you share these pictures online.

    There's a bunch of countries that have or have started to outlaw the sharing of non consensual deep fakes, so I wonder how much this app is going to take hold there. My guess is that it's going to be shut down rather quickly if it even launched at all in these countries.

    20 votes
    1. [8]
      NaraVara
      Link Parent
      People will 100% do this with minors, which to my mind is worse than drawn anime minors/lolicon stuff. You won’t catch it 100% but you could still make it illegal to knowingly possess it.

      People will 100% do this with minors, which to my mind is worse than drawn anime minors/lolicon stuff. You won’t catch it 100% but you could still make it illegal to knowingly possess it.

      24 votes
      1. [2]
        oliak
        Link Parent
        Not will, are. It's already happening.

        Not will, are. It's already happening.

        21 votes
        1. shrike
          Link Parent
          People have already been convicted of this too pretty recently.

          People have already been convicted of this too pretty recently.

          5 votes
      2. [5]
        GenuinelyCrooked
        Link Parent
        I wonder what the legality would be of an adult training a model like this using pictures of themself when they were a minor.

        I wonder what the legality would be of an adult training a model like this using pictures of themself when they were a minor.

        1 vote
        1. [4]
          MimicSquid
          Link Parent
          Given that underaged people have been prosecuted for having made pornography of themselves without having distributed it to anyone, it being consenting is unlikely to be a defense.

          Given that underaged people have been prosecuted for having made pornography of themselves without having distributed it to anyone, it being consenting is unlikely to be a defense.

          6 votes
          1. [3]
            GenuinelyCrooked
            Link Parent
            Right, but it wouldn't be making actual pornography of an underaged person. It would be more akin to simulated child pornography, like drawn versions, which I don't think is illegal. They would be...

            Right, but it wouldn't be making actual pornography of an underaged person. It would be more akin to simulated child pornography, like drawn versions, which I don't think is illegal. They would be consenting AND they would be creating fictitious pictures.

            I think this might cause either the legality of drawn child pornography, or the legality of teens taking naked images of themselves, into further review. There's no way for the regulation of this particular action to be consistent with both precedents.

            1. [2]
              MimicSquid
              Link Parent
              The illegality of simulated material is variable by jurisdiction as far as I know.

              The illegality of simulated material is variable by jurisdiction as far as I know.

              6 votes
              1. GenuinelyCrooked
                Link Parent
                Yeah, I don't particularly want to Google it to find out.

                Yeah, I don't particularly want to Google it to find out.

                4 votes
    2. TanyaJLaird
      Link Parent
      I agree. I don't buy the argument that because we can't prevent it 100%, we shouldn't outlaw it. Child pornography is the obvious parallel example. The sad truth is that it will never be possible...

      I agree. I don't buy the argument that because we can't prevent it 100%, we shouldn't outlaw it.

      Child pornography is the obvious parallel example. The sad truth is that it will never be possible to completely stop all child abuse images. Pedophiles exist. Some pedophiles inevitably have children of their own. And everyone has a phone with a camera. If a pedophile takes pictures of their own kids when they're too young to remember, encrypts them, and never shares them anywhere...The hard truth is we're probably never going to be able to catch that. The best we can do is make child pornography extremely illegal and hope to catch them when they slip up.

      We'll never be able to stop all deepfake porn. But we can at least make it illegal enough that it's mostly kept under wraps.

      15 votes
    3. [2]
      Moonchild
      (edited )
      Link Parent
      I've written before on the topic: I think the question we need ask ourselves is: what does the proliferation of these images and technology tell us about the people distributing and generating...

      I've written before on the topic: I think the question we need ask ourselves is: what does the proliferation of these images and technology tell us about the people distributing and generating them and the people consuming them, and what effect do they have on those people? I'm not trying to make a particular object-level point here—I've learned that I have no understanding of how most people relate to eroticism—but I do want to argue for a particular frame and some directions.

      Here's an anecdote. I remember distinctly the first time I read anything on 4chan—I had decided it was high time I checked out this infamous website for myself. The first thing I saw: somebody had posted three pictures of women in swimsuits, purportedly people they knew in real life, and asked: 'which of these should I rape?' The replies were filled with discussion of their ... respective merits.

      Pre ai craze, there were internet communities dedicated to producing fake nudes manually, the old-fashioned way—with photoshop (contrast a 'deep fake' with a non-deep one). These were fringe for economic reasons: producing a single convincing fake image takes a nontrivial amount of time and effort, and not everybody has the skill to do it. Ai-nonsense democratised it, fundamentally changing its social role. Forging an image of somebody in a compromising position is now just as easy as talking about raping them—and, if I may be so bold as to do what I said I wouldn't and speculate about other people's sexualities, both comprise a sort of virtual para-violation of the subject by the consumer—the latter already happens, and the former is a market substitute for it.

      My point is—how big of a deal is this? A doctored image presumably evokes a more visceral reaction. How much does that actually intensify the feelings of people viewing them? By how much does it increase the market for these dynamics? I leave these questions to others, but in any event I fear the images are a convenient distraction and excuse to ignore the underlying problems.

      8 votes
      1. DefinitelyNotAFae
        Link Parent
        The big deal lies in the violation of the person whose picture is altered. Because it doesn't just sit on the App user's computer, unseen by anyone else. There's zero guarantee of it. Instead it's...

        The big deal lies in the violation of the person whose picture is altered. Because it doesn't just sit on the App user's computer, unseen by anyone else. There's zero guarantee of it. Instead it's shared - posted online, shared among peers, confused for reality, used to harass and shame the person whose picture was used and altered without consent.

        How would the three women whose pictures were posted as potential victims have felt had they found the post? How come no one cared about them in the conversation? If instead fake nude images were created, how much worse would finding out have been? How much more horrifying the 4chan conversations? And it's not like it'll only happen to women.

        It is baffling how we cannot protect victims from these products by banning them, and while I have zero faith in the government to protect victims of any sexual crime, we won't apparently even care about copyright in the face of AI.

        I suspect that this would be covered by local "revenge porn" laws - or will be soon.

        10 votes
  3. [5]
    Wafik
    Link
    This is so weird to me. Non-Consensual nude pictures are obviously a problem, but this feels like one step further. I cannot imagine getting your rocks off to fake nude pictures of anyone. Doesn't...

    This is so weird to me. Non-Consensual nude pictures are obviously a problem, but this feels like one step further. I cannot imagine getting your rocks off to fake nude pictures of anyone. Doesn't that kind of defeat the purpose? I must be getting old.

    7 votes
    1. [2]
      stu2b50
      Link Parent
      Not really an important conversation, but just curious, have you never heard of drawn porn? Or can you really distinguish with just a glance that something is a “deepfake”? I can’t imagine most...

      Not really an important conversation, but just curious, have you never heard of drawn porn? Or can you really distinguish with just a glance that something is a “deepfake”? I can’t imagine most people really care if their porn is “real” or not, that’d be pretty low on the porn priority list.

      16 votes
      1. Wafik
        Link Parent
        I mean, obviously I have. It has never done anything for me. I'm sure you're right. Still weird to me.

        I mean, obviously I have. It has never done anything for me. I'm sure you're right. Still weird to me.

        7 votes
    2. [2]
      shrike
      Link Parent
      The quality is getting better and better. You can go on Instagram and find tons of completely AI-generated "influencers". Basically a generic female face either on an AI generated body or...

      The quality is getting better and better. You can go on Instagram and find tons of completely AI-generated "influencers". Basically a generic female face either on an AI generated body or AI-interposed on an existing body or even a video. Some of them are up front and have clear clues the images are AI-generated in the profile, but some aren't - and they have tens of thousands followers commenting on the pictures exactly like you'd imagine.

      This combined with the fact that we have normalised people plastering their face everywhere on the internet willingly, it's quite easy collect material to train an AI model to "know" a specific face. After that it's just prompt engineering custom pictures with a local LLM.

      And from there you can automate it and have a whole fake account of someone with quite believable looking pictures with their face on it. This is all doable today with a normal M-series MacBook.

      This is one account creating SFW fakes of celebrities: https://www.instagram.com/deca_ai

      2 votes
      1. Wafik
        Link Parent
        I mean, I guess I shouldn't be surprised that Facebook is totally fine with all of this happening.

        I mean, I guess I shouldn't be surprised that Facebook is totally fine with all of this happening.

  4. [2]
    pete_the_paper_boat
    Link
    I'm wondering how an app designed to generate non consensual imagery on their cloud can even be in business in any western jurisdiction.

    I'm wondering how an app designed to generate non consensual imagery on their cloud can even be in business in any western jurisdiction.

    2 votes
    1. fxgn
      Link Parent
      Because it's technically not any different from just photoshopping a woman's face onto some random nude body, which is not illegal. That being said, many countries are starting to outlaw...

      Because it's technically not any different from just photoshopping a woman's face onto some random nude body, which is not illegal. That being said, many countries are starting to outlaw non-consentual deepfake nudes.

      7 votes