17 votes

AI and ethics - CP

Weird one, but this debate has come up a couple of times and people look at me like I'm the odd ball when discussing this in real life: AI CSAM and Ethics.

AI, without guardrails, can generate CSAM. This is ethically horrific. However, my opinion is that I would rather the Monsters could and would look at AI generated CSAM than pay for the real thing.

Expanding, this has cons such as diluting society's care of the content, which could lead to more tolerance on real CSAM. I don't think that would be the case, since these people have a sick infatuation and probably just want to jack off to it in the privacy of their own homes. Again, I'd rather that was AI generated and not real.

Pros, not so many, apart from the Twisted would be using fantasy work and hopefully stopping other Sickos from making real content for money.

I know it would not completely curb it, and I'm not advocating for it being a mainstream thing, but I would rather it was faked than real.

Can I have others take on this? Am I mental? Am I seeing it all wrong?

27 comments

  1. [8]
    V17
    Link
    You're not mental, I think most people who use generative AI had that thought and undoubtedly these debates are taking place, but probably behind closed doors because in public you're always going...
    • Exemplary

    You're not mental, I think most people who use generative AI had that thought and undoubtedly these debates are taking place, but probably behind closed doors because in public you're always going to have people with more passion than knowledge, making dialogue very difficult.

    I don't have an opinion on the topic, iirc there is research suggesting that things could go both ways (reducing real CP vs normalizing it and "increasing hunger").

    What I expect is going to happen is nothing: nobody is going to come forward and push through relaxed laws for child porn, it would be a very difficult sell even if it did truly reduce child abuse, but if it did the opposite, it would be pretty much a public (metaphorical) suicide. The potential consequences are too big and can't really be taken back because a ton of AI CP and models capable of generating it (which likely exist now, but in small amounts and not out in the open) would already exist and be widely distributed.

    23 votes
    1. [6]
      immaterial
      Link Parent
      I remember having an interesting conversation online about whether explicit lolicon content should be banned - one point was particularly about how it's a coping mechanism for non-offending...

      I remember having an interesting conversation online about whether explicit lolicon content should be banned - one point was particularly about how it's a coping mechanism for non-offending pedophiles to prevent hurting real children. I think that discussion is quite analogous to this one.

      There's research examining coping techniques of pedophiles, and sure enough consuming CSAM is on there. However, I can't find any regarding how effective they are. I imagine it's quite hard to conduct research finding that out.

      Honestly some parts of that article feel quite grim - to have this part of you that you can't control, to the point where you plan your life around avoiding any "risky situations". Our minds are so weird, the fact that we can have this kind of schism between rationality and irrational impulse.

      12 votes
      1. [4]
        V17
        Link Parent
        Many years ago I talked to a dude who worked at an anonymous psychological help hotline and had a repeated client who was a pedophile. The client hated this side of himself, tried to do everything...

        Honestly some parts of that article feel quite grim - to have this part of you that you can't control, to the point where you plan your life around avoiding any "risky situations".

        Many years ago I talked to a dude who worked at an anonymous psychological help hotline and had a repeated client who was a pedophile. The client hated this side of himself, tried to do everything in his power to eliminate his urges, but in the end realized he can never be sure, so he decided to voluntarily undergo castration, thinking he's finally going to be free. Unfortunately he found out that while some urges vanished, he's still attracted to children. Absolutely devastated. It was really dark, I cannot imagine living like that.

        18 votes
        1. [3]
          OBLIVIATER
          Link Parent
          That's awful, it's terrifying that it was even offered as an option. Is that a standard treatment for that kind of thing, or is it just based on the assumption that if you can't get aroused you...

          That's awful, it's terrifying that it was even offered as an option. Is that a standard treatment for that kind of thing, or is it just based on the assumption that if you can't get aroused you can't get aroused to CSAM?

          Was the doctor aware that was the reason for wanting to be castrated? It seems like a huge problem that this kind of procedure can just be done by request if it doesn't even work effectively. And yet we still have people who can't get voluntarily sterilized because doctors are afraid they may want to have kids in the future.

          1 vote
          1. [2]
            V17
            Link Parent
            I do not know the details and iirc the therapist did not know the details either, except the fact that it was entirely voluntary and actively seeked out by the client - this is basically a crisis...

            I do not know the details and iirc the therapist did not know the details either, except the fact that it was entirely voluntary and actively seeked out by the client - this is basically a crisis hotline, so there's usually little space to dig deeper than what the client needs at that very moment. I have not heard of any other case like that (but I am also not in the field). This happened probably about 25 years ago or more and I'm not from the US, the standards were slightly different then around here.

            1 vote
            1. DefinitelyNotAFae
              Link Parent
              I was very briefly in the field of treating offenders and victims of child sexual abuse, chemical and physical castration have been and still are sentences for crimes related to the sexual abuse...

              I was very briefly in the field of treating offenders and victims of child sexual abuse, chemical and physical castration have been and still are sentences for crimes related to the sexual abuse of children. I don't know whether this person actually underwent physical castration or chemical or whether a doctor would in fact perform that voluntarily today, but it is something that has been performed as a punishment and is still on the books. In fact, I believe it was Louisiana that just added physical castration.

              Theory is that it will remove the urges to actually offend by lowering the sex drive, but it doesn't do anything to remove the attraction itself. I have not kept up on the status but it doesn't surprise me that somebody was able to go that route or tried to in the absence of an alternative successful treatment option.

              3 votes
      2. Deely
        Link Parent
        Just thinking out loud: what happens in Japan where lolicon mangas exists and published for the dozens of years?

        However, I can't find any regarding how effective they are. I imagine it's quite hard to conduct research finding that out.

        Just thinking out loud: what happens in Japan where lolicon mangas exists and published for the dozens of years?

        1 vote
    2. pete_the_paper_boat
      Link Parent
      A creep genuinely attempted this in the Netherlands 4 years ago, it went about as well as you would expect. link (Dutch)

      nobody is going to come forward and push through relaxed laws for child porn

      A creep genuinely attempted this in the Netherlands 4 years ago, it went about as well as you would expect. link (Dutch)

      5 votes
  2. [8]
    paris
    Link
    Small aside: the term CSAM is preferred when discussing this topic.
    10 votes
    1. [6]
      V17
      Link Parent
      What a strange thing to claim. I cannot imagine anybody other than pedophiles (and even there probably not all of them) ever thinking that "child porn" implies consent, and that those sick enough...

      'The term Child ‘Pornography’ implies consent and a child cannot give consent to sexual activities. And thus, we use the term Child Sexual Abuse Material (CSAM).'

      What a strange thing to claim. I cannot imagine anybody other than pedophiles (and even there probably not all of them) ever thinking that "child porn" implies consent, and that those sick enough to think so would change their mind by using different words for it. I remain unconvinced.

      20 votes
      1. [5]
        Thrabalen
        Link Parent
        I would actually argue the other way... calling CSAM pornography of any stripe implies that pornography can be non-consensual. While there can be non-consensual adult conduct, we have a word for...

        I would actually argue the other way... calling CSAM pornography of any stripe implies that pornography can be non-consensual. While there can be non-consensual adult conduct, we have a word for that, and it certainly isn't pornography (even if some would try to blanket the two together.)

        That's not to say that there aren't those who try to pass off such recordings as pornography, but there are those who would try to pass off non-consensual adult conduct in general as sex, and that's a word which similarly does not apply IMO (sex should be reserved as a descriptor for consenting adults going in "eyes open.")

        15 votes
        1. [4]
          V17
          Link Parent
          I get what you mean, but firstly this seems like trying to change human behavior by changing what we call it, which is not something I believe in in general (not that it has zero effect, but the...

          I get what you mean, but firstly this seems like trying to change human behavior by changing what we call it, which is not something I believe in in general (not that it has zero effect, but the effort to efficiency ratio seems very far from worth it to me), and secondly I disagree with your observation about porn: there's been a lot of concern about various forms of abuse in the porn industry (whether physical, which is likely getting less common, or some "soft" abuse like coercion through financial means), yet everyone talking about those concerns still calls it porn, and we use it in other negative connotations as well, like revenge porn.

          8 votes
          1. [3]
            Thrabalen
            Link Parent
            There's a lot of abuse in many industries, but the abuse isn't/shouldn't be the norm (even if widespread). I suppose my point is that abuse isn't part of the definition of the word, but it is with...

            There's a lot of abuse in many industries, but the abuse isn't/shouldn't be the norm (even if widespread). I suppose my point is that abuse isn't part of the definition of the word, but it is with CSAM. Mind you, I don't think we should relax on cracking down on abuse in other areas... any areas... where it occurs. But calling CSAM pornography, again IMO, lessens the perception of severity of the act. Pornography itself is a legitimate business market (much like clothing manufacture) despite the abuse (to continue the analogy, sweatshops), but involving minors is never legitimate.

            I do agree, however, that we're never going to actually convince people to not use the term... my argument is almost entirely academic.

            2 votes
            1. [2]
              V17
              Link Parent
              We'll have to agree to disagree here, because I find it hard to think of words with more negative connotations than "child porn". People didn't even want to be seen writing it out and started to...

              But calling CSAM pornography, again IMO, lessens the perception of severity of the act.

              We'll have to agree to disagree here, because I find it hard to think of words with more negative connotations than "child porn". People didn't even want to be seen writing it out and started to use the shorthand CP many years ago. As a comparison, we find the N word to be so bad that we don't write it out or say it out loud, but if you imagine newspaper headlines saying "Leaked emails of politician xxxx discover he called his opponent a n****r" vs "Leaked emails of politician xxxx discover he purchased child porn", the latter sounds much worse and with much more severe consequences (even ignoring possible jail, just social consequences) to me.

              5 votes
              1. Thrabalen
                Link Parent
                While true that "child porn" is a horrible term, if we called it what it truly is, child rape, I think it would be perceived worse. But you're right in the "agree to disagree" department, I think.

                While true that "child porn" is a horrible term, if we called it what it truly is, child rape, I think it would be perceived worse. But you're right in the "agree to disagree" department, I think.

                1 vote
    2. g33kphr33k
      Link Parent
      Noted. I will update my terminology going forward.

      Noted. I will update my terminology going forward.

      7 votes
  3. [3]
    ShroudedScribe
    Link
    This seems like the best place to share a conspiracy theory I have - I sometimes contemplate if government agencies plant CSAM on people's computers that they either can't detain for another crime...

    This seems like the best place to share a conspiracy theory I have - I sometimes contemplate if government agencies plant CSAM on people's computers that they either can't detain for another crime because they don't have enough proof, or to immediately alter public opinion on someone. Obviously I have no proof of this, but would anyone be able to disprove it? There certainly are zero-day exploits that could be used to plant this content remotely, or it could even just be lied about if a computer is confiscated for evidence.

    Regarding AI generation of this content, as others have said, there aren't really enough studies out there to determine if this creates a desire to seek out the real thing, or if it is enough to do the opposite.

    One thing that is also important to consider is if CSAM consumers can be rehabilitated. A 2024 paper claims the following in the abstract:

    48.1% want to stop using CSAM. Some seek help through Tor, and self-help websites are popular. Our survey finds commonalities between CSAM use and addiction. Help-seeking correlates with increasing viewing duration and frequency, depression, anxiety, self-harming thoughts, guilt, and shame. Yet, 73.9% of help seekers have not been able to receive it.

    7 votes
    1. [2]
      papasquat
      Link Parent
      Yeah, I very much believe it happens sometimes. It's such an easy thing to do, and theres basically no category of file that is both so illegal and so publicly disgusting to most people. At the...

      Yeah, I very much believe it happens sometimes. It's such an easy thing to do, and theres basically no category of file that is both so illegal and so publicly disgusting to most people.

      At the same time, there are a horrifying number of pedophiles out there, and I'd imagine that the vast, vast majority of cases of CSAM are someone legitimately getting off to it versus blackmail.

      3 votes
      1. ShroudedScribe
        Link Parent
        I certainly don't want to downplay the issue - there are, unfortunately, a ton of people being abused, and I do believe the vast majority of accused are guilty of their involvement.

        I certainly don't want to downplay the issue - there are, unfortunately, a ton of people being abused, and I do believe the vast majority of accused are guilty of their involvement.

        1 vote
  4. pete_the_paper_boat
    (edited )
    Link
    I worry such content serves as a gateway to CSAM, thus "increasing hunger", as @V17 put it, People tend to get desensitized to things, and that never ends well. This topic is probably much older...

    I worry such content serves as a gateway to CSAM, thus "increasing hunger", as @V17 put it, People tend to get desensitized to things, and that never ends well.

    This topic is probably much older than AI, but with artificial intelligence, the quality/effort scale hugely shifts.

    5 votes
  5. [5]
    OBLIVIATER
    Link
    Honestly the worst part about this to me is (as I understand it) generative AI would have to be trained on real CSAM to produce that type of content right? I don't know if there can ever be a...

    Honestly the worst part about this to me is (as I understand it) generative AI would have to be trained on real CSAM to produce that type of content right?

    I don't know if there can ever be a silver lining to that cloud, at least in my opinion. Feeding CSAM into the grinder to make content to titillate pedophiles is not something I think should be done. I understand the argument of harm reduction, I just don't know if it holds enough water.

    5 votes
    1. [4]
      unkz
      Link Parent
      I'm highly confident that one could make a CSAM generator trained initially on clothed children and regular adult porn. My first approach would basically be a GAN (generative adversarial network)....

      I'm highly confident that one could make a CSAM generator trained initially on clothed children and regular adult porn. My first approach would basically be a GAN (generative adversarial network). Train two discriminator networks, one to detect children and one to detect porn, and then generate images that maximize both those criteria.

      You could probably do this off with off the shelf code, eg this example of how we can convert horses into zebras -- either nudify child images, or youngify adult porn, which would give you lots of simulated training data to build a stable diffusion type model that could generate arbitrary new images.

      8 votes
      1. [2]
        OBLIVIATER
        Link Parent
        Definitely showing my ignorance of generative AI then, it's a difficult field to stay on top of when it's evolving and changing so quickly.

        Definitely showing my ignorance of generative AI then, it's a difficult field to stay on top of when it's evolving and changing so quickly.

        3 votes
        1. Moonchild
          Link Parent
          extrapolation and generalisation were always the point of ai

          extrapolation and generalisation were always the point of ai

          3 votes
      2. V17
        Link Parent
        The CEO of Stability AI claimed that the reason why their models cannot do naked people is because they had to make a decision during training: either completely exclude nudity or completely...

        The CEO of Stability AI claimed that the reason why their models cannot do naked people is because they had to make a decision during training: either completely exclude nudity or completely exclude children, otherwise the models could plausibly create child porn. So it may be even "easier" than you say, just train a generalized model without those barriers in place - quotes around easier because training those models is not cheap.

        2 votes
  6. [2]
    unkz
    Link
    I think one issue, which I haven't thought too much about, but seems problematic when it comes to prosecuting, is if there is no law against artificial CSAM, then how will we be able to...

    I think one issue, which I haven't thought too much about, but seems problematic when it comes to prosecuting, is if there is no law against artificial CSAM, then how will we be able to definitively identify actual CSAM? I could have an actual CSAM generating operation, but just claim that it's AI. A sort of benefit of the current situation is anyone with CSAM is by definition committing a crime, so we don't have that kind of potential defence.

    All that said,

    This is ethically horrific.

    I'm still on the fence about this. I'm not sure if there's an ethical component to someone doing something on their own computer that nobody else ever sees or is affected by.

    5 votes
    1. OBLIVIATER
      Link Parent
      Honestly a great point, even if we developed a way to 100% accurately identify AI generated content it would be difficult to prove beyond a shadow of a doubt. I don't know what the best way to...

      Honestly a great point, even if we developed a way to 100% accurately identify AI generated content it would be difficult to prove beyond a shadow of a doubt. I don't know what the best way to rehabilitate pedophiles is, I doubt it's prison though. But opening up the laws for wiggle room seems like the wrong lever to pull for that kind of problem anyway.

      1 vote