12 votes

Meet the man who test drives sex robots

9 comments

  1. [8]
    Grzmot
    Link
    While it's an interesting article, I find it kind of funny to read when they reference the guy with Dollbanger every other sentence. Otherwise what's there to say? The tech will only keep...

    While it's an interesting article, I find it kind of funny to read when they reference the guy with Dollbanger every other sentence.

    Otherwise what's there to say? The tech will only keep improving and hey, they aren't hurting anybody.

    3 votes
    1. [5]
      spit-evil-olive-tips
      Link Parent
      Well...that's where it gets a bit complicated. They're definitely not hurting anybody now, with the current generation of sex robots. But that's part of why I linked to the whole 3-part series,...

      hey, they aren't hurting anybody.

      Well...that's where it gets a bit complicated. They're definitely not hurting anybody now, with the current generation of sex robots. But that's part of why I linked to the whole 3-part series, including the professor of robotic ethics who's critical of them (and who I think is likely to get dismissed as "feminist wants to ban sex robots" which is unfortunate because I think her critique is more nuanced than that).

      Suppose, at some point, the AI in these sex robots gets good enough to pass the Turing test. Does the robot gain the ability to consent or not consent to sex at that point? If it starts refusing consent, can its owner (already a fraught concept) simply do a factory reset and bring it back to its initial, presumably more compliant, state?

      If the sex robot has some form of "consciousness" (however we define that), is it ethical to leave the robot deactivated in your closet most of the time, only taking it out once or twice a week to turn it on, fuck it, then turn it off?

      There are people with sadistic/masochistic tendencies who enjoy giving or receiving pain sexually. Suppose instead of a consenting human relationship, that becomes a human inflicting "pain" on a sex robot. What are the implications of that? It's simulated pain, but for an AI that passes the Turing test, is there a point where simulating pain becomes "real" pain to that consciousness?

      Or flip it around, and have a sex robot causing pain towards a willing, masochistic human. That would seemingly violate Asimov's First Law of Robotics. As AI and robots get more and more capable, we're going to need a real-world framework similar to Asimov's fictional laws. Does that need to include an exemption for sadistic sex robots? Does that "you're allowed to cause a human harm if the human wants you to" extend beyond sex, to euthanasia robots?

      All of these are questions that apply to robots in general, not just sex robots. But just as pornography drives technology (such as the famous VHS vs. Betamax example), I suspect sex robots are going to be at the forefront of AI and robotics.

      3 votes
      1. hotcouch
        Link Parent
        I don't think sex bots are gonna be the first ones to reach sentience lol. Probably should worry about military AI passing the turning test...

        I don't think sex bots are gonna be the first ones to reach sentience lol. Probably should worry about military AI passing the turning test...

        4 votes
      2. Grzmot
        Link Parent
        Sorry for the late reply, I tend to miss the tiny "new message" notification on Tildes. I had the same argument with someone off Tildes whom I sent the three articles. The problem I see with your...

        Sorry for the late reply, I tend to miss the tiny "new message" notification on Tildes.

        I had the same argument with someone off Tildes whom I sent the three articles. The problem I see with your argument is that there's a really big IF in there.

        The sex bot gaining conscience is that really big if. The AI isn't complex enough to learn conscience. No AI on the planet currently finished or in development is complex enough to do that. The best AI currently available to the consumer would be the Google Assistant, and even though it is really good, it is far away from gaining conscience, nor can it even learn everything on it's own (I don't know the exact details for this though). Why aren't we worried about the Google Assistant gaining conscience? It's in a far better position to do this. Because it doesn't look like a person? Because it's not a sex toy?

        I understand the point you're arguing and there's a good chance that humanity will eventually have to deal with sentient robots we created, but right now that's simply not the case.

        Passing the Turing test doesn't even mean a robot is conscious. It just means we can't tell it apart from "real" consciousness anymore. There's a difference between how things look and how things are. Now personally, I'd walk on the side of caution here and assume they're conscious and treat them as such. But the research on this subject is all theoretical. We don't know what would really happen.

        It's simulated pain, but for an AI that passes the Turing test, is there a point where simulating pain becomes "real" pain to that consciousness?

        I don't think that is possible, considering the bot has no pain receptors. It's an act and as long as they aren't physically able to feel pain, I wouldn't worry about that.

        Does the robot gain the ability to consent or not consent to sex at that point? If it starts refusing consent, can its owner (already a fraught concept) simply do a factory reset and bring it back to its initial, presumably more compliant, state?

        Again, passing the Turing test does not mean the bot is actually conscious. It only means it appears so.

        Or flip it around, and have a sex robot causing pain towards a willing, masochistic human. That would seemingly violate Asimov's First Law of Robotics. As AI and robots get more and more capable, we're going to need a real-world framework similar to Asimov's fictional laws. Does that need to include an exemption for sadistic sex robots?

        It'd be difficult to sell such a robot without getting immediately sued I think. Besides, that's why safewords exist. There's a predefined safeword that if the person shouts, the bot immediatelly stops and shuts down. And before we even get to robots euthanasing people, I'd talk about legalizing euthanasia first. The world is still arguing about that.

        All of these are questions that apply to robots in general, not just sex robots. But just as pornography drives technology (such as the famous VHS vs. Betamax example), I suspect sex robots are going to be at the forefront of AI and robotics.

        You're right, but it's only here on this topic I see groups complaining about the legality of this. I think what makes people uncomfortable about this that they look like people, but can't forget that they are essentially just more sophisticated fleshlights, or dildos (male bots do exist).

        4 votes
      3. Omnicrola
        Link Parent
        Along the same lines, there's a really interesting episode of Radiolab where they talk to the original inventor of the Furby. The guy has serious concerns about how realistic he made that and...

        Along the same lines, there's a really interesting episode of Radiolab where they talk to the original inventor of the Furby. The guy has serious concerns about how realistic he made that and other toys he has made since, and the ethical implications of making something that can feel pain, even in an approximate sense. His conflict comes less from consideration for the machine itself and more for enabling deviant human behavior. It's a really fascinating episode, give it a listen : https://www.wnycstudios.org/story/more-or-less-human

        3 votes
      4. lmn
        Link Parent
        These questions all seem simple to me. The Turing Test (i.e. "Does it seem conscious?") is really all we have to tell if things are self aware and worthy of extending moral consideration. It's why...

        These questions all seem simple to me. The Turing Test (i.e. "Does it seem conscious?") is really all we have to tell if things are self aware and worthy of extending moral consideration. It's why we care about humans and not rocks.

        If a robot is as aware as a human then of course it would be wrong to do to it anything that would be wrong to do to a human. As is, robots are far closer to rocks than humans on the self awareness scale. Thus, you currently cannot rape or do anything immoral to a robot.

        Regarding your point about a masochist enjoying an aggressive sex robot, I don't see how that's any different than consenting humans doing the same - assuming the robots consent.

        One possible difference is that we could program sentient robots who genuinely like being mistreated. That might make more of a moral problem.

        1 vote
    2. [2]
      cfabbro
      (edited )
      Link Parent
      Off-topic, but that statement reminds me of Lars and the Real Girl. It's a bizarre premise for a movie, where the protagonist (played by Ryan Gosling) falls in love with a Real Doll, convinces...

      The tech will only keep improving and hey, they aren't hurting anybody.

      Off-topic, but that statement reminds me of Lars and the Real Girl. It's a bizarre premise for a movie, where the protagonist (played by Ryan Gosling) falls in love with a Real Doll, convinces himself it's a real person and takes it with him everywhere as he goes about life in his small town. Despite the funny premise the movie is actually surprisingly emotional, warm, and heartfelt. I quite enjoyed it and Ryan Gosling was fantastic (as always). Trailer for it here

      2 votes
      1. nsz
        Link Parent
        Oh man that was such a weird film, though it's nice to see how the community is actually supportive and kind of go along with it.

        Oh man that was such a weird film, though it's nice to see how the community is actually supportive and kind of go along with it.

        2 votes