16 votes

Replika CEO Eugenia Kuyda says it’s okay if we end up marrying AI chatbots

20 comments

  1. glesica
    Link
    I mean, isn't this just straight out of the "I need publicity for my company so I'll make an absurd claim that will attract headlines and controversy" playbook? It irks me that news outlets have...

    I mean, isn't this just straight out of the "I need publicity for my company so I'll make an absurd claim that will attract headlines and controversy" playbook? It irks me that news outlets have an incentive to go along with this nonsense to get impressions.

    60 votes
  2. [14]
    em-dash
    Link
    From the perspective of "what should our culture allow", I'm entirely on the side of "anything that makes someone happier without negative effects to other people is good". I don't need to...

    From the perspective of "what should our culture allow", I'm entirely on the side of "anything that makes someone happier without negative effects to other people is good". I don't need to understand something myself to allow other people to have it.

    That said, this person said so many things that were red flags for me:

    Even if you think about relationships and romantic relationships, we like someone who resembles our dad or mom, and that’s just how it is.

    That's a heck of a generalization.

    I’m a woman. Our chief product officer is a woman. We’re mostly a female-led company. [Erotic content is] not where our minds go.

    That's also a heck of a generalization, and she fails to realize the differences between "we didn't design with that in mind" and "people won't use it for that".

    Separately from that, I do have deep concerns about this sort of thing being SaaS. Relationships should not be subscriptions. I look forward, with deep schadenfreude, to watching people learn this the hard way when Replika inevitably raises their prices.

    22 votes
    1. vord
      Link Parent
      And/or is merely maintaining a facade of plausible deniability because being in the erotica business means a lot more (and harsher) regulations. Pretty sure their customer base collapses after the...

      she fails to realize

      And/or is merely maintaining a facade of plausible deniability because being in the erotica business means a lot more (and harsher) regulations. Pretty sure their customer base collapses after the first 1-star review that says something to the effect of "can't sext, what's the point."

      It's not just men writing and consuming all this erotic fanfiction after all.

      14 votes
    2. [6]
      ignorabimus
      Link Parent
      I'm unsure spending time "talking" to a large language model is good for people (from literally the fact that this is time they might otherwise spend on meeting actual people, to the fact that...

      I'm entirely on the side of "anything that makes someone happier without negative effects to other people is good".

      I'm unsure spending time "talking" to a large language model is good for people (from literally the fact that this is time they might otherwise spend on meeting actual people, to the fact that this presumably creates false expectations around relationships – e.g. I don't think AI partners are going to have emotional needs in the same way that a real partner will).

      I'm definitely unsure that this is good on a utilitarian society-level basis.

      12 votes
      1. [2]
        EgoEimi
        Link Parent
        In a long ago comment, I speculated that the rise of AI companions will do more social harm than good. AIs are endlessly patient and servile. You don't have to learn to give and take or empathy...

        In a long ago comment, I speculated that the rise of AI companions will do more social harm than good.

        AIs are endlessly patient and servile. You don't have to learn to give and take or empathy with an AI.

        LLMs like Anthropic's Claude and OpenAI's ChatGPT will push back on obviously offensive requests, but there are already 'jailbroken' models that won't, and those are getting popular. Same with image generation models that are being used for porn.

        I fear that AI partners, friends, and companions are going to become popular with children who readily accept new realities as normal. They will grow up thinking that it's normal to have a servile AI friend who will put up with their crap endlessly. And they'll learn to prefer its sycophantic company over the company of real humans who are messy, impatient, and can hurt and disappoint them.

        17 votes
        1. GenuinelyCrooked
          Link Parent
          Now that you bring is up, I actually think well-programmed AI that is designed to challenge users and teach empathy could be great for society. A lot of "incels" and "creeps" start as inept young...

          Now that you bring is up, I actually think well-programmed AI that is designed to challenge users and teach empathy could be great for society. A lot of "incels" and "creeps" start as inept young people who have trouble with social cues, and it's very hard to practice understanding those without upsetting anyone else or facing harsh rejection and other repercussions when you get it wrong. Being able to practice reading those cues without risking social ties with real humans, and without upsetting any real humans, could be an incredible boon to the dating scene and social cohesion more generally.

          That said, I don't think AI technology is currently advanced enough to be trusted with a project like this, and there wouldn't be much profit in doing it correctly, so I doubt it will ever happen.

          12 votes
      2. DavesWorld
        Link Parent
        Replika and their staff/executives simply want to make money, so I'm not defending them nor do I "take their side." Their whole goal here is to spin up a subscription, and it's just that it...

        Replika and their staff/executives simply want to make money, so I'm not defending them nor do I "take their side." Their whole goal here is to spin up a subscription, and it's just that it happens to be a chatbot rather than streaming or fitness tracking or whatever. So fuck them and fuck the executives.

        That said, there is a market for a realistic, human-like chatbot. Not an empty randomizer script that just spits back preloaded phrases, but something that can use LLM technology to parse and respond in a more realistic manner to language. Could be voice, could be text; same difference.

        I think those who insist "people should talk to real people" are speaking from a place of privilege. Those who often decry the possibility of a "chatbot friend" are people who tend to be able to make friends, who can talk to "real" people, who don't have major persistent obstacles in connecting with people. It's a dismissive attitude that ignores real hurt and real pain that real people are feeling when they're rejected by real people.

        Because it happens. Constantly. People are cruel. They're demanding, they're arbitrary, and they often don't give second chances. They make judgements and decisions about other people based on nebulous and sometimes seemingly flimsy or mysterious "reasons." People will literally decide they don't like someone because "well, I can't quite but my finger on it, but I just don't like them."

        It might be tone or body language, it could be general manner, it could be any of thousands of things. How they speak, what they speak about. The way they dress, how they wear their hair. What their hobbies and habits are. Anything. Because all that and more goes into how humans judge and navigate interactions with other humans.

        And not all humans know how to navigate it. Not all humans have that subtext specialty in being able to successfully, consistently, forge connections with other humans. Autism, for example.

        So in my opinion, it's fairly offensive for people who just coast through social situations enjoying their innate privilege to successfully engage with their fellows to then sneer at the concept that someone who wants some interaction might want it from a "social chatbot."

        Why they want that "electronic interaction" is their business. If it exists, if it's something that can be possible, and some would like to see if it's a thing they'd like to have as part of their life, it's just bullying and cruelty to sneer at them for it.

        Further, it proves the point. They're people too. Their only crime is they're people who other people don't want to reach out to, tolerate, or try to understand. They're people others are happier ignoring, dismissing, and looking down upon. If all they did was ignore, dismiss, and (silently) look down upon it would be one thing; but usually people take those reactions and make them active with bullying, insults, and as a reason to verbally outcast the target.

        And still folks wonder why some would rather withdraw, rather than try to reach out again. When reaching out only results in pain, it's not "suck it up, try harder" like the coach in school inevitably says no matter what the actual situation is.

        Sometimes, you just get tired of being hurt. You learn to avoid the pain. If an "electronic friend" can lessen some of that pain, who does it hurt? Apparently it hurts "normal" people who don't understand why the rejects can't just try harder to "be normal" and "get along."

        6 votes
      3. [2]
        CptBluebear
        Link Parent
        You could argue that the people chasing a partner that's this subservient to their every whim and need is probably not one you need in the human dating pool. It's the same reason sex workers work,...

        You could argue that the people chasing a partner that's this subservient to their every whim and need is probably not one you need in the human dating pool. It's the same reason sex workers work, their clientele are largely the, excuse my verbiage, undesirables.

        3 votes
        1. GenuinelyCrooked
          Link Parent
          Then you need to question whether "chasing a partner that's this subservient to their every whim" is a fixed trait, or something that can change, and if it can change, should it? There are...

          Then you need to question whether "chasing a partner that's this subservient to their every whim" is a fixed trait, or something that can change, and if it can change, should it? There are probably some people who, regardless of therapy or practice, will never be a good partner to a real, independent human with their own wants and needs. There are also definitely quite a lot of people, especially young people, who would enjoy the experience of having any partner, subservient or not, and by choosing one that is subservient, they never learn how to be with someone who isn't. There are probably also people who struggle to form even casual relationships with other humans, but become better at it with therapy and practice, but would never do that if they had AI friends and lovers to assure them that they don't need to change.

          Hopefully we can all agree that the kids, at least, should get the opportunity to grow into people who can have healthy, human relationships.

          6 votes
    3. [6]
      EgoEimi
      Link Parent
      I've long wondered about this, the trajectory of human (bio?) technological development, and what it means to be human. I'm gay and hate the old structures that dictated human sexuality and gender...

      From the perspective of "what should our culture allow", I'm entirely on the side of "anything that makes someone happier without negative effects to other people is good". I don't need to understand something myself to allow other people to have it.

      I've long wondered about this, the trajectory of human (bio?) technological development, and what it means to be human.

      I'm gay and hate the old structures that dictated human sexuality and gender expressions. I'm pro-personal autonomy.

      But what happens when the technology is available for people to modify themselves beyond what we currently conceive to be human boundaries? Should we stop the wealthy from engineering their children to be extremely intelligent, attractive, and disease-proof, almost creating a separate human race?

      What if the technology for immortality becomes possible and some people want to live forever and some people believe that death is a necessary human experience? The latter can't simply let the former do their own thing, because the immortals will eventually amass incomparable wealth, political power, and knowledge—far beyond the current average person vs. billionaire divide today—and lord over countless generations of mortals.

      Relationships should not be subscriptions. I

      And what if people want to be in love with an avatar that's an extension of a for-profit entity that's interested in extracting money from them? Or to be in love with machines that will never challenge them to grow?

      Do we allow human culture to freely evolve toward these end states?

      I think that in the distant future, there will be many intense culture wars over what it means to be human, culture wars that make the current culture war over abortion and trans rights look like a small scuffle.

      11 votes
      1. [5]
        em-dash
        Link Parent
        Sure, why not? I consider "some people are extremely intelligent and disease-proof" to be unambiguously better than "no people are extremely intelligent and disease-proof". (I have no strong...

        But what happens when the technology is available for people to modify themselves beyond what we currently conceive to be human boundaries? Should we stop the wealthy from engineering their children to be extremely intelligent, attractive, and disease-proof, almost creating a separate human race?

        Sure, why not?

        I consider "some people are extremely intelligent and disease-proof" to be unambiguously better than "no people are extremely intelligent and disease-proof". (I have no strong opinions on attractiveness, because "attractive" is such a personal judgement.) Equality should be about giving everyone the same good things, not about taking away good things just because not everyone has them.

        What if the technology for immortality becomes possible and some people want to live forever and some people believe that death is a necessary human experience? The latter can't simply let the former do their own thing, because the immortals will eventually amass incomparable wealth, political power, and knowledge—far beyond the current average person vs. billionaire divide today—and lord over countless generations of mortals.

        I would argue we should try fixing the economic system that lets them do that before we just start killing people for living too long.

        12 votes
        1. [4]
          GoatOnPony
          Link Parent
          Egalitarian societies need safeguards against the accumulation of power otherwise they don't stay egalitarian. For material goods this manifests as taking/giving from the wealthy and providing for...

          Egalitarian societies need safeguards against the accumulation of power otherwise they don't stay egalitarian. For material goods this manifests as taking/giving from the wealthy and providing for the needy. For non material goods, non fungible goods, or goods with extreme power, things are not so clear cut because you might just be forced to give it to no one. For example, should an egalitarian society allow secret societies? They are an avenue for oligarchic behavior that threaten to consolidate power amongst a few people. You can't really share secret societies, only attempt to prevent their formation. Should some one person be allowed to hold onto the hope diamond? It's an object that can't be divided and is an enormous status symbol, maybe society would be better off if it didn't exist. Should anyone have nuclear bombs? Clearly no one should. Egalitarian societies obviously shouldn't plow everyone into featureless clones, but there is a line somewhere beyond which it's inherently harmful to maintaining an egalitarian society for some to have something and not others.

          Genetic/bio engineering falls into the non fungible category. Then the next question is whether some having it would threaten to give them power difficult for the rest of society to overcome. Then can we do this to everyone? If yes, would such a society be better? If the answer to the first question is yes and to either of the latter is no, then an injunction against it seems reasonable to consider.

          5 votes
          1. [3]
            em-dash
            Link Parent
            That's why I advocate for fixing systems that unfairly advantage older people, rather than just shrugging and letting them keep those advantages. To the extent that being healthy is a form of...

            Then the next question is whether some having it would threaten to give them power difficult for the rest of society to overcome.

            That's why I advocate for fixing systems that unfairly advantage older people, rather than just shrugging and letting them keep those advantages. To the extent that being healthy is a form of power, that is a sign that we should treat unhealthy people better and get better at fairly distributing medical care, not that we should stop people from becoming healthy.

            As a point of comparison, in the current US healthcare system, many people cannot afford emergency care. If I, a person who can easily afford medical care, have a medical emergency, is it ethical for me to seek care for it? If so, what makes this medical care different, other than that it's newer? Every treatment we currently know how to do was new and expensive at one point.

            4 votes
            1. [2]
              GoatOnPony
              Link Parent
              I think there's an ethical distinction between harm prevention and providing a boon. Emergency medical care is harm prevention and biological engineering which makes people go to the ER less often...

              I think there's an ethical distinction between harm prevention and providing a boon. Emergency medical care is harm prevention and biological engineering which makes people go to the ER less often I would put in that class. But that's not my impression of what was being discussed - my impression was that the thread was focused on immortality and significantly enhanced human capabilities over the average person. I would class those as boons, things which no one truly needs in order to have a happy life. So in as much as we should drastically improve healthcare and endeavor to prevent suffering in the most number of people possible, I wholeheartedly agree.

              One additional note is that expense seems like a justifiable reason why some treatments might be societally preferred or shunned. Reducto ad absurdum, a treatment which would extend someone's life one day at the cost of billions of dollars would be viewed by most as a waste and money better spent saving the lives of many more people elsewhere. Obviously genetic or bio engineering wouldn't have that absurd a cost benefit analysis. I just want to point out that doing a cost benefit analysis at a societal level isn't unreasonable. Societal reasoning about these things is ultimately consequentialist.

              1 vote
              1. em-dash
                (edited )
                Link Parent
                I think this is our real difference: I don't consider these to be a separate class of thing than ordinary medical care, especially immortality. Immortality is just another word for "cure all the...

                immortality and significantly enhanced human capabilities over the average person

                I think this is our real difference: I don't consider these to be a separate class of thing than ordinary medical care, especially immortality. Immortality is just another word for "cure all the things that would kill you if left uncured". I don't fault anyone for wanting to do that, or for expending resources to do so.

                edit: on further reflection, I didn't really respond to the last part about opportunity costs. I'll ponder it some more and write a better response in a bit.

                edit: yeah, I do agree that we should, at a societal level, consider opportunity costs. That's very different from saying "we shouldn't strive for these things at all", and I don't think it's related to the usual "but X is part of being human" arguments, which I find much less convincing.

                3 votes
  3. blivet
    Link
    Absolutely everything connected with AI seems grotesque.

    Absolutely everything connected with AI seems grotesque.

    10 votes
  4. supergauntlet
    Link
    yeah that's a healthy way to deal with loss. this is actually just passing over into 'weird' territory. Maybe it's just me but I don't understand who wants this. Even the most shut in incel has to...

    The idea for Replika came from a personal tragedy: almost a decade ago, a friend of Eugenia’s died, and she fed their email and text conversations into a rudimentary language model to resurrect that friend as a chatbot.

    yeah that's a healthy way to deal with loss. this is actually just passing over into 'weird' territory. Maybe it's just me but I don't understand who wants this. Even the most shut in incel has to know this is both a dead end with 0 chance of a meaningful connection and much more bluntly important to them, no sex. I just don't see who's willing to pay all this money for a service you can get very easily for free.

    6 votes
  5. post_below
    Link
    It's such an interesting topic, too bad Kuyda never really stepped out of marketing mode. I definitely get the dytopian projections, the situation begs for them. The idea of people replacing human...

    It's such an interesting topic, too bad Kuyda never really stepped out of marketing mode.

    I definitely get the dytopian projections, the situation begs for them. The idea of people replacing human relationships with bots is fraught.

    And also, isn't this just another in a long line of technological advances that we weren't ready for but managed to muddle through despite a lot of people thinking it heralds the end of humanity as we know it?

    That's not to say it's a good thing, rather it's something that's going to happen no matter what we think about it. So how do we collectively navigate it?

    As things are, we haven't finished figuring out how to deal with the downsides of social media, so we should probably sort that out soonish.

    It's crazy how sci-fi the present is.

    5 votes
  6. RheingoldRiver
    Link
    i dont really care if someone wants to play-act that they're married to a chatbot but we seriously need to separate marriage-the-social-construct from civil-union-the-legal-entity asap

    i dont really care if someone wants to play-act that they're married to a chatbot but we seriously need to separate marriage-the-social-construct from civil-union-the-legal-entity asap

    4 votes
  7. moocow1452
    Link
    What's the line between surrogate relationships and artificial personhood, if one of the key measures of how humans relate to one another is our connections? I understand this system is emotional...

    What's the line between surrogate relationships and artificial personhood, if one of the key measures of how humans relate to one another is our connections? I understand this system is emotional support appliance as a system, but someone more involved might consider this their best friend, so I'm not sure we're prepared for that as a society, especially if it's just another subscription service. Never mind whatever marriage is defined as when it's an iteration of a program.

    2 votes