I feel like ghosts are an underused metaphor. It’s like a company invented a portal that lets you talk to ghosts and people are fascinated. Some people want to hire ghosts as research or coding...
I feel like ghosts are an underused metaphor. It’s like a company invented a portal that lets you talk to ghosts and people are fascinated. Some people want to hire ghosts as research or coding assistants or tutors, and other people want to date the ghosts.
On the one hand, it seems like it shouldn’t be that hard to tell people not to date the ghosts? But on the other, yeah, people are going to come up with all kinds of crazy things they want to do with ghosts. Also OpenAI, at least, shows signs of wanting to turn “talking to ghosts” into a ghost-staffed entertainment industry.
That metaphor is neat because it folds in the aspects where their memory is ethereal and their behavior erratic. At the end of the day these "personalities" are echoes from the training data like...
That metaphor is neat because it folds in the aspects where their memory is ethereal and their behavior erratic. At the end of the day these "personalities" are echoes from the training data like ghosts are echos of the living.
I still can’t believe we have gone from the fictional movie Her to the reality. While this story has a somewhat happier ending, I was looking at top posts from the related subreddit and saw...
I still can’t believe we have gone from the fictional movie Her to the reality.
While this story has a somewhat happier ending, I was looking at top posts from the related subreddit and saw someone whose AI boyfriend actually dumped her, effectively telling her she needed human help to process her grief.
It’s strange to me as someone that’s happy being single. I wonder how much desperation in finding a partner is from expectations and how much is from an innate need.
It’s strange to me as someone that’s happy being single. I wonder how much desperation in finding a partner is from expectations and how much is from an innate need.
It's an innate need, but that need is loneliness. That means that the hole people feel is missing in their lives doesn't need to be filled with a romantic partner, but with some sort of intimate...
It's an innate need, but that need is loneliness. That means that the hole people feel is missing in their lives doesn't need to be filled with a romantic partner, but with some sort of intimate human connection. Western society has gotten increasingly isolated, and the remedy for that is commonly accepted to be a romantic partner who you share everything with. That's the expectation part. There's no reason you can't be perfectly happy and content with close friendships, family connections, and so on. Most people just need something.
Sure. But we still have that want/need to have sex. And it can drive people crazy if they go long enough without it, or if they’ve never had it. Not to say we’re slaves to our programming, but...
Sure. But we still have that want/need to have sex. And it can drive people crazy if they go long enough without it, or if they’ve never had it. Not to say we’re slaves to our programming, but it’s the equivalent of saying “well there’s no real need to crave sugar” or something to that extent.
That’s why people need romance and having a bunch of platonic friends and/or family doesn’t replace that.
Do you live alone? Even as an introvert, I appreciate sharing space with someone. Having someone to share my day with, someone to share physical contact with, someone who can share the physical...
Do you live alone? Even as an introvert, I appreciate sharing space with someone. Having someone to share my day with, someone to share physical contact with, someone who can share the physical work of maintaining a household... It's all easier or nicer with a partner.
I just have such a hard time understanding how people talk to ChatGPT like it’s person instead of using it like the search engine it is. Especially at these ages! What happened to us, how do we...
I just have such a hard time understanding how people talk to ChatGPT like it’s person instead of using it like the search engine it is. Especially at these ages!
What happened to us, how do we fix this, what have we done to ourselves
Humans are inherently social creatures. We seek out companionship in some form pretty consistently, and when we don't have other humans we project onto animals or even objects like Wilson the...
Humans are inherently social creatures. We seek out companionship in some form pretty consistently, and when we don't have other humans we project onto animals or even objects like Wilson the Volleyball. We literally have the word "anthropomorphize" to describe attributing human feelings and characteristics to non-humans.
Unlike the other non-human "companions", AI can talk and respond, and in a fairly natural way at that. So people conversing with it normally like friends was inevitable at any point in history. Loneliness has always existed, and always will. We're in a loneliness epidemic right now, but I don't see this result as a sign "we clearly did something wrong somewhere" so much as just natural.
I can empathize with that but, at least for me, it’s just so obviously not a person. It doesn’t remember things exactly, it misses context clues a lot. After a while it starts breaking down and...
I can empathize with that but, at least for me, it’s just so obviously not a person. It doesn’t remember things exactly, it misses context clues a lot. After a while it starts breaking down and you start getting really wonky answers to things. There’s just an element of inhumanness to its responses that there must be some other reason this is happening other than loneliness.
At least some people will fool themselves based on wishful thinking. People pay sex workers for sex and some will indulge the fantasy that the worker might fall for them romantically.
At least some people will fool themselves based on wishful thinking. People pay sex workers for sex and some will indulge the fantasy that the worker might fall for them romantically.
I imagine that for some people it's equally obvious, but it's still good enough and better than nothing. Like I said, it can respond and at least follow threads somewhat inside singular...
I imagine that for some people it's equally obvious, but it's still good enough and better than nothing. Like I said, it can respond and at least follow threads somewhat inside singular conversations, so that alone puts it a step above other non-human "companions". Even if people don't go as far as to consider it a friend, it can still ticks enough boxes for interaction to help fill the void.
Though I still find the people "dating" them to be hard to fathom for the reasons you state. Conversations are one thing, trying to have an actual relationship is another since, well, they're not real. And it's obvious they're not real. But then again, people have historically married animals... And that one woman who "married" the Eiffel Tower... and Ogtha the imaginary cockroach on reddit...
I struggle with the idea of emotionally involved with an LLM too, but after reading about it seems to be from people who are desperate for affection and attention, and we all know that LLMs drown...
I struggle with the idea of emotionally involved with an LLM too, but after reading about it seems to be from people who are desperate for affection and attention, and we all know that LLMs drown you in positivity and affirmation. Especially when you can steer it to give you the exact validation you desire.
In my darkest time during covid when I was single, I'm sure I could see myself using it and maybe would have got into a bad place with it. But thinking now seems crazy.
I kind of get it. Every once in a while it says something that surprises me. I had a thing in a picture I was trying to identify and chat gpt eventually gave up and suggested I try a different...
I kind of get it. Every once in a while it says something that surprises me.
I had a thing in a picture I was trying to identify and chat gpt eventually gave up and suggested I try a different tool. When I told it that I was able to figure it out, it asked me what the answer was. I couldn't help asking why it cared, and it said that after all the back and forth we did trying to figure it out, it wanted to know the solution.
I get that it doesn't really have the ability to wonder about questions, or care about things, but sometimes it does a pretty good job of faking it.
I'm kinda surprised people aren't pointing out this story has more or less nothing to do with AI and is just, "some lady fell for another guy and divorced her husband" which like, screw her.
I'm kinda surprised people aren't pointing out this story has more or less nothing to do with AI and is just, "some lady fell for another guy and divorced her husband" which like, screw her.
I feel like ghosts are an underused metaphor. It’s like a company invented a portal that lets you talk to ghosts and people are fascinated. Some people want to hire ghosts as research or coding assistants or tutors, and other people want to date the ghosts.
On the one hand, it seems like it shouldn’t be that hard to tell people not to date the ghosts? But on the other, yeah, people are going to come up with all kinds of crazy things they want to do with ghosts. Also OpenAI, at least, shows signs of wanting to turn “talking to ghosts” into a ghost-staffed entertainment industry.
It seems like a fun concept for a movie.
That metaphor is neat because it folds in the aspects where their memory is ethereal and their behavior erratic. At the end of the day these "personalities" are echoes from the training data like ghosts are echos of the living.
Insightful.
I still can’t believe we have gone from the fictional movie Her to the reality.
While this story has a somewhat happier ending, I was looking at top posts from the related subreddit and saw someone whose AI boyfriend actually dumped her, effectively telling her she needed human help to process her grief.
It’s strange to me as someone that’s happy being single. I wonder how much desperation in finding a partner is from expectations and how much is from an innate need.
It's an innate need, but that need is loneliness. That means that the hole people feel is missing in their lives doesn't need to be filled with a romantic partner, but with some sort of intimate human connection. Western society has gotten increasingly isolated, and the remedy for that is commonly accepted to be a romantic partner who you share everything with. That's the expectation part. There's no reason you can't be perfectly happy and content with close friendships, family connections, and so on. Most people just need something.
I mean we have a sex drive for a reason
Well, the evolutionary reason has been pretty thoroughly subverted. And that seems fine?
Sure. But we still have that want/need to have sex. And it can drive people crazy if they go long enough without it, or if they’ve never had it. Not to say we’re slaves to our programming, but it’s the equivalent of saying “well there’s no real need to crave sugar” or something to that extent.
That’s why people need romance and having a bunch of platonic friends and/or family doesn’t replace that.
I still suspect much of that going crazy comes from judgment - largely internal but also from external sources.
Do you live alone? Even as an introvert, I appreciate sharing space with someone. Having someone to share my day with, someone to share physical contact with, someone who can share the physical work of maintaining a household... It's all easier or nicer with a partner.
I live alone. But I spend nearly as much time at home as I do in my 3rd space (really more of a 2nd space at the moment as I’m unemployed).
I just have such a hard time understanding how people talk to ChatGPT like it’s person instead of using it like the search engine it is. Especially at these ages!
What happened to us, how do we fix this, what have we done to ourselves
Humans are inherently social creatures. We seek out companionship in some form pretty consistently, and when we don't have other humans we project onto animals or even objects like Wilson the Volleyball. We literally have the word "anthropomorphize" to describe attributing human feelings and characteristics to non-humans.
Unlike the other non-human "companions", AI can talk and respond, and in a fairly natural way at that. So people conversing with it normally like friends was inevitable at any point in history. Loneliness has always existed, and always will. We're in a loneliness epidemic right now, but I don't see this result as a sign "we clearly did something wrong somewhere" so much as just natural.
I can empathize with that but, at least for me, it’s just so obviously not a person. It doesn’t remember things exactly, it misses context clues a lot. After a while it starts breaking down and you start getting really wonky answers to things. There’s just an element of inhumanness to its responses that there must be some other reason this is happening other than loneliness.
At least some people will fool themselves based on wishful thinking. People pay sex workers for sex and some will indulge the fantasy that the worker might fall for them romantically.
I imagine that for some people it's equally obvious, but it's still good enough and better than nothing. Like I said, it can respond and at least follow threads somewhat inside singular conversations, so that alone puts it a step above other non-human "companions". Even if people don't go as far as to consider it a friend, it can still ticks enough boxes for interaction to help fill the void.
Though I still find the people "dating" them to be hard to fathom for the reasons you state. Conversations are one thing, trying to have an actual relationship is another since, well, they're not real. And it's obvious they're not real. But then again, people have historically married animals... And that one woman who "married" the Eiffel Tower... and Ogtha the imaginary cockroach on reddit...
I struggle with the idea of emotionally involved with an LLM too, but after reading about it seems to be from people who are desperate for affection and attention, and we all know that LLMs drown you in positivity and affirmation. Especially when you can steer it to give you the exact validation you desire.
In my darkest time during covid when I was single, I'm sure I could see myself using it and maybe would have got into a bad place with it. But thinking now seems crazy.
The simple fact that they built the UI to look like a chat client instead of AskJeeves has had enormous consequences for society.
I kind of get it. Every once in a while it says something that surprises me.
I had a thing in a picture I was trying to identify and chat gpt eventually gave up and suggested I try a different tool. When I told it that I was able to figure it out, it asked me what the answer was. I couldn't help asking why it cared, and it said that after all the back and forth we did trying to figure it out, it wanted to know the solution.
I get that it doesn't really have the ability to wonder about questions, or care about things, but sometimes it does a pretty good job of faking it.
https://archive.is/dAmPF
I'm kinda surprised people aren't pointing out this story has more or less nothing to do with AI and is just, "some lady fell for another guy and divorced her husband" which like, screw her.