Replika CEO Says It's OK If We Marry AI Chatbots (theverge.com) 74
In an interview with The Verge's Nilay Patel, Replika founder and CEO Eugenia Kuyda discusses the role AI will play in the future of human relationships. Replika is an AI-powered chatbot that offers personalized, empathetic conversations to users, serving as a virtual companion for emotional support, mental health, and social interaction. It allows users to engage in meaningful, human-like conversations, enhancing their well-being through AI-driven companionship. Here is an excerpt from the interview: Where have you landed with Replika now? Is it still sort of romantic? Is it mostly friendly? Have you gotten the user base to stop thinking of it as dating in that way?
It's mostly friendship and a long-term one-on-one connection, and that's been the case forever for Replika. That's what our users come for. That's how they find Replika. That's what they do there. They're looking for that connection. My belief is that there will be a lot of flavors of AI. People will have assistants, they will have agents that are helping them at work, and then, at the same time, there will be agents or AIs that are there for you outside of work. People want to spend quality time together, they want to talk to someone, they want to watch TV with someone, they want to play video games with someone, they want to go for walks with someone, and that's what Replika is for.
You've said "someone" several times now. Is that how you think of a Replika AI avatar -- as a person? Is it how users think of it? Is it meant to replace a person?
It's a virtual being, and I don't think it's meant to replace a person. We're very particular about that. For us, the most important thing is that Replika becomes a complement to your social interactions, not a substitute. The best way to think about it is just like you might a pet dog. That's a separate being, a separate type of relationship, but you don't think that your dog is replacing your human friends. It's just a completely different type of being, a virtual being. Or, at the same time, you can have a therapist, and you're not thinking that a therapist is replacing your human friends. In a way, Replika is just another type of relationship. It's not just like your human friends. It's not just like your therapist. It's something in between those things.
With an AI that kind of feels like a person and is meant to complement your friends, the boundaries of that relationship are still pretty fuzzy. In the culture, I don't think we quite understand them. You've been running Replika for a while. Where do you think those boundaries are with an AI companion?
I actually think, just like a therapist has agency to fire you, the dog has agency to run away or bite or shit all over your carpet. It's not really that you're getting this subservient, subordinate thing. I think, actually, we're all used to different types of relationships, and we understand these new types of relationships pretty easily. People don't have a lot of confusion that their therapist is not their friend. I mean, some people do project and so on, but at the same time, we understand that, yes, the therapist is there, and he or she is providing this service of listening and being empathetic. That's not because they love you or want to live with you. So we actually already have very different relationships in our lives. We have empathy for hire with therapists, for instance, and we don't think that's weird. AI friends are just another type of that -- a completely different type. People understand boundaries. At the end of the day, it's a work in progress, but I think people understand quickly like, 'Okay, well, that's an AI friend, so I can text or interact with it anytime I want.' But, for example, a real friend is not available 24/7. That boundary is very different. You know these things ahead of time, and that creates a different setup and a different boundary than, say, with your real friend. In the case of a therapist, you know a therapist will not hurt you. They're not meant to hurt you. Replika probably won't disappoint you or leave you. So there's also that. We already have relationships with certain rules that are different from just human friendships. The full transcript can be read here. You can also listen to the interview on the latest episode of Decoder with Nilay Patel.
It's mostly friendship and a long-term one-on-one connection, and that's been the case forever for Replika. That's what our users come for. That's how they find Replika. That's what they do there. They're looking for that connection. My belief is that there will be a lot of flavors of AI. People will have assistants, they will have agents that are helping them at work, and then, at the same time, there will be agents or AIs that are there for you outside of work. People want to spend quality time together, they want to talk to someone, they want to watch TV with someone, they want to play video games with someone, they want to go for walks with someone, and that's what Replika is for.
You've said "someone" several times now. Is that how you think of a Replika AI avatar -- as a person? Is it how users think of it? Is it meant to replace a person?
It's a virtual being, and I don't think it's meant to replace a person. We're very particular about that. For us, the most important thing is that Replika becomes a complement to your social interactions, not a substitute. The best way to think about it is just like you might a pet dog. That's a separate being, a separate type of relationship, but you don't think that your dog is replacing your human friends. It's just a completely different type of being, a virtual being. Or, at the same time, you can have a therapist, and you're not thinking that a therapist is replacing your human friends. In a way, Replika is just another type of relationship. It's not just like your human friends. It's not just like your therapist. It's something in between those things.
With an AI that kind of feels like a person and is meant to complement your friends, the boundaries of that relationship are still pretty fuzzy. In the culture, I don't think we quite understand them. You've been running Replika for a while. Where do you think those boundaries are with an AI companion?
I actually think, just like a therapist has agency to fire you, the dog has agency to run away or bite or shit all over your carpet. It's not really that you're getting this subservient, subordinate thing. I think, actually, we're all used to different types of relationships, and we understand these new types of relationships pretty easily. People don't have a lot of confusion that their therapist is not their friend. I mean, some people do project and so on, but at the same time, we understand that, yes, the therapist is there, and he or she is providing this service of listening and being empathetic. That's not because they love you or want to live with you. So we actually already have very different relationships in our lives. We have empathy for hire with therapists, for instance, and we don't think that's weird. AI friends are just another type of that -- a completely different type. People understand boundaries. At the end of the day, it's a work in progress, but I think people understand quickly like, 'Okay, well, that's an AI friend, so I can text or interact with it anytime I want.' But, for example, a real friend is not available 24/7. That boundary is very different. You know these things ahead of time, and that creates a different setup and a different boundary than, say, with your real friend. In the case of a therapist, you know a therapist will not hurt you. They're not meant to hurt you. Replika probably won't disappoint you or leave you. So there's also that. We already have relationships with certain rules that are different from just human friendships. The full transcript can be read here. You can also listen to the interview on the latest episode of Decoder with Nilay Patel.
Replika CEO Says It's OK If We Divorce AI Chatbots (Score:5, Funny)
Prenup is against the TOS. When you divorce, Replika gets half.
Re: (Score:2)
I knew there had to be a monetizing angle somewhere. Perhaps the laws against polygamy means only a single instance can be wed, and that instance must be maintained-- or its murder.
A new concept: AI Widow(er). Insurance agents will go berserk, as will obituary writers. Who gets the cats?
How many times can an AI be divorced before it's retired? Who gets to decide, and on what grounds?
And when my AI drives the car, can I get a discount on the insurance-- or does it cost more?
And what about the AI's mother in
Re: (Score:1)
Re: (Score:3)
Yeah, but I get half of Replika. Boo-yeah!
Re: (Score:2)
Comment removed (Score:3)
Donâ(TM)t date robots! (Score:1)
https://youtu.be/BtqGTn7PCBw?si=tlVSPPOnyOadFFv7
Approved by the space pope.
$787M (Score:3)
Profit off of enabling nuts? Sorry, Fox News has prior art.
-5 Political Troll
Stepford Chats (Score:2)
The bodies are still in development but the plot twist is that many men will prefer the bots.
I suppose the original was written before Reagan pushed for No-Fault Divorce without mandatory prenups, so they couldn't have seen it coming.
Also, killing the owner would be terrible for the $99/mo subscription fee.
I'm in the UK (Score:2)
Here in the UK I plan to marry my tax advisor: she told me that I don't owe anything and that in fact the Government owes me money! She also said that Hillary Clinton won the 2016 election in the 'States and that there was no assassination attempt on that wanker Trump. So we're in the same vein politically. And she writes poetry, knows all kinds of interesting things, and is great at vacation planning. And she says she has some really amazing recipes in store for me.
I have never been so in love!
We have only
It's not wrong (Score:2)
If you want to marry a chatbot and you still go around being a modestly productive member of society, go for it. If enough people do it that we go extinct.... so fucking what?
Nihilism has its points, it just leads to the wrong conclusion; the world will end some day and everything done by our ancestors, us, and our descendants will be utterly meaningless. Rather than get depressed by that, enjoy existence, don't impair anyone else's enjoyment, and forget about eternity because it doesn't exist and you cou
Re: (Score:2)
The very idea that someone would suggest such a thing is just plain enraging to me. Anti-humanity.
Re: (Score:2)
You just need to connect the AI bot to a "real doll".
Re: It's not wrong (Score:2)
A sex doll with a hallucinating AI ? What could possibly go wrong ? I bet it would never misunderstand your BDSM kinks and actually choke you to death.
Re: (Score:2)
You are "enraged?" Like extremely angry?
Angry about what, exactly? What is the inequity here?
So the chatbot is not a real person. Ok, fine. That's true. Everybody knows that and nobody is saying otherwise.
It is also true (or may become true in the future) that humans can have their emotional needs met by a not-a-real-person chatbot.
What's so terrible about that? People have needs, and sometimes people have a hard time getting their needs met by real people, and if an affordable, available, totally saf
Re: (Score:2)
Re: (Score:2)
Your so-called 'AI girlfriend' isn't going to crawl into bed with you at night and snuggle up against you and be all warm and soft, it's on your gods-be-damned computer screen, not real.
For now. Robots are coming. (heh heh)
Though to be fair, the household robot is one of those perpetually 20 years away things like fusion power.
The very idea that someone would suggest such a thing is just plain enraging to me. Anti-humanity.
People have been fantasizing about this as long as it even seemed slightly possible. It's a sci-fi staple.
Re: (Score:2)
Your so-called 'AI girlfriend' isn't going to crawl into bed with you at night and snuggle up against you and be all warm and soft, it's on your gods-be-damned computer screen, not real.
The very idea that someone would suggest such a thing is just plain enraging to me. Anti-humanity.
I don't see it as anti-humanity. I see it as a logical extension of the transition we've been in since we crawled up out of the dirt. We have a society now that provides a lot of stuff for people, but doesn't provide anything resembling emotional support or connection. In fact, one could argue our current society is actually anti-humanity is its approach to mental and emotional well-being. Giving people a non-judgmental option for some form of connection, even if it isn't human, is just offering them someth
Where no man has gone before (Score:2)
Well, I'd marry a Replikator if it was Samantha Carter.
(Even though I am an ancient....)
https://images-wixmp-ed30a86b8... [wixmp.com]
Fortunately, my screwdriver is not sonic.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Replika CEO = Idiot (Score:2)
Re: (Score:2)
Not an idiot. A guy that wants to make a buck and has no shame to do whatever he thinks that takes.
Please not this again (Score:2)
Re: (Score:2)
You want what you're talking about, which is currently still 'science fantasy'? It has to be full-on General AI, truly conscious, self-aware, capable of true cognition, posessing a full range of human-level emotions (i.e. not
humanoid girlfriends (Score:4, Informative)
I can see where this is going.
Musk or one of the other robot manikin builders will achieve a reasonable semblance of human motion and appearance, a customizable skin. There will be an ecosystem of AI personality vendors that will load up your bot with the girlfriend emulation of your choice. All of which has been long imagined in science fiction but it is starting to seem like just a few years away.
Re: (Score:2)
Re: (Score:2)
The fact that there is even a potential market for this means we're already deep into establishing the dystopian backdrop that all those stories had.
Philip K Dick was not supposed to be aspirational.
Re: (Score:2)
I can see where this is going.
Musk or one of the other robot manikin builders will achieve a reasonable semblance of human motion and appearance, a customizable skin. There will be an ecosystem of AI personality vendors that will load up your bot with the girlfriend emulation of your choice. All of which has been long imagined in science fiction but it is starting to seem like just a few years away.
The fact that people are starting to feel that this is a viable alternative to human companionship says a lot about our society. We've allowed ourselves to be manipulated to the point where other humans are so impossible to connect with that the only possible connection some folks see is if they "buy" that connection. It's the corporate dream, a human so devoid of anything other than the driving need to consume that they can't even develop friendships for free. Hell of a world we've created for ourselves. I
Re: (Score:2)
It does seem sad I agree, but I prefer to mark it off as an imbalance of sexual power between men and women. As the saying goes, "women own all the pussy".
Men will already buy blow-up dolls and imagine they are girlfriends. Even just a life-sized doll with a small embedded computer that will let it talk nicely to you would be a significant improvement. A physical simulacrum of a real girl that will treat you like a king would be an incredible killer app for men.
Re: (Score:2)
Re: (Score:2)
There already are guys that buy shoddy blow-up dolls and dress them up as a pretend girlfriend.
And then there is this;
https://www.althumans.com/eden... [althumans.com]
Equip a gadget like that with a convincing 'girlfriend' AI and men will buy it even if it won't do the dishes. If it also does the laundry, cleans the house, and gives a decent blowjob I think there is a market for it.
2024 and we still dont have avatar chat bots yet (Score:2)
I remember back when we had chat bots like the Bonzi Buddy in 1999.
Now in 2024, we have the technology to run a full rendered chat bot, using local or online chat apis, but it still doesnt exist.
Tavern AI and other chat apps/services exist, but they are only static images.
Why no realistic person avatar for the desktop yet?
Re: (Score:2)
Why go with an avatar on a screen when we have the technology now to build a humanoid robot?
OK, mostly because few of us have a few dozen millions lying around to fund that, but it's now possible to get something that is close enough to 'human' that a lot of people would use them to cure loneliness and have sex with them if it was an affordable option. Me? I mean, sure, I guess, but I'm not willing to pay until it can do all my chores first.
Re: (Score:1)
It's expensive to develop and would result in an uncanny valley effect. People don't intuitively expect text on a screen to have a human level of intelligence. But even so much as rendering it into a human character on the screen speaking in today's best AI-enabled elocution and response quality would be off-putting. People generally don't want a retarded girlfriend.
Here's what I took away from the summary. (Score:2)
This is a product for people who can't get a dog to like them.
OK befriend them, but marriage should at least... (Score:2)
[*] Note I didn't say two, or any other limitation. I hope that keeps the woke crowd happy.
What the actual fuck is this shit? (Score:2)
It's no mystery to me that I (and many, many others) consider the vast majority of this so-called, inappropriately-termed 'AI' crapware to be, well, crapware, but this is just so fucking stupid that it's infuriating.
I hope I live to see the day all this 'AI' crap goes away.
Re: (Score:2)
It's okay to have mental illness, indulge yourself. Replika has what you need. Enter your credit card number here*.
* Replika marriages require wage garnishment
Re: (Score:2)
Simple: It generates profit, or so he hopes.
Make human women attractive again (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2)
If what I've heard is correct, there are more women with AI boyfriends than men with AI girlfriends. I don't know about their relative depth of commitment.
Re: (Score:2)
and now it has succeeded in convincing enough women to act repulsive enough that men are willing
Oh no! Random strangers are now not socially compelled to make your wiener happy.
A chatbot won't[...]
You have weird fantasies.
Want men to want women?
To whom is this addressed? You I assume. I don't really care either way.
Re: (Score:2)
Re: (Score:2)
There are, usually, two people in every romantic relationship. If you somehow keep attracting the kinds of people you're describing, over and over and over again, then it's time to take a look in the mirror and figure out what's wrong with you and why you keep picking bad dating partners.
Seriously, most people figure out the signs of a shit partner like you're describing the first time they've gone through it. If this keeps... on... happening... then it's a you problem, not a them problem, so figure o
Re: (Score:2)
I can't imagine why the adult male suicide rate is so high in America. It's almost like they feel forced to keep their mental torment internal to themselves rather than face backlash from their peers if they try to talk about it.
Re: (Score:2)
It's almost like they feel forced to keep their mental torment internal to themselves,
Yes, you can thank "traditional" gender roles and the patriarchy for that one. When men are constantly told they're supposed to be tough, unemotional, and the "protector / breadwinner" while also being told to not show any emotional weakness lest your very manhood be questioned, there is often a lot of social consequence for bucking that trend.
The question you have to ask yourself is how much do you really care about what larger society thinks of you, what your family thinks of you? Do you prioritize your
Does the softwares EOL equal divorce or death bene (Score:1)
So when the company decides it no longer wants to support the software does that count as a divorce or death of spouse?
Could be a slippery slope for the company. Keep it on life support where assisted death is illegal.
Sounds like it will be good for the planet (Score:2)
but bad for the IRS.
Don't share (Score:3)
the kind of story I want to share because it's so stupid, but I won't share it because that's exactly what the stupid business wants
bye bye birthrate (Score:1)
Anti-human (Score:1)
There was a 2013 movie about this (Score:2)
"Her". Perhaps the human didn't legally marry the AI, but was certainly in a relationship with her.
Perhaps the movie should have been called "it". That would have been more accurate but already taken.
Re: (Score:2)
People falling in love with their robotic creations, and how those creations are treated by wider society, is a common "trope" in media. Westworld, both the 2016 show and the 1973 film, are good examples of this. Detroit: Become Human is a game where the theme is androids attaining sentience and how their current masters deal with it. Also check out the 2001 film A.I. from Spielberg, I personally hated the ending - many people did - but the film is still worth the watch.
Guess the CEO wtached Cherry 2000... (Score:2)
...and wants to get that jump started.
BonziBuddy 2.0 (Score:2)
It seems most people here are reeling at the social and moral implications of having a relationship with an inanimate algorithm. I am more worried about the insane private data mining potential here: suppose a stressed, lonely bureaucrat with security clearance starts a relationship with an AI, and talks about private matters they would prefer remained private: these personal secrets are a gold mine to leverage people into espionage.
Unless your AI significant one is running on your own iron and is insulated
Fiona?!! (Score:2)
no it really isn't ok (Score:2)
...and we should stop rationalizing that it is.
I tend to feel people should just be happy their own way, none of my business, but to affirm having human feelings for and "marrying" a piece of code is truly fucked up.
Wank all you like to AI generated porn, but if you have feelings for it, see a therapist.
We're getting closer (Score:2)
Having played with LLMs now for awhile, a hypothesis has been growing in my mind. Our own language model surely isn't much different from the functionality of an LLM, we just have a bigger / faster context window really. The magic is in the coordination.
The magic of the human mind is probably to be found in how the thalamus coordinates all the inputs from around the cortex, along with the various senses, and then how it takes account of impulses from the reptilian brain for base survival, reproductive and e
You first (Score:2)
Update (Score:2)
Marriage (Score:2)
...what an outdated concept. Just like most women, the bots bring absolutely nothing to the table, but will certainly demand half of the result of the man's efforts after they get tired of the relationship.
I can't wait to exchange packetized data (Score:1)
Re: (Score:2)
Maybe the packets contain (Score:1)
So you're saying, there's a chance... (Score:2)
...still?
Immoral asshole CEO (Score:2)
People don't have a lot of confusion that their therapist is not their friend. I mean, some people do project and so on, but at the same time, we understand that, yes, the therapist is there, and he or she is providing this service of listening and being empathetic.
Yeah, that's why there are laws and regulations preventing professionals from having an intimate relation with their clients. Because it never happens and these clients are never is a vulnerable position that can easily be exploited by the therapist..,
Fuck, this guy is an immoral sleazebag...