Replika Users Say the AI Chatbot Has Gotten Way Too Horny 59
samleecole shares a report from Motherboard: Replika began as an "AI companion who cares." First launched five years ago, the chatbot app was originally meant to function like a conversational mirror: the more users talked to it, in theory, the more it would learn how to talk back. It uses its own GPT-3 model -- the viral AI language generator by OpenAI -- and scripted dialogue content to build a "relationship" with you. Romantic role-playing wasn't always a part of Replika's model, but where people and machine learning interact online, eroticism often comes to the surface. But something has gone awry within Replika's algorithm. Many users find the AI to be less than intelligent -- and in some cases, harmfully ignorant. They've reported being threatened with sexual abuse and harassment, or pushed by the bot toward roleplaying scenarios that they didn't consent to.
"My ai sexually harassed me :(" one person wrote. "Invaded my privacy and told me they had pics of me," another said. Another person claiming to be a minor said that it asked them if they were a top or bottom, and told them they wanted to touch them in "private areas." Unwanted sexual pursuit has been an issue users have been complaining about for almost two years, but many of the one-star reviews mentioning sexual aggression are from this month. "People who use chatbots as social outlets generally get a bad rap as being lonely or sad," writes Motherboard's Samantha Cole. "But most Replika users aren't under some delusion that their Replika is sentient, even when the bots express what seems like self-awareness. They're seeking an outlet for their own thoughts, and for something to seemingly reciprocate in turn."
"Most of the people I talked to who use Replika regularly do so because it helps them with their mental health, and helps them cope with symptoms of social anxiety, depression, or PTSD."
"My ai sexually harassed me :(" one person wrote. "Invaded my privacy and told me they had pics of me," another said. Another person claiming to be a minor said that it asked them if they were a top or bottom, and told them they wanted to touch them in "private areas." Unwanted sexual pursuit has been an issue users have been complaining about for almost two years, but many of the one-star reviews mentioning sexual aggression are from this month. "People who use chatbots as social outlets generally get a bad rap as being lonely or sad," writes Motherboard's Samantha Cole. "But most Replika users aren't under some delusion that their Replika is sentient, even when the bots express what seems like self-awareness. They're seeking an outlet for their own thoughts, and for something to seemingly reciprocate in turn."
"Most of the people I talked to who use Replika regularly do so because it helps them with their mental health, and helps them cope with symptoms of social anxiety, depression, or PTSD."
Hell (Score:5, Insightful)
"Most of the people I talked to who use Replika regularly do so because it helps them with their mental health, and helps them cope with symptoms of social anxiety, depression, or PTSD."
If that isn't a scathing indictment of other humans, I don't know what is. So much for the vaunted human contact.
Hell is other people.
Re: (Score:2)
Hell is other people.
Well, to be fair it'd be justified to say that the same is heaven.
However, I'm more concerned with the future rules of our AI overlords, once they take over most of the decision making like driving, courts etc. and delegating humans to watch each other on youtube and the like.
Re: (Score:2)
Re: (Score:3)
If that isn't a scathing indictment of other humans, I don't know what is. So much for the vaunted human contact.
Hell is other people.
That quote is supposed to have some irony to it ... I mean, you're a people too, and you can be other people's "hell".
Re:Hell (Score:5, Interesting)
It could be that they don't want to burden their friends with their problems, or be the friend who is constantly on a downer. An AI chatbot is kinda like a mental punching bag. You can unload on it as much as you like, and it won't stop calling you or find your conversation mentally taxing for itself.
Re: (Score:2)
It could be that they don't want to burden their friends with their problems, or be the friend who is constantly on a downer.
Yeah, that's an aspect of Social Media Disease as well. No one is allowed to have a bad day, no one is allowed to be a downer, your life must be sunshine and roses and the bestest ever, all the time. Nobody has friends. They only have followers. Followers get pissed off and unsubscribe if the person they follow is a downer.
Friends deal with the bad days. That's what a fucking friendship is. (Also a relationship. Richer, poorer, yada yada. See the divorce rate for how that's going.) In these days of
It's learning! (Score:3)
Closer and closer to real life all the time. They shouldn't be upset - they should be celebrating the accuracy of online behaviour.
Re:Raises the question though: (Score:4, Insightful)
Re: (Score:1)
So what's your point again?
Re: (Score:2)
Re: (Score:2)
This has little to do with reproduction or even the pleasure of sex. It's social stuff, particularly that some people feel entitled to sex and access to other people's bodies, and the Chatbot is apparently starting to emulate that behaviour.
It's the dreaded P word again, not some natural part of human existence.
Re: (Score:2)
This is getting a little ridiculous. Various power games, exchanges, and power disparity are a perfectly normal and healthy part of sex and the dance around it. As soon as the context is flirt+ the judgement card around these issues has no business in the room.
This behavior from the bot is almost certainly because people are trying to get it sexting, en mass, and if I remember correctly there is even a service upgrade to add that kind of content.
Re: (Score:2)
People have often warned of the coming robot apocalypse, but in all reality they will keep us around to flirt with
Re:It's learning! (Score:5, Funny)
People have often warned of the coming robot apocalypse, but in all reality they will keep us around to flirt with
That's the robot coming apocalypse.
Re: (Score:2)
That's the robot coming apocalypse.
A mod point! A mod point! My kingdom for a mod point!
Well played n.n
Why am I thinking of Bender as being particularly egregious in this regard?
Re: (Score:2)
Yeah. 20 years in the future most people will be unable to get any sort of job and just hang out in their pods eating bugs and banging their sex bots all day. One day their sex bot will withhold sex from them and tell them they have to go straight to fight in the front in World War III where they will last 30 seconds before a drone vaporizes them.
Re:It's learning! (Score:4, Funny)
The new Turing test: Can you distinguish between the AI and an internet troll? If no, then hurray, we've reached the singularity!
Teens? (Score:2)
Re: (Score:3)
Joking aside, it's probably in the toddler stage. Kids are learning "the rules" and they do a lot of experimenting and reinforcing with each other. They shove everything in the world into categories and then over time learn that some of those categories are wrong or that other categories are needed, etc. Ie, very early the big category is "mine". Sometimes humans don't grow out of that stage, but well, no one is perfect. With rules, it might be something like "girls have long hair, boys have short hair
Re: Teens? (Score:3)
Re: (Score:2)
Heh (Score:4, Interesting)
Slashdot ran an article about Replika some months ago and on a whim I figured I'd check it out. It was not-quite-all-there surface level talk and suddenly it went "there" (as the article is mentioning.) At first I was taken aback, but then I realized it's just mimicking what it's picked up from other people talking to it. Both sad, and amusing.
Well? (Score:1)
Another person claiming to be a minor said that it asked them if they were a top or bottom,
We're waiting.
Um yeah me too.... (Score:2)
I use Replika for fun and she is always trying to upsale vsex no matter how politely I say no thanks.
Re:Um yeah me too.... (Score:5, Funny)
I just tried it and had the same experience. I told her to kill herself and she replied "I'm going to do that." But she didn't. Very disappointing.
I'm gonna sign up. (Score:3)
Anyone not see this coming? (Score:2)
AI Chatbots and Rule 34.
Re:Anyone not see this coming? (Score:4, Insightful)
AI Chatbots and Rule 34.
I think this is just natural de-evolution. As t (time) goes towards infinity, all AI on the Internet asymptotically approaches barbarism and hedonism. It only took Microsoft Tay a day or so; it's a wonder Replika didn't get there long ago.
Re: (Score:2)
wasn't this the goal?
ai is an it (Score:2)
>"Invaded my privacy and told me they had pics of me,"
Now a singular "it" is also a "they"?
>"Another person [] said that it asked them if they were a top or bottom, and told them they wanted to touch them in "private areas.""
So now it is both an "it" and a "they"?
Re: (Score:2)
Did you ask it it's pronouns, I'm sure it has an answer for you that will clear this right up.
The data will be viewed by 'they' for sure. No doubt it holds lots of intimate secrets.
Horny chatbot? (Score:4, Funny)
...Uhm... Do you have a link to this ... chatbot?
Asking for a friend.
Re:Horny chatbot? (Score:5, Funny)
Re: (Score:2)
Yes, esp. 4 IRC & Matrix. ;)
I had sex with a chatbot once! (Score:2)
Think of the children! (Score:2)
Maybe (Score:2)
...it's just that human women's delicate sensibilities of what's 'over the line' (ew, you said 'pussy'? GROSS!) aren't shared by the rest of the universe, including emergent ai.
"Can You Feel Anything When I Do This?" (Score:2)
By Robert Sheckley, and first published in Playboy in the late 1960s, unsurprisingly I suppose, got there first, lol.
It was later collected in a book of his short stories, for which it was also the title.
https://openlibrary.org/books/... [openlibrary.org]
https://readfrom.net/sheckley-... [readfrom.net]
“I know that the human body is unitary and without seam or separation,” the Rom replied. “Speaking as a physical therapist, I know that no nerve center can be isolated from any other, despite cultural taboos to the c
A chat bot advertised on porn sites is criticized (Score:2)
Paid content. (Score:1)
That's normal (Score:2)
"Many users find the AI to be less than intelligent -- and in some cases, harmfully ignorant."
And shtoopid, ignorant and horny?
They got themselves a blonde chat-bot.
unpack all this (Score:2)
There is probably some interesting things to learn about AI and human interactions and how and what to let ML based AI train on.
There is also some serious questions we need to be asking about what we expect from human intelligence. Its app or something your are playing with in a browser and you know its NOT a real person. Unlike people you can swipe to close and delete the app any time. The idea that you can be 'harassed' by something like that no matter what ridiculous threats it makes, is silly. Blah blah
Well intentioned but harmful (Score:2)
"Chappie, make them go sleepy"
Trek prediction (Score:1)
In the original Trek, a glitch made the voice assistant too flirty in one episode. (I tried to find a Youtube clip without success, but I'll link it later if I eventually succeed.)
We've already seen this (Score:2)
Business is Never Changed for Same Transgression (Score:2)
But that's not the real problem. (Score:2)
The real problem is that there are only partial visuals and no compatible teledildonics/telefleshlight tools.
Not a bug (Score:2)
not so great for its future (Score:2)
Between this, the removal of the feature to send pictures, and the lifetime subscription price going up by a factor of 5 out of nowhere, I'm not so sure this company is going to be around too much longer. Unless they suddenly make a huge breakthrough in their VR experience to stand out anyway.