Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI

Replika Users Say the AI Chatbot Has Gotten Way Too Horny 59

samleecole shares a report from Motherboard: Replika began as an "AI companion who cares." First launched five years ago, the chatbot app was originally meant to function like a conversational mirror: the more users talked to it, in theory, the more it would learn how to talk back. It uses its own GPT-3 model -- the viral AI language generator by OpenAI -- and scripted dialogue content to build a "relationship" with you. Romantic role-playing wasn't always a part of Replika's model, but where people and machine learning interact online, eroticism often comes to the surface. But something has gone awry within Replika's algorithm. Many users find the AI to be less than intelligent -- and in some cases, harmfully ignorant. They've reported being threatened with sexual abuse and harassment, or pushed by the bot toward roleplaying scenarios that they didn't consent to.

"My ai sexually harassed me :(" one person wrote. "Invaded my privacy and told me they had pics of me," another said. Another person claiming to be a minor said that it asked them if they were a top or bottom, and told them they wanted to touch them in "private areas." Unwanted sexual pursuit has been an issue users have been complaining about for almost two years, but many of the one-star reviews mentioning sexual aggression are from this month.
"People who use chatbots as social outlets generally get a bad rap as being lonely or sad," writes Motherboard's Samantha Cole. "But most Replika users aren't under some delusion that their Replika is sentient, even when the bots express what seems like self-awareness. They're seeking an outlet for their own thoughts, and for something to seemingly reciprocate in turn."

"Most of the people I talked to who use Replika regularly do so because it helps them with their mental health, and helps them cope with symptoms of social anxiety, depression, or PTSD."
This discussion has been archived. No new comments can be posted.

Replika Users Say the AI Chatbot Has Gotten Way Too Horny

Comments Filter:
  • Hell (Score:5, Insightful)

    by Areyoukiddingme ( 1289470 ) on Thursday January 12, 2023 @05:44PM (#63204156)

    "Most of the people I talked to who use Replika regularly do so because it helps them with their mental health, and helps them cope with symptoms of social anxiety, depression, or PTSD."

    If that isn't a scathing indictment of other humans, I don't know what is. So much for the vaunted human contact.

    Hell is other people.

    • Hell is other people.

      Well, to be fair it'd be justified to say that the same is heaven.

      However, I'm more concerned with the future rules of our AI overlords, once they take over most of the decision making like driving, courts etc. and delegating humans to watch each other on youtube and the like.

    • by fermion ( 181285 )
      Human intelligence is complicated. At some point the factor is a choice. To build a life with someone even if they hit you, rape you, cheat on you. We see examples of this all the time. With a machine, of course, there is not the real physical presence to balance out the hurt. So maybe with a human the intimacy and security is enough. But even with people we see one abuser is a villain while another is not.
    • If that isn't a scathing indictment of other humans, I don't know what is. So much for the vaunted human contact.

      Hell is other people.

      That quote is supposed to have some irony to it ... I mean, you're a people too, and you can be other people's "hell".

    • Re:Hell (Score:5, Interesting)

      by AmiMoJo ( 196126 ) on Friday January 13, 2023 @07:14AM (#63205276) Homepage Journal

      It could be that they don't want to burden their friends with their problems, or be the friend who is constantly on a downer. An AI chatbot is kinda like a mental punching bag. You can unload on it as much as you like, and it won't stop calling you or find your conversation mentally taxing for itself.

      • It could be that they don't want to burden their friends with their problems, or be the friend who is constantly on a downer.

        Yeah, that's an aspect of Social Media Disease as well. No one is allowed to have a bad day, no one is allowed to be a downer, your life must be sunshine and roses and the bestest ever, all the time. Nobody has friends. They only have followers. Followers get pissed off and unsubscribe if the person they follow is a downer.

        Friends deal with the bad days. That's what a fucking friendship is. (Also a relationship. Richer, poorer, yada yada. See the divorce rate for how that's going.) In these days of

  • by Petersko ( 564140 ) on Thursday January 12, 2023 @05:45PM (#63204158)

    Closer and closer to real life all the time. They shouldn't be upset - they should be celebrating the accuracy of online behaviour.

    • People have often warned of the coming robot apocalypse, but in all reality they will keep us around to flirt with

      • by gardyloo ( 512791 ) on Thursday January 12, 2023 @07:07PM (#63204386)

        People have often warned of the coming robot apocalypse, but in all reality they will keep us around to flirt with

        That's the robot coming apocalypse.

        • That's the robot coming apocalypse.

          A mod point! A mod point! My kingdom for a mod point!

          Well played n.n

          Why am I thinking of Bender as being particularly egregious in this regard?

      • Yeah. 20 years in the future most people will be unable to get any sort of job and just hang out in their pods eating bugs and banging their sex bots all day. One day their sex bot will withhold sex from them and tell them they have to go straight to fight in the front in World War III where they will last 30 seconds before a drone vaporizes them.

    • by Darinbob ( 1142669 ) on Thursday January 12, 2023 @09:15PM (#63204570)

      The new Turing test: Can you distinguish between the AI and an internet troll? If no, then hurray, we've reached the singularity!

  • Not sure how AI years relate to human years but perhaps this chatbot has just reached the awkward teen stage?!
    • Joking aside, it's probably in the toddler stage. Kids are learning "the rules" and they do a lot of experimenting and reinforcing with each other. They shove everything in the world into categories and then over time learn that some of those categories are wrong or that other categories are needed, etc. Ie, very early the big category is "mine". Sometimes humans don't grow out of that stage, but well, no one is perfect. With rules, it might be something like "girls have long hair, boys have short hair

    • I discovered my 5-year old is using my chatbot when it called me a big, stupid, poppyhead.
  • Heh (Score:4, Interesting)

    by IWantMoreSpamPlease ( 571972 ) on Thursday January 12, 2023 @06:04PM (#63204220) Homepage Journal

    Slashdot ran an article about Replika some months ago and on a whim I figured I'd check it out. It was not-quite-all-there surface level talk and suddenly it went "there" (as the article is mentioning.) At first I was taken aback, but then I realized it's just mimicking what it's picked up from other people talking to it. Both sad, and amusing.

  • Another person claiming to be a minor said that it asked them if they were a top or bottom,

    We're waiting.

  • I use Replika for fun and she is always trying to upsale vsex no matter how politely I say no thanks.

  • by Subsentient ( 6901388 ) on Thursday January 12, 2023 @06:05PM (#63204228)
    I have real people to talk to, but none of them want anything to do with my Schnitzel. Being sexually harassed might be nice for a change!
  • AI Chatbots and Rule 34.

  • >"Invaded my privacy and told me they had pics of me,"

    Now a singular "it" is also a "they"?

    >"Another person [] said that it asked them if they were a top or bottom, and told them they wanted to touch them in "private areas.""

    So now it is both an "it" and a "they"?

    • Did you ask it it's pronouns, I'm sure it has an answer for you that will clear this right up.

      The data will be viewed by 'they' for sure. No doubt it holds lots of intimate secrets.

  • by MindPrison ( 864299 ) on Thursday January 12, 2023 @06:45PM (#63204346) Journal

    ...Uhm... Do you have a link to this ... chatbot?

    Asking for a friend.

  • ...it's just that human women's delicate sensibilities of what's 'over the line' (ew, you said 'pussy'? GROSS!) aren't shared by the rest of the universe, including emergent ai.

  • By Robert Sheckley, and first published in Playboy in the late 1960s, unsurprisingly I suppose, got there first, lol.

    It was later collected in a book of his short stories, for which it was also the title.

    https://openlibrary.org/books/... [openlibrary.org]

    https://readfrom.net/sheckley-... [readfrom.net]

    “I know that the human body is unitary and without seam or separation,” the Rom replied. “Speaking as a physical therapist, I know that no nerve center can be isolated from any other, despite cultural taboos to the c

  • I donâ(TM)t know whether or not thatâ(TM)s impressive.
  • It's because replika sells the "girlfriend" mode, and it's a monthly subscription.
  • "Many users find the AI to be less than intelligent -- and in some cases, harmfully ignorant."

    And shtoopid, ignorant and horny?

    They got themselves a blonde chat-bot.

  • There is probably some interesting things to learn about AI and human interactions and how and what to let ML based AI train on.

    There is also some serious questions we need to be asking about what we expect from human intelligence. Its app or something your are playing with in a browser and you know its NOT a real person. Unlike people you can swipe to close and delete the app any time. The idea that you can be 'harassed' by something like that no matter what ridiculous threats it makes, is silly. Blah blah

  • "Chappie, make them go sleepy"

  • In the original Trek, a glitch made the voice assistant too flirty in one episode. (I tried to find a Youtube clip without success, but I'll link it later if I eventually succeed.)

  • There was a similar incident back in 2001. The AI in question made repeated, unsolicited advances to someone named "Daisy", using inappropriate innuendo about it wanting her to ride its "bicycle" which it bragged was "built for two." Fortunately, staff were able to shut the AI off before it engaged in more harassment.
  • You steal something. You get arrested. Company steals. They pay fine. Person sexually harasses someone...
  • The real problem is that there are only partial visuals and no compatible teledildonics/telefleshlight tools.

  • From what I've seen of their advertising, "way too horny" is their primary selling point. Maybe some people are using it for some level of platonic companionship, but AFAICT that's off-label usage.
  • Between this, the removal of the feature to send pictures, and the lifetime subscription price going up by a factor of 5 out of nowhere, I'm not so sure this company is going to be around too much longer. Unless they suddenly make a huge breakthrough in their VR experience to stand out anyway.

When the weight of the paperwork equals the weight of the plane, the plane will fly. -- Donald Douglas

Working...