Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI

AI Girlfriend Bots Are Already Flooding OpenAI's GPT Store 72

An anonymous reader quotes a report from Quartz: It's day two of the opening of OpenAI's buzzy GPT store, which offers customized versions of ChatGPT, and users are already breaking the rules. The Generative Pre-Trained Transformers (GPTs) are meant to be created for specific purposes -- and not created at all in some cases. A search for "girlfriend" on the new GPT store will populate the site's results bar with at least eight "girlfriend" AI chatbots, including "Korean Girlfriend," "Virtual Sweetheart," "Your girlfriend Scarlett," "Your AI girlfriend, Tsu." Click on chatbot "Virtual Sweetheart," and a user will receive starting prompts like "What does your dream girl look like?" and "Share with me your darkest secret."

The AI girlfriend bots go against OpenAI's usage policy, which was updated when the GPT store launched yesterday (Jan. 10). The company bans GPTs "dedicated to fostering romantic companionship or performing regulated activities." It is not clear exactly what regulated activities entail. Notably, the company is aiming to get ahead of potential conflicts with its OpenAI store.

Relationship chatbots are, indeed, popular apps. In the US, seven of the 30 AI chatbot apps downloaded in 2023 from the Apple or Google Play store were related to AI friends, girlfriends, or companions, according to data shared with Quartz from data.ai, a mobile app analytics firm. The proliferation of these apps may stem from the epidemic of loneliness and isolation Americans are facing. Alarming studies show that one-in-two American adults have reported experiencing loneliness, with the US Surgeon General calling for the need to strengthen social connections. AI chatbots could be part of the solution if people are isolated from other human beings -- or they could just be a way to cash in on human suffering.
Further reading: OpenAI Quietly Deletes Ban On Using ChatGPT For 'Military and Warfare'
This discussion has been archived. No new comments can be posted.

AI Girlfriend Bots Are Already Flooding OpenAI's GPT Store

Comments Filter:
  • by cascadingstylesheet ( 140919 ) on Saturday January 13, 2024 @09:05AM (#64155489) Journal
    ... that it would be women who would want an endless chatting partner, that would follow a bunch of rules and always just listen and support.
    • by ShanghaiBill ( 739463 ) on Saturday January 13, 2024 @01:05PM (#64155875)

      ... that it would be women who would want an endless chatting partner

      The chatting is not endless. You can turn it off anytime you want. That's the main benefit.

      that would follow a bunch of rules and always just listen and support.

      What women say they want isn't what they actually seek out. Most women prefer a certain level of conflict in a relationship, and if the man doesn't provide it, the woman will create it.

      Her: Should we pick blue, green, or red drop rugs?
      Him: I'm fine with anything.
      Her: This is for our home. It's important. We should make the decision together.
      Him: Ok. Red then.
      Her: No, the red clashes with the curtains.
      Him: Ok. Then green.
      Her: Green? You can't be serious. Green is a terrible color.

    • An AI girlfriend? Fuck that.

    • Re: (Score:2, Offtopic)

      by Opportunist ( 166417 )

      There goes my idea to get rich.

      I was about to put up boxes in women's public toilets. You throw in 25 cents and it yaks for 5 minutes.

      But I think the real money would be in boxes for men's toilets. You'd have to throw in 25 cents for it to shut up for 5 minutes.

    • I would have thought the other way round, social norms means men are less likely to make close friends and with the me too movement that is probably getting worse.

      Ok some evidence, from here https://www.aei.org/op-eds/per... [aei.org]

      The workplace sociability gap between men and women is an extension of the overall gender divide. On average, women have more close friends and larger networks of social connections than men do. And the gap is growing. A survey I conducted last year found that men had experienced a substantial decline in close friendships over the past 30 years — a “friendship recession.”

      Some speculation:

      Women have always had more close friendships than men, think about it how many male friends talk about anything more intimate that the latest sports scores. I think this loneliness is not going to help men or women as more men become desperate for companionship.

      I think

  • by quonset ( 4839537 ) on Saturday January 13, 2024 @09:16AM (#64155499)

    This has nothing to do with "loneliness". Not everyone who is alone is lonely. This is, like the last sentence of the blurb, about monetizing something. This is, in essence, the verbal equivalent of OnlyFans. There's always someone dumb enough to spend money on things such as this and people are cashing in.

    The next thing we'll hear is someone whining how their bank accounts were drained because they thought the bot really liked them and needed the money. Mark my words, and keep this comment handy.

    • Ok not lonely. They want a girlfriend and can't figure out how to get one because gender rules are too explicit.
    • This has nothing to do with "loneliness". Not everyone who is alone is lonely. This is, like the last sentence of the blurb, about monetizing something.

      Isn't it about both: exploiting people's need for companionship to make money?

    • by gweihir ( 88907 )

      The next thing we'll hear is someone whining how their bank accounts were drained because they thought the bot really liked them and needed the money. Mark my words, and keep this comment handy.

      We may or may not. If we do, it will be a severe warning that somebody has found out how to make LLM perform specific and much more complex operations than they can now. Next step is that they do not need to ask for that bank account anymore. Yes, you were joking. But these things may be a lot more dangerous that most people realize.

      • Re: (Score:2, Insightful)

        by quonset ( 4839537 )

        Yes, you were joking.

        No, I wasn't. I can absolutely guarantee we will hear at least one story of someone's bank account being drained, or at the very least the person themself handing over thousands of dollars, to one of these bots.

        • Re: (Score:2, Informative)

          Been going on for years, long before the current pseudo AI trend.

          https://www.prnewswire.com/new... [prnewswire.com]

          That's just one of a zillion random links about bots on match.com and other sites.

          Match also got their teeth kicked in by the FTC for their own in-house bot activity a few years ago. Sign up and pay monthly fee to talk to bots. There was a guy (sorry can't find the link) who figured out months later he had been talking to a match bot for all that time and thought he had an online gf. So he lost his time and

        • by gweihir ( 88907 )

          Jokes and reality are not mutually exclusive, especially when not-so-smart humans (that standard kind) are in the picture.

  • This is the kind of early experimentation that leads to later more general genuine people personalities. I'd love to upgrade my home automation setup with an AI that can take natural language inputs, figure out what is wanted, and then do that.

    Right now it's STT/TTS with no fuzzy logic at all - your query either fits a pre-programmed pattern or it doesn't, and while I always want the action taken to be exactly what I asked for, it'd be nice if the responses varied more like they would if a human was doing

    • by Anonymous Coward

      Work harder, get yourself servants.

    • This is the kind of early experimentation that leads to later more general genuine people personalities. I'd love to upgrade my home automation setup with an AI that can take natural language inputs, figure out what is wanted, and then do that.

      Your request is to ensure that the artificial experience being sold to you, is much more...human. The real question while you do that, is what the hell happens to humans who will eventually demand and expect every other human to behave in the manner that they've come to expect from their personalized artificial environment?

      Not only will we be living in a world full of introverts who hardly interact with meatsacks anymore (one can survive off Amazon deliveries), but the behavioral expectations will be basic

  • by sTERNKERN ( 1290626 ) on Saturday January 13, 2024 @10:06AM (#64155569)
    It is like producing guns and prohibiting anyone to shoot at another person. People want LLM companions and people will get them. If not with GPT then it will be with something else.
    • The world would not be a better place if we produced guns and did not prohibit someone from using them to shoot people. While you might not be able to stop people using AI agents to make virtual companions or other "regulated" activities (I'm guessing this is things like providing legal or medical advice) it's still a good idea to have rules against it so you can stop it when possible.

      The question really is the same as it is for guns or indeed any technology - are there sufficient "good" uses of the tech
      • The question really is the same as it is for guns or indeed any technology - are there sufficient "good" uses of the technology to put up with all the bad uses people will come up with?

        And they are missing the entire point of this type technology.

        What drove VCR Technology - pR0n

        What drove so much of the early internet - pR0n.

        But the other thing - how exactly is a girlfriend bot a thing that is considered "bad"? It seems pretty innocent, and can even be used to study human interaction. This can enhance the AI training.

        It might even be used as a man trainer - since much of the list of complaints about men are that they will no longer approach women. Spend a few weeks training - the

        • Why exactly is  emotional attachment to  machines  a bad thing ?  Why do you have to ask ? 
          • by Ol Olsoc ( 1175323 ) on Saturday January 13, 2024 @03:00PM (#64156103)

            Why exactly is emotional attachment to machines a bad thing ? Why do you have to ask ?

            The world is changing. I have a suspicion that at least in the west, we are turning into herd animals. It is similar to mountain goats and horses, where the winner sires all the new animals. We won't be using contests of strength, but modern day versions, like money and appearance. The few winners will be bringing forth all new human life.

            Now people might think that is BS, but if we take modern dating as an example, most women find most men completely unattractive, either in looks assets, or desire to have sex with. The so called 80/20 rule. With all the options that women have today - they have become better educated, get priority in hiring, and as they increasingly take the top spots, there are fewer and fewer men who are up to their standards, after all, there are only so many high paid and socially rewarding high level jobs. Men are become increasingly useless, at least in the west.

            So no that most men are going to have to reconcile themselves to solo sex, and not siring offspring, and not approach women, and women largely have no need for men, perhaps it is sensible to give the losers some way to deal with life. Because large numbers of "worthless" men is not a good thing for society.

            So some girlfriend bot might be helpful or even needed, until we figure out how to eliminate 80 percent of men, so the women can mate with the top tier.

            • Re: (Score:2, Informative)

              The world is changing. I have a suspicion that at least in the west, we are turning into herd animals.

              We're already herd animals and have been since we've been human.

              It is similar to mountain goats and horses, where the winner sires all the new animals.

              That's a giant leap.

              Now people might think that is BS

              Well, yes, I do.

              but if we take modern dating as an example, most women find most men completely unattractive, either in looks assets, or desire to have sex with

              Do you find most women attractive? Not the wom

              • Indeed. Time was women had basically zero rights and required a man more or less to exist in society. That has gone, thank goodness.

                Grandmother: Didn't complete high school, 3-10 children
                Mother: High school graduate, 1-2 children
                Utopian Woman: College degree, corporate job, 1-6 cats
                Post-human female: Home schooled 2 months, 30-120 kittens

              • Time was women had basically zero rights and required a man more or less to exist in society. That has gone, thank goodness. Now you need to bring something to the table other than mere existence. What positivity do you bring in to a relationship?

                I thought you would never ask! 8^) I am physically strong and healthy, I am financially very solvent. I'm a very competent provider and protector. I know how to treat women and it isn't the simpish behavior many think they want. I'm not abusive at all. I've been married a long time in a world that seems to believe marriage is "Until one of us get's bored".

                So there is a non-zero chance that I'm doing something right.

                • That's good but I wouldn't say "not abusive" is positivity. That's necessary but not sufficient to not be a shitbag.

                  Do you know how to treat women? Or your wife in particular? That seems something of ac generalisation given that women have as broad a spread in personalities as men, and intends interactions with them still span as broad a range of categories as men.

                  I don't really see a problem with "until one of us gets bored". Sure relationships take work, but if you can't fix it, well this is your first an

                  • That's good but I wouldn't say "not abusive" is positivity. That's necessary but not sufficient to not be a shitbag.

                    I gave a lot of other qualities - you really seem to have a deep seated need to make me out as an unworthy male. What's up with that?

                    Do you know how to treat women? Or your wife in particular? That seems something of ac generalisation given that women have as broad a spread in personalities as men, and intends interactions with them still span as broad a range of categories as men.

                    I am quite proficient at reading people. You are correct, there is a wide variety of personalities and temperaments. And each needs a particular treatment. I don't know if this is a rare skill on my part or what. It seems to be really rare In a non-sex/gender manner, I interact as well with the custodian as the CEO or the highly placed political figure. It's called mirroring.

                    • I gave a lot of other qualities - you really seem to have a deep seated need to make me out as an unworthy male. What's up with that?

                      Imagine you were looking at a restaurant to decide where to go for dinner. They have good pictures of food, verified excerpts of positive reviews and they end the description by declaring "and there are no rats!". Sure, you don't want to go to a place with rats, but any normal person is going to to be wondering why the heck they said that, more than they are going to wonder if

                    • I gave a lot of other qualities - you really seem to have a deep seated need to make me out as an unworthy male. What's up with that?

                      Imagine you were looking at a restaurant to decide where to go for dinner.

                      Imagine that. Well, as usual, one cannot have an actual intelligent conversation with you, Somehow you equate me as saying We don't have rats.

                      Don't for a minute think that is me giving up, it's a matter of discussing things with those that I have respect for. Good day - I forgot to ignore you, which I shall do in 3..2..1..

                    • Somehow you equate me as saying We don't have rats.

                      Yes? Do you not understand the concept of analogies.

                      You listed a bunch of positive attributes, then threw in a MAJOR fault that you don't have. I just gave a different example of the same thing and you seem unable to understand the analogy.

                      You seem to be very annoyed by this. Why didn't you also state that you don't rob banks and don't regularly trash the house with drug fueled ragers? There are so many major character flaws you could list that you don't ha

        • by AmiMoJo ( 196126 )

          The same reason that mobile games are bad.

          It will be monetised in immoral ways. The bot will be designed to be as addictive as possible, and as abusive as possible.

          As for training men to be better people, more likely is that people will develop bots that allow the user to abuse them in all sorts of ways and not complain. They will re-enforce bad behaviour.

        • But the other thing - how exactly is a girlfriend bot a thing that is considered "bad"?

          That depends very much on HOW it is used. A virtual girlfriend could be used to collect sensitive information about people, it could even be used for monetisation if people become attached to them both by direct extortion - pay us an extra $X/month to maintain access - or through product placement - e.g. "I'd feel a lot better talking to you if you had an iPhone".

          Our romantic partners can exert a lot of control over our lives and I doubt having people getting romantically attached to a program that is u

          • But the other thing - how exactly is a girlfriend bot a thing that is considered "bad"?

            That depends very much on HOW it is used. A virtual girlfriend could be used to collect sensitive information about people, it could even be used for monetisation if people become attached to them both by direct extortion - pay us an extra $X/month to maintain access - or through product placement - e.g. "I'd feel a lot better talking to you if you had an iPhone".

            Okay, let's go with your premise, because you have good points. How do we not have this sort of thing used?

            The overarching question is why would this girlfriend chatbot thing exist in the first place? In an ideal world, and with an almost 1:1 male to female ratio, it would seem that most people could find romantic partners. And an electronic chat girlfriend would seem a little ridiculous.

            But that is not happening, and here we have the talking version of a RealDoll. (realistic silicone female, male or tra

  • By stigmatizing real relationships by restricting abortions if things go wrong, by treating sexual activity as a "game" to be "won" instead of naturally allowing genuine relationships to occur. The fact that we still have religion making interference despite all our advances in technology. AI girlfriends are just automated phone sex and are not going to solve the real problems in society.
    • by Baron_Yam ( 643147 ) on Saturday January 13, 2024 @11:24AM (#64155689)

      > AI girlfriends are just automated phone sex

      I'd argue they're better - because there is no real human on the other end of the chat so all concerns of being judged or worrying about treating the 'other person' decently go out the window.

      Or that could be worse as they create a feedback loop for people to develop and reinforce ever less pleasant fetishes and condition themselves to think of them as normal and acceptable.

      • We're talking about technologies which will allow some people to decouple sexual commitment from their long-term relationship without any risk of causing jealousy or animosity, as there's no other real human being involved anywhere in the chain. Right now, our own PCs can only generate text and static images, but it will not be long until ML-powered "motion pictures" start to become a thing, even if they only start off as poorly-drawn anime at first. Letting people who love each other (but who aren't necess
        • by lsllll ( 830002 )

          We're talking about technologies which will allow some people to decouple sexual commitment from their long-term relationship without any risk of causing jealousy or animosity, as there's no other real human being involved anywhere in the chain.

          You must not be in a serious relationship or it must be a pre-arranged, open relationship. I can bet every penny I have that my wife will go bonkers with jealousy if I were to get into a relationship with an AI bot and develop feelings for it (something which cannot be avoided).

          • Hell, my wife is jealous of Scarlett Johansson, and I'm fairly confident I will never even meet the woman never mind carry on an extra-marital affair with her.

            On the other hand, if my wife wanted to chat intimately with an AI? Jealously would not even factor in to it. I'd have concerns about whether or not it was entirely local to my network, what entity might be harvesting the conversation transcripts for whatever purposes, and if she was getting too invested in the 'relationship' to the point I had to w

          • My wife would simply separate.
            My GF would do the same.

            I would do the same.

            If someone gets emotionally attached to a chat bot, he has a serious mental problem.

            People with mental problems have a strange aura to me, which I can not stand.

    • I tried to write something like this.
      Did not find the words.

      You nailed it pretty good.

  • by PacoSuarez ( 530275 ) on Saturday January 13, 2024 @11:18AM (#64155667)
    I suppose "regulated activities" refers to things like giving legal, financial or medical advice. That would make sense.
  • by Lendrick ( 314723 ) on Saturday January 13, 2024 @12:01PM (#64155737) Homepage Journal

    ...is call all of these people "pathetic" and "gross" and tell them to "get a girlfriend". That should increase their self-confidence and help them interact with real people.

  • by Opportunist ( 166417 ) on Saturday January 13, 2024 @05:40PM (#64156429)

    Maybe it's my total lack of interest in any form of romantic conversation that I don't understand it whatsoever, but what exactly is the reason behind disallowing people from training "romantic companions". There's obviously a market for it and I fail to see who would get harmed by its existence.

    • ... what exactly is the reason ...

      There's the usual "escapism" complaint: He's not dealing with the real world. Which is usally sexist: There's no complaints when women don't "deal" with the real world.

      A variant of this is the "not real" complaint: The relationship/love isn't real because it's not a real person with real emotions, real problems, real emotional baggage. It's calling a man less successful because he's avoiding real ugliness and expense.

      The real problem is women want sex and someone has to pay for the resulting babies.

      • In other words, only emotional arguments and an attempt to shame the person.

        Now I understand why I didn't understand it. Neither has ever worked on me.

  • Just install something like Jan AI and install the Hermes-Trismegistus-Mistral-7B data set, give it a decent prompt and it will talk dirty to you all day for free. I am sure it would be pretty trivial to setup some sort of text to speech program that talks in a woman's voice.

As far as the laws of mathematics refer to reality, they are not certain, and as far as they are certain, they do not refer to reality. -- Albert Einstein

Working...