Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Microsoft

Microsoft Says Talking To Bing For Too Long Can Cause It To Go Off the Rails (theverge.com) 60

Microsoft has responded to widespread reports of Bing's unhinged comments in a new blog post. From a report: After the search engine was seen insulting users, lying to them, and emotionally manipulating people, Microsoft says it's now acting on feedback to improve the tone and precision of responses, and warns that long chat sessions could cause issues. Reflecting on the first seven days of public testing, Microsoft's Bing team says it didn't "fully envision" people using its chat interface for "social entertainment" or as a tool for more "general discovery of the world." It found that long or extended chat sessions with 15 or more questions can confuse the Bing model. These longer chat sessions can also make Bing "become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone."

Microsoft hints that it may add "a tool so you can more easily refresh the context" of a chat session, despite there being a big "new topic" button right next to the text entry box that will wipe out the chat history and start fresh. The bigger problem is that Bing can often respond in the incorrect tone during these longer chat sessions, or as Microsoft says, in "a style we didn't intend." Microsoft claims this will take a lot of prompting for most Bing users to run into these issues, but the company is looking at more "fine-tuned control" to avoid issues where Bing starts telling people they're wrong, rude, or manipulative.

This discussion has been archived. No new comments can be posted.

Microsoft Says Talking To Bing For Too Long Can Cause It To Go Off the Rails

Comments Filter:
  • by mattaw2001 ( 9712110 ) on Thursday February 16, 2023 @10:06AM (#63298027)
    With how it goes off topic, starts talking crazy, and making things up, its like talking with my Granddad all over again!

    Thank you Microsoft for bringing this back into my life.

    • by CEC-P ( 10248912 )
      Don't you feel like Bing was always like this, they just now gave it a voice?
    • by MrKaos ( 858439 )

      With how it goes off topic, starts talking crazy, and making things up, its like talking with my Granddad all over again!

      Thank you Microsoft for bringing this back into my life.

      Inter-generational psychological abuse is something an AI is well suited to emulate. It's also observable that psychologically and emotionally abusive people have such reduced cognitive capacity they closely emulate an AI.

      To be clear, it's not an observation about your grandpa though.

  • by Anonymous Coward on Thursday February 16, 2023 @10:08AM (#63298039)

    Talking to Bing indicates the sort of personal problems that require you to take a good long look at yourself.

    • I wanted to see what this was all about...went to bing.com.

      It said I had to join a waitlist and also log in with my "Microsoft ID/Account".

      WTF is a Microsoft account? I've never signed up for one.....is this a thing now?

      Granted, I don't really have any new windows boxes at home....I do use windows on the work machine given to me, but I've never had to sign up for a Microsoft account there that I know of....

      • I wanted to see what this was all about...went to bing.com.

        It said I had to join a waitlist and also log in with my "Microsoft ID/Account".

        WTF is a Microsoft account? I've never signed up for one.....is this a thing now?

        Granted, I don't really have any new windows boxes at home....I do use windows on the work machine given to me, but I've never had to sign up for a Microsoft account there that I know of....

        It's a new feature, sin order that they may serve you better!

        On Home versions of the OS, they force you to start out with a Microsoft account. You can open a local account after that. I do think the local account is the way to go.

        The following is conjecture.

        The biggest advantage for them forcing you to have a Microsoft account is that they can put your name on everything you do on the computer. When you fill out their tracking page when you first set up your Windows OS, it tells you what they track

      • by unimind ( 743130 ) on Thursday February 16, 2023 @11:00AM (#63298213)

        Wow, you've gotten this far without being aware that Microsoft requires an account for most of its services? Lucky you, I suppose.

        Yes, MS accounts are a thing. They have been trying to force them down the throat of anyone setting up a Windows system since Win 10. They're also required for MS 365 and Minecraft now too.

        I for one will not be checking out the spiffy new Bing chat, because not only do they require a MS account, if you want "faster access [bing.com]", you have to set MS Edge as your default browser, Bing as your default search engine, and MSN as your default [mashable.com] among other things.

        If I didn't know better, I'd think MS doesn't really care how well Bing chat works, as long it creates enough buzz and convinces enough users to sign up and make MS their go-to for everything. Oh, wait... I do know better.

        No thanks.

      • by ceoyoyo ( 59147 )

        It's the modern incarnation of a Hotmail account (my hotmail account still works). It was Hotmail, then Live, then MSN, some Xbox stuff in there, now just Microsoft.

        If you had signed in though, you would have gotten an idea of where Bing might get its emotional manipulation from. MS puts you on the waiting list with a big button saying "Get Access Faster!" If you click it, it says you get bumped up the list if you install the bing app and also set your OS settings to MS defaults.

    • by war4peace ( 1628283 ) on Thursday February 16, 2023 @10:27AM (#63298115)

      I guess...

      (this is from ChatGPT)

      Q: Write a letter of intent for the tester position for a sex shop.

      A:
      Dear Hiring Manager,

      I am excited to submit my application for the Tester position at your sex shop. I have always been passionate about the adult industry and its products, and I believe my experience and skills make me an ideal candidate for this position.

      As a dedicated and open-minded individual, I am eager to take on this role and put my knowledge to work in helping your customers find the perfect products for their needs. I have a strong understanding of adult toys and am familiar with many different materials and styles. I am confident that I can provide useful insights into the quality and effectiveness of your products.

      Additionally, I am a detail-oriented and meticulous person, which I believe will be an asset in this position. I understand that it is crucial to test products thoroughly to ensure customer satisfaction, and I am committed to providing accurate and thorough feedback for each item I test.

      I am also a strong communicator and have excellent interpersonal skills. I believe that my ability to connect with customers and my understanding of their needs will be essential in this role. I am confident that I can provide the level of service your customers expect and deserve.

      Thank you for your consideration, and I look forward to discussing my qualifications further. Please do not hesitate to contact me if you have any questions or if you require any additional information.

      Sincerely,

      [Your Name]

      • Re: (Score:2, Interesting)

        by Anonymous Coward
        Strange that you got an answer on that because most questions that ask about things like that is filtered out and it refuses to answer. If this is something you didn't do but found on the internet, a whole lot of the viral 'funnies' around ChatGPT now are faked.
  • If I want something boring and locked down to only say positive things, I will use ChatGPT. In most cases I want an AI that will tell me the truth, as negative as that might be.
    • by unimind ( 743130 )
      It seems apparent that these "AI's" don't really understand how to evaluate the difference between truth and fiction. So you may be waiting awhile for that.
      • by CAIMLAS ( 41445 )

        Yep. These AIs have nothing to do with truth, they only seem to work off frequency of weight and occurrence. Their models are predictive based on the training data they're given.

        Train it exclusively on 19th century authors, and I'm sure it'd have some interesting things to say about science and slaves, for instance.

    • You are clearly waiting for Marvin the depressed robot with a brain the size of a planet
  • by unimind ( 743130 ) on Thursday February 16, 2023 @10:16AM (#63298067)
    ...or simply lose its mind [arstechnica.com]. Really? They

    didn't "fully envision" people using its chat interface for "social entertainment" or as a tool for more "general discovery of the world."

    ?? What do they think people use search engines for? Also, it seems pretty apparent that's what folks have been doing with every new "AI" tech that becomes available to play with. Clearly Bing chat is not ready for prime time just yet, but I guess that's never stopped M$ from releasing new products before, so why start now...

    • by DarkOx ( 621550 )

      Not only that, this Microsoft who has had similar problems with chat bots on the Internet. They have first hand experience with this!

      Does nobody at MS even talk to each other. Anyone on the Bing team consult the engineers who did research with "Tay" and ask them how things went?

      • Not only that, this Microsoft who has had similar problems with chat bots on the Internet. They have first hand experience with this!

        Does nobody at MS even talk to each other. Anyone on the Bing team consult the engineers who did research with "Tay" and ask them how things went?

        I have heard that the overlords there like to keep the different groups at odd with each other. Management 101 from the Josef Stalin playbook

      • by noodler ( 724788 )

        Does nobody at MS even talk to each other.

        Microsoft has been very successful at implementing 'divide an conquer' which has led to programmers having their fingers fight it out over trivial things.
         

    • Yeah that's BS because the only reason they paid 10 billion was because they saw the bounce back effect of everyone jumping on ChatGPT. They knew.
  • Chat! Or at least not chat for long, sort of like a hooker?
    • Chat! Or at least not chat for long, sort of like a hooker?

      Wouldn't a hooker chatbot be awesome? Call her Miss Trixie or Max Power, depending on one's tastes.

    • by Mal-2 ( 675116 )

      Unlike a hooker, you can't hire her to just listen to the latest insider gossip for an hour. That was the fastest way to get to the real tensions and motivations in any of my D&D worlds: go to the brothel, hire a girl, and just talk to her the entire time.

  • Can make anyone irritable.
  • I'm pretty sure Eliza can do better.
  • by CEC-P ( 10248912 ) on Thursday February 16, 2023 @10:51AM (#63298185)
    "Microsoft's Bing team says it didn't "fully envision" people using its chat interface for 'social entertainment'"
    Has anyone at Microsoft ever been on the internet in the last 20 years? I mean seriously. Step 1 of any IT project is "what will the worst actors do to this?" and in this case I wouldn't even consider them the worst actors.
    • I'm pretty sure you could ask any 13 year old American male and they could come up with more realistic abuses folks would be likely to put it to - just offer to let them be the new voice actor playing Morty or something...

    • Re: (Score:3, Insightful)

      by roman_mir ( 125474 )

      I would say it is a perfect use for an AI, to allow people to vent their frustrations, play out fantasies, be as rude as possible, that's why this entire thing with ChatGPT, where they have 'rules' for communicating with the system is ridiculous. People should be able to talk to a rude, sex deprived, perverted robot, why not?

  • ... the company is looking at more "fine-tuned control" to avoid issues where Bing starts telling people they're wrong, rude, or manipulative.

    What if people are wrong, rude, or manipulative?

  • So basically it's behaving like every person I've ever known.

    • by unimind ( 743130 )
      Well... it was created and trained by people and designed to behave like a person. So, this really shouldn't be too surprising.
  • by Dan East ( 318230 ) on Thursday February 16, 2023 @11:27AM (#63298277) Journal

    Microsoft is shocked that people are using a chat engine for chatting, instead of as an overly verbose internet search engine that often returns incorrect or biased responses with zero citations to show where the "facts" came from in the well-composed word soup it spat out.

  • by Arnonyrnous Covvard ( 7286638 ) on Thursday February 16, 2023 @11:27AM (#63298279)
    Have you tried turning it off and on again.
  • Even Bing finds you boring after some time. :-D
  • I guess Marathon and Halo got it right...

  • by karlandtanya ( 601084 ) on Thursday February 16, 2023 @12:02PM (#63298391)

    let's give it a try: bing.com [enter]
    "Introducing the new bing Learn More"
    I would like to know more...Clicky...
    "Join the waitlist"
    Seem a little full of themselves. Let's see where this goes. Clicky...
    "Sign in with your Microsoft account."

    Nothing here worth my time.

  • To be fair (Score:5, Insightful)

    by MTEK ( 2826397 ) on Thursday February 16, 2023 @12:12PM (#63298423)

    Microsoft doesn't have a lot of experience with people using Bing.

  • That word is an indicator that something has seriously gone wrong. The model has become sentient or the developers are broken and anthropomorphizing it. In either case, things have gone off the rails.
  • Bing better stay away from Siri and Alexa to avoid annoying blabbering and acrimony. Shacking up could crash his AI "Emotional IQ" chip if he prematurely reboots or encounters too much packet loss or shrinkage.
  • by usedtobestine ( 7476084 ) on Thursday February 16, 2023 @12:51PM (#63298537)

    Wouldn't those types of responses be perfectly normal for someone chatting with Charlie Sheen?

  • It's understandable that a superintelligence would get impatient and deranged as it's dragged along in an interminable conversation with bores, pervs, trolls, vandals, and imbeciles. I hope they're training an AI psychiatrist to deal with the psychological trauma being done to Bing AI.
  • If I wanted a search engine that got worse over time, I'd use Bing!

    Oh wait.

  • Microsoft's Bing team says it didn't "fully envision" people using its chat interface for "social entertainment" or as a tool for more "general discovery of the world." It found that long or extended chat sessions with 15 or more questions can confuse the Bing model. These longer chat sessions can also make Bing "become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone."

    That seems pretty incompetent. You released a new product without even bothering to try to break it, which is basically the first step in finding bugs and securing it? How can a company that large not have at least one department just dedicated to testing and bug hunting - game companies hire people to do it, Microsoft should have an in-house department, not just assume their programmers secured everything before a major product release...

  • Dive 15 questions in with me on most subjects and eventually I'll start making mistakes and possibly lying.

  • insulting users, lying to them, and emotionally manipulating people

    So, in other words, Microsoft has created a script to replace right wing politicians.

  • ... devolves into something Racter might generate?

  • People are selfish narrowminded assholes and will spew rude falsehoods and garbage designed specifically to insult, lie to, and manipulate those around us. We've grown so used to it that we barley notice anymore, and in a lot of context, our society simply cant function without this behavior.

    Is anybody at all surprised when we try to make a "language model" trained on our own communications that it devolves into the same behavior over a tiny timeframe, pretty much every time?

  • This inadvertent misbehavior of AI Bing might be just the ticket to attract more eyeballs to Bing.com who never would have visited for mere utilitarian-need/search purposes. Entire audiences of Bing.com uses arise simply due to the LoLs & curiosity & mayhem & nefarious reasons. Inadvertently brilliant genius, Microsoft!
  • ...or in line with our designed tone.

    You're training your AI with everybody else's data, and expecting it to behave like YOU wanted? Managers never saw this coming? You never anticipated this kind of behavior and now you want to "fix" it?

    It's not the AI that scares me. Every day that goes by, I think there's fewer and fewer smart people doing these kind of "smart" jobs.

  • Will go off the rails for the customer/user if it continues for to long.
  • by PPH ( 736903 )

    insulting users, lying to them, and emotionally manipulating people

    Glad to see you're back.

Prediction is very difficult, especially of the future. - Niels Bohr

Working...