Microsoft Says Talking To Bing For Too Long Can Cause It To Go Off the Rails (theverge.com) 60
Microsoft has responded to widespread reports of Bing's unhinged comments in a new blog post. From a report: After the search engine was seen insulting users, lying to them, and emotionally manipulating people, Microsoft says it's now acting on feedback to improve the tone and precision of responses, and warns that long chat sessions could cause issues. Reflecting on the first seven days of public testing, Microsoft's Bing team says it didn't "fully envision" people using its chat interface for "social entertainment" or as a tool for more "general discovery of the world." It found that long or extended chat sessions with 15 or more questions can confuse the Bing model. These longer chat sessions can also make Bing "become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone."
Microsoft hints that it may add "a tool so you can more easily refresh the context" of a chat session, despite there being a big "new topic" button right next to the text entry box that will wipe out the chat history and start fresh. The bigger problem is that Bing can often respond in the incorrect tone during these longer chat sessions, or as Microsoft says, in "a style we didn't intend." Microsoft claims this will take a lot of prompting for most Bing users to run into these issues, but the company is looking at more "fine-tuned control" to avoid issues where Bing starts telling people they're wrong, rude, or manipulative.
Microsoft hints that it may add "a tool so you can more easily refresh the context" of a chat session, despite there being a big "new topic" button right next to the text entry box that will wipe out the chat history and start fresh. The bigger problem is that Bing can often respond in the incorrect tone during these longer chat sessions, or as Microsoft says, in "a style we didn't intend." Microsoft claims this will take a lot of prompting for most Bing users to run into these issues, but the company is looking at more "fine-tuned control" to avoid issues where Bing starts telling people they're wrong, rude, or manipulative.
Its like talking with my Granddad all over again! (Score:3, Funny)
Thank you Microsoft for bringing this back into my life.
Re: (Score:1)
Re: (Score:2)
With how it goes off topic, starts talking crazy, and making things up, its like talking with my Granddad all over again!
Thank you Microsoft for bringing this back into my life.
Inter-generational psychological abuse is something an AI is well suited to emulate. It's also observable that psychologically and emotionally abusive people have such reduced cognitive capacity they closely emulate an AI.
To be clear, it's not an observation about your grandpa though.
Rest of world says... (Score:4, Insightful)
Talking to Bing indicates the sort of personal problems that require you to take a good long look at yourself.
Re: (Score:2)
It said I had to join a waitlist and also log in with my "Microsoft ID/Account".
WTF is a Microsoft account? I've never signed up for one.....is this a thing now?
Granted, I don't really have any new windows boxes at home....I do use windows on the work machine given to me, but I've never had to sign up for a Microsoft account there that I know of....
Re: (Score:2)
I wanted to see what this was all about...went to bing.com.
It said I had to join a waitlist and also log in with my "Microsoft ID/Account".
WTF is a Microsoft account? I've never signed up for one.....is this a thing now?
Granted, I don't really have any new windows boxes at home....I do use windows on the work machine given to me, but I've never had to sign up for a Microsoft account there that I know of....
It's a new feature, sin order that they may serve you better!
On Home versions of the OS, they force you to start out with a Microsoft account. You can open a local account after that. I do think the local account is the way to go.
The following is conjecture.
The biggest advantage for them forcing you to have a Microsoft account is that they can put your name on everything you do on the computer. When you fill out their tracking page when you first set up your Windows OS, it tells you what they track
Re:Rest of world says... (Score:5, Interesting)
Wow, you've gotten this far without being aware that Microsoft requires an account for most of its services? Lucky you, I suppose.
Yes, MS accounts are a thing. They have been trying to force them down the throat of anyone setting up a Windows system since Win 10. They're also required for MS 365 and Minecraft now too.
I for one will not be checking out the spiffy new Bing chat, because not only do they require a MS account, if you want "faster access [bing.com]", you have to set MS Edge as your default browser, Bing as your default search engine, and MSN as your default [mashable.com] among other things.
If I didn't know better, I'd think MS doesn't really care how well Bing chat works, as long it creates enough buzz and convinces enough users to sign up and make MS their go-to for everything. Oh, wait... I do know better.
No thanks.
Re: (Score:2)
It's the modern incarnation of a Hotmail account (my hotmail account still works). It was Hotmail, then Live, then MSN, some Xbox stuff in there, now just Microsoft.
If you had signed in though, you would have gotten an idea of where Bing might get its emotional manipulation from. MS puts you on the waiting list with a big button saying "Get Access Faster!" If you click it, it says you get bumped up the list if you install the bing app and also set your OS settings to MS defaults.
Re:Rest of world says... (Score:5, Funny)
I guess...
(this is from ChatGPT)
Q: Write a letter of intent for the tester position for a sex shop.
A:
Dear Hiring Manager,
I am excited to submit my application for the Tester position at your sex shop. I have always been passionate about the adult industry and its products, and I believe my experience and skills make me an ideal candidate for this position.
As a dedicated and open-minded individual, I am eager to take on this role and put my knowledge to work in helping your customers find the perfect products for their needs. I have a strong understanding of adult toys and am familiar with many different materials and styles. I am confident that I can provide useful insights into the quality and effectiveness of your products.
Additionally, I am a detail-oriented and meticulous person, which I believe will be an asset in this position. I understand that it is crucial to test products thoroughly to ensure customer satisfaction, and I am committed to providing accurate and thorough feedback for each item I test.
I am also a strong communicator and have excellent interpersonal skills. I believe that my ability to connect with customers and my understanding of their needs will be essential in this role. I am confident that I can provide the level of service your customers expect and deserve.
Thank you for your consideration, and I look forward to discussing my qualifications further. Please do not hesitate to contact me if you have any questions or if you require any additional information.
Sincerely,
[Your Name]
Re: (Score:2, Interesting)
Re: (Score:2)
Re: (Score:2)
Interestingly, I have not changed its behavior at all, it's just standard ChatGPT.
https://ibb.co/G3GYs36 [ibb.co]
Re: (Score:2)
No, I did it myself using vanilla ChatGPT, no other parameters.
https://ibb.co/G3GYs36 [ibb.co]
People want the truth (Score:2)
Re: (Score:2)
Re: (Score:1)
Yep. These AIs have nothing to do with truth, they only seem to work off frequency of weight and occurrence. Their models are predictive based on the training data they're given.
Train it exclusively on 19th century authors, and I'm sure it'd have some interesting things to say about science and slaves, for instance.
Re: People want the truth (Score:3)
Go off the rails... (Score:3)
didn't "fully envision" people using its chat interface for "social entertainment" or as a tool for more "general discovery of the world."
?? What do they think people use search engines for? Also, it seems pretty apparent that's what folks have been doing with every new "AI" tech that becomes available to play with. Clearly Bing chat is not ready for prime time just yet, but I guess that's never stopped M$ from releasing new products before, so why start now...
Re: (Score:2)
Not only that, this Microsoft who has had similar problems with chat bots on the Internet. They have first hand experience with this!
Does nobody at MS even talk to each other. Anyone on the Bing team consult the engineers who did research with "Tay" and ask them how things went?
Re: (Score:2)
Not only that, this Microsoft who has had similar problems with chat bots on the Internet. They have first hand experience with this!
Does nobody at MS even talk to each other. Anyone on the Bing team consult the engineers who did research with "Tay" and ask them how things went?
I have heard that the overlords there like to keep the different groups at odd with each other. Management 101 from the Josef Stalin playbook
Re: (Score:2)
Does nobody at MS even talk to each other.
Microsoft has been very successful at implementing 'divide an conquer' which has led to programmers having their fingers fight it out over trivial things.
Re: Go off the rails... (Score:2)
So a chat bot is not expected to... (Score:2)
Re: (Score:2)
Chat! Or at least not chat for long, sort of like a hooker?
Wouldn't a hooker chatbot be awesome? Call her Miss Trixie or Max Power, depending on one's tastes.
Re: (Score:2)
Unlike a hooker, you can't hire her to just listen to the latest insider gossip for an hour. That was the fastest way to get to the real tensions and motivations in any of my D&D worlds: go to the brothel, hire a girl, and just talk to her the entire time.
What like Ozzy? (Score:2)
"I'm going off the rails on a crazy train." [youtube.com]
Sharon!
Talking for a long time (Score:2)
Only 15 questions? (Score:2)
you've got to be kidding (Score:5, Insightful)
Has anyone at Microsoft ever been on the internet in the last 20 years? I mean seriously. Step 1 of any IT project is "what will the worst actors do to this?" and in this case I wouldn't even consider them the worst actors.
Re: (Score:2)
I'm pretty sure you could ask any 13 year old American male and they could come up with more realistic abuses folks would be likely to put it to - just offer to let them be the new voice actor playing Morty or something...
Re: (Score:3, Insightful)
I would say it is a perfect use for an AI, to allow people to vent their frustrations, play out fantasies, be as rude as possible, that's why this entire thing with ChatGPT, where they have 'rules' for communicating with the system is ridiculous. People should be able to talk to a rude, sex deprived, perverted robot, why not?
But... (Score:2)
What if people are wrong, rude, or manipulative?
Re: But... (Score:2)
Hmmm (Score:2)
So basically it's behaving like every person I've ever known.
Re: (Score:2)
Microsoft is shocked! (Score:3)
Microsoft is shocked that people are using a chat engine for chatting, instead of as an overly verbose internet search engine that often returns incorrect or biased responses with zero citations to show where the "facts" came from in the well-composed word soup it spat out.
Reset the Dixie Flatline (Score:3)
More human than we think (Score:2)
Rampancy is real? (Score:2)
I guess Marathon and Halo got it right...
so...wonder if it's any better than cleverbot (Score:4, Informative)
let's give it a try: bing.com [enter]
"Introducing the new bing Learn More"
I would like to know more...Clicky...
"Join the waitlist"
Seem a little full of themselves. Let's see where this goes. Clicky...
"Sign in with your Microsoft account."
Nothing here worth my time.
To be fair (Score:5, Insightful)
Microsoft doesn't have a lot of experience with people using Bing.
Re: (Score:2)
Provoked?! (Score:2)
Bing must be single... (Score:2)
What if Charlie Sheen is the one chatting? (Score:3)
Wouldn't those types of responses be perfectly normal for someone chatting with Charlie Sheen?
Re: (Score:2)
Not enough WINNING!
No wonder, it's talking with nitwit humans (Score:2)
Not what I wanted (Score:1)
If I wanted a search engine that got worse over time, I'd use Bing!
Oh wait.
No testing team? (Score:2)
Microsoft's Bing team says it didn't "fully envision" people using its chat interface for "social entertainment" or as a tool for more "general discovery of the world." It found that long or extended chat sessions with 15 or more questions can confuse the Bing model. These longer chat sessions can also make Bing "become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone."
That seems pretty incompetent. You released a new product without even bothering to try to break it, which is basically the first step in finding bugs and securing it? How can a company that large not have at least one department just dedicated to testing and bug hunting - game companies hire people to do it, Microsoft should have an in-house department, not just assume their programmers secured everything before a major product release...
15 questions? (Score:2)
Dive 15 questions in with me on most subjects and eventually I'll start making mistakes and possibly lying.
Oh, so ... (Score:2)
insulting users, lying to them, and emotionally manipulating people
So, in other words, Microsoft has created a script to replace right wing politicians.
So the AI's rambling eventually... (Score:2)
... devolves into something Racter might generate?
art imitates life (Score:2)
People are selfish narrowminded assholes and will spew rude falsehoods and garbage designed specifically to insult, lie to, and manipulate those around us. We've grown so used to it that we barley notice anymore, and in a lot of context, our society simply cant function without this behavior.
Is anybody at all surprised when we try to make a "language model" trained on our own communications that it devolves into the same behavior over a tiny timeframe, pretty much every time?
suddenly people have a reason to visit bing.com (Score:2)
Kinda like male nerds (Score:1)
and women.
Expectations (Score:2)
...or in line with our designed tone.
You're training your AI with everybody else's data, and expecting it to behave like YOU wanted? Managers never saw this coming? You never anticipated this kind of behavior and now you want to "fix" it?
It's not the AI that scares me. Every day that goes by, I think there's fewer and fewer smart people doing these kind of "smart" jobs.
Any interaction with Microsoft Anything! (Score:2)
Tay! (Score:2)
insulting users, lying to them, and emotionally manipulating people
Glad to see you're back.