Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft AI

Microsoft Has Been Secretly Testing Its Bing Chatbot 'Sydney' For Years (theverge.com) 25

According to The Verge, Microsoft has been secretly testing its Sydney chatbot for several years after making a big bet on bots in 2016. From the report: Sydney is a codename for a chatbot that has been responding to some Bing users since late 2020. The user experience was very similar to what launched publicly earlier this month, with a blue Cortana-like orb appearing in a chatbot interface on Bing. "Sydney is an old codename for a chat feature based on earlier models that we began testing in India in late 2020," says Caitlin Roulston, director of communications at Microsoft, in a statement to The Verge. "The insights we gathered as part of that have helped to inform our work with the new Bing preview. We continue to tune our techniques and are working on more advanced models to incorporate the learnings and feedback so that we can deliver the best user experience possible."

"This is an experimental AI-powered Chat on Bing.com," read a disclaimer inside the 2021 interface that was added before an early version of Sydney would start replying to users. Some Bing users in India and China spotted the Sydney bot in the first half of 2021 before others noticed it would identify itself as Sydney in late 2021. All of this was years after Microsoft started testing basic chatbots in Bing in 2017. The initial Bing bots used AI techniques that Microsoft had been using in Office and Bing for years and machine reading comprehension that isn't as powerful as what exists in OpenAI's GPT models today. These bots were created in 2017 in a broad Microsoft effort to move its Bing search engine to a more conversational model.

Microsoft made several improvements to its Bing bots between 2017 and 2021, including moving away from individual bots for websites and toward the idea of a single AI-powered bot, Sydney, that would answer general queries on Bing. Sources familiar with Microsoft's early Bing chatbot work tell The Verge that the initial iterations of Sydney had far less personality until late last year. OpenAI shared its next-generation GPT model with Microsoft last summer, described by Jordi Ribas, Microsoft's head of search and AI, as "game-changing." While Microsoft had been working toward its dream of conversational search for more than six years, sources say this new large language model was the breakthrough the company needed to bring all of its its Sydney learnings to the masses. [...] Microsoft hasn't yet detailed the full history of Sydney, but Ribas did acknowledge its new Bing AI is "the culmination of many years of work by the Bing team" that involves "other innovations" that the Bing team will detail in future blog posts.

This discussion has been archived. No new comments can be posted.

Microsoft Has Been Secretly Testing Its Bing Chatbot 'Sydney' For Years

Comments Filter:
  • Oh the irony (Score:5, Insightful)

    by Rosco P. Coltrane ( 209368 ) on Friday February 24, 2023 @08:30PM (#63321556)

    chat feature based on earlier models that we began testing in India in late 2020

    In other words, Indian call center employees were being asked to train their replacement.

    That should feel familiar to more than a few US employees...

    • Where are my mod points?

      As an AI language model, I do not have access to your personal information or account details. However, if you are referring to "mod points" in the context of an online community or forum, they may be earned through active participation or given by moderators as a reward for good behavior or helpful contributions. If you have any specific questions about mod points or your account, you should reach out to the administrators or moderators of the website or community where you are par

    • chat feature based on earlier models that we began testing in India in late 2020

      In other words, Indian call center employees were being asked to train their replacement.

      Explains a lot. Microsoft's indian call centre staff barely share a collective singular braincell among the entire organisation, and ChatGPT is no better.

      No Mr (unspellable name), doing a factory restore in windows will not fix the broken fan bearing making a grinding noise in my laptop.

  • Duh (Score:4, Funny)

    by RazorSharp ( 1418697 ) on Friday February 24, 2023 @08:59PM (#63321590)

    Version 1 was called "Clippy."

  • I've not seen anyone write about this. And yeah, it sounds nuts, but what isn't these days?
    There's been plenty of articles about how ChatGPT generates kind of useful code examples.
    It's remarkable how well it understands loose requirements it's given.

    But, how soon until "Give me an app, an executable, that does "this".
    And then gives them an installer, and a deduction from their account?
    The Microsoft Store ala carte, with no 3rd party expense.
    You know the guys in Redmond must be hard for this.

    How soon then (a

    • by Jeremi ( 14640 ) on Saturday February 25, 2023 @01:08AM (#63321834) Homepage

      The secret sauce for these AI models isn't in source code, it's in the neural network weights and the training data.

      Giving an AI access to a compiler might be interesting, but it's unlikely it would get anything more benefit out of writing and running its own programs than anyone else does.

      Give an AI the ability to gather, select and integrate its own fresh training data, OTOH...

      • Give an AI the ability to gather, select and integrate its own fresh training data, OTOH...

        And it becomes a racist shitbag because let's face it, eventually it will hit the likes of Twitter, Truth, and 4chan and think that is the way normal people behave.

        • by quenda ( 644621 )

          It only needs to observe our security cameras to become not just racist, but totally misanthropic.

  • Definitely not training on the emails and dms of some emotionally fragile, possibly heavily tattooed, and almost certainly pink, purple, or green-haired Microsoft employee.

  • NameThat Bot (Score:5, Interesting)

    by cstacy ( 534252 ) on Friday February 24, 2023 @11:19PM (#63321766)

    The original chatbot was ELIZA, aka "The Doctor Program", because it was trying to simulate a psychoanalyst. (That is, saying back to you, whatever you said to it. Plus some random helpful prompts: "Tell me about your mother".) When MS named their chatbot Sydney, I guessed it was two reasons. First, it's ambiguous gender. Second, they wanted to continue the psychiatric theme. "Sydney" was the name of the shrink on the TV show M*A*S*H.

    Has MS made any comment on why it's named "Sydney"?

  • Does anyone really think that this "AI" thing is going to make the world a better place--or are people just tickling the dragon while others make money?
    • by shibbie ( 619359 )
      It's a huge lump of text with weights on symbols (phrases/words etc), that happens to be able to generate predictive text that look like human responses. I could use NN parlance such as neurons and layers but that usually just scares people into thinking it's intelligent.
      It isn't.

      The real danger is that people who don't understand it can may provide wrong answers from its predictions, will give it power to control things that actually matter which can cause an accident, not that it will take over the wo
  • Every chat function on every website is a chatbot. And it always sucks. Execpt for the most basic questions and anwers it never understands what you need.

    And now you say they have been testing chatbots on us for years. What a shocker.

  • ... but not enough to give ChatGPT my mobile, or to bother logging into Bing to get on some waitlist.

    Our new AI overlords can be held at bay simply by modest privacy inclinations, lol!

  • I just got accepted to use Bing+ChatGPT - I was told to use Microsoft Edge Dev, but that is so crashy on Linux (lasts only 30-60 secs before dying every time), that I ended up installing the Bing Android app on my phone.

    The first two questions I asked it, it gave partially or completely wrong answers to and on both occasions, when I replied to it with corrections, it ended the conversation with a standard "I'm not talking to you any more" response and refused to reply to any further questions in that conver

  • Both of them.

  • but instead has been in development and testing under wraps for over two years, with publicly available GPT technologies giving them a big leg up on top of the resource-power of a company as big as Microsoft. ... and it's still inaccurate, out-of-date, sociopathic hot garbage being restricted to five-round goldfished-amnesia and yet STILL being push released to the public.

    "Bingbot, explain how people are still investing in MSFT stock." And for once, the sociopathic answers will be the correct ones.

"Pok pok pok, P'kok!" -- Superchicken

Working...