Forgot your password?
typodupeerror
AI Microsoft

Microsoft Gives Bing's AI Chatbot Personality Options (engadget.com) 23

According to web services chief Mikhail Parakhin, Microsoft is giving Bing preview testers a toggle to change the chatbot's responses. Engadget reports: A Creative option allows for more "original and imaginative" (read: fun) answers, while a Precise switch emphasizes shorter, to-the-point replies. There's also a Balanced setting that aims to strike a middle ground.

The company reined in the Bing AI's responses after early users noticed strange behavior during long chats and 'entertainment' sessions. As The Verge observes, the restrictions irked some users as the chatbot would simply decline to answer some questions. Microsoft has been gradually lifting limits since then, and just this week updated the AI to reduce both the unresponsiveness and "hallucinations." The bot may not be as wonderfully weird, but it should also be more willing to indulge your curiosity.

This discussion has been archived. No new comments can be posted.

Microsoft Gives Bing's AI Chatbot Personality Options

Comments Filter:

Politics: A strife of interests masquerading as a contest of principles. The conduct of public affairs for private advantage. -- Ambrose Bierce

Working...