Comment Bing+ChatGPT is always right, really Microsoft? (Score 1) 25
I just got accepted to use Bing+ChatGPT - I was told to use Microsoft Edge Dev, but that is so crashy on Linux (lasts only 30-60 secs before dying every time), that I ended up installing the Bing Android app on my phone.
The first two questions I asked it, it gave partially or completely wrong answers to and on both occasions, when I replied to it with corrections, it ended the conversation with a standard "I'm not talking to you any more" response and refused to reply to any further questions in that conversation. I had to actually completely clear the conversation to get it to respond again, which is absolutely terrible for an AI chatbot session.
What's worse is that the original ChatGPT at chat.openai.com will, if your correction is warranted, admit it is wrong and will omit the wrong answer if you repeat the question. Bssically, Bing+ChatGPT is assuming it's always right, can't be argued with if you question any of its answers and if you do dare to challenge it, it'll storm off in a huff, never to be seen again unless you wipe its memory. This is appalling and I bet ChatGPT engineers are furious that Microsoft have hugely gimped the Bing version.