Scared of Leaking Data To ChatGPT? Microsoft Tests a Private Alternative 20
An anonymous reader shares a report: Not everyone trusts OpenAI's ChatGPT. While the new artificial intelligence-powered chatbot has proved popular with some businesses looking to automate business tasks, other companies, such as banks, have avoided adopting ChatGPT for fear that their employees would inadvertently give the chatbot proprietary information when they use it. Microsoft, which has the rights to resell the startup's technology, has a plan to win over the holdouts.
Later this quarter Microsoft's Azure cloud server unit plans to sell a version of ChatGPT that runs on dedicated cloud servers where the data will be kept separate from those of other customers, according to two people with knowledge of the upcoming announcement. The idea is to give customers peace of mind that their secrets won't leak to the main ChatGPT system, the people said. But it will come at a price: The product could cost as much as 10 times what customers currently pay to use the regular version of ChatGPT, one of these people said.
Later this quarter Microsoft's Azure cloud server unit plans to sell a version of ChatGPT that runs on dedicated cloud servers where the data will be kept separate from those of other customers, according to two people with knowledge of the upcoming announcement. The idea is to give customers peace of mind that their secrets won't leak to the main ChatGPT system, the people said. But it will come at a price: The product could cost as much as 10 times what customers currently pay to use the regular version of ChatGPT, one of these people said.
So your private data is 10 times more valuable (Score:1, Flamebait)
Re: (Score:2)
It's only worth that to the very few of us left who haven't given it all away for free. I think you'll find that's a shockingly low percentage when it comes to internet users.
Llama (Score:3)
You can install a Llama-based language model on your PC already, and generate some text. No network traffic at all, no secrets leaked, no restrictions, just how good the language model is, and how good your PC is. (also some legal restrictions for how you are allowed to use Llama)
Only ten times the price (Score:2)
And we *promise* we won't look at your stuff. Cross our hearts. Of course, it's all our hardware.
Re: (Score:2)
Azure has this thing called Key Vault. You can upload your own keys to it and their ChatGPT implementation will use your keys to encrypt data at rest and in transit. So yes it is truly your data. Now they also have telemetrics inside that gpt implementation, so they are keeping tabs on what you're doing but they're not siphoning data out in bulk. So no, they cannot look at your stuff outside of the normal use of the gpt service. Further, that 10x price also comes with a contract outlining what MS can and
Microsoft is killing it (Score:2)
ChatGPT doesn't cite sources... BingAI links directly back to sources so you can verify.
ChatGPT doesn't know anything from after it was trained... Bing does a normal websearch including new information and reads those to find up-to-date answers
And now this one - using ChatGPT is limited because nobody knows where the in
Re: Microsoft is killing it (Score:1)
I'm open to alternatives (Score:1)