Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Microsoft

Scared of Leaking Data To ChatGPT? Microsoft Tests a Private Alternative 20

An anonymous reader shares a report: Not everyone trusts OpenAI's ChatGPT. While the new artificial intelligence-powered chatbot has proved popular with some businesses looking to automate business tasks, other companies, such as banks, have avoided adopting ChatGPT for fear that their employees would inadvertently give the chatbot proprietary information when they use it. Microsoft, which has the rights to resell the startup's technology, has a plan to win over the holdouts.

Later this quarter Microsoft's Azure cloud server unit plans to sell a version of ChatGPT that runs on dedicated cloud servers where the data will be kept separate from those of other customers, according to two people with knowledge of the upcoming announcement. The idea is to give customers peace of mind that their secrets won't leak to the main ChatGPT system, the people said. But it will come at a price: The product could cost as much as 10 times what customers currently pay to use the regular version of ChatGPT, one of these people said.
This discussion has been archived. No new comments can be posted.

Scared of Leaking Data To ChatGPT? Microsoft Tests a Private Alternative

Comments Filter:
  • Than real money. Now we know the exchange rate, you should budget your privacy accordingly.
    • It's only worth that to the very few of us left who haven't given it all away for free. I think you'll find that's a shockingly low percentage when it comes to internet users.

  • by Dwedit ( 232252 ) on Tuesday May 02, 2023 @02:21PM (#63492178) Homepage

    You can install a Llama-based language model on your PC already, and generate some text. No network traffic at all, no secrets leaked, no restrictions, just how good the language model is, and how good your PC is. (also some legal restrictions for how you are allowed to use Llama)

  • And we *promise* we won't look at your stuff. Cross our hearts. Of course, it's all our hardware.

    • by dknj ( 441802 )

      Azure has this thing called Key Vault. You can upload your own keys to it and their ChatGPT implementation will use your keys to encrypt data at rest and in transit. So yes it is truly your data. Now they also have telemetrics inside that gpt implementation, so they are keeping tabs on what you're doing but they're not siphoning data out in bulk. So no, they cannot look at your stuff outside of the normal use of the gpt service. Further, that 10x price also comes with a contract outlining what MS can and

  • Call me cazy, but from where I sit, Microsoft is doing a very good job utilizing this wave of "AI." Basic ChatGPT has several shortfalls, and one by one, Microsoft is addressing them -

    ChatGPT doesn't cite sources... BingAI links directly back to sources so you can verify.
    ChatGPT doesn't know anything from after it was trained... Bing does a normal websearch including new information and reads those to find up-to-date answers
    And now this one - using ChatGPT is limited because nobody knows where the in

    • I was at MS during that one chatbot fiasco. I always thought it was funny they unleashed it to the web before trying to eat the dog food company-wide first. I wonder if they still talk about eating dog food while pushing it away...
  • I'm about to dive into HiveMind and Petals. It's a good time to look at how we can avoid being held hostage by a single platform.

/earth: file system full.

Working...