Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI

Nvidia DGX Cloud: Train Your Own ChatGPT in a Web Browser For $37K a Month 22

An anonymous reader writes: Last week, we learned that Microsoft spent hundreds of millions of dollars to buy tens of thousands of Nvidia A100 graphics chips so that partner OpenAI could train the large language models (LLMs) behind Bing's AI chatbot and ChatGPT.

Don't have access to all that capital or space for all that hardware for your own LLM project? Nvidia's DGX Cloud is an attempt to sell remote web access to the very same thing. Announced today at the company's 2023 GPU Technology Conference, the service rents virtual versions of its DGX Server boxes, each containing eight Nvidia H100 or A100 GPUs and 640GB of memory. The service includes interconnects that scale up to the neighborhood of 32,000 GPUs, storage, software, and "direct access to Nvidia AI experts who optimize your code," starting at $36,999 a month for the A100 tier.

Meanwhile, a physical DGX Server box can cost upwards of $200,000 for the same hardware if you're buying it outright, and that doesn't count the efforts companies like Microsoft say they made to build working data centers around the technology.
This discussion has been archived. No new comments can be posted.

Nvidia DGX Cloud: Train Your Own ChatGPT in a Web Browser For $37K a Month

Comments Filter:
  • Its $37,000 a month to train it. That is before even using it. This is nothing but a toy for the rich.
    • It only costs like one bitcoin, come on!

    • by kiviQr ( 3443687 ) on Tuesday March 21, 2023 @02:40PM (#63388271)
      Last year it was $5 milion, next year $2.7. Thing you have in your pocket used to be the size of a 3 story building.
      • We're going back to the terminals approach though, as what you can run will be leased to you as they see fit. Let's see how capable your phone will be while disconnected from the cloud in a few years. It's already a glorified camera/browser/spying device where the flops go mostly towards shiny pixels.
    • Its $37,000 a month to train it. That is before even using it. This is nothing but a toy for the rich.

      When Chatxxx starts taking jobs by the thousands from whiny meatsacks who are always bitching about taking time off every day to sleep and needing health insurance, we might not find it so "toy" like.

    • Not a toy for long, but will remain for the rich alright. The plebes will get some scraps to get excited for signing up their privacy away for the more "premium" offers.
      It's funny how NVIDIA became popular by bringing 3D to the masses 25 years ago, by allowing the average joe to run cool stuff at home. How the times have changed, and the business models evolved ...
    • Its $37,000 a month to train it. That is before even using it. This is nothing but a toy for the rich.

      $37k for even a year for a guy at home is crazy. Only rich guys can think about that. However, even for a small company, $37k/month is nothing. And for large companies, it's less than nothing.

      OK, there are other significant costs besides the $37k/month, but even considering the true cost, the issue is not the money. It's whether the trained model can do something useful and moreover can do something that can be monetized. That's the real question. If the answer is that it's useless or non-revenue gene

      • by Junta ( 36770 )

        Either way it is a *lot* for something that just costs $200k up front. Ignoring residual capital value, it's *still* cheaper to buy than rent in less than 6 months.

  • by edi_guy ( 2225738 ) on Tuesday March 21, 2023 @03:18PM (#63388375)

    I've got eight Pentium 90's PC's, 64MB RAM, linked with 10mbit ethernet cables. That should do the trick

  • Sounds like a win win situation....

  • The actually cost to train a chatgpt 3.5 tier model from scratch on the nvidia cloud is about 900k.

There are two ways to write error-free programs; only the third one works.

Working...