Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI

Amazon CEO Says 'Really Good' AI Models Take 'Billions of Dollars' To Train (cnbc.com) 14

Amazon is introducing a cloud service called Bedrock that developers can use to enhance their software with artificial intelligence systems that can generate text, similar to the engine behind the popular ChatGPT chatbot powered by Microsoft-backed startup OpenAI. From a report: The announcement indicates that the largest provider of cloud infrastructure won't be leaving a trendy growth area to challengers such as Google and Microsoft, both of which have started offering developers large language models they can tap into. Generally speaking, large language models are AI programs trained with extensive amounts of data that can compose human-like text in response to prompts that people type in.

Through its Bedrock generative AI service, Amazon Web Services will offer access to its own first-party language models called Titan, as well as language models from startups AI21 and Google-backed Anthropic, and a model for turning text into images from startup Stability AI. One Titan model can generate text for blog posts, emails or other documents. The other can help with search and personalization. "Most companies want to use these large language models but the really good ones take billions of dollars to train and many years and most companies don't want to go through that," Amazon CEO Andy Jassy said on CNBC's "Squawk Box" Thursday. "So what they want to do is they want to work off of a foundational model that's big and great already and then have the ability to customize it for their own purposes. And that's what Bedrock is."

This discussion has been archived. No new comments can be posted.

Amazon CEO Says 'Really Good' AI Models Take 'Billions of Dollars' To Train

Comments Filter:
  • by gweihir ( 88907 ) on Thursday April 13, 2023 @07:08PM (#63448058)

    Or rather an overstatement on the "really good". Pretty realistic on the effort and cost though.

    • by narcc ( 412956 )

      You're not wrong. For what you What you get for the time, money, and effort it hardly seems worth-while.

      Shows like Silicon Valley and various pop-sci articles about AI really push the idea that what matters is the volume and quality of the training data, but that's really the easy part. The complex and expensive part is the actual training.

      • by gweihir ( 88907 )

        Thanks.

        People tend to vastly overestimate what ChatGPT and friends can do, probably because of the natural language interface and a general blind faith in technology. What I have seen so far are typically really hilarious failures once you move beyond toy examples. For example, the "coding" this thing can do is either simplistic stuff that you could just gave googled or comes with really bad problems, while, to make it worse, looking good on the surface. Even more hilarious are the experiments come creators

        • by narcc ( 412956 )

          The whole approach seems to be very much subject to diminishing returns and seems to already have peaked.

          That's a good insight, and one that I suspect will become increasingly important. Even operating under the (unreasonable) assumption that you can get the performance you want with a larger / better trained model of this kind, could you then extract enough value from it to make it worth the considerable cost?

          With other kinds of language models, things tend to grow exponentially. Take n-grams, for example. The model size grows exponentially with n, as you'd expect, while perplexity drops sharply and quickly

  • Did he have his Doctor Evil pinkie out?
  • The old rule of "garbage in garbage out" still applies with AI. I have a hard time explaining that to people. To get a good AI model, you need good training data. To get good training data, you need pay people money to produce it.

    AI isn't cheap.
  • BS (Score:2, Interesting)

    by Dan East ( 318230 )

    I call complete and total BS here. BILLIONS of dollars?? We know the computation side of thing is fairly intensive, but is maybe in the $100ks of dollars. Let's go up an entire order of magnitude and say $1 million in computation expense to be safe. Human prompting is no longer necessary as existing models can be used to train new models. However even with using an army of a thousand humans working for a month at $10k each for that month to interact with the AI is only $10 million. That's $11 million dolla

    • an army of a thousand humans working for a month

      You are delusional if you think ChatGPT was created in a month.

  • Doesn't Amazon's CEO also say "really good rockets" don't need to be able to "get to orbit"?

    Old man shouts at cloud part 4.

God doesn't play dice. -- Albert Einstein

Working...