

Amazon Announces 'Olympus' LLM to Compete With OpenAI and Google (reuters.com) 17
Amazon "is investing millions in training an ambitious large language model," reports Reuters, "hoping it could rival top models from OpenAI and Alphabet, two people familiar with the matter told Reuters."
The model, codenamed as "Olympus", has 2 trillion parameters, the people said, which could make it one of the largest models being trained. OpenAI's GPT-4 model, one of the best models available, is reported to have one trillion parameters...
The team is spearheaded by Rohit Prasad, former head of Alexa, who now reports directly to CEO Andy Jass... Amazon believes having homegrown models could make its offerings more attractive on AWS, where enterprise clients want to access top-performing models, the people familiar with the matter said, adding there is no specific timeline for releasing the new model.
"While the parameter count doesn't automatically mean Olympus will outperform GPT-4, it's probably a good bet that it will, at minimum, be very competitive with its rival from OpenAI," argues a financial writer at the Motley Fool — as well as Googles nascent AI projects. Amazon could have a key advantage over its competition, one that CEO Andy Jassy alluded to in the company's third-quarter earnings call. Jassy said, "Customers want to bring the models to their data, not the other way around. And much of that data resides in AWS [Amazon Web Services] as the clear market segment leader in cloud infrastructure...."
Amazon will likely also leverage Olympus in other ways. For example, the company could make its CodeWhisperer generative AI coding companion more powerful. Jassy noted in the Q3 call that all of Amazon's "significant businesses are working on generative AI applications to transform their customer experiences." Olympus could make those initiatives even more transformative.
They point out that Amazon's profits more than tripled in the third quarter of 2023 from where they were in 2022.
And Amazon's stock price has already jumped more than 40% in 2023.
The team is spearheaded by Rohit Prasad, former head of Alexa, who now reports directly to CEO Andy Jass... Amazon believes having homegrown models could make its offerings more attractive on AWS, where enterprise clients want to access top-performing models, the people familiar with the matter said, adding there is no specific timeline for releasing the new model.
"While the parameter count doesn't automatically mean Olympus will outperform GPT-4, it's probably a good bet that it will, at minimum, be very competitive with its rival from OpenAI," argues a financial writer at the Motley Fool — as well as Googles nascent AI projects. Amazon could have a key advantage over its competition, one that CEO Andy Jassy alluded to in the company's third-quarter earnings call. Jassy said, "Customers want to bring the models to their data, not the other way around. And much of that data resides in AWS [Amazon Web Services] as the clear market segment leader in cloud infrastructure...."
Amazon will likely also leverage Olympus in other ways. For example, the company could make its CodeWhisperer generative AI coding companion more powerful. Jassy noted in the Q3 call that all of Amazon's "significant businesses are working on generative AI applications to transform their customer experiences." Olympus could make those initiatives even more transformative.
They point out that Amazon's profits more than tripled in the third quarter of 2023 from where they were in 2022.
And Amazon's stock price has already jumped more than 40% in 2023.
the simplicity of human language and thought (Score:2)
It should be surprising to anyone that a machine capable of "understanding" a query, formulating a topical reply, writing that in coherent English and forming many sentences and paragraphs that spool in a logical reasoned order could be contained in a trillion floats.
These things write responses better than most graduate students.
Now never mind accuracy or correctness. I'm just talking about it executing the thing that all past AI researchers and language modelers agreed was a unique and special human talen
Re: (Score:2)
RIP Amazon AI (Score:4, Insightful)
At least we won't have to worry about the AI takeover by the Amazon AI.
threw up their hands (Score:2)
They said, "shit we have no idea what we are doing and can't seem to hire anyone that can figure it out, better put AI to work on this one."
It is like tech companies playing chicken, and they are all going full speed ahead off the cliff. I'm not sure whether to be entertained or horrified.
PR translation (Score:2)
Amazon jumps on the enshittification bandwagon.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
But can it spell? (Score:3)
Will the extra trillion parameters solve it from hallucinating basic mistakes? The other day I asked (for fun) and AI image generator to generate a logo for a brewery called Hopps.
It drew a nice logo, complete with the word Hoppps [sic] in it and the word Brewerr [sic] in it. This isn't a case of AI not knowing how many knuckles in a finger, it's case of it misspelling words given to it in the brief.
I don't think the number of parameters are the problem.
Re: (Score:3)
Will the extra trillion parameters solve it from hallucinating basic mistakes?
The parameter count of GPT-4 is not public knowledge. Most of the guessing and "leaks" are in the 1.8T range. In any case the answer is no.
The other day I asked (for fun) and AI image generator to generate a logo for a brewery called Hopps.
It drew a nice logo, complete with the word Hoppps [sic] in it and the word Brewerr [sic] in it. This isn't a case of AI not knowing how many knuckles in a finger, it's case of it misspelling words given to it in the brief.
AI image generators are not LLMs and don't even pretend to have any high level understanding of the elements in the scenes they render. It's a bit like an artistic rendition of a rocket motor or circuit board. The artist may superficially draw something that looks like a motor complete with piping and various doohickeys yet they have no understanding of what the thin
And where did they get all that data? (Score:2)
Amazon owns the entire site purchasing and browsing history of all of it's customers, going back a couple of decades. Their review comments and interactions with sellers. Their payment history, returns, music and movie viewing preferences. In total it is a titanic amount of detailed data about almost everyone.
Call me cynical, but it sure does seem like they will or have fed all of that into their LLM. Or maybe not into their model "codenamed as Olympus" which probably will offer some form of public access.
Olympus? (Score:2)
Presumably they're naming it for the Greek myth and not the Japanese company that today is now best known for a ¥376 billion scandal from 2011.
Since y'know Amazon are squeeky clean and LLMs aren't a blight on an information economy that values integrity and accuracy of information. R-right?
Hubris (Score:2)
The name seems an odd choice. In Greek mythology, humans who tried to climb Mount Olympus were punished for their hubris.
Aside from the poor name choice, the other choice that looks pretty iffy is the size. Many, if not most, of the recent papers on the scaling of LLMs suggest that making them bigger does not necessarily make them better and often make them worse. Researchers like Emily Bender and Gary Marcus have been saying this for a while; last year DeepMind demonstrated that training smaller models on [arxiv.org]
An AI Trained on Product Reviews? (Score:2)
Quick delivery. Looks god, haven't tried it yet.