
How Amazon Blew Alexa's Shot To Dominate AI 43
Amazon unveiled a new generative AI-powered version of its Alexa voice assistant at a packed event in September 2023, demonstrating how the digital assistant could engage in more natural conversation. However, nearly a year later, the updated Alexa has yet to be widely released, with former employees citing technical challenges and organizational dysfunction as key hurdles, Fortune reported Thursday. The magazine reports that the Alexa large language model lacks the necessary data and computing power to compete with rivals like OpenAI. Additionally, Amazon has prioritized AI development for its cloud computing unit, AWS, over Alexa, the report said. Despite a $4 billion investment in AI startup Anthropic, privacy concerns and internal politics have prevented Alexa's teams from fully leveraging Anthropic's technology.
Alexa killed Alexa (Score:4, Funny)
Re:Alexa killed Alexa (Score:5, Informative)
My friend has one that they set up a scheduled event to mute alexa, tell her to not suggest stuff anymore, then unmute alexa which makes it stop doing that for the day. Amazon's sure you want this feature so they re-enable it every day just in case
Re:Alexa killed Alexa (Score:5, Informative)
When Alexa began asking "By the way did you know I can ..." was when I removed Alexa from my home.
Same. When I am watching a movie and I say "Alexa, dim the lights." And then she talks for 45 seconds to a minute about "did you know" or "by the way" over the movie we're trying watch.... FIRED. We took every single Alexa and tossed them.
Amazon likely has a trove of data from our house that are variations of "Alexa... SHUT THE FUCK UP."
Re: (Score:2)
For the record, Google does the same thing (although it sounds like it's a lot less obnoxious about it).
Re: (Score:2)
Re:Alexa killed Alexa (Score:4, Funny)
When Alexa began asking "By the way did you know I can ..." was when I removed Alexa from my home.
Same. When I am watching a movie and I say "Alexa, dim the lights." And then she talks for 45 seconds to a minute about "did you know" or "by the way" over the movie we're trying watch.... FIRED. We took every single Alexa and tossed them.
Amazon likely has a trove of data from our house that are variations of "Alexa... SHUT THE FUCK UP."
Me: Alexa, play white noise so I can get my baby to sleep
Alexa: That's great, but continuously playing a simple sound loop for several hours is extremely complicated and required a 3rd party developer to implement, so are you interested me blathering on about a monthly subscription for a while before I start playing the white noise? Btw, why does it suddenly sound like a baby is crying?
Alexa show me the money (Score:5, Insightful)
But, as it turns out, no one really wants to use Alexa for that. So while I'm sure they make a small profit on the hardware, the back-end has a constant on going cost for operations. With it not driving more sales revenue, it's a money sink for Amazon. So what is their motivation to add even more costs by prioritizing rolling out advanced generative AI on the platform? The capabilities might drive some hardware sales but it won't fix any of the Echo/Alexa profitability issues, and at the same time it will not only burn money, but also utilize expensive hardware Amazon could actually make money billing customers for on AWS.
It does looks like they will probably charge a subscription for the AI capabilities based on recent reporting to try to keep it from being a total money pit, but there is going to a minimum number of subscriptions needed to even break even and I'm not sure they could get there, particularly with people getting the same or (most certainly) better genAI on their phones from Google and Apple.
Re:Alexa show me the money (Score:4, Interesting)
Part of the issue is the internal process for cloud computing usage. When I worked in the Security Operations Center we were charged by AWS for every (virtual) CPU, storage unit, processing capability, network connection, etc. For us it was cheaper than actually buying/maintaining standalone servers so we didn't have a problem with it, but for something like Alexa which is attempting to roll out AI on a gigantic scale the cost would have been pretty steep. They probably would have been better to purchase a couple of standalone Cerebras racks and train their AI on that first, but it may not have been possible for reasons. (If you haven't read about the Cerebras systems you should, they're incredibly cool. A CPU the size of an entire wafer https://spectrum.ieee.org/cere... [ieee.org] )
Re: (Score:2)
Here's the good news for Amazon though: they have a staggering amount of idle compute available to them, in the form of unused capacity across AWS.
I'm never going to buy the excuse that Amazon can't come up with enough cheap compute to make it work, when other companies that are PAYING AMAZON FOR THE COMPUTE can.
Re: (Score:3)
That's the thing, they also make internal customers pay for resources, doesn't matter if the resources would otherwise sit idle.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I failed at reading and thought you were talking about the 300 options available for any type of service on AWS.
Re: Alexa show me the money (Score:2)
This. Same goes for Google Home too. These devices are largely used for three things:
1. Playing music
2. Kitchen timers
3. Controlling smart home devices
Having better AI on the voice could help make 3 more frictionless but in terms of business model there is very little incentive for Google or Amazon to invest.
The only advantage Alexa and Google Home have is that they are aggregators for music and smart home systems. They offer an eco system, of sorts, and that helps the overall ecosystem but I canâ(TM)t
Terrible headline (Score:2)
Amazon didn't "blow" anything
Actually making a useful AI assistant is hard
Pundits, analysts, investors and futurists are ramping up the hype, believing that enormous profits are imminent
Expect to see a tsunami of crappy AI stuff, forced on us long before it's ready
I'm optimistic that useful stuff will eventually be developed, but the early stuff will suck mightily
Re: (Score:3)
Amazon didn't "blow" anything Actually making a useful AI assistant is hard Pundits, analysts, investors and futurists are ramping up the hype, believing that enormous profits are imminent Expect to see a tsunami of crappy AI stuff, forced on us long before it's ready I'm optimistic that useful stuff will eventually be developed, but the early stuff will suck mightily
Apple's shoving OpenAI down their user's throats. I have a feeling "useful" is not at all something these companies are interested in. The point of AI today is data rape. And I have a hard time believing anyone's hype that the current cycle of these things is going to lead to anything other than more and more certain wording of absolutely false facts.
Re: Terrible headline (Score:2)
Re: (Score:2)
You can't rely on these things for any kind of specialized knowledge. They are great at extracting and summarizing data from inputs, but that doesn't originate inside the model. The model has, at best, a linguistic understanding of general concepts. The key is to stop trying to use these things for stuff they aren't going to be good at.
I don't disagree with you at all, but the companies running these things seem convinced they're going to rule the world with them, and that the machines are already better qualified to do most jobs than people are.
I'm still waiting for my easy to install and use summarizing agent for my fictional universe. I have every word I've ever written about it. I'd love to be able to feed it into an LLM/AI of some type to ask general questions when I can't remember the exact date of some event or a birthday or an eye
Re: Terrible headline (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
How is Apple shoving OpenAI down people's throats? I saw they were making it available for use but I wasn't aware of them forcing people to use it. I doubt that's the case just from the AI safety issues around using it...
You can choose not to use Siri, but it's still there in the background. I'm sure OpenAI's phone install will be the same. The "optional" part will be whether you interact with it. It won't be up to you or me if it gets installed if it's a part of Apple's base install. It certainly never has been up to now. Why would that change just because it's AI? They're happy to shove a middle finger in your face on all their other useless extras. Like the Journal app, which I'm sure isn't a covert attempt at sucking do
Re: (Score:2)
Dude. Time to get a grip, and actually understand how things work.
They aren't installing anything onto the device to use OpenAI. They have an enterprise license to the OpenAI API. If the on-device thing they came up with can't handle the query, they pop up a box asking if you want to try OpenAI. If you say yes, it makes an API call. If you say no, it does not.
Please tell me how that's not entirely optional.
Re: (Score:2)
Because the person that wrote that, still doesn't understand that it will prompt you to go off-device and use OpenAI if the on-device LLM can't figure it out.
Clearly something that you need to opt-in on every time it's used is "shoving it down people's throats" ...
This isn't rocket science... (Score:4, Insightful)
I don't know if you've ever been around a highly competent personal assistant but they're not around for conversation. They're their for competence. A good assistant has real practical value because they make it easier to solve problems. Chatbots are not personal assistants. No one wanted Alexa discuss their favorite song, we wanted it to write a grocery list, control our devices and book flights for us. That was never actually the goal.
Amazon wasn't trying to build a functional tool. So it's not a functional tool. How is this news?
Re: (Score:2)
This is a completely underrated take.
I don't know why anyone would think that Amazon was doing everyone a huge favor by making their lives easier in any way that doesn't include Amazon recognizing revenue. They added a few odds and ends in there in order to give a nice facade of making your life easier, but really it was about data gathering and increasing convenience / lock-in of using Amazon services, and making Amazon purchases. It's no coincidence that in order to use one of their display things, you
somehow I read the title differently (Score:2)
Re: somehow I read the title differently (Score:2)
"what? of a fat fuck looser"
Not as loose as your mom.
Re: (Score:2)
Somewhere out there undoubtedly there exists some piece of Alexa-integrated hardware to do just that . . .
talking is low bandwidth (Score:4, Insightful)
Re:talking is low bandwidth (Score:4, Interesting)
If I'm in the kitchen kneading dough and I want to listen to music I'm not going to go wash my hands to type what I want into a keyboard. In fact I'm going to do exactly that in just a minute. If I'm cooking and use up the last of the pizza sauce I'm not going into the other room to type "pizza sauce" into my shopping list with the keyboard. If my wife reminds me that we need to make an appointment with the vet while we're having breakfast, like she did yesterday, I'm not stopping to key it in and then come back, and if I wait until after breakfast I'll have spaced it out again.
It has its uses. Is it indispensable? Is it perfect? Of course not, but damn it is more convenient than the alternatives.
Re: (Score:2)
Re: (Score:3)
Ah, in that case I mostly agree.
Do the majority of people even want this? (Score:3)
Alexa problems long before AI (Score:2)
Alexa had problems long before AI. I remember getting our first-generation Echo, and being really excited about it. We quickly learned that apart from some very simple things (like asking the weather, setting a timer, or such), it was really stupid. It seemed obvious that Amazon would get tons of data from people trying to do things that it couldn't handle, which seemed like an obvious data set for finding useful enhancements. But they never did anything to improve it.
The only thing they did was push to
Risk aversion (Score:2)
I wonder if the summary buried the lede.
Despite a $4 billion investment in AI startup Anthropic, privacy concerns and internal politics have prevented Alexa's teams from fully leveraging Anthropic's technology.
To train an LLM you need a massive amount of data.
OpenAI could afford to scrape the Internet and not worry about the blowback from privacy and copyright because without ChatGPT they don't have a company.
Meta and Google could also do that because they had a giant pile of data they could could legally