Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
AI

How Amazon Blew Alexa's Shot To Dominate AI 43

Amazon unveiled a new generative AI-powered version of its Alexa voice assistant at a packed event in September 2023, demonstrating how the digital assistant could engage in more natural conversation. However, nearly a year later, the updated Alexa has yet to be widely released, with former employees citing technical challenges and organizational dysfunction as key hurdles, Fortune reported Thursday. The magazine reports that the Alexa large language model lacks the necessary data and computing power to compete with rivals like OpenAI. Additionally, Amazon has prioritized AI development for its cloud computing unit, AWS, over Alexa, the report said. Despite a $4 billion investment in AI startup Anthropic, privacy concerns and internal politics have prevented Alexa's teams from fully leveraging Anthropic's technology.
This discussion has been archived. No new comments can be posted.

How Amazon Blew Alexa's Shot To Dominate AI

Comments Filter:
  • by DenvLett ( 7507742 ) on Thursday June 13, 2024 @11:38AM (#64546565)
    When Alexa began asking "By the way did you know I can ..." was when I removed Alexa from my home.
    • by Ksevio ( 865461 ) on Thursday June 13, 2024 @11:49AM (#64546601) Homepage

      My friend has one that they set up a scheduled event to mute alexa, tell her to not suggest stuff anymore, then unmute alexa which makes it stop doing that for the day. Amazon's sure you want this feature so they re-enable it every day just in case

    • by MikeDataLink ( 536925 ) on Thursday June 13, 2024 @12:15PM (#64546717) Homepage Journal

      When Alexa began asking "By the way did you know I can ..." was when I removed Alexa from my home.

      Same. When I am watching a movie and I say "Alexa, dim the lights." And then she talks for 45 seconds to a minute about "did you know" or "by the way" over the movie we're trying watch.... FIRED. We took every single Alexa and tossed them.

      Amazon likely has a trove of data from our house that are variations of "Alexa... SHUT THE FUCK UP."

      • For the record, Google does the same thing (although it sounds like it's a lot less obnoxious about it).

      • Comment removed based on user account deletion
      • by quantaman ( 517394 ) on Thursday June 13, 2024 @06:34PM (#64547673)

        When Alexa began asking "By the way did you know I can ..." was when I removed Alexa from my home.

        Same. When I am watching a movie and I say "Alexa, dim the lights." And then she talks for 45 seconds to a minute about "did you know" or "by the way" over the movie we're trying watch.... FIRED. We took every single Alexa and tossed them.

        Amazon likely has a trove of data from our house that are variations of "Alexa... SHUT THE FUCK UP."

        Me: Alexa, play white noise so I can get my baby to sleep

        Alexa: That's great, but continuously playing a simple sound loop for several hours is extremely complicated and required a 3rd party developer to implement, so are you interested me blathering on about a monthly subscription for a while before I start playing the white noise? Btw, why does it suddenly sound like a baby is crying?

  • by EvilSS ( 557649 ) on Thursday June 13, 2024 @11:55AM (#64546623)
    The problem with Alexa is that Amazon foresaw it as a way to increase user purchases with reduced 'friction'. Alexa, order dish soap. Alexa, order printer paper. This is a different model than Siri, where it could help to sell hardware and get/keep people on their platforms where Apple makes their money. Similar story with Google: it could help attract/keep people on Android where they hope they can make some money from services/app store sales.etc.

    But, as it turns out, no one really wants to use Alexa for that. So while I'm sure they make a small profit on the hardware, the back-end has a constant on going cost for operations. With it not driving more sales revenue, it's a money sink for Amazon. So what is their motivation to add even more costs by prioritizing rolling out advanced generative AI on the platform? The capabilities might drive some hardware sales but it won't fix any of the Echo/Alexa profitability issues, and at the same time it will not only burn money, but also utilize expensive hardware Amazon could actually make money billing customers for on AWS.

    It does looks like they will probably charge a subscription for the AI capabilities based on recent reporting to try to keep it from being a total money pit, but there is going to a minimum number of subscriptions needed to even break even and I'm not sure they could get there, particularly with people getting the same or (most certainly) better genAI on their phones from Google and Apple.
    • by cusco ( 717999 ) <brian DOT bixby AT gmail DOT com> on Thursday June 13, 2024 @01:59PM (#64547081)

      Part of the issue is the internal process for cloud computing usage. When I worked in the Security Operations Center we were charged by AWS for every (virtual) CPU, storage unit, processing capability, network connection, etc. For us it was cheaper than actually buying/maintaining standalone servers so we didn't have a problem with it, but for something like Alexa which is attempting to roll out AI on a gigantic scale the cost would have been pretty steep. They probably would have been better to purchase a couple of standalone Cerebras racks and train their AI on that first, but it may not have been possible for reasons. (If you haven't read about the Cerebras systems you should, they're incredibly cool. A CPU the size of an entire wafer https://spectrum.ieee.org/cere... [ieee.org] )

      • Here's the good news for Amazon though: they have a staggering amount of idle compute available to them, in the form of unused capacity across AWS.

        I'm never going to buy the excuse that Amazon can't come up with enough cheap compute to make it work, when other companies that are PAYING AMAZON FOR THE COMPUTE can.

        • by cusco ( 717999 )

          That's the thing, they also make internal customers pay for resources, doesn't matter if the resources would otherwise sit idle.

    • There are usually around 300 options available for anything on Amazon, how could you possibly go through all that with a speaker?
    • This. Same goes for Google Home too. These devices are largely used for three things:

      1. Playing music
      2. Kitchen timers
      3. Controlling smart home devices

      Having better AI on the voice could help make 3 more frictionless but in terms of business model there is very little incentive for Google or Amazon to invest.

      The only advantage Alexa and Google Home have is that they are aggregators for music and smart home systems. They offer an eco system, of sorts, and that helps the overall ecosystem but I canâ(TM)t

  • Amazon didn't "blow" anything
    Actually making a useful AI assistant is hard
    Pundits, analysts, investors and futurists are ramping up the hype, believing that enormous profits are imminent
    Expect to see a tsunami of crappy AI stuff, forced on us long before it's ready
    I'm optimistic that useful stuff will eventually be developed, but the early stuff will suck mightily

    • Amazon didn't "blow" anything Actually making a useful AI assistant is hard Pundits, analysts, investors and futurists are ramping up the hype, believing that enormous profits are imminent Expect to see a tsunami of crappy AI stuff, forced on us long before it's ready I'm optimistic that useful stuff will eventually be developed, but the early stuff will suck mightily

      Apple's shoving OpenAI down their user's throats. I have a feeling "useful" is not at all something these companies are interested in. The point of AI today is data rape. And I have a hard time believing anyone's hype that the current cycle of these things is going to lead to anything other than more and more certain wording of absolutely false facts.

      • You can't rely on these things for any kind of specialized knowledge. They are great at extracting and summarizing data from inputs, but that doesn't originate inside the model. The model has, at best, a linguistic understanding of general concepts. The key is to stop trying to use these things for stuff they aren't going to be good at.
        • You can't rely on these things for any kind of specialized knowledge. They are great at extracting and summarizing data from inputs, but that doesn't originate inside the model. The model has, at best, a linguistic understanding of general concepts. The key is to stop trying to use these things for stuff they aren't going to be good at.

          I don't disagree with you at all, but the companies running these things seem convinced they're going to rule the world with them, and that the machines are already better qualified to do most jobs than people are.

          I'm still waiting for my easy to install and use summarizing agent for my fictional universe. I have every word I've ever written about it. I'd love to be able to feed it into an LLM/AI of some type to ask general questions when I can't remember the exact date of some event or a birthday or an eye

      • by r0nc0 ( 566295 )
        How is Apple shoving OpenAI down people's throats? I saw they were making it available for use but I wasn't aware of them forcing people to use it. I doubt that's the case just from the AI safety issues around using it...
        • How is Apple shoving OpenAI down people's throats? I saw they were making it available for use but I wasn't aware of them forcing people to use it. I doubt that's the case just from the AI safety issues around using it...

          You can choose not to use Siri, but it's still there in the background. I'm sure OpenAI's phone install will be the same. The "optional" part will be whether you interact with it. It won't be up to you or me if it gets installed if it's a part of Apple's base install. It certainly never has been up to now. Why would that change just because it's AI? They're happy to shove a middle finger in your face on all their other useless extras. Like the Journal app, which I'm sure isn't a covert attempt at sucking do

          • Dude. Time to get a grip, and actually understand how things work.

            They aren't installing anything onto the device to use OpenAI. They have an enterprise license to the OpenAI API. If the on-device thing they came up with can't handle the query, they pop up a box asking if you want to try OpenAI. If you say yes, it makes an API call. If you say no, it does not.

            Please tell me how that's not entirely optional.

        • Because the person that wrote that, still doesn't understand that it will prompt you to go off-device and use OpenAI if the on-device LLM can't figure it out.

          Clearly something that you need to opt-in on every time it's used is "shoving it down people's throats" ...

  • by PubJeezy ( 10299395 ) on Thursday June 13, 2024 @12:14PM (#64546711)
    This isn't rocket science. The failure was a choice, not an outcome. They say so right up top: "demonstrating how the digital assistant could engage in more natural conversation. "

    I don't know if you've ever been around a highly competent personal assistant but they're not around for conversation. They're their for competence. A good assistant has real practical value because they make it easier to solve problems. Chatbots are not personal assistants. No one wanted Alexa discuss their favorite song, we wanted it to write a grocery list, control our devices and book flights for us. That was never actually the goal.

    Amazon wasn't trying to build a functional tool. So it's not a functional tool. How is this news?
    • This is a completely underrated take.

      I don't know why anyone would think that Amazon was doing everyone a huge favor by making their lives easier in any way that doesn't include Amazon recognizing revenue. They added a few odds and ends in there in order to give a nice facade of making your life easier, but really it was about data gathering and increasing convenience / lock-in of using Amazon services, and making Amazon purchases. It's no coincidence that in order to use one of their display things, you

  • What I read was something about a headshot to Alexa... must be my former occupation coming to mind...
  • by Big Hairy Gorilla ( 9839972 ) on Thursday June 13, 2024 @01:46PM (#64547045)
    Turns out talking is probably the worst way to convey information. There's basically nothing you can't do faster and more accurately with a keyboard. You can read paragraphs of text in the same time as Alexa can utter 2 sentences. Talking and listening to computers is a huge time waster. Star Trek was wrong.
    • by cusco ( 717999 ) <brian DOT bixby AT gmail DOT com> on Thursday June 13, 2024 @02:08PM (#64547089)

      If I'm in the kitchen kneading dough and I want to listen to music I'm not going to go wash my hands to type what I want into a keyboard. In fact I'm going to do exactly that in just a minute. If I'm cooking and use up the last of the pizza sauce I'm not going into the other room to type "pizza sauce" into my shopping list with the keyboard. If my wife reminds me that we need to make an appointment with the vet while we're having breakfast, like she did yesterday, I'm not stopping to key it in and then come back, and if I wait until after breakfast I'll have spaced it out again.

      It has its uses. Is it indispensable? Is it perfect? Of course not, but damn it is more convenient than the alternatives.

      • There are some use cases for voice I/O. I don't dispute that. The point I'm making is that for non-trivial queries and responses, voice I/O is slow and frustrating. Like using IVR systems on the phone. You have to wait thru painfully slow instructions. Alexa isn't much different... so you want to something rather trivial, put an ingredient on your grocery list... or as most people are saying, the main purpose is to raise or lower volume or light levels. Once your use case is outside of those useful but triv
  • by nwaack ( 3482871 ) on Thursday June 13, 2024 @02:42PM (#64547167)
    I'd be very happy with a device that simply listened for basic commands, such as, "Turn on the kitchen lights," "Play Pandora ," "Tell me the outside temperature," "Show me the front door camera," etc. and did it very well, without any of the other nonsense. Stop complicating everything.
  • Alexa had problems long before AI. I remember getting our first-generation Echo, and being really excited about it. We quickly learned that apart from some very simple things (like asking the weather, setting a timer, or such), it was really stupid. It seemed obvious that Amazon would get tons of data from people trying to do things that it couldn't handle, which seemed like an obvious data set for finding useful enhancements. But they never did anything to improve it.

    The only thing they did was push to

  • I wonder if the summary buried the lede.

    Despite a $4 billion investment in AI startup Anthropic, privacy concerns and internal politics have prevented Alexa's teams from fully leveraging Anthropic's technology.

    To train an LLM you need a massive amount of data.

    OpenAI could afford to scrape the Internet and not worry about the blowback from privacy and copyright because without ChatGPT they don't have a company.

    Meta and Google could also do that because they had a giant pile of data they could could legally

When the weight of the paperwork equals the weight of the plane, the plane will fly. -- Donald Douglas

Working...