Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Cloud AI

Amazon Bets $150 Billion on Data Centers Required for AI Boom (yahoo.com) 26

Amazon plans to spend almost $150 billion in the coming 15 years on data centers, giving the cloud-computing giant the firepower to handle an expected explosion in demand for artificial intelligence applications and other digital services. From a report: The spending spree is a show of force as the company looks to maintain its grip on the cloud services market, where it holds about twice the share of No. 2 player Microsoft. Sales growth at Amazon Web Services slowed to a record low last year as business customers cut costs and delayed modernization projects. Now spending is starting to pick up again, and Amazon is keen to secure land and electricity for its power-hungry facilities.

"We're expanding capacity quite significantly," said Kevin Miller, an AWS vice president who oversees the company's data centers. "I think that just gives us the ability to get closer to customers." Over the past two years, according to a Bloomberg tally, Amazon has committed to spending $148 billion to build and operate data centers around the world. The company plans to expand existing server farm hubs in northern Virginia and Oregon as well as push into new precincts, including Mississippi, Saudi Arabia and Malaysia.

This discussion has been archived. No new comments can be posted.

Amazon Bets $150 Billion on Data Centers Required for AI Boom

Comments Filter:
  • ...a small fraction of what Sam Altman says he needs
    • I noticed lately that data center space around my area is going up dramatically. A full rack used to be pretty easy to get for about $300/mo, now that's $800/mo and $1200/mo with enough power to actually run a dense rack. Most folks I know are seeing increases of 30%-50% at renewal time.
    • I'm not denying Altman's accomplishments, but he's also on record about how everyone is gonna have AGI by the end of next fiscal year. Nothing like to bit of AI hype to drive the stock price through the roof, eh?

      Oncer people realize that, no, the singularity is NOT right around the corner, a far more likely outcome is that whole thing will level off. People figure out the productive niches for LLMs (there will be many) and train the programs to fill those niches. Once the programs are actually trained,
      • I'm not denying Altman's accomplishments

        well i am

      • Once the programs are actually trained, they won't be consuming massive amounts of computational power like they are currently.

        and you heard it here first folks, the end of training ai models is soon to end!

  • So all of our small businesses were sucked up by Amazon and now the profits from that event is funding the Beast. Welcome O Great Satan.

    • by cusco ( 717999 )

      2/3 of everything sold on Amazon is presented by third party companies, and the vast majority of them are small and medium businesses. Thousands of businesses which would have folded up during the pandemic only survived on their Amazon sales.

      Twenty years ago a small clinic, small manufacturer, accounting firm, etc. needed a DB server, an email server, a file/print server, and likely one or more standalone box for some app or process and half a dozen desktop machines. That wasn't enough to have an on-staff

  • by echo123 ( 1266692 ) on Thursday March 28, 2024 @11:00AM (#64351155)

    ...when you earn massive profit and don't pay tax, (or at least have control over the piddly percentage you choose to pay). Just reinvest in your own future growth and world domination, risk of financial loss be damned. Of course you still need to pay for your lobbyists as a matter of doing business.

    Vote for Joe Biden [archive.is] to collect tax where the money is.

  • by schwit1 ( 797399 ) on Thursday March 28, 2024 @11:08AM (#64351185)

    "For a few months in 2022, Dominion Energy Inc., which powers Virginia’s data center alley, couldn’t keep up, pausing connections to facilities that were otherwise ready to come online. The utility expects demand to nearly double over the next 15 years, with the growth driven primarily by data centers."

  • If the bet should turn out badly, we'll have to bail him out somehow.

  • When it turns out pogs and beanie babys have more staying power and practical applications than this utterly moronic spicy autocomplete fad.
  • by King_TJ ( 85913 ) on Thursday March 28, 2024 @12:52PM (#64351493) Journal

    I noticed in just the last week or two? My Echo Dots at home are suddenly MUCH more responsive than they've been for quite some time. As soon as I give a command like, "Alexa, turn off the living room light.", I get a near instant response of "OK" as the light is turned off.

    It was getting progressively more sluggish up until now, and I assumed it was a sign of Amazon cutting resources for the Alexa-enabled smart devices. (They laid off a big chunk of staff supporting them, etc.)

    I'm wondering if Amazon is throwing more resources at it again, now, with the idea it's needed for AI tie-ins?

    • by cusco ( 717999 )

      I've noticed the same (plus it was having trouble with Spanish), I think that more likely what's happened is that the installed base of Echo devices had expanded to the point that they were straining the resources dedicated to them so they slowed down. The way that Amazon delivers resources to internal customers is that each is charged for the resources they use, when I worked in Amazon Physical Security we were charged X-amount every month for the CPU time, memory, bandwidth and storage that every NVR, ev

  • In my lifetime batteries have become 10x (at least) more powerful, lights much brighter, cpus faster, magnets stronger and so many other limits of materials science have been breeched. It's great to see the new possibilities with these new resources, but I think the previous limitations gave rise to better solutions.

    I'd prefer to see a 10x increase in LLM training performance come from an order of magnitude improvement in the training algorithms than from so much more server capacity coming online. Adding a

    • by ranton ( 36917 )

      In your lifetime, algorithms have also become 1000s of times faster and more efficient. It just isn't as noticeable as tracking TFLOPS/watt improvements. This research paper [arxiv.org] claims the impact of algorithmic improvements in computer vision was roughly the same as the impact in improved hardware efficiency over the past decade or so. I've seen similar claims in other papers related to chess programs.

      It's very likely you will see a 10x increase in LLM training performance from algorithmic improvements over the

  • ...and Amazon finds itself in a perfect position to sell pickaxes and shovels. Can't blame them.

Make sure your code does nothing gracefully.

Working...