Forgot your password?
typodupeerror
AI

AI Bubble Is Ignoring Michael Burry's Fears (bloomberg.com) 60

An anonymous reader shares a report: Costing tens of thousands of dollars each, Nvidia's pioneering AI chips make up a hefty chunk of the $400 billion that Big Tech plans to invest this year -- a bill expected to hit $3 trillion by 2029. But unlike 19th-century railroads, or the Dotcom boom's fiber-optic cables, the GPUs fueling today's AI mania are short-lived assets with a shelf life of perhaps five years.

As with your iPhone, this stuff tends to lose value and may need upgrading soon because Nvidia and its rivals aim to keep launching better models. Customers like OpenAI will have to deploy them to stay competitive. So while it's comforting that the companies spending most wildly have mountains of cash to throw around (OpenAI aside), the brief useful life of the chips and the generous accounting assumptions underpinning all of this investment are less consoling.

Michael Burry, who made his name betting against US housing and who's recently turned to the AI boom, waded in this week, warning on X that hyperscalers -- industry jargon for the giant companies building gargantuan data centers -- are underestimating depreciation. Far from being a one-off outlay, there's a danger of AI capex becoming a huge recurring expense. That's great for Nvidia and co., but not necessarily for hyperscalers such as Google and Microsoft. Some face a depreciation tsunami that's forcing them to be extra vigilant about controlling other costs. Amazon has plans to eliminate roughly 14,000 jobs.

And while Wall Street is used to financing fast-depreciating assets such as aircraft and autos, it's worrying that private credit funds are increasingly using GPUs as collateral to finance loans. This includes lending to more speculative startups known as neoclouds, who offer GPUs for rent. Microsoft alone has signed more than $60 billion of neocloud deals.

This discussion has been archived. No new comments can be posted.

AI Bubble Is Ignoring Michael Burry's Fears

Comments Filter:
  • Non-paywalled link (Score:2, Informative)

    by Anonymous Coward

    Here you go [archive.ph]

  • by Pseudonymous Powers ( 4097097 ) on Wednesday November 12, 2025 @10:59AM (#65790170)
    Arguably, most people think this "AI" boom is a bubble (except for the people who think that true AGI is happening sometime next year, which to my mind means the end of capitalism, and possibly civilization, shortly thereafter, but whatever), but nobody knows how to time it. The same was true during the subprime mortgage crisis. Burry deserves credit for loudly and publicly stating that the emperor has no clothes at a time when few others in his profession would, but the point of that story was not that it was hard to tell that the emperor was naked.
    • by Zocalo ( 252965 )
      It wasn't hard to tell that the emperor in the fable was naked at the equivalent point in the tale either, but it still took that lone voice to pipe up and say so. In the case of sub-prime, the smart people (or at least their smart financial advisors) sat up, paid attention to what Burry was saying and took some mitigating action, everyone else took a bath or, if they had the right contacts and leverage, got a government bailout.

      In my mind, AI is just about at that point but is still suffering from a co
      • by PPH ( 736903 )

        This.

        The Emperor's New Clothes is quite applicable here. It's not enough to point out that he's naked as a jaybird, but that this has become "common knowledge" and one is now safe to act on it without negative consequences.

        Just try to step up and say that your company isn't going to bite on the AI bait and the market will knock points off your share price.

        There's an interesting book [amazon.com] out on that topic.

    • which to my mind means the end of capitalism, and possibly civilization, shortly thereafter, but whatever

      I can't think of any better example that captures the zeitgeist of 2025 than being casually dismissive about the potential end of civilization.

      The same was true during the subprime mortgage crisis. Burry deserves credit for loudly and publicly stating that the emperor has no clothes at a time when few others in his profession would

      I have a ton of respect for Burry. Everyone saw the housing bubble was

    • Arguably, most people think this "AI" boom is a bubble (except for the people who think that true AGI is happening sometime next year, which to my mind means the end of capitalism, and possibly civilization, shortly thereafter, but whatever), but nobody knows how to time it.

      If "most" people think that AI is in a bubble, we wouldn't see the current level of investment. Obviously, the people spending the money don't believe that. The big disconnect is that many people who aren't involved in the industry see AI as synonymous with AGI. While all the big players invest in research involving AGI, companies are not banking on AGI as the next big thing that will bring profits. Instead, just as the past decade has seen transformative changes in natural language processing and trans

  • by ebonum ( 830686 ) on Wednesday November 12, 2025 @11:18AM (#65790208)

    What depreciation method are these companies using? What is suggested by GAAP? What is reality ( or how fast are these chips actually going to zero value? )?

    My understanding is that most companies use 3 or 5 year straight-line depreciation with 0 residual value for "computers". This seems reasonable for these Nvidia chips. Are they doing something different?

    • I think the implications of that is that there will be no taxes paid by any of these companies, probably for the next 20 years, all while pocketing billions upon billions of dollars.
    • by ceoyoyo ( 59147 )

      A datacentre with lots of GPUs should depreciate the same way a regular old datacentre does. If they're calculating depreciation other than the way they do for regular datacentres that would be very suspicious.

      In reality, models require vastly more computation to train than they do to use, and more still to develop, so the current spending is more accurately compared to something like the costs to construct railways, which is much greater than the costs to run them, and the asset is not the GPUs but the tra

      • Isn't the point that there's not a place where that expense slows - that we reach "good enough" and the training stops or subsides dramatically like a railroad? Look at a related industry in chip fab - these companies need continual investment in R&D and expense building out new equipment to make the next generation of chip - they extract the value they can from the equipment but then need to refresh to keep up the revenue stream. I have no idea what that cycle is, but I think we can agree AI is much

        • In my estimate, the speed improvement in AI chips is going to see the same slowdown we've already seen in CPUs: single threaded performance is almost at a standstill and multi-threaded performance is increasing much less rapidly than it used to. If this slowdown occurs, there will be less pressure to replace existing AI machines with faster AI machines. This means a longer life cycle for existing machines.

          Whether new facilities continue to be built will depend upon the degree to which AI is useful, and whe

        • by ceoyoyo ( 59147 )

          It doesn't seem likely. The AI companies would have you believe that more compute equals smarter but they're already hitting dimishing returns pretty hard.

          That's almost beside the point though. Railway and fiber companies built out more infrastructure than was immediately useful but then a bunch went broke and the survivors concentrated on making a profit. The big AI companies are in that first phase where they're trying to build stuff faster than everyone else. Next inevitably comes the part where they try

    • According to the article, the (Google, Meta, Amazon) have changed the life of a deprecating asset (Google and meta up to 6 years, from 4 and 5.5 respectively) AWS has bounced from 5 to 6 and then back to 5)

      The neo-clouds are doing something different and appear to be taking out loans to purchase the gpuâ(TM)s and then using those assets (the gpuâ(TM)s) to back the loan. Kind of like how you or I could take a loan out on a car or mortgage on a house. Neo-cloud loans are for 3 years it appears and

  • by SlashbotAgent ( 6477336 ) on Wednesday November 12, 2025 @11:21AM (#65790218)

    Are they saying that this isn't a bubble? It definitely is!

    Are they saying that the bubble won't pop because AI chips "have" to be replaced? Wanna bet? Burry has. Time will tell, even if his timing of off.

    Are they seriously implying that this highly suspect $3trillion number is going to be recurring revenue? LOL! Not a chance.

    Frankly this just reads like Bloomberg is pumping the bubble. It can definitely get big. But it can't grow infinitely and it can't maintain it's current size for very long.

    • Finance drama (Score:5, Insightful)

      by abulafia ( 7826 ) on Wednesday November 12, 2025 @12:20PM (#65790368)
      This is a very specific form of writing. It is kind of, but not quite journalism, not quite fictionalization, and not just an attempt to influence other market participants.

      The author is trying to tell the story within the form - A Titan of Finance is making a Bold Bet with big implications for the little peoples' 401Ks!

      Various folks with input to the story all have their own angle and want to steer it to their advantage. Everyone outside the story who is paying attention can see the bubble, but have the same problem Burry has - the old cliche about the market staying irrational longer than you can stay solvent still applies.

      So little investors have skin in the game but very little range of motion other than getting out of the market. Big players are betting against bubble blowers, which means they need their story to "win" on a timeline that doesn't lose them a ton of money. Meanwhile OAI, NVidia and similar grifters are sucking Tubby's stump in hopes of a bailout.

      It is all high drama, with lots of players trying to influence the story. Think of it as multiparty participatory propaganda trying to steer things, with the eventual outcome determining how many Grandmas have to switch to dog food for dinner.

      • the old cliche about the market staying irrational longer than you can stay solvent still applies.

        As does a quote often attributed to the late economist Herbert Stein [wikipedia.org]: "Something that cannot go on forever will eventually stop."

        Stein's actual quote [quoteinvestigator.com] is more interesting, however:

        What economists know seems to consist entirely of a list of things that cannot go on forever, and this may be one of them. But if it can’t go on forever it will stop. And if we never do anything that we can’t go on doing forever we will never do very much.

      • This is a very specific form of writing. It is kind of, but not quite journalism, not quite fictionalization, and not just an attempt to influence other market participants.

        The author is trying to tell the story within the form - A Titan of Finance is making a Bold Bet with big implications for the little peoples' 401Ks!

        Various folks with input to the story all have their own angle and want to steer it to their advantage. Everyone outside the story who is paying attention can see the bubble, but have the same problem Burry has - the old cliche about the market staying irrational longer than you can stay solvent still applies.

        So little investors have skin in the game but very little range of motion other than getting out of the market. Big players are betting against bubble blowers, which means they need their story to "win" on a timeline that doesn't lose them a ton of money. Meanwhile OAI, NVidia and similar grifters are sucking Tubby's stump in hopes of a bailout.

        It is all high drama, with lots of players trying to influence the story. Think of it as multiparty participatory propaganda trying to steer things, with the eventual outcome determining how many Grandmas have to switch to dog food for dinner.

        My favorite aspect of your very-well written summary is how it is indistinguishable from the action at a poker table. Which further highlights what everyone already knows - the choices driving the economy of this global technological civilization are being made in the context of gambling. The cards themselves do have some nonzero tangible value as physical assets, but the hands have no inherent value. The value of your pocket 94 is whatever your chip stack can handle and whatever you can convince (ie can af

    • by Eneff ( 96967 )

      I think they're trying to say that the GPUs will depreciate more quickly than expected and thus the expectations of their return on investment (on which the loans financing the GPU purchases) depend on will leave all of these major companies heavily in debt without revenue generation on the assets to justify their purchase in later years.

      Conceivably, this could lead to bankruptcies and a chain of failures from companies like google and amazon, with a massive drop in stock value and a "too big to fail" probl

      • Ok but just think of all the cool things we can do with tons of cheap GPU? There's many things that tinkerers are priced out of being able to play with.
  • by Ostracus ( 1354233 ) on Wednesday November 12, 2025 @11:22AM (#65790220) Journal

    You say it like it's a bad thing. In case people forgot mining rigs went on the market, cheap putting an end to the starvation before. Same will happen with those AI rigs.

  • by gr8_phk ( 621180 ) on Wednesday November 12, 2025 @11:24AM (#65790224)
    Moore's law is over. TSMC 14A node is pretty much the end of the road with the current 2nm node close to it in terms of performance. nVidia has also got packaging quite good, so the chips can't really get packaged much closer. In other words, compute capability per rack is not going to increase very much beyond the next couple years, and even from now to then there won't be a whole lot of improvement. I think old data centers will still have some value, just less than the final ones a few years from now. If the AI hype holds, they will all be needed. When the hype dies, they will all be worth less.
    • by HiThere ( 15173 )

      Moore's law may be over, but the 3D version of it is just getting started. The real problem is moving the heat away from the chip. I think we're in the early part of the ramp up of 3D chips.

      N.B.: That it's actually do-able was proven decades ago, but only for custom sculpted 1-off chips in a lab setting. (I believe it was the Tennessee Valley Authority...but I'm more sure about the Tennessee than about the rest.)

    • by jrobot ( 1239050 )

      Even if performance continues to scale, Moore's less famous second law about cost scaling is going to radically transform both semiconductor and consumer electronics starting around 2030. With balooning component costs denominating the value proposition (performance/($+power)) of new parts, hyperscalers won't find enough incremental value to warrant buying new chips (same for consumers with cellphones). This means parts will be used much longer and a robust secondary used market will substantially eat into

  • by Fly Swatter ( 30498 ) on Wednesday November 12, 2025 @11:55AM (#65790302) Homepage
    This is the supercomputer phase of AI; it needs huge amounts of space, resources, and expensive equipment. When an eventual successor is developed that reduces all those resources down to a small box that sits in someone's home or pocket and does the same thing faster and almost infinitely cheaper - all this debt will be worse than just throwing money in the fireplace.

    Until then this idea of just making an AI database bigger will never be profitable. The only hope is that new developments quickly render this waste of resources obsolete before our world collapses under the weight of data center and infrastructure debt.
    • Nobody wanted those stupid AI laptops.

      • AI laptops are a scam, either all they can really do is text to speech or they are really just thin clients to the AI server. Want to know how much AI is actually done locally on a device? disable the network connection and watch it all fail.
    • by HiThere ( 15173 )

      Don't expect AI to ever use only a small amount of compute. You can do a lot by pre-training, but there are limits.

      OTOH, I'm rather sure that the current algorithms are a lot more wasteful than a later version will be. A factor of 100 wouldn't surprise me. Personally I think the way to handle it is with a raft of Small Language Models, each one tuned to a specific context, and a higher system that switches context as appropriate. (I've seen signs in the news that we're already headed that way.)

      • Don't expect AI to ever use only a small amount of compute.

        Personally I think the way to handle it is with a raft of Small Language Models, each one tuned to a specific context, and a higher system that switches context as appropriate.

        It will be this, because that's how human intelligence works.
        There is no such thing as unitary consciousness.
        It's the emergent aggregate internetwork and general-purpose vector sum produced by a set of local electrochemical networks and special-purpose processes all signal-patterning among each other.
        Your Self is the result (and feedback loop) of traffic shaping.

  • by DeplorableCodeMonkey ( 4828467 ) on Wednesday November 12, 2025 @11:56AM (#65790304)

    Assuming the GPUs aren't unusable due to wear, they can be repurposed to provide low cost services.

    I've worked on projects where they'd have spent millions of dollars on renting GPUs per quarter if the AWS sales pitch was "these are so last 3 years, but they're dirt cheap for letting your data scientists experiment."

    I think he's 100% over the target about the accounting side, but I think he is potentially underestimating how much money corporations would be willing to throw at "old GPUs" that are substantially cheaper per hour to rent.

    Let's face it, in 3-5 years the Nvidia 50X series on the market right now will be "old" but still very powerful. More than powerful enough to do A LOT of GPU-centric work.

    • I interpret Burry's point to be that the previous-gen chips will be in such low demand that they'll have lost almost all of their value. If that's the case, the resale value of those old chips will be extremely low and won't contribute much toward the purchase of the next-gen chips. In other words, they'll need to keep getting huge influxes of cash to buy next-gen chips. What he isn't considering is that some AI companies will likely be designing their own chips to be optimized for their particular AI al
    • Not sure because these chips cost so much to operate in energy costs. If enough efficiency gains are made it would be cheaper to buy and run new ones than to run old ones to do the same tasks. When data center companies start building their own nuclear power plants you know there is a LOT of money to be saved by increasing efficiency.
    • by HiThere ( 15173 )

      GPU are retargetable (to their original use as graphic processors), but I'm not sure the same is true of TPUs and the other more specialized varieties.

  • And while Wall Street is used to financing fast-depreciating assets such as aircraft and autos, it's worrying that private credit funds are increasingly using GPUs as collateral to finance loans.

    Seriously? GPUs as collateral? Can you use something that will depreciate to nearly zero before the term of the loan is up? Or are these extremely short-term loans? Are banks just impressed with the big number of greendbacks a company has slung at GPUs and utterly ignorant of how little than number will mean in ten, or even five, years time? Again I ask, "What in the actual fucking fuck are we doing?" I feel like the entire world is caught up in snake oil salesmanship to the point of destroying the entirety

    • by 0123456 ( 636235 )

      The Economy relies on ever-increasing amounts of debt to function. Banks are fine with lending money because they expect taxpayers to bail them out if the loans go bad.

      > I feel like the entire world is caught up in snake oil salesmanship to the point of destroying the entirety of functional society, just because a very few people might make some money off of it. WTF?

      It's been like that for years now. Society is collapsing and we're in the Looting The Treasury phase.

      • "they expect taxpayers to bail them out"

        Does the Fed need taxpayers, or does it simply print (digtally) money? Have taxes gone up or down since 2008?

      • The economy does not rely on ever-increasing amounts of debt to function. With a reduction or reversal of debt growth, the economy would be somewhat different, not damaged. The economy runs on production and consumption. Debt is a side issue.
        • The economy runs on production and consumption. Debt is a side issue.

          This feels a good bit like saying rain runs on ocean and clouds but water is a side issue.
          What is enabling the current level of production and consumption to happen?

          • The economy runs on production and consumption. Debt is a side issue.

            This feels a good bit like saying rain runs on ocean and clouds but water is a side issue.
            What is enabling the current level of production and consumption to happen?

            Ah, here's a comment that says it better than I did: https://slashdot.org/comments.... [slashdot.org]

    • I feel like the entire world is caught up in snake oil salesmanship to the point of destroying the entirety of functional society

      The allure of "making money from money" is far more appealing to many people than making money from doing actual work.

  • In a few years, all of these GPUs will be available on eBay for a few bucks each.

    Then I'll finally be able to snag a whole bunch of them and build a Beowulf cluster to run SETI@home faster than anybody else.

  • Isn't all the infrastructure and hardware. That stuff's going to get used because the goal of AI is to replace white collar workers and that tech does work. Not perfectly but it's improving every day and it already does quite a bit.

    The problem is that the nature of llms means that when things shake out we're going to be left with just a couple of big players. That's because the only people who are going to be able to stay in the game are the ones who have access to training data from real human beings a
  • ...that ever increasing compute power will be needed for future AI
    This reminds me of the old military saying that generals plan to fight the last war
    One efficient algorithm changes everything
    One different processing approach like analog hybrids or bio hybrids changes everything
    The future is becoming increasingly unpredictable

  • The companies aren't making that much money from their AI's, a company might be worth billions, while its revenue (not profit) might be no more than €20mln. This is going to implode as soon as the AI-fad is over.

    AI is a useful tool and it has value, but right now the value of anything with AI in it is simply way too high due to it being hyped into the stratosphere. Anyone financial reads "AI" and stops thinking and starts seeing euro signs and starts investing without looking at the numbers of the comp

  • No they're not. They're salivating at the prospect of how much money they can siphon from the purported billions worth of transactions being promised, and they'll be long gone before the hardware stops being shiny.
  • Comment removed based on user account deletion

Whenever people agree with me, I always think I must be wrong. - Oscar Wilde

Working...