AI Bubble Is Ignoring Michael Burry's Fears (bloomberg.com) 60
An anonymous reader shares a report: Costing tens of thousands of dollars each, Nvidia's pioneering AI chips make up a hefty chunk of the $400 billion that Big Tech plans to invest this year -- a bill expected to hit $3 trillion by 2029. But unlike 19th-century railroads, or the Dotcom boom's fiber-optic cables, the GPUs fueling today's AI mania are short-lived assets with a shelf life of perhaps five years.
As with your iPhone, this stuff tends to lose value and may need upgrading soon because Nvidia and its rivals aim to keep launching better models. Customers like OpenAI will have to deploy them to stay competitive. So while it's comforting that the companies spending most wildly have mountains of cash to throw around (OpenAI aside), the brief useful life of the chips and the generous accounting assumptions underpinning all of this investment are less consoling.
Michael Burry, who made his name betting against US housing and who's recently turned to the AI boom, waded in this week, warning on X that hyperscalers -- industry jargon for the giant companies building gargantuan data centers -- are underestimating depreciation. Far from being a one-off outlay, there's a danger of AI capex becoming a huge recurring expense. That's great for Nvidia and co., but not necessarily for hyperscalers such as Google and Microsoft. Some face a depreciation tsunami that's forcing them to be extra vigilant about controlling other costs. Amazon has plans to eliminate roughly 14,000 jobs.
And while Wall Street is used to financing fast-depreciating assets such as aircraft and autos, it's worrying that private credit funds are increasingly using GPUs as collateral to finance loans. This includes lending to more speculative startups known as neoclouds, who offer GPUs for rent. Microsoft alone has signed more than $60 billion of neocloud deals.
As with your iPhone, this stuff tends to lose value and may need upgrading soon because Nvidia and its rivals aim to keep launching better models. Customers like OpenAI will have to deploy them to stay competitive. So while it's comforting that the companies spending most wildly have mountains of cash to throw around (OpenAI aside), the brief useful life of the chips and the generous accounting assumptions underpinning all of this investment are less consoling.
Michael Burry, who made his name betting against US housing and who's recently turned to the AI boom, waded in this week, warning on X that hyperscalers -- industry jargon for the giant companies building gargantuan data centers -- are underestimating depreciation. Far from being a one-off outlay, there's a danger of AI capex becoming a huge recurring expense. That's great for Nvidia and co., but not necessarily for hyperscalers such as Google and Microsoft. Some face a depreciation tsunami that's forcing them to be extra vigilant about controlling other costs. Amazon has plans to eliminate roughly 14,000 jobs.
And while Wall Street is used to financing fast-depreciating assets such as aircraft and autos, it's worrying that private credit funds are increasingly using GPUs as collateral to finance loans. This includes lending to more speculative startups known as neoclouds, who offer GPUs for rent. Microsoft alone has signed more than $60 billion of neocloud deals.
Non-paywalled link (Score:2, Informative)
Here you go [archive.ph]
Re: (Score:1)
Thank you.
How Big and How Short? (Score:3)
Re: (Score:3)
In my mind, AI is just about at that point but is still suffering from a co
Re: (Score:3)
This.
The Emperor's New Clothes is quite applicable here. It's not enough to point out that he's naked as a jaybird, but that this has become "common knowledge" and one is now safe to act on it without negative consequences.
Just try to step up and say that your company isn't going to bite on the AI bait and the market will knock points off your share price.
There's an interesting book [amazon.com] out on that topic.
Re: (Score:3)
I can't think of any better example that captures the zeitgeist of 2025 than being casually dismissive about the potential end of civilization.
I have a ton of respect for Burry. Everyone saw the housing bubble was
Re: (Score:2)
Arguably, most people think this "AI" boom is a bubble (except for the people who think that true AGI is happening sometime next year, which to my mind means the end of capitalism, and possibly civilization, shortly thereafter, but whatever), but nobody knows how to time it.
If "most" people think that AI is in a bubble, we wouldn't see the current level of investment. Obviously, the people spending the money don't believe that. The big disconnect is that many people who aren't involved in the industry see AI as synonymous with AGI. While all the big players invest in research involving AGI, companies are not banking on AGI as the next big thing that will bring profits. Instead, just as the past decade has seen transformative changes in natural language processing and trans
Obvious questions (Score:3)
What depreciation method are these companies using? What is suggested by GAAP? What is reality ( or how fast are these chips actually going to zero value? )?
My understanding is that most companies use 3 or 5 year straight-line depreciation with 0 residual value for "computers". This seems reasonable for these Nvidia chips. Are they doing something different?
Re: (Score:2)
Re: (Score:2)
A datacentre with lots of GPUs should depreciate the same way a regular old datacentre does. If they're calculating depreciation other than the way they do for regular datacentres that would be very suspicious.
In reality, models require vastly more computation to train than they do to use, and more still to develop, so the current spending is more accurately compared to something like the costs to construct railways, which is much greater than the costs to run them, and the asset is not the GPUs but the tra
Re: (Score:1)
Isn't the point that there's not a place where that expense slows - that we reach "good enough" and the training stops or subsides dramatically like a railroad? Look at a related industry in chip fab - these companies need continual investment in R&D and expense building out new equipment to make the next generation of chip - they extract the value they can from the equipment but then need to refresh to keep up the revenue stream. I have no idea what that cycle is, but I think we can agree AI is much
Re: (Score:1)
In my estimate, the speed improvement in AI chips is going to see the same slowdown we've already seen in CPUs: single threaded performance is almost at a standstill and multi-threaded performance is increasing much less rapidly than it used to. If this slowdown occurs, there will be less pressure to replace existing AI machines with faster AI machines. This means a longer life cycle for existing machines.
Whether new facilities continue to be built will depend upon the degree to which AI is useful, and whe
Re: (Score:2)
It doesn't seem likely. The AI companies would have you believe that more compute equals smarter but they're already hitting dimishing returns pretty hard.
That's almost beside the point though. Railway and fiber companies built out more infrastructure than was immediately useful but then a bunch went broke and the survivors concentrated on making a profit. The big AI companies are in that first phase where they're trying to build stuff faster than everyone else. Next inevitably comes the part where they try
Re: Obvious questions (Score:2)
According to the article, the (Google, Meta, Amazon) have changed the life of a deprecating asset (Google and meta up to 6 years, from 4 and 5.5 respectively) AWS has bounced from 5 to 6 and then back to 5)
The neo-clouds are doing something different and appear to be taking out loans to purchase the gpuâ(TM)s and then using those assets (the gpuâ(TM)s) to back the loan. Kind of like how you or I could take a loan out on a car or mortgage on a house. Neo-cloud loans are for 3 years it appears and
Re: (Score:2)
AFAIKT, China is 4-5 years away from "breaking into this market", if they market is the upper end of the chips. Possibly even a bit longer. OTOH, for many purposes their chips are already good enough, so they'll break into it at the lower end as soon as they have enough chips for export. (Aren't they already doing that?)
I Don't Understand The Story's Intent (Score:3)
Are they saying that this isn't a bubble? It definitely is!
Are they saying that the bubble won't pop because AI chips "have" to be replaced? Wanna bet? Burry has. Time will tell, even if his timing of off.
Are they seriously implying that this highly suspect $3trillion number is going to be recurring revenue? LOL! Not a chance.
Frankly this just reads like Bloomberg is pumping the bubble. It can definitely get big. But it can't grow infinitely and it can't maintain it's current size for very long.
Finance drama (Score:5, Insightful)
The author is trying to tell the story within the form - A Titan of Finance is making a Bold Bet with big implications for the little peoples' 401Ks!
Various folks with input to the story all have their own angle and want to steer it to their advantage. Everyone outside the story who is paying attention can see the bubble, but have the same problem Burry has - the old cliche about the market staying irrational longer than you can stay solvent still applies.
So little investors have skin in the game but very little range of motion other than getting out of the market. Big players are betting against bubble blowers, which means they need their story to "win" on a timeline that doesn't lose them a ton of money. Meanwhile OAI, NVidia and similar grifters are sucking Tubby's stump in hopes of a bailout.
It is all high drama, with lots of players trying to influence the story. Think of it as multiparty participatory propaganda trying to steer things, with the eventual outcome determining how many Grandmas have to switch to dog food for dinner.
Re: (Score:1)
the old cliche about the market staying irrational longer than you can stay solvent still applies.
As does a quote often attributed to the late economist Herbert Stein [wikipedia.org]: "Something that cannot go on forever will eventually stop."
Stein's actual quote [quoteinvestigator.com] is more interesting, however:
What economists know seems to consist entirely of a list of things that cannot go on forever, and this may be one of them. But if it can’t go on forever it will stop. And if we never do anything that we can’t go on doing forever we will never do very much.
Re: (Score:2)
This is a very specific form of writing. It is kind of, but not quite journalism, not quite fictionalization, and not just an attempt to influence other market participants.
The author is trying to tell the story within the form - A Titan of Finance is making a Bold Bet with big implications for the little peoples' 401Ks!
Various folks with input to the story all have their own angle and want to steer it to their advantage. Everyone outside the story who is paying attention can see the bubble, but have the same problem Burry has - the old cliche about the market staying irrational longer than you can stay solvent still applies.
So little investors have skin in the game but very little range of motion other than getting out of the market. Big players are betting against bubble blowers, which means they need their story to "win" on a timeline that doesn't lose them a ton of money. Meanwhile OAI, NVidia and similar grifters are sucking Tubby's stump in hopes of a bailout.
It is all high drama, with lots of players trying to influence the story. Think of it as multiparty participatory propaganda trying to steer things, with the eventual outcome determining how many Grandmas have to switch to dog food for dinner.
My favorite aspect of your very-well written summary is how it is indistinguishable from the action at a poker table. Which further highlights what everyone already knows - the choices driving the economy of this global technological civilization are being made in the context of gambling. The cards themselves do have some nonzero tangible value as physical assets, but the hands have no inherent value. The value of your pocket 94 is whatever your chip stack can handle and whatever you can convince (ie can af
Re: (Score:3)
I think they're trying to say that the GPUs will depreciate more quickly than expected and thus the expectations of their return on investment (on which the loans financing the GPU purchases) depend on will leave all of these major companies heavily in debt without revenue generation on the assets to justify their purchase in later years.
Conceivably, this could lead to bankruptcies and a chain of failures from companies like google and amazon, with a massive drop in stock value and a "too big to fail" probl
Re: (Score:2)
All for a dollar. (Score:3)
You say it like it's a bad thing. In case people forgot mining rigs went on the market, cheap putting an end to the starvation before. Same will happen with those AI rigs.
They won't depreciate that much (Score:4, Interesting)
Re: (Score:1)
Without Moore's Law you can build more powerful chips by making them bigger, but they'll take more power to run. Which means more cooling to keep them running and more power plants to run them.
There might be improvements to chip design to make them more optimal for AI software, but that's likely to be a one-off.
Re: (Score:2)
There are also the analog chips on the horizon. They don't have the precision in calculating the weights of a ML matrices but they are good enough for the algorithm needs and use a fraction of the power.
Re: (Score:1)
I'm going to make some rough approximations here.
There are difficulties in dissipating power in high speed processors. Assume that the power that can be dissipated is proportional to the area of the chip. Relative to a single active layer chip, the power that can be dissipated per layer is 1/(number_of_layers * thermal_conduction_to_coolant). Thermal conduction to coolant is dominated by copper in the heatsink and SiO2 in the chip. Copper is at least 200 times more thermally conductive than SiO2. Assume tha
Re: (Score:2)
Moore's law may be over, but the 3D version of it is just getting started. The real problem is moving the heat away from the chip. I think we're in the early part of the ramp up of 3D chips.
N.B.: That it's actually do-able was proven decades ago, but only for custom sculpted 1-off chips in a lab setting. (I believe it was the Tennessee Valley Authority...but I'm more sure about the Tennessee than about the rest.)
Re: (Score:1)
Even if performance continues to scale, Moore's less famous second law about cost scaling is going to radically transform both semiconductor and consumer electronics starting around 2030. With balooning component costs denominating the value proposition (performance/($+power)) of new parts, hyperscalers won't find enough incremental value to warrant buying new chips (same for consumers with cellphones). This means parts will be used much longer and a robust secondary used market will substantially eat into
Supercomputer vs PC. (Score:3)
Until then this idea of just making an AI database bigger will never be profitable. The only hope is that new developments quickly render this waste of resources obsolete before our world collapses under the weight of data center and infrastructure debt.
They already did that (Score:2)
Nobody wanted those stupid AI laptops.
Re: (Score:2)
Re: (Score:2)
Don't expect AI to ever use only a small amount of compute. You can do a lot by pre-training, but there are limits.
OTOH, I'm rather sure that the current algorithms are a lot more wasteful than a later version will be. A factor of 100 wouldn't surprise me. Personally I think the way to handle it is with a raft of Small Language Models, each one tuned to a specific context, and a higher system that switches context as appropriate. (I've seen signs in the news that we're already headed that way.)
Re: (Score:2)
Don't expect AI to ever use only a small amount of compute.
Personally I think the way to handle it is with a raft of Small Language Models, each one tuned to a specific context, and a higher system that switches context as appropriate.
It will be this, because that's how human intelligence works.
There is no such thing as unitary consciousness.
It's the emergent aggregate internetwork and general-purpose vector sum produced by a set of local electrochemical networks and special-purpose processes all signal-patterning among each other.
Your Self is the result (and feedback loop) of traffic shaping.
Depends on the meaning of "shelf life" (Score:3)
Assuming the GPUs aren't unusable due to wear, they can be repurposed to provide low cost services.
I've worked on projects where they'd have spent millions of dollars on renting GPUs per quarter if the AWS sales pitch was "these are so last 3 years, but they're dirt cheap for letting your data scientists experiment."
I think he's 100% over the target about the accounting side, but I think he is potentially underestimating how much money corporations would be willing to throw at "old GPUs" that are substantially cheaper per hour to rent.
Let's face it, in 3-5 years the Nvidia 50X series on the market right now will be "old" but still very powerful. More than powerful enough to do A LOT of GPU-centric work.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
GPU are retargetable (to their original use as graphic processors), but I'm not sure the same is true of TPUs and the other more specialized varieties.
This can't be right. (Score:2)
And while Wall Street is used to financing fast-depreciating assets such as aircraft and autos, it's worrying that private credit funds are increasingly using GPUs as collateral to finance loans.
Seriously? GPUs as collateral? Can you use something that will depreciate to nearly zero before the term of the loan is up? Or are these extremely short-term loans? Are banks just impressed with the big number of greendbacks a company has slung at GPUs and utterly ignorant of how little than number will mean in ten, or even five, years time? Again I ask, "What in the actual fucking fuck are we doing?" I feel like the entire world is caught up in snake oil salesmanship to the point of destroying the entirety
Re: (Score:1)
The Economy relies on ever-increasing amounts of debt to function. Banks are fine with lending money because they expect taxpayers to bail them out if the loans go bad.
> I feel like the entire world is caught up in snake oil salesmanship to the point of destroying the entirety of functional society, just because a very few people might make some money off of it. WTF?
It's been like that for years now. Society is collapsing and we're in the Looting The Treasury phase.
Re: This can't be right. (Score:1)
"they expect taxpayers to bail them out"
Does the Fed need taxpayers, or does it simply print (digtally) money? Have taxes gone up or down since 2008?
Re: (Score:1)
Re: (Score:2)
The economy runs on production and consumption. Debt is a side issue.
This feels a good bit like saying rain runs on ocean and clouds but water is a side issue.
What is enabling the current level of production and consumption to happen?
Re: (Score:2)
The economy runs on production and consumption. Debt is a side issue.
This feels a good bit like saying rain runs on ocean and clouds but water is a side issue.
What is enabling the current level of production and consumption to happen?
Ah, here's a comment that says it better than I did: https://slashdot.org/comments.... [slashdot.org]
Re: (Score:2)
The allure of "making money from money" is far more appealing to many people than making money from doing actual work.
Look at the bright side (Score:2)
In a few years, all of these GPUs will be available on eBay for a few bucks each.
Then I'll finally be able to snag a whole bunch of them and build a Beowulf cluster to run SETI@home faster than anybody else.
Re: (Score:2)
[golf clap]
So the problem with the bubble (Score:2)
The problem is that the nature of llms means that when things shake out we're going to be left with just a couple of big players. That's because the only people who are going to be able to stay in the game are the ones who have access to training data from real human beings a
Re: So the problem with the bubble (Score:1)
If the Fed has unlimted power to do "whatever it takes" to end panics, without needing taxpayer money, why can't it fund a basic income, and index it to inflation?
Re: So the problem with the bubble (Score:1)
What if I don't want to sell anything for money, including my labor, and you've enclosed all the commons so I can't even self-provision legally?
The massive spending is based on the assumption (Score:2)
...that ever increasing compute power will be needed for future AI
This reminds me of the old military saying that generals plan to fight the last war
One efficient algorithm changes everything
One different processing approach like analog hybrids or bio hybrids changes everything
The future is becoming increasingly unpredictable
AI bubble is going to burst (Score:2)
The companies aren't making that much money from their AI's, a company might be worth billions, while its revenue (not profit) might be no more than €20mln. This is going to implode as soon as the AI-fad is over.
AI is a useful tool and it has value, but right now the value of anything with AI in it is simply way too high due to it being hyped into the stratosphere. Anyone financial reads "AI" and stops thinking and starts seeing euro signs and starts investing without looking at the numbers of the comp
Underestimating depreciation? (Score:2)
Re: (Score:2)