Comment Re:Obvious questions (Score 1) 54
A datacentre with lots of GPUs should depreciate the same way a regular old datacentre does. If they're calculating depreciation other than the way they do for regular datacentres that would be very suspicious.
In reality, models require vastly more computation to train than they do to use, and more still to develop, so the current spending is more accurately compared to something like the costs to construct railways, which is much greater than the costs to run them, and the asset is not the GPUs but the trained models.