Here we go again with this.
NVidia shipped 100k AI GPUs last year, which - if run nonstop - would consume 7,4 TWh. Crypto consumes over 100 TWh per year, and the world as a whole consumes just under 25000 TWh per year.
AI consumption of power is a pittiance. To get these huge numbers, they have to assume long-term extreme exponential scaling. But you can make anything give insane numbers with an assumption like that.
I simply don't buy the assumption. Not even assuming an AI bust - even assuming that AI keeps hugely growing, and that nobody rests on their laurels but rather keeps training newer and better foundations - the simple fact is that there's far too much progress being made towards vastly more efficient architectures at every level - model structure, neuron structure, training methodologies, and hardware. . Not like "50% better", but like "orders of magnitude better". I just don't buy these notions of infinite exponential growth.