Comment Re:US also used ~21GW for data-centers in 2024... (Score 1) 55
... and is on a trajectory to absorb way more than those 11.7GW for "AI" very quickly [pewresearch.org]. And that is before translating the very theoretical peak power of the new "solar capacity" into actual Terawatt-hours harvested
I am trying to figure out how your link possibly supports your assertions. Frankly, your source makes me feel a little dumber for having read it, mainly for the information technology for grade schoolers primer on what a data center is at the start. What I can glean from it is that it seems to be projecting that power usage from data centers will grow by 2.32X by 2030 (linearly, I'll note) So, if it was ~21GW in 2024, then it will grow by 4.62 GW per year. Now, the 11.7 GW noted is just for a single quarter of the year, so that would actually work out to more like 46 GW per year. However, the capacity factor for that is supposed to be around 24%. So, that would be something like a total of 11 GW added per year. 11 GW is clearly more than 4.62 GW, so your own source disagrees with your claim that AI data centers are "on a trajectory to absorb way more than those 11.7GW".
That's even if we assume the four fundamental erroneous assumptions you seem to be taking for granted: First, that somehow solar power is the only renewable power source being added and so it has to stand alone compared to AI power growth; Second, that somehow AI power growth is alone in power demand and that there can't be other factors increasing or, indeed, decreasing power demand; Third that the increase in AI power demand will continue to increase as in the speculative projection in the article you link because, let's face it, LLMs are just the latest bubble; Fourth, that solar (and other renewable) power growth will somehow be flat rather than increasing.
Overall, what you are claiming does not seem to hold up to real world facts or, indeed, your own cited source.