18 Months for a "nuke", that is overly optimistic. I expect at least double that amount of time to let all the concrete dry out, let alone form it how it is needed to be. And no, most of that cannot be done off-site in several parts, all built at the same time. As that would be a logistical problem in the general area. And another big problem (transportation) of the elements that can be manufactured in other provinces/states.
Hinkley 2, that is a huge "nuke" in Great Britain, as replacement for Hinkley, the previous "nuke near that location. They are already 10 years in and as far as I know still not finished. And you'll need either a few very large plants or lots and lots of small plants, as energy demands remain terribly high.
And I would like the add the following thought:
The models being made by OpenAI, Anthropic, Google etc. are not energy efficient. Hence you'll need a lot of power just to create/train new models. Power that cannot be in 2 places at once. And with each iteration of model, the energy demands practically double. These companies need subscriptions to become (somewhat) profitable (hopefully). However, with energy limitations, do you send power to the companies building AI or to the users, so they can actually use the products on offer?
China's models focus on energy efficiency, so both the companies that create/train models can do so, while the users are also having enough power to use these models.
Energy-efficiency is the name of the real game here. It would mans you'll need only some 3 or 4 huge "nuke's" to cover both the creators and users, or a whole lot less smaller "nuke's" all over the country. Yet, there is only talk about MORE POWER to turn on datacenters containing MORE GPU, which need MORE POWER, because there needs to be MORE GPU shoved into any and every datacenter. That is a vicious circle you are entering, which will hurt both the creators and customers. And you can only hope that this doesn't occur with both at the same time. But with a shitty grid, that chance becomes a whole lot higher.
Energy-efficiency reduces that chance by at least one order of magnitude and gives the creators, customers and grid time to scale up with realistic targets. Now I'm aware that energy-efficiency is a curse word worse than the multiplication of the f.ck and c.nt curse words in the USA, however, it is the only game any nation busy with AI should be playing. Because if they don't, the AI bubble will pop sooner than later and consequences will be really damaging to their economy, possibly ravaging enough it will take a whole generation to rebuild. And there simply not enough time for that.
What do the military say: "go slow, slow is efficient, efficient is fast"? The current "panicky" trend hurts long term prospects so much more than most realize.