Oh and your scaling numbers are a bit off. intel has only 6-16 EUs but these are 8-wide. So if they'd want a chip comparable to a high-end nvidia or amd card, they'd only need around below 200 EUs (they also run at somewhat higher frequency) not 1600 (which would be insane). Likewise for some good performance card ~100 EUs would be enough.
I don't consider this underpowered. Sure for full-res gaming this is a no-go but neither the high display resolution nor the MacBook itself is intended for that. If you want to game, just use lower resolution with upscaling (which does cost some performance too but not that much), though why some serious gamer would even consider a MacBook completely escapes me.
There's a reason GeForce 680M (and Radeon HD7970M) are only found in big and bulky gaming notebooks.
So I have some doubts it will be "phenomenally more capable" (some cautious statements from Nintendo about being able to match PS360 in graphics add to that, as does the tiny form factor) though it might indeed be somewhat better.
The 7% figure for decay heat is only true immediately after shutdown, after an hour (roughly when the tsunami hit) this goes already down to 1.5% (and there was some limited cooling after that due to battery backup). So you're really only looking at about 20MW or so per reactor which is not THAT much. Looks easy enough to use some portable generator to get some pump going (some 100kWs of electrical power should probably be enough) but apparently it didn't work that way...
Though the GP suggestions to rely on generators/turbine for cooling by just shutting down to self-sustaining power levels sounds extremely risky to me. The tsunami certainly flooded not only the diesel generators but other areas as well. If the diesels didn't work after the tsunami, I've got some doubts the turbines did (not to mention there could have been short-circuits or even direct earthquake related damage). We never really heard about if there was damage to these parts, if someone even knows.
There were however 2500 damage reports, which explains the 9 million. That's not really much per damage report, it'll quickly cost quite a bit to for instance repair some superficial visible crack in plaster. And yes, there are strong doubts all of the damage was actually caused by that earthquake...
Even if you somehow would exceed cooling system capability, you wouldn't need to downclock a lot. Since these cpus drop voltage when downclocking, going from 2.5Ghz to 2Ghz already drops power requirements by half or so (that's just a guesstimate but you get the idea).
That said, after reading some of the links, it seems it is some problem with cooling the chipset, plus some overly aggressive bios with way too low thermal thresholds for throttling. Dunno if that's a simple bios bug or there is some reason they do it (could for instance have some components where you don't know temperature but dell knows they get too hot). In any case, this is indeed clearly broken, and IMHO a obvious case for getting a replacement part (or refund if dell can't fix it). If dell isn't going to do something about it, probably some lawyers are going to have a lot of fun with this...
However, most notebooks will indeed get annoyingly noisy if you do number crunching or something else running cpu at full tilt. And certainly, battery runtime is going to suffer a lot.
Notebooks usually have cooling systems with heatpipes, so the cooling area is actually quite large considering the size of the notebooks.
That said, I don't know why those dells throttle so much. Normal Speedstep certainly doesn't clock down that much, could be some serious thermal issue (if heatsink/pipe wouldn't make contact with cpu for instance), though I haven't read the pdf hence can't say if that's really an issue (there are tools out there to read out cpu temperature certainly...).
From TFA, it seems that Carmack believes it would be hard to get the necessary performance without using the NVidia drivers. It's somewhat surprising to me if it wouldn't be possible to get it running acceptably on anything else, even if the game does use a lot of advanced features - but if Carmack says so!
Well, if it absolutely requires new OpenGL features / extensions chances are they aren't present in mesa (on which all open-source drivers are based) yet. So if he's coding for OpenGL 3.1 plus maybe even a couple extensions (for example geometry shaders which are only core GL in 3.2) then currently it wouldn't run certainly. That said, this certainly shouldn't stop id from doing a port. First, the open source drivers never really achieved the performance of the closed ones (though glsl made things worse), so that's nothing new, and some people were (and still are) quite willing to use the binary drivers. Plus, this might actually help to implement these features - if no app uses them anyway, particularly people working in their free time on the open source drivers probably don't see much reason to do any work on them. Can't blame him though that the business case for a linux port might just be not there, that's not really his fault.