If you think fifteen years in the profession makes you an 'old programmer'.
"I mean, who would have thought in the mid-00s that the smart device would become the pre-eminent consumer computing platform in less than a decade?"
You mean other than Apple and the other people who helped make it happen?
Microsoft's problem moving off of the desktop has always been that they want a very similar experience on the desk and in the hand. This was a bad idea when they tried to emulate the Windows experience in WinCE, and is a bad idea going the other direction with Metro.
Would I trust the setup with nuclear launch codes? No.
They were set to 00000000 for decades anyway, so why not?
Or Apple's mistaken focus on OpenCL over CUDA. Unfortunately, Apple have not indicated that they will remedy the problem.
Well, Apple is pushing an open API that will run on lots of hardware, while nvidia is pushing an API tied to their hardware. I think everyone would benefit with a widely supported open API.
No, we should be encouraging nuclear first, then solar / wind / geothermal, because nuclear is actually scalable and doesnt chew up gobs of land.
That's true right up until it generates a huge wasteland, and starts to poison the seas. Nuclear should only be used as a transitional source. I guess, in theory, a reactor could be made safe, but I doubt it could in practice.
"In theory, there is no difference between theory and practice. But in practice, there is." - Jan L.A. van de Snepscheut
For most digital work these days, you really just need a logic analyzer.
Unless your logic analyzer can show you ringing or capacitance / inductance problems on the digital signal lines, this is not really true. "Digital" signals on a circuit board are analog after all, and are subject to a lot of the same gremlins that plague an all analog circuit. This sort of thing doesn't always matter in a digital circuit, but you need a good scope to find them when they cause problems.
It is easier to write an incorrect program than understand a correct one.