And you seem to have missed the part where "running hotter than SandyBridge" applies only to overclocking. Yes, IB is a worse overclocker than SB, but under normal conditions IvyBridge is faster and uses less power than SandyBridge. Remember that overclockers are a tiny portion of the market. IvyBridge isn't the amazing revolutionary chip some people were expecting but it is a successful, evolutionary step forward. Just like most processor generations.
Thing is for every problem you point out with a AI driven car you can point out 5 problems with human drivers. Humans frequently mess up in hazardous conditions, especially if they aren't used to them, meanwhile an AI car is going to be programmed for all possible conditions before it will ever be released into the wild. As for something being wrong with the car, that's what sensors are for. We have to rely on imperfect queues like smell, the AI on the other hand should be plugged straight into the onboard computer and have an excellent overview of the car's health. It might miss corner cases but once again, humans will miss many more. Also humans will often suspect something is wrong and carry on anyway because they can't be bothered to check it out, while the AI can be forced to pull over and demand a fix before carrying on.
There will still be deaths on the road if we switch over to 100% AI controlled traffic but I'll be damned if it won't drop the road toll to a tenth or less than what it is now. That's a ton of lives that will be saved, as well as the added convenience of not having to drive yourself. Of course convincing people to give up control to a machine is going to be a tough sell.
It might help if Android had some sort of built in performance metric similar to the Windows experience index, that can measure the CPU/GPU/memory/etc... and spew out some easy to understand numbers that a user can use to compare to minimum specs listed in the Android store. Something like you need a minimum score of 3, recommended 4, you check your phone, see it only has a 2.2 and skip buying that particular game. No confusing GPU series numbers, memory amounts, CPU Mhz or Ghz or core count numbers, just a simple score the user can compare (or even the device automatically compares and lets the user know). As far as I know nothing like that exists just yet but it would be simple to implement and really solve the problem of different device capabilities for game developers.
While marketing likes to throw things like 2000 cores!!! around, GPU SIMD units really aren't cores. A core implies a complete processing unit, that includes things like a decoder, memory controller, etc... while the shaders in a GPU are barely more than the SIMD co-processing units found in modern CPUs. Of course the language is already muddled through things like the new Bulldozer "cores" that share a single FPU between two cores, which has many people calling their 8-core processors quad-cores with Hyperthreading on steroids. Still, calling a GPU shader unit a "core" is a pretty serious abuse of that word.
I agree that the big studios are just rehashing the same ideas (and often badly at that) but there is plenty of innovation coming from indie developers and mod makers in the community. Look at DotA, which started as a simple UMS map in Starcraft, got ported to Warcraft 3 and has managed to spawn an entire new genre. Or in the indie space we have games like Minecraft and Terraria that are forging the way for yet another new genre of games where the player has the freedom to rebuild/shape his entire game world. That's where I'm betting we will see some really interesting and fun games appear in the future, some more sandboxy like Minecraft and some more like real games (similar to Terraria).
For the big studios it is simply to risky to invest in new (unproven) ideas when they have to recoup millions in development costs. But once a concept is proven in the mod or indie space the big studios will eventually pick it up and polish it. Again, look at DotA, a small mod project, and now we have Heroes of Newerth, League of Legends and DotA2 all competing in that space. Once the concept was considered proven big studios decided to invest in it.
I just really wished that they would stop forcing console UIs on to the PC versions of games. Just watched a video review of Dungeon Siege 3 today and the whole UI looked like a big console-port clusterfuck. Is it really too much to ask that you have separate UI implementations for the console and PC versions of games? Really?