Your code must suck. When I look back in my code, the only improvements are generally API/language enhancements related. But I've always written clean fast code. Even in 6510A ASM...
Either that or your skills haven't improved at all between the time you wrote the code and the time you came back to it, which is far more likely the case. After all, if you think you are already great then there's no reason to up your game.
Also plays 1080p without a hitch.
Latest stable of XBMC, Gotham, now works completely out of the box on Ouya. 1080p, AC3 and DTS decoding, AC3/DTS passthrough, everything. And as a bonus, you get a silky smooth XBMC interface experience unlike on the Pi.
You have a point about the CEC thing tho. Just get a remote with an IR learning function so you can use that to turn the TV on/off and control the volume on your receiver. Everything else will be used to control the Ouya.
For example, while g++ mostly supports the new standard I'm pretty sure gdb doesn't allow you to set a breakpoint in an anonymous function. Until it does I would say they have no place in application development, or only under the most draconian coding standards that prevent the kind of unpleasantness you get when a junior developer realizes all the kewl stuff they can do with them.
VC2010's debugger allows breakpoints in lambdas. Just sayin'
Computers don't actually think. You just think they think. (We think.)