It's always been the applications that have driven users to a platform, and right now Microsoft doesn't have those.
You seem to have missed where I mentioned this in my post. I even put it there with the specific intent of stopping misinformed "there are no apps" replies.
Android has 20 apps that duplicate the same functionality, and Windows Phone has 5. But the functionality is there. It doesn't work to just compare a raw count. In terms of big-name apps, there are still a number of holdouts but for every one that's missing, there's something identical to replace it.
In the Windows Phone 7 times, I definitely felt limited in app selection. Not anymore.
My Windows phone (Lumia 920) runs faster and more fluid and it has significantly less power than my Android tablet (Nexus 7, 1st gen). Each update has added features without making it slower. There are less apps but I have yet to not find what I'm looking for and they generally feel more consistently designed. WP 8 brought native C++ programming. The only thing left is ditching their Direct3D stuff for OpenGL/OpenCL support to make porting games easier (which will admittedly probably never happen).
In terms of geek factor Android is of course far more customizable and rootable, but I and I'd assume the great majority of users are not interested in doing that.
There's so much focus on Microsoft forking Android, but I really don't see the point. They've got a long way to go to get to Android levels of market share, but it's by no means a failure that deserves to be trashed.
They'll work only if they aren't a sloppy, slapped together gimmick designed to rubber stamp "programmers" and install them in cubicles like spare parts.
I really can't imagine any good a 3 month crash course would accomplish. At best you'll get someone dangerous who thinks they know how to code, and a nightmareish scenario of either picking up the pieces or having someone in management now tell you your job is so easy they can do it.
Kids 2-3 years into college with no prior experience are just barely starting to write code that does anything interesting, let alone writing it properly. 3 months? Give me a break. The only proper way to increase the number of coders is to introduce kids to it early on and find/encourage the ones who show interest.
Their tech really didn't push boundaries that much, at least not usefully, in recent years.
The distinction to make is that it was poorly applied. That doesn't mean it wasn't there. id tech 4 and 5 were examples of id taking Carmack's latest idea and running with it full stop, even if the tech wasn't ready.
Other developers eschewed these technologies in favor of older ones, because they had the focus to pick tech they could apply immediately and successfully to fulfill their vision. id didn't have this focus, and the games clearly suffered as they made the games to suit the technology. The so-called "tech-demo" syndrome that everybody uses to describe the latest id games.
Eventually those technologies made it into other games. Per-pixel shading is all over the place now, but still alongside lightmaps. Megatexturing is so compelling that support for it is built into the latest graphics standards, so that games can use it properly and without putting in the monumental effort that Carmack did.
You can't say that he wasn't pushing boundaries. Come on. It's all right there. The games were failures, and other engines look better in many aspects, but the tech was there and it was ahead of its time.
Zenimax not wanting their prized programmer to spend a lot of his time working on promotional material for his other business seems reasonable. I don't fault them for it, nor do I fault him for leaving to work on another passion.
Two things had become constants at id: the lack of interesting games, and the boundary-pushing tech. Lets be honest, the only thing at id that kept it notable was Carmack. And I say that with a crushed, broken heart, as one who's run a TF server, mastered the trick jumps, and played thousands of rounds well after Quake was out of its prime.
Carmack leaving id for Oculus will free him from the constraints of a big business and allow him to inject some of that coding genius into yet another promising, young, experimental industry. This is exactly where we need him, and where he'll be able to thrive.
I'm sure he means Intel's Quick Sync hardware codecs, which are integrated on Intel's CPUs and does not use the integrated GPU.
My understanding of AMD's VCE is that it is also a fully separate codec which does not use any GPU compute power, though they do have optimized paths to copy the framebuffer into VCE for low-latency screen capture.
Link to Original Source
Teenagers are famous for their lack of impulse control. Either it is my age showing or there really does seem to be an decrease of impulse control among American teens.
I don't think impulse control has decreased at all. The only difference I've noted is that the hyper-connectivity of modern times provides much more opportunity to exercise their lack of impulse control. It's exacerbated further due to their parents not having grown up in a remotely similar environment, and so being unable to anticipate certain things.
Although rendering text correctly is maddenly complex, the reasons described here aren't actually any of them.
The things described here are more a result of the good established libraries only being written for the CPU. Not because GPU is more complex, but simply because nobody had taken the time to do it.