Why do you think that would make a difference to us who are trying to squeeze every bit of performance out of our boxen? Wasted GPU cycles are still wasted, on a machine that could be tuned to offload some of the rendering work or number crunching from the cores to the GPU.
I do some CG. A "simple" three minute animation can easily take more than 30 hours to render, even with four cores AND the GPU cooking.
There is a reason why anyoine doing serious computer work today is using one of the Linux distros.
I don't knock Windows or even Apple. If all you are doing with the computer is the same stuff your Grandma and Grandpa used to do with a pegboard accounting system and a sliderule, then by all means get a box that will play the games you enjoy. But trying to compare that OS with a serious computing OS is like trying to compare the best ever go-cart with a Formula One race car. Yeah they can run on the same track, but that's about all they have in common.
That's a really bad car analogy. About the worst I've ever heard. Really really bad.
Yeah. It was bad. The best I could do under the circumstances.
What circumstances?
Can't justify wasting any more time on this.
Oh. Yeah, I see your point.