As a mater of fact, your (almost desperate) counter-arguments to every single and minimal "goofy" of mine just demonstrates how difficult are being to you to simply present some facts that proves me wrong.
Let me tell you something: inside your Intel chip, there're not only three "integer coprocessor", but a entire array of ALUs.
From Pentium and ahead (if I remember correctly), Intel decided to o "RISC" style and invested a lot of efforts on pipelining the thing, in order to achieve a better instruction per cycle ratio. By the nature of pipelining make it hard to fully use all internal "coprocessors", they discovered that unless the programmer do very specialized techniques while programming, 20% (at the very best) of the chip stays unused all the time. A pretty waste.
They managed to overcome the industry incapacity to get rid of ancient programming habits (please read it right: I said ancient, not wrong) with hiperthreading. They just sacked another pipelined Control Unit inside the core, carefully crafter to use the chips parts that the first pipeline leaves unused. Each Control Unit appears to the Operating System as a CPU core (but make no mistake: it's just ONE core, faking it is two).
There's no problem with this solution, but one (sane - or at least non stupid) person can not just tag "stupid" anything that decides to be a better idea avoiding all that "faking two CPUs" thing, saving the money or the chip space, leaving space to yet more features.
The ordinary Intel chip nowadays has a lot more cores than the CELL. But you don't see it, because you choose to learn programming on a eco-system that uses customer money to make things easier to you. There's no problem with it, it's a valid way of doing things.
But what's wrong is just calling everybody that thinks that it's better to make developer's life a bit more complicated in order to save money on the long run. Doing this is not just wrong, its plain stupid (or bad faith).
Games have a different development cycle : the game maker (software) spends a lot, but a really lot of money building the game to one platform, but when the thing is done, the thing is done. For the rest of the product's lifecycle, development is at minimum. It's not a continuous development cycle that we're used on PCs - the hardware does not change! Again, when the thing is done, it's done.
It totally makes sense on saving money on hardware, what is hard and costly to "duplicate" (and the manufacture sinks money during all the product's lifecycle), even if by doing it, you make the developer's life harder as their product has a shorter development lifecycle, but it's easier to "duplicate" and have the same lifespan as the hardware.
Developers are not the core business in this industry. We're a important part of the business, but that's it.