Nvidia Working on a CPU+GPU Combo 178
Max Romantschuk writes "Nvidia is apparently working on an x86 CPU with integrated graphics. The target market seems to be OEMs, but what other prospects could a solution like this have? Given recent development with projects like Folding@Home's GPU client you can't help but wonder about the possibilities of a CPU with an integrated GPU. Things like video encoding and decoding, audio processing and other applications could benefit a lot from a low latency CPU+GPU combo. What if you could put multiple chips like these in one machine? With AMD+ATI and Intel's own integrated graphics, will basic GPU functionality be integrated in all CPU's eventually? Will dedicated graphics cards become a niche product for enthusiasts and pros, like audio cards already largely have?" The article is from the Inquirer, so a dash of salt might make this more palatable.
Out of their league? (Score:5, Insightful)
Nvidia is a fantastic graphics card company - they should continue to innovate focus on what they're good at rather than try to play follow the leader in an arena they know nothing about.
Math co-processors, anyone? (Score:5, Insightful)
patents (Score:2, Insightful)
I would very much doubt that they could compete with AMD and Intel who have already patented many x86 cpu concepts.
It's a shame that Intel has decided not to buy nvidia, and go it alone with it's own design staff.
Re:Heard This One Before (Score:4, Insightful)
At one time floating point was done by software it still is one some cpus.
Then floating point co-processors became available. For some applications you really needed to speed up floating point so it was worth shelling out the big bucks for a chip to speed it up. This is very similar to what we have now with graphics cards.
Finally CPUs had floating point units put right on the die. Later DSP like instructions where added to CPUs.
We are getting to the point where 3d graphics are mandatory. Tying it closer to the CPU is now a logical choice.
Re:Heard This One Before (Score:3, Insightful)
A cyclic process? (Score:5, Insightful)
Graphics chips seem to have done this cycle at least once; perhaps now we're just looking at the next stage in the cycle? We've had graphics as a separate component from the processor for a while, perhaps the next stage in the cycle is for them to combine together into a G/CPU, to take advantage of the design gains in general-purpose processors.
Then at some point down the road, the GPU (or more likely, various GPU-like functional units) might get separated back out onto their own silicon, as more application-specific processors become advantageous once again.
Pointless without documentation. (Score:4, Insightful)
This isn't a good thing unless they also release documentation for it!
Re:Out of their league? (Score:5, Insightful)
Niche market? (Score:4, Insightful)
Haven't they already???
The vast majority of machines (at least, from my experience, which could not be broad enough) seem to be shipping with integrated graphics on the motherboard. Certainly, my last 3 computers have had this.
Granted, I buy on the trailing edge since I don't need gamer performance, but I kind of thought most consumer machines were no longer using a separate graphics card.
Anyone have any meaningful statistics as opposed to my purely anecdotal observations?
Re:Should Slashdot really insult other news outlet (Score:5, Insightful)
What NVidia eventually does may not bear much resemblance to the story.
Re:Niche market? (Score:3, Insightful)
I really doubt the CPU part is going to compete with the latest super-quadcore chips from AMD or Intel, so no-one will use it for a mainstream computer. Possibly it'd have a market for embedded products but I thought they were already well catered for.
Re:Heard This One Before (Score:5, Insightful)
And vice versa. This might work where someone wants an embeded GPU for low to medium end graphics. However, I doubt gamers would like the idea of having to purchase a new CPU evertime a new GPU comes out and vice versa.
There's something to be said for physically discrete components.
Re:Heard This One Before (Score:2, Insightful)
As other have replied its all about the bus speed. The amount of time it takes to move data from chip to chip can insert enormous overhead.
Think back a little to the DEC Alpha. Now the ALPHA chip in and of itself was not really that remarkable. What was so VERY remarkable about the Alpha system was its bus switching. It was blazingly fast and could handle monster amounts of data from main memory to CPU, GPU, etc. The reason ( mostly ) that its now sitting in HP's vault is that the bus switch was/is really expensive to manufacture.
So the way you do this without haveing to build this very expensice bus architecture is to just put the GPU on the die with the CPU. Everything runs at the internal speed of the processor and its fairly inexpensive, comparatively, to build.