Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Nvidia Working on a CPU+GPU Combo 178

Max Romantschuk writes "Nvidia is apparently working on an x86 CPU with integrated graphics. The target market seems to be OEMs, but what other prospects could a solution like this have? Given recent development with projects like Folding@Home's GPU client you can't help but wonder about the possibilities of a CPU with an integrated GPU. Things like video encoding and decoding, audio processing and other applications could benefit a lot from a low latency CPU+GPU combo. What if you could put multiple chips like these in one machine? With AMD+ATI and Intel's own integrated graphics, will basic GPU functionality be integrated in all CPU's eventually? Will dedicated graphics cards become a niche product for enthusiasts and pros, like audio cards already largely have?" The article is from the Inquirer, so a dash of salt might make this more palatable.
This discussion has been archived. No new comments can be posted.

Nvidia Working on a CPU+GPU Combo

Comments Filter:
  • by Salvance ( 1014001 ) on Friday October 20, 2006 @12:32PM (#16517661) Homepage Journal
    Unless Nvidia is partnering with Intel to release this chip, I think they're getting too far out of their confort zone to be successful. Sure, a dual or even quad core chip with half of the cores handling graphics would be great, but can Nvidia deliver? I doubt it ... look how many companies have gone down the tubes after spending millions/billions after trying to make an x86 compatible chip, let alone trying to integrate top end graphics as well.

    Nvidia is a fantastic graphics card company - they should continue to innovate focus on what they're good at rather than try to play follow the leader in an arena they know nothing about.
  • by cplusplus ( 782679 ) on Friday October 20, 2006 @12:35PM (#16517709) Journal
    GPUs are going the way of the math co-processor. I think it's inevitable.
  • patents (Score:2, Insightful)

    by chipace ( 671930 ) on Friday October 20, 2006 @12:36PM (#16517725)
    It's quite clear that the AMD-ATI merger was to aquire the IP and expertise necessary to integrate gpu core(s) on the same die as cpu core(s). Nvidia does not have to actually market a design, but rather patent some key concepts, and this could provide revenue or protection.

    I would very much doubt that they could compete with AMD and Intel who have already patented many x86 cpu concepts.

    It's a shame that Intel has decided not to buy nvidia, and go it alone with it's own design staff.
  • by LWATCDR ( 28044 ) on Friday October 20, 2006 @12:39PM (#16517749) Homepage Journal
    Well think of it like floating point.
    At one time floating point was done by software it still is one some cpus.
    Then floating point co-processors became available. For some applications you really needed to speed up floating point so it was worth shelling out the big bucks for a chip to speed it up. This is very similar to what we have now with graphics cards.
    Finally CPUs had floating point units put right on the die. Later DSP like instructions where added to CPUs.

    We are getting to the point where 3d graphics are mandatory. Tying it closer to the CPU is now a logical choice.
  • by Ironsides ( 739422 ) on Friday October 20, 2006 @12:46PM (#16517839) Homepage Journal
    Then why not just have some connections that come straight out of the CPU and go directly to a graphics card, bypassing any bus entirely?
  • A cyclic process? (Score:5, Insightful)

    by Kadin2048 ( 468275 ) <slashdot.kadin@xox y . net> on Friday October 20, 2006 @12:49PM (#16517867) Homepage Journal
    A while ago -- and maybe it was in the Slashdot discussion about ATI, I'm not sure -- somebody described a cycle in computer design, where various components are built-in monolithically, then broken out as separate components, and then swallowed back up into monolithic designs again.

    Graphics chips seem to have done this cycle at least once; perhaps now we're just looking at the next stage in the cycle? We've had graphics as a separate component from the processor for a while, perhaps the next stage in the cycle is for them to combine together into a G/CPU, to take advantage of the design gains in general-purpose processors.

    Then at some point down the road, the GPU (or more likely, various GPU-like functional units) might get separated back out onto their own silicon, as more application-specific processors become advantageous once again.
  • by sudog ( 101964 ) on Friday October 20, 2006 @12:56PM (#16517939) Homepage
    Why is everyone getting excited about this? Now we're going to have a CPU that's only partially documented, and we lose even moreso to closed-source blobs.

    This isn't a good thing unless they also release documentation for it!
  • by TheRaven64 ( 641858 ) on Friday October 20, 2006 @01:07PM (#16518113) Journal
    The thing is, it doesn't need to be a very good x86 chip. Something like a VIA C7 is enough for most uses, if coupled with a reasonable GPU. I can imagine something like this being very popular in the sub-notebook (which used to mean smaller-than-letter-paper but now means not-as-fat-as-our-other-products) arena where power usage is king. If the CPU and GPU are on the same die then this gives some serious potential for good power management, especially if the memory controller is also on-die. This makes for very simple motherboard designs (and simple = cheap), so it could be very popular.
  • Niche market? (Score:4, Insightful)

    by gstoddart ( 321705 ) on Friday October 20, 2006 @01:08PM (#16518117) Homepage
    Will dedicated graphics cards become a niche product for enthusiasts and pros, like audio cards already largely have?

    Haven't they already???

    The vast majority of machines (at least, from my experience, which could not be broad enough) seem to be shipping with integrated graphics on the motherboard. Certainly, my last 3 computers have had this.

    Granted, I buy on the trailing edge since I don't need gamer performance, but I kind of thought most consumer machines were no longer using a separate graphics card.

    Anyone have any meaningful statistics as opposed to my purely anecdotal observations?
  • by vondo ( 303621 ) * on Friday October 20, 2006 @01:11PM (#16518175)
    The Inquirer is more of a rumor site than a news site. They have a pretty good track record for their rumors, but they don't have people on record backing this one up.

    What NVidia eventually does may not bear much resemblance to the story.
  • Re:Niche market? (Score:3, Insightful)

    by gbjbaanb ( 229885 ) on Friday October 20, 2006 @01:18PM (#16518265)
    Its hardly a niche market - every server wants onboard graphics, mainly because they don't need to be superpowerful. I imagine this is similar - a low-powered CPU on the same chipset as the graphics chip (and no doubt the network chip) would make making motherboards a bit cheaper, or give them more capabilities that they currently have to have managed with software installed as a driver.

    I really doubt the CPU part is going to compete with the latest super-quadcore chips from AMD or Intel, so no-one will use it for a mainstream computer. Possibly it'd have a market for embedded products but I thought they were already well catered for.
  • by Ironsides ( 739422 ) on Friday October 20, 2006 @01:49PM (#16518679) Homepage Journal
    And, of course, the reason number one: you get a guaranteed GPU sale for each CPU sale - goodbye pesky competition ;).

    And vice versa. This might work where someone wants an embeded GPU for low to medium end graphics. However, I doubt gamers would like the idea of having to purchase a new CPU evertime a new GPU comes out and vice versa.

    There's something to be said for physically discrete components.
  • by FlyingGuy ( 989135 ) <.flyingguy. .at. .gmail.com.> on Friday October 20, 2006 @02:02PM (#16518925)

    As other have replied its all about the bus speed. The amount of time it takes to move data from chip to chip can insert enormous overhead.

    Think back a little to the DEC Alpha. Now the ALPHA chip in and of itself was not really that remarkable. What was so VERY remarkable about the Alpha system was its bus switching. It was blazingly fast and could handle monster amounts of data from main memory to CPU, GPU, etc. The reason ( mostly ) that its now sitting in HP's vault is that the bus switch was/is really expensive to manufacture.

    So the way you do this without haveing to build this very expensice bus architecture is to just put the GPU on the die with the CPU. Everything runs at the internal speed of the processor and its fairly inexpensive, comparatively, to build.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...