Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

The Outlook On AMD's Fusion Plans 122

PreacherTom writes "Now that AMD's acquisition of ATI is complete, what do the cards hold for the parent company? According to most experts, it's a promising outlook for AMD . One of the brightest stars in AMD's future could be the Fusion program, which will 'fuse' AMD's CPUs with ATI's GPUs (graphics processing units) in a single, unified processor. The product is expected to debut in late 2007 or early 2008. Fusion brings a hopes of energy efficiency, with the CPU and GPU residing on a single chip. Fusion chips could also ease the impact on users who plan to use Windows Vista with Aero, an advanced interface that will only run on computers that can handle a heavy graphics load. Lastly, the tight architecture provided by Fusion could lead to a new set of small, compelling devices that can handle rich media."
This discussion has been archived. No new comments can be posted.

The Outlook On AMD's Fusion Plans

Comments Filter:
  • by Anonymous Coward on Thursday November 16, 2006 @05:12PM (#16875730)
    Combining the CPU and GPU may make sense for embedded systems or as a replacement for integrated graphics, but I cannot see it working for those who prefer to have specific components based on other factors.

    Unless combining the two increases the performance of the system as a whole enough that the AMD CPU/GPU combination keeps up with or beats the latest and greatest video card... ...and is nearly as cheap to upgrade as buying high end video card... ...and both these seem entirely possible to me.
  • Re:Linux Drivers (Score:3, Informative)

    by Chris Burke ( 6130 ) on Thursday November 16, 2006 @05:29PM (#16876024) Homepage
    Despite slashbot rantings about the closed-source nVidia drivers, and despite my motley collection of Frankenstein hardware, I've never had a problem with the nVidia stuff. The ATI stuff is junk. The drivers are pathetic (open source) and the display is snowy, and the performance it rubbish.

    Well if you do 3D gaming on Linux, you're used to closed source drivers, since there hasn't really been another choice since the 3dfx Voodoo -- who won me over by supporting Linux, if not the Free Software philosophy behind it. NVidia similarly works. The ATI drivers are terrible, and I'm not talking the open source ones.

    I hope AMD do something about the Linux driver situation.

    Me too, because I'm sick of having only one practical choice for graphics cards. Not that I really have any complaints with NVidia, but it would be nice to be able to pick the best card, not the one that I can count on to work.

    I'm hopeful, just because AMD has been a big supporter of Linux and gcc, particularly in getting them to support AMD64. I guess we'll see.
  • by RuleBritannia ( 458617 ) on Thursday November 16, 2006 @05:56PM (#16876528)
    Any kind of integration tends to improve power efficiency just because of the high capacitance of the PCB traces. This makes it difficult to route a PCB for high-speed inter-chip communications never mind getting multiple 2.5Gb/s (PCIe) signal traces through a connector. All this requires large driver cells to drive off-chip communication and these use a great deal of power (and moderate area) on chip. Reducing the noise floor of your signals (by keeping them on chip) also gives you more headroom for voltage reductions in your digital hardware. All in all it makes it a much better picture overall for power efficiency. But dissipating power from these new chips will still be a headache for CPU package designers and systems guys alike.
  • by asuffield ( 111848 ) <asuffield@suffields.me.uk> on Thursday November 16, 2006 @08:06PM (#16878178)
    That is why I do not buy ATI products any more.


    So you use SiS chipsets then? They're the only manufacturer I can think of who still provide specs for their video chips (or do Intel still do that too?).

    Unfortunately we're currently stuck with a range of equally sucky choices. I tend to buy (older) ATI cards because at least they get reverse-engineered drivers, eventually.

There are two ways to write error-free programs; only the third one works.

Working...