Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

The Outlook On AMD's Fusion Plans 122

PreacherTom writes "Now that AMD's acquisition of ATI is complete, what do the cards hold for the parent company? According to most experts, it's a promising outlook for AMD . One of the brightest stars in AMD's future could be the Fusion program, which will 'fuse' AMD's CPUs with ATI's GPUs (graphics processing units) in a single, unified processor. The product is expected to debut in late 2007 or early 2008. Fusion brings a hopes of energy efficiency, with the CPU and GPU residing on a single chip. Fusion chips could also ease the impact on users who plan to use Windows Vista with Aero, an advanced interface that will only run on computers that can handle a heavy graphics load. Lastly, the tight architecture provided by Fusion could lead to a new set of small, compelling devices that can handle rich media."
This discussion has been archived. No new comments can be posted.

The Outlook On AMD's Fusion Plans

Comments Filter:
  • by Channard ( 693317 ) on Thursday November 16, 2006 @04:58PM (#16875470) Journal
    .. or advertising on TV. I work in a computer shop and it seems loads of people have no idea who the hell AMD are. I've explained that they're just competitors to a lot of customers, but still the customers go 'No, I've been told to get an Intel.' I can't recall having ever seen an AMD ad on telly at all.
  • by rjmars97 ( 946970 ) on Thursday November 16, 2006 @05:03PM (#16875568) Homepage
    Although I can see the potential efficiency increases, combining the GPU and CPU into one chip means that you will be forced to upgrade one when you only want to upgrade the other. To me, this seems like a bad idea in that AMD would have to make dozens of GPU/CPU combinations. Say I want one of AMD's chips in my headless server, am I going to have to buy a more expensive processor because it has a high powered GPU that I don't want or need? What if I want to build a system with a good processor to start, but due to budget reasons want to hold off on buying a good video card?

    Combining the CPU and GPU may make sense for embedded systems or as a replacement for integrated graphics, but I cannot see it working for those who prefer to have specific components based on other factors.
  • Heat??? (Score:3, Insightful)

    by pla ( 258480 ) on Thursday November 16, 2006 @05:06PM (#16875616) Journal
    Although CPUs have gotten better in the past year, GPUs (particularly ATI's) still keep outdoing each other in just how much power they can suck.

    With a decent single-GPU gaming rig drawing over 200W just between the CPU and GPU, do they plan to start selling water cooling kits as the stock boxed cooler?
  • Yes but (Score:3, Insightful)

    by Ant P. ( 974313 ) on Thursday November 16, 2006 @05:09PM (#16875652)
    Will it run Linux less than half a year after it's obsoleted by the next version?
  • Upgrades ? (Score:1, Insightful)

    by Anonymous Coward on Thursday November 16, 2006 @05:10PM (#16875680)
    Let's say I buy Fusion. Later on NVIDIA brings cool graphics card to market. Will I be able to use NVIDIA graphic card with Fusion ?
  • by hirschma ( 187820 ) on Thursday November 16, 2006 @05:12PM (#16875732)
    Whoa. You're going to need a closed-source kernel driver to use your CPU now? They can eat me. The graphics driver situation is bad enough.

    This one is untouchable until they open up the graphics drivers - or goodbye AMD/ATI.

    jh
  • by milgr ( 726027 ) on Thursday November 16, 2006 @05:18PM (#16875864)
    Gee, most of the servers I use don't have a video card. Some of the servers have serial ports. Others talk over a proprietary fabric - and pretend to have a serial connection (and maybe even VGA). I don't need to walk into the lab to get to the server's virtual consoles.

    Coming to think of it, the way we have things set up, the console is inaccessible from the lab - but accessible via terminal concentrators - over the lan.
  • by Chris Burke ( 6130 ) on Thursday November 16, 2006 @05:24PM (#16875946) Homepage
    Especially the former, where you can't really upgrade anyway and you typically have a GPU soldered to the board.

    The advantages of a combined CPU/GPU in this space are:
    1) Fewer chips means a cheaper board.
    2) The GPU is connected directly to the memory interface, so UMA solutions will not suck nearly as hard.
    3) No HT hop to get to the GPU, so power is saved on the interface and CPU-GPU communication will be very low latency.

    I highly doubt AMD is planning on using combined CPU/GPU solutions on their mainstream desktop parts, and they are absolutely not going to do so for server parts. I think in those spaces they'd much rather have four cores on the CPU, and let you slap in the latest-greatest (ATI I'm sure they hope, but if NVidia gives them the best benchmark score vs Intel chips then so be it) graphics card.

    AMD has already distinguished their server, mobile, desktop, and value lines. They are not going to suddenly become retarded and forget that these markets have different needs and force an ATI GPU on all of them.
  • Re:Stock tip ... (Score:4, Insightful)

    by jandrese ( 485 ) <kensama@vt.edu> on Thursday November 16, 2006 @05:37PM (#16876194) Homepage Journal
    Yeah, but the heatsink for the processor/graphics card combo system will be righteous.

    Frankly, I'm betting this is going to turn out more like the next generation of integrated video. Basically, the only "fusion" chips you'll see will be ones designed for small/cheap boxes that people never upgrade the components on. I'm betting the graphics in general will be slow and the processor will be average. Super fast processors and fast graphics won't get the fusion treatment because the people who buy them tend to want to keep them separate (for upgrading later), not to mention the difficulty you'd have powering and cooling a chip that complex.
  • by kimvette ( 919543 ) on Thursday November 16, 2006 @05:44PM (#16876326) Homepage Journal
    I'm going to ask:

    That's great and all, but does it run Linux?

    I'm not kidding, either. Is AMD going to force ATI to open up its specs and its drivers so that we can FINALLY get stable and FULLY functional drivers for Linux, or are they still going to be partially-implemented limited-function binary blobs where support for older-yet-still-in-distribution-channels products will be phased out in order to "encourage" (read: force) customers to upgrade to new hardware, discarding still-current computers?

    That is why I do not buy ATI products any more. They provide ZERO VIVO support in Linux, They phase out chip support in drivers even while they are actively distributed. They do not maintain compatibility of older drivers to ensure they can be linked to the latest kernels.

    This is why I went Core 2 Duo for my new system and do not run AMD - their merger with ATI. My fear is that if ATI rubs off on AMD then support for AMD processors and chipsets will only get worse, not better.

  • Re:Airport fun (Score:4, Insightful)

    by PFI_Optix ( 936301 ) on Thursday November 16, 2006 @06:05PM (#16876650) Journal
    You assume that this would do away with video cards; there's not a chance of that happening any time soon. As I said in another thread, it'd be quite simple for AMD to disable the on-chip video in favor of a detected add-in card.

    Right now I'm buying a $200 vidcard every 18-24 months. I'm looking at probably getting my next one middle of next year, around the same time I replace my motherboard, CPU, and RAM. My current system is struggling with the Supreme Commander beta and upcoming games like Crysis should be equally taxing on it. In the past six years, I've bought three CPU upgrades. If AMD could market a $300 chip that gave me a CPU and GPU upgrade with similar performance and stay on the same socket for 3-4 years, I'd be breaking even.
  • by Anonymous Coward on Thursday November 16, 2006 @06:08PM (#16876690)
    I'll buy this is they provide free drivers; I won't buy it if they don't. Vista's piggish graphics will surely push all GPU's to new performance levels. I don't care about on-chip integration nearly as much as I care about avoiding the need to use binary blobs in my free OS.
  • by rjstanford ( 69735 ) on Thursday November 16, 2006 @06:30PM (#16877060) Homepage Journal
    One thing to consider is that right now its getting pretty easy to have "enough" RAM for 99% of all users. I mean, if you get a new machine today that had 1.5-2.0gb in it, the odds of even wanting to upgrade would be slim to none. The fact is that most people live quite reasonably with 256-512mb right now, and will never upgrade. Note: most /. readers != most people. For modern machines if you're not running anything more brutal than Office, having a gig permanently attached would probably make sense for most people who would be using an integrated graphics type of system.
  • by Chris Burke ( 6130 ) on Thursday November 16, 2006 @06:48PM (#16877306) Homepage
    This is why I went Core 2 Duo for my new system and do not run AMD - their merger with ATI. My fear is that if ATI rubs off on AMD then support for AMD processors and chipsets will only get worse, not better.

    It is pretty typical in a buyout like this for the larger company's culture to dominate the smaller one. While in many cases this is a bad thing as the smaller company has the more open culture, in this case it is the larger company, AMD, that is more open.

    It is ridiculous to think that support for AMD chipsets and processors will get worse since AMD has utterly depended on Linux to jump start the 64-bit x86 market. Oh, and a processor is nothing if it doesn't expose its interfaces, because they count on programmers to use those new instructions or modes or whatever to optimize their programs and make the processor look good. There is no DirectX or OpenGL equivalent that processors hide behind.
  • by racerx509 ( 204322 ) on Thursday November 16, 2006 @08:05PM (#16878168) Homepage
    This product will most likely find its way into the mobile industry. Imagine a laptop with a *decent* 3d accelerator that uses low power and can actually run a 3d game at a good frame rate, without weighing a ton or singing your knees. They may be onto something here.

Today is a good day for information-gathering. Read someone else's mail file.

Working...