The Outlook On AMD's Fusion Plans 122
PreacherTom writes "Now that AMD's acquisition of ATI is complete, what do the cards hold for the parent company? According to most experts, it's a promising outlook for AMD . One of the brightest stars in AMD's future could be the Fusion program, which will 'fuse' AMD's CPUs with ATI's GPUs (graphics processing units) in a single, unified processor. The product is expected to debut in late 2007 or early 2008. Fusion brings a hopes of energy efficiency, with the CPU and GPU residing on a single chip. Fusion chips could also ease the impact on users who plan to use Windows Vista with Aero, an advanced interface that will only run on computers that can handle a heavy graphics load. Lastly, the tight architecture provided by Fusion could lead to a new set of small, compelling devices that can handle rich media."
power efficiency?? (Score:2, Interesting)
Re:A better way to spend they money would be on PR (Score:2, Interesting)
Re:Airport fun (Score:3, Interesting)
Wow. And here I was thinking there was this vast market for things called "workstations" where businesses didn't need high-end video cards and home systems where users didn't require the best 3D accelerators on the market. Shows what I know.
Even most enthusiasts only replace their video cards every 12-18 months. If a CPU/GPU combo was in the same price range as a current video card (not farfetched) then there'd be no reason not to use a combo chip.
But hey, feel free to waste hundreds of dollars just because you think you know how things will work. Don't let the facts get in the way of your fanaticism.
but... (Score:2, Interesting)
Maybe... (Score:5, Interesting)
Linux Drivers (Score:3, Interesting)
I've been an nVidia advocate since 1999 when I bought a TNT2 Ultra for playing Quake III Arena under Linux on my (then) K6-2 400.
I'm on my 4th nVidia graphics card, and I have 6 machines, all running Linux. One is a 10-year-old UltraSPARC, one has an ATI card.
Despite slashbot rantings about the closed-source nVidia drivers, and despite my motley collection of Frankenstein hardware, I've never had a problem with the nVidia stuff. The ATI stuff is junk. The drivers are pathetic (open source) and the display is snowy, and the performance it rubbish.
I hope AMD do something about the Linux driver situation.
My next machine will be another AMD, this time with dual dual-core processors and I'll be doing my own Slackware port, but I'll be buying an nVidia graphics card.
Re:Remember math coprocessors? (Score:1, Interesting)
Re:this will fail (Score:3, Interesting)
Integrating the GPU with the CPU will be about driving down cost and power consumption, not something that is usually a high-priority for folks that want to run the latest greatest games and get all the shiniest graphics. So, I'd be very surprised if this is intended to hit that part of the market, more likely it's designed to address the same market segment that Intel hits with graphics embedded in the CPU's supporting chipset.
That said, having the CPU & GPU combined (from the point of view of register and memory access etc) might open up some interesting new possibilities of using the the power of the GPU for certain non-graphic functions.
Back in the day at Intergraph we had a graphics processor that could be combined with a very expensive (and for the time powerful) dedicated floating point array processor. To demonstrate the power of that add-on somebody handcoded an implementation of the Mandelbrot Fractal algorithm on the add-on and it was blistering fast. I can imagine similar highly-parallelized algorithms doing very well on a GPU/CPU combo.Re:Bad idea for upgrades (Score:2, Interesting)
As a hobbyist, though, this sort of move makes me uncomfortable and maybe even a little bit sad. I've always liked the puzzles that computers bring: programming, building, troubleshooting, compiling, security monitoring, maintaining, and even the jargon and zealotry that comes with being a computer enthusiast. When computers have become a standard black box commodity what will be the next hobby puzzle to hold my interest?
Oh. And yes. I'd like to claim intellectual property on the SuneB. Sure, the industry will call it something else and all the patents will have a different name, but at least, 10 years from now when a SuneB clone company is the driving force on the stock market, I can sit back and think to myself,"Somewhere on Slashdot there's a post proving that I should be a billionaire rather than a corporate wage-slave."
GPU or GPGPU? (Score:2, Interesting)
I remember programming assembly graphics code in BASIC back in the day. You would set the VGA card to mode 13h and then write to...what was it now...0xa00? That's probably wrong. Anyway, whatever your wrote to that portion of memory would go to the screen.
If you had a huge SIMD co-processor, would it not be possible to rival modern GPUs with this model? Not to mention being able to do some cool stuff like having a video input card dump data driectly into that portion of the screen. So you could have video in with the CPU at complete idle.
Not what you think (Score:2, Interesting)
A step between on-board video, and full graphics (Score:3, Interesting)
With Vista coming out soon, PC-makers are going to want a low-cost 3-d accelerated solution to be able to run some (or maybe all) of the eye-candy that comes with vista.
Re:Stock tip ... (Score:3, Interesting)
"Righteous" = "big"?
Intel was making 130W CPUs until AMD got better performance with 60W (although Intel have now overtaken AMD on this.) I've got a 40W GPU which is as powerful as a 100W GPU of a couple of years ago.
A state-of-the-art CPU plus a mid-to-high range GPU today could come in at around 130W. The 130W CPU heat-sink problem is solved (for noisy values of "solved".)
Also, it is much easier to deal with a big heatsink on the motherboard than on a video card - the size and weight are much less restricted.
Hm, perhaps if AMD starts making 100+W fusion chips, they'll start supporting Intels BTX form factors (which were largely designed to improve cooling.) As a silent computing nut, I think this would be a Good Thing.
Re:Heat??? (Score:5, Interesting)
You're talking about the high-end "do everything you can" GPUs... ATI is dominating the (discrete) mobile GPU industry because their mobile GPUs use so little power. Integrating (well) one of those into a CPU should still result in a low-power chip.
Re:It's for laptops and budget systems (Score:5, Interesting)
I think they are, and I think it's the right choice. The GPU that will be integrated will not be today's GPU, but a much more general processor. Look at NVidia's G80 for the beginning of this trend; they're adding non-graphics-oriented features like integer math, bitwise operations, and soon double-precision floating point. G80 has 128 (!) fully general-purpose SISD (not SIMD) cores, and soon with their CUDA API you will be able to run C code on them directly instead of hacking it up through DirectX or OpenGL.
AMD's Fusion will likely look a lot more like a Cell processor than, say, Opteron + X1900 on the same die. ATI is very serious about doing more than graphics: look at their CTM initiative (now in closed beta); they are doing the previously unthinkable and publishing the *machine language* for their shader engines! They want businesses to adopt this in a big way. And it makes a lot of sense: with a GPU this close to the CPU, you can start accelerating tons of things, from scientific calculations to SQL queries. Basically *anything* that is parallelizable can benefit.
I see this as nothing less than the future of desktop processors. One or two x86 cores for legacy code, and literally hundreds of simpler cores for sheer calculation power. Forget about games, this is much bigger than that. These chips will do things that are simply impossible for today's processors. AMD and Intel should both be jumping to implement this new paradigm, because it sets the stage for a whole new round of increasing performance and hardware upgrades. The next few years will be an exciting time for the processor business.
Re:GPU or GPGPU? (Score:1, Interesting)
Re:Airport fun (Score:3, Interesting)