Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

The Outlook On AMD's Fusion Plans 122

PreacherTom writes "Now that AMD's acquisition of ATI is complete, what do the cards hold for the parent company? According to most experts, it's a promising outlook for AMD . One of the brightest stars in AMD's future could be the Fusion program, which will 'fuse' AMD's CPUs with ATI's GPUs (graphics processing units) in a single, unified processor. The product is expected to debut in late 2007 or early 2008. Fusion brings a hopes of energy efficiency, with the CPU and GPU residing on a single chip. Fusion chips could also ease the impact on users who plan to use Windows Vista with Aero, an advanced interface that will only run on computers that can handle a heavy graphics load. Lastly, the tight architecture provided by Fusion could lead to a new set of small, compelling devices that can handle rich media."
This discussion has been archived. No new comments can be posted.

The Outlook On AMD's Fusion Plans

Comments Filter:
  • by guysmilee ( 720583 ) on Thursday November 16, 2006 @04:54PM (#16875388)
    Invest in heat sinks! :-)
    • Not really. Because the Fusion concept integrates so much into one chip, it also means the potential to substantially cut the size of the motherboard itself, which means lower overall power consumption. This means we could dramatically reduce the size of the CPU box to something not much bigger than the current Mac Mini but still have tremendous processing power. It would certainly make it easier for designer of small computer barebones systems like Shuttle's XPC boxes.
      • Re:Stock tip ... (Score:4, Insightful)

        by jandrese ( 485 ) <kensama@vt.edu> on Thursday November 16, 2006 @05:37PM (#16876194) Homepage Journal
        Yeah, but the heatsink for the processor/graphics card combo system will be righteous.

        Frankly, I'm betting this is going to turn out more like the next generation of integrated video. Basically, the only "fusion" chips you'll see will be ones designed for small/cheap boxes that people never upgrade the components on. I'm betting the graphics in general will be slow and the processor will be average. Super fast processors and fast graphics won't get the fusion treatment because the people who buy them tend to want to keep them separate (for upgrading later), not to mention the difficulty you'd have powering and cooling a chip that complex.
        • Re: (Score:3, Interesting)

          Yeah, but the heatsink for the processor/graphics card combo system will be righteous.

          "Righteous" = "big"?

          Intel was making 130W CPUs until AMD got better performance with 60W (although Intel have now overtaken AMD on this.) I've got a 40W GPU which is as powerful as a 100W GPU of a couple of years ago.

          A state-of-the-art CPU plus a mid-to-high range GPU today could come in at around 130W. The 130W CPU heat-sink problem is solved (for noisy values of "solved".)

          Also, it is much easier to deal with a big heatsi
          • Actually, I think an XPC-sized box with heat-pipe cooling for the Fusion chip will actually work pretty well.

            I think people are confused about the nature of Fusion--it is intended for general computer users, not the high-end geeks who want to load up on the latest in everything inside the computer.
        • In it not clearly stated WHAT and HOW AMD will intergrate on 1 single chip.

          It could be, as you assume, a cheap graphic card, something similar to the intel chipsets that integrate some lowend 3D graphics.

          It could also be some 3D geometry unit. A specialised vector engine that could be used for geometry, physics, or general-purpose GPU programming. Which will be more like the AMD's answer to Cell processor.

          In which case a fusion chip will still be used together with specialised GFX cards. But vector calculat
      • There might be less overall energy use, but there will be a much higher energy use in the small part of the motherboard in which the processor resides.
        Heat sinks don't cover the whole motherboard, just the hottest parts, so putting two devices that each normally need heat sinks into one small area implies to me that the OP is right. There will have to be some kick-arse cooling to stop it from melting.
        Especially when you take into account that by the time they get anything out to market, the CPU that wil
  • how will homeland security like you bringing home a multi core Fusion through the gates?

    "But, but its an AMD processor, built in Germany or Russia or somewhere"

    "Teh internet told me it was more powerful than anything else out there."

    "It would literally blow me away!"
    • wont buy it.
      hate ATI and when AMD bought them, they absorbed ATI's taint.
      this solution is really only good for servers anyways, when people dont care about video.
      and, since server solutions can get away with uber low end graphics chips, why bother?

      enthusiasts will be wanting to upgrade seperately...defeating the purpose.
      • Re: (Score:3, Interesting)

        by PFI_Optix ( 936301 )
        So...there are only servers and enthusiasts in the market?

        Wow. And here I was thinking there was this vast market for things called "workstations" where businesses didn't need high-end video cards and home systems where users didn't require the best 3D accelerators on the market. Shows what I know.

        Even most enthusiasts only replace their video cards every 12-18 months. If a CPU/GPU combo was in the same price range as a current video card (not farfetched) then there'd be no reason not to use a combo chip.

        B
        • by grommit ( 97148 )
          The GP may be on to something though. Technically, all of those business and home desktops *are* servers since they've got a myriad of trojans, rootkits and bots busily serving data to script kiddies around the globe.
        • by geekoid ( 135745 )
          but his point is valid.

          Most desktop machine in the workplace do not need high end video cards. The on board one works fine.

          "Even most enthusiasts only replace their video cards every 12-18 months. If a CPU/GPU combo was in the same price range as a current video card (not farfetched) then there'd be no reason not to use a combo chip."
          So if you bought a machine 6 months ago, and now you hav to upgrade the entire thing to use an application, that's fine with you?

          feel free to waste those hundreds of dollars.

          En
          • Re:Airport fun (Score:4, Insightful)

            by PFI_Optix ( 936301 ) on Thursday November 16, 2006 @06:05PM (#16876650) Journal
            You assume that this would do away with video cards; there's not a chance of that happening any time soon. As I said in another thread, it'd be quite simple for AMD to disable the on-chip video in favor of a detected add-in card.

            Right now I'm buying a $200 vidcard every 18-24 months. I'm looking at probably getting my next one middle of next year, around the same time I replace my motherboard, CPU, and RAM. My current system is struggling with the Supreme Commander beta and upcoming games like Crysis should be equally taxing on it. In the past six years, I've bought three CPU upgrades. If AMD could market a $300 chip that gave me a CPU and GPU upgrade with similar performance and stay on the same socket for 3-4 years, I'd be breaking even.
        • Most people I have seen want a workstation for either high end programming, or rendering something (3D video/pictures, etc). The high end programmers are dual or more (usually quad) monitors and the rendering people need the graphics processing power. Not the gamer cards but the rendering cards Quadro from Nvidia and Fire from ATI(? I forget ATI's cards). Nither of those types cards are low end. The plain number crunching business people yes they don't need a high end graphics solution. But many business pe
          • You'd think they'd learn their lesson at least after burning out the second card, if not the first. Where are they getting the money from? :o
          • I'm sorry, but I just don't think you have a realistic picture of computing demands.

            Call centers, cubicle farms for accounting and the like, home users who don't do any gaming, even dual monitor users--a Radeon 7000 can use dual monitors quite well for productivity apps--far outnumber stock traders, programmers, and graphic designers.

            As for gamers: anyone chewing through 2-5 video cards a year is reckless and spends way too much money on hardware. ATI and nVidia are both on a six-month cycle, so there is ab
      • Gee, most of the servers I use don't have a video card. Some of the servers have serial ports. Others talk over a proprietary fabric - and pretend to have a serial connection (and maybe even VGA). I don't need to walk into the lab to get to the server's virtual consoles.

        Coming to think of it, the way we have things set up, the console is inaccessible from the lab - but accessible via terminal concentrators - over the lan.
        • by julesh ( 229690 )
          I don't think that's a common setup, though. Most servers I've worked with are built from commodity components, because it's cheaper. Yeah, if things go really wrong you need to connect a monitor, but really, how often will you need to do that? Usually, first time you install an OS on it, and then again if you completely reinstall due to disk failure (or whatever).
      • How can you use a server without Aero? The command line is too scary...
  • by Channard ( 693317 ) on Thursday November 16, 2006 @04:58PM (#16875470) Journal
    .. or advertising on TV. I work in a computer shop and it seems loads of people have no idea who the hell AMD are. I've explained that they're just competitors to a lot of customers, but still the customers go 'No, I've been told to get an Intel.' I can't recall having ever seen an AMD ad on telly at all.
    • ...which may explain how AMD has managed to keep their costs low over the years. Word of mouth is compelling...even to the point that many folks that I know are now biased against Intel...even though we are at a unique point where AMD's advantage has eroded...at least for the moment.
    • To some extent, yes.

      I just finished explaining to a friend why the software he just purchased for his business will run fine on the laptop I suggested from a Fry's ad. The specs for the software list "Intel Processor" and he assumed the AMD chip in my recommendation wouldn't work, because he has no idea what a processor even does. I would even hazard a guess that whomever wrote those specs doesn't know either, this little software vendor isn't getting paid to push Intel hardware.

      If they could just get word
      • by raynet ( 51803 )
        Unfortunately there are some programs that do require Intel CPUs to work or to enabled accelerated mode. I recall my friend talking about some video editor program that required Intel P4 to work and did warn you about this fact when you tried to install it to a AMD machine.

        Also I think the Intel C-compiler only enableds MMX/SSE/SSE2 etc support on Intel CPUs, thus making the programs slower on AMD unless you patch the binery to bypass the CPU check.
    • Leave the advertising to the computer manufacturers like Dell and HP. Now that Dell is going to carry AMD, they could start putting in AMD logos with whatever "chime" AMD has at the end of every one of their commercials. On their website, they can even say, "Now we bring you more savings by offering you the option of an AMD processor." Followed by an explanation that AMD is on par with intel when it comes to speed and is fully compatible with all software designed for intels. If I were in charge, I'd pr
      • If I were in charge, I'd probably drop the Celeron as the low cost alternative and just push AMD's.

        You'd want to make sure this move wouldn't make people equate AMD with Celeron performance. I wouldn't want AMD to be seen as a Celeron replacement.
    • I saw one of Dell's laptop commercials and the only processor options it stated were AMDs single and double core Turion. They are getting a little PR at least.

      Considering they are doing so well without any form of mass attention tells me they really could deal a major blow to Intel if they advertised.

      Then again, the shock value of telling people I use AMD is like telling them about my Fiero. "Wtf is AMD/Fiero?". ^_^
  • yeah. i'm also wondering how putting the two hottest components on the mother board (the GPU and CPU) into the same package is a power savings... :-/ maybe on the low end of the market where the performance of the GPU is irrelevant, but for those who actually care about GPU performance, putting the two most power hungry and memory bw hungry components together doesn't seem like a good idea.
    • by Yfrwlf ( 998822 )
      There will be a decrease in net heat output and power consumption, but you're right, it will make them hotter than just the CPU or GPU by itself. Maybe water cooling will be really needed at that point for the fastest CPU/GPU combos.
    • by RuleBritannia ( 458617 ) on Thursday November 16, 2006 @05:56PM (#16876528)
      Any kind of integration tends to improve power efficiency just because of the high capacitance of the PCB traces. This makes it difficult to route a PCB for high-speed inter-chip communications never mind getting multiple 2.5Gb/s (PCIe) signal traces through a connector. All this requires large driver cells to drive off-chip communication and these use a great deal of power (and moderate area) on chip. Reducing the noise floor of your signals (by keeping them on chip) also gives you more headroom for voltage reductions in your digital hardware. All in all it makes it a much better picture overall for power efficiency. But dissipating power from these new chips will still be a headache for CPU package designers and systems guys alike.
    • by julesh ( 229690 )
      A fairly large proportion of the power requirement of a general-purpose CPU goes into running the L2 cache. I'm not familiar with GPU design techniques, but I'd hazard a guess that the same is true there. If this is the case, a design with a unified cache would not only be more efficient time-wise (because data produced by the CPU is already in the cache when the GPU comes to render it) but energy-wise as well (because a smaller overall cache is necessary).
  • by rjmars97 ( 946970 ) on Thursday November 16, 2006 @05:03PM (#16875568) Homepage
    Although I can see the potential efficiency increases, combining the GPU and CPU into one chip means that you will be forced to upgrade one when you only want to upgrade the other. To me, this seems like a bad idea in that AMD would have to make dozens of GPU/CPU combinations. Say I want one of AMD's chips in my headless server, am I going to have to buy a more expensive processor because it has a high powered GPU that I don't want or need? What if I want to build a system with a good processor to start, but due to budget reasons want to hold off on buying a good video card?

    Combining the CPU and GPU may make sense for embedded systems or as a replacement for integrated graphics, but I cannot see it working for those who prefer to have specific components based on other factors.
    • Re: (Score:1, Informative)

      by Anonymous Coward
      Combining the CPU and GPU may make sense for embedded systems or as a replacement for integrated graphics, but I cannot see it working for those who prefer to have specific components based on other factors.

      Unless combining the two increases the performance of the system as a whole enough that the AMD CPU/GPU combination keeps up with or beats the latest and greatest video card... ...and is nearly as cheap to upgrade as buying high end video card... ...and both these seem entirely possible to me.
    • by Chris Burke ( 6130 ) on Thursday November 16, 2006 @05:24PM (#16875946) Homepage
      Especially the former, where you can't really upgrade anyway and you typically have a GPU soldered to the board.

      The advantages of a combined CPU/GPU in this space are:
      1) Fewer chips means a cheaper board.
      2) The GPU is connected directly to the memory interface, so UMA solutions will not suck nearly as hard.
      3) No HT hop to get to the GPU, so power is saved on the interface and CPU-GPU communication will be very low latency.

      I highly doubt AMD is planning on using combined CPU/GPU solutions on their mainstream desktop parts, and they are absolutely not going to do so for server parts. I think in those spaces they'd much rather have four cores on the CPU, and let you slap in the latest-greatest (ATI I'm sure they hope, but if NVidia gives them the best benchmark score vs Intel chips then so be it) graphics card.

      AMD has already distinguished their server, mobile, desktop, and value lines. They are not going to suddenly become retarded and forget that these markets have different needs and force an ATI GPU on all of them.
      • by jandrese ( 485 )
        You know, there could be a short term marketing coup to be had with a UMA style setup. I've noticed that a lot of people compare cards based entirely on how much memory they have (you have a 128MB card? Bah, my 256MB card is twice a good!), and will even mention it in game requirements "requires a 64MB video card". With UMA you could theoretically count the entire system memory as your card memory "this new system has 1GB of video on it!". People will think it's the hottest new thing until they get home
      • Re: (Score:3, Insightful)

        by racerx509 ( 204322 )
        This product will most likely find its way into the mobile industry. Imagine a laptop with a *decent* 3d accelerator that uses low power and can actually run a 3d game at a good frame rate, without weighing a ton or singing your knees. They may be onto something here.
      • I think in those spaces they'd much rather have four cores on the CPU, and let you slap in the latest-greatest...graphics card.

        If you're running a server (let's say a web server), aren't you only going to put in a video card that barely has anything on it (I'm thinking ATi Rage stuff, where all you need is 1024x768 or something)?
        • Yes, the "let you" was to indicate that doing so would be the customer's choice. Obviously few server customers would. I was mostly referring to the desktop market with that comment.
      • by modeless ( 978411 ) on Thursday November 16, 2006 @09:27PM (#16879024) Journal
        I highly doubt AMD is planning on using combined CPU/GPU solutions on their mainstream desktop parts, and they are absolutely not going to do so for server parts

        I think they are, and I think it's the right choice. The GPU that will be integrated will not be today's GPU, but a much more general processor. Look at NVidia's G80 for the beginning of this trend; they're adding non-graphics-oriented features like integer math, bitwise operations, and soon double-precision floating point. G80 has 128 (!) fully general-purpose SISD (not SIMD) cores, and soon with their CUDA API you will be able to run C code on them directly instead of hacking it up through DirectX or OpenGL.

        AMD's Fusion will likely look a lot more like a Cell processor than, say, Opteron + X1900 on the same die. ATI is very serious about doing more than graphics: look at their CTM initiative (now in closed beta); they are doing the previously unthinkable and publishing the *machine language* for their shader engines! They want businesses to adopt this in a big way. And it makes a lot of sense: with a GPU this close to the CPU, you can start accelerating tons of things, from scientific calculations to SQL queries. Basically *anything* that is parallelizable can benefit.

        I see this as nothing less than the future of desktop processors. One or two x86 cores for legacy code, and literally hundreds of simpler cores for sheer calculation power. Forget about games, this is much bigger than that. These chips will do things that are simply impossible for today's processors. AMD and Intel should both be jumping to implement this new paradigm, because it sets the stage for a whole new round of increasing performance and hardware upgrades. The next few years will be an exciting time for the processor business.
        • they are doing the previously unthinkable and publishing the *machine language* for their shader engines!

          This could be of huge benefit to F/OSS - if it's possible to write a decent GPL driver for an AMD GPU, there's suddenly a huge lever to persuade nVidia to open their GPU machine language too. Yay for AMD (again...)
      • I could see this on servers and desktops equally, just not on gaming rigs.

        The average desktop user is at the point now where they buy a new PC entirely if the old one is too slow in any area. Its not about RAM upgrades or video cards, its about a PC to them, and they buy a whole new one.

        On the same note, integrated video is more than enough for most server configurations, and only high-end CAD/visualization workstations and gaming rigs need independent graphics capabilities.
    • Re: (Score:2, Interesting)

      I can appreciate that an integrated CPU/GPU combination may have advantages in many arenas. It feels like a Bad Idea, though, in the same way that televisions with integrated VHS players were a bad idea, and all-in-one stereo systems didn't become a Good Idea until they came down both in price and physical size. In general I'm not comfortable with someone else bundling my technology for me. I'll be more than happy to accept the cost of keeping up to date with researching the individual components, and ac
    • Say I want one of AMD's chips in my headless server, am I going to have to buy a more expensive processor because it has a high powered GPU that I don't want or need?

      Probably AMD will continue to make GPU-less chips for headless servers and specialized applications where no GPU is needed, just as (for a while, at least) Intel made 486SX chips which were 486s without the FPU, when FPUs were first build into CPUs. Although with the emergence of ideas to leverage GPUs for non-display applications, I wouldn't b

    • Presumably AMD will have a limited set of these processors and most of them will be targeted at budget systems, the rest at laptops. Neither kind of machine is typically upgraded - most laptops you can't upgrade, and most budget systems just never need to be upgraded because the kind of people who buy them want to surf the web, read email, and use office. And maybe a simple paint program, and download pictures off their digital camera. Any crap computer today can do all that; might as well get more integrat
  • So... (Score:3, Funny)

    by FlyByPC ( 841016 ) on Thursday November 16, 2006 @05:06PM (#16875614) Homepage
    Energy efficiency...
    Project named Fusion...
    ...
    Please tell me Pons and Fleischmann [wikipedia.org] aren't behind this?
    • Project named Fusion...

      We also got a gas guzzling car and razor with numerous blades. I say that if it doesn't net fusion energy, there should be a law against calling it fusion!
  • Heat??? (Score:3, Insightful)

    by pla ( 258480 ) on Thursday November 16, 2006 @05:06PM (#16875616) Journal
    Although CPUs have gotten better in the past year, GPUs (particularly ATI's) still keep outdoing each other in just how much power they can suck.

    With a decent single-GPU gaming rig drawing over 200W just between the CPU and GPU, do they plan to start selling water cooling kits as the stock boxed cooler?
    • Re:Heat??? (Score:5, Interesting)

      by Pulzar ( 81031 ) on Thursday November 16, 2006 @07:58PM (#16878096)
      Although CPUs have gotten better in the past year, GPUs (particularly ATI's) still keep outdoing each other in just how much power they can suck.


      You're talking about the high-end "do everything you can" GPUs... ATI is dominating the (discrete) mobile GPU industry because their mobile GPUs use so little power. Integrating (well) one of those into a CPU should still result in a low-power chip.
  • Yes but (Score:3, Insightful)

    by Ant P. ( 974313 ) on Thursday November 16, 2006 @05:09PM (#16875652)
    Will it run Linux less than half a year after it's obsoleted by the next version?
    • Re: (Score:1, Funny)

      by Anonymous Coward
      Yes, in all the VESA glory
  • Upgrades ? (Score:1, Insightful)

    by Anonymous Coward
    Let's say I buy Fusion. Later on NVIDIA brings cool graphics card to market. Will I be able to use NVIDIA graphic card with Fusion ?
    • I don't see why not. A lot of modern systems have integrated graphics that automatically switch off or become secondary if an add-in video card is detected. No reason AMD couldn't do this in their own chips.
    • Re: (Score:2, Funny)

      by adsl ( 595429 )
      Or you could replace the whole Fusion chipset with the projected Nvidia chipset (release also late 2007) which attaches a CPU onto their graphic chipset.
  • by hirschma ( 187820 ) on Thursday November 16, 2006 @05:12PM (#16875732)
    Whoa. You're going to need a closed-source kernel driver to use your CPU now? They can eat me. The graphics driver situation is bad enough.

    This one is untouchable until they open up the graphics drivers - or goodbye AMD/ATI.

    jh
  • but... (Score:2, Interesting)

    by Hangin10 ( 704729 )
    does this mean ATI will be opening up its GPU programming specs, or merely what is being stated (that graphics chip and CPU will share a die) ?
    • Is ATI going to open up their specs so people can write open source drivers?

      ATI (before we were AMD) released CTM http://www.atitech.com/companyinfo/researcher/docu ments.html [atitech.com], which is the hardware specification for the pixel shaders on our graphics cards. The pixel shaders are probably the most complicated part of our chips and we released this because the GPGPU community wanted it. While I don't speak for AMD, I would not be surprised at all if a group serious about writing an open source AMD driv
  • Prior to 486s, they used to have the floating point functions on a separate chip from the processor. If the GPU is moving to the processor now, what will be the next thing to get sucked in?

    • by milgr ( 726027 )
      There are currently not that many other components to get sucked in. Here is a list off the top of my head:
        Network processor
        Sound
        Video input processor
        USB (or whatever equivalent but newer technology)
        Disk controller
        Memory
    • Re: (Score:1, Interesting)

      by Anonymous Coward
      Ram. Look at the Xenos GPU (Xbox360) or the PS3 GPU. Both have the ram soldered directly over the GPU package. They cant be in the same chip because of the different fabrication processes, but they can be glued together for higher overall speed. But upgrades will suck...
      • Re: (Score:3, Insightful)

        by rjstanford ( 69735 )
        One thing to consider is that right now its getting pretty easy to have "enough" RAM for 99% of all users. I mean, if you get a new machine today that had 1.5-2.0gb in it, the odds of even wanting to upgrade would be slim to none. The fact is that most people live quite reasonably with 256-512mb right now, and will never upgrade. Note: most /. readers != most people. For modern machines if you're not running anything more brutal than Office, having a gig permanently attached would probably make sense fo
      • by julesh ( 229690 )
        Mod parent up. The current bandwidth in processor design is getting data from the memory to the CPU fast enough, and has been for *ages* -- when was the last time you had a machine where the RAM fast enough for there to be no delay cycles, ever? The latest I know of was the BBC Model B: a machine that ran on a 2MHz 6502. While moving RAM to the processor would probably not *solve* this issue, it might be a significant improvement over the current state of affairs.
    • by mgblst ( 80109 )
      I predict the hard-drive?

      But seriously, memory would seem to the next thing. The already have L1 and L2 caches, why not move all of memory on board. As long as CPUs keep shrinking, there is no reason not to do this. Sure, you can still have a memory bus to external memory, if you want to upgrade.
  • this processor will just be a addon ontop of the vid card you would have, like 3dnow! its just a add on, maybe we can shift into 128bit processors!
  • Maybe... (Score:5, Interesting)

    by MobyDisk ( 75490 ) on Thursday November 16, 2006 @05:20PM (#16875892) Homepage
    The article says that this might be attractive to businesses: I can see that since most businesses don't care about graphics. This is similar to businesses buying computers with cheap on-board video cards. But that means they will be profiting on the low-end. It seems like this is more of a boon for laptops and consoles: Currently, laptops with decent video cards are expensive and power-hungry. Same with consoles. But for mid-range and high-end systems, there must be a modular bus connecting these two parts since they are likely to evolve at different rates, and likely to be swapped-out individually.
    • by MikeFM ( 12491 )
      I don't know if that's really true anymore. When I upgrade either my CPU or GPU these days I usually end up upgrading both along with mobo and RAM. Unless you're making a very minor upgrade of either CPU or GPU, and who can afford that unless you're buying crappy outdated stuff anyway, you'll likely have a hard time not needing to upgrade your mobo in the process and if you do that you usually have to upgrade everything else. Just package everything together and make it really powerful so I won't have to up
  • Linux Drivers (Score:3, Interesting)

    by turgid ( 580780 ) on Thursday November 16, 2006 @05:24PM (#16875944) Journal

    I've been an nVidia advocate since 1999 when I bought a TNT2 Ultra for playing Quake III Arena under Linux on my (then) K6-2 400.

    I'm on my 4th nVidia graphics card, and I have 6 machines, all running Linux. One is a 10-year-old UltraSPARC, one has an ATI card.

    Despite slashbot rantings about the closed-source nVidia drivers, and despite my motley collection of Frankenstein hardware, I've never had a problem with the nVidia stuff. The ATI stuff is junk. The drivers are pathetic (open source) and the display is snowy, and the performance it rubbish.

    I hope AMD do something about the Linux driver situation.

    My next machine will be another AMD, this time with dual dual-core processors and I'll be doing my own Slackware port, but I'll be buying an nVidia graphics card.

    • Re: (Score:3, Informative)

      by Chris Burke ( 6130 )
      Despite slashbot rantings about the closed-source nVidia drivers, and despite my motley collection of Frankenstein hardware, I've never had a problem with the nVidia stuff. The ATI stuff is junk. The drivers are pathetic (open source) and the display is snowy, and the performance it rubbish.

      Well if you do 3D gaming on Linux, you're used to closed source drivers, since there hasn't really been another choice since the 3dfx Voodoo -- who won me over by supporting Linux, if not the Free Software philosophy beh
      • by Ant P. ( 974313 )
        I do 3D gaming just fine with a 9250 and the open driver. It's not noticeably worse than the FX5200 I had before it burned out.
        I'll be sticking with nVidia for the foreseeable future though; ATi is just not worth the risk on any OS.
    • The ATI stuff is junk. The drivers are pathetic (open source) and the display is snowy, and the performance it rubbish.


      I currently have an ATI Radeon 9200. The reason I went with it rather than a faster card is because of the open source driver for it. The games I play are emulated SNES, GTA III, GTA Vice City, and Enemy Territory. I haven't had and problems with it. It whenever I install Linux the card works accelerated with out of the box.
  • Cool amd is about to MediaGX themselves.
  • the processor market still changes too rapidly for this kind of bonding.

    Do you really want to have to replace an entire system when you upgrade? You buy a Dell, a new game comes out 6 months and your system can't play it reasonably well.

    So then you either
    a) buy a new system
    or
    b) gut in a video card and not use the one on the proc.

    When processors begins to peak, and each upgrades is basically a few ticks, then developers will have to create things for the systems that is out , not a system that will be out i
    • by turgid ( 580780 )

      Do you know about Hypertransport? Do you know how important multi-CPU AMD motherboards are about to become?

      While intel's multi-core processors are choking on a single frontside bus, with an AMD system, you just need to plug in another CPU, GPU, physix processor, vector processor or whatever and get more (not less) memory bandwidth per processor and a linear increase in processing power.

      By 2008, I expect 4-socket AMD motherboards will be common place amongst consumers, never mind enthusiasts.

      Intel will ha

    • Re: (Score:3, Interesting)

      by NSIM ( 953498 )

      Do you really want to have to replace an entire system when you upgrade? You buy a Dell, a new game comes out 6 months and your system can't play it reasonably well. So then you either a) buy a new system or b) gut in a video card and not use the one on the proc.

      Integrating the GPU with the CPU will be about driving down cost and power consumption, not something that is usually a high-priority for folks that want to run the latest greatest games and get all the shiniest graphics. So, I'd be very surpri

      • by adsl ( 595429 )
        Graphic chipsets are often more complicated than the CPU. I agree about the lowering of cost and reducing power consumption. Nvidia themselves have a similar project with a hoped for release late 2007 and VIA are powering the "Walmart" Laptp with remarkably low power consumption and reduced cost. Thus the market for a combined CPU/Graphics chipset is likely to be in low end desktops AND most importantly in LAPTOPS. It will be interesting to see wat Intel have up their sleeves also.
      • by afidel ( 530433 )
        A machine I built recently is very power efficient(for something with a GPU) and runs the latest and greatest games. Athlon 64 x2 4600+ low power, SLI'd Geforce 7600GT's passively cooled running in a case with 2x12cm fans, 12cm power supply fan, and the 8cm CPU fan. Total max power draw between the two GPU's and the CPU is 102W, compare that to a late model Pentium 4 at 130W! The thing is almost silent, the seek noise from the HDD is the loudest noise in the machine and that's 38dB.
    • by zenslug ( 542549 )
      Do you really want to have to replace an entire system when you upgrade?

      I don't upgrade so this would be nice, yes.

      You are a gamer and you are special. Most of the world isn't special and would like a cheaper machine to browse the web. You will buy a different system with an upgradable GPU. Or this new setup will be almost exactly like integrated graphics today which allows you to add your own killer card as a replacement.

  • by kimvette ( 919543 ) on Thursday November 16, 2006 @05:44PM (#16876326) Homepage Journal
    I'm going to ask:

    That's great and all, but does it run Linux?

    I'm not kidding, either. Is AMD going to force ATI to open up its specs and its drivers so that we can FINALLY get stable and FULLY functional drivers for Linux, or are they still going to be partially-implemented limited-function binary blobs where support for older-yet-still-in-distribution-channels products will be phased out in order to "encourage" (read: force) customers to upgrade to new hardware, discarding still-current computers?

    That is why I do not buy ATI products any more. They provide ZERO VIVO support in Linux, They phase out chip support in drivers even while they are actively distributed. They do not maintain compatibility of older drivers to ensure they can be linked to the latest kernels.

    This is why I went Core 2 Duo for my new system and do not run AMD - their merger with ATI. My fear is that if ATI rubs off on AMD then support for AMD processors and chipsets will only get worse, not better.

    • Re: (Score:3, Insightful)

      by Chris Burke ( 6130 )
      This is why I went Core 2 Duo for my new system and do not run AMD - their merger with ATI. My fear is that if ATI rubs off on AMD then support for AMD processors and chipsets will only get worse, not better.

      It is pretty typical in a buyout like this for the larger company's culture to dominate the smaller one. While in many cases this is a bad thing as the smaller company has the more open culture, in this case it is the larger company, AMD, that is more open.

      It is ridiculous to think that support for AMD
    • Re: (Score:3, Informative)

      by asuffield ( 111848 )
      That is why I do not buy ATI products any more.


      So you use SiS chipsets then? They're the only manufacturer I can think of who still provide specs for their video chips (or do Intel still do that too?).

      Unfortunately we're currently stuck with a range of equally sucky choices. I tend to buy (older) ATI cards because at least they get reverse-engineered drivers, eventually.
      • I use Nvidia video cards now, because even though they're proprietary, they support old-to-ancient chipsets and actively maintain older drivers.

        I do not use SiS products as the failure rates I've seen for SiS are somewhere between horrible and abysmal.
  • GPU or GPGPU? (Score:2, Interesting)

    by tbcpp ( 797625 )
    From what I understand (and I could be wrong), AMD/ATI is aiming more at the GPGPU market. So we're talking more of a suped up altivec processor in the CPU instead of a full blown GPU. It sounds like the're simply adding a 64 pipleline vector processor to the existing x86-64 core. I'm not sure if this is a bad idea.

    I remember programming assembly graphics code in BASIC back in the day. You would set the VGA card to mode 13h and then write to...what was it now...0xa00? That's probably wrong. Anyway, whatev
    • Re: (Score:1, Interesting)

      by Anonymous Coward
      Exactly. This could be the move that makes physics acceleration ubiquitous. Sure, el cheapo systems can be built that utilize the on package or on die GPU capabilities for low end graphics. But a higher end (and gamer relavent) use would be to have the integrated "GPU" doing physics calculations while an add in board continues handling the traditional GPU tasks. This would be *far* superior to the current add in board approach, because the tight CPU-Physics integration would allow for some damned sweet
  • Not what you think (Score:2, Interesting)

    by Anonymous Coward
    My people are reading this as an integrated GPU and CPU. I don't see it that way. I see it as adding a generic vector processor to the CPU. Similar to the Cell processor and similar to future plans Intel has described. Vector processors are similar to SSE, 3DNow, etc. They are SIMD processors that can execute very regular mathematical computations (Video and audio encoding/decoding) VERY quickly, but aren't much good for generic algorithms.
    • Re: (Score:2, Funny)

      by organgtool ( 966989 )
      My people are reading this as an integrated GPU and CPU.
      So let me get this straight: you own slaves AND you force them to read Slashdot. You, sir, should be punished for crimes against humanity.
  • by Vellmont ( 569020 ) on Thursday November 16, 2006 @05:57PM (#16876544) Homepage
    The people claiming this will fail all seem to miss the market this is aimed at. It's obviously not intended to compete with the high-end, or even middle of the road graphics processor. Those boards require gobs of VERY fast video memory. My guess is this thing is aimed at a space between the on-board video (which are really just 2-d chips) and the full 3-d graphics card. Anyone buying this has no intention of buying a super-duper

    With Vista coming out soon, PC-makers are going to want a low-cost 3-d accelerated solution to be able to run some (or maybe all) of the eye-candy that comes with vista.
  • by Anonymous Coward
    I'll buy this is they provide free drivers; I won't buy it if they don't. Vista's piggish graphics will surely push all GPU's to new performance levels. I don't care about on-chip integration nearly as much as I care about avoiding the need to use binary blobs in my free OS.
  • by Wilson_6500 ( 896824 ) on Thursday November 16, 2006 @06:12PM (#16876768)
    Let's hope this fusion doesn't bomb.
  • AMD will be making razors and shave gel? Sweet! How many blades, 4, 5 or scalable on demand?
  • Great idea on paper. It boils down to personnel though. You're talking about fusing development teams with experience. Will they work together well? Or will the elevator assets go work for someone else, leaving the understudies to bicker about with an ignoramus boss unable to figure out which engineers are clever and which are just suckups?

    I'm not saying it won't work; I'm saying that fusing development teams with expertise is a lot different than fusing different components onto the same board. And that,
  • I know the trend is single socket multi core but with the gpu embedded dual and quad sockets instead of sli!
  • This will lead to a whole new world of disgustingly bad graphics chips eating system RAM and claiming to have "256MB" or whatever but really having little or none and just munching on (slow) system memory as needed... and that never works as well as it should.
  • How about getting those lightning fast matrix operations onto the CPU? I always hear about people building application-specific tweaks by reprogramming their algorithm into a shader language. I imagine that there is far more fertile soil for innovation here than some lame combined C/GPU.
  • They are also look at this as well. Maybe even some kind of an super cross fire / sli with build in cpu video graphics processing + video in the slot with it's own graphics processing and ram also you may be able to 2 video cards linked as well.

    Amd 4x4 systems may be able to have 4 video cards + 2 cpus with graphics processing in them.

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...