Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

ATi's Multi-GPU CrossFire Graphics Card Unveiled 207

MojoDog writes "ATi has unveiled their new Multi-GPU technology dubbed "CrossFire" today out at the Computex show in Taiwan. HotHardware has a full preview of the technology, which requires both a Radeon Xpress 200 CrossFire based motherboard and a CrossFire graphics card, in addition to another Radeon X800 series PCI Express card, for dual 3D Graphics processing with three available types of load balancing. CrossFire supports Split-Screen, Alternate Frame Rendering and SuperTiling mode load balancing between the GPUs."
This discussion has been archived. No new comments can be posted.

ATi's Multi-GPU CrossFire Graphics Card Unveiled

Comments Filter:
  • by Anonymous Coward on Tuesday May 31, 2005 @12:12PM (#12684766)
    http://www.anandtech.com/video/showdoc.aspx?i=2432 [anandtech.com]

    Just thought would be good to add variety.
    • The Anandtech article brings up one important point that everybody else here seems to miss; CrossFire is not only about multi-GPU rendering.

      CrossFire contains a rather neat implementation of multi-GPU antialiasing that provides double the samples compared to single-card anti-aliasing. This works on all games, even those that don't work with normal multi-GPU acceleration, or those that don't see any benefit.

      The new CrossFire AA features not only normal AA (8x or 12x) but the first implementation of super-s
  • Awesome (Score:4, Insightful)

    by Keystroker ( 884765 ) on Tuesday May 31, 2005 @12:12PM (#12684773)
    This is just in time. I'm sure many nex-gen games coming out will be transferred over to PC. This sort of begs the question. Slowly, the computer is becoming an all in one console. Next gen consoles may soon become useles.

    PS- ATI, we need Linux drivers!
    • How funny. (Score:3, Insightful)

      Slowly, the computer is becoming an all in one console. Next gen consoles may soon become useles.

      The same was said of the PC 10 years ago.
    • Slowly, the computer is becoming an all in one console. Next gen consoles may soon become useles.

      So all the new consoles are announced and everyone thinks PC gaming is doomed.

      New video cards are announced and people thing console gaming is doomed.

      Which is it?! TELL ME WHAT TO THINK!!!

      Inf act there will always be consoles/dedicated gaming machines AND a market for games played on PCs. Wow, that was hard.
      • Which is it?! TELL ME WHAT TO THINK!!!

        All you have to do is look at the numbers. Console game sales (software-wise) are skyrocketing while PC-based games are creeping along. Look to IDG [idg.com] for the hard numbers.

        Also, all the next-gen consoles are PPC based and use ATi graphics. You're looking in the wrong direction if you think PCs are going to benefit from this, wrong architecture.

      • Re:Awesome (Score:3, Insightful)

        by AviLazar ( 741826 )
        Another few reasons why there will be both:

        Consoles are more user friendly - virtually no crashing...requiring no loading or advanced configuartion

        PCs are more customizable, can do other things (i.e. you can type your homework on it), and are not so locked up.


        Some people, also, cannot afford both. Maybe someone can afford to spend 1200 on a bangin gaming machine...but they may not be able to afford that 1200 piece of hardware and an additional 400-600 console.

        There will be a market for both
    • Re:Awesome (Score:5, Insightful)

      by fr0dicus ( 641320 ) on Tuesday May 31, 2005 @12:21PM (#12684860) Journal
      Meanwhile, on planet Earth, the PC gaming market shrinks every year, as even Microsoft shift focus to games consoles.
      • Meanwhile, on planet Earth, the PC gaming market shrinks every year, as even Microsoft shift focus to games consoles.

        While that's true, graphic-intensive simulation (military certainly) is not moving to gaming consoles.

      • While I'm not saying you're wrong, would either side please quote some numbers?
        • Re:Awesome (Score:3, Informative)

          by bluk ( 791364 )
          http://www.gamespot.com/news/2005/05/26/news_61265 52.html [gamespot.com]

          GameSpot's quarterly report said PC sales were down, and that they only account for 4% of sales. You could argue that PC sales remained the same and console related sales skyrocketed, but this is the tail end of a console generation when people are usually saving up money for the next console.

          Since GameSpot doesn't sell PCs that I know of and console hardware sales are around 20% from that same report, you can venture that roughly 70% of sales a
      • Re:Awesome (Score:3, Insightful)

        by Espectr0 ( 577637 )
        Meanwhile, on planet Earth, the PC gaming market shrinks every year, as even Microsoft shift focus to games consoles.


        And then PC users get only console ports, which are badly done, therefore no one wants to buy PC games, making the problem worse every year.
      • They aren't shifting focus just expanding the horizons. With the cash cow that is Office it would take a lot more for them to shift focus.
    • Re:Awesome (Score:5, Interesting)

      by NanoGator ( 522640 ) on Tuesday May 31, 2005 @12:30PM (#12684947) Homepage Journal
      " I'm sure many nex-gen games coming out will be transferred over to PC. This sort of begs the question. Slowly, the computer is becoming an all in one console. Next gen consoles may soon become useles."

      The opposite could just as easily be said. Next gen systems are rivaling PC's. Slowly, PC games will move over to consoles.

      Frankly, either prediction is silly. The sole difference between PC's and consoles isn't the graphic power. There are a set of trade-offs for either platform. The PC, for example, requires up to date hardware, doesn't have a standard controller, and often requires a lot of configuration to get going. The game console, however has, standard hardware, no installation BS, games designed to play on the lowes common denominator, and a multi-purpose controller. One you'll happily play Quake on, the other you'll happily play Zelda on.

      Me personally, I'm not thrilled with PC gaming anymore. Too much hassle with too little payoff. Maybe I'm just busier than I used to be, but I like the idea of a $200 box I can just hook up to the TV, pop a disc in, and play.
      • The really big advantage to a PC is the controls. Quite why no-one makes console games that can talk to USB keyboard & mouse, I don't know; FPSs on a console drive me nuts, just can't get used to the control system.

        There are other issues, like consoles have substantially less memory, which makes it harder for game designers to put large maps in (I'm really puzzled the next gen consoles aren't going for more memory; 512mb is okay now, but in 5 years time?).
      • Spot on!
        I'm just waiting for some typically PC only games to come to consoles such as Age Of Mythology etc...
        They require a mouse (done with AOE for PS2) AND a hig hres display.
        I hope the new high res capabilities of the next gen consoles will be used for some computer like experience.
      • Re:Awesome (Score:3, Interesting)

        by Dragoon412 ( 648209 )
        I don't think that the assumption that consoles will soon kill off PC gaming is all that far fetched. Why?

        (And keep in mind, I'm a very staunch PC gamer that's had and subsequently traded in all 3 current consoles because I find their games to be so shallow and short-lived).

        The PC really has two advantages over consoles, and neither is specific to the PC itself: the control scheme (I love gamepads and all, but they simply can't compete with the level of precision and complexity a keyboard and mouse offer
      • The game console, however has, standard hardware, no installation BS, games designed to play on the lowes common denominator, and a multi-purpose controller. One you'll happily play Quake on, the other you'll happily play Zelda on.

        What? Sorry, you can play Quake on your multi-purpose controller, if you want to have one hand figuratively tied behind your back. I'll take my keyboard and mouse combination any day of the week.
        • Uh, yeah, that was the point. I think you misunderstood what I said. You wouldn't want to play Zelda with a kb and mouse anymore than you'd want to play Quake4 on a controller.
      • Re:Awesome (Score:3, Insightful)

        by 0111 1110 ( 518466 )
        You forgot to mention that most console games are still targeted at child gamers. I realize that thirty-somethings like myself are a minority in the gaming market, but for us hardware comparisons are largely irrelavent. Even if/when consoles add qwerty keyboards and monitor connections, there really is (almost) nothing for us to play.

        I already went through that arcade game phase with Atari 2600 and Atari 400/800 (or Apple II) games in the early 80s and again with IBM PC first person shooter games (i.e. Wol
    • "Slowly, the computer is becoming an all in one console."

      Not so slowly, the console is becoming an all-in-one computer. Think: media player, internet browser, etc.
    • >Next gen consoles may soon become useles.

      not as long as an entire multimedia and internet capable games console costs half the price of just the graphics card in a high-end PC.
    • The next gen consoles have a substantial advantage over PCs because all their hardware is optimized specifically for games in ways that are mutually exclusive with good PC performance.

      However, I wouldn't be all that surprised if graphics cards started getting console subsystems entirely on the card.
    • > Slowly, the computer is becoming an all in one console.

      Not if ATI are involved with it. Seriously. The result will be truly awesome hardware with truly appauling drivers. That, a gaming console, does not make! Great hardware is useless if the drivers are flakey. I have owned two Radeons, and have had a few years now of driver hell and will never buy another ATI product again - no matter how tempting. The Windows XP (and 98/ME for that matter) Catalyst drivers and MMC software suite is an embarrass
    • ... are that many of them are pretty dern alike, and there are limited genres within. Yet, this is why they sell. People like buying every GTA game, every run-and-shoot platformer, all the nearly identical FPS games. Until there start to be strategy games (I love Galactic Civilization and Sim City 4), moddable games, online games like Tribes/Tribes 2/Battlefield/BF2/Others that have a real good amount of depth (and should be possible with the increased console RAM), and etc., several PC gamers will stubborn
  • by Anonymous Coward on Tuesday May 31, 2005 @12:14PM (#12684793)
    Before you waste your time on the same old tired "who needs it" posts, here's the answer:

    Obviously not you.

    Now stfu and be happy.
  • Yes I'm coming across as a troll here but I'm still pissed about having bought a 9200 and the need to fight it to get it working reasonably under linux. TV out should not have taken a days worth of work. Until ATI gets it together and starts releasing good drivers for something other then windows my cards will be nvidia.
    • ATI's *nix drivers are getting better (but still suck), but you are running a 9200, a POS card. Doubt you are doing much graphic intensive work on that card. So you really need to ask yourself, do you need this kind of power? If you want to upgrade to this system, be prepared to shell out the big bucks. I doubt it will support *nix for quite a while.
    • for something other then windows

      Erm. So, ATI finally has their drivers working well for Windows?

      Honestly, how many of you actually believe ATI is capable of making multiple GPUs work reliably? And on Linux?

      Go ahead ATI fanbois, I can spare the karma.
  • by guyfromindia ( 812078 ) on Tuesday May 31, 2005 @12:15PM (#12684798) Homepage
    ...32 graphic chips!!!
    From TomsHardware http://www.tomshardware.com/hardnews/20050526_1558 43.html [tomshardware.com]

    I will live on bread and water from now on to afford a system with this... in the far future! :-)
  • by jellomizer ( 103300 ) * on Tuesday May 31, 2005 @12:15PM (#12684802)
    At a speed where it can render the entire earth. at the string theory level at 80 FPS?
    • Never, if that uber computer remains on earth.
      • Hmm at 32 bit color that would be 10^19 Yottabits of Video Ram at least. I wont dare to calculate Mhz or um probably Yhz to get the work done. Nore will I even attempt to figure how much heat will be dicipated in the calculation. Needless to say with out 2005 knowlege. It probably cant be done.
    • Sounds about right, yeah.

      Sort of low FPS though. . . :)

    • Not really. When it can render some 8 square kilometers of real terrain at 60FPS with resolution sufficient to show objects the size of a single pixel of some 1600x1200 display. Say, a field of grass in the wind. To the horizon. Each visible straw composed of some 20 polygons.
      We don't need to get beyond what human eyes can see.
      • by FauxPasIII ( 75900 ) on Tuesday May 31, 2005 @01:26PM (#12685483)
        > We don't need to get beyond what human eyes can
        see.

        Tell it to the people who insist on a sustained 200fps whilst running their monitors are
        retracing at 85hz.
        • >>>> We don't need to get beyond what human eyes can
          see.

          >> Tell it to the people who insist on a sustained 200fps whilst running their monitors are
          retracing at 85hz.

          Me: It's not about the retrace or pure benchmark "framerate", it's about the framerate spikes that are constantly fluctuating depending on what is happening in the game that occur during playing the game fully loaded with big battles, and beutiful models, textures and environments. In my opinion this is because games, gamin
    • Comment removed based on user account deletion
    • heh I don't think string theory is in vogue any more...

      And yeah I know you didn't want a real answer.

      You're getting it anyways.

      Simple fact of computation; in order to simulate anything in real time, you need to have more complexity than the elements you are simulating. This complexity directly correlates to component count.

      Build a GPU the size of the solar system and it might be doable...

      But then again a GPU the size of the solar system would have so many lightspeed delays that you couldn't get a dece
    • At a speed where it can render the entire earth. at the string theory level at 80 FPS?

      Only then will we find out the question to which the answer is 42.

      Phillip.
    • by SamSim ( 630795 ) on Tuesday May 31, 2005 @02:13PM (#12685947) Homepage Journal

      I've got a simulator running which renders the entire Earth at string theory level at 10^34 FPS.

      Unfortunately it's in use at the moment.

    • When will a GPU Be Good enough at a speed where it can render the entire earth. at the string theory level at 80 FPS?

      Sometime next Thursday, if you believe the marketing departments of Sony, nVidia, Microsoft or ATi.

      -Charles
  • by IronChefMorimoto ( 691038 ) on Tuesday May 31, 2005 @12:20PM (#12684852)
    HardOCP (http://www.hardocp.com/article.html?art=Nzc4 [hardocp.com]) also has a decent preview. If you look down the list of the various news items for today, the [H] has included links to other previews. Also, they have some photographs from CompuTex (???) in Taipei from this week.

    I skimmed both the Anandtech and HardOCP articles, and the basic gist about ATI's "SLI" is:

    - needs an ATI chipset (the 200 -- for both Intel and AMD right now)
    - "SLI" connector is external via some sort of weird DVI dongle
    - uses one (1) existing X800 or X850 flavor card + a special CrossFire edition of same card models = no real need to get TWO CrossFire cards at one time if you already have the above models

    Looks like I'm gonna need a monster case to ever be able to do this setup (ATI's demos at CompuTex take up 4 friggin' slots on the back of a case).

    IronChefMorimoto
    • You might also have noticed the content of all the sites is nearly identical. Just a rewrite of the ati press kit is suppose. They all miss benchmarks (the whole purpose of sli is speed).

      Here is a list of some more sites:
      beyond3d [beyond3d.com]
      techreport [techreport.com]
      tweakers.net [tweakers.net] (dutch, but the content is identical to other sites
      the faq from ati [ati.com]

      Next in line: these same sites (i left anand tech and tomshardware out) will bring the benchmarks all the same day the nda on the benchmarks expires
  • HyperComputer (Score:3, Interesting)

    by Doc Ruby ( 173196 ) on Tuesday May 31, 2005 @12:23PM (#12684878) Homepage Journal
    Now we've got loadbalancing GPUs. Which means cheap supercomputers, on a PCI LAN, in cheap P4 clients running the OS of our choice. Everyone overclocking your Pentium for more power: GPGPU [gpgpu.org] is the cheapest way to get the fastest PC. First demo of a pool of parallel LAME process running on a stacked beast, let me know.
    • Even better: in addition to the PCI Express bus for internode signaling, the DVI cable is available for throughput, without clogging the PC host apps' IO on the PCI bus. These little monsters are screaming for GP parallel processing.
    • There are libraries for some of that here... http://graphics.stanford.edu/projects/brookgpu/ [stanford.edu] and here... http://libsh.org/ [libsh.org]

      I had a play with BrookGPU a while back, running parallel test jobs an AGP GF4 GPU, and PCI GFFX GPU. Worked well enough, but that was on PCI and AGP bus, and it killed the CPU trying to keep up with the GPUs. Probably needed bigger datasets to keep the GPUs busy or something...

      Anyway, if there's an easy way to load-share accross these things using a single graphics context, I reckon
      • The question for BrookGPU, Sh, and other GPGPU languages, is how "GP" they are - how closely they map to the full C++ (or C) they map to the underlying shader hardware. Once the initial compile of the LAME source, using the GP language/library, fails, what is required to debug and run? Is it just syntax, call graphs, linguistic changes? Or does the algorithm need changing to run at all, let alone optimized for the GPU architecture?

        Those languages have been around a while, but GPGPU projects don't seem to u
  • That ATI and Nvidia have cards out in production that are twice as powerful as any currently out but are holding back on any new innovation probably until well after the Consoles have sold.

    ATI and Nvidia are selling out the PC Gamer's in hopes of pleasing the console makers so they can make even more money off our backs.
    • I'm going to make a totally baseless claim here and guess that those GPUs aren't anywhere near the production stage yet and those console makers are full of shit and showing us graphics either pre-rendered, or running off a very experimental prototype card.
      • Re:How do you feel? (Score:3, Interesting)

        by Xugumad ( 39311 )
        Certainly, the X-Box 360 demos were all all run on PowerMacs with X800 cards [anandtech.com], not any kind of next gen hardware.

        ATi did actually have a demo of their next-gen R520 [anandtech.com] at E3, which should be launched later this year (a time frame that at worst puts it in line with X-Box 360). No news from Nvidia on the GF70, from what I can tell, but I'd imagine they'll try to launch around the same time as ATi.

        Anyway, if you've been following the graphics card market (which you really should if you're thinking of buying a mu
    • So I don't follow these things too closely, but the way I heard it, the ATI card that the XBox 360 and Nintendo Revolution will be using derivatives of started its life as a PC card, but got the axe when it proved to be cost-inefficient, and wouldn't be hitting market at all at this point if it weren't for the consoles; and the NVidia chippy thing that the PS3 GPU is based off of actually IS outright coming to PCs as a retail product and will apparently be shown at Computex or whatever it's called this week
  • I have yet to use any video hardware that required an external dongle and still got decent throughput to the monitor. There was always some image degradation due to the passthrough. No matter how non-shoddy it was.
    • Image degration?

      Ever heard of what 'DVI' stands for

      Hint; the 'D' stands for 'Digital'. Want to explain to me how a digital signal 'degrades' in the cable?
      • Hint; the 'D' stands for 'Digital'. Want to explain to me how a digital signal 'degrades' in the cable?


        Of course digital signals can degrade. This is precisely why the guy at Best Buy told me to get the more expensive shielded optical cable for my DVD player.
      • In my experience DVI is a bit iffy at 1600x1200 60hz (which is its max). I've found that some monitors are more likely than others to suffer dropouts (often seen as regions of flashing pixels, or green pixels). On a professional DVI capture board with a passthrough that I use at work, 1600x1200 is not usable because of all the dropouts.

        With all the advancements we've seen in graphics boards, I'm disappointed screen resolutions haven't gone up very much - the upper end of mainstream has hovered around 16

    • I wouldn't think this would be a problem with DVI. I know I hated my Voodoo2 for this reason, but that was Analog VGA -> Ditigal -> Analog VGA. Is there something I'm missing that would effect a Digitial -> Digital -> Digital solution?
    • This is an all digital DVI-I goofy external dongle, so theoretically there should be no degredation (unlike VGA goofy external dongles, which should be taken outside and shot).
  • by Zed2K ( 313037 ) on Tuesday May 31, 2005 @12:42PM (#12685056)
    I'm holding out for version 2. I just don't see why you need an ati or nvidia chipset for this stuff. If you have a motherboard with 2 16x pci slots next to each other then just sell the connector bracket that includes the necessary logic. Also this current generation drops the 16x slots down to 2 8x slots. Next gen should give you 2 full speed 16x slots if nvidia follows through.

    I refuse to get locked into either an ATI implementation or a Nvidia implementation. I want a MB with a chipset that I select to work with either one. Then in the future I can upgrade the 2 video cards to a different brand without having to change out everything else.
    • Read the marketing spam bit more carefully.

      Nowhere is it stated that ATI chipset motherboard is *Required*. Instead, the the term 'optimal' is used.

      Translation from marketing bullshit: ATI Xpress 200 Crossfire = clone of NForce 4 SLI. Unless they want to shoot their leg by restricting it at drivers, their implementation on the motherboard side of things seems identical. I do hear that nVidia isn't allowing SLI on their drivers with anything except nVidia NForce 4 chipset (tho 'support for Intel chipsets'
      • And "optimal" means if it doesn't work on a competitors board then we won't work as hard to fix the drivers to make it work. Aka, buy it from us to actually get it working to its full potential.
        • ASUS, DFI, MSI etc are not competitors of ATI. They are important partners. I'm quite sure ATI will lick ASUS's boots with whatever fix is required so that ASUS customers stay happy and ASUS sells lots of ASUS-branded Crossfire cards to the *existing installed base* of NF4 SLI + X800/X850 users. Sure they'll tell the clueless people that a new ATI chipset motherboard is the 'optimal' solution, but pissing off their prime customer base of technically savy too-rich ubergamers who already forked out a ton for
  • that they didn't bench Doom3?

    The game itself might not have been as exciting at it first seemed but the engine surely is. Quake 4, EnemyTerritory: QuakeWars and I'm sure some other games also are based on Doom3.
  • Comment removed based on user account deletion
  • ...ATI's forthcoming R520, with hardware H.264 codec support [xbitlabs.com].

    Imagine a Mac mini or laptop with that chipset...it will enable HD playback on a lot of hardware that wouldn't otherwise support it.
  • by Jarnis ( 266190 ) on Tuesday May 31, 2005 @12:44PM (#12685073)
    "which requires both a Radeon Xpress 200 CrossFire based motherboard and a CrossFire graphics card"

    Wrong. Instead they stated that the 'optimum' platform is the Xpress 200 CrossFire.

    However, between the marketing bullshit, you can clearly see that the motherboard is just a dupe of NForce4 SLI (and of similar Intel chipset coming up). Exact same PCIE setup. So it's almost certain that CrossFire will run just fine on nVidia chipset SLI motherboards.

    I doubt they'd do a commercial suicide to prevent it on driver side. Today ATI has 0 SLI boards out. Nvidia has a gazillion - many of which are currently running X800/X850 cards. Nforce4 was first working PCIE AMD chipset, so many bought it - even the more expensive A8N-SLI or similar from other manufacturers, because nothing else was available at the time. Then they noticed how sucky the 6800GT/Ultra drivers currently are (stuutttteeerr bug in EQ2 comes to mind) and decided to fill the board with top of the line ATI card.

    Such people are the PRIME candidates for forking out extra 500$+ for a CrossFire card, and I'm quite sure that they'll want the money from these people WITHOUT forcing upon them a crappy unproven ATI chipset based motherboard.

    Now I do admit that ATI has been very elusive about this in their marketing material (ahem, I mean 'exclusive previews'), but if you go over them all, nowhere it says the thing *requires* ATI chipset, and I'm quite sure that detail is missing for a very good reason - they are late to the party on the motherboard side, and their system is exactly same (two x16 slots, running at x8 mode), that doing it any other way would be just silly.

  • When they finally write DECENT drivers then I will be impressed. As it is the official drivers are buggy and dual monitor support is absolute crap compared to nVidia's drivers.

    Lets not talk about load balancing between cards, ATi can't even get scaling one desktop over 2 momnitors right. And thats if you can get monitor #2 to detect, which is still a hit or miss affair with the official drivers.

  • Wasted space (Score:2, Interesting)

    by oskard ( 715652 )
    They sell a motherboard that is to be used with a new technology. They also include PCI slots for good measure. The damn video cards completely cover the 2 PCI slots, why are they there in the first place?
  • Crossfire
    CROSSFIRYAAAAAHHHHHH

    (reference to an old commercial for those who don't know)
  • Re: (Score:2, Funny)

    Comment removed based on user account deletion
    • This was on Final Jeopardy last night...

      What is the ATi Rage Fury MAXX card?

      I've been an ATi user for a long time but they sometimes do the dumbest things. I remember in '94 when their slogan was "Perfecting the PC"; they were shipping cards with a sticker that had obviously been typoed at the printers. It said:

      ATi: Partecting the PC.

      I had that stuck on my monitor at work for the longest time.
  • ATi has traditionally had poor OpenGL performance, now this:
    "It has come to our attention that the "small number of applications" for which Supertiling does not work includes all OpenGL based titles."

    I wonder how much ATi's cosy relationship with MSFT has to do with this?

God help those who do not help themselves. -- Wilson Mizner

Working...