Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

'SLI On A Stick' Reviewed 188

Bender writes "What would happen if you took NVIDIA's multi-GPU teaming capability, SLI, and stuck it onto a single graphics card? Probably something like the GeForce 7950 GX2, a 'single' video card with dual printed circuit boards, dual graphics processors, dual 512MB memory banks, and nearly twice the performance of any other 'single' video card. Add two of these to a system, and you've got the truly extreme possibility of Quad SLI. We've seen early versions of these things benchmarked before, but the latest revision of this card is smaller, draws less power than a single-GPU Radeon X1900 XTX, and is now selling to the public."
This discussion has been archived. No new comments can be posted.

'SLI On A Stick' Reviewed

Comments Filter:
  • Wow (Score:5, Insightful)

    by McGiraf ( 196030 ) on Monday June 05, 2006 @10:13AM (#15472016)
    Which game you need to run to take advantage of the equvalent of 4 graphic cards?
  • What about... (Score:2, Insightful)

    by exit3219 ( 946049 ) on Monday June 05, 2006 @10:14AM (#15472032) Homepage
    4 cards with 2 dual-core, double-the-cache, twice-the-speed GPUs each? Is that what the future keeps for us?
  • Re:Bleugh (Score:5, Insightful)

    by mrchaotica ( 681592 ) * on Monday June 05, 2006 @10:23AM (#15472091)
    What would you prefer -- that hardware manufacturers artificially held back new technology?
  • by Coopjust ( 872796 ) on Monday June 05, 2006 @10:23AM (#15472092)
    The review states:

    Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there's some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the "complexity" involved.

    So they are going to alienate the majority of the market that would spend the money on a Quad SLI setup to keep it exclusive to system builders for whatever period of time.

    Seems like a bad business decision to me, at least until (and if) Nvidia comes to their senses.
  • by LordKazan ( 558383 ) on Monday June 05, 2006 @10:40AM (#15472199) Homepage Journal
    I'm probably going to loose even more karma for posting with that title and subject - but i'm on a karma--; roll lately.

    Graphics cards innovations for the past several months/year with SLI seem to be me mostly "i have a dual SLI system!", "yeah? well i have a QUAD SLI system!" - soo much performance that is unused it's pointless. Furthermore for the price of one of these brand new cards in the article I can build a decent gaming computer or a HDTV mythTV box.

    I would rather spend $600 on much more useful things that would see use right now on pricewatch the video cards at $100 are: radeon x1300 256mb agp, radeon x1600 pro 256mb pci express, radeon x800 pci express 256mb, geforce 6600 gt pci-e 256mb
  • Re:Wow (Score:5, Insightful)

    by Tim C ( 15259 ) on Monday June 05, 2006 @10:41AM (#15472210)
    Today, none (unless you want to run something like Doom 3 or Half Life 2 with all the options turned up to max and at an insane resolution).

    Tomorrow, who knows? I remember a time when a TNT2 Ultra was considered overkill, now you can get more powerful GPUs in mobile phones.
  • by TheSkepticalOptimist ( 898384 ) on Monday June 05, 2006 @10:45AM (#15472235)
    Like the original dual Voodoo cards, multiple video cards is just one of those things that keeps going out of style (but like old fads, makes its appearance every decade or two).

    The cost to implement and manufacture multiple video cards is ridiculous. Who honestly would spend $1400 just to have two video cards, and then only get at most 20% performance improvement.

    With the current trend of multiple cores, I figured it would be just a matter of time for the SLI and Crossfire solutions to switch back to a single video card. Either they would dual core the GPU, or simply put two GPU on the same card.

    I just makes sense to keep a video card as a single card. You dont have to duplicate the production costs and all the other components that are wasted in a dual card configuration, you also dont have to duplicate the bus technology on the motherboard in order to implement dual video cards. Overall, this will be a much cheaper configuration that will actually bring high performance video technology into the realm of being practical.

    Eventually, 4 way GPU cards will be released, and eventually nVidia and/or ATI will start to dual core their GPUs, those spending money on their expensive dual or even quad based SLI configurations just wasted a bunch of money.
  • Re:Bleugh (Score:3, Insightful)

    by kfg ( 145172 ) on Monday June 05, 2006 @10:55AM (#15472309)
    Unless you spend several hundred dollars every few years, you get way behind the curve.

    What's wrong with staying way behind the curve? It's the same tech, the same games, the same everything over time, except that you get those who think there is some important value to being at the leading edge of the curve to finance your gaming for you.

    Your problem isn't tech, or money . . .it's envy.

    Remember, the best ride is on the face of the wave.

  • Getting by (Score:3, Insightful)

    by Neo-Rio-101 ( 700494 ) on Monday June 05, 2006 @11:10AM (#15472420)
    I'm still getting by with my ATI RADEON 9700 PRO. Still plays just about anything I can throw at it. Oblivion gives it a hard time, but it's still adequately playable.

    I'm going to hold off as long as possible until the card can't play the latest games, at which point I may get one of these quad SLI setups. by that time, we'll have DDR3 memory and quad core CPUs too.
  • Re:Wow (Score:4, Insightful)

    by matt328 ( 916281 ) on Monday June 05, 2006 @11:29AM (#15472586)
    The one where you win by claiming higher frame rate than your peers.

    As an aside: it doesn't matter how long you've been playing a certain fps, your eyes have not mutated to give you the ability to discern a difference between 400 and 405 frames per second.
  • Re:Bleugh (Score:2, Insightful)

    by kfg ( 145172 ) on Monday June 05, 2006 @11:34AM (#15472637)
    I'm sorry, you'll have to come up with a car analogy.

    The best value in a car is a two year old used, third year of the model, but avoid the models favored by teenage street racers. They're innately overpriced for what you get and no matter how shiney the paint the internals have had the shit beat out of them.

  • I just bought the budget edition of 'Deus Ex' the other day. What I really like about it is that I needn't think twice about wether it will run smooth or not. I have an Athlon 2100 XP + and a Geforce 4 Ti 4-something, I can crank up the grafics to full and needn't worry about lag or something.
    That's allways the more fun way to go IMHO.
  • by Anonymous Coward on Monday June 05, 2006 @11:46AM (#15472762)
    This card is not about you. It's about a) the rich, for whom $10000 for a system is as much pocket change as a $50 card for us, b) the computation-on-GPU people and most importantly c) developers whose job depends on having 2008's hardware in their development systems right now.

    You've also got to admit that if you had the money you'd buy one just to tell the crazed Sony fanboys "my PC kicks the ass of a PS3 before it's even out" :)
  • because for the large numbers of us with laptops, it's really hard to upgrade our video cards, given space constraints, but quite easy to pop in a "stick" video card so we can run the latest graphics apps.


    See, if I'd bought the "latest" computer, I'd already be out of date - by choosing to just buy a cheap $500 laptop, I'm just as out of date as I was a month ago.

    But ... I will need to be able to play Spore ...
  • by Honest Man ( 539717 ) on Monday June 05, 2006 @01:40PM (#15473728)
    You know, I've been stuck behind the kbd for many, many years and I'm ready for a change that seems obvious to me as needing to be done.

    Get the Gfx 'card' out of the computer. Add a GPU socket to the motherboard and expandable video-ram slots.

    I could spend an hour on why I think this solution would be better but here are a few of my reasons:
    1) As fast as PCI-E is, a direct motherboard interface would be faster
    2) Directly upgradeable memory allows you to afford the better chips and expand the ram as you have the money instead of 'settling' for a lower card because the higher memory version doubles the price.
    3) The ability to use the same memory and JUST upgrade your GPU since many revisions happen to cards while the memory stays the same.
    4) You could use standard CPU cooling on the GPU to have a much more efficiently cooled GPU instead of adding more weight to a relatively flimsy PCI-E connector saving the occasional card/mb damage.
    5) A forced standard all chipmakers would have to produce chips under the same interface standard for new boards and motherboard mfr's as well as CPU mfr's would have to be on the ball too. A GFX chip that you could buy for one year would still plug into new boards 5 years later as would the vid-ram, CPU and the system ram. Also, once any of them are upgraded the bios would need to auto-set to handle the faster I want them to predict the speed of the GPU/CPU/RAM 10 years from now and at least try to make motherboards that can support the changing times for a realistic amount of time.

    Sure, have boards with dual GPU's or more but it's time to get off the slot and move into a better format.

    I know, the motherboards would cost more because the expectation would be that you could use the same motherboard for 10 years and frequent upgrades to the CPU/GPU/Ram/Gfx-Ram but I'd pay more for a board I didn't have to keep freaking changing while still being able to keep my game on and upgrade only the pieces that need upgraded, as I can AFFORD them.

    But that's just my 10 cents.

If I have seen farther than others, it is because I was standing on the shoulders of giants. -- Isaac Newton