Forgot your password?
typodupeerror

'SLI On A Stick' Reviewed 188

Posted by Hemos
from the that's-one-hot-stick dept.
Bender writes "What would happen if you took NVIDIA's multi-GPU teaming capability, SLI, and stuck it onto a single graphics card? Probably something like the GeForce 7950 GX2, a 'single' video card with dual printed circuit boards, dual graphics processors, dual 512MB memory banks, and nearly twice the performance of any other 'single' video card. Add two of these to a system, and you've got the truly extreme possibility of Quad SLI. We've seen early versions of these things benchmarked before, but the latest revision of this card is smaller, draws less power than a single-GPU Radeon X1900 XTX, and is now selling to the public."
This discussion has been archived. No new comments can be posted.

'SLI On A Stick' Reviewed

Comments Filter:
  • Wow (Score:5, Insightful)

    by McGiraf (196030) on Monday June 05, 2006 @10:13AM (#15472016) Homepage
    Which game you need to run to take advantage of the equvalent of 4 graphic cards?
    • Re:Wow (Score:5, Funny)

      by HugePedlar (900427) on Monday June 05, 2006 @10:16AM (#15472040) Homepage
      DNF?
    • Re:Wow (Score:3, Informative)

      by Kutsal (514445)
      Nothing yet, probably. But that doesn't mean there won't ever be any.. Also, in most cases, these boards are used by people like John Carmack to come up with proof-of-concept of new ideas/technologies, or whatever cool thing he's cooking up...

      While they may be overkill for your average user, for (game) developers these things will be goldmines..

      -K
    • Re:Wow (Score:4, Funny)

      by ceeam (39911) on Monday June 05, 2006 @10:26AM (#15472112)
      Windows Vista Aero Glass.
      • Re:Wow (Score:1, Funny)

        by Anonymous Coward
        Seriously, it's gonna be great to know that a game of solitaire can bring my computer to it's knees...
    • by Bega (684994)
      Don't know about games, but I sure hope it'll be enough to run Vista!
    • Re:Wow (Score:3, Informative)

      by Anonymous Coward
      Oblivion. At 1600x1200. Turn up the antialiasing to 4x and turn on all the effects.
    • Re:Wow (Score:5, Insightful)

      by Tim C (15259) on Monday June 05, 2006 @10:41AM (#15472210)
      Today, none (unless you want to run something like Doom 3 or Half Life 2 with all the options turned up to max and at an insane resolution).

      Tomorrow, who knows? I remember a time when a TNT2 Ultra was considered overkill, now you can get more powerful GPUs in mobile phones.
      • by Eivind (15695)
        Yes. But the market will consist of idiots. Because, as you say, today you can literally use such a setup for nothing.

        Sure, 3 years down the road there'll be games that look noticeably better with such a setup, but heres the thing; 3 years down the road you can have this graphics-performance for 1/8th the price and power-consumption.

        It's fine though, those "early adopters" (aka idiots) pay a large fraction of the development-cost for the rest of us.

      • by jmke (776334)
        at 30" LCD resolutions of 1900+ pixels with higher detail (4xAA/16xAF) you'll find that a single 7900GTX or X1900XTX is too slow in games like Call of Duty 2, FEAR and of course Oblivion...
    • Re:Wow (Score:1, Redundant)

      by rspress (623984)
      Windows Vista......but it might need three or four of the cards to run smoothly ;-)
    • Any game, hooked up to this:

      http://hardware.slashdot.org/article.pl?sid=06/04/ 17/1841231 [slashdot.org]

    • Re:Wow (Score:2, Informative)

      by greatguns_17 (955947)
      i thought the same about my 7800 GTX till i tried playing oblivion with all setting....and the fps i get mostly below 30. you get the hardware and getting the software to saturate that piece of hardware is not so hard....
    • Re:Wow (Score:4, Insightful)

      by matt328 (916281) on Monday June 05, 2006 @11:29AM (#15472586)
      The one where you win by claiming higher frame rate than your peers.

      As an aside: it doesn't matter how long you've been playing a certain fps, your eyes have not mutated to give you the ability to discern a difference between 400 and 405 frames per second.
      • Ignoring that fact, I want to know where you got the monitor that can display over 400 frames per second.
      • I think some animal eyes (insects in particular) have faster response times than human eyes.

        So, if you're making a flight simulator to study the way bees fly, you might need 1000 FPS.
    • Re:Wow (Score:4, Informative)

      by Kjella (173770) on Monday June 05, 2006 @12:38PM (#15473222) Homepage
      Which game you need to run to take advantage of the equvalent of 4 graphic cards?

      Oblivion at 1920x1200? Good thing I don't have an Apple Cinema Display. Personally I think Oblivion's game engine is a bit overrated. Ok it's pretty but not *that* much prettier than the other freeform 3D games that don't kill my GFX card. Right now I'm working on a HOMM5 addiction instead...
    • Microsoft flight simulator 2004 with 4 full screen windows running. I could easily use something like this, and I'd need an incredibly beefy CPU as well.
    • by Anonymous Coward
      Less power consumed than the high-end Radeon, and take into consideration the heat is going to be coming from two GPU cores instead of one. If you're already on an ATI setup this will surely take your temp down a couple of degrees.

      /nVidiot fanboy
    • Or, to put it in another way : maybe enough to boil eggs in the tank of your watercooling setup ?
  • What about... (Score:2, Insightful)

    by exit3219 (946049)
    4 cards with 2 dual-core, double-the-cache, twice-the-speed GPUs each? Is that what the future keeps for us?
    • I think in the future we'll see GPUs with multi-core CPUs, with PhysX and Media(maybe IBM's CELL) co-processors and lots and lots of memory for HD-textures and HD-content. Or maybe I just need some coffe.
  • OMG (Score:4, Funny)

    by EW87 (951411) on Monday June 05, 2006 @10:17AM (#15472046) Journal
    I think my Dell just Cried.
  • Weight (Score:5, Interesting)

    by Bios_Hakr (68586) <xptical@@@gmail...com> on Monday June 05, 2006 @10:18AM (#15472057) Homepage
    Don't snap off your PCI slot. Soon, we'll see modder cases with rails for support the front of the cards.

    Or maybe, just maybe, old-school lay-down cases will come back in style.
    • Re:Weight (Score:3, Funny)

      by dkf (304284)
      Or maybe, just maybe, old-school lay-down cases will come back in style.
      Bah! It's 19" racks for me! All I need now is a big reel-to-reel tape deck to use as a false front, and everyone will know I've got a proper computer!
  • Bleugh (Score:2, Interesting)

    by Anonymous Coward
    I'm not the only one that thinks 'great, just what we need' am I? I only just upgraded my graphics recently from a 5900-series to a reasonably priced 7600-series, and since doing so reviews of CrossFire[sic?] and SLI keep popping up, and now quad- is appearing. This time next year can I expect my graphics card to not even be considered minimum-spec to run new games on the PC, yet are going to be on the Xbox360 and PS3 running just fine?

    Who truly honestly needs this much horsepower for personal use? Seems li
    • Re:Bleugh (Score:5, Insightful)

      by mrchaotica (681592) * on Monday June 05, 2006 @10:23AM (#15472091)
      What would you prefer -- that hardware manufacturers artificially held back new technology?
    • This is the nature of the "hardcore" (or "enthusiast", or whatever they call it these days) PC game market. Unless you spend several hundred dollars every few years, you get way behind the curve. It's really unfortunate, as I'd love to play more PC games, but the total cost of upgrades (versus what you get out of it) is way too much.
      • Re:Bleugh (Score:3, Insightful)

        by kfg (145172)
        Unless you spend several hundred dollars every few years, you get way behind the curve.

        What's wrong with staying way behind the curve? It's the same tech, the same games, the same everything over time, except that you get those who think there is some important value to being at the leading edge of the curve to finance your gaming for you.

        Your problem isn't tech, or money . . .it's envy.

        Remember, the best ride is on the face of the wave.

        KFG
        • Re:Bleugh (Score:5, Funny)

          by moonbender (547943) <<moc.liamg> <ta> <rednebnoom>> on Monday June 05, 2006 @11:20AM (#15472511)
          Remember, the best ride is on the face of the wave.

          I'm sorry, you'll have to come up with a car analogy.
          • Re:Bleugh (Score:2, Insightful)

            by kfg (145172)
            I'm sorry, you'll have to come up with a car analogy.

            The best value in a car is a two year old used, third year of the model, but avoid the models favored by teenage street racers. They're innately overpriced for what you get and no matter how shiney the paint the internals have had the shit beat out of them.

            KFG
          • Re:Bleugh (Score:3, Funny)

            by mikael (484)
            Remember, the best ride is on the face of the wave.

            I'm sorry, you'll have to come up with a car analogy.


            The best ride is on the roofrack of the car?
    • /agree

      I bought a shiny 6800GT over a year ago for $400 bucks. I'll never spend close to that for a video card again. If I can play the same games on a next gen console I'll pass on any PC upgrades in the future. It's a shame... the PC industry is only hurting self for a long term user base. Hell, you can't even get a decent baseball sim on the PC anymore... it's all going to pot.
      • I built my newest PC about a year and a half ago for under $800. It replaced my previous PC which I had used for about 3-4 years. My year and a half old PC is still doing fine with most newer games, I've play HL2 based games with most options turned on with no problem. I've been playing a lot of NFS:MW lately, with the graphics cranked up and it runs smooth as silk.

        As for a baseball sim... you've gotta be kidding. I mean, I can understand going out to a game, the atmosphere, the pop-corn and hot dogs, the c
        • Well Rick, some people still enjoy a good simulation of baseball. The PC used to be the king of baseball sims but since developers are making more money developing for the console they abandoned the PC (thus my reason for mentioning it). My point is that if they can make games like HL2 and Doom3 better on consoles, which they have come pretty damn close (go read the xbox reviews @ gamespot), then why put any more money into a PC?
          • HL2/Doom3 are better on newer consoles than their pretecesors were on earlier consoles. They are still weak compared to their PC based rivals. ;)

            As great as consoles are, they are still specialized machines which limits their adoption. My PC can do everything consoles can do and much more that consoles can not. And as long as PCs have that advantage and a wide spread adoption rate, there will continue to be a market for PC based video games.

            -Rick
    • "640KB ought to be enough for anybody." -Bill Gates

      This is actually pretty cool... I'm starting to feel like the computer industry is warming up to the prospect of modular parallelization "at home".
      We are reaching a point where quantum tunneling could become a real problem and frankly, I was hoping this would happen sooner... The industry always focused on things getting smaller, but we're running into a barrier in that direction.
      Now we're starting to see the opposite: instead of buying a brand new system a
    • Seems like a case of making the product long before any real demand for it actually exists.

      Thats often how progress happens. Products are developed where the demand that already exists is a very limited niche, then, once the technology exists, more uses for it are developed, and demand increases.

      But then, I don't think that's really the case here; seems to me that polygon-pushing horsepower on GPUs is something that developers have plenty of uses for as much as anyone can make available, and that plenty of

    • Don't worry about it, it's all hype. If you look at benchmarks you see that these SLI setups actually perform significantly worse than just a single card in all except the rare supported game.
  • by Coopjust (872796) on Monday June 05, 2006 @10:23AM (#15472092)
    The review states:

    Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there's some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the "complexity" involved.

    So they are going to alienate the majority of the market that would spend the money on a Quad SLI setup to keep it exclusive to system builders for whatever period of time.

    Seems like a bad business decision to me, at least until (and if) Nvidia comes to their senses.
    • The review states: Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there's some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the "complexity" involved. Did it mention anything about having to have a direct supply of electricity from your local Three Gorges Dam
  • FINALLY (Score:5, Funny)

    by Quick Sick Nick (822060) on Monday June 05, 2006 @10:23AM (#15472096)
    *Throws away 4 7900 GTXs running in SLI*

    If I upgrade, I might be able to go from 200 frames per second in Doom III to.... 205 frames per second!

    I can't wait to get rid of my old setup! It was a piece of shit!
    • Since you're throwing it out, please send your junk to me. I'll recycle it for you and won't even charge you anything.
  • by neovoxx (818095)
    Yeah, but does it finally support HDCP for us DIY system builders?
  • by xming (133344)
    I have a dual core duo 2 with dual channel GB DDR2 and dual GPU dual card (SLI) setup with dual monitor, cooled with dual case fan, powered by dual (redundant) PSU on 220V. Oh forgot about my dual layer DVD burner and dual button mouse.
  • by Bega (684994)
    Now all we need are games to utilise that power!
  • by LordKazan (558383) on Monday June 05, 2006 @10:40AM (#15472199) Homepage Journal
    I'm probably going to loose even more karma for posting with that title and subject - but i'm on a karma--; roll lately.

    Graphics cards innovations for the past several months/year with SLI seem to be me mostly "i have a dual SLI system!", "yeah? well i have a QUAD SLI system!" - soo much performance that is unused it's pointless. Furthermore for the price of one of these brand new cards in the article I can build a decent gaming computer or a HDTV mythTV box.

    I would rather spend $600 on much more useful things that would see use right now on pricewatch the video cards at $100 are: radeon x1300 256mb agp, radeon x1600 pro 256mb pci express, radeon x800 pci express 256mb, geforce 6600 gt pci-e 256mb
    • I would rather spend $600 on much more useful things that would see use right now on pricewatch the video cards at $100 are: radeon x1300 256mb agp, radeon x1600 pro 256mb pci express, radeon x800 pci express 256mb, geforce 6600 gt pci-e 256mb

      So spend your $600 on more useful things with the rest of us, and let the fanatics keep driving the very high end video card market so that we can all benefit from it when it's in the $100 bin in what, 2 or 3 years.
    • One word:
      Oblivion.
      Three more words:
      Unreal Tournament 2007.

      I have an athlon 64 3800+ with SLI 7800GT's, XFi etc etc and oblivion still grinds to a halt if I push the settings up much beyond their medium levels. Even FEAR only just runs at a decent rate at full whack on my rig. I don't even want to think about the horsepower UT2007 will need.

      You want a game that looks like crap and runs like crap, fine. Buy an X1300 or 6600GT. Those of us who want a better looking, faster responding high-end game can use all
      • I have an athlon 64 3800+ with SLI 7800GT's, XFi etc etc

        And just doing a quick back-of-the-envelope, that rig probably cost you well over $3000. That's a heck of a lot of money to spend and still have Oblivion 'grind to a halt' at max settings. Today's gaming market has just gotten ridiculous. What ever happened to the days when you could get good performance from the latest games on only $1000-$2000 worth of hardware?

    • I would rather buy a dual-SLI system, with a pair of Quadros, which will set me back about $5000 to replace my aging FX4000. You buy what you want, I'll buy what I want.

      XSI on a pair of Quadros is worth the cost to me.
    • Sad thing, the sad thing is these guys do have a point because of something that isn't talked about much anymore.

      They sell their cards used, last generations high end card is better performance than this generations mid range...

      If they sell them consistently they are paying approximately the same amount because you can't sell last generations mid range card.

      Sad but true... Of course I like to have dozens of systems doing nothing so I need the old hardware but if you don't run servers it doesn't matter
  • by TheSkepticalOptimist (898384) on Monday June 05, 2006 @10:45AM (#15472235)
    Like the original dual Voodoo cards, multiple video cards is just one of those things that keeps going out of style (but like old fads, makes its appearance every decade or two).

    The cost to implement and manufacture multiple video cards is ridiculous. Who honestly would spend $1400 just to have two video cards, and then only get at most 20% performance improvement.

    With the current trend of multiple cores, I figured it would be just a matter of time for the SLI and Crossfire solutions to switch back to a single video card. Either they would dual core the GPU, or simply put two GPU on the same card.

    I just makes sense to keep a video card as a single card. You dont have to duplicate the production costs and all the other components that are wasted in a dual card configuration, you also dont have to duplicate the bus technology on the motherboard in order to implement dual video cards. Overall, this will be a much cheaper configuration that will actually bring high performance video technology into the realm of being practical.

    Eventually, 4 way GPU cards will be released, and eventually nVidia and/or ATI will start to dual core their GPUs, those spending money on their expensive dual or even quad based SLI configurations just wasted a bunch of money.
    • With the current trend of multiple cores, I figured it would be just a matter of time for the SLI and Crossfire solutions to switch back to a single video card. Either they would dual core the GPU, or simply put two GPU on the same card.

      Actually, the G71 processor used in that beast has 32 pixel pipelines already, which in their context are similar to cores on a CPU. (Sure, they form a SIMD architecture unlike CPU cores, but so does SLIed GPUs sort of as I have understood it.) When CPUs get more cores, G

    • "Did you know that 93% of statistics are made up on the spot?"

      5950 to 7675 (3dmark scores) is over 28%. There were better and worse scores than that, but since that was the overall 3dmark core, I figured it would be good to go with.

      Yes, there are individual tests that are lower than 20%, but to say 'at most 20%' when there are no games designed to USE that kind of hardware and the current benchmarks ALREADY show higher results... That's just wrong.

      If you'd said 'better than 30%' I'd still have checked my
    • is that it doesnt work for GPUs.
      Instruction Parallisation was never a problem there, so the cores are inheritly as parallel as the die-size allows. If you could squeeze twice as much transistors on a chip, your GPU would have 64 instead of 32 pixel piplelines, for example.
      Plus dual core does nothing for the bandwith problem... (and no, going to 1024 bit memory or something isnt an option
    • Eventually, 4 way GPU cards will be released, and eventually nVidia and/or ATI will start to dual core their GPUs, those spending money on their expensive dual or even quad based SLI configurations just wasted a bunch of money.

      You are missing the point that GPU are highly parallel operation processors. What you call "dual core their GPU" has been done for the past 5+ years in the graphics industry. They call it a new product.
      Every new generation had more pixel pipelines. What do you think those are? You can

    • I'm not trying to sound like an asshole, but here goes...

      1. I have an SLI motherboard and GPU. I'm only running one card so I can upgrade it down the line (3+ years) when the card hits 50 bucks.
      2. Who honestly would spend $1400 just to have two video cards, and then only get at most 20% performance improvement?
      Actually, SLI can be had for as little as 300 (mobo not included). Also, you will see a much more than 20% performance boost. Check your numbers next time.
      3. Yeah, i saw it merging onto one ca
  • Now've the full package: a 4x4 car, a 4x4 AMD chipset and a 4x4 SLI video card.
    Someone shoot me.
  • by szembek (948327) on Monday June 05, 2006 @10:47AM (#15472250) Homepage
    SLI stands for Scalable Link Interface.
  • by GmAz (916505)
    Nice card...err...cards. I would buy one if I had the $$$. But if you look at the price point of the 7900GTX and this new card the price difference isn't that big. Still though, thats a pretty penny just to make games look better. My 6800GT is still hanging in there.
  • by pneumatus (936254) * on Monday June 05, 2006 @10:55AM (#15472310)
    I heard these new cards were developed especially for Duke Nukem Forever!
  • Imagine... (Score:2, Funny)

    by enko (802740)
    Imagine a beowulf cluster of those?
  • "graphics cards today are more and more starting to look like tanks"
  • Getting by (Score:3, Insightful)

    by Neo-Rio-101 (700494) on Monday June 05, 2006 @11:10AM (#15472420)
    I'm still getting by with my ATI RADEON 9700 PRO. Still plays just about anything I can throw at it. Oblivion gives it a hard time, but it's still adequately playable.

    I'm going to hold off as long as possible until the card can't play the latest games, at which point I may get one of these quad SLI setups. by that time, we'll have DDR3 memory and quad core CPUs too.
  • by Qbertino (265505) on Monday June 05, 2006 @11:37AM (#15472669)
    I just bought the budget edition of 'Deus Ex' the other day. What I really like about it is that I needn't think twice about wether it will run smooth or not. I have an Athlon 2100 XP + and a Geforce 4 Ti 4-something, I can crank up the grafics to full and needn't worry about lag or something.
    That's allways the more fun way to go IMHO.
  • by WillAffleckUW (858324) on Monday June 05, 2006 @11:47AM (#15472778) Homepage Journal
    because for the large numbers of us with laptops, it's really hard to upgrade our video cards, given space constraints, but quite easy to pop in a "stick" video card so we can run the latest graphics apps.

    Sigh.

    See, if I'd bought the "latest" computer, I'd already be out of date - by choosing to just buy a cheap $500 laptop, I'm just as out of date as I was a month ago.

    But ... I will need to be able to play Spore ...
  • With so much of the highest-level CPU design going into GPUs, and so many of the most wily consumers of the fastest GPUs going to any lengths possible to trick them out, I'm surprised there's not a lot more development of GPGPU [gpgpu.org], harnessing these processors for general purpose computing.

    Given the qualifications and interests of that joint community, I'd expect to see a "PCI network" that parallelizes MP3 encoding on much cheaper MFLOPS GPU HW by now.

    Maybe actually playing the games is eating up too much time
  • by Honest Man (539717) on Monday June 05, 2006 @01:40PM (#15473728)
    You know, I've been stuck behind the kbd for many, many years and I'm ready for a change that seems obvious to me as needing to be done.

    Get the Gfx 'card' out of the computer. Add a GPU socket to the motherboard and expandable video-ram slots.

    I could spend an hour on why I think this solution would be better but here are a few of my reasons:
    1) As fast as PCI-E is, a direct motherboard interface would be faster
    2) Directly upgradeable memory allows you to afford the better chips and expand the ram as you have the money instead of 'settling' for a lower card because the higher memory version doubles the price.
    3) The ability to use the same memory and JUST upgrade your GPU since many revisions happen to cards while the memory stays the same.
    4) You could use standard CPU cooling on the GPU to have a much more efficiently cooled GPU instead of adding more weight to a relatively flimsy PCI-E connector saving the occasional card/mb damage.
    5) A forced standard all chipmakers would have to produce chips under the same interface standard for new boards and motherboard mfr's as well as CPU mfr's would have to be on the ball too. A GFX chip that you could buy for one year would still plug into new boards 5 years later as would the vid-ram, CPU and the system ram. Also, once any of them are upgraded the bios would need to auto-set to handle the faster speeds...so I want them to predict the speed of the GPU/CPU/RAM 10 years from now and at least try to make motherboards that can support the changing times for a realistic amount of time.

    Sure, have boards with dual GPU's or more but it's time to get off the slot and move into a better format.

    I know, the motherboards would cost more because the expectation would be that you could use the same motherboard for 10 years and frequent upgrades to the CPU/GPU/Ram/Gfx-Ram but I'd pay more for a board I didn't have to keep freaking changing while still being able to keep my game on and upgrade only the pieces that need upgraded, as I can AFFORD them.

    But that's just my 10 cents.
    • Pro and cons (Score:3, Informative)

      by DrYak (748999)
      On one side that's something that we'll be seing in the near future thanks to the HyperTransport format. Slashdot recently announced programmable chips (FPGA) that could be plugged into dual opteron motherboard and that could use the HT bus.
      Also recently announced on slashdot, the developpement of a standart hypertransport connector [slashdot.org] (as part of the HT 3.0 revision).

      So maybe in a near future you'll see motherboards featuring HyperTransport connectors, in which you could directly plug CPU/DDR board, GPU/GDDR
  • Right on the first page of TFA it says that it is HDCP compliant, so you need the latest HDTV "set" in order to run it. So it's not like there was much of a chance of me purchasing one of these in the first place, but I'm not going to buy a DRM crippled product.
  • Or is this nVidia offering still limited in this way? I don't care how fast it is when games still aren't looking their best.

What hath Bob wrought?

Working...