The First Quad SLI Benchmarks 109
An anonymous reader writes "X-bit labs have a preview of NVIDIA's Quad SLI system based on two GeForce 7900 GX2 cards. On each GeForce 7900 GX2 is allocated 512 MB of on-board memory, which is connected through a special bridge chip with 16X PCIe lanes to the other daughter card and the system. The two GPUs on the card work in SLI mode. The core and memory are clocked lower than a single GPU card at 550 MHz and 1.2GHz (DDR). For Quad SLI, NVIDIA has introduced a new mode of SLI, AFR of SFR where each card alternately renders a frame split between the two GPUs of one card after the other. The GX2 cards are benched (when possible) at resolution of 2560 by 1600 with 32X SLI AA and compared to a Crossfire x1900 XTX system on a variety of games."
Re:What resolution? (Score:2, Informative)
*raises hand*
and so does anyhow who bought the 3007wfp on sale during recent dell days... Oblivion at native res is only about 30fps... would prefer to quad it up for a decent 100+ fps
Re:Unlike CPU, dual GPU costs double (Score:3, Informative)
Reason may be that dual-GPUs are not dual core but two GPUs (usually) on two different PCB?
In a word, they're merely sticking two full graphic cards together, while dual-core CPUs stick the cores and the dual-CPU handling logic in a single physical package.
Dual GPU is twice as expensive to buy because it's twice as expensive to make in the first place.
People, people, people (Score:3, Informative)
* This is not hardware for the mass market. In fact, even the dual SLI setup is overkill and mainly used as "we knew how to do it and so we did it to prove it".
* This system is not supposed to be cheap and most definitely not intended to be the most effective cost per fps solution.
* Although only a few will buy this, it is far more valuable for NVIDIA to kill ATI:s chances of de-throning them from the performance top.
* Such excessive memory bandwidth is suitable for extreme resolutions that are currently unsupported by over 95 percent of the monitors, but the point is not that we should play our games at these levels, but to prove that it is possible.
* NVIDIA gets an edge over ATI along game developers because, performance-wise, they will be able to run the future games on setups comparable to single cards that are two or even three generations away.
* Yes, it's a waste of electricity, but if you're a member of Green Peace, then wait a few more generations before you buy a cow approved graphics card that fits into this category.
* One user was upset, claiming that it would be stupid to waste $1000 on a setup like this. I agree, but if you happen to drive a Ferrari and if you are debt free and got a few million bucks stored, then why not settle for the best if you can afford it? And you can obviously get your 17-year-old Slashdot-reading neighbour to put in watercooling or whatever to make it silent, too. Point is, some people will buy this, and being able to afford something isn't being stupid.
Last but not least, we should all remember that the CPU is the new bottleneck now. It will be interesting to see what a CPU a year from now can do to this rig.
Re:What resolution? (Score:3, Informative)
http://www1.us.dell.com/content/topics/topic.aspx
It's not too expensive a monitor, popular with gamers who have the kind of money to buy quad sli.