Forget Expensive Video Cards 322
Anonymous Reader writes "Apparently, the $200 in video cards does not produce the difference. While $500 video cards steal the spotlight on review sites and offer the best performance possible for a single gpu, most enthusiasts find the $300 range to be a good balance between price and performance. Today TechArray took a look at the ATI x1900xtx and Nvidia 7900gtx along with the ATI x1800xt and Nvidia 7900gt."
A 'Wow!' moment. (Score:2, Informative)
It's not often that I go "wow" after a hardware upgrade. 486->pentium class. First Athlon. Virge3D ->3Dfx Voodoo 1 (glquake for teh win)... and just a week ago I went from a nVidia PCX5900 (and ATI 9600XT/256) to a 7900GT. Everything on High in BF2 (and 2x FSAA); smooth as butter. Going from 800x600 low textures, everything down in oblivion to 1280x960 HDR: Wow
Re:Not directly related to TFA (Score:5, Informative)
Actually, that's not quite true these days. A modern render farm has a GPU (or two) in each node, and uses it for all sorts of things. If you are only doing relatively low-quality renderings, you can use something like Chromium and get high framerate, enormous images rendered through OpenGL. If you are doing ray tracing, you can speed this up hugely using the GPU.
Even volume rendering runs on the GPU these days. You can split an enormous volume into 256^3 cubes, render these quickly on an large array of GPUs and then composite the individual rays using the alpha blending hardware on a smaller array of machines in a tree configuration until you have the final image[1].
So, no, not every node needs a video output capability, but if you want state-of-the-art performance they do all need at least one GPU.
[1] Some people are using other kinds of stream processor for this step these days, but that's still a relatively young research area.
Re:What am I missing out on ? (Score:5, Informative)
For comparison, take a look at Apple's Quartz 2D Extreme. This uses the CPU to render each character to a texture and stores them in the graphics RAM. These are then composited by the GPU. The downside of this, of course, is that the CPU needs to render the text for every size at which it is used. Even so, this gives about an order of magnitude better performance than the traditional way (and, of course, lower CPU usage).
If this becomes mainstream then a GPU with fast shader support will give:
[1] See? They do actually do interesting things. It's a real shame nothing from MS Research ever seems to make it into shipping products though.
Re:Shock! Horror! (Score:4, Informative)
I rarely play games at more than 800x600 anyway so no loss for me. My $150 GeForce 6600 card came with a $50 instant rebate for a video game at Best Buy so I picked up a copy of Battlefield 2 with the card. It plays absolutely fine on my AMD Athlon XP 2400+ system with the 6600 card at 800x600. It's AGP to boot! I imagine I'll need a better motherboard and processor if I really wanted to take advantage of some higher performance graphics cards, but I have other priorities at this time in my life. Maximizing my 401(k), building a house, putting away money for my child's college education, etc.
Have you sat back and thought about how far that $500 would go if you didn't just throw it away on a piece of computer equipment that will be obsolete in 3 months? For example, find some financial calculators and do some calculations of putting $500 every 3 months into a high growth rate mutual fund or stocks for example. I bet you'd be pleasantly suprised by the kind of growth your investment would return. Who am I kidding eh? This is Slashdot. Spend spend spend fools! Spend so my stocks will increase in value! Woohoo.
Radeon X800 GTO2 (Score:2, Informative)
I paid $199 canadian for it. The card is absolutely amazing, I get 90fps in UT2004 with max settings at 1280x1024 and around 60fps in Call of Duty 2 and Doom 3 at 1024x768 and high quality settings.
the Sapphire Radeon X800 GTO2 (limited edition) is definitely a special card for the price! Paying a huge chunk of money for 1 graphic card or even more for a SLI setup is just crazy, these mid-range graphics cards perform well enough as it is IMO.
Wapperjawed (Score:2, Informative)
Re:Not directly related to TFA (Score:5, Informative)
As with everything else in a cluster, it's usually whichever has the best price:performance ratio. I'm more familiar with the ones that exist in academia, and these tend to be 'whatever the fastest that we could afford when the cluster was built.' An average cluster node costs around £2000 and upwards. They usually have at least two CPUs, a couple of GBs of RAM (minimum). The less cheap ones will have a high-speed interconnect, adding £500-£1000 to the price of a node (plus more expensive switches), while the cheap ones will just use gigabit ethernet. Adding a £200 GPU adds 5-10% to the cost of the node, while giving up to around a 500% performance increase in many tasks.
Usually they don't need access to the driver code. On *NIX (excluding IRIX) they tend to just run an X server on a display that's not connected to anything and run shader programs on it. The limitation of this is that only one program/user can typically access the GPU at once, but that's usually what's wanted. The shader program receives data from the interconnect, processes it, and passes it on.
20 patty burger (Score:3, Informative)
What's the biggest burger you've ever made or had made?
Expensive Video Cards are a Waste of Money (Score:2, Informative)
Re:20 patty burger (Score:2, Informative)
Re:Not directly related to TFA (Score:1, Informative)
And as far as ease of use goes, the Nvidia ones are much easier to work with. But if the performance was there, I would personally be willing to deal with the ATI drivers if an ATI card was the best value. But they are not. ATI isn't a choice if you are a Linux user.
Re:20 patty burger (Score:3, Informative)