Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Forget Expensive Video Cards 322

Anonymous Reader writes "Apparently, the $200 in video cards does not produce the difference. While $500 video cards steal the spotlight on review sites and offer the best performance possible for a single gpu, most enthusiasts find the $300 range to be a good balance between price and performance. Today TechArray took a look at the ATI x1900xtx and Nvidia 7900gtx along with the ATI x1800xt and Nvidia 7900gt."
This discussion has been archived. No new comments can be posted.

Forget Expensive Video Cards

Comments Filter:
  • A 'Wow!' moment. (Score:2, Informative)

    by eddy ( 18759 ) on Sunday April 30, 2006 @10:19AM (#15231742) Homepage Journal

    It's not often that I go "wow" after a hardware upgrade. 486-&gtpentium class. First Athlon. Virge3D ->3Dfx Voodoo 1 (glquake for teh win)... and just a week ago I went from a nVidia PCX5900 (and ATI 9600XT/256) to a 7900GT. Everything on High in BF2 (and 2x FSAA); smooth as butter. Going from 800x600 low textures, everything down in oblivion to 1280x960 HDR: Wow

  • by TheRaven64 ( 641858 ) on Sunday April 30, 2006 @11:01AM (#15231918) Journal
    Render farms don't need or have video on each processor ... you just have thousands of machines in racks processing data streams.

    Actually, that's not quite true these days. A modern render farm has a GPU (or two) in each node, and uses it for all sorts of things. If you are only doing relatively low-quality renderings, you can use something like Chromium and get high framerate, enormous images rendered through OpenGL. If you are doing ray tracing, you can speed this up hugely using the GPU.

    Even volume rendering runs on the GPU these days. You can split an enormous volume into 256^3 cubes, render these quickly on an large array of GPUs and then composite the individual rays using the alpha blending hardware on a smaller array of machines in a tree configuration until you have the final image[1].

    So, no, not every node needs a video output capability, but if you want state-of-the-art performance they do all need at least one GPU.

    [1] Some people are using other kinds of stream processor for this step these days, but that's still a relatively young research area.

  • by TheRaven64 ( 641858 ) on Sunday April 30, 2006 @11:18AM (#15231996) Journal
    I read a paper from Microsoft Research[1] about rendering text using the GPU. The idea was that the raw bezier paths of each character in a font would be loaded onto the GPU and then each character would be created on the fly by a shader. This gave a huge performance benefit; it reduced the GPU-RAM bandwidth requirement hugely and allowed the CPU to offload pretty much all text rendering to the GPU.

    For comparison, take a look at Apple's Quartz 2D Extreme. This uses the CPU to render each character to a texture and stores them in the graphics RAM. These are then composited by the GPU. The downside of this, of course, is that the CPU needs to render the text for every size at which it is used. Even so, this gives about an order of magnitude better performance than the traditional way (and, of course, lower CPU usage).

    If this becomes mainstream then a GPU with fast shader support will give:

    • Faster text-rendering performance.
    • Lower CPU usage when rendering large amounts of text.
    • The ability to have effects like Apple's Exposé, but with sharp, fully anti-aliased text at all stages in the zoom effect (the performance on current generation GPUs was fast enough to render entire screens full of small text every frame).

    [1] See? They do actually do interesting things. It's a real shame nothing from MS Research ever seems to make it into shipping products though.

  • Re:Shock! Horror! (Score:4, Informative)

    by Professor_UNIX ( 867045 ) on Sunday April 30, 2006 @11:36AM (#15232076)
    Yeah but try to play Fear or Oblivion at 1600X1200 resolution with all features on and AF at 16X on your 6600 and then tell me a 500.00 card isn't better

    I rarely play games at more than 800x600 anyway so no loss for me. My $150 GeForce 6600 card came with a $50 instant rebate for a video game at Best Buy so I picked up a copy of Battlefield 2 with the card. It plays absolutely fine on my AMD Athlon XP 2400+ system with the 6600 card at 800x600. It's AGP to boot! I imagine I'll need a better motherboard and processor if I really wanted to take advantage of some higher performance graphics cards, but I have other priorities at this time in my life. Maximizing my 401(k), building a house, putting away money for my child's college education, etc.

    Have you sat back and thought about how far that $500 would go if you didn't just throw it away on a piece of computer equipment that will be obsolete in 3 months? For example, find some financial calculators and do some calculations of putting $500 every 3 months into a high growth rate mutual fund or stocks for example. I bet you'd be pleasantly suprised by the kind of growth your investment would return. Who am I kidding eh? This is Slashdot. Spend spend spend fools! Spend so my stocks will increase in value! Woohoo.

  • Radeon X800 GTO2 (Score:2, Informative)

    by Emetophobe ( 878584 ) on Sunday April 30, 2006 @11:54AM (#15232160)
    I just got a new computer with a Sapphire Radeon X800 GTO2 (already unlocked to 16 pipelines).

    I paid $199 canadian for it. The card is absolutely amazing, I get 90fps in UT2004 with max settings at 1280x1024 and around 60fps in Call of Duty 2 and Doom 3 at 1024x768 and high quality settings.

    the Sapphire Radeon X800 GTO2 (limited edition) is definitely a special card for the price! Paying a huge chunk of money for 1 graphic card or even more for a SLI setup is just crazy, these mid-range graphics cards perform well enough as it is IMO.
  • Wapperjawed (Score:2, Informative)

    by jrmiller84 ( 927224 ) on Sunday April 30, 2006 @11:54AM (#15232161) Homepage
    Also on the flip side, if you're going to spend the 200$-300$ as I did, do your research first. I made the stuipd mistake of buying a 200$ Radeon 9800 Pro 256MB before realizing that it was actually a Sapphire made card that runs on a 128 bit bus instead of a 256 bit bus. So while I have a "Radeon 9800 Pro with 256 MB of video memory" (booming voice!), it's actually a piece. Of course it plays all of the newest games but there is much room for improvement. Moral of the story... do your homework before buying ANY video card (high, mid, or low end), don't listen to the name.
  • by TheRaven64 ( 641858 ) on Sunday April 30, 2006 @12:53PM (#15232403) Journal
    Do you know which cards are commonly used? Are they $500 gaming cards? Cheaper gaming cards? More specialized cards?

    As with everything else in a cluster, it's usually whichever has the best price:performance ratio. I'm more familiar with the ones that exist in academia, and these tend to be 'whatever the fastest that we could afford when the cluster was built.' An average cluster node costs around £2000 and upwards. They usually have at least two CPUs, a couple of GBs of RAM (minimum). The less cheap ones will have a high-speed interconnect, adding £500-£1000 to the price of a node (plus more expensive switches), while the cheap ones will just use gigabit ethernet. Adding a £200 GPU adds 5-10% to the cost of the node, while giving up to around a 500% performance increase in many tasks.

    Usually they don't need access to the driver code. On *NIX (excluding IRIX) they tend to just run an X server on a display that's not connected to anything and run shader programs on it. The limitation of this is that only one program/user can typically access the GPU at once, but that's usually what's wanted. The shader program receives data from the interconnect, processes it, and passes it on.

  • 20 patty burger (Score:3, Informative)

    by asn ( 4418 ) on Sunday April 30, 2006 @02:13PM (#15232694)
    On a dare onetime, I had to go to Wendy's and try to order a 20 patty burger. We had already determined at this point that the double the meat deal really only meant 1 extra patty, so I had to order a "single burger with 19 extra patties" which resulted in a pimply faced reply of "uh.. sir... I'm going to have to get the manager" -- the manager insisted they could not construct a burger beyond 4 patties, even after I said I didn't care whether or not it was properly wrapped. We were actually able to reach a middle ground where he gave me my 4 patty burger and then put 16 other patties in 2 of their plastic salad bowls. We took the burger home, assembled it, took pictures, then deconstructed it into more manageable burgers served on white bread.

    What's the biggest burger you've ever made or had made?
  • by JerLasVegas ( 791093 ) on Sunday April 30, 2006 @02:57PM (#15232908)
    If you buy the latest and greatest Video card, you may be able to take advantage of one or two games at most. By the time there are enough games out there to justify the video card, it cost hundreds less. That is like buying a console that cost $600 and there is only one game for it, then the price drops down to $250 and there are 10 more. Unless you have money to waste, it is better to wait.
  • Re:20 patty burger (Score:2, Informative)

    by PayPaI ( 733999 ) on Sunday April 30, 2006 @03:32PM (#15233066) Journal
    In-N-Out 16x16 [] (not mine)
  • by Anonymous Coward on Monday May 01, 2006 @12:23AM (#15234933)
    Have you benchmarked your card and then tried the same under Windows? You will notice the ATI Linux drivers fall FAR short of the Window ones. The gap in performance is unacceptable. The Nvidia drivers perform similar under both platforms. It doesn't make sense to buy a $300 ATI card when you can get the same performance with a $200 Nvidia card under Linux.

    And as far as ease of use goes, the Nvidia ones are much easier to work with. But if the performance was there, I would personally be willing to deal with the ATI drivers if an ATI card was the best value. But they are not. ATI isn't a choice if you are a Linux user.
  • Re:20 patty burger (Score:3, Informative)

    by silvwolf ( 103567 ) on Monday May 01, 2006 @05:42AM (#15235687)
    I see your 16x16 and raise you to 100x100 [].

Syntactic sugar causes cancer of the semicolon. -- Epigrams in Programming, ACM SIGPLAN Sept. 1982