Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Forget Expensive Video Cards 322

Anonymous Reader writes "Apparently, the $200 in video cards does not produce the difference. While $500 video cards steal the spotlight on review sites and offer the best performance possible for a single gpu, most enthusiasts find the $300 range to be a good balance between price and performance. Today TechArray took a look at the ATI x1900xtx and Nvidia 7900gtx along with the ATI x1800xt and Nvidia 7900gt."
This discussion has been archived. No new comments can be posted.

Forget Expensive Video Cards

Comments Filter:
  • Try $200 (Score:5, Interesting)

    by eln ( 21727 ) on Sunday April 30, 2006 @09:59AM (#15231679)
    I find the cards that are at the price point of around $150 to $200 are usually good enough to play new games for about 2 years after they're purchased with all of the eye candy enabled. After that, you can either buy another $150 to $200 card (which obviously is far more advanced than the one you bought 2 years previously) or continue to play newer games without all of the eye candy enabled.
  • by Inverted Intellect ( 950622 ) on Sunday April 30, 2006 @10:03AM (#15231693)
    The price/performance graph for most every imaginable computer component can be represented by a bell curve. It just so happens that I'm in the market for a 300$ graphics card. I plan on buying the Nvidia 7800 GS, which is the most powerful AGP card available. While it sucks that those with AGP mobos have been left without an upgrade path, this particular price range works fine for me. I figure it'll be the last major upgrade to my close-to-obsolete AGP slotted computer.
  • Me my Mum and I.... (Score:4, Interesting)

    by MosesJones ( 55544 ) on Sunday April 30, 2006 @10:07AM (#15231706) Homepage
    And then of course you have the home computer that I'm currently fixing for my mum (mom to USitiens) which has a very basic graphics card that powers the 17" TFT rather nicely, sitting next to that is the one my wife uses which has a Voodoo 3500 TV, running SUSE, and that works fine for her.

    The ONLY people who need these graphics cards are people who place top end games. I find it stunning when I come across work desktops for people who do MS Office stuff that have only 512Mb RAM but a graphics card capable of doing Doom3 at decent framerates. 80%+ of people don't need even the 7900GT let alone the GTX and it would take a completely brain dead operating system to require people to have top line graphics cards just to run a word processor....

    That of course is where my theory breaks down, Vista... you might not play games... but our developers do.

  • by chrismcdirty ( 677039 ) on Sunday April 30, 2006 @10:20AM (#15231746) Homepage
    http://www.newegg.com/Product/Product.asp?Item=N82 E16814150098 [newegg.com]. Try that. If you're willing to spend twice the price, (and have an SLI-capable system) I hear they perform very nicely in SLI, and still less than the price of a $300 card.
  • Skewed results? (Score:5, Interesting)

    by travail_jgd ( 80602 ) on Sunday April 30, 2006 @10:24AM (#15231762)
    All of the benchmarks in TFA are run at 1600x1200.

    I understand that maximum resolution is the best way to highlight the limitations of the cards. But how many "budget" gamers are going to have monitors capable of running at those resolutions?

    All of these cards produce "acceptable" results at 1600x1200. I read the article as "the cards are identical at lower resolutions, but reporting you need to spend more money makes our advertisers happy." Or maybe I'm just cynical.
  • by Kjella ( 173770 ) on Sunday April 30, 2006 @10:25AM (#15231766) Homepage
    Not going for the top of the line graphics card, motherboard, CPU, RAM heck virtually every piece of hardware yields you the most bang for the buck.

    Actually it's more generic than that. If you look at hard disks (because it has such a good metric, but the same applies to all hardware) you'll see $/GB is not lowest at the low end - there's the infamous "sweet spot" in the middle. Same with CPU, the lowest CPUs don't give the most bang for the buck. There's some inherent costs in just producing and shipping the product, which means the lowest are typically really very crippled but not that much cheaper. In terms of absolute performance, mainstream is the best. Of course, that does not mean your utility of the performance is maximized unless it's exactly 1:1 with the dollar value. My parents could get a 7900GTX SLI & 750GB Seagate disks and their utility would be 0 (over their current machine). There's no sense spending money on performance if you're not getting utility, and it makes good sense to spend money where you are getting utility, even if you're moving away from the sweet spot.
  • by ScrewMaster ( 602015 ) on Sunday April 30, 2006 @10:35AM (#15231802)
    The ONLY people who need these graphics cards are people who place top end games.

    That's not entirely true. For example, in the mechanical engineering department where I work there's one guy with a really fast PC and a high-end (I think nVidia but I'm not sure) graphics card that does 3-D design and rendering of parts for the automated machine tools on the plant floor. Not that many years ago, he would have had some kind of special "workstation video board" that would have cost a couple of grand. Those have all but died out as the likes of nVidia and ATI have pushed the performance envelope so far that engineering tasks pale in comparison to the requirements of a game. I guess my point is that there are many tasks that need high-performance 3D, they're just not as high-profile as gaming. And even that is a rather small subset of the total number of computer users out there.
  • by It'sYerMam ( 762418 ) <[thefishface] [at] [gmail.com]> on Sunday April 30, 2006 @10:54AM (#15231879) Homepage
    Having talked personally with the ATi linux team (back before I bought an nVidia) I know they do try with the resources they're given by the management. They also take into account the complaints of the users - although, being bound by NDA, I'm pretty sure they can't give out "coming soon" notices. Certainly, way back when there was this nasty problem with UT2k4 and the ATi linux drivers, they wouldn't disclose that it was fixed before they released.
  • by BassKadet ( 936182 ) on Sunday April 30, 2006 @11:05AM (#15231928)
    I know that 1600x1200 really stresses the GPUs in these cards but I often wonder how many people are actually gaming at that resolution. I have lots of hardcore gamer friends in the area and I've seen their rigs and I know that only 1 of them has a monitor bigger than 19" and runs 1600x1200. Sure, 1600x1200 looks great on a 19" monitor too, but with a monitor that small 1280x1024 still looks very nice and to push the res up to 1600 really isn't worth the FPS hit. Or at least, that is the concensus amongst my friends. I don't mind paying the $500 for something I want, I have camera lenses that cost twice that amount. But somehow it just seems excessive to spend an extra $200 over a $299 card to gain 5-15 FPS for a game in some high resolution I'll never use anyway.
  • by ZeroExistenZ ( 721849 ) on Sunday April 30, 2006 @11:21AM (#15232011)
    I agree... i'm still using my Geforce TI4200; I bought it cheap cause the DirectX9.0 cards were coming out and I didn't feel like going for the premature bugs and such.

    There hasn't really been a game I couldn't play; I've finshed Halflife 2, all Need For Speed titles, all GTA titles, and so many more...
    So why would I need to fork out 300-600 for playing what I can play now, but "better"?

    Nothing has felt as sluggish and jaggy as trying to play Blood II with a voodoo2 card. on a P200. Are kids these days spoiled rotten? I had to work to finance my PC-spending, and I still do.
  • Re:Whatever... (Score:3, Interesting)

    by Tyrant Chang ( 69320 ) on Sunday April 30, 2006 @11:37AM (#15232081)
    To add little more to your post, I think the term is compromise effect: http://www.everything2.com/index.pl?node_id=127302 9 [everything2.com] (or also known extreme aversion effect). People will generally choose a midpoint of an option set and framing an option as a middle makes it more attractive.

    Apparently, this effect has been "applied" to many fields like marketing, sales, negotiation and also in legislative world where a legislator will present a stupid bill that he knows will fail because of the backlash but will make his next bill more reasonable (as we see too often).
  • Re:Whatever... (Score:5, Interesting)

    by blair1q ( 305137 ) on Sunday April 30, 2006 @11:43AM (#15232106) Journal
    >Apparently when people see that there is something more expensive and more "over the top" they are much more compelled to buy the next lower version than if that same version was the high end.

    don't confuse compelled for enabled

    people don't want to feel like pigs

    they feel like pigs when they get the biggest item

    if they take the next-biggest item, they both satisfy their need to serve themselves, and their need not to be gluttonous

    also, it's very common that the best value is to be had by taking the second-tier item; the reason is that on a learning-curve pricing scheme, the slope is steepest between items near the premium end of the curve; why a learning-curve pricing scheme applies is beyond the scope of this article, many reasons can be found, and exceptions as well
  • by thepotoo ( 829391 ) <thepotoospam@@@yahoo...com> on Sunday April 30, 2006 @01:17PM (#15232489)
    someone please, PLEASE, tell me why no one likes ATI in Linux?

    I have used the ATI out-of-the-box radeon drivers in SuSE, it was pretty much as easy to install as it was in windows. And UT2004 (the only linux game I own) seemed to run just as well as it did in Windows.

    So what am I missing that everyone hates so much?

  • Re:well, (Score:3, Interesting)

    by rapidweather ( 567364 ) on Sunday April 30, 2006 @01:44PM (#15232577) Homepage
    The graphics card is powered via the slot on the motherboard.
    I have had burn-outs of the motherboard power connector(s) due to too many cards. Takes hours to fix, one solution I have in place is dual power supplies, takes the load off the motherboard power connectors. Extra hard drives, cdrom drive can be powered by the extra power supply. I just turn on the main power supply first, then the second one, which is fixed with it's own toggle switch and power-on light. That way, the bios knows what to do. Next mod is a big externally powered fan, aimed at the memory bank, keeps it cooler. Comes on with the power strip(s), of which I use two, Monitor on one, PC on another.
    I hesitate to use a big graphics card for fear that the power draw would do this setup in. Using a 32 MB card now, the monitor, a Gateway 2000 EV900 wouldn't look any better with a 64 or 128 MB card, or so I am telling myself.
  • by just_forget_it ( 947275 ) on Sunday April 30, 2006 @01:56PM (#15232619)
    That seems to be a growing trend in the industry, too. At my company the Network Admin gave us new CAD workstations equipped with ATI Radeon x600's, a mid-range gaming card at best. When you have bean-counters in control, they tend to think that they're not going to reap $1000 in increased productivity by spending that much more on 3D cards.
  • by theurge14 ( 820596 ) * on Sunday April 30, 2006 @03:37PM (#15233092)
    Spent $399 for it at Micro Center. Sure, it was a big improvement over the ATi 9500 Pro I had in there before. But in the long run all i got out of it was the three games I played ran just a bit smoother than before. That's it.

    And I'm done with the PC "ricing" subculture. All these wonderful Antec case fans from 2002 are loud, all the money I've dropped upgrading this thing still leaves me with the same crappy Windows XP experience. Think about it, 1GB of Corsair RAM, Athlon XP 2800 processor and two Serial-ATA drives all idle, I click on Control Panel and WAIT 5 seconds for Explorer to redraw my screen twice as all the icons flicker and reload.

    Can't wait for my Mac.
  • by coopaq ( 601975 ) on Sunday April 30, 2006 @03:43PM (#15233127)
    Just a tip... I bought a $350 7800 GS for my socket 754 A64 3700.

    After playing a few games I knew this wasn't right and returned it for a full refund.

    I immediately ordered a socket 754 MB with PCI-X ( $69 ) and a 7900 GT ( $299 ) from newegg .

    The out of the box performance was awesome and overclocking headroom insane.

    You may have to reinstall Windows, but you may be able to get away with a restore.

    I you aren't too hung up on APG consider this.

    I just think the A64 3700+ @ 2500mhz has a lot of life left in it for gaming. My choice was an between CPU upgrade ability vs GPU upgrade ability and I chose GPU. Oblivion is my current haunt. Playing with HDR at 1600x1200 now. Cheers.

  • Re:Try $200 (Score:3, Interesting)

    by wyldeone ( 785673 ) on Sunday April 30, 2006 @05:17PM (#15233520) Homepage Journal
    That's not true.

    I have a nVidia 6800, bought for $300 a year ago, and it struggles with modern games. I've found that anything older than 6 months will not play modern games with all the eyecandy.

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...