Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Forget Expensive Video Cards 322

Anonymous Reader writes "Apparently, the $200 in video cards does not produce the difference. While $500 video cards steal the spotlight on review sites and offer the best performance possible for a single gpu, most enthusiasts find the $300 range to be a good balance between price and performance. Today TechArray took a look at the ATI x1900xtx and Nvidia 7900gtx along with the ATI x1800xt and Nvidia 7900gt."
This discussion has been archived. No new comments can be posted.

Forget Expensive Video Cards

Comments Filter:
  • by remembertomorrow ( 959064 ) on Sunday April 30, 2006 @09:55AM (#15231662)
    But I will not even consider purchasing an ATI card until they get their Linux compatibility (drivers) up to snuff.

    I'd rather not be locked to one platform because of a piece of hardware.
  • well, (Score:3, Insightful)

    by joe 155 ( 937621 ) on Sunday April 30, 2006 @09:56AM (#15231665) Journal
    obviously just sticking in a crazily expensive video card won't make a system radically better, computers are a bit bound to go at the speed of the slowest part (I know that doesn't always hold true) but if you computer costs $1000 then spending $500 on a card wouldn't be sensible
  • by suv4x4 ( 956391 ) on Sunday April 30, 2006 @10:04AM (#15231698)
    Wait a second, since when $300 for a friggin' video card is not expensive? Because there's $500 cards?
    If there were plenty of $2000 video cards, would $1000 be not expensive then?

    Someone's being brainwashed here...

    When a pretty good video card is in the range of $80-$160... now that's more reasonable.
  • Well... (Score:5, Insightful)

    by Aphrika ( 756248 ) on Sunday April 30, 2006 @10:10AM (#15231715)
    I have a PC with a ATi 9800 Pro in it which I use for gaming. I've had this since 2003 and it still plays a mean game of Battlefield 2 when I feel like it. If it runs a bit slow then I plonk the resolution down. This is by far the best way to get your game to run faster. Anyway, bottom line is - it runs whatever current game I'd care to buy for it.

    Now I've thought about upgrading, but two things have hampered me. The first is strictly technical - I have an AGP machine, so there's not a huge amount of difference over a 9800 Pro whatever I plug in there because it'll always be limited by the bus speed.

    The second is probably more of a personal thing - I've got mates who have the latest and greatest GFX cards in their machines, but I'll be damned if I can tell the difference between their games and mine. Sure, it's a slightly higher res, but are there any bonus features like fog or smoke? No. Better anti-aliasing? No. I spent my hard-earned cash on a Dell 20" widescreen monitor and I can assure you that as far as gaming experiences go, this added to mine much more than a new GFX card would.

    Maybe it's me getting old, but hardware upgrades now tend come when I buy a new PC, and be a notch under the top o' the range. Although having said all this, I just picked up a Inspiron 9400 for work which did come with a GeForce 7800 in it, which I guess'll be useful for um.... spreadsheets *cough*
  • by ecuador_gr ( 944749 ) on Sunday April 30, 2006 @10:22AM (#15231750) Homepage

    I really don't get it.

    Exactly the same (and obvious) conclusion as any review I've seen on sites like HardOCP, Anandtech, Tomshardware. Is it news that this article is one of the most amateurish attempts at reviewing cards we've seen in recent history? 4 benchmark runs (at least they use games) put together in little fps graphs along with a 2-page grade school level analysis and of course no details about more important stuff like image quality etc.

    Maybe it's just me, since I have never paid over $200 for any kind card, and I would probably object seing such an article on [H], Anand, Tom etc being made "news". However, this particular article is not even close to that level. It really seems like it does not offer anything noteworthy.

  • Re:Whatever... (Score:5, Insightful)

    by X43B ( 577258 ) on Sunday April 30, 2006 @10:26AM (#15231771) Journal
    "I'm sure the $500 GFX cards only exist to make spending $300 on a single component of a computer seem reasonable."

    I'm sure you are probably joking but I think you nailed it on the head. Having a super expensive card, even if it is a low seller, has many positive benefits.

    1) You will sell some to those who want to be ub3r133t
    2) You get the publicity of being "the best" even if no one actually buys the best
    3) Perhaps most importantly, the "Wendy's Effect". It is oft quoted that no one buys Wendy's triple cheeseburger. Someone at Wendy's decided that offering it was a waste so they removed it. However, this almost immediately reduced the number of double cheeseburgers sold. Apparently when people see that there is something more expensive and more "over the top" they are much more compelled to buy the next lower version than if that same version was the high end.
  • Re:Erm (Score:5, Insightful)

    by Proud like a god ( 656928 ) on Sunday April 30, 2006 @10:31AM (#15231788) Homepage
    To take CPU bottlenecking out of the equation. Comparisons of CPUs with the best graphics card likewise attempt to take GPU bottlenecking out of the equation.
  • by jellomizer ( 103300 ) * on Sunday April 30, 2006 @10:38AM (#15231812)
    I wish I had mod points today for that. The Hard Core gammer is expecting so much out of their systems and the game companies are obliging thus making these cards ultra expensive. I say stick with seeing some polygons for a while and use the money on Fun Games, or Food, or for Dinner(s) on a Date. Back in the old days of the late 80s and early 90s computer games were designed to run on the average running spec computer. CGA graphics were still Available, EGA was widely used when VGA was new, and could run on 256k of ram. At the Time getting a VGA Display and Card was close to $500-$600 And for a couple of years after that I have only seen one game released for VGA. But computer game companies stuck with CGA and EGA for a long time because most people didn't have VGA displays. VGA finally came to become standard around 92/93 and then SVGA was out but no games used SVGA they were all EGA or VGA. Not until 94 was SVGA games were introduced running on Windows 3.1 right before 95 was released with win32 abilities. But the games were targeted and performed optimally on the standard computer at the time. Then when Quake came out with support for 3d Graphic Cards, and network it started the hard core gammers started spending all the money for these cards because now they can see the moving pixels of your opponent where he is stuck on a lower resolution and get some skips you have the advantage and able to kill him without him seeing you. So hard core gammers got spend more an more money. And game companies realize there is a market they made games that push the edge more and more. And leaving a class break between people who play computer games for fun and Hard Core Gammers who do it for more then fun.
  • by nurb432 ( 527695 ) on Sunday April 30, 2006 @10:40AM (#15231822) Homepage Journal
    When you can buy the rest of the box for about the same price, spending that much on just video is lunacy.
  • Re:Whatever... (Score:5, Insightful)

    by ivan256 ( 17499 ) * on Sunday April 30, 2006 @10:48AM (#15231850)
    The write the games for the hardware that is out there. If the GPU sales strategy didn't work, you'd see more games for lower powered cards, not more people with machines that can't handle modern games.
  • Re:Skewed results? (Score:4, Insightful)

    by CodeMonkey42 ( 965077 ) on Sunday April 30, 2006 @11:03AM (#15231921)
    Most real gamers (budget or otherwise) still use inexpensive CRT's which produce the best image quality, have zero ghosting, zero dead pixels, etc. and easily do 1600x1200 for the latest games or 800x600 for "classic" games. My $200 NEC AS900 easily outshines the majority of LCD monitors in image quality, and the majority of games I play on my NVIDIA 6800 GT are indeed at 1600x1200.
  • by Starcub ( 527362 ) on Sunday April 30, 2006 @11:08AM (#15231947)
    I remember when the V1 3d cards were first ccame into the market. They were easily top of the line and the best cards went for about $200. When the next generation V2's came out, I pre-purchased the very 1st V2 SLI card (actually 2 cards bridged together) at the incredibly expensive price of about $600. It was alot, but the card literally quadrupled the performance of the V1 I had and the price very quickly fell another $200 before the V3's were out. Today you pay $500 for a top of the line single GPU card that doesn't even double the previous generation's performance. It seems video cards are becoming a disproportionally expensive component of the PC and just aren't providing the same value.
  • Re:Shock! Horror! (Score:3, Insightful)

    by tomstdenis ( 446163 ) <tomstdenis@gma[ ]com ['il.' in gap]> on Sunday April 30, 2006 @11:12AM (#15231967) Homepage
    I'd need to buy a 1600x1200 monitor first...

    Besides I don't view "game not working on setup" as a feature. If the game was well put together it would have reduced texture/polycount modes so it could work on a more appropriate range of hardware.

    There use to be a day where programmers were judged by how well they could make software fit the hardware. Not the other way around.

    Tom
  • by a_greer2005 ( 863926 ) on Sunday April 30, 2006 @11:19AM (#15232000)
    A buddy of mine has a AMD 3800, ATI radion x1900xtx, and 2 gb ram, and maxing the graphics out in some of the latest games cause it to be noticably jittery, so why spend $2000 on a gaming PC when an xBox 360 does jitter free HD for $400?
  • by TomHandy ( 578620 ) <<tomhandy> <at> <gmail.com>> on Sunday April 30, 2006 @11:22AM (#15232020)
    Because not every PC game someone might want to play comes out on the XBox 360? Because some people don't like playing FPS's on consoles when they can play them with a mouse and keyboard on their PC? Because some people find XBox live to be overpopulated with whiny 12-year olds, ruining the multiplayer experience? Because you miss out on some of the customizability and modding that you get with PC games (in Oblivion, for example)? Just a thought.....
  • by macemoneta ( 154740 ) on Sunday April 30, 2006 @11:26AM (#15232033) Homepage
    When I buy a replacement card (which I haven't had to do since I bought my GeForce FX 5600XT a couple of years ago), I buy whatever is currently at the $100 price point. That lets me play better than 95% of games well. If I were buying today, I'd get a GeForce 6600. It's more than good enough.

    No matter what card you buy, in a short period of time there will be a small number of games that need better. Chasing that carrot with no self control is an exercise in futility.
  • Re:Well... (Score:2, Insightful)

    by soupforare ( 542403 ) on Sunday April 30, 2006 @11:40AM (#15232098)
    The X800Pro (and up) beats the 9800Pro. Get something like the X850XT/PE and you're good for another long while.
    Remember, PCI-E was introduced for the future, we haven't hit on games that saturate an AGP 8x bus.
  • by Ohreally_factor ( 593551 ) on Sunday April 30, 2006 @11:43AM (#15232110) Journal
    That's interesting info. I guess my knowledge is a few years out of date. I was under the impression that individual units in a render farm ran headless, and the whole idea was to use masses of cheap commodity hardware to do the heavy crunching.

    What you're saying does make sense, though. GPUs are just slightly more expensive cheap commodity hardware. =) And if it will cut down on render times while not raising costs by several orders of magnitude, it seems like a no brainer. Shorter render times = pushing more projects through = more money.

    Do you know which cards are commonly used? Are they $500 gaming cards? Cheaper gaming cards? More specialized cards?

    Also, I'd imagine that the really big post houses (with the state of the art farms you of which you speak) have access to driver source code, so they can create and optimize their own drivers. If this is the case, they probably can't or wouldn't want to (if they could) release these drivers. I'm just guessing. If you have more info, I'd love to hear it!
  • by Volante3192 ( 953645 ) on Sunday April 30, 2006 @11:50AM (#15232140)
    It'd be better to spend the money not spent on gaming on tickets to a traveling Broadway play or a live concert.

    There's a better ROI on a video card though. If you want to go to a play or concert, that's one time. Not to mention additional costs in getting there, parking, outfit (especially for the play. Not every nerd has a spare tuxedo floating around.) Plus, if you have to pee, you lose part of your ticket price and it's impossible to get that back.

    Video cards are used daily. There's minimal extra cost. (You're already paying for the electricity, minimal travel involved, and who cares what you wear.) And if you have to pee, you can pause and not lose anything.

    Do parents feel good about this?

    You'd have to ask Congress. They're the ones parenting now.

  • Old news. (Score:1, Insightful)

    by Anonymous Coward on Sunday April 30, 2006 @12:07PM (#15232230)
    I've been saying this for years. It's one of ATI's biggest faults that they have a bad tendency to focus almost purely on the high end and let their mid-range and low-end products fall behind (there are exceptions, but, this has generally held true.) Nvidia has earned a lot of profits selling products such as the 6600GT for lower prices than comparable ATI cards could meet while ATI concentrates on having the most powerful cards and getting those high profits from a much smaller number of people. Don't get me wrong, I'm no nVidia fanboy -- in fact, my current video card is (well, was) one of those overly expensive high-end cards from ATI.

    What really bugs me though is watching both of them try to fool people with even more expensive things that offer practically nothing. For example, those 512MiB VRAM video cards. The more ignorant consumers fall for it and believe the video card is twice as good just because it has twice as much memory, so spend an insane premium when, in fact, by the time more than one or two games can truly utilize the higher amounts of ram on cards pulling this trick, the GPU is old technology and not really able to very well keep up with the other demands of the games. Each generation of cards pulls this trick and in each generation it gets them more money for a video card that only performs very slightly better.

    Here's an article for you: "Consumers don't research their video cards well enough and spend money on what sounds good rather than what is good. Videocard manufacturers capitalize on this." There, I just summed up the whole industry in two sentences for you.
  • by Anonymous Coward on Sunday April 30, 2006 @12:09PM (#15232239)
    I bet you if you set that "maxed out" PC game down to the 360's 1280x720, the 1900 would be a lot less jittery.
  • Re:Shock! Horror! (Score:4, Insightful)

    by Nazo-San ( 926029 ) on Sunday April 30, 2006 @12:17PM (#15232277)
    "No way!!! BUY BUY BUY!!! /me happy with my 6600 :-) [it's the cheapest non-crippled PCIe card I could find at the time]"

    I'm sorry, but, I have to inform you that your 6600 is VERY crippled. Especially if you mean the non-GT version. The 6600 series started its life as a crippled card. The GPU, the NV43, is a weaker crippled version of the NV40, and, probably more importantly, while it boasts really fast sounding gDDR3 memory, its 128-bit memory bus actually makes it unable to compete even with the slower gDDR memory of the 256-bit 6800LE (that's right, even the elusive LE is a little better -- excluding the possibility that the LE can be unlocked and overclocked to become a lot better. The nu comes out even further ahead, again excluding unlocking and overclocking on the AGP models.) Mind you, if it had gDDR it would hurt even more since with such a low bus it needs all the speed it can get to compensate.

    Actually, I have a point beyond just pointing out that little mistake. When the 6600GT was first released, it was called the Doom 3 card, and rightly so because it could get some very nice quality settings out of a game with such high requirements. Comparable probably to a Radeon 9800 even, but, at a lower price. And that price was no $500. Only today is the 6600 series finally beginning to truly show its weakness in games like Oblivion (which can bring even a X850 to its knees with the right settings.) The mid-range cards actually end up being the best investment for a person because by the time they loose their competitive advantage (cost vs performance) even the high end video cards are starting to struggle. In other words, by the time a mid-range card is no longer able to get you acceptable quality settings out of a game, chances are a high-end card is no longer going to be good enough either. In either case you must upgrade within the same sort of time range. If you spend $500 every time, it hurts a lot worse than if you just keep upgrading to the mid-range cards. Even if the $500 will buy you a little more time, it's not enough extra time to be worth that extra $200 or so.
  • yeah, no kidding (Score:3, Insightful)

    by TTK Ciar ( 698795 ) on Sunday April 30, 2006 @01:23PM (#15232503) Homepage Journal

    My usual criterion for the quality of a video card is: "how well does XFree86 support it?" (or I guess XOrg, now). A $50 or $30 card which works well for making xterms and Netscape appear on the screen is exactly what I want (and need).

    An advantage to being happy with inexpensive cards is that it becomes feasible to purchase a few of them, so that you can standardize sets of machines on them. That goes double for network cards. It's handy to be able to swap harddrives between machines with impunity, and have video + network "just work" without needing to fiddle with modprobe (or with XF86Config/xorg.conf).

    My old crop of machines was standardized on ne2k-pci compatible cards, but I'm transitioning to eepro1000 :-) It's a wonderful world, where gig-e cards can be had for only $50 (or $25, if I wasn't stuck on the high quality etherpros), and 8-port gig-e switches for only $100. My standard video card is still the ATI Xpert98, though. Maybe it's time to restandardize there too.

    -- TTK

  • by just_forget_it ( 947275 ) on Sunday April 30, 2006 @01:32PM (#15232530)
    If your 3D design department is running gaming cards instead of workstation boards than either you have an extremely stupid IT department, or what you're designing is extremely simple.

    I hate to break it to you, but workstation cards are alive and kicking. The Quadro and FireGL are still available, and for $500 is it MUCH better to have a low-end workstation card than a high-end gaming card. Gaming cards do not have the full OpenGL functionality that 3D design applications need. In my experience, using the wrong type of 3D card can result in random program crashes and data corruption, since the gaming cards are designed to throw as many pixels and textures at the screen as possible, and not accurately render polygons.
  • by Xugumad ( 39311 ) on Sunday April 30, 2006 @01:37PM (#15232547)
    Excluding the points other people have made; that's great for this generation. However, if I want my graphics to improve over the next 5 years, I'm going to be getting PCs...
  • Re:It depends. (Score:3, Insightful)

    by Fulcrum of Evil ( 560260 ) on Sunday April 30, 2006 @03:05PM (#15232941)

    Engineers who genuinely need 3000x2000, or filmmakers who really do need 48-bit colour probably have a need for a very high-end graphics card that supports these kinds of features. So, a generic "forget expensive graphics cards" may not entirely be fair.

    Ok. Forget expensive cards unless you have a reason not to. This is most people.

    PCI-X has a relatively high latency, but games are real-time - if the data can't get displayed in time, it can produce some really ugly results.

    We're talking sub ms latency - nobody can notice that.

    Finally, there's the gratuitous mark-up factor. Graphics cards don't make much profit, because volume is low. However, shareholders and accountants don't care about volume. Neither do most company directors. They care about what they're able to rake in. A really good graphics card might possibly sell one graphics card for every fifty computers sold, so if they want to strut their stuff and look stinking rich, they need to mark up the boards accordingly.

    Sort of, but not really. Graphic card volumes are low because most people don't really need them. They use what came with the computer and it's just fine. Directors absolutely care about volume - it's part of revenue and determines pricing. The markup isn't gratuitous, it's what the market will bear. It has to be enough to pay for R&D or the company goes under. Lower end cards are priced according to predicted price points so as to maximise revenue.

    I believe it would be better if someone founded a cottage industry and made their own high-end graphics cards, making and selling them on weekends or other free time

    This is possibly the stupidest thing I have ever heard. You can't make high end graphic cards in your spare time, the time investment is too great. By the time you were ready to mask, nVidia would be selling the equivalent card for $50. This ain't tea cozys.

  • by GauteL ( 29207 ) on Sunday April 30, 2006 @05:51PM (#15233655)
    1. The games differ greatly from the XBox games. You may not like console games.
    2. The machine is upgradable
    3. The games cost almost twice as much for the Xbox 360 (at least here in the UK).
    4. You can use the computer for other things than gaming and watching DVDs.
    5. You may already have a PC that can be upgraded decently cheaply.

    I still think the latest graphics card are unjustifiably expensive, but the older ones aren't so bad, and it was easier for me to justify spending £70 on a graphics card (GeForce 6600gt) than £300 on an Xbox 360.
  • by Trejkaz ( 615352 ) on Sunday April 30, 2006 @06:05PM (#15233709) Homepage
    Many of us were around when Doom 3 came out on Linux, and suffered ATI's drivers at that point in time.
  • by Anonymous Coward on Sunday April 30, 2006 @07:05PM (#15233948)
    I just dumped my Ti4200 for a 6800 for about $120. There are indeed games coming out that won't run acceptably on anything less than a 6200, for example Oblivion. There's going to be more and more games coming out that won't play on older cards.

    Now if you're fine re-playing games you already own -- which works well for multiplayer mods and such -- then good for you. If you wait a few years, both the games and a card to play it with will be cheap. But if you're "spoiled" enough to want to play newer games, an older card is becoming less and less practical.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...