Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

ATI Radeon X1800 GTO Launched 117

SippinTea writes "ATI has also hastened to market with a launch of their own this week, with a new Performance Mid-Range Graphics Card. The Radeon X1800 GTO is a chopped-down version of the Radeon X1800 XL with 12 pixel pipelines and less expensive, lower speed GDDR3 DRAM on board. It compares well with the new GeForce 7600GT but can it compete with a GeForce 7900GT for only a few dollars more?"
This discussion has been archived. No new comments can be posted.

ATI Radeon X1800 GTO Launched

Comments Filter:
  • by Beuno ( 740018 )
    Is it me or are there just too many video cards out there?
    • Yes, and the actual "game pleasure improvements" you get from an FPS because you have a 500$ card versus a 150$ card is probably fairly low.

      Key things to look for

      1. Get nvidia. The driver support is there.
      2. Stick one revision back [e.g. 6xxx instead of 7xxx]
      3. Don't get "shared memory" LE or LS or whatever edition cards
      4. Don't get 256MB cards unless there is no price difference [or a very small one]
      5. Look for native TV out if that's your bag. Sadly nvidia cards often need win32 drivers to get tvou
      • by Anonymous Coward
        So you start with how the cards fare in FPS type games, then go on to tell people what to look for in linux media servers? This is exactly why there are tons of cards out there, everybody is looking for something different.

        Personally I want a card that can drive my 1900x1200 display in native resolution while I'm playing FPS, so I'm pretty sure that low end card isn't cutting it. Really, I want a card that can run two of them, since I don't want to upgrade my relative new system to one that will properly h

        • I'd say the # of people running 1900x1200 displays is the minority. Specially since most monitors are 1280x1024 or less.

          You can get by with decent gaming on a 6600 which will cost you 140$. You don't need to buy a 7800 for 500$ to play Farcry or something. Filtering like what I suggested will land you a card in the 6xxx series that doesn't cost more than 200$ and will let you play games at decent refresh rates and resolutions.

          So yes, there are a lot of cards out there but it's usually fairly easy to pick
          • But there is a growing need for ultra high resolution capabilities. As soon as the mainstream figures out how immersive ultra wide screen games can be, there will quickly be a major demand for super high resolution capable video cards. I don't think the popular 1280x1024 viewing format will remain the norm for much longer. 3084x1024 will be a major driver for graphics cards sales.

            For more information on what I mean, visit http://www.matrox.com/ [matrox.com]
            They are releasing an adaptor to turn many video cards capable o
    • by Ossifer ( 703813 ) on Saturday March 11, 2006 @05:07PM (#14899408)
      Is it me or are there just too few silent video cards out there?
      • Both Galaxy and Gigabyte are currently exposing fanless GF 7600GT at CeBIT (and are planning fanless 7900GT and GTX).

        Fanless graphic cards are becoming more and more common on the retail market, while they virtually didn't exist a year ago...

        • Fanless graphic cards are becoming more and more common on the retail market, while they virtually didn't exist a year ago...

          Funny, my THREE PROCESSOR 12MB Creative 3D Blaster Voodoo2 was purely passive cooling - no fans, not even a heatsink. When did that come out... 1995?? Same with my Voodoo3 2000 PCI. ATi Rage/Rage Pro/RageIIC (digging them up as I dig thru my old hardware box here.) Same thing. Those are pretty old, as well. Matrox G400 - no fan or heatsink, either.
          • Funny, my THREE PROCESSOR 12MB Creative 3D Blaster Voodoo2 was purely passive cooling - no fans, not even a heatsink. When did that come out... 1995?? Same with my Voodoo3 2000 PCI. ATi Rage/Rage Pro/RageIIC (digging them up as I dig thru my old hardware box here.) Same thing. Those are pretty old, as well. Matrox G400 - no fan or heatsink, either.

            You need to learn how to troll properly. The Voodoo 2 chipset, consisting of one PixelFX2 and two TexelFX2 chips, was designed as a multi-chip solution for sever
    • From the video card manufacturers' point of view, if they can sell people cards at different prices, then they can reach all the different reservation prices. One guy wants top-of-the-line, another wants midrange, another wants cheap. It's the way the free market works.

      I would take issue more with the naming conventions. They are all just strings of letters and numbers anymore, and they just get larger and more complicated.

    • Yet not enough with free software drivers. If any video card company wants to increase the number of customers that they have and get a competitive edge, they could release technical information that would help free software developers, or write some free software drivers themselves.
      • Ground control to major tom! Free OSs are NOT a major share of the graphics powerhouse market, sorry. Linux support isn't the turning point in Nvidia and ATI's stalemate, as much as the slashbots would like it to be. Man, I'm SOOO getting modded down of this....
        • I didn't say that free operating systems were a "majority", but the fact remains that there are still a lot of people using Linux who are in need of a solution to their graphics card problem.

          Any company that fills this gap will be the ONLY company serving what is becoming a rather large niche. This means that they will get some sort of a bump in sales.

          Companies don't need to target majorities in order to make money.
    • by eyegone ( 644831 ) on Saturday March 11, 2006 @07:05PM (#14899923)
      And too (two) few companies.
    • So what is the difference between this and a Saphire Radeon X800Pro? sounds like the same board essentially.
  • by Calibax ( 151875 ) * on Saturday March 11, 2006 @04:32PM (#14899316)
    There's one significant difference between the nVidia launches this week and the ATI board launched the same day. The nVidia products were available on launch day from on-line stores but the ATI product won't be available for "a few weeks".

    It looks like ATI wanted to steal nVidia's thunder by announcing their latest product the same day. The small issue of not actually being able to manufacture their product yet doesn't seem to be very important to them.
  • Linux drivers? (Score:4, Insightful)

    by Zugot ( 17501 ) * <`bryan' `at' `osesm.com'> on Saturday March 11, 2006 @04:45PM (#14899347)
    The vesa driver is sooooo unacceptable.
  • Finally proof!! (Score:4, Interesting)

    by B5_geek ( 638928 ) on Saturday March 11, 2006 @04:47PM (#14899352)
    I am offically an Old-Fart(tm).

    I looked at this and I thought, "so what, how many fps do kids need in their games anyways?"
    Then the exact next thought was: "Bah the drivers are still fubar in linux so why should I care."
    3rd: "How many /.'ers will make the same comments?"

    So offically, pass me a hat. I quit.
    Ahh games I do miss them so (the best FPS will always be StarSeige Tribes), and eye-candy; nah it'll probably slow down my compile times.
    • I looked at this and I thought, "so what, how many fps do kids need in their games anyways?"

      Same as always, but as the cards get beefier the games tear through more and more graphical resources, and then you can activate HDR, Full Scene AntiAliasing (FSAA), Anisotropic Filtering, ... to the point that top-of-the-line latest released games manage to be unplayable if you enable every single graphical option.

    • it'll probably slow down my compile times

      Note to self: next sourceforge project: OpenGL blurscope for kernel compilations.

      Initiate project after: Current sourceforge project for mplayer script to play Memento DVD in correct chronological sequence.

    • I looked at this and I thought, "so what, how many fps do kids need in their games anyways?"

      Nope, the right question is "how many polygons at 30FPS."
      In some games more polygons = more detailed models. I don't give a shit.
      In other games more polygons = more enemies on screen at the same time. And that's when fun really begins!
    • Right. (Score:1, Funny)

      by Anonymous Coward

      I am offically an Old-Fart(tm).

      Presumably you are the grand old age of 26 or something. I see this all the time on ./ and elsewhere: "You young 'uns today! Why, in my day we only had 32MB TNT2 cards when we played Quake3! I was lucky to get 60 fps, and liked it".

      Guys like me started out on punch cards, and worked with guys from the ENIAC-era...they used to do the "You young 'uns today...!" thing too, only they complained that we wouldn't know how to program with patch cords to save our lives.

      Those guys are

      • I'm not dead, yet.

        I've programmed a few computers that used patch cords and removable plug boards. They used discrete transistor logic, not vacuum tubes.

        How many people around here ever used a Tektronix storage tube graphics terminal? They used vector graphics and a weird display tube that didn't need to be refreshed by the electron beam. The hardware is long gone but you can still see plentiful references to the Tektronix 4000 series terminals in the UNIX documentation.

        We were so poor that we couldn'

    • Heh, me too.
      I thought my X800 were cool, but I must admit that the latest year, I have found other things in life that were more important to me than having the latest grahics card. I don't even play games much anymore, so my purchase of the X800 ended up in being a waste of money, plus have this "funny" bug of the 2 pixels in the lower right of the screen, being duplicated across the first line on my HP LCD screen.(Problem first shows when installing the ati drivers).

      Second, it seems that all new cards are
    • I don't really care about fps, so long as its smooth the vast majority of the time. It is nice to turn on some or all of the eye candy at higher resolution and still maintain, this is what newer cards can offer.

      As far as drivers are concerned, Linux does have a few mainstream games to play, but this appears to be a budget gamer's card, which, at the moment, pretty much relegates it to the Windows realm.

      As a sidenote the best fps was not Starsiege Tribes, it was just plain old Starsiege. It still pisses me o
    • Tribes was amazing. To bad it died. Try Castle Wolfenstien: Enemy Territory. That game is the heir to Tribes, but free.
  • Speed Check (Score:4, Funny)

    by robotsrule ( 805458 ) * on Saturday March 11, 2006 @04:55PM (#14899373) Homepage
    If you put four of them together you can actually run the first full second of the trailer for the next version of Doom.
  • by Clockwurk ( 577966 ) * on Saturday March 11, 2006 @04:55PM (#14899378) Homepage
    Its a real shame Apple had to shackle its Pro notebook and consumer desktop with the uninspiring x1600. OS X relies on the graphics card for so much and they give it so little attention. I hope they follow the lead of other OEMs and make upgrades to their products as new stuff becomes available and not delay faster stuff so that Steve Jobs has something to talk about at Macworld or WWDC.
    • Apple has for a long time offered what is ATI's low end for the PC. Most PCs don't ship with as nice graphics as ATI's low end though, and Macs have come with independent video memory, unlike the many PC's that ship with integrated video through Dell and the like.

      The latest generation of integrated video is much better though, and I can see the latest offerings from ATI, Nvidia and Intel being sufficient for most non-gamers, as long as they have at at least 32MB of independent memory. I know ATI's chipset s
      • Anything is sufficient for non-gamers, and any graphics chipset -- shared memory or not -- released in the recent past is powerful enough to accelerate the most part of what OS X and Windows can load off. Hell, my old subnotebook's pathetic on-board Intel video was powerful enough to run WOW.
        • Using 100% shared memory for video causes small lags and stuters all over the place, except in the latest generation of integrated video. Even doing simple day to do things like scrolling, or selecting a menu has tended to cause this. Main memory bandwidth is the biggest bottleneck to the CPU, and the latency added during the moments when video memory needs updates is what causes the lag.

          Given two identical machines, except one with shared memory and one with independent video memory, the first will be perc
    • by MojoStan ( 776183 ) on Sunday March 12, 2006 @12:55AM (#14901063)
      Its a real shame Apple had to shackle its Pro notebook and consumer desktop with the uninspiring x1600.
      I think the Radeon x1600 is a fine GPU for their "professional" notebook and a very good GPU for their "consumer" desktop.

      The Mobility Radeon x1600 in their mid-sized MacBook Pro is ATI's second-best current-generation mobile GPU. The Mobility Radeon x1800 is ATI's current high-end part and the only noticable difference (for most users) between x1600 and x1800 is 3D gaming performance, which is not worth the extra cost for the vast majority of MacBook Pro buyers. The x1800 is more appropriate for Alienware gaming notebooks or giant Dell XPS desktop replacement notebooks.

      I think the (non-mobile) Radeon x1600 in the iMac is a heck of a nice GPU for a "consumer" PC. Any current generation GPU (like Radeon x1300 or GeForce 7300) would be a fine choice IMO because the extra 3D gaming performance would be a waste for the vast majority of iMac buyers. Anyone that needs more gaming power than an x1600 shouldn't be buying an all-in-one computer with non-upgradable graphics. It would be nice, however, if Apple offered a headless upgradable desktop that wasn't a freakin' workstation.

      OS X relies on the graphics card for so much and they give it so little attention.
      Are you talking about stuff like Quartz Extreme and Core Image/Video? I think the Radeon x1600 gives plenty of GPU power for OS X. Heck, Intel's maligned GMA 900 integrated graphics seemed to have snappy OS X performance [slashdot.org] on the Intel Developer Macs. Core Image only requires a Radeon 9500 or GeForce FX 5200, which are both two generations older than the Radeon X1600.
  • by Anonymous Coward on Saturday March 11, 2006 @04:56PM (#14899379)
    1. Spend hundreds of millions of dollars developing top of the line graphics card.
    2. Sell it for $500
    3. Spend a few more million dollars figuring out how to cripple top of the line graphics card.
    4. Sell it for half the price.
    5. Profit?
    6. Consumers figure out how to re-enabled all the features that were crippled making there $250 graphics card perform almost equal to the $500 version.
    • CPU vendors do the same thing. The Celeron and Sempron products are "crippled" high end CPUs, but it's not always a matter of crippling since the lower price points contain devices that failed to make the top bin split or have defects in one cache bank.

      However, to meet demand at the low end the vendors do end up disabling features on their mainline parts to dumb them down. In most cases, though, there's no way to undo the damage.
    • by wyldeone ( 785673 ) on Saturday March 11, 2006 @05:16PM (#14899440) Homepage Journal
      That's completly untrue. These cards are able to be sold for cheaper because they don't need as high manufacturing standards as the top of the line cards. For those, every pipeline has to be perfect (or within an acceptable range of that) in order for it not to be thrown away. The brilliant thing about selling these kinds of cards, is that they don't have to just throw them away. Instead, they disable the faulty pipelines and sell them for cheaper. Thus they make $250 instead of nothing. Some people who buy them get lucky and get ones with mostly good pipelines. They can then renenable the pipelines, and get better performance. However, there will be problems like video corruption.
      • I am using a radeon 9500 right now that is software modded to perform similar to a radeon 9700. Saved me more than 100 dollars at the time, and it works great. About 3 other people I know got in on the same deal, and only one of us has had problems with faulty pipelines. I know that this may have been a rare case, but it does happen that a large percentage of the lower-end "crippled" product is actually crippled. I doubt ATI saved 100 dollars making it a 9500 instead of a 9700.
      • That's completly untrue. These cards are able to be sold for cheaper because they don't need as high manufacturing standards as the top of the line cards.

        What usually happens is that in the initial run of that group of graphics cards, they take perfectly capable cards and downrate them.

        Why? To get their product out on the market.

        Smart people figure out which cards can be softmodded (BIOS Flash) or hard-modded (messing with the PCB) and they go buy that card and bump it up to full power.

        Eventually nVidia/ATI

      • That's the concept, of course. However, I think the demand for cheaper cards can be higher than there are cards with faulty pipelines, which would result in the manufacturer selling perfectly working high-end chips with disabled pipelines. I don't know the yield rates or the market segmentation, but I believe it's a very likely scenario.
        • That's one way it could work - the problem is: testers are expensive, and tester time is precious. Post-fab testing of chips is a traditional bottleneck for chip designers -- you've got X million transistors in there. It's much quicker to run the chips through a "pass/no pass" test procedure than to debug "failures" to see that the failures are in, say, a pixel shader unit.

          I'm not saying it's not done; I'm just saying it's a business decision, and it depends on the value of tester time at a point in a chi
    • It's called price discrimination. ATI can price the flagship part lower to capture larger marketshare but in doing so it would be giving up the premium that first-adopters were willing to pay for the bleeding-edge. Guys who are willing to pay $500 for a card can only pay $250. That's not so good from a business standpoint. Companies used to wait a few months before lowering prices so it could capture the first-adopter premium and the later-adopters who were not willing to pay as much as the first-adopters.
  • Oh Yay... (Score:2, Insightful)

    by DarthChris ( 960471 )

    Another graphics chip, in case the 20+ already out there aren't enough choice for you.

    FTFA:

    Fortunately, years later we find a dramatically different competitive landscape on the graphics card front, as today's mainstream and performance segment GPU's are equipped with the technology and features that would annihilate flagship GPU's from a few short generations ago.

    And then:

    Looking at these basic specifications, it is certainly impressive to think that this is a $249 graphics card that has all of the f

  • by D. Book ( 534411 ) on Saturday March 11, 2006 @05:19PM (#14899452)
    Am I the only one who suspects the reason we now have a ridiculously confusing range of video chips is less to do with product differentiation and manufacturing efficiency than the publicity that accompanies each new launch? ATI and nVidia seem to have themselves stuck in this game where if one were to announce a new product every month and the other every two months, the relative disadvantage in the reporting on the latter company will result in a significant loss of consumer recognition.

    So they keep coming up with new variations that are trivially different from the existing products - a clock speed adjustment here, a few pipes disabled there - primarily to keep their name in the media. Even the "unannounced" chips are broadly reported, usually with something like "quietly released" in the headline.
  • ATI used to suck with linux drivers. If you wanted a fairly recent 3d card in linux, you had to go with nvidia.

    Is that still the case? If so, then I can't see why I would be interested in ATI.
    • Re:Linux drivers? (Score:1, Informative)

      by Anonymous Coward
      If you use Linux you shouldn't buy an ATI card. The drivers for the X1*** series of GPUs simply don't exist; and even when they do, ATI's patterns indicate that the Linux driver will deliver substantially fewer features and less performance than its Windows counterpart. This is in addition to how poor ATI cards are from a performance vs. price standpoint.

      More significantly, though: Xgl relies upon an OpenGL extension that ATI is unlikely to support. This means you will never get the latest and greatest X11
      • sounds a little like FUD to me, as just a few hours ago I was playing with Xgl on a ATi Radeon 9700 Pro via the FGLRX drivers, and it was super smooth, ie, the windows would wobble like fluid and all the other little details that you really need to see to believe (*if only* Vista WAS that good :) and only 5-10% CPU usage, and this is coming from a FreeBSD/KDE user who is used to 60%+ of CPU usage from some of the KDE apps -- heh, I know the kororaa demo was based on Gnome--.

        And the fact is, you should know
        • Re:Linux drivers? (Score:3, Interesting)

          by Illissius ( 694708 )
          Grandparent was confusing things; it's AIGLX which relies on an extension the ATi drivers don't have (texture-from-pixmap), not XGL. XGL just requires a working OpenGL implementation, iirc. (AIGLX and XGL are basically two very different ways of acheiving the same awesome.)

          And the fact is, you should know better anyway, you cannot expect Linux support from the latest and greatest hardware for such a minority of users when a good percentage of the market is still Windows based, thus being where the games

          • nVidia usually gets drivers out in a matter of days, or a week or two at most. (And not to even mention that their drivers match the performance of their Windows counterparts (as they share the same codebase)
            Do you know if that includes PureVideo and H.264 hardware acceleration? I can't find this info with Google. I'd be very pleasantly surprised if they did.
            • No, not yet.

              Don't look for Purevideo on Linux. Look for Xvmc support.

              Xvmc is an interface for hardware accelerated video decoding. Deinterlacing, yadda yadda. Via currently supports Mpeg-1,2,4, H.264, and some other goodies. Nvidia only supports Mpeg-1 and 2 right now. But expect more in the future.
      • by Anonymous Coward
        "ATI's patterns indicate that the Linux driver will deliver substantially fewer features and less performance than its Windows counterpart."

        So does the Linux Nvidia driver support Purevideo(C)? I think you'll find that the Linux drivers overall support less features than their comparable Windows version.
        • No, nvidia won't support Purevideo on linux.

          Rather, they support Xvmc, which enables hardware video decoding. Currently, their support is not up to date with Via's, but they are working on it. Mpeg-1 and 2 are supported in hardware. Expect Mpeg-4 soon.

          Nvidia's linux drivers lag behind windows, but only slightly.

          ATI's linux drivers will never be up to date with Windows drivers. Hell, your lucky if you'll be able to use your ATI graphics card before it becomes outdated (X1x00 series, ahem.)
    • IMX, ATI sucks with Windows drivers (Catalyst Control Center, anyone?). nVidia is the only video vendor with any decent Linux driver support.
      • IMX, ATI sucks with Windows drivers (Catalyst Control Center, anyone?). nVidia is the only video vendor with any decent Linux driver support.

        Which is why I download the driver sets marketed toward dial-up users. No control center, only the drivers, and my computer takes less time to boot up than otherwise.

    • ATI's linux drivers don't even support the X1x00 series yet.

      Oh, and they haven't seen more than a 1% performance improvement (per release) on their last 5 driver releases.

      Mainly, it seems like ATI's linux drivers are "improving" in that you can now reasonably get them installed on most configurations.

      In terms of normal driver issues (unrelated to difficulty of install, or compatibility with kernel versions) their drivers are absolutely terrible. I find their lack of support for the X1x00 series disturbing.
  • Why? (Score:4, Interesting)

    by Vo0k ( 760020 ) on Saturday March 11, 2006 @05:30PM (#14899483) Journal
    I once set Q3Arena to deathmatch, one of the void maps, against bots. 300 of them. Frag limit bumped to something like 500 and it wasn't much. The game was completely crazy but incredibly fun. With some luck you lived 10 or 15 seconds, the trick was not to not be killed but to frag at least two before you get fragged. The saw glove appeared to be extremely good weapon because at a good location you could run through a row of 30 or so bots shooting each others' backs, and get 30 frags in a row.

    The problem? It was running at about 5 FPS.
    Now I'd like to get a card that would enable this kind of gameplay at reasonable speed. Crowded cities, armies of troopers, hordes of demons. Power in numbers, not detail. Completely new gameplay style. Screw high degree of reality, allow me to perform a multi-kill of 40 with one shot.
  • I wonder what's the point of releasing such cards. They usually cost about the same that previous-generation cards with very similar performance. The only reason for me to get a new crippled card would be to unblock some features but I can't come up with a way how is this profitable for the manufacturer.
    • Well can u see any point in realeasing new version of OS. or new fasted CPU. Do u think companies investing millions of dollors in research are mad .... dude get a point and comment on anything .... and new generation cards are not have same performance and features. ever tried playing FEAR or doom3 on 9800P ...
      • We are not talking about faster stuff. GTO is a stripped down version (made slower on purpose) and I was pointing out that there are X700 which where top-of-the-line some time ago (or whatever was previous line of ATi cards, I'm not following their developement) which have pretty same performance and features like this low-end versions of X1800 cards.
        • i can see ur point there but X700 stands no where in front of X1800 GTO. u are comparing two different segments of the cards. the top end version of in that series was X850XT PE and even that version doesn't stand in front og this new GTO. first of all GTO has new features like SM3.0 support. Hardware accelaration for video. and the good thing is it consumes less power than X850 XT PE and performs better. u have more silent PC and less power dissipation. technology is moving and u can't ignore that.:)
  • Radeon X1800 XL with 12 pixel pipelines and less expensive, lower speed GDDR3 DRAM

    The 7900GT has 24 pixel pipelines 65nm process and is cheaper. nuff said.
  • well this card is placed really nice in the 200-300 price bracket. if u take a look at the card at that price range 7600GT will be low range and 7900GT will be outta budget. i think its better than nvidia 7600GT(only if 7600GT had 256bit memory bus y nvidia y). the moment nvidia launched 7600 and 7900 products ATi decreased the prices. i don't think we are gonna see the X1800 GTO soon in the market. as all ATi lauches are mostly paper launches. but i think its a good move from ATi they have created a new se
  • Tech Forum Watch [techforumwatch.com] has a good round up of the recent launches including the GTO, Quad SLI & Notebook SLI.
  • by EMIce ( 30092 ) on Saturday March 11, 2006 @06:38PM (#14899784) Homepage
    They are about offering more about bang than the other guy for your buck. The midrange $150-$200 range is where you get the most for your money, and each time one competitor offers a better value, the other can't afford to sit back for too long. The midrange GPU segment is one incredibly efficient market and the that is why there are these frequent releases. Each company is fighting to stay ahead.

    One reason for this is that most midrange buyers are enthusiasts, and judging by the # of comments for a product on newegg, one can see that as soon as a better value is offered by a new chip, sales quickly shift towards it. The Nvidia 6800 GS was selling like hotcakes for just the tiny stopgap period it was put out, just to best the ATI x800GTO until the 7600 GT showed up.

    I'm shopping for a card for a friend now, and have noticed that the midrange is good, but for high resolution play at 1600x1200 or 1920x1200, the midrange is barely cutting it now, so it becomes important to get the most bang for your buck, especially if you have an LCD with native high res and want to maintain quality. The new 7600 GT is about 15% faster than the 6800 GS, even w/ a 128 bit memory bus, and definitely hits a sweet spot at $190. It should run most popular titles comfortably at 1920x1200 and has next generation shader 3.0, unlike ATI's offerings below $200.

    Unfortunately for ATI, they haven't offered the best midrange value since their 9xxx line. ATI took Nvidia's crown a while back but Nvidia has had it back for some time now.
    • I'm shopping for a card for a friend now, and have noticed that the midrange is good, but for high resolution play at 1600x1200 or 1920x1200, the midrange is barely cutting it now, so it becomes important to get the most bang for your buck, especially if you have an LCD with native high res and want to maintain quality. The new 7600 GT is about 15% faster than the 6800 GS, even w/ a 128 bit memory bus, and definitely hits a sweet spot at $190. It should run most popular titles comfortably at 1920x1200 and h
    • I'm shopping for a card for a friend now, and have noticed that the midrange is good, but for high resolution play at 1600x1200 or 1920x1200, the midrange is barely cutting it now, so it becomes important to get the most bang for your buck, especially if you have an LCD with native high res and want to maintain quality.

      I just upgraded to a GeForce 6600 GT from a Radeon 9500 Pro and I've had both of these hooked up to a Dell 2405FPW. While I am running my 2D space in 1920x1200, I don't tend to run any o

  • I'm a photographer, and most of my work is photo editing with Adobe Photoshop and RAW picture conversion with Canon's Zoombrowser. I read in a FAQ from Adobe that they mentioned upgrading your video card to improve performance in Photoshop, and was wondering what types of aspects in a video card I should be looking at for this work? I don't need any of the video game type enhancements in a card, so should I just look at the speed and amount of DRAM on the board? Is the X1800 GTO going to be a good choice
  • Wouldn't you expect that a "GTO" edition of a card is better than the plain-jane version?

    Recently I upgrade my card. If it wasn't for Tom's Video Card charts and some more reviews to round that out, it would have been impossible to tell which cards were better than which - let alone which is the best value.

    I really think the numbering and naming schemes do the companies a disservice.

Whenever people agree with me, I always think I must be wrong. - Oscar Wilde

Working...