Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Previewing ATi's Radeon X800 XT & X800 Pro 441

Giant_Panda writes "A few short weeks ago, it looked like NVIDIA was back on track as they were able to overtake ATi and reclaim the 3D performance crown with their GeForce 6800 Ultra. Now, it seems like ATi has fired back with a killer card of their own. HotHardware just posted a preview of the new 12-Pipe ATi Radeon X800 Pro ($399) and 16-pipe ATi Radeon X800 XT ($499). The X800 XT seems to be faster then even the new GeForce 6800 Ultra Extreme cards that were rumored to exist on a few sites this past weekend and the X800 Pro is a great performer as well. (Other sites have just posted previews: TechReport, Hexus, Lost Circuits)"
This discussion has been archived. No new comments can be posted.

Previewing ATi's Radeon X800 XT & X800 Pro

Comments Filter:
  • Video Arms Race (Score:3, Insightful)

    by networkBoy ( 774728 ) on Tuesday May 04, 2004 @11:52AM (#9051941) Journal
    At what point is there simply too much noise Vs. signal about how good one card is VS. the other. If you're a fan of nVidia you're going to buy their card no mater what and likewise for ATI no? -nB
    • I'm not sure what to choose now!
    • Re:Video Arms Race (Score:5, Insightful)

      by NeoFunk ( 654048 ) on Tuesday May 04, 2004 @11:59AM (#9052030) Homepage

      Buying a card just because you "prefer" that particular brand is stupid. There's nothing wrong with brand loyalty, but true enthusiasts will always go with the best product.

      I was an Nvidia "fanboy" for quite a while, until their cards started to suck. My latest video card purchase was a Radeon 9800 pro, and I couldn't be happier.

      • Re:Video Arms Race (Score:3, Insightful)

        by networkBoy ( 774728 )
        Great for you!, What I was getting at is that IMHO most people, when faced with two video cards, closely related in specification will likely buy the same brand as they had before (unless experience == bad). I don't understand the need for this level of power consumption and processing horsepower in a video card for even the most demanding games. I can only think the nVidia or ATI cards would really shine in their respective demos and in scientific rendering applications (nMRI, 3-modeling, etc.) just my
        • Re:Video Arms Race (Score:5, Insightful)

          by Short Circuit ( 52384 ) <mikemol@gmail.com> on Tuesday May 04, 2004 @12:52PM (#9052805) Homepage Journal
          For some of us Linux users, there's a second consideration: How difficult will it be to set up?

          Once I learned how to set up my Riva TNT2 with the NVidia drivers, I didn't have much of a problem doing it again whenever I upgraded my kernel.

          However, that didn't prepare me for the obstacles involved in setting up my recently-bought ATI Radeon 9000. I'm not saying it was harder, just different.

          I would have preferred to upgrade to a new NVidia card, but I didn't want to go back to a 2.4 kernel. (At the time, you needed to apply a third-party patch to the driver glue to get it to work with the 2.5/2.6.opre* kernels.)

          Now, I'm happy to say that my Radeon works fine, and I don't need to reinstall a driver every time I upgrade my kernel.
        • by bonch ( 38532 ) on Tuesday May 04, 2004 @01:12PM (#9053113)
          The difference I see this time (and which I wrote about in a related post here) is that the new nVidia card is a power hog and requires you to buy a new power supply if you don't meet the requirements for its two-slot design. The X800 takes up just one slot while generally matching the quality.

          I guess I just see that two-slot, power-sucking design as a huge hassle. I can't imagine how noisy it must be, though I haven't heard it really mentioned in review. But I think the non-fanboys will take a look at the two cards, see that one takes up one and the other takes up two, and go with the one...
        • Re:Video Arms Race (Score:3, Insightful)

          by dnixon112 ( 663069 )

          I don't understand the need for this level of power consumption and processing horsepower in a video card for even the most demanding games.

          Just like every advanced commercial technology, not many people 'need' the power of the most high end products. But for those of us who buy at the more affordable price points, the release of these cards are just as significant. I'm sure soon enough you'll be able to pick up a 9800pro for dirt cheap, and for people like you that's probably great. In another year's tim

    • Re:Video Arms Race (Score:3, Informative)

      by Xugumad ( 39311 )
      Err, no. I first bought Nvidia, then ATI, then Nvidia twice, then ATI again, and am probably going to go with ATI this time because I don't want to spend a fortune on a new power supply, or the electricity bill...
    • My 3d card history goes as follows:

      1999: Voodoo3 AGP
      2001: GeForce 2 MX
      2003: GeForce FX 5200
      2004: ATI Radeon 9700 Pro

      Many I know follow the benchmarks and nothing more when buying. The only reason I used to be loyal to nVidia is becaues I used to run linux (ATI has shit linux drivers).
    • Actually no (Score:2, Interesting)

      by Vermy ( 456774 )
      I was a die hard NVidia/Dell fan for years. Click, click I got my nice neat box in two days and watch as my roommates who built their boxes local with cards other than Nvidia chipset have problems. I quietly chortled in my dark room of Tribes while they rebooted.

      But over the Christmas holiday, it finally came time to upgrade. I decided to save a few bucks (actually, this was more a mandate from the wife) and build the box myself. This actually meant that I had to do some research instead of the click an
    • Re:Video Arms Race (Score:2, Interesting)

      by Seven001 ( 750590 )
      It may seem that simple, but for a lot of people it isn't. I have only had NVidia cards, but my next one, if I have a choice (and I probably will), WILL be an ATI card. Not because anything was particularly wrong with my NVidia cards, they are still running actually, but because I think that ATI's cards in the range I want to buy in, are superior. Plus I'd like to try an ATI card at least once. I can't make any true judgements or be a fan boy for one particular brand when I've not tried them both.
    • I don't want to have to retrofit my computer with a 500-watt power supply, and I don't want my video card taking up TWO goddamn slots. :)

      The X800 matches or betters the nVidia card while having a lower transistor count and lower supply requirement (350), thereby meaning I can run the damn thing in just one slot!

      OEMs are going to balk at needing to suck up two slots when they can just go to ATI and get an equal card that takes up one.

      The only different I can see is PS3.0, which ATI chose not to bother wit
    • No as in no. I am a long time fan of Nvidia, but I switched to ATI this last time, since they've been fooling around too much with their chipsets without producing any real results. The only real reason for their latest offering being a bigger jump than normal is due to the fact I was hardly alone in my switch to ATI and the marketing people got the message loud and clear. Hence their latest card being so much faster than the 9800 series.

      Here is a good example why being a fan boy is plain stupid. Look

    • Re:Video Arms Race (Score:3, Insightful)

      by homer_ca ( 144738 )
      High end gaming cards are a specialized niche. They cost as much as two game consoles, and unlike other components of a fast PC, they're only good for games. So you can justify it for non-gaming purposes. If you're a casual gamer or a budget gamer, you can save a load of cash on the video card by just running games at 800x600.
      • Re:Video Arms Race (Score:4, Interesting)

        by Viking Coder ( 102287 ) on Tuesday May 04, 2004 @02:36PM (#9054390)
        they're only good for games

        Guess again. Medical [utah.edu] volume [siggraph.org] visualization. [computer.org]

        Now, if you're point is that for MOST consumers, they're only good for games, you may have a point. But the other way to look at it is that, since consumers have demanded such amazing video technology, the price to deliver advanced medical visualizations to doctors has dropped dramatically.

        What you used to need a $40,000 SGI O2 for, now you can do with a $1000 computer from Best Buy. That computer might actually save your life some day. Pretty amazing, if you think about it.
  • by AsTrONoT ( 711544 ) on Tuesday May 04, 2004 @11:54AM (#9051958)
    Seems like they're cutting the traces on the extra pipes when creating the 12-pipe Pro version. Not that soft-mods were universally successful anyway.
  • by Anonymous Coward on Tuesday May 04, 2004 @11:54AM (#9051960)
    A complete list of articles related to this can be found @ OverclockersClub.com [overclockersclub.com].
  • Half-life 2 (Score:5, Funny)

    by NeoFunk ( 654048 ) on Tuesday May 04, 2004 @11:55AM (#9051974) Homepage
    ... but will it ship with a voucher for Half-life 2? "Now only 2 video card generations away! Buy now!"
  • Damn... (Score:3, Funny)

    by ajiva ( 156759 ) on Tuesday May 04, 2004 @11:56AM (#9051977)
    These new video cards have more memory than I have RAM! Geez...
  • by Goronmon ( 652094 ) on Tuesday May 04, 2004 @11:56AM (#9051992)
    At the levels of performance that you are talking about with the ATI and NVidia cards, there really isn't a large difference between a few frames here and there. I mean, most of the time, the declared "winner" only bests the other card on a majority of the test, not all of them

    Just pick whicher brand you like better and you'll feel better off letting go of that $500...
    • It NEVER hurts to have the best performance.

      Sure, both nvidia and ATI's latest cards will play all current games at great framerates, but once you start to pile on things like high resolution, anti-aliasing, antisotropic filtering... you need all the performance you can get. Even these newest cards probably won't be able to play FarCry perfectly at 1600x1200 16xAA 16xAF with full details...

      More performance is never superfluous.
    • by onion2k ( 203094 ) on Tuesday May 04, 2004 @12:05PM (#9052118) Homepage
      The performance might be similar at the top end, but there is a difference that could swing it in favour of ATI. Power consumption on the X800 cards is a lot lower than the nVidia alternative (its actually lower than ATI's own 9800 cards). Less power for the same performance means lower temperatures, and quieter slower fans..
      • by HarvardAce ( 771954 ) on Tuesday May 04, 2004 @12:48PM (#9052734) Homepage
        To clarify on a few points of the parent:

        The nVidia 6800 Ultra requires two dedicated molex power connectors, and it also requires a 480W power supply. More details. [anandtech.com] Now that's a lot of power.

        Also, the cooling setup on the 6800 Ultra takes up a slot of its own, which means you lose a PCI port as well, although now that most of the features PCI had (such as sound and NICs) are integrated in the motherboards, it's not too big of a deal.

        Lastly to note, nVidia is releasing a lower-powered 6800GT which is approximately equivalent to the X800 Pro card, and they just recently announced a 6850 Ultra which is basically an OEM-overclocked 6800 Ultra. That thing will probably take up 5 slots, have a built-in A/C unit, and have its own cold fusion reactor as well.
    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Tuesday May 04, 2004 @12:31PM (#9052454)
      Comment removed based on user account deletion
      • Tell me about it. From october till february my 9600 Pro was useless for many games. There is no use in having top hardware if the drivers are no good. See this [hardwareanalysis.com] 83-page(!) thread with other people who were having problems.
  • But the lack of Linux-drivers is holding me back. Not only does NVIDIA have Linux-drivers, they have 64bit drivers as well! Yes, X800 is better overall than 6800 is. But fact is that one of them works well with Linux, while the other one does not.

    Ati: If you want to have my money, you better pull your thumbs out of your ass and write some Linux-drivers!

    Or maybe I will buy this card, and hope it works well with the Generic Ati-drivers that ship with Xorg/Xfree...
    • Don't count on it working well in the next kernel release, as nvidia's current kernel drivers are incompatible.
      • by bflong ( 107195 )
        Can you please elaborate on this statement?
        I'm running 2.6.5 right with Nvidia's drivers on my debian system. I'm having no problems whatsoever. What kernel release are you talking about?
    • by Dragoon412 ( 648209 ) on Tuesday May 04, 2004 @12:04PM (#9052102)
      What, exactly, would be the purpose of running this card on a *nix box? To play cutting edge games like America's Army and Quake 3?

      I don't mean to troll, but every time there's a post about some new bleeding edge video card, there's always someone getting modded up to +5, insightful for saying he'd buy it if it weren't due to lack of driver support, and I'm left wondering what the hell for?
      • by Dot.Com.CEO ( 624226 ) * on Tuesday May 04, 2004 @12:10PM (#9052186)
        This is a valid question, to which there is a valid answer. There are a lot of people out there (myself included) who use Linux as their main desktop, only booting to Windows to play the occasional game. Now, if I can play a game in Linux (native like UT2004 or under winex) I do, and when I do, I want to have comparable features to Windows. So, whereas I did not buy my 9800 Pro so that KDE refreshes windows faster, the fact that I could use it to play the couple of games that exist in Linux is a bonus.

        Anyhow, the original poster is wrong and therefore this discussion is irrelevant.

        • Here's a simpler answer -- people buy the high-end cards to play new games in Windows, but they still want the thing to at least work in X Windows.
        • Anyhow, the original poster is wrong and therefore this discussion is irrelevant.

          The drivers Ati provide are nowhere near as good as the ones which NVIDIA provides. And since my next system will have Athlon 64 and the os will be 64bit Linux, I NEED 64bit drivers! NVIDIA has them, does Ati? I have heard some vague rumours that they _might_ make 64bit drivers available "sometime in the summer", but that's it. Untill that happens, I either have to buy a NVIDIA-card, or use some generic drivers that give me a

          • by Dot.Com.CEO ( 624226 ) * on Tuesday May 04, 2004 @12:36PM (#9052537)
            I STRONGLY suggest you read around about the problems people are having with Nvidia's 64 bit drivers before investing your money. True, ATI have NO 64 bit drivers but perhaps that should tell you that perhaps it is too early for you to get a 64-bit system. They are unstable and run at half the speed. Now, if that is "good enough" for you, fair enough, but I'd much rather wait for a couple to six months so that things settle down on the 64 bit arena.
        • by Slack3r78 ( 596506 ) on Tuesday May 04, 2004 @01:09PM (#9053036) Homepage
          The original poster is technically wrong, but as far as I'm concerned, in spirit they're about right. Comparing the ATI and nVidia Linux drivers is an absolute joke. With nVidia, you download a simple shell script, it checks for a precompiled module for your kernel, and if it doesn't find one, it builds one for you and installs it. After that, you chance *one* line in your XFree86 config file and you're done.

          ATI, on the other hand, was a complete nightmare the last time I installed their drivers on a Linux box for someone. I'm fairly proficient in Linux, and he was running Slackware which is the distro I run myself day in and out. It still took us a couple of hours of playing around to get the drivers working properly due to a combination of quirky behavior and EXTREMELY poor documentation. I wouldn't mind doing it all manually, as long as the documentation is clear and concise and helps you get things done in a reasonable amount of time.

          Personally, I do keep a Windows box around for gaming, but the parts from this get hand-me-downed to the Linux machines as I upgrade. For that reason, Linux drivers are important to me, and I'll be buying nVidia next time I upgrade. I can deal with spending 5 minutes on a shell script and a reboot to upgrade my video card - I can't handle 2 hours to do the same thing with an ATI card.
      • by adamjaskie ( 310474 ) on Tuesday May 04, 2004 @12:20PM (#9052302) Homepage
        Um, Quake 3 and America's Army? Yes, I can play those under Linux. Also, we have such titles as Savage and UT2004, recent games avaliable for Linux, and the upcoming Doom III. The MOST graphics intensive games seem to come out for Linux as well.
      • by Glock27 ( 446276 ) on Tuesday May 04, 2004 @12:25PM (#9052381)
        I don't mean to troll, but every time there's a post about some new bleeding edge video card, there's always someone getting modded up to +5, insightful for saying he'd buy it if it weren't due to lack of driver support, and I'm left wondering what the hell for?

        Seeing as how none of the other replies mentioned it, one reason is to do cutting-edge OpenGL development under Linux. There is significant interest in doing Linux game development using cross-platform toolkits of various types. One example is Garage Game's Torque engine [garagegames.com]. Write to that, and get Windows, Mac and Linux support with very little (if any) tweaking. IMO, Linux is the best and most cost-effective platform for game development.

        This is why, once again, my next video card purchase will most likely be from NVIDIA. I'll get ATI if I manage a G5... ;-) (I wonder how soon the G5s will get these cards?)

    • by Dot.Com.CEO ( 624226 ) * on Tuesday May 04, 2004 @12:06PM (#9052133)
      ATI have had 3d drivers for some time now. They work fine with me (ATI 9800 Pro), although you will need to patch the drivers if you want to run Suse 9.1. Also, the fact that nvidia provide 64 bit Linux drivers does not automatically mean they are any good. And just so you don't have any doubt: they are not. So, yes, you are wrong, and whoever modded you up to +5 Insightful, does not really follow ATI Linux.

      Could the drivers be better? Oh yes. Are they up to nvidia's standard? No. But they ARE listening, and since the last update you can play winex games with hardware acceleration, so there's no problem there...

      • by ImpTech ( 549794 ) on Tuesday May 04, 2004 @12:28PM (#9052423)
        Contrary to the parent, the ATI Linux drivers only work 'fine' for very liberal definitions of the word. They are slow. They are buggy. They do not support Xinerama. They do not support FSAA in any usable way. Hell, they don't support 16-bit video modes. ATI may be 'listening', but that doesn't necessarily get drivers written. I, and many others, are still stuck using ATI drivers released mid last year because all their subsequent releases have been worthless.

        I bought a 9600 Pro thinking that whatever drivers ATI had would be 'good enough'. Well, they aren't. Not by a long shot. If I weren't so fundamentally opposed to separate power connectors for video cards, I might've traded it in for a nvidia months ago. Those drivers are the sole cause of instability in my system. If you're buying a card for Linux, buy Nvidia. Case closed.
    • The lack of linux drivers??? The card came out today. I can't even download a windows version of the driver yet. I don't believe that Nvidia has a driver out for the 6800 yet do they? So what are you complaining about?
  • by gUmbi ( 95629 ) on Tuesday May 04, 2004 @11:57AM (#9052008)
    I'm going to wait for the 'NVidia 7000 Ultra Extreme Pro Super Plus - Special Limited Edition'. Then I'll be so very l33t.

    Jason.
  • Other reviews (Score:5, Informative)

    by RonnyJ ( 651856 ) on Tuesday May 04, 2004 @11:57AM (#9052009)
    Here's another two other reviews, one at AnandTech [anandtech.com] and another at TomsHardware [tomshardware.com]
  • Question (Score:5, Insightful)

    by pubjames ( 468013 ) on Tuesday May 04, 2004 @11:58AM (#9052020)

    Is there any point in getting one of these cards for any reason other than playing the latest games?
    • Re:Question (Score:4, Informative)

      by fitten ( 521191 ) on Tuesday May 04, 2004 @12:01PM (#9052071)
      Is there any point in getting one of these cards for any reason other than playing the latest games?

      nVidia cards tend to have good OpenGL support and OpenGL is used by a number of "high end" CAD and rendering packages. These cards will work well for folks who don't want to spend the $1500 for the high end CAD cards which are almost the same thing (there are some differences but these will do well on a smaller budget, though $500 for a card is pretty pricey to me :)
  • by Thagg ( 9904 ) <thadbeier@gmail.com> on Tuesday May 04, 2004 @11:58AM (#9052027) Journal
    While it's true that both ATi's and Nvidia's new cards scream, it has to be noted that ATi decided not to compete with Nvidia on quality. The new 3.0 versions of the Vertex and Fragment shaders, as implemented in the NV40, are a stunning advance over the 2.0 shaders in the newest ATi cards.

    At my company, we had considered using hardware for the final rendering on some of the shots in our current visual effect movie, but the 2.0 shaders just didn't have the capability -- they really are suited only for games (not too surprising, that's where 99% of the market is.) The lack of fully-functional floating point buffers, the limitation on the size of the shader programs, the lack of texture mapping in the vertex shaders -- these are all devastating to the notion of doing high-quality hardware rendering.

    All of these limitations, and more, were addressed in the new 3.0 shaders.

    I am sure that ATi will support these features eventually, as games come to require them -- but right now you are really comparing apples and Porsches when you compare ATi's and Nvidia's latest offerings.

    Thad Beier
    • Yeah, I'm sure PS 3.0 games are coming down the pipe any day now, right?

      I haven't seen it, but by all accounts, what ATi's managed to do with PS 2.0 in their Ruby demo makes PS 3.0's use seem rather superfluous. And we all know that within a couple months, we'll be seeing the X850 and X900, that probably will have PS 3.0 support.

      If inclusion of PS 3.0 an as-of-yet unused and still far-in-the-future spec is the sole factor you're taking in to account in terms of "quality," I can see why you're let down, bu
    • ATI makes no bones that the radeon series is for gamers. If they can give you enhanced performance during the product cycle at the sacrifice of a new feature that no game will use during this card's usability, who cares?

      The real question for the gamer is how large the intersection is of the set of games that will (in the future) be able to run on these cards at a playable speed and the set of games that will use this feature. The answer is not clear to me that this intersection would be large.

      Case in po
    • by Anonymous Coward on Tuesday May 04, 2004 @12:25PM (#9052366)
      While it's true that both ATi's and Nvidia's new cards scream, it has to be noted that ATi decided not to compete with Nvidia on quality. The new 3.0 versions of the Vertex and Fragment shaders, as implemented in the NV40, are a stunning advance over the 2.0 shaders in the newest ATi cards.

      That means that ATI has decided not to compete with NVidia on compatibility. On shader quality, the screen shots at Toms Hardware [tomshardware.com] suggest that it is NVidia that has chosen not to compete. Why would you care about a 3.0 shader language from a card that still doesn't give you correct output of 2.0 shaders?
  • by mattkime ( 8466 ) on Tuesday May 04, 2004 @11:59AM (#9052031)
    what i'd really like to know is if these new cards will outperform my geforce 2mx in wordperfect scrolling.
  • by jago25_98 ( 566531 ) <<slashdot> <at> <phonic.pw>> on Tuesday May 04, 2004 @11:59AM (#9052035) Homepage Journal
    Someone please divide price by benchmark and plot this in a graph please!

    Maybe I'll do it if no one else can be bothered.
    • by Geek_3.3 ( 768699 ) on Tuesday May 04, 2004 @12:15PM (#9052247)
      These new cards are gonna be LOOOOOOOW on the performance/price ratio, relatively speaking. That is why I usually don't like those types of graphs--they kind of give you a firm grasp of the obvious, that expensive cards have a crappy performance/price ratio compared to a more reasonable (i.e. Radeon 9600XT GeForce 5700). Not to spoil anything, but as history dictates, I would imagine that the new offerings from ATI/nVidia will be in a dead heat for last place on this particular ratio.

      That, it it would seem that each card has their respective wins in different disciplines anyways... Radeon = better in newer games (Farcry, etc) and situations where you have a lot of options on, while nVidia tends to be better in older games, but not a slouch in any particular discipline either, so it would be harder to find out what index you would want to use for this particular graph.
    • by Loualbano2 ( 98133 ) on Tuesday May 04, 2004 @04:20PM (#9055776)
      Here is a chart that shows what you are looking for. It doesn't cover cards made after December 2003, but it is still useful.

      http://www.tomshardware.com/graphic/20031229/vga-c harts-16.html [tomshardware.com]

      ft
  • Silly question (Score:3, Insightful)

    by AndroidCat ( 229562 ) on Tuesday May 04, 2004 @12:00PM (#9052045) Homepage
    I realize that this question makes no sense to the people who have to be so-leading-edge-it-hurts, but are there any applications around that will really push a graphics card that much and require one of these?

    Make no mistake, I'll eventually buy one like these .. after it's well down the price curve, bugs fixed, drivers updated, in a couple years.

  • by phasm42 ( 588479 ) on Tuesday May 04, 2004 @12:01PM (#9052062)
    I think a real big advantage for ATI is the fact that their card doesn't take up two slots, require a monstrosity of a heat sink and fan, and recommend/require a 450W power supply like the '6800 does. Even if the new ATI card wasn't as fast as the 6800, I wouldn't consider buying a video card like that. And I've always considered myself a fan of Nvidia cards (I used to hate the "ATI OS" that ATI's old drivers used to install -- it was very invasive). ATI has produced a very competitive card performance-wise, while keeping the same form factor and with a reasonable (relatively speaking) level of power consumption and heat dissipation.
  • As much as I like the nVidia kit "just working", I wish they would get their head out of their arse and implement true DirectX9, not just the shite that's part driver.
    Have a look through the feature sets between ATI, nVidia and DirectX9 - nVidia supports the barest of minimums to work with DirectX9 written games.
    No wonder Carmack shunned nVidia

    There has to be a time when they support the games, instead of just paying for a prissy ad at the start of a game.

  • by gnuman99 ( 746007 ) on Tuesday May 04, 2004 @12:01PM (#9052075)
    Are we going to have proper set of Linux drivers? Correct implementation of OpenGL?

    I know that ATI has their little RPMs going, but the reason I have switched to using nVidia is because of the crap that went on with ATI and lack of Linux support. And now, they finaly released some drivers, but no support for older cards, and no way to actually install it properly on a Debian system.

    nVidia at least allows for distribution of their drivers [debian.org]

    This is the only reason why I switched to nVidia. I don't see how anyone using Linux can support the bad support for Linux from ATI (as compared to nVidia, of course).

    As to the card itself, well, I think nVidia and ATI was always close enough :) Sometimes competition works, and ATI & nVidia are prime examples of that.

    PS. Please, don't troll me about the free drivers. I want/need real drivers, and not some partial implementation.

    • PS. Please, don't troll me about the free drivers. I want/need real drivers, and not some partial implementation. What you don't seem to realize is that, while NVidia is better about keeping up to date binary linux drivers, ATI is better about releasing hardware info to the driver devs for older harder, meaning that if you're okay with the second string hardware (as opposed to these $500 monstrosities), you get much better support out of your system because the kernel devs will support you if something brea
  • by stratjakt ( 596332 ) on Tuesday May 04, 2004 @12:06PM (#9052135) Journal
    with ultra shit support.

    The fanboy following video cards is endlessly annoying. I own a Radeon 9800, and it was good value for the dollar all around, but quite frankly, the support sucks.

    ATI relies on big benchmark numbers over real world results, I guess that's what 'uber pc geeks' want. nVidia seems to cater to gamers by working with developers to make sure games USE all those fancy new functionalities of the GPU. Ie; nVidia's "The Way it was Meant to be Played" program. ATI plays lip service to it with it's "Get in the Game" program, but they don't provide the same support (like sample codes for killer shader effects, etc)

    So we end up with TRON 2.0 having really cool glowing effects on nVidia, but flat and tacky looking on ATI. We have soft shadows in Splinter Cell for nVidia, blocky PSX-era crap for ATI.

    Hell, I could go on for months listing all the anomalies in actual real-life games I've encountered. Texture corruptions in Tomb Raider: AOD, outright crashes in Halo.

    For all the hype around FSAA and anisotropic filtering - just about EVERY GAME I've enabled them for has crashed hard. Unreal 2, Halo, XIII.

    Oh, and the worst, the absolute worst, is frame drops to 5fps and worse in CounterStrike when there's smoke onscreen. I mean COME ON, I had a RivaTNT2 that played the game properly. There's no excuse for that, save a piss poor opengl implementation.

    So I tried Will Rock, the game whos screenshots were on my 9800's box, and is a member of the "get in the game" program. This ought to SMOKE on an ATI card, right? Almost, awful looking texture corruption in menus, stuttering in-game for no apparent reason (nothing on screen).

    Missing proprietary nVidia features is fine, substitute your proprietary ATI features. Just make them stable and working.

    I've used ATI forever, they used to be a cut above the other retail level cards. Now they've slipped hard.

    This is a case where nVidia will slowly strangle the competition, because the competition sucks. I'd really like to see ATI turn around and focus on the gaming experience, not the mutual masturbation you see on rage3d.com (the unofficial "support" forum) - with a bunch of kids comparing benchmarks and overclocks, with two or three frustrated folks chronically posting for advice on with mishmash of driver files will actually work with Counter Strike.

    Anyhow, hooray for leapfrogging nVidia in phony-baloney do-nothing benchmarks. Will this fabulous new technology actually work with games or is this just more MARKETING BULLSHIT for the likes of toms hardware and hardocp to spread?
    • by cK-Gunslinger ( 443452 ) on Tuesday May 04, 2004 @12:39PM (#9052579) Journal

      The fanboy following video cards is endlessly annoying. .. Followed by endless rants about how ATi sucks and nVidia rocks and will "slowly strangle the competition."

      Please. There are NO differences between the companies as far as "caring about gamers" is concerned. Both exist to make a profit. Period. Several people I know are big independent ATI developers. ATI provides them with code samples, driver updates, etc.. gratis. Anything you say that generalizes one or the other of the companies makes you a "fanboy." Its no different than Ford vs Chevy. Each has some advantages and some disadvantages. And the both have some rabid fan-base that will make it thier sole priority to bash the other. *yawn*

      Also, I don't get the whole "hooray for leapfrogging nVidia in phony-baloney do-nothing benchmarks" when every single review I read included all the current DX9 games with commentary on stability and visual quality, as well as performance. I don't even think Anandtech showd a 3DMark03 score. If so, I didn't pay attention to it. I agree, games are all that matter. Fortunately, that's what was tested.
  • Hmm... (Score:5, Interesting)

    by Burgundy Advocate ( 313960 ) on Tuesday May 04, 2004 @12:07PM (#9052152) Homepage
    As much as I want to like this card, I fear that they've taken a wrong turn on the path they plan to persue.

    As a 3D developer, one of the most exciting things that has come about recently is Shader Model 3.0. It allows you to get greater effects with less operations using some new developments. However, it requires a 32 bit precision. Read more about it here [microsoft.com].

    ATI has chosen to continue with it's 24-bit precision architecture. While fine for most applications, some of the exciting new developments require this newer spec technology. I'm sure that it will be interoperable, but all that speed may end up being wasted while computing certain operations.

    I'm left wondering why I would buy a brand spankin' new card video card when it doesn't support the newest APIs all that well. Oh well, I guess I get to stick with nVidia...
    • Re:Hmm... (Score:3, Insightful)

      by aliens ( 90441 )
      I think the only problem with it not supporting all the 'newest API's' is if games were coming out in the next 6-12 months that really required that support.

      Nothing on the horizon seems to make use of any of what you mentioned, so it'd be safe to buy either card and be totally happy.

      There is no such thing as an upgrade that will keep you happy in 2 years if you need to see all the eye-candy. Even though the 6800 supports PS 3.0 and 32bit I highly doubt it'll hold a candle to the cards that are coming out
  • I really like the styles on that article for editorial links and sponsored links. Nicely done.
  • by Psiren ( 6145 ) on Tuesday May 04, 2004 @12:09PM (#9052174)
    I plumped for an NVidia card for my new machine, but did consider the ATI ones. In the end I went for NVidia because the drivers seemed better supported. My question is, did I miss anyone? Are there any other cards that can run modern(ish) 3D games under Linux?
  • by Prince Vegeta SSJ4 ( 718736 ) on Tuesday May 04, 2004 @12:12PM (#9052213)
    Now I can play quake at 10,000,000,000 frames per second!
  • Let me first say that I'm pretty firmly in the ATi camp, but I really want to see better competition than this.

    For two generations now, ATi's tended towards smaller, sleeker, more elegant designs, while nVidia's products keep getting larger, noisier, hotter, and more power-hungry. They're tpically more expensive, to boot. Making the decision for which card to purchase right now is an absolute no-brainer.

    On one hand, ATi's X800 draw little power, has superior image quality, doesn't take up multiple slots,
  • by Punk Walrus ( 582794 ) on Tuesday May 04, 2004 @12:18PM (#9052273) Journal
    Honestly, how much video power do you need? I still use an NVidia 16mb card on most of my games. I only got a new FX5200 for my newest computer because it was the "most bang for under $80" that I saw. 128mb! Far out! But UT2K4 is running fine on my 64mb NVidia GForce4, which I see I can now get for about $39 [pricewatch.com]. Do I need to run it at 1600 x 1200? No. 1024 x 768 is fine. How finely graned do I need to see the wall, anyway? I just need to see my attackers! I don't stand around and watch the face of my assailant and marvel at the rendering detail of the nose and mustache, because if I'm that close, I think I'm already pwn3d! No no, I am a class A cowards, and I prefer to shoot at them far away, thank you, and as long as I can see them well enought to aim and fire with decent accuracy, I don't care if it's an attacker with the pixelation of a 1970s Bally Midway style Space Invader.

    I'd love to see some program that does "reverse VRAM reclaiming" so those of us who don't need 128mb of video RAM power can get some of that ram back for compiling or something.

    Okay... that WAS geeky.

  • by SilentChris ( 452960 ) on Tuesday May 04, 2004 @12:22PM (#9052330) Homepage
    I was talking with people on another board (hardware mavens), and for most of us with a late model card from last generation (Radeon 9800, any of the competing nVidia cards), the X800 really isn't worth it.

    A good denominator is fpspb (frames per second per buck, a made up value from Tom's Hardware. For the cash, you can squeeze a lot more out of a $200 Radeon Pro 9800 (especially with overclocking) than you can with anything else right now. You're only talking a marginal difference of fps between this generation and last at high (1600x1200) resolutions, and an almost non-existant difference at "normal" resolutions. The $200-300 extra price premium isn't worth those extra frames.
  • The article gives me the impression that it's atleast 20-30% faster. It isn't, and it isn't in all games, the 6800 beats in out in a lot of them too. It's only by a few FPS as well, nothing that you would really even notice playing games either. That's only on Windows and Mac OS X, when you get to Linux, nVidia will kill ATI on every game, no question. Also, feature wise nVidia is king there too. nVidia is still my choice.
  • Now I can buy a 9800ProXT+ or whatever they are called now for $200 less! Happy days!
  • More Reviews (Score:5, Informative)

    by Rufus211 ( 221883 ) <rufus-slashdotNO@SPAMhackish.org> on Tuesday May 04, 2004 @12:26PM (#9052392) Homepage
    stolen from Anandtech [anandtech.com]

    HardOCP [hardocp.com]
    Ascully [ascully.com]
    DriverHeaven [driverheaven.net]
    TrustedReviews [trustedreviews.com]
    K-Hardware [k-hardware.de]
    Hardware Analysis [hardwareanalysis.com]
    Hexus [hexus.net]
    The Tech Report [techreport.com]
    Beyond3D [beyond3d.com]
    Neoseeker [neoseeker.com]
    ExtremeTech [extremetech.com]
    Gamers Depot [gamers-depot.com]
    Lost Circuits [lostcircuits.com]
    Firing Squad [firingsquad.com]
    Tom's Hardware [tomshardware.com]
    Bjorn3D [bjorn3d.com]
    Hot Hardware [hothardware.com]

    Your comment has too few characters per line (currently 10.9). Your comment has too few characters per line (currently 12.3). Your comment has too few characters per line (currently 14.9). Your comment has too few characters per line (currently 17.4).
  • How exactly do people come up with model names/numbers for some of this stuff? For instance the X800, sounds like a mix between generic and random.
  • ...now you give me some good drivers and i might go away from NV but as long as your drivers *SUX* ill stick with the goooldnv.

  • by Anonymous Cowabunga ( 738559 ) on Tuesday May 04, 2004 @12:31PM (#9052455)
    Somewhat off topic, but can someone here explain the difference between these high end gaming cards and a workstation graphics card (for Autocad, 3D Studio, Maya, etc). As I understand it, it has to do with how they deal with dedicated rendering windows, but on the other hand, these high end game cards often come with Autocad drivers, and seem to work perfectly fine for the above programs. So why get the latter, and these cards are often $1000+. What are the speed/quality differences?; most game sites don't review these other cards.
  • by watanuki ( 771056 ) on Tuesday May 04, 2004 @01:06PM (#9052991)
    I don't know about you, but "XT" doesn't sound all that "high tech" to me.

    Maybe ATi will come out with these cards next.

    Radeon X800 AT
    Radeon X800 386
    Radeon X800 486

    And then they'll run into trademark problems with a certain other semiconductor manufacturer...
  • by aardwolf204 ( 630780 ) on Tuesday May 04, 2004 @01:43PM (#9053614)
    Radeon VS. GeForce, Cost per Frame

    *CPF = Cost per Frame
    **Per Aquamark 3: 1024, P4 3.2, 1024MB CAS2, i875P

    Radeon X800 XT
    Cost: $499 (MSRP)
    FPS: 57.96
    CPF: $8.60

    Radeon X800 Pro
    Cost: $399 (MSRP)
    FPS: 54.89
    CPF: $7.26

    Radeon 9800 XT
    Cost: $396 (Pricewatch.com)
    FPS: 47.9
    CPF: $8.26

    GeForce 6800 Ultra
    Cost: $499 (MSRP)
    FPS: 62.65
    CPF: $7.96

    GeForce 6800 GT
    Cost: $399 (MSRP)
    FPS: 61.3
    CPF: $6.50

    GeForce FX 5950 Ultra
    Cost: $365 (Pricewatch.com)
    FPS: 50.93
    CPF: $7.16

    Winner: GeForce 6800 GT

    NOTE:
    This is ignoring other factors that go into TCO such as power consumption (the Radeons use far less power and may not require a power supply upgrade)

    This is based on the Aquamark 3 benchmarks at 1024x768 only. If you wish to gather the mean of the other benchmarks in the linked review to figure a more percise CPF please reply.

    Intended to make you think about what your getting when you pay the extra $100 for the top of the line card.

    If you were wondering, I'm an ATI fanboy and would personally buy the Radeon X800 Pro if I had $400 to blow.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...