Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

ATI's Radeon X1900GT On Test 101

An anonymous reader writes "ATI's Radeon X1800XT reached end of life last month and the company announced its replacement on May 5th: Radeon X1900GT. Bit-Tech has put a pair of retail Radeon X1900GT cards from Connect3D and Sapphire to the test in a range of real-world benchmarks to find out how it matches up to NVIIDA's 7900 GT."
This discussion has been archived. No new comments can be posted.

ATI's Radeon X1900GT On Test

Comments Filter:
  • by glrotate ( 300695 ) on Wednesday May 10, 2006 @10:18AM (#15300734) Homepage
    Final Thoughts...

    In some areas, the R580-based Radeon X1900GT is faster than the card it is replacing. However, in other, less shader-intense titles like Day of Defeat: Source, the R520-based Radeon X1800XT is the faster of the two. This can be attributed to the architectural differences between R520 and R580.

    The natural competitor for the Radeon X1900GT is NVIDIA's GeForce 7900 GT, and across a range of games, it is very much a case of win some, lose some for both companies.

    In texture-heavy games, the Radeon X1900GT can sometimes be slower than the GeForce 7800 GT, nevermind the faster GeForce 7900 GT. In newer, shader-intensive games like The Elder Scrolls IV: Oblivion and Call of Duty 2, the Radeon X1900GT delivers a very good gaming experience in comparison to the GeForce 7900 GT. This is particularly the case in Oblivion, where the Radeon X1900GT is able to deliver a better gaming experience than XFX's superclocked 7900 GT XXX Edition. In addition, it will be possible to play Oblivion with both HDR and Anti Aliasing enabled if the upcoming Catalyst 6.5 driver includes the 'Chuck' patch. This is something that is currently not an option for NVIDIA owners.

    Based on the current price forecasts from people in the know, the deal looks to be a pretty good one. If the Radeon X1900GT is priced at £199, it is undoubtedly a good deal. However, there are GeForce 7900 GT's already selling for that price. The decision will ultimately depend on what games you're currently playing, whether you're planning to overclock or not, and also based on the price points that ATI's partners will manage to hit.

    The GeForce 7900 GT is a very good overclocker, while the Radeon X1900GT looks to be a bit of a mixed bag at the moment. If you're looking to overclock, we feel that the GeForce 7900 GT is the better deal if you find one at a good price. However, if you're planning to run your video card at stock speeds the final decision will depend on the games you're looking to play.
  • by thebdj ( 768618 ) on Wednesday May 10, 2006 @10:51AM (#15300968) Journal
    List of games I have played in Linux:

    *Diablo II
    *Warcraft III
    *Half-Life (pre-steam) w/ all games
    *Return to Castle Wolfenstein


    List of games I know will run in Linux:

    *World of Warcraft
    *Half-Life 2 and mods


    Those are just the ones I can think of off the top of my head. (Yes, I know the list is a bit short.) But I own all the games on that list and their ability to run in Linux is great because I really plan on shedding Windows for good with this next PC upgrade. There are others games that I think run in Linux. Tribes 2 had a Linux version and I think NWN was going to eventually get Linux support.

    Yes, the market for Linux gaming is a lot smaller, but it does still exist. Some companies have released commercial versions of games to run in Linux and id gave away the Linux client for RtCW (though you need the game for its data files). The rest can be made to run in wine. Still, the point is valid. There is no reason for companies to release open source drivers if they don't want to do it. The fact they release drivers at all is actually somewhat impressive.
  • Re:nVidia (Score:3, Informative)

    by everphilski ( 877346 ) on Wednesday May 10, 2006 @10:55AM (#15301005) Journal
    Blow by blow:

    accelerated 3d

    I do 3d development under Linux using OpenSceneGraph [openscenegraph.org]. I can personally attest to the fact that 3d acceleration works under Linux. framebuffer

    Why the hell you want to use framebuffer with a spiffed up card is beyond me but yes, nVidia has a framebuffer driver, and here's a walkthruough: here [comcast.net]

    2d & video out

    Haven't used it personally but I have friends who do. Again, same driver code is shared between Windows and Linux.

    Also of interest:

    NVIDIA also provides an open source OpenGL and XFree86 3.3.5 driver implementation on their website. The implementation supports the NV1, RIVA 128, RIVA 128ZX, RIVA TNT, RIVA TNT2, and GeForce 256 chipsets. This driver has lower performance than NVIDIA's proprietary driver but it does include source code.

    Yes, I fed a troll, but only so that he might not mislead others. May god have mercy on my soul.
  • by Dex5791 ( 973984 ) on Wednesday May 10, 2006 @11:13AM (#15301135)
    With video cards, unless you have a lot of disposable income, you are better off buying the mid-range cards that pack enough features to get the job done. The X1600Pro is a much better deal than the X1900XT. It will run Oblivion just fine in 1024x768 with most of the bells and whistles enabled. It's priced at around $125 for the AGP version.
  • by the.nourse.god ( 972290 ) on Wednesday May 10, 2006 @11:46AM (#15301395) Homepage
    Or you could also go with the top card from a year or so ago. I recently purchased an X850XT for around $150. Since my monitor only supports 1280x1024, I can run everything just fine at native res with features maxed out.

    Staying a generation or two behind can save you a ton of money and you won't take too much of a performance hit. In fact, my X850 will outperform the current X1600 pro. Just my two cents anyway.
  • by Anonymous Coward on Wednesday May 10, 2006 @12:52PM (#15301940)
    The amount of names they have for their different products is rather absurd. It's all probably down to creating a 'price spread' so a product will be available at every possible price point.
    While counting megabytes and megahertz is usually a futile exercise I have found that there are a couple things to keep an eye out when comparing video cards.

    1. "Pipelines" - Sounds weird, but it seems to be a very reliable metric for determining how one chip compares to another. The more the better, and the more expensive.
    2. Memory bus width - Don't skimp here. Only get a card with the same memory bus with as the fastest current generation. Anything less starves a memory hungry GPU, crippling it artificially. You mostly see this on odd, cheap OEM cards I think. Currently I think 256 bit is the standard. Can anyone correct me on that?
    3. Generation, age, etc. - Often you can get a card with a chip that may be essential the same as the top-end part, only clocked slower or has pipes disabled, slower clocked memory, etc. New is good. New chips run cooler and often have the new features needed for new and future games.
    4. Memory type - Your GPU wants memory and it wants it fast. 256 megs of high clocked GDDR3 is going to do much more for you than 512mb of generic DDR.

"Ninety percent of baseball is half mental." -- Yogi Berra

Working...