Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Upgrades Hardware

Positive Reviews For Nvidia' GeForce 6800 Ultra 564

Sander Sassen writes "Following months of heated discussion and rumors about the performance of Nvidia' new NV4x architecture, today their new graphics cards based on this architecture got an official introduction. Hardware Analysis posted their first looks at the new GeForce 6800 Ultra and takes it for a spin with all of the latest DirectX 9.0 game titles. The results speak for themselves, the GeForce 6800 Ultra is the new king of the hill, beating ATI's fastest by over 100% in almost every benchmark." Reader egarland adds "Revews are up on Firing Squad, Toms Hardware, Anandtech and Hot Hardware." Update: 04/14 16:54 GMT by T : Neophytus writes "HardOCP have their real life gameplay review available."
This discussion has been archived. No new comments can be posted.

Positive Reviews For Nvidia' GeForce 6800 Ultra

Comments Filter:
  • by ackthpt ( 218170 ) * on Wednesday April 14, 2004 @12:13PM (#8860647) Homepage Journal
    It's good to know I can look forward to reading text in a more scintillating black and white, while Flash ads and pop-ups will be more vibrant than ever.

    In a word, "Wow."

    I mean, who'd have thunk it that the 6800 would still have life? Maybe ATI can counter with a Radeon All-In-Wonder Xtravaganza 6502!

    • ...that is, until ATI releases their next card too.

      I wouldn't expect a new card NOT to beat out the current cards. ATI and Nvidia have played this catchup game with each other for years.
    • 2d Performance (Score:5, Interesting)

      by phorm ( 591458 ) on Wednesday April 14, 2004 @12:49PM (#8861054) Journal
      Which actually brings me to a good question: Graphics cards have been improving in fast-3d-rendering performance, but are often not that great at crisp 2d rendering (compare an NVidia card to a Matrox and see what I mean).

      How well does this one do at 2d rendering? I do play 3d games a lot but that doesn't mean I want my web-browsing and other non-3d activities to be sub-par
      • Re:2d Performance (Score:5, Interesting)

        by Paladin128 ( 203968 ) <aaron&traas,org> on Wednesday April 14, 2004 @12:53PM (#8861091) Homepage
        That used to be true, but the gap is closing. Most GeForce FX cards have pretty fantastic RAMDACs. Yeah, the Parhelia does look a hair better (but only on your primary monitor, if you're using more than one with it). NVIDIA beats Matrox in price, performance, and driver quality.

        Besides, I'm not going to be using an analog output for too long... DVI kills the whole "2d quality" argument; the color values are passed digitally via a TMDS transmitter. Doesn't matter if
      • by willy_me ( 212994 ) on Wednesday April 14, 2004 @02:00PM (#8861961)
        The chipsets are all very similar. It's the external components, filters and such, that determine quality. Matrox has a good reputation because they use high quality components. Same with ATI. NVidia has a poor rep because of all the different card makers trying to undercut each other by using cheaper components. Same is true for the clone ATI boards.

        So long as you have a quality graphics card, it really doesn't matter who's chipset is powering it. For example, even though NVidia has a poor rep, there are still high quality cards out there.

        • Are you serious? Every ATI video card I've ever had has had serious quality control problems. If it wasn't the drivers it was the physical hardware.

          I bought a 9500 Pro a year ago and I've only ever been able to use it for a month. I'm on card number 4 now because of a flaw in the way the heatsink/heatsink shim was made (something their customer service reps admit to). I was so burned by the 9500 that I could honestly never bring myself to by another ATI card for as long as I live. Much in the same way it w
  • nvidia's back (Score:3, Insightful)

    by rwiedower ( 572254 ) on Wednesday April 14, 2004 @12:15PM (#8860671) Homepage
    These are the guys that managed to crush every single other player into the ground...the fact that nVidia was knocked backwards by ATI was a huge deal, but they weren't the champ for being slow on your feet. At the end of the next few months, the continuing battle should be good news for all of us consumers.

    Did anyone else notice the size of the die rivals even that of the Pentium 4 EE? This thing is frickin' huge!

    • Re:nvidia's back (Score:4, Interesting)

      by ackthpt ( 218170 ) * on Wednesday April 14, 2004 @12:17PM (#8860711) Homepage Journal
      These are the guys that managed to crush every single other player into the ground..

      Is it considered "safe" to buy any of the Nvidia chipset motherboards, or are they still pretty sketchy?

      • Re:nvidia's back (Score:3, Informative)

        by scumbucket ( 680352 )
        I've had an MSI K7N2-L motherboard which has the Nforce2 chipset for over a year now. It's rock solid with no problems.
      • Re:nvidia's back (Score:4, Informative)

        by Jeff DeMaagd ( 2015 ) on Wednesday April 14, 2004 @12:29PM (#8860836) Homepage Journal
        Two people have had some issues with the nVidia IDE drivers, at least one person fixed it by using a generic IDE driver.
      • Is it considered "safe" to buy any of the Nvidia chipset motherboards, or are they still pretty sketchy?

        I have an ECS [ecs.com.tw] N2U400-A motherboard with an NVidia N Force 2 Ultra chipset. It's fantastic. Rock solid stable and fast.

        Don't take my word for it, google up some reviews of motherboards with the chipset. It's good stuff.

        LK
      • I have the original nForce motherboard with an Athlon XP 1800+, and it has been rock solid since the day I got it.

        Oh, I also have an audigy platinum and an all in wonder 9700 Pro. no problems what so ever.
    • Re:nvidia's back (Score:3, Insightful)

      by YanceyAI ( 192279 ) *
      And this is why healthy competition is GOOD for consumers (*nudges Bill Gates*).
    • Re:nvidia's back (Score:5, Insightful)

      by BiggerIsBetter ( 682164 ) on Wednesday April 14, 2004 @12:18PM (#8860724)
      To hell with the die size, check out the power requirements. There's two, TWO! power connectors for that thing. Damn, they've created a monster. I wonder how fast it can run GPGPU [gpgpu.org] apps...
      • by ackthpt ( 218170 ) * on Wednesday April 14, 2004 @12:34PM (#8860902) Homepage Journal
        To hell with the die size, check out the power requirements. There's two, TWO! power connectors for that thing. Damn, they've created a monster.

        Nvidia GeForce 6800 Ultra: $600

        800 Watt Powersupply: $250

        MMORPG: $10/mo.

        The look on your face when you get your next powerbill: Priceless

        There are some things in life your measley paycheck can cover, for everything else there's Massivecharge.

    • Re:nvidia's back (Score:3, Interesting)

      by LqqkOut ( 767022 )
      The damn thing still won't fit into a Shuttle case... It'd be nice it they said something about noise. [H] [hardocp.com] is /.'ed too, I wonder what they have to say.

      I've been a hardcord nVidia follower for years, but after last year I was left with a bad taste in my mouth. I'm glad to see another generation of video cards and I can't wait to see what ATI's got to offer - it's been a while since nVidia has had to play catch-up.

      Yea! More horsepower for Doom ]|[ (only 2 more months!)

    • by egarland ( 120202 ) on Wednesday April 14, 2004 @12:33PM (#8860890)
      ATI is supposed to announce the 420 soon. They've had some time to redesign too. I switched to ATI in the last round of upgrades and was very happy. I'll need a good reason to switch back. So far I have good reason but ATI could take it away with a decent new product.
    • by chmilar ( 211243 ) on Wednesday April 14, 2004 @01:16PM (#8861362)
      [...] but they weren't the champ for being slow on your feet.

      What are they doing with my feet? Give 'em back!
  • latest vs last-year (Score:5, Informative)

    by bwindle2 ( 519558 ) on Wednesday April 14, 2004 @12:15PM (#8860677)
    They are comparing the latest nVidia GPU to the 9800XT, which is several months old. When ATI's next-gen chip comes out (two weeks?), only then will we be able to see who holds the GPU Speed crown.
    • They are comparing the latest nVidia GPU to the 9800XT, which is several months old. When ATI's next-gen chip comes out (two weeks?), only then will we be able to see who holds the GPU Speed crown.

      I don't think so. The first ATi card to be released will be a 12x1 pipe version while the first nVidia card will be a 16x1 pipe version. ATi seriously underestimated what nVidia was planning as they moved the production schedule of their 16x1 pipe version 5 months ahead of schedule. ATi was scared s***less a
    • Fanboyism (Score:4, Insightful)

      by bonch ( 38532 ) on Wednesday April 14, 2004 @12:24PM (#8860787)
      I think the submitter must be something of an Nvidia fan. :) Most people wouldn't ridiculously compare a new next-gen card to today's months-old cards, not even mentioning that ATI has a new one due out in weeks. But he sure did mention an over 100% speed increase over those old cards, didn't he?

      Personally I don't get the fanboy rivalries--I have a Radeon in my laptop and a Geforce in my desktop, and that's just what I happened to buy at the time, no fanboy adherism going on.
      • Re:Fanboyism (Score:4, Interesting)

        by Nogami_Saeko ( 466595 ) on Wednesday April 14, 2004 @12:31PM (#8860866)
        Well said! The amount of "epenis" bickering that surrounds videocards is legendary, but the fact of the matter for me is that I buy what's fastest with best quality at any given time (assuming relatively stable drivers of course). Of course, price does figure into it as well. I'm not going to pay a huge premium for a card unless it's significantly better than the competition. A few extra percent on a benchmark simply won't open my wallet more.

        Had a NVidia GEForce2 when it was at the top of the pile a few years ago, picked up an ATI 9700Pro when it was released. May go back to Nvidia, may stay with ATI (shrug).

        In the longrun, all of us consumers benefit from some healthy competition. Granted, as a Canuck, I'm happy to see ATI do well - but they also earned it. At the time when the 9700Pro was released, ATI blew Nvidia out of the water. Nvidia had grown a tad complacent, and they paid for it.

        Now we'll see what happens with Nvidia having a fast new card and ATI about to release their new offering in a few more weeks.

        N.
    • ...so it's even sillier that the submitter would say that. But, hey, it's healthy fanboyism I guess.

      Here's what the Register says [theregister.co.uk]:

      ATI will ship its much-anticipated R420 chip later this month as the Radeon X800 Pro. The part's 26 April debut will be followed a month later by the Radeon X800 XT on 31 May.

      So claims Anandtech, citing unnamed vendor sources and a glance at ATI's roadmap.

      If the date is accurate, it puts ATI just 13 days behind Nvidia's NV40 launch on 13 April. NV40 will surface as the GeFo
  • by account_deleted ( 4530225 ) on Wednesday April 14, 2004 @12:15PM (#8860685)
    Comment removed based on user account deletion
  • I have a mini-pc you insensitive clod!
  • Power Requirements (Score:5, Insightful)

    by Lord_Pall ( 136066 ) on Wednesday April 14, 2004 @12:16PM (#8860688)
    Okay so it's fast.. no question.. Amazing feature set as well..

    but it requires a 480 watt power supply

    and 2 power connections... And it also has what looks to be a vacuum cleaner tied to it..

    I currently use a shuttle skn41g2 for my main box.. I love the sff pc's. This won't work in that.. It would make the includied power supply very sad.

    My HTPC box uses an antec sonata with a fanless radeon 9000, and ultra quiet everything else.. Forget using this in a quiet pc as well

    I don't care for nvidia's trend towards hideously loud, bulky, power hungry video cards.. They might perform well, but for normal use, i'd prefer something smaller and quieter.. and for god's sake, give me an external power supply.. heh
    • by happyfrogcow ( 708359 ) on Wednesday April 14, 2004 @12:21PM (#8860760)
      The sound of the fans should be drowned out by booming speakers you should have to go with your gaming system. games and gamers aren't quite, who cares about fan noise when your kicking someones ass?

      Now power consumption... that can be an issue.
      • by GoofyBoy ( 44399 )
        >games and gamers aren't quite,

        When you watch tv, turn on the radio to a low sound level.

        Even if you have the tv up loud, its still annoying.

        (Not to mention using the computer for non-gaming stuff)
    • but it requires a 480 watt power supply

      and 2 power connections... And it also has what looks to be a vacuum cleaner tied to it..

      The approaching question is, which is the principal in your box, the Motherboard or the Video card? My present video card has more memory and sucks more power than my laptop, and like yours has a fan, though it's quiet.

      FNNNZZOWWWNT! "Wayl, shewt! Thar goes the arc welder! Gessen we cain't play no Medal o' Honor till we gets a new one."

    • by Anonymous Coward
      but it requires a 480 watt power supply

      bah, I won't be impressed until a video card requires 1.21 gigawatts.

    • You bring up an interesting point. I wonder what it would take to create a whole house AC/DC converter. Once in DC its an easy step up or down to the proper voltage for a PC, or any other number of little gadgets that incorporate transformers on them.

      Hmm, I only now electronics from one class in Physics so I coudn't comment on it much now. I should look into it though.

      I can imagine a 45V supply running through to outlets that support the circle jacks of DC/DC converters. Maybe 12V? Most devices that use b
    • by fitten ( 521191 )
      From actually RTAs, most reviewers say that the card is surprisingly quiet and has acceptable noise levels, in spite of the large, scary heatsink/fan/heatpipe.
    • RTFA. (Score:3, Informative)

      by Illissius ( 694708 )
      At least one of them. I've read/skimmed most, and several of them mention how (a) it's actually significantly *cooler* than the 5950 Ultra (the previous high end card); (b) it's not very loud (not silent, but not disturbing either); (c) it only draws 10-30W more than aforementioned 5950 Ultra (this figure varied from review to review).
      Though you are right, using it in an SFF wouldn't be a great idea. Can't have everything.
      (And several of the sites mention how it worked flawlessly with a 400W PSU, and the
  • by Seoulstriker ( 748895 ) on Wednesday April 14, 2004 @12:16PM (#8860693)
    I am really quite impressed with the performance of the 6800. Across the board, the 6800 is nearly twice the performance of the current top of the line cards. Going from 4x2 pipes to 16x1 was definitely worth it for nVidia, as their shading performance is simply astounding! Halo actually runs incredibly well on the 6800, getting 2x-3x current performance.

    Now, as DooM 3 is supposedly being released with the 6800, can we expect DooM in mid-may? This is truly an incredible day for PC gaming as we will have cinematic computing in the near future.

    I'm giddy. :-)
  • Its HUGE (Score:5, Interesting)

    by silas_moeckel ( 234313 ) <silas@@@dsminc-corp...com> on Wednesday April 14, 2004 @12:17PM (#8860703) Homepage
    Ok this card has great specs etc etc etc. Did you look at the thing it's taking up at least 1 PCI slot for the fan and another for it's intake to the fan. This thing should have just come with water cooling out the back. Granted it's specs look great I do have to ask will it drive that IBM T221 LCD display that hits 204DPI at 22" thats about the only thing I can think of that realy would do the card justice.
    • Re:Its HUGE (Score:3, Interesting)

      by afidel ( 530433 )
      I take it you haven't seen some of the games out now with LOTS of eye candy? Silent Storm is absolutly amazing looking even with crappy settings, I turned on all the eye candy for a while just to look at it but my lowly GF3 can barely do 1FPS. People with the newest Radeon's and GForce's can't run it at high resolution with everything cranked. This is an engine that Nival obviously designed for a seriously long lifespan. Oh yeah and AI processing eats my 1.2GHz Athlon for breakfast. I think this game is goi
    • How many PCI slots do you need, though? Twin DVI ports take care of most people's video needs, and most motherboards ship with excellent networking, audio, Serial ATA, and even FireWire onboard these days. My oldest system, a dual P3-800, uses two PCI slots for an additional ATA/100 controller and a NIC, and if it had sound, that would take an additional slot. My P4 2.4B has only a PCI TV tuner card. The nForce2 based Athlon machine doesn't use *ANY* PCI slots, just AGP for graphics.
  • by Julius X ( 14690 ) on Wednesday April 14, 2004 @12:17PM (#8860704) Homepage
    0wn3d!
  • Man .. There has been many generations of video cards now .. but the prices doesnt seem to come down that much ..
    • I wonder if it has anything to do with the price the market will bear. Hmmm...
    • by gl4ss ( 559668 ) on Wednesday April 14, 2004 @12:59PM (#8861158) Homepage Journal
      huh? the prices _do_ come down.

      the prices of _new_ cards are always at the maximum that somebody would pay for them.

      if you want a cheap card, buy a cheap card(that same cheap card would have cost hundreds of dollars few years back).

      the way i see it there's few categories that have been for years: 1. ultra cheaps at 30-50$ 2. entry level gaming cards at 100$ 3. medium level gaming cards 200-300 and then the 4. high end gaming cards at insane 400-500$. all that changes over the years is which speed cards belong where.

      there comes new cheap cards occasionally, but usually they base heavily on yesterdays high end chips.
  • Impressive! (Score:4, Interesting)

    by cK-Gunslinger ( 443452 ) on Wednesday April 14, 2004 @12:17PM (#8860709) Journal

    I must admit, after looking at the benchmarks from Tom's and Anand's earlier this morning, I am *very* impressed by the results of this chipset. I still have concerns about the cooling and power requirements, as well as the image quality, but that may be partly related to my newfound ATI fanboy-dom. ;-)

    Speaking of which, I can't wait to see what the boys from Canada have coming next week. 16 pipelines? Mmmm....

  • Money (Score:2, Insightful)

    by tai_Dasher ( 319541 )
    People who can afford to buy these kind of things should give money to charity.

    No seriously, this thing costs more than a new full fledged computer.
  • This thing requires a 480 watt power supply, minimum. That's too much. I am currently responsible for a large number of servers that don't have larger than 400 watt power supplies each.

    It's not hard to see why the U.S. has to violently defend our oil interests when we have video cards wastefully burning through electricity like there's no tomorrow.

    I'm all for advances in processor technology, just not when it comes with a high energy consumption price.

    I once heard that by leaving a computer with a meas
    • Unless you are doing something with the computer, it will not be drawing very much current at all.
    • by JawFunk ( 722169 ) on Wednesday April 14, 2004 @01:39PM (#8861635)
      I once heard that by leaving a computer with a measely 150 watt power supply (minute by today's standards) on 24 hours a day like most people do, it consumes more energy than the common refrigerator.

      Perhaps the survey you are referring to was measuring energy consumption of a mini-fridge for a single 12 oz.can of beer (served ice cold), but the common refridgerator, and I mean modern, not the one's from the 70s and 80s, as they improve with time, but the modern fridge draws about 700 - 750W. This is about double that of a computer loaded with hardware doing average browsing or word processing. The ratio is less when UT2004 is activated (W00T).

      • The 2004 requirement for refridgerators sold in the US is to be labeled as "Energy Star" compliant (which is most of the decent ones) is ~500kWh/year. There are aout 8760 hours in a year. That's means for "normal use" the fridge consumes an average of 60W of power. I think you're off by an order of magnitude.

        A fridge drawing a constant 700W running 24/7 for 365 days would cost about $613/year to run, assuming an average of $0.10/kWh. ~$50/month electric bill just for the fridge? I don't think so.

        M
  • by sczimme ( 603413 ) on Wednesday April 14, 2004 @12:19PM (#8860738)
    From the article:

    To measure how well both cards perform with actual gameplay we used Unreal Tournament 2003 and 2004 and Halo and Far Cry. For both versions of Unreal Tournament we've used the built-in benchmark, which consists of a flyby and a botmatch. We've omitted the flyby scores as they doesn't tell us much about performance during actual gameplay, just how fast the graphics card is able to render the flyby. With UT2003 the lead the GeForce 6800 Ultra takes over the Radeon 9800 XT is less impressive, at a 1024x768 and 1280x1024 resolutions it is only 6% faster. At 1600x1200 however the GeForce 6800 Ultra pulls away and clocks in 21% faster. With UT2004 the difference is much bigger, starting off at 10% at 1024x768 up to 65% faster at 1600x1200. What is also noteworthy is the fact that the performance of the Radeon 9800 XT drops at higher resolutions whereas that of the GeForce 6800 Ultra stays at about the same level.

    I know this is /., but how does this become "beating ATI's fastest by over 100% in almost every benchmark"??
    • Because that's only one benchmark where it doesn't beat by 100%? The fact that it has a 3x lead in "call of duty" gives them a little license to make generalized claims ;)
    • you add up all of the %difference and report that number
    • Well, it is obvious in that benchmark something *besides* the graphics card is limiting performance, since increasing resolution hardly even decreases the framerate. If you look at only the benchmarks where it appears that both cards are stressed to the max, NVidia's card does seem to be about twice as fast as ATI's when the new features of DirectX 9 are used. Of course that doesn't make the submitter's statement correct, but it is quite impressive for a one-generation improvement in card performance. (.
    • by KalvinB ( 205500 ) on Wednesday April 14, 2004 @01:05PM (#8861211) Homepage
      "almost" means "many of, but not all."

      Congratulations on finding the games section where it didn't womp the best ATI card until you get into the higher resolution ranges.

      However, you'll notice on the preceeding pages, "over 100% better" was a very common occurance in areas like shaders and lighting and whatnot.

      Pointing out areas where the GeForce doesn't beat the ATI at 100% does exactly nothing to diminish the point of the article submitter.

      This is why he said "almost every" and not "all."

      Ben
  • by Stevyn ( 691306 ) on Wednesday April 14, 2004 @12:20PM (#8860752)
    But where do I put this thing? That's not a heatsink, that's the kitchen sink!
  • by Guspaz ( 556486 ) on Wednesday April 14, 2004 @12:21PM (#8860755)
    ATI's next-gen offering is to be launched about the same time as nVidia's GeForce 6800, and we haven't seen reviews from it yet.

    I'd wait until the Radeon X800 benchmarks are out before crowning a new king. For all we know ATI's new offering will beat the new GeForce.
  • by silverhalide ( 584408 ) on Wednesday April 14, 2004 @12:21PM (#8860758)
    I heard from a confidental source that the next NVidia card was going to be called the Super GEForce 95050++ Hyper-titanium Happy Extreme-platinum Ultra MK-II Enhanced V2.2 Omicron version. Keep your eyes open.
  • by netfool ( 623800 ) * on Wednesday April 14, 2004 @12:23PM (#8860773) Homepage
    nVidia mine as well get into the case and CPU fan/heatsink business! Look at that thing!
    Hell, with something that big they should just build freezer around the card.
  • by Gingko ( 195226 ) on Wednesday April 14, 2004 @12:24PM (#8860783)
    all of the latest DirectX 9.0 game titles

    what, both of them? ;)

    Thank you ladies and gentlemen, I'm here all week. Available for weddings, bahmitzvahs and light-hearted funerals.


  • ... but what am I going to have to PAY for this beautiful monster?

    It's big (2 slots), it probably runs VERY VERY hot, takes two power connectors... but it seems to trump EVERYTHING else so far, and not by small amounts!

  • Reviews! not Revews. (My typo. Sorry)
  • by Recoil_42 ( 665710 ) on Wednesday April 14, 2004 @12:24PM (#8860789) Homepage Journal

    here. [mbnet.fi]

    those benchmarks don't look too impressive to me, and the hugeass heatsink/fan combo is still there! not to mention that it requires *two* molexes?

    Nvidia is really starting to fall behind...
    • Hmmmmm. Let's see. We have about 10 reviews saying that the nVidia is 2x faster than current top of the line cards, and we have one review by [H]ardOCP which uses different measures in its benchmarks (different resolutions, AA, AF settings in the same graph) and is profoundly anti-nVidia and we are supposed to take it seriously? Come on...
    • by Cyph ( 240321 ) <(ten.ysaekaeps) (ta) (xinooy)> on Wednesday April 14, 2004 @12:46PM (#8861034)
      Those benchmarks are leaked images from the HardOCP benchmark, for most part. If you look, you'll notice that HardOCP decided to do something unusual this time and not compare each card at the same performance settings, but rather, compare it in such a way that it shows the top performance setting the card could use while running at a similar fps count as the other cards. I personally am not really fond of the approach, because seeing everything at the same in-game performance setting makes it a lot easier to compare to other cards.
  • FX 6200? (Score:3, Insightful)

    by Spleener12 ( 587422 ) * on Wednesday April 14, 2004 @12:24PM (#8860790)
    I'm curious as to whether or not this means there will be a new low-end NVIDIA card. Yeah, the 6800 is nice, but I'm more interested in the cards that I can actually afford.
    • Re:FX 6200? (Score:4, Insightful)

      by ameoba ( 173803 ) on Wednesday April 14, 2004 @01:42PM (#8861664)
      What counts as something you can 'actually afford'? If you don't plan on spending at least $100 on a graphics card, you're going to get shit (think Intel + Celeron). For right at $100 today, you can get a Radeon 9600 or a geForce FX 5700, current generation (DX9) cards that have very respectable performance. (for slighly less, GF4 Ti's and Radeon 8500/9100s are still available, giving you the last generation's high-end for cheap).

      The current generation's low-end cards (as well as the last gen or two) aren't really worth the money if you want to do anything more complex than 3d screensavers. The FX5200 is a dog that isn't really any faster than the GF4mx was & isn't really worth using in DX9. The Radeon 9200 is actually slower than the 9100 & 9000. Eventually, the FX 6x00 core will be adapted to a chip that sucks just as much & you'd probably be better off getting a high to mid range card from the previous generation.

      Intel makes a 3.4GHz P4EE and AMD has the Athlon64 FX-53, both of which are $800+ CPUs, you don't see (many) people complaining about the top of the line chips there being over twice the price of the chip 2 steps down ($275 should get you an Athlon64 3200+ or a P4 3.2GHz), yet when a new graphics card comes out and it's $500 every lines up to talk shit.

      Yet there's always going to be something in the $100-150 range (what's considered reasonable mid-range for serious gaming) that's worth buying (barring some sort of hyper-inflation); you don't always have to have the latest & greatest thing on the market. Game manufacturers realize this and target their games to be playable on $50 cards, ideal for $100-150 cards and able to take advantage of the $500 cards.
  • by Cthefuture ( 665326 ) on Wednesday April 14, 2004 @12:26PM (#8860814)
    Ahh! When did Tom's do away with the Q3 benchmarks?

    It's still the only game that can push the hardware to its limits reliably. All those other games tend to have bottlenecks that are algorithm/code related rather than hardware related (like the scripting engine in UT).

    Too bad, I found Quake3 to be one of the most accurate because it ran at such a low level and could pretty push the hardware. It's not like those other games are using the hardware shaders yet anyway (or are they?).
    • [Quake 3] is still the only game that can push the hardware to its limits reliably. All those other games tend to have bottlenecks that are algorithm/code related rather than hardware related (like the scripting engine in UT).

      Yes, it is incredibly meaningful to see that card X can do 672 frames per second in Quake 3, and card Y can do 784 frames per second, even though your monitor can't show it that quickly or your eyes wouldn't see the difference if it could. When you can boast to your friends about num
    • by Zathrus ( 232140 ) on Wednesday April 14, 2004 @01:37PM (#8861605) Homepage
      It's not like those other games are using the hardware shaders yet anyway (or are they?).

      They are -- FarCry is probably the most intensive game out there right now, fully utilizing DX9 specs. Halo is no slouch either, although a lot of its speed issues are from wanting to use hardware that simply isn't present (on PCs -- it is on the Xbox; why they didn't port away from this is beyond me).

      Aquanox 2, Tomb Raider: Angel of Darkness, Painkiller, UT2k4, BF: Vietnam, and several others utilize DX9 to varying lengths as well. And there's the upcoming games -- Half Life 2, STALKER, Soldner (with an umlaut on the o), World of Warcraft, Everquest 2, and numerous others.

      Quake 3 simply isn't a reliable benchmark anymore. It utterly fails to excercise the newer features of the cards -- which are really the only features to bother upgrading for. If all you're going to do is play Q3-era games then a GeForce2 is more than sufficient. If you want to run games already out, and those coming out in the next year, with all the graphical options turned up and at high-res then you'll be best served by either the latest nVidia or (probably) ATI card.

      And (most importantly to me, and many others) if you want to get a card that can run new games at reasonable resolutions with most of the graphical bells and whistles on, but at a reasonable price... well, those $400 cards are going to be sub-$200 very quickly now, and the $200 cards are going to drop to around $100.
  • by guile*fr ( 515485 ) on Wednesday April 14, 2004 @12:27PM (#8860826)
    in other news ID Software announce that DoomIII will
    run at 30@fps on the new Nvidia 6800
  • Holy mother of crap (Score:5, Informative)

    by l33t-gu3lph1t3 ( 567059 ) <arch_angel16.hotmail@com> on Wednesday April 14, 2004 @12:28PM (#8860832) Homepage
    Strong points of new Nvidia card:

    -Obscene performance boosts, on a scale I've never seen before
    -fancy new effects
    -massively improved image quality
    -heatsink fan still pretty quiet
    -basically free 4xFSAA and 8x ANISO

    Weaker points of new Nvidia card:

    -Expensive
    -it seems that shader precision is still not as pretty as ATI's, though that may be fixed by game patches
    -takes up 2 slots with the tall heatsink
    -480W recommended PSU
    -video processing engine isn't implemented in software yet

    I don't really object to the power requirements. This thing is more complicated, bigger, and has more transistors than a P4 Extreme Edition. It consumes about 110W, of which 2/3 is the GPU die's power draw. It is certainly NOT unreasonable to require a big power supply with this thing. It seems as though ATI's solution will have a power supply recommendation as well. Simply put, if you're gonna improve performance by such a margin by means other than smaller manufacturing, you're going to increase power consumption. Get over it.

    This thing isn't meant for SFF PCs or laptops, though I'm sure the architecture will be ported to a laptop chip eventually. As for the 2-slot size, well...It consumes 110W! To put this in perspective, it consumes more than any non-overclocked desktop CPU today! Think of how big your Athlon64/P4EE heatsink/fan is, then you'll realise that 2 slots aren't really that big of a problem.

    My own personal reason for wanting this thing: It can play any current game at 1600x1200 with 4XFSAA and 8x anistropic filtering at a good framerate, and is the only card that can claim to do this right now :)
  • What with the license changes for XFree86, the various new X implementations, changing distros, etc. has NVidia come out and said which one their drivers will work with?
  • From the article:
    "What is also noteworthy is the fact that the performance of the Radeon 9800 XT drops at higher resolutions whereas that of the GeForce 6800 Ultra stays at about the same level."

    Wouldn't that mean that the limiting factor for fps is NOT the card but some other thing (processor, memory bandwidth) ?

    I mean i know they used this hardaware for the test:
    "The system we used consists of a Pentium 4 3.2GHz EE processor, EpoX? 4PCA3+ i875P chipset motherboard, 1GB of Crucial DDR400 memory and two W
  • I wish .... (Score:5, Insightful)

    by El Cubano ( 631386 ) on Wednesday April 14, 2004 @12:32PM (#8860880)

    I wish that people that pretend to be computer experts would do the teeniest bit of research.

    How about this gem: First introduced in 1995, Microsoft's DirectX application programming interface (API) was designed to make life easier for developers by providing a standard platform for Windows-based PCs. Before the arrival of DirectX, developers had to program their software titles to take advantage of features found in individual hardware components. With the wealth of devices on the market, this could become a tedious, time-consuming process.

    I'm glad he cleared that up for us. Because this little known company called SGI [sgi.com] didn't develop OpenGL [opengl.org] back in 1992 [sgi.com]. In fact, were it not for MS, we would still be in the computer graphics dark ages.

    I'm not trying to troll here. I am just pissed that people pretend to be experts when they don't have a clues what they are talking about.

    • There are different OpenGL paths for every graphics card or architecture. You can use Architecture Review Board (ARB) standards as well. Those barely existed when DirectX and Direct3D came into being.
  • Let's get this out of the way:

    -"Who needs this? My Voodoo 3 runs Q3A just fine!"
    -"Does it have Linux support?"
    -"nVidia pwnz ATi!!11one!111~"

    Now that that's over with...

    I agree with a lot of the comments here: I really dislike nVidia's tendancy towards massive, bulky, noisy, power-hog GPUs. And while the 6800's performance is nothing short of jaw-dropping, I'll bet ATi's solution will be far more eloquent, smaller, with lower power requirements and less noise.

    Either way, though, this is good for consumer
  • by Selecter ( 677480 ) on Wednesday April 14, 2004 @12:39PM (#8860967)
    If Ati's counter offering, due up on the 26th it seems, has 90% of the performance of this beastie but has lower power requirements ( 1 molex or none ) and does not take up slots then Ati will still beat it.

    There's a very limited number of gamers that will buy this card - you literally have to build a whole new PC around it considering the power requirements and the slot hoggishness. I wont be buying one. My 9500 Pro Oc'ed to 300/300 with a 3000+ AMD *STILL* plays anything without problems ( at least any I can see )

    Even if ATi does come out with a card that beats it, I wont be buying one of those either. Gaming is only *part* of what I use computers for. These days at age 40 I cant compete with the twitchy youngsters anyways :D

    I care a lot more these days about how well my data is protected and how good the whole experience is, not how many fps I get in some game.

    • by MrAngryForNoReason ( 711935 ) on Wednesday April 14, 2004 @03:01PM (#8862685)

      90% of the performance of this beastie but has lower power requirements ( 1 molex or none )

      I very much doubt Ati's new card won't need any additional power. Don't forget Ati's 9700 Pro was the first card to require more power than the AGP slot provided.

      ... considering the power requirements and the slot hoggishness.

      PCI slots aren't as useful as they used to be. So much is on board now so PCI cards aren't needed. Take the Asus A7N8X for instance, it has two network connectors on board as well as sound comparable to a high quality PCI sound card. And don't forget the slot you lose is the first one, which shares an IRQ with the AGP slot so it isn't a good idea to use it in any case.

    • Bad news for you, the ATI X800 will require 2 molex connectors too.

      ATI needs extra power too [theinquirer.net]
  • by Zed2K ( 313037 ) on Wednesday April 14, 2004 @12:54PM (#8861105)
    So does nvidia recommend any power supply brands to be used with this card? I would think they would almost have to recommend something as the power usage requirements might scare a lot of people away from buying the card just because they don't think (or know) if the one they have will work.
  • my next computer (Score:3, Interesting)

    by WormholeFiend ( 674934 ) on Wednesday April 14, 2004 @01:04PM (#8861198)
    I now fully expect to have to build my next PC around a video card, with the rest of the hardware being peripheral to the VPU and its board/heatsink.

    Crazy.

    I bet in a few generations more, home PCs will have fans so big, you'll be able to drive them around the house and mow the lawn, too!
  • by Professr3 ( 670356 ) on Wednesday April 14, 2004 @01:09PM (#8861268)
    Update: In 2005, nVidia released the specs for the GeForce 6900. Comments on geek news site "Slashdot" mainly focused on the need for two 1-kilowatt external gas turbines for power. Quotes like "Do the turbines run linux?" and "Will they be louder than my CPU fan?" are rampant, but most users say as long as the benchmarks are better than ATI, they don't mind wearing ear protection.

    When in doubt, mod +1 funny and pray

  • Power supply issues (Score:4, Informative)

    by EconolineCrush ( 659729 ) on Wednesday April 14, 2004 @01:18PM (#8861376)
    I see a lot of posts on the fact that the 6800 Ultra requires a 480W power supply. However, if you read Tech Report's review [techreport.com], you'll notice that the card's actual power consumption isn't much more than the previous generation of cards. In fact, its idle power consumption is actually lower than the 9800 XT.
  • 16 pipelines. (Score:3, Interesting)

    by flaming-opus ( 8186 ) on Wednesday April 14, 2004 @01:21PM (#8861421)
    The top-of-the-line card is always cool to drool over, and a few people with too much money will undoubtedly run out and buy this monster. However the mid-range and budget derivatives are generally much more interesting. (compare the number of GF5600/RA9600 cards sold to the number of GF5950/RA9800 sold)

    They made this haul ass by doubling the number of pipes, but the first thing they are going to do when they put out a mid-range card is to halve, or quarter the number of pipes. How much has been done to refine this card, and how much impact will the new design have for those of us with $150 to spend on a video card?
  • by Anonymous Coward on Wednesday April 14, 2004 @01:25PM (#8861467)
    I was one of the lucky 250 people that got to be at the GeForce 6800 release in San Francisco. They held a LAN party of 250 people, including some tournaments of UT2k4 and BF:Vietnam. I made the Quarterfinals (top 8) of the UT2k4 and got to actually play on the new video card. All I can say is - wow. I own a 9800 XT so I'm not too shabby, but I took this card to the next level - the ability this card has is just unthinkable in a lot of ways if you're a graphic programmer like me.

    -Shader 3.0 Compatible (Farcry had a demo at the show of a patch they have coming out that will upgrade the game to Shader 3.0. It's by far the biggest improve in a game I've ever seen as I actually got to play it).

    -14983 3DMARK SCORE! If you know anything about 3dmark, you'd scream in joy at that one.

    -Other game companies were there like Everquest2, Lord of the Rings: Battle for Middle Earth and of course, the new nvidia chick Nula with per-pixel lighted hair that has 2 million vectors rendered in real time...

    All I have to say is wow.
    (But wait for PCI express before you buy one)
  • PCI-Express (Score:3, Insightful)

    by mark_space2001 ( 570644 ) on Wednesday April 14, 2004 @01:57PM (#8861917)
    Neat, but at this point I think I'm going to wait for PCI-E to become common on motherboards before I upgrade. Bandwidth is starting to be an issue with just regular PCI, I'd prefer to get something that isn't going to be just a throw away item in a few short months.
  • by DroopyStonx ( 683090 ) on Wednesday April 14, 2004 @02:11PM (#8862120)
    A $600 card that requires a 480 watt power supply? Can you say "overkill"?

    Something in that will have to be redesigned before people will consider buying it.

    While some hardcore gamers wouldn't mind throwing that kinda cash at a vid card right now, most people won't. Of course, these cards are intended for general consumers once they get about a year old or in the $100-$299 price range, but the 480 watt power supply is like $20 extra per month on your electric bill if you're using it a lot!

    That'll be quite a shocker when people figure out that their brand new video card is spiking their elec. bill.
  • Thoughts (Score:3, Interesting)

    by fullofangst ( 724732 ) on Wednesday April 14, 2004 @02:24PM (#8862280)
    Ahh well this is nice to see - a new generation of graphics card that will now allow me to play practically any of my games at up to 1600x1200 without gameplay-affecting slowdown. So far, so good.

    I am genuinely happy that Nvidia have released a product that can perform 'significantly' better than their currently available flagship card. As ATi are going to retaliate with their own card, this can only be a good thing and I hope they do actually keep this large performance jump up for the next generation(s).

    One thing to note in some benchmarks which I've seen so far, are that some of the results give the maximum framerate of a game. I'd be more happy reading either an average or Minimum framerate achievable, as in a frenetic multiplayer game you are going to be usually rendering a lot more stuff than in a single player. The minimum framerate is what I'll be watching out for as that is where the most frustration will come from - nothing quite so annoying as experiencing slowdown when something critical happens, or if you are in the middle of a hellishly large battle (which happens quite a bit in UT2004 Onslaught, for example).

    Unfortunately I won't be able to use this card in my Shuttle. The card is too big and too power-hungry. As someone else says, noise isn't exactly a problem as you would generally get this card to play fancy loud games on anyway.

    And recommending a 480w power supply? Hmm. Oh well, wish I was a hardware site journalist under NDA, I'd have had time to buy some shares in Enermax ;)
  • by francium de neobie ( 590783 ) on Wednesday April 14, 2004 @02:59PM (#8862664)
    1. The power consumptions of the last generation nvidia and ati cards are indeed very similar. Please don't say ATI's cards consume less power

    Comparison 1 [tomshardware.com]
    Comparison 2 [tomshardware.com]

    2. The ATI Radeon X800s will require two power rails also. So stop dreaming about a "power efficient" part and buy a new PSU :(
    ATI needs extra power too [theinquirer.net]

    That said, I'm no fanboy of nVidia or ATI though. The new GF 6800U is still occupying one extra PCI slot and blowing a whole lot of hot air inside the case. Imagine someone put another 100W+ Prescott next to it. I just feel uncomfortable for a GFX card to dissipate so much of heat right next to the CPU. But well... ATI is gonna do that too (except for the two-slot thing)

    If there's any reason I'd look forward towards the X800s, I hope they won't require two slots - that is just inelegant. But based on the two molex connectors on the X800s, and the power consumption of their older parts, I won't hold any hope that ATI would "save power".

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...