Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

GeForce 8800GTX Benchmarked 214

An anonymous reader writes "The card does not launch for another week, but DailyTech already has benchmarks of the new GeForce 8800GTX on its website. The new card is the flagship GPU to replace GeForce 7900, and according to the benchmarks has no problem embarrassing the Radeon X1950 XTX either. According to the article, 'The GeForce 8800GTX used for testing is equipped with 768MB of GDDR3 video memory on a 384-bit memory bus as previously reported. Core and memory clocks are set at 575 MHz and 900 MHz respectively.'"
This discussion has been archived. No new comments can be posted.

GeForce 8800GTX Benchmarked

Comments Filter:
  • Holy Cow (Score:3, Funny)

    by 0racle ( 667029 ) on Friday November 03, 2006 @11:41PM (#16712581)
    equipped with 768MB of GDDR3 video memory on a 384-bit memory bus as previously reported. Core and memory clocks are set at 575 MHz and 900 MHz respectively
    Would you like some computer with your video card? Why even have a system at all, can I just get a backplane to attach a NIC and power to this card and just run everything from it?
    • Re: (Score:3, Interesting)

      by msobkow ( 48369 )

      That video card has 50% more memory than my development database server.

      Kinda scary, eh?

      • . . . but only 3/4 of what I've got in my cellphone . . .

        double scary

      • by kettch ( 40676 )
        Imagine what you could do with a beowulf cluster of these.
        • Re: (Score:3, Funny)

          by Firehed ( 942385 )
          I know one thing... prove that we have an energy crisis. By singlehandedly causing it.
      • by Kjella ( 173770 )
        Kinda scary, eh?

        I got more freaked out when I noticed that my Athlon64 has twice as much L1 cache that my first computer (C64) had as total memory.

        That video card has 50% more memory than my development database server.

        Oddly enough, memory size has now really outgrown what I manage to use up. Even with some huge memory drains, 2GB is more than enough memory. I don't see why the graphics card memory needs to increase either, from what I've gathered the hot thing now is shaders (data manipulation) rather than
        • by Kirsha ( 201264 )
          What about big ass textures, lots and lots of them?
        • by Molt ( 116343 )

          As shaders are getting more powerful the number of texture maps is actually going up, a texture which once would have had a simple colour map now may well have a colour map, a specular map, a diffusion map, a glow map, and a normal map, all merged together with a shader.

          Whilst it is possibly for shaders to produce textures in an entirely procedural manner it's not as fast for most textures as doing a few texture map lookups, munging the data, and throwing it out.. hence graphics memory is still needed. Al

        • Oddly enough, memory size has now really outgrown what I manage to use up. Even with some huge memory drains, 2GB is more than enough memory.

          Hmm... I have a gig of memory, and 5 gigs of swap, and have often hit 3 gig swap use mark just in my normal desktop use. Of course I also have 12 virtual desktops configured right now :)...

          In any case, 768 MB might seem like a lot of memory, but it isn't really. Textures are 2D images, and when you double the resolution, you quadruple memory consumption. And, of

    • Re: (Score:2, Funny)

      by Anonymous Coward
      equipped with 768MB
      nVidia is preparing for Windows Vista
      • I got a chuckle out of this. Incidentally, I am someone who is "lucky" enough to own a last- generation card new enough to run the latest games at acceptable framerates, and yet somehow too dumpy to run Vista in its fully tricked out form.
    • why dont they make it with a CPU kinda socket , so we can unplug the CPU and plug this GPU in its place instead? it would be more fair..
      • See, we have a GPU "socket", just it's a long socket with only two rows of pins.

        The idea of a socket purely for the GPU is a flawed concept. The memory technology used in graphics cards changes quite rapidly (notice how most graphics cards just skipped with over DDR2 and went to DDR3, now some have DDR4 while DDR2 is only now standard as system memory), different GPUs have different bus widths, and the memory speed varies. You solve this by putting the memory on-package with the GPU. Only then, you reali
    • I predict this card will fail. No one will ever need more than 512 MB of ram on a video card.
  • wow (Score:3, Funny)

    by pppppppman ( 986720 ) on Friday November 03, 2006 @11:42PM (#16712587)
    Wow... this thing could run like... two Vistas... maybe
  • Oh your god! (Score:3, Insightful)

    by Daath ( 225404 ) <(kd.redoc) (ta) (pl)> on Friday November 03, 2006 @11:43PM (#16712593) Homepage Journal
    Oh your god! 92% more FPS than ATI's current flagship! Both in HL2 and in Quake 4! "Only" 54% better 3Dmark06 score though. This card is crazy ;P I wish I could afford a truck full of these. Or maybe just one. Hmm and a new CPU... And more RAM... And some huge disks in RAID-5... Damn.
  • by thealsir ( 927362 ) on Friday November 03, 2006 @11:43PM (#16712599) Homepage
    But will the 8600 GT be in a good price range? The 8200? This will matter to a lot more people.

    More power is never, worse, though...unless you are trying to reduce power consumption...
    • Is that both the launch cards will be expensive. nVidia's usual form is to launch high end only with a new major numerical generation (this being the GeForce 8 series). The high end one will be $600 or more, the next one down probably $300-400. You'll have to wait a few months on a more midrange card to come out.

      Makes sense too, new chip and such yields are likely to be a bit low at first so you need to drop it in the expensive stuff. After you've done some work, you release some lower cards.

      If you want a m
    • But it should push down the prices for the 7000 line, which is nice since I don't pay more than $100 for a video card (7600GS is the best at the moment).
    • by ionpro ( 34327 )
      The low-end cards are actually in the -300 series now. Why are manufacturers doing this? ATI had to release their new card as the X1650XT even though it is substantially different from the X1600XT because of this inflation of numbers. When the 9700 came out (as a top-end card), the other two were the 9500 and the 9000. They should go back to that naming and give themselves some headroom (heck, I'd even make the 9500 a 9400).
  • The final piece of my plan for world domi...

    I mean...

    At last, the long awaited G80 series! Only two things prevent my upgrade: Vista's final release reqs and the G80 series. Is it Direct x10 ready as expected? I can't tell from the article. Bah! Even if it isn't I'm holding off Vista until well past its release; I could wait for the GeForce 8850GTX, or 8900GTX or whatever their naming convention is, as well. Impressive stats to say the least.
  • That looks... expensive.
  • by Sycraft-fu ( 314770 ) on Saturday November 04, 2006 @12:02AM (#16712681)
    Does it do DirectX 10? If so, how well? I mean the target market here is the high end gamer thus the interest is going to be on having something that supports the latest, greatest. The game development community seems to be going bonkers over DX10 so it's something to consider before you get a card.

    I'm planning on getting a high-end graphics card here soon but I'm going to hold off until Vista is out and running for a bit to evaluate and make sure I get one with good DX10 support. No sense in spending money on a new generation of hardware if it doesn't support the new generation of software fully.
    • Does it do DirectX 10? If so, how well?

      Umm, of course. The point of G80 and R600 (ATI's next) are that they're the DX10 generation chips. However how well it does DX10 is somewhat of a pointless question. As you point out Vista won't be out for "a few months", and no games using DX10 will be out untill a bit after that. By the time that DX10 performance actually matters an incremental spin of the 8800 (psychic, I'm guessing it'll be called the 8850) will be out.
      • by AbRASiON ( 589899 ) * on Saturday November 04, 2006 @03:54AM (#16713955) Journal
        I think you're missing his point.

        Damn good point it is too, I forgot that entirely.

        Sure the card might be good at DX9, this is obvious but how good is it at DX10?
        The ATI offering may be substantially faster, or this thing may only do the basics of DX10 but be unable to do certain DX10 functions in a single pass, where the competition can.

        Who knows? I can say that in the past, sometimes the 2 companies offering, 1 of them has been designed slightly differently which has led to performance hits in certain modes (iirc ATI's competitor to the GF3 was fairly ho-hum, but don't quote me on that)

        So to summarise, it might be a nice DX9 card but until we see what DX10 demands and both DX10 cards can do - we can only be sure of it's current gen performance, not next gen.

        • "Who knows? I can say that in the past, sometimes the 2 companies offering, 1 of them has been designed slightly differently which has led to performance hits in certain modes (iirc ATI's competitor to the GF3 was fairly ho-hum, but don't quote me on that)"

          ATI was a complete shit factory during most of the 3D card wars when 3Dfx was still around. ATI was not even in the game until the 9500 / 9500 pro. Nvidia was king for a while until the time of the GF4
    • A rather redundant question, but here goes:

      It does DirectX 10. It does DirectX 10 much better than any other card that does DirectX 10. The G80 is the only
      chipset you can buy that does DirectX 10 at this point in time. So if you want to do DirectX 10, you must buy this card. It has no competition.

      AMD won't have anything to compete until next year, and if recent (last 12 months) are any indication, it will be a "me too" offering from AMD rather than the glory days of the old ATI Radeon 9xxx series.
      • by jandrese ( 485 )
        If the past is any indication, the ATI product will be faster than the competing nVidia product, but will be held back by driver issues. Once the drivers are sorted out (mostly) they'll be faster at most everything until the next big nVidia release.
        • ATI never really had an answer for the 7950 sli-on-a-card setup, they're still to release a card that is conclusively faster all around. Prior to that, ATI's x19xx's were on par with the 79xx's, but in smaller supply and generally priced higher. If money was no object we'd all be buying renderfarms.

          It's quite possible the R600 gear will be quicker than the 88xx's, but as usual it will be a paper launch from AMD, and since I'm in Australia, the channel is gutted and we won't get anything until much later i
  • I'm no hardware techie, but I do so enjoy playing a good game --- "when I have time" (yeah...).

    Everytime Microsoft releases a new version of DirectX it has some new sweet feature that everyone wants but none of the current cards on the market support it.

    Microsoft has also said DirectX 10 and Vista will not be backward compatible with previous versions of DirectX. (Or has this changed, as I recall Vista wouldn't support applications built for previous OS's too - seems they changed their tune on that one. The
    • by beavis88 ( 25983 )
      I could be missing something and maybe the card does support DX10

      It does indeed support DX10. As the first ever DX10 card, however, it probably will be put to shame by something else in 4-6 months regardless ;)
      • by Shados ( 741919 )
        No way! Remember the FX serie of card? They had -amazingly- Direct X9 and Pixel Shader 2.0 support even though it was new!

        ::grumbles at his FX 5900 Ultra that can't play most DX9 at an acceptable frame rate...::
    • Re: (Score:3, Interesting)

      by westlake ( 615356 )
      Microsoft has also said DirectX 10 and Vista will not be backward compatible with previous versions of DirectX.

      "Windows Vista continues to support the same Direct3D and DirectDraw interfaces as Windows XP, back to version 3 of DirectX (with the exception of Direct3D's Retained Mode, which has been removed). Just as with Windows XP Professional x64 Edition, 64-bit native applications on Windows Vista are limited to Direct3D9, DirectDraw7, or newer interfaces. High-performance applications should make use

      • by GoMMiX ( 748510 )
        Yes, I misspoke, or miss-wrote... Whatever-have-you.

        What I meant was that DX10 wouldn't be backward compatible. I have read Vista will be backwards compatible but read it was some sort of software emulation.

        What I was getting at was that according to the articles I have read DX10 will simply not work on a card not designed for it - and DX10 itself was not going to be backward compatible. Basically, if you don't have a card built for it you simply can't use it at all.

        When DX9 came out - my 6800GT didn't supp
        • No, you have it wrong.

          DX10 will not be available for any Windows version prior to Vista because the driver model in Vista has changed substantially, so drivers for XP et al wont work. If your card has a Vista compatable driver, then it will work under DX10 - and because Nvidia roll all their cards drivers up into one neat package, once they release a Vista driver for one, chances are it will work for their entire range.

          Your older card will work fine under Vista and DX10 once the drivers are available
    • by Jozer99 ( 693146 )
      Vista isn't backwards compatable with older DirectXs, but DirectX 10 is backwards compatable: You can't install DirectX 8.0 on Vista, but DirectX 10 will run DirectX 9 games just fine. DirectX 10 just builds on what is already there in DirectX 9, so DirectX 10 games will have spiffy features that DirectX 9 games don't have. I may well be mistaken, but I was under the impression that this card is compatable with DirectX 10. However, that does not say that it will run DirectX 10 games very fast, since it
  • AMD ATI vs Nvidia (Score:2, Insightful)

    by Black-Six ( 989784 )
    Now to get things straight, I'm not bashing Nvidia here or criticizing AMD ATI as I own products from both and am very impressed.

    Ok, on to the meat of the topic. I read about this card on Tom's Hardware about a month ago and was very impressed. The specs Nvidia gave Tom's for the 8800GTX was 768mb of GDDR4 memory, 128 pixle pipelines, dual 384 bit memory busses (768 bit total), 4 RAMDAC cores at 450mhz and 2 G80 cores at 550 mhz with the memory at 1000mhz (2000mhz for DDR). The card probably won't have a
    • Re: (Score:2, Interesting)

      by tonyray ( 215820 )
      Who knows what these guys have in store for us

      From what I've been reading, come late 2008, AMD will have one or more GPU's built into their multi-core processors using a new modular technology which allows them to quickly create application targetted processors. One processor for games, another for database servers, still another for scientific applications requiring parallel processing, and so on. This is AMD's much reported "Fusion" technology.
    • I'm sorry, but does your post have a point? You ramble between Nvidia, ATI, and AMD randomly.

      Also check your basic facts. It's not dual core. What on earth is a dual 384-bit bus? 75nm production doesn't exist except for one DRAM (90, 80, 60, and 45 are the current and future logic steps).
      • While most of your post is right on, the last sentence is inaccurate. 75nm [fabtech.org] and 65nm [cbronline.com] fabs are already in operation. I know; I design for one :)
    • dual 384 bit memory busses (768 bit total)


      It's actually a 512 bit and a 256 bit memory bus for a total of 768.
  • ...until they reach the 5 digit numbers. My guess is BFG is already drooling over what's just over the horizon.

    BFG 10K, anyone?
  • by ET_Fleshy ( 829048 ) <lespea.gmail@com> on Saturday November 04, 2006 @01:03AM (#16712979)
    Where are the remaining 27 pages of the article?

    And where are the adds?

    Did I time travel 4 years in the past? What year is it!
  • too bad the cooler looks like may aunt's hair dryer.
  • Even if you underclocked this thing you'd have enough performance to load two copies of Half Life 2 at once and still have enough memory left over to play Solitaire!
  • Shouldn't they implement something like SpeedStep, but for the GPU? It would be killer to have one of these if they only drew minimal power when not using them for anything other than XGL or AeroGlass.
    • They already have that I think. The temperture in my computer does raise when I play 3d Games, and it does fall again when I stop playing.

      (And it falls even if I run other cpu heavy applications, so it's not the cpu that's clocking down).

      • by ostiguy ( 63618 )
        they do - a lot of the cards that require a molex power connector will work without it in windows, but fire up a 3d game, and your system crashes. I believe my x800 xt agp runs at 200 or 300mhz in windows, 500mhz in games.
  • Video cards are getting absurd, but this really ties it up. At over 300watts and this size, I've officially built a desktop gaming PC that's smaller and uses less power than this card alone.. and that was only 3 years ago! There's so many parallel, custom instruction set processors on this it's truly a rendering farm stuffed onto one piece of silicon. I can't decide whether that's irresponsible or appropriate, but I do know no system I've built would even fit this card, let alone power it, let alone be tole
  • I know it's unfashionable to pile on Sony anymore, but for the PS2 and XBox there was a 3-6 month window before PC graphics caught up. I do believe that is what drove some of the early adapters, that something so powerful wasn't available in any other form.

    The Nvidia card that is said to be equivalent to the one in the PS3 is the 7900, which was launched in March.

    The PS3 has been delayed so much that they are now launching AFTER the graphics card that they are equivalent to has been superseded. That's not a
    • Consider that the PS3 is $500 and a high-end graphics card is now $600. The days of consoles having the most powerful graphics are over; the economics don't support it.
    • My understanding is that the console market isn't effect that much by the PC game market, but I'm not in marketing. If anybody has seen numbers on this, I'd love to see it.

  • Radeon X1950 XTX
    Idle: 184
    Load: 308

    GeForce 8800GTX
    Idle: 229
    Load: 321

    Damn. 300 watts just for a single video card. And now read this part:


    Having two SLI bridge connectors onboard may possibly allow users to equip systems with three G80 GeForce 8800 series graphics cards. With two SLI bridge connectors, three cards can be connected without any troubles.


    One full megawatt just for running your video cards. It requires two slots and two power connectors.

    My 6600gt already uses a power connector, which i found scar
    • One full megawatt just for running your video cards. It requires two slots and two power connectors.
      I think you mean kilowatt.
    • Damn. 300 watts just for a single video card.

      From TFA: "Power consumption was measured using a Kill-A-Watt power meter that measures a power supply's power draw directly from the wall outlet".

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...