GeForce 8800GTX Benchmarked 214
An anonymous reader writes "The card does not launch for another week, but DailyTech already has benchmarks of the new GeForce 8800GTX on its website. The new card is the flagship GPU to replace GeForce 7900, and according to the benchmarks has no problem embarrassing the Radeon X1950 XTX either. According to the article, 'The GeForce 8800GTX used for testing is equipped with 768MB of GDDR3 video memory on a 384-bit memory bus as previously reported. Core and memory clocks are set at 575 MHz and 900 MHz respectively.'"
Holy Cow (Score:3, Funny)
Re: (Score:3, Interesting)
That video card has 50% more memory than my development database server.
Kinda scary, eh?
Re: (Score:2)
double scary
Re: (Score:2)
Re: (Score:3, Funny)
Re: (Score:2)
I got more freaked out when I noticed that my Athlon64 has twice as much L1 cache that my first computer (C64) had as total memory.
That video card has 50% more memory than my development database server.
Oddly enough, memory size has now really outgrown what I manage to use up. Even with some huge memory drains, 2GB is more than enough memory. I don't see why the graphics card memory needs to increase either, from what I've gathered the hot thing now is shaders (data manipulation) rather than
Re: (Score:2)
Re: (Score:2)
As shaders are getting more powerful the number of texture maps is actually going up, a texture which once would have had a simple colour map now may well have a colour map, a specular map, a diffusion map, a glow map, and a normal map, all merged together with a shader.
Whilst it is possibly for shaders to produce textures in an entirely procedural manner it's not as fast for most textures as doing a few texture map lookups, munging the data, and throwing it out.. hence graphics memory is still needed. Al
Re: (Score:2)
Hmm... I have a gig of memory, and 5 gigs of swap, and have often hit 3 gig swap use mark just in my normal desktop use. Of course I also have 12 virtual desktops configured right now :)...
In any case, 768 MB might seem like a lot of memory, but it isn't really. Textures are 2D images, and when you double the resolution, you quadruple memory consumption. And, of
Re: (Score:2, Funny)
Re: (Score:2)
Get rid of AC logins, AC's are worthless (Score:2)
Said the anonymous coward who thinks that he knows my hardware setup without my even telling him. I like how you just psychically KNOW that I must not know what I'm talking about based on your meager, barely interesting little experience with your own crappy card.
I have a GeForce4 Ti 4200 overclocked almost to 4600 speeds. I've had it for a long time. I've been able to run most games at 1024 x 768 resolution with medium graphics settings for years and years now. So what's the problem
Re: (Score:2)
Re: (Score:2)
The idea of a socket purely for the GPU is a flawed concept. The memory technology used in graphics cards changes quite rapidly (notice how most graphics cards just skipped with over DDR2 and went to DDR3, now some have DDR4 while DDR2 is only now standard as system memory), different GPUs have different bus widths, and the memory speed varies. You solve this by putting the memory on-package with the GPU. Only then, you reali
Re: (Score:2)
wow (Score:3, Funny)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Oh your god! (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
But.... (Score:3)
More power is never, worse, though...unless you are trying to reduce power consumption...
My guess (Score:2)
Makes sense too, new chip and such yields are likely to be a bit low at first so you need to drop it in the expensive stuff. After you've done some work, you release some lower cards.
If you want a m
of course not (Score:2)
Re: (Score:2)
Re: (Score:2)
At last (Score:1)
I mean...
At last, the long awaited G80 series! Only two things prevent my upgrade: Vista's final release reqs and the G80 series. Is it Direct x10 ready as expected? I can't tell from the article. Bah! Even if it isn't I'm holding off Vista until well past its release; I could wait for the GeForce 8850GTX, or 8900GTX or whatever their naming convention is, as well. Impressive stats to say the least.
Wow... (Score:2)
Re: (Score:2)
Of course the real question is (Score:5, Insightful)
I'm planning on getting a high-end graphics card here soon but I'm going to hold off until Vista is out and running for a bit to evaluate and make sure I get one with good DX10 support. No sense in spending money on a new generation of hardware if it doesn't support the new generation of software fully.
Re: (Score:2)
Umm, of course. The point of G80 and R600 (ATI's next) are that they're the DX10 generation chips. However how well it does DX10 is somewhat of a pointless question. As you point out Vista won't be out for "a few months", and no games using DX10 will be out untill a bit after that. By the time that DX10 performance actually matters an incremental spin of the 8800 (psychic, I'm guessing it'll be called the 8850) will be out.
Re:Of course the real question is (Score:4, Informative)
Damn good point it is too, I forgot that entirely.
Sure the card might be good at DX9, this is obvious but how good is it at DX10?
The ATI offering may be substantially faster, or this thing may only do the basics of DX10 but be unable to do certain DX10 functions in a single pass, where the competition can.
Who knows? I can say that in the past, sometimes the 2 companies offering, 1 of them has been designed slightly differently which has led to performance hits in certain modes (iirc ATI's competitor to the GF3 was fairly ho-hum, but don't quote me on that)
So to summarise, it might be a nice DX9 card but until we see what DX10 demands and both DX10 cards can do - we can only be sure of it's current gen performance, not next gen.
Re: (Score:2)
ATI was a complete shit factory during most of the 3D card wars when 3Dfx was still around. ATI was not even in the game until the 9500 / 9500 pro. Nvidia was king for a while until the time of the GF4
Re: (Score:2)
It does DirectX 10. It does DirectX 10 much better than any other card that does DirectX 10. The G80 is the only
chipset you can buy that does DirectX 10 at this point in time. So if you want to do DirectX 10, you must buy this card. It has no competition.
AMD won't have anything to compete until next year, and if recent (last 12 months) are any indication, it will be a "me too" offering from AMD rather than the glory days of the old ATI Radeon 9xxx series.
Re: (Score:2)
Re: (Score:2)
It's quite possible the R600 gear will be quicker than the 88xx's, but as usual it will be a paper launch from AMD, and since I'm in Australia, the channel is gutted and we won't get anything until much later i
DirectX 10 and Vista (Score:2)
Everytime Microsoft releases a new version of DirectX it has some new sweet feature that everyone wants but none of the current cards on the market support it.
Microsoft has also said DirectX 10 and Vista will not be backward compatible with previous versions of DirectX. (Or has this changed, as I recall Vista wouldn't support applications built for previous OS's too - seems they changed their tune on that one. The
Re: (Score:2)
It does indeed support DX10. As the first ever DX10 card, however, it probably will be put to shame by something else in 4-6 months regardless
Re: (Score:2)
Re: (Score:3, Interesting)
"Windows Vista continues to support the same Direct3D and DirectDraw interfaces as Windows XP, back to version 3 of DirectX (with the exception of Direct3D's Retained Mode, which has been removed). Just as with Windows XP Professional x64 Edition, 64-bit native applications on Windows Vista are limited to Direct3D9, DirectDraw7, or newer interfaces. High-performance applications should make use
Re: (Score:2)
What I meant was that DX10 wouldn't be backward compatible. I have read Vista will be backwards compatible but read it was some sort of software emulation.
What I was getting at was that according to the articles I have read DX10 will simply not work on a card not designed for it - and DX10 itself was not going to be backward compatible. Basically, if you don't have a card built for it you simply can't use it at all.
When DX9 came out - my 6800GT didn't supp
Re: (Score:2)
DX10 will not be available for any Windows version prior to Vista because the driver model in Vista has changed substantially, so drivers for XP et al wont work. If your card has a Vista compatable driver, then it will work under DX10 - and because Nvidia roll all their cards drivers up into one neat package, once they release a Vista driver for one, chances are it will work for their entire range.
Your older card will work fine under Vista and DX10 once the drivers are available
Re: (Score:2)
Drivers for Windows XP et al wont work under Vista because the model has changed, and thus current generation of cards wont work unless theres a new Vista driver for them. Get a Vista driver and you are fine and dandy, running under DX10 regardless of whether the card supports DX10 specific functionality or not.
The post I was replying to
Re: (Score:2)
AMD ATI vs Nvidia (Score:2, Insightful)
Ok, on to the meat of the topic. I read about this card on Tom's Hardware about a month ago and was very impressed. The specs Nvidia gave Tom's for the 8800GTX was 768mb of GDDR4 memory, 128 pixle pipelines, dual 384 bit memory busses (768 bit total), 4 RAMDAC cores at 450mhz and 2 G80 cores at 550 mhz with the memory at 1000mhz (2000mhz for DDR). The card probably won't have a
Re: (Score:2, Interesting)
From what I've been reading, come late 2008, AMD will have one or more GPU's built into their multi-core processors using a new modular technology which allows them to quickly create application targetted processors. One processor for games, another for database servers, still another for scientific applications requiring parallel processing, and so on. This is AMD's much reported "Fusion" technology.
Re: (Score:2)
Re: (Score:2)
Also check your basic facts. It's not dual core. What on earth is a dual 384-bit bus? 75nm production doesn't exist except for one DRAM (90, 80, 60, and 45 are the current and future logic steps).
Re: (Score:2)
Re: (Score:2)
It's actually a 512 bit and a 256 bit memory bus for a total of 768.
Only a few more release cycles... (Score:2)
BFG 10K, anyone?
Somewhat confused (Score:3, Funny)
And where are the adds?
Did I time travel 4 years in the past? What year is it!
It will be clear soon (Score:2)
style (Score:2)
Underclocking (Score:2)
Why the high power consuption while idle? (Score:2)
Re: (Score:2)
(And it falls even if I run other cpu heavy applications, so it's not the cpu that's clocking down).
Re: (Score:2)
Rendering Farm on a Card (Score:2)
Another nail in the PS3 Coffin? (Score:2)
The Nvidia card that is said to be equivalent to the one in the PS3 is the 7900, which was launched in March.
The PS3 has been delayed so much that they are now launching AFTER the graphics card that they are equivalent to has been superseded. That's not a
Re: (Score:2)
Re: (Score:2)
My understanding is that the console market isn't effect that much by the PC game market, but I'm not in marketing. If anybody has seen numbers on this, I'd love to see it.
Re: (Score:2)
I would have been more inclined to agree with you when they were still overestimating the capabilities of the Cell processor and planning to do the graphics with that alone. When they fell back to an Nvidia solution their graphical architecture became much more mundane while the total system architecture got more complex.
I'm not saying that the PS3 isn't still a worthwhile purchase for whatever reasons, but that this is a concrete example of how Sony has
Power consumption (Score:2)
Idle: 184
Load: 308
GeForce 8800GTX
Idle: 229
Load: 321
Damn. 300 watts just for a single video card. And now read this part:
Having two SLI bridge connectors onboard may possibly allow users to equip systems with three G80 GeForce 8800 series graphics cards. With two SLI bridge connectors, three cards can be connected without any troubles.
One full megawatt just for running your video cards. It requires two slots and two power connectors.
My 6600gt already uses a power connector, which i found scar
Re: (Score:2)
Re: (Score:2)
From TFA: "Power consumption was measured using a Kill-A-Watt power meter that measures a power supply's power draw directly from the wall outlet".
Re: (Score:2)
Actually, the PCI-E version of the 6600GT don't use an extra power connection, only the AGP version of the 6600GTs do. The PCI-E version of the Geforce 6600GT gets all of its power from the PCI-E slot. It's an old card, but it should still be able to handle 1024x768 no problem.
Re: (Score:3, Interesting)
http://www.dailytech.com/article.aspx?newsid=444 1
Re: (Score:2)
Re: (Score:2)
don't feel so bad (Score:2)
Or be sane and wait until there is a game I want to play that actually stresses out my computer.... but that... would be... exercising... so much restraint...
Re: (Score:2)
Re: (Score:2, Insightful)
Re: (Score:2)
It feels like it's all been done, because for the most part, that's true. Adding 37% more shiny crap to the same old game doesn't make it better, it just means it has more shiny crap and you are going to spend a fortune on new hardware just to play the same old game concept.
I wish the industry would l
Re: (Score:2)
Right now I'm playing Cave Story and Bontago, both freeware games, while waiting on
Re: (Score:2)
Re: (Score:2)
So assuming we had infinite graphics power, adding a mountain of development time required to achieve photo-realism would create more "gameplay, story and characterisation"? More likely it'll be even more rehas
Re: (Score:2)
Re: (Score:2)
It seems to me that the current corporate games industry has one simple model for game developement:
1: create or more often buy new game concept.
2: release version after version with tiny improvements and graphical tweaks to re-sell the same game again and again until people get sick of it.
3: goto 1
And they can spend years stuck on (2)
Whereas ID software, to name one good example, produce wonders each tim
Re: (Score:2)
Re: (Score:2)
But if you could see (and shoot at the same time) I assure you that you'd agree the graphics are impressive.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Funny)
Re: (Score:3, Funny)
Just as long as it's not the 9800 Pro, that's fine.
Re: (Score:2)
Re: (Score:2)
V-sync seems to be enabled by default in the game with no option to turn it off. I did the above and got about a 10fps boost, making the game at least playable.
Re:Any chance of getting NVIDIA to opensource thei (Score:2)
Re: (Score:2)
Re: (Score:2)
Moderate this up if you have the moral fiber and intestinal fortitude to show something worth-while about this pathetic breach of confidentiality.
Pathetic indeed.