Positive Reviews For Nvidia' GeForce 6800 Ultra 564
Sander Sassen writes "Following months of heated discussion and rumors about the performance of Nvidia' new NV4x architecture, today their new graphics cards based on this architecture got an official introduction. Hardware Analysis posted their first looks at the new GeForce 6800 Ultra and takes it for a spin with all of the latest DirectX 9.0 game titles. The results speak for themselves, the GeForce 6800 Ultra is the new king of the hill, beating ATI's fastest by over 100% in almost every benchmark." Reader egarland adds "Revews are up on Firing Squad, Toms Hardware, Anandtech and Hot Hardware." Update: 04/14 16:54 GMT by T : Neophytus writes "HardOCP have their real life gameplay review available."
Wonders Never Cease... (Score:5, Funny)
In a word, "Wow."
I mean, who'd have thunk it that the 6800 would still have life? Maybe ATI can counter with a Radeon All-In-Wonder Xtravaganza 6502!
Beats ATI by 100%... (Score:2)
I wouldn't expect a new card NOT to beat out the current cards. ATI and Nvidia have played this catchup game with each other for years.
2d Performance (Score:5, Interesting)
How well does this one do at 2d rendering? I do play 3d games a lot but that doesn't mean I want my web-browsing and other non-3d activities to be sub-par
Re:2d Performance (Score:5, Interesting)
Besides, I'm not going to be using an analog output for too long... DVI kills the whole "2d quality" argument; the color values are passed digitally via a TMDS transmitter. Doesn't matter if
Re:2d Performance (Score:4, Funny)
Doesn't matter if
Dammit, my Geforce isn't displaying the text properly, I can't make out the end of your sentence!
Depends on board components.. (Score:5, Interesting)
So long as you have a quality graphics card, it really doesn't matter who's chipset is powering it. For example, even though NVidia has a poor rep, there are still high quality cards out there.
Re:Depends on board components.. (Score:3, Interesting)
I bought a 9500 Pro a year ago and I've only ever been able to use it for a month. I'm on card number 4 now because of a flaw in the way the heatsink/heatsink shim was made (something their customer service reps admit to). I was so burned by the 9500 that I could honestly never bring myself to by another ATI card for as long as I live. Much in the same way it w
nvidia's back (Score:3, Insightful)
Did anyone else notice the size of the die rivals even that of the Pentium 4 EE? This thing is frickin' huge!
Re:nvidia's back (Score:4, Interesting)
Is it considered "safe" to buy any of the Nvidia chipset motherboards, or are they still pretty sketchy?
Re:nvidia's back (Score:3, Informative)
Re:nvidia's back (Score:4, Informative)
Re:nvidia's back (Score:2)
I have an ECS [ecs.com.tw] N2U400-A motherboard with an NVidia N Force 2 Ultra chipset. It's fantastic. Rock solid stable and fast.
Don't take my word for it, google up some reviews of motherboards with the chipset. It's good stuff.
LK
Re:nvidia's back (Score:2)
Oh, I also have an audigy platinum and an all in wonder 9700 Pro. no problems what so ever.
Re:nvidia's back (Score:3, Insightful)
Re:nvidia's back (Score:5, Insightful)
Re:nvidia's back (Score:5, Funny)
Nvidia GeForce 6800 Ultra: $600
800 Watt Powersupply: $250
MMORPG: $10/mo.
The look on your face when you get your next powerbill: Priceless
There are some things in life your measley paycheck can cover, for everything else there's Massivecharge.
C & D Letter Forthcoming.. (Score:5, Funny)
We at Mastercard do not appreciate that you are using our wonderful ad lines to mock our business. We are aware that we do apply massive charges, but to bring that to the forefront is immature and irresponsible.
Please have your lawyers contact us so we may discuss a settlement which you can pay directly to us at your earliest convenience.
Thank you.
Re:C & D Letter Forthcoming.. (Score:5, Funny)
Do you take American Express?
Re:nvidia's back (Score:3, Interesting)
I've been a hardcord nVidia follower for years, but after last year I was left with a bad taste in my mouth. I'm glad to see another generation of video cards and I can't wait to see what ATI's got to offer - it's been a while since nVidia has had to play catch-up.
Yea! More horsepower for Doom ]|[ (only 2 more months!)
ATI may be right there with them (Score:4, Interesting)
Re:ATI may be right there with them (Score:3, Funny)
Ahahahahah. You should be doing standup with lines like that, hell you even managed to keep your face straight!
When I bought a 9800 Pro I had to do three kernel recompiles to get the damn drivers to work properly, I had to edit part of their interface code because they didn't handle the KT600 chipset I have on my board and even when I got them installed it screwed up my vttys when X was working and they driver
Re:nvidia's back (Score:5, Funny)
What are they doing with my feet? Give 'em back!
latest vs last-year (Score:5, Informative)
Re:latest vs last-year (Score:3, Interesting)
I don't think so. The first ATi card to be released will be a 12x1 pipe version while the first nVidia card will be a 16x1 pipe version. ATi seriously underestimated what nVidia was planning as they moved the production schedule of their 16x1 pipe version 5 months ahead of schedule. ATi was scared s***less a
Fanboyism (Score:4, Insightful)
Personally I don't get the fanboy rivalries--I have a Radeon in my laptop and a Geforce in my desktop, and that's just what I happened to buy at the time, no fanboy adherism going on.
Re:Fanboyism (Score:4, Interesting)
Had a NVidia GEForce2 when it was at the top of the pile a few years ago, picked up an ATI 9700Pro when it was released. May go back to Nvidia, may stay with ATI (shrug).
In the longrun, all of us consumers benefit from some healthy competition. Granted, as a Canuck, I'm happy to see ATI do well - but they also earned it. At the time when the 9700Pro was released, ATI blew Nvidia out of the water. Nvidia had grown a tad complacent, and they paid for it.
Now we'll see what happens with Nvidia having a fast new card and ATI about to release their new offering in a few more weeks.
N.
As a matter of fact, here are some specs on X800 (Score:3, Informative)
Here's what the Register says [theregister.co.uk]:
ATI will ship its much-anticipated R420 chip later this month as the Radeon X800 Pro. The part's 26 April debut will be followed a month later by the Radeon X800 XT on 31 May.
So claims Anandtech, citing unnamed vendor sources and a glance at ATI's roadmap.
If the date is accurate, it puts ATI just 13 days behind Nvidia's NV40 launch on 13 April. NV40 will surface as the GeFo
Comment removed (Score:5, Funny)
Insensitive... (Score:2, Funny)
Power Requirements (Score:5, Insightful)
but it requires a 480 watt power supply
and 2 power connections... And it also has what looks to be a vacuum cleaner tied to it..
I currently use a shuttle skn41g2 for my main box.. I love the sff pc's. This won't work in that.. It would make the includied power supply very sad.
My HTPC box uses an antec sonata with a fanless radeon 9000, and ultra quiet everything else.. Forget using this in a quiet pc as well
I don't care for nvidia's trend towards hideously loud, bulky, power hungry video cards.. They might perform well, but for normal use, i'd prefer something smaller and quieter.. and for god's sake, give me an external power supply.. heh
Re:Power Requirements (Score:4, Insightful)
Now power consumption... that can be an issue.
Re:Power Requirements (Score:3, Insightful)
When you watch tv, turn on the radio to a low sound level.
Even if you have the tv up loud, its still annoying.
(Not to mention using the computer for non-gaming stuff)
Re:Power Requirements (Score:2, Funny)
and 2 power connections... And it also has what looks to be a vacuum cleaner tied to it..
The approaching question is, which is the principal in your box, the Motherboard or the Video card? My present video card has more memory and sucks more power than my laptop, and like yours has a fan, though it's quiet.
FNNNZZOWWWNT! "Wayl, shewt! Thar goes the arc welder! Gessen we cain't play no Medal o' Honor till we gets a new one."
Re:Power Requirements (Score:2, Funny)
bah, I won't be impressed until a video card requires 1.21 gigawatts.
Re:Power Requirements (Score:2)
You bring up an interesting point. I wonder what it would take to create a whole house AC/DC converter. Once in DC its an easy step up or down to the proper voltage for a PC, or any other number of little gadgets that incorporate transformers on them.
Hmm, I only now electronics from one class in Physics so I coudn't comment on it much now. I should look into it though.
I can imagine a 45V supply running through to outlets that support the circle jacks of DC/DC converters. Maybe 12V? Most devices that use b
Re:Power Requirements (Score:3, Interesting)
RTFA. (Score:3, Informative)
Though you are right, using it in an SFF wouldn't be a great idea. Can't have everything.
(And several of the sites mention how it worked flawlessly with a 400W PSU, and the
Incredible day for PC gaming! (Score:4, Interesting)
Now, as DooM 3 is supposedly being released with the 6800, can we expect DooM in mid-may? This is truly an incredible day for PC gaming as we will have cinematic computing in the near future.
I'm giddy.
Its HUGE (Score:5, Interesting)
Re:Its HUGE (Score:3, Interesting)
Re:Its HUGE (Score:2)
Re:If you can afford this... (Score:2)
And the word of the day @ ATI.... (Score:4, Funny)
the cards are still all very expensive (Score:2, Insightful)
Re:the cards are still all very expensive (Score:2, Insightful)
Re:the cards are still all very expensive (Score:4, Informative)
the prices of _new_ cards are always at the maximum that somebody would pay for them.
if you want a cheap card, buy a cheap card(that same cheap card would have cost hundreds of dollars few years back).
the way i see it there's few categories that have been for years: 1. ultra cheaps at 30-50$ 2. entry level gaming cards at 100$ 3. medium level gaming cards 200-300 and then the 4. high end gaming cards at insane 400-500$. all that changes over the years is which speed cards belong where.
there comes new cheap cards occasionally, but usually they base heavily on yesterdays high end chips.
Impressive! (Score:4, Interesting)
I must admit, after looking at the benchmarks from Tom's and Anand's earlier this morning, I am *very* impressed by the results of this chipset. I still have concerns about the cooling and power requirements, as well as the image quality, but that may be partly related to my newfound ATI fanboy-dom.
Speaking of which, I can't wait to see what the boys from Canada have coming next week. 16 pipelines? Mmmm....
Money (Score:2, Insightful)
No seriously, this thing costs more than a new full fledged computer.
I like the reviews, but.... (Score:2, Informative)
It's not hard to see why the U.S. has to violently defend our oil interests when we have video cards wastefully burning through electricity like there's no tomorrow.
I'm all for advances in processor technology, just not when it comes with a high energy consumption price.
I once heard that by leaving a computer with a meas
Re:I like the reviews, but.... (Score:2)
Re:I like the reviews, but.... (Score:5, Informative)
Perhaps the survey you are referring to was measuring energy consumption of a mini-fridge for a single 12 oz.can of beer (served ice cold), but the common refridgerator, and I mean modern, not the one's from the 70s and 80s, as they improve with time, but the modern fridge draws about 700 - 750W. This is about double that of a computer loaded with hardware doing average browsing or word processing. The ratio is less when UT2004 is activated (W00T).
Re:I like the reviews, but.... (Score:3, Informative)
A fridge drawing a constant 700W running 24/7 for 365 days would cost about $613/year to run, assuming an average of $0.10/kWh. ~$50/month electric bill just for the fridge? I don't think so.
M
"... by over 100% in almost every benchmark"?? (Score:5, Insightful)
To measure how well both cards perform with actual gameplay we used Unreal Tournament 2003 and 2004 and Halo and Far Cry. For both versions of Unreal Tournament we've used the built-in benchmark, which consists of a flyby and a botmatch. We've omitted the flyby scores as they doesn't tell us much about performance during actual gameplay, just how fast the graphics card is able to render the flyby. With UT2003 the lead the GeForce 6800 Ultra takes over the Radeon 9800 XT is less impressive, at a 1024x768 and 1280x1024 resolutions it is only 6% faster. At 1600x1200 however the GeForce 6800 Ultra pulls away and clocks in 21% faster. With UT2004 the difference is much bigger, starting off at 10% at 1024x768 up to 65% faster at 1600x1200. What is also noteworthy is the fact that the performance of the Radeon 9800 XT drops at higher resolutions whereas that of the GeForce 6800 Ultra stays at about the same level.
I know this is
Re:"... by over 100% in almost every benchmark"?? (Score:2)
Re:"... by over 100% in almost every benchmark"?? (Score:2)
Re:"... by over 100% in almost every benchmark"?? (Score:3, Informative)
The definition of "almost" (Score:4, Informative)
Congratulations on finding the games section where it didn't womp the best ATI card until you get into the higher resolution ranges.
However, you'll notice on the preceeding pages, "over 100% better" was a very common occurance in areas like shaders and lighting and whatnot.
Pointing out areas where the GeForce doesn't beat the ATI at 100% does exactly nothing to diminish the point of the article submitter.
This is why he said "almost every" and not "all."
Ben
Looks great but... (Score:4, Funny)
How is it the "King of the hill"? (Score:5, Interesting)
I'd wait until the Radeon X800 benchmarks are out before crowning a new king. For all we know ATI's new offering will beat the new GeForce.
Re:How is it the "King of the hill"? (Score:4, Informative)
Dinivin
crazy titles (Score:3, Funny)
Addtional Revenue (Score:3, Funny)
Hell, with something that big they should just build freezer around the card.
short review (Score:5, Funny)
what, both of them?
Thank you ladies and gentlemen, I'm here all week. Available for weddings, bahmitzvahs and light-hearted funerals.
Talk about cornering the market ... (Score:2, Insightful)
... but what am I going to have to PAY for this beautiful monster?
It's big (2 slots), it probably runs VERY VERY hot, takes two power connectors... but it seems to trump EVERYTHING else so far, and not by small amounts!
FIX THE TYPO (Score:2)
More info, pcis, and a different view (Score:4, Informative)
here. [mbnet.fi]
those benchmarks don't look too impressive to me, and the hugeass heatsink/fan combo is still there! not to mention that it requires *two* molexes?
Nvidia is really starting to fall behind...
Re:More info, pcis, and a different view (Score:3, Informative)
Re:More info, pcis, and a different view (Score:4, Insightful)
FX 6200? (Score:3, Insightful)
Re:FX 6200? (Score:4, Insightful)
The current generation's low-end cards (as well as the last gen or two) aren't really worth the money if you want to do anything more complex than 3d screensavers. The FX5200 is a dog that isn't really any faster than the GF4mx was & isn't really worth using in DX9. The Radeon 9200 is actually slower than the 9100 & 9000. Eventually, the FX 6x00 core will be adapted to a chip that sucks just as much & you'd probably be better off getting a high to mid range card from the previous generation.
Intel makes a 3.4GHz P4EE and AMD has the Athlon64 FX-53, both of which are $800+ CPUs, you don't see (many) people complaining about the top of the line chips there being over twice the price of the chip 2 steps down ($275 should get you an Athlon64 3200+ or a P4 3.2GHz), yet when a new graphics card comes out and it's $500 every lines up to talk shit.
Yet there's always going to be something in the $100-150 range (what's considered reasonable mid-range for serious gaming) that's worth buying (barring some sort of hyper-inflation); you don't always have to have the latest & greatest thing on the market. Game manufacturers realize this and target their games to be playable on $50 cards, ideal for $100-150 cards and able to take advantage of the $500 cards.
No more Quake bencmarks?! (Score:3, Insightful)
It's still the only game that can push the hardware to its limits reliably. All those other games tend to have bottlenecks that are algorithm/code related rather than hardware related (like the scripting engine in UT).
Too bad, I found Quake3 to be one of the most accurate because it ran at such a low level and could pretty push the hardware. It's not like those other games are using the hardware shaders yet anyway (or are they?).
Re:No more Quake bencmarks?! (Score:3, Insightful)
Yes, it is incredibly meaningful to see that card X can do 672 frames per second in Quake 3, and card Y can do 784 frames per second, even though your monitor can't show it that quickly or your eyes wouldn't see the difference if it could. When you can boast to your friends about num
Re:No more Quake bencmarks?! (Score:4, Interesting)
They are -- FarCry is probably the most intensive game out there right now, fully utilizing DX9 specs. Halo is no slouch either, although a lot of its speed issues are from wanting to use hardware that simply isn't present (on PCs -- it is on the Xbox; why they didn't port away from this is beyond me).
Aquanox 2, Tomb Raider: Angel of Darkness, Painkiller, UT2k4, BF: Vietnam, and several others utilize DX9 to varying lengths as well. And there's the upcoming games -- Half Life 2, STALKER, Soldner (with an umlaut on the o), World of Warcraft, Everquest 2, and numerous others.
Quake 3 simply isn't a reliable benchmark anymore. It utterly fails to excercise the newer features of the cards -- which are really the only features to bother upgrading for. If all you're going to do is play Q3-era games then a GeForce2 is more than sufficient. If you want to run games already out, and those coming out in the next year, with all the graphical options turned up and at high-res then you'll be best served by either the latest nVidia or (probably) ATI card.
And (most importantly to me, and many others) if you want to get a card that can run new games at reasonable resolutions with most of the graphical bells and whistles on, but at a reasonable price... well, those $400 cards are going to be sub-$200 very quickly now, and the $200 cards are going to drop to around $100.
DoomIII now ready to ship? (Score:5, Funny)
run at 30@fps on the new Nvidia 6800
Holy mother of crap (Score:5, Informative)
-Obscene performance boosts, on a scale I've never seen before
-fancy new effects
-massively improved image quality
-heatsink fan still pretty quiet
-basically free 4xFSAA and 8x ANISO
Weaker points of new Nvidia card:
-Expensive
-it seems that shader precision is still not as pretty as ATI's, though that may be fixed by game patches
-takes up 2 slots with the tall heatsink
-480W recommended PSU
-video processing engine isn't implemented in software yet
I don't really object to the power requirements. This thing is more complicated, bigger, and has more transistors than a P4 Extreme Edition. It consumes about 110W, of which 2/3 is the GPU die's power draw. It is certainly NOT unreasonable to require a big power supply with this thing. It seems as though ATI's solution will have a power supply recommendation as well. Simply put, if you're gonna improve performance by such a margin by means other than smaller manufacturing, you're going to increase power consumption. Get over it.
This thing isn't meant for SFF PCs or laptops, though I'm sure the architecture will be ported to a laptop chip eventually. As for the 2-slot size, well...It consumes 110W! To put this in perspective, it consumes more than any non-overclocked desktop CPU today! Think of how big your Athlon64/P4EE heatsink/fan is, then you'll realise that 2 slots aren't really that big of a problem.
My own personal reason for wanting this thing: It can play any current game at 1600x1200 with 4XFSAA and 8x anistropic filtering at a good framerate, and is the only card that can claim to do this right now
Any word on X support? (Score:2)
Biased Article ? (Score:2)
"What is also noteworthy is the fact that the performance of the Radeon 9800 XT drops at higher resolutions whereas that of the GeForce 6800 Ultra stays at about the same level."
Wouldn't that mean that the limiting factor for fps is NOT the card but some other thing (processor, memory bandwidth) ?
I mean i know they used this hardaware for the test:
"The system we used consists of a Pentium 4 3.2GHz EE processor, EpoX? 4PCA3+ i875P chipset motherboard, 1GB of Crucial DDR400 memory and two W
I wish .... (Score:5, Insightful)
I wish that people that pretend to be computer experts would do the teeniest bit of research.
How about this gem: First introduced in 1995, Microsoft's DirectX application programming interface (API) was designed to make life easier for developers by providing a standard platform for Windows-based PCs. Before the arrival of DirectX, developers had to program their software titles to take advantage of features found in individual hardware components. With the wealth of devices on the market, this could become a tedious, time-consuming process.
I'm glad he cleared that up for us. Because this little known company called SGI [sgi.com] didn't develop OpenGL [opengl.org] back in 1992 [sgi.com]. In fact, were it not for MS, we would still be in the computer graphics dark ages.
I'm not trying to troll here. I am just pissed that people pretend to be experts when they don't have a clues what they are talking about.
Re:I wish .... (Score:2)
My blinders must be stuck (Score:5, Informative)
"All he said was that Microsoft provided a platform for Windows."
What he said:
"Before the arrival of DirectX, developers had to program their software titles to take advantage of features found in individual hardware components."
He didn't just say that Microsoft provided a platform for Windows, he said that before Microsoft provided their platform, developers had to write directly to the graphics drivers. This is untrue: although some programmers did write directly to hardware-specific interfaces like 3dfx's glide, they didn't have to. The availability of OpenGL for Windows predates DirectX, and the availability of OpenGL in general (remember, he said "developers", not "Windows developers") predates DirectX by years.
For a quick reference, check out this Byte article [byte.com], which discusses both the already existing OpenGL, "available on Unix, Windows NT and 95, and the Mac", and the soon-to-be-released Direct3D, "scheduled to ship in the second quarter".
Cue generic video card responses (Score:2)
-"Who needs this? My Voodoo 3 runs Q3A just fine!"
-"Does it have Linux support?"
-"nVidia pwnz ATi!!11one!111~"
Now that that's over with...
I agree with a lot of the comments here: I really dislike nVidia's tendancy towards massive, bulky, noisy, power-hog GPUs. And while the 6800's performance is nothing short of jaw-dropping, I'll bet ATi's solution will be far more eloquent, smaller, with lower power requirements and less noise.
Either way, though, this is good for consumer
I sense a change in the force..... (Score:5, Insightful)
There's a very limited number of gamers that will buy this card - you literally have to build a whole new PC around it considering the power requirements and the slot hoggishness. I wont be buying one. My 9500 Pro Oc'ed to 300/300 with a 3000+ AMD *STILL* plays anything without problems ( at least any I can see )
Even if ATi does come out with a card that beats it, I wont be buying one of those either. Gaming is only *part* of what I use computers for. These days at age 40 I cant compete with the twitchy youngsters anyways :D
I care a lot more these days about how well my data is protected and how good the whole experience is, not how many fps I get in some game.
Re:I sense a change in the force..... (Score:4, Insightful)
90% of the performance of this beastie but has lower power requirements ( 1 molex or none )
I very much doubt Ati's new card won't need any additional power. Don't forget Ati's 9700 Pro was the first card to require more power than the AGP slot provided.
PCI slots aren't as useful as they used to be. So much is on board now so PCI cards aren't needed. Take the Asus A7N8X for instance, it has two network connectors on board as well as sound comparable to a high quality PCI sound card. And don't forget the slot you lose is the first one, which shares an IRQ with the AGP slot so it isn't a good idea to use it in any case.
Re:I sense a change in the force..... (Score:3, Informative)
ATI needs extra power too [theinquirer.net]
What are the recommended power supply brands? (Score:3, Insightful)
my next computer (Score:3, Interesting)
Crazy.
I bet in a few generations more, home PCs will have fans so big, you'll be able to drive them around the house and mow the lawn, too!
"I'm givin' her all she's got, cap'n!" (Score:5, Funny)
When in doubt, mod +1 funny and pray
Power supply issues (Score:4, Informative)
16 pipelines. (Score:3, Interesting)
They made this haul ass by doubling the number of pipes, but the first thing they are going to do when they put out a mid-range card is to halve, or quarter the number of pipes. How much has been done to refine this card, and how much impact will the new design have for those of us with $150 to spend on a video card?
The Nvidia Lan in San Francisco (Score:3, Informative)
-Shader 3.0 Compatible (Farcry had a demo at the show of a patch they have coming out that will upgrade the game to Shader 3.0. It's by far the biggest improve in a game I've ever seen as I actually got to play it).
-14983 3DMARK SCORE! If you know anything about 3dmark, you'd scream in joy at that one.
-Other game companies were there like Everquest2, Lord of the Rings: Battle for Middle Earth and of course, the new nvidia chick Nula with per-pixel lighted hair that has 2 million vectors rendered in real time...
All I have to say is wow.
(But wait for PCI express before you buy one)
PCI-Express (Score:3, Insightful)
What's the point if no one can afford it? (Score:3, Insightful)
Something in that will have to be redesigned before people will consider buying it.
While some hardcore gamers wouldn't mind throwing that kinda cash at a vid card right now, most people won't. Of course, these cards are intended for general consumers once they get about a year old or in the $100-$299 price range, but the 480 watt power supply is like $20 extra per month on your electric bill if you're using it a lot!
That'll be quite a shocker when people figure out that their brand new video card is spiking their elec. bill.
Thoughts (Score:3, Interesting)
I am genuinely happy that Nvidia have released a product that can perform 'significantly' better than their currently available flagship card. As ATi are going to retaliate with their own card, this can only be a good thing and I hope they do actually keep this large performance jump up for the next generation(s).
One thing to note in some benchmarks which I've seen so far, are that some of the results give the maximum framerate of a game. I'd be more happy reading either an average or Minimum framerate achievable, as in a frenetic multiplayer game you are going to be usually rendering a lot more stuff than in a single player. The minimum framerate is what I'll be watching out for as that is where the most frustration will come from - nothing quite so annoying as experiencing slowdown when something critical happens, or if you are in the middle of a hellishly large battle (which happens quite a bit in UT2004 Onslaught, for example).
Unfortunately I won't be able to use this card in my Shuttle. The card is too big and too power-hungry. As someone else says, noise isn't exactly a problem as you would generally get this card to play fancy loud games on anyway.
And recommending a 480w power supply? Hmm. Oh well, wish I was a hardware site journalist under NDA, I'd have had time to buy some shares in Enermax
To all clueless fanboys taunting the power req. (Score:4, Informative)
Comparison 1 [tomshardware.com]
Comparison 2 [tomshardware.com]
2. The ATI Radeon X800s will require two power rails also. So stop dreaming about a "power efficient" part and buy a new PSU
ATI needs extra power too [theinquirer.net]
That said, I'm no fanboy of nVidia or ATI though. The new GF 6800U is still occupying one extra PCI slot and blowing a whole lot of hot air inside the case. Imagine someone put another 100W+ Prescott next to it. I just feel uncomfortable for a GFX card to dissipate so much of heat right next to the CPU. But well... ATI is gonna do that too (except for the two-slot thing)
If there's any reason I'd look forward towards the X800s, I hope they won't require two slots - that is just inelegant. But based on the two molex connectors on the X800s, and the power consumption of their older parts, I won't hold any hope that ATI would "save power".
Re:So... (Score:2)
Re:Useless waste of processing power (Score:3, Informative)
Computer generated frames per second is a completely different thing than film frames per second. Most of your dvds are 23.9 frames per second and you can view even the biggest action scenes with no issues.
Try playing even quake2 multiplayer at 30fps and you will get a headache. It might be okay single player because there is much less action going on. But once you have 50-100 entities fl
Re:Useless waste of processing power (Score:3, Informative)
Yes you do need pretty graphics to kill people. If you are shooting at a guy that is across a field and your in 640x480 with ultra minimum detail, all you will see is a block if your lucky, a little speck that looks like a rock if you are unlucky.
Proceed to bump the resolution up to 1280x10
Re:Useless waste of processing power (Score:3, Informative)
The reason games look better at an average fps of 100 is that they can actually fully calculate and display the scene as it was ment to look and can handle the complexities of the scene while keeping the framerate at an average of 100fps instead of periodicly dropping below 30 and making the game run like shit.
Games are getting more and more complex.
Re:But... (Score:3, Funny)
Re:linux k2.6 driver on dual 64-bit opteron suppor (Score:3, Informative)