NVIDIA GTX 295 Brings the Pain and Performance 238
Vigile writes "Dual-GPU graphics cards are all the rage and it was a pair of RV770 cores that AMD had to use in order to best the likes of NVIDIA's GeForce GTX 280. This time NVIDIA has the goal of taking back the performance crown and the GeForce GTX 295 essentially takes two of the GT200 GPUs used on the GTX 280, shrinks them from 65nm to 55nm, and puts them on a single card. The results are pretty impressive and the GTX 295 dominates in the benchmarks with a price tag of $499."
Holiday Shopping (Score:2, Insightful)
No benchmark against GTX 280? (Score:2)
Re: (Score:2)
Re: (Score:2)
Because this new card uses the 260's memory bus and memory effects dominate at higher resolutions with AA turned up? Because its core clock is the same as the 260? Yes, its cores have the same number of shaders as a 280, but if you run at 1920x1200 or above it's probably more like a pair of 260's in SLI than a pair of 280's - and there's no point in having a $500 card if you play on a crappy screen.
You must have completely missed what I was saying. I own a GTX 280 card, and I would like to see how my GTX 280 compares to the new GTX 295. Why would you benchmark a graphics card without comparing it to the card that was top-of-the-line for the last several months straight?
Re: (Score:3, Funny)
So, no hardware in your stockings this year!
You have no idea how wrong that sounds.
Re: (Score:3, Funny)
get your mind out of the gutter ...or Santa won't be sliding down your chimney this year.
ugh (Score:3, Insightful)
this is like the razor wars (double blade! triple blade! quad blade! pento blade!). With OpenCL and DirectGPU (or whatever MS is calling it this week), this should be good for anyone trying to build a cheap superGPU cluster.
Re:ugh (Score:5, Funny)
this is like the razor wars (double blade! triple blade! quad blade! pento blade!).
Clearly you don't value a close shave.
Re:ugh (Score:4, Funny)
Re:ugh (Score:5, Funny)
I like A Close Shave. But I still think The Wrong Trousers is the best of the films with Wallace & Gromit.
Re: (Score:3, Interesting)
Re: (Score:2)
Re:ugh (Score:5, Informative)
Somehow most previews don't even mention power consumption.
Had you RTFA properly, folks have mentioned that card is not yet officially out and nVidia asked to withheld further details as BIOS might still get tweaked.
By that logic everything which does not start to burn is power efficient...
This is not an absolute metric (or is it "yardic" in US?). I presume they compare it to 4870 on which the infamous DDR5 alone - even when idle - draws whooping 40W [techpowerup.com]. 4870x2 has already tweaked factory BIOS and yet twice more DDR5 still consumes same 40W. Yes - RAM alone consumes 40W.
Re: (Score:2)
Well for one, this is a "preview", which usually means NVidia sent a bunch of propaganda for them to disseminate to the masses. These things come with NDAs and a variety of other restrictions, in exchange for getting the "scoop" on the new product, and raking in a few extra yen from the ad networks.
You will get a real review once the product hits the shelves, and a real person performs real tests. Anything else should be taken with a grain of salt, as most "review" sites exist to make money first and fore
Re: (Score:2, Informative)
Impressive Card (Score:3, Funny)
Re: (Score:2)
Re: (Score:2)
And what exactly is wrong with 1024x768x24bpp?
(No, seriously. I used that very setup from 1995 to 2001. Then I got a PCI Voodoo4 cheap because 3dfx had just gone bust, and then got drunk one night and bought a 19" CRT on an auction site, and discovered the joy of 1600x1200.)
Re: (Score:2)
Prob. in order 4x RV770 from ati ? (Score:2)
Re:Prob. in order 4x RV770 from ati ? (Score:5, Funny)
All I know is that my graphics box (I call it a graphical) houses a nice little motherboard with a cute Intel chip, some hard drives, and I think I even have a sound card plugged into it.
I remember when the graphics "card" was simply part of the computer -- these days, all the other components are part of my graphical.
It's great that there's a market for this stuff... (Score:5, Insightful)
It's great that there's money for this stuff... (Score:3, Informative)
It's not just about the graphics. GPUs are being called upon to do much more, from AI to Physics, to folding@home. Even encoding and decoding audio and video formats.
Re: (Score:2)
As a programmer who does a great deal of data crunching, I sincerely hope that Intel's 80+ core CPU comes along quickly to crush the silliness out of people who are trying to find applications for GPU "cores."
Re: (Score:2)
In order to do that Intel would have to adopt some of the architecture that is a GPU. Hence become the very thing you dislike. Right now GPUs are here and being used. Your fictional core isn't and with present limitations most likely will not be showing up for some time.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Take a look at the latest issue of Computer- there's an interesting (if technically fairly light) look at performance ratios of various multicore designs- the present "cpu surrounded by lots of simple processing cores design, as implemented by a CPU + GPU combination, turns out to have the best performance/space and performance/watt ratios.
Sure, there are some situations where you can do massive parallelization but each individual thread needs the flexibility of a full CPU, but there are at least as many, i
Re: (Score:2)
Re: (Score:2)
If intel brings out their 80 core proc I will be one of the first to compare to gpu to see which I should continue to use, until then I will use my gpu for real performance gains.
Re: (Score:2)
You can claw your way up to 16 cores today if you work with 4 socket motherboards...
The data I work with isn't typically "fully massaged", there are a lot of sparse areas to consider quickly to identify areas that need more attention.
I still find cases where the algorithms can be sped up more than 10x by eliminating un-necessary work - this might not happen in more mature fields, but especially in cases where the new programmer has implemented something, there's usually more speed to be gained in a good c
Re: (Score:2)
yea, and that general-purpose CPU will also consume about 80x the power needed by a vector processor that is twice as fast at performing: 2D/3D rendering, scientific modeling, financial calculations, video transcoding, data compression, and all the other tasks that GPGPU/stream processors are used for.
a general-purpose scalar CPU will never replace specialized vector co-processors because it's the wrong tool for the job. most applications that vector coprocessors are used for involve processing very large d
Re: (Score:3, Insightful)
You are also kind of donating the hardware, which is a much bigger cost than the power. $10 worth of electricity will do more of these calcs than a $10 donation would enable.
Re: (Score:2)
You are also kind of donating the hardware, which is a much bigger cost than the power. $10 worth of electricity will do more of these calcs than a $10 donation would enable.
Except donating 10$ cash being a tax deduction costs you less than $10 electricity which isn't. So you could donate say, $14 cash and have it cost the same $10 electricity.
Further, that's not $10/year. For a high end GPU running folding at home, you are easily pushing 300 Watts. In most places doing that 24x7 will run you upwards of $25
Re: (Score:2)
it makes the integrated graphics in eee class PCs that much better when the tech trickles down 5 years later.
This is also good news right now for the "sweet spot" gaming PC builders. Each time these new bleeding-edge-$500-200W-XXTREME cards come out, the previous two generations of graphics cards tend to suddenly drop drastically in price.
When I built my current middle-of-the-road gaming computer, I put in an ATI HD3850 for $150, with the expectation of adding a second on Crossfire once the price drop occurred. Looking at Newegg.com, the 3850's have hit ~$55.00 this week. My computer looks to be due for an
Drivers drivers drivers (Score:5, Interesting)
For me personally, I could care less if the card hardware is great if the drivers suck. NVIDIA, fix your linux drivers please. Next time I'll give a much harder look at amd.
Re: (Score:2)
Next time I'll give a much harder look at amd.
I'll save you 5 minutes of research....stick with nVidia.
But in all seriousness, I agred with your point. It seems like their Linux drivers have taken a shit compared to previous releases. Personally for me, I have a lot of artifacting issues in KDE4 that are apparently related directly to nVidia's drivers from what I've read.
Re: (Score:2)
But in all seriousness, I agred with your point. It seems like their Linux drivers have taken a shit compared to previous releases. Personally for me, I have a lot of artifacting issues in KDE4 that are apparently related directly to nVidia's drivers from what I've read.
Performance issues, by any chance? I've been baffled why my KDE 4 performance is so terrible compared to my Gnome performance, and I have a reasonable nVidia notebook chip (Quadro NVS 140M).
A lot of the forums have similar complaints, but most people seem to indicate that their problems went away with the 177.80 drivers, which I have installed.
I was hoping the forthcoming nVidia driver will help, but from how people are talking, I've got to wonder if it's even wise to install it when it's released.
Re: (Score:2)
Bad advice, the new AMD cards runs fine under Linux, since radeon 9600 I've never had any problems getting ATI/AMD to run under Linux.
Re: (Score:2)
I've still been buying nVidia for my Linux boxes. Is ATI finally a better choice?
Re: (Score:2)
I don't know about ATI, but NVidia is a _terrible_ choice! I also used to buy NVidia for all my Linux boxes and recommend it to everybody. I was wrong. The problem is that the free NVidia drivers are extremely slow. They are actually slower than using the generic VESA driver with Intel graphics.
I don't know why - I suspect it has something to do with reading from display memory - but it is a fact. I have a relatively fast quadcore machine, and yet when I am using the free NV drivers, it is unusable for Inte
Re: (Score:2)
Generally what I've done is preferred Intel for my work laptops, because I wanted to good quality of the Intel open source drivers.
But on my home computers, which I sometimes use for gaming in Linux and/or Windows, it's really a choice between nVidia and ATI. I don't care much wether I use the open or closed source drivers on those cards, as long as they work well.
The problem for now seems to be that at least in KDE 4, nVidia's closed drivers aren't a good match for the implementation of KDE 4, but ATI is
Re: (Score:2)
I'm sorry... what? I don't see how that's a reasonable choice at all.
fix (Score:4, Insightful)
NVIDIA, fix your linux drivers please.
NVIDIA, open your linux drivers please.
Re: (Score:2)
So by your argument, nVidia is still working on the drivers... still paying their devs the same... but they can accept tweaks from the community that improve them for free. I don't see the issue here.
Re: (Score:2)
Next time I'll give a much harder look at amd.
A good advice - don't. Maybe when their open source driver rolls around but the fglrx driver has a lot of really bad issues, take a look at the Phoronix forums whenever a new AMD driver rolls around. Personally I run KDE3 on a 8800 GTS and all the 3D acceleration and whatnot is working just fine for me. No tearing in video (a problem that seems to plague AMD users endlessly), but then I use it in a desktop machine.
Re: (Score:2)
So... you're recommending against ATI without having actually used it?
I'm currently using Nvidia (on price), but I was on an ATI card 6 months ago and it worked fine. The Nvidia drivers were slightly better when I switched, but only very slightly.
Re: (Score:2)
Microstutter (Score:3, Interesting)
I wonder if this card will suffer from microstutter. The 9800GX2 benchmarked very well but real world performance was lacking because the card essentially varied between very high fps and very low fps, so it still lagged even though it got decent average fps.
With these dual cards it's best to look at their low fps rating. An average fps is often misleading.
Re: (Score:2)
The 512 had a bit of a lag, but the 1gb did not.
This is all so 1998 (Score:2, Insightful)
Ten years ago the video card wars were in full swing. Each generation brought amazing new advance and impressive technology.
But nVidia and ATI haven't realized that we passed the point of diminishing returns years ago. Mobility and battery life are what matter. And I know there are hardcore PC gamers out there, but there are only a handful of companies even bothering to push the high-end graphics, so you buy a $500 video card and there are exactly ZERO games that take full advantage of it. Wait a year or
Re:This is all so 1998 (Score:5, Informative)
Re: (Score:2)
I'd say let the early adopters go for it. It is usually the early adopters that help pay for the lion's share of the development anyway. A year from now, the same performance will be had for less than half, and there should be several games that can play it.
Re: (Score:2)
No the render farms do not need this level of graphics either ... they render off-screen at seriously low frame rates (frames per minute not per second) they go for quality not speed
They do use the GPU's to render but they are using it as a general processor that is fast (in a cluster) not as a GPU at all ...
The only people using high end graphics cards are Gamers and a very few graphic artists ...
Re: (Score:2)
I also expect in the near future to see accelerated CAE/FEA built into new CAD packages that utilize the power of the GPU for processing.
Re: (Score:2)
Re: (Score:2)
I agree with you, but I also think that raster 3D is hitting a downward slope in "realistically programmable" feature sets and hopefully ray tracing or hybrid rendering will start to pick up in it's place. I actually think keeping the bleeding edge market going is a good thing toward both obtainable real time tracing and lower power consumption. Even today, I think (even today) nVidia/ATI have to reduce energy costs to go faster.
Re: (Score:2)
ATI and nVidia know all about the mobile, embedded and low power markets. That's where they make most of their money. You only see the product announcements for the latest and greatest, super fast, gamer wet dream cards because those are the only ones that make the news.
ATI and nVidia have to do R&D, develop faster cards even if they are impractical for their target market. Then, once those features have been tried and tested, they get put into regular, production cards. Just like Formula 1, they've
Re: (Score:2)
This is blatantly false. What a decent ($200+) graphics card buys you today is being able to play games at decent resolution. To play todays games at
Re: (Score:2)
But nVidia and ATI haven't realized that we passed the point of diminishing returns years ago. Mobility and battery life are what matter.
In the context of a card designed for desktop PCs, specifically for people who play games on gigantic monitors? You can't be serious.
And I know there are hardcore PC gamers out there, but there are only a handful of companies even bothering to push the high-end graphics, so you buy a $500 video card and there are exactly ZERO games that take full advantage of it.
Most modern games can use the full capabilities of this card. It's designed for people who want to play the latest games at extremely high resolutions with maximum quality settings. That takes an unholy amount of processing power. It's really the only use for a card like this. Most people won't buy it.
And as a bonus, you get SIGNIFICANTLY increased power consumption, and the video card addicts are just wasting resources so they can all whack-off to Shader 30.0 soft shadows on eyelashes.
Nobody who buys a dual-GPU card gives a singular shit about their power
Point: Missed (Score:4, Interesting)
For the interested, there's a great article at anandtech [anandtech.com] talking about how the R770 came to be pretty awesome... Really, though, it's not a super-high-end part.
Re: (Score:2)
Re: (Score:2)
[citation needed]
The new Nvidia is debuing at $499
Cheapest competing AMD (Radeon HD 4870 X2 2GB) card I could find was $480
Re: (Score:2)
Doh! That's supposed to be debuting.
Re: (Score:2)
Taking back the performance crown? (Score:4, Informative)
Nvidia never lost the performance crown. AMD did not even bothered to compete with Nvidia for performance at the high end.
Read this excellent article [anandtech.com].
What AMD did with the RV770 series was to totally pwn everything below the super high end.
When the 4870 was released at $299, it was generally worse than GTX280, but it easily beat the GTX260 which was priced at $399.
When the 4850 was released at $199, it easily matched the 9800GTX which was priced at $249
Re:Taking back the performance crown? (Score:5, Insightful)
No, the funny thing here is that AMD *did* have the performance crown, even though they had planned to give it up. Before the GTX 295, the Radeon 4870x2 was the top of the pile for single-card graphics.
The Real Question (Score:2, Interesting)
Personally I'm betting on the former being far more likely than the latter.
-Nvidia-? All the -Rage-? (Score:2)
You guys have got it all wrong... ATI cards are all the rage... Or, I guess they were until the Radeon...
Re: (Score:3, Informative)
Re: (Score:3, Interesting)
Re: (Score:2, Informative)
Re: (Score:2, Interesting)
Now, which one (given those numbers) would you expect to look better?
Which one actually looks (a lot) better?
Theres your problem.
Also it has a spectacular memory leak that sees it using up all available physical memory after a while, grinding to a halt and refusing to load any new textures, which is actually pretty funny the first time. Driving around in a flying car avoiding invisible buildings
Re: (Score:2)
not quite. physics calculations are very well suited to vector processing, which is why most Physics Processing Units are vector processors [wikipedia.org], just like GPUs, array processors, DSPs, stream processors, etc.
a GPU may be more specialized to handle 2D/3D graphics rendering, but they can also perform physics calculations quite well because of their SIMD architecture. GPGPU stream processors in particular are of great interest to the scientific community because of their superb performance in scientific modeling a
Re: (Score:2)
They do, ever since NVidia implemented the PhysX API in their GPU drivers (for GeForce 8 and up).
On my PC, the PhysX control panel lets me choose between the GeForce or an AGEIA card. They rolled this out a few months ago, works on XP and Vista as far as I know.
Screenshot [ozone3d.net]
Re: (Score:2)
I don't know very many people who play the latest games in 1024x768 anymore. Heck, I was playing Q3 that high in late '99. Today it's all about 1680x1050, 1920x1200... maybe a bunch of peeps on 1280x1024 laptops or CRTs.
Maybe I should try running Crysis in 320x240 for shits and giggles.
Re: (Score:2)
According to the Steam Hardware Survey [steampowered.com] last updated October this year, about a quarter of it's user base have a primary resolution of 1024 x 768. Surely there are some that set certain games to higher resolutions, but I would expect the percentage of those people in the 1024 x 768 bracket to be similar to the percentages in the other brackets. Except those at 1920 x 1200, naturally.
Re:Really, though. (Score:5, Informative)
GTA4 is processor dependent, not GPU dependent, cos it's the crappiest console port we've seen in years.
Re: (Score:2)
Try this... (Score:2)
It's been recommended on several OC forums to disabled clip recording in-game--apparently this is offers a substantial performance increase... YMMV, but it's free to try. Unlike this new NVIDIA beast.
Re:Really, though. (Score:5, Interesting)
Please cite that.
I'm running L4D on my (very) old computer, a 1.6 Ghz AMD single core with a 7600 GS and 1.5GB ram. The game runs fine at medium settings (despite the fact that I am way, way under the minimum system requirements), and when I briefly swapped out the 7600 for a 7900, I was able to turn a few of the settings from medium to high (1024x768, textures low, medium effects -> 1280x1024, textures medium, high effects) and still get a stable 20-25 average frame rate.
Not quality benchmarks, I know, but the engine hasn't changed drastically since HL2 except for graphical improvements (=GPU limited), so claims about it being CPU limited haven't been true since the first public version of the Source engine, and that's assuming they were even true back in 2004.
Re: (Score:2)
Re: (Score:3, Insightful)
People need to understand that sometimes, those detail sliders aren't meant to be cranked all the way up on current hardware. They're there to "future-proof" the game, so that it can still look pretty 2 or 5 years down the road. Wing Commander 4 did it for example.
Of course, it's not a huge incentive for developers to future-proof a game when all they get for it is a forum-bashing like "omg the game sux i can't get 50 fps on my 1337 rig".
Re: (Score:2)
Actually, I don't think Rockstar 'get' the PC platform anymore. They actively prevented people from changing the game files so mods are impossible without hacks. They put mechanisms in place to prevent trainers from working so you can't have
Re:i hate fans (Score:5, Insightful)
Easy--they're deaf. After years of working on building (near) silent PCs, I've learned that what many people/reviewers consider to be 'quiet' is nowhere near my definition of 'quiet'. I'm not quite sure how loud some gamers have their sound systems turned up, or if they play with the window open or what, but I simply can't trust a review on newegg or most websites when someone says a piece of equipment is 'silent'. There are a few websites like silentpcreview.com that do a good job, but if a piece of equipment isn't reviewed there or in the forums, you're SOL (or you get to be the guinea pig).
Re: (Score:2)
Or maybe you have the "low-fat chips" phenomenon. Yes, it's somewhat less fat than regular chips but they're still chips, not celery roots and nowhere near healthy. A CrossFire/SLI paired with a high-end CPU and maybe overclocked too have only two options, loud and really loud. Wattage and DB ratings are a much better bet, and "fanless" is the magic word.
Re: (Score:2)
or you go for relatively high flow (I use a squirrel cage blower in the PC for exhaust) puss the exhaust air through a muffler, and dynamat the inside of the PC for noise abatement. Works a treat. All intake is passive, no fans, air comes in via the PSU and a vent in front of the HDDs. The air is then routed through the case and over components via ductwork made from plexi and terminates at the blower intake. overall the machine runs at a fairly substantial negative pressure, and very quiet.
-nB
Re: (Score:2)
I've yet to find a squirrel cage blower that does not make an absurd racket at all speeds. Do tell where you find yours, as there are many applications where such a blower would be far more appropriate than conventional fans.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Equal amount spend on the audio system.
Reminds me of an old joke, "I spent $500 to get my muffler fixed on my car, but now I can hear all these other things that are wrong and it's going to cost me another $500 to get louder speakers."
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Don't the heatsink-only ones rely on better-than-average airflow through the case, which means more and/or stronger case fans?
Re: (Score:2)
And really, it's not that common. AMD CPU: higher number is faster. AMD GPU: Higher number is faster. Intel CPU: Higher number is faster. It's just Nvidia with their anal marketing.
Re: (Score:2)