New DX10 Benchmarks Do More Bad than Good 99
NIMBY writes "An interesting editorial over at PC Perspective looks at the changing status between modern game developers and companies like AMD and NVIDIA that depend on their work to show off their products. Recently, both AMD and NVIDIA separately helped in releasing DX10 benchmarks based on upcoming games that show the other hardware vendor in a negative light. But what went on behind the scenes? Can any collaboration these companies use actually be trusted by reviewers and the public to base a purchasing decision on? The author thinks the one source of resolution to this is have honest game developers take a stance for the gamer."
John Carmack (Score:5, Interesting)
Re:John Carmack (Score:5, Insightful)
Re: (Score:2)
Re:John Carmack (Score:5, Interesting)
"JC: DX9 has its act together well. I like the version of DirectX on the 360. Microsoft is doing well with DX10 on tightening the specs and the exactness."
Of course, he's still calls it like it is:
"The new features are not exactly well-thought-out. Most developers are pretty happy with DX9. The changes with DX10 aren't as radical. It's not like getting pixel shaders for the first time. Single-pass shaders are nice with DX10, but it's a smaller change. "
Re: (Score:1)
But really. Those two quotes seem to contradict each other.
"Microsoft is doing well with DX10 on tightening the specs and the exactness."
Then:
"The new features are not exactly well-thought-out."
Tell me I'm not the only one that noticed that?
The first thing I thought when I read that first quote was the G4s commercials for Collins college... "We just need to tighten up the graphics..."
Re:John Carmack (Score:4, Informative)
It's like "they reduced the needs for annual service on the car by 10%, but didn't add a turbocharger".
Re:John Carmack (Score:4, Funny)
I thing, you thing, we all thing (Score:3, Funny)
Re:I thing, you thing, we all thing (Score:5, Funny)
The answer (Score:4, Insightful)
No. There is some room for an "Unless..." argument, but frankly, "reviews" like this are so biased that no sane person should knowingly take them into account while evaluating a purchase. Unless (hah!) it's as a strike against the companies doing it. But you're screwed on both sides, there, so...
honest game developers (Score:1, Funny)
Yeah, because you'd never hear a hack developer blame all the problems on the hardware, right?
As an avid slash reader and linux user (Score:2)
DX10 (Score:5, Funny)
Re: (Score:1, Interesting)
Re: (Score:2)
Hey, how much is that in furlongs/fortnight?
quit already with 'optimized' drivers (Score:5, Insightful)
Stop fucking around and do it right the first time.
How hard is that?
Re: (Score:3, Interesting)
What happens when a better way than the "known standard" comes around. Are we supposed to wait for some updated standard then updated hardware for that standard but by then don't you think some part of that standard will be obsolete?
Tweaked drivers, in most cases, only provide marginal benefits that many users would hardly notice. Yes, there are some stark exceptions where a different driver can have substantial impact, but this is often the game developer's fault as much as the hardwa
Re:quit already with 'optimized' drivers (Score:4, Informative)
In this case, DX10 (well, strictly D3D10) *is* the standard you're talking about. Waiting for a better way would involve waiting for D3D11 or similar. That's not what the OP's talking about.
Yes, there are some stark exceptions where a different driver can have substantial impact, but this is often the game developer's fault as much as the hardware developer.
I don't know about games, but I remember NVidia being caught cheating at 3DMark a couple of years ago. They released a driver that deliberately cut corners when it detected that that benchmark was being run, massively improving the framerate.
That's not so easy with games, but quite often optimisations can be made when you code for a specific case that can't (or shouldn't, for performance reasons) be made when coding more generally. I wouldn't put it past either vendor to tweak their drivers for say Half Life 3 at the expense of other, less hyped titles.
Re: (Score:1)
Re: (Score:2)
I can't remember the last time I saw a graphics driver patch or fix for OS X.
Of course when you can threaten to pull a vendors entire line of video cards from potentially millions of new computers they tend to jump when you say jump. (and I believe jobs once did over ATI leaking a new Mac product once a few years back)
Re: (Score:2, Interesting)
If Apple were to mandate absolute perfection, you'd see a lot fewer driver releases for OSX...because they require more QA time.
So on the other hand, Windows users would be getting better performing drivers more quickly that may have a hitch here and there in select titles, while OSX users would have inferior performance, all because Apple mandates perfection.
Personally
Re: (Score:2)
I can't remember the last time I saw a graphics driver patch or fix for OS X.
Re: (Score:1, Funny)
Re: (Score:2)
Re:quit already with 'optimized' drivers (Score:5, Interesting)
You've got to remember, these guys live and die by sales. They *have* to look good in the numbers because that's what sells their cards at the top end. At the low end, no consumer cares either way as price dominates, but like automobiles, people assume that the tech from the top end trickles down to their lowly mass-market video hardware in some fashion, so it ends up still being relevant, if less directly so.
Also, if you have looked at most of these benchmarks, the difference between best and 2nd best is usually quite small, on the order of a couple percent. The bragging rights of being able to claim you can run your game at 150fps while other plebeans can only run at 140fps is just that - bragging rights. There is no practical effect on game play until framerates drop below 30fps. And the top end graphics hardware these days is not the bottleneck at resolutions of 1280x1024 and below, so really, these guys are chasing numbers in the rarified air of super high resolution monitors and games which use every trick in the book, which is an extremly small set of games actually played.
But that is what sells. And in any case, the competition between ATI and nVidia is good even if those of us who 'know' see their number-chasing as pointless. Let them do their thing, and reward or punish them at the counter as you see fit.
Re: (Score:2)
Re: (Score:2)
Yet, at the same time there are lots of people asking for one of them, please, support Linux so we can buy them.
Re:quit already with 'optimized' drivers (Score:4, Informative)
Driver manufacturers try and get the generic code paths as fast as they can, but they can always make the driver a little bit faster by applying some domain-specific knowledge. If they know that a particular game has a particular hot path, they can optimize that path. Maybe the optimization that they use wouldn't make sense for the general case, but they know it will work in that particular case.
Sure it would be nice to have a card that was great in everything, but there will always be a way to make it just a little faster for that one special case....and we're back to the current scenario.
Re: (Score:2)
Are there actually that many cases when an optimisation doesn't make sense?
Game X does a lot of operation Y, so we make operation Y faster, fine. But in what case would it be beneficial to make operation Y slower?
It's not like our high-end graphics cards are short of RAM to store code in,
Re: (Score:2)
Driving people to consoles (Score:4, Insightful)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re:quit already with 'optimized' drivers (Score:5, Insightful)
Every time you write a piece of code you improve it you may find a new way of doing something.
Also as more and more programs come out that use the driver they people that write them will gain a better understanding of how they are used. That will help them optimize a code path. It could be as simple as selecting which branch tends to be used more and making that the default path.
The something is used the more performance you can get out of it.
Re: (Score:2)
The hardware didn't change, your level of understanding did.
You didn't know the way to tell the computer to do what you wanted accurately enough, the first time, if you "found a new way to do it".
Re: (Score:2)
HALs are designed so that the developers can ignore the potentially vast differences in underly
Re: (Score:1)
In the 21st century, apparently that is so hard it is utterly impossible. We live in a world where nothing is ever "right". Well I can tell you one thing: The driver teams probably did get things "right" the first time, in the sense that they probably cooked up drivers that adhered to published specs, and made good use of the hardware available. Now it's perfectly normal to have small updates from time to time to incorporate refinements
Re: (Score:2)
+10
That's why I have bought a Wii. It's graphics ... sucks. But games are good - because gameplay is good.
Wii - is my response to all the crap drivers crap (or cock size competition) both ATI and nVidia started many years ago. Momentarily both ATI and nVidia are winning (judging by their PR) and apparently it's customers who lost the race.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
The French Resolution! (Score:4, Funny)
2048x1536 is the ONLY resolution.
Re: (Score:2)
And just how do I get that to display properly on my lovely 24" 1920x1200 display?
Re: (Score:1)
Re: (Score:2)
Never mind you amateurs. I want it to display properly on my 30" screen at 2560x1600! This is the One True Resolution!
Re: (Score:1)
5040x1050, man!
Shows how bad DX-10 really is (Score:5, Insightful)
And there's no indication here if someone is using corked drivers that favor one game over the other.
What I'd like to see is a benchmark rundown of each function in DX10, along with some realistic estimate of how much each function is called in normal game play. If different games favor different functions, then say so. Only then might I have some idea of how the two graphics powerhouses measure up against each other.
And if you have some reasonable way of testing common sequences of calls, show that as well.
Re:Shows how bad DX-10 really is (Score:4, Insightful)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
The bigger picture is that these cards are completely compatible with each other, sporting exposed feature sets that are practically identical from a software point of view, making it much easier to program DX10 games that work on both. This is a bi
Re: (Score:3, Insightful)
Looks to me like if you make calls to it in one way, ATI shines, but call it another way and its all Nvidia -- yet both cards+drivers allegedly comply with the standard
Both cards do do the same thing.
Just that some engineer at nVidia thought of a genius way to, say, handle antialiasing, whereas some ATI engineer came up with a genius way to do T&L. (Just pulling these examples out of my Canada, but you get the idea.) Point is, chips designed by different people at different companies will perfor
Re: (Score:1)
Eh?. DirectX is a thin wrapper over the hardware. The orignal DirectX gave you access to the physical frame buffer and blitter (DirectDraw), the 3d hardware (Direct3d). Most of the time, Direct3D was used to blast polygons from a buffer in the game straight to the hardware in the card which rendered them. These days, it's much more complex and different hardware does better at different things, because the designers concentrated on o
Anyone but me. (Score:2)
Good idea. We should have politicians to take a stand for the voter. Or criminals to take a stand for the victims. Let's demand that the problems take care of themselves, and then we can go back to not paying attention.
The solution is people paying attention and voting with their wallets. Obviously this is never going to happen -- people have more important things to worry about, and they're
Re:May as well be Diogenes... (Score:4, Informative)
Twice.
You can benchmark in existing, released games. Those are the real application and as such the performances observed are relevant. The fact that their current driver is optimized or not does not matter, this is the current status.
Of course, as the author talks about pre-release, that does not apply here. The author is quite surprised that *BETA* code of incomplete demos runs better on the hardware of the company that helped the game developer. No foul play here.
In the same vein, you have 3dmark. Those are benchmarks developed in collaboration with ALL the players in the industry: AMD, Nvidia, Intel... They all have plenty of time to optimize their drivers and find bugs or non-optimized path. As this is a synthetic benchmark, results are less relevant.
So, in summary, the author complains that beta code is non optimized for all hardware.
A stupid complaint while the solution already exists: benchmark with known benchmark and final code.
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
My Voodoo 3 3500 memories are mostly watching it crash if I tried to task switch, not being able to render in a window, not being able to do 24-32 bit color, and having video capture capped to 320x240 but still using all of my CPU and dropping 1/4 of the frames. Oh yeah, and getting a special custom MiniGL driver for a handful of games that were supported by it, and watching all manner of chaos erupt if the w
Does it really matter? (Score:5, Insightful)
I've been using dedicated graphics cards since my old 3dfx Orchid Righteous 3D, since then I've had various ATI/nVidia cards and I've never been in the situation where I've thought "damn I wish I bought the other company's card".
I used to be someone that thought it was great to get 3 more fps than the other guy but when I came to realise that whilst I got 3 more fps in one game, and he got 5 more fps in another game that was OpenGL instead of DirectX or whatever. It became obvious that it's not as clear cut as one card is better than the other in terms of frame rates, it depends on the graphics API, the driver releases, the OS, the other hardware in the system, the game settings and on and on. Personally I prefer nVidia, but that's only because they have better developer support and I've had a better experience with the quality of their drivers over ATI's, image quality, features and frames per second has never once been an issue for me and I'm sure this is true for all but those people who think that getting an extra 3 more fps in game X actually makes the blindest bit of difference in the world.
Re: (Score:1)
By that same token, if the difference is 3fps, and otherwise virtually identical, then you know that too, and yo
Is it the DX 10 code or vista? (Score:4, Insightful)
Poorest Grammar Award goes to this post (Score:1)
Let them eat Directx! (Score:4, Funny)
Re: (Score:3, Funny)
Think this is bad?! (Score:2)
As my Comp Sci lecture loved saying... (Score:5, Funny)
I don't see a conspiracy here. (Score:3, Insightful)
Since graphics technology is actually a fairly complex field and the design philosophies of these two companies are different, the other companies cards/drivers will be optimized for what THEY feel are the real performance metrics and therefore they won't test as well on those benchmarks.
All of this can happen without anyone doing anything but coding and designing in the manner they believe to be the best balance of technology and practicality.
Re:New DX10 Benchmarks Do More Bad than Good (Score:1)
what is up with game support in drivers (Score:1)