Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

New DX10 Benchmarks Do More Bad than Good 99

NIMBY writes "An interesting editorial over at PC Perspective looks at the changing status between modern game developers and companies like AMD and NVIDIA that depend on their work to show off their products. Recently, both AMD and NVIDIA separately helped in releasing DX10 benchmarks based on upcoming games that show the other hardware vendor in a negative light. But what went on behind the scenes? Can any collaboration these companies use actually be trusted by reviewers and the public to base a purchasing decision on? The author thinks the one source of resolution to this is have honest game developers take a stance for the gamer."
This discussion has been archived. No new comments can be posted.

New DX10 Benchmarks Do More Bad than Good

Comments Filter:
  • John Carmack (Score:5, Interesting)

    by dsanfte ( 443781 ) on Wednesday May 23, 2007 @04:18PM (#19244855) Journal
    John Carmack used to be pretty good at cutting through the marketing crap and telling it like it was. Let's ask him.
    • Re:John Carmack (Score:5, Insightful)

      by peragrin ( 659227 ) on Wednesday May 23, 2007 @04:21PM (#19244919)
      Use OpenGL.
      • by dsanfte ( 443781 )
        Yeah, I guess that would be his short answer, wouldn't it?
      • Re:John Carmack (Score:5, Interesting)

        by Applekid ( 993327 ) on Wednesday May 23, 2007 @04:34PM (#19245073)
        He seems to be less anti-DirectX these days [computerpoweruser.com]:

        "JC: DX9 has its act together well. I like the version of DirectX on the 360. Microsoft is doing well with DX10 on tightening the specs and the exactness."

        Of course, he's still calls it like it is:

        "The new features are not exactly well-thought-out. Most developers are pretty happy with DX9. The changes with DX10 aren't as radical. It's not like getting pixel shaders for the first time. Single-pass shaders are nice with DX10, but it's a smaller change. "
        • It's amazing what money and/or healthy donations can do(?)
          But really. Those two quotes seem to contradict each other.

          "Microsoft is doing well with DX10 on tightening the specs and the exactness."

          Then:
          "The new features are not exactly well-thought-out."

          Tell me I'm not the only one that noticed that?
          The first thing I thought when I read that first quote was the G4s commercials for Collins college... "We just need to tighten up the graphics..."
          • Re:John Carmack (Score:4, Informative)

            by Anonymous Coward on Wednesday May 23, 2007 @07:05PM (#19246627)
            They made the specifications slicker and more exact, but did't add any new breakthrough features. Not contradicting at all really.

            It's like "they reduced the needs for annual service on the car by 10%, but didn't add a turbocharger".
    • by GrievousMistake ( 880829 ) on Wednesday May 23, 2007 @04:51PM (#19245251)
      For extra value we should also ask Theo de Raadt for a comment. And it would make a good House episode. "So what you're saying, Mr. NVIDIA, is that you got that driver bug from a public toilet seat?"
  • by Timesprout ( 579035 ) on Wednesday May 23, 2007 @04:19PM (#19244883)
    Sorry I bit my tounge and I cant pronounce sing properly. What was the Author singing for anyway, shouldn't he have just written it down?
  • The answer (Score:4, Insightful)

    by beavis88 ( 25983 ) on Wednesday May 23, 2007 @04:21PM (#19244917)
    Can any collaboration these companies use actually be trusted by reviewers and the public to base a purchasing decision on?

    No. There is some room for an "Unless..." argument, but frankly, "reviews" like this are so biased that no sane person should knowingly take them into account while evaluating a purchase. Unless (hah!) it's as a strike against the companies doing it. But you're screwed on both sides, there, so...
  • by Anonymous Coward
    "...have honest game developers take a stance for the gamer."

    Yeah, because you'd never hear a hack developer blame all the problems on the hardware, right?
  • I'd just like to say, "I already knew that".
  • DX10 (Score:5, Funny)

    by Richard McBeef ( 1092673 ) on Wednesday May 23, 2007 @04:28PM (#19245005)
    DX10 or for the uninformed, Derendering eXtraction (10 megapixels/second) is a standard benchmark for measuring the performance of GPUs or Gradient Pixilization Units. Pretty much this is what the video card companies all base their prices on with price being directly related to how many pixels can be gradiated per unit (usually about 30 cents per pixel/ounce).
    • Re: (Score:1, Interesting)

      by Anonymous Coward
      Really? Based on what I read, I assumed it was DirectX 10.
    • DX10 or for the uninformed, Derendering eXtraction (10 megapixels/second) is a standard benchmark for measuring the performance of GPUs or Gradient Pixilization Units. Pretty much this is what the video card companies all base their prices on with price being directly related to how many pixels can be gradiated per unit (usually about 30 cents per pixel/ounce).

      Hey, how much is that in furlongs/fortnight?
  • by BattleTroll ( 561035 ) <battletroll2002@yahoo.com> on Wednesday May 23, 2007 @04:30PM (#19245039)
    I'm getting tired of the back and forth between AMD and Nvidia. Drop the whole 'optimized' drivers crap and give us cards that work great out of the box. This entire trend of releasing per-game tweaked drivers is just hurting consumers. I shouldn't have to wait for Nvidia to tweak their drivers to get the best performance out of one of their cards. I shouldn't have to download new drivers every time a new games comes out. The whole reason you create your cards based on a known standard is to avoid this mess.

    Stop fucking around and do it right the first time.

    How hard is that?
    • Re: (Score:3, Interesting)

      by Kamokazi ( 1080091 )
      Quite hard actually.

      What happens when a better way than the "known standard" comes around. Are we supposed to wait for some updated standard then updated hardware for that standard but by then don't you think some part of that standard will be obsolete?

      Tweaked drivers, in most cases, only provide marginal benefits that many users would hardly notice. Yes, there are some stark exceptions where a different driver can have substantial impact, but this is often the game developer's fault as much as the hardwa
      • by Tim C ( 15259 ) on Wednesday May 23, 2007 @04:53PM (#19245285)
        What happens when a better way than the "known standard" comes around. Are we supposed to wait for some updated standard then updated hardware for that standard but by then don't you think some part of that standard will be obsolete?

        In this case, DX10 (well, strictly D3D10) *is* the standard you're talking about. Waiting for a better way would involve waiting for D3D11 or similar. That's not what the OP's talking about.

        Yes, there are some stark exceptions where a different driver can have substantial impact, but this is often the game developer's fault as much as the hardware developer.

        I don't know about games, but I remember NVidia being caught cheating at 3DMark a couple of years ago. They released a driver that deliberately cut corners when it detected that that benchmark was being run, massively improving the framerate.

        That's not so easy with games, but quite often optimisations can be made when you code for a specific case that can't (or shouldn't, for performance reasons) be made when coding more generally. I wouldn't put it past either vendor to tweak their drivers for say Half Life 3 at the expense of other, less hyped titles.
        • Benchmarker-specific changes are obviously dishonest, but I see nothing wrong with optimizing drivers for the most popular applications, provided no quality is reduced for the framerate gain.
      • Steve Jobs is pretty adamant about the "no drivers" issue with OS X.

        I can't remember the last time I saw a graphics driver patch or fix for OS X.

        Of course when you can threaten to pull a vendors entire line of video cards from potentially millions of new computers they tend to jump when you say jump. (and I believe jobs once did over ATI leaking a new Mac product once a few years back)
        • Re: (Score:2, Interesting)

          by Kamokazi ( 1080091 )
          Well let's first assume OSX and Windows were on equal footing with PC Games, and upgrading video hardware was common for users of both OS's.

          If Apple were to mandate absolute perfection, you'd see a lot fewer driver releases for OSX...because they require more QA time.

          So on the other hand, Windows users would be getting better performing drivers more quickly that may have a hitch here and there in select titles, while OSX users would have inferior performance, all because Apple mandates perfection.

          Personally
        • Steve Jobs is pretty adamant about the "no drivers" issue with OS X.
          I can't remember the last time I saw a graphics driver patch or fix for OS X.
          Pretty much every update release for OS X has had one line in the public change log relating to changes to the ATi or nVidia kexts (drivers). On my MacBook Pro, I've had a few kernel panics caused by the ATi drivers, so I don't think OS X gets to escape driver issues.
    • Re: (Score:1, Funny)

      by Anonymous Coward
      Still waiting for the nethack optimized drivers... ..@..
    • by ChronosWS ( 706209 ) on Wednesday May 23, 2007 @04:51PM (#19245247)
      This is only a problem if in the course of 'optimizing' for a particular use case they degrade performance in all of the other cases. There may be times, if the case is particularly widely used, that it might even be worth a small perf hit in one area to gain a large benefit in another.

      You've got to remember, these guys live and die by sales. They *have* to look good in the numbers because that's what sells their cards at the top end. At the low end, no consumer cares either way as price dominates, but like automobiles, people assume that the tech from the top end trickles down to their lowly mass-market video hardware in some fashion, so it ends up still being relevant, if less directly so.

      Also, if you have looked at most of these benchmarks, the difference between best and 2nd best is usually quite small, on the order of a couple percent. The bragging rights of being able to claim you can run your game at 150fps while other plebeans can only run at 140fps is just that - bragging rights. There is no practical effect on game play until framerates drop below 30fps. And the top end graphics hardware these days is not the bottleneck at resolutions of 1280x1024 and below, so really, these guys are chasing numbers in the rarified air of super high resolution monitors and games which use every trick in the book, which is an extremly small set of games actually played.

      But that is what sells. And in any case, the competition between ATI and nVidia is good even if those of us who 'know' see their number-chasing as pointless. Let them do their thing, and reward or punish them at the counter as you see fit.
      • "You've got to remember, these guys live and die by sales. They *have* to look good in the numbers because that's what sells their cards at the top end. At the low end, no consumer cares either way as price dominates"

        Yet, at the same time there are lots of people asking for one of them, please, support Linux so we can buy them.

    • by Chirs ( 87576 ) on Wednesday May 23, 2007 @04:54PM (#19245289)
      You say "Drop the whole 'optimized' drivers crap", then in the next sentance you say " I shouldn't have to wait for Nvidia to tweak their drivers to get the best performance...". You're contradicting yourself. Obviously you want the better performance that can be gained through the tweaking.

      Driver manufacturers try and get the generic code paths as fast as they can, but they can always make the driver a little bit faster by applying some domain-specific knowledge. If they know that a particular game has a particular hot path, they can optimize that path. Maybe the optimization that they use wouldn't make sense for the general case, but they know it will work in that particular case.

      Sure it would be nice to have a card that was great in everything, but there will always be a way to make it just a little faster for that one special case....and we're back to the current scenario.
      • If they know that a particular game has a particular hot path, they can optimize that path. Maybe the optimization that they use wouldn't make sense for the general case, but they know it will work in that particular case.

        Are there actually that many cases when an optimisation doesn't make sense?

        Game X does a lot of operation Y, so we make operation Y faster, fine. But in what case would it be beneficial to make operation Y slower?

        It's not like our high-end graphics cards are short of RAM to store code in,
        • Well, when you optimize operation Y, you may be slowing down operation Z (predicting differently on branches, allocating processing units differently). Or, alternatively, you may break operation W (game X never cares about triangles with these attributes, so we skip them to make operation Y faster, but other games might need those triangles to render properly).
    • by SuperKendall ( 25149 ) on Wednesday May 23, 2007 @05:01PM (#19245341)
      It's exactly aspects of PC gaming like this that drove me to consoles. Then they can do all the per-game tweaking they like, it's not me doing the work.
      • by redcane ( 604255 )
        Yes it is, you have to wait for updates to be downloaded when you connect xbox live. I guess your not doing the work per se, but neither is the guy hitting "update" on his video drivers.
        • The Xbox/360 is not the same as all consoles in general. None of my other consoles connect for updates on anything. I would prefer that this remain the case for eternity, but the market seems to be moving in a different direction.
        • Comment removed based on user account deletion
    • by LWATCDR ( 28044 ) on Wednesday May 23, 2007 @05:18PM (#19245515) Homepage Journal
      Do it right the first time? Spoken like someone that has never written software.
      Every time you write a piece of code you improve it you may find a new way of doing something.
      Also as more and more programs come out that use the driver they people that write them will gain a better understanding of how they are used. That will help them optimize a code path. It could be as simple as selecting which branch tends to be used more and making that the default path.
      The something is used the more performance you can get out of it.
      • by NateTech ( 50881 )
        Finding a "new way" to do something on the same hardware, simply means that you didn't do it right the first time, by definition.

        The hardware didn't change, your level of understanding did.

        You didn't know the way to tell the computer to do what you wanted accurately enough, the first time, if you "found a new way to do it".
    • D3D and OpenGL are supposed to be device-independent APIs. They are abstraction layers. It is inevitable that there will be different ways to accomplish pretty much the same task. Those semantically equivalent algorithms will not all translate to the hardware capabilities to the same extent. The game developers should not have to care too much about the relative strengths and weaknesses of the underlying hardware.

      HALs are designed so that the developers can ignore the potentially vast differences in underly
    • Stop fucking around and do it right the first time.

      How hard is that?

      In the 21st century, apparently that is so hard it is utterly impossible. We live in a world where nothing is ever "right". Well I can tell you one thing: The driver teams probably did get things "right" the first time, in the sense that they probably cooked up drivers that adhered to published specs, and made good use of the hardware available. Now it's perfectly normal to have small updates from time to time to incorporate refinements

    • I'm getting tired of the back and forth between AMD and Nvidia.

      +10

      That's why I have bought a Wii. It's graphics ... sucks. But games are good - because gameplay is good.

      Wii - is my response to all the crap drivers crap (or cock size competition) both ATI and nVidia started many years ago. Momentarily both ATI and nVidia are winning (judging by their PR) and apparently it's customers who lost the race.

    • by Yvanhoe ( 564877 )
      Well, there is a whole set of game you can play with no worries. Gamers these days agree to be early adopters, but personally I see no shame in buying 2+ years old game and enjoy them. Granted they don't have the same graphics quality than more recent game, but there is more to it, isn't there ?
      • That reminds me of something I thought this morning. When I was a kid, I enjoyed playing Pong, it was great. A few years later I thought Space Invaders and Asteroids were awesome! I doubt that my classification of 'awesomeness' was different then than what it is now. I see Vanguard's graphics and I think they're awesome, but even though the graphics are 200000 times better, it's the same feeling I had when I saw Space Invaders 28 or so years ago. Sometimes I wish it all stopped, and games just came for
    • Well, then try a Mac where the OpenGL implementation is shipped by Apple, and that's what all developers have to use. Granted, it's a little slower, though that has changed recently with the multithreading update, and Leopard will be using OpenGL 2.1.
  • by DRAGONWEEZEL ( 125809 ) on Wednesday May 23, 2007 @04:55PM (#19245293) Homepage
    "The author thinks the one source of resolution to this is have honest game developers take a stance for the gamer."

    2048x1536 is the ONLY resolution.
  • by Nom du Keyboard ( 633989 ) on Wednesday May 23, 2007 @05:02PM (#19245359)
    To me, this just goes to show what a bad standard/interface DX10 really is. Looks to me like if you make calls to it in one way, ATI shines, but call it another way and its all Nvidia -- yet both cards+drivers allegedly comply with the standard. It sounds like trying to compare floating point benchmarks on AMD Athlon versus Intel Core 2. Depending on how you arrange the numbers and call the various floating point extensions can make all the difference.

    And there's no indication here if someone is using corked drivers that favor one game over the other.

    What I'd like to see is a benchmark rundown of each function in DX10, along with some realistic estimate of how much each function is called in normal game play. If different games favor different functions, then say so. Only then might I have some idea of how the two graphics powerhouses measure up against each other.

    And if you have some reasonable way of testing common sequences of calls, show that as well.

    • by KillerCow ( 213458 ) on Wednesday May 23, 2007 @05:33PM (#19245699)

      What I'd like to see is a benchmark rundown of each function in DX10, along with some realistic estimate of how much each function is called in normal game play. If different games favor different functions, then say so. Only then might I have some idea of how the two graphics powerhouses measure up against each other.
      ... or you could just benchmark the card running popular retail games.
      • Awesome, which game should we rush out and buy to do DX10 benchmarks? I'm gonna pick up Duke Nukem Forever.
    • Don't blame DX10, I haven't heard any game programmer call it a pile of crap. What is probably going on here is that because DX10 is not widely used yet but just having "DX10 compatible" on the box will increase sales, both hardware manufacturers rushed their cards to market with stupidly inefficient rendering paths for certain features that they judged less important. This benchmarking problem should largely go away as DX10 support matures.
    • So you're saying that if DX10 was designed "properly", every video card would perform identically? I'm sorry, but that's a stupid thing to say, and Slashdot's the only place you'd get modded up for saying so just because it sounds like you're bashing Microsoft.

      The bigger picture is that these cards are completely compatible with each other, sporting exposed feature sets that are practically identical from a software point of view, making it much easier to program DX10 games that work on both. This is a bi
    • Re: (Score:3, Insightful)

      by Z34107 ( 925136 )

      Looks to me like if you make calls to it in one way, ATI shines, but call it another way and its all Nvidia -- yet both cards+drivers allegedly comply with the standard

      Both cards do do the same thing.

      Just that some engineer at nVidia thought of a genius way to, say, handle antialiasing, whereas some ATI engineer came up with a genius way to do T&L. (Just pulling these examples out of my Canada, but you get the idea.) Point is, chips designed by different people at different companies will perfor

    • To me, this just goes to show what a bad standard/interface DX10 really is.

      Eh?. DirectX is a thin wrapper over the hardware. The orignal DirectX gave you access to the physical frame buffer and blitter (DirectDraw), the 3d hardware (Direct3d). Most of the time, Direct3D was used to blast polygons from a buffer in the game straight to the hardware in the card which rendered them. These days, it's much more complex and different hardware does better at different things, because the designers concentrated on o
  • The author things the one source of resolution to this is have honest game developers take a stance for the gamer.

    Good idea. We should have politicians to take a stand for the voter. Or criminals to take a stand for the victims. Let's demand that the problems take care of themselves, and then we can go back to not paying attention.

    The solution is people paying attention and voting with their wallets. Obviously this is never going to happen -- people have more important things to worry about, and they're
  • by Xest ( 935314 ) * on Wednesday May 23, 2007 @05:16PM (#19245489)
    I'm not sure benchmarks really matter. It's not as if either of the cards are so bad that you're getting screwed by buying one instead of the other.

    I've been using dedicated graphics cards since my old 3dfx Orchid Righteous 3D, since then I've had various ATI/nVidia cards and I've never been in the situation where I've thought "damn I wish I bought the other company's card".

    I used to be someone that thought it was great to get 3 more fps than the other guy but when I came to realise that whilst I got 3 more fps in one game, and he got 5 more fps in another game that was OpenGL instead of DirectX or whatever. It became obvious that it's not as clear cut as one card is better than the other in terms of frame rates, it depends on the graphics API, the driver releases, the OS, the other hardware in the system, the game settings and on and on. Personally I prefer nVidia, but that's only because they have better developer support and I've had a better experience with the quality of their drivers over ATI's, image quality, features and frames per second has never once been an issue for me and I'm sure this is true for all but those people who think that getting an extra 3 more fps in game X actually makes the blindest bit of difference in the world.
    • by Sir_Sri ( 199544 )
      well to that extent benchmarks matter to show it's only 3 fps. The difference between and 8800 Ultra and a 2900 XT is pretty substantial. No, that isn't an apples to apples comparison, but it clearly demonstrates that for decidedly more money, you get a better card. If that degree of betterness is worth the extra cost to anyone in their given circumstances then they can make an informed choice.

      By that same token, if the difference is 3fps, and otherwise virtually identical, then you know that too, and yo
  • by Joe The Dragon ( 967727 ) on Wednesday May 23, 2007 @05:25PM (#19245591)
    vista will look very bad if that dx 10 for XP hack comes out and it turns out to be faster.
  • This post has a greater abundance of both poor grammar and spelling than I can recall in any post that I've read in at least the previous year. The grammar is so poor that one or two sentences are nearly unintelligible. Nice proofreading and editing, there, Mr. ScuttleMonkey.
  • by BillGatesLoveChild ( 1046184 ) on Wednesday May 23, 2007 @05:51PM (#19245877) Journal
    DX10 runs only on Vista. I'm sure this article will be of great interest to the three Vista gamers out there.
    • Re: (Score:3, Funny)

      by laffer1 ( 701823 )
      No, it wasn't that interesting. I don't know how the other two feel. I don't care about benchmarks right now. I just want stable drivers when I have to use vista.
  • Wait until the benchmarks come out showing its 20% slower(Note this is a guestimate, don't get upset... yet).
  • by pls_call_again ( 920752 ) on Wednesday May 23, 2007 @08:26PM (#19247307)
    As my third year computer design lecture loved saying: There are lies; then there are damn Lies; and then there are benchmarks.
  • by shaitand ( 626655 ) on Wednesday May 23, 2007 @11:33PM (#19248569) Journal
    The simplest explanation doesn't require any malice on the part of the video card manufacturers. If the developers and engineers develop the cards and drivers to optimize the features they believe the most important for performance, it stands to reason they will think those same features the most important when collaborating on a benchmark program. Magically, the benchmarks will score heavily in favor of the features that camp optimized their hardware and drivers for.

    Since graphics technology is actually a fairly complex field and the design philosophies of these two companies are different, the other companies cards/drivers will be optimized for what THEY feel are the real performance metrics and therefore they won't test as well on those benchmarks.

    All of this can happen without anyone doing anything but coding and designing in the manner they believe to be the best balance of technology and practicality.

  • Yes, for the real gamer, there are a lot of things to consider before doing a test on new ideas. Such as what could be seen on the different category of gamers. The benchmarks could hardly do a lot, but how about giving it a try?
  • Why is game support a driver problem? I have written a few drivers myself so I understand what they do. What I don't understand is why every time a new game comes out, ATI and Nvidia need to provide support for that game in their driver? If they are putting game specific code in the drivers, shouldn't that code be part of the game? And if their driver blows and doesn't use the hardware correctly, shouldn't they fix it for all games, not just one? Can somebody explain this to me please.

The world will end in 5 minutes. Please log out.

Working...