Follow Slashdot stories on Twitter


Forgot your password?

DirectX 10 Hardware Is Now Obsolete 373

ela_gervaise writes "SIGGRAPH 2007 was the stage where Microsoft dropped the bomb, informing gamers that the currently available DirectX 10 hardware will not support the upcoming DirectX 10.1 in Vista SP1. In essence, all current DX10 hardware is now obsolete. But don't get too upset just yet: 'Gamers shouldn't fret too much - 10.1 adds virtually nothing that they will care about and, more to the point, adds almost nothing that developers are likely to care about. The spec revision basically makes a number of things that are optional in DX10 compulsory under the new standard - such as 32-bit floating point filtering, as opposed to the 16-bit current. 4xAA is a compulsory standard to support in 10.1, whereas graphics vendors can pick and choose their anti-aliasing support currently. We suspect that the spec is likely to be ill-received. Not only does it require brand new hardware, immediately creating a minuscule sub-set of DX10 owners, but it also requires Vista SP1, and also requires developer implementation.'"
This discussion has been archived. No new comments can be posted.

DirectX 10 Hardware Is Now Obsolete

Comments Filter:
  • More juice! (Score:5, Funny)

    by JosefAssad ( 1138611 ) on Saturday August 11, 2007 @05:44AM (#20193891) Homepage
    4xAA is a compulsory That would seem to me to be the biggest change, that it requires batteries now.
    • by mpe ( 36238 ) on Saturday August 11, 2007 @06:44AM (#20194115)
      4xAA is a compulsory That would seem to me to be the biggest change, that it requires batteries now.

      Presumably Microsoft will be calling one of the new features "EverReady Boost" :)
  • Wait... (Score:4, Funny)

    by Draconix ( 653959 ) on Saturday August 11, 2007 @05:44AM (#20193895)
    You mean developers are actually using DirectX 10?
  • by imbaczek ( 690596 ) <imbaczek&poczta,fm> on Saturday August 11, 2007 @05:44AM (#20193897) Journal
    This seems like a window of opportunity for a new OpenGL standard. Anybody knows when it's due?
    • by MrCoke ( 445461 ) on Saturday August 11, 2007 @05:54AM (#20193933)
    • by baadger ( 764884 ) on Saturday August 11, 2007 @05:55AM (#20193937)
      According to the OpenGL homepage...

      The OpenGL 3 specification is on track to be finalized at the next face-to-face meeting of the OpenGL ARB, at the end of August
    • This seems like a window of opportunity for a new OpenGL standard.

      Or--even better--a window of opportunity for a new SDL [] version. SDL is comparable to DirectX as it offers control over sound, graphics, mouse/keyboard/joystick. OpenGL is just for graphics so comparing it with DirectX isn't really fair. :)

  • by zdude255 ( 1013257 ) on Saturday August 11, 2007 @05:45AM (#20193905)
    I'm sure the two developers using DX10 are gonna be pissed.
  • by Z00L00K ( 682162 ) on Saturday August 11, 2007 @05:52AM (#20193925) Homepage
    and major requirement change - so why not call it DirectX 11 instead? Or maybe that's X11?

    Anyway - the whole business here seems to be to force hardware upgrades by one hand and software upgrades with the other just to be sure that the flow of money is ensured. How long will it take until video drivers are Vista Only - just to force an upgrade to Vista?

    • by Yokaze ( 70883 )
      > and major requirement change

      It is not a major requirement change, because, contrary to the statement of The Inquirer, the previously optional and now mandatoryfeatures are provided by NVidia (source []) and ATI DX10-cards (source []).
      Both are have 32-bit fp unified shaders and 4xAA.

  • by tech10171968 ( 955149 ) on Saturday August 11, 2007 @05:53AM (#20193931)
    The article makes it seem as if Microsoft rushed DX10 out before it was truly ready; when you consider that this is what they often seem to do with their OS's, this should probably come as no surprise. Of course, we're seeing this news on the Inquirer, often considered to be a slightly less-than-reliable source of tech news. Maybe I'll reserve judgement until I hear another explanation from some other source.
  • by Dracos ( 107777 ) on Saturday August 11, 2007 @06:07AM (#20193977)

    Once again, those seven little letters get left out of a "standards" article: d-e f-a-c-t-o.

  • Does this mean that moving to SP1 makes old hardware unusable ? So will people be able to upgrade to SP1 and still keep their current hardware and games ?

    I also wonder if there is a license change; charge hardware vendors more or make it unusable with FLOSS or something.

    • Does this mean that moving to SP1 makes old hardware unusable ?


      So will people be able to upgrade to SP1 and still keep their current hardware and games ?

  • Why (Score:3, Interesting)

    by Unixfreak31 ( 634088 ) on Saturday August 11, 2007 @06:07AM (#20193989)
    Why this sudden change dx 10 has not even caught on in the hardcore gamers let alone even the above mainstream. Is MS going to make dx 10.2 or 11 radical to where devolpers have no options as well? If so I think its time to move back to OpenGL. No freedom for devlopers. And I want to be able to set my own AA levels.
  • by Taagehornet ( 984739 ) on Saturday August 11, 2007 @06:13AM (#20194003)
    "Now" is probably an exaggeration, considering that we're talking about Vista SP1.

    "Obsolete" ...I guess my DX9 card has been obsolete for a few years now, it still ticks on nicely though. Heck, all my hardware is probably obsolete.

    You could sum up TFA in a single line: "Microsoft discusses future extensions to the DirectX API. The current generation of hardware won't support those."

    Are anyone really surprised? Newsworthy?
    • by Sycraft-fu ( 314770 ) on Saturday August 11, 2007 @06:54AM (#20194171)
      You hear about it for a few reasons:

      1) Some people (like many on Slashdot) hate MS and want them to fail, thus look for anything that makes them look bad and make sure it gets page time.

      2) For some reason, some people had the perception that because DX10 was launched with Vista, that made it special and thus it wouldn't be changed for a long time. Never mind that MS has released a version of DirectX that has added a significant feature (as in something that needs more hardware) every 1-2 years in the past.

      3) Perhaps because of this many people bought in to the DX10 cards expecting them to be "futureproof". Again no idea why anyone would think that given graphics cards are the things that evolve the fastest and thus obsolete the fastest.

      Also I'm not so sure they said it wouldn't support it. Maybe I misread their slides, but all I saw was they said that "upcoming hardware" will support it. That statement doesn't mean that current hardware won't.

      Either way, much ado about nothing. Games will continue to be made to support whatever hardware is common on the market. Game companies love all the flashy new toys, but they are in bussiness to make money and you do that by selling games that run on the actual systems that are out there. That means so long as most peopel don't have cards capable of using a new standard, they won't require it (though they may support it to give mroe eye candy to the eairly adopters).

      Heck, right now you'll discover that a great number of games require nothing more than a DirectX 8 accelerator. That's a card like a GeForce 4 Ti fore example. Basically that means shader model 1.1 hardware. While many games support 2.0 and 3.0 (DX 9.0 and 9.0c respectively) you'll find that a good number don't require 2.0, and very few require 3.0. The reason is that there are still a lot of people using older cards. Not every one upgrades every year. Thus game makers have to take that in to account.

      It's not like the second 10.1 comes out developers are going to say "Ok, everyone better upgrade because this is all we support!" They could try, and they'd just go out of business and other, smarter, developers would support the hardware that more people have.

      Heck it is a pretty recent phenomena that developers have stopped supporting Windows ME for games, and some still do. Why? Enough people still used it.
  • by jadin ( 65295 ) on Saturday August 11, 2007 @06:14AM (#20194015) Homepage
    Conan - Are you comfortable and angry Pierre?

    Pierre - Comfortable and furious Conan.

    Conan - So what are you upset about today?

    Pierre - I've been a fan PC Games for ages Conan. To play the latest and greatest games requires me to continually upgrade my computer. Recently I upgraded to Windows Vista by Microsoft in order to play their newest game "Shadowrun". My PC could handle it although there wasn't much benefit over using Windows XP. It, however, required a lot more RAM and faster CPU in order to run smoothly. The game itself required the best video card I could afford. This was a serious investment, the video card alone put me back about the price of a new "non-gaming" PC. All this new hardware also required a bigger power supply, which wound up adding to my expenses. I wound up replacing my entire PC in order to save money. And since I was only upgrading for one game only it was difficult to upgrade for that alone, but I did so knowing my investment would last a year or two. Now Microsoft has announced DirectX 10.1 which makes all hardware for DirectX 10 obsolete. This made my previous investment from a month ago already worthless. To add salt to my wounds most of the features of 10.1 were optional and did nothing to improve the product. PC Gaming is an enjoyable experience, although an expensive one. Hardware should last a minimum of 6 months cutting edge, and about a year for not-the-best but playable.

    Bottom line America? Microsoft needs to realize that features need to be worthwhile and should always be optional. If they are truly worth it, they will be adopted as standard by the general public very quickly.

    Conan - Thank you Pierre, I'm sure two or three people across America know exactly what you're feeling like.
    • Not to self, do not buy expensive hardware for a single title. Another note to self, buying "bleeding edge" hardware is the surest way to guarantee that things will change and your hardware will be soon outpaced by commodity "mainstream" hardware at half to a third of the price. Another note to self, until hardware is "mainstream" no developers are going to invest huge amounts of time or resources on it... thus the slight difference between DX9 and DX10. Yet another note to self, you have failed to understa
  • by Val314 ( 219766 ) on Saturday August 11, 2007 @06:31AM (#20194067)
    How can this be surprising?

    You have 10.0 hardware and want it to support 10.1?

    Please stop posting such nonsense, or would you cry foul if your SSE3 CPU doesnt support SSE4 when its available?
    • by _KiTA_ ( 241027 )

      How can this be surprising?

      You have 10.0 hardware and want it to support 10.1?

      Please stop posting such nonsense, or would you cry foul if your SSE3 CPU doesnt support SSE4 when its available?

      Well, yes, I would cry foul is my SSE3 CPU suddenly didn't work with ... SSE3. This is a MINOR version change, 10.0 to 10.1. If they were going from 10.0 to 11.0, that would be one thing. They're not.
  • Oh no! (Score:4, Informative)

    by mikkelm ( 1000451 ) on Saturday August 11, 2007 @06:34AM (#20194073)
    Is that.. is that progress? New technology requiring new hardware?! BURN IT! BURN THE WITCH!

    I didn't think I'd live to see the day where new technology would be unwelcome to the slashdot crowd. I guess it isn't surprising, though, it being a Microsoft product, and slashdot degenerating into a zealot sandbox.

    DirectX 10.1 is going to be released about a year after DirectX 10. DirectX 9.0c was released about a year after DirectX 9.0b, and DirectX 9.0b hardware was also incompatible with DirectX 9.0c spec. That didn't create a whole lot of mainstream uproar, as people are generally positive towards new technology. I guess this being Vista and all, people can ignore pesky facts like those and continue their circle jerking unabated.
    • Re: (Score:3, Informative)

      by ardor ( 673957 )
      The point is that D3D10.1 mainly just enforces stuff that was optional in 10.0. There are no new killer features. So a game requiring 10.1 will make your shine new 8800 obsolete with absolutely no gain. 9.0b->9.0c saw the addition of stream frequencies among others, which is essential for instancing (D3D10 redesigned the entire instancing thing again). Also, 9.0c was largely compatible with 9.0b. It was mostly a bugfix release with added samples and a couple of new features (which were optional).
      • The title of the article is "DirectX 10 Hardware Is Now Obsolete". If you want to talk about the features making it obsolete, you'll be wanting "DirectX 10.1 Ships With No New Noteworthy Features". The fact of the matter is that it's nothing new that new standards supersede older ones, and that's what the summary and the people posting comments are complaining about.
      • by weicco ( 645927 )

        I can still play my old games with using my old GF 7900 GTX graphics card even if MS releases DirectX 15.7. And new games won't be going for DX 10.1 only any time soon now. So there is basically no point. And if, as you put it, DX 10.1 doesn't bring anything new into table DX 10.0 compatible cards may already support it.

    • Re:Oh no! (Score:5, Insightful)

      by DrEldarion ( 114072 ) <[moc.liamg] [ta] [0791uhcsm]> on Saturday August 11, 2007 @07:26AM (#20194291)

      I didn't think I'd live to see the day where new technology would be unwelcome to the slashdot crowd.
      That's the general trend of Slashdot nowadays. The realization hit me when everyone started bashing the PS3, which contains a very impressive processor, allows installation of linux, has built-in media streaming, uses standard USB and Bluetooth hardware, runs folding@home, upscales DVDs and old games, etc. etc. All anyone here says, though, is "OMG SONY I BET THERE'S A ROOTKIT ON IT LOL".

      This isn't a tech site anymore, it's a political site. Witness all the anti-RIAA/MPAA stories, global warming stories, election stories...
      • Re:Oh no! (Score:4, Insightful)

        by marcello_dl ( 667940 ) on Saturday August 11, 2007 @08:01AM (#20194409) Homepage Journal
        I don't buy a PS3 exactly because of the rootkit. But I criticized the PS3 mainly because Linux has not access to the whole hardware, the lack of ram expansion options, the braindead HD partition scheme. If new tech is crippled because of corporate strategies don't expect techies (either on slashdot or elsewhere) to like it.
      • everyone started bashing the PS3, which contains a very impressive processor, allows installation of linux, has built-in media streaming, uses standard USB and Bluetooth hardware, runs folding@home, upscales DVDs and old games, etc. etc. All anyone here says, though, is "OMG SONY I BET THERE'S A ROOTKIT ON IT LOL".

        But is that the fault of "Slashdot", or Sony?

        ie: did it ever occur to you that you might not be grasping the cause-and-effect relationship here?

        Oh, and you people who are posting "Slashdot had cha
  • Developers already have difficulties justifiying DirectX 10 support because Vista marketshare is still so low and most gamers are perfectly fine with XP and DirectX 9. Also, DirectX 10 lacks the backwards compatibilty of the older versions.

    But at least the new Unified Shaders seemed to be useful for developers, so at least they had advantages to it. But now, DirectX 10.1 only seems to make certain features compulsory, thus removing choice for the developers and also does not add new features to make it c
    • Re: (Score:3, Insightful)

      by ardor ( 673957 )
      Yes, game developers are getting conservative nowadays, and always have been regarding support of new APIs. So many studios will continue using D3D9. But for the same reason many studios still wont switch to OpenGL. In both cases (D3D9->D3D10, D3D9->OpenGL 2.x or even the coming 3.x) the codebase has to be largely rewritten, so when studios MUST upgrade, they will probably prefer OpenGL this time...
  • by Zephiris ( 788562 ) on Saturday August 11, 2007 @07:10AM (#20194223)
    That DirectX 10.1 is incompatible with 10.0 (along with new WDDM interface) has been known for at least a year now. It's a bit late [] for people to be in shock about it.

    Slashdot even covered it before [].

    Just because Microsoft officially announced it at a conference doesn't *exactly* make it new news, since they made it very clear on roadmaps and everything else exactly what was going to happen, and why it wasn't the best idea ever to adopt DirectX 10.0 hardware, rather than hardware capable of 10.1 (or 10.2) and whatever the new superset of OpenGL happened to be (3.0 as it turns out).

    Also, the reason to bother [] with DirectX 10.1 isn't so much that it offers "brand new super features" to games, but the WDDM 2.1 bits, which would allow for far finer-grained context switching and task management. Being able to immediately switch from rendering one small bit, to starting to render something else, which would theorhetically make all of the compiz/Aero type stuff be able to run much more smoothly in conjunction with real 3D rendering (ie, games, CAD).

    It all seems an exercise in futility to me, as far as the "DirectX 10" hardware goes. I like faster, I like more features, but there just seems no real reason to upgrade beyond my Geforce 6800 for the price point (which I got 18 months ago). Not to a 7800-series or comparable, and certainly not to an 8x00 or upcoming 9x00 Geforce, unless driver stability improves dramatically, and they can add more real-world-useful features, particularly without the need for Windows Vista. I'm back using WinXP "for a while" again, but I generally won't buy hardware anymore unless it's a notable and drastic improvement in Windows, Linux, and FreeBSD.

    I digress, but the point is, the news has already been covered before. If it apparently wasn't that attention-worthy a year ago, is it now? New DirectX versions *always* require brand new hardware, whereas most minor OpenGL revisions have almost always included new features that also work on old hardware (OpenGL 1.5's Vertex Buffer Objects humming along happily on a Geforce 256, for instance), and while full compliance is the best, all you really need to care about is if something implements certain clearly defined extensions, rather than wondering if Nvidia or ATI have 'misinterpreted' specifications over DirectX. Both have been panned in the past for 'creative' adoption of pixel shader standards and bizarre interpretations of DirectX 9.

    I'd just hope that eventually, there's more actual competition again, and both companies (and new companies) actually respect and care about standards compliance and that both they and the standards bodies start to care about what customers actually doing with their hardware.
  • In a way, Microsoft is trying to emulate IBM when it tried to jam MCA down the throat of the PC world back in the mid-80's. What happened to IBM then should happen to Microsoft now, too.
  • > 'Gamers shouldn't fret too much - 10.1 adds virtually nothing that they will care about and,
    > more to the point, adds almost nothing that developers are likely to care about.

    Actually it's even better. DirectX 10.0 doesn't add anything you will care about either. Game developers are finding Shader 3.0 (DirectX 9.0c) gives them more than enough to do. There's no need to move to DirectX 10.0 for quite some time. Now add to that DirectX only running under Vista, because someone at Microsoft marketing t
  • Known Roadmap (Score:4, Interesting)

    by Anonymous Coward on Saturday August 11, 2007 @08:29AM (#20194551)
    It's funny watching everyone who is shocked. Those are the people who have no idea what DirectX 10 is and why the model has shifted so much from OpenGL and earlier versions of DirectX.

    DirectX 10 and up is not just an accelerated video API but it is also a standard. Microsoft has completely eliminated the capability bits, or "capbits", concept in order to ensure to developers that if they program a specific version of the standard that all of the functionality mandatory by that standard will be supported by the graphics hardware. No longer will a developer target DirectX9 or OpenGL2 and have to ask the hardware whether or not it supports a plethora of options and then have to completely branch their development umpteen ways to support different varieties. If a game targets DirectX10.1 then 4xAA is guaranteed to be there, period. If a game does not require 4xAA then it doesn't have to target DirectX10.1.

    So get used to it otherwise you'll be shitting yourself for every single DirectX release going forward. This is how it works now.
  • by arse maker ( 1058608 ) on Saturday August 11, 2007 @09:49AM (#20194947)
    First off.. technology is made obsolete??? no shit! Its hard to imagine the slashdot crowd finding this to be news. This doesn't mean your dx10 card doesn't work anymore, you don't install SP1 and your PC wont boot with DX10 hardware. If you get upset every time people make revisions and improvements to software and hardware, I suggest you packup your computer and return it to avoid further heart ache. If you are an early adopter of the latest hardware and don't read any reviews (which all from memory said it will be some time before dx10 is going to matter) then thats your fault. Microsoft have explained in numerous interveiws (and documentation of course) how DX 10 will work, they even suggested 10.1 would be out BEFORE vista shipped. Graphics card features change ALL THE TIME, you have to write miles of CAPS checking code and render paths to support the zoo of cards and features. Now with DX10 they roll all the features up and any DX 10.x card will support the featuers, even if you write a DX 10.0 and DX 10.1 path, its only two options you have to support. You didn't see "ATI MAKES LAST CARD OBSOLETE BY INTRODUCING NEW PRODUCT", even though those changes could be far, far more difficult to develop for by having a bunch of changed caps and maybe even a few new proprietary ones. A fixed feature set is what allows developers to squeze out every drop of performance from PS2 hardware to make amazing looking graphics, even though your mobile phone might have more processing power available to it. And lastly.. people who mock the, apparent, modest real world improvments dx10 is offering.. what is your point? Intel brings out a new processor every x months with ~1-3% improvements, by your logic they should just stop bothering making new processors. Of course thats stupid, you wait till the improvment is enough for you to find it compelling.
  • DirectX 10 hardware is now obsolete?!!? Thank god I stayed with my non-obsolete DX9 hardware.

    Oh wait... I run Linux/OpenGL. Nevermind.
  • by Charcharodon ( 611187 ) on Saturday August 11, 2007 @12:31PM (#20195949)
    Video games fall under the 2 year rule. What comes out tomorrow will not show up in games for at least two years. If you buy into all the BS marketing and buying the latest and the greatest you are going to be in a constant state of dissappointment since nothing can live up to the hype and nothing is ever ready to go at launch.

    DirectX 10 other than a few limp patches and demos does not exist, hardware accelerated physics nope not yet, SLI or Dual and Quad GPU's hardly give a return on the investment unless you are running multiple monitors, etc etc etc. None of this is worth getting worked up about. Unplug out brain from the marketing driven fanboy/hater game and just enjoy ride. Graphics and computing power is fabulous compared to what it was just a few years ago, and the fact that MS has set an actual standard is kick ass so that when you go out and buy a card and game that says DX10 on the side you can actually count on it being exactly what it says it is. That beats the "good ol" days before DirectX where you had to wait for your graphic card manufacturer or the game publisher to come out with a patch so that your graphics card would be supported and when they didn't you were just shit out of luck.

  • by TheNetAvenger ( 624455 ) on Saturday August 11, 2007 @12:51PM (#20196085)
    One thing that 10.1 DX addresses is the Sound APIs that developers felt lost without, MS's Sound technology that is used on the Xbox 360 is being added into 10.1 DX, and this is more of what DX 10.1 is about than anything else.

    Sadly though, sound is one area Vista gets no credit, yet is one of the best selling points of Vista.

    With the new Audio subsystem in Vista, if you are running 5.1 or higher you can turn on your Mic and it will auto tune the speakers and environment sounds for an outstanding experience.

    Another great thing about Sound in Vista is that even with an old AC'97 sound card and just stereo speakers on a desktop or laptop, the sound fidelity is significantly better than XP or OS X by several factors. For example a Wav,mp3,wma played on the same hardware and same speakers will sound incredibly more rich and defined on Vista than when you are playing it in XP. Even putting the same speakers on a Mac and 'trying' get the fidelity up, the sound quality was NOT even close to what Vista was doing with an old sound card.

    And DX10.1 adds back in DirectX level APIs for game developers.

    If anyone really wants to understand the Audio in Vista, do a search on Vista Audio Subsystem, or Sonar Vista. There are great technical pieces on why Vista redid the Audio system and also some good examples of why developers of audio products like Sonar continue to choose MS and Vista as their platform of choice for high quality production.

"There is no distinctly American criminal class except Congress." -- Mark Twain