Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Nvidia Launches 8800 Series, First of the DirectX 10 Cards 149 149

mikemuch writes "The new top-end GeForce 8800 GTX and GTS from Nvidia launched today, and Loyd Case at ExtremeTech has done two articles: an analysis of the new GPU's architecture, and a benchmark article on PNY's 8800 GTX. The GPU uses a unified scalar-based hardware architecture rather than dedicated pixel pipelines, and the card sets the bar higher yet again for PC graphics." Relatedly an anonymous reader writes "The world and his dog has been reviewing the NVIDIA 8800 series of graphics cards. There is coverage over at bit-tech, which has some really in-depth gameplay evaluations; TrustedReviews, which has a take on the card for the slightly less technical reader; and TechReport, which is insanely detailed on the architecture. The verdict: superfast, but don't bother if you have less than a 24" display."
This discussion has been archived. No new comments can be posted.

Nvidia Launches 8800 Series, First of the DirectX 10 Cards

Comments Filter:
  • WOW! This is FAST! (Score:4, Insightful)

    by Salvance (1014001) * on Wednesday November 08, 2006 @05:24PM (#16775453) Homepage Journal
    It's actually pretty surprising that the DX10-compatible 8800 runs $450-$600 given it's brand new and has huge performance gains over NVidia's current cards. I don't understand why someone would say only buy it if you have a 24" monitor though ... it seems like buying a single 8800 would be just as good (and cheaper) than buying a couple 7800's ...
  • DNF! (Score:4, Funny)

    by spacemky (236551) * <nick@aryfi.DEBIANcom minus distro> on Wednesday November 08, 2006 @05:28PM (#16775533) Homepage Journal
    I heard somewhere that this will be one of the only supported video cards in Duke Nukem Forever.

    • by Das Modell (969371) on Wednesday November 08, 2006 @11:50PM (#16780059)
      I'm sure a lot of people are going apeshit with excitement over this card, but in a year everyone will have forgotten about it, because the next big thing will have been released by then. The 7900 was released in March 2006, for fuck's sake, and now everyone is talking about the 8800. People don't even realise that they'll be able to use the 7900 for several years before it gets outdated. Then again, I'm so amazingly oldschool that I use 1024x768 with no AA or AF (=useless gimmicks), so what do I know. I have a X800 and it's still enough to run games without compromising graphics quality too much (except BF2142 because the engine was programmed by monkeys).
    • Re:DNF! (Score:3, Funny)

      by EvilIdler (21087) on Thursday November 09, 2006 @04:53AM (#16781873)
      You're half-right; it will be the oldest supported card.
  • another review (Score:4, Informative)

    by brunascle (994197) on Wednesday November 08, 2006 @05:29PM (#16775547)
    Hot Hardware has another review [hothardware.com]
  • by MojoKid (1002251) on Wednesday November 08, 2006 @05:29PM (#16775551)
    NVIDIA has officially launched their new high-end 3D Graphics card that has full support for DX10 and Shader Model 4.0. The GeForce 8800 series is fully tested and showcased [hothardware.com] at HotHardware and its performance is nothing short of impressive. With up to 128 Stream Processors under its hood, up to 86GB/s of memory bandwidth at its disposal and comprised of a whopping 681 million transistors, it's no wonder the new GPU rips up the benchmarks [hothardware.com] like no tomorrow. NVIDIA is also launching a new enthusiast line of motherboard chipsets in support of Intel's Core 2 Duo/Quad and AMD Athlon 64 processors. NVIDIA's nForce 680i SLI and nForce 680a SLI motherboard chipsets [hothardware.com] will also allow a pair of these monster graphics cards to run in tandem for nearly double the performance and the new chipset offers a ton of integrated features, like Gigabit Ethernet Link Teaming etc.
  • by bigattichouse (527527) on Wednesday November 08, 2006 @05:31PM (#16775571) Homepage
    ... you can get reasonable framerates with NeverWinter Nights? :)
  • Yeah, but... (Score:2, Informative)

    by cp.tar (871488) <cp.tar.bz2@gmail.com> on Wednesday November 08, 2006 @05:35PM (#16775639) Journal

    ... does it run Linux?

    Seriously... when are the Linux drivers expected?

  • 24" monitor? (Score:3, Insightful)

    by BenFenner (981342) on Wednesday November 08, 2006 @05:36PM (#16775669)
    So this will benefit my 13' projected monitor running at 1024 x 768 resolution (60 Hz refresh), and not my 20" CRT running at 1600 x 1200 resolution (100 Hz refresh)?

    You don't say...
  • by wumpus188 (657540) on Wednesday November 08, 2006 @05:37PM (#16775685)
    Now we need Duke Nukem Forever to really put this baby to work.
  • by plasmacutter (901737) on Wednesday November 08, 2006 @05:38PM (#16775697)
    Seriously.. last i checked certification for logo testing and DX 10 required DRM... not just DRM but enough lockdown to get hollywood to sign off on it.

    They kept changing the standards over and over.. so the question is exactly what is required in terms of integrated DRM.
    • by Firehed (942385) on Wednesday November 08, 2006 @08:22PM (#16777973) Homepage
      That's the FUDdiest post I've read in a while. Back that up with at least a vague reference at anything.

      HDCP support being required wouldn't surprise me, but that's not so much DRM as a stupid thing to try and make you buy a new monitor. But in either case, won't affect gaming whatsoever, or legal content for quite some time (the ICT isn't likely to be enabled before 2010-2012). I doubt it's required anyways, just highly recommended.
    • by jonwil (467024) on Wednesday November 08, 2006 @11:59PM (#16780135)
      The DRM will probobly consist of 2 things:
      1.HDCP support so that if the software layer requests it, the data will only be sent to devices that are approved (i.e. those that correctly talk HDCP)
      and 2.Support on the software side so that the drivers (or at least one version of them) will be able to prevent screen scraping programs (think FRAPS etc) and other hacks from being able to read graphics data back if the application and OS request "protected media".

      Basicly, unless you are actaully using software and media that requires this DRM, its unlikely to be noticable.
  • Finally... (Score:2, Funny)

    by Anonymous Coward on Wednesday November 08, 2006 @05:39PM (#16775725)
    ...something that can run Vista Aero with 5 stars!!!
  • by theonecp (1021699) on Wednesday November 08, 2006 @05:50PM (#16775883)
    The folks at Boot Daily take a peek at MSI's GeForce 8800GTX and run it through quite a few benchmarks and discuss its visual qualities.
  • by TheRaven64 (641858) on Wednesday November 08, 2006 @05:51PM (#16775915) Journal
    I was under the impression that one of the major advantages of DirectX 10 was it supported virtualisation. This means that the device needs to either be able to save its entire state to main memory (for switching) or, ideally, the ability to produce virtual devices that render to a texture rather than the main screen easily (for picture-in-picture).

    TFA didn't seem to mention anything about this though. Can anyone comment?

    • by rzei (622725) on Wednesday November 08, 2006 @06:38PM (#16776767)

      Not that I was a developer or really knew anything about the implementations of todays graphics cards I think that off screen rendering has been supported for some time.

      For example game F.E.A.R. [whatisfear.com] did take use of, among other things the off screen rendering, or straight to textrure when multiple surveillance cameras were rendered on a monitor in the game world.

      The way I see it, the chip itself doesn't have to know so much about how many tasks are using it, it's the drivers or perhaps even higher level software that does the scheduling of "graphic-requests". But then again this is all speculation :)

    • by TychoCelchuuu (835690) on Wednesday November 08, 2006 @10:21PM (#16779307) Journal
      Right now, although the card supports DX10, all your games and Operating Systems are in DX9. Until Vista comes out you're not going to see anything taking advantage of any neato DX10 doodads.
  • by llZENll (545605) on Wednesday November 08, 2006 @05:54PM (#16775977)
    http://enthusiast.hardocp.com/article.html?art=MTI xOCwxLCxoZW50aHVzaWFzdA== [hardocp.com]

    http://www.tomshardware.com/2006/11/08/geforce_880 0/ [tomshardware.com]
    Although the toms article is pretty worthless as most benches are cpu bound with a fx64 cpu.

    my favorite has to be this page, 8800 GTX SLI/3.80GHz Core 2 Duo SLI
    http://www.extremetech.com/article2/0,1697,2053791 ,00.asp [extremetech.com]
  • by minvaren (854254) on Wednesday November 08, 2006 @05:56PM (#16776039)
    Any coincidence that they launch the first DX10 card the same day that Vista goes gold [neowin.net]?
  • by mr_stinky_britches (926212) on Wednesday November 08, 2006 @05:57PM (#16776049) Homepage Journal
    For those of you who are interested in what the [H] has to say about this card..here is the direct link:

    BFGTech GeForce 8800 GTX and 8800 GTS [hardocp.com]
    Today marks the announcement of NVIDIA's next generation GeForce 8800 series GPU technology code named "G80." We have two new hard-launched video cards from BFG Tech representing the 8800 products. Gameplay experience TWIMTBP?

    I found their review to be of typical [H] quality, which I think is pretty decent (when compared to other H/W review sites, that is ;)

    Wi-Fizzle Research [wi-fizzle.com]
  • by scombs (1012701) on Wednesday November 08, 2006 @06:13PM (#16776319)
    Is this SLI capable? Not that I would be able to pay for even one of the cards...
  • by ConfusedSelfHating (1000521) on Wednesday November 08, 2006 @06:16PM (#16776393)
    At least the Xbox 360 was released before it was obsolete. The PS3 graphics processor is similar to the 7800 GTX if I remember correctly. When the PS3 releases people won't be saying "Buy the PS3 for the greatest graphical experience", instead they'll say "Buy the PS3 for the greatest graphical experience, expect for the PC you could have bought last week". The PS3 will probably be about as powerful as the 8600 when it's released.

    I know I sound very offtopic bringing this up, but many PC gamers also play console games. They will want to compare console graphics to PC graphics.
    • by Wesley Felter (138342) <wesley@felter.org> on Wednesday November 08, 2006 @06:39PM (#16776791) Homepage
      It's really not fair to expect a $500 console to have the same graphics as a $2,000 PC. For mainstream gamers, PS3 will probably compare favorably to a PC when it comes out.
    • by Elladan (17598) on Wednesday November 08, 2006 @06:41PM (#16776841)
      Graphics processors aren't really as important for a console system as a PC, though, since consoles target an output device with a typical resolution of 640x480 at 60 frames per second at most (and more like 30 in most cases). Sure, a few people might have HDTV, but not many.

      Plus, the PS3 has a herd of vector coprocessors to assist the video engine. I don't think anyone is going not buy a PS3 because it fails to meet some artificial benchmark in the lab. They're going to complain that it costs a hell of a lot for the small number of games available.
    • Consoles can tune games a LOT more than PC's, because the hardware is completely standard. They can do tricks and optimizations with rendering and such that you couldn't reliably expect to work on Joe Blow's random PC. The console still isn't out of the game.

      Besides, the video card you can buy costs as much as a whole PS3. The PS3 is still better bang for your gaming buck. Either way, I'm ok with my Go7600 in my laptop and I'm gonna get a Wii, so y'all can go do your own thing when posturing about games.
      • by Sj0 (472011) on Wednesday November 08, 2006 @07:16PM (#16777307) Journal
        You CAN, but I've found almost universally that they don't. The game development cycle is too tight, multi-platform compatibility is too important, and codebases are simply too large to justify optimizing the living hell out of the code you've got.

        And the new gaming PC I'm building costs less than the PS3, and other than perhaps 100 bucks for the chibi version of this monster when it comes out, I don't expect to have to do much to keep the system I'm building competitive with the PS3 in terms of playing a broad spectrum of recent games for the livespan of the machine.
      • by Slothy (17409) on Thursday November 09, 2006 @04:06AM (#16781639) Homepage
        This is true. Essentially, you figure out how many milliseconds of rendering time you can afford (depending on whether you do 30 or 60fps), and then work backwards to see what you can enable. So you can do all sorts of tricks to take advantage of the exact hardware to hit your target framerate. On PC, you can't really do much, because there are so many cards and then new drivers come out so regularly and change performance, you just try to have a really flexible engine and let users turn options on or off depending on the framerate/resolution they want.

        If you know that everyone who buys the game will see this model using this resolution normal map, you can put lots of work into making that look great. On PC, you have no real idea what it will look like for any particular person (will they even enable normal mapping? what res will it run at? what kind of texture filtering will they use? will it be using the highest resolution, or a lower mip level?), so the effort is diluted by trying to make every case look as good as it can. In reality, that will never look as good as making one case look outstanding.
    • by Inoshiro (71693) on Wednesday November 08, 2006 @07:14PM (#16777273) Homepage
      (Especially as I find Sony a bunch of asshats), but...

      An 8800 GTX is how much, exactly? A PS3 is 550$ CDN. How many PC games will use DX10? 10 games? AFAIK, Halo 1, Half-Life 2, etc, aren't magically DX 10 games since they were written for previous versions of the DX API.

      Will SquareEnix be writing PC versions of Final Fantasy XII? X-2?

      Cost wise, these cards and the PS3 are close. Game wise, I suspect the PS3 will have more games out than there will be DX10 games. The DX10-Vista lock in is another dis-incentive to go and get a raging boner over an 8800.

      I find my Nintendo DS to be a very enjoyable game platform, despite the fact that it doesn't require a 450W power supply, or other conmensurate upgrades, to get the same picture as my PC.
    • by asuffield (111848) <asuffield@suffields.me.uk> on Wednesday November 08, 2006 @08:02PM (#16777775)
      I know I sound very offtopic bringing this up, but many PC gamers also play console games. They will want to compare console graphics to PC graphics.

      Have you ever played PC games and console games? Console graphics have always SUCKED DONKEY compared to PC graphics. The PS2 had the most advanced graphics processor around when it was released... but the output resolution was 320x200 (because that's about what a TV uses), so it really didn't matter a damn.

      Nobody sane has ever expected decent graphics from a console. Consoles are not intended to give decent graphics, except when compared to other consoles. Consoles are intended to be used with TV sets, which is an incredibly limiting constraint.

      The current round of consoles has made a lot of noise about HDTV but none of them currently have any intention of implementing it (they'll give HDMI output but the quality of the output is not greatly improved over a regular TV). Console games continue to run at painfully low resolutions, compared to equivalent PC games. Gamers who really care about graphics quality will continue to use PCs, like they have done for the past several years.
  • by zulater (635326) on Wednesday November 08, 2006 @06:18PM (#16776447)
    now we can finally watch pr0n at over 1000fps!
  • by Vigile (99919) on Wednesday November 08, 2006 @06:24PM (#16776555)
    http://www.pcper.com/article.php?type=expert&aid=3 19 [pcper.com]

    This review looks at gaming and such too, but also touches on the NVIDIA CUDA (Compute Unified Device Architecture), that NVIDIA is hoping will get super computing into mainstream pricing. What thermal dynamics programmer would love to access 128 1.35 GHz processors for $600?

    http://www.pcper.com/article.php?aid=319&type=expe rt&pid=5 [pcper.com]
  • by Deagol (323173) on Wednesday November 08, 2006 @06:39PM (#16776781) Homepage
    Isn't this the tail wagging the dog? Shouldn't the video card industry have hardware API standards and shouldn't the software vendors be releasing stuff compatible with the hardware?

    "DirectX 10 Cards" sounds as silly as saying "Vista compatible PC BIOS". WTF?

    • by TheRaven64 (641858) on Wednesday November 08, 2006 @07:04PM (#16777149) Journal
      The biggest difference between DirectX and OpenGL is the extension mechanism. OpenGL specifies a set of features which must be implemented (in hardware or sofware), and then allows vendors to add extensions. These can be tested for at runtime and used (and the most popular ones then make it into the next version of the spec). DirectX doesn't have a comparable mechanism; the only features it exposes are those that the current version of the API dictates.

      In their rush to get a chunk of the big Windows market share, vendors put their weight behind DirectX, without noticing that it was a typical Microsoft attempt to commoditise the market by preventing vendors from differentiating themselves easily. Now GPU vendors just have to try to release the fastest card they can that conforms to the Microsoft API, rather than adding new, innovative, features. I doubt something like S3 texture compression would survive if it were added now; only OpenGL programmers would be able to use it, and they don't seem to make up much of the games market.

      • by Quasar1999 (520073) on Wednesday November 08, 2006 @11:26PM (#16779891) Journal
        Oh please!

        It's called the CAPS structure, and DirectX has had it for as many versions as I can remember. You check to see what Capabilites the card supports and decided what features you'll use. The OpenGL extensions are the same damned thing, except there you enumerate a big string list, while on the DirectX side you have all extensions visible and most available in software emulation mode, with the CAPS structure telling you what was hardware accelerated.

        Besides, how do you think pixel shaders and vertex shaders got to where they are today? It certainly wasn't because graphics card manufacturers decided to write extensions to OpenGL for the hell of it... It was DirectX specs that pushed them forward. And it's those same DirectX specs that allow developers to write games in parallel with the hardware development cycle, so that when the latest card comes out, there are already games ready to use it.

        If developers had to wait for a card to come out with some OpenGL extension before being able to experiment, understand, and then use it (and only on one brand of card), do you think anything would be adopted in any reasonable amount of time?

        I by no means love DirectX, it's got it's issues... but the OpenGL extension concept is in NO WAY helping innovation in the hardware arena.
    • by uhlume (597871) on Wednesday November 08, 2006 @07:05PM (#16777161) Homepage
      Seriously, where have you been for the last 10-15 years, and were you somehow under the impression all this time that OpenGL, DirectX 3-9 and their predecessors were "hardware API standards"? The only difference in this respect between DirectX10 and earlier versions is that DX10 doesn't attempt to provide backward compatability for older hardware, so you'll need an explictly DX10-compatible card in order to take advantage of DX10 rendering paths.
    • by drsmithy (35869) <drsmithy AT gmail DOT com> on Wednesday November 08, 2006 @08:49PM (#16778295)

      Isn't this the tail wagging the dog? Shouldn't the video card industry have hardware API standards and shouldn't the software vendors be releasing stuff compatible with the hardware?

      Sure, if you want to go back to the bad old days of games only supporting a small number of very specific video cards.

  • by Kris_J (10111) * on Wednesday November 08, 2006 @07:03PM (#16777131) Homepage Journal
    Or a Matrox Triplehead2Go [matrox.com]. A 24" panel is only a little over 2 million pixels. Three 1280x1024 panels are almost 4 million pixels. And you can get a TH2G plus three 17" or 19" panels for significantly less than a 24" panel.

    Is anyone testing these video cards in 3840x1024 yet?

  • by Sloppy (14984) on Wednesday November 08, 2006 @07:08PM (#16777197) Homepage Journal

    Forget the review; what catches my eye here is the term "DirectX 10 Card." The very idea that it's categorized by limited software compatibility, rather than categorized by the type of hardware slot that it uses, is a new idea to me.

    I can see a huge upside to it, though. As a time-saver, I would love for the amount of "closedness" to be how hardware gets categorized, so that I could just shop from the "open and compatible with everything" category instead of having to do research along the lines of "is this usable? Is that?"

    • by Nicolay77 (258497) <`nicolay.g' `at' `gmail.com'> on Wednesday November 08, 2006 @07:28PM (#16777425) Homepage
      It runs all the DirectX 7, 8 and 9 games with amazing framerates.
      It just happens to be the first to be able to run DirectX 10 games too.
    • by Sycraft-fu (314770) on Thursday November 09, 2006 @01:20AM (#16780715)
      Generally speaking DirectX and new graphics hardware are closely tied. MS works with the card makers to find out what they are planning with next gen hardware and to let them know what DirectX will demand. Cards are built around that. So the DirectX version of a card becomes a useful way of talking about it's features. For example DX7 cards had hardware T&L units, DX8 card programmable T&L units and pixel shaders, 9 cards fully programmable shaders (among other changes).

      Well DirectX 10 makes more changes and thus there'll be a new generation of hardware to deal with it.

      Now please note none of this means it won't support OpenGL, nVidia has strong OpenGL support. The reason OpenGL isn't referenced is because it doesn't keep up. The GL specs are slow, so new features are implemented as vendor specific extensions. This card is capable of things beyond 2.0, but then so were the 7000 series.
  • by eneville (745111) on Wednesday November 08, 2006 @07:47PM (#16777627) Homepage
    is this what i need to be able to run vista?
  • I am planning to buy a GeForce 7600GT, a card that gives me the framerates and resolutions I want with a very small price compared with the high end cards. Also because a more expensive card would be bottlenecked by my CPU so it would be a waste.

    However I now want to get a card of the same price and watts requirements of the 7600GT but with the G80 chipset (the one inside this beasts), just because of the better anisotropic filtering and antialiasing.

    So... how long until we have the mid end versions of this card ???
  • by Sj0 (472011) on Wednesday November 08, 2006 @08:28PM (#16778041) Journal
    Anyone using qbasic, yes.

    These are just video games. At some point, you're seriously facing diminishing returns for your $1200USD SLI 8800 rig.
  • by SEMW (967629) on Wednesday November 08, 2006 @08:33PM (#16778097)
    No, you don't need a DX10 card to run Vista. You need a a DirectX 9 card with 128 MB of Video RAM for Aero Glass, or any old 2D chip for Aero Standard, Aero Basic, Classic, etc.
  • by Sj0 (472011) on Wednesday November 08, 2006 @08:40PM (#16778201) Journal
    Actually, I reserved a new video card this week. I decided on something with a bit less oomph than this though. Between my Geforce MXes and my super high end video cards, I've found that it's better to buy the cheap video card that'll last you maybe a year or two than to go all-out and get maybe a year or two out of it. (Hey, it's not raw power that neccessitates upgrades, it's Pixel Shaders 4.1, right around the corner!)
  • Power consumption (Score:4, Insightful)

    by MobyDisk (75490) on Wednesday November 08, 2006 @08:58PM (#16778411) Homepage
    Dual power connectors, yeeeha! Video card manufacturers really aren't doing much about idle power consumption. 66 watts at idle just to display a static frame buffer. I can't imagine what will happen running Vista w/ Aero glass. I bet people's power consumption numbers will double.
  • by MobyDisk (75490) on Wednesday November 08, 2006 @09:00PM (#16778451) Homepage
    Holy crud! I misread that: It is 220 WATTS AT IDLE! [bit-tech.net] The idle TEMPERATURE in deg C is the 66.
  • by plasmacutter (901737) on Wednesday November 08, 2006 @09:14PM (#16778605)
    It's been some months since I last saw the relevant articles (they were on the EFF's Trusted Computing repository and in places like freedom to tinker), but I'll try to bring what stuck in my mind here:

    AACS copy protection on the new generation HD video media has invasively strict requirements, such as encryption of the video path within the system itself to prevent "sniffing" attacks, which means either the hardware itself or the drivers constitute a form of DRM. Any way I look at that encrypted media path requirement I wonder exactly how a set of linux drivers would not be challenged as a "circumvention device", and at the very least the authentication process within the system for this encryption will impose a toll on video performance.
  • by friedmud (512466) on Wednesday November 08, 2006 @10:55PM (#16779613)

    With 768MB of RAM you might actually be able to run Beryl and open up _10_ windows before they start going black! :-P

    For those who have no idea what I'm talking about look here:

    http://www.nvnews.net/vbulletin/showthread.php?t=7 7248 [nvnews.net]

    My workstation at school is a turbo-cache Quadro card with ony 128MB of RAM... which means I can only open a couple of windows before they start going black... sigh.

  • by Das Modell (969371) on Wednesday November 08, 2006 @11:44PM (#16780021)
    The Resident Evil games on the GameCube (1, 0 and 4) all look really nice, even today.
  • by TheRaven64 (641858) on Wednesday November 08, 2006 @11:59PM (#16780139) Journal

    It's called the CAPS structure, and DirectX has had it for as many versions as I can remember

    It's not the same thing at all. The CAPS structure allows you to enumerate which subset of DirectX the card supports. OpenGL extensions allow you to query for features which were not part of the original specification. This is exactly the difference I was describing. You can't use DirectX 9 features from a DirectX 8 application, but you can use OpenGL 2.0 features from an OpenGL 1.0 application by accessing them via the extensions mechanism.

    Besides, how do you think pixel shaders and vertex shaders got to where they are today?

    As I recall, nVidia designed Cg and wrote a Cg compiler. This could then be used outside the framework of DirectX or OpenGL. Eventually DirectX and OpenGL incorporated derivatives of Cg in their APIs. OpenGL extensions take a while to get into the standard, but they get the technology out there quickly for developers to experiment with. It's not that uncommon for nVidia to implement ATi extensions (and vice versa) before they make it into the standard. By the time something's in the ARB_ namespace, it's close enough to being in the standard that people can start using it.

  • by S3D (745318) on Thursday November 09, 2006 @01:47AM (#16780899)
    Looks like DirectX 10 functionality - unified (geometry) shader and like will be available in in the NVIDIA drivers very soon. Seems the entry points for new OpenGL extensions are already present in the driver nvoglnt.dll (96.89), including
    and new Cg profiles
    All we need now is header file
    Chances are, for OpenGL directX 10-like functionality will be here before VISTA. Another one for swith to OpenGL from DirectX. Also it will be at least couple of years before majority of the gamers switch to VISTA, but with OpenGL developers can utilize latest GPU to their full potential on the Windows XP.
    More about it in this thread form OpenGL.org:
    http://www.opengl.org/discussion_boards/ubb/ultima tebb.php?ubb=get_topic;f=3;t=014831 [opengl.org]
  • by dusanv (256645) on Thursday November 09, 2006 @02:22AM (#16781087)
    January at the earliest. The really sweet ones (cheap and low power) will come in the spring when they do a die shrink to 45 nm.
  • by Slothy (17409) on Thursday November 09, 2006 @03:57AM (#16781587) Homepage
    Your first paragraph is very false.
  • by Slothy (17409) on Thursday November 09, 2006 @04:16AM (#16781693) Homepage
    What methods did you use to determine that nobody tunes their game for the hardware? Can you tell if it's CPU bound, fillrate bound, vertex bound, memory bound, etc? That sounds like a very hard-to-prove statement.

    I would submit Halo as an example of a game that shows just how much it was optimized for the target hardware. The PC version chugs on much more powerful hardware.

    And frankly, you HAVE to tune your game for each platform, it's not optional. Platforms have different capabilities (pixel shader support? unified shaders or fixed pipelines? how big is the L2 cache? is it multi-core? how much performance are you losing to cache misses? what's your threading model?) It's not like you can ignore that stuff and still ship a game.
  • by modeless (978411) on Thursday November 09, 2006 @05:35AM (#16782121) Journal
    OpenGL Extensions are not nearly the same thing as DirectX Caps. Vendors can't add new DirectX caps; they can only choose to implement, or not, the caps that Micosoft defines. For an example of why this sucks, consider geometry instancing. Microsoft decided in their infinite wisdom that only Shader Model 3.0 cards could have geometry instancing support. Then when ATI produced some Shader Model 2 cards with instancing they had to disable it by default and use a terrible hack to enable it involving calling an unrelated API (CheckDeviceFormat IIRC) with a special "magic number", just to pass Windows Logo testing.

    For another example, consider the sad, sad state of dual-head support in DirectX. To this day it is impossible to write a windowed (non-fullscreen) DirectX application that performs well when stretched across two monitors as seen by Windows. Graphics vendors have stood on their heads and worked around this problem by telling Windows that there is really only one monitor and replacing all the dual-head code in Windows with logic in the graphics driver! Meanwhile, OpenGL works flawlessly.

    In any case, DirectX Caps suck so much that developers often ignore them in favor of just using the make and model of card to determine a suitable rendering path. Microsoft realizes this, and that's why the role of caps has been reduced in DirectX 10. That's right: DirectX 10 mostly defines a standard set of features that every card must support, leaving very little flexibility for vendors.

    I'm not sure how anyone could argue that DirectX is friendlier to the development of new graphics features than OpenGL. There's simply no way to add new features to it at all, unless you are Microsoft. Everyone has to wait, for years, until Microsoft comes down from Mount Sinai with the next version set in stone. Or, as has become the status quo: implement new features using dirty hacks, and do handstands to hide these new features from Microsoft's APIs until DirectX catches up.
  • by Petrushka (815171) on Thursday November 09, 2006 @06:50AM (#16782555)
    ... you may laugh (go on, I know you want to) but I managed to have fun with NWN1 on a 2001 Toshiba Satellite laptop with 256 MB RAM, 48 MB shared graphics memory, and no hardware acceleration, at 800x600, with almost everything turned down. It even survived the siege scenario in HotU. I think I'd cough up hairballs if I tried that now, mind you.
  • by Petrushka (815171) on Thursday November 09, 2006 @06:52AM (#16782573)
    DirectX 10 requires Vista, which comes with five trillion kinds of DRM built in. Isn't that enough?
  • by Kris_J (10111) * on Thursday November 09, 2006 @08:50AM (#16783327) Homepage Journal
    A huge collection of games support that resolution (with some bugs in some cases), and I very much enjoyed playing WoW with a wrap-around image until I quit (for an unrelated reason). Triplehead is not dualhead -- you're not staring at a join, you're typically focused on the centre screen and you use the side screen for extra warning, or space for secondary information.

    Though by "testing", I simply mean benchmarking cards however the reviews currently benchmark cards, with the benefit of actually taxing the video card enough that the CPU isn't the bottleneck.

  • by zoney_ie (740061) on Thursday November 09, 2006 @09:39AM (#16783783)
    > AA or AF (=useless gimmicks)

    I have a 20" screen. Even with its native resolution of 1600x1200, pixels are quite large. Turn off AA, and particularly AF, and any game looks pretty poor (jagged edges and un-merged texture edges look abysmal when magnified).

    Particularly looks awful if my poor vanilla 6800 won't let me do more than 1024x768 for a game (e.g. the monster that Bethesda created which is Oblivion); generally this looks awful scaled to fill the screen (esp. with AA low or turned off). If I'm using 1280x1024 the screen is large enough that I can just run it centred and it doesn't look silly/too small.

    For most people who have a more down-to-earth screen, I'd agree, there's no need to upgrade so frequently. Even in my case, I'm not going to change from my 6800 anytime soon unless I see a real bargain in the lower high-end bracket (about €300, and something more powerful than say 7950GT). MMMmmmm... I'd love one of these 8800 series cards though...
  • by Namarrgon (105036) on Thursday November 09, 2006 @10:10AM (#16784243) Homepage

    Yeah, games are fine & all that, but I'm just happy for gamers to bring the economies of scale down for this-here plug-in supercomputer.

    GPGPU is what will really make it stand out. Physics acceleration, Folding@Home, ray-traced audio, ray-traced window managers, fluid-simulator window managers, film-level 2D pixel processing (my field), realtime H.264 decode & encode... I'm just scratching the surface. High performance computing just got a whole lot cheaper.

    Cell? Never heard of it.

  • by mgblst (80109) on Thursday November 09, 2006 @10:56AM (#16785093) Homepage
    Yeah, that is why I drive a bus to work, even though I live on my own and don't carpool.

    That is why I live in a 10-bedroom, 3-bathroom mansion.

    That is why I have 6 monitors at work, even though I just play solitaire.

    Because the opposite of 640k is enough for anybody means having as much as possible no matter the need.
  • by JohnnyBigodes (609498) <morphine&digitalmente,net> on Thursday November 09, 2006 @10:59AM (#16785177)
    Had heard of that, already did, didn't help, unfortunately :(

    Well, hoping for a patch.
  • by Prophet of Nixon (842081) on Thursday November 09, 2006 @11:59AM (#16786295)
    Er, the PS2 was not the most advanced graphics processor when it was released, but despite that it typically rendered at 480i (720x480, interlaced). It manages to look quite good, check out SkyGunner, Okami, or Gradius V sometime. I know I've seen better looking games on the PS2 than Neverwinter Nights 2, and unlike NWN2, they ran smoothly.
  • by queenb**ch (446380) on Thursday November 09, 2006 @04:54PM (#16788647) Homepage Journal
    We've been screwed by Microsoft again. Here's an excerpt from my blog, "After a brief phone to ATI, I was informed that both the X1800 and X1900 series cards will support the new Dx10 standard. A quick pop over to PriceWatch shows that the cheapest X1800 card is $176 and the cheapest X1900 card is $203. Now, since I'm getting the whole rest of the box for $667, I find this to be a bit steep. Hopefully, Nvidia will have something under $150." Sadly, that's not the case. This is the ONLY card from Nvidia that supports Dx10. The only sites that I've seen that are pre-selling it are in the EU. The one listed in the article in the orignal post says 605 British Pounds which works out $1,152.59 USD.

    Am I only that see the irony in putting a $1000 video card in a $600 computer?

    2 cents,


Our business in life is not to succeed but to continue to fail in high spirits. -- Robert Louis Stevenson