Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Nvidia Launches 8800 Series, First of the DirectX 10 Cards 149

mikemuch writes "The new top-end GeForce 8800 GTX and GTS from Nvidia launched today, and Loyd Case at ExtremeTech has done two articles: an analysis of the new GPU's architecture, and a benchmark article on PNY's 8800 GTX. The GPU uses a unified scalar-based hardware architecture rather than dedicated pixel pipelines, and the card sets the bar higher yet again for PC graphics." Relatedly an anonymous reader writes "The world and his dog has been reviewing the NVIDIA 8800 series of graphics cards. There is coverage over at bit-tech, which has some really in-depth gameplay evaluations; TrustedReviews, which has a take on the card for the slightly less technical reader; and TechReport, which is insanely detailed on the architecture. The verdict: superfast, but don't bother if you have less than a 24" display."
This discussion has been archived. No new comments can be posted.

Nvidia Launches 8800 Series, First of the DirectX 10 Cards

Comments Filter:
  • WOW! This is FAST! (Score:4, Insightful)

    by Salvance ( 1014001 ) * on Wednesday November 08, 2006 @05:24PM (#16775453) Homepage Journal
    It's actually pretty surprising that the DX10-compatible 8800 runs $450-$600 given it's brand new and has huge performance gains over NVidia's current cards. I don't understand why someone would say only buy it if you have a 24" monitor though ... it seems like buying a single 8800 would be just as good (and cheaper) than buying a couple 7800's ...
  • 24" monitor? (Score:3, Insightful)

    by BenFenner ( 981342 ) on Wednesday November 08, 2006 @05:36PM (#16775669)
    So this will benefit my 13' projected monitor running at 1024 x 768 resolution (60 Hz refresh), and not my 20" CRT running at 1600 x 1200 resolution (100 Hz refresh)?

    You don't say...
  • Re:Yeah, but... (Score:3, Insightful)

    by BRTB ( 30272 ) <slashdot@NOSpam.brtb.org> on Wednesday November 08, 2006 @05:56PM (#16776031) Homepage
    Is now [nzone.com] soon enough for you? =]

    Sure, they're beta, if you want to be picky about it. Probably works just fine - their last beta drivers did.
  • by Tyger ( 126248 ) on Wednesday November 08, 2006 @06:15PM (#16776385)
    Just because you have a DirectX 10 capable card doesn't mean you need DirectX 10. Most of those games/benchmarks are against DirectX 9, and the rest are against OpenGL. It will be a few years before most games require DirectX 10.
  • by TheRaven64 ( 641858 ) on Wednesday November 08, 2006 @07:04PM (#16777149) Journal
    The biggest difference between DirectX and OpenGL is the extension mechanism. OpenGL specifies a set of features which must be implemented (in hardware or sofware), and then allows vendors to add extensions. These can be tested for at runtime and used (and the most popular ones then make it into the next version of the spec). DirectX doesn't have a comparable mechanism; the only features it exposes are those that the current version of the API dictates.

    In their rush to get a chunk of the big Windows market share, vendors put their weight behind DirectX, without noticing that it was a typical Microsoft attempt to commoditise the market by preventing vendors from differentiating themselves easily. Now GPU vendors just have to try to release the fastest card they can that conforms to the Microsoft API, rather than adding new, innovative, features. I doubt something like S3 texture compression would survive if it were added now; only OpenGL programmers would be able to use it, and they don't seem to make up much of the games market.

  • by Babbster ( 107076 ) <aaronbabb&gmail,com> on Wednesday November 08, 2006 @07:08PM (#16777205) Homepage
    It depends on the game. In the [H]ardOCP review [hardocp.com], this appears to be the first card that can do Oblivion with maxed in-game settings (the grass has been the problem area in the past, even with top-of-the-line cards) at very high resolutions and high AA settings while retaining solid framerates - the settings they considered ideal in their testing were 8x AA at 1600x1200 and 4x AA at 1920x1200. That would be impressive for a SLI setup, let alone a single card.

    How worthwhile that is depends, of course, on just how killer a person wants their gaming rig to be (I can't imagine ever buying a $600 graphics card myself). But, given that the performance seems to exceed that of any other graphics card (or any two, for that matter), it's pretty clearly the card to get to ensure maximum gaming PC penis size. :)
  • by Sj0 ( 472011 ) on Wednesday November 08, 2006 @07:16PM (#16777307) Journal
    You CAN, but I've found almost universally that they don't. The game development cycle is too tight, multi-platform compatibility is too important, and codebases are simply too large to justify optimizing the living hell out of the code you've got.

    And the new gaming PC I'm building costs less than the PS3, and other than perhaps 100 bucks for the chibi version of this monster when it comes out, I don't expect to have to do much to keep the system I'm building competitive with the PS3 in terms of playing a broad spectrum of recent games for the livespan of the machine.
  • by Handpaper ( 566373 ) on Wednesday November 08, 2006 @07:28PM (#16777427)
    Ahem.

    Section "Monitor"
    Identifier "Sun GDM-5410"
    HorizSync 30-122
    VertRefresh 48-160
    Modeline "2048x1536@72" 472.89 2048 2080 3872 3904 1536 1565 1584 1613
    EndSection

    'The old that is strong does not wither' :)

  • Power consumption (Score:4, Insightful)

    by MobyDisk ( 75490 ) on Wednesday November 08, 2006 @08:58PM (#16778411) Homepage
    Dual power connectors, yeeeha! Video card manufacturers really aren't doing much about idle power consumption. 66 watts at idle just to display a static frame buffer. I can't imagine what will happen running Vista w/ Aero glass. I bet people's power consumption numbers will double.
  • Also GPGPU (Score:3, Insightful)

    by Krischi ( 61667 ) on Wednesday November 08, 2006 @11:03PM (#16779681) Homepage
    Don't forget general-purpose GPU computing. For those highly parallelizable applications that do not need to conform to the full IEEE-754 floating point specs, this card is a dream come true.

To the systems programmer, users and applications serve only to provide a test load.

Working...