Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Nvidia Launches 8800 Series, First of the DirectX 10 Cards 149

mikemuch writes "The new top-end GeForce 8800 GTX and GTS from Nvidia launched today, and Loyd Case at ExtremeTech has done two articles: an analysis of the new GPU's architecture, and a benchmark article on PNY's 8800 GTX. The GPU uses a unified scalar-based hardware architecture rather than dedicated pixel pipelines, and the card sets the bar higher yet again for PC graphics." Relatedly an anonymous reader writes "The world and his dog has been reviewing the NVIDIA 8800 series of graphics cards. There is coverage over at bit-tech, which has some really in-depth gameplay evaluations; TrustedReviews, which has a take on the card for the slightly less technical reader; and TechReport, which is insanely detailed on the architecture. The verdict: superfast, but don't bother if you have less than a 24" display."
This discussion has been archived. No new comments can be posted.

Nvidia Launches 8800 Series, First of the DirectX 10 Cards

Comments Filter:
  • by MojoKid ( 1002251 ) on Wednesday November 08, 2006 @05:29PM (#16775551)
    NVIDIA has officially launched their new high-end 3D Graphics card that has full support for DX10 and Shader Model 4.0. The GeForce 8800 series is fully tested and showcased [hothardware.com] at HotHardware and its performance is nothing short of impressive. With up to 128 Stream Processors under its hood, up to 86GB/s of memory bandwidth at its disposal and comprised of a whopping 681 million transistors, it's no wonder the new GPU rips up the benchmarks [hothardware.com] like no tomorrow. NVIDIA is also launching a new enthusiast line of motherboard chipsets in support of Intel's Core 2 Duo/Quad and AMD Athlon 64 processors. NVIDIA's nForce 680i SLI and nForce 680a SLI motherboard chipsets [hothardware.com] will also allow a pair of these monster graphics cards to run in tandem for nearly double the performance and the new chipset offers a ton of integrated features, like Gigabit Ethernet Link Teaming etc.
  • by plasmacutter ( 901737 ) on Wednesday November 08, 2006 @05:38PM (#16775697)
    Seriously.. last i checked certification for logo testing and DX 10 required DRM... not just DRM but enough lockdown to get hollywood to sign off on it.

    They kept changing the standards over and over.. so the question is exactly what is required in terms of integrated DRM.
  • by TheRaven64 ( 641858 ) on Wednesday November 08, 2006 @05:51PM (#16775915) Journal
    I was under the impression that one of the major advantages of DirectX 10 was it supported virtualisation. This means that the device needs to either be able to save its entire state to main memory (for switching) or, ideally, the ability to produce virtual devices that render to a texture rather than the main screen easily (for picture-in-picture).

    TFA didn't seem to mention anything about this though. Can anyone comment?

  • by ConfusedSelfHating ( 1000521 ) on Wednesday November 08, 2006 @06:16PM (#16776393)
    At least the Xbox 360 was released before it was obsolete. The PS3 graphics processor is similar to the 7800 GTX if I remember correctly. When the PS3 releases people won't be saying "Buy the PS3 for the greatest graphical experience", instead they'll say "Buy the PS3 for the greatest graphical experience, expect for the PC you could have bought last week". The PS3 will probably be about as powerful as the 8600 when it's released.

    I know I sound very offtopic bringing this up, but many PC gamers also play console games. They will want to compare console graphics to PC graphics.
  • by Vigile ( 99919 ) on Wednesday November 08, 2006 @06:24PM (#16776555)
    http://www.pcper.com/article.php?type=expert&aid=3 19 [pcper.com]

    This review looks at gaming and such too, but also touches on the NVIDIA CUDA (Compute Unified Device Architecture), that NVIDIA is hoping will get super computing into mainstream pricing. What thermal dynamics programmer would love to access 128 1.35 GHz processors for $600?

    http://www.pcper.com/article.php?aid=319&type=expe rt&pid=5 [pcper.com]
  • Re:Yeah, but... (Score:1, Interesting)

    by allometry ( 840925 ) on Wednesday November 08, 2006 @06:43PM (#16776861)
    Not to troll, but in using Linux, I've never seen the need for such a card.

    Anyone actually using this card under Linux and can give me a reason? I'm simply curious, that's all...
  • Re:Yeah, but... (Score:1, Interesting)

    by Anonymous Coward on Wednesday November 08, 2006 @07:15PM (#16777299)
    Sure, they're beta, if you want to be picky about it. Probably works just fine - their last beta drivers did.

    Maybe they did for the majority, but they sucked ass for some of us with more esoteric systems.

    For example, all the 7xxx and up cards have dual-link dvi transmitters built into the chipset - it is not an option. Yet, if the driver had problems parsing the EDID information from the monitor, the drivers assumed the transmitters were single-link and misprogrammed them as such, assuring that they would not work at all. No amount of configuration options could force the dual-link behaviour if EDID information was unreadable (like, for example you have a uni-directional DVI over fibre extender, or your monitor's EDID voltage level is just a tad below spec).

    Their only linux support - the informal, unofficial kind provided via a forum at nvnews.com - was seriously lacking in ass, it wasn't even half-assed. All they could do was follow a script and when you got to the end of the script without a solution, they just stopped responding.

    ATI drivers are just as closed, I won't be buying either in the future unless they open up. It is too bad that Intel's fully-open graphics are motherboard only.

    At least I won't lose out on all the fancy-dancy MPEG4 decode acceleration that the Nvidia card can do - their official linux drivers don't support it either. For me, that makes even the 8800 cards just about on par with the Intel offerings.
  • by S3D ( 745318 ) on Thursday November 09, 2006 @01:47AM (#16780899)
    Looks like DirectX 10 functionality - unified (geometry) shader and like will be available in in the NVIDIA drivers very soon. Seems the entry points for new OpenGL extensions are already present in the driver nvoglnt.dll (96.89), including
    GL_NV_geometry_shader4
    GL_NV_gpu_program4
    GL_NV_gpu_shader4
    and new Cg profiles
    All we need now is header file
    Chances are, for OpenGL directX 10-like functionality will be here before VISTA. Another one for swith to OpenGL from DirectX. Also it will be at least couple of years before majority of the gamers switch to VISTA, but with OpenGL developers can utilize latest GPU to their full potential on the Windows XP.
    More about it in this thread form OpenGL.org:
    http://www.opengl.org/discussion_boards/ubb/ultima tebb.php?ubb=get_topic;f=3;t=014831 [opengl.org]

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...