Forgot your password?
typodupeerror

Nvidia Launches 8800 Series, First of the DirectX 10 Cards 149

Posted by ScuttleMonkey
from the by-popular-demand dept.
mikemuch writes "The new top-end GeForce 8800 GTX and GTS from Nvidia launched today, and Loyd Case at ExtremeTech has done two articles: an analysis of the new GPU's architecture, and a benchmark article on PNY's 8800 GTX. The GPU uses a unified scalar-based hardware architecture rather than dedicated pixel pipelines, and the card sets the bar higher yet again for PC graphics." Relatedly an anonymous reader writes "The world and his dog has been reviewing the NVIDIA 8800 series of graphics cards. There is coverage over at bit-tech, which has some really in-depth gameplay evaluations; TrustedReviews, which has a take on the card for the slightly less technical reader; and TechReport, which is insanely detailed on the architecture. The verdict: superfast, but don't bother if you have less than a 24" display."
This discussion has been archived. No new comments can be posted.

Nvidia Launches 8800 Series, First of the DirectX 10 Cards

Comments Filter:
  • WOW! This is FAST! (Score:4, Insightful)

    by Salvance (1014001) * on Wednesday November 08, 2006 @05:24PM (#16775453) Homepage Journal
    It's actually pretty surprising that the DX10-compatible 8800 runs $450-$600 given it's brand new and has huge performance gains over NVidia's current cards. I don't understand why someone would say only buy it if you have a 24" monitor though ... it seems like buying a single 8800 would be just as good (and cheaper) than buying a couple 7800's ...
    • Re: (Score:3, Informative)

      by Ant P. (974313)
      What they're saying is that if you're only ever going to go up to 1600x1200, this is just going to waste drawing more frames than your monitor can ever display. Right now it looks like the only thing that could strain this card is one of those huge Apple LCDs.
      • by Kenja (541830)
        7800gtx and dual AMD Opterons.

        Modern games still dont run at optimal frame rates at 1280x1024 with max graphics settings. Most recent of these is Neverwinter Nights 2, I get around 20fps which is enough, but I wouldn't mind it being a bit smoother.
        • Re: (Score:3, Informative)

          by JohnnyBigodes (609498)
          Apparently Neverwinter Nights 2 has some sort of problem in it is *very* slow for some people with reasonably fast PCs. I've tried it and it also runs almost unbearably slow with things set to medium everything and a couple of lows (1024, no AA) on a 7800.

          20fps with your 7800GTX in NWN2 is certainly not acceptable :)
          • by ivan256 (17499)
            Try turning off shadows. They seem to cause the problem... Runs smooth a silk for me with a 7800GTX at 1680x1050 with everything else at max. If I turn on shadows I get 10 or so FPS.
      • by steveo777 (183629) on Wednesday November 08, 2006 @06:04PM (#16776159) Homepage Journal
        Just play Ultima IX on 1028x768 mode without any of the fixes or patches. I do believe the 8800 will have met its match. (never met a configuration that could run it over 10fps, except my friends old 650Mhz PIII with some VooDoo card or another, ran it at 19fps)
      • by mikael (484)
        But if you are doing general purpose computing that requires considerable floating-point performance (FFT signal processing, dynamic systems), then you won't be restricted by the refresh rate of the monitor. Both DirectX and OpenGL support floating-point framebuffers. However, for some simulations, you may have less than four floating-point variables per pixel. So just by using three out of four pixel channels, you are just wasting 25% or more of your processsing time. Having scalar processors would seem to
      • by Barny (103770)
        Hrmm, well running a pair of 7900gt cards atm, and on company of heroes, all settings maxed at 1902x1200 (dell 24") things get a little chunky, so yes, other than apple there are screens that need these.... mines pre-ordered btw :)
        • Re: (Score:3, Funny)

          by Sj0 (472011)
          You know, I remember being impressed that Duke Nukem 3d ran at 640x480. The point where you need 1902x1200 AND anti-aliasing is the point where you're just doing it to make fun of the people without a Geforce 8800.
      • Re: (Score:3, Insightful)

        by Babbster (107076)
        It depends on the game. In the [H]ardOCP review [hardocp.com], this appears to be the first card that can do Oblivion with maxed in-game settings (the grass has been the problem area in the past, even with top-of-the-line cards) at very high resolutions and high AA settings while retaining solid framerates - the settings they considered ideal in their testing were 8x AA at 1600x1200 and 4x AA at 1920x1200. That would be impressive for a SLI setup, let alone a single card.

        How worthwhile that is depends, of course, on
      • Re: (Score:2, Insightful)

        by Handpaper (566373)
        Ahem.

        Section "Monitor"
        Identifier "Sun GDM-5410"
        HorizSync 30-122
        VertRefresh 48-160
        Modeline "2048x1536@72" 472.89 2048 2080 3872 3904 1536 1565 1584 1613
        EndSection

        'The old that is strong does not wither' :)

      • huge Apple LCDs.

        Well of course only an "Apple" monitor could ever have these HIGH levels of capabilities...

        Is everyone here AppleStupid? - Or maybe the term is MacInTard?

        My freaking laptop has a 1920x1200 LCD, and it is at the bottom of the list of displays in my HOME let alone at my office. Even my old 2002 Toshiba Laptop has a 1600x1200 LCD display.

        I suppose people are going to go all crazy and start saying that they play games on PCs above 1024x768 next. Oh my, the insanity, how could this be possible?

        Ge
  • DNF! (Score:4, Funny)

    by spacemky (236551) * <nick@ar y f i .com> on Wednesday November 08, 2006 @05:28PM (#16775533) Homepage Journal
    I heard somewhere that this will be one of the only supported video cards in Duke Nukem Forever.

    *ducks*
    • I'm sure a lot of people are going apeshit with excitement over this card, but in a year everyone will have forgotten about it, because the next big thing will have been released by then. The 7900 was released in March 2006, for fuck's sake, and now everyone is talking about the 8800. People don't even realise that they'll be able to use the 7900 for several years before it gets outdated. Then again, I'm so amazingly oldschool that I use 1024x768 with no AA or AF (=useless gimmicks), so what do I know. I ha
    • Re: (Score:3, Funny)

      by EvilIdler (21087)
      You're half-right; it will be the oldest supported card.
  • another review (Score:4, Informative)

    by brunascle (994197) on Wednesday November 08, 2006 @05:29PM (#16775547)
    Hot Hardware has another review [hothardware.com]
  • by MojoKid (1002251) on Wednesday November 08, 2006 @05:29PM (#16775551)
    NVIDIA has officially launched their new high-end 3D Graphics card that has full support for DX10 and Shader Model 4.0. The GeForce 8800 series is fully tested and showcased [hothardware.com] at HotHardware and its performance is nothing short of impressive. With up to 128 Stream Processors under its hood, up to 86GB/s of memory bandwidth at its disposal and comprised of a whopping 681 million transistors, it's no wonder the new GPU rips up the benchmarks [hothardware.com] like no tomorrow. NVIDIA is also launching a new enthusiast line of motherboard chipsets in support of Intel's Core 2 Duo/Quad and AMD Athlon 64 processors. NVIDIA's nForce 680i SLI and nForce 680a SLI motherboard chipsets [hothardware.com] will also allow a pair of these monster graphics cards to run in tandem for nearly double the performance and the new chipset offers a ton of integrated features, like Gigabit Ethernet Link Teaming etc.
  • ... you can get reasonable framerates with NeverWinter Nights? :)
  • Yeah, but... (Score:2, Informative)

    by cp.tar (871488)

    ... does it run Linux?

    Seriously... when are the Linux drivers expected?

  • 24" monitor? (Score:3, Insightful)

    by BenFenner (981342) on Wednesday November 08, 2006 @05:36PM (#16775669)
    So this will benefit my 13' projected monitor running at 1024 x 768 resolution (60 Hz refresh), and not my 20" CRT running at 1600 x 1200 resolution (100 Hz refresh)?

    You don't say...
  • Now we need Duke Nukem Forever to really put this baby to work.
  • by plasmacutter (901737) on Wednesday November 08, 2006 @05:38PM (#16775697)
    Seriously.. last i checked certification for logo testing and DX 10 required DRM... not just DRM but enough lockdown to get hollywood to sign off on it.

    They kept changing the standards over and over.. so the question is exactly what is required in terms of integrated DRM.
    • by Firehed (942385)
      That's the FUDdiest post I've read in a while. Back that up with at least a vague reference at anything.

      HDCP support being required wouldn't surprise me, but that's not so much DRM as a stupid thing to try and make you buy a new monitor. But in either case, won't affect gaming whatsoever, or legal content for quite some time (the ICT isn't likely to be enabled before 2010-2012). I doubt it's required anyways, just highly recommended.
    • by jonwil (467024)
      The DRM will probobly consist of 2 things:
      1.HDCP support so that if the software layer requests it, the data will only be sent to devices that are approved (i.e. those that correctly talk HDCP)
      and 2.Support on the software side so that the drivers (or at least one version of them) will be able to prevent screen scraping programs (think FRAPS etc) and other hacks from being able to read graphics data back if the application and OS request "protected media".

      Basicly, unless you are actaully using software and
  • Finally... (Score:2, Funny)

    by Anonymous Coward
    ...something that can run Vista Aero with 5 stars!!!
  • The folks at Boot Daily take a peek at MSI's GeForce 8800GTX and run it through quite a few benchmarks and discuss its visual qualities.
  • by TheRaven64 (641858) on Wednesday November 08, 2006 @05:51PM (#16775915) Journal
    I was under the impression that one of the major advantages of DirectX 10 was it supported virtualisation. This means that the device needs to either be able to save its entire state to main memory (for switching) or, ideally, the ability to produce virtual devices that render to a texture rather than the main screen easily (for picture-in-picture).

    TFA didn't seem to mention anything about this though. Can anyone comment?

    • by rzei (622725)

      Not that I was a developer or really knew anything about the implementations of todays graphics cards I think that off screen rendering has been supported for some time.

      For example game F.E.A.R. [whatisfear.com] did take use of, among other things the off screen rendering, or straight to textrure when multiple surveillance cameras were rendered on a monitor in the game world.

      The way I see it, the chip itself doesn't have to know so much about how many tasks are using it, it's the drivers or perhaps even higher level softw

      • by pilkul (667659)
        Indeed. Less obviously, many games render to a texture in order to apply full-screen effects (e.g. your entire vision getting blurry when you are damaged) on them before sending to the screen.
    • Re: (Score:3, Informative)

      by TychoCelchuuu (835690)
      Right now, although the card supports DX10, all your games and Operating Systems are in DX9. Until Vista comes out you're not going to see anything taking advantage of any neato DX10 doodads.
  • http://enthusiast.hardocp.com/article.html?art=MT I xOCwxLCxoZW50aHVzaWFzdA== [hardocp.com]

    http://www.tomshardware.com/2006/11/08/geforce_880 0/ [tomshardware.com]
    Although the toms article is pretty worthless as most benches are cpu bound with a fx64 cpu.

    my favorite has to be this page, 8800 GTX SLI/3.80GHz Core 2 Duo SLI
    http://www.extremetech.com/article2/0,1697,2053791 ,00.asp [extremetech.com]
  • Any coincidence that they launch the first DX10 card the same day that Vista goes gold [neowin.net]?
  • For those of you who are interested in what the [H] has to say about this card..here is the direct link:

    BFGTech GeForce 8800 GTX and 8800 GTS [hardocp.com]

    Today marks the announcement of NVIDIA's next generation GeForce 8800 series GPU technology code named "G80." We have two new hard-launched video cards from BFG Tech representing the 8800 products. Gameplay experience TWIMTBP?

    I found their review to be of typical [H] quality, which I think is pretty decent (when compared to other H/W review sites, that is ;)

    -

  • by scombs (1012701)
    Is this SLI capable? Not that I would be able to pay for even one of the cards...
    • Re: (Score:2, Informative)

      by noSignal (997337)
      From nvidia.com:

      Q: Do the new GeForce 8800 GTX and GeForce 8800 GTS GPUs support SLI technology?

      A: Yes. All GeForce 8800 GPUs support NVIDIA SLI technology.

  • by ConfusedSelfHating (1000521) on Wednesday November 08, 2006 @06:16PM (#16776393)
    At least the Xbox 360 was released before it was obsolete. The PS3 graphics processor is similar to the 7800 GTX if I remember correctly. When the PS3 releases people won't be saying "Buy the PS3 for the greatest graphical experience", instead they'll say "Buy the PS3 for the greatest graphical experience, expect for the PC you could have bought last week". The PS3 will probably be about as powerful as the 8600 when it's released.

    I know I sound very offtopic bringing this up, but many PC gamers also play console games. They will want to compare console graphics to PC graphics.
    • It's really not fair to expect a $500 console to have the same graphics as a $2,000 PC. For mainstream gamers, PS3 will probably compare favorably to a PC when it comes out.
      • Isn't the good PS3 package selling for more like $600?

        The 8800 GTS is around $450. You can easily get the rest of a solid gaming PC for $450. So you're talking more about a $900 PC than a $2,000 PC. And that's before pricing a decent HDTV vs. a decent gaming monitor.

    • by Elladan (17598)
      Graphics processors aren't really as important for a console system as a PC, though, since consoles target an output device with a typical resolution of 640x480 at 60 frames per second at most (and more like 30 in most cases). Sure, a few people might have HDTV, but not many.

      Plus, the PS3 has a herd of vector coprocessors to assist the video engine. I don't think anyone is going not buy a PS3 because it fails to meet some artificial benchmark in the lab. They're going to complain that it costs a hell of
    • by PitaBred (632671)
      Consoles can tune games a LOT more than PC's, because the hardware is completely standard. They can do tricks and optimizations with rendering and such that you couldn't reliably expect to work on Joe Blow's random PC. The console still isn't out of the game.

      Besides, the video card you can buy costs as much as a whole PS3. The PS3 is still better bang for your gaming buck. Either way, I'm ok with my Go7600 in my laptop and I'm gonna get a Wii, so y'all can go do your own thing when posturing about gam
      • Re: (Score:3, Insightful)

        by Sj0 (472011)
        You CAN, but I've found almost universally that they don't. The game development cycle is too tight, multi-platform compatibility is too important, and codebases are simply too large to justify optimizing the living hell out of the code you've got.

        And the new gaming PC I'm building costs less than the PS3, and other than perhaps 100 bucks for the chibi version of this monster when it comes out, I don't expect to have to do much to keep the system I'm building competitive with the PS3 in terms of playing a b
      • by Slothy (17409)
        This is true. Essentially, you figure out how many milliseconds of rendering time you can afford (depending on whether you do 30 or 60fps), and then work backwards to see what you can enable. So you can do all sorts of tricks to take advantage of the exact hardware to hit your target framerate. On PC, you can't really do much, because there are so many cards and then new drivers come out so regularly and change performance, you just try to have a really flexible engine and let users turn options on or of
    • (Especially as I find Sony a bunch of asshats), but...

      An 8800 GTX is how much, exactly? A PS3 is 550$ CDN. How many PC games will use DX10? 10 games? AFAIK, Halo 1, Half-Life 2, etc, aren't magically DX 10 games since they were written for previous versions of the DX API.

      Will SquareEnix be writing PC versions of Final Fantasy XII? X-2?

      Cost wise, these cards and the PS3 are close. Game wise, I suspect the PS3 will have more games out than there will be DX10 games. The DX10-Vista lock in is another dis
    • by asuffield (111848)

      I know I sound very offtopic bringing this up, but many PC gamers also play console games. They will want to compare console graphics to PC graphics.

      Have you ever played PC games and console games? Console graphics have always SUCKED DONKEY compared to PC graphics. The PS2 had the most advanced graphics processor around when it was released... but the output resolution was 320x200 (because that's about what a TV uses), so it really didn't matter a damn.

      Nobody sane has ever expected decent graphics from a co

  • now we can finally watch pr0n at over 1000fps!
  • by Vigile (99919) on Wednesday November 08, 2006 @06:24PM (#16776555)
    http://www.pcper.com/article.php?type=expert&aid=3 19 [pcper.com]

    This review looks at gaming and such too, but also touches on the NVIDIA CUDA (Compute Unified Device Architecture), that NVIDIA is hoping will get super computing into mainstream pricing. What thermal dynamics programmer would love to access 128 1.35 GHz processors for $600?

    http://www.pcper.com/article.php?aid=319&type=expe rt&pid=5 [pcper.com]
  • Isn't this the tail wagging the dog? Shouldn't the video card industry have hardware API standards and shouldn't the software vendors be releasing stuff compatible with the hardware?

    "DirectX 10 Cards" sounds as silly as saying "Vista compatible PC BIOS". WTF?

    • Re: (Score:3, Insightful)

      by TheRaven64 (641858)
      The biggest difference between DirectX and OpenGL is the extension mechanism. OpenGL specifies a set of features which must be implemented (in hardware or sofware), and then allows vendors to add extensions. These can be tested for at runtime and used (and the most popular ones then make it into the next version of the spec). DirectX doesn't have a comparable mechanism; the only features it exposes are those that the current version of the API dictates.

      In their rush to get a chunk of the big Windows mar

      • Re: (Score:3, Informative)

        by Quasar1999 (520073)
        Oh please!

        It's called the CAPS structure, and DirectX has had it for as many versions as I can remember. You check to see what Capabilites the card supports and decided what features you'll use. The OpenGL extensions are the same damned thing, except there you enumerate a big string list, while on the DirectX side you have all extensions visible and most available in software emulation mode, with the CAPS structure telling you what was hardware accelerated.

        Besides, how do you think pixel shaders and
    • Seriously, where have you been for the last 10-15 years, and were you somehow under the impression all this time that OpenGL, DirectX 3-9 and their predecessors were "hardware API standards"? The only difference in this respect between DirectX10 and earlier versions is that DX10 doesn't attempt to provide backward compatability for older hardware, so you'll need an explictly DX10-compatible card in order to take advantage of DX10 rendering paths.
    • by drsmithy (35869)

      Isn't this the tail wagging the dog? Shouldn't the video card industry have hardware API standards and shouldn't the software vendors be releasing stuff compatible with the hardware?

      Sure, if you want to go back to the bad old days of games only supporting a small number of very specific video cards.

  • Or a Matrox Triplehead2Go [matrox.com]. A 24" panel is only a little over 2 million pixels. Three 1280x1024 panels are almost 4 million pixels. And you can get a TH2G plus three 17" or 19" panels for significantly less than a 24" panel.

    Is anyone testing these video cards in 3840x1024 yet?

    • by kjart (941720)
      Is anyone testing these video cards in 3840x1024 yet?

      I don't think many games would support that resolution (I'm assuming that's what you're referring to when you say testing). Also, would you even want to play a game across multiple displays? For one thing, I'd imagine the bezel around each monitor would get annoying.

  • Forget the review; what catches my eye here is the term "DirectX 10 Card." The very idea that it's categorized by limited software compatibility, rather than categorized by the type of hardware slot that it uses, is a new idea to me.

    I can see a huge upside to it, though. As a time-saver, I would love for the amount of "closedness" to be how hardware gets categorized, so that I could just shop from the "open and compatible with everything" category instead of having to do research along the lines of "is t

    • by Nicolay77 (258497)
      It runs all the DirectX 7, 8 and 9 games with amazing framerates.
      It just happens to be the first to be able to run DirectX 10 games too.
    • Generally speaking DirectX and new graphics hardware are closely tied. MS works with the card makers to find out what they are planning with next gen hardware and to let them know what DirectX will demand. Cards are built around that. So the DirectX version of a card becomes a useful way of talking about it's features. For example DX7 cards had hardware T&L units, DX8 card programmable T&L units and pixel shaders, 9 cards fully programmable shaders (among other changes).

      Well DirectX 10 makes more ch
  • is this what i need to be able to run vista?
  • I am planning to buy a GeForce 7600GT, a card that gives me the framerates and resolutions I want with a very small price compared with the high end cards. Also because a more expensive card would be bottlenecked by my CPU so it would be a waste.

    However I now want to get a card of the same price and watts requirements of the 7600GT but with the G80 chipset (the one inside this beasts), just because of the better anisotropic filtering and antialiasing.

    So... how long until we have the mid end versions of thi
  • by Sj0 (472011)
    Anyone using qbasic, yes.

    These are just video games. At some point, you're seriously facing diminishing returns for your $1200USD SLI 8800 rig.
  • by SEMW (967629)
    No, you don't need a DX10 card to run Vista. You need a a DirectX 9 card with 128 MB of Video RAM for Aero Glass, or any old 2D chip for Aero Standard, Aero Basic, Classic, etc.
  • by Sj0 (472011)
    Actually, I reserved a new video card this week. I decided on something with a bit less oomph than this though. Between my Geforce MXes and my super high end video cards, I've found that it's better to buy the cheap video card that'll last you maybe a year or two than to go all-out and get maybe a year or two out of it. (Hey, it's not raw power that neccessitates upgrades, it's Pixel Shaders 4.1, right around the corner!)
  • Power consumption (Score:4, Insightful)

    by MobyDisk (75490) on Wednesday November 08, 2006 @08:58PM (#16778411) Homepage
    Dual power connectors, yeeeha! Video card manufacturers really aren't doing much about idle power consumption. 66 watts at idle just to display a static frame buffer. I can't imagine what will happen running Vista w/ Aero glass. I bet people's power consumption numbers will double.
  • Holy crud! I misread that: It is 220 WATTS AT IDLE! [bit-tech.net] The idle TEMPERATURE in deg C is the 66.
  • It's been some months since I last saw the relevant articles (they were on the EFF's Trusted Computing repository and in places like freedom to tinker), but I'll try to bring what stuck in my mind here:

    AACS copy protection on the new generation HD video media has invasively strict requirements, such as encryption of the video path within the system itself to prevent "sniffing" attacks, which means either the hardware itself or the drivers constitute a form of DRM. Any way I look at that encrypted media pa
  • by friedmud (512466)
    Indeed!

    With 768MB of RAM you might actually be able to run Beryl and open up _10_ windows before they start going black! :-P

    For those who have no idea what I'm talking about look here:

    http://www.nvnews.net/vbulletin/showthread.php?t=7 7248 [nvnews.net]

    My workstation at school is a turbo-cache Quadro card with ony 128MB of RAM... which means I can only open a couple of windows before they start going black... sigh.

    Friedmud
  • The Resident Evil games on the GameCube (1, 0 and 4) all look really nice, even today.
  • It's called the CAPS structure, and DirectX has had it for as many versions as I can remember

    It's not the same thing at all. The CAPS structure allows you to enumerate which subset of DirectX the card supports. OpenGL extensions allow you to query for features which were not part of the original specification. This is exactly the difference I was describing. You can't use DirectX 9 features from a DirectX 8 application, but you can use OpenGL 2.0 features from an OpenGL 1.0 application by accessing

  • by S3D (745318) on Thursday November 09, 2006 @01:47AM (#16780899)
    Looks like DirectX 10 functionality - unified (geometry) shader and like will be available in in the NVIDIA drivers very soon. Seems the entry points for new OpenGL extensions are already present in the driver nvoglnt.dll (96.89), including
    GL_NV_geometry_shader4
    GL_NV_gpu_program4
    GL_NV_gpu_shader4
    and new Cg profiles
    All we need now is header file
    Chances are, for OpenGL directX 10-like functionality will be here before VISTA. Another one for swith to OpenGL from DirectX. Also it will be at least couple of years before majority of the gamers switch to VISTA, but with OpenGL developers can utilize latest GPU to their full potential on the Windows XP.
    More about it in this thread form OpenGL.org:
    http://www.opengl.org/discussion_boards/ubb/ultima tebb.php?ubb=get_topic;f=3;t=014831 [opengl.org]
  • by dusanv (256645)
    January at the earliest. The really sweet ones (cheap and low power) will come in the spring when they do a die shrink to 45 nm.
  • by Slothy (17409)
    Your first paragraph is very false.
  • by Slothy (17409)
    What methods did you use to determine that nobody tunes their game for the hardware? Can you tell if it's CPU bound, fillrate bound, vertex bound, memory bound, etc? That sounds like a very hard-to-prove statement.

    I would submit Halo as an example of a game that shows just how much it was optimized for the target hardware. The PC version chugs on much more powerful hardware.

    And frankly, you HAVE to tune your game for each platform, it's not optional. Platforms have different capabilities (pixel shader s
  • by modeless (978411)
    OpenGL Extensions are not nearly the same thing as DirectX Caps. Vendors can't add new DirectX caps; they can only choose to implement, or not, the caps that Micosoft defines. For an example of why this sucks, consider geometry instancing. Microsoft decided in their infinite wisdom that only Shader Model 3.0 cards could have geometry instancing support. Then when ATI produced some Shader Model 2 cards with instancing they had to disable it by default and use a terrible hack to enable it involving callin
  • by Petrushka (815171)
    ... you may laugh (go on, I know you want to) but I managed to have fun with NWN1 on a 2001 Toshiba Satellite laptop with 256 MB RAM, 48 MB shared graphics memory, and no hardware acceleration, at 800x600, with almost everything turned down. It even survived the siege scenario in HotU. I think I'd cough up hairballs if I tried that now, mind you.
  • by Petrushka (815171)
    DirectX 10 requires Vista, which comes with five trillion kinds of DRM built in. Isn't that enough?
  • by Kris_J (10111) *
    A huge collection of games support that resolution (with some bugs in some cases), and I very much enjoyed playing WoW with a wrap-around image until I quit (for an unrelated reason). Triplehead is not dualhead -- you're not staring at a join, you're typically focused on the centre screen and you use the side screen for extra warning, or space for secondary information.

    Though by "testing", I simply mean benchmarking cards however the reviews currently benchmark cards, with the benefit of actually taxing th

  • by zoney_ie (740061)
    > AA or AF (=useless gimmicks)

    I have a 20" screen. Even with its native resolution of 1600x1200, pixels are quite large. Turn off AA, and particularly AF, and any game looks pretty poor (jagged edges and un-merged texture edges look abysmal when magnified).

    Particularly looks awful if my poor vanilla 6800 won't let me do more than 1024x768 for a game (e.g. the monster that Bethesda created which is Oblivion); generally this looks awful scaled to fill the screen (esp. with AA low or turned off). If I'm usi
  • Yeah, games are fine & all that, but I'm just happy for gamers to bring the economies of scale down for this-here plug-in supercomputer.

    GPGPU is what will really make it stand out. Physics acceleration, Folding@Home, ray-traced audio, ray-traced window managers, fluid-simulator window managers, film-level 2D pixel processing (my field), realtime H.264 decode & encode... I'm just scratching the surface. High performance computing just got a whole lot cheaper.

    Cell? Never heard of it.

  • by mgblst (80109)
    Yeah, that is why I drive a bus to work, even though I live on my own and don't carpool.

    That is why I live in a 10-bedroom, 3-bathroom mansion.

    That is why I have 6 monitors at work, even though I just play solitaire.

    Because the opposite of 640k is enough for anybody means having as much as possible no matter the need.
  • Had heard of that, already did, didn't help, unfortunately :(

    Well, hoping for a patch.
  • Er, the PS2 was not the most advanced graphics processor when it was released, but despite that it typically rendered at 480i (720x480, interlaced). It manages to look quite good, check out SkyGunner, Okami, or Gradius V sometime. I know I've seen better looking games on the PS2 than Neverwinter Nights 2, and unlike NWN2, they ran smoothly.
  • We've been screwed by Microsoft again. Here's an excerpt from my blog, "After a brief phone to ATI, I was informed that both the X1800 and X1900 series cards will support the new Dx10 standard. A quick pop over to PriceWatch shows that the cheapest X1800 card is $176 and the cheapest X1900 card is $203. Now, since I'm getting the whole rest of the box for $667, I find this to be a bit steep. Hopefully, Nvidia will have something under $150." Sadly, that's not the case. This is the ONLY card from Nvidi

A modem is a baudy house.

Working...