Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware

nVidia GeForce 2 Ultra Unveiled 205

Jacek Fedorynski was one of the folks who wrote in with a huge slew of reviews of the new GeForce 2 Ultra - starting with Hot Hardware, moving to RIVA 3D, heading to 3DGPU, and ending with FiringSquad, and nV news and probably all the other sites.
This discussion has been archived. No new comments can be posted.

nVidia GeForce 2 Ultra Unveiled

Comments Filter:
  • web sites split content . .

  • If they've gotten the drivers right under Windows, why not Linux?

    Ranessin
  • I think you're underestimating the X-Box by quite a bit.

    Actually, it won't be 10x more powerful than the PS2. More like twice as powerful.

    It's less a matter of "the X-Box is n times as powerful" and much more a matter of the programming paradigm the consoles are best suited for. The PS2 is impressive in terms of pure FP thouroughput, memory bandwidth, etc. Where it suffers is in its lack of high-level programming tools and serious dearth of onboard RAM, especially video RAM. Specifically, the PS2 has 32 MB of main memory, but only 4 MB of video memory. The only good thing is that the two are connected with a high-bandwidth (3.2 GB/s) bus. This compares to the typical setup on a current high-end PC game: whole bunch of main RAM, 32 MB of video RAM, but a low speed (AGP is a bad joke) bus connecting the two. This means that a whole generation of programmers who learned to program 3D games according to the current PC model--that is, "just load all the data you need for a level into video RAM at the start, and work out of video RAM as much as possible"--have to try to figure out a completely different way of doing things. No matter how good they are, they'll never get around the fact that 4 MB just ain't enough to hold detailed textures, not to mention your z-buffer, framebuffer, etc. The obvious solution--find some way to store some of that stuff in the main memory and access it only when needed--ends up requiring a totally different approach to the rendering process. Happily enough, each developer needs to roll the solutions to these problems on their own, in a foreign assembly language, because high-level libraries for the PS2 are, at this point, almost nonexistant.

    The result? Developers, especially all those PC developers looking to make the jump to the much higher volume world console gaming, are none too enthusiastic. Meanwhile, despite the very impressive potential of the PS2, the first generation of games is so badly programmed they barely look any different from games for the Dreamcast. In many cases, they look worse (after all, at least DC has 8 MB of video RAM).

    Contrast the X-Box. It has a 64 MB unified memory architecture. Not so coincidentally, by the time it comes out, most decent PC 3D cards will have...64 MB of memory. (The fact that the X-Box's 64 MB will also have to hold the game code, etc. is not as large a problem as you think--indeed, most in-game code can fit into a CPU's cache.) The X-Box can access this RAM across a 6.4 MB/s bus--twice as good as the PS2, and better than any current PC card (except the new GeForce2 Ultra).

    Best of all, the X-Box will work with both 3D gaming API's around for the PC: DirectX and OpenGL. While one might have some complaints about the peculiarites of these API's, they are many many many many times better than any API's available for consoles, much less the PS2. Plus, their use makes porting any PC game to the X-Box a snap.

    Developers, by the way, are much more enthused about X-Box than PS2 these days...

    In the same situation, the original Playstation whooped Nintendo's ass even though the N64 was more than 3X faster, and featured stuff like texture filtering and full screen anti-aliasing (which PSX doesn't support.) Of course, Nintendo actually knew the console industry, and knew how to deliver a simple, stable, easy to use product

    I disagree. The N64 failed because it used cartridges instead of CD's. For one thing, this meant that its superior processing power was wasted on games with very low detail textures, and thus that its graphics looked no better than those on the "underpowered" PS. For another, it meant that Nintendo could continue to pocket a $15 licensing fee/media cost for each game sold, as opposed to about $3 for each PS game. Guess which console's games cost less? Guess which had more developer support?? Nintendo's error was in treating the market as if they still had a monopoly.

    Microsoft is bringing too much PC garbage into the XBox to be able to do that. XBox will not be simple, stable, nor easy to use. It will require updates, and upgrades, and patches.

    How do you know? There is simply no basis for this in fact. None. Zero.

    Meanwhile, the PS2 will need a cable modem peripheral for Internet capability, fast becoming one of the most important aspects of the modern video-game experience. (Built into X-Box.) As history shows us, that is indeed a killer. How many console peripherals have been widely successful, in the history of console games?? None.

    Now, the X-Box does have a hard drive, but it will not be end-user accessible. It is mainly for saving games and caching purposes. If used as MS says it will be used, it is only an asset.

    And the X-Box uses a stripped-down version of the Win2k kernel. So? Has anyone shown W2k to be any less stable than, say, Linux running a GUI? Didn't think so. Is there any reason to think Embedded W2k won't be as stable as, say, Embedded Linux? No.

    Nothing against MS, it's just that it doesn't understand the industry.

    Precisely what was said about Sony before the PS. Sony made great strides to study the industry and not presume to attack it the same way it had in other industries, and from all appearances so has MS.

    None of this takes into account the fact that Playstation 2 is coming out more than a year earlier. As history shows us, that's a killer. PSX whopped Nintendo largely due to the fact that N64 came out more than a year later.

    Uh...the Sega Saturn came out before the PS. It failed miserably. Why?? Funny you should ask. Seems it was too difficult to program for...

    Lastly, Sony is a lot bigger than Microsoft, and they have more developer support and consumer mindshare.

    False, fasle and debatable.

    MS is going to be eaten for lunch.

    Don't be so sure.
  • Quake 3 run better with SMP turned off, believe it or not...

  • Funny, but I had no problems installing the Voodoo3/5 DRI drivers from source. Hate to break it to you that the problem is *not* me.

    Ranessin
  • by Rombuu ( 22914 ) on Tuesday August 15, 2000 @11:21AM (#853155)
    I dunno, I just question why people like nVidia so much...

    Because they make good hardware at a good price with great Windows drivers.
  • Yes, that is my idea of support. They give us the tools we need to make the best possible driver for the Matrox G400. A driver which, I might add, works on any platform, with any kernel, and any configuration.

    Constrast that with the fine, fine output of those 100 engineers (although I suspect that their Linux driver "team" numbers in the ones). The NVidia binary-only driver runs only on particular kernel revisions, and does not allow the user to switch between SMP and UP operation. Worse, NVidia could choose to stop supporting Linux at any time, and there's nothing we could do about. Since we have the specs for the G400, we can support that card forever.

  • If you want to support a company that is behind Linux, get a Matrox G400. Matrox releases the full programming manuals to the XFree development team, and the GLX and DRI drivers for the G400 are fully open-source. The G400 isn't nearly as fast as the GTS Ultra, but the nvidia drivers aren't open source.

  • If they hadn't been relased at all I would not be infinitely more screwed since they don't work in the first place. What's the difference b/w having non-functioning driver and not having non-functioning drivers?

    Ranessin
  • BeNews [benews.com] has reported that official, Be-maintained GeForce drivers for BeOS are on their way, courtesy of BeBits. [bebits.com] They're just 2D*, but that's still a damn sight better than VESA or greyscale modes.

    *: Which stands to reason, since accelerated OpenGL is still in the Real Soon Now stage.

    Every day we're standing in a wind tunnel/Facing down the future coming fast - Rush
  • I'm going to hang that on my wall.

    ------
  • Weird, I had no problem back in the Riva128 days. Of course, take a look back then. The main reason there were no incompatibility problems with Voodoo back then was because all games were designed with 3DFx in mind. Of course 3DFx had no problems because everyone tested on their card. Of course, take a look at what NVIDIA did back then. The brought DirectX6 support and OpenGL ICD's in, putting a big hurt on 3DFx's Glide monoply. Of course take a look at the now. NVIDIA drivers have been more or less rock solid since 3.x, they have great OpenGL support, and they release high quality products. Compare this to 3DFx that released the totally flaky Voodoo3 3500, that still doesn't have good OpenGL support, and still is in the back of the pack in terms of features. I'm suprised you didn't go all the way back to the NV1 and the load of crap that was. Of course, NVIDIA has made a huge turnaround. The Riva128 was their first real competitive card (and only their second commercial design.) With the TNT they experianced some growing pains early on, but ever since the driver issues cleared up (I got a TNT several months after they came out, and I've had no problems) and so far, the only problems they've had is the fact that their boards are so demanding of current that lower quality motherboards have problems.
  • The ATI Radeon DDR has both HDTV and DVD support.
  • I wouldn't complain if I could get the 3D working, (with Mandrake). I've tried my Asus V6600 Deluxe with XFree 4.0.1 and 4.0.0, the stock XFree drivers work fine, NVidia's keep blowing things away and making me restart the computer. Seems to be glx problems for me.

    Since they don't work for me, I will complain. Because if part of them was open-sourced, it would probably be installed and running for me.

    Mike Roberto
    - GAIM: MicroBerto

  • I'm using Nvidia's latest official detonator release for a Geforce DDR with SMP-enabled quake3 without problems, it's quite smooth and with modestly-high settings I still play between 60-90fps.

    This is with detonator 2. I can't vouch for any of these "leaked" drivers though.
  • by be-fan ( 61476 ) on Tuesday August 15, 2000 @11:29AM (#853165)
    Yes I'm pissed at NVIDIA for Be thing. But given that the company does so much for it's users in a day where videocard companies are generally down the tube, I forgive them for it.

    NVIDIA is still toe-ing the XFree86 market. But they're doing it whole-heartedly, and the drivers have improved quite a bit.

    As for the time it took to release, remember this. The specs for XFree86 weren't really set that early in the game. The driver ABI changed up until right before it's release. Also, an OpenGL driver is a complete implementation of OpenGL. Not only does it bang interrupts, but totally implements the entire pipeline. Functionally, it does just as much as Mesa, but is slightly easier to code because most functions don't need emultors. It's a wonder they got it out in the time they did. As a preempt to any OSS comments, remember, this isn't just a driver, it's an OpenGL implementation. It's also the highest quality GL implemenation on a consumer level card. It is simply too much to ask them to OSS it. They'd be giving away all the tweeks to the GL pipeline, not just register level info. Matrox is really hurting for a good OpenGL ICD. Is it really fair to ask NVIDIA to give them one?
  • across multiple pages . .
  • Just a quick note. If I remember correctly, both IRQ 9 and IRQ 2 use the same interrupt. Actually 2 is connected to 9 to allow for all 16 Interrupts. I've notices that a lot of SB cards default to using IRQ 9, which causes wierd problems when they get used. Just something to check.
  • $600 for a Voodoo 5 6000? This card will perform nearly as well for much less. (The prices should go down by then.)
  • So... with the amount of computing power one of these things has, when will the stupid US laws on the export of supercomputers come into effect and hobble the marketing of this card in places outside of the good ol' US of A?

    Also, why would anyone besides IPO millionares or graphics professionals want to spend the price of a low-end PC on JUST a graphics card, which won't even be taken advantage of for another YEAR or so?
  • "I don't think you can hold Nvidia responsible for the possibility that Linux support for you mother board is not quite perfect(If that happens to be the case)."

    In addition, 3dfx and ATI don't seem to have that problem...

    Ranessin
  • Actually, in an FAQ released right about when the GeForce2's came out, nVidia was saying that the 128 megs of RAM on the Voodoo5 6000 added up to only about 58 megs of effective RAM. Even 3dfx's competitor says it has that much RAM to work with, so it's more than 32. The only thing that has to be repeated is texture data, I think.

  • You're forgetting about Capcom. =)

    I'm just waiting for the Super Ultra Mega Tournament Champion Edition GeForce Alpha II.


    --
  • And yet, the bandwidth between the CPU and GPU ain't all that hot. And the CPU is going to be throttled due to Windows CE.

    Besides, there is more to a good system than graphics. Like how fun the games are. That's why the PS is still going strong despite being so slow.
  • hehe.. go ahead and get a Voodoo with an external power supply.. I'd never put one of those in my box, even if it was free. That thing is the biggest freaking piece of inefficient hardware . They need four chips and an external power supply to compete. How sad... perhaps they should have spent more money on R&D, instead of marketing..

    JOhn
  • Actually, a lot of people are willing to fork over that much. Lot's of people bought dual Voodoo2 SLI's for $600. Methinks the bulk of the price is due to the high speed RAM.
  • www.sharkyextreme.com also has a good review on this. BTW, has anybody used the video features on this thing? In an age where 3DFx still doesn't do AGP, it's cool that NVIDIA is putting in stuff like support for HDTV.
  • If I understand correctly, the Real3D StarFighter (which I happen to own) does this, making use of the AGP bus to access system memory. According to the FAQ, that's why the PCI StarFighters have so much more memory (32-64 MB) than the AGP StarFighters (4-8 MB). If anybody's interested, the StarFighter FAQ is at http://support.intel.co m/support/graphics/starfighter/faq.htm [intel.com]
  • If you are running Q3A under Win2k, make sure you have r_ext_texture_compress and r_overbrightbits turned off.

    Q3A runs fine on my TNT (stb velocity 4400) in Win2k. I plan to buy a GeForce2 MX soon.

  • Isn't Elantra Korean?
  • What I'm saying is that the Nvidia drivers may not be the problem. Windows may deal with your motherboard the way it needs to be dealt with, but Linux still doesn't perfectly suppord every motherboard in existance. I'd admit that it seems like a bit of a stretch, but certainly possible.

    Your other reply is where I see the best arguement for bugginess. ATI and 3dFXs drivers may not have this problem, but do they both use the AGP card as anything other than a PCI card? Nvidia may be using features correctly that aren't implemented correctly for your motherboard.

    Or the drivers may just be buggy :). I've had mixed results, in the same system a GeForce 256 works perfectly, but a TNT 2 crashed every now and then. I'm sure the drivers will continue to improve, Nvidia has the chance to get a great position in the 3d Linux market, but still has still has very competent competition.
    treke

  • offtopic but true 15% baah, brew your own mead 26-30% and tastes like a sweet ginger ale if you do it right, leave the cloves out though
  • If you haven't noticed someone else post this by now (about half of the posts ive read so far mention this) nvidia has damn good drivers out, and the drivers are stable. also, you can get help on #nvidia on irc.openprojects.net, and the linux dev guys from nvidia are there a lot
  • That was probably one of the top ten flames I've ever received.

    Too bad you're too much of a pussy to use an actual account, so I could, erm, email you and congratulate you.

    Anyway, there's no content to that post, at all. Granted, that's what makes it a flame, and I probably shouldn't be responding to flamebait, but the only thing you actually said with real substance is that you support CEOs raping the public so they can take six-month vacations. Well, more power to you. You probably don't vote anyway.

    Sometimes there's value in criticizing successful things. Other successful business practices include slavery, strip mining, sweatshop labor...
  • Of course Matrox has been working on OpenGL since the G200 and they are STILL half-baked. (Quake works, 3D Studio doesn't)
  • Just because Nvidia routinely fucks over the open source community...

    Question: How did NVidia damage the open source community by releasing Linux drivers? If they had done nothing at all, would that have been better? As I'm sure you already know (as it has been stated thousands of times), NVidia would like to have released open source drivers, but they have NDA's with other companies that prevent them from doing so.

    ...and tries to strongarm the market...

    Every company has "tried to strongarm the market" at some point. 3dfx was one of the worst, with GLIDE and their patent on multi-texturing (which NVidia realized was crap, and ignored).

    ...doesn't mean their product isn't cheaper than the other ones!

    Their product is better than the other ones. In all my extensive research of the subject, I have continuously come to the same conclusion. Furthermore, their closed source drivers for Linux are currently better than any open source driver for a 3D card on Linux. Not to put down open source (I spend most of my time writing open source code), but it's just a fact.

    See the .sig.

    ------

  • Actually, it's a 250MHz core. :)

  • Thank you :-)

    I can get all the Mesa demos to work and have no problems with the GL modules from xscreensaver and xlock. However shortly after I start up Q3A, my entire machine locks up. UT did the same.

    After some experimentation, I realized something interesting. If the PCI slot below my TNT2 was empty, it would work. This led me to believe that the TNT2 and which ever card was below it (sometimes a SoundBlaster Live, sometimes an Adaptec 2940, and sometimes an Ensoniq PCI) were having a conflict over an interrupt.

    However, /proc/interrupts showed separate IRQs for each device.

    Any ideas?

    Ranessin

  • "But given that the company does so much for it's users in a day where videocard companies are generally down the tube, I forgive them for it."

    Down the tube? Like the various companies who are either writing their own 3D drivers for BeOS or assisting Be in writing the drivers? Frankly, I think it's nVidia that is down the tube.

    Perhaps if they could put out quality, stable, Be and Linux drivers, I'd change my mind. But till then, I'm quite happy with my Voodoo5 which has 2D acceleration under Windows, Linux (and every other x86 XFree86 platform), and BeOS, and which has (or will soon have) 3D accleration under Linux, Windows, and BeOS.

    Ranessin
  • And just as good Linux drivers.

    ------
  • I guess you're right. I'd much rather be able to switch between SMP and UP on-the-fly than have drivers that are fast and featurful.

    ------

  • I woke up with feet at the end of my legs, is it luck?
  • I guess, but somehow I doubt that 100 dollars will be a big decider for most would be buyers of these video cards. If you are willing to spend 500 dollars for a cutting edge solution, why not pay 600? Neither card is going to hold their value very well. If the prices were reversed would everyone be singing the praises of the V5 6000? Somehow I doubt it.
  • some 64MB GeForce 2 GTS cards are selling for less than $300. So certainly, the RAM alone does not cost the manufacturer $300. The main reason prices are so high is the lack of competition. If nVidia ups the ante and voluntarily lower the prices, people will flock to their cards. The can certainly still make profit, in fact, they could make more profit from the volume.
  • "How did NVidia damage the open source community by releasing Linux drivers?"

    It could be argued that every time a company releases a closed source product which uses GPLed code it damages the open source community.

    "Furthermore, their closed source drivers for Linux are currently better than any open source driver for a 3D card on Linux."

    Only if you can get them to work. If you can't, you're screwed since you have no access to the code and little, if any, access to the developers.

    Ranessin
  • Unfortunately, from what I hear (So take this with a grain of salt) Apple (or their subcontractor) hosed up the Cinema Display and it only works on Macs right now.

    I fail to see how this could be true, but I've heard it from a couple of sources now.

  • by Anonymous Coward
    The Ultra is for the same people that buy 1.1GHz processors right when they come out. Penis envy. It's a well known fact that people with small penises make up for this shortcoming (no pun intended) by getting the most powerful computer equipment they can afford.

    man with 133MHz pentium = has big penis and proud of it

    man with 1.1GHz Athlon Ultra Super with 1GB RAM and RAID 5 Hard drive = has itty bitty penis
  • I agree with you. I have a Geforce2 GTS and I am very happy with my purchase. My primary reason for choosing this card (it was Creative's card) is the support under Linux. It was the same reason I purchased Linksys network cards (they ship you the tulip module source!) and I bought an SB Live card.

    One small point you didnt mention in your message is that you can find nvidia's driver developers on irc.openprojects.net in #nvidia. They and the other people in that channel have been very helpful to everyone who's visited from newbie to expert. They even have optimized drivers you can test out if you wish. Good group of people over there.

    So everyone... support hardware vendors who make a real effort to support Linux and support it well. Companies like nVidia, Creative and Linksys need to be praised for their support of linux more so than other hardware vendors who just pay lipservice to it. Quit arguing over licenses and crap and just push them for quality, supported hardware and software, open or not.

    siri

  • Oh yes. That's it. You do realize there is life outside of /. In the "real world" (ie. the world rarely inhabited by delusional OpenSource fundementalists) NVIDIA is regarded as a high quality company. You go buy that inferior 3DFx product for more money. That's exactly what we want to encourage. Boycott the companies that make decent products, and buy stuff from jackasses like 3DFx that still don't support modern features, and charge too much for cards that really aren't worth it. Buy cards from the same company that convinced retailers to put "Glide required" stickers on D3D and OpenGL compatible games in a last-ditch effort to save the Glide franchise. Buy products from a company that still doesn't have top quality OpenGL drivers. Now if you'll excuse me, I'm going to go sell some blood to afford this card.
  • Which linksys card are you referring to? Yes, they put a copy of tulip on the disks. But, I own three LNE100TX cards (the 10/100 PCI thingies) and even though they work on linux, under a high load they become unusable until a hard-reboot. NFS and most other UDP protocols won't work under them at all. Sure it's probably the driver's problem, but my point is that just because a company puts forth some good-will gestures to linux doesn't mean they actively support it.

    I don't think this applies to nVidia at all though. I have 2 TNT2 cards (go q3a!) and the nVidia drivers are great. yes, they're binary only, and yes they crash sometimes. Hopefully this will be fixed soon, though, and at least they're fast. I'm especially impressed that a hardware company can get their act together and maintain a driver source code base that compiles for multiple OSs.

  • Constrast that with the fine, fine output of those 100 engineers (although I suspect that their Linux driver "team" numbers in the ones). The NVidia binary-only driver runs only on particular kernel revisions, and does not allow the user to switch between SMP and UP operation. Worse, NVidia could choose to stop supporting Linux at any time, and there's nothing we could do about. Since we have the specs for the G400, we can support that card forever.

    The XFree module and the GL libs are binary only, but the source to the small kernel module is distributed. Admittedly, it has problems on many kernels, but at least they let you try to compile it on which ever kernel you happen to be using.

  • While the raw materials of the card cost about $0.20 (it's all sand and plastic, basically :) there are other factors involved.

    There is the chip yield at the fab. They are using the fastest clocked processor and memory chips in a graphic card to date IIRC. If yields are low, price is high.

    There's also the need to not kill their current sales of old boards.

    And then there is recouping any "hidden" research costs associated with the new card.

    And then there is the need for a healthy profit to enable further development and keep shareholders happy.

    Besides - if they are priced too high, they will get sales, and then prices will come down. In a competitive market, consumers keep things priced right ;)
  • > (let's just make toasters. expensive, fast toasters)

    Expensive, large toasters. That require their own 220V power outlet.

  • Actuall according to another guy, DirectX8 is completely designed around/for NVIDIA card. NVIDIA is aweful big in pushing for the feautres they want during conferences, (DirectX is planned by hardware and software vendors talking about what features should be put in.) Given the fact that NVIDIA's cards totally dominate in terms of features, I wouldn't be surprised. I do remember, though, that DirectX6 looked awefully like a TNT.
  • by be-fan ( 61476 ) on Tuesday August 15, 2000 @11:40AM (#853209)
    Put blame where blame is due. Lambaste Linus and his desicion to not put in a stable driver API.
  • Look, I care more for quality than a peculiar attachment to a particular OS. Right now, OpenGL on all the other consumer cards sucks compared to NVIDIA's. They're the only ones I can use for 3D Studio. They also have the most stable drivers, and I'm benifeting from all their driver work even though I've got an older card. That's real quality.
  • If you can't, you're screwed since you have no access to the code and little, if any, access to the developers.

    Once again, you can find the developers at #nvidia on irc.openprojects.net.

    ------

  • I'm sorry, but 3dfx is also Satan.

    Remember, 3dfx bought up a company and became the only marketing path for products based on their chipsets, leaving a number of people high and dry who were hoping to continue to make boards based on 3dfx chipsets. Thanks, guys. Maybe you should sell out to Apple so we can keep all the proprietary-ness in the same place.

    So who's left? ATI? They just brought out their first worthwhile card (The Radeon.) Personally, I'm going to wait and see if they can keep up with the big boys before I commit to buying anything from them. Matrox? Not hardly. Great image quality, but they just don't have the power to pump, and their OpenGL driver quality has dropped some from the old days.

    So I guess it's down to 3dlabs and FireGL (whoever owns FireGL these days) which is intensely expensive, but it is somewhat faster than any of the consumer-level cards. Not amazingly faster, mind you, but it should benchmark favorably. The only problem there is that I don't recall seeing anything about Direct3D. I know, I know, it's a commercial level device, but D3D has been gaining footholds even in important places like inside of Lightwave 3D. Yes, I think D3D sucks. No, I don't think we can get away from it now. Microsoft did us all a disservice by creating a lame new 3D api rather than going with OpenGL, and now we have to live with it.

  • by vaxer ( 91962 ) <sylvar@NOSpAm.vaxer.net> on Tuesday August 15, 2000 @10:43AM (#853218) Homepage
    HELL-lo...

    At 64MB and 200MHz, my desktop PC is less powerful than this video card! Has anyone ported NetBSD to run on the GeForce2 yet?

  • by be-fan ( 61476 ) on Tuesday August 15, 2000 @10:46AM (#853223)
    Actually, it won't be 10x more powerful than the PS2. More like twice as powerful. That's nothing. In the same situation, the original Playstation whooped Nintendo's ass even though the N64 was more than 3X faster, and featured stuff like texture filtering and full screen anti-aliasing (which PSX doesn't support.) Of course, Nintendo actually knew the console industry, and knew how to deliver a simple, stable, easy to use product. Microsoft is bringing too much PC garbage into the XBox to be able to do that. XBox will not be simple, stable, nor easy to use. It will require updates, and upgrades, and patches. Nothing against MS, it's just that it doesn't understand the industry. None of this takes into account the fact that Playstation 2 is coming out more than a year earlier. As history shows us, that's a killer. PSX whopped Nintendo largely due to the fact that N64 came out more than a year later. Lastly, Sony is a lot bigger than Microsoft, and they have more developer support and consumer mindshare. MS is going to be eaten for lunch.
  • by Stiletto ( 12066 ) on Tuesday August 15, 2000 @12:49PM (#853227)

    Disclamer: I work for Matrox

    I don't know what you mean by Matrox not having a "full" ICD. Can you explain what isn't "full" about the one available on their web site?
  • The marketing department at 3dfx going up in smoke?

    "Image quality doesn't matter, speed matters."
    (shit - let's improve the image quality)

    "32 bits don't matter, speed matters."
    (shit, let's do 32 bits)

    "22 bits looks just as good as 32 bits."
    (i said, let's do 32 bits!)

    And now:
    "um ... speed doesn't matter, uh ... anti-aliasing matters!"
    (let's just make toasters. expensive, fast toasters)

  • Perhaps the card is overheating. :) Did you try putting an extra fan on it? I had to do this with my TNT2. Kinda makes sense that when the adjacent slot was empty, the card would stay cooler, as the fan sucks in air from below IIRC. (Or did the TNT2 have a fan? I can't remember...)

    That is a very strange problem, though. I can see why the NVidia people had trouble with it... though I agree that they should at least reply to e-mails. They used to reply. I e-mailed once and got an immediate reply from Nick Triantos (head programmer), and it even fixed my problem, but apparently they don't do that anymore. :(

    Well that was a fun Holy War. I defend NVidia because I really like the stuff I can do with their cards (programming-wise), and because I have never had a problem with their hardware. I'd like to see their drivers open source, but I don't think it is right to hate them for not doing so. There's a whole lot of good stuff in their drivers that other companies would love to use (seeing as how NVidia has the only complete OpenGL implementation).

    No hard feelings. :)

    ------

  • First let me address the old open-driver issue.
    Nvidia doesn't make all the tech they use. They purchase rights to use it. These rights do not include the right to tell everyone how all these things work. So Nvidia, whether it would like to open drivers or not, CANNOT. It's not an issue of Nvidia being dicks, it's an issue of Nvidia being legally bound.

    Nvidia is going to try and get drivers out for every market they feel is viable. They're a very busy company, so we'll see about the Be thing. Porting it to BSD shouldn't be too hard, so if Nvidia gets enough email saying, "I use FreeBSD, I'll buy your card if you give me a driver" then thats great. But I doubt many freebsd users (I'm one) use freebsd or openbsd or netbsd for games. It just isn't the focus of the OS. FreeBSD is for high performance stuff on PC hardware. OpenBSD is security, NetBSD is just about propagating. Of all of them, FreeBSD x86 is the most promising. But they need a reason.

    I, being a graphics programmer who is devout in his belief OpenGL rules, love Nvidia. They make great drivers for the platforms they DO support, and their linux support is, without question, on par with windows support. Futher, as I've said in other posts, their drivers are very "complete." If you're not an opengl programmer, you may know know about opengl extensions and how much better they can make a given image look. OpenGL drivers with a bunch of offical (SGI recognizes them) extensions are easier to use to make fast, realistic effects.

    Since these drivers are complete, Nvidia cards can be used for more than just games. They can be used for simulation, medical imaging, satellite image display, a bunch of cool stuff!

    Does 3dfx do this? No. And no one makes a technically better card, with better drivers. I wish they could open source and port to every platform, but I'll take performance over an ethical pat on the shoulder any day.
    - Paradox
    Man of the C!!!
  • by be-fan ( 61476 ) on Tuesday August 15, 2000 @11:01AM (#853251)
    Maybe I'm wrong, but...
    If quality is what counts (you said they make quality products) then why is there a new card every 6
    months or less? Also, why new drivers nearly every week?
    There are two ways this could be viewed:
    >>>>>>>>>>>>>>
    Quality counts, but so does speed. Unlike most companies, NVIDIA manages to keep one from being exclusive of the other. NVIDIA cards are all high quality, but they are released often so that NVIDIA stays on top of the industry. Unlike 3DFx they are not resting on their asses waiting for everyone to catch up. As for new drivers, why is a new Linux kernel released every few weeks? Does that mean that the kernel isn't high quality? Of course not. NVIDIA continually improves the quality of their drivers. There are always tweeks you can implement, and NVIDIA is taking already high quality drivers and making them better. You're complaining about that?

    It would seem to me that the wisest choice would be to release a solid card with solid drivers, rather
    than a card plagued with incompatibility issues and drivers that constantly improve performance on
    each release.
    >>>>>>>>
    Where is there any indication that NVIDIA's cards aren't solid? The NVIDIA drivers are rock solid. (Have you actually used the Windows ones?) They're all fast, stable, and very compatible. NVIDIA is one of the only cards a workstation 3D user could seriously consider using with apps like 3D Studio. As for compatiblity issues, they're in your head. There have always been problems with crappy hardware and NVIDIA's boards. There were problems with NVIDIA's boards and the early Super 7 ones. There were problems using them in lower-quality boards because of the large amount of current they use. The Via KX133 chip had problems with them. Of course, they were also incompatible with the Thunderbird Athlon's, so it seems that the fault lies with VIA. NVIDIA cards are big, and take a lot of current. If your motherboard manufactuer isn't building stuff to spec, then how can you possibly blame NVIDIA for compatiblity issues?
  • As quoted from The New York Times from a Press Venue with the X-Box Representative from Microsoft:

    "While we're aware that the Playstation 2 will be released a year earlier, with a large selection of titles, DVD Support and stable gaming, the X-Box will provide...."

    (The spokesman pauses and turns, asking an assistant if this is correct.) "Um, excuse me. The X-Box will provide...a PC-like enviroment...with...is this word 'faulty'? Yes? Drivers, a 'No-Firewire Zone' sticker on the bottom, and will ultimately...suck?"

    (he turns and runs toward the assistant)

    "That wasn't the press release! Hey! Don't you work at VA Systems?"

    The spokesman had no other comment.

  • I was refering to drivers. Why do people like to abuse the moderation system?
  • by xinu ( 64069 )
    Sounds great. Reeeal fast eh.

    But I'm gonna have to stick with my Matrox G400 until anyone else comes up with half as good drivers for Linux...

  • by Christopher B. Brown ( 1267 ) <cbbrowne@gmail.com> on Tuesday August 15, 2000 @10:51AM (#853273) Homepage
    ... Not cool enough to spend $500 on, mind you.

    The really cool thing is that this will "force" everyone else to improve their graphics cards, which means that in another year's time, by the time that XFree86 4.x drivers are available:

    • There may actually be general availability of XFree86 4.x;
    • There may be reasonably-well-tested accelerated GLX / DRI support in XFree86;
    • There will be all sorts of other graphics cards with ludicrous amounts of RAM, relatively competitive with GeForce
    • The "only somewhat ludicrously powerful" graphics cards with 32MB of RAM will be, at that point, "obsolete," and thus, dirt cheap.

    The one thing I'd be concerned about, a year from now, is that you might be buying graphics cards with 256MB of RAM, which is more than the amount of "regular RAM."

    (I can remember the "good 'ol days" when we upgraded an Alpha 4600 system to 256MB of RAM to help it to better support 40 online users with the ludicrously-wasteful SAP R/3 ERP system. [hex.net] I can now imagine someone putting that much RAM on a video card, for use by one user. Unbelievable...)

  • I'm saying that quality and price ARE the only IMPROTANT factors. So is service, and support. NVIDIA aces all of these. I'm just saying that it is not in one's best interest to go support companies who really aren't as high quality over something like Open Source DRIVERS. You're free to do it, but just realize that by doing so you have no right to complain that computer companies these days don't care about product quality, and care even less for their users.

    As for alternatives, you could buy ATI and live with crappy drivers, or buy Matrox and live with (comparativly) crappy performance and bad OpenGL support. Desicions, desicions.
  • Significantly, I mean you can go out and buy 128MB of RAM for that. Also, V5600 won't be out for awhile, and the price should go down as DDR RAM prices go down. (A big chunk of the increased price is the 230MHz DDR RAM)
  • by JayBees ( 124568 ) on Tuesday August 15, 2000 @10:23AM (#853282)
    $500 bucks for this thing? My Geforce 256 isn't being pushed at all yet, I'm not gonna throw 500 into a card that won't reach its limits for another 2 years. Well, maybe if halo and tf2 would look better :)
  • The other 3D drivers get updated more often, because they're Open Source. However, they shouldn't need to be updated more often. Saying, "okay, the driver API is stable as long as your product is Open Source and you update them whenever the API changes" is a crutch. It's a hack to solve a more fundemental problem. Perfect example: The glue layer between the NVIDIA driver and the kernel is Open Source. However, because the driver API has changed in 2.4-test6, that glue layer no longer compiles. If it was another OSS driver that depended on those particular features, those wouldn't compile either. As it is, driver that depend of parts of the driver API that change have to be updated whether or not they are Open Source. Thus, the problem lies not with the driver (which simply uses all the features of the driver API) but in the API itself (which changes the way those features are accessed.)
  • by chromatic ( 9471 ) on Tuesday August 15, 2000 @12:00PM (#853289) Homepage

    I think some people are critical of NVIDIA because the company can't quite seem to figure out what to do. It said it supports Linux, but couldn't seem to release decent drivers until pressured by a handful of articles on Slashdot, Linuxgames, and other sites. It was caught using GPL'd code in the driver, but removed it faster than it said it could. It uses a unified driver architecture so that improvements to the Windows driver show up in the Linux driver, but a marketing manager goes on the record as saying "The only reason anyone would open source something is because they can't do a good job of it on their own."

    It get "caught" strongarming volunteer sites by sending out review hardware and then calling in the chips to get rid of information about competitors -- then explains it away by claiming it hires temporary workers with the authority to make those deals but not the oversight not to.

    Personally, I think it's a normal company with good hardware that needs to get rid of the marketing department and give the technical people more control.

    Read more at my NVIDIA history [wgz.org] or analysis [wgz.org] pages. I don't really understand what the company's doing, but at least my video card works.

    --

  • ATI is bad about that, Rage Fury Pro etc.

    I'm waiting for the ATI Psychotic Genocide 3000.

  • There WAS competition, the RivaTNT. That offered about the same performance as a Voodoo-whatever.

    When the Voodoo 2 came out, there was no TNT (yet), and nothing in the consumer field beat a dual Voodoo2-SLI at fill rate. So no, there was no competition, initially.
  • Not that YOU release your software for free, but don't want to give other people THEIR freedom to release THEIR software how THEY choose. How are they screwing over the OSS community? The used some code. I'm sure a lot of people have used some code. They keep their drivers closed.

    How are they screwing us? Simple. Our license says you can't use our code in closed-source products. They did anyway. Their license says if we use their code without permission, we are violating their copyright, and they are quite likely to sue/press charges for infringement.

    We say: You are free to use our code, if and only if, we are free to use yours. They make it illegal to use theirs freely (no reserve engineering, etc) and impractical too (no source). This it is illegal and wrong for them to use ours. If they can withold permission to use their code, we damn well sure can, and do, withold ours.

    NVIDIA has THE highest quality OpenGL ICD in consumer space.

    Even if that is true, that is irrelevant. Can we ignore a license just because we have some cool stuff? Should they be able to? Hell no!

  • Think about it. If NVidia wanted to please Microsoft, what worse way to do it than provide hardware to a Linux-based competitor? I'm sure Microsoft asked NVidia to do the X-Box, not vice-versa.

    In the end, I'm sure that Microsoft did ask NVIDIA to supply the cards for the X-Box, but you shouldn't interpret that as meaning that the NVIDIA folks were doing Microsoft a favor. Microsoft, and a bunch of other people, are fairly certain that X-Box is going to sell pretty damn well (probably better than Indrema), and NVIDIA would have been stupid to turn its back on that.

    Quite on the contrary, I would be willing to bet that NVIDIA lobbied pretty hard to get into the X-Box. In the end, MS had to pick a partner who was willing to a) give them a quality product which b) was capable of delivering good D3D performance, all c) at a reasonable price point (cheap enough to stick in a consumer-level game machine). That NVIDIA answered the call indicates that they have no interest in playing hardball with MS. And frankly, I don't blame them: it would be a counterproductive and costly fight to pick.

  • What problems are you having? I've debugged alot of NVidia Linux installations and I'd be happy to help you.

    ------
  • The one thing I'd be concerned about, a year from now, is that you might be buying graphics cards with 256MB of RAM, which is more than the amount of "regular RAM."

    You know what's kind of silly? There is a special AGP bus for video cards which goes (n) X (the normal speed of the bus), and must transfer all kinds of data back and forth. Why can't Intel or VIA or someone come up with a special bus that connects the AGP bus with the RAM. That way, you could buy a video card with just some ROM, and no RAM, and the video card could use as much of the system memory as you needed. You could have 64MB of RAM now, and when your games run slow, upgrade to 256MB and everything will run faster. You could even have the AGP->RAM bus work for only one or two RAM slots, because I can imagine how out-of-sync ISA and PCI cards could get running at AGP speeds.

    Of course, video card manufactures wouldn't like this unless they also manufactured RAM, but I think it's time has come.

    Even if RAMBUS actually works at 200Mhz, it still couldn't top the speeds attained by AGP 4x.

  • You say that be-fan shouldn't have been modded up because, in your humble opinion, what he said was wrong. Well, there are quite a few people that disagree with you, and the point of moderation is to bring up the post that are interesing, insightful, or informitive, not the ones which the moderators think are "correct". Just because you disagree does not mean that be-fan's post was not insightful.

    ------

  • 3DFx doesn't use AGP direct texture execution. This means that textures must be copied into texture RAM first before they can be used. In cards that support "real" AGP, textures can be used directly from system memory. It also lacks fast-writes and sidebanding, and essentially uses AGP as a 66MHz PCI slot.

    AGP has no problem with multiple processors, it just can't handle multple devices. That's why Voodoo5 is on one card. However, AGP has no problem with multiple processors, as evidenced by the ATI Rage Fury MAXX, which uses multple Rage 128 Pro chips. It seems that 3DFx didn't engineer support for AGP texture execution into the VSA-100 chips. To 3DFx's credit, it really doesn't matter because in current games AGP makes less than a 1% differences. However, future games are going to be limited both by the lack of AGP texturing, and by the fact that even the 128MB Voodoo5 6000 only has 32MB of RAM (because each proc needs it's own copy of the textures.)
  • According to Toms Hardware [tomshardware.com] the D3 drivers will be out for linux sometime inside the next week. Looks like they'll have to redo the recent article they compared linux and win98 3d benchmarks in. I wonder what the new drivers will do for CAD performance under linux?
  • Sometimes the guys with the brand new computers just have more money then sense. Another possibility is that they just got rid of a 3 year old machine and want to keep the new one for another 3 years.

    The guys you really wonder about are the guys with the 91 Camaro that get pissed off when you close the door too hard and drive 85 out of the gas station parking lot. Those guys have some masculinity issues to deal with.

    -B

  • by Temporal ( 96070 ) on Tuesday August 15, 2000 @11:14AM (#853315) Journal

    Think about it:

    • NVidia may have good D3D support, but they also have better OpenGL support than any other consumer graphics card company. Most companies just support whatever Quake3 needs, but NVidia supports everything and more (through extensions). As a matter of fact, NVidia's OpenGL support is better then their D3D support -- more of the GeForce 2's features are available through OpenGL than through Direct3D. Specifically, I am thinking of register combiners (god I love those), which have been available in the GL drivers for some time, but won't be available in the D3D drivers for... err... some time.
    • Speaking of which, D3D has been basically tracking NVidia's hardware in version 7 and will again in version 8. I read that DX 8's register combiner functionality is biased towards NVidia's hardware. This looks to me like Microsoft sucking up to NVidia. NVidia doesn't really care as all the register combiner stuff is already available in their GL drivers.
    • The Indrema. It's a Linux-based game console, and it uses NVidia hardware! Think about it. If NVidia wanted to please Microsoft, what worse way to do it than provide hardware to a Linux-based competitor? I'm sure Microsoft asked NVidia to do the X-Box, not vice-versa.
    • Hell, even the fact that NVidia has better Linux support than any other 3D hardware vendor probably pissed Microsoft off, big time. (If you don't believe me on their support being better, well... NVidia's Linux drivers are equivalent in speed and exatly equal in features to their Windows drivers. Sometimes, their Linux drivers are even faster (the recent speed boosts in the detonator3 drivers have been in the Linux drivers for some time now). Heh, and come to think of it, NVidia's Linux drivers have more features than their Windows D3D drivers. Not one other 3D chipset manufacturer can say that.)

    Combine that with all the stuff be-fan said (which I agree with 100%), and you have one really cool company. Sorry, they aren't open source, but you'll notice that none of the open source drivers available compare anywhere near as favorably to their Windows counterparts as NVidia's drivers do. Where I come from, we judge our software on quality.

    ------

  • by be-fan ( 61476 ) on Tuesday August 15, 2000 @11:16AM (#853316)
    About the Microsoft thing. What is wrong with writing code, protecting it, and making a profit? It is those kinds of feelings that get people to think of OSS people as communist. Not that YOU release your software for free, but don't want to give other people THEIR freedom to release THEIR software how THEY choose.

    How are they screwing over the OSS community? The used some code. I'm sure a lot of people have used some code. They keep their drivers closed. I've actuall gained new insight into that one. An OpenGL driver is a full implementation of OpenGL. Not just a driver that bangs interrupts, but something that handles everything from glVertex on down. NVIDIA has THE highest quality OpenGL ICD in consumer space. In Windows-land Matrox and ATI are both struggling to get high quality OpenGL drivers. (Read the interview with Matrox's OpenGL guy in this month's MaximumPC.) Asking someone to give that away is simply too much.
  • I meant 128. The competitors to the Voodoo2 were the Riva128, the Intel i740, and the Rendition Verite. In the sense that the GeForce2GTS totally wipes the floor with everything else (including Voodoo5) then the Riva128 and i740 were about the same compared to Voodoo2 as the Matrox Gxxx and Voodoo5 are to the GeForce2GTS. (Although the Radeon is good competiton the the GeForce, I haven't seen very many of them around.)
  • The thing is that I'm pretty sure they didn't MEAN to steal your code. The GPL code used in the driver was some really simple stuff. I'm guessing that some of the speculations were right (that they were so crunched for time, somebody had just used the GPL-code internally and forgotten to take it out.) The same thing happened to Be to, and I'm pretty sure they didn't mean to either. The thing is the GPL allows you to keep the code for internal versions, and sometimes people forget that the code is in there and release it anyway. I'm sure a lot of people have taken GPL code and forgotten about it. does it make it right? No. But the GPL community brought such cases upon itself by being Open Source, and thus have to be willing to put up with such things. They should bring such cases to the attention of the companies that do it, and should push hard to get rid of the problem. However, it is silly to totally boycott a company because some engineer screwed up.
  • However, it seems that the Ultra will perform close to or better than the Voodoo5 6000. In the benchmarks the Ultra is performing almost twice as fast as a V5 5000. Given the fact that SLI only gives about an average performance boost of around 50-70% at the highest resolutions, it would mean that in most cases the Ultra was faster AND cheaper.
  • it could even be something like power being a problem. Recall that early athlon systems had problems simply because of the power required by the processor.
  • However, these boxes have a LONG shelf life. Developers quickly learn. In the case of the Saturn, some really killer games came out towards the end that really showed of its power. However
    A) Sega didn't market it correctly.
    B) The dual proc design was inherently hard to code for.
    The difference is that dual procs are always hard to code for, while a change in architecture requires re-learning it, but once you get the new paradim, it's easy.
  • by mholve ( 1101 ) on Tuesday August 15, 2000 @10:26AM (#853325)
    You just gotta love these names.

    Things like "Detonator" and "Firing Squad" and "Ultra..."

    Like American and Japanese cars. American cards have names like "Mustang" and "Camaro" and "Viper" whilst Japanese cars have more tranquil names like "Elantra" and "Mirage" and such.

  • I have to admit, Nvidia deserves a lot of credit for making such great chips and pushing the graphical envelope as much as they do, but I'm really not pleased at all with the whole Ultra release. As someone who just bought a new computer complete with a Geforce II GTS, I feel cheated. If there had been some warning (ANY warning) that the chip was going to come in an Ultra flavor, I'd have waited the extra month and made due with my TNT2 a little longer. Instead, I shelled out $400+ dollars for a card that wasn't top-of-the-line for even four whole weeks. The price difference between the Ultra and regular cards notwithstanding, I think Nvidia has made a mistake in releasing the Ultra a.) so soon in its product cycle and b.) without any warning at all. Personally, I'm going to give 3dfx a great deal of consideration when the time comes to upgrade to the NV20 or equivalent.
  • The thing is, without making GDI calls you cannot, for example, overlay DirectDraw over Direct3D. This means that there is no easy way to do a true 2D overlay over D3D graphics. You either have to make your interfaces 3D, do GDI calls (slowdown) or render your interfaces by hand into a texture and update it on a polygon. All of this sucks. Meanwhile, OpenGL has a nice set of functions for drawing lines, pixels, and other two-dimensional elements.

    I firmly believe that OpenGL is the superior API. Yes, I respect everything else (Except DirectMedia, which chokes hard in most situations) included under the DirectX umbrella, most especially DirectSound (Though you wouldn't need that so much if Microsoft's normal sound driver architecture supported multiple mixed streams) and DirectInput (which is the biggest blessing to people who make game controllers, ever.) But Direct3D still sucks.

  • I read Anandtech's review this morning, and I must say:

    Smooookin!

    But will I buy it? No! $400-$500 for a GRAPHICS CARD is a bit much. I know it can run OpenGL, but on my system it would do me little good until I go to XFree86 4.0.1. I won't do this yet as My fave distro [debian.org] doesn't carry it in any form but alpha.. The nVidia drivers from their site are rotten as far as I'm concerned.

    The really great news is this bugger matches the Radeon with the GeForce's 2xFSAA turned on! That's an impressive run for the chip.

    But it's still too damn much :(

  • That's fine with me. I'm not particularly happy with 3DFx. The Voodoo3 series was decidedly low quality, they really don't give a hoot about OpenGL beyond the minimum required to run Quake III, and their windows drivers aren't always polished. Matrox is a good company, and they care about OpenGL, but their drivers are less than polished. Neither company offers the power and polish of NVIDIA. Their drivers are stable, fast, and pro-quality. I'm pissed that they don't support BeOS (although if you use BeOS, check out BeNews today. Apparently BeBits is involved in a top-secret project to bring GeForce drivers to BeOS.)

    However, in the end, it's what's more imporant to you. I prefer getting the ultimate in performance, and don't mind rebooting to do it. You do mind rebooting and I can understand that.
  • God, I really hate Thresh's Firingsquad and their moronic articles. I've never seen so much padding, so much pointless waffle, so spread out over so many pages so that they can get more hits on their banners. Plus, because they use bloody tables all over the place, the pages load extremely slowly on my Solaris Netscape. Argh. Avoid like the plague!

    Was that too off-topic? Sorry, but I just had to get this off my chest.
  • I don't think they can. MS is a tiny company compared to Sony.
  • There WAS competition, the RivaTNT. That offered about the same performance as a Voodoo-whatever. Remember, this isn't NVIDIA's next main chip, it's the dual-voodoo2-SLI of the gaming world. As the benchmarks show, it really does kick ass. It means the difference between playable and unplayble frame-rates at 1600x1200. People will ALWAYS pay for that.
  • by be-fan ( 61476 ) on Tuesday August 15, 2000 @10:32AM (#853354)
    I don't know why there are so many people against NVIDIA. It seems that everytime news about NVIDIA comes out, people go out of their way to lambaste them. They release OpenGL drivers for Linux, people bitch about them not being OpenSource. It is found that they used a small amount of GPL code, people act as if they closed up the Linux kernel and released their own OS. They get some support for Microsoft, people act as if they are in a secret plot with MS to take over 3D, switch everyone to D3D, and help MS steal GPL code to boot.

    Cool off. NVIDIA is a company that has a lot of class. Not only do they make quality products, but they go out of their way to make the user experience better. For example, they continually improve their drivers. Even though the current Detenator 2 drivers are already really high quality, the GeForce2 Ultra comes with the Detenator 3 drivers, which increase the speed by another 10-15% Best of all, you can use these new drivers on all cards dating back to the original TNT. Most companies don't even keep drivers for older cards on their website, much less continually improve them. The performance of the TNT must have improved 30 or 40% from the original 3.x drivers, and those weren't exactly shabby. Wheres Matrox get's major upps for doing OpenGL (badly at that) nobody ever points out that NVIDIA was a poineer in doing OpenGL on consumer cards. Back when everybody was doing "mini-GL" drivers, NVIDIA put a full OpenGL ICD in the box with the TNT. And not just any ICD either, NVIDIA's ICD has full support for everything from Quake to Softimage. Meanwhile, Matrox STILL doesn't have a full ICD. ATI's drivers are still flaky. Yet, everyone is saying "Is there an alternative to NVIDIA?" Hello, this comany releases fast prodcuts, excellent drivers with great OpenGL support, and goes out of it's way to support older users. In a market where good companies like Diamond have dissapeard, and bad companies like ATI abound, NVIDIA really does deserve some credit.

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...