Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

3dfx/Gigapixel: Where Did it Go Wrong? 100

nvidia3dfxatibobby writes "According to this interview & 3dfx tribute, gambling upon buying Gigapixel and then hoping to book a spot in Microsoft's Xbox is the theory or rather last gamble that failed and thus brought the curtain down on 3dfx interactive. Of course since then, NVIDIA booked their spot in the Xbox and 3dfx were left in the dust. The interview also looks at a former 3dfx employee's perspective now working for NVIDIA and hence talking for NVIDIA."
This discussion has been archived. No new comments can be posted.

3dfx/Gigapixel: Where Did it Go Wrong?

Comments Filter:
  • by Anonymous Coward

    nvidia is in an extremely good position nowdays. they produce the finest 3d graphics solutions on the market, they absolutely destroy SGI for 3d modeling and rendering and will probably own the entire movie special effects and 3d simulation markets shortly.

    SGI cannot compete with pcs these days. pc 3d graphics smoke SGI graphics...even a TNT2 can utterly run circles around an Onyx2. Now with linux people dont have to use NT and linux is far more reliable and scalable than IRIX is, thus another nail in SGI's coffin.

    linux and nvidia is most likely the next decades 3d powerhouse. watch for it in movie effects, military installations and everywhere else open source innovation dominates.

  • If 3dfx was serious about selling to OEMs, then why did they produce a graphics card that required an additional power supply FFS?

    (FFS? I'm not familiar with that acronym)

    Short answer: That particular card was obviously not targetted towards OEMs. :)

    Long answer: Way, way back when the Voodoo4/5 products were first conceived by marketing, the Voodoo4 (single chip, & no external power supply) would have done very well with OEMs-- had we delivered it on time. As I'm sure you know, Voodoo4 was very, very late. By the time it hit the shelves, it was boring; for pretty much the same price we were asking, OEMs could get a leaner, more potent solution from our competitor. And as you already pointed out, OEMs are not too inclined to go with high-cost, margin-reducing SKUs like our Voodoo5 series.

    In the end, we had to focus on promoting our Voodoo5 series in the retail channels, just to stay afloat long enough to get our next product released. But that was not our original intended focus for Napalm. Our next architecture, Rampage, had just gotten back from the fab a few short weeks before the company went belly-up. Software developers only had Rampage boards a few days before the company become insolvent. I don't know the exact specs-- and now no one will ever know if this was really the case-- but it was alleged to be as potent as NVIDIA's GeForce2 series. (FYI, Rampage was also very, very late. Rampage was at least 2 years in the making. I cringe when I think of how things might have turned out, if Rampage had stayed on schedule.)

  • 3dfx' biggest mistake was buying STB, thereby alientating their very lucrative partnerships with Diamond and Creative... They decided to sell the boards, not just the chips, and allowed nVidia to eat their lunch.

    ...and I lost a buttload of cash on TDFX because of it :^(

  • Well, no, there won't be an end to the graphics card silliness© Just look over the recent /© headlines on display technologies© Just today someone or other announced they have found a way to pack 10000 'pixels' per sq inch© Now expand that to a 19 inch display surface, and you have a lot of pixels© This means more detailed scenes, higher fill rates, everything suddenly needs to be an order of magnitude more accurate and sharp to look good on these displays©
  • Well, it will be photorealism one day.
    Somebody in an article which I don't remeber claimed it would be about 20-30 years to that day.
    They where think about moore's law, the speed of the graphics development and finally they basiclly cut the number they got i half and said something like this. It looked rather convincing.

    So don't you worry about the end of the road right now.
  • I don't think they were stupid either, just arrogant; in my opinion, their big problem was that they started believing their own marketing.

    That, for some reason, looks exceedingly simple to be the complete answer.

    I believe 3dfx claimed "we don't need 32-bit textures", "we don't need large textures/AGP" and "we don't need T&L" precisely because they were having trouble with new processor design. Hence, they had to stick to the old ones and you see the result.

    I have never heard about 3dfx and design difficulties, of course. But I haven't heard anything denying that either, so it could very well be the reason.

    Flavio
  • Like it or not, the video card fanboys need to remember that they are a very small segment of the market. Uncle Joe down the street doesn't care whether he has the fastest card on the street - he just chooses what's presented to him by Dell, Gateway etc.

    3dfx's inability to follow OEM product cycles with fresh product in a reliable manner doomed them.

  • yes the Gigapixel IP is nice but what about the SEC ?

    tile based is nice but I think that VideoLogic will be the best tile based tech in town for a while yet !

    oh and in terms of IP 3Dlabs dont do badly !

    regards

    john jones
    (a deltic so please dont moan about spelling but the content)
  • after 16 million colours.. what real benifit would there be?

    That's been answered several times over, now, but the idea here is that although your eye might not be able to distinguish that many colors, rendering an image with > 32bpp can increase the image clarity (especially when doing multiple passes for effects like pixel shading). It's the same idea as high framerates. 60-75fps would be ideal, but to get that in the average case, you generally need to be able to push 120fps or more max.

    after we get real time life like images what more benifit could there be?? (making them play faster would detract not enhance the experience)

    "Faster" in this context should actually be "smoother", as in you're not decreasing the playing time, but you're decreasing the time between "frames" (where by "frame" I mean a slice of the rendering at a specific point in time, rather than something like a movie still. Think of a movie that would normally run at 24fps, but with all the tweens filled in running at an actual framerate of something more like 60fps).

    also the monitors can only get so big before people's homes don't have the space for them

    Think "Flat Panel LCD" and "wall". People have walls, and before you know it, that's where they'll be hanging their 100" diagonal LCD monitors. As well, increasing the dpi on a monitor has the effect of increasing the resolution (where "resolution" is an incorrect term, as real "resolution" is the same as the dpi, but is meant to mean "number of pixels displayed"). Maybe your little 15" monitor can only do 1024x768 at 60dpi or something like that, but when you have a 15" LCD with a 200dpi, you're going to be able to put more pixels on the screen.

    although at some point maybe the game designers will think "hmmm maybe i should concentrate on my content rather than how flashy i can make this finish"... wouldn't that be novel

    And with the help of GPUs, game designers have have both content and flash, because all the flashy stuff is essentially done for them already (at which point, they set the artists to work, and deal with their own interesting problem areas, like AI). Look at some of the current crop of games, for instance. Giants is excellent, content-wise (it's got some crazy humorous elements to it), and yet is quite pretty. Rune takes advantage of the hardware T&L in all the new video boards, and yet has an interesting story behind it as well (although being a Viking-based game, it has a goodly amount of blood and guts and gore). Sacrificing visuals for content is nearly as bad as sacrificing content for visuals. Most don't realize that, however, because it's rare that game designers will make that trade-off. The goal is to have great content, great visuals, and great gameplay. With GPU advancements, the visuals almost take care of themselves, leaving more time and resources for the other areas.

  • The Charisma (what ATI call their design) engine should be faster than NVidia's, as they compress things in hardware. As all tests indicate, however, that this is not the case due to poor drivers.

    Poor drivers aren't the only reason the Radeon isn't faster than nVidia's offerings. The GeForce line (especially the GeForce 2 GTS and GeForce 2 GTS Ultra) are extreme powerhouses. They have power down to the raw metal. Radeon's speed comes from fancy bit twiddling which may or may not (but usually does) make a difference in speed. Sharky Extreme has a good review [sharkyextreme.com] of the Radeon, comparing it against the GeForce 2 standard. The interesting thing to note is that when they turned off all the fancy performance-enhancing stuff (ie, ran metal to metal against a GeForce 2), the Radeon suffered horribly. Sure, drivers can fix this to a certain extent, and ATI does have a bad reputation for their drivers, but without the hardware backing it, there's only so much fancy software-based performance enhancement that can be done.

  • Fantastic. Now we have cards like the soundblaster 64 taking over. This is instead of the fantastic sound quality and effects the GF1 (Gus) could produce.
  • The only one I can think of is price.

    When I read this, I immediately thought about CD-ROMs. I remember way back when a CD-ROM drive was a real purchase that you thought about and maybe saved for. How long can it be before graphics cards completely overtake mainstream display capabilities and when designing a new system you say "...and throw in one of those $40 graphics cards, it's more than my $400 display can handle anyway"?

    -B
  • 1994: "Accelerated graphics? I guess that's kind of neat, but who wants to spend $250 for a graphics card that you'll only use to play games?"

    1998: "Did you upgrade to the new Voodoo card yet? No? Get with the times, luser!"

    2001: "3Dfwho?"

    Who woulda thunk it?

    --
    Ernest MacDougal Campbell III / NIC Handle: EMC3
  • Neon is out of the gamut of emitted light- as long as you can see your monitor in the dark, it will never display neons- Howevwer, it's quite likely that a 64bpp scanner/filesystem combo could pick it up, you just couldn't see it in photoshop :)

    Here's a question - Can neon be printed in the cmyk gamut? It seems to be quite small compared to what's capable of reflecting light.... Perhaps a better system of printing should be invented?




    --Gfunk
  • King oath poor driver quality... I was beside myself with joy when I discovered that I could change resolution in quake 2 or even (heaven fordid) alt-tab out of it without hanging my box when I switched to nvidia a coupld of years ago.

    And don't tell me it's changed- I got the newest drivers for my brother's voodoo, and they still can't disconnect from the net while playing a game or the popup window brings down the whole computer cause it's essentially an alt-tab.


    --Gfunk
  • Poor driver quality is all I've ever seen from 3dfx.

    It has *never* failed that those I dealt with 3dfx cards (back in the Voodoo2 days at least) had constant stability problems. I'm not talking Linux but Win9x. They were a mess... didn't matter if it was games or CAD work.

    nVidia had far more stable drivers for everyone I talked to or worked with. Those who decided to replace their 3dfx cards with nVidia cards were without exception much happier.

    If your experience different then that's great.
  • I do get your point; I was just addressing the jab at the importance of MIDI that you took. (BTW, software synthesizers like QuickTime, DirectMusic, and the Yamaha softsynths are all too CPU-intensive, and have latency problems.)

    However, on second examination, I think you and FFFish both actually underestimated the work that people are doing in the field of PC audio in general, 3D audio for games and DVD playback in specific. Sure, Aureal's A3D died a sad death, but DirectSound3D is still around, EAX still exists for some reason or another, QSound is being resurrected by the Thunderbird Avenger (love that name) processor in Philips' new Acoustic Edge (awesome 5.1 card with Yamaha XG MIDI softsynth), and Sensaura is actually doing stuff that's finally making A3D sound a bit dated.

    None of that stuff is best left to the CPU. It's just like unaccelerated software 3D animation rendering on consumer PCs...you can do it, but it's not always very nice to witness.

    Long story short, there will always be a place for dedicated hardware. I don't think PC audio has gone the way of hardware MPEG-1 decoding just yet.

    < tofuhead >

  • Actually, it's not nVidia anymore. I'm too lazy to actually find when they did this, but as I recall, the name was officially changed to NVIDIA a while back. If you don't believe me, check out their website, they write it all in caps wherever it is mentioned.
  • by HMV ( 44906 )
    A lot of people have offered that 3dfx's aquisition of STB was the beginning of the end for the opposite reason you mention: it effectively took them *out* of the OEM market and got them into the mfg business. nVidia, on the other hand, is succeeding exactly by relying on "putting your product into somebody else's" - the deal with Microsoft is just the latest.

    I could be wrong, I don't follow this industry any closer than I have to, but nVidia's position in the OEM market as well as their chips that were a little better is what has them on top now.
  • > . If you're running through a crowd of 5000 you can't expect the designers to individually render each person

    Shogon does, but they cheat and draw them as sprites.
  • I'm sure 3D engine programmers will find ways to waste any extra processor power.

    ;-)

    Capt. Ron

  • Why doesn't Matrox dominate the market? They're Canadians.
    Why can't creative find its ass with both hands? They're from Singapore.
    Why did 3dfx loose their ass? They're from California.
    Hardware costs money to produce, and its a dog eat dog world.
    Systems are powerful enough. The next frontier will be making them so useful that everyone must have one to survive.
  • How many times did I try to dissuade some clueless newbie from spending his last 99$ on some shitty Virge based card?
    People don't buy capabilites, they buy price.

    The largest part of the video card buying public are mouth breathing morons. You can spend 15 minutes detailing the pros and cons of every good chipset on the market, and they'll still go for the card thats 20$(US) less than the median price.
    Video card buyers are not loyal. If the average idiot buys one 150-200$ video card every two years, and half of the shelf space is NVIDIA, NVIDIA gets half the sales.
    If NVIDIA is making money on every board, and 3Dfx is loosing money on every board, its only a matter of time before the CEO is out the door, and the place is up for sale.
    The amazing thing is how many people anticipated this outcome.
    Anyone with any sense stopped buying 3Dfx when the TNT2 came out.
    People are loosing their enthusiasm. Everyone has spent their toy money. The market is going into a downward spiral. Linux distributors are next, and then Apple.
    2001 is going to be a very bad year. Be forewarned.
  • 16777216 is still 256 colors per primary.

    that mean that on your 1024 dot line you will have 4 dot bands if you try to display gradients. It gets worse if you use mixed colors.
    Think about that when that 4 dot band gets up on the IMAX screen. Really noticeable.

    Course I don't go to IMAX often.
  • Realtime raytracing awaits. Until you can reproduce reality you'll want to be able to keep throwing more and more power at the problem. The latest batch of games/cards doesn't come close to allowing this.
  • If 3dfx was serious about selling to OEMs, then why did they produce a graphics card that required an additional power supply FFS? As soon as you start asking OEMs to hook up all sorts of extra crap to the hard-drive power supply just to make a graphics card run, you can pretty much kiss their business goodbye.
  • But both ATI and Matrox appear to have dropped the ball. I don't see any truly kick-ass cards coming out of them.
    Hmm... the 64MB DDR Radeon counts as "kick ass" in my books..

    Also, apparently a "Radeon 2" is supposed to be coming out soon. I commonly hear "Twice as fast as a Radeon" which is obviously complete and utter bullshit, but I still bet that thing's gonna be nice.

    -----
  • I dunno, basically tho I can do ALT-TABBING and everything with my 3dfx driver.

    Good for Nvidia if theirs has also improved.
  • Poor driver quality?

    3dfx drivers are among the best drivers I've seen, maybe not as good as the drivers 3dlabs makes, but other than that they've been rock solid, running on whatever chipset I throw at it.

    Wonder where you get this from.
  • Carmack has already done a .plan update where he explains the need for 64bit color.

    It's not because people see banding with 24bits of discrete color. It's because graphics accelerators accumulate too many range-clamping errors once you start doing multiple passes per pixel.

    His proposal, iirc, was 4 16bit floating point values for RGBA, for a total of 64 bits. Going from fixed to floating point solves the range problem but introduces its own problems. First off, you have to start doing ordered adds for for low-order/low-value accuracy. For instance, if i have a few floating point numbers like this:

    .00000001
    .00000002
    24^2342

    if i add them in the above order, the "result" will be more correctly represented than if i add them like this:

    24^2342
    .0000001
    .0000002

    this is becasued floating points are expressed as mantissa (fixed # of bits) and an exponent (fixed number of bits). with thousands of numbers to add, order of operation actually makes a difference because you lose the accuracy if your mantissa is already 30 digits away from the decimal point.... if you dont beleive me, figure out the smallest representable floating point value (i mean absolute value, not the "most negative"). write a loop that adds it to itself a few billion times, then add a significantly larger number. now do things the other way around and see if the results are the same :)

    Also, you may or may not be able to correctly represent a given floating point value with a given number of bits. You get less sort of "compression" artifacts on the color space with floating point colors, but you get floating point errors :)
  • They all wasted their time on 3d graphics, when the market for 4d and 5d will be opening very soon and making many of us rich =)
  • Holy crap! Thanks man. I'll keep an eye on this site. Maybe /. is useful for things even when you stay buried at 1. Cool.

    psxndc

    PS Do you have any of their products? How is there support? etc, etc.

    PPS Tried to find your e-mail and saw your "How to buy DVD's if you hate the MPAA" post. Good idea!

  • Now that 3dfx is sunk, does it make sense to still buy one of their cards? I'd like to set up the whole dual headed monster thing and 3dfx seems to be the only company really supporting PCI still. Should I go out and pick up a Voodoo5 5500 once the price drops a little? I can't give up my TNT2 Ultra just yet in the AGP, and when I do, it'll be to the next line of NVDA's, but if I want to set up dual monitors, is a 3dfx card still a viable, if not the best, PCI solution? Or am I begging for a tech support black hole?

    psxndc

    PS I dual boot into Windows... gotta have my games...

    PPS unless I can start running UT and so forth under linux w/ a Voodoo. :-)

    PPPS Yeah, yeah, I know I can do it with the TNT, but a) I'm a newbie still and b) I have other Linux related prorities.

  • On the other hand, this is an artifact of how human sight works. If you look at RGB(255,0,0) versus RGB(0,0,255) and RGB(0,255,0), the blue box tends to be the darkest, the red is next, and green is the brightest, therefore it stands to reason you'd be able to distinguish more shades of green than blue, if for no other reason than the percieved intensity difference. Intensity itself isn't even percieved linearly by the eye to make things worse. Even if we were to go to 16-bits per colour, or 48bpp, it's still conceivable that there would be some colours that the human could noticibly percieve but the monitor could not generate. However, for the vast majority of people, 24bpp is 'good enough' and achieves everything they'd want out of a video card. I could, however, see 36bpp becoming an option on some video cards for people who require very accurate colour matching for things like publishing. And 3D accelerators could also use the extra bits even if they don't display them directly by preventing lighting from polluting the textures.
  • from what the dreamers have been talking about, and from what has been tried on different levels, I think there are several things 'around the corner' for consumers.

    First, barring the expense and room of multimonitors for the average gamer, I think that multimonitor will grow. As the monitors get smaller and cheaper (like processers I guess, hmmph) it would be nice to have a multi monitor setup that gives you different degrees of 360 'wrap-around' viewing (and even overhead). Second, some ol boy is gonna invent a stable 3D display sometime, and this would be a quantum leap in the rise of processing for it's display. And, what about integration? That seem to be the trend now. Gotta have a 200+ fps, DVD decoder, TV tuner, SPDIF out video card... NOW!!!

    I recently had a discussion with a coworker about just this thing. As we discussed, the max fps might be overkill and lost, but it is always justified by its effect on the average and lower end fps. Perhaps a bigger move would be to work on stabilizing the fps... but then again, what do I know? As for the multi-display, at least for games, it would be great to have a home built simulator for the latest flight or space flight sim. I could just look up and around to track the action... and my bogey, without pressing buttons. It would appear as a real cockpit. Perhaps some of those flexible 640x480 displays all around me, with my 21+ incher in the middle.

    Either way, I think there is a large area for growth, not to mention that the photorealism thing is rather far off IMHO. I personally can't wait to have an COTS card that I will plug in my server, sit in the living room with the wife next to me, and us both turn on our display projectors and fragging the hell. OK, so it will be awhile, but I can dream... and the distributed vs centralized thing will always be around I think.

  • They are open so you can keep writing them or least someone who is capable can write them
  • I bet 10GB of RAM will be standard long before 10 years. You could get a lowly Windows machine [unisys.com] that will run 32 CPUs and 64GB of RAM today. I remember 3 years or so ago. A server with 2GB of RAM was just ridiculous and most had 128MB or 256MB. Today 128MB and 256MB desktops are common and a lot of people have more. There are several new technologies that could see production in the next 5 years that make huge leaps forward(if they pan out).
  • True, but that's only for internal precision. As far as the external precision (i.e. the precision of the finished, fully rendered image) going any higher than 8 bits per channel realy doesn't get you anything.
  • Technology does not stop.

    Once we reach the practical limit of the two dimentional display, it will move into some sort of true three dimentional display. The graphics card will have to worry about rendering objects from all directions at once, virtual light sources, and REAL light soruces.

    Display technology has a long road ahead of it.

    Dan
  • At work our 'big' machine is a 3-year-old Onyx Infinite Reality (IR). It has 8 CPUs and two gfx pipes (not all that much seeing how the Onyx 3000 scales to 512 CPUs and 16 pipes). Using OpenGL and the IRIS/OpenGL Performer API we create a variety of (real-time) simulations and demonstrations for our clients. Seems the courtroom 3D exhibit fad has sorta gone away so these days we mostly design interactive presentations and demos for impress-the-investors pitches to deploy either on-site on an Octane2 VPro V8 or in our own "Reality Center" VR theater driven by the Onyx.

    Anyway... with our somewhat dated hardware, most render passes take a couple ms, letting us do at least 5, sometimes 8 passes per frame (at a truly-sustained 60 Hz). With the decent geometry and raster hardware of the IR we are able to use pretty complex models and still have no problems with multiple dynamic lights, a continous roaming ground texture, and whatever effects are needed to add realism. All of this on a 1280x1024 projector (the second pipe usually handles a monitor with menus and/or "dashboard" controls)... on hardware that was shipping when NVIDIA was selling only the Riva128, ATi the RageII, and 3Dfx the Voodoo1 and Voodoo Rush.

    SGI's current hardware is amazing... and they continue to maintain and update features for the now-replaced Onyx2. Even with some pretty serious work going on, we rarely are using more than 30% of the CPU provided by our 8 MIPS R10000 processors running at 195 MHz. I can't even imagine what would be possible with a maxed out Onyx3000. To say that a "TNT2 can run circles around an Onyx2" is ludicrous. Perhaps in quake, but that would be about it. And yes, SGI and NVIDIA have been working together. Their Intel systems with "VPRO" gfx are based mostly on NVIDIA Quadro/Gefoce gfx. In typical SGI marketing blunder, the Octane2's new gfx is also called "VPRO", even though its V6 and V8 gfx have nothing to do with NVIDIA. (And yet the latest VPRO for Intel is the V7.... jippity!)
  • When I get my bionic eye implants I'll need triple that many colors...
  • I too wonder what will happen to ATI or Matrox. Granted, Matrox does high-end video stuff, and ATI has all their OEM contracts. But for how long. ATI's new Radeon does have nice hardware design, but as usual, their drivers are total crap. The Charisma (what ATI call their design) engine should be faster than NVidia's, as they compress things in hardware. As all tests indicate, however, that this is not the case due to poor drivers.

    It's too bad mismanagement has forced another good technology company to go under. Any any business grad could tell you, there is something known as core competencies of a company. These are things that a company does best. 3dfx used to be great design place, then they bought STB and went into manufacturing. By doing this, they strayed outside their core competency. The rest, as they say, is history...

    I just hope that NVidia doesn't get monopolistic on us and become a hardware M$.
  • "lose" != "loose"

    just an fyi.

    eudas
  • Specs are irrelevant. Mostly lies, actually. All consoles pretty much suck compared to a loaded PC. Can't you read at all?
  • do the characters in any given three-dee game (say, everquest) look photorealistic to you? personally i think they look terrible. like cardboard boxes with people painted on them. we are a long way from doing large numbers of quality curved surfaces, photorealistic VR type environments. a long, long, long, long way.
  • Bzzzt! Sorry, and wrong... I couldn't even *afford* to be a gaming geek at the time 3Dfx was king, used a Number 9 Vision 2 meg card built in 1995 up til 1999, and I remember finding out about the lawsuits they fired up against Nvidia, AFTER I bought an upgrade...

    3Dfx was trying to patent pretty much every single 3D technology out there, and long before all this other IP suit glut appeared in the news... Prior to that, the only people suing on IP infringement, were Sun, Netscape, and a few others... Who were they suing? Microsoft...

    Huh, nobody really complained about that either as I recall... So I guess concern about IP cases only apply to 133t linux geeks, and the remainder can screw themselves...

    As to answer your question of "How is 3Dfx different from any of the hundreds or thousands of companies that're suing other companies for similar reasons?"... The answer is simple:

    People like YOU support them... People like YOU ignore that behavior unless it *gasp* infringes on your particular lifestyle...

    Of course, what do I know? You're probably just some poor hapless idiot who just got back from Xmas shopping, after blowing $299 on a 3Dfx Voodoo 4500, when you saw this headline...;)
  • there will always be higher resolution monitors, i have seen a coupple of stories on /. in the last coupple of months talking about 200dpi monitors.

    if my monitor (17"trinitron, 1280x1024) was instantly replaced with a 200dpi one, the graphics card would have to keep up with 2900x387 or some silly resolution like that. none of the 3d cards on the market get even get close to that resolution, and ofcouse i want that at 2000fps (and a monitor refresh rate to match!)

    and what about fullscreen video at that resolution, the pipelines needed for that would have to have the decoder chip on the graphics card.

    what about 200dpi in 3d??? we havent started on what kind of graphics card it would take to drive a 3d holigram projector at 8,000,000 dots per cubic inch ;)

    bats = bugs
  • Of course since then, NVIDIA booked their spot in the Xbox and 3dfx were left in the dust....

    Along with proper grammar on Slashdot, too.

  • I think you're forgetting that this deal still has to be approved by the shareholders ands regulators. It's not a sure-bet yet, and I hope the regulators will not approve the sale , given Nvidias' market dominance.

    There has to be a better option that will preserve 3dfx's corporate identity and that may keep it running as a subsidiary of a larger company.

  • All this may be well and true, but in the real world, when companies go out of business:

    people loose jobs

    shareholders loose money

    creditors are lucky to get all of thier money back.

    These are not good things. With the death of 3dfx, it brings Nvidia one step closer to completly having the high end 3-D card all to itself. I don't think anyone wants this.

  • I own it all, i don't wanna go into details and lets just leave it at that... I still wonder, what will happen to all of the 3dfx technology in the arcade? Games like guantlet ledgends, the entire San Francisco Rush series. Will future iterations of these games be on 3dfx or nvidia hardware? I really really like my voodoo. It just works, and there is a lot of older games that use the glide API that I fire up once in a while (descent2, carmageddon) I worry that glide will go away and if I were ever to lose my Hell I'm too hopped on on vicoden to finish this post. I think im gonna take a nap now. --Toq
  • Neon needs special dies, cmyk can't print it. In fact cmyk can't even handle all the shades of green rgb can (try drawing in bright green in photoshop and converting to cmyk).
    A while back I heard about a 6-color printing system that had much higher quality, does anyone know if it's still around?
  • SGI and NVidia have been working together for a little while now.

    Ashes of Empires and bodies of kings,
  • Titanic wasn't rendered by a PC. It was rendered by 60+ Alphas. (were running Linux, though!)

    Ashes of Empires and bodies of kings,
  • yes but we are talking about something that is limited only by the amount of processing power e can envision rather than physical limitations like the number of colours we can envision... after 16 million colours.. what real benifit would there be? after we get real time life like images what more benifit could there be?? (making them play faster would detract not enhance the experience)
    also the monitors can only get so big before people's homes don't have the space for them
    the main difference with gpus and cpus imho... is that as long as windows and office keep getting more bloated more powerfull cpus will be needed...
    although at some point maybe the game designers will think "hmmm maybe i should concentrate on my content rather than how flashy i can make this finish"... wouldn't that be novel
  • If NVidia becomes near-monopolistic, we'll end up with the same mediocre performance and features as the SoundBlaster.

    I don't understand what you think is missing in soundcards.

    Surely this is like the color-depth thing mentioned earlier; once you are scanning and outputting CD-quality 44KHz stereo, what else is there to do? Most of the preprocessing done by early "kick-ass" soundcards is now redundant as it can be done more easily by software on the main CPU. How important is, say, realistic MIDI now when nearly everything is .mp3 and .wav (or functional equivalent)?

    There is plenty of room for improvement before we have real-time photorealistic rendering of things like human faces and cloth. OTOH some of the VR problems are pretty much fully solved. I would think sound is one of these.

  • Realistic MIDI playback may be getting progressively more irrelevant for some (not me...I'm going to be getting a Hoontech XG card as soon as I have the time), but digital musicians rely on MIDI for producing their tunes.

    Understood, but my point was that this kind of work can be done within the main CPU now instead of by specialty hardware. When your CPU is capable of doing Fourier transforms in realtime, f'cryinoutloud, why do you need specialty hardware? Why not do MIDI in software so you can upgrade without opening the box?

  • Who cares about XBox? That slug?
    The Indrema is 2x what the XBox is, or
    did any of you read the reviews here?

  • I suggest you look at the specs on both of the consoles before you make such an assanine statement. Nobody will buy the xbox because its a slug. It has no mp3 support either. Oh, it comes with a modem, wow. Im on cable, i won't settle for a modem. Indrema comes with a 10/100 nic. thats what I want to see. Not 20 year old outdated analog technology.

  • Your right about the PC. My compaq running Linux 2.2.18 makes PS2 look like shit.

    But yes, Xbox does look bad compared the Indrema
    If you want a slug get M$'s work wich will probably wind up crashing just like their winCE, or get somthing fast like Indrema in which it is a console you can actually improve if you don't like it. Plus you get more hard disk space if you want it.

  • Just look at sound cards, for example. Several years ago, they were a hot technology, but have they now reached their end of the road? Many have become part of the motherboard, and as long as they provide decent surround sound, etc, they are basically a commodity, like RAM.
  • The dynamic range of 24-bit graphics isn't nearly as wide as what we can see. For example: what color, in 24-bit RGB, is a fluorescent highlighter pen? Or the difference in 24-bit RGB color between treetops and sky at 11:20PM on a night where you can just barely see the treetops? (Less than 1 level difference?) And this kind of subtle color may sound silly, but could be put to excellent use in a 3D shooter on a wall-sized screen: menacing shapes moving in the sky, etc.
  • Folks,

    In all the griping back and forth about why 3dfx failed and why Aureal failed against Creative Labs, there's one reason that nobody has discussed: the integration of reasonably high-quality video AND audio into the motherboard chipset.

    Sure, we all laughed when Intel introduced the original i810 motherboard chipset, but by the time we got to the i815e chipset both graphics and sound quality were more than good enough for the majority of desktop computer users. Anyone who's tried a motherboard that uses the i815e chipset knows that graphics quality--2D and 3D--are not bad at all, and the i815e's sound card function is quite good with built-in wavetable MIDI sound.

    The i815e is how many companies can produce very good systems at quite low cost. For example, the Micron PC RS2100 series computers sold at Best Buy uses a Celeron 700 MHz CPU on an i815e chipset motherboard, a pretty good combination for the majority of end users out there.

    Think about it: how many people out there REALLY need a high-end graphics card and a high-end sound card outside of hardcore gamers? I don't think there's much demand outside of the hardcore gamer crowd, that's to be sure.
  • An example is that we used to judge gaming computers by how many colours they can display, be it 8, 256, or 65536. But once we reached 16 million, there wasn't any further useful improvement that could be made.

    64-bit color (16-bits per channel) will definitely be necessary for complicated pixel shaders. The more stages in the pipe, the more precise your internal representation must be to avoid a final result that looks washed-out.
  • IMHO is that they went for raw speed over features.

    Definitely. They not only neglected 32 bit color like you said, but also neglected AGP in favor of PCI and started using double/quad processor/card configurations instead of actually developing ONE card with all these features built into one, non-redundant chipset.

    Of course there are features like motion blur and the T-Buffer, but I believe they are (a) too novel for their time; (b) slow; (c) restricted to 3dfx cards, making designers careful when to implement them; (d) a late and improper solution to their problems...

    But then why did this happen? They aren't stupid.

    The XBox deal with NVIDIA was as lethal to them as EIDOS' investment in Daikatana was lethal to Looking Glass Studios. 3dfx and Looking Glass were already in trouble when these events happened; they were only the nail in the coffin.

    3dfx probably had some serious design issues, kind of like what Intel had with RAMBUS. 3dfx just doesn't have the leverage that Intel does to stay in control.

    Flavio
  • www.evga.com [evga.com] has PCI nvidia cards. Their PCI GeForce2 MX is product number 032-P1-NV29-01. You can look it up by going "products" from the sidebar, and getting a list of all the nvidia cards by selecting the manufacturer and clicking "submit". Hope this helps...
  • "640k ought to be enough for anyone."

    Who knows. It certainly seems that the end of the road for 3d technology should be nigh. Especially with today's displays.

  • Big mistakes by 3dfx:
    - Not supporting OpenGL well
    - Poor driver quality
    - Buying STB and deciding to stop providing chips to OEMs (that killed them... lost the industry support)

    We can thank them for helping make consumer 3D popular but I for one am glad they are dead.

    Brian Macy
  • How important is, say, realistic MIDI now when nearly everything is .mp3 and .wav (or functional equivalent)?

    Realistic MIDI playback may be getting progressively more irrelevant for some (not me...I'm going to be getting a Hoontech XG card as soon as I have the time), but digital musicians rely on MIDI for producing their tunes. Where do you think a lot of your MP3s and WAVs come from? Not everything you here is a live recording of a live performance, although with some good soundbanks you might be fooled.

    < tofuhead >

  • The truth of the matter is, 3Dfx was, for a goodly 2 years, one of the first big name companies to start the IP onslaught...

    You're trying very hard to make this article fit your agenda, aren't you? How is 3dfx different than any of the hundreds or thousands of companies who are suing other other companies for similar reasons? The difference is that you *heard* about 3dfx, because you are game/PC-hardware geek. Just because you happen to know about a particular case doesn't make it extraordinary.
  • > Surely once we are in the trillions of polygons per second (at the present rate, soon, probably) and 3d graphics offer photorealism

    We'll NEVER have enough polygons. The world is just too dam complex to model accurately.

    I want 100K polygons PER TREE!* and I want 10,000 visible onscreen at once! With 120 fps. AND that is JUST for the scenery, nevermind the 1,000 people onscreen each with 500K polys!

    * Most games just slap 2 quads together at right angles, and call it a tree. Yuck!

    How do you accurately nebulous volumes like fog and smoke? Today we cheat with billboarded sprites, and volumeteric hacks.

    Take a look at any good outdoor terrain used in games. How come allmost every game has a far clip plane (even when pushed out as far as it will go) is STILL _relatively_ close ?!

    Get on top of a mountain and tell me how far you can see? In real-life you can see for MILES. We just don't have the fill-rate with today's cards, to draw more then, what, 100m of the game world.
  • If you ever tried multipass blending, one will quickly realize that each blend operation lowers the effective bit-depth. Having EACH color component of 16-bits (i.e 16-bit RGBA = 64-bit color) allows for a TON of blending operations without effecting the image quality.

    Quick example: Take a look at 3Dfx video cards with alpha-blended smoke. It looks dithered, and the image quality is bad ;-( due to only 16-bits. (That's 4444 color.) (Yeah, 3Dfx hardware was an effective 22-bits, but it still wasn't enough bits for proper blending ;-)

    8888 provides 256 gradients for a primary color, but not 256 gradients for secondary colors and non-primary colors!

    But I'm just a 3D graphics programmer ... *shrugs*
  • Once we have the holodeck. Then this industry is done. Might be a few years.
  • While your saying seems to make sense, those that used some old 286 would have thought the same thing. I mean, how fast can you use Word Perfect 5.1? :-) There are lots of things that are missing from current 3D standards that keeps things from being photo realistic, and those things take *tons* of power.

    1. First, in one of John Carmacks .plans (IIRC finger johnc@idsoftware.com), he mentioned 64bit color. Since most people can't see more than 32 bit color, it seems like a waste to use so many colors, but it helps things blend from one pixel to another and look cleaner (so I've heard)

    2. Lighting. Lighting is very CPU intensive, and currently, the OpenGL spec. supports only 8 lights (IIRC, the spec says "at least 8 lights", but I don't know anyone who supports more than that. Also, those lights are all "point" lights, which eminate from one pixel, as opposed to true light, which actually lights the room from more than just a single point (especially those florscent lights that you find at work)

    3. Neon Colors. Currently, it's impossible to display Neon Colors onto an RGB Screen. I don't know if it's mathematically impossible or what, but I know if you scan in a picture of your friend in his "hot-pink" pants, (besides knowning he's gay :-) ), it won't show up properly on the screen.

    4. More Polygons. Remeber how impressed you were the first time you played Doom? There were all of those creepy dead soldgers impailed by some spear onto the beams in each room? When Quake 1 was released, it didn't have anything like that. Since Quake 1 was designed for a software render (IIRC), Id didn't have the CPU horsepower to put anything like that in. Hopefully, with Doom 3, we will see those impaled Marines once again :-)

    5. Shading. Currently, OpenGL supports a mode called smooth shading (I'm certain direct X does, too, but I haven't used it much). It is known a "Gourad Shading", and it operates between polygons by using Anti-Alaising for it to look cleaner. There is another model called "Phong Shading", which blends colors between pixels. I've read lots of things why that can't be done, but if we have the CPU power, maybe it can look smoother

    6. Textures, Bump Maps, etc. In real life, everything has unique textures feeling to it. The more textures you can use to describe an object, the more realistic it will feel.

    7. Stereo. This has to be one of the coolest things that will happen someday. For those that can see stereo images (only about 10% of the population can't), when a game/demo/whatever is designed with stereo in mind, then it will draw two identical (well, really cool 3D would offset them a bit and make them look even better) scenes, which makes the scenes truely look 3D. Some video card companies have released glasses that mimic this effect, but from what I've heard it's too slow to be useful.

    I think that we will always find more cool things to do with 3D hardware in the future!
  • But it has recently lost Apple's contracts

    When did this happen? It's been a rumor for quite some time, but I haven't seen any news sites that have mentioned this. Plus, I'd assume that nVidia would be smart enough not to announce it before a MacWorld!
  • You knew how to party.... Those were the days and we salute you!

    Glad to see some serious journalism going on here.

  • If you look at the downfall of infocom piece that was covered a couple days ago you can see the connection. Gambling your entire company future on having a few choice things going your way is tricky. Although, I personally think 3dfx killed it self the minute it decided to go into the video card business for itself. In general, when you have a large chuck of market share you don't have to make risky decisions. Throw some change at R&D to keep the FPS crown and don't make any business desision that would tank your company if it went drastically wrong. In fact you should set it up so that the company would still survive even if you totally horked three or four deals.

    But that's just my $.02, and I'm sure the upper end of 3dfx are making out like bandits on the deal. The real losers are going to be the people who owned older 3dfx cards and now are going to find that the video drivers are never going to get updated.
  • I can understand that the demand for increased processing power for CPU's will probably never be sated, but id this true for Graphics cards? Surely once we are in the trillions of polygons per second (at the present rate, soon, probably) and 3d graphics offer photorealism, will there still be a need for better graphics?

    Remember 10-15 years ago (or whenever it was), when Bill Gates made his infamous comment that "640k should be enough for anybody"? Just as computer technology keeps evolving faster than we could imagine, I think there's a good chance that graphics technology will follow suit.

    ---
    "Fdisk format reinstall, doo dah doo dah,
  • One of the big reasons for having powerful graphics cards is to relieve some of the work-load of the CPU. With the current crop of games, graphics cards work well, but what happens when developers start putting out "virtual reality" games. I'm talking about put on a helmet, walk yourself around in a full 3-d world games. If and when games like that ever come around, powerful graphics cards will likely be needed to share the workload.
  • IMHO is that they went for raw speed over features. Sure it could push *many* polygons but really nice things like 32bit color got left behind. I think this was a big reason why Nvidia won.
  • ...16 bits per gun...

    Why? You're talking about 2^48 colors, but in general humans can't distinguish more than ten million or so (between 2^23 & 2^24), even under excellent lighting conditions. Won't there be trillions of wasted colors?

  • The dynamic range of 24-bit graphics isn't nearly as wide as what we can see.

    From my dog-eared copy of the "Guinness Book of World Records" (1984 edition):

    The unaided human eye, under the best possible viewing conditions, comparing large areas of color, in good illumination, using both eyes, can distinguish 10,000,000 different color surfaces.

    Unless you're a freak of nature [slashdot.org], your eyes only respond to red, green, and blue. Assuming (possibly incorrectly) that each color palette is equally sensitive, each cone responds to roughly 215 color levels, which is easily covered by eight bits per sample. My inability to give exact answers to your vague examples notwithstanding, if there are any colors which 24-bit RGB cannot display, it is an inherent limitation of the RGB scheme, and throwing more bits at it won't solve anything.

    That said, I will accept the argument that more bits will reduce rounding errors [slashdot.org].

  • After looking at this article, I realized that 3Dfx was doomed to meet this demise for some time. They've suffered from a real-world case of the "Dilbert [dilbert.com]" principle. Hedging the success of a business such as 3Dfx on the odds of being able to put your product into somebody else's is very bad business, and a sign of poor management. It's really a pity that this has happened, as 3Dfx made some damned good products (I think the NVidia chips were a little better, but it's still a pity).

    Nevertheless, it's done and over. 3 cheers to a once-great company destroyed by poor management! 3 cheers to a once-great country being destroyed by poor management!

    It's all about the Karma Points, baybee...
    Moderators: Read from the bottom up!

  • Unless major conceptual redesigning takes place in the graphics acceleration field, there won't be an upper limit to performance. There will always be the demand for 'more' pixels and 'more' polygons. Even once we hit photorealistic real-time imagery that fools the human eye, there will still be a need for increasing GPU power because there is no limit to scene complexity. Example : fast forward to 2010, Nvidia has the GeForce 42 with 2 gigs of onboard memory and an integrated air-conditioner. The card has enough boom to render an entire feature-length movie starring Bruce Willis and Spamela Anderson in 4-zillion polygons each. Well what if someone wanted to render a whole crowd of virtual hi-rez humans in one scene, that person would need even more crunching power. Lather, rinse, repeat. There just isn't any expected end to the polygon-race.
  • Errm, no, not really. If you're talking about decent sound cards then there are entire music studios on a sound card, things like the Terratec EWS XXL with an onboard Microwave synthesizer, the Event series and so on. They're no more at the end of the road for people looking for performance than graphics card are.

  • Remember 10-15 years ago (or whenever it was), when Bill Gates made his infamous comment that "640k should be enough for anybody"? Just as computer technology keeps evolving faster than we could imagine, I think there's a good chance that graphics technology will follow suit.

    But where will it go?

    Once graphics cards get to the point where they are doing 16 million colors (or more) with every possible effect known to nature and Photoshop, what's left?

    I mean, after a certain point frame rate will cease to matter, and even the Quake 'twichers' will be able to frag people with all the effects turned on.
  • Jeez, a leader in chip design falls behind the roadmap of their upstart competitor, their marketing dept gets ahead of their R&D dept, and then their competing products are released almost a generation behind their upstart competitors so they bank on a tech that never takes off. That sure sounds like a recipe for disaster to me!

    No I am NOT talking about Intel, that would be offtopic!


    "Me Ted"
  • First they close their source, then they start shouting at us.

    What more must we suffer from them?

  • It's nVidia. Quit shouting it please. :-)
  • What about high detail outdoor landscapes? Real-time fractal generation for high detail plants and trees.

    The only reason you think we are close is that game designers are good at hidding the hardware limits. How far can you see in most FPSes? 50 meters? Have you ever seen a convincing forest? Or a real-time magnifying glass?

    As game detail increases video cards will be expected to provide more helper functions as well. If you're running through a crowd of 5000 you can't expect the designers to individually render each person (interals to, so they can explode properly). Cards and APIs will have to have commands like:

    world.add(new person(MDL_BLOND_GIRL_HOT, "5'6", ACTN_STANDINGAROUND));

    There might be some eventual upper limit, but we're nowhere near it.
  • by Anonymous Coward on Thursday December 21, 2000 @11:52AM (#544500)
    Um, no ;). The reason no more colors are necessary is that humans can't distinguish any more colors than that. Most people would be hard pressed to tell the difference between the RGB triplets (255, 128, 2) and (255, 128, 3) on a computer screen.

    The detail currently achieved by 3d accelerator cards is good, but nowhere near "real". There is also a huge difference in realism between one generation of graphics cards and the next (not the case with adding color depth).

    Let's look at a few examples of just how wimpy today's cards are, compared to what we would like them to be.

    How many poly's do you need for a good tree? Here's an order of magnitude example: You might need 10 polygons per leaf times 10,000 leaves plus another 100,000 for the trunk if you want to get the bark right. That's 200 thousand polygons. If you want 30 frames per second, that's 6 million polygons per second. Of course you'll also want them to be lit and shaded. Most of today's graphics cards would be hard pressed to maintain this, so trees in games at this point are usually done using billboards...a couple of textured polygons. Even given this, when was the last time you saw a good dense forest in a video game?. We need another factor of 1000 in power at least, just to have a forest.

    Today's cards use a lighting model called Gourad. It is a smooth vertex shading model. Unfortunately, it doesn't do shadows, reflections, per-pixel lighting, or any of the other things you see every day in real life. Stepping up to a new lighting model (such as raytracing) is beyond today's graphics cards.

    Ever notice that even though the cards render in 3d, you have to describe the world for them in 2d pieces? Polygons and texture maps are planar. Only recently have we been able to get cards powerful enough (consumer level like Radeon) to do some simple 3d texturemapping. This ability to represent volumetric data is another thing that we need so that we can do good fog/ghosts/fire/smoke/explosions and a hundred other things without trying to fake them with an animation played on top of a polygon. Think about how much memory this might take too. If I have a big texture map, say 512x512 at 32 bit color, that is a one megabyte texture. What if I want a texture that is 512x512x512? Suddenly I need 512 Mb of texture memory. How close are we to this on today's graphics cards? The best have 64 Mb.

    Don't forget that these examples are for simplistic things like trees and fire that we can't even do right yet! Perhaps you are right that eventually we will reach a point of visual perfection where no further improvements are needed, but we are far, far from it right now.

  • by Squid ( 3420 ) on Thursday December 21, 2000 @12:05PM (#544501) Homepage
    Stereoscopic imagery. Film resolutions (4096x3072 and such). Multihead cards. 64 bpp, maybe even 128bpp - 16 bits per gun, the rest for highly elaborate alpha and reflectivity tricks. Raytracing in realtime. There's plenty of room for expansion here.

    And never, ever ignore the possibility of the Next Cool Thing coming along.
  • I can understand that the demand for increased processing power for CPU's will probably never be sated, but id this true for Graphics cards? Surely once we are in the trillions of polygons per second (at the present rate, soon, probably) and 3d graphics offer photorealism, will there still be a need for better graphics? I would have thought that in 5 or 10 years, Graphics card technology will have got as good as it can usefully get. An example is that we used to judge gaming computers by how many colours they can display, be it 8, 256, or 65536. But once we reached 16 million, there wasn't any further useful improvement that could be made.

    Physics is starting to be integrated in graphics hardware.

    Accurately resolving collisions between several dozen rigid bodies in real time is a tax on even the most advanced hardware. Most games fudge physics horribly, and real physics just plain doesn't happen in real time for any kind of a complex scene with numerous non-uniform, variable elasticity and independently moving objects.

    So long as graphics cards continue to take on real time physics, you can be sure that graphic simulation hardware has barely entered its infancy.

  • by blazer1024 ( 72405 ) on Thursday December 21, 2000 @11:35AM (#544503)
    When 2D cards basically reached their maximum ability (Although I'm sure there could be additional improvement, but not too many people care these days) what you're talking about basically happened. It wasn't about features or speed anymore, it was about price. Why buy an expensive card when you could get one just about as good for $30 or so. But, graphics card developers went to 3D.

    Once they can no longer advance that field (which won't be for a long time I'd wager), they'll find something else to work on, such as something worthy of being called virtual reality.. devices that implement touch as well as sight, etc. The advances could go on.

    Generally, graphics cards are aimed towards the hardcore gamer. There are those cards for 3D graphics artists, engineers and architechs who use CAD/CAE programs, etc., but those are a completely different thing. As long as there are gamers who want the latest technology to further immerse themselves in the game, there will be planty of new advances to sell to them. If the technology stops progressing, they won't be able to sell many cards, because everyone will have one.

    At least that's what I think.
  • by NeuroManson ( 214835 ) on Thursday December 21, 2000 @12:04PM (#544504) Homepage
    Odd, that there are as many people out there, including /. who mourn the loss of 3Dfx...

    The truth of the matter is, 3Dfx was, for a goodly 2 years, one of the first big name companies to start the IP onslaught... Less than 2 years ago, they attempted to sue Nvidia, claiming patent infringement, when Nvidia was just starting to make a name for itself, for providing low cost 3D accelerators, which at the time, was just a notch above ATI for preferred hardware...

    Then Creative Labs came out with a Glide wrapper, custom made for their TNT/TNT2 cards... 3Dfx tried suing them as well...

    Then people in the emulation scene started coming out with Glide wrappers as well... 3Dfx threatened them with cease and desist letters...

    Meanwhile, while 3Dfx was running around threatening to sue everyone who could possibly compete with them, Nvidia continued to develop their technology...

    Are we noticing a trend here?

    Nvidia made one error in their PR dept, when they threatened to pull their sponsorship for hardware review websites, if any competing 3D accelerator ads/logos were displayed on said sites... They admitted the mistake, and for the most part, apologised...

    So really, the biggest irony here is, with the general uproar regarding big companies stomping on the little guy for IP "infringement", that just such a company got bought out by one of the very little guys they were trying to stomp on...

    And yet the majority of responses online (other than Nvidia's, of course) has been negative, attacking Nvidia rather than 3Dfx, making 3Dfx into the martyr, nay, the sacrificial lamb, gobbled up by the slathering wolves of Nvidia...
  • by FFFish ( 7567 ) on Thursday December 21, 2000 @04:44PM (#544505) Homepage
    What's missing in soundcards is quality 3D sound. The best was Aureal A3D, but that's vapour now that Creative succeeded in bankrupting them.

    Creative offers EAX, which is predefined reverb effects. That's fine for ambient sound, but it doesn't provide good three-dimensional sound location.

    QSound, Sensura and A3D all provide some form of 3D sound, well beyond the quad-speaker setup; they are particularly effective while wearing headphones.

    What they do is simulate the delay, volume and "sound wrapping"/pitch shift effects experienced in natural sound. Part of these are obvious: it takes a fraction longer for sound to reach the distant ear, and the volume will be lower; others are less obvious, caused by the bending of the soundwave as it wraps around the head -- or, at higher frequencies, is blocked by the head.

    Anyway, point is that it is difficult to simulate 3D sound. A3D had it: you could swear that the rocket went within an inch of your head; or hear the click of a trap behind you. It added a lot to the game experience -- it's like the leap from Wolfenstein to Quake III. An order or two of magnitude difference in realism.

    And now that A3D is stomped, I'm very doubtful that we'll be hearing good game sound any time soon. It's like going back to Wolfenstein; sure, things will still be fun... but they won't be anywhere near as sweet.



    --
  • by FFFish ( 7567 ) on Thursday December 21, 2000 @12:23PM (#544506) Homepage
    is what will happen to ATI and Matrox.

    ATI is currently a much more profitable company. It has outsold NVidia by nearly double: ATI chipsets are spec'd in nearly every laptop and many OEM boxes. But it has recently lost Apple's contracts, and NVidia is making inroads into the laptop market.

    Matrox ruled the world with its 2D cards. They were fast and, every bit as importantly, incredibly stable. If you were using specialty software, you could rely on a Matrox card to work with it. The same could not be said of *many* other cards: driver incompatibilities were assured with them.

    But both ATI and Matrox appear to have dropped the ball. I don't see any truly kick-ass cards coming out of them. And while Matrox 2D is still top-of-the-heap, it's not enough any more: fast 2D can be done by anyone, and not enough people require driver stability to make that Matrox's saving grace.

    What happens if either, or both, of those companies fold? We'll be stuck with the same sort of abysmal situation we have with soundcards: a complete lack of innovation or advancement during the past ten years. Creative Labs owns the soundcard market, and to this day we do not have advanced sound capabilities of any sort of respectable nature.

    I'd hate to see that happen, particularly with 3D. The visual presentation is nearly as important as the aural presentation, when it comes to fooling the mind into believing in the virtual environment. If NVidia becomes near-monopolistic, we'll end up with the same mediocre performance and features as the SoundBlaster.

    That'd be a crying shame.

    We need competition. We might not get it.

    --
  • by synaptik ( 125 ) on Thursday December 21, 2000 @11:53AM (#544507) Homepage
    ...and articles such as this one make me sick to my stomach. Why? Because most of them are full of ill-researched speculation, that couldn't be further from the truth, but nevertheless seem plausible-- so people believe them.

    Now, this particular article on Sharky isn't so bad. I've seen worse. Brian Burke was with us almost to the end, so he knows what he's talking about.

    Anyway, my point to this post is to clarify a few details that people tend to get wrong in these articles:

    * nvidia did NOT buy 3dfx. Rather, 3dfx became insolvent, and so asked nvidia to buy their assets, so they could afford to dig their own grave. That is why my TDFX stock is work pennies today. If nvidia had acquired 3dfx, the stock would be going up, not down (because eventually, the stocks would be one-and-the-same.)

    * 3dfx did not "refuse" to let OEMs sell their products. 3dfx WANTED to sell to OEMs (such as Dell, Compaq, etc.) That was the whole reason they changed their logo-- to look more professional! But Napalm and Rampage were woefully late, and 3dfx fell so far behind on the performance curve, that OEMs weren't all that interested. 3dfx's inability to meet the OEM's product schedules didn't help, either. In the end, all that was left for 3dfx was the retail side-- something 3dfx had wanted to de-emphasize, with the STB merger.

    (In defense of some of the people who made the above claim I just refuted... by "OEM" some people might have meant board companies, like CREAF-- in which case, they are absolutely right; 3dfx did stop selling to those companies intentionally. But that's not what killed 3dfx; retail sales only accounts for a very, very small portion of the 3D graphics market. By far, OEM sales (Dell, Compaq, etc.) is where the money is at. This becomes more evident if you consider the fact that 3dfx became insolvent, despite having the top-selling products in the retail channel. Reason? Because retail is just a trickle, compared to OEM sales.

    * The gigapixel merger is NOT what killed 3dfx. The gigapixel purchase was the smartest thing 3dfx did in a long time. But it was too little, too late. The purchase of STB is ultimately what killed them. The problem was, the business model changed on them. Prior to the STB acquisition, companies such as Dell and Compaq bought boards from board companies. But by the time the acquisition took place, the OEMs started buying chips directly from the chip suppliers, and then contracting companies to build boards overseas. STB was a middleman; our days were numbered, but we didn't realize it at the time. And neither did 3dfx, and they paid dearly for it. What we should have done (hindsight being 20/20 and all,) was sell or lease Juarez to someone like Solectron, and return to being a chip company. 3dfx realized this, but we realized it too late; Mexican labor is considerably more expensive than that of the pacific rim sweatshops, so Juarez's market value declined before we could sell it.
  • by Kiss the Blade ( 238661 ) on Thursday December 21, 2000 @11:17AM (#544508) Journal
    I can understand that the demand for increased processing power for CPU's will probably never be sated, but id this true for Graphics cards? Surely once we are in the trillions of polygons per second (at the present rate, soon, probably) and 3d graphics offer photorealism, will there still be a need for better graphics? I would have thought that in 5 or 10 years, Graphics card technology will have got as good as it can usefully get. An example is that we used to judge gaming computers by how many colours they can display, be it 8, 256, or 65536. But once we reached 16 million, there wasn't any further useful improvement that could be made.

    I would guess that a similar future awaits graphics card technology. So on what criteria will Graphics cards of the future be judged upon? What will be the defining factors that will give one card an edge over an other once this graphical end of history has been reached? The only one I can think of is price.

    KTB:Lover, Poet, Artiste, Aesthete, Programmer.

Remember, UNIX spelled backwards is XINU. -- Mt.

Working...