GeForce 3 Demoed - Running DOOM 3 372
green pizza writes "Yesterday at Macworld Tokyo, Steve Jobs and John Carmack demoed the new nVidia GeForce 3 (73 GIGAFLOPS of power)... on a G4... running DOOM 3. Please excuse me while I pick my jaw up off the floor. You can get more details
from MacNN." [michael adds: VoodooExtreme has screenshots of Doom 3 running on the new GeForce card; Shugashack has more. Looks like Doom 3 will be another game where the color palette is "shades of black". Sigh.]
Just as long (Score:2)
ooh, and barrels.
Running on... (Score:2)
Video of Geforce in Action (Score:2)
http://news.cnet.com/news/0-1006-201-4881105-0.htm l?tag=cd_mh [cnet.com]
Daaaaaaamn.... (Score:2)
Not "shades of black" (Score:5)
Yes, DOOM3 is dark, but probably not as dark as what you are seeing in the screenshots-- look at those shots on a Mac or SGI instead of a PC and you'll be surprised how bright they are.
magic
linux-friendly video (Score:4)
Walter H. Trent "Muad'Dib"
Padishah Emperor of the Known Universe, IMHO
Fer cryin' out loud!!! (Score:3)
--Fesh
Re: (Score:2)
NVIDIA loses more points... (Score:2)
Re:Misapplication of technology (Score:3)
Yow! (Score:3)
Oh yeah, and DOOM looks amazing. Too bad NVidia cant make up their mind about releasing more info on the GeForce3. They have changed their mind 3 times in the past 24hours.
-Steve Gibson
Shades of Black (Score:3)
--
Re:NVIDIA loses more points... (Score:2)
Hrm, Holy Trinity? (Score:2)
Doom 3
Wolfenstein 3 (return to castle wolfenstein)
Geforce 3
I'd say Pentium 3 but 4's already out.
Does Duron count as the 3rd Athlon? (vanilla, thunderbird, Duron?)
Who else has a 3 coming out soon?
Is this the year of 3?
Re:I won't buy one on principle. (Score:2)
For a nice example of something similar, take a look at Photoshop. It totally owns its market, but only because nobody else can make anything as good.
--
Re:NVIDIA loses more points... (Score:3)
Can't see a thing... (Score:4)
I mean, is this supposed to be meaningful in any way?: http://216.105.168.97/cgi-bin/image-o-matic.cgi?d
So close, but yet so far!...hehe...But this video clip [cnet.com] is kinda cool.
2) Anyone know any concrete scheduling info. on Doom3?
3) So, how much do people think that Apple paid nVidia for the whole "out on Apple first" deal?
I disagree (Score:2)
In other words, shield the competitors from the feedback of the marketplace, letting them produce inferior products without suffering the consequences for so doing. Even if enough people were willing to do this, it wouldn't produce a 'more equal' market. It would simply keep second-rate products coming. Only by allowing the market to signal an organization that it's putting out stuff that people won't buy will the situation improve.
Shades of black (Score:2)
The contrast is not the issue, as others have argued. No matter how I turn up the contrast on my monitor or video card, I suspect I will only see dark blue, brown, and black.
IMHO, it's probably because American McGee left. I seem to remember Doom being a lot more colorful without losing any moodiness, and it wasn't because the levels were lit like bad 80s disco clubs either...
Re:I won't buy one on principle. (Score:2)
What software was designed for hardware accellerated 3d before there was hardware accellerated 3d cards?
I could very easily see games taking full advantage of full 3d sound once it was available. Although I was under the impression that it already was...
You expected Doom to look like Super Mario? (Score:3)
Yes, DOOM3 is dark, but probably not as dark as what you are seeing in the screenshots-- look at those shots on a Mac or SGI instead of a PC and you'll be surprised how bright they are.
Cranking the Gamma on this monitor improved the movie no-end, so I expect the game will be the same. Think of the Quake3TeamArena demo level - with the gamma slider at the lowest setting you could barely see anything.
I'm amazed at the number of people who seem to think that DOOM3 should be some multicoloured bright-light party. Thats what Nintendo games are for. Doom's legacy demands dark rooms, illuminated by flickering flourescent lights, monsters which appear out of the shadows or drop on the player from the ceiling.
And seriously, looking at the video footage of Doom 3, this is going to be a game to give you nightmares. The characters are going to be closer to realistic images than ever before - those Maya-produced animations are pushing several thousand polys when up close (I assume that the meshes will have Level-Of-Detail) - playing this is going to be like starring in a good (or maybe even a bad :-) ) horror movie. I fully expect to see some Army Of Darkness mods based on this engine :-)
Don't expect Doom3 to be a game for kids. This one will earn a 'Mature' rating almost straight off the bat.
Cheers,
Toby Haynes
The actual apple video (Score:3)
Doom3 just looks amazing. Even these early tech demo scenes make it clear that they have reached the photorealistic level. Say whatever you want about NVidia, but Carmack made that card fly...
-magic
Re:Shades of Grey (Score:5)
Geek dating! [bunnyhop.com]
Re:Doom 3, do we really need this? (Score:2)
Without the misfits who studied banned history texts and read about war the Earth would have fallen to the Kiniz (sp?). Most likely they played video games also.
What are you talking about? (Score:3)
Really, what does nVidia have that no one else is theoretically capable of matching?
I don't see any kind of proprietary API that only runs on nVidia hardware. I only see them supporting OpenGL and Direct3D - two APIs that anyone else can use just as well. The only way they've brought themselves into such a powerful position is simply good products and good business. Ever since the TNT they've been aggressively pushing new features like 32-bit colour and hardware geometry acceleration. It's not they're preventing other companies from adding new features, so tell me, what egregious tactics have they used to shut out competitors?
Damn. I *hate* shades of black (Score:4)
Like Citizen Kane. What is that? CRAP!
Or The Maltese Falcon. That's crap too.
Or the original Gameboy. Shades of black on the screen. Ergo crap.
Or even that asshole M. C. Escher. Lots of shades of black there. And it was craptacular.
sigh
Re:Misapplication of technology (Score:2)
Instead of building 10 amazing computer componets for $1M each they can build 100 000 amazing computer componets for $100 each. It makes it cheaper for those who want to do scientific stuff.
People buy faster computers for games and less so for office applications.
OT: somebody set up you the bomb! (Score:2)
For some reason people thought it was funny enough to spam everywhere.
Someone went so far as to make a small flash movie of it... http://www.detonate.net/newsitems/01021601/ayb.sw
It's quite amusing actually.
Hope you get paid well ... (Score:3)
Still, that card's feature-set is certainly a force to be reckoned with. I'll stick with the X-Box for now.
More than just the GeForce3 at MacWorld (Score:5)
- The Power Macs were subtly altered. The former build-to-order dual 533 MHz model is now a standard configuration [apple.com] from the Apple Store (which means retailers will start carrying it as well), and there's an option to purchase a 733 MHz model with a CD burner in place of the DVD-R/CD-RW combo unit (saving $400 in the process).
- Those wild new iMacs have at least upped the specs [apple.com] slightly for the graphics chipset; still Rage 128-based, but at least there's more memory (16 MB) on board. Plus, the 500 and 600 MHz models are the new G3 chip with the full-speed onboard 256K cache.
- The Cube now offers [apple.com] the GeForce2 MX card as a build-to-order option (standard w/ CD burner on the high-end model). Guess ATi's still on Apple's shit list to some extent.
- Fellow Mac users should try running Software Update and see if they get CarbonLib 1.2.5.
- The $49.95 5-pack of DVD-R disks is finally available from the Apple Store, but the estimated ship time is 45 days.
Now, if I could only get a Flower Power G4 Cube..... mmmmmmm....
The Mac forums are blazing with commentary on the new iMac colors. Personally, I kind of like them, and hope they do a good job of stimulating interest in a highly overlooked demographic; women computer users. Anything that brings computing power to a wider audience can't be a bad thing. Besides, the effect is supposed to be slightly 3Dish, with the pattern all the way through the case instead of merely stuck onto the surface. I can't wait to see one in person.
--
Id Software has the same schedule as always... (Score:2)
-Ted
Re:I won't buy one on principle. (Score:2)
Sounds like you want a Creative SB Live! [creative.com] card, which has support for Dolby Digital.
Creative make good card with GPL'd drivers, and as such I have no problem in buying their products. nVidia make products that might be good, but with binary-only drivers I don't intend to find out. My 16mb Matrox G400 does me fine, good 3D performance (Quake3 runs very well)and proper open-source drivers. I don't intend to buy an nVidia card unless they change their driver release policy.
--
Re:Shades of Black (Score:5)
I'd show you a screen shot, but we can save bandwidth if you just look at your screen with your eyes closed for a bit.
Steve: "All your GeForce 3 are belong to us" (Score:3)
Well of COURSE the demo is going to be dark... (Score:2)
Re:Doom 3, do we really need this? (Score:2)
Any book that talks about giving people drugs to treat their "Aggressive emotions or tendancies" had better do so in the style of a satirical vonnegut novel. I like my feelings just the way they are, thank you.
Fwiw, I'd much rather juice someone into a puff of red mist in a video game and laugh about how nasty it looked than do it in real life and spend the rest of my days grabbing the bars while i get porked from the rear.
While we're at it, as long as people are at home playing video games, they're not out raping someone i care about. Thats fine by me.
Re:Doom 3, do we really need this? (Score:2)
And, don't forget: In "Madness Has Its Place", the ones re-arming humanity (with lasers used to launch slowboats, powered by the Sun) were crazies who went off their meds, not "misfits".
Re:Hrm, Holy Trinity? (Score:4)
So, that's 2/22/2001
Three two's in the day (without the year)
2+0+0+1=3
2222001, or 222*3 = 666!
We're all doomed!!!!
Or something.
Raptor
jesus christ (Score:2)
however, since i cheked out those screenshots, i have had my faith in id restored. id will be forgiven it's past transgressions (quake 2, hexen, heretic, no more commander keen games, etc) if this game is half as good as it looks.
i also seem to have something wet in my pants. excuse me.
--
They're just screenshots (Score:2)
You're right, it's really easily to extract gameplay elements from the screenshots.
- Scott
--
Scott Stevenson
WildTofu [wildtofu.com]
Re:Video of Geforce in Action (Score:2)
Pictures may be nice, but seeing it in motion is ... WOW .... sweeeeet.
Re:VA Linux Systems Cuts 25% Of Its Workforce (Score:2)
Bill - aka taniwha
--
Re:The actual apple video (Score:3)
_ _ _
I was working on a flat tax proposal and I accidentally proved there's no god.
Doom 3 Mac for OSX only (Score:2)
The Mac version of Doom 3 will be for Mac OS X only. Mac OS 9 and earlier will not be able to run it.
- Scott
--
Scott Stevenson
WildTofu [wildtofu.com]
Video (Score:2)
Re:Damn. I *hate* shades of black (Score:3)
Smell The Glove.
Your Working Boy,
- Otis (LICQ: 85110864)
Re:Doom 3, do we really need this? (Score:3)
I know, I know, I shouldn't feed the troll. But I hate to see someone slander Niven that way. Well, go ahead and slander his last few years of work, but the classic Known Space stuff is still some of the best SF ever, IMHO.
Re:I won't buy one on principle. (Score:2)
What? Make a pretty decent product at a reasonable price? Yeah, that would be horrible...
Re:What are you talking about? (Score:2)
Re:More than just the GeForce3 at MacWorld (Score:3)
Very important news for all geeks.
-Chris
...More Powerful than Otto Preminger...
Damn right (Score:2)
For Mac, this is HUGE!
Re:Hope you get paid well ... (Score:2)
Of course, I don't own any nVidea or Playstation related products.
Re:Shades of Black (Score:3)
Still, I found Doom to be an excessively dark game (even with gamma correction), but I respect the artists' decision to go with a dark-themed game. I just probably won't play it.
Re:Can't see a thing... (Score:2)
3) So, how much do people think that Apple paid nVidia for the whole "out on Apple first" deal?
honestly i don't think they needed to pay them much at all. think about it: NVidia's new chip will only be available in small quantities before they ramp up production anyhow. Apple is providing them a high-profile easy distribution channel for their first few lots.
from NVidia's point of view by going with this "exclusive" deal they win over the support of the Mac market, they get to announce their new chip with a big fanfare at a high-profile event at no cost to them, and they have a guaranteed sales channel for their low-volume parts. not to mention the fact that they can justify charging exorbinant ($600!) amounts of money because it's bundled with a computer that's already several thousand! sounds like a pretty good deal to me!
as far as i can see it, it's a win-win for Apple and NVidia.
- j
Downloadable .rm of the video? [slightly OT] (Score:2)
Does anyone have an URL for a "download and THEN watch" version of the video, or an equivalent to "wget" for rtsp streams?
I'd really like to get the whole thing and then watch it full-size, rather than trying to cram it through my slow internet connection and losing most of the detail...
---
"They have strategic air commands, nuclear submarines, and John Wayne. We have this"
Re:Downloadable .rm of the video? [slightly OT] (Score:2)
Re:NVIDIA loses more points... (Score:5)
------
Re:More than just the GeForce3 at MacWorld (Score:3)
When the GeForce4 comes out with 128MB of memory, do you think Apple will still be selling G4 computers with 64MB? That would be even funnier.
steveha
Lol (Score:2)
Moral of this story: never post any stories to Slashdot on time. Ever. :)
Re:Shades of black? (Score:2)
--
Re:More than just the GeForce3 at MacWorld (Score:5)
"I'd never buy a fruity-colored computer"
"Steve Jobs is (insert some random extreme opinion) and should I ever meet him I'll give him a dirty look".
"Only lam3r d00dz uze 1-button m1ce".
C'mon, frankly the level of discussion on /. regarding the existing biggest commercial competition to MS & the soon to be largest unix vender is reliably sophmoric.
Apple is an independent billion-dollar corporation that has reliably innovated and is moving unix into the consumer market faster then anyone else yet all we hear is the same whining crap from /.'ers repeating the same urban-legends about Xerox PARC and Apple not giving away QuickTime yadda yadda yadda.
Is talking about the floral-prints any different?
GeForce 3 Press Release (Score:2)
Re:Downloadable .rm of the video? [slightly OT] (Score:2)
---
"They have strategic air commands, nuclear submarines, and John Wayne. We have this"
Re:More than just the GeForce3 at MacWorld (Score:2)
My NeXT Cube has 32MB system memory and 32MB video ram on the NeXT Dimension graphics board - nice and balanced!
Think Different (Score:2)
The techy PC market is saturated. People have sated themselves on cheap ram, cheap CPUs, cheap storage, and cheap video cards. There's no reason for PC growth to grow.
So now Apple is probably targetting the *non* techy market. Coincindentally that also happens to be the female market. Girls. The ones who don't know or care about the iMac's increased graphics memory or CPU speed, or video chipset speed.
The ones who buy new pairs of shoes to match their new dresses to match their new handbags etc.
If they can hook these girls even once, Apple can almost guarantee multiple resells as a fashion industry. Basic black with chrome highlights. Iridescent green with transparent blue panels. Etc.
Geek dating! [bunnyhop.com]
Re:Listen to yourself! (Score:2)
I thought I listed a whole bunch of improvements for you already...
Geek dating! [bunnyhop.com]
Re:More than just the GeForce3 at MacWorld (Score:2)
Take, for example, the unified configuration management system. Or Quartz. Or Netinfo. Or the simple fact that I'll finally be able to run applications that don't suck (like Photoshop) next to a native windowing Emacs and CMUCL. No other Unix allows anything like the usability of Mac OS X.
(jfb)
California can thank Seti@home (Score:2)
Seth
What use are better sound cards? (Score:3)
How about digital USB speaker output? Right now that sucks up CPU resources that a good sound card should be able to handle.
How about MP3 encoding/decoding? Right now it's a trivial 2% of my system, but if I up the bitrate, the number of channels, and the 'effects', I can start soaking up CPU. Why not have a soundcard accelerate it the same way video cards accelerate 3d graphics?
How about voice recognition software? Hardware accelerate that!
3d sound: Anything that uses a 3d library should be able to use 3d sound. Imagine Quake3. If the soundcard could access the level data, the walls, the enemy placement, the weapon type, etc, it could actually do occlusions, echoes, reverbs, damping, amplification, cancelation, etc.
Geek dating! [bunnyhop.com]
Re:Not "shades of black" (Score:2)
Not only will use of this word in place of the non-existant/redundant word "boxen" stop people from laughing at you when you say it out loud, it will reduce the intense feeling of unjustified intellectual superiority that is severely limiting your social life.
Re:Looks like another Id Classic (Score:2)
Uninitiated (Score:2)
Re:Not "shades of black" (Score:2)
Shows how much you know! Probably I need to get a better graphics card. Then I'll get some, for sure.
(Just kidding - I applaud your efforts. Could you also try to get people to stop saying "fsck" instead of "fuck?" And "sheeple" and "BZZT! Wrong!")
GeForce 3 On Mac ONLY? (Score:2)
-- Doesn't know anything about Macs, please excuse my ignorance.
------------
CitizenC
Re:More than just the GeForce3 at MacWorld (Score:2)
Yes, the largest right behind Sun, SGI, HP, IBM and GNU/Linux.
Re:Not "shades of black" (Score:2)
Re:NVIDIA loses more points... (Score:3)
They provide binary-only support for Linux, no support for BeOS, and they've entered into a deal with Mac to artificially release their GeForce 3 on Mac first. (I say "artificially," since you know damn well they have more people in the PC world that want the technology.)
nVidia blows goats. I have proof.
-thomas
* Ding-a-ling dinner. Translation: Telling someone to "blow you." AKA "kiss my ass."
Re:Just as long (Score:3)
Re:Misapplication of technology (Score:2)
An example of this was Iterated System's fractal codec, where you could spend literally days of 33 Mhz i486 CPU time searching for a better compression, or be satisfied with the compression you got from a few minute's search.
Are there any modern codecs like that? So that a powerful machine can really crank the compression up, but a slower machine had better have a fat pipe, 'cause it isn't going to have time to get much compression done.
I guess it'd have to be a codec whose compressed representation was almost turing complete (I guess we could just send a program, but the halting problem seems intractable.)
Re:John.. Say something.. please :( (Score:2)
An Evil Empire's new piece of hardware being released to jam themselves into a new market to further twist there .net into the home using the force of the game API from their desktop OS monopoly to achive it...
or
A friggin' fast piece of hardware from a company that has been fairly good to the 'alternate' worlds in Computer Land, and a piece of game software that will be friggin' excellent, just like everything out of it's makers doors, that will at least have binaries released for 'alternate GNU OS using people', if not a box set.
Personally, I'd buy a GeForce 3, Doom 3, and a GameCube, but for the love of god don't let Microsoft get a foot in the door!
Re:Shades of Grey (Score:2)
Re:More than just the GeForce3 at MacWorld (Score:2)
No, I was actually talking about X-Box (Score:2)
Re:More than just the GeForce3 at MacWorld (Score:2)
Refrag
Re:You expected Doom to look like Super Mario? (Score:2)
Re:Shades of Grey (Score:5)
The colors did get rather washed out on the big screen.
John Carmack
Re:More than just the GeForce3 at MacWorld (Score:2)
Well, Apple has shipped around 50,000 copies of the public beta, which are Unix-based. And there is Mac OS X Server, which has been out for nearly two years now. I figure Apple must have sold one or two of those.
At this point, Apple has probably shipped more copies of a Unix-based OS than several of the aforementioned companies.
-jon
Re:More than just the GeForce3 at MacWorld (Score:2)
Actually I believe it was soon-to-be but close enough. If one looks at the number of active Macs out there (they have a lifetime longer then most PCs & price only seems part of the reason) and compare it to their other unix brethren you'll see that there are more.
Presuming some large number switch over to MacOS X then yes, it'll have the largest installed base of unix boxes.
Of course MacOS X has yet to ship so we'll see it's reception but back to the original posting; many /.'ers seem stuck in their disdain of all things Apple which is ironic considering what Apple is doing and how it's affecting the industry.
Re:Shades of Grey (Score:4)
If only there were a way we could see images generated directly from the source...
Re:More than just the GeForce3 at MacWorld (Score:2)
A: How to make yourself into a no-life looser - go name calling on /.
B:"...next to a native windowing Emacs" - Have you seen this? Is there really a native port of Emacs to MacOS X or are you just blowing hot air? By native I mean running within Aqua using PDF-based cut-n-paste and all of the other MacOS X-specific technologies.
Re:Misapplication of technology (Score:2)
You'd probably think something like this is very dangerous stuff. I bet you want to burn all evidence of its existence. Other people would consider Hamlet a classic of literature, stage, and screen. Different tastes, I guess.
-jon
Re:More than just the GeForce3 at MacWorld (Score:2)
And yes, apple *is* going to be the largest distributor of a UNIX based operating system in the world. Why do some here feel the need to undermine that fact? One might begin to suspect that they don't *want* UNIX in any form to become a mainstream desktop os.
Frankly I'm sure that the BSD guys don't know whether to laugh or cry at the fact that Apple is going to be the largest distributor of it. I myself think it's great.
Scott
--
A question, John? (Score:3)
Prelude aside, I haven' been able to find much in terms of cross platform programming on Linux, Mac OSX, and Windows 2000, three of the least popular gaming platforms out there behind PSX, Gameboy, and Windows 98. Though I suppose what with the DX 8/7 support in W2k, Windows 2000 isn't a problem.
I *really* want to learn and use Objective C, the Cocoa libraries, and OpenGL. I know that's not a problem with the Mac, given that Apple has made them all first class citizens of Mac OSX; is there any chance of being cross platform?
Or do you just code straight C for the game (and thus target every platform on the planet I guess), with platform specific code for the input and display handling?
I really wish someone had a book published, using iD as the case study, on cross platform development. IDEs, compilers, best practices, optimization techniques, workarounds, etc.
Of course, just saying all this out loud has given me a solution ^^; Code in C, abstract out the platform specific display, device, and input handling routines into a separate library, and use the 'best' software for each platform, whether that be Metrowerks, GCC, Visual Studio, etc.
Actually, I guess you could use Metrowerks for all the platforms, couldn't you? Is that what you do?
-AS
Re:NVIDIA loses more points... (Score:3)
Yes, that explains why NVidia's GL drivers have always been faster than their D3D drivers.
NVidia implements their OpenGL drivers in hardware too. This is unlike other handware companies (*cough* 3dfx *cough*) whos GL drivers are simply wrappers around some other API (like GLIDE).
You will also notice that NVidia's GL drivers always support the newest features of their cards long before D3D does. For example, NVidia's GL drivers have had the "NV_vertex_program" extension (for programmable vertex shaders) since long before D3D8 was released. Similarily, when the GeForce was released, the GL drivers instantly supported T&L, while D3D users had to wait for Microsoft to released D3D7 for that support.
Might I remind you that NVidia employs several people like Mark J. Kilgard, the author of OpenGL Programming for the X Window System (the definative work on the subject), and the GL standard windowing library, GLUT? (If you have ever done any programming with OpenGL at all, you probably used GLUT to do it.) MJK is one of the biggest names in OpenGL on the planet, and I suspect he is part of the reason that NVidia has the best OpenGL implementation in existence.
Please make sure you know what you are talking about before you talk. Thank you.
------
Re:Running on... (Score:2)
Yeah, worrying about that was keeping me up at night.
--
Re:Competitive clock speed???? Did I miss somethin (Score:5)
The 733 G4 was not as fast as my 1 ghz PIII in any of the trouble areas.
Apple is doing a lot of good work, but the CPU's just aren't as fast as the x86 ones.
AltiVec can compensate in some cases, because it is way, way easier to program for than SSE, but it takes a very simple batched, computation intensive task for it to pay off in any noticable way. Amdahls law and all that.
We did a couple functions with AltiVec, but they didn't make much difference.
Video encoding and large image processing are two areas that it can pay off, because you may be spending 90%+ of your time in one page of code.
Even then, it takes a special balance to let a G4 come out ahead, because it has less memory bandwidthd than a high end x86 system.
John Carmack
Re:Doom 3, do we really need this? (Score:2)
Spoiling the story for people who haven't read it yet is REALLY BAD FORM.
Re:A question, John? (Score:5)
I will probably do a
Jim Dose had inadvertantly used a few MS specific idioms that we had to weed out over the past couple weeks of the bring up on OS-X.
John Carmack
Re:Competitive clock speed???? Did I miss somethin (Score:3)
The AltiVec unit requires a couple of things that you dont have to deal with in normal code. The data has to be nicely laid out all in one place, and you have to be willing/able to deal with it more than one element at a time. Most programs are not designed that way (or if they are it is only by chance) so usually it is hard at first to find ways to retrofit Altivec into existing code. Certain trends in programming (e.g. OOP) make it even less likely that you will see these kinds of data structures. So to make a long story short, to get pervasive use of AltiVec, you have to design around it. I cant speak about the Quake engine, but from John's comments that sounds a little bit like part of the problem.
This is not to say that AltiVec is useless for such apps. AltiVec more often than not just so happens to be very good at accelerating that 10% of the code that consumes 90% of the CPU time. So, a little bit of work often goes a long way.
What you learn working with it is that memory bandwidth is almost always the problem. The result is that certain common old-gen programming techniques for code optimization that rely on memory access to save CPU time (longer code, lookup tables, etc.) are about the last thing you would want to do with AltiVec. You only unroll loops as much as is necessary to get proper scheduling and stop there, for example. Almost everything, including constants are best calculated on the fly rather than loaded from memory. As John points out, you are better off with one big complicated function rather than a lot of simple ones. Accessing memory is so expensive that once you have the data you had better do a lot with it. You typically have about 30-40 cycles of time per 32 byte block of memory that is "spent for you" every time you load memory. If you dont do something worthwhile during that time, it is your loss.
AltiVec is also good for sound code. It is a great way to reduce the overhead associated with single sound channels so that you can have lots playing concurrently without having time spent at interrupt running amok.
Re:NVIDIA loses more points... (Score:3)
NVIDIA cannot be a monopoly. Monopoly is more than just market share. NVIDIA, even if it gets 100% market share, cannot be a monopoly because it is based on open standards. They don't have anything like Glide to depend on. The minute somebody comes out with a faster card, they lose the top position. Simple as that.