Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software

Dual GPU graphics solution from ATi? 359

Graphics Guru writes "Last week TweakTown posted an exclusive picture of the ATi Radeon 8500 MAXX with believable accompanying information also regarding the highly anticipated ATi R300. 3DChipset is today reporting that they have confirmation that the 8500 MAXX is indeed real and is due to be shipped fairly soon. Here's what someone from ATi told them: "The ATI Radeon 8500 Maxx is for real and the card is already in full production and about to be shipped soon. ATi has finally nailed certain issues with the dual chip. Final testings have been done and you should here noise from ATi regarding this offering." You decide if it is real or not, a solid dual GPU solution would surely rock the industry to massive proportions!"
This discussion has been archived. No new comments can be posted.

Dual GPU graphics solution from ATi?

Comments Filter:
  • Rocking v2.0. (Score:3, Insightful)

    by saintlupus ( 227599 ) on Tuesday July 09, 2002 @10:33AM (#3849233)
    You decide if it is real or not, a solid dual GPU solution would surely rock the industry to massive proportions ...as soon as the next version of the drivers come out, I presume. This is an ATI product, remember.

    --saint
  • I wonder if this [petitiononline.com] online petition had any effect...
  • by Anonymous Coward
    This GPU plus those 8500 OpenGL drivers being paid for by the weather channel should make a kick ass system for Doom III.
  • Stereo (Score:4, Funny)

    by oever ( 233119 ) on Tuesday July 09, 2002 @10:35AM (#3849252) Homepage
    2 GPUs: that means 3D viewing. One for red and one for blue. I better find them goggles.
  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Tuesday July 09, 2002 @10:35AM (#3849253)
    Comment removed based on user account deletion
    • Re:Bull (Score:3, Insightful)

      by levik ( 52444 )
      Ummm... ATI had nothing to do with the voodoo 5, you know that, right? Anyway, the ATI driver support is supposedly improving. Maybe if they concentrated on making solid drivers about figuring how to make things look faster in Quake3, they would have a better reputation on the market.
    • hear hear,

      ati drivers are bollocks. as well as matrox.

      i'm glad someone points this out everytime there is a story about these jokers releasing some whiz bang hardware that probably wont work cuz their drivers are poop!

      -
      uberfag, and proud of it.
  • ATi (Score:2, Redundant)

    by Bongalot ( 325378 )
    UntiL ATi makes their own *nix drivers, im stickin to Nvidia. Those 3rd party drivers for ATi are pretty shabby and not worth anybody's time.

    -=Crazy stuff happens w/ the Bong and me.=-
    • Re:ATi (Score:5, Informative)

      by dinivin ( 444905 ) on Tuesday July 09, 2002 @10:54AM (#3849382)
      UntiL ATi makes their own *nix drivers, im stickin to Nvidia.

      You mean like the linux drivers that ATI wrote for the Radeon 8500 and 8800? Guess you'll be switching to ATI now, right?

      Dinivin
    • > UntiL ATi makes their own *nix drivers, im stickin to Nvidia.

      In ATI's defense, unlike nVidia (who are strictly proprietary), ATI do make the chipset details available so anyone can write open source drivers for whatever esteoric OS they happen to be using - there's more OSs than just Windows and Linux, you know!

      Of course, it would be nice if ATI released both specs and drivers, but IMHO, it is better in the long term for open source OSs if the specs are released.

  • Which industry? (Score:5, Insightful)

    by FortKnox ( 169099 ) on Tuesday July 09, 2002 @10:39AM (#3849286) Homepage Journal
    a solid dual GPU solution would surely rock the industry to massive proportions!

    Which industry would that be? The gaming industry is slowing down as far as graphics go. Mark my words, there is going to be a shift soon from graphic intensive to gameplay innovation. People don't want games to be any prettier (or don't notice much of a difference). Notice how the mod community is getting bigger and better? Its cause they take the graphics engines and add innovation.

    I'm rambling, but I think that these new video cards aren't going to be this big explosion that they were in the past. Sure they are big and powerful, but people aren't going to fork over the cash to get this one when they can get a good GeForce2 that can play their games just as well.
    • Re:Which industry? (Score:2, Interesting)

      by Scaba ( 183684 )
      Beliieve it or not, people actually may want both innovation and cooler graphics. Why do you think the two are mutually exclusive?
      • Re:Which industry? (Score:3, Insightful)

        by FortKnox ( 169099 )
        Because they are both time restrictive. If you want a game with great innovation, you need to buy a graphic engine and work solely on the innovation. If you want pretty graphics, you spend a ton of time on the engine, and little on the game.

        If you do both, either your innovation or your graphics will be outdated by the time you finish.
    • Actually as far as I can tell the major limiting factors now wrt to graphics cards is heat and powersupply issues. About half of the people I know with a GF4 have had to make major modifications just to get the card displaying properly. Joe average doesn't want to swap power supplies, or add cooling fans; hell, Joe Average took alot of work just opening the case to put the card in...

      Plus if anyone remembers ATI did this before with another MAXX card that suffered horribly from limited, crappy drivers that assumed too much, and didn't work. Good idea, bad irl.
      • Tell me about it. When I got Morrowind, I first got a GForce4, but that seemed to have trouble with my existing system, so I decided to go ahead and get a new system. So I upgraded from a 1Ghz Athlon to an Athlon 2000+, new case, *8* fans, DDR memory, and since the A7V333 supported simple hardware raid, I bought two 80G drives and reinstalled Win2K on it, all for one game :)

        I had two ATI All-In-Wonder cards, and I finally abandoned them when I got tired of constantly hearing about bugfixes coming "soon" that never came. That, and I really hated their application software for TV and video capture. I hate overdone metaphor GUI's.

    • Re:Which industry? (Score:5, Insightful)

      by chrysrobyn ( 106763 ) on Tuesday July 09, 2002 @11:20AM (#3849575)

      Which industry would that be? The gaming industry is slowing down as far as graphics go.

      Hi, FortKnox, I'd like to disagree with your "marked words". I'd say that anything that dramatically increases the performance of graphical gaming will be welcome. I grew up playing PC-Man, Friendlyware PC Arcade and other ASCII games. I eventually progressed to Doom, to Quake (by way of Heretic, which I liked more than my friends). On opening day, I bought Warcraft III. Turns out my 16 meg Radeon doesn't play it perfectly smoothly, so I may end up buying a new video card by the end of the year for my dual G4. In my case, the vid card is clearly the bottleneck.

      People have been talking a deal over the years about how consumers don't want anything prettier (or won't notice), they're happy with what they have. Game play is what's important. Don't try to tell me that Myst was more than an eye candy excuse. Sure, consumers are happy with what they have now, but it turns out that pretty is what sells in the stores -- nobody wants to read a novella describing "game play", they want to see screenshots!

      I'm rambling, but I think that these new video cards aren't going to be this big explosion that they were in the past. Sure they are big and powerful, but people aren't going to fork over the cash to get this one when they can get a good GeForce2 that can play their games just as well.

      You actually remind me of the neigh-sayers (or nay-sayers, depending on where you're from) a decade, two or three ago saying that games were nothing more than a distraction on computers. Nobody ever designs a computer for the things, and certainly nobody would fork out over $500 just to play games! Can you tell me which industry is driving which? I won't say that iD is single-handedly responsible for Intel's bottom line over the past 10 years, but I will say that consumer's demand for "prettier, better, smoother" has been responsible for a great many computer sales. They don't need 2GHz Pentiums / Athlons to balance their checkbooks or play with Mozilla (yet).

      The ultimate in game play, I'm willing to bet, is an eyepiece or two that behaves as a huge, high-res screen, but takes up the entire field of view. That will be a great number of pixels (dare I guess 10k horiz by 5k vert per eye?) Maybe we'll have some game play innovations along the way, but there is certainly a need for more innovation. Perhaps we'll be stopping by 32" LCDs (or OLEDs) on the way, banks of seamlessly tiled conventionally sized screens, or even something different. The fact of the matter is, consumers are happy with what they have -- until they see something better. That's where the bucks are. Where would Gateway/ Dell/ Compaq/ Toshiba/ HP/ Apple be without those consumers wanting the next pretty thing?

      Oh, and yes, I do see the mod community getting bigger. They even have some great successes. But, do you see them modding Wolf 3d? Doom I? Or do they move on, exercising the latest engines to the fullest of their abilities? Would they prefer engines that allow them to show on the screen what they have in their heads?

      • On opening day, I bought Warcraft III. Turns out my 16 meg Radeon doesn't play it perfectly smoothly, so I may end up buying a new video card by the end of the year for my dual G4.
        I wouldn't be so sure that it's your video card. Warcraft III plays fine at 1024x768 with all the options on on my Duron 800 with an old Voodoo 3 card. On my wife's new-ish iBook, 640x480 is still a little jerky with all of the options on.

        I suspect a lack of Mac optimization from Blizzard.

      • Quote me as saying, in 1980, while sitting in front of a TRS-80 model II, with my 8th-grade teacher trying to explain how to calculate interest on a home loan:
        "Games are the only legitimate use for computers."
      • But, do you see them modding Wolf 3d? Doom I?

        Yes, there are still people modding Doom 1 and 2... go over to doomworld.com and check it out. Helluva lot more gameplay than today's "walk around corner and kill single enemy" bullshit.

    • Can you tell the difference between the real world and the simulated virtual world, in terms of visual fidelity? I sure can, and as long as there is a noticable difference, the graphics industry will rocket on.

      I predict the curve won't break until realtime computer graphics are far more convincing than the computer graphics we see in movies today.

      That will be a short while ...
    • The game oriented graphics card industry will not slow down until there are cards that can do polygon/pixel resolution at at least 1600x1200 with enough fill rate to draw ten passes and still maintain at least 60-75fps.

      Once you can get multi-pass polygon/pixel resolution at acceptable performance levels the need textures becomes moot and very interesting things will start to be produced.
    • I completely agree. I recently got ahold of a copy of Quake III, and had it running at home (college student at home for summer). My 14 year old brother, who is a gamer himself, asked what I was playing. Upon my response, he promptly retorted "that's it?? People make this big deal over Quake III - it looks just like every other FPS out there!"

      I remember when Quake 3 first came out - it was amazing. Now, it "looks like every other game out there". Utterly incredible. I remember the days of playing Dangerous Dave and Zaxxon on my 8088, and thinking that they were incredible. In past years, graphic quality has been critical because there has been so much room for expansion. However, now, there's only so much you can do with eye candy.

      The next area of innovation will be in gameplay. FPSes were innovative...when Wolenstein and Doom came out. There have been a few definitive games, but for the most part, there has been little innovation in that arena. Now, there's room for it.

      Doom 3, while looking amazing, will not do well if the gameplay isn't above par. It's possibly the best-looking game to date, but gamers are becoming increasingly disillusioned with graphic candy, and are craving better gameplay.

  • Dual GPUs (Score:3, Funny)

    by Rupert ( 28001 ) on Tuesday July 09, 2002 @10:42AM (#3849308) Homepage Journal
    Right, like all those dual CPU desktops I see.
    • It is very hard to parallelize most ordinary computation. It is very easy to parallelize much of 3D rendering (though not all of it). It shouldn't be surprising if we end up in a world with single-CPU systems with large arrays of graphics processors.
      • by sporktoast ( 246027 ) on Tuesday July 09, 2002 @11:11AM (#3849497) Homepage
        What a GREAT idea!

        I'll start up a computer company to capitalize on the hig demand for such systems. We'll put all the horsepower into killer graphics and put them in cool looking boxes.

        I think I'll call the company SGI. With a plan like this, we'll be around FOREVER.

      • Ya.. well there is a large difference between dual CPUS and dual GPUS, for starters if you look at the time a cpu is busy verses not, it's a huge difference. The cpu is only busy about 20% of the time, the rest of the time it's idle, unless you run seti or dnet, which will use the idle time in the background, but most computer systems are busy only when you load a program or doing some sort of work (no, reading slashdot isn't 'work'). Rest of the time the processors just sit there and does nothing.. well they keep the system going but that's it. As for a GPU, while you are playing quake 3 arean, those processors are blazing away the entire time you are playing, they never stop working, so having two would definately be an advantage.

        Dual CPU's on the desktop are a waste of money, by the time you need the extra horsepower, it's cheaper to buy a new chip which will be faster (moore's law), though i have found using dual cpu'ed servers give it about an extra year of life at a fraction of the cost. Currently using a dual 500mhz celeron, which is 2 yrs old and there is still no load on the system, having that extra processor to handle tasks on a true multithreaded OS rocks!
        • Re:Dual GPUs (Score:5, Insightful)

          by Anonvmous Coward ( 589068 ) on Tuesday July 09, 2002 @12:18PM (#3850054)
          "Dual CPU's on the desktop are a waste of money, by the time you need the extra horsepower..."

          That's not necessarily true. I've been running dual for a couple of years now, and the benefits I see to it are far deeper than adding 'extra horsepower'.

          I'm a 3D artist. I use Lightwave primarily, but also use Photoshop and After Effects quite extensively. I spent a LOT of time waiting for stuff to get done. My boss got me a dual Athlon 1600 with a gig of RAM early this year. She didn't get it for me because she wanted me to halve my rendering times, but rather she wanted me to make better use of my time while the computer was busy.

          Lightwave is multithreaded, but not very elegantly. As a matter of fact, I rarely enable the multithreaded option. Instead, while it's rendering, I set up processes on the other processor to continue on with what I'm doing. Sometimes I'm building the next model, sometimes I'm generating a texture in Photoshop, or I'm setting up a composition in After Effects.

          So while my computer is busy rendering, I'm still busy being productive. Some of you are saying "Yeah, but you'll never get 2x the processing out of it." And you know what? That's basically true, at least in a benchmark point of view. I get close to double clock speed when I have a rendering running on each processor, but I doubt I hit 2x. I don't need 2x anymore, though. About a year ago I started layering my animations. That means that my computer would render elements of a scene, which render much faster than the entire scene. As each frame is generated, it gets added to the composition in After Effects. So while my computer is rendering, I'm busy in After Effects getting it all put together. This sure beats waiting for the rendering to get completed. Heck, thanks to this technique (and the dual proc), I rarely have 'over-the-weekend-renderings' that have the potential to go horribly wrong.

          Would I be better off with a second machine? No. For the amount if money that was spent on my machine (roughly $1,500 sans monitor and hard drives), I probably could have gotten more 'pixels rendered' per minute. But, it'd be a huge blow to my workflow switching between two computers. It wouldn't take very long for the 100mbit connection between them to become a huge bottleneck. As a matter of fact, I'm not sure it would have been all that cheaper. We'd still have to get me high end video cards and monitors for each machine.

          Are dual proc desktops for everybody? Not really. The best benefit you'd see is that Windows 2000 behaves a LOT better. Explorer and IE are both very multithreaded, and are much more responsive. As a matter of fact, my Athlon 1.2 gig machine at home felt sluggish compared to my old Dual P3 550. It kicked the 550's butt at rendering, but when it came to browsing the web, doing email, etc, the dual 550 was much more responsive.

          In short, dual processor machines have their place. If you primarily play games, you probably won't care much. But if you do CPU intensive work, it'll make your life a lot easier. Unless, of course, you like having nothing to do while your machine is busy.
  • by levik ( 52444 )
    ... Do we need to have a Beowulf cluster of chips and memory on a video card? This is yet another example of the trend in cumputing that has strived to make things better by making them bigger and giving us more of them.

    "Optimization, shmoptimization! Just cram a second GPU in there and we'll be fine."

    I really wish people would just stop coming out with new hardware for a couple of years, so that we can all save a few upgrade bucks, and the software industry can get their act together, and start writing clean, well optimized, stable programs, instead of trying to always catch up to the bleeding edge that nobody really asked the hardware companies to push.

    • If I get the choice of adding $200 dollars for a new gfx card or a new cpu to permanently increase the graphics capabilities or computing capabilities I'd way rather pay that than have to pay programmers to optimize every piece of software using it. I want programmers to be busy making high-quality, stable content and gameplay capabilities (though the actual quality depends on the gfx / sfx / storyline / modelling / texturing / whatever staff), not trying to squeeze out the last 5fps / assembly optimizations to make it playable on a bigger marked (= slower systems).

      Which is not an excuse to make *unessecerrily* bloated code. But to me I'd be a lot happier if it's featurerich and stable rather than fast. Usually you only get to pick at most two out of three, at least on a sane budget

      Kjella
    • Re:Really now... (Score:3, Informative)

      by Zathrus ( 232140 )
      Probably because it's cheaper to build new hardware than it is to try and optimize what's out there. Seriously.

      If you can spend $1M on hardware development and come out with a new chip that's 20% faster or spend $1M on software and put out drivers that are 5% faster, where is the money better spent? Besides which, you can charge for the new hardware. Charging for the new drivers is not acceptable to consumers.

      Freezing the hardware for "a couple years" is not acceptable. Companies will simply cease to exist. Upgrades are part of the business model of the industry, and that modern systems are capable of doing virtually all tasks home and business users would require of them is part of the reason for the technology bust in the past couple years.

      Look, it's simple. If you don't play the latest and greatest games, or don't care if you can play those games at uber-high res with all the effects turned on, then you don't need to upgrade. And yes, you can generally play the new games just fine on an older computer (my system is an Athlon 750, 512MB PC133, 32MB GF2 and runs DS and NWN just fine. Plays Q3 just fine. Will it play UT2 just fine? I doubt it... but it's 2 years old now).

      As for "nobody asked hardware companies to push" -- speak for yourself. Go look at the Doom3 demo. You simply can't do that on current hardware with any semblance of speed. Yeah, you can run it on a GF3/4/ATI 8500, but you'll have to run it at a lower resolution and turn off features. Run it on a GF2? An ATI 7500? An MX anything? Maybe. It won't have anywhere near the eye candy.

      Once we're to photorealistic scenes being rendered in realtime with no drops below ~60 fps on large, outdoor scenes you can say we've gone far enough. And by that time we'll probably want 3D or something else that will continue to push the bleeding edge.

      Until then, there is room for improvement. And there's a lot more room on the hardware side then there is on the software side.
  • support (Score:2, Interesting)

    by tux-sucks ( 550871 )
    Does anyone know how this would effect developer's coding for gpu instructions? When dual processors came out, they were, for the most part, unsupported by applications. Will the same hold true in this scenario?
    • I would expect it's up to the drivers, not the application, to support parallel execution of graphics instructions. The whole idea of a GPU is that the application does not have to supply every instruction to the video card, because every instruction supplied by the application has to be processed by the CPU. You give a GPU higher-level instructions and it uses them to produce lower-level effects.
  • Final testings have been done and you should here noise from ATi regarding this offering.

    Not only can't the people at ATi write drivers, they can't write or spell very well.

    I doubt I'll ever buy another ATi product again, my AIW128 Pro hasn't been the same since I tried to install new drivers for it. I can't even get it fully back to the old drivers now. *sigh*

  • The Nvidia Geforce 4 4200 generally runs most things about 50% faster than a radeon 8500 and costs less. It's definitely the best value gamer card on the market.

    Here's a good article with some benchmarks on this great value card.

    [hothardware.com]
    http://www.hothardware.com/hh_files/S&V/abit_ti_ 42 00.shtml

    This is a nice concept card, but it's not going to put ATI on top.

    money [freedkp.com] for moogles [kupoflux.com]
  • ...that allow XFree86 drivers to be built and work potentially as well (or better) as their proprietary couterparts (or even better, if they work on Free Software drivers), I'll gladly buy this card. If not, oh well, my Matrox Millenium G200 still has pretty good 2D, and 3D is just about a tad slower than geforce's Free drivers (not the proprietary ones), so its a win-win situation... they sell one more card, and I finally can enjoy good and decent 3D :)

    Cheers
  • Assuming:
    1. It's true
    2. It is a viable product
    3. reliable drivers become available
    4. people buy them
    How much wasted GPU cycles will there be? I mean, even die-hard gamers don't do it 24/7/365...So if a few thousand get out there in desktops that are only taxing these cards a few hours a day, how long until somebody writes a distributed processing app like seti@home [berkeley.edu] or The Great Internet Mersenne Prime Search [mersenne.org] to run on these GPUs?

    I mean, if people can trick the TCP stack into doing distributed math [slashdot.org], they can certainly trick these GPUs into doing it to...
  • by bachelor3 ( 68410 ) on Tuesday July 09, 2002 @10:56AM (#3849399)
    ...would surely rock the industry to massive proportions!

    Yeah, they said this about the final Kiss tour as well.

  • I *just* purchased an nVidia GeForce4 Ti 4200 a few days ago! I hate that one lemma that states that the new hardware you want will not come out until you've paid top-dollar for what's currently on the market.
  • Stop Slamming ATI (Score:5, Informative)

    by N8F8 ( 4562 ) on Tuesday July 09, 2002 @11:01AM (#3849439)
    I read a lot of people complaining about ATI. I think people need to put a little perspective on things. NVidia came out of the blue and used their superior 3D chipsets to grab the mainstream Video market. ATI's response was slow at first, but is really gaining steam. I've got a Mobility M4 in my laptop with great new OpenGL drivers. In my home PC I've got a Radeon 8500 LE that runs 99.9% of all the DirectX games. In the case of the former, my 2D performance was the biggest factor. In the latter, the price gap for comarative performance was a joke. $99 for the ATI Radeon 8500 LE (NewEgg.com) vs $180 for a Nvidia GF4 4400. NVidia is now using their market dominance to bleed the market (a familiar strategy that eventually backfires). Not to mention the beautiful All-In-Wonder I bought for my parent's computer that has the best MultiMedia and TV Tuner I've seen (DScaler ain't bad but can't seem to pick up as many stations).

    For all you GNU/Linux junkies, ATI has been much more forthcoming in information for developing XF86 drivers than NVidia(proprietary binary only).
    • Hadn't constantly crashed my Win2k box when I bought it for $600, I might not be so bitter.

      But when you write a driver and refuse to run a machine with it for more than an hour, and then, worse than that, ship the product and try to sell it for $600 upon release, you do get a bad name.

      ATI deserves every flame they get until my radeon supports VfW without an ungodly amount of hacks. And video capture is the absolute least amount of the problems with the driver that shipped (the fact that your DVD support is gone if you lose/ruin your driver disc would be number 2 on the list).

      ATI can keep their crappy products. Of course, now I've switched over to Linux, I'm starting to buy their products again (looks like third party drivers written without full specs of ATIs cards are more stable than ATIs own -- who'd-a-thunk-it?).

      >ATI has been much more forthcoming in information for developing XF86 drivers

      Which would explain why third party X11 drivers are better than their windows drivers. Man, you have to have one really poor set of coders to be beat out on the quality of the drivers for your product by people hacking out code as a hobby.
    • by Zathrus ( 232140 )
      For all you GNU/Linux junkies, ATI has been much more forthcoming in information for developing XF86 drivers than NVidia(proprietary binary only).

      People who whine about this just prove how inane and stupid the free software movement can be.

      Look at this, and virtually every other thread, regarding ATI. See how many complaints there are about the poor drivers despite the superior hardware. Contrast to nVidia, whose drivers support every card made going back 3 years, have great performance, and are usually very stable.

      Now tell me again how there aren't trade secrets in that driver code?
  • when they get the drivers straightened out, I'll think about buying one.
  • Picture (Score:4, Informative)

    by jwilhelm ( 238084 ) on Tuesday July 09, 2002 @11:02AM (#3849445) Homepage Journal
    Here's a picture of it off the forum:
    http://www.jeffwilhelm.com/files/r250.jpg
  • by wazzzup ( 172351 ) <astromacNO@SPAMfastmail.fm> on Tuesday July 09, 2002 @11:10AM (#3849491)
    Is it me or sometime soon are we going to have to plug our computers into the 220 socket where the washing machine used to be?

    Besides, who needs clean clothes when your getting 200 fps ;o)
  • I had a Radeon 8500 in my box and had to remove it because it runs much, much, much hotter than my GeForce3, and because it was more susceptible to heat problems. (The DVI-D output starts to fail as it approaches 60C or so.)

    What's ATI doing to keep TWO of these in one box from overheating?

  • Judging by how hot my Xpert128 runs, I think we can expect advertising copy along these lines:

    The ATI Radeon 8500 Maxx: Your High-End Desktop Graphics and Affordable Home Heating Solution from ATI!
  • by Saurentine ( 9540 ) on Tuesday July 09, 2002 @11:22AM (#3849586) Journal
    "You decide if it is real or not, a solid dual GPU solution would surely rock the industry to massive proportions!"

    (emphasis added)

    When did Slashdot start hiring cheezy '80s Hair Metal band rejects to post stories to the front page?

    It's "News for Nerds", not "News for Mullet-Sporting Losers Who Can't Get Over Their High School Glory Days".

    Flamebait? Maybe a little. ;)
  • I doubt it matters. (Score:3, Interesting)

    by The_Shadows ( 255371 ) <thelureofshadows ... minus physicist> on Tuesday July 09, 2002 @11:29AM (#3849678) Homepage
    You know what the best card out there is? It might surprise you. At a low end price tag of ~$800 and up to nearly $6500 the Voodoo 5 6000 is one of the best cards out there. The price tag is that high because it is, unsurprisingly, a collectors item.

    I've seen this card work. I runs fast and it looks gorgeous.

    You know what the Parhelia tried to do? Fragmented AA? Voodoo could've torn that up years ago. The V5 6000 did 8x Full screen AA. Fast. At 1024. It's amazingly gorgeous.

    Think about it. This card is 2-3 years old. The architecture is what matters. Not the amount of GPUs. The GeForce4 4600 can't even consider 8xAA. The V5 6K does, and it does it well. On 128M of SDRAM. I'd still maybe take the 4600 over the V5 6K. But it would be a hard decision. The 4600 with it's DDR memory and GPU can handle some things better. Some. Not all.

    This card just proves that it really doesn't matter how much RAM or how many GPUs a card has. It's in the way the card is built. There aren't many cards I'd take over the V5 6K. If I could get one, for myself to keep, I'd pull me Geforce 3 out in a heartbeat. The GPU isn't a factor here. The RAM (DDR over SDR) isn't a factor. The V5 6K is just that well built, even 3 years later.

    There. I've said my piece. After seeing the V5 in action, I don't care to get the least bit excited about the "latest greatest" graphics cards ever again.
    • It is reported that the next generation NVIDIA GPU (NV30) will contain technology that they got when the bought 3DFX. This would include work done on the Voodoo 5 6000 and any prototypes 3DFX had been working on. So maybe you can look forward to that?
    • What the Hell?!? (Score:5, Insightful)

      by djohnsto ( 133220 ) <dan.e.johnston@gm[ ].com ['ail' in gap]> on Tuesday July 09, 2002 @12:20PM (#3850075) Homepage
      Do you even know what a V5 6000 is? It's a 4 GPU version of the V3!!!

      Think about it. This card is 2-3 years old. The architecture is what matters. Not the amount of GPUs.

      Again, this card had 4 processors!

      ...The GeForce4 4600 can't even consider 8xAA. The V5 6K does, and it does it well. On 128M of SDRAM.

      It sort of had 128M of RAM. It actually has 32MB of RAM per processor. So, all the latest games that use up more than 32MB of RAM in texture / geometry caching will run really slowly on the V5. Also, for those that don't remember, this was the card that you had to plug into the wall separately from the computer.

      Don't get me wrong, I've used the V5 5500 (2 GPU version), and it was really cool at the time. But I'll take a GF4 any day of the week over any voodoo you offer me (unless of course I can sell it at the collector's item price :)

    • Somebody mod this guy down he's full of shit. I've never seen a 6000 in action but it doesn't take a genius to realize it would be at most 2x as fast as the 5500, and given that the 5500 could only get ~15fps at 1024x768x32 with 4xFSAA the 6000 would have gotten about the same with 8xFSAA (double the chips but double the pixels) certainly not fast by today's standards. A Geforce 4600 will stomp a Voodoo 6000 into the ground in any benchmark. Not to mention most of the 128MB of memory is wasted storing four copies of every texture. Clearly a few mods aren't thinking too much about what this guy is spewing cause man does it stink.
      • OMG. Mod parent down as "Idiot." I don't care that I didn't remember that the V5 6K had 4 Processors (Which were not Vertex Pixel Shaders ala nVidias GPUs). That was my bad. I apoloigze. I wasn't the one running the damn card. When my buddy, who practically stole one, said "Come look at this!" I came, saw, and saw it kicked ass.

        But you are just a moron. Granted, I should have specified what this was running on, but you're taking the V5 5500's benchmarks based on it's release! Yeah, the V5 6K won't do quite so well on a PII 400. It does a helluva lot better on an Athlon XP. Helllooo? It's not all in the graphics card. Chipset, processor, RAM. You know, I hear those are important components too!

        PII 400MHz != 1.4GHz Athlon

        And NO the 4600 can't stomp it in any benchmark. 8xFSAA. You moron. The 4600 can't even render in 8xFSAA. Even if the 4600 only got 15fps (it was running smooth, at least 30) 15 0? New math! Hooray for new math! It won't do you a bit of good to review math!

        I've seen this thing RUN. You haven't. Once you have, you can tell me what you thought it looked like.

        Do what you will. I've got Karma to burn.
        • Ok lets see where to start. When a grachics card is getting 15fps the GPU is the bottleneck. You could have a Pentuim 5 50GHz and it would be about the same, especialy with FSAA which is purely dependant on the graphics card.

          Who gives a flying fuck if the GF4 can't do 8xFSAA when the Voodoo5 6000 couldn't do it at a playable fps, the whole argument is moot and besides 4xFSAA at 1600x1200 is going to look a hell of a lot better than 1024x768 no matter what kind of FSAA you throw at it.

          The Voodoo 5 6000 was not the pinnacle of graphics cards you make it out to be. I suspect you may be confusing it with Rampage, 3DFX's true next-gen chip which would have kicked a whole lot of ass.
    • 5 words for you:

      Pixel and vertex shaders.

      No Voodoo card can do them regardless of how many of those chips you stick on a board. Fixed function pipelines suck once you get used to being able to change all of the lighting and shading code that the graphics board runs.
    • AA performance basically comes down to memory bandwidth. Yes, the Voodoo 5 6000 did have a stupidly high bandwidth (11-12Gb/s), which still just about beats Nvidia's Ti4600 10.4Gb/s

      It has little to do with the number of GPUs you've got. The Voodoo 5 probably had to have several just to keep up with the bandwidth it had.

      So what was so great about the Voodoo 5 6000? they put a huge amount of bandwidth into a card when it just wasn't economically viable. I'm sure that nVidia and ATI probably both had internal test setups that could equal it, but they both had the sense not to try and make a commercial product out of it until the cost of fast RAM came down.

      As far as I am aware the V5 6000 didn't have any particularly special AA tricks, which nVidia seem to have now (compare Geforce3 AA performance with GeForce4 Ti...) so I'd imagine that the Ti4600 would beat the V5 nowadays, on 4x AA at least. Shame they don't have a higher AA mode, but with the next gen of games coming out, you wouldn't be able to afford it anyway, even with e Ti4600 or a V5 6000.

      Of course, the V5 had no pixel or vertex shaders (which is gonna hurt image quality) and no hardware T&L. As the majority of current games are still CPU-bound, that's gonna hurt the Voodoo 5.
    • Some numbers for you.

      Q3A, 1600x1200x32 bit (no FSAA)

      GF4 Ti4600 : 160.6 fps
      V5 6000 : 58.7 fps

      Expect almost a linear scaling for FSAA. Note that at 4x, the GF4 would be pushing out around 40 fps. The 6k? About 13. At 8x? Let's be generous, and call it 8. Yes, the machines being tested are very different (a 1.3ghz Athlon vs. an 800mhz P3), but at those resolutions, you're very close to being 100% CPU bound.

      I admire the meaningless iconoclasm that would lead one to tout an evolutionary dead-end like the 6000 as the be-all end-all of video cards, but in the future, you would be better served by appealing to the Voodoo's superior blast capacity, or the "warmth" of its image, rather than trying to make a technical argument without even the slimmests of legs to stand on.

      Best,
      'jfb

      Links:

      V5 6k benchmarking: http://www.voodooextreme.com/hw/previews/v5_6000/5 .html

      GF4 numbers: All over, but I used these:
      http://www.anandtech.com/showdoc.html?i=15 83&p =9
  • by Anonymous Coward on Tuesday July 09, 2002 @12:05PM (#3849964)
    http://www.jeffwilhelm.com/files/r250.jpg

    I apologize in advance for the AC post, but being as my company has a working relationship with ATI, blah, blah, etc etc.

    On to why I think this is fake:

    1. Look at the heatsink/fans. From the picture, it looks like they are using different model fans for the different GPU's, looking at the position at which the power wires are coming from. Being that I am a board designer, I can tell you that this would never happen, in order to keep the bill of materials down.

    2. On the very bottom right of the card, under the last SRAM chip, there is a small device (regulator?) that looks like its overlapping the edge of the board. This would never pass board layout verification, because there are certain clearances you need to observe when laying out pcb's.

    3. It looks like the lower GPU is violating the AGP spec for connector keepouts. I'm not sure on this, as I dont have the AGP design guide handy, but that GPU looks like it's positioned extremely low.

    4. Silkscreen for some of the parts further down the board (compare some of the electrolytic can & SRAM silkscreens) seems to be conspicuously absent.

    5. Look at the ATI symbol silkscreen. Right above it is a fiducial (these are used during assembly, as a way for the machine doing the assembly to calibrate it's position to the board), and part of a silkscreen that looks exactly like the assembly guide for the SRAMs! This is the thing that to me stands out the most as being doctored.
  • Comment removed based on user account deletion
  • I'm really not into the a-fan-for-everything movement going on. Motherboards, video cards, etc. The only things I really want in my PC with the fan are the CPU and the power supply, and if I could do without those, it would be even better. ATI would have a winner in my eyes if it could use two lower-clocked fanless chips together to deliver performance on-par with the rest of the one chip cards. Driver issues or not (I'm on an All-In-Wonder 128 right now... don't even get me started), it would definitely get a buy consideration from me.
  • that most game coders are lazy. This thing is running dual GPU, so it probably can't run in AGP mode and take advantage of all the memory bandwidth that no one take advantage of.
  • Check it out here [hardocp.com]
  • "a solid dual GPU solution would surely rock the industry to massive proportions"

    Agh! Marketing Splooge, attacking from 3 o'clock!

    Dual "GPU" configurations have been around since early '98, when the Voodoo2 came out. Sure, 3Dfx called it SLI, but it was essentially two 3D cards working as one - and someone (Quantum3D?) made single cards with dual Voodoo2's on them. Not to mention the Voodoo5 which had 4 GPUs on it.

    I remember seeing someone (could have been Quantum3D again) who was promising a 16-GPU version of the Voodoo5 for mass $$$.

    Multiple GPU's is nothing new, and it's definitely not going to "shake the gaming industry to it's core only on PAY PER VIEEEEEEEEEW....."
  • Why is the photo that was supposedly leaked by someone there so obviously photochopped? Where's the official hype (surely they don't plan, officially, on releasing the product onto a totally unsuspecting market? They've officially told various people about their upcoming R300 product, why would they, officially, keep this so secret)? Where in ATi's lineup does this Radeon 8500 MAXX fit? Above the single GPU Radeon 8500, sure. But wouldn't it steal sales from their upcoming R300 based product (reportedly called the Radeon 9700)? Sure, a Radeon 8500 MAXX won't have DirectX 9 compliance, but there's no DirectX 9 games out yet, nor will there be any that *require* it (notice I didn't say can't take advantage of DirectX 9 features) for some time.

    Yes, the geek in me thinks "Dual GPU Radeon card. Sweet!". But the realist in me thinks "Well, the ONLY "proof" of it we've seen are unconfirmed leaks, and a badly photochopped photo of a product that ATi already have in full production.

    Ahuh. I'll believe it when I see it, in person.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...