Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Hardware

ATI Radeon 256 146

snack writes "FINALLY! ATI has released info on their new graphics chip, built to take on both the 3dfx and nVIDIA. Reading through the press release it says that it has Windows, Linux and Mac suport. There are no benchmarks yet on the Web site, but reading through the tech specs it seems that this chip will blow everything else away. It also says that over the summer, this will implement the MAXX technology. Two of these chips working in parallel... Oh, my God!"
This discussion has been archived. No new comments can be posted.

ATI Radeon 256

Comments Filter:
  • by Anonymous Coward
    DRI is a spec. It's done. Xfree4.0 is an implementation it's not totally done. Drivers plug into these and they're mostly not written. Precision Insight is working under contract to write a driver set or two for this. Doing them all is not their responsibility. Want to contribute? join X as a developer or send them a contribution in $ form.

    Another way to make a contribution is to intelligently reward companies like ATI and 3dfx who're supporting you as a Linux user by releasing register level programming info and/or paying for quality drivers to be written. Conversely, don't fall for hype from companies who promise support and don't deliver. ATI for example went from honestly not supporting Linux, to honestly supporting Linux and the OPen Source community. Nvidia by contrast has taken many of you for a ride, promising support and basically stalling with crappy GLX modules, ignoring the DRI standard interface and Xfree4.0, and in the end stonefacing all who raise these points in criticism. If you have a G-force and you were hoping to use it in Linux, sell it to a WinD'ohs user while it's still worth something, get something else and get on with your life. The wait is over, or infinite--which is the same thing.

  • by Anonymous Coward
    Call me a troll, Call me biased, Call me what you like but there is no way in hell that I will ever buy another ATI Card. EVER. I have a rage fury 32mb card, if you must know. The reasons that I won't buy an ATI card again are very simple

    1. They don't support their cards.. They can't even get the drivers right for windows for crying out loud.

    2. Windows is the only OS they support. All that other bullshit you hear is a crock of shit. Supposedly they hired PrecisionInsight to do their 3D Drivers which were to be released this quarter. Still nothing

    3. From what I can tell the XF4.0 2D Server was done by SuSe and not ATI. btw Thanks for the specs ATI. Why would I support a vidcard company that can't write its own drivers?

    4. They talk about how the hardware is superb. Yes that is nice but I could have a supercomputer in my room and without anywhere to plug it in how useful is it? Makes a might fine paperweight I'm sure.

    5. We are going to be the #1 Opensource leader in graphics cards. Are you serious ATI? I like this reason here not to run ATI.. Blaatantly lying. They haven't done one thing for any other operating system besides windows in general. Look at their record.

    6. The drivers are coming soon. We will fix this soon. Soon. Yeah soon. Its been 2 yrs now but soon.. Very soon

    Its not even worth continuing. However I applaud the absolute idiot who helps ATI by posting a story about how their card is going to kickass and support Linux and Windows and all that. He is an absolute dumbass.

    Check the record before putting them on showtime. Another reason why slashdot has turned lame. Absolute stupidity.

    -I'm a troll? yah? I_redwolf(zanee)
  • by Anonymous Coward
    please check www.elkabong.com for techniques in procedural animation ... I myself have sworn to never set a key frame again! And it's working.
  • by Anonymous Coward
    While the gist of your post is dead on, its not nearly as bas as you imply. All these cards have texture caches, so every texture read does not have to suck down memory bandwidth.

    None of these cards will come close to their theoretical peak performance, but they're still pretty damn fast.

    Now all we need is AGP 8x and 1024bit on card memory buses. And 2 meg on chip texture caches.
  • Ahh, yes... I remember the good old days of using the 1 meg of memory on my Gravis Ultrasound as a ramdrive in DOS. That was a cool hack. :)
  • I'm not an online journalist by any means, but I am aware that people who make broad generalizations like that don't help anything. By helping to tear down the reputation of anyone who tries to put up good, honest, unbiased reviews (like ArsTechnica [arstechnica.com]), you take away some of their credibility and their motivation.

    The analogy anyone (meaning non-wired people) can understand, is: would you trust one of those news magazines at the checkout counters that have stories about the return of Elvis on the cover? Of course not. Well, how about if your local Sun or Tribune started selling out? You'd find another one. So yeah, there are bad news sources online, but don't hurt the good ones by making blanket statements like that.

  • Yes,and it hurts to see the shortcuts they had to take to get these 48GB/sec fillrate

    the PS2 only has 4Megs of Video Ram

    This means that at 640x480 there isnt much place left for textures and stuff

    To avoid having to fetch the textures from system RAM and thus slowing everyhting down a LOT, they are forced to lower the Resolution to 640x240 or 320x480

    Considering how everyone slaers over the PS2 all the time, am i the only one finding that to be acceptable?
  • of course thats supposed to be unacceptable instead of acceptable

    --
  • You'll laugh your ass off, but the funniest thing is, go to www.vapor.com, and it's an Amiga software company called VaporWare. No shit.

    I wish I had a nickel for every time someone said "Information wants to be free".
  • moderate +1 informative, +1 insightful.

    Yes, as a long-time ATI person, I agree, their drivers exhibit much suckage. Which is why ATI will always suck no matter what they do with hardware.

    Thanks for the tip on Rage Underground. I didn't know about that.

    I wish I had a nickel for every time someone said "Information wants to be free".
  • If I didn't have to pay for it, it would be one of these: http://www.pixelfusion.com [pixelfusion.com]

    Mankind has always dreamed of destroying the sun.

  • You mean the "amazing, expensive Apple Cinema Display". Admittedly, it is in the HDTV price range ($3,000), but the picture quality is terrific (based on DVD playback, 2001 to be precise). It also looks stylish on your desk.
  • >Back on ATI's side, the Radeon looks like it will have more features that the GF2. As a game coder, I like that. :) Also, ATI is likely to have better Linux support. I also like that.


    nVidia/SGI have a driver in the works. I've seen it working at one of SGI's "linux university" shows. It was runing a performer demo, and it ran it very fast:) The driver is supposed to be out in May along with an SGI Linux workstation with their own version of an nVidia card in it.
  • Who has the faster car, who has the smaller phone (see bigger isnt always better), who has the best grahics card? Does it realy matter? I have seen people complain about the Gforce and the whole T&L thing. No one is going to code for it unless XX% of the user base has access to it. Stay a generation behind, save your self 200$ and beta testing the drivers. FUD, everyone seems to think M$ plays this game best, wrong. You have people haveing arguments over a product that havent even seen the light of day yet. That is unless your living under an NDA, and that dosent count for us out here in the real world. At what point does fill rate become a non issue. Isnt good old film, 24 frames per second. Can any one out there honestly tell me that they can see the diffrence between the fill rates in any modern 3D cards? At what point does all this become a subjective matter? Or is it already and the matter is who makes the best drivers? --- NO SIG YOU SAY ---
  • Get out of here, that'll never happen.

    Uwe Wolfgang Radu
  • Concerning useing two cards running in tandem, it is also possible to run this card in quads. Each card will render every other pixel of every other scanline, thereby quadrupling the effective frame rate. That is, only if you have 4 slots.

    hm, from the press release:

    To ensure total performance dominance, the chip also includes support for ATI's patented MAXX(TM) multi-ASIC technology, enabling twin Radeon 256 chips on a single graphics card. The new chip will appear this summer in a range of board products.

    I didn't see anywhere multiple boards, but if they can get circuitry for 2 radeons into one asic, I'm sure they could find a way to run them in parallel on one video card. But then again, if the radeon lives beyond the hype, we shall see.

    But for linux consider your alternatives: nVidia insists they will bring killer drivers to linux (I have been waiting a long time now and am tired of their development efforts for linux). Question for somebody else, hasn't ATI opened the specs for their older cards? Aren't they contracting Precision Insight to bring drivers to linux (Xfree 4.0)?

  • "The hardware in these cards is pretty much entirely for 3d games. It will help in the modelling phase, but these consumer cards are not what you want for that. The Geforce2 might be approaching pro cards for modelling, as might this ATI card, "

    Bullcrap! current geforce cards actually smokes the "pro" OpenGL cards(like 3dlabs and evans&sutherland) maybe except with regards to driver stability, and "advanced" features like AAlines hahahaha....
    A lot of DCC professionals will testify to this.
    ATI will never be able to make a professional OpenGL card neither will 3dfx.
    Check out the facts before you start mumbling on slashdot.
  • Thresh's Firingsquad has an excellent preview and writeup [firingsquad.com] of the first-looks at the Game Developer's Conference.
  • I've been considering buying the ATI All-In-Wonder card for just that reason when I put my next system together. (Just how well is that card - the AGP version - supported under Linux?).

    I am using ATI All-In-Wonder AGP version for my RH 6.1: It's ok.


    2D performance is a little low when you shift desktops - perhaps some more video RAM would have helped. TV is working although the (third party open source) TV program is somewhat rough. I can't tell you about 3D performance because I am not using any 3D programs. There are some not-so-active Linux projects working with ATI cards (GATOS), and some optimized ATI drivers on its way together with XFree86 4.0.


    The main reason for me to choose ATI was the ergonomic qualities of ATI products: Refresh rate >= 100MHz in 1024x768 resolution or better (I think NVidia does that too).


    All-In-All: All-In-Wonder is an ok card. Recommended.

    -Claus

  • Rather than falling for the blind rants of some companies [microsoft.com], linux users tend to believe what we hear on the graphics/gaming front. We're getting tired of being treated as second class citizens when it comes to graphics and games.
  • Heh heh. ATI has an interesting history with benchmarks. I think it was '90 or '91 when ATI got in trouble because they hardcoded the routines for one of the popular Windows benchmarks into their firmware. When the benchmark ran, the card was able to execute highly optimized hardware routines and blew away the competition. SO, when ATI says their cards perform the best on the benchmarks...they might be right. Now real world performance is another story altogether. I do have to give their engineers "you've got balls!" points for that little stunt!
  • Comment removed based on user account deletion
  • True, with 2 640x480x64bit frame buffers, there really isn't much space left for textures.

    But the average PS2 resolution is 640x240x32 (16 bit color, 16 bit Z), which is as high as NTSC televisions will go. There is also a 3.2GB bi-directional bus between the GS and the EE. Technically, it would be possible to render a section of the framebuffer in the GS, and then page it out to the EE. Then the EE can do video post-processing (non-photorealistic rendering and complex antialiasing come to mind as possible applications) and output a framebuffer to the GS for display.

    There are plenty of techniques to work around the 4MB VRAM on the PS2, you just need to know about parallel processing, multithreading, and paging. Unfortunately, since most games developers are still self-taught, these aren't common knowledge in the industry. When Phil Harrison guest lectured at a CS class of mine he basically went so far as to say that over half of current game developers don't have nearly what it takes to get the most of the PS2.
  • Well, if every triangle is projected exactly the same and there have uniform animation, right.

    Typically it will end up being 1 matrix per triangle list (to include animation and translation), which will probably drop the matrix read bandwidth requirements by a factor of 30-40.
  • Well, I for one I am _much_ happier with the ATI software DVD player than the DXR3 "hardware" DVD decoder from Creative Labs. I have a Xpert99 board (Rage128, 8Mb RAM, AGP 2x) on a ... VIA board and AMD K6-2/380MHz. The ATI player is stable (I can watch several movies in sequence) while the Creative Labs one crashes two or three times during a movie :( Plus the ATI player has much better contrast and colors (on the same monitor). And ATI released some libs for accelerated X and DVD playback for linux. I don't know how far they got with those though...
  • ATI drivers especially suck on the Mac. Alot of the flack Apple G4 machines have been getting is the fact that the 3D performance was not up to par with higher end 3d cards from other companies (3dfx, nVidia, etc.). Hardware wise, the Radeon looks like it can rock any other card out at the time. I just hope the drivers will be up to par. Maybe they learned their lesson. I'm willing to give anyone a second try... or third... or forth...
    --
  • Ignore the 3D specs, we know they'll be lower than the hype; this thing has Hardware HDTV support!

    The only way HDTV is ever going to catch on in the US in the next few years is if we start using the only tubes currently in households that can handle even 50% of the resolution -- Your computer monitor.
  • I am pretty jaded by all this graphics-card hoopla.

    It started when i bought a Matrox G200, which was advertised as having a full OpenGL implementation. Thinking 'woah, this'll mean i'll be able to run lightwave at a great pace!' i bought one.

    Turns out, as with a lot of hardware these days, that the OpenGL drivers weren't ready when they shipped the product, and they only actually released them when the G400 shipped. This is about a year later.

    Suffice to say i'll never buy another matrox product again.

    The Nvidia GeForce is now out, supposedly bringing the wonders of hardware T&L to the world. Well, i have yet to see anything, bar NVidia demos, that actually use the geometry acceleration. Why? 'oh, the drivers that support it aren't ready yet'

    And on my current TNT2 card, i have to use the 2.08 drivers (lst time i looked the drivers were up to release 3.58) because the later drivers break OpenGL 1.2 compliance.

    How long does it take to get a decent set of drivers for a chipset???

    And if i was a betting man, i'd put $100 on the fact that theres no way ATI are going to ship a product with all features enabled and working in the first release.

  • >Each card will render every other pixel of every >other scanline, thereby quadrupling the >effective frame rate. That is, only if you have >4 slots.

    Errrr....you'd need to have a motherboard with 4 AGP slots right?

    -
    Ekapshi.
  • The ATI don't compare to this!
    http://www.ultimatechaos.com/ucfx/

    -
    Ekapshi
  • I don't know about a link between the Charisma engine and the Emotion engine, but the Emotion engine is PSX2, not dreamcast.

  • Alright, I'm not an expert on video hardware. I always assumed that the amount of memory a card had was related to it's max resolution (800x600, etc), at least that's the way old cards were. I got into an argument with some guys in my office about weither a card with a ton of memory (say 64 mb) would increase the preformance of non-realtime rendering (say Bryce or something). I keep thinking how would video memory improve something like that. I know you can cache textures in vid memory, but what else is it good for?

    And as for this card, I think the most impressive feature is that HDTV hardware. I'm guessing that it will act like a super-tuner-card or some such. Now you can watch PBS in super high rez. :)

    And for the record, I'm using a 4meg ATI card right now (Rage Pro 3D or something), and it seems to handle Quake 2 just fine.

  • Sounds even more like Radion, a cheap'n'cheerful washing powder here in the UK.

    Yellow tigers crouched in jungles in her dark eyes.
  • The most surpising thing to me in this posting is that the band Moxy Fruvous has finally made it onto Slashdot. Who'd have predicted that?! (Note the "from the dept." line in Emmet's post.)
  • WinHec started today - everyone's announcing their latest and greatest homing to get game developer mindshare -
  • Why would they include an HDTV decoder?
    Internet Appliances perhaps?

    Linux support is a definite plus.
  • by ndfa ( 71139 )
    128 Meg Support at 200 Mhz.
    30 million transistors, in an .18 micron technology
    HDTV hardware decoder, 1.5 GigaTexel/Second rendering engine!
    Charisma Engine ? I kinda like the name... something like velocity engine. But Radeon is definately a nice name!
    I just find it hard to believe that ATI will take over the Voodoo5 and GeForce 2! I mean they have always been first out with the higher memory and stuff... BUT I have never liked the chips till now! Hmm any idea about the G800's... Matrox is one company thats my bet at all times!
    Hmm linux support would be nice... but i will wait till they deliver!
  • NOT for XFree 4.0... Precision is still working on that! I played Q3 when i was running 3.3.5 and it ran really nice, AGP GART and all that compiled and ran smooth as silk! Waiting for 4.0 to show its power!
  • there's nothing like playing Quake3 on a system on which you used du -s * | grep "M" just a minute ago
    Arrggghhh.... Got the G400... love the card, but its kinda rough having to wait for Quake3 and UT support in Linux... having to reboot into WinDoz for UT is such a bore! And I completely agree with ATI not being able to deliver what they promise! The Fury was to be some monster with 32 Megs of RAM and it just sucked for what it cost! WHich is what I am thinking about this chip... its probably going to cost a BUTT load!
  • by snack ( 71224 )
    Sorry Duxup, I do not work for ATI. I am just an average joe working in my average job. But i am stoked with these new graphics chips comming out from all of the large companies. I wish i worked for the ATI pr folks...then i'd be making the big bucks :)
  • So...

    Anyone want to write a system to use spare texture memory as swap. Presumably it would be a bit speedier than Hard Disk.

  • I used to work for a manufacturer of graphics chips. It doesn't matter which one. While working there, I noticed a couple of general rules:

    1. Every manufacturer tweaks its own in-house benchmarks, thus making them useless to consumers.
    2. The benchmarks in commercial magazines often favor their biggest advertisers, thus making them useless to consumers.

  • awww damn! Wish I still had moderation points. This was fscking hilarious :)

    Someone mod this up 'funny'
  • Remember when the Rage 128 was announced? It was supposed to crush the TNT1 and Voodoo2 SLI, and (if it had shipped on time), it probably would have. But by the intended shipping date (Christmas 99, IIRC), they only had early alpha silicon which ran far hotter than the specs had claimed. The final Rage 128 shipped more than 3 months late, at a slower speed than was originally announced. Several weeks later, the V3 / TNT2 came out, blowing the Rage 128 out of the water.

    My memory is a bit fuzzy, so if someone could correct me, I'd appreciate it.

  • Assuming ATI keeps up its tradition of supporting open-source drivers, this is *great* news!

    Now go and buy one of these newfangled cards - and don't forget to write nVidia a polite letter explaining why binary-only drivers just don't cut it anymore =)
  • What good is Linux support if the card does not support XFree86?

    What 'custom linux' thing are they doing such that Linux is listed and not XFree86?
  • What impressed me was the shadow casting stuff. I have been wondering about how best to implement shadows for some time. It is really a lot harder that you'd expect. I am very happy to see it done in hardware.

    It's not really hard to do shadow casting in hardware, in fact standard OpenGL can do it, with a bit of creative use. See Nvidia's ShadowMap demo [nvidia.com] for an example. The source for a lot of the latest effects is Wolfgang Heidrich's thesis [mpi-sb.mpg.de]. Lots of really cool ideas, needs a reasonable computer graphics background, though.

    If you want to get higher precision and speed you'll need some extensions, but not a lot. Depth textures and copy from framebuffer are enough, and have been available on high-end sgis for years, so the design is sort of stable.

    Moral: with a good API and some creative use you can get really cool effects. Hardware can make it fast, but we'll have to see if the ATI chip delivers on that part...

  • I read your article, and I couldn't agree with you more! You hit the nail right on the head. I love the features of the AIW 128, but the driver support (and lack of non-Intel support) is HORRIBLE!!

    As for your DVD issues, though. When I upgraded to a KX133-based motherboard, (Yes, non-intel to another non-intel), my DVD playback got messed up. (jerky picture, flicker). I tried PowerDVD, and the playback is better than ever! Give it a try. -WD
  • doh. 25 million triangles. I knew that.
    ------
  • You shouldn't believe everything you read on Slashdot. If you did, you might believe Nvidia's linux drivers are a 1GB download.

    Nvidia released a GLX driver for XFree-3.3.5, and while it has a few bugs it does work and does support the GeForce. The only references I've been able to find about Nvidia *not* supporting DRI are on Slashdot. If you read the Nvidia site or the DRI developer mailling list you will see that Nvidia said they werent going to improve the 3.3.5 driver until XFree4 and then they would release a DRI driver. The last I read about the DRI driver was that it would be released in the first half of 2000.
  • "Internal priorities" may shift and pull people away from the linux driver support, but considering their development invested so far and their press release boasting how they were going to bring "real" opengl drivers to linux I don't think that Nvidia will just not ship their drivers now. They may never upgrade the drivers, but they'll put some binary only non-glx drivers up someday before August.
  • Who said that writing drivers are resources splitting?

    I'm almost sure that the Linux drivers are done by Precision Insight (including Itanium), and I don't think that the Mac developers help the Windows drivers developers at all..

    Just my thought :)
  • Fun with marketing!!

    Bitboys Oy [bitboys.fi]
    The one and two chip solutions will deliver the first one and two gigatexels per second performance in the 3D market, with an amazing feature set and low solution cost!

    ATI [ati.com]
    First graphics chip to break through the Gigatexel barrier with an awesome 1.5 Gigatexel per second rendering engine.

    3DFX [3dfx.com]
    Taking advantage of the revolutionary scalable architecture of the 3dfx VSA-100 chip, the Voodoo 5 6000 AGP features four processors working together to be the world's first 3D accelerator to break the Gigapixel barrier.

    Ok, that last one says 'pixel', but 3DFX is probably referring to single-texture polys anyway.

    Couldn't find an nVidia reference, can anyone else find one?

  • Actually, what is SORELY needed, but nobody's doing, is a website that tracks ALL product announcements, and watches the dates, and keeps a running total of how many days each product is overdue, and compiles a list of which industry players are the biggest fucking liars.

    A site like that would get a lot of hits. And maybe people would stop believing the same bullshit from the same liars day in and day out, business as usual.

    I wish I had a nickel for every time someone said "Information wants to be free".
  • How far away from a 54" HDTV would you sit?

    I'm not sure. I lay about six feet from my 32" TV, but that's because that's about how tall I am. Sometimes I sit about nine feet from it, but that's where the couch is. I really don't have a good place for a 54" projection set. It's much deeper then my current TV, so I would have to put it closer, and I don't think it would stand up to dog slobber.

    For me the 28" HDTV VVega ultra-flat glass tube sounds better. I would lay about six feet from it, or sit about 8 feet from it.

    I bought a nice GemStar 19" monitor for $300. Not quite a trinitron, but it'll give me 1600x1200 and 75hz. I sit less than two feet away from this nice big screen. And let me tell you. It takes up a lot more of my field of vision than a 54" tv would from my couch.

    I have a very nice desk chair. To gloat a bit, it's an Areon. It's not as comfey as my couch. After a day of sitting (at the office, or at home) it's not really as comfey as laying on the floor either.

    My wife would also be a little upset if only one person could watch TV at once, especally if I kept kikcing her off it to read slashdot.

    And considering I can have a DVD player, DVD burner, internet access, chat, and video game access. I think that it more than surpasses a 54" HDTV for what it does.

    My car provides warmth, has a six speaker stereo, can haul 4 people and a little baggage at 130MPH with the top down. It can muss my hair. I can cook a pizza on it's engine block. It is definitly surpasses an HDTV system at what it does.

    Regretabaly the HDTV surpasses both my car, and your PC at being a easy to use appliance that can be viewed by several people in the comfort of a tipical living room.

    Now if you wanted a car, or a computer, then your better off getting one of them then a HDTV. But if you want a TV, a PC is only a limited substitute. Under the right set of circumstances it is a quite acceptable substitute.

    Unless you're at a party.

    I would question the wisdom of asking a girlfriend to sit at your desk and watch a movie with you. Then again you'll both have to sit close together, so it may work out.

    You're not going to get 20 people to watch your 19" monitor for the super bowl. (not that nerds watch such things!!!)

    Geeks definitly watch the superbowl. But pretty much only for the comercials.

  • In Case you haven't noticed, hdtv sets are rather expensive.

    Yep. The Sony 38" (or is it 36") ultra-flat glass VVega is quite pricey at $6000. The 50" projections sets are somewhat more so.

    The average computer nowadays, however, is less half the price and has a monitor more than capable of displaying any format of hdtv you throw at it.. basically you could have a 19" hdtv/dvd player for around 2,000...

    There are a lot of cheep monitors that won't do the more agressave HDTV reslutions, but yes any monitor over about $150 will do nicely. Of corse that's a bit unfair because I expect if Sony, et al thought the HDTV market would buy 19" HDTV sets they could make them cost somewhat less then computers.

    Find me a 32" PC monitor for cheep. Howabout a 38"? Or a 54"? Remember to discount the size to get it to the same aspect ratio as HDTV (or use that amazingly costly Apple Cinima display).

    It would be wondeful if I could make a $600 PC (or even $2000) PC system do the work of a $6000 HDTV, but it's just not possable yet.

    Of corse if all you have for HDTV is $600, and 19" sounds good to you, go for it. I admit some HDTV probbably beats no HDTV. Then again Quake III might beat any HDTV just now, given the lack of much programming!

    (I'm all for bargins, and trade-offs, just know what you are trading off!)

  • if we start using the only tubes currently in households that can handle even 50% of the resolution

    Whatchew talkin bout Willis? A 1280x1024 VGA monitor should handle 720p quite nicely (1280x720), and my 1920x1200 monitor at work can do 1080p (which isn't even in broadcast equipment yet IIRC)... Remember one of the really nice things about HDTV is actually the noninterlaced (or 'progressive' in non-computergeek-speak) capability, and we've been accustomed to good-quality non-interlaced VGA for 10+ years...

    Hell, even your 1024x768 set can do 480p, which is better than flickery NTSC junk...

    And if ATI really supports XFree (writing DRI drivers and GLX support) out of the box I can definitely see $tnt2_adapters_on_ebay++ ...


    Your Working Boy,
  • Now when somebody asks about butt-stomping 3D, how many people (outside of ATI's marketing drones) instantly think "ATI!"?

    Also, their drivers are somewhat less than stellar.

    And MAXX'ed? Didn't we already see that MAXX technology does undesireable things to one's latency? Who cares if you get 150fps if you're a half-step behind everyone else?

    True, ATI MAY have gotten their act together. I think I'll sit this generation of their cards out though. They've got a bit to prove to me before I go and plunk down the dross for one of their cards.


    Chas - The one, the only.
    THANK GOD!!!

  • Mac video hardware has definitely fallen behind the PC, with the dual-processor Rage Maxx being unable to run properly in the current Macs due to motherboard issues. So what else is out there?

    Right now, 3dfx has Mac drivers [3dfxgamers.com] and a bios flash that will allow your standard PCI V3 for the PC to work with a Mac. The drivers are beta, but the card is cheap (since you don't have to pay the Mac hardware premium), and test show it blowing away [xlr8yourmac.com] ATI's cards (In 3D, 2D is so-so). Only problem is that the V3 wasn't designed with Mac support in mind, so there are a few hardware issues that may never go away.

    The V5, on the other hand, was designed to support the Mac from the beginning. The Mac version has already been demoed (Rumor has it the Mac version was ready before the PC one). Here's an article from InsideMacGames [insidemacgames.com]. 3dfx is coming out with PCI only at first, AGP may come later.

    nVidia has announced that they intend to bring out a Mac card sometime later this year, possibly the NV15 or a varient thereof. They've released little info on this--but they did hire some director or manager guy away from Apple a few months ago.
  • "...world+dog..."

    You read the TheRegister [theregister.co.uk], don't you?

    Anyway, besides nVidia and ATI, 3dfx plans to show off a working V5-6000 (Their quad-chip card) at WinHec. There's a bunch of players that probably won't be there, though. S3 has been quiet lately (Not surprising, given that S3 is selling their graphics division to Via). BitBoys has pushed their expected release date back by an entire year. VideoLogic hasn't made any recent announcements, and STMicro's "GeForce-Killer" is still vapor.
  • Amen. I have no need of a card that runs Quake at even 30 frames/second -- I don't play Quake. (Let's see, last computer game I played was Myst, last multiplayer game was probably xtrek (no, not nettrek, but it's predecessor based on X10)). I had more fun hacking on the source code to 'adventure' than playing it...

    But combining video capture, MPEG acceleration, etc onto a single card would certainly free up some slots in my computer. (Which is maxed out, what with sound, SCSI, NIC, FireWire, TV card, and VGA card). I've been considering buying the ATI All-In-Wonder card for just that reason when I put my next system together. (Just how well is that card - the AGP version - supported under Linux?).

    Heck, how about putting the X server right on the card too?
  • I _used_ to be an ATI fan, because I wasn't all into serious gaming. Heck, if the card did 1024x768, 24bit 2D, I was happy... New driver out? Why should I change. ATI was the best card there was.

    Then I started gaming. Wow, is this slow. Let's try this new ATI Rage Fury, 1st time out. Wow, it crashed Win98. No Linux support. Had to flash my BIOS. It crashes because it overheats. New driver? Cool. Installed it. Had to reinstall Win98 cuz it doesn't like the driver. Linux support? No way...

    Let's try this Voodoo 3 3000. Wow, easy install. New driver? Not a problem. Linux support, playing Q3 and UT @ 1024X768. Never crashes. 3DFx is the sweetest thing on the planet. It may not be the fastest, but it's damn fast and it's trouble free -- and compatible.

    When the Voodoo 5 gets the same Linux support, count me in as a loyal customer. The ATI will have Linux support? Doesn't phase me a bit.
  • If you don't have an Intel processor/mobo, think twice before plonking down hard currency for anything made by ATI. I myself got an ATI All-in Wonder 128 card this Christmas, and it refuses to play well with my VIA based motherboard/K6-2 processor. It's not like the Super 7 platform is either too new or too old for ATI to have supported it in the Rage 128 based cards, or that the VIA MVP3-G chipset is so uncommon. ATI, quite frankly, just doesn't care about supporting non-Intel platforms, because they don't have to. They're the company of choice for Intel-based OEMs. So, they don't care about performance-loving AMD-using geeks like a lot of us here.

    If you're performance loving, I can't understand why you would have ever bought an AMD CPU in the past (I certainly can now with the Athlon, however). The main reason why ATI cards have problems on Super7 is because Super7 is a bad, bad hack. If you actually look around, all of the cards were having problems around that time. The G200 had problems with the Super7 platform, as did the TNT, and just about anything else around at the time using AGP... Some companies were more responsive to the problems, admittedly, but the problem lay far more with the Super7 platform than the graphics card manufacturers.

    Funny then how the REALmagic Hollywood+ I got after the ATI's performance bit delivers flawless DVD performance on the same VIA chipset, with CPU usage averaging under 5%. Yeah, ATI, blame it on the mobo chipset instead of your own laziness when it comes to drivers...

    And the REALMagic is a PCI card, right? And your ATI card is AGP? Again, this is more a problem with Super7 than the manufacturer of your graphics card.

  • That they are not using the "Rage" Prefix in this chip. They've been using this for the past 4 or 5 years, and I have to say, I was never able to tell one version from another. They were all the same to me "Ati rage something" I mean sure, I'm sure there's some difference between the "Ati Rage" and the "Ati Rage Pro Fury II MAXX", but I'd be hard pressed to tell you what.

    Anyway, more good Gfx Cards with good OpenGL support is always good (Something 3dfx never seemed to grasp...) Oh well, I just hope they change the name again when they get there next chip. I don't want to be talking about the "ATI Radeon 512 II Pro Turbo Championship Addition" in 2007...
  • I know that John Carmack has been skeptical about using voxels due to the sheer amount of processing power they need.

    Actually, I've heard the exact opposite, (From a direct quote as well) That JC likes voxels, and would prefer them to polys eventually, as opposed to more and complex-er 3d-model based systems.

    He said that the situation was a lot like back in the day with vector graphics vs. pixel graphics. Vectors were great, and didn't take up that much ram, whereas pixels needed bitdepth*screen size of ram. Quite a bit when you're talking about boxes with 64k of ram.

    But, as images got more and more complex, so did there vector data. Whereas pixels require the same amount of data no matter how visual complex the image is.

    Eventually voxles will take over.

    I don't think that they will for quite a while though...
  • Strange - this comes the day after we finally get to see some Voodoo 5 beta benchmarks and the same day that, rumor has it, the GeForce 2 NDR is rumored to be lifted (and judging by the last paragraph of the Sharky Extreme article on Voodoo 5 vs. GeForce article, the rumors are true) - are all of these companies that confident that their cards are all good enough to steal the competitions' thunder, or is it just a freak coincidence?
  • You can keyframe a skeleton just like you can keyframe a mesh. Basically, you define a motion for the skeleton, and and time duration in which that motion occurs. Voila! A keyframe. Motion capture is often viewed as massive auto-keyframing.

    In fact, including mocap, keyframing is far and away the most popular technique in modern computer graphics. Not many people use procedural animation (which is possible, although not terribly realistic yet). Basically, if the actual motion is stored as discrete samples along the motion curve which are then interpolated, you've got keyframing.

    Skeletal animation is preferred because you only need to keyframe (typically) 24-50 degrees of freedom for a human object. This is much easier for artists than having to manually handle 10,000 NURBS surfaces, and also makes capturing the motion really easily. Skinning, such as included in the Radeon, comes into play because the model is only defined once. If you look at your own skin as you bend your elbow, the skin on the outside will stretch while the skin on the inside will contract. Since polygons are not soft, skinning by using matrix interpolation is used to ensure that no seams emerge at any of the joints.
  • No, the memory runs at 200MHz, it doesn't mention that the chip itself will run at 200MHz. Given that the chip is 0.18micron and has 30million transistors this chip could be running at 300-400MHz. The memory is DDR as well, which means excellent performance.

    But 1500MTexels/3 = 500M Pixels per second. If the chip can effectively commit 1 pixel per clock cycle that would be 500MHz, I suspect it can commit 2 pixels per clock cycle, or even more! Some detailed overview of how the internals of the chip work would be great - you see this for CPUs, but not for graphics chips...

  • Don't rely on online reviews.


    Or at least hit reload every once in a while to make sure the site isn't financially supported by one of the card companies reviewed. Let's not forget the fiasco where a chip company's ads were running on "you know who's" Hardware Guide a few months back when they were trying to do an "unbiased" review.

    Well, a healthy dose of skepticism is a good thing, but don't get paranoid. The media has always been in a struggle to both advertise but maintain integrity and independence. I'd say on most respectable sites they are entirely fair and will burn a product of a company that advertises with them if it really sucks. The only reason people go to these site are their credibility. Once they blow that it's over. You can only do something stupid like that in the Windows propaganda magazines (which have all those stupid ads with some schmucks shoe business).
  • I've just bought a G400MAX, because Matrox are paying Precision Insight to develop Linux drivers. Matrox are notoriously bad at writing Windows drivers, so this is a smart move on their part.

    HH

    Yellow tigers crouched in jungles in her dark eyes.
  • whatever happened to 3dfx? I remember when people used to joke about these other companies. Seems like they're slippin'
  • "There are no benchmarks yet on the website, but reading through the tech specs it seems that this chip will blow everything else away."

    Does snack sound like he works for ATI PR people to anyone else? If I were ATI I'd submit every announcement I can to /. in hopes of it being posted. This is not to say it shouldn't have been posted, that's not my call, and I do find the news interesting. Snack's comments just sound like a press release to me :-)
  • That's all I want - full, dual eyeball resolution. Is that too much to ask?
  • The whole keyframe thing didn't impress me too much. Keyframes were used in Quake 2, but now most games use skeletal animation, which usually does not involve keyframes at all. But, keyframes may come back into style once we start animating individual limbs more. Of course, the Radeon has skeletal animation accelleration in hardware as well, and it is better than nVidia's.

    What impressed me was the shadow casting stuff. I have been wondering about how best to implement shadows for some time. It is really a lot harder that you'd expect. I am very happy to see it done in hardware.

    ------

  • Since I doubt that Slashdot will drop a poll on the subject anytime soon, I think I'll conduct my own little one :o)

    If you could have any of the three fancy new chips in your dream computer, which would it be?

    Please reply with your answer. Thanks!

  • > Arrggghhh.... Got the G400... love the card, but its kinda rough having to wait for Quake3 and UT support in Linux...

    Uhm, you do know that the G400 does Q3 ( and probably UT as well, though I haven't tried that ) under Linux just fine, don't you? Get the GLX for it here [sourceforge.net]

  • Yes, but the nvidia drivers for 3D suck. From what I understand playing Quake III, even on a fast processor, will give you a whopping 10 fps. nVidia certainly has not supported Linux drivers that well.
    Molog

    So Linus, what are we doing tonight?

  • OK, so it is obvious from history that all the video chip manufacturers will spout their PR gibberish whenever they can and that they will never tell us about the products they fail to beat. What is not so obvious is why speed has remained the driving force of video hardware for the last few years.
    Definetly since we moved to AGP cards, there has been next to no determinable difference between the 2d performance of graphics cards under windows, under XFree86 however the acceleration of the drivers meant that some cards speed ahead while others were left behind. I have no figures to confim this, but I am sure that there must now under linux be no real difference in 2d performance between a selection of 6-12 month old graphics cards from the linux "supporting" manufacturers. So 2d is no longer an advertising campaign....what is?
    3d seems to still be the marketeers primary goal, they feel certain that by convincing us that there card can handle more 3d data (by throwing fill-rates, bandwidth, texture memory and ramdac speeds our way) we will experience VR on a standard x86 machine.....rubbish. The 3d question when buying a graphics card = will it play the newest games at an acceptable speed in my system NOW (i.e. not if they ever get their drivers out)? If yes move on to next question, if no move onto next card, to speculate as to whether its performance will be adequate to play the next generation of games is futile as no-one will no how they will be written and you don't know what your system will be like in the future.
    Video. MPEG acceleration, Capture and perhaps CSS decoding and pal/ntsc out, these are all addons (like 3d used to be) that nobody seems to see as a marketing ploy. I was in two high-street computer shops in Dublin this weekend and noticed both carrying the (nice big sticker advertising the fact on box) NON-VIVO version of the same TNT II...why? Seems like to someone not having video is an advertising sell. When buying a graphics card, how many people will not look at a card if it will not do significant hardware acceleration of MPEG, how many will look for a TV out card and how many will spend those extra bucks for the ability to capture from their camcorder and do a bit of non-linear editing. Look at the ATI All-In-Wonder or the some of Asus' NVidia based boards and the G400 and you will find many of these features, but how many work outside of one or maybe two variants of M$ Winblows?
    What does this all mean? Well, IMHO the first manufacturer to produce a product that has an open source driver (for any platform, if you build it, we will hack) which
    1. delivers accelerated 2d performance
    2. OpenGL hardware acceleration
    3. MPEG acceleration
    4. Video Capture
    5. Pal/NTSC output
    will win the linux market (and how many cards is that then? Well a few million anyway, and growing rapidly.)
    Who here on slashdot would turn down a video-card/driver combination that provides these features because it will only play todays games at 85 and not 110 frames per second?
  • Raytheon and Radon..oh well, I'll keep chugging along w/ my 2 megs of VRAM for 1 monitor and 1 meg for the other one.
  • In Case you haven't noticed, hdtv sets are rather expensive. The average computer nowadays, however, is less half the price and has a monitor more than capable of displaying any format of hdtv you throw at it.. basically you could have a 19" hdtv/dvd player for around 2,000... and great game playing capabilities thrown in
  • Now, in order to actually project these coordinates onto the screen, every vertex needs to be multiplied by a 4x4 matrix

    You don't need a separate 4x4 matrix per-vertex. One will do fine for the whole scene -- unless you're trying to simulate a non-linear camera lens or something.

  • The reason they were all called "Rage" is because they were all based on the "Rage" extensions to the Mach64 chipset. Note that all the 2D XFree86 drivers for these cards were using the Mach64 driver, that's because the 2D core was the same. The Rage extensions just did some texture mapping for 3D.

    The Ati 128 chip used a different core, and thus required new 2D XFree86 drivers.

    Anyway, enough of this boring ATI history. The reference to the Linux drivers probably only means 2D support. :-(

  • I belive the Rage128 was announced around Dec98/Jan99, as Apple announced and began shipping the blue & white G3 in Jan '99 with a bundled Rage 128. Performance and drivers were another story. In typical ATI fashion, it took several revisions of drivers downloaded from ATI and Apple before performance and OpenGL compliance were up to speed. The very first Rage 128s were also clocked pretty slow, but that's a different issue.
  • Purchase new Power Mac with bundled ATi card.

    Moan and groan about OpenGL performance and compliance.

    Dig up driver updates from Apple and ATi, install.

    Note performance boosts in one area, slow downs elsewhere, few bugs, few bug fixes.

    Lather, rinse, repeat.


  • I certainly hope ATI is paying attention. I absolutely love the ATI hardware, but you're absolutely correct... ATI drivers suck.

    It's such a shame that such beautiful hardware has to be hobbled by such awful software.

    *LoL* Linux on x86 is beautiful software hobbled by awful hardware! The irony is killing me.

    --// Hartsock //
  • by pb ( 1020 ) on Monday April 24, 2000 @10:08PM (#1112484)
    Let's solve the "support for Linux" issue once and for all, and just port Linux to this card! :)
    ---
    pb Reply or e-mail; don't vaguely moderate [152.7.41.11].
  • by BrianH ( 13460 ) on Monday April 24, 2000 @08:15PM (#1112485)
    Oh. Wow. Yet another press release expounding the miraculous abilities of some upcoming piece of hardware that has yet to see the light of day. I am so impressed. Maybe, just because I am sooo totally impressed by this, I'll put off buying that GeForce now. But then again, I'm sure we'll have some new announcement a few months after that telling us about the next great achievement in hardware. At this rate, I'll be stuck with this EGA card for the next 20 years!

    Seriously, I wish that Slashdot and all of the other sites that repost on computer hardware would start dropping crap like this in the trashbin. Post reviews of released products, post reviews of soon to be released products, but if I never see another review of a product that only exists on paper, I'll still have seen too many. Posting this kind of stuff just encourages these hardware companies to write more of it.
  • by matticus ( 93537 ) on Monday April 24, 2000 @08:06PM (#1112486) Homepage
    i was reading a review of the Radeon located at Thresh's Firing Squad [firingsquad.com] and i can't help but be impressed. maybe ati finally has got it right...their other cards have had some...problems. the 3dfx benchmarks of today were kind of disappointing, but maybe there still is hope. i'm a big 3dfx fan, and i appreciate their linux support-but it's good to see other companies grab ahold of linux. there's nothing like playing Quake3 on a system on which you used du -s * | grep "M" just a minute ago. it's really quite liberating. here's to good graphics cards and linux support! yay technology..
  • by Temporal ( 96070 ) on Monday April 24, 2000 @09:39PM (#1112487) Journal

    Odd... I submitted this hours ago, yet my writeup was rejected...

    Anyway, the ATI Radeon can do 1.5 gigatexels per second. The Voodoo 5 can only do 667 megatexels. So, the Radeon will far outperform a V5. And it has T&L! What a deal! The funny thing is that 3dfx is hyping the V5 based on its fill rate...

    Now, on Wednesday, nVidia is going to announce the GeForce 2. It will have a fill rate of 1.6 gigatexels, just a bit higher than ATI's offering. On the geometry side, the GF2 will do 250 million triangles per second. I don't know how fast the Radeon is as far as geometry, but if anyone else knows, please share! It is also rumored that the GF2 will be in stores on Friday. As in, THIS Friday. Whoah.

    Back on ATI's side, the Radeon looks like it will have more features that the GF2. As a game coder, I like that. :) Also, ATI is likely to have better Linux support. I also like that.

    It looks like choosing between these two cards will be tough, but I'm leaning towards ATI right now. One thing that I know for sure, however, is that 3dfx is not in the running. Their only hope right now is to drop their prices very low. I would not like to be working at 3dfx right now.

    Oh, here's some links:

    Again, nVidia will be announcing the GF2 on Wednesday. Check their site then for details.

    ------

  • by b_pretender ( 105284 ) on Monday April 24, 2000 @08:14PM (#1112488)
    It's nice to hear that it will be released supporting all of the major OSes. Let's hope that this isn't just a marketing ploy, and, that most new hardware will follow this plan.

    Things seem to be swinging back to the way that it was in the '80s with many different OS's. This time, however, we have standardized hardware (mostly).

    Making Linux into an adequate gaming platform also depends upon immediate support when hardware is released. I think the ATI Radeon is a step in the right direction.

    --
  • by Sir_Winston ( 107378 ) on Tuesday April 25, 2000 @12:18AM (#1112489)
    > I can't understand why you would have ever bought an AMD
    > CPU in the past (I certainly can now with the Athlon, however).

    First point: I bought an AMD K6-2 because, at the time, it was the only reasonable alternative to supporting the Intel monopoly, a monopoly I found just as odious then as I do now. The K7 Athlon was six months or more away, and I needed a computer sooner than that. And, I wasn't going to buy Intel on principle--they'd been handing us slight modifications of the original Pentium, without much true innovation, for far too long. Amazing how the underdog AMD, with comparatively few resources, was first to market with a true "786" processor core... But, back to the point, I bought AMD on principle, and their price/performance ratio at the time was very competitive with the Celeron (differing clockspeeds for equal performance, of course). It'll still make a damn fine file server when I build a new Athlon/Thunderbird system at the end of the year.

    > The main reason why ATI cards have problems on Super7 is because
    > Super7 is a bad, bad hack. If you actually look around, all of the
    > cards were having problems around that time.

    Yes, but you're missing the point! *ALL* of the other major manufacturers fixed their drivers to make their products work with Super 7, *EXCEPT* for ATI. A TNT2 will run well on Super 7, and has since a couple months after introduction when the drivers were fixed for VIA Super 7. Ditto for G200/G400. Even a shiny new GeForce runs well under VIA Super 7. But still, after an eternity, not the ATI Rage 128 based cards. *That's* the point. All other important players run well on VIA Super 7 chipsets now, except for ATI which has had well over a year to fix their driver support. So, the problem is with ATI's protracted laziness. Nowadays Super 7 may not be the best platform to bother with, but it was for most of the over a year in which ATI ignored driver dev for it.

    > And the REALMagic is a PCI card, right? And your ATI card is
    > AGP? Again, this is more a problem with Super7 than the manufacturer
    > of your graphics card.

    No, no, no, no. The video card is a standard PCI model; I have an AGP slot which I wanted to save for an nVidia or 3Dfx card when I could save the money to upgrade, then use the ATI strictly for its multimedia functions live Video Desktop and vid capping. Look towards the future, I always say, and I wasn't about to waste my AGP slot on a card featuring an ATI chip which was already a year old. So, it isn't a problem with VIA's AGP implementation, it's, I repeat, a problem with ATI's substandard hardware support/driver support. So, if the REALmagic card has no problem doing DVD over my PCI bus, the ATI card shouldn't either. It's that simple. REALmagic took the time to write drivers which would handle DVDs well from a Super 7 mobo, but ATI Multimedia didn't. Their driver support is consistently substandard when compared to nVidia, 3Dfx, and Matrox, and I mean even on Intel mobos--read the posts on the Rage Underground help boards if you doubt this. But the driver support is especially bad for AMD/VIA solutions. There is *ZERO* excuse for blaming a chipset instead of fixing your drivers to work with it, especially when every other important player in the graphics industry has managed to make their cards work quite well with it. And before you try again to lay the blame elsewhere, yes, the latest VIA drivers have been installed and configured properly.

    And, a final note: especially since a PCI-based Rage 128-based card could not play well with X in standard SVGA mode, ATI has absolutely no business calling a VIA chipset non-standard. It's ATI's cards and drivers which are non-standard.
  • by caedes ( 157147 ) on Monday April 24, 2000 @09:02PM (#1112490)
    Concerning useing two cards running in tandem, it is also possible to run this card in quads. Each card will render every other pixel of every other scanline, thereby quadrupling the effective frame rate. That is, only if you have 4 slots.
  • by green pizza ( 159161 ) on Monday April 24, 2000 @10:04PM (#1112491) Homepage
    With Mac OS X and more and faster Power Mac G4s in the wings, I really hope ATI looks forward when developing Mac drivers. The Rage II, RagePro, and Rage128 drivers for the MacOS in the past have left a bad taste in my mouth and at this point I wouldn't be too heartbroken if they were to develop their next set of drivers for G4/Altivec and Mac OS X only. Perhaps they outta consult with some Linux and NEXTSTEP/OpenStep/MacOS X experts for help with their non-Windows drivers. Omni [omnigroup.com] has done miracles with NEXTSTEP and OpenGL in the past for Carmack and id, perhaps someone outta give them a call.

    Rant, Rant, Rant


  • by Anonymous Coward on Monday April 24, 2000 @09:44PM (#1112492)

    You [mentaltempt.org] have [dictionary.com] overlinked [sfsu.edu] this [this.com] article [article.com]. Just because [justbecause.com] you [mentaltempt.org] know [notmuch.com] the [the.com] URL [url.com] of [msn.com] Linux [linux.org] doesn't [dbytes.com] mean [tony.ai] you need to [youneedto.com] use it. [useit.com]

  • by Shaheen ( 313 ) on Monday April 24, 2000 @08:11PM (#1112493) Homepage
    I'm just skimming through the tech specs here and I'm just gonna comment on a few things...

    Note, I'm no graphics professional. I am merely an interested individual. Repeat: I am no John Carmack! :P

    The first thing that jumped out at me that ATI seems to be doing in the "new and cool" area (rather than just adding more horsepower to today's GPUs) is adding keyframe interpolation. Not *2D* interpolation, but *3D mesh* interpolation. The idea has a good illustration at the bottom of this [ati.com] page.

    Voxels seem to be cropping up here [ati.com]. It's cool to see that they are adding support for them at the hardware level. I know that John Carmack has been skeptical about using voxels due to the sheer amount of processing power they need.

    Most of the stuff I saw in the specs, however, is mostly just fluff covering various graphics technologies and what they do. While the specs hint that the chip will have support for them, it doesn't do too much more than hint at it.

    Maybe there'll be more information soon...
  • by dragonfly_blue ( 101697 ) on Monday April 24, 2000 @09:34PM (#1112494) Homepage
    Hopefully, they will come out with an ISA version, for Slashdot-Terminal [slashdot.org].

  • by Stiletto ( 12066 ) on Tuesday April 25, 2000 @03:09AM (#1112495)
    You're absolutely right...

    Everyone is going to announce their latest chip with guns blazing, claiming that it will be the fastest thing ever, with the most features, bla. blah blah.

    And, I'm surprised slashdotters are falling for it. "Look at those specs!!! It must be good!!! I can't wait to buy it!!!" The marketing folks there must be already patting themselves on the back.

    If anyone's suffered through ATI's past chips (and their [lack of] relationship with the Linux community) they will already know to stay away from anything ATI puts out in the future.

    When shopping for hardware, especially video hardware where the competition is downright cutthroat, here are some do's and dont's:

    Don't rely on online reviews.

    Or at least hit reload every once in a while to make sure the site isn't financially supported by one of the card companies reviewed. Let's not forget the fiasco where a chip company's ads were running on "you know who's" Hardware Guide a few months back when they were trying to do an "unbiased" review.

    Don't rely on benchmarks created by a graphics chip company.

    Of course NVIDIA's card will run the NVIDIA tree demo faster than anyone else!!! What unbiased information does this tell you? Nothing. I personally find any benchmark that is not part of an actual application totally useless. Quake is an OKAY benchmark if youre into gaming, and many CAD applications come with their own benchmarks. I'd put a little more trust in these.

    Test all resolutions and color depths

    Remember: low-resolution and/or high-poly tests guage the driver's performance and efficiency, while high-resolution, low-poly tests guage the card's fillrate. Don't trust a Quake benchmark that is only done at 640x480. Beta or low-quality drivers can make a card look bad at this low a resolution.

    Test on multiple CPU's

    Make sure the graphics chip's performance scales well with better CPU's. Drivers can also be optimized for Pentium-2 and -3 class machines.
    ________________________________
  • by Gary C King ( 34445 ) on Tuesday April 25, 2000 @01:41AM (#1112496)
    ... called answer the question.

    I've got a graphics card by some manufacturer (it really doesn't matter who) that has a 1.6 Gigatexel fill rate. Now, given the propensity for developers to use 32-bit textures, that means that each and everyone one of the 1.6 billion texels I process every second must be accompanied by its own 4-byte read. Now, how much memory bandwidth does this require? And how much bandwidth is on the card?

    Now, let's start expanding on this... 30 million triangles / second, given triangle lists equates to about 16 million vertices. At 3 floats (x,y,z) per vertex, and 4 bytes per float, that's another 192MB/sec of bandwidth we don't have. Now, in order to actually use textures, each of those vertices also needs texture coordinates, which add another 2 floats, or 128MB/sec. And then for lighting we need a vector normal to each vertex... that's another 3 floats, so 192MB/sec. Now, in order to actually project these coordinates onto the screen, every vertex needs to be multiplied by a 4x4 matrix, or 16 more floats. Whooppee! That's another GIGABYTE of bandwidth down the tubes. Then to actually display this, since I have 2 texture units per pixel pipeline, my card delivers 800MegaPixel fill rates, which at 8 bytes per pixel (24-bit RGB + 8-bit alpha, 24-bit Z, 8-bit stencil) is another 6.4GB/sec of bandwidth.

    So, when all is said and done, to reach the theoretical maximum of my card, I need 14.3GB/sec memory bandwidth minimum. Add in things like texture filtering (multiple texel reads per texel write) or alpha blending and you can break 20GB/sec easily.

    Multiply all this by about 10 for Microsoft's X-Box (which somehow claims to shovel 14.4 Gigatexel performance across a 6.4GB/sec unified bus), and you'll know why any and all paper specs for the X-Box are completely ridiculous.

    There is only one architecture currently available or in production that actually has the bandwidth to support its theoretical maximums, and there's no way in hell it'll fit in an AGP slot. It's manufactured by Sony, can currently be bought in Japan for about $400, is slightly larger than a bread box, and provides 48GB/sec of bandwidth, albeit at a slight hit to the actual frame buffer size.

    So, in the spirit of the industry, I'm announcing my new video card. It has 400Gigatexel performance, and can transform 100 billion triangles every second. Unfortunately, due to current memory and bus technologies, you'll never see more than about 500 megatexels and 2.5 million triangles, anyway.
  • by Sir_Winston ( 107378 ) on Monday April 24, 2000 @10:22PM (#1112497)
    Yes, it *sounds* fantastic on paper. So did, a-hem, the original Rage Fury, what with its groundbreaking new chip and whopping 32 MB of memory. But, ATI has always had one fatal flaw, and that flaw will doubtless plague them still: their drivers absolutely suck. It took a good six months after the original Rage Fury was released for it to get the performance it should have and could have had from day 1--by that time, TNT2 was mopping the floor with it in both performance and image quality, and especially in price. It was outdated by the time it was performing up to par with its specs.

    This has always been ATI's main problem. Unlike nVidia and 3Dfx, ATI releases drivers slowly and never ever advertises them; in fact, its own driver download pages warn that the drivers are only supposed to be for people experiencing problems, etc., and might cause new problems. They go beyond a "standard disclaimer" and try to actively discourage driver updates--no wonder then that sites like "Rage Underground" are the center for the ATI guys into performance, sites which have their own *unofficial* performance-optimized drivers because ATI drivers suck.

    So, I'm convinced that no matter the potential of ATI's new chips, they won't live up to them until it's too late. The other ATI problem is also driver-related: lack of hardware support. If you don't have an Intel processor/mobo, think twice before plonking down hard currency for anything made by ATI. I myself got an ATI All-in Wonder 128 card this Christmas, and it refuses to play well with my VIA based motherboard/K6-2 processor. It's not like the Super 7 platform is either too new or too old for ATI to have supported it in the Rage 128 based cards, or that the VIA MVP3-G chipset is so uncommon. ATI, quite frankly, just doesn't care about supporting non-Intel platforms, because they don't have to. They're the company of choice for Intel-based OEMs. So, they don't care about performance-loving AMD-using geeks like a lot of us here.

    This is in stark contrast to nVidia and 3Dfx, which release new drivers all the time and which try to support every viable platform. When GeForce cards were having a problem on Athlon mainboards, nVidia released new drivers to fix the problem. Yet, ATI would probably have done the same thing they did a year ago with K6-2 and K6-3 platforms and the Rage 128 cards and blame the problem on the chipset vendors for being non-standard--i.e., non-Intel.

    This is a serious attitude problem on ATI's behalf, and until they can prove that they'll provide adequate enough driver support at least for Windows, I'd recommend staying away from anything they offer because the drivers will kill it. Let alone Linux. I tried installing both Corel Linux 1.0 and Linux-Mandrake 6.0 with my A-i-W 128--based on the same year-old chip from the Rage Fury--and couldn't get it to work with X even in generic SVGA mode. ATI doesn't support all common platforms under Windows, so forget about decent Linux drivers.

    I am satisfied somewhat with the multimedia features of my All-in-Wonder 128 under Windows--Video Desktop is a godsend--but even then DVD playback was unbearably awful. Of course, ATI blamed it on my VIA chipset. Funny then how the REALmagic Hollywood+ I got after the ATI's performance bit delivers flawless DVD performance on the same VIA chipset, with CPU usage averaging under 5%. Yeah, ATI, blame it on the mobo chipset instead of your own laziness when it comes to drivers...

    As I said, I like the multimedia features of my A-i-W 128, even though DVD playback won't work because of shoddy drivers the rest of it is great. Video capture is flawless, and Video Desktop for TV viewing always wows my guests and provides me with hours of entertainment during my long visits to pr0n--er, tech sites. But never, ever, ever, buy an ATI card for its performance stats. It won't live up to them until the card is outdated, and even then it might never live up to them unless you have an Intel mobo and processor.
  • by jmd! ( 111669 ) <jmd@pLISPobox.com minus language> on Monday April 24, 2000 @08:03PM (#1112498) Homepage
    Wow, ATI says their chip is fastest? GET OUT! I thought they were going to say it sucks. And ATIs benchmarks prove it? NO WAY!

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...