ATI Releases Five New Radeons 268
An anonymous reader writes "Eager to retake the performance crown from NVIDIA, ATI has announced five new releases for their Radeon product line. The latest card features 512MB GDDR4 memory running at 1000Mhz, it's currently the fastest single CPU VGA card out there. From the review: 'ATI has proven they are a leader and not a follower with the X1950 XTX. ATI has released the world's first consumer 3D graphics card with GDDR4 memory clocked at the highest ever stock speed that chews through games when it comes to high definition gaming. Memory bandwidth looks to once again be the defining factor in 3D performance. With a re-designed heatsink/fan unit, faster memory, and lowered price, the ATI Radeon X1950 XTX and CrossFire Edition are both serious 3D gaming video cards for the [H]ardcore that offer some value over NVIDIA's more expensive 7950 GX2. ATI's CrossFire dual GPU gaming platform looks to have just grown up.'"
Screw ATI (Score:5, Funny)
Re: (Score:3, Funny)
They have 2MB of EDO memory (upgradable to 4) for the ultimate experience.
It runs on PCI. Express graphics are guaranteed.
Re: (Score:2, Interesting)
Re: (Score:2)
You know, the Risc processor is going to change everything.
Re: (Score:3, Funny)
RISC is good.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Funny)
Re: (Score:2)
IBM [ibm.com] seems to disagree with you. They have both words on that page to make Google happy, but refer to them directly as hard drives ("Troubleshoot my hard drive" , "your hard drive", "SCSI hard drive", etc.). Google puts the drive:disk ratio at 167,000,000 [google.com] tto 68,300,000 [google.com].
In other words, don't make a campaign out of "hard disk" being correct unless you enjoy being rebuffed.
Re: (Score:2)
Re: (Score:3, Funny)
Drivers? (Score:5, Insightful)
Re: (Score:2)
Most of us don't need these cards. These are for hardcore gamers. As in shoot-em-ups that will only run on Windows, not real games like nethack. ATI won't be very concerned about the lost market.
Re: (Score:2)
Re:Drivers? (Score:4, Interesting)
fglrx(0) PreInitDAL failed
fglrx(0) R200PreInit failed
The total failure of ATI quality control to regression test even on recent cards like in my laptop is astounding. I'll never make the mistake of buying ATI products again.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
ATI's Linux Driver Page [ati.com].
They just released their 8.28.8 drivers a couple of days ago, and they had just released the previous version about 3 weeks before that. So, there are some changes being made at least. Also, with the AMD merge, they are considering opening up the source code to at least portions of the driver, so I personally expect ATI to become a serious player in Linux space in the not-too-distant future.
Re: (Score:2)
Re: (Score:2)
This comment would be equally valid if you s/Linux/Windows/. ATI has consistently displayed an utter inability to write drivers since, well, time immemorial.
Re: (Score:2)
I don't see how ATI has such high quality hardware and such crappy drivers, on 2 platforms, year after year.
Re: (Score:2)
Now you will have to buy a new graphics card when you buy Vista.
And yes I am just kidding.
I hope that AMD will help correct some of the driver issues that ATI has suffered with.
I still would drop nVidia in a minute if AMD/ATI produces OSS drivers for linux.
The sincerest form of flattery (Score:5, Funny)
ATI has proven they are a leader and not a follower with the X1950 XTX
No, X1950 XTX, ATI's top of the line card, sounds nothing like Nvidia's top model, 7950GTX.
Re:The sincerest form of flattery (Score:4, Funny)
Re: (Score:2)
Re: (Score:2)
I love my X850 XT PE.
It works both ways. (Score:3, Interesting)
ATI started using the "GT" monkier (x800 GT, x1900 GT), and even extended the monkier to "GTO" after the incredible success of Nvidia's 6600 and 6800 GT.
They've been doing this sort of thing for years.
And while ATI looks like they ripped off the "950" name with the x1950 XTX, this one again cuts both ways:
ATI was the first to use "50" increments in their product naming. Witness the Radeon 9250 and 9550, which were released a few ye
Re: (Score:3, Informative)
Both statements are wrong, though the correct answer will not change anything in the Nvidia vs. ATI naming race:
Nvidia 5900XT and 5600XT are approx. ½ year older than Nvidia 6800XT.
Nvidia 5950 is approx
Re: (Score:2)
There's certainly a lot of borrowing going on in tech companies' marketing departments. It makes sense to an extent so that people know what they're buying and how it stacks up vs competitors' products. The whole Athlon XP (another stolen name) line was numbered to resemble the GHz rate of Pentium equivalents.
You may be right about Nvidia stealing the "50" from ATI, but I thou
ATI/AMD - Show leadership (Score:5, Insightful)
What the new AMD led ATI can do to help show leadership is to release the information (or even drivers) needed for Linux to take full advantage of their card capabilities.
ATI seemed to not want to do this. I hope this changes under the new AMD administration.
What I've heard in the Linux community is to stay away from anything ATI if you plan to use it with Linux. Too bad really, because they really do make nice cards.
Re:ATI/AMD - Show leadership (Score:5, Insightful)
What I've heard in the Linux community is to stay away from anything ATI if you plan to use it with Linux.
The same applies to nvidia. Try Intel or Unichrome cards. Support companies that support FOSS.
Oh, and for the people who'll inevitably reply with the "they cant release the source, because of 3rd party IP" (I am tired of that particular whine) - why can't ATI/Nvidia release the source for the code they do have IP rights over? (and allow the OSS community to fill in the blanks).
Re: (Score:2, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Anyway, if you have political issues with Nvidia that's one thing, but otherwise they've run fine under Linux for years.
Re:ATI/AMD - Show leadership (Score:5, Informative)
They're very low-end, (used in cheap laptops, via's embedded line, etc) so if your a windows-gamer-fanboy, you're not going to have heard of them. (and if you judge a card by its name, you have bigger problems than that).
Anyway, if you have political issues with Nvidia that's one thing, but otherwise they've run fine under Linux for years.
No they don't. They run better than ATI's offering. There's a number of things that don't work correctly. (TwinView doesn't support multiple monitors with different resolutions, framebuffer/x switching support is poor, you can't report (linux) bugs to the kernel team, you're allowing an unaudited binary blob to run in kernelland, I can go on and on).
If Nvidia & ATI were the only choices, then fine, I'd reccommend Nvidia's buggy binary blob over ATIs buggier binary blob. But they're not. Two companies have offered the specs & a reference GPLd driver - I reccommend them and I think other supporters of FOSS should do likewise.
Saying a reccommendation of a driver that actually supports linux over one that doesn't is 'political' is.... well - let's say I suspect you have a political agenda of your own.
Re: (Score:2)
Huh? Using "a very low-end" card over one with much more Linux functionality, even if not 100% of its Windows functionality, because of their licensing terms -- what is that if not "political"? Not that there's anything wrong with putting your money behind your politics.
Re: (Score:2)
1) I listed two cards (not just the low end)
2) A bug free, but slow/low featured card is superior to a buggy fast/full featured card (IMO)
3) I'm sick of people's support of cards that don't have kernel mode binary blobs as 'political', when it's practical.
4) As other people in this thread have pointed out, nvidia's linux drivers are only useful for linux - think of the atheos, *bsd (minus freebsd), etc, etc users out there. Pleas
Re: (Score:2, Informative)
Having source makes the device useful for something _other_ than Linux. Like NetBSD.
ATI have historically been easier to work with than Nvidia in this regard. One can get source for some ATI products. And they are willing to work under NDA. I've even produced a radeonfb for NetBSD using information that was under NDA (and had never been released outside ATI before), and ATI let me release the drivers back to the community under a BSD
Re: (Score:2)
No more than you can report kernel bugs for any kernel issue where you have some random hunk of code involved, regardless of whether you have the source available to you or not. The kernel developers aren't going to help you unless they feel like you are only running their code and their code only.
Unless you, personally, have audited the code you don't know if it has been audited or not. If y
Re: (Score:2)
Re: (Score:2)
Oh, and for the p
Re: (Score:2)
Those of us who want the source (or, at the very least, the specs so we can write our own drivers), want it because we want our hardware to work. The only thing I have ever seen crash FreeBSD was the nVidia binary driver. The vast majority of Linux kernel panics I have seen have been as a result of the nVidia binary drivers. T
Re: (Score:2)
Whoa! Chill, I wasn't using the imperative, just suggesting people use FOSS friendly cards to support FOSS.
It would be more like I'm saying:
"if you support FOSS you shouldn't use those cards, because they're not releasing the source"
As to the rest of your rant, if you're a gamer, sure, you're probably dual booting to windows and perhaps an nvidia/ati solution is better for you. But as far as choosing ATI/Nvidia for pure linux? Not many peo
What about games? (Score:2)
Re: (Score:2)
I'm sorry, but it's a waste of time for them.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Drivers (Score:5, Insightful)
While NVIDIA is not perfect, the 2 cards I have from them work perfectly with their drivers. While ATI is releasing better featured cards their drivers leave something to be desired.
100 MB driver? screws my LCD panel? (Score:3, Informative)
Next... I noticed that text on my LG LCD monitor (20 in widescreen) was of really poor quality. I even installed ClearType from Microsoft, didn't help much. Started thinking it was my monitor, but then hooked it up to my laptop that has NVidia. Wow! WHat a difference! Even wit
Same on my Acer AL2032W (Score:4, Informative)
I can tell you exactly what happens there, because I've put some time into diagnosing the exact same problem on my Acer AL2032W monitor. And it still pisses me off that the problem _still_ isn't fixed, in spite of being known for ages.
The problem starts like this: some cretin at ATI decided that, if it detects a DVI cable, it should automatically trust the highest resolution reported by the monitor, and, here's the idiotic part, never allow the user or the monitor drivers to override it. So if it reads 1600x1200 as the highest supported resolution, any other resolutions you choose will automatically be either scaled to 1600x1200 or centered in an 1600x1200 image. It has no choice that lets me say, basically, "fuck off and just send the image as it is to the monitor."
Why is that an idiotic idea? Well, here's why: because some monitors support resolutions higher than their native one. E.g., there are a ton of 1280x1024 monitors which report that they also support 1600x1200. Or the AL2032W has an 1680x1050 native resolution, but _also_ supports 1600x1200. They just then down-scale that to their native resolution.
So think of the following scenario: let's say your monitor is an 1280x1024, but affected by the abovementioned quirk. And you set your desktop resolution to 1280x1024. It should be crystal clear, right? Well, on an Nvidia card it would be, but for ATI it's wrong.
What ATI will do there is scale your 1280x1024 image to 1600x1200 first, before sending it to the monitor. Which makes it all fuzzy already. But then your monitor has received an image which doesn't fit its native resolution. So it will rescale this 1600x1200 image back to 1280x1024. This doesn't re-create the original crystal-clear image, but adds _more_ fuzziness to it.
Yes, I know what you mean by "really poor quality" there, and even that is mildly put. It's piss-poor quality. It was so fuzzy on my monitor that it gave me headaches in less than an hour.
And the really idiotic and annoying part is that it doesn't even allow you to override that. Once it's decided 1600x1200, that's it. Whoever designed it had the arrogance to decide that surely the user is too stupid to know such technical details, so let's not trust the user with the power to set something else. I find that not only utterly idiotic (since we just saw that it can guess wrong), but outright offensive.
Anyway, there are two solutions to this:
1. Download the Omega drivers. Strangely enough those are smart enough to read the native resolution, not the maximum supported one.
2. Use a VGA cable. On VGA it does allow you to set your maximum resolution and frequency yourself.
(This also goes in case someone wants to jump in with the usual "just set the resolution in the control centre" advice. Trust me, it doesn't work over a DVI cable. Over a VGA cable it works. Through DVI it doesn't.)
Personally I find both solutions pretty annoying. Number 1 involves installing some non-official non-supported driver. (And if you know about how drivers run in kernel mode in Windows, you'll understand what's scary about running non-official drivers just downloaded off some web site.) Number 2 basically involved throwing the whole "digital" part out the window, and using an LCD monitor as a glorified analog CRT with larger pixels.
Re: (Score:2)
I loaded the Omega driver and it works much better, and I've never had any issues witht he drivers as far as virus, spyware, carashes, etc. . .
Re: (Score:3, Interesting)
p.s., I don't suppose you use a KVM switch, do you? My fr
Re: (Score:2)
Re: (Score:2)
Imagine if car companies were also the gas company and could modify the gasoline to make their older models run worse/less efficient/slower?
You wouldn't be able to get a car to work beyond 3 years.
Wait for DirectX10 cards? (Score:5, Insightful)
Are sales declining because of anticipation of this?
Will ATI and Nvidia be able to shift large quantities of cards over the next few months, with people like myself waiting for the next (significant) generation?
Aside: Yes, I am aware that these cards will still pack a punch in DirectX10 games, and will not be obsolete over night, but the unified shader/vertex architecture of DirectX10 seems to be a big shift in card design and will offer a lot of features to game desingers, not efficntly do-able on the odler hardware, so you may be stuck with a less good lookign rendering of a new game.
Re: (Score:2)
Also there's always the possibility that they are DX 10 card. I don't know
Radeon Definition (Score:5, Funny)
This definition causes all sorts of problems, such as how to define dual-card setups and what happens when a Radeon is attached to a daughter card rather than a motherboard. Videostronomers are currently divided between those who favour the term "Radeon" and those who argue that we should stick with the current definition favoured by consumers, which is "the weird square-ish blue plug at the back of my Dell".
Graphics card naming... (Score:3, Funny)
Re:Graphics card naming... (Score:5, Informative)
They model numbers. The requirement is that they be different between different cards, so customers can see that different products are different. Beyond that, marketting can do whatever they want with them - it doesn't really matter.
Suprisingly, the marketting departments at ATI and Nvidia have settled on a highly structured and informative system for model numbers (for something generated by marketting departments).
Here's how it works: Take the "X1950 XTX". That splits into 4 segments: "X1" is the generation, "9" is the class, "50" is the revision, and "XTX" is the specific model. Nvidia uses exactly the same system. For the 7950 GX2, we have generation 7, class 9, specific model GX2.
Generation usually changes yearly. Class splits into (generally): 0-3 is low-end, 5-7 is mid-range, and 8-9 is high end. The revision number allows more recent products to have higher numbers than older products. Generally for ATI "Pro" Now - that still doesn't let you determine which card is "better" based on the model number, but model numbers never do that. Which is better, An "AMD Opteron 165" or an "AMD Athlon64 FX-50"?
Re: (Score:2)
Another Review Perspective (Score:4, Informative)
Here the review talks up the signle X1950 XTX card but finds the CrossFire platfrom from ATI still very under-developed.
ATI is Evil (Score:4, Insightful)
Although it was touched on a little above, I couldn't pass up the opportunity to rant about ATI. ATI does not support more than 2-3 generations of cards. Their driver development quickly stops and their Catalyst drivers are ridiculously huge.
On the linux side of things their support is so freaking lame it is ridiculous. Reverse engineered open source drivers are 10X better than drivers developed by ATI. ATI is pathetic and any company that releases such terrible software in their name does not have very high standards and cannot be trusted. I had a radeon 8500 and I will never recommend or waste my money on such pathetic ATI junk again.
Re: (Score:2)
Comment removed (Score:3, Interesting)
But... (Score:2)
HEXUS.review (Score:3, Informative)
Loads more reviews out there too. Anyone feel like making a list?
Re: (Score:2)
About time (Score:3, Funny)
Re: (Score:2)
More X1950XTX Reviews (Score:5, Informative)
- http://www.hothardware.com/viewarticle.aspx?artic
- http://www.hexus.net/content/item.php?item=6538 [hexus.net]
- http://www.mvktech.net/content/view/3357/48/ [mvktech.net]
- http://pcper.com/article.php?aid=287 [pcper.com]
- http://uk.theinquirer.net/?article=33872 [theinquirer.net]
- http://www.reghardware.co.uk/2006/08/23/review_at
- http://www.techpowerup.com/reviews/ATI/X1950XTX [techpowerup.com]
- http://www.bjorn3d.com/read.php?cID=954 [bjorn3d.com]
- http://techreport.com/reviews/2006q3/radeon-x1950
- http://www.extremetech.com/article2/0,1697,200732
- http://www.tgdaily.com/2006/08/23/ati_releases_ra
- http://www.guru3d.com/article/Videocards/375/ [guru3d.com]
- http://www.hardwaresecrets.com/article/131 [hardwaresecrets.com]
- http://www.hardwarezone.com/articles/view.php?id=
- http://www.firingsquad.com/hardware/ati_radeon_x1
- http://www.driverheaven.net/reviews/X1950XTXrevie
up to date list: http://www.madshrimps.be/forums/showthread.php?s=
Sure this is good but... (Score:2)
OT: What's with the alphabet soup? (Score:5, Funny)
Is there some kind of rule that says we can only use letters like X, N, R, and words like CrossFire, to denote 'cool' products mainly aimed at men?
Just once I'd like to see an ATI Shiny B001 LALA and FluffyPants Edition. Just to shake things up.
Who in their right mind does benchmarks this way? (Score:3, Insightful)
Re:Who in their right mind does benchmarks this wa (Score:4, Insightful)
That's what's really relevant. I don't care if card X gets 200fps in 1024x768 mode and card Y gets 300fps. Both are way above my "give a shit" boundary. What I want to know is at what level to they start to drop to the point where I'll notice.
Re:Who in their right mind does benchmarks this wa (Score:2)
They are floating on free graphic cards.
does anyone actually give a flying toss?!?! (Score:4, Insightful)
i am interested to hear from anyone who is genuinely excited by this news. I'm also interested in hearing from someone who would pay £400 to increase their rendering power by 15%.
(yes i know that only applies to people who already have the current fastest video card, but i'd love to know if anyone is actually rich and bored enough to replace bleeding edge with bleeding edge at every opportunity)
Re: (Score:2)
But since I've almost completely switched over to Macs these days, it matters even less than before. (With the new Mac Pros using EFI instead of a BIOS, we Mac users are now stuck hoping ATI and nVidia will go to the trouble to release custom Mac versions of some of their cards that work with EFI. And so far, the *only* one shipping today is a crap
Plural of Radeon (Score:2)
But will it be enough for Oblivion? (Score:2)
I finally broke this habbit (Score:2)
Being a TRUE leader (Score:3, Insightful)
They could start by unifying features into a tight and manageable product set, and eliminate some degree of confusion about features and chipsets from the market.
-AND-
They could stop working on the "problem" of pushing more triangles, and work on the real problem with modern video cards: Power. Personally, I don't really need photorealistic graphic quality if it means I have to keep two power supplies in my system, or plug my video card directly into the wall.
Graphic quality is already impressively high, so maybe it's time to step back and improve the underlying technology and give the market time to absorb and upgrade. Like others, I still work on my ATI Radeon Pro 9800 with 256MB of RAM. I'm not upgrading anytime soon, because there are fewer and fewer AGP cards available, and I'm not willing to replace my entire otherwise completely functional system just to get a PCI-E slot. There are a lot of people like me, who are waiting, and I'm no Luddite. I like my gadgets. But keeping up with PC improvements has become a game of diminishing returns, since I run huge graphics and multimedia applications, plus most of the games on the market, very comfortably on my AMD64 3400+ processor with 1GB of RAM. I have yet to find a game I WANT to play that doesn't play quite nicely on my hardware.
Re: (Score:2)
Also, as new chips come out, the old ones are retired, but not immediately. So the question often is do
Fans? (Score:2)
A Less Glowing Review (Score:4, Insightful)
X1950 XTX review [extremetech.com]
Curious (Score:2)
ATI has the best TV-OUT. (Score:2, Informative)
Color me fickle.... (Score:2)
Nice CPU (Score:2, Interesting)
Funny that this is *still* an issue (Score:2)
I have never held a grudge against ATI but these driver issues have been a problem for them for a VERY long time. I remember buying my first TNT2 card and back then, the competing product from ATI (can't remember what it was) was riddled with driver problems. So I avoided ATI like the plague and went with Nvidia. Wash, rinse and repeat for each iteration of cards...
It's very interesting to me that here we are - 10
Upgrade me (Score:2)
Now I can play Duke Nukem at 120 frames per second despite the fact that human eyes aren't capable of seeing that much data.
Given the fact that we seem to be reaching laws of diminishing returns on video cards, shouldn't the hardware manufacturers instead start to concentrate on the weakest link by improving the capabilities of the human?
I need eyes than can handle more fps. I need more bandwidth from my eyes to my brain. I definitely need more processing power in my brain.
I won't mention one
Re: (Score:2)
sigh.
There are two factors:
1) Page fliping.
2) 120 FPS will drop when there are 10-20 other people on the screen moving around.
I can gt 60FPS in WoW, but as soon as it gets busy with people it drops to 20.
Of course even at 20 it's pretty good.
Re: (Score:2, Funny)
My motto: if it doesn't have Linux, it should!
Re: (Score:2)
Re: (Score:2)
O RLY? (Score:2)
Get this. (Score:4, Informative)
In both Windows, and Linux, I can have 8 monitors. EIGHT MONITORS. In any configuration I want, fully 3d accelerated at 1600x1200 per screen.
I currently am using it for a 6144x768 sized desktop for an AV switching system demo.
There is nothing in the ATI camp that can do this save (possibly) the FireMV line. And do you know what chip they use in that internally? A 9200. A friggin 9200.
That only works in linux using the open source driver! Absolutely ridiculous.