ATI Radeon X1800 GTO Launched 117
SippinTea writes "ATI has also hastened to market with a launch of their own this week, with a new Performance Mid-Range Graphics Card. The Radeon X1800 GTO is a chopped-down version of the Radeon X1800 XL with 12 pixel pipelines and less expensive, lower speed GDDR3 DRAM on board. It compares well with the new GeForce 7600GT but can it compete with a GeForce 7900GT for only a few dollars more?"
Too many video cards (Score:1, Insightful)
Re:Too many video cards (Score:1, Troll)
Key things to look for
1. Get nvidia. The driver support is there.
2. Stick one revision back [e.g. 6xxx instead of 7xxx]
3. Don't get "shared memory" LE or LS or whatever edition cards
4. Don't get 256MB cards unless there is no price difference [or a very small one]
5. Look for native TV out if that's your bag. Sadly nvidia cards often need win32 drivers to get tvou
Re:Too many video cards (Score:2, Insightful)
Personally I want a card that can drive my 1900x1200 display in native resolution while I'm playing FPS, so I'm pretty sure that low end card isn't cutting it. Really, I want a card that can run two of them, since I don't want to upgrade my relative new system to one that will properly h
Re:Too many video cards (Score:3, Insightful)
You can get by with decent gaming on a 6600 which will cost you 140$. You don't need to buy a 7800 for 500$ to play Farcry or something. Filtering like what I suggested will land you a card in the 6xxx series that doesn't cost more than 200$ and will let you play games at decent refresh rates and resolutions.
So yes, there are a lot of cards out there but it's usually fairly easy to pick
Re:Too many video cards (Score:1, Insightful)
For more information on what I mean, visit http://www.matrox.com/ [matrox.com]
They are releasing an adaptor to turn many video cards capable o
Re:FX5200? Why? (Score:2, Informative)
Last I checked a Voodoo card from 1997 wouldn't get 30fps at 800x600x32bpp while playing UT2k4.
Nice troll though. It's kinda funny actually, most of the trolling on slashdot and on usenet come from anonymous sources. It's almost like you ARE a coward and think that disrupting a conversation is ok to do, so long as you can post
Re:FX5200? Why? (Score:4, Informative)
Re:FX5200? Why? (Score:1)
I got a 6800GT on ebay for £120, and another one a month or so later for same price, but the adaptor on my scsi disk is getting in the way so can't use two yet till I get a bigger case. UT2k4 is very enjoyable at 1600x1200 with just the one card, and am spending way to much time playing it. Am really looking forward to the new one see here, [beyondunreal.com] but I'm pretty certain even with both cards its going to be
Re:FX5200? Why? (Score:1)
Re:FX5200? Why? (Score:2)
That and when you make nearly six figures spending an extra 60$ or whatever on a video card isn't a big deal [specially in light of my first point].
Tom
Re:FX5200? Why? (Score:2)
Still a poor buy for it's time (Score:2)
Re:Too many video cards (Score:1, Redundant)
Tom
Re:Too many video cards (Score:4, Insightful)
Re:Too many video cards (Score:3, Informative)
Both Galaxy and Gigabyte are currently exposing fanless GF 7600GT at CeBIT (and are planning fanless 7900GT and GTX).
Fanless graphic cards are becoming more and more common on the retail market, while they virtually didn't exist a year ago...
Say what??? (Score:2)
Funny, my THREE PROCESSOR 12MB Creative 3D Blaster Voodoo2 was purely passive cooling - no fans, not even a heatsink. When did that come out... 1995?? Same with my Voodoo3 2000 PCI. ATi Rage/Rage Pro/RageIIC (digging them up as I dig thru my old hardware box here.) Same thing. Those are pretty old, as well. Matrox G400 - no fan or heatsink, either.
Re:Say what??? (Score:2)
You need to learn how to troll properly. The Voodoo 2 chipset, consisting of one PixelFX2 and two TexelFX2 chips, was designed as a multi-chip solution for sever
Re:Say what??? (Score:2)
Back then, even though they had fewer transistors, these cards were larger, less efficient, and produced LOADS of heat (I could fry an egg on a voodoo2 faster than the equivalent AMD processor back then.) The main differences between back then and today are heat, power consumption, amount of power packed into one core, and a significantly smaller die size.
Go crawl back under a rock, idiot.
Re:Too many video cards (Score:2)
From the video card manufacturers' point of view, if they can sell people cards at different prices, then they can reach all the different reservation prices. One guy wants top-of-the-line, another wants midrange, another wants cheap. It's the way the free market works.
I would take issue more with the naming conventions. They are all just strings of letters and numbers anymore, and they just get larger and more complicated.
Re:Too many video cards (Score:2, Insightful)
Re:Too many video cards (Score:2, Insightful)
Re:Too many video cards (Score:1)
Any company that fills this gap will be the ONLY company serving what is becoming a rather large niche. This means that they will get some sort of a bump in sales.
Companies don't need to target majorities in order to make money.
Re:Too many video cards (Score:4, Funny)
Re:Too many video cards (Score:1)
Can you say "soft launch"? (Score:5, Insightful)
It looks like ATI wanted to steal nVidia's thunder by announcing their latest product the same day. The small issue of not actually being able to manufacture their product yet doesn't seem to be very important to them.
Re:Can you say "soft launch"? (Score:2)
That's because ATI didn't foresee the launch of the 7600GT this early, and had to start the PR-machine for the counter-offensive (== announce the X1800GTO) much earlier that they'd have liked it.
Re:Can you say "soft launch"? (Score:2)
Re:Can you say "soft launch"? (Score:1)
Linux drivers? (Score:4, Insightful)
Finally proof!! (Score:4, Interesting)
I looked at this and I thought, "so what, how many fps do kids need in their games anyways?"
Then the exact next thought was: "Bah the drivers are still fubar in linux so why should I care."
3rd: "How many
So offically, pass me a hat. I quit.
Ahh games I do miss them so (the best FPS will always be StarSeige Tribes), and eye-candy; nah it'll probably slow down my compile times.
Re:Finally proof!! (Score:2)
Same as always, but as the cards get beefier the games tear through more and more graphical resources, and then you can activate HDR, Full Scene AntiAliasing (FSAA), Anisotropic Filtering, ... to the point that top-of-the-line latest released games manage to be unplayable if you enable every single graphical option.
Re:Finally proof!! (Score:1)
Note to self: next sourceforge project: OpenGL blurscope for kernel compilations.
Initiate project after: Current sourceforge project for mplayer script to play Memento DVD in correct chronological sequence.
Re:Finally proof!! (Score:2)
Nope, the right question is "how many polygons at 30FPS."
In some games more polygons = more detailed models. I don't give a shit.
In other games more polygons = more enemies on screen at the same time. And that's when fun really begins!
Right. (Score:1, Funny)
Presumably you are the grand old age of 26 or something. I see this all the time on ./ and elsewhere: "You young 'uns today! Why, in my day we only had 32MB TNT2 cards when we played Quake3! I was lucky to get 60 fps, and liked it".
Guys like me started out on punch cards, and worked with guys from the ENIAC-era...they used to do the "You young 'uns today...!" thing too, only they complained that we wouldn't know how to program with patch cords to save our lives.
Those guys are
Re:Right. (Score:2)
I've programmed a few computers that used patch cords and removable plug boards. They used discrete transistor logic, not vacuum tubes.
How many people around here ever used a Tektronix storage tube graphics terminal? They used vector graphics and a weird display tube that didn't need to be refreshed by the electron beam. The hardware is long gone but you can still see plentiful references to the Tektronix 4000 series terminals in the UNIX documentation.
We were so poor that we couldn'
Re:Finally proof!! (Score:2)
I thought my X800 were cool, but I must admit that the latest year, I have found other things in life that were more important to me than having the latest grahics card. I don't even play games much anymore, so my purchase of the X800 ended up in being a waste of money, plus have this "funny" bug of the 2 pixels in the lower right of the screen, being duplicated across the first line on my HP LCD screen.(Problem first shows when installing the ati drivers).
Second, it seems that all new cards are
Re:Finally proof!! (Score:1)
As far as drivers are concerned, Linux does have a few mainstream games to play, but this appears to be a budget gamer's card, which, at the moment, pretty much relegates it to the Windows realm.
As a sidenote the best fps was not Starsiege Tribes, it was just plain old Starsiege. It still pisses me o
Re:Finally proof!! (Score:1)
Re: (Score:2)
Re:Finally proof!! (Score:2)
(yes I do play ET, but I wish I could get Tribes or Tribes 2 to work with linux)
Speed Check (Score:4, Funny)
With all these new mid-range cards out.. (Score:4, Insightful)
Re:With all these new mid-range cards out.. (Score:2)
The latest generation of integrated video is much better though, and I can see the latest offerings from ATI, Nvidia and Intel being sufficient for most non-gamers, as long as they have at at least 32MB of independent memory. I know ATI's chipset s
Re:With all these new mid-range cards out.. (Score:2)
Re:With all these new mid-range cards out.. (Score:2)
Given two identical machines, except one with shared memory and one with independent video memory, the first will be perc
Re:With all these new mid-range cards out.. (Score:5, Insightful)
The Mobility Radeon x1600 in their mid-sized MacBook Pro is ATI's second-best current-generation mobile GPU. The Mobility Radeon x1800 is ATI's current high-end part and the only noticable difference (for most users) between x1600 and x1800 is 3D gaming performance, which is not worth the extra cost for the vast majority of MacBook Pro buyers. The x1800 is more appropriate for Alienware gaming notebooks or giant Dell XPS desktop replacement notebooks.
I think the (non-mobile) Radeon x1600 in the iMac is a heck of a nice GPU for a "consumer" PC. Any current generation GPU (like Radeon x1300 or GeForce 7300) would be a fine choice IMO because the extra 3D gaming performance would be a waste for the vast majority of iMac buyers. Anyone that needs more gaming power than an x1600 shouldn't be buying an all-in-one computer with non-upgradable graphics. It would be nice, however, if Apple offered a headless upgradable desktop that wasn't a freakin' workstation.
Are you talking about stuff like Quartz Extreme and Core Image/Video? I think the Radeon x1600 gives plenty of GPU power for OS X. Heck, Intel's maligned GMA 900 integrated graphics seemed to have snappy OS X performance [slashdot.org] on the Intel Developer Macs. Core Image only requires a Radeon 9500 or GeForce FX 5200, which are both two generations older than the Radeon X1600.A quick run down of how this works (Score:5, Funny)
2. Sell it for $500
3. Spend a few more million dollars figuring out how to cripple top of the line graphics card.
4. Sell it for half the price.
5. Profit?
6. Consumers figure out how to re-enabled all the features that were crippled making there $250 graphics card perform almost equal to the $500 version.
Re:A quick run down of how this works (Score:1)
However, to meet demand at the low end the vendors do end up disabling features on their mainline parts to dumb them down. In most cases, though, there's no way to undo the damage.
Re:A quick run down of how this works (Score:4, Informative)
Re:A quick run down of how this works (Score:2)
Re:A quick run down of how this works (Score:2)
What usually happens is that in the initial run of that group of graphics cards, they take perfectly capable cards and downrate them.
Why? To get their product out on the market.
Smart people figure out which cards can be softmodded (BIOS Flash) or hard-modded (messing with the PCB) and they go buy that card and bump it up to full power.
Eventually nVidia/ATI
Re:A quick run down of how this works (Score:1)
Re:A quick run down of how this works (Score:2)
I'm not saying it's not done; I'm just saying it's a business decision, and it depends on the value of tester time at a point in a chi
Re:A quick run down of how this works (Score:2)
Re:A quick run down of how this works (Score:1)
>>The key thing you missing here is that most consumers are not like you -- they have lives, which means they are not going to spend time tinkering with their graphics card.
Yes, but by that logic, they wouldnt be spending $500 on a graphics card either.
Oh Yay... (Score:2, Insightful)
Another graphics chip, in case the 20+ already out there aren't enough choice for you.
FTFA:
And then:
Gratuitous product launches (Score:5, Insightful)
So they keep coming up with new variations that are trivially different from the existing products - a clock speed adjustment here, a few pipes disabled there - primarily to keep their name in the media. Even the "unannounced" chips are broadly reported, usually with something like "quietly released" in the headline.
Linux drivers? (Score:2)
Is that still the case? If so, then I can't see why I would be interested in ATI.
Re:Linux drivers? (Score:1, Informative)
More significantly, though: Xgl relies upon an OpenGL extension that ATI is unlikely to support. This means you will never get the latest and greatest X11
Re:Linux drivers? (Score:1)
And the fact is, you should know
Re:Linux drivers? (Score:3, Interesting)
Re:Linux drivers? (Score:1)
Re:Linux drivers? (Score:2)
Don't look for Purevideo on Linux. Look for Xvmc support.
Xvmc is an interface for hardware accelerated video decoding. Deinterlacing, yadda yadda. Via currently supports Mpeg-1,2,4, H.264, and some other goodies. Nvidia only supports Mpeg-1 and 2 right now. But expect more in the future.
Linux drivers?-Less Filling. Tastes Lousy. (Score:1, Insightful)
So does the Linux Nvidia driver support Purevideo(C)? I think you'll find that the Linux drivers overall support less features than their comparable Windows version.
Re:Linux drivers?-Less Filling. Tastes Lousy. (Score:2)
Rather, they support Xvmc, which enables hardware video decoding. Currently, their support is not up to date with Via's, but they are working on it. Mpeg-1 and 2 are supported in hardware. Expect Mpeg-4 soon.
Nvidia's linux drivers lag behind windows, but only slightly.
ATI's linux drivers will never be up to date with Windows drivers. Hell, your lucky if you'll be able to use your ATI graphics card before it becomes outdated (X1x00 series, ahem.)
Re:Linux drivers? (Score:2)
Re:Linux drivers? (Score:1)
Which is why I download the driver sets marketed toward dial-up users. No control center, only the drivers, and my computer takes less time to boot up than otherwise.
Re:Linux drivers? (Score:2)
Oh, and they haven't seen more than a 1% performance improvement (per release) on their last 5 driver releases.
Mainly, it seems like ATI's linux drivers are "improving" in that you can now reasonably get them installed on most configurations.
In terms of normal driver issues (unrelated to difficulty of install, or compatibility with kernel versions) their drivers are absolutely terrible. I find their lack of support for the X1x00 series disturbing.
Why? (Score:4, Interesting)
The problem? It was running at about 5 FPS.
Now I'd like to get a card that would enable this kind of gameplay at reasonable speed. Crowded cities, armies of troopers, hordes of demons. Power in numbers, not detail. Completely new gameplay style. Screw high degree of reality, allow me to perform a multi-kill of 40 with one shot.
Re:Why? (Score:2)
You don't need a new video card. You need a new life.
You're just upset that you didn't think of it first, aren't you?
Re:Why? (Score:3, Informative)
Maybe you can. What was your card? (Score:1)
Crippled cards (Score:1)
ever tried playing FEAR or doom3 on 9800P (Score:1)
Re:ever tried playing FEAR or doom3 on 9800P (Score:1)
Re:ever tried playing FEAR or doom3 on 9800P (Score:2, Informative)
nVidia keeps the crown this year too (Score:2, Informative)
The 7900GT has 24 pixel pipelines 65nm process and is cheaper. nuff said.
Re:nVidia keeps the crown this year too (Score:1)
Re:nVidia keeps the crown this year too (Score:1)
The 7900GT is 90nm process and the ATI is 130nm.
LOL still u are wrong (Score:2, Interesting)
New segment in 250US$ range (Score:2, Insightful)
More Video Card News From CeBIT (Score:2)
These launches are not totally about PR (Score:5, Insightful)
One reason for this is that most midrange buyers are enthusiasts, and judging by the # of comments for a product on newegg, one can see that as soon as a better value is offered by a new chip, sales quickly shift towards it. The Nvidia 6800 GS was selling like hotcakes for just the tiny stopgap period it was put out, just to best the ATI x800GTO until the 7600 GT showed up.
I'm shopping for a card for a friend now, and have noticed that the midrange is good, but for high resolution play at 1600x1200 or 1920x1200, the midrange is barely cutting it now, so it becomes important to get the most bang for your buck, especially if you have an LCD with native high res and want to maintain quality. The new 7600 GT is about 15% faster than the 6800 GS, even w/ a 128 bit memory bus, and definitely hits a sweet spot at $190. It should run most popular titles comfortably at 1920x1200 and has next generation shader 3.0, unlike ATI's offerings below $200.
Unfortunately for ATI, they haven't offered the best midrange value since their 9xxx line. ATI took Nvidia's crown a while back but Nvidia has had it back for some time now.
Re:These launches are not totally about PR (Score:1)
Re:These launches are not totally about PR (Score:1)
I just upgraded to a GeForce 6600 GT from a Radeon 9500 Pro and I've had both of these hooked up to a Dell 2405FPW. While I am running my 2D space in 1920x1200, I don't tend to run any o
Video Card for Photo Editing (Score:1)
Re:Video Card for Photo Editing (Score:1, Informative)
Re:Video Card for Photo Editing (Score:3, Informative)
http://www.matrox.com/mga/workstation/cre_pro/pro
It has excellent 3d (in terms of quality) too.
They are still alive in this FPS comparison hell thanks to their focus on features like that.
Re:Video Card for Photo Editing (Score:1)
Annoying and confusing names (Score:2)
Recently I upgrade my card. If it wasn't for Tom's Video Card charts and some more reviews to round that out, it would have been impossible to tell which cards were better than which - let alone which is the best value.
I really think the numbering and naming schemes do the companies a disservice.
Re:Hhehehe (Score:1)