Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Budget Graphics Cards Compared 220

Posted by CmdrTaco
from the bang-for-the-buck dept.
EconolineCrush writes "Tired of reading reviews of high-end graphics cards that cost several hundred dollars or more? The Tech Report has a round-up of three budget cards that cost $80 or less. ATI's Radeon X1300 Pro, NVIDIA's GeForce 7300 GS, and S3's Chrome S27 are compared in an array of gaming, video playback, power consumption, and noise level tests against not only each other, but also a typical integrated graphics solution. As one might expect, the budget cards offer significantly better 3D performance than integrated solutions. What's even more impressive is the fact that even with newer games, the sub-$80 cards still have enough punch to deliver respectable performance."
This discussion has been archived. No new comments can be posted.

Budget Graphics Cards Compared

Comments Filter:
  • by Whiney Mac Fanboy (963289) * <whineymacfanboy@gmail.com> on Monday May 15, 2006 @08:40AM (#15333929) Homepage Journal
    OK, the article tells us about 3dmark, quake 4 specs, video playback, etc etc etc.

    But, they don't tell us which one (if any) has a vendor supported OSS compatable driver.

    Since XGL, etc (and I'm sure I'm not alone here), I've been on the lookout for a cheap & good 3d card, that doesn't give me 'kernel tainted' messages when I insert the driver.

    Does anyone know if any of these have good open support (I'm going to presume patchy [at best] for ATI, closed fast drivers from nvidia & good drivers [sourceforge.net] [but crappy hardware] for the s27)
    • by Grant29 (701796) * on Monday May 15, 2006 @08:52AM (#15334045) Homepage
      That wasn't necessarily the point of the article though. They did a good job comparing the budget video cards for performance. Is there a linux tech site that reviews hardware under different flavors of linux? That would be a useful site. Especially if they dived into driver compatibility issues on different distributions.
      --
      Promote your RSS/XML feed by generating RSS icon code at RSS Icon Gallery [rssicongallery.com]. Looking for the del.icio.us icon now. Please help!
      • by Whiney Mac Fanboy (963289) * <whineymacfanboy@gmail.com> on Monday May 15, 2006 @08:57AM (#15334076) Homepage Journal
        They did a good job comparing the budget video cards for performance.

        Nitpicking I know, but they actually did a good job comparing the budget video cards+software driver for performance.

        The review (while great for gamers) is pretty useless to Apple + Linux fans out there... and as this is a mixed site, I thought I'd ask.

        Is there a linux tech site that reviews hardware under different flavors of linux? That would be a useful site. Especially if they dived into driver compatibility issues on different distributions.

        That would be pretty cool - that's what I was hoping someone would reply to my comment with a link to! :-)
        • The review (while great for gamers) is pretty useless to Apple + Linux fans out there... and as this is a mixed site, I thought I'd ask.

          I would argue that a article that is useful to a vast majority of users has certainly achieved its goal.

          • I don't think it's that useful to most users either though.
            These are all PCI-E cards. Most people that have upgraded from AGP to PCI-E aren't going to be getting a budget card.
            Maybe if someone is rebuilding their PC and switching to a new motherboard with PCI-E, they might pick up one of these as an interim card because they spent everything on the new board and a dual core Athlon 64.
            That's really the only market I could see for these. "I spent everything on my motherboard and CPU and can't afford a fa
            • Most new OEM computers often have integrated crap for video, so a budget card would let them be able to game.

              Now, how likely are they to come across this review, well that is kinda doubtful.
        • and for the few of us out there that care, that just wasn't their audience. If a site is comparing mileage in a luxury SUV, they're not going to throw in an economy 2 door from the Czech Republic just because you don't care about the mileage in those vehicles.

          Their target audience was obviously windows users.

      • by strider44 (650833) on Monday May 15, 2006 @09:03AM (#15334130)
        Is there a linux tech site that reviews hardware under different flavors of linux? That would be a useful site. Especially if they dived into driver compatibility issues on different distributions.

        Try http://www.phoronix.com/ [phoronix.com]
    • by strider44 (650833) on Monday May 15, 2006 @09:07AM (#15334150)
      Does anyone know if any of these have good open support (I'm going to presume patchy [at best] for ATI, closed fast drivers from nvidia & good drivers [but crappy hardware] for the s27)

      You presume right. Nowadays I don't buy anything but nVidia graphics cards - I like my Doom 3 and co. and I can never be bothered rebooting to Windows. Hell I got Serious Sam 2 with my 7600GT and I can't even be bothered installing it and playing it.
      • Nowadays I don't buy anything but nVidia graphics cards

        Well - if performance is all you care about, you're making the right decision.

        However, some people are worried about including closed software in their kernel - they don't want linux to turn into the windows driver bugfest for starters....
        • However, some people are worried about including closed software in their kernel - they don't want linux to turn into the windows driver bugfest for starters....

          While you can verify and possibly enhance an open source video driver, being oss is no guarantee for quality, neither is being closed source a guarantee for lack of quality. Incidentely, nvidia has done pretty well with regards to this where ati consistently makes a mess out of it.
          • by Whiney Mac Fanboy (963289) * <whineymacfanboy@gmail.com> on Monday May 15, 2006 @10:31AM (#15334818) Homepage Journal
            being oss is no guarantee for quality, neither is being closed source a guarantee for lack of quality.

            I'm afraid that it's a guarantee for lack of support. (Running a tainted kernel guarantees you won't recieve support from the core kernel group if you're having troubles.)

            Furthermore, while being oss is no guarantee of quality, inclusion in the mainline kernel tree is (to some extent anyway).
            • I'm afraid that it's a guarantee for lack of support. (Running a tainted kernel guarantees you won't recieve support from the core kernel group if you're having troubles.)

              So, you try to reproduce the problem without the tainting driver and if it no longer occurs, you report to nvidia, if it still does you report to the linux developers with a now untainted and supported situation.

              Yes, it is a bit more troublesome, but by far not as bad as you are suggesting.

              Furthermore, while being oss is no guarantee of qu
              • So, you try to reproduce the problem without the tainting driver and if it no longer occurs, you report to nvidia, if it still does you report to the linux developers with a now untainted and supported situation.

                And if it doesn't? Seems to be an awful lot of weird behaviour that only happens to people running nvidia's DRI. Nvidia's not going to help you with other hardware, kernel folk aren't going to help you if you're running a tainted kernel.

                Sorry, but all that it does is ensure that the driver will be m
                • by SillyNickName4me (760022) <dotslash@bartsplace.net> on Monday May 15, 2006 @01:55PM (#15336577) Homepage
                  And if it doesn't? Seems to be an awful lot of weird behaviour that only happens to people running nvidia's DRI. Nvidia's not going to help you with other hardware, kernel folk aren't going to help you if you're running a tainted kernel.

                  First of all, I have used nvidia drivers on different platforms, including Windows, Linux (various distributions including FC4, Debian (testing from about 6 months ago), Gentoo) and FreeBSD. On none of those I found that the nvidia drivers were generally the cause of problems. In cases where they were, and where it was reported to nvidia, there usually followed a fix for the problem.

                  If you can show that their code causes the problem then they will at least try to help. I can say this from repeated experience. Yes it would be nicer if all the source was available. Yes you do run a risk of ending up with unsupported hardware over time when relying on a closed source driver, but knowing those risks, I find it highly preferable over not having the functionality it provides, esp. since as long as you don't buy the latest and greatest cards, I can buy a new one for less then I get payed for an hour of work.

                  So, what you indeed lack in support from the kernel developers, you can at least partly get back. But hey, pick whatever works best for you.

                  Supposed bugs or their potential existance is not an argument for using one piece of software or the other, all software has bugs, but few of those affect you most likely. Quality of code, seriousness of potential bugs, how do those get fixed, and support in general are usually good arguments however. If you feel more comfortable with only using open source software then be my guest, but stop spreading fud while trying to convince the world that your view is the only valid one.

                  And tested in more hardware configurations then nvidia ever will...

                  Yes. nothing new there. Guess what, in most untested cases it still works.

                  And it's not so easy to get code put into the kernel as you think - the code has to be portable, 64/32 bit safe, smp & kernel preemption friendly, etc etc. Many of these things will shake out bugs you wouldn't have known existed.

                  I have been involved in OS development for over 15 years, I am pretty aware of all that. I have code in 2 operating systems that are in current use, and some in one that is no longer being used much. I have worked with Microsoft developers, IBM developers, Linux developers, FreeBSD developers and many others. Yes, writing software can be quite complex and difficult. Sometimes a large group does a better job at it, sometimes a tiny group of very dedicated people do a better job at it, there is no telling in advance.

                  The one clear advantage that oss has is that you can interfer yourself, and while I have the capabilities often to do so, I seldom actually get to do so because in virtually all cases asking the current developers and providing them with GOOD INFORMATION for reproducing and locating the actual bug is a lot more effective, regardless of dealing with open source or closed source software.

    • First, I'm damn impressed. An S3 card beating an nVidia? Actually... an S3 with a decent performance? I'm not much of a gamer, and considering I would pretty much BUY only budget cards... this is looking really good, especially when put together with the FREE OSS LINUX drivers.

      I know what I'll be putting inmy AMD64, whenever i make that. Nice.
      • I know some folks are surprised that S3 is still around, if you read many forums you'll find that people seem to think that there are only two video chip manufacturers still in business anymore. I knew S3 were still around but from my point of view, they are the SiS of video chipset manufacturers - best to be avoided unless you want headaches. The reason I had that impression was I haven't seen any positive reviews of S3 based video cards in quite a few years, and no gaming or workstation cards proclaiming
    • No one really cares in the real world if it has a supported OSS driver. Linux is not a gaming system. No, I am not a linux hater/Windows fanboy, but its just like Mac, not a gaming OS.
      • No one really cares in the real world if it has a supported OSS driver. Linux is not a gaming system. No, I am not a linux hater/Windows fanboy, but its just like Mac, not a gaming OS.

        Hmmmmmn, maybe you missed the part of my post where I said: Since XGL, etc ... I've been on the lookout for a cheap & good 3d card...

        There are at least a few out there who care - hopefully our numbers will start to get big enough that support for nvidia & ati proprietary kernel extentions will dry up.
  • Feh (Score:5, Funny)

    by The_Isle_of_Mark (713212) on Monday May 15, 2006 @08:43AM (#15333953)
    If I can't do 27,000,000,000,000 triangles a second for under $80 I'll not buy one!
  • Impressive (Score:5, Insightful)

    by suv4x4 (956391) on Monday May 15, 2006 @08:46AM (#15333990)
    What's even more impressive is the fact that even with newer games, the sub-$80 cards still have enough punch to deliver respectable performance.

    No, what's impressive is that most gamers have been successfully brainwashed that they need a $500 video card to play a modern game, while the low range has been excellent for the past 3-4 years.

    At the same time, people are shocked about PS3 being $600. I wonder what the hell happened to common sense, where we lost it and will we find it again any time soon.
    • Re:Impressive (Score:5, Informative)

      by Lumpy (12016) on Monday May 15, 2006 @08:55AM (#15334066) Homepage
      Exactly! I use the horribly outdated and underpowered Geforce FX6600 card and I can play ANY game very nicely. Even the Quake 4 watermark is very VERY playable at 1024X768 at mid level quality settings.

      and the point is playability. because you can play at 1280X1024 at full res does not make it feel any better when the 13 year old kid waxes you hard every time with his 640X480 and lowest quality settings.

      if the game is smooth and fun then that is what matters.
      • Re:Impressive (Score:2, Interesting)

        by ergo98 (9391)
        Exactly! I use the horribly outdated and underpowered Geforce FX6600 card and I can play ANY game very nicely. Even the Quake 4 watermark is very VERY playable at 1024X768 at mid level quality settings.

        I suppose "very nice" and "playable" is subjective.

        The idea that you can play "ANY" game at a quality that other people would find acceptable is laughable. I have a 6600GT, and Battlefield 2 starts stuttering seriously (hugely ruining immersion) beyond 1024x768 / everything at low / no AA. I've seen how unbel
    • Almost (Score:3, Informative)

      by everphilski (877346)
      You are right, $500 card is way too much. But an $80 card is gimp. If you are going to make the upgrade do it right and get a mid-range card that has withstood the test of time, something like a nVidia 6600 or 6800 GTS. It'll set you back a few bucks more than these cards - 30-50$ more - but you will get a whole lot more value.
      • It'll set you back a few bucks more than these cards - 30-50$ more - but you will get a whole lot more value.

        Could be, but to me, value is something I can make use of, and if I don't use it, it's not value :D
        • Then settle for the onboard video card you have :) seriously, you are talking about a huge increase in performance for minimal price. Take any benchmark and divide by price, and the 6600 or 6800 will come out on top.
      • Ok what does the 6600 have that the 7300 GS doesn't have the both support pixel shader 3.0.

    • Re:Impressive (Score:3, Insightful)

      by fatduck (961824) *
      As is explained every time this topic comes up, "most gamers" don't think they need $500 video cards. Cards are priced based on supply/demand, and eventually there will be a more advanced card for $500 and the old $500 card will be in a price range that we find reasonable. If they only made $80 video cards because they're "good enough" then what is the incentive to spend on R&D for better technology?
      • If they only made $80 video cards because they're "good enough" then what is the incentive to spend on R&D for better technology?

        To make a better video card than the other companies making $80 video cards.

        What you're saying doesn't make sense -- there are lots of products that have basically fixed price-points, but yet still have a lot of competition and R&D going on. It becomes a little different kind of research, perhaps; instead of sending the engineers out with a blank check and telling them to
    • Your comments comes across as if this is something new. High powered video cards have always been expensive. Back in the "good old days" I remember paying over $300 for a Matrox 2D solution let alone the $299 Voodoo 2 days where a lot of gamers bought 2 cards.

      Comparing console pricing to PC video card pricing doesn't make much sense. The difference your ignoring that most PC enthusiast expect $300 and higher video cards, console buyers do not expect $600 consoles.
    • No, what's impressive is that most gamers have been successfully brainwashed that they need a $500 video card to play a modern game, while the low range has been excellent for the past 3-4 years.

      Case in point, my younger sibling has been nagging me for the last week for money to get a new graphics card. His current one runs fine. I tried explaining to him about buses and such in a effort to get across that the reason his games were running slow had little to do with the graphics card, and more to do with shoddy programming, a slow bus, etc, etc. He listen patiently and then proceeded to nag me more for ~$250 for a minor upgrade to the machines current graphics card.

      Meanwhile, when there are few agents on screen, every game runs smoothly and perfectly. On one game, Dawn of War, you can pause the action and rotate the camera around. When you do this, the pan is smooth regardless of the number of agents. This applies at reasonably high settings as well.

      Someday, maybe, people will realise that how good a game looks has less to do with the polygon count and texture rates than it has to do with artistic design. Super Mario World looks better than 95% of most games on the shelves today. It's image will stay in your mind long after the sterile landscapes of the current console high res wars have faded into oblivion.
    • No, what's impressive is that most gamers have been successfully brainwashed that they need a $500 video card to play a modern game, while the low range has been excellent for the past 3-4 years.

      It's similar to the markets for wine and high-end audio. Enthusiasts will spend any amount of money for increments in "quality" that are miniscule at best (and often completely imaginary), just to distinguish themselves from the unwashed masses...

    • by King_TJ (85913) on Monday May 15, 2006 @11:12AM (#15335170) Journal
      With a CRT monitor, all of the supported resolutions display equally well. Unfortunately, as gamers upgraded to fancy, new 19" and 21" LCD monitors, they only look good at a single, native resolution - which is usually much higher than people ran their CRTs at.

      This translates to needing a beefier graphics card to get the frame-rates you expect, vs. the "old way" of just playing all your 3D games at a lower resolution like 800x600.

      • Unfortunately, as gamers upgraded to fancy, new 19" and 21" LCD monitors, they only look good at a single, native resolution...

        This translates to needing a beefier graphics card to get the frame-rates you expect, vs. the "old way" of just playing all your 3D games at a lower resolution like 800x600.

        Wouldn't setting your 3D games at "half" (actually one-fourth) the LCD's native resolution (and stretching to full screen) fix that nasty scaling/interpolation problem? I can't find a good answer from my go

        • It's weird, but I find some resolutions look much better than others on my 1280x1024 monitor. 1280x1024 looks good, of course; 800x600 also looks good. However, 1024x768 and 960x600 look abominably awful. Strangely, 1152x864 looks very nice indeed. Is it just me?
          • A lot depends on your monitor's ability to scale the output from the video card. Some monitors have algorithms that are good at this, and some do very poorly. One monitor I have (20.1" Sony) seems to do very well at non-native resolutions, while another (19" Gateway) looks like crap. I have also used a 19" Viewsonic which seemed to do a decent job, and a 17" NEC which was kind of in the middle. I have also found that overall, most laptops look terrible at non-native resolutions. Some monitors (like the
    • I think that happened around the same time people started to pay $80-100/mo for their cable/satellite TV service... yet squirm at a $30/mo DSL bill.
    • Re:Impressive (Score:3, Informative)

      by default luser (529332)
      Exactly. There is a pricepoint for every budget, and thankfully ATI is starting to improve their pricing so they can actually offer a competitive product. S3 has also helped kick the chair out from under the bottom-end cards, bringing prices even lower.

      Here are some current-generation cards worth considering for excellent price to performance ratios in their price class:

      x1300, 7300 GS ($60)

      s27, x1300 Pro ($80)

      x1600 Pro, 7600 GS ($110)

      7600 GT ($160)

      x1800 XT 256MB, 7900 GT 256MB ($250-300)

      Unfortunately, th
  • by Vo0k (760020) on Monday May 15, 2006 @08:49AM (#15334018) Journal
    Will any of them give me more than 10 FPS during "Breaking Siege of Kvatch", "Battle for Bruma" and the final fight in Imperial City in Oblivion?
    That should be current benchmark method. All the budget cards I know of simply can't do it.
    • Buy a 360!
      I'm kidding! I'm addicted to Oblivion on my PC. I have a SLI setup (with 2 NX6600GT), so its running pretty good (although not as good as I would like). I rented it on the 360 for kicks and beside not crashing when using a magic sword, and playing on a 55" TV, I prefer the PC version.

      But the article is pretty cool and they are right. I've run on 1 card for a while (because of a busted fan) and Oblivion is the first to give me trouble (Oblivion was what convinced me to go out and buy a new fan).
    • I don't know about Battle of Bruma, but the Siege of Kvatch mission went fine with my old Radeon 9800 Pro, on an AMD 3200+. Unfortunately, TES: Oblivion crashes a lot with my card, and usually takes Windows with it. I think that may be because of insufficient cooling, though.
    • by Morrigu (29432)
      Here's what I found w/ my setup. Base system is a P4 3.0E Prescott + Soyo SY-P4VTE mobo (VIA PT-800 chipset) + 2x Kingmax 512MB PC3200.

      * Rosewill Radeon 9600Pro/256MB AGP = hah. Whatever. Oblivion takes off its hat and laughs, then asks if I'd like to upgrade to something that gives me more than 4FPS @ 800x600.

      * Sapphire Radeon X800GTO/256MB AGP = pretty decent performance, Oblivion suggests "High" graphics settings @ 1024x768. Can bump up the resolution to 1280x1024, doesn't impact the
    • Oblivion is why I upgraded my graphics card. It was practically impossible to play with my GF FX5700. Now I have a 6600GT AGP card. Still not the best of the best but Oblivion runs very smooth 800x600 medium detail.

      Not exactly budget either... mid-range is the word i guess.

      • Even in the "heavy" scenes like Kvatch?
        Well, the only way to survive and do something with sense in Bruma on 6600LE was to keep looking away from the allies :)
    • My X800XL has enough problems just rendering two people.
    • Backing up what others have said I have a Geforce 6600GT for 90 quid. (Not sure how its price translates to dollars.)

      It runs Oblivion with just about everything but AA on at reasonable resolution (800x600) Gets some fairly bad slow down out with a crap load of grass but typically runs nice and smooth. I also have a gig and a half of RAM (which is a fair amount) and a 2.8 P4. (not such a fair amount.)
    • by Carewolf (581105)
      My ancient Geforce4 ti4200 can play all of those at more than 10fps (using the oldblivion patch, because bethesda are the whores of the graphics card industry).

      Just don't run with stupid levels of detail.
  • AGP versions? (Score:5, Insightful)

    by gEvil (beta) (945888) on Monday May 15, 2006 @08:50AM (#15334021)
    I find it humorous that there is such a push for budget cards using a fairly new interface. Where are the AGP versions of these cards? You know, for people who really are on a budget and can't afford to buy a new motherboard to use with a new budget card...
    • I thought this way too, until I found two key things:

      1) PCI express cards cost about $50 cheaper than the AGP version fo the same card
      2) A PCI express motherboard can be found under $50

      Check out NewEgg's video card subcategory [newegg.com] and compare the AGP and the PCI x16 sections.
      • Is this because the PCI-E version is cheaper to produce, or is it because pushing newer technologies helps drive sales in other sectors of the tech industry?
      • Re:AGP versions? (Score:4, Insightful)

        by gEvil (beta) (945888) on Monday May 15, 2006 @09:30AM (#15334319)
        Not to mention, please find me a Socket A mobo that has a PCI-E slot. What? There are only a handful (none at newegg). And none have the same features that my current mobo has that I use. Suddenly, this "budget" videocard is costing me a new motherboard and processor, plus a handful of PCI cards. No thanks.
        • Suddenly, this "budget" videocard is costing me a new motherboard and processor, plus a handful of PCI cards.
          Sure, if it's an upgrade. People do buy new computers from time to time, as well.
      • The 50 dollar motherboard doesnt include the new CPU I have to buy (and possibly RAM). There's a real market for AGP cards, I think the big players will eventually translate these cards into the AGP interface when sales slump. The X800 and other newish chipsets have AGP versions, no real reason these guys can't.

        You're right, there is a price premium but I wonder if that has to do more with the scale of production than the interface. If they're punching out 100,000 of these cards but only 10,000 AGP cards,
      • It's ridiculous how they're trying to force PCI express lately. When I did my last system upgrade (fall of 2004, around 1.5 years ago), you couldn't even get it yet. And all of a sudden everyone wants to pretend that AGP is dead, apparently the quickest death of a major technology ever. Come on, even PCI battled it out with VESA Local Bus for quite a while during the 486 days, and AGP didn't kill the PCI video card market overnight. I guess with the ever dropping system prices, they need us to "refresh" our
    • Re:AGP versions? (Score:2, Insightful)

      by deander2 (26173) *
      the nvidia 6600 AGP is the best (and likely last) option you're going to have in AGP. AGP is dead for new product development.
    • Sorry, but a machine with AGP isn't what I'd consider "budget". I don't have ANY PC's with AGP in them. I need a regular ol' PCI card. That's "budget".
    • Where are the AGP versions of these cards? You know, for people who really are on a budget and can't afford to buy a new motherboard to use with a new budget card...

      People "really" on a budget don't look at the "low-end" of the high-end 3d graphics card market. They first of all don't buy a new graphics card unless their old one dies; and second, when they do actually need a new one, they buy a GeForce MX4000 or Radeon 7000 series card for under $25. And they will feel absolutely thrilled at the HUGE p
    • Where are the AGP versions of these cards?

      At this point, you can buy previously top-of-the-line AGP cards for budget prices already. (I know, not current cards, but already plenty fine.) Excellent deals on ebay, computer forums (e.g., www.ocforums.com), etc. An ATi 9800 pro fetches no more than $65 currently, and 6600GT cards are generally under $160. I get excellent Half-Life2 performance with my old GF4 Ti4200 card, and those can be found for $30 or less online!

      The budget cards for AGP are the top-o

    • Amen. I went to Newegg looking for a DirectX9 compatible card that used AGP or PCI and ya know what? There were only a handfull. Have that many people really already moved to PCI-E (which looks like it would necessitate not only a montherboard, but a processor upgrade for me)? I just wanted to step up to more modern graphics, but it seems like you just about have to lay out for a whole new system now-a-days.
      • I've been looking at the Asus A8N-VM that has the built-in GeForce 6100 chipset. Yeah, it's integrated video, but it's supposedly about as good as the old GeForce 5900 cards.

        The big advantage is that it's a PCIe motherboard so you can drop in a more expensive graphics card down the road.

        Price for the A8N-VM with CPU and RAM? ~$260 at MWave.com. The motherboard is only $65.

        It's not the most expandable system (2 PCI, 1 PCIe 1x and 1 PCIe 16x) but it's not a bad looking board for a budget-level system
  • by Anonymous Coward
    But will ATI, nVideo, or S3 provide documentation so we can write drivers even for these much less than flagship models? When they release a card (even a *much* less than cutting edge card) and the documentation to write a 3D driver for it (so I can expect to use it to potention on whatever operating system), then I will be impressed and interested in the bargain.
    • You'd probably have more luck starting a campaign to get S3 to do it. Nvidia and ATI are pretty comfortable with their respective very large market shares and their managers probably won't open up things for worry about their competitors seeing what they are doing (wheather those fears are justified or not).

      S3, having such a small market share, might be much more eager to capture a niche market (non-windows) and not be as worried about the 'big boys' stealing their IP.

  • What are you selling and what is it going to cost me?
  • by ceeam (39911)
    I'm not much of a gamer but - well - my current Radeon 8500 looses it on most modern games (though I'm still happy with it in IL2 and GPL, that I mostly play). So I thought for some about a new video card (and a bloody new MB apparently since they decided to kill AGP, bastards). Nothing over $100-150 - just something with support for modern features and reasonable performance. Nice article for me....

    But hey!! Check out power consumption figures! They state 85 watts in idle and 145.6 watts under load for the
    • But hey!! Check out power consumption figures! They state 85 watts in idle and 145.6 watts under load for their "winner" card (Radeon X1300). W.T.F.?! That's like three times more than my 90nm AMD64 CPU, right?

      Yes. Doing a simple sanity check should reveal that the power consumption figures probably include the rest of the system, since this is much easier to measure than the power consumption of the graphics card alone.

    • You know what - go to hell. Call me back when you have something reasonable with passive cooling.

      if only amd started making gpu's.
  • A better option (Score:5, Interesting)

    by Doomstalk (629173) on Monday May 15, 2006 @09:17AM (#15334218)
    Budget video cards almost always suck. You're better off buying the best of the previous generation or, even better, buying refurbished or open box hardware from sites like Newegg. I was browsing their open box section the other day for a friend, and came across a Radeon x1600 Pro 256mb for $90. A little bit more money than their target, but for $10 you get a card that's not intentionally crippled.
  • by Ihlosi (895663) on Monday May 15, 2006 @09:24AM (#15334269)
    ... just consider buying one of yesterdays "mainstream" cards. An X800 GTO will wipe the floor with any of the crippled budget cards, while being in the same price range (assumed you can find the version with 128 MB GDDR1 memory, the slightly faster GDDR3 versions cost a bit more).
  • by pjrc (134994) <paul@pjrc.com> on Monday May 15, 2006 @09:29AM (#15334313) Homepage Journal
    Ok, I thought, "hmm, maybe this is the time to upgrade that crappy but silent card I bought some time ago?". I believe it was a nvidia 6600LE... which as I understand is the graphics equiv of the low-end 6200, but has its own dedicated 256 meg memory.

    Why would I buy that? Well, cost wasn't the concern. At the time, it was the best card on the market that was passively cooled. No fan = no extra noise!

    So I clicked the link to TFA, and jumped right to the end, and it turns out the quietest card is 44 dBA. No thanks! Not after the low noise power supply, an after-market super-quiet chipset heatsink/fan, and installing 120 mm low-rpm fans (20 dBA), and the quiet Seagate drive. Even worse, from TFA:

    Unfortunately, none of our budget cards are intelligent enough to lower fan speeds at idle, and none offer silent, passive cooling

    So does anyone know of better cards that ARE passively cooled, and will work inside a case with scant airflow due to using large but very low speed fans.

    • Nvidia GeForce 7600 GS cards are often passively cooled. They aren't exactly cheap, but only slightly more expensive than that.
    • I bought this fanless 7600GS [newegg.com] a month ago, and it works great in my quiet pc. Combined with my passively cooled Rev E. Athlon64 3000+ it plays Q4 at 1600x1200 without dropping frames.
    • Anandtech has a review [anandtech.com] of two fairly high end cards, that are both passively cooled - a 7600 GS and a 7800 GT
    • I had an 256MB X800 (note lack of suffix) which was passively cooled (made by Gigabyte, doubt they make it anymore but old samples may still be around). Coped very well with all modern games (including Oblivion at 1024*768 with bloom lighting and the view distances maxed out and F.E.A.R. at a similar resolution), cost about £130 and I was very happy with it. Unfortunately (and this may or may not have had anything to do with the 20 sessions of Oblvion it was forced to undergo on the weekend before its
  • by CSG_SurferDude (96615) <wedaa@NosPAm.wedaa.com> on Monday May 15, 2006 @09:55AM (#15334525) Homepage Journal
    Why don't they ever compare graphics cards to slightly older cards? I have an ATI 9200 but have not a clue if these cards are the worse, the same, or better than the card I already have in my box...
  • by LoverOfJoy (820058) on Monday May 15, 2006 @10:04AM (#15334589) Homepage
    Their conclusion? The best budget card is the most expensive one they looked at, the Radeon X1300 Pro.
  • This conclusion isn't the slightest surprising. VidRAM has heavy demands upon it. Lots of bandwidth is eaten just refreshing the screen. More in calcing textures, etc.

    With only main memory in an "integrated" system, the CPU loses significant access while the video is being refreshed/calced.

  • Not budget cards!! (Score:4, Insightful)

    by Carewolf (581105) on Monday May 15, 2006 @10:26AM (#15334770) Homepage
    What's the point in reviewing budget cards if the require investing in a top computer?

    All of the cards require PCI express and consume extreme amounts of power requiring motherboards and PSU less than a year old!
    • When I build a computer, I choose parts for upgradability. If I were building one right now, I'd get a motherboard with a 939 socket and PCI express and a fast system drive (possibly a Raptor), and skimp on the parts that are easier to upgrade. The cheapest CPU that a 939 socket will take, RAM on one stick (I know the bandwidth is better on two, but one makes it easier to just add a second later.), and a video card in the $75 - $100 range.

      Now, I don't think I'll be building another computer until I gradua
  • S3 Open? DVI? (Score:3, Interesting)

    by twistedcubic (577194) on Monday May 15, 2006 @12:35PM (#15335864)
    Does the S3 have open drivers for the DVI interface? I'm looking to ditch Matrox.

PLUG IT IN!!!

Working...