Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

NVIDIA Releases new Budget GPUs 108

Dennis Law writes "I was just checking out the latest GPU releases from NVIDIA. Non-gamers will be delighted to hear that NVIDIA also released a budget-edition of their new 7300 series, namely the 7300 LE. 'Targeted at the X1300 LE, this card will be priced lower than the GeForce 7300 GT at a price range of $49 to $69.' Now that's cheap enough for me to afford."
This discussion has been archived. No new comments can be posted.

NVIDIA Releases new Budget GPUs

Comments Filter:
  • Sounds good. Anyone know what sort of performance we can expect from one of these? Preferably in the form of a comparison to mid-range cards of years past (I used to have a Ti4200, now running a 9800AIW). I realize it'll have support for newer technologies (DX9/10 support, for instance), but some sort of frame of reference could be handy.
    • I recently upgraded from a GeForce 4 128MB AGP to a GeForce 6200 128MB AGP for $50 USD. I was able to play Doom 3, Quake 4 and HL2 at 800x600, and UT2004 at 1024x768, all on high settings at playable framerates with the latest eye candy. I suspect the new budget cards might be similar in performance or run these games in the next highest resolution level.
  • Cheap but not free (Score:3, Insightful)

    by bobintetley ( 643462 ) on Tuesday March 21, 2006 @02:34PM (#14965727)

    Now that's cheap enough for me to afford.

    It might be cheap enough for you, but it certainly isn't free enough for me.

    I use NetBSD and I doubt they'll be porting the proprietary drivers anytime soon.

    • NetBSD, maybe not, but then how much high end GPU usage do you frequent in there? There are drivers for many linux flavors, and they tend to work quite well. (I have a Geforce 4 in my linux box and rock the fps on tuxracer...)
    • It might be cheap enough for you, but it certainly isn't free enough for me.

      I haven't been keeping up. What card/chipset can you get the fastest "non-proprietary" performance out of these days?

      It used to be Matrox's line ... a long time ago. I saw a posting elsewhere on this site claiming the radeon r200 chipset drivers were pretty sharp.

    • What if they did? What would/could you possibly do with such graphics power on NetBSD? Play FreeCiv at 30,000 fps?
  • by MikeFM ( 12491 ) on Tuesday March 21, 2006 @02:35PM (#14965737) Homepage Journal
    For anything other than vanilla workstations getting a cheap video card usually costs more in the long run than a mid-range card because you have to replace it sooner. This card looks like it has a little bit of power to it. Will it still be able to run most 3D games and apps in three years or will it, like most of these cheapies, have to be replaced yearly? With desktops and apps going 3D more and more it's no longer an issue only for gamers I think.
    • With Qt4 out and KDE 4 in the works, this card amy just hit the spot for a non-gaming desktop. Just enough 3d goodness to speed things up, but no overkill.
      • But will it still hit that spot in three years? A decent mid-range game card now can be bought for $100 and not be completely outdated for several years. Most of the budget cards I've ever tried barely did the job when you bought them, still cost around $50-$70, and certainly didn't handle anything very useful two or three years later. It seems the desktop is going to need this 3D power in the near future so maybe it's better to double what you're spending and get something a bit better?

        Or - is this card a
        • Re:I disagree (Score:2, Insightful)

          by wernercd ( 837757 )
          Since majority of people do Windows, I think the question is moot since NO card has DX10 atm... so if you (or anyone) is going to advocate a 3+yr card - Nothing on the market has it atm. a $600 FX 7900 or a X1900 will be 'dated' in 6 months anyways.

          So why not buy a $50 card now and then buy the lowest DX10 card when it comes to market? That would be the smartest penny pincher plan in my book. Since I assume Aero runs better with atleast some kind of DX10.

          That is assuming you go for Vista and aren't one of t
          • Or to quote the superior intellectual Bill Gates - "No one will ever need more than 640k of Ram."
          • Well, it may not be DX 10, but Microsoft has posted a list of cards that are Vista compatible, so one can assume that they will have rudimentary dx 10 compatibility. As far as nVidia, anyth FX 5200 and up is good.
        • Graphics cards are a fun portion of the market. It's one of those things where I can openly mock cheapskates without repercussions because 99% of them are uniformly stupid. It's kind of like the used car market with two less digits on the price.

          It doesn't matter what kind of card you buy, or how cheap it is. If you intend to use its 3d capabilities at all, it's going to cost you AT LEAST 50$ a year. That means the crappy Radeon 9250 you bought for xmas will need to be replaced by next xmas with another
          • I have one of the upper end GeForce 6800 (512MB) cards that I picked up for about $100 recently from NewEgg. I thought about a beefier card (I'm a little disappointed as I was expecting to get a 7800 card in trade for some work but it never came through.) but really the 6800 handles every one of the major games I've tried, with all features turned on and cranked up all the way, without any problem. These are leading top-of-the-line games out right now and I'm seeing good frame rates and the games look great
    • For anything other than vanilla workstations getting a cheap video card usually costs more in the long run than a mid-range card because you have to replace it sooner.

      What's the difference between paying $50 per year over three years, or paying $150 once every three years?

      • It's usually more an issue of 50-100% price increase for the better card not 200%. At least in my experience. So every three years you save around $50 plus time and effort involved.
        • Oh, I dunno about that. The 6600GT, which I believe still qualifies as a midrange card, runs around $140-150, or damn near 3x the cost of this thing. Anyway, think of it in terms of inflation - if you spend $50 per year on a video card, versus $150 every three years, then inflation renders the second card cheaper than the first in real terms, and the third cheaper still. Which means that, in real terms, it's actually cheaper to spend the money over time than it is to spend it up front. Not that gamers t
      • Opening the box, reconfiguring drivers, etc. Depending on the performance curve, you can also end up with better performance for the most part of those 3 years, while saving that labor.

        (And, yeah, it may be fun to poke inside a machine...)

      • Ahh, but for 3 years you get to use a card that is equivilent to the last (highest performing) card.
    • For anything other than vanilla workstations getting a cheap video card usually costs more in the long run than a mid-range card because you have to replace it sooner.

      I have a GeForce4 MX440 I bought 3 years ago. I've never had to replace it. The thought hadn't even crossed my mind. I've got dual head tv out, opengl scaling for Zsnes, it's fast enough to run epsxe. So wtf are you talking about?
      • Sounds like a pretty vanilla workstation. That might be exactly WTF the OP was talking about.
      • Me too!

        But seriously, I'm in pretty much the same boat. Epsxe is the benchmark for my MX440 too.

        I have actually considered an upgrade, but then looked at the prices and thought about how much RAM I could get for the price and how much more that would improve my computing experience.

        If these cards come in under AU$100 I might think about it, just to get second head and DVI capability.

        If you pretty much only like classic games (like I do) then budget cards are great value.
    • The fact that I have been using the same 40$ GF4 MX440 for the past two years pretty much invalidate your point. Run the Kororaa Xgl LiveCD quite well, too.
      • I just transfered my GeForce 256 DDR VIVO I purchased for >£220 back in 2000 from the dieing, overclocked to 750Mhz Athlon box I originally had it in, to a newer 3Ghz box. It plays Sims2 fine. So what is *everybody* on about?
    • >like most of these cheapies, have to be replaced yearly?

      In my experience that's not true. First off, lots of potential PC gamers are pretty damn poor. Students, kids, etc. The onboard card doesnt cut it for a variety of reasons. Too laggy in "low," not enough video RAM, etc.

      A midrange to low card can give these players enough power to run a lot of games at medium to high (at certain resolutions) for a long time coming. Even when it becomes the equivalant of a geforce2 today it still may beat the onboar
    • Who cares about the games? this card is simply meant for the majority of the world that doesnt play games.
    • Nobody is forcing you to upgrade. You can quite happily live with cheap cards without having to upgrade all the time. The games system that I am currently in the process of upgrading has a 32MB TNT2 card! As you can imagine, even this 7300LE would be a dramatic upgrade for me.

      I can use this low end system because I don't buy the latest games. I can load up an old game like No One Lives Forever without any problems. Now I am finally getting through my backlog of old games that I bought cheap or second hand

  • Why? (Score:2, Interesting)

    Why do companies do this? Doesn't it cost about the same for them to make a 'cheap chip' as it would an expensive one? Is the manufacturing process really that much simpler? All the other costs would be the same, distribution, administration, marketing, whatever.

    Something I've been wondering since the first 'celeron'

    • You must be new here...

      but seriously, the cost of fabrication has nothing to do with anything. Jeez, you know what the difference was between a 486DX and 486SX? The last guy in the assembly line would BURN OFF the math coprocessor of the DX's, making them SX's.

      What if it cost just as much to manufacture a Civic as a Corvette? There's value in the difference between the cars that people are willing to pay for, and thats the key.
      • The last guy in the assembly line would BURN OFF the math coprocessor of the DX's, making them SX's.

        This is a legend that i wish would die out. Yes, the first 486SXs were DXs with a disabled or (more likely the case) defective math co-pros. After Rev. A, SXs they were different pinout, different case (plastic vs. ceramic), different connector type (BGA vs. pins).

        That said, I will admit the 487SX was a bit odd (a 486DX with one pin different). Then again, computers cost a lot more back then.

        • It's not really a legend if it actually happened, if even for a time. The same thing has happened with the celeron line more than once as well. Overclockers have often hit spurts where a particular celeron was simple a pentium with cache disabled and underclocked a little because intel was trying to empty their warehouses without devaluing their higher end product.
    • Re:Why? (Score:3, Informative)

      Defects in the manufacturing mean some components can't work at top spec, so they slow them, disable dud memory, etc. and make a cheaper product out of them to recoup the losses.
    • Re:Why? (Score:1, Informative)

      by p0on ( 669866 )
      It's called price discrimination and it allows companies to sell products to those who will pay higher prices for higher prices. The principal keeps 80% of retail products companies in business. It's not predatory, it's not unethical, it's just economics. Keep in mind it happens both ways too - some customers are far less profitable than others at the same service and product level.

      The wikipedia entry is a good primer.

      http://en.wikipedia.org/wiki/Price_discrimination [wikipedia.org]
    • Marketing (Score:4, Informative)

      by temojen ( 678985 ) on Tuesday March 21, 2006 @02:53PM (#14965938) Journal
      They market them in different market segments. If they did that with identical chips, people would cry foul.

      It may also be that they have multiple shader units and the ones that have more shaders fail get down-graded and sold at a lower price. Thus increasing process yields since they have to throw out fewer chips. Sort of like the difference between a 386/33 and a 386/25 in the old days.
    • Re:Why? (Score:5, Informative)

      by MaineCoon ( 12585 ) on Tuesday March 21, 2006 @02:53PM (#14965940) Homepage
      Assuming that this is the exact same silicon, now that they have recouped investment, these low end chips may be from piles of accumulated failed chips; chips where only half the shader units passed tests, or where it only operated properly at half speed.

      Working on the assumption that, like some other chips (AMD comes to mind), the features on a chip are enabled after a set of tests are run on it, or are enabled in-chip after passing some internal test, it is reasonable to assume that these are from the same silicon as the high end chips, but are faulty in some way, but not faulty enough that they can't be used. I imagine thats the difference between the 7800 GT/GTX - the GT had several (but not too many) failed shader units, and/or operated stably at a bit less speed than the minimum required for GTX classification.

    • Doesn't it cost about the same for them to make a 'cheap chip' as it would an expensive one?


      Not necessarily. Chip manufacturing costs are mostly about die size. How many chips can you fit on a single wafer? It's also about yield. That is how many chips per wafer will run at the speed you're shooting for? If you have a small die size, and a lower ghz goal you're going to get a lot of functional chips per wafer. That translates into lower costs.

      Nvidia can certainly design a chip that takes up less die s
    • Market segmentation [wikipedia.org].

      The practice is quite a bit older than the Celeron even in the CPU industry, e.g. the 486SX was basically a 486DX with the math coprocessor disabled (sometimes they were really flawed math cos, but many times they were just disabled).

      They do it simply because they maximise profits that way. The general reason they can do it is that the higher end markets are smaller and less price elastic (e.g. if you really need a fast CPU or GPU for what you do, you kinda have to accept the market's

    • Assume it costs $10 to make a GPU, high end or low end, should all GPUs be made high end? No.

      Lets say the low end GPU can be put on a PCI card and sold for $50 at $10 profit. You can sell a lot of these because they are cheap, but your margins are low.

      Now put the high end GPU on a slightly more expensive PCI card and sold for, lets say, $300. Profit is what, somewhere around $200-$250? Sure you won't sell as many, but the margins are quite impressive.

      Now it's not quite this simple, but that's the general id
  • Why do OEMs insist on using integated Intel graphics cards when stuff like this is available?
    • In the case of the newish Dell that's sitting under my desk, it's because it would have required that they actually put an AGP or PCI-E slot on the mobo, and then populate that slot with this $50 card. Or they could go the route they did and integrate the Intel Extreme graphics chip for about the cost of the AGP slot alone.
    • Because Integrated cards are even cheaper?
    • Motherboards with integrated video chip sets are cheaper than motherboards plus a separate video card. It's also cheaper to use the same motherboard on all your products, instead of one for each. So your budget offering gets the integrated video while teh l33t antisocial gamer model gets the same motherboard with additional video card, RAM, 3D sound, gigabit, etc.

      If you don't like it, don't buy an OEM system, but build your own. I might be more expensive, but at least you're not paying the Intel graphics "t
    • Fifty bucks is fifty bucks. That's a lot of money when you are building a low-end PC. Depending on its intended use, that money might be better spent on more RAM or a larger hard disk.
    • Re:Why (Score:3, Insightful)

      by pla ( 258480 )
      Why do OEMs insist on using integated Intel graphics cards when stuff like this is available?

      Because, believe it or not, most people do not use their PCs for high-end gaming. They may occasionally try to run some hot new resource-sucking game without a clue about the hardware needed to get a good framerate, but for the most part, a Flash app inside their browser counts as the most graphically-intensive app they run.

      Now, on the opposite side of that, you have people who will blow $400 on a video card t
      • So you're saying that there is a card that costs $400 less than a Radeon X1900XT, and has 97% of its performance? Please let me know where you've seen this. There is a significant difference between a 7900GT, and a 7600GT, and I'm posotive that the difference between the 7600GS,a nd GT wil be sizable as well. Particularly if you turn of effects like AA, Ansio, and HDR. Not to mention with all the high res panels out there now with minimun resolutions of 1680x1050, and the like you will notice a difference.
    • The same reason they use Celerons and Semprons when Pentiums and Athlons are available.
  • We don't care what 4 digit number NVidia Marketers will be using. 7300? I bet it's as fast as an Geforce 2 MX400.

    Oh, I'm wrong am I? How would anybody know? There's no performance comparison given! How fast is this thing? Should I dump my Geforce 2 MX400?
    • You should dump your seriously outdated video card to get a nice performance boost and the latest eye candy that the newer games support. The only computer I would use a GeForce 2 MX400 in would be a server.
      • Damn straight. Thats where mine is. A headless server no less. (mx400 is in it just in case / so it doesn't get lost)
      • My game machine has a GeForce 6600. The MX400 is in my Linux box. Great 2D performance! But I still don't know how fast this 7300 is.
      • Holy poop! A Geforce card in a SERVER? ANY decent 3D accelerator does not belong in a server. Hell, I had a Riva TNT in a spare desktop in case someone wanted to use it. It's better to put it somewhere it *might* be used, than somewhere where it's guaranteed to *never* be used.

        Wow, I MUST be getting old.... For me, the ideal video card for a server is still the old S3 Trio64+ PCI card. Servers typically run text only or at best have very basic graphics needs, so what in the world would you need someth
        • S3 Trio64+ PCI card

          LOL, the first PC system I built had a Trio64 in it. And I did actually end up using it in my linux server a few years back. Did the job just fine.

          -Eric

        • When you constently upgrade your video card, trying to keep your main game machine in the race for the highest FPS, you'll have plenty of old GeForce cards lying around. So, it basically will cost you nothing to put an old card in a new server. At least, that's how some of my servers got GeForce cards.
          • So, it basically will cost you nothing to put an old card in a new server. At least, that's how some of my servers got GeForce cards.

            Older 3D cards will still eat up electricity and generate heat. I don't really see the point of drawing an extra 20-30W 24/7 so I can have a GeForce card in my headless server.
            • You normally don't put a video card into a headless server. For a server with X Windows, a GeForce card is really nice over most older motherboard graphic chips. If you need a PCI card, the GeForce MX cards can be a better bang per buck than other chipsets.
          • Probably selling old Geforces once they're not needed and buying some old PCI card when it's needed would end up much cheaper...
        • Holy poop! A Geforce card in a SERVER?

          Foolish salesfolk assume that if you have a cluster you must be doing movie special effects for LOTR and that a 32 inch widescreen LCD monitor will be connected to each node. Even when you tell the salesfolk that it will never need to do more than text they don't believe you. A lot of the recent dual processer server boards have better graphics hardware than what I'd consider a mid end graphics card - which effectively leaves me with a room full of 1U gaming godboxes

          • Foolish salesfolk assume that if you have a cluster you must be doing movie special effects for LOTR and that a 32 inch widescreen LCD monitor will be connected to each node. Even when you tell the salesfolk that it will never need to do more than text they don't believe you.

            Of course they know. They also know that you have to argue with them, for the look of the thing, and that what you really want is the super-duper 3D card.

            The reason? So that the BOFH can swap out the GeForce 64000 GTi 512MB Ultimate

        • I guess it all depends what your server is serving. My server streams video around my house, and I have a GeForce in it to help with the video decoding.
  • When are they going to release Free Software drivers for them? Or alternatively, complete technical specs?
  • The link has pictures indicating MCE application target for the 7600GS with passive cooling. I'm currently using a 6600GT for gaming; if this is the replacement and can be passively cooled, I would love one in my media pc--I could play on my 37-inch LCD. Plus it would be a perfect excuse to get a 7900GT for my gaming pc... it can't have the same performance as my media pc!

    Any chance for some benchmarks soon?
    • You should be able to pick up a passive video cooling kit for your card for about 20-30 bucks. They're basically heatpipe/bigass heatsink combos, with some including memory heatsinks, too. The one I put on my card certainly made a big difference, as it got rid of that super-whiny tiny little fan that was on there.
    • The 6600GT can be passively cooled (eg. Gigabyte sell one such model, which I have), but the heatsink is much bigger than the one the 7600GS has in the picture. The performance increase wouldn't be that big however, and the 7600GS actually sports a lower memory clock than the 6600GT. It would be nice to get a power consumption figure as well - the 6600GT uses about 50 W [xbitlabs.com] and the linked item lists the 7600GT at 67 W.
  • A...G....P...!!! (Score:5, Insightful)

    by MetricT ( 128876 ) on Tuesday March 21, 2006 @03:06PM (#14966072)
    FOR THE LOVE OF GOD Nvidia, please release a AGP version of your cards. Seriously. There's no way I intend to replace a perfectly good motherboard, CPU, memory, etc just to use a PCI-Express card. I would imagine there are a ton of people in the same boat.

    Jeez, VL-Bus got better support after PCI than AGP is after PCI-Express.
    • Because everyone is clamoring for PCI-E, they made their chips PCI-E native. Those are the people that make nVidia the most money, so that's who they cater to. You might be able to argue that workstation cards like the Quattro series make them more money, I don't have the economics in front of me, but that's beside the point.

      The first PCI-E cards were more expensive because they had to use a bridge chip to make them work. They used AGP-based GPUs. Now, since the majority of new cards b
      • Of course, the irony here is that it's mostly people like me who don't want to spend lots of money upgrading that, well, don't want to spend lots of money.

        If I could afford to blow lots of cash on all those bits, I'd buy a decent midrange graphics card. Or an XBox 360.
    • I use a GeForce 6600 AGP, and that's good enough for my needs. I can run all modern games on at least medium settings. I figure by the time I'm playing anything that requires something more powerful, I might as well just upgrade my entire computer, which would mean a nice new PCI-E motherboard.

      So I figure that there isn't a huge need for AGP in the latest, greatest video cards, since relatively few new AGP motherboards are being produced these days.
    • 7800GS (Score:2, Informative)

      by tiffman ( 162790 )
      It's not a 7900, but the 7800GS [newegg.com] is AGP.
    • Theres plenty of budget agp cards in that price range with decent performance, the same could not be said for decent performing ultra cheap pci-e cards till just now.
    • The exact same thing also happened with the PCI -> AGP conversion, it didn't take long until noone was making nice graphics cards for PCI only motherboards.
      If you're that worried about the cost of replacing your whole MB, CPU etc at once, get a PCIe MB that has builtin video. Some of the built in chipsets these days compete quite well against the cheaper cards.
  • This looks like it could potentially be a good card for Home Theater PC (HTPC) use. The picture showed a passive heat sink, which is nice for quiet HTPC use. It talks about H.264 acceleration available in the "PureVideo" feature, which is obviously a big deal for HTPC use..

    I basically don't care about 3D features/performance. The questions I have are:

    - AGP Version Available?
    - "PureVideo" features available in Linux?
  • Any chance of getting one of these for a Mac G5??
  • Too bad it's PCIE. If you're the budget-component type, you're probably still using AGP 1X.
  • Hmm, so according to the specs, people can set these up in SLI. Wouldn't it be cheaper to go up to the next level (Possibly the 7600GS)?
  • Good! I for one hope they keep bringing out fanless video cards. It's getting tough to build a quiet PC with just about every card on the market growing various fans on them.

    The last time I bought a video card I went out of my way to find a fanless one, and it's good to see they're still be made with passive coolers.

An authority is a person who can tell you more about something than you really care to know.

Working...