Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

NVIDIA GeForce 7900GS Benchmarked 118

Posted by ScuttleMonkey
from the video-cards-that-don't-require-a-new-mortgage dept.
Spinnerbait writes "NVIDIA has launched another salvo of more competitively priced graphics cards, this time hitting the sub-$200 mark. The new GeForce 7900GS is built on a 90nm fab process with 20 pixel shaders and 7 vertex shaders. The end result is that just about any medium to high res gaming situation can be handled with high levels of anti-aliasing and anisotropic filtering, while maintaining more than acceptable frame rates. Best of all, you can actually purchase a card in retail today, so this is no paper launch."
This discussion has been archived. No new comments can be posted.

NVIDIA GeForce 7900GS Benchmarked

Comments Filter:
  • no AGP :( (Score:5, Interesting)

    by Ubergrendle (531719) on Wednesday September 06, 2006 @04:09PM (#16055321) Journal
    Given that this discount/budget card is intened for more casual gamers, its too bad there's no AGP version forthcoming. I suspect I'm in the same boat as many Slashdotters, having a hard time justifying the replacement of an 18 month old motherboard + cpu just to get PCI-Express -- especially since X2 AMD cpus are just now coming to the end of manufacturing.

    I'm a dedicated ATI user, but I'd buy the best price/performance card for if someone was still supporting AGP.
    • by interiot (50685)
      I'm in the other boat... I'm upgrading my very old motherboard, but I've got a decent AGP video card, and I'd rather keep using it until DX10 is out (even if it's 12 months), but no decent motherboards support AGP anymore. :(
      • Re: (Score:1, Informative)

        by Anonymous Coward
        Try the ASRock 939dualSATA2, has an AGP and PCI-E slots. its cheap.
        • by Duds (100634)
          Technically redundant but I have an almost identical board to this for my new Core 2 Duo and it seems very very nice so far (although I'm not using AGP myself)
      • You should then get an AGP and PCI-E mobo. They arent that much more expensive.
        • Re: (Score:2, Informative)

          by Dr. Eggman (932300)
          Bad idea. Motherboards the support both AGP and PCIe slots end up underperforming in both. I had an article explaining why, but I can't seem to find it. This one does a good job of explaining how PCIe works may mention AGP PCIe combo mobos. http://arstechnica.com/articles/paedia/hardware/pc ie.ars [arstechnica.com]
          • Re: (Score:3, Informative)

            by homer_ca (144738)
            Not true. Check out this review of the ASRock 775Dual-VSTA [anandtech.com]. The PCI Express Slot is limited to 4X, and it underperforms other PCI-E motherboards by as much as 10%, but it's usually closer. AGP performance is actually slightly better. It's one of the most affordable Core 2 Duo motherboards out there, and it can even use your old DDR400 RAM.
            • by interiot (50685)
              But then you're stuck using slower memory, the board doesn't support DDR2...
              • Just not at the same time as DDR. ASRock site [asrock.com] And besides, if you already have DDR and then you buy this board, you're no more "stuck" with DDR than you were before you bought it.

                It's a terrific board at an incredible price. The only reason I didn't buy one is that it's reportedly tough to install current Linux versions on it.
              • by homer_ca (144738)
                Actually it does support both DDR and DDR2. Here's [anandtech.com] an earlier article comparing memory performance between the two. Almost no difference.
          • by Jett (135113)
            There ASRock 939Dual natively supports both AGP and PCI-E and it has a riser card to upgrade it to AM2. Check out the performance comparisons - it runs full speed with whatever combo you throw at it - people have even gotten it working with a video card in each slot (no SLI that way unfortunately).
            I've had one for almost a year now and it runs great. OC'd an A64 in it without issue - plays games at framerates comparable to similar boards, no stability issues, and I've got an upgrade path to a new video card
    • Re: (Score:3, Interesting)

      by Dr. Eggman (932300)
      I am facing the same thing. Although, I've finally decided that I'm going to try Core 2 Duo over AMD's X2 when I upgrade. I was hoping to hear an announcement about the GeForce 8800GT and 8800GTX first, though. Supposidly, announcements about it and its support of Directx 10 (I hope) are coming out this or next month. Until I hear news on the 8800 series, I'm holding off upgrading. Hopefully, I can upgrade and get my PCI-x, dual core, Directx10 and Vista also supported in one fel swoop. So far, as long as I
    • by simp (25997)
      I'll have to do a "me too" here. My current machine is fine, I got enough cpu power & memory, but I really wish I could get an AGP videocard upgrade. I'm stuck with a Shuttle barebone. It has done its job perfectly the last few years, but now I'm stuck at the end of an upgrade path. No possibility of swapping mobo's, no pci-e video cards....

      • Thought there was a 7800 or 7900 AGP NVIDIA card? (I currently use a NVIDIA GeForce 6800 AGP and I'm thinking of keeping my 2GHz Opteron w/ 2GB of RAM for probably another 2 years now.)

        Ah, the 7800GS for $270-$300. Which I believe is still a good bit faster then my old 6800 and will perform well enough to be worth the upgrade price (before I ditch the 2GHz Opteron unit for a dual-core or quad-core CPU down the road).

        That reminds me, I need to go look for benchmarks...
    • Off topic. I am looking for a video card to use with my machine, so that I can do video editing with Adobe Premiere/After Effects. Is there a difference between a graphics card for games and cards for video editing? This looks like a pretty good deal to me.
    • I got lucky, sold my AGP 6800GT, xp2600, mobo, ram, etc. on ebay and upgraded to a 939 Opteron 148, 2gb corsair, and an eVGA deal that gave a mobo for free with a 7800gt purchase. With my earnings it really didn't cost much out of pocket (around $200+) and when overclocked it is an FX-55 crusher. I didn't know at the time how stupid it was to buy a 7800, the 7900 came out right after, but again- ebay to the rescue. I then upgraded to a 7900gt with minimal loss (maybe $30) and I was lucky I did because about
    • My boot hard drive died.

      My raction?

      I bought a brand-spanking-new AM2 motherboard/CPU, PCI-E video card, and new sticks of DDR2.

      Oh, and a new hard drive.
      • by smash (1351)
        Heh, in a similar vein, 2 months ago my power supply died. So i purchased a new mobo, cpu, video, ram and case (inc psu).

        :) "Forced" hardware upgrades are fun :)

    • Oh but there IS a AGP card! it is the 7800GS.. the last AGP card that will be made by Nvidia.
    • by zarthrag (650912)
      I recently purchased an ATI Sapphire X1500 AGP for $149.99, and am *VERY* satisified. HDR lighting, 4x antialias on everything I play, best money I've spent - besides upgrading to over 1GB of memory.
  • Which one? (Score:5, Funny)

    by neonprimetime (528653) on Wednesday September 06, 2006 @04:11PM (#16055331) Homepage
    Can you spot which one makes this card a hit?

    Operating Systems
    ._Windows XP/XP 64/ME/2000
    ._Built for Microsoft Windows Vista
    ._Linux
    ._Macintosh OS X

  • And Dusk and all those others? I recall that it was made to work on ATI stuff, but how about Linux? Anyone ever get that working there?
    • by Doppler00 (534739)
      Yeah those demos were pretty cool because they were optimized for the cards and really pushed the performance to the edge. Your average game available out now won't do that, since it's developed for a mainstream market. It would be rather pointless to make a game only work with the highest end card, and most games don't scale very well graphics-wise.
  • what do people use anisotropic filtering for? is that just
    a funny way of saying arbitrary discretized kernel?
    • Re:question (Score:5, Informative)

      by Sycraft-fu (314770) on Wednesday September 06, 2006 @04:26PM (#16055430)
      Isotropic = Identical in all directions. An = a prefix meaning not, so anisotropic means something that is directionally dependant. With respect to filtering of computer graphics, it deals with textures that are off angle to the camera. If a texture is facing the camera (screen) it is easy to scale up and down in size and thus scale off to the distance. However if something is off angle, such as the ground, it quickly gets blurry in the distance with standard bilinear or trilinear filtering. Thus anisotropic filtering. When enabled, card perform special filtering on off angle textures that makes them much more clear.

      It is a very pleasing effect, however it does require some power to do and thus can slow down higher end games.
    • by lolocaust (871165)
      I'd explain it in detail, but I might make a minor mistake, so here's the wikipedia link: http://en.wikipedia.org/wiki/Anisotropic_filtering [wikipedia.org] and I think it's just a way to allow textures to be viewed from almost sideways (like walls) without the "shimmering" effect or the blurryness of bilinear filtering.
  • and i'm really starting to regret not getting one - or having the money I should say.... They were 150 (incl shipping)... pretty sexy, needless to say they sold out before 6 am
    • No they didn't. I bought one of those (for the machine I'm building, so I haven't had a chance to play with it yet), and I made my order during the work day.
      • oh... maybe i was thinking of something else.... nevertheless i'm a bit upset I couldn't score one
    • Yeah, I bought one (my first woot!), crossing my fingers because there wasn't too much information out. After checking this out, I'm really glad I did.
    • by slaker (53818)
      I bought two from woot. $285.
      I don't even like nVidia GPUs. I figured I could find someone to buy them for the retail price.

      Still, I did try one out. In City of Heroes, the only game I care about, it performed roughly on par with my ATI X1900XT (within 1 - 2 fps), a card that costs about 2.5 times what the 7900GS does.
      Granted, CoH isn't exactly an ATI-friendly game.

      But whoever ends up with those cards will probably be pretty happy.
  • by powerlord (28156) on Wednesday September 06, 2006 @04:36PM (#16055484) Journal
    "NVIDIA has launched another salvo of more competitively priced graphics cards, this time hitting the sub-$200 mark. ..."


    Seems like a good casual gamer card. Of course the NIC integrated with my MotherBoard (bought/built in January) has been good enough for my PC gaming so far.

    Sub $200 is nice ... of course a lot of us on /. are saving up for a [Wii|PS3|XBox360]. =D
  • It looks like this is another one in Nvidia's line that includes support for Macs as well as Windows machines on the same card. At least OS X is listed in the supported OS's. Hopefully they will continue to bundle firmware for both PCs and Macs on the same card, instead of trying to gouge Mac users. Way to go Nvidia.

    • by timeOday (582209)
      I am confused by this. Is it Nvidia's decision for OSX to support a new card, or Apple's? In the past, Apple's high quality control has in part been a result of targeting only selected hardware. The more Mac hardware resembles PC hardware, the more manufacturers will be offering Mac-compatible products. Are they automatically welcome to do so, or can Apple say, "sorry, if you put that in your case it's no longer a Mac"?
      • Re:Mac Support (Score:5, Informative)

        by 99BottlesOfBeerInMyF (813746) on Wednesday September 06, 2006 @04:58PM (#16055639)

        I am confused by this. Is it Nvidia's decision for OSX to support a new card, or Apple's? In the past, Apple's high quality control has in part been a result of targeting only selected hardware.

        Umm, actually in the past video cards did not support Macs for two main reasons. First, they often used ADC, which pulled power for the monitor as well as the video feed and which required extra work to support the power requirements. This has not been the case in the last several revisions of all macs. Second, the macs use EFI or OpenFirmware instead of BIOS, meaning the video card needed to support all three types of firmware. Older Nvidia cards did not support OpenFirmware which Apple used on PPC macs. Now that Apple is using EFI, Nvidia has released a couple of cards that use the DVI connector now standard on macs and which has firmware for both BIOS and EFI in the same ROM. It marketed them as video card blah for Mac and PC. Presumably, this card is continuing that beneficial trend.

        The more Mac hardware resembles PC hardware, the more manufacturers will be offering Mac-compatible products. Are they automatically welcome to do so, or can Apple say, "sorry, if you put that in your case it's no longer a Mac"?

        Apple is pretty open about letting anyone plug anything they want into macs and as far as I know have never locked out anything in OS X, except motherboards. As far as I know, Apple has never refused to bundle the drivers for any devices pre-installed in OS X, but should they not want to do so, the user would simply have to install them from an included CD or download.

        I'm not sure where you got the idea that Apple was holding back video card manufacturers, but as far as I know, that has never been the case. ATI and Nvidia have both had Mac offerings for a long time, often with nothing more than a different ROM and clock speed, and at half again the price of the PC version.

        • Second, the macs use EFI or OpenFirmware instead of BIOS, meaning the video card needed to support all three types of firmware.

          I thought it was a matter of endianness, and not an issue with the bios/firmware. PPCs are big endian while x86 chips are little endian. Now that Macs are running on little endian chips, nothing special needs to be done with the card other than writing a driver for the OS.
          • thought it was a matter of endianness, and not an issue with the bios/firmware. PPCs are big endian while x86 chips are little endian. Now that Macs are running on little endian chips, nothing special needs to be done with the card other than writing a driver for the OS.

            I think the driver is the only thing that cares about that in the first place. I know there were firmware flashes that you could use to make a PC card work in a PPC Mac, and my understanding is they just changed the ROM to deal with the

  • What about the only benchmark that matters: glxgears?
  • Reliability? (Score:5, Interesting)

    by ewhac (5844) on Wednesday September 06, 2006 @05:00PM (#16055661) Homepage Journal
    The 7900GT and 7900GTX were one of the most problematic graphics products NVidia ever released. There were hundreds of reports of cards that would display a garbage desktop, lock up or, most commonly, display "exploding triangles", all of which would seem to point to RAM data corruption somewhere. These failures even happened to people who were not overclocking their cards at all. The issue was so bad that one card vendor apparently shut down its customer comment fora to forestall any further reports of problems (I won't say who since I didn't independently confirm it).

    There has been tons of speculation on what the cause might be (excessive heat, bad batch of RAMs, signal integrity problems, bad/weak power supplies, too-close-to-the-edge memory timings), but no concrete explanations from anyone.

    I personally bumped into this. I built a brand new rig for myself about four months ago, and gave it an NVidia 7900GT made by eVGA. It wasn't long before stuttering graphics and exploding triangles showed up. Happily, eVGA were very committed to their product, and cross-shipped a replacement which, so far, has worked almost entirely without incident. It's my understanding that customers of competing board vendors have not been so lucky.

    So whenever I see a review of the latest NVidia product, I'm afraid my first question is no longer, "How fast is it?" but, "How reliable is it?" I think burn-in tests should become a standard part of a reviewer's benchmark suite.

    Schwab

    • by linuxpng (314861)
      xfx force
  • More Reviews (Score:5, Informative)

    by homer_ca (144738) on Wednesday September 06, 2006 @05:37PM (#16055900)
    Why is the submitter only pimping the HotHardware review? Here's more (in no particular order):

    HardOCP [hardocp.com]
    Guru3D [guru3d.com]
    Anandtech [anandtech.com]
    Bjorn3D [bjorn3d.com]
    PCPerspective [pcper.com]
    nV News [nvnews.net]
  • Review lacks scope (Score:5, Insightful)

    by crabpeople (720852) on Wednesday September 06, 2006 @06:16PM (#16056118) Journal
    I have a geforce 6800 GT which i purchased around this time last year. Would it be so much to ask that they append reviews of older cards, 1 or perhaps 2 years back, so that we can see where our cards would rank in relation to the new shit? I would really like to know if there has been a significant performance increase so that what is now a budget card, could outperform my one year old highend card.

    If any reviewer is reading this please please put more context in the form of older models into your reviews. Comparing them against the current mid/high range cards does nothing for someone who doesnt obsessively follow video card benchmarks.

    • by hurfy (735314)
      Agreed

      I have no idea how all the newer cards relate to one another.

      Was the 6800 Gt better than the 6800 GS and has now been replaced by a 7xxx YY ? Just trying to look up yours i saw: 6800 GS Extreme, 6800 Extreme, 6800 XT and something else, but no more 6800 GT, How many of those are the same? Which is better? Heck i cant get the sequence of one manufacturer figured out much less compare to the other :(

      Now i just wait until i KNOW mine is too old (that would be NOW!) and find one at about $150-200 (prices
    • Re: (Score:3, Informative)

      by jambarama (784670)
      Techarp has a great comparison [techarp.com] between just about every major video card ever from just about every major video card maker ever. The 7900GS included. The format of the comparison (images) is terrible since you can't search through, but it is a pretty sweet chart. The comparison is more a technical one than a performance one, so take it with a grain of salt, but here are the results for a few cards.

      Name_of_Card____Vertex_Pipelines___Textures/Cloc k____Core_Speed___Memory_Bandwidth
      GeForce_7900GS_____
    • by greg1104 (461138) <gsmith@gregsmith.com> on Wednesday September 06, 2006 @07:28PM (#16056455) Homepage
      The review at Anandtech [anandtech.com] includes benchmarks against cards going back to the 6600/6800 era.
      • by TomHandy (578620)
        Hrmm, I see they include the 6800GS and the 6600GT. Where does the 6800GT fit on those AnandTech benchmarks (that is, which card would it be most comparable to)?
        • by greg1104 (461138)
          If you look at benchmarks comparing the two, the 6800GT performs essentially the same as the later 6800GS.
  • I'd rather spend $200 on a graphics card like this than $600 on a console. My money is on Toshiba's HD-DVD being the industry standard, not Blue Ray. $175 for the Wii wouldn't be too bad though. They should just put the Wii on an internal PC card and cut the price $100 if they want to sell more games. Sony is obviously under the delusion that the whole world will wait on PS3 because the Japanese have boycotted the "foreign devil's" Xbox 360. They're wrong. People will be buying some combination of Xbo
    • by Tweekster (949766)
      WHy would anyone want a Wii on a pc?
      seriously. I am an avid computer guy, but i dont like gaming on computers. it is much simpler to buy a console (my preference was the gamecube because it was inexpensive, worked well and games were readily avail (on the cheap via ebay))

      I like gaming on a tv, no driver issues, no problems, just put the disc in and go. No worrying about framerates and other irrelevant bullshit just to play a video game for a few hours a week.

      that 200 dollar card will work for now, until
  • For me, the next card I select will be chosen more for the availability & functionality of open source drivers, rather than the raw speed of the chip itself.

    I've just spent too much time trying to configure a Matrox G550 and a Nvidia Quadro 280 to deal nicely with dual-head. Both are busted with recent releases (6.8, 6.9) of Xorg.
  • Anyone care to explain - in english - what pixel and vertex shaders are? The wikipedia article [wikipedia.org] ain't very clear.
    • I am not a graphics programmer, so this might not be 100% accurate. But I have worked closely with graphics programmers. This is a simple explanation, without hard details:

      Shaders are small (the first versions had an 8 instruction limit, now they can be much more complex, but they are still in the order of hundreds of bytes), domain-specific programs that the gpu executes on each pixel and vertex. They are coded to achieve different effects like normal mapping or blurs. The name comes from the fact that the
  • A friend of mine bought this card from Woot! [woot.com] for $150. It got shipped in a brown box and was allegedly Dell overstock. He had to take a leap into darkness because nobody at the time was allowed to benchmark the thing, but now it looks like a pretty good purchase!
    • by Pfhor (40220)
      I got it from them also, it was in a white box with "woot" tape on it, and appeared to be an MSI pre-release bundle of some sort. Don't know where woot got them from, but it works OK right now (waiting for my pci-express power adapter to arrive so i can fully use it).

      Going to test it in a mac pro at work sometime next week, see if it boots.
  • But are they going to release linux drivers with support for accellerated H.264 decoding any time soon? I don't care how many polygons per second these cards can shift, but accelleration of stuff I actually do, such as playing video, would be very worthwhile.

  • Sheesh I must be getting old! I understood sentences 1 and 4 in that!

    Question : will it do gvim OK?

C for yourself.

Working...