Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Nvidia 480-Core Graphics Card Approaches 2 Teraflops 261

An anonymous reader writes "At CES, Nvidia has announced a graphics card with 480 cores that can crank up performance to reach close to 2 teraflops. The company's GTX 295 graphics cards has two GPUs with 240 cores each that can execute graphics and other computing tasks like video processing. The card delivers 1.788 teraflops of performance, which Nvidia claims is the fastest single graphics card in the market."
This discussion has been archived. No new comments can be posted.

Nvidia 480-Core Graphics Card Approaches 2 Teraflops

Comments Filter:
  • by TibbonZero ( 571809 ) <Tibbon&gmail,com> on Friday January 09, 2009 @04:56PM (#26391615) Homepage Journal
    No, seriously... can anything run it at full options yet?
    • Yes (Score:3, Informative)

      by jgtg32a ( 1173373 )
      I can run Crysis/Warhead at 30fps maxed out at 720p. I have a single 4850.

      The problem with video card review is they don't bother testing anything lower than 1920x1080 which is 2.25x bigger than 720.

      Crysis takes a lot to run but it has already been tamed as long as you aren't running at 2560x1600 or some other absurd resolution.
      • Of course they test at 1920x1200 - that's how you can stress the card. It also avoids the problem of getting ridiculous framerates because you tested on a reasonable resolution.
        • Re: (Score:3, Insightful)

          by jgtg32a ( 1173373 )
          A video card test needs to show a consumer the capabilities of the card, so they can decide if the card is for them. If what you said was true than they would only do one test at 1920x1600 and be done with it. The lowest resolution I've ever seen on a review was 1920x1080. Not everyone has a monitor that runs that high.
        • Re: (Score:3, Insightful)

          by billcopc ( 196330 )

          Right, but I don't know very many people with 1920x1200 displays. I have one, and my 18-month old GPU can run Crysis and any other game just fine at that rez, but practically everyone else I know is still at 1280x1024 or 1680x1050.

          Realistically, reviewers should find the resolution and settings at which a game is playable, meaning 25-30 fps average for most games. Sure, it's funny to know that Crysis will get 8 fps at 2560x1600 with 16x AA+AF, but if that's what they reviewers think even hardcore gamers e

      • Re: (Score:3, Insightful)

        by Surt ( 22457 )

        1920x1200 is the most preferred resolution because it is the native resolution of most of the 24" panels. If you don't play at native resolution, you get to experience glorious scaling artifacts. Glorious, glorious scaling artifacts.

        • Re: (Score:3, Insightful)

          by Runefox ( 905204 )

          Almost makes me pine for the days of the CRT. ... Well, maybe not exactly. I don't want to imagine how heavy a 24" or larger CRT would be, but I'd love for another technology not locked to a single native resolution to break through the never-ending sea of fixed-pixel devices. For now, I just run my LCD in the scaled "maintain aspect" mode on my Radeon and enjoy the black borders on non-native resolutions. Better than that nice blurry stretch effect I'd get otherwise!

      • I was thinking about this. I am picking up a 6gb ram/4870x2/i7 920 setup and kept thinking: "Why not just run 1680x1050 dual monitors with like 16xAA"

        the thing about being able to run the current generation of games at 2560x1600 also ensures that there isn't a chance in hell you'll be able to with the same setup a year later as games will be too demanding, and lowering resolution while preserving aspect ratio probably makes everything look like crap. Not to mention how disappointing that would be.

        Car analog

  • by sexconker ( 1179573 ) on Friday January 09, 2009 @04:56PM (#26391619)

    1.21 Jiggawatts

    • What the hell is a Jiggawatt?

      • Re: (Score:2, Informative)

        by Anonymous Coward
        It's an allusion to Back to the Future, where Marty goes back to the 50's and tells the doctor that the Delorian time machine requires "one point twenty-one gigawatts" to make the leap. Only back in 1985, the SI prefix "giga" wasn't well known, so presumably the actors or directors in the film arbitrarily, or by following french language convention, decided to pronounce giga with a soft g, hence the line "1.21 jiggawatts" which sounds a little out of place in 2009.
      • Jay-Z's so big now he generates his own form of power.

    • Not to mention that you have to have the video card traveling at 88 MPH.

      • by Adriax ( 746043 )

        Ok so the card is only really useful for mobile computing. To get that speed you have to be on an interstate in most of the US, or a school zone in california.

  • Contest... (Score:5, Funny)

    by Anonymous Coward on Friday January 09, 2009 @04:59PM (#26391647)

    Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.

    • by jollyreaper ( 513215 ) on Friday January 09, 2009 @05:10PM (#26391801)

      Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.

      Yeah, but it's mega-floppy at that.

    • by pla ( 258480 ) on Friday January 09, 2009 @05:20PM (#26391939) Journal
      Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.

      Not quite - They proved they have the biggest number of penises... Making for some interesting crossover potential into the Hentai gaming market.

      / Wonders what "ultra realistic" means as regards H - "Wow, the fur on her tail looks almost real, and her breasts look like actual porcelain!"
      • Not quite - They proved they have the biggest number of penises... Making for some interesting crossover potential into the Hentai gaming market.

        And to fit all those penises on the card, they had to make sure they were very very small.

      • Re: (Score:2, Informative)

        by randyest ( 589159 )
        That's gross; you're gross.
    • Not even close. A single ATI HD4870X2 card has 2.4 TFLOPS or processing power: 2 (instr/clock with MAD) * 800 (Streaming Processors) * 750 (MHz) * 2 (GPUs) = 2.4 TFLOPS.
  • Great... (Score:5, Insightful)

    by pwolf ( 1016201 ) on Friday January 09, 2009 @05:01PM (#26391679)
    That's just great and all but when can I get a video card that doesn't take up half my case and melts down after 6 months of use? Not to mention, doesn't cost an arm and a leg.
    • Re:Great... (Score:5, Funny)

      by clarkn0va ( 807617 ) <apt,get&gmail,com> on Friday January 09, 2009 @05:10PM (#26391785) Homepage

      when can I get a video card that doesn't take up half my case and melts down after 6 months of use? Not to mention, doesn't cost an arm and a leg.

      2006?

    • by slaker ( 53818 )

      No kidding. I have a BFG Tek 8800GTX that's been replaced five times since I got it. My game system used to be an overclocked affair with several hard drives, but over time I've reduced it to a 700W Corsair PSU, an un-overclockable Intel branded motherboard, one hard disk and stock Crucial RAM, thinking maybe my setup was killing the card... all in an enormous Antec P180 case, which has dedicated cooling for the graphics slot and multiple 120mm fans.

      Fucking thing died again a couple weeks ago. Even when it'

      • Sounds like the perfect opportunity to add a new vent and fan.
      • by Fweeky ( 41046 )

        I bought an ATI when my 8800GTS 512 died; I didn't want to play the lottery as to whether the replacement would have the same manufacturing defects [tgdaily.com] or not.

        nVidia are going to have to do something pretty special to attract me back after that; putting two of their power hungry barely-fabricatable huge monolithic GPUs on a single card just isn't it.

      • The problem seems to be that many video cards ship with inadequate cooling systems. At least that's been my experience. Back in the day, custom cooling solutions were pretty much reserved for those doing serious overclocking. Now cooling requirements have gone up, but manufacturers generally use the bare minimum, such that the GPU doesn't overheat as soon as it's powered up, and nothing more.

        I've only got a 7900GTX, but after having it replaced once, and then getting more jaggies, slowdown, and stutterin

      • My 8800 GTX burnt out too, replaced with a 4850 that was on sale for $150 figuring id throw it in my media center PC when i got my 8800 back. The 4850 works so well I havent bothered to send in the 8800. I play on a 24" dell (1920x1200) and the 4850 runs most everything VERY well at that res.
    • Right now (Score:4, Informative)

      by Sycraft-fu ( 314770 ) on Friday January 09, 2009 @05:14PM (#26391861)

      One of the benefits of the technology war is that it produces good midrange and low end technology as well. This is particularly true in the case of graphics cards since they are so parallel. They more or less just lop off some of the execution units and maybe slow down the clock and you get a budget card.

      Whatever your budget is, there's probably a good card available at that level. Now will it be as fast as the GTX 295? Of course not. However they'll be as fast as they can be at that price/power consumption point.

      Don't pitch because some people need/want high end cards. Enjoy the fact that they help subsidize you getting good, cheap midrange cards.

      If you want serious suggestions, tell me your budget range and what you want to do and I'll recommend some cards.

      • by morcego ( 260031 )

        Amen to that. I've got a 9600GT (1GB ddr3), a couple months ago, for a very nice price. It meets all my needs, and then some more.

        I never buy the latest model of anything. It is simply (for me) not worth it.

    • by evanbd ( 210358 )
      Perhaps you should simply buy one of the less expensive cards out there? Of course the highest performing card available uses lots of power and costs a lot. Get something less powerful.
    • No kidding! I just ran into my first Nvidia heat-o'-death situation too.

      Anyone know of an after-market part to draw air directly over your PCIe cards? This is a problem that's right now solved by the turning-my-graphics-card-into-a-jet-engine solution. It works, but if there's a quieter answer that keeps the graphics power I'd be happy to hear it.

      Here's the skinny:

      790i comes with 3 PCIe slots so I thought to try SLI with two new cards, and an older one (in the middle thanks to the bridge) for second monitor

      • Re: (Score:2, Informative)

        61C on a video card isn't much to worry about. Using RivaTuner I used to watch an 8800gts creep up to 90C and it never died. Unfortunately, taking comfort in knowing your video cards won't get cooked isn't very useful when you're worried about the other nearby devices. For what it's worth my old temps were in an antec sonataII case. When I switched over to the antec 900 my 8800gts temps dropped to the 55-60C range. Maybe a new case is the aftermarket part you're looking for?
        • Might do it. I've been reading about some directional fans that may help.

          Problem is these cards tend to draw air from the face instead of the back. That's no good when your neighbours are mighty-hot already.

          Zalman's got a fan I might crack the plastic case on these guys for. I figure a little more space between cards and adding a bunch of surface area might help get some air in that's not coming directly off the other cards.

          Thanks a bunch!

    • When *won't* you be able to get a video card that takes up less than half your case and doesn't require its own power supply?

      Right now you can still get a high powered graphics card for less than $50 with a small or no fan. But those cards are 2 year old technology. These days all the latest and greatest are essentially a PC within a PC and I doubt the power and cooling requirements will go down with time.

      So in 5 years these rediculously large cards will cost $50 but they'll still be rediculously large.

      10

      • Right now you can still get a high powered graphics card for less than $50 with a small or no fan.

        Define "high powered", please...

  • 480 core? (Score:5, Interesting)

    by Anonymous Coward on Friday January 09, 2009 @05:05PM (#26391715)

    Color me doubtful but I suspect it's 480 stream processors which isn't anywhere NEAR the same thing as the "cores" on the CPU or even the core of the GPU.

    Why has the press suddenly started to call stream processors "cores"? Marketing?

    • Re: (Score:3, Insightful)

      by Chabo ( 880571 )
      Maybe because GPGPU is coming soon, and the GPU makers want people to think of them as individual cores? So... partly marketing, I guess.
  • Will this card support OpenCL?

  • by Thelasko ( 1196535 ) on Friday January 09, 2009 @05:24PM (#26391991) Journal
    with CPUs anymore? I'm just going to fill a case with graphics cards and call it a day.
    • Re: (Score:2, Informative)

      by Anonymous Coward

      Because this card can only do 1.788 tera-multiply-adds per second. Try instead to have it build a parse tree, then run transformation algorithms on it (chasing pointers all over the place) and so on, like you would while compiling code, and this thing will make the Atom look great.

      CPUs are optimized for general computing, GPUs are optimized for stream-oriented numeric computing. Both have their uses, and the ideal is probably a combination of both, as is currently done.

    • by Rhys ( 96510 )

      This is reaching the point of "why bother with a supercomputer?" If your app can make use of those cards, an enthusiast board with 3-way SLI can deliver more performance than a 4 year old, debut at #66 supercomputer (that I manage) can. In one PC, with a hell of a lot less to go wrong.

      4 years old is pretty old for a supercomputer, but still that amount of computational power is staggering.

    • Why do we bother... with CPUs anymore? I'm just going to fill a case with graphics cards and call it a day.

      Then you can enjoy the fact that you'll be able to run your anti-virus software 21x faster too. [theinquirer.net]

  • by Skiron ( 735617 ) on Friday January 09, 2009 @05:35PM (#26392119)

    ... for Windows 7 (or whatever they call Vista now).

  • Because their Tesla boards post nearly a TFLOP of performance for single precision computing, but only 78 MFLOPS for double precision.

  • *sigh* (Score:5, Funny)

    by CynicalTyler ( 986549 ) on Friday January 09, 2009 @05:50PM (#26392335)
    Can someone please post the link to a how-to guide for convincing your wife/girlfriend of the necessity of owning a graphics card with dual 240-core GPUs? Or, if you are a girl who acknowledges said necessity without a fight, please post a link to your Facebook profile. Thank you in advance.
    • How to convince her??? Just agree to be her personal slave for the rest of your life. You'll get your video card, but you'll never have time to use it (unless she has you online, putting swimsuits on her 3D virtual model)!
  • by jdb2 ( 800046 ) * on Friday January 09, 2009 @06:07PM (#26392591) Journal
    If you thought the Radeon 4870 X2 was overkill, then you need a new word to describe the monstrosity that Nvidia has just released. Here's what Nvidia has done :
    1. Taken the GT200 GPU and shrunk the die to a 55nm process. ( to match the AMD/ATI's 55nm RV770 )
    2. Basically slapped together 2 complete and independent graphics cards, that is, the GTX 295 is composed of 2 PCB's with their "topsides" facing each other and a huge heatsink between them.
    3. They've linked the two "cards"/PCBs via an SLI bridge ( or is it a PCIe bridge? )

    Compare this to the Radeon 4870 X2 : 2 55nm RV770 GPUs on the same PCB connected by a PCIe bridge although the card has a "Crossfire X Sideport" interlink ( which I think is Hypertransport, although I may be wrong ) that directly connects the two GPUs, which isn't enabled in their drivers at the moment. (you can see it on the PCB -- a set of horizontal traces directly linking both GPUs ) One might wonder if they've delayed enabling the direct link because they knew Nvidia would respond this way.

    Anyway, it's always great when two companies battle it out, as the consumer always wins.

    jdb2

  • by J.R. Random ( 801334 ) on Friday January 09, 2009 @08:45PM (#26394361)
    Until NVIDIA starts supporting the development of open source drivers I'm sticking with ATI, no matter how many Blazing Cores of Might NVIDIA might fit onto their chips. While ATI's closed source drivers have their fair share of bugs, and it will be some time before there are good 3D open source drivers for their more recent cards, at least the development has started and ATI has been aiding it, not hobbling it.
    • Re: (Score:3, Interesting)

      by Casandro ( 751346 )

      Well ATI recently anounced that they want to start supporting open source drivers again. It's just a matter of time, I hope. Otherwise I'll have to go with Intel for my next chipset.

  • by Casandro ( 751346 ) on Saturday January 10, 2009 @02:54AM (#26396313)

    I mean seriously, as long as they don't publish the hardware specifications so you can write your own software for it, it's preety much useless. The only thing you can do with it is play games. And even then you have to fear every little software update as it might trigger some bug in the binary only drivers the manufacturer provides.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...