Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

GPUs To Power Supercomputing's Next Revolution 78

evanwired writes "Revolution is a word that's often thrown around with little thought in high tech circles, but this one looks real. Wired News has a comprehensive report on computer scientists' efforts to adapt graphics processors for high performance computing. The goal for these NVidia and ATI chips is to tackle non-graphics related number crunching for complex scientific calculations. NVIDIA announced this week along with its new wicked fast GeForce 8800 release the first C-compiler environment for the GPU; Wired reports that ATI is planning to release at least some of its proprietary code to the public domain to spur non-graphics related development of its technology. Meanwhile lab results are showing some amazing comparisons between CPU and GPU performance. Stanford's distributed computing project Folding@Home launched a GPU beta last month that is now publishing data putting donated GPU performance at 20-40 times the efficiency of donated CPU performance."
This discussion has been archived. No new comments can be posted.

GPUs To Power Supercomputing's Next Revolution

Comments Filter:
  • by Anonymous Coward on Thursday November 09, 2006 @05:54PM (#16789059)
    I was thinking about the question of what makes GPUs so great..

    I thought .. What is it that a CPU does that a GPU doesn't?

    Oh yeah .. I know .. run windows.

    *I'm kidding I'm kidding*
  • by Chayak ( 925733 ) on Thursday November 09, 2006 @05:57PM (#16789101)
    Great now Homeland Defence is going to buy up all the graphics cards to prevent their dangerous computing power from falling in the hands of evil script kiddies trying to crack your hotmail account...
  • by NerveGas ( 168686 ) on Thursday November 09, 2006 @06:00PM (#16789127)

        "Serious" computers won't come with fewer than 4 16x PCI-E slots for hooking in "scientific processing units"...

        We used to tell our boss that we were going to do stress-testing when we stayed late to play Q3, this takes that joke to a whole new level.
  • by JCOTTON ( 775912 ) on Thursday November 09, 2006 @06:02PM (#16789143) Homepage Journal
    Can a game computer that is self-aware, play with itself? Are Slashdotters that play with themselves, self-aware?

    Step back, step back from that sig....

  • by sczimme ( 603413 ) on Thursday November 09, 2006 @06:03PM (#16789153)


    NVIDIA announced this week along with its new wicked fast GeForce 8800 release the first C-compiler environment for the GPU

    "Wicked fast" GPU? And a compiler?

    Sounds like a Boston C Party.

  • by Donut2099 ( 153459 ) on Thursday November 09, 2006 @06:20PM (#16789265) Journal
    They found they could get even more performance by turning off vsync!
  • sigh.. (Score:5, Funny)

    by springbox ( 853816 ) on Thursday November 09, 2006 @06:36PM (#16789351)
    They "CUDA" come up with a better acronym.
  • 8087 (Score:4, Funny)

    by Bo'Bob'O ( 95398 ) on Friday November 10, 2006 @04:44AM (#16791768)
    "GPUs have dedicated circuitry to do math, math, and more math - and to do it *fast*. In a single cycle, they can perform mathematical computations that take general-purpose CPUs an eternity, in comparison."

    Sounds like there is a lot of untapped potential. I propose we move GPUs off the external cards, and give them their own dedicated spot on the motherboard. Though, since we will allowing it be used for more general applications, we could just call it a Math Processor. Then again, it's not really a full processor like a duel core, so, we'll just call it a Co-Processor. This new "Math Co- Processor" will revolutionize PCs like nothing we have ever seen before. Think of it, who would have thought 20 years ago we could have a whole chip just for floating point math!

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...