Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Impressive GPU Numbers From Folding@Home 201

ludd1t3 writes, "The Folding@Home project has put forth some impressive performance numbers with the GPU client that's designed to work with the ATI X1900. According to the client statistics, there are 448 registered GPUs that produce 29 TFLOPS. Those 448 GPUs outperform the combined 25,050 CPUs registered by the Linux and Mac OS clients. Ouch! Are ASICs really that much better than general-purpose circuits? If so, does that mean that IBM was right all along with their AS/400, iSeries product which makes heavy use of ASICs?"
This discussion has been archived. No new comments can be posted.

Impressive GPU Numbers From Folding@Home

Comments Filter:
  • by blahplusplus ( 757119 ) on Friday October 13, 2006 @06:25PM (#16431183)
    I have a feeling this is memory bandwidth related, modern GPU's have insane amounts of memory bandwidth compared to the wide range of desktops. Not to mention the parallelism.
  • Obvious? (Score:2, Insightful)

    by Iron Condor ( 964856 ) on Friday October 13, 2006 @06:28PM (#16431225)

    Maybe I'm missing some subtlety in the OP somewhere, but if GPUs weren't better at what they're doing than CPUs, there wouldn't be a point in having a GPU in the first place.

    ...and if you have a problem that can be expressed in terms of the problem space the GPU is designed to handle, then that problem is going to run faster on the gpu than on the CPU.

  • by ThinkFr33ly ( 902481 ) on Friday October 13, 2006 @06:34PM (#16431273)
    GPUs are, for the most part, highly specialized parallel computers [wikipedia.org]. Virtually all modern CPUs are serial computers. They do essentially one thing at a time. Because of this, most modern programming languages are taylored to this serial processing.

    Making a general purpose parallel computer is very, very hard. It just so happens that you can use things like shaders for more than just graphics processing, and so via OpenGL and DirectX you can make GPUs do some nifty things.

    In theory, and indeed often in practice, parallel computers are much, much faster than their serial counterparts. Hence the reason a GPU that costs $200 can render incredible 3D scenes that a $1000 CPU wouldn't have a prayer trying to render.
  • by Murphy Murph ( 833008 ) <sealab.murphy@gmail.com> on Friday October 13, 2006 @06:56PM (#16431505) Journal

    I actually installed boinc with seti on several of my machines last night and it worked quite well to heat part of the house (us Canadians need to turn the heater on earlier). Took a bit of time to get started, but it was nice and toasty in the morning. Does anyone know if this method is less efficient in generating heat than using a apace heater? Slower perhaps..


    Using your CPU as a space heater is not a bad idea. It is 100% efficient. Every watt it consumes gets turned into heat. Before someone says "but the cooling fans are wasteful" let me remind you that the air moved by those cooling fans will eventually come to a stop (inside your house) as a result of friction, releasing its energy as heat in the process.

    Depending on what type of space heater you use, and the construction of your house, your computer can be more efficient than many other electric space heaters. Since none of the energy "consumed" by your CPU/GPU is converted to visible light, none of it has the opportunity to leave your house through your window panes (assuming you have IR reflective glass). Contrast this to quartz and halogen space heaters which produce a fair amount of visible light.

    In much the same way, incandescent bulbs match the efficiency of compact fluorescents during the winter months. Every watt "wasted" as heat during the summer is now performing useful work heating your house. (Before someone says "you called a quartz/halogen space heater inefficient because of its waste light, and now an incandescent efficient because of its waste heat!' let me say that the space heater's light is not useful light, while the bulb's heat is useful heat (during the cool months.))
  • by Jeffrey Baker ( 6191 ) on Friday October 13, 2006 @07:07PM (#16431621)
    Good one ... but I also wonder why anyone is throwing around the term "ASIC" in this article. A GPU is obviously not an application-specific circuit, which is clearly shown by the fact that it can be programmed to process graphics, or protein folding, or numerous other tasks. A GPU is a general-purpose processor like a CPU, it just happens to have different numbers and kinds of execution units.
  • Not a mystery (Score:3, Insightful)

    by DaveJay ( 133437 ) on Friday October 13, 2006 @07:25PM (#16431767)
    Take one hundred people with computers, and who have an interest in Folding@Home. Offer them a CPU-driven version of the app, and 100 computers will be running the CPU-driven app, regardless of the age/performance of the machine.

    Now, offer them a GPU-driven alternative. For the most part, the only people that will install and run it are those with a fancy-schmancy video card capable of running it, and for the most part, the only people that have a fancy-schmancy video card capable of running it have high-performance computers as well (or at least more recent computers that came with compatible cards.)

    So let's say that's ten out of the hundred, and those ten are statistically likely to have had the highest-performing CPUs as well; so you've pulled the top ten performers out of the CPU-client pool, and thrown them in the GPU-client pool. Even if you didn't switch those ten people over to the GPU, you could probably isolate those computers' CPU-client performance numbers from the other 90 and find that they're disproportionally faster than a larger number of the slower computers.

    There's still more to the story, of course, but you really are taking the highest-performing computers out of the CPU pool and into the GPU pool. The exception would be high-performance servers with lousy/no graphic cards, but those are likely working so hard to perform their normal business that Folding@Home isn't a priority.
  • by olddoc ( 152678 ) on Friday October 13, 2006 @07:52PM (#16432079)
    If my computer idles at 150w and runs FAH at 100% cpu at 200w and I need 20h to generate 1 unit,
    I am spending $.10 for the extra kw hour roughly. In the summer I waste money on AC in the winter I save
    gas money on heat. If I put my computer in 4watt S3 standby for 15 of those 20 hours, I can save a lot more.
    FAH calculations do not depend on "free" "idle" computer power, they depend on users spending money to generate
    the results.
  • by ameline ( 771895 ) <ian...ameline@@@gmail...com> on Friday October 13, 2006 @07:55PM (#16432109) Homepage Journal
    Ok, I went and did the math (assuming, on average 1035 btu/cubic foot of natural gas) Looking at my bills,

    Natural gas is (cdn)$0.278287 per cubic meter, and electricity is 0.058 /kwh.

    At 96% efficiency, natural gas works out to 0.027331 / kwh, (3413 btu in 1 kwh) or 47% of the cost of electricity at today's prices in Toronto.

    So 1/3 was a bit of hyperbole, but not too much.
  • Re:Not a mystery (Score:5, Insightful)

    by tkittel ( 619119 ) on Friday October 13, 2006 @08:56PM (#16432595)
    Your logic is fine, but you are overestimating the effect you mention if you really think that it "solves the mystery".

    500 users out of 25000 means that you have at most taken the 2 percent highest performers out of the CPU pool. If we assume that those 2 percent have computers that are 5 times as powerful as the average computer, then we have lowered the average performance of the CPU pool by roughly 9%.

    This 9% systematic effect will lower the reported performance superiority of around 5000% of the GPU vs. the CPU to something like 4500%. I.e. it doesnt change the result at all (which seems to be that GPUs kick ass for these applications).
  • by Majik Sheff ( 930627 ) on Friday October 13, 2006 @09:03PM (#16432647) Journal
    Look at the first two letters of the acronym: Application Specific. A screwdriver and a swiss army knife will both turn a screw, but the screwdriver is going to be much more efficient at it. GPUs are finely tuned to rip through massive volumes of floating point vectors and not much else. It just so happens that the folding project also fits this desctiption and as such is an excellent use of an otherwise wasted resource.
  • Not really. (Score:4, Insightful)

    by TerranFury ( 726743 ) on Friday October 13, 2006 @09:41PM (#16432867)

    >Using your CPU as a space heater is not a bad idea. It is 100% efficient.

    Not really. Consider exergy [wikipedia.org]. Yes, your CPU is just as efficient as any electric space heater. However, consider that the alternative is probably burning natural gas or oil in a furnace. If you burn fuel for heat, 90%+ of the chemical energy goes to producing heat (the rest is lost as unburnt hydrocarbons in the exhaust). If you burn fuel to spin a turbine at a power plant, only about 40% goes to electrical energy, and unless it's a cogeneration plant which uses the waste heat for industrial purposes, the rest is lost as heat up the smokestacks. So, starting from the fossil-fuel source, electrical heating is less than half as efficient as burning fuel for heat. If you do need to heat using electric power, it's much more efficient to use that electricity to pump heat in from a lower temperature outside than it is to turn that electricity itself into heat.

    If you are stuck with electric (non-heat-pump) heating in your house, however, you are correct: There is absolutely no reason not to run your CPU or any other electrical appliance full tilt.

  • by tukkayoot ( 528280 ) on Saturday October 14, 2006 @01:12AM (#16433851) Homepage
    It may be useful in getting people to actually download and run the client.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...