Folding@Home Releases GPU Client 177
SB_SamuraiSam writes, "Today the Folding@Home Group at Stanford University released a client (download here) that allows participants to fold on their ATI 19xx series R580-core graphics cards. AnandTech reports, 'With help from ATI, the Folding@Home team has created a version of their client that can utilize ATI's X19xx GPUs with very impressive results. While we do not have the client in our hands quite yet, as it will not be released until Monday, the Folding@Home team is saying that the GPU-accelerated client is 20 to 40 times faster than their clients just using the CPU.'"
Power usage? (Score:4, Interesting)
Re:Power usage? (Score:5, Informative)
20x-40x the performance at 1x-3x the power usage is pretty good.
steve
Re: (Score:2, Informative)
There's a few places that confirm this, here they show just the VGA card usage, [xtreview.com] here they show the total system power usage. [anandtech.com]
Re: (Score:2, Insightful)
Re:Power usage? (Score:5, Funny)
Clearly, you're not one of the millions with an active WoW subscription.
Re: (Score:1)
Re: (Score:3, Informative)
Re: (Score:2)
I'd guess it was the 3800+ that was pegged.
Not really. (Score:4, Funny)
Re: (Score:2)
I think.
Hey, anybody know the math for this?
Re:Power usage? (Score:5, Informative)
See also Reversible computing. [wikipedia.org]
I think it's just a space heater. (Score:2)
I suspect that a good model for its energy consumption would just be a big resistive along with some capacitive and inductive load (I'm not reall
Re: (Score:2)
Meanwhile, do you happen to know the math, other than measuring the discrepancy between diffused heat and power input?
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:1)
Re:Power usage? (Score:5, Informative)
Folding@home vs FarCry, WoW, etc... (Score:2)
drawback (Score:3, Funny)
Yeah, but what kind of results do you get if you combine the GPU-accelerated client with a KillerNIC video card? It must at least triple the speed. at least.
Re:argh (Score:2)
Re: (Score:2)
good, I think... (Score:3, Insightful)
Are either of my worries vaild? can it damage it (or speed up its death) and what's the probability of a security threat?
Re: (Score:2, Interesting)
Re: (Score:2)
Re:good, I think... (Score:5, Informative)
Re: (Score:2)
I don't like that kind of reasoning. If my computer is good enough today, it should be good enough 10 years from now. About the only thing I am willing to concede is that computers aren't always "good enough", but I do think they are now.
Re: (Score:3, Insightful)
Face it, computers are one of the fastest changing technologies. Intel plans to have some ridiculous hardware in only 5 years. 80 core CPUs? Crazy. If you think your current dusl dual-core setup (I'm assuming you have the best PC possible to back up that 10 yr statement) will be able to handle what an 80-core doesn'
Re: (Score:2)
I used to change my computer about every 18 months. My current one, though, will be 4 years
Re: (Score:2, Insightful)
I hope I just missed your <sarcasm> ... </sarcasm> tags.
Ever hear of Moore's Law?
wikipedia: Moore's Law [wikipedia.org]
Transistor density has been doubling every 24 months (I recall it being quoted as 18 months, but we would be arguing semantics) for as long as I can remember. In 10 years, that's 2**5, or 32 times denser than it is right now. And you think the computer you have now will run anything remotely close to what
Re: (Score:2)
You aren't quoting Bill Gates. You are quoting an urban legend.
Frankly, while I think expecting a computer to have a ten year useful life is a big stretch, but I don't think it is unreasonable to expect a computer to have at least five years of useful life. My dad's computer is a cast-off dual Xeon 500MHz, made in 1998. Granted, it has a 10k RPM SCSI drive (which it was designed for) and 1GB of RAM in dual channel setup, the system was spec'
Re: (Score:2)
Seriously, have you seen the dust and grime a computer accumulates after 10 years??? I'd want to replace the thing just because it's really ugly and disgusting by that point.
Re: (Score:2)
Increasing expectations, not hardware burnout. (Score:4, Informative)
A computer that does some task today, should -- assuming it wasn't designed to be flawed or have a fixed life expectancy from the very beginning -- still be capable of doing that task in ten years. And for the most part I think this is true; it will.
Most computers that are 10 years old still run fine today (ones that were well-made in the first place); the problem is more one of finding a purpose for them, and then finding software to run on them, then getting them to start. Actually, I would wager that lots of computers that are 20+ years old would still run fine today, depending on how they've been stored and taken care of in the interim.
The problem isn't that machines really "wear out" all that quickly; with some exceptions few do. It's more the relentless drive of increasing expectations that puts working equipment in the landfill. At least for home users; commercial users have their support contracts to worry about, so it's slightly more complicated.
Case in point: I have an Apple IIc in my closet right now, which I know for a fact works fine. I could take it out tomorrow, set it on my desk, put in Apple Write, fire it up and start typing away. Somewhere around I even have a dot-matrix serial printer that I could use to output from it. Everything that Apple advertised that computer as capable of doing, it is just as capable of doing today as it was twenty-one years ago. So why am I not using it? Why am I sitting here with a computer that's only four years old, when I have a perfectly functional computer from 1984 in my closet? It's not because I like spending money. It's because I want to do things that I can't do on an old computer. There are a lot of things that I consider necessities, or at least things that are nice enough to have that I'm willing to pay for them, that weren't possible or even considered more than a few years ago.
If you honestly think that what you can do with a computer today is all you're ever going to want to do -- that you won't see some neat feature on your friend's box in 2014 and decide that you need to have it -- then you're absolutely correct; the computer you have now is the last one you ought to ever have to buy. Realistically though, most people aren't like this; they know that the computer they have today isn't going to be something they're going to want in five or ten years, and they're not willing to pay for a machine that's built to last longer than that.
The things that people use home computers for has changed, and will continue to change, and the tasks that people want to use their computers for will drive the upgrade cycle far faster than the breakdown rate of the components does.
Re: (Score:2)
Which is to say it'll be completely useless next to a current model. Infact it's quite likely that just the extra power-budget for keeping your current machine will outstrip the cost of changing to a more modern machine with more power.
If your current computer uses 300W, then that is 2500kwh for a year (assuming it runs 24/7, which most folding@home machines do).
In 10 years its a given that y
Re: (Score:2)
Which is to say it'll be completely useless next to a current model.''
I guess it all depends on your expectations. I had a Sun Ultra 5 (from 1998), until I sold it a few months ago. If it hadn't had problems with the video card, I might still have used it for another two years, at least. It did everything I needed. Now I'm running a VIA EPIA at 533 MHz, which has a better video card and m
Re: (Score:2)
Re: (Score:2)
Security Risk? Nope, much safer than games (Score:4, Insightful)
Most of the distributed-computation projects have a very simple communication model - use HTTP to download a chunk of numbers that need crunching, crunch on them for a long time, and use HTTP (PUT or equivalent) to upload the results for that chunk, etc. Works fine through a corporate firewall, and the only significant tracking it's doing is to keep track of the chunks you've worked on for speed/reliability predictions and for the social-network team karma that helps attract participants.
Online games normally have a much more complex communications model - you've got real-time issues, they often want their own holes punched in firewalls, there's user-to-user communication, some of which may involve arbitrary file transfer, and many of the games are effectively a peer-to-peer application server as opposed to the simple client-server model that distributed-computation runs. Fortunately, gamers would never use third-party add-on software to hack their game performance, or share audited-for-malware-safety programs with their buddies, or "share" malware with their rivals, or run DOS or DDOS attacks against other gamers that pissed them off for some reason.....
As far as the effects of running a CPU or GPU at high utilization go, most big problems will show up as temperature, though there may be some subtle effects like RAM-hogging number-crunchers causing your system to page out to disk more often. Not usually a big worry if you're running a temperature monitor to make sure your machine doesn't overheat. Laptop batteries are an entirely separate problem - you really really don't want to be running this sort of application on a laptop on battery power. I used to run the Great Internet Mersenne Prime Search when I was commuting by train, and not only did it suck down battery, the extra discharge/recharge cycles really beat up a couple of rounds of NiMH battery packs. Oh - you're also contributing to Global Warming and to the Heat Death of the Universe. But finding cures for major diseases is certainly a reasonable tradeoff, and we'll do that faster if you're using your GPU as opposed to 10 people using general-purpose CPUs.
Probably poor QC (Score:3, Interesting)
I've had more than a few crappy machines that I've run at 100% utilization for months or in one case years on end, without catastrophic failures, so I don't think that any consumer machine is "not built for constant processor work." I suspect that there is a higher rate of manufacturing defects in el cheap consumer machines versus higher-end ones because of more lax quality control, but I don't think they're designed that poorly with ce
Re: (Score:2, Insightful)
Good use of my GPU when idle... (Score:3, Funny)
Re:Good use of my GPU when idle... (Score:4, Informative)
1) There is no Linux GPU client (yet)
2) Many gamers who use Linux have gone nVidia due to driver support. There is no nVidia client (yet)
Wouldn't this be folding at single precision only? (Score:2)
I doubt the GPU can do IEEE double precision floating point.
Is 32 bit precision precision enough for a scientific application
like protein folding?
Is the entire algorithm of folding a big approximation anyway?
Re:Wouldn't this be folding at single precision on (Score:2)
I might also think that GPUs can handle doubles as well as floats. But again, could be pure nonsense. I am not familiar enough with the low level operations of a video card.
Re: (Score:2)
Re: (Score:3, Informative)
One problem with floating point is that it risks being unrepeatable. If you don't carefully define the terms of rounding, you'll have two different machines arrive at different results on the same calculation. But as long as you pick a standard (e.g. IEEE 754), your results are repeatable. N
Re: (Score:1, Insightful)
"Fairly"?
Re: (Score:3, Funny)
You're mostly wrong - Numerical Math is Hard :-) (Score:2)
Re:Wouldn't this be folding at single precision on (Score:2)
Single precision is fine (Score:2)
Re: (Score:1)
Re: (Score:1)
This is impressive, but... (Score:1)
While SSE vectoring unit could process a 8 double precision 64-/80-bit numbers at a time, GPU could process vectors of hundreds of numbers, but limited to the single 32-bit prescision.
Most of the established CPU demanding scientific applications will need douuble prescision. Only few problems are very well suited for lower prescision.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
This doesn't make GPUs capable of double-precision arithmetic, and doesn't mean they will replace CPUs. But it can be used expand the number of algorithms where the vast "arithmetic density advantag
Folding@home versus Grid.org (Score:2, Interesting)
Check out http://www.chem.ox.ac.uk/curecancer.html [ox.ac.uk] and decide for yourself. Personally, I don't see direct value/benefit to the folding@home project. I understand that knowing about misfolding is important for certain diseases and maybe even cancers
Re:Folding@home versus Grid.org (Score:5, Funny)
Re:Folding@home versus Grid.org (Score:4, Insightful)
Re: (Score:3, Insightful)
Re: (Score:2)
Folding@home doesn't work with the US DoD (Score:2)
Two words: closed architecture (Score:5, Insightful)
"With help from ATI, the Folding@Home team has created a version of their client that can utilize ATI's X19xx GPUs with very impressive results."
And therein lies the rub. While GPU's are getting more and more like general purpose vector floating point units, they remain closed architectures, unlike CPUs. Only those that can get help from ATI (or Nvidia) need apply to this game.
Re:Two words: closed architecture (Score:5, Informative)
What's most likely is that the guys at Stanford started pushing the hardware to the limit, and in ways the driver developers might not have anticipated. Probably what they ran up against was bugs in the driver, and the help came from ATI in terms of ways to work around the bugs. Evidence backs this up from Folding@Home's GPU FAQ:
[You must use] Catalyst driver version 6.5 or version 6.10, but not any other versions: 6.6 and 6.7 will work, but at a major performance hit; 6.8 and 6.9 will not work at all.
Your next question might be, if that's true then why use ATI (who are known for poor driver quality)... it might simply be a matter of that's the hardware they had to test with, so that's what they needed to use.
At any rate, it's definitely possible to get started doing GPU programming without vendor support.
There's even some API's out there to help... The Brook C API (for doing multiprocessor programming) has a GPU version out called BrookGPU: http://graphics.stanford.edu/projects/brookgpu/in
There's even a fairly large community of people using Nvidia's own Cg library for doing general purpose stuff.
There's also GPUSort (source code available to look at), which is a high performance sorting example that uses the GPU to do the sorting, and it trounces the fastest CPUs: http://gamma.cs.unc.edu/GPUSORT/results.html [unc.edu]
And last but not least there's the GPGPU site that is a great resource for all sorts of general purpose computing the GPUs: http://www.gpgpu.org/ [gpgpu.org]
Missed the point of "Closed" (Score:1, Interesting)
Re: (Score:2)
How Common is this card? (Score:2)
So how common are the ATI x1900
Re: (Score:2)
I saw a nice presentation at the IABEM conference in Graz this Summer from a researcher writing BEM-based Laplace-equation solvers on GPU units.
GPGPU for BEM -- By T. TAKAHASHI
http://www.igte.tugraz.at/guest/iabem2006/printpdf
There are some links, and even some code in that paper (PDF)
Essentially all he had to do was map his mathematic
People have wanted to do this for years (Score:1, Interesting)
Re: (Score:2)
Time to tune into the new century. SETI@home has been available under the GPL for several years now. Nothing prevents you from modifying it and using the modified version.
I keep asking for people to send me processor specific optimizations and so far only a Mac/PPC version has shown up. I'm ending up writing the SSE version mysel
Only X1900s? (Score:2)
Re: (Score:2)
It's got a broken CRTC, no red channel in the output plus the image is smeared. If it worked (without causing blinding headaches after 5 minutes) I wouldn't be using this MX400
Mac Support (Score:2)
Re: (Score:2)
GPU code samples? (Score:2)
Re: (Score:1)
Esoteric high-end GPUs are sexy (Score:1)
It's ok, I didn't want to help cure cancer anyway.
Re: (Score:1)
Re: (Score:1)
Buy a real pc.. (Score:2)
say ati? I say drivers! (Score:1)
go figure...
Re: (Score:1)
Re: (Score:2)
Win/Win (Score:1)
Folded Down Data (Score:2)
charitable donations (Score:4, Funny)
Mod parent funny or slimy (Score:2)
Re: (Score:2)
Tech questions (Score:2)
Basically, can I pickup a cheap X1900 and whack it into my PC jus
Re: (Score:2)
Good for ATI (Score:4, Interesting)
If you think about that, it says something about us that I think is important; people want to help and they're willing to spend their money to be helpful.
The concept of voluntary grid computing is a curious one. Why do people do this? Surely one more little CPU grinding away at a huge problem won't make a difference. Yet even though we all know this, we do it anyway. The result of this collective hopefulness and helpfulness is tangible. But what else is strange is that so little notice is given to grid computing. I don't recall hearing about it on CNN or any other news television program. SETI gets air time because it's so, well, 'out there', but the folding, aids, cancer/find-a-drug stuff is operating in obscurity.
BTW, kudos to Slashdot for helping get the word out. I first heard about grid computing here.
Reliability of the GPU? (Score:2)
Will things like this affect the outcome of the calculations, and give bad results? While an overheated CPU usually crashes and burns long before it can submit bad data, I am worried that overheating GPUs might give bad data which aren't obviously bad.
protein folding 101 (Score:2)
Solution involves resolving two major problems.
1) For any given two folds identify which one is the "real" one, or closer to the "real" one. This is done by assessing the quality of the fold using a scoring function, for example Gibbs energy (more physical approach) or how well the fold resembles one of the naturally existing folds (empirical approach).
2) One has to be able quickly review myriads of different folds, preferably bypass
Re: (Score:2)
Re:Am I the only idiot? (Score:5, Informative)
Re: (Score:2)
We don't have many shortcuts (yet) for figuring out what shape a 1-dimensional strand of peptides will take when correctly folded in 3-dimensional space. What we can do is positional all of the thousands of bonds in one particular configuration, then calculate the overall energy of
Re: (Score:2)
Re: (Score:2)
Keeping them seperate just makes sense from an economic standpoint right now. Maybe in the long term future we'll see a merging like we did with the FPU, but right now it just isn't in the cards.
Re: (Score:2)
The 19xx XT(X) core runs at ~650Mhz, and has 48 pixel pipelines. It's a bit like having 48 650Mhz cpu's at your disposal (0.65Ghz * 48 = ~30Ghz, but clock speed is not everything). However, the GPU is highly specialized while the CPU is more flexible, which means that for some jobs the GPU is far better, and for other the CPU is best.