Folding@Home Releases GPU Client 177
SB_SamuraiSam writes, "Today the Folding@Home Group at Stanford University released a client (download here) that allows participants to fold on their ATI 19xx series R580-core graphics cards. AnandTech reports, 'With help from ATI, the Folding@Home team has created a version of their client that can utilize ATI's X19xx GPUs with very impressive results. While we do not have the client in our hands quite yet, as it will not be released until Monday, the Folding@Home team is saying that the GPU-accelerated client is 20 to 40 times faster than their clients just using the CPU.'"
good, I think... (Score:3, Insightful)
Are either of my worries vaild? can it damage it (or speed up its death) and what's the probability of a security threat?
Re:Power usage? (Score:2, Insightful)
Two words: closed architecture (Score:5, Insightful)
"With help from ATI, the Folding@Home team has created a version of their client that can utilize ATI's X19xx GPUs with very impressive results."
And therein lies the rub. While GPU's are getting more and more like general purpose vector floating point units, they remain closed architectures, unlike CPUs. Only those that can get help from ATI (or Nvidia) need apply to this game.
Re:Folding@home versus Grid.org (Score:4, Insightful)
Re:good, I think... (Score:3, Insightful)
Face it, computers are one of the fastest changing technologies. Intel plans to have some ridiculous hardware in only 5 years. 80 core CPUs? Crazy. If you think your current dusl dual-core setup (I'm assuming you have the best PC possible to back up that 10 yr statement) will be able to handle what an 80-core doesn't blink at, you're crazy. It's going to have approx 20 times the power, assuming no other advances in speed.
No, that logic works great for cars and toasters, but computers just change too much.
Re:Wouldn't this be folding at single precision on (Score:1, Insightful)
"Fairly"?
Re:Folding@home versus Grid.org (Score:3, Insightful)
Re:good, I think... (Score:2, Insightful)
I hope I just missed your <sarcasm> ... </sarcasm> tags.
Ever hear of Moore's Law?
wikipedia: Moore's Law [wikipedia.org]
Transistor density has been doubling every 24 months (I recall it being quoted as 18 months, but we would be arguing semantics) for as long as I can remember. In 10 years, that's 2**5, or 32 times denser than it is right now. And you think the computer you have now will run anything remotely close to what is running then? You won't even be able to load the operating system.
Think back 10 years ago (1996), what was the "hot computer" filled with? The original Pentium, probably running at a blazing 66MHz?
Next let's quote Bill Gates, "640K should be enough for anyone."
Security Risk? Nope, much safer than games (Score:4, Insightful)
Most of the distributed-computation projects have a very simple communication model - use HTTP to download a chunk of numbers that need crunching, crunch on them for a long time, and use HTTP (PUT or equivalent) to upload the results for that chunk, etc. Works fine through a corporate firewall, and the only significant tracking it's doing is to keep track of the chunks you've worked on for speed/reliability predictions and for the social-network team karma that helps attract participants.
Online games normally have a much more complex communications model - you've got real-time issues, they often want their own holes punched in firewalls, there's user-to-user communication, some of which may involve arbitrary file transfer, and many of the games are effectively a peer-to-peer application server as opposed to the simple client-server model that distributed-computation runs. Fortunately, gamers would never use third-party add-on software to hack their game performance, or share audited-for-malware-safety programs with their buddies, or "share" malware with their rivals, or run DOS or DDOS attacks against other gamers that pissed them off for some reason.....
As far as the effects of running a CPU or GPU at high utilization go, most big problems will show up as temperature, though there may be some subtle effects like RAM-hogging number-crunchers causing your system to page out to disk more often. Not usually a big worry if you're running a temperature monitor to make sure your machine doesn't overheat. Laptop batteries are an entirely separate problem - you really really don't want to be running this sort of application on a laptop on battery power. I used to run the Great Internet Mersenne Prime Search when I was commuting by train, and not only did it suck down battery, the extra discharge/recharge cycles really beat up a couple of rounds of NiMH battery packs. Oh - you're also contributing to Global Warming and to the Heat Death of the Universe. But finding cures for major diseases is certainly a reasonable tradeoff, and we'll do that faster if you're using your GPU as opposed to 10 people using general-purpose CPUs.
Re:Probably poor QC (Score:2, Insightful)
We uninstalled the client soon after that, as the number of 'spare-pc's was quickly running out =)
The 'peak' clearly registered, but sadly didn't last long. Ahh, the days =)
Anyway, what I wanted to say was : although ALL the pc's were identical and more less the same age, some were perfectly capable of running 100% load 24/7, others weren't. In this case it was mainly due to maintenance (or lack thereof). But I'm sure things like ambient temperature, location (eg. hidden in a closed corner under your desk), environment (eg. furry creatures hugging the box all the time) etc... can have a much more serious effect on the hardware than the labratory-tests they do during "QC" when the machine leaves the factory.