IBM Constructs New Fastest Computer 188
scoobysnack writes "MSNBC is reporting that IBM has once again created the world's fastest computer -- it will be used for simulating real-world nuclear tests. With 12 teraflops it would still take it 3 months to simulate the first 1/100th of a second of a nuclear bomb explosion." There's coverage at CNET as well.
Re:What Else Can it Simulate? (Score:1)
So, the answer is no.
simulating nuclear explosions (Score:1)
I don't understand why people bother simulating nuclear reactions. Now, before you think i'm being facetious, let me explain. Nuclear physics is hard (as if you needed me to tell you that). Most of the theories as to how the fundamental interactions work are flawed. For example, the liquid drop [yale.edu] type models (on which most current simulations are based) are incredibly simplistic. They don't even take into account the Pauli exclusion principle, instead relying on a fudge factor to ensure that their particles follow the Fermi-Dirac [wolfram.com] distribution.
Thus, my argument is: you're better off doing the experiment.
The scary thing (Score:1)
Re:Research is GOOD... (Score:1)
Very worthy, seeing as the Pentium 6's and the K9's are gonna need a fusion power supply just to boot, and we won't even talk about those in SMP...
bash: ispell: command not found
Re:Nuclear simulations? Is that it? (Score:1)
The "old" systems, though they aren't the most powerful, are still used every day (and every night, too) by a wide variety of projects. There is a heck of a lot of research going on around the clock at LLNL.
When they mention the nuclear simulations, that may be the biggest problem that they tackle, but there is no shortage of smaller (relatively speaking) projects which need CPU cycles on a regular basis.
Check out http://www.llnl.gov/llnl/06news /feature-techno.html [llnl.gov] for all the cool non-nuclear stuff they do at Livermore.
Nate
...and hi to Brooke if you're out there at the lab again. How's the rice?
Re:Nuclear simulations? Is that it? (Score:1)
Q: Why are such huge computers are needed for bomb symulation, or fusion power research etc...?
A: For each particle that you want to simulate you need nine dimentions to specify where it is and where is it going, three for position, three for linear momentum, and three for angular momentum. Each dimention specified takes up roughly one FLOP (floating point operation per second), over simplified but it gets the point across. Therefore each particle requires nine FLOPS. So a 12 teraflop machine can keep track of ~1.33E12 particles. This may seem like a lot of particles to keep track of but considering things like Avregadros(sp?) number...
An example roughly of how intense this is. This new machine could do a decent job simulating a mid-sized tokamac fusion facility, if you use the quick and somewhat inaccurate way of simulating.
If you were to take a look at the top 500 most powerful machines you would find that most of the ones at the very top are either doing nuclear weapons testing or fusion power research.
Oh, in case you are wondering, the government changed one of their supercomputers that was doing weapons simulation over to traffic simulation last year. Traffic simulations also require several dimensions per car, again, three for position
Re:Nuclear simulations? Is that it? (Score:1)
It is possible for unclassified simulations to be run on the machine with a Q-cleared "proxy" user running the code on behalf of someone else - but in this day and age of ultra-tight security in the wake of Wen Ho Lee and missing disk drives - that is highly unlikely to happen.
However, there is a mighty impressive machine available on the "open side" which will be used for things like weather sim, drug design, etc... In this case it will be the machine ("ASCI Blue") which is currently behind the fence, which will be moved outside. 5000+ PowerPC 604e's, and not a bad parallel environment to work in, I must say...
The ASCI program is also funding 5 university ASCI centers. These centers are targeted with solving unclassified "grand challenge" type problems which involve similar complexities to nuclear simulations. ie - solid rocket motor simulations (Illinois), Astrophysics (U Chicago), accidental fire scenarious (Utah), turbulence (Stanford), and material modeling/response (caltech). These centers get time (or "fight for time" if you asked them) on the unclassified ASCI machines.
The unclassified machines are usually just one step behind, or slightly smaller, so they don't make splashy headlines. They are, however, still very very impressive machines, and there is lots of groundbreaking research being done on them which would probably not have been possible (yet) without ASCI
I (and others) don't particularly like the fact that these machines are used mostly for nuclear simulations - but it's better than the alternative (craters in Nevada), and it's definitely helping push the envelope of parallel computing - which is all I really care about. :-)
Los Alamos has a similar contract with SGI to supply large machines for them. Sandia has a large machine from Intel, and have subsequently been concentrating on massive linux clusters (ala CPLANT) as their future.
More information here [llnl.gov]
Re:Nuclear simulation (Score:1)
Walt
Metaphors OK, apparently (Score:1)
Personally what bugs me most is people who don't understand the meaning of a figure of speech, and get it the wrong way round. Like "I could care less"
Oh, and the use of literally to mean exactly the opposite - like when people say "He literally put his foot in his mouth"
Gravity computer (Score:1)
Re:The scary thing (Score:2)
So, simple math being your friend, let's assume that we're talking June, July, and August. That's 92 days. Each day has 86,400 seconds. So that gives a total of 7,948,800 seconds in June, July, and August. Now, this amount of time only simulates the first .01 second. So we need to multiply by 100, yielding 794,880,000. This means that the computer needs to be 794,880,000 times more powerful to simulate the first .01 second of a nuclear explosion in .01 second.
Multiplying this figure by 12 (They said it had a peak of 12.3, but it probably can't do 12.3 all the time), a computer would need to be capable of 9,538,560,000 teraflops to model the first .01 second of a nuclear explosion in .01 second. And the complexity increases from there.
Damn.
Hey wouldn't it cool to have a Beowulf ... Oh. (Score:1)
What's with this +1 bonus stuff anyway? Who asked for it and why do I have to explicitly turn it off?
Re:Nuclear simulations? Is that it? (Score:1)
In this application, however, this is just not a possibility. Consider the simulations this computer will be used for, nano-second scale simulations of the initial criticality of a chunk of uranium or plutonium. This is just not something that can be broken up into chunks and handed out for processing. For the simulation to be even vaguely useful, each atom or particle of fissionable substance simulated must interact with all of the other atoms in the simulation. Thus, if someone were to be given a chunk of, say, a million atoms/particle/lattice points, to process, their sub-simulation would need to communicate continuously with all of the other processors sub-simulations. If we look at the system conservatively, and say that each particle will only be affected by the particles closer too it, this still requires the exchange of vast quantities of information at each step. And that's not even taking into account the effects of simulating the neutrons and other sub-atomic particles zipping around. The processors would have to have huge amounts of data transfer ability to make this feasible.
Of course, if i misread your intent, and you're actually suggesting that every citizen should be given this huge amounts of banwidth to enable the distributed processing, i'm behind you all the way.
Do WHAT?. (Score:1)
Are you nuts or what?
The reason we have an international ban or nuclear reactions in space is so that we dont destroy not just ourselves, but the rest of our known universe..
If a nuke was set of in near-vacuum space, the chain reaction would still occur, but at huge distances - I'm not sure of the mechanics between explosions in space and bodies of large mass (which will have an awfull lot to do with type and size of atmosphere, if any), but there is absolutely no reason to assume that chain reactions among the far more dispersed matter of near-vacuum will not be absolutely catastrophic.
Feel free to correct me on this, but isn't there good reason to suggest that based on reactions that happen on Earth, the chain reaction in space could basically rip through all known near-vacuum, impacting every large-matter-body in space aswell?
In other words, wouldn't it be like, instead of one point nuke within Earth atmosphere, simultaneous nuking of the whole outer atmosphere of every planet or star or comet everywhere?
Since I am relying on memory posting this, I will research further over the weekend - in the meantime, feel free to reply candidly.
Re:Nuclear simulations? Is that it? (Score:1)
I think when ASCI Red was unveiled, they allowed for up to 50% of the CPU time to be used for non-ASCI research projects.
The top of the line supercomputers are listed in TOP500 Supercomputing Sites latest list [top500.org]. Clearly there are more than ASCI Nuclear Research computers. I don't remember Slashdot reporting installation of the two new non-ASCI teraflop computers, though. Perhaps only the best is of interest to the slashdot crowd.
Ever tried to run an SP cluster? (Score:1)
Maybe the SP software came a long way since I last saw it, but what I saw tells me that using the box for anything other than one application that will benefit from massively parallel computing is a waste of money. IBM hypes the massive parallel potential for a good reason, but the sales droids will sell it for applications that just don't work out.
Which is a pity. The hardware is nice.
Re:Why use it for weapons development? (Score:2)
important problem? (Score:1)
"They're trying to solve a very important problem there, "Josephs said, adding he thought the $110 million price tag for ASCI White was actually pretty reasonable. "The only other way to do this would be to take old nuclear weapons and blow them up. This really is a bargain."
Hold on, because there's no other way to test it, it becomes an important problem? I mean, unless someone plans on shooting off a nuclear missile, it's not that important. Which brings up the interesting question of who in the US is planning on shooting off a nuke? And if you argue that the US is testing to see what happens if someone nukes the US, then I don't see how these calculations are going to make a difference when a nuke is coming our way.
Re:What a waste... (Score:2)
Speed of Computers (Score:1)
I'd like to take into example how I was just at the PC Expo in NYC the other day and was looking at the 1GHz Athlon chips from AMD. Now, it's certainly quite impressive that these chips are running that fast, but how are they showing it off? A TV tuner card showing a video, a windows machine doing nothing special and someone playing a 3D car racing game. What does this show? a) how well your video card works with the TV tuner card, b) nothing and c) how fast your 3D rendering card is.
What I WOULD like to see in future tests is how fast the Linux kernel compiles. That is, at least , how I test how fast a machine is. Of course, it also then takes into account how fast your disk is (interface and drive), but there is still the raw processing power. Granted, you could argue Gaussian algorithms, but that is FPU speed. I want to see the processor as a whole.
So, my proposition is: The new benchmark for Slashdot readers should be how fast it compiles the kernel with the default options :)
My two cents; no refunds.
--
Re:Nuclear simulations? Is that it? (Score:2)
How come almost every time there is a post about supersomputers, they are being used for nuclear bomb explosion simulations?
As usual, there's a good article over at The Bulletin of the Atomic Scientists [bullatomsci.org] on why there's so much government desire for bomb simulation.
We have treaties and treaties on why we can't test these devices "for real". Given the desire to upgrade them without "upgrading" them in a way that affects counts or treaties, there's currently a lot of interest in how to re-use existing designs and components in ways that give functionally new weapons, without being listed as such. Converting air-burst devices to near-surface burst devices turns town-killers into bunker-killers, but it doesn't have to appear as building new weapons or changing the type of existing ones.
Re:More tools for the USian nuclear weapons brigad (Score:1)
If it turns out that the expiration dates of the weapons is far enough in the future, no new weapons will have to be made for a while, BUT to verify the durability, some experiments will have to be made, and since real-world nuclear tests politically sensitive (how's this for a euphemism?
Personally i wouldn't mind seeing all of those weapons passing their expiration dates, but spontaneous detonation would be rather nasty.... (though highly improbable i guess)
Sander
Re:Why use it for weapons development? (Score:1)
Re:bogomips (Score:1)
Re:Nuclear simulations? Is that it? (Score:2)
By the way, the weather service did get themselves their own parallel computing cluster [noaa.gov] (running Linux, by the way). Incidentially, the progress made in simulating nuclear blasts carries directly over to astronomers who simulate supernovae.
Re:Do WHAT?. (Score:1)
I suggest that the main reason for this treaty is that nobody likes the idea of nuclear weapons in orbit pointed down. Hell, a kinetic harpoon is destructive enough at 11 km/s.
or did you miss the </sarcasm>?
Don't worry - it's not being scrapped (Score:1)
When the machine is ready for "general" use ("general" as long as you have a Q clearance!), then the plan is to move the current machine to the unclassified side, and open it up for use by the ASCI alliances [llnl.gov] and other unclassified users.
They should be able to simply add it on cluster style as you suggest, since the current machine on the unclassified side is basically the same architecture. I can't tell you for sure that it's what they'll do - but if they do, they should have about a 4-5 tflop machine for unclassified use by the end of the year.
As far as what will happen to ASCI White when they're done with it - it's only being rented from IBM - so it'll go back to Kingston or whereever...
--Rob
security (Score:1)
faster computer = better, faster cavewomen pics (Score:1)
Could you Imagine... (Score:2)
Seriously though. This is using the Power III-3. Isn't the INSANELY fast Power IV just around the corner? When will that sucker arrive?
Re:Distributed? (Score:2)
I believe the problem arises in the amount of shared data required between nodes... it's not like cracking a key where you can just chop they keyspace up into as many pieces as you like and work on them all separately.. you have to have the entire dataset in order to work on it properly..
Re:More tools for the USian nuclear weapons brigad (Score:2)
They are ensuring that the bombs will go off reliably. The thinking is that if we have a working nuclear stockpile, enemies will think twice before attacking us. If our weapons get old and fail to work, the deterrence will be lost. How do we know if our old weapons will work? Blow one up for real, or simulate it.
I'd rather them simulate it.
Re:hmmm (Score:1)
---
hmmm (Score:1)
Imagine (Score:1)
Moderation (Score:2)
Re:ASCI *WHITE*?!? (Score:2)
Asci Blue Pacific (Livermore) 1999
Asci Blue Mountain (Los Alamos) 1998
And Red.
Asci Red (Sandia) 1999
So, being American.. it's time for white, yes?
Supercomputers aren't dead, they're fileserving (Score:2)
a couple of midrange Cray's.
I assumed they were for graphics processing (that's what the Origins were for), but it turns out they were simply the fastest fileservers on the market at the time of purchase (an important thing if you're pushing around mutli GB files).
--
Re:What is the benefit here? (Score:2)
More kill for the buck.
Fastest Computer (Score:2)
Re:Nuclear simulations? Is that it? (Score:2)
--
Re:A new record (Score:1)
A new record (Score:4)
IBM's ASCII White Super Computer Unleashed
It's CmdrTaco coming down the stretch on New Fastest Computer, but here comes timothy on ASCII White, it's Taco, it's timothy, Taco, timothy....timothy by a nose!
--
Re:simulating nuclear explosions (Score:2)
Re:I'm Sick of Analogies! (Score:1)
Of course, they could have just said, "It's way bigger than a regular PC" but that wouldn't have helped
Compare to all of SETI@home... (Score:4)
Keywords: Quake 3, Kernel compilation, Beowulf, Toy Story 3 in realtime?
Fross
only the government buys top-end computers (Score:2)
Re:What's the point of supercomputers these days? (Score:2)
This is exactly what Distributed.net and Seti are doing, as far as sending a discrete block of data, having the client crunch it, and then return the result.
Some computations require a network of interdependencies int he data set. Such as large scale simulations, which is what this computer will be doing. In such scenarios, there is no way to 'break up' the computational tasks into neat little discrete packets, since they are interdependant.
This requires lot of very fast networking (ccNUMA, very large SMP, etc) on hardware designed specifcally for this type of task.
As a matter of fact, we do (Score:2)
As they age, though, you don't really know what'll happen. And that's why we have simulators. As others have pointed out, there's only 2 ways to know if a device that's been stockpiled for 15 years will work - simulate it or take it out to the desert. Now, since we can't exactly take one out back and set it off, we buy a bigass computer and simulate it.
Or would you prefer that one just randomly go off while sitting at the dock inside an Ohio-class submarine at Groton, CT? Or maybe, if we return to the 50s mindset of 24-hour alert for bomber crews fully loaded for WWIII, with the occasional scrambling to test readiness, that B2 flying over Topeka hits some turbulance and levels half of the already-flat state of Kansas?
Re:hmmm (Score:2)
...phil
Research is GOOD... (Score:5)
You know, more goes on in a nuclear weapon than fission. If we can properly simulate the beginnings of FUSION, that could be an important step towards commercially viable fusion power plants! Cheap, clean, unlimited energy... worthy goal, I would think.
Additionally, even if the data from this box *IS* indeed ONLY ever applied towards nuclear weapons, that's still MUCH better than the alternative: which is to withdrawl from the Test-Ban Treaty, and start setting the things off for real again.
SIMULATING something is NOT morally equivelent to DOING that very thing. Otherwise, quite a lot of Quake, Carmagaddeon, and GTA players would be sitting in jail right now.
And, hell, even nuclear bombs, as they exist now, and as they could be refined, have potentially non-military uses. No, I'm NOT talking about Teller's harbor in Alaska; or the ridiculous scenarios in Deep Impact or Armagaddeon... Although nukes COULD be used for the noble purpose of deflecting incoming Comets/Asteroids. The implimentation, as presented, just sucked.
Actually, what I'm talking about *WAS* mentioned in Deep Impact. I'm talking about the Orion drive. If we are ever smart enough to withdraw from the ridiculous treaties which prevent it's deployment, Orion could be the answer to all of our short term space exploration problems! Until we perfect fusion, it IS the most powerful drive system proposed for deployment. Imagine how FAST we could get to Mars, and how much equipment we could take along if we used Orion, rathar than ridiculously inefficent chemical rockets!
Or, for the peaceniks out there... Wouldn't that be the ULTIMATE "swords into plowshares" situation? Imagine... the nuclear stockpiles of the world, ultimately directed not towards mutual annihilation, but towards the exploration of the final frontier!
We HAVE the way, all we need is the will.
john
Resistance is NOT futile!!!
Haiku:
I am not a drone.
Remove the collective if
Re:Research is GOOD... (Score:2)
On early projects I saw, oil was thought to be nearly depleated by 2002, and that by 2010 the first fusion power plant would be operational. NIF creates low-level (and low amounts of) nuclear radiation, unlike fission. Both fission and fusion plants could be designed to be safe, though the current water method for fission is cheap so its heavily used. To build either in the U.S. would be impossible, due to lobbying and lack of licensing.
Now, for the nuclear-based space issues, that's ridiculous. Current fusion has only begun to obtain more energy than it took. More importantly, NASA almost didn't launch a probe 1.5yrs ago because it was feared that the nuclear material it was using (for power) could be caught in the atmosphere. If any space-bound vessle blew up and got caught in the atmosphere, the radiate would spread and cause severe global problems. Sending up the radioactive materials is highly risky, so it is rarely ever done.
If you wish to learn more about NIF and nuclear energy, I recommend you look at LLNL.gov, which for years has had an excellent set of pages explaining ICF, and other details for visitors.
Re:Nuclear simulations? Is that it? (Score:2)
Add a couple hundred nodes, buy another switch or two, increase your flops by a couple of hundred g?
Johan
National Status Symbols (Score:4)
Now-a-days, I guess the US is afraid that China will have better nuke simulators than the US so we gotta beat 'em at it, it's "Keeping up with the Chin's" all over again.
I'd rather see the funds go toward a modern super-collider but, pfft, I only pay 1/3 of my income to taxes, I don't have any real say in how it's going to be spent.
Re:Nuclear simulations? Is that it? (Score:2)
Re:Research is GOOD... (Score:2)
Re:Nuclear simulations? Is that it? (Score:2)
(I think. I'm actually making this up as I go along, so add salt to taste)
Now the problem is that since the entire simulation goes in lock-step, you limit the number of steps by not only computation speed, but also communication speed between the nodes.
I presume that there are smart approximative approaches that can be used to assauge this, but it remains the case that distribution and cellular simulation just don't go.
Johan
Re:Research is GOOD... (Score:2)
Re:Nuclear simulations? Is that it? (Score:2)
Add a couple hundred nodes, buy another switch or two, increase your flops by a couple of hundred g?
Any idea if that's what they did? None of the articles have said whether they just plugged 9 more teraflops worth of power into the existing 3 or put a whole new 12 tf system in. That would be interesting to know.
Kintanon
PSHAW! Call that hardware? (Score:2)
Re:Algorithmic complexity of a N particle problem? (Score:2)
Re:the difference between ASCI White & beowulf (Score:2)
Re:Nuclear simulation (Score:2)
Re:National Status Symbols (Score:3)
Re:Doing a little comparison... (Score:2)
IBM Supercomputers (Score:3)
Here's a link to an article, but's it's a bit dated:
http://www.ibm.com/news/1999/12/06.phtml
=Blue(23)
Nuclear Explosions my arse... (Score:4)
We have built this giant computer to simulate Nuclear Explosions. Previously, we couldn't predict the outcome of a Nuclear Explosion. We did not know if it would kill a few million people, or a few billion. Until we had the ability to simulate it we couldn't be sure, and if we aren't sure, then we can't protect you. So please continue to send us more tax dollars to support the electric bill for our new Nuclear Explosion Simulator(TM) and we can continue to protect you. Also, it's good for children.
On an unrelated note, please feel free to update your PGP keys to the longest possible key length you can use, we believe you have every right to your privacy.
Yours Truly,
Big Brother
-----
On a more serious note, how much ass would we kick if we could get this badboy to join Team Slashdot over at distributed.net?
-Tommy
To All The People Who Think This Is A Waste. (Score:2)
If they had posted an article (or two) stating that the U.S. was going to resume above-ground testing, how would that make you feel?
This is the "alternative" that they are always talking about in those debates. So, quit whining or we're gonna have to make Nevada glow.
#VRML V2.0 utf8
Re:only the government buys top-end computers (Score:3)
There are a lot of industries that use these technologies - major car manufacturers (Ford, GM, etc) use them for their designs, and to do crash test simulations. It actually saves them tons of money in the end - not having to build a bunch of prototypes and running them into walls :). Boeings 777 was actually built straight out of the computer - no mock-up models or anything. Believe it or not, even Disney is actually a big supercomputer customer.
Have you been watching the news about the Human Genome lately? Those companies (and the gov't too) said that high-powered computers accellerated their research by a huge margin. You can bet that our future biotech industry will try to stay ahead by pushing the speeds of simulations.
You're right in that the divisions of the gov't are shelling out for the biggest computers, just don't ignore the business sector so easily. I think the word "fading" is inaccurate - "constant" or possibly "stagnant" might be a better description.
Quick, now's the chance. (Score:2)
Re:Nuclear simulations? Is that it? (Score:2)
Ok, What I want to know is where did the old computer go? They had a 3.??? teraflop computer before. Now they havea 12.??? teraflop computer. What did they do with the 3? Scrap it? Give it to another branch of science? Sell it? Stick it in a warehouse? Why can't we take it and set it up in a big room and let every research facility around that wants time on it buy some. Or even just allocate X amount of time per month for each scientific institution and let them use it to further research. It would very much suck if they just threw the thing away....
Kintanon
Re:only the government buys top-end computers (Score:2)
Darn exponential curves 8^)
Re:Hey wouldn't it cool to have a Beowulf ... Oh. (Score:2)
Re:Gravity computer (Score:2)
Re:Why not do a SETI@home kind of project? (Score:2)
Re:Distributed? (Score:2)
A comparison is not really possible. d.net can do more raw operations per second if one just adds up the total power of all the machines involved. However, ASCI White has far better internode communication. For the purposes of during highly paralizable calculations like distributed FFTs on blocked data and brute force sieving or key searches, d.net is probably faster. For the purposes of doing tasks like numerical linear algebra, nuclear/non-linear dynamics, weather forcasting, etc. ASCI White would be faster.
"How feasible is the distribution of such a computation? Are all the calculations similar, or would a lot of different computational code have to be written?":
It would essentially be impossible to distribute the computational task, even if the intial value data could be distributed (and it can't; even simplier FEA simulations routinely have datasets exceeding 30 GB), the process would require so much internode communication that any d.net type system operating over a heterogenous, low-bandwidth, high-latency, public network like the Internet would never work.
"Are there any such systems already in place? Currently, I'm only aware of one "useful" system, and that's ProcessTree (damn, I lost my referral number). SETI@Home is arguably useful, depending on whether you believe there is extraterrestrial life that uses the same radio waves we're scanning and is sending signals we could interpret.":
Not for using distributed computing for these kind of tasks.
Re:A new record (Score:2)
Haiku? (Score:4)
Draws 1 Point 2 Megawatts
The West Coast Goes Dim
Worst name. (Score:5)
Jeez, get some imagination ya nerds.
Hotnutz.com [hotnutz.com] - Funny
Re:More tools for the USian nuclear weapons brigad (Score:2)
how about more information like frame rate?
__________________________
Re:More tools for the USian nuclear weapons brigad (Score:2)
I believe it is time for us to reconsider the role of the military. After WWI (or was it WWII) we renamed the War Department into the Defense Department. This reflected a more peaceful mindset. Since we haven't had any defense of land and life in the US for over 50 years, it seems appropriate to rename it to the Foreign Economic Interests Department, since the military is used as a pawn to secure American economic interests.
Of course, this is a separate matter again, but certainly there is something better to simulate than an explosion. What possibly could they seek to understand about it other than how to improve it? This being the last thing society needs.
Re:Don't they get tierd of newclear simulations? (Score:2)
good), but honestly--what else would you have them
do? Play Quake at 50k frames/sec.? Crack crypto
keys?
'Nuclear simulations' doesn't just mean figuring
out how many people they can kill per megaton.
It ties in plasma physics and a dozen other
related fields, which tie in closely to
astrophysics, i.e. how stars happen. Then there's
the residual interesting bits, which can lead to
advances in almost any random field. (maybe not
random, but indeterminate)
Fundamentally, they're using tons of computing
power to investigate stuff we _don't_know_ yet.
They're not just cranking out bigger numbers
faster, but looking in different directions.
To say, "I don't think we need more (fill in the
blank) research" is to utterly fail to understand
how good science works, and ties together.
Re:the difference between ASCI White & beowulf (Score:2)
the switch is what puts the super in this supercomputer. but when you break it down, it is just a fast network. yes, orders of magnitude better in all ways than 10base-t but still just a fast network.
and as far as operating principles, it is just IBM's flavor of MPI. nothing special there. all the money went into the switch.
maybe the horsepower equation deserves the bicycle/car comparison, but the operating principle is the same. i.e: a bunch of standalone unix nodes, connected by a high speed network clustering software and MPI (or PVM).
Re:ENIAC (Score:2)
More tools for the USian nuclear weapons brigade (Score:2)
This is a nice piece of kit to say the least, who wouldn't want one for themselves, but look at the use it's being put to - running simulations of nuclear bombs being used. Yes, it's again part of the $1 trillion USian milatary machine, even if it is given a more publicly acceptable face through the Department of Energy's "Stockpile Stewardship and Management Program", a euphamism if I've ever heard one.
Why is it that the Pentagon still gets to spend so much money on fancy new toys for a war that will now never come? The USSR has collapsed and since the US is now sucking up to China it looks like there isn't going to be the proposed World War III that US military leaders have been hoping and planning for for decades. And whilst this $1 trillion goes into the military black box, poor people starve on the streets and can't afford even basic health care thanks to the Randite social policies of the US, despite what their Constitution supposedly guarantees.
No, on purely technical merits this computer is interesting, but I don't think that we, as reponsible people, should be praising something which is part of a group that contributes in a large way to the suffering of the poor and needy.
---
Jon E. Erikson
enh? they're still around (Score:2)
Blue Mountain (at LANL), for instance, was on the order of 6,000 RS10K processors; if you hunt around enough, you can still find the webpages about it at the Lab. (Again, I'm lazy. Sorry. :-)
Re:Could you Imagine... (Score:2)
Re:Hey wouldn't it cool to have a Beowulf ... Oh. (Score:2)
a backwards argument for abolition (Score:2)
Sorry to rant a little. My point is just this: argue all you want that we shouldn't have nukes; write your congressmen, campaign on Capitol Hill, etc. I wish you luck. But until that day comes, it makes sense for us to do this sort of simulation.
12.2Teraflops? IE still stalls... (Score:2)
Re:Speed of Computers (Score:2)
make -j2 zImage
make -j4 zImage
make -j zImage
etc...
Re:National Status Symbols (Score:2)
Well China being so close to Japan, wouldn't you be worried about them buying mass quantities of PSX2 units? I mean come on, this ASIC thing may be pretty fast, but nothing beats a PSX2 for weapons design, you know, with that easy to use controller and supercomputer classed processors.
-- iCEBaLM
That much hardware necessary? (Score:2)
Maybe... (Score:2)
The day you realized that splitting an atom would release megatonage of energy, that was an epiphany.
The day you realize that it would take over 25 years to simulate one second of the blast in a computer, that was an epiphany.
It's a new kind of physics, you need a new kind of software.
Re:3 months to do a simulation? (Score:2)
Re:More tools for the USian nuclear weapons brigad (Score:2)
Now weary traveller, rest your head. For just like me, you're utterly dead.
Nuclear simulations? Is that it? (Score:5)
At least we can run our own weather simulations at home with the Casino-21 project [rl.ac.uk]. How long until a distributed nuclear simulation project? I guess that wouldn't happen becuase of "security concerns," though.
Re:Speed of Computers (Score:2)
A) A GeForce2 GTS can render a hell of a lot more triangles than the proc (even a 1GHz) can feed it. Sure the geometry acceleration helps out a bit, but most games don't use it yet. Thus this racing game (no racing games that I know of use the geomtery engine in D3D or OpenGL) is definately a good indicator of the performance.
B) The kernel compile is a crappy benchmark. Given the fact that the source tree is some 75 megs, and the fact that it does nothing with the FPU, and the fact that it is much more dependant on bus bandwidth due to the nature of the operation, it doesn't make for a very good benchmark.
But in the end, all that matters in a benchmark is how well it does what YOU'RE doing. If you want to test raw proc speed, you'll use a synthetic benchmark that just does math ops. If you compile all day, then the kernel compile is a perfectly valid benchmark. If you run 3D games, then the 3D game is a great benchmark.
Re:To All The People Who Think This Is A Waste. (Score:2)
Of course I thought of all that. I simply wanted to point out that a lot of people were looking at this from a "the glass is half empty" point of view.
#VRML V2.0 utf8
Don't rely on the computer for everything just yet (Score:2)
Some aerospace critics lay some of the blame of recent rocket failures on just this point, that too much emphasis is being put on rocket simulation at the expense of actually building prototypes and testing them. Certainly it is cheaper to simulate them, but you can't skip too many prototype iterations in the design phase.