Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Compaq

Compaq to Build Alpha Supercomputer 265

kfarmer@tru64.org writes, "The French Atomic Energy Commission has placed an order for a supercomputer to simulate and analyze nuclear explosions. The supercomputer will use about 2,000 Alpha chips running in the 1.25-GHz range, or about 2,500 chips at the 1-GHz level."
This discussion has been archived. No new comments can be posted.

Compaq to Build Alpha Supercomputer

Comments Filter:
  • by Anonymous Coward
    How I love debunking religious fanatics who sincerely believe that they're reasoning is correct, and will use any means possible to try to prove the truth of bible, which I refuse to capitolize.

    a.) Take your pick. You can just take the parts of the bible that you like and try to pass them off as factual, when the other parts are such incredulous BS. The first chapter of Genesis states that god, which I will also refuse to capitolize, created the earth in six days. Ahem. I think not. Can you give me evidence that some all-powerful deity reached down and snapped his fingers? Didn't think so.

    b.) What about the existence of Mars? Jupiter? The asteroid belt? Did god create those too? Does it say so in the bible? Oh wait, they didn't know those existed when the bible was written. They still thought the earth was flat and was the center of creation, and so did god, despite the fact that he created it all. Hmmm...

    c.) My dad is an avid sailboat enthusiast and as such he taught me about the magnetic poles vs. the traditional poles when I was young. The north pole, the geographic one, was the magnetic north pole when it was named. Since then it has shifted, as the earth wobbles on its axis and the atoms realign. At no point would it have been strong enough to have fried any life forms.

    d.) Ever heard of dinosaurs? Well, we've found entire skeletons of them. And if people have always existed since the dawn of the earth, how come no one ever reported seeing these things? How come real carbon dating puts these things 65 million years ago? Did they exist only in the first 5 days, before god made the humans?

    Sorry mac, listen to your science and ancient history teachers, not your clergyman.

  • by Anonymous Coward
    =all this evidence confirms that since we all evolved from this early ancient race of homo sapiens, we're still all really only one race, skin color and skull shape be damned.
    ="All one race" is a lie, as mentioned above. The ludicrous "evolution" theory is nonsensical as well, since "evolution" could not possibly have taken place in a mere six thousand years (the scientifically established age of the Earth, corroborated by all currently accepted facts of physics, chemistry and geology). In fact, statistics tell us that evolution would in fact have taken nine hundred trillion years to produce even a single cell, much less a human being in the image of God.


    Boys, don't confuse "race" and "species". There is indeed only a single "species" of humans on this planet. The definition of species as applied to most higher level critters is that a pair of gametes are able to breed and produce fertile, procreatively viable offspring. That's us, no doubt.

    HUMAN BEINGS: kingdom Animalia, phylum Chordata, Class Mammalia, Order Primates, family Hominidae, genus Homo, species sapiens.

    Other taxon terms such as sub-species, race, breed, variety and strain mean much the same and in the context of biological classification of Homo sapiens are irrelevant.

    If either of you two boys would study some college-level biology you'd both know that it does not take very many generations of procreation within a single species of the Class Mammalia to produce wildly different appearing sub-species, race, breed, variety, strain, whatever, due to influences from their environment. Take a look at dogs, cats and horses as species. Biologically speaking, humans are in the same boat. Four thousand or six thousand or ten thousand years is plenty enuff time for wide variations in physical appearance to manifest themselves in our species, especially considering how we have scattered ourselves around the planet, mistreated our environment and waged war against one another over the millennia.

    WRT little boy #2's comment of a human being in the image of God. Does anyone know what God looks like? The God I know exists in the form of energy and works thru the power of prayer. I've seen God's work happen firsthand right before my own eyes, but have never seen "Him". Also OT, but a favorite rant of mine is how can you be sure that God is a "he"? Do you trust all those ancient writings that were written by men and now exist as the three main monotheistic holy books? And how about those men who wrote all the texts and those men who translated the various languages over all those years who were all employed by male kings and monarchist religious leaders who all no doubt had political motive to exert whatever editorial control they pleased over what was written down.
  • by Anonymous Coward
    What type of science is involved in that? Fluid dynamics? Some books/URL pointers would be great!

    Thanks,
    Matt
  • by Anonymous Coward
    I quite agree. This computing power is being put to no discernible use. Vast numbers of cycles that could be used to extend human knowledge are being wasted on nuclear simulations. Meanwhile, whole tracts of the South Pacific and Australia remain unbombed.

    In any case, there's a far easier way to find out what happens in the heart of a nuclear explosion -- move to Taiwan. You'll find out soon enough, if the Klintons have their way.
  • by Anonymous Coward

    Actually the last wave or nuclear tests we French did was to aquire enough data to be able to simulate explosions later, to be be able to continue to improve our enemy vitrification technology...

    AFAIK no one in France plans to redo nuclear testing for real... Those Aussies make way to much noise when we do... But we were nice, we even let them win Davis Cup and Rugby World Cup for them to forgive us...

  • by Anonymous Coward
    Designing nuclear bombs (fission, fusion, enhanced radiation, ...) is not a trivial undertaking. Take the case of the initial implosion, the explosive charges get set off to generate a pressure wave. The core materials are "real", and are not elastically isotropic. So, to solve for how the pressure wave just elastically deforms the core material in the vicinity of the wave crest involves the 4th rank stiffness/compliance tensor. The bomb people want to increase the core density, so plastic deformation will take place, further increasing the complexity of the equations being solved. Add fission processes, neutron moderation and diffusion, and all the other little things that are going on (some with time scales on the order of microseconds, some with time scales on the order of picoseconds or less), and you have a very difficult mathematical problem.
  • by Anonymous Coward
    Please post the Nielsen interview already!!!!

  • Okay, boys and girls, most of you have been non-educated in non-schools created by the radical Left over the past fifty years since the Truman Conquest of the US government in 1948, but you should at least know this one:

    Q. What is the sun?

    A. A very large fusion reaction.


    Here's another, which might be a little tougher for all you little basket-weavers and folk-dancers:

    Q. What is a hydrogen bomb?

    A. A relatively tiny fusion reaction.


    The sun has been shining on the earth for at least six thousand years. Nobody has been killed by it. No cities have been destroyed by it. The sun is as safe as anybody could want. Fusion is not dangerous or harmful.

    I've had about enough of the paranoid, sick, vicious, bigoted hate that the Liberals vomit forth every day of their lives. LIES, okay? Everything you see in the media about nuclear war IS A LIE. Nuclear conflict poses no significant dangers.

  • by Anonymous Coward
    Knock off LA? That's a WIN-WIN situation
  • by Anonymous Coward
    By doing tests at Mururoa, the French were trying to completely erase those islands from the face of Earth. This was done in agreement with the Saudi government, because Mururoa exposes a logic gap in Muslim theology.

    Mohammad said all Muslims should pray facing in the same direction, towards Mecca. What Mohammad didn't know was that the Earth is a sphere, so there's a point that's directly opposite to Mecca, which is located in Mururoa. In Mururoa, every direction faces Mecca.

    This logic oversight was menacing the whole Muslim faith, so the French government agreed (in exchange for oil) to help correct it in the only possible way: blast those blasphemous islands from the face of the Earth. I don't know why the agreement was later revoked.

    Excuse me for sending these facts as an AC, but I don't want to suffer Salman Rushdie's fate. You can check the fact for yourself, just look at a World map with sufficient detail to show the Mururoa atoll.

  • The processors alone don't matter that much.
    How data is piped from one processor/memory/cluster/etc. to another is what matters -- and then performance will depend heavily on what sort of problems are run on it.
    John
  • A 1300 and 1000 processor T3E(i.e. alpha) has already been done, and the current ASCI systems have >9000(red), >6000(Blue) and ~5800(other blue).
    John
  • The U.S. National labs have been doing this for
    a couple of years now. Compaq just won a contract
    for a new system at Los Alamos to do the same
    thing.

  • Those Compaq designs are from Compaq. The Alpha engineering manufacturing & designs are from the former DEC. I have a true DEC Alpha at 500MHz and I wouldn't trade it for an x86 at any clock speed, except maybe 1.5GHz or so, but I can still up my system to 800MHz if I ever find a chip rated near that speed (good for a 3 year old system, eh?)...

    This type of computer contracted to the French has been announced last summer, although not many are using it yet.

    Honestly, the last half of your problem belongs on some email list somewhere.
  • Aarrrggghhh...
    It was not MS that gave up - it was Compaq!!!! Compaq decided to drop support for NT on Alpha!!


    Partly true. Compaq decided to stop making 32-bit NT on Alpha. It was M$ that decided to kill off NT on Alpha completely!

    Who do you think had to maintain Alpha/NT? Clue: Not MS. Yes, DEC/Compaq had to pay for a complete NT software development dept., because DEC/Compaq had to do the maintenance. (same thing when NT used to run on PPC - IBM had to maintain it, until they realised it didn't sell).

    M$ had to do some of it. Compaq did most of the work though. Yes I imagine the same is what killed PowerPC and MIPS support.

    Then some bright spark looked at the figures and realised that nobody was buying NT on Alpha. It's best marketshare was on workstations - ~15%!! On servers it was even worse - because people who tend to buy nice hardware like Alpha also tend to buy nice OS's like OpenVMS or Unix. People were not spending money on Alpha/NT.

    I'm not sure about the percentage, but it is probably close.

    They put 2 and 2 together and realised that paying for NT/Alpha was costing more than the revenues generated by Alpha/NT sales. And that's why it was scrapped. The biggest money maker on Alpha is Unix, closely followed by VMS.

    Yes and people wanted 64-bit NT, not crappy 32-bit OS on a 64-bit platform.

    Also, look how hard compaq is pushing Linux on Alpha. This is for the same reason as why NT was dropped - money. Linux sells a lot of Alpha's.. esp in the lower end, eg Linux marketshare on DS10's is about 40% or higher... it also does well on clusters. And Compaq is pushing linux/alpha clusters really hard.

    Yep, Linux on Alpha kicks ass and Compaq knows it!

    (my mouse mat is a picture of tux on a fat motorcycle with the Compaq Alpha logo, and a banner saying "Linux SCREEEEAMMMS on Compaq Alpha".. this is an official compaq mousemat)

    Cool! Where did you get that?

    in fact this Alpha cluster will most likely run linux..

    No, it runs Tru64 UNIX with Tru64 clustering software. Like or not, Tru64 UNIX can still kick ever other OS's ass. Don't get me wrong! I love Linux and run it on everything I can, including my Alphas. Linux just doesn't scale that high yet. The largest AlphaServers Linux runs on is the 4100's. Where Tru64 runs on all of the Compaq branded Alphas (except a few NT-only systems that were called "white boxes"). The system will be mostlikely a wildfire system, probably multiple 128-way systems. My point here is that Linux runs great on lower end 1 - 4 way AlphaServers, but that's as far as the support goes. FreeBSD support is still maturing and I do not know how well NetBSD scales. I do not NetBSD runs on AlphaServer 8400's. Anyway anything over a 4100 you'll have to run Tru64 UNIX or OpenVMS.
  • Did MS bail on the Alpha?

    Compaq decided a 32-bit OS on a 64-bit platform (Alpha) that had a low market share, wasn't worth putting out lots of money for. So Compaq said there were gonna stop supporting 32-bit NT on Alpha and was going to pick up where they left off when 64-bit NT finally arrived. Unfortunatly (for those who got the short end of the stick) M$ decided to kill NT on Alpha off completely! The end result is NT is just like Win9x in that it only runs on x86-based systems. Maybe they can write NT/2000/whatever in assembly now and get some decent speed out of it! *grin*
  • "The supercomputer will use about 2,000 Alpha chips running in the 1.25-GHz range, or about 2,500 chips at the 1-GHz level."

    Well, I suppose they will get a realistic view of the heat production with this setup...

  • In Mururoa, everybody could track their progress.

    Who knows what terrors they (and the US and the UK and Russia and Israel and India and Pakistan and Koreas and China and...) are planning for us in their virtual testgrounds.

    Nuclear? No thanks!
    --
  • When I first started reading theses comments I though our annoymous coward here was just simply mislead by his fundamitalist brain washing.

    I must admit I was wrong. Your just an idiot.

  • France has argued for years that if we want them to stop testing in the South Pacific then we have to sell them the technology to simulate weapons testing the same way we in the USA do that now.
  • 2500 Alpha CPU's, at let's say.. $1000 each comes to $2.5 Million. How much does this pricing compare to Cray or Origin supercomputers? Alpha's are incredible to say the least, but I'm wondering if we can get a Beowulf cluster going for a cheaper price with similar speeds?

    If I was shelling out that much cash, I'd probably want to go MIPS though, but it all depends on the company's infrastructure and needs. I'm overall impressed by Compaq's dedication to providing excellent and reliable products (e.g. Proliant servers), but I'm a little bit edgy with their up and down attitude towards Alpha lately. :)

    EraseMe
  • I couldn't even imagine how much processing power it would take to calculate missile trajectories and such. It's probably relatively close to making films like Star Wars or Toy Story, or large decryption. It takes a lot of power to try and bring such large math as close to real time as possible.

    EraseMe
  • Okay then.. a 600Mhz UltraSparc3 would be relatively close to a 750Mhz 21264 Alpha to a 1000Mhz P3 on SPECint benchmarks I believe. Wait till the 1500Mhz UltraSparc5's are out though .

    Intel's downfall is they are moving further and further CISC with their P3 SIMD, while Alpha's are easily pushing ahead along with Sun on beautiful RISC CPU's.

    Then again, the P3 will probably kick Alpha's ass on gaming benchmarks, but the Alpha will most definately nail the P3 to the ground in processing war applications.

    EraseMe
  • Why go to all the expense PBS has it free on its website:

    http://www.pbs.org/wgbh/ame x/bomb/sfeature/mapablast.html [pbs.org]
  • Hydrogen bombs can yield just about anything you want them to. They are fairly scalable, from sub-megaton to many hundreds of megatons.

    The fission bombs dropped on Japan yielded at the most 20 kilotons. The smallest fission bombs can fit into a backback transportable by a single person, and their blast would be only about a city block.
  • If you are saying that no person has ever been killed by a hydrogen bomb, then yes that is true.

    The simulations do seem to indicate that if a person were to stand within 10 feet of a hydrogen bomb, they would be killed.
  • Also, I could have sworn that there were restrictions on the computational power that we could export from the U.S. Something that breezes through nuclear calculation could probably brute-force crack most encryption methods in an afternoon...

    And your point is? There are restrictions on "technology transfers" of this nature. So what. Naturally, they've gotten the export license for this thing, or they wouldn't be telling the world about it, now would they?

    As for the crack about encryption, who needs encryption when you're "breezing" through simulations of nuclear explosions? Why decrypt intercepts from other nations, when you can explode nuclear devices in the atmosphere and take out their communications infrastructure?

    Just as a side note, the last I heard, use of encryption in France by private citizens requires governmental permission. Anyone in France care to correct/comment on this?

  • No, fluid dynamics is used for weather.

    Unfortunately you wont find much information on this, as the government considers this REALLLY sensitive information. The design of the nuclear pits used atomic weapons is fairly well known (go to your library ;) however, it is the details where things get tricky.

    Computers are used in implosion simulations to calculate fission / fusion efficiency, and to interpolate different design modifications to enhance yeild.

    This involves very very large amounts of floating point operations to calculate effects from the properties of the fission reaction, namely the fission material density, the half life, the shape of the compressed fission material, the rate of compression, etc.

    Quite complex. I wish I did know details, that would be some juicy code.
  • Actually, they arent simulating the blast effects, they are simulating the supercritical state which exists during the first nanoseconds of a nuclear explosion.

    These computations will be used to design / build more efficient (read: clean) nuclear weapons.
  • Quite a few posters seem to think this will simulate the entire nuclear blast process, which would entail a large amount of fluid dynamics (weather prediction basically).

    However, this is not the case. What they are doing is simulating the compression -> supercritical process that occurs when the detonation lenses used to implode the nuclear core or 'pit' are detonated.

    These calculations usually rely on finite element analysis and atomic decay / fission simulations. (nuclear & some quantum physics calculations)

    The simulations have to handle multiple variables which interact with each other like:
    - detonation shock wave velocity
    - detonation shock wave effects of lensed charges on the heavy metal driver layer of the pit.
    - implosion vectors for the heavey metal (usually uranium) driver layer as it implodes through a surrounding vacuum around the inner beryllium/plutonium core.
    - implosion vectors for the inner core (beryllium jacket and hollow tritium / dueterium filled plutonium sphere) as the driver transfers kinetic energy and implodes the core itself.
    - Calculation of effects on rate of fission and efficiency as the inner core goes super critical.
    - Calculation of the effects of the beryllium neutron reflector layer surrounding the super critical core.
    - Calculation of the effects of the neutron source at the center of the imploding core (the deuterium / tritium)

    All of this together is used to determine the yeild and efficiency of a given nuclear device. In all likelyhood they have the lensed detonation charge values already computed / interpolated and the majority of the simulation goes towards the fission reaction simulation.

    All sorts of variables are optimized by this approach, such as the shape of the heavy metal driver layer (surprise! a perfect sphere is not the most efficient geometric shape, probably due to the slight differences in the effects of the implosive shock wave generated by the surrounding lensed charges relative to the position of the lenses and location and rate of triggering detonations)

    The size and shape of the beryllium neutron reflector jacket surrounding the plutonium core.

    And finally the size and shape of the plutonium core itself, and if/ how much deuterium / tritium is at the very center.

    So, hopefully that clears up the issues regarding what exactly they are simulating, and why the need for massssive floating point power is mandatory.
  • Every supercomputer company has its maximum
    "pie in the sky" configuration, but corporations
    cant afford these $50+ million price tags.
    How do we know we can every reach these capacities?
    Answer: governemnt agencies- DOE, NOAA, NSA-
    buy a few of these uneconomical computers
    to keep the industry on their toes.
    I support limited purchase like this,
    but not the wholesale subsidy of the supercomputer industry like
    during the 70s nad 80s (e.g Thinking Machines).
  • by HeghmoH ( 13204 )
    By "simple", I mean the computational power required, not the mathematics involved.

    X-Plane [x-plane.com] claims to have engineering-accurate flight simulation for most of the aircraft it models. I tend to believe that this claim is exaggerated, but it still does a good job with very little power. Most of X-Plane's computational requirements are wasted on pretty graphics rather than calculatin aerodynamics.

    You really, really don't need a supercomputer to do this stuff. As proof, I point to the Apollo missions, which were planned and executed using mostly slide rules and the occasional "supercomputer" that is probably a hundredth of the speed of a cheap desktop now.
  • Missile trajectories are actually really simple. Basically just Newton's laws applied. It doesn't take that much computing power.
  • in fact this Alpha cluster will most likely run linux..

    It doesn't and it oughtn't. It runs Tru64.

    (jfb)
  • "While Intel-based designs clearly dominate the computing market, Lipcon said there is very little overlap between the two technologies because Alpha does not run on any Windows-based systems."

    Well, that's not entirely inaccurate. After all, Microsoft dropped the Alpha support in NT round about NT4/SP4.

    Also, I could have sworn that there were restrictions on the computational power that we could export from the U.S. Something that breezes through nuclear calculation could probably brute-force crack most encryption methods in an afternoon...

    I thought the French banned almost all encryption. Surely to have an encrypted message of any form would violate their own laws...
  • ...instant grits.
  • For a nice article on simulations performed for the Relativistic Heavy Ion Collider [bnl.gov]:

    A Taste of Quark Soup [psc.edu]

    BTW, this research was done a T3E [psc.edu] (which uses Alphas).

    Sean

  • Hmmm.. AFAICR, the French are powered predominantly by nuclear power. So, by switching this thing on, they'll generate a whole stack of plutonium. Handy!
  • "Give peace a chance."

    Don't build a computer to test nukes, build a computer to help cure cancer. Let's use our high end processing power for playing chess and doing good.

    Okay, I'm probably OT, but I was listening to Mr. Lennon last night, and sometimes his lyrics just ring so true.
    --
  • Oh, give me a break. Don't you guys ever think before you post your drool? This thing's not a Beowulf: it's the Dragon.

    Linux is great for a lot of things, but if you're shelling out the money for 2000+ alpha chips, you're not going to run Linux. You're either going to run a custom OS designed just for this task-- and I doubt the French will open source it-- or *BSD with a customized kernel.

    Somewhere else on the net, some asshole read the same article and said, "Cool! Too bad it won't run NT." Don't be that asshole's linux-using brother.

    --Kevin

  • Who says this has anything to do with exporting from the US?

    could be made in france or taiwan
  • You missed the point. The point is that the big burly computer will house 2,500 Alphas running at 1 Ghz or 2,000 running at 1.25 Ghz. This will put that at or near the very tip top of fastest supercomputer ever built by man's hand. To say, "nothing new here" is a gross discredit to this behmoth of power. In fact, I shall from this day forward, worship the 2000 CPU alpha box as my lord and savior.

  • I thought the French banned almost all encryption.

    They did. But now they're quite keen on their citizens using encryption.

  • No, it means the French won't need to do actual testing anymore. The last couple of tests were so they had sufficient info to be able to do simulations.

    Much better to do simulations in France, than real tests in Mururoa/Fangataufa.

    Did you realise though, that the Chinese test site is actually closer to AU/NZ than the French test site ?
  • Right down the hall from me is an 18 node SGI O2K with R12K processors running at (as I recall) 300 mhz...as "supercomputers" go, this one is probably quite average...nothing spectacular. I would be surprised to hear that the pricetag of this system was under $1M.

    Of course you can "get a beowulf system going..." and probably cheaper, but IMHO beowulf is more appropriate to attack specific applications that parallelize well and have limited network demands. The O2K, and most successful supercomputers, derive a significant portion of their speedup and scalability through effective node communications. Amdahls law basically...

  • Seriously, they're going to have to build a nuclear reactor anyway, just to have access to a sufficient cooling system...
  • Well, maybe they're OOP proponents??! My guess would be that the simulation is MFC-based, though...
  • Considering the current "situation" between France and the USA over Echelon, I'd be a bit paranoid if I were placing this order.

    Think about it - you have 2500 radio transmitters, with timing accurate to 1GHz. Use this as a phase array and you could transmit a pencil-thin beam of radio/microwave energy at any satellite or other receiver you choose.

    I'm sure this machine would have plenty of processing power to anaylse its own activity and transmit data in this way without anyone noticing the loss of clock cycles.

    IIRC, details of DEC's VMS operating system were among the things that the Cracking ring broken by Cliff Stoll we selling to the Russians. Is that a coincidence? Is it a coincidence that DEC, VMS and AXP (the true name of the Alpha) are all TLA's?

    You might think so, but I'm not so sure.

  • IIRC ASCI red has > 9000 PPro200, so that is roughly MHz equivalent of 2000 Alpha1000, but I believe the Alpha has greater FP power per MHz than the PPro, so the result might be more powerful (others have made SpecINT comparison, but I don't believe that's what significant here). Digital Fortran compiler might be better than the Intel ones too, I don't know.

    --LG
  • Yes, these systems use Tru64, and Digital's clustering environment including CFS.

    Machine interconnect is 200MB/sec system from Quadrix (sp) in the UK. The system is VM aware so you get to do DMA from system to system.

    This machine will be about 5 Teraflops. I imagine with the new 64cpu nodes. Single system image. Programming using the MPI specs etc.

    -Britt, currently trying to decide if he will take a job in the Compaq Alpha SuperComputer group.
  • Yeah, it was a year or two ago when MS gave up on the idea of running NT on multiple architectures and stuck to intel.
  • It really depends what kind of accuracy you are looking for. You can write an engineering level code that solves basic equations for aerodynamic forces, and get first-order accuracy in predicting steady, inviscid, incompressible flow over a two dimensional airfoil at a small angle of attack on a standard pc; you can even get results within seconds if you make enough broad assumptions. However, this will give you very simplified results that are only valid inside the range of the assumptions used. If, for instance, you wanted to solve an off-design screech problem in a modern fighter engine combustor or afterburner, at temperatures outside the range of constant air properties, you would need to solve the full Navier-Stokes equations, [sdsc.edu]probably including unsteady terms, turbulence terms, vitiates, gas chemistry anomalies, airfoil expansion due to temperature gradients (which in turn require complex grid generation/regeneration) etc. Try to do this on even the fastest PC and you will be waiting for years.

    Quick example: to complete a full analysis of an 11.5 stage high pressure compressor on a 48 processor HP workstation (180 mhz) takes 50 days! of wall clock time. This isn't even a full engine, just one component. While obviously an extreme, since the code makes literally no assumptions (within the limits of human understanding of the physics involved in the problem), it comes to mind immediately as an example of the level of complexity involved in calculating any problem, whether it be molecules of air or sub-atomic particles, to the degree of accuracy required by modern scientists/engineers. Disclaimer: I am not completely familiar with the specifics of this code because it is/was a NASA Glenn [nasa.gov] project. I saw the tail end of a paper presented for it back when it was still Nasa Lewis... I have a copy of the paper, somewhere, but it escapes me. If I find it I will post the TR# and you can look it up at a tech library somewhere.

    I have never used the x-planes program you mentioned, but from looking at it I believe they can probably do what they claim. They aren't really claiming a lot, however; the real applications for high performance computing are in high order engineering and design for detailed performance analysis. Engineering level analysis is pretty simple and their level of detail could probably even be accomplished with table look-ups.

    Rev Neh

  • australia is not next door
  • ...France intends to use this instead of doing the tests for real (remember a few years ago?), then this is great news.

    Interesting that this comes so soon after the magical 1GHz announcements by Intel and AMD. Surely not a coincidence? :)

    --
  • If you only simulate nuclear explosions on a computer you aren't detonating nukes. No detonation, no (or at least significanlty lower) chance of radiation leakage.

    Is this an ideal application of supercomputing power? Maybe, maybe not. But it sure beats the alternative.. once you have the data so you *can* model things, that is.

  • Neither of you know the facts; you're just talking about whose dick is bigger This is excatly the kind of thing that gets in the way of objective science.
  • The Ad Hominum fallacy of formal logic is as follows: to attack another person's credibility instead of their arguments. An argument's truthfullness is not affected by who puts forth said argument. This previous post includes no new facts for the argument - it is a stream of insults; irrelevent. The only fact he uses is a restatement of his original argument; "The sun is a fusion reactor," which has already been addressed by his adversary.
  • relying on the bible for facts, while very convincing to religious people, doesn't do jack when you're talking to atheists like me. Keep your theology to yourself, unless you can back it up. Even religious people admit the bible has plenty fo bullshit in it.
  • The EV68 0.18 um Alpha dissipates 65 Watts at 1 GHz. Reportedly the 0.18 um Merced dissipates around 146 Watts at 600 odd MHz.

    The PowerPC 0.15um G4 dissipates about 12 Watts at 500 MHz. Yes, 12.

    What was your point again?


    --

  • The heat generated by the 2000 Alphas simulates the intense heat at the center of an atomic blast.

    And, the power consumption requires an atomic electrical plant to run the system, thus justifying the importation of large amounts of radioactive matter.

    ^_^


    --

  • Well, I think that the estimate of the price for SC like this done as "# of processors times price of one processor" is pretty rough. If they want real parallel cluster, they'll need some gigabit ethernet (like, say, Myrinet), which can add about 1000 bucks to the price of each node all by itself.
  • by gfxguy ( 98788 )
    Missile Trajectories are simple in Physics 101, where every trajectory problem started with: "Ignoring wind resistence, and assuming constant mass..."
    ----------
  • The French Atomic Energy Commission has placed an order for a supercomputer to simulate and analyze nuclear explosions.
    So when you turn on the 2500 Alphas the explosion is the magnitude of a nuclear device? ;)
  • I agree completely. However, in the article, they are comparing within an archeticture! Within the alpha architecture, to be specific. They reported the trade-off between CPU spped and number of CPUs. It seems to me that this is about the only legitimate use of MHz as an indicator of speed.

    To continue your rant, if CPU speed mattered, a cheap wintel box with a cyrix at 233MHz would whomp the pants off an origin with a 150MHz R10000 CPU. I don't think so. You might be able to find something that would run faster on the cyrix... but I think that pretty much anything involving floating point wouldn't fall into that category. Of all the many things which determine a systems speed, MHz may be the easiest to understand, and the least important. Perfect for marketing purposes!

  • Repeat after me: Mhz only has any validity as a benchmark within an architecture. And even that validity is limited. A 400Mhz PII is NOT 33% faster than a 300Mhz PII. It's maybe 10%. To talk about Ghz Alphas as though they are at all similar to Ghz Intels is crazy.

    Mhz within a given implementation of an architecture is the only thing that can be relevant, and even there it depends on the workload. You'll see linear scaling of performance on stuff that's not memory intensive, e.g. RC5 cracking. For most "real world" apps, you'll spend a great deal of your time bottlenecked somewhere other than the core, so the ratio of speedup to frequency will be less than linear.

    It's pretty important to remember that implementations of architectures can vary drastically in capabilities, too. EV6 (the 21264) is a MUCH more aggressive microarchitecture than EV5 (21164). Even though you can get an EV5 or and EV6 at the same clock speed, the EV6 will trounce the EV5 in performance, due to higher bandwidth, out-of-order execution, etc.

    On the other hand, there are cross-architecture frequency comparisons that can be valid, like, say, an EV5 vs. and UltraSparc II. Both are quad-issue, inorder cores with similar amounts of bandwidth. Frequency comparisons between the two aren't precise, but they are a pretty good rough comparison of performance between the two implementations...

  • Why would you want to put so much into analyzing nuclear explosions?

    The horsepower comes from NOT runnint the nuclear explosion. I'm not up to date on the various treaties, but what they are allowed to actually blow up less and less as time goes by. But they still have to convince potential invaders that they know what they're doing.

    So you blow up things inside computers. And you convince your enemy's scientists that your simulations are valid.
  • It just shows how out of touch I've become with the windows world. I remember the big tours when MS and DEC were convincing everyone with joint presentations that NT on Alphas were sliced bread. The neatest idea since better mousetraps. Whatever.

    Now the press tells us that While Intel-based designs clearly dominate the computing market, Lipcon said there is very little overlap between the two technologies because Alpha does not run on any Windows-based systems.

    Did MS bail on the Alpha?
  • Hey man, I fear you have a 4-year lag.

    France has not done any nuclear testing for several years, and the last experiments REALLY were the last ones (well, I hope, with politics you never really know).
  • It is really hard to compare a 1Ghz Alpha chip to a 1Ghz Athlon or PIII by using frequency. It is almost like saying that if a Ford Pinto and a Porsche both redline at 6000 rpm, then they are equally as fast!

    Disclaimer: This is an arbitrary comparison, I do not know that actual redlines of the above vehicles!
    Furthermore, I am not saying that an Alpha chip is like a Pinto and an AMD chips is like a Porsche... I am just saying that it is a poor comparison. Frequency is misinterpreted. Hell, I've seen monkeys at the zoo jerk-off at about 2.5Ghz, but they suck at calculations!
  • No, it's the French, their testing grounds were in the Polynesian islands. All the crap from their nuclear testing is now spread pretty evenly around the southern Pacific Ocean. I certainly wouldn't want them to start up real testing again, let them have their computer :-)

    ----
  • Aarrrggghhh...

    It was not MS that gave up - it was Compaq!!!! Compaq decided to drop support for NT on Alpha!!

    Who do you think had to maintain Alpha/NT? Clue: Not MS. Yes, DEC/Compaq had to pay for a complete NT software development dept., because DEC/Compaq had to do the maintenance. (same thing when NT used to run on PPC - IBM had to maintain it, until they realised it didn't sell).

    Then some bright spark looked at the figures and realised that nobody was buying NT on Alpha. It's best marketshare was on workstations - ~15%!! On servers it was even worse - because people who tend to buy nice hardware like Alpha also tend to buy nice OS's like OpenVMS or Unix. People were not spending money on Alpha/NT.

    They put 2 and 2 together and realised that paying for NT/Alpha was costing more than the revenues generated by Alpha/NT sales. And that's why it was scrapped. The biggest money maker on Alpha is Unix, closely followed by VMS.

    Also, look how hard compaq is pushing Linux on Alpha. This is for the same reason as why NT was dropped - money. Linux sells a lot of Alpha's.. esp in the lower end, eg Linux marketshare on DS10's is about 40% or higher... it also does well on clusters. And Compaq is pushing linux/alpha clusters really hard.

    (my mouse mat is a picture of tux on a fat motorcycle with the Compaq Alpha logo, and a banner saying "Linux SCREEEEAMMMS on Compaq Alpha".. this is an official compaq mousemat)

    :)

    in fact this Alpha cluster will most likely run linux..

  • Furthermore, I am not saying that an Alpha chip is like a Pinto and an AMD chips is like a Porsche

    I surely hope you don't... An Alpha at 150 MHz is roughly equivalent to a PPro/PII at 800 - 1000 MHz, for floating-point intensive programs. Partly because the Alpha is just blazingly fast for FP operations, and partly because the x86 FP architecture sucks. When they developed the 8087 originally, they chose a design which is flawed from beginning to end, but they couldn't get it to work any other way... And now we're stuck with it.

  • it kinda reminds me of an old Star Trek (1st gen) episode where some planet was in the midst of a war, but it was all computer simulated, and whoever lost a virtual 'battle' had to send a bunch of people to a death chamber.
    So if China ran a simulation of an ICBM launched at LA, pitted against a US simulation of a 'star wars' ABM missile trying to knock it out of the sky, and the US missed, we'd have to bump off a bunch of LA residents, all with no messy radiation or destruction of property!
  • From what I have read, the tricky part of computing ballistic missile trajectories is having to use a high fidelity model of the gravitational field around the Earth. There are also questions about the accuracy of existing models. I'm not sure how atmospheric effects are handled, this was a major problem for the USA during World War II. Precision bombing missions by high altitude bombers often missed targets by large distances due to wind effects.
  • ***
    The Spanish intercepted these French intercepts and are still pondering a suspicious code phrase: "noleche"
    ***

    Luckily the Spanish found their Franglish translator which revealed the secret message: I'm not licking.

    (from lécher, to lick)
  • I was thinking noleche was a secret French message instead of a secret Spanish message. Don't speak Spanish so I defer to you on the he's not licking theorem

    =)
  • it seems to me that Compaq are going to sell a computer that is going to analysis the destrution of the south pacific.

    If this means I'll never hear 'there's nothing like a dame' again, that can only be a good thing...
  • There is a vital detail missing from this article: what kind of machine is it? The actual processors being used are only one factor. What about the machine itself? Is it a cluster of workstations, and if so, what kind of network is it on?

    Alternatively, is it a "real" supercomputer, with a scalable high bandwidth low latency interconnect?

    If so, it makes me wonder why they just don't buy a Cray...

    Chris

  • Don't you mean SPECfp? That's where alphas have the most advantage. At similar clock speeds, the SPECints aren't that far off for a P3 and Alpha (IIRC)...
  • Something that breezes through nuclear calculation could probably brute-force crack most encryption methods in an afternoon...

    That would be very bad crypto, in that case. The keyspace enumerable (ie, assume a key checked per cycle -- obviously a best case scenario) for a 1 TeraOPs machine is just shy of 40 bits per second. So in two seconds it does 41, in 4 seconds it does 42... in 24 hours, just over 51 bits. Since you on average only need to search the half the keyspace, the best this machine could hope to crack using brute force is a 52 bit key per day.

    But they were talking about eventually scaling up to 100 odd TOPs. 128 == 2^7, which brings it up our searchable space to just under 60 bits.

    At that rate, You'll need to search for 2^68 days to break a 128 bit key. That's a pretty long time. Go calculate it for yourself. Oh, I'll do it, it's 10^20 odd days.

    Johan
  • Well, the first few millionths of a second. In a multi-stage device you may have 3+ seperate fission and fusion reactions triggering each other at exponentially greater yields.

    Simple devices use primary with plutonium core compression fisson device with tritium core. X-rays escaping the primary tamper are focused via beryllium into plastic/foam/explosive waveguides to generate enough energy in second detonation to heat lithium-6 deuteride or other fussion fuel sufficiently, and outer bomb casing is often made of fissile material for free extra bang (and higher burn efficiency of lithium). An extremely complex system. I always thought it was cool how in "dial-a-yield" by simply varying the tritium in the primary core "pit", you can change the final yield by a good order of magnitude or two - for tactical situations vs. simple inventory.

    Handling propogation of energy between stages introduces signifigant aspects of materials sciences, and is one of the more interesting problems with maintaining a stockpile - how things change after sitting and bombarding itself with low level radiaition for years on end. Lots of trace isotopes appearing, opacity to x-rays changing [the internal power exchange medium - "high temperature photon gas"), _very_ neat stuff.

    As an aside, the utilities to work through these exchanges are what Dr. Lee is accused of losing his backup tapes of. Rather important stuff, as higher efficiency means more yield per launch platform.
  • They could just "buy a Cray", except that SGI long ago stopped any additional development to the T3E line. This line was exactly like this - high speed Alpha CPUs (600 MHz on the -1200s) and incredibly fast, low latency interconnect. These machines still hold 24/50 of the top spots on the top500 list [top500.org]. Not bad for a computer designed in the mid-90s and last revised in early/mid-98. If they had allowed a continuation of the line (instead of promoting their SGI Origins instead), you would've been hearing Cray as opposed to DEC (err...Compaq). They're really just doing a good job of filling up the gap that SGI/Cray left wide open for them.
  • I agree that Compaq is doing great work when it comes to servers, espcially with the alpha stuff.. but I cannot and will not forgive them for the poor craftsmanship, quality, and engineering behind any single Presario you can pull out of Compaq's arse. I personally have a Presario 4640 (P2 266) No fan on the CPU but a case fan in the front blowing across it. This is fine except that they put the shortest possible floppy cable they could in the box and to fit it was stretched across the surface of the fann completely blocking airflow. How is that for engineering? Speaking of Presario's has anyone gotten Linux to run on a model near mine? Any version of RedHat from 5.2-6.1 has a kernal oops when formatting a linux partition during the install. I got slackware 3.5 up on it but every 2-10 minutes it complains of dev/hda being out of sync and restarting the drive. YOu subsequently hear the drive spin down and back up, halting disk IO for a good 5-15 seconds. The drive is fine so I can't figure it out.... any suggestions would be greatly appreciated.

  • One of the reasons is studying the decay of the current stock of nuclear armement. There is no point in stocking your nukes if they cannot be used in a few years. I believe this is the main reason actually.

  • You do this so you don't have to actually set off a nuclear device to predict how it will perform. You must do a number of real nuclear tests to get baseline information. After that, you can do computer simulations of similar explosions. These simulations are VERY processor-intensive (like weather prediction) and require these large parallel systems to compute.

    The U.S. does the same thing with massively parallel systems at Sandia National Labs and Lawrence Livermore National Lab. Check out the list of the top 500 supercomuter sites in the world -- http://www.top500.org/ [top500.org] -- to see who's doing this kind of thing.

  • Even if this is a powerful machine, I'm afraid it will need a VERY long time to boot...
    Especially on NT : Windows has detected a new CPU.. please insert Windows NT CD-ROM... Windows has detected a new CPU....
  • The cluster is made of a variant of the ES40, which is a 4 CPU box.

    The Interconnect is Memory Channel 2; 2GB/s with less than 2 microseconds latency.

    So yes, this is a "real" supercomputer. :)


    --
  • The machine will be built out of 4-CPU nodes based on the ES40. The OS will be a customized version of Tru64 5.0, which has VMS-style clustering. This will make the entire cluster appear as 1 machine, i.e. the OS only has to be installed one time. The CPUs will be the new 21364, aka EV7.

    --
  • Some info on the technology used in this machine: http://www.digital.com/hpc/ref/ref _clusters.html [digital.com]

    of partiular note is the Memory Channel 2 interconnect they are using which gives throughput of 2GB/sec with an amazing latency of less than 2 microseconds.
    --

  • I would rather have them simulate it than to start nuking the Nevada and New Mexico plains again.
  • Well, when I think about it it is amusing that I've been duped. Before amusement becomes irritation. Not with the origional Troll, but with oneself. I couldn't believe I could be so stupid as to respond to it.

    I don't think that Trolls should be ignored. If there is a good Troll, I like to respond in a humourous way. This time I failed.

    I know what you mean about the moderators not understanding insightful. I've posted about one insightful comment, and had dozens moderated up as insightful. What I don't quite follow is why you're so angry about it. Ordinary people get to moderate. People are fallable. Not everyone does know what insightful means. Hell, most people even get the meaning of the word "instantaneously" (to choose a random example) wrong too. This is because nobody knows everything. This does not give you just cause to correct minor errors with pedantry. It is no reason for swearing. And if you think good trolls shuld be encouraged, then GREAT! respond to them with anger. That's what they're there for.

    And as you can see, I (Neil Sluman) have responded with my name. Happy now?
  • "While Intel-based designs clearly dominate the computing market, Lipcon said there is very little overlap between the two technologies because Alpha does not run on any Windows-based systems."

    Once again, the difference between architectures and software slips through the grasp of the media...

    Also, I could have sworn that there were restrictions on the computational power that we could export from the U.S. Something that breezes through nuclear calculation could probably brute-force crack most encryption methods in an afternoon...

  • by coreman ( 8656 ) on Friday March 03, 2000 @04:24AM (#1228639) Homepage
    It's not for analyzing, it's for simulation and emulation to verify designs without physical testing. The simultanious equations to do these things are pretty extensive and a hugely parallel processor is very useful. Remember, all the interesting things happen in the first millionth of a second, beyond that it's an expansion/compression issue of the blast propogation.

    Where would this be placed in the current supercomputer ranking?
  • by Duxup ( 72775 ) on Friday March 03, 2000 @04:19AM (#1228640) Homepage
    My question is:
    Why would you want to put so much into analyzing nuclear explosions?
    I can see for weapons testing and maybe just out of scientific curiosity. Are there any other reasons anyone can think of?
  • We did. In my company, we need computers able to crunch lots of data in very short amount of time. We now have a 16 CPU SGI origin. We are considering going for a beowulf type machine, because for a similar amount of CPU power, the cost is 5 times less (at least).

    I'm wondering if we can get a Beowulf cluster going for a cheaper price with similar speeds.
    These alpha boxen will problably run tru64 in a configuration similar to beowulf, that is a cluster.

  • by lovebyte ( 81275 ) <lovebyte2000@NOSpAm.gmail.com> on Friday March 03, 2000 @04:46AM (#1228642) Homepage
    More info on the French CEA website: http://www.cea.fr/actu/html/61_1.htm [www.cea.fr], in French.

    Quick translation:
    ..... The power of 5 teraflops is obtained by the use of the Compaq Alphaserver SC series of supercomputers.....
    The installation of this supercomputer ..... is the first of three steps in the realisation of the nuclear weapon simulation centre. The second step, towards the year 2005, will see an increase to a power of between 30 and 50 teraflops and the last step, 2009, to a machine of about a 100 teraflops.
    .....

  • by eyeball ( 17206 ) on Friday March 03, 2000 @06:03AM (#1228643) Journal
    What overkill. I can simulate nuclear blasts in just a few hundred clock cycles:
    main()
    {
    printf("Goodbye, world!\n");
    }

  • by MattMann ( 102516 ) on Friday March 03, 2000 @04:43AM (#1228644)
    Why would you want to put so much into analyzing nuclear explosions?

    they're trying to find the optimal distances to heat the following foods for a light snack:

    • s'mores
    • toasted marshmallows, straight up
    • mac'n cheese
    • tea, Earl Grey, hot

    the project got kicked off accidently when the French Echelon intercepted and misspelled this decrypt from the American Sec. of Defense: "the best way to heat these foods is unclear". The Spanish intercepted these French intercepts and are still pondering a suspicious code phrase: "noleche".

  • by MattMann ( 102516 ) on Friday March 03, 2000 @04:50AM (#1228645)
    Mhz only has any validity as a benchmark within an architecture. To compare across architectures, you must use bogomips!
  • by Amphigory ( 2375 ) on Friday March 03, 2000 @04:22AM (#1228646) Homepage
    Why do otherwise knowledgeable people persists in using clock speed as a way of rating CPU speed?

    Repeat after me: Mhz only has any validity as a benchmark within an architecture. And even that validity is limited. A 400Mhz PII is NOT 33% faster than a 300Mhz PII. It's maybe 10%. To talk about Ghz Alphas as though they are at all similar to Ghz Intels is crazy.

    You want to share CPU benchmarks on something like this, talk about SPECint and SPECfp. Not Mhz.

    --

Remember the good old days, when CPU was singular?

Working...