Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Japan's New Supercomputing Toy 190

deman1985 writes "As reported by UPI, Japan has unveiled their fastest supercomputer yet. Assembled from Hitachi and IBM components, the new system sports total performance around 59 trillion calculations per second and comes at a cool 5-year lease price of $30 million. Pictures of the beast can be found at Mainichi Daily News."
This discussion has been archived. No new comments can be posted.

Japan's New Supercomputing Toy

Comments Filter:
  • Is it getting the most of that computing power by running Windows?
    • Re:I wonder (Score:2, Funny)

      by Cygfrydd ( 957180 )
      Reportedly, it is just fast enough to use Windows Vista’s Aero Glass GUI.
  • Yes.... (Score:1, Redundant)

    by JoeLinux ( 20366 )
    But imagine a a beowulf cluster of these.

    *sigh* I miss when that was popular...I was in college, dating a total bitch, living off of ramen, playing CS until my grades started to suffer, and getting four hours of sleep a night...good times, good times.
    • Re:Yes.... (Score:2, Insightful)

      by j79 ( 875929 )
      *sigh* I miss when that was popular...I was in college, dating a total bitch, living off of ramen, playing CS until my grades started to suffer, and getting four hours of sleep a night...good times, good times.

      Shit. That's my life right now, and trust me...it ain't good times...

      Then again, maybe a few years down the road, when I have a shit job, married to a total bitch, living off of ramen, and still play games till the wee hours in the morning, I'll be able to reflect and think, "yeah...good times...
  • Beowulf Cluster (Score:1, Redundant)

    by Nazmun ( 590998 )
    Imagine a beowulf cluster of these. And also does this run linux?!?!?

    There, i've said it...you know someone would have!
  • According to top500.org, the fastest computer is an IBM Blue Gene/L with 280 TeraFlops. This Japanese team would have been #1 about a year ago.
    • The article didn't say that it was the fastest in the world, but merely the fastest to yet be built by the Japanese.

      Yes, Blue Gene/L still reigns supreme.

      -WeAz
    • True, the biggest BlueGene/L implementation does best this number. Also interesting to note, this thing has BlueGene in it: " The supercomputer, consisting of two systems -- Hitachi's multipurpose supercomputer with a peak performance of 2.15 terra flops and IBM Japan's Blue Gene Solution with a peak performance of 57.3 terra flops -- is capable of making about 59 trillion calculations per second, the Mainichi Shimbun reported Wednesday. "
      • Hitachi's multipurpose supercomputer with a peak performance of 2.15 terra flops and IBM Japan's Blue Gene Solution with a peak performance of 57.3 terra flops -- is capable of making about 59 trillion calculations per second, the Mainichi Shimbun reported Wednesday.
        i wonder how many nanoseconds it took this new supercomputer to add 2.15 to 57.3 and round that off to 59
  • And Yet (Score:1, Troll)

    by Herkum01 ( 592704 )
    It still cannot run Windows Vista...
  • ask public? (Score:3, Interesting)

    by tomstdenis ( 446163 ) <tomstdenis@gma[ ]com ['il.' in gap]> on Wednesday March 01, 2006 @01:19PM (#14827784) Homepage
    They're gonna ask the public for research themes? ... AFTER THEY BOUGHT IT???

    I'd love to see this from top500.org

    name,where,how many processors,average FLOPS,max FLOPS,***actually being used FLOPS***

    Then sort it based on the latter. :-)

    Tom
    • Yes, hopefully the response won't be something like "Prove that creationism is true." Boy that be a fun one to calculate...

      Also, what's the point in having the Hitachi around when the IBM sitting right next to it is 25 times faster?
    • > They're gonna ask the public for research themes? ... AFTER THEY BOUGHT IT???

      This is not unusual.

      Universities around the world are filled with expensive equipment that doesn't get much use.
      What usually happens is that somebody has a vague idea for a research project, applies for a funding grant, but doesn't expect to get it, and then three years down the track the grant gets approved, and you have to buy something that you never expected to get, and you're not sure what to do with it.

      Part of my job is
  • That's a lot of calculations per second, but other than satisfying man's desire to have the biggest or fastest thing on the block, what possible uses of real value, other than decryption, could this thing bring to the table? Quicker searching for prime numbers? Weather modeling? SETI@Home nonsense?

    And does anyone have an update on the Jap's supersonic jet project? Last story I remember was a model crashing in Australia. Go Japan!

    • You're Bill Gates and I claim my five pounds.
    • What uses? TFA said particle accelerator research. So they're studying physics. What's wrong with that?
    • EQ2 at 30FPS :P

      (i kid, i kid)
    • Biological simulations.

      Protein folding, modeling cell machinery, and simulations of other biological systems at the molecular level. Think about the number of calculations involved in modeling the interactions between a few million atoms [lanl.gov] in something as simple as the ribosome. Now imagine adding the water and solute environment that surrounds these sorts of molecules. Oh, you could ignore the water and do the simulation in a vacuum, but let's remember that a driving force in protein conformation is hydropho
    • I beleive traditional uses is in partical physics and nuclear testing, among other things. Knowing what happens when you slam two particals together, along with the reaction to the surrounding enviroment, takes an extrodinary amount of calculations.
    • I guess if you need a supercomputer why not build the fastest?
    • Super computers are used in weather simulations, weapon design simulations, energy grid and so on. Louisiana's vulnerabilities to hurricanes was learned around 1999 because of a supercomputer simulations, the problem is that politicians apparently chose to ignore the threat. One man is trying to use supercomputers to help determine if there was a way to affect a hurricane by seeing if it could be affected with weather modification tools when it starts to form.
    • Protein folding. Proteonics makes the human genome project seem like an insignificant ant next to a Boing, both in terms of required CPU power and in regards to potential benifit to the human race.
  • Give it five years and it'll be a commercially available laptop, ten at the most.
    • Good luck with that. Moore's law is dead.
      • The number of transistors is still doubling at an hectic pace. Maybe you are confused (as are most people) about what Moore's law actually is.

        Now with mulit-core processors becoming the norm, and 32 cores per CPU not that far off, it is easy to see that Moore's law is going to keep going strong for at least 5 more years, if not more.

        Since these machines are already distributed multi-node systems, a single cpu with 32 cores is going to be generally faster than 32 CPUs in a distributed node configuration.

    • i'm writing this from my Earth Simulator laptop that will be commercially available next year
  • You know, I think there should be a kind of "size cap" on these stats. A computer should be ranked higher if it can squeeze more performance out of so many cm2 worth of die or something. Otherwise you can just keep making computers more powerful by just adding more and more nodes. That does not excite me *shrug*. -naeem
  • Now what to do with it?

    How about installing one over at Slashdot HQ?
    You guys need it for all the people who keep missing their chance at getting the first post.
  • Does 59 trillion calculations approximately equal 59 teraflops?
    • Does 59 trillion calculations approximately equal 59 teraflops?

      Good point. I live in Mexico, and here "illions" are measured in 10^6 units. So here, a billion is 10^12, a trillion is 10^18, etc.

      And actually I don't know how it's handled in different countries, so yes, it's confusing. Using mega,giga,tera is much more specific and doesn't lead to confusions.
    • Re:teraflops (Score:3, Informative)

      by Secrity ( 742221 )
      59 trillion calculations a second equals exactly 59 teraflops, of which, 57.3 of the 59 teraflops is from a smallish IBM Blue Gene.

      Lawrence Livermore National Laboratory has an IBM Blue Gene that does 280.6 teraflops or 280.6 trillion calculations a second.
  • Isn't that nr 6 in the world?
    The list [top500.org] is here.
    • Processor - PowerPC 440 700MHz; two per compute node - Lowpower allows dense packaging; better processor-memory balance

      Not particularly powerful CPUs individually, but I guess if you cram enough of them together it adds up.
      • The last part you quoted was very important. Supercomputers often spend more time waiting for memory than they do with actual computing. The CPU doesn't need to be so fast for this.
        • Indeed, that is one of the reasons IBM is so proud of their new Cell architecture. It was designed to reduce the latency between CPU and RAM, perhaps more out of necessity than by choice. IBM wanted to reduce the complexity of the processor by moving a lot of the out-of-order-execution, register renaming, branch prediction, etc... logic off the silicon and into the compiler. Transmetta tried the same thing in the past, but found it only compounded memory latency issues. It works out for the Cell architectur
          • Which is all well and good, except that the cell architecture gives the spe's really fast access to a very small pool of memory, and then a decent, but not astounding access to the rest of memory through the master core. This works very well for doing a lot of number crunching on a small bit of digital media data. However, there are a lot of supercomputing problems that are bounded by the amount of memory addressable by each processor. Even Blue Gene's quarter gig per node is a problem for many codes.

            If the
    • Yes, it runs Linux! Well, sort of... the I/O processors run Linux; the actual compute nodes run a proprietary low-overhead executive. Linux is a general purpose OS, and interrupting CPUs for system processes and timer ticks has a huge cumulative impact on highly parallel tasks. Ideally, one would want the compute nodes to get a chunk of work and work on nothing else until that chunk of work is finished, i.e. preemptive multitasking is actually a liability for massively parallel machines.
  • Interesting slanted design of the racks.

    Is there a design reasons for that (air flow, etc)?

    Or is this marketing wanting to be "different"?
  • ...but imagine a Beowulf cluster of them!

    Hey, somebody had to say it!
    • You know, these comments were funny. In 1998. 8 years later, their dumb. No. Seriously. No one is laughing.
  • "The institute will ask the public to propose specific themes of research activities using the supercomputer system." Seti@Home! Duh...
  • by muhgcee ( 188154 ) *
    "Toy" seems pretty appropriate here. They seem to have bought it without fully knowing what they were going to do with it yet: "The institute will ask the public to propose specific themes of research activities using the supercomputer system."
  • is to allow for more FF-XI servers. Duh.
  • by advocate_one ( 662832 ) on Wednesday March 01, 2006 @01:35PM (#14827992)
    Have we had another moderation system failure... the number of posts making it to my +3 browsing has dropped dramatically in the last two days... I'm expecting there to be almost minimal moderation after today and it to be a general trollfest again by Friday...
    • I've noticed the same thing. there seem to be a lot of down-mods burning the mod-points. I've seen stories out for hours with all of 0 Score: 4s or 5s
    • Having read the replies so far to this thread there simply isn't anything worthy of being modded up, in particular there is a steady stream of predictable Beowulf comments, which just aren't funny or worth posting anymore.

      At least for this thread, I'd say its not the moderation system failing, there just isn't anyone with anything intelligent to say posting anything.

      Slashdot posters seem to be the thing cratering.

      Articles on IBM throwing together another giant collection of CPU's for someone with money to b
    • ...is that with fewer /. comments (+4) to obsessively read through, I find I suddenly have more time to get other stuff done. I, for one, hope that this less-cream-filters-to-the-top 'feature' stays. :)
    • I've noticed the same. At first I thought it was because I changed my comments preferences but when I set them back to the way they were, no change.

      It used to be that browsing at +3, you would get about 10-20% of the total number of comments (e.g. 70-120 comments +3 or higher when total comments = 700). I thought at first (in typical slashdotter fashion) that there was a massive conspiracy of rogue mods who just downmodded at will. But looking at the mod scores, it seems like it's just that there aren't
  • Hi,

    Given Moore's Law, and given increasing performance gains in computer architecture and new work on algorithms, how likely is it that one of these days one of these machines (or one of their exponentially more powerful progeny) bootstrap themselves into a "Singularity", an AI which at the point of self-awareness becomes almost instantaneously god-like?

    I know that this has been the stuff of Science - Fiction wet dreams for decades, but will this old idea - like so many other ideas first found in science-fi
    • Computers are all the same - some of them have more memory than others and some of them are speedier than others but they are functionally the same. Given enough time and memory, any computer can simulate any other computer. Intelligence and speed are separate concepts. Imagine we had a chat with some aliens who lived 10 light years away. There answers would be no less intelligent even though we had to wait 20 years to receive a reply.

      So, intelligence is no more likely to emerge all of a sudden from the lat
  • The supercomputer, consisting of two systems -- Hitachi's multipurpose supercomputer with a peak performance of 2.15 terra flops and IBM Japan's Blue Gene Solution with a peak performance of 57.3 terra flops -- is capable of making about 59 trillion calculations per second...

    You don't say. I wonder if they put those numbers into the machine(s) to get that sum...
  • Ok, I haven't been keeping up --- this Japanese machine is, I gather, massively parallel. Suppose I wanted to find out which single processor was the speed king for floating point calculations. Is it as simple as sorting for the highest number on SpecFP2000?
    • First you would have to define what you mean by "single processor". The fastest addressable processor is probably fujitsu's VPP series node, but it's not really one processor, it's 8 IBM power4+ processors ganged together with a vector coupling facility. The fastest hpc processor that sits on a single chip is the cray X1e msp chip, which has 2 18GFlop cores per chip. This of course ignores specialized chips like DSPs, most notably the IBM cell chip. Cell has a much higher flops performance than any chips us
      • I'm sorry. I got that wrong. It's not the fujitsu vpp (which is no longer being made), but rather the hitachi sr11000, which gangs together a bunch of power4's into a vector processor.
  • Under Moore's Law, the price of computing power halves every 18 months. So that means in 5 years, the price ought to be about 1/(2^(5/1.5)) =~ a tenth of that.

    That means in five years I'll be able to afford it on my desktop for about what I make an hour.

    Woohoo!

    I hope it comes with a better mouse. I have one of those mechanical ones, and it keeps getting granola in it.
    • No, mores law claims the transistor density (or now the data density) doubles ever 18 mo. It turns out that increases the problems you can solve in a non linear way. A great example of this is the hardware multiplier. Once transistor density got tot he point where you could put a hardware multiplier on the chip, you could do a 16 bit multiply about 60 times faster than using the add/shift technique so you do do far more complex work. There are also massive gains in DNA sequencing once you could build a
  • by BitwizeGHC ( 145393 ) on Wednesday March 01, 2006 @01:41PM (#14828070) Homepage
    In other news, South Korea unveiled its new supercomputer: KEKEKE ^____^
  • How sad is it that I was thrilled they included a link to a picture of it?
  • It's not a toy OK! It just's able to run Counterstrike at really high framerates because scienctific simulations and game mechanics are very similar operations...

    Now if you'll excuse me, my aimbots need seeing to.
  • I just wait for that beast to get slahsdotted, any minute now. Wait, they didn't run the web server on that one, did they? High energy accelerator research... Hrrmmmppff..
  • I think when the lists are released this year, the top one hundred computers will be in the 10 to 300 teraflop range. With the cell CPU chipset peaking at a quarter-teraflop, one teraflop is merely high performance these days.
  • Their tag line will be "Finally, something fun to do with 59 trillion calculations per second!". But nobody will buy it still because there really isn't anything fun about Mac what with virtually no game support.
  • what kind of frame rate does it pull on HL2?
  • Did you see the pictures? It's powered by millions of tiny little Jawas!
    1. It's the fastest in Japan , aka should beat Earth Simulator (40TFlops, ranked #7 last November).
    2. It's basically yet another IBM Blue Gene, but with a much weaker Hitachi attached to it.
    3. #1 ranked BlueGene/L running at Livermore hits 280TFlops.
    4. IBM PPCs dominate the high end of the Top500.
  • ...a Beowulf .... AAAACK!!!
  • This is not, as a misinterpretation of the summary might suggest, the fastest supercomputer on the planet, just the fastest one in Japan. That title of world's fastest is still held by the BlueGene at Lawrence Livermore, which boasts something like 350 teraflops peak. Interestingly enough, this new machine in Japan is a smaller BlueGene computer: same architecture, fewer racks.
  • How many Libraries of Congress will it hold?
    How many football fields does it cover?
    I need some stats I can relate to! :)
  • Yes the linpack performance is better, but I think 38tf of Earth simulator is likely to get a lot more real work done than 57tf of blue gene. The blue gene is a really elegant design for creating a very inexpensive supercomputer, which means one can affordably buy a very large system. However, the balance of the system is a little weak in terms of memory bandwidth, and interconnect bandwidth, as compared to the Earth simulator.

    Furthermore, scaling most codes to the tens of thousands of processors of a blue
  • I hear the NOS sticker is good for two teraflops, and if they put a big unpainted wing on top of it, that's another three teraflops. Stage III, including the watermelon shooter con exhaust pipe, takes it up an additional ten teraflops!
  • ...that it's just fast enough to keep up with Microsoft's upcoming Vista OS.
  • Sounds a lot like Intel's first dual processors. Let's just slap together a Hitachi and an IBM, add up the TFlops, and claim victory.
  • Terra is also the Latin word for earth. Tera stands for the factor 10^12.
  • Any other Japanese and English speakers out there find Mainichi Daily news to be a little redundant? :-p
  • Does it use AJAX?
  • can it run Vista?
  • Do they offer micropayment options for really small calculations?

Saliva causes cancer, but only if swallowed in small amounts over a long period of time. -- George Carlin

Working...