Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Supercomputing IBM

IBM Provides Access to Blue Gene On Demand 146

neutron_p writes "IBM's world renowned Blue Gene supercomputing system, the most powerful supercomputer, is now available at new Deep Computing Capacity on Demand Center in Rochester, MN. The new Center will allow customers and partners, for the first time ever, to remotely access the Blue Gene system through a highly secure and dedicated Virtual Private Network and pay only for the amount of capacity reserved. Deep Computing Capacity on Demand will service new commercial markets, such as drug discovery and product design, simulation and animation, financial and weather modeling and also a number of customers in market segments that have traditionally not been able to effectively access a supercomputer at a price within their budgets. The system enables customers to obtain a peak performance of 5.7 teraflops."
This discussion has been archived. No new comments can be posted.

IBM Provides Access to Blue Gene On Demand

Comments Filter:
  • 5.7 teraflops (Score:4, Insightful)

    by FTL ( 112112 ) * <slashdot.neil@fraser@name> on Saturday March 12, 2005 @10:19AM (#11919528) Homepage
    Amazing supercomputer. It's /.ed already...

    What's 5.7 teraflops in more familiar units? Like SETI@home workunits/day? By my calculations that's 1.5 workunits every second. Give or take. [cox-internet.com] By comparison the entire SETI@home network is currently running at 67 teraflops [berkeley.edu].

  • by farmhick ( 465391 ) on Saturday March 12, 2005 @10:23AM (#11919548) Homepage
    until all the Road Runner customers jump on it, and the bandwidth goes to hell. My three-d real-time animation of last week's blizzard with slow to a crawl, and then I'll probably get a pop-up advising me to switch to that other supercomputer the Japanese made last year.

    Man, I hate when that happens.
  • Google? (Score:3, Interesting)

    by macpulse ( 823760 ) on Saturday March 12, 2005 @10:24AM (#11919550) Homepage Journal
    I wonder if Google will compete with this when they release their supercomputer grid/cluster to the world.
    • Re:Google? (Score:2, Interesting)

      by karvind ( 833059 )
      Very possible. But I wonder how does the bandwidth between the processors will compare for the two cases (and will determine what kind of supercomputing applications can be run on them) ? Blue gene is custom designed ( each chip = two processors, four accompanying mathematical engines, 4MB of memory and communication systems for five separate networks). On the other hand google uses commercially available servers and hence may be able to offer the service lot cheaper.
      • >may be able to offer the service lot cheaper.

        It'd better be cheap 'cause it'll certainly suck - either their supercomputing or the search engine. You can't do both low-latency supercomputing and fast response web search and webmail.

        They could, of course, add more servers and balance the load between their current and their new (HPC) services, but their current workloads must be pretty consistent (say, growing 0.3% a day) so they would really need dedicated HPC boxes.

        Besides, there's not much in their
    • Re:Google? (Score:3, Interesting)

      by jacksonj04 ( 800021 )
      IIRC the Google grid is mostly incapable of general computing, for reasons such as memory being allocated in 64mb blocks.
    • Google's supercomputers are built for very very spefific tasks. So I don't think google will be able to offer something similar to this. Google's supercomputers could be compared to a very large private SETI@Home network (SETI@work?? :)).
  • by thank-u-for-sharing ( 843287 ) on Saturday March 12, 2005 @10:24AM (#11919552) Journal

    IBM Provides Access to Blue Gene On Demand.
    I would be interested if Penelope Cruz is wearing them!

  • by 3770 ( 560838 ) on Saturday March 12, 2005 @10:24AM (#11919554) Homepage
    I'll buy some time to run this program

    int main() {
    for(;;) fork();
    }
  • financial and weather modeling and also a number of customers in market segments that have traditionally not been able to effectively access a supercomputer at a price within their budgets.

    A signed up member of IBM's marketing department. It sounds a slightly odd slashdot line.

    Nice advertising though, and an interesting proposition.

    The mainframe is dead... long live the mainframe
  • by JawzX ( 3756 ) on Saturday March 12, 2005 @10:26AM (#11919564) Homepage Journal
    Who's next to offer "pay as you compute" access to supercomputer level systems? Apple? HP? Toshiba? Hitatchi? Is this going to be a new market segment or just a flash in the pan? Are companies going to begin outsourcing computer time? Are there going to be giant compute centers in India housing huge systems crunching numbers for companies that would have planed to invest in a lower level super computer for inside use? Will this kill supercomputer/supercluster sales or drive them up?

    An interesting development for sure.
    • India can only provide low-cost and -priority services. They don't even have reliable power there. You can tell how much companies care about something when they outsource it to india - it's pure proof that they don't care about providing tech support.
    • When they outsource computing time to India, all the computers in the US are gonna get together and make a peaceful demonstration... They would pop up BSOD's just before you click save... And Linux will join in with "OOPs... I told ya not to piss me off..."
    • Offering Supercomputer access on a pay for time spent basis is nothing new, even my computing department does it if you want to use the 'Big Machines'.

      It would be FAR more interesting if someone like http://www.distributed.net/ started offering commercial clients access to 'super-clusters' made of everyone on the internet who can dedicate some processor power and get paid in return. Of course they would need to work out a bunch of problems but it would be cool.

      And then I could finally start to make back s

    • Sun is offering a vanilla Solaris environment that pretty much anyone is familiar with. Is IBM able to deliver a vanilla RHEL/SuSE Enterprise environment on BlueGene? There is a slight difference between a custom-built supercomputer and a rack of standard Opteron and SPARC servers. It seems the other IBM services listed at the bottom of the article are more in-line with Sun's offering.

  • Finally (Score:5, Funny)

    by Anonymous Coward on Saturday March 12, 2005 @10:26AM (#11919566)
    Now I can compile Gentoo in under a day.
  • SUN (Score:2, Interesting)

    It reminds me of what SUN was talking about in this. [com.com]

    Jonathan Schwartz must be happy to see that finally, his idea of selling cpu time is being realised (and how much he loves IBM ;))

    Anyway, even if, I guess, the price will be lot higher than Jimi Hendrix (and that's something), the few people getting access to some of the best performing supercomputers [top500.org] is really nice.

    To sum up : nice business plan.
    • by Anonymous Coward
      I've heard Iran, North Korea and Al-Qaeda are showing a great deal of interest in buying CPU time. I think there's a big potential market out there.
  • by Anonymous Coward on Saturday March 12, 2005 @10:31AM (#11919591)
    I remember when a friend of mine working at IBM when they were in the process of chosing the name told be that he was pushing the "Blue Gene" name to piss off creation "scientists" and other religious nuts who don't believe in genes and the fact of evolution and speciation in the DarwinOS-style fashion. Just wait before "Dr." Richard Paley (a teacher of "Divinity" and "Theobiology" at Fellowship "University") will write another idiotic crackpot bullshit in his "Evolutionism Propaganda" column [nyud.net]. Let me quote: However, these propagandists aren't just targeting the young. Take for example Apple Computers, makers of the popular Macintosh line of computers. The real operating system hiding under the newest version of the Macintosh operating system (MacOS X) is called... Darwin! That's right, new Macs are based on Darwinism! While they currently don't advertise this fact to consumers, it is well known among the computer elite, who are mostly Atheists and Pagans. Furthermore, the Darwin OS is released under an "Open Source" license, which is just another name for Communism. They try to hide all of this under a facade of shiny, "lickable" buttons, but the truth has finally come out: Apple Computers promote Godless Darwinism and Communism. People like "Dr." Richard Paley makes me proud to be an atheist, and the humor of IBM's and Apple's developers only keeps reminding me about it.
    • Re:Gotta love IBM (Score:5, Informative)

      by Nine Tenths of The W ( 829559 ) on Saturday March 12, 2005 @10:39AM (#11919637)
      People like "Dr." Richard Paley makes me proud to be an atheist, and the humor of IBM's and Apple's developers only keeps reminding me about it.

      The joke's on you. The website is a parody.
      • ... we'd have to invent one."

        But one of us already did. B-)
      • Is it really? maybe its the hangover, but my sarcasm detector was strangely silent as i read that page.
        Have you got any links/proof its a piss take?
  • Think about it (Score:1, Interesting)

    by Anonymous Coward
    Most researcher would love to have free access to this much cpu calculation power ...

    Instead of charging a fee on entry , why dont they take a percentage on the discovery ...

    And for fuck sake can we stop building this things to predict the weather , or its just a lame ass excuse to cover the paimenet made to somebody else and no one ask did it really cost that much , If I whant to know the weather I get my head outside , prediction are often more then not : wrong ...

    Compile the Gnu/Linux kernel in .1 sec
    • Comment removed based on user account deletion
    • by Ungrounded Lightning ( 62228 ) on Saturday March 12, 2005 @12:15PM (#11920159) Journal
      And for fuck sake can we stop building this things to predict the weather [...]

      They're not building it "to predict weather". They're building it to do really large computation jobs.

      Predicting weather is just one canonical example of a really hard and really useful thing to do that can be done well by throwing enough crunch at it.

      Some others are fluid/aerodynamic modeling, chemical geometry modeling (especially protein folding and drug/receptor interactions), graphics rendering, mechanical structure and motion simulation, and subatomic particle interactions.

      You'll notice that, in the blurb, they mentioned commercial uses of all of those except for the nuclear engineering applications.

      Given that applied nuclear physics is heavily regulated worldwide, legal users are likely to be funded well enough to have their own machines, and governments get worried about such info traveling on open networks, IBM probably doesn't see much market for that service - or at least not much that they can sell into. B-)
  • What OS is it running? In case it's not Linux (chances are it isn't), can one slap Linux on it? How would it (Linux) perform? BTW, what distro would perform best?
    • Re:just curious (Score:1, Informative)

      by Anonymous Coward
      RTFA:

      IBM's other US based IBM centers in Poughkeepsie, NY, and Houston, TX as well as the European-based center in Montpellier, France are accessible to customers worldwide via a secure VPN connection over the Internet. Clients have on demand access to over 5,200 CPUs of Intel®, AMD Opteron(TM) and IBM POWER(TM) technology based compute power to run the Linux, Microsoft Windows and IBM AIX operating environments. The new center in Rochester, MN introduces over 2,000 CPUs of IBM PowerPC® based
    • Re:just curious (Score:4, Informative)

      by Monx ( 742514 ) <MonxSlashNO@SPAM ... ossibilities.com> on Saturday March 12, 2005 @10:45AM (#11919662) Journal
      BlueGene runs Linux [devchannel.org]
    • Actually I want to know the performace drop when some tries to load Windows for Super computers on it, to run Doom3 at 10000 fps.

    • ... a beowulf cluster of THOSE puppies?

      For starters, think of the size of the network pipes you'd need between them. (Image of a bundle of optical fibers the size of a watermain.)

      Awesome!
    • that box that SGI made for NASA ran linux, that had (iirc) 1024 processors, so presumably linux does scale. i dont think they used out of the box distro anyway and the kernel was probably highly patched.

      How good is AIX for this kind of job?
  • Finally... (Score:5, Funny)

    by Anonymous Coward on Saturday March 12, 2005 @10:33AM (#11919609)
    This should come in handy the next time I forget my password.
  • News? (Score:4, Insightful)

    by The-Bus ( 138060 ) on Saturday March 12, 2005 @10:34AM (#11919613)
    Is it really news? They've had commercials running on TV for this for weeks, if not months. If they had commercials back then, that decision and announcement would've been done before. Why is this news now?

    Not that it's not a cool idea...
  • by Chris Kamel ( 813292 ) on Saturday March 12, 2005 @10:43AM (#11919654)
    book some time to play doom3...
  • I really just want to play Tetris on it.
  • "IBM's world renowned Blue Gene supercomputing system, the most powerful supercomputer..." The video card on it is crap.
  • by karmaflux ( 148909 ) on Saturday March 12, 2005 @10:52AM (#11919691)
    for the best folding@home score you will ever know?
  • So are they going to be doing Folding@Home [stanford.edu] when no one is using it.
  • I know let's take my 5.7 TeraFlops, your 5.7 TeraFlops, and his 5.7 TeraFlops and run a Beowulf cluster...
  • Huh! (Score:2, Insightful)

    by 1tsm3 ( 754925 )
    Did CowboyNeal just take over /. ?? 8 out of the 10 postings are his!!!
  • by mi ( 197448 ) <slashdot-2017q4@virtual-estates.net> on Saturday March 12, 2005 @11:07AM (#11919761) Homepage Journal
    Parallel Virtual Machine [ornl.gov]? Any of the Message Passing Interface [mpi-forum.org] implementations?

    Or does one need to re-write her/his software to use their own?

    • by Anonymous Coward
      I work at LLNL. BlueGene/L supports MPI, as well as a limited form of TCP/IP sockets.
    • Chunks of the environment are handed over to the customer and they install and run whatever they like. It might be one of the various scheduling tools out there such as LSF, openPBS, PBS, mpich,etc or something completely in house. To get an idea of what usually runs on these types of systems, check out the xCat home page [xcat.org] or the xCat mailing list (or here [usc.edu]).

      -L

  • Google (Score:1, Funny)

    by Anonymous Coward
    Just wait until Google have capitalized on the Google bar's feature of "borrowing" CPU cycles when your computer when you're not using it (talk about grabbing candy from gullible people). Then they'll be able to compete with IBM and the others in super-/grid computing.
  • I know, I know, Moore's law and all that. Still, think about it....

    5.7 teraflops, that's just nuts. I mean, wow.

    Or maybe it's just me getting old, I remember when we were impressed by a VAX 785 upgrade to our 780.

  • by gmuslera ( 3436 ) on Saturday March 12, 2005 @11:22AM (#11919823) Homepage Journal
    Just ask it to make some tea
  • by neomage86 ( 690331 ) on Saturday March 12, 2005 @11:30AM (#11919863)
    Back in the 19th century major companies had a 'Vice President of Power', like we have a VP of IT. Then, a few companies started making all the electricity in one place, and rolling it out to where it was needed. It's always more efficient that way (economies of scale, and diminishing marginal return can become negligble with proper managment).
    Do you think IT will become just another commodity like electricity or water?
    • Do you think IT will become just another commodity like electricity or water?

      Funny that water, probably the ultimate commodity here on Earth, is being branded and costs more per gallon than gasoline.

      I'm no economist, but to respond to your question, I think the answer is yes and no: yes there could be a commodity aspect to it, but no, it won't be a commodity the same way electricity and water are.

      From the point of view of the rent-a-supercomputer customer, it's just like any other new system: I assume

  • by Anonymous Coward
    hehe I'd like to rent this to bring to the next LAN party... I'd be the game server and no one would lag ;)

    hmmm it's pretty damn big though. I guess I'd have to bring the LAN party to it.
  • What FPS games have been ported to Blue Gene? Are any of those multi-player?
  • by Ungrounded Lightning ( 62228 ) on Saturday March 12, 2005 @11:58AM (#11920055) Journal
    This is the same computing model as was used in large "computing centers" - such as those in universities - back in the 1950s-1970s:

    The machine you need is too expensive to buy yourself and then leave sitting around idle most of the time (like a pencil sharpener). So an institution buys and sets one up, and you rent chunks of its time. If the demand goes up the institution gets more rent and can buy upgrades.

    You get a machine fast enough to do your too-big-for-humans computing task in a short time (so YOU don't spend most of your time waiting to do YOUR next piece of work, like a pencil sharpener). You only pay for the amount you use.

    Billing by CPU seconds, I/O volume, memory usage (fast and files), etc.

    In the '50s you took your work to the machine, by the '60s remote terminals were becoming available, by the '70s packet-switching networks were making machines available across continents.

    And also by about the '70s you were starting to see both comm and crunch becomming so cheap that, for ordinary jobs, accounting by the slice no longer made economic sense. Better use of money scattering (cheap) computers around and making them wait than only having a few and making (expensive) people wait.

    Paying for comm by usage metering never caught on (too bursty, wastes human attention worrying about the effect on the bill, ...). Just buy the size of pipe you need to keep from being bottlenecked at peak load and leave it mostly idle. (You'd end up doing that by proxy anyhow - eliminate the middleman.) Client-server computing models moved institutions to a similar model for crunch and storage. General-user timesharing services gave way to networking services with unmetered shell accounts, which gave way to pure networking services, as the cheapening of computation evolved the personal terminal from a special purpose keyboard/display/comm box, first to terminal emulation on a dumb computer, then to one application on a progressively more powerful (though still small and cheap) computer functioning as a full-blown network node.

    But there are still REALLY BIG jobs were the economics of a shared utility make perfect sense. IBM was once a primary provider of machines to such utilities within educational and business institutions. Now it's largely a business service provider. It seems approprate they should recognize the opportunity and use it as a way to make a profit by filling a gap at the high end of the computing market.
    • Yeah, I remember my dad telling me that back in the early days of computing how computers used to be so big that they filled rooms as large as..... oh wait, Nevermind [ibm.com].

      Kinda looks like a bowling alley too.

      • Yeah, I remember my dad telling me that back in the early days of computing how computers used to be so big that they filled rooms as large as..... oh wait, Nevermind.

        Gene Amdahl - IBM's archetect for much of the mainframe era - was a lower-level worker at an early company before he went to IBM. (Honeywell, I think it was, or maybe Univac.) While working there he watched in amazement as a computer was designed and delivered to a research institution and it wouldn't fit through the doors. They had to te
    • Actually, today's supercomputing centers still operate in exactly that way - they buy a big computer, people apply for time on it, you submit your jobs, and the time gets charged against your account.

      The only difference is that, I think, all supercomputing centers at this time are government/university-funded, so there is no transfer of actual money in most cases.

      The really new idea in grid computing is not that many users share one machine, but many users many machines.

  • by 88NoSoup4U88 ( 721233 ) on Saturday March 12, 2005 @12:20PM (#11920205)
    Blue Gene runs Doom III with a comfortable 70 frames per second ;)
  • Money talks But it don't sing and dance And it don't talk As long as I can have you login to me I'd much rather be Forever in blue genes. Honey is sweet But it ain't nothing next to Big Blue's treat And if you'll pardon me I'd like to say We'll do okay Forever in blue genes Maybe tonight Maybe tonight by the drive array All alone you and i Nothing around but the sound Of my drives and your sighs Money talks But it don't sing and dance And it can't crunch As long as I can have you logged into me I'd much
  • On a related note... (Score:3, Informative)

    by Cheerio Boy ( 82178 ) on Saturday March 12, 2005 @01:02PM (#11920500) Homepage Journal
    These guys [cray-cyber.org] offer open access to the Cray machines they have online. You have to get permission from them to do certain things but that's still a small price to get access to a cray.

    Not exactly the same thing as the article but definitely a way for the average joe to learn about supercomputers without building one himself.
  • My birthday is comming soon ...
  • can I use to it compile my HL2 maps? Please?
  • Do they create a virtual machine of specific number of CPUs, and boot your choice of OS, and let you work on it, or to they take your (gasp!) Java apps and run them? How do you really use a grid?

    I had a list of kernel config options to try in Linux, and wanted to compile a kernel for each option. I thought a grid is great for it... maybe 30 CPUs for an hour should do it.. and didnt know if I can get multiple virtual machines too, say 2 of 15 CPUs each. But for Sun, the minimum amount was something like 100
  • some time on the IBM computer.Increasingly the time lag between stories being reported on major news sites and their being posted on /. is getting absurdly long.

    I read about this some time yesterday here

    http://www.theregister.co.uk/2005/03/11/ibm_ren t s_ bluegene/

    Come on Taco , Please do something to fix /. before it's too late.What with the Duplicates, the blatant Advertising ( a well known story poster everyone calls roland) , piss-poor editing , /. has become a parody of its former self.

If you steal from one author it's plagiarism; if you steal from many it's research. -- Wilson Mizner

Working...