Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Sun Microsystems Education

Stanford Gets First Sun Blackbox 124

miller60 writes "The Stanford Linear Accelerator Center (SLAC) will be the first end-user to get a Project Blackbox portable data center from Sun Microsystems. The 20-foot shipping container (which will be white, not black) will sit on a concrete pad behind the computer building with hookups to power, a 10-gigabit network connection and a chiller located on an adjacent pad. The 'data center in a box' will allow the SLAC to expand its computing capacity even though its existing data center has maxed out its power and cooling."
This discussion has been archived. No new comments can be posted.

Stanford Gets First Sun Blackbox

Comments Filter:
  • The Market? (Score:5, Funny)

    by TheRaven64 ( 641858 ) on Saturday June 23, 2007 @10:22AM (#19619545) Journal
    People were complaining that Sun didn't have a decent portable computer (they sold a few Tadpoles, but nothing they made themselves), and this is what they came up with. Apparently it's meant to be an iPhone killer.
    • Re: (Score:3, Funny)

      I have to admit its not exactly pocket sized, although if you have combat pants you could probably fit it in side pouch.
      I haven't found anything that doesn't fit in them side pouches.
      • Damn straight. I'd rather have one of those any day than a frickin iPhone...
        A Unix datacenter and more resources than any one person will probably use computing in 20 years... hell yeah!
        Who needs an iPhone when you could just hook up a laser (see TRON) and live in the computer... And give let your friends join in TOO!
        GR
    • by deniable ( 76198 ) on Saturday June 23, 2007 @11:06AM (#19619901)
      It depends on where you leave the iPhone. Hell, a 20' container could be an iPhone mass murderer.
    • Well, it would be the first portable with a decent display.
    • by Ilgaz ( 86384 ) *
      A Sparc portable machine/tablet with some extra-extra security chips, devices (non military/spy grade) pre installed along with latest Solaris can really sell. People would buy it for reliability and security. Especially companies and even home users who are really tired of unreliable laptops. It MUST be end user friendly.

      What they try to do now is trying to sell a very bare minimum DESKTOP, nothing included machine just happens to have a SPARC chip more expensive than anything Intel based.
      http://www.sun.co [sun.com]
  • by ArcSecond ( 534786 ) on Saturday June 23, 2007 @10:23AM (#19619549)
    Wow! Can you imagine a Beowulf cluster of these? :)
    • by 6Yankee ( 597075 ) on Saturday June 23, 2007 @10:32AM (#19619627)
      Umm, a container ship? That'd be a portable Beowulf cluster of portable data centers.
    • Re:the obligatory... (Score:5, Interesting)

      by Yvanhoe ( 564877 ) on Saturday June 23, 2007 @11:30AM (#19620049) Journal
      • by hattig ( 47930 )
        Do the containers have generators in them?

        Because there's precious few cables going into or out of those boxes in that picture.

        If they're cheap (the containers, not the Sun Blackboxes), can you build a house out of them. Those containers must be pretty weatherproof I imagine. Several of those and some welding and you'd have an awesome place. With no windows. As I said, awesome, I'd sell them as "Overground Basements" to geeks.
        • by Fishead ( 658061 )
          I rented a 20' shipping container for moving last month. It cost the same as a scrU-Haul, but I got to keep it for a whole month. That and I didn't have to drive it. Told the wife that I one day when we get a house with a big enough yard, one of those suckers is gonna be my shop.

          Check out http://www.bigsteelbox.com/ [bigsteelbox.com] 20' cost $2500 for a used one. Put your shop in it, and if you have to move, just cram everything to one end, load all your other stuff in the rest of the space and call them to come pick it
          • by hattig ( 47930 )
            If everything goes wrong in life, I'll ensure I have enough money to buy an area of land in the middle of nowhere, and a couple of these containers to live in once I'm there! I'll half-bury them in a slope, so I get some natural insulation from the elements (and this makes planning permission easier in some areas). I guess I'd have to coat them with a load of sealant beforehand... might have to use a 40ft container for a personal bowling alley too!

            With your plan of course you will need to use the smaller 10
      • Looks good except for one of their scenarios. They indicate it can be used in a relief mission? WTF? Wouldn't water/food/shelter be more important there? Do you need to do any calculations you can't do on a laptop? (or piece of paper for that matter).

        Lot of scenarios are OK. But the refuge one really makes no sense.

        https://photos.sun.com/asset/7553 [sun.com]
        • by c_forq ( 924234 )
          Remember that after disasters a lot of infrastructure is rendered unusable. What if you could quickly move in a couple of these to get cell towers, wireless internet, and radio communication lines back up and running.
        • by Yvanhoe ( 564877 )
          Agreed. The only application I can think of here is meteorology simulation in case of disaster relief from a hurricane or a flood.
    • Re: (Score:3, Funny)

      by notthe9 ( 800486 )
      Wow! Can you imagine a Beowulf cluster of these? :)

      But does it run Solaris?
    • But does it run...

      But seriously, blackbox is a fecking great idea, disaster recovery.. no problem sir. I can't tell you how much I want a black box stuffed full of my goodies... there is long queue behind me who are equally excited. Anyone got a cost for one yet?
  • by nbvb ( 32836 ) on Saturday June 23, 2007 @10:28AM (#19619593) Journal
    Project Blackbox is one incredibly cool device. Sun was gracious enough to park one as a demo at my company, and it's just a very well engineered, game-changing design. The beauty is that it can be done relatively cheap, because shipping containers are CHEAP in the US. Most of them come from China, and since we import more than we export, we're stuck with a boatload (literally) of excess containers.

    Imagine - rather than spending many millions building a true data center, you can just purchase a (relatively) cheap warehouse and line these things up inside. Instant data center - with lots of inherent redundancy.

    Mirror one Blackbox to another across the warehouse.

    Disaster Recovery? This the best thing since sliced bread. Park one at another facility 50 miles away and off you go.

    I'm highly impressed. It's a bit cramped in there, but if you do your work neatly and place the servers in the racks correctly, it's not an issue. One shouldn't spend much time in the data center anyway!

    and I asked ...

    #1 - yes, they are standard racks, so other vendors' equipment will fit.
    #2 - I asked about "oversized" equipment (such as Superdomes, E25k's, disk arrays, etc.) - they're working on a solution for that too. My guess is that it would involve removing some of the racks to make room.

    I think Blackbox is a great idea with lots of deployment potential. Another thing to note - I was told that the air filters are designed to filter out lots of particulate matter -- sand included. You can guess why.
    • I have to wonder how it compares to this [apc.com], both in price and in capabilities. (see the 'more images' link for pics of the inside).

      I don't know who came up with the idea first.
    • by tsajeff ( 925056 )
      Why don't they just take the empties back on the return trip - or do they leave the ships here too?
    • i remember lots of talking several years ago about Google building datacenters-in-a-box.
      what's interesting, is that they actually tried it, built some and found it impractical in the end.
      i wonder if this offering by Sun really takes off...
  • by Alain Williams ( 2972 ) <addw@phcomp.co.uk> on Saturday June 23, 2007 @10:41AM (#19619699) Homepage
    Oh, so how will they keep it cool until then ? Not switch it on perhaps ?
  • ... will it run Vista’s Aero interface?

    Cheap shot, I know.

    @yg
    • by fm6 ( 162816 )
      Actually, that's not a cheap shot at all. Sun is no longer a SPARC-only shop. I write docs for Sun's x64 servers, and many customers do use them as graphics workstations. It's the old "visualization" marketplace that used to be dominated by SGI supercomputers with graphic front ends.

      One big problem is that all of Sun's latest servers emphasize low power consumption. That means PCI slots that don't support power-hungry graphics cards. You get around this by clustering the server with an x64 workstatio
  • Node Failure? (Score:3, Interesting)

    by tansey ( 238786 ) on Saturday June 23, 2007 @10:43AM (#19619713) Journal
    I'm just curious, but is the inside of this thing roomy enough for a person to easily get in there and replace a part? I know that node failures happen on a very regular basis with clusters and the box doesn't look very wide.
    • by Fallen Kell ( 165468 ) on Saturday June 23, 2007 @11:40AM (#19620135)
      I had the pleasure of getting to see and work with the demo unit that Sun had/has on tour. The inside of the unit has two rows of their custom built racks, three racks deep on both sides of the "black box". Each rack has a water cooling unit between it and the next rack in the row so that the hot air comming out of the front rack is cooled before it is used as the intake air for the following rack. The racks themselves are on a custom designed damping/shock absorption system and rail system so that when you need to work on a rack, it can be slid out into the center aisle where you can then access the front and back of the rack. A little planning is needed so that you have the appropriate tools/gear on the proper side of the rack before you pull it out into the center asile, however they do slide very easily even when loaded up so you can put is back in and move behind it or in front of it depending on what you need to do. It makes the most sense to have two people, one on either side do and work on the systems.

      It may be a little warm in there if you place it out in the middle of nowhere as the cooling system is really designed just to cool the systems in racks, and not the entire box, especially when you have the front or back doors open, and you may want to close the inner door if you can to keep try and keep the moist outside air from entering the container, however there is a dehumidifier in the system to take care of that situation. It will be a little cramped working in there, but no more so then any high density compute server room. The main idea however is to not have to go in there very often, in which the Sun "lights out managment" systems come into play. The only reason to go into the container is for actual hardware failure, all other maintenace can be performed remotely on systems with the "lom" ports, from bios settings, to single user/maintenance mode issues.

      As for your "I know that node failures happen on a very regular basis with clusters...", comment, I have personally found that if you are using "lom", you will almost never need to go in there unless it was a true hardware issue. In the 9 racks of beowulf cluster that I manage, there have only been 6 actual hardware outages over the last 3 years. The majority of issues are software related outages which can all be fixed over the "lom" connections, even reloading the OS...
      • by Bandman ( 86149 )
        Very informative post, thank you.

        Did they happen to say what the ballpark price is for one of these?
      • So it only holds 6 racks? Sounds like a neat set-up, but no more space-efficient than a normal data center.

        I guess they have sold at least one more than APC's "Data Center on Demand" with the Stanford purchase.

        We had a concept that could hold a bit more equipment, but this seems to be pretty hassle-free.

        • A BlackBox has 8 racks (not 6). One of the racks is used for infrastructure components, like the dehumidifier, power, network, etc. The remaining seven racks are 38 RU, but because of the power distribution unit and a patch-panel, you can fit 36 1U servers in a rack. That is a total of 252 1U servers per BlackBox.

          Someone calculated that if you would fill it completely with for example X2200 servers (two dual-core AMD Opteron), it would end up around position 200 in the Supercomputing Top500.
  • by foxtrot ( 14140 ) on Saturday June 23, 2007 @10:54AM (#19619793)
    for Sun (Whose name came from where their first machines were seen, the Stanford University Network) to deploy their first of a new idea.

    I'm not sure it's the world-killer that everyone wants to think, mind: If your data center is tapped out for power or cooling, you'll still need to get portable power and cooling to go next to your portable data center, but it does seem to be an excellent idea to tide you over until your real data center expansion gets built. Which means I expect to see a number of these sitting outside fixed data center locations in a basically permanent role, just like the "temporary" trailer classroom buildings outside schools and all the other stop-gap measures we implement "just to tide us over" that wind up being permanent emplacements.

    I kinda fear this outside our data center. Especially when the machines therein get on the "long in the tooth" side, and we've decommissioned every application in the thing but one.

    It's a great new idea, don't get me wrong, but the problem is how most companies want to run their data centers doesn't look a whole lot like how anybody's actually doing so in the real world. :)
    • I kinda fear this outside our data center. Especially when the machines therein get on the "long in the tooth" side, and we've decommissioned every application in the thing but one.

      Remember IBM's "lego" data center concept-- little boxes, and it grows out as time goes on, generally moving across your data center floor over the years? Same thing with this type of solution-- buy one a year, and migrate apps as time requires to new containers. It really is more of a portable data center (sans infrastructure)

    • Outsourced? (Score:3, Interesting)

      by mcrbids ( 148650 )
      I just don't get it. I mean, I really, REALLY don't get it?!?!?!

      Places like 365 Main [365main.com] offer top-notch server hosting for dirt-cheap prices. I have a half-rack there with 6, quad-core Opteron clustered LAMP servers in place now. Reliability is excellent, bandwidth availability is fabulous (we have a Gb interface to the Internet) and the price is just astonishingly cheap - although we are an "Internet Company", we spend more on phone calls than we do on hosting and related fees. Never mind hotels and travel/fl
  • by Anonymous Coward on Saturday June 23, 2007 @11:07AM (#19619903)
    SLAC is in a big bind for space to house computers. They're out of power in their existing building but are also under very very onerous rules about approval for electrical installs due to an accident on site that nearly killed an electrician. They figure 24 months to get a new electrical feed from the transformers just outside the building... Apparently they'll be able to get the (simpler) electrical install of the Blackbox done more quickly.

    So, they are using the Blackbox in the mode of "gotta get more capacity yesterday" vs. a real change in direction of datacenter planning... Still, I bet SUN sells more of these to customers in similar situations.
  • See the innards (Score:5, Informative)

    by tcampb01 ( 101714 ) on Saturday June 23, 2007 @11:28AM (#19620041)
    I was a bit surprised that all the pictures only show off the outside and none of the links follow to info on what these things look like on the inside or how they work.

    Here is Sun's page that shows off considerably more info: Sun's Project Black Box page [sun.com]

    Basically the outside of the box has hookups for power, cooling water, and network. Everything on the inside is pre-wired. Servers aren't included, but they are designed to serve as the transport container for the servers -- not just a place to put servers once the box arrives (the racks have a shock-absorbing suspension system so that servers can be transported in the container without the need to unrack them or pack them for shipping.). When it arrives it just needs to be "plugged in" and it's literally ready to go. Since it really is a standard shipping container, all rules about shipping containers apply -- e.g. there's no shortage of trucks, trains, or boats designed specifically to hold them. They are structurally sturdy and can be stacked tall just like containers on a cargo ship.

  • My conerns... (Score:5, Insightful)

    by HockeyPuck ( 141947 ) on Saturday June 23, 2007 @11:38AM (#19620109)
    I'm assuming you don't set this up 'in a parking lot' but under some sort of cover/tarp/tent, even painting it white, putting it outside in the northern california sun, can't be very efficient as far as cooling is concerned. How much insulation do they have between the metal of the 'box' and the interior walls?

    If it's RAINING, how do you keep from increasing the humidity inside the box. In our datacenters, we have sticky plastic sheets on the floor outside the datacenter so you won't track dust into the datacenter. With a door that opens 'to the outside' how do you keep out dust/dirt?

    • Apparently there's an inner and outer door, so there should be space there to close one before opening the other. At least, one would hope so.
    • I'm assuming you don't set this up 'in a parking lot' but under some sort of cover/tarp/tent, even painting it white, putting it outside in the northern california sun, can't be very efficient as far as cooling is concerned.

      I suspect this is a lot like worrying about aerodynamic drag on a 100 ton locomotive. Yes, it exists, but it's negligible compared to the other load you have to deal with. I bet if you compared the amount of heat that the container would gain from sunlight, it's probably a small per


      • There's no AC.

        It's all chilled water; you hook it up to an external chiller, and the racks are set up back-to-front, and in between every pair of racks is a wall of fans and a radiator with cold water. The water sucks out the heat, and redistributes it outside the unit.

        I don't know how they deal with humidity, to be honest.
        • I don't know how they deal with humidity, to be honest.

          Dehumidifier?
          • by jbengt ( 874751 )
            Yes, it has a dehumidifier, according to another post.

            • yeah, but the box is sealed from the outside world. Once they have a bucket of water they've taken out of the air, what happens to it?

              They don't just dangle a tube outside the structure. At least, I don't think... maybe they do...
        • by jbengt ( 874751 )
          Well, since they're blowing air across the cooling coils, technically it is Air Conditioning.
      • by jbengt ( 874751 )
        Most data centers have humdifiers and dehumidifiers.
        It used to be critical to keep the humidity close to 50%,
        but now musch of the rack-based equipment can handle anywhere
        from a low of 10%-20% RH to a high of 90%-non-condensing.
        And, yes, unless you get conditions causing condensation,
        usually avoiding the static from too-low humidities is more important.
      • Re: (Score:3, Interesting)

        by mollymoo ( 202721 )

        I'm assuming you don't set this up 'in a parking lot' but under some sort of cover/tarp/tent, even painting it white, putting it outside in the northern california sun, can't be very efficient as far as cooling is concerned.

        I suspect this is a lot like worrying about aerodynamic drag on a 100 ton locomotive. Yes, it exists, but it's negligible compared to the other load you have to deal with.

        I suspected the solar heat load was relatively small, but I decided to run a (very) rough estimate to get a bet

    • I mean seriously -- we all know that physical access to the hardware == compromised security. Most datacenters exist inside a building, with card keys, reinforced walls, etc. etc. It seems like all you'd need to gain physical access to the servers in one of these things is a blowtorch.
      • I mean seriously -- we all know that physical access to the hardware == compromised security. Most datacenters exist inside a building, with card keys, reinforced walls, etc. etc. It seems like all you'd need to gain physical access to the servers in one of these things is a blowtorch.

        A blowtorch will get you into most data centers anyway. Seriously, some super high security places may have super high end physical safe guards, but in most places, your whizzy electronic card reader can be dealt with pretty

  • Prank (Score:5, Funny)

    by Joebert ( 946227 ) on Saturday June 23, 2007 @11:39AM (#19620119) Homepage
    What do you think would happen if a student pasted a "PODS" label on the side of it & called the company to come do a pickup ?
  • Checklist (Score:5, Funny)

    by Dan East ( 318230 ) on Saturday June 23, 2007 @11:58AM (#19620255) Journal
    Faraday cage large enough to encompass a shipping container... Check.
    Honking-big wirecutters... Check.
    Rollback flatbed truck with 20' bed and winch... Check.

    Dan East
    • Comment removed based on user account deletion
  • by smackenzie ( 912024 ) on Saturday June 23, 2007 @12:05PM (#19620299)
    I mean, look Apple, tried the whole white computer in a box thing with their so-called "Cube" and it never took off. To make it worse:

    1. I think this computer looks even BIGGER and UGLIER than the Cube. (Can someone post picture of Cube and this together so we can see size differences to confirm?)

    2. Though the internet connection is decent, I don't see a firewire port. HELLO! People still use firewire these days!!

    3. Can I use it as a media center device? Those are cool. I think most American's will be able to fit this in their living room under their TV, but no way the Japanese are going to go for it with their smaller apartments...

    Nice try, Sun, but I'm not going out and picking up another electrical substation powerstrip just to plug this (probably) under-powered and over-priced white "computer in a box" copycat...

  • I used to design modules very much like this for the oil biz, its a common way to provide office space, utilities, subsea control, temporary functions etc. for oil rigs and hazardous areas.
    You can order the shells pretty much any size/shape you want, especially if you dont have to worry about regulations.

    Theyre not as sturdy as you might think tho, they get beat up something awful by transport, esp. offshore.
    Crane operators like to use the one in the sling to knock the others in place/out of the way :)

    So if
    • by anilg ( 961244 )
      Ah.. But do you know one of the inside lores. They emptied one up, made a wall transparent, put in a few chairs /tables, and had the executives work from in there in full public view on April fools :) Article here [siliconvalleysleuth.com]
  • A related link at the end of the article [datacenterknowledge.com] describes how Sun took one of their Black Box systems to a giant shake table at the seismic research center at UCSD [ucsd.edu], to see how well it would hold up during an earthquake. Some things pulled loose, and some things will need a little redesign, but it was able to keep functioning during and after the simulated earthquake. Sun produced a slick little video [youtube.com] of it.
  • I love how it took someone at Stanford to point out the idea of painting it white *laughs*

    It will be a couple more days of work before they figure out to put a reflective cover slightly above the container, as even white paint is still very absorbent.
    • Having a matt black container with a reflective cover above it is probably the best configuration as black is a much better emitter of heat than white
      • It's intriguing but turns out to be not necessarily true.

        "If objects appear white (reflective in the visual spectrum), they are not necessarily equally reflective (and thus non-emissive) in the thermal infrared; e. g. most household radiators are painted white despite the fact that they have to be good thermal radiators. Acrylic and urethane based white paints have 93% blackbody radiation efficiency at room temperature (meaning the term "black body" does not always correspond to the visually perceived colou
  • Dude, (Score:2, Funny)

    by tabby ( 592506 )
    Where's my datacenter ?
  • Just ten gigabits? One decent motherboard has two...

Behind every great computer sits a skinny little geek.

Working...