Forgot your password?
typodupeerror
Microsoft

Microsoft Puts Datacenter In a Barn 147

Posted by samzenpus
from the ei-ei-o dept.
aesoteric writes "Microsoft has announced that it will open a new datacenter in Washington State housed in a 'modern' barn-like structure that is 'virtually transparent to ambient outdoor conditions'. It was not the first time Microsoft had toyed with the idea of a datacenter without walls. In September 2008, it successfully ran a stack of HP servers in a tent for seven months, apparently with no failures."
This discussion has been archived. No new comments can be posted.

Microsoft Puts Datacenter In a Barn

Comments Filter:
  • by pecosdave (536896) * on Wednesday January 05, 2011 @07:45PM (#34771528) Homepage Journal

    I want them to replicate this experiment in Big Bend National Park in July.

    • I think that's the idea, isn't it?

      When you get right down to it, all they're doing is getting free cooling from low ambient temperatures. It's hovering just above freezing there. The only thing they're keeping off is rain. I'm not sure what they're doing about the humidity -- maybe "not caring".

      • by afidel (530433) on Wednesday January 05, 2011 @08:18PM (#34771818)
        Yep, for them technology reduces the cost per compute unit fast enough that the fact that they wear out faster is inconsequential and losing any given node is meaningless so a slightly higher failure rate is completely acceptable. It's the same reason Google can run with SATA drives without RAID, they take care of those concerns at a higher level. For those of us using software without builtin failover we can't really do this. Though if VMWare FT comes a bit farther it might be possible soon (though I'd still want my storage in a conditioned space even if the hosts weren't).
      • by Pharmboy (216950) on Wednesday January 05, 2011 @08:32PM (#34771962) Journal

        That would makes sense, as it takes tons of heat to produce cold air, and simply using the existing cold air has a much lower carbon footprint (corporations don't really care, to them it means lower power bills). As for humidity, it should be lower inside the barn than outside, as the heat from the systems will still raise the temperature enough to drop the relative humidity to more reasonable levels. Probably still higher than optimal, but like you said, who cares if you are running cheap enough gear, you would replace it more often anyway.

        Reading Wikipedia seems to indicate that it is much drier there than eastern Washington anyway. You could likely balance the humidity with the temperature, ie: if you want lower humidity, you vent less and put up with higher temps within the barn. If it is anything like Spokane, then the main humidity is in the winter, when the air is holding less water to start with and allowing the temp to go up (vent less) will do no harm, while dramatically dropping the humidity. There aren't a lot of places this would work, but this area might just fine.

        • by Pharmboy (216950)

          I meant it is drier than western Washington, but I think you get the picture...

        • by ron_ivi (607351) <.moc.secivedxelpmocpaehc. .ta. .ontods.> on Wednesday January 05, 2011 @08:53PM (#34772122)

          I seem to recall a business plan back in the late 1990's to do something similar adjacent to the alaska pipeline; complete with a refinery.

          The argument was that it would have

          * Free airconditioning with the clean dry cold alaska air.
          * Unparalleled physical security - with miles of visibility in all directions.
          * A well-protected network (if they could run their lines along the well-defended pipeline)
          * Unlimited backup-generator fuel (tapped directly into the pipeline)

          I seem to recall they raised funds. Wonder what happened to them.

        • Reading Wikipedia seems to indicate that it is much drier there than eastern Washington anyway.

          The 'dry side' of Washington is also a great deal *hotter* than the 'wet side'.

      • by raodin (708903)

        It is dipping well below freezing at night in Quincy right now, but this is the coldest part of the year. Central Washington isn't exactly cool in the summer months - the average highs for Quincy in July are in the mid-80s, and the record highs are pushing 110F.

      • by hairyfeet (841228) <{bassbeast1968} {at} {gmail.com}> on Wednesday January 05, 2011 @10:14PM (#34772622) Journal

        Which is why I don't understand why some entrepreneur with a brain isn't buying up those old Titan II missile silos in north AR. Talk about the perfect data centers! You have an ambient temp of 55 degrees F in the lower levels, plenty of wind up top to pull away heat, simply add racks to the silo tubes with side venting at the top, a big fan blowing up at the bottom, and the chimney effect will take care of the rest. they also have plenty of fiber backbone run through that area thanks to AT&T and the US Military, so hooking into the backbone wouldn't be anything, cost of living is cheap, and no unions to worry about.

        While I think TFA is a good experiment, it seems like it would be cheaper in the long run to use structures already built with inherent cooling properties. There are missile silos being closed all over the place and having been in one the ambient temp in those lower levels stays pretty damned chilly. All one would have to do is use a big fan to pull it through grates set into the silo itself to have racks of servers that would stay nicely chilled for the cost of a large fan. Just seems stupid to let them rot or worse end up filled in when nobody buys 'em.

        • by bennomatic (691188) on Thursday January 06, 2011 @01:56AM (#34773754) Homepage
          Well, it's QC, not AR, but apparently, someone did think what you're suggesting was a good idea [greenm3.com]. The way they've got the systems oriented and the venting, all the heat is pushed towards the middle, creating an updraft which vents out the top and sucks in outside air so that you've got a natural cold aisle on the outside of the ring of computer systems. Pretty sweet stuff.
          • by hairyfeet (841228)

            As your link shows this is why I think using the silos is so bloody brilliant. Thanks to the SALT treaties we have literally thousands of these old missile silos lying abandoned across the country, most will be sold for dirt cheap (I actually helped my dad wire one for a guy that turned one into a house, only cost him 30k for the fully outfitted silo) or even wasted by being filled in if nobody buys them. The biggest expense for an underground facility is ALWAYS the massive digging effort, and this is alrea

        • While we're on that idea, I can tell you that Paris has around ~300km of abandoned tunnels under it - bedrock inspection tunnels (dating from a time where Paris was expanding over the very ground they were extracting its building stone from). There would be a humidity problem under the areas of the city (near the water table), but with that fixed, the temperature is a constant 14 and the possibilities of air/cable distribution are endless (certain regions have been, in the past, transformed into bomb shelte

          • by leuk_he (194174)

            The temperature is not constant anymore if you add a lot of heath generating servers to the environment. At that moment you have to add warmth exchange equipment. In the Washington silo example someone thought of this, that part is missing from your tunnels.

            One more thing: who owns those tunnels?

      • by RapmasterT (787426) on Thursday January 06, 2011 @01:27AM (#34773596)

        I think that's the idea, isn't it?

        When you get right down to it, all they're doing is getting free cooling from low ambient temperatures. It's hovering just above freezing there. The only thing they're keeping off is rain. I'm not sure what they're doing about the humidity -- maybe "not caring".

        Not caring is exactly what the point of the tent exercise was. Historically the datacenter industry has maintained baseline ideas like 70 percent humidity, 75 degrees temperature...ideas that haven't changed with server technology and durability improving.

        The point is if we spend "X" dollars maintaining a datacenter environment for a baseline of reliability. If we spend 75% of X and reliability isn't significantly impacted, then that's a win. But if we spend 10% of X, and the failure rate costs less than our original X cost, then that's a HUGE win.

        Nobody expects that you can run an open air datacenter without increasing system failure rates, but the current datacenter paradigm just isn't scalable with modern high density systems, so something has to give. If "tradition" is the only thing it costs us, then tents it is!

        As for Washington state being cheating...nothing is going to work everywhere. Hell, desert country is far better for datacenters than cooler climates. It's much cheaper to cool hot dry air, than it is to dehumidify wet air.

        • Re: (Score:3, Informative)

          Completely True.

          I live in the Seattle area, and have a sump pump and dehumidifier in the basement. The dehumidifier has to run year round, and uses twice the energy in a day that the air conditioning uses in 5 days. Granted, the dehumidifier is about 25 years old, and the air conditioning is only 4 years old. But they have about the same volume of air to modify the composition of.

          Western Washington is a really beautiful part of the country. But I have to admit, it's the least friendly place to any size

        • by Cylix (55374) *

          The problem is the obvious portion of the equation is by far the smallest factor in the issues that face free air cooling. I'm not talking about tent data centers and good luck getting PCI compliance on that rig.

          In thinking outside the norms and in part the concept behind tent farms is also regarding ambient temperature. No longer chasing the perfect balance, but rather just pushing the limits.

          It turns out you really have to design this type of thing from the ground up in order for everything to really work

        • by blincoln (592401)

          ideas that haven't changed with server technology and durability improving.

          Servers are less durable now than they were a decade ago. They're made more cheaply, and they have tin whiskers to contend with. Obviously they are better in most ways than their predecessors, but longevity isn't one of those ways. Before I moved on from server engineering, I had retired a couple of NT4-era Pentium II-based systems that were still chugging away after over a decade of essentially 24/7 operation. Meanwhile, anything ba

        • Historically the datacenter industry has maintained baseline ideas like 70 percent humidity, 75 degrees temperature...ideas that haven't changed with server technology and durability improving.

          Respectfully, I must disagree. Typical Data Centers usually shoot for closer to 45% or 50% RH, not 70% as you suggest.

          • Historically the datacenter industry has maintained baseline ideas like 70 percent humidity, 75 degrees temperature...ideas that haven't changed with server technology and durability improving.

            Respectfully, I must disagree. Typical Data Centers usually shoot for closer to 45% or 50% RH, not 70% as you suggest.

            hah, that's funny. I started thinking about humidity and rattled off my cigar humidor setpoint, rather than the DC one. I think "respectfully" is the wrong answer to the suggestion of running a DC at 70% humidity, it should be more like "WTF?".

      • IMHO, Esperanza (Argentine Antarctica) is just about the most wet-dream-ideal place you could possibly build a datacenter, if you're willing to sink lots of capital into an investment with a 10-20 year payoff horizon. Let's see:

        * Fiber-accessibility: not sure whether there's fiber there today (there probably IS), but the area's not glaciated, and the surrounding ocean isn't frozen (lots of icebergs, but the water itself remains liquid year-round), so laying fiber from Esperanza to Tierra del Fuego (or even

      • by Eivind (15695)

        humidity is a non-problem in a strongly heated space. Because relative humidity drops rapidly with increasing temperature.

        If you take in cool-and-wet air into a datacenter, the air ends up as warm-and-dry. The air still contains the same amount of water offcourse, but the *relative* humidity is much lower because the saturation-point of water-vapor in air rises rapidly with temperature.

        • humidity is a non-problem in a strongly heated space

          That's not necessarily true. While your point makes sense in environment where there is adequate moisture in the air, the challenge is often keeping the humidity high enough. You'd be surprised how much humidifiers have to work in a Data Center to keep moisture in the air during colder times of the year. Air that is too dry is going to be just as harmful, if not more harmful to sensitive electronics than overly humid air.

    • by sconeu (64226)

      I want them to do it in Death Valley National Monument in August.

    • by sumdumass (711423)

      It probably wouldn't matter. If you have enough airflow, you can pull enough heat away from the systems to avoid failure. It's not like the chips and stuff run below 100 degrees F under load much anyways.

      It they were liquid cooled, they probably could get away will a bit hotter ambient temps. What matters is the ability to pull excess heat away from the machines faster then it's created. In this situation you would be both heating and cooling the systems to maintain an average temp. As long as the outside a

      • by pecosdave (536896) *

        When the temps in my hometown hit 128F back in 94 I was driving an 83 GMC truck. The orange needle that showed which position the automatic transmission was in more or less melted in half and pulled the needle all the way to the left.

        Even if the CPU can handle that I'm not sure how everything else involved in the chain would.

        • I respectfully submit that you're looking at the incident the wrong way. You melted a $0.50 part, but the truck kept going. Operating in temperatures outside the normal range caused no downtime, just like with these servers.

        • by sumdumass (711423)

          Well, the needle didn't melt just because it was 128F (26.6C). You see, on an 80 degree day, [familycarguide.com] a parked car can reach as high as 110F (43.3C) in just 20 minutes and go on to reach 130 degrees F (54.4C) not to long afterward. So the needle already should have survived 128 degrees on it own. This is also why you don't leave children or animals in parked cars.

          But if the air is moving through it, it doesn't get that hot. In fact, it stays about the same temp as the outside give or take. And if you perspire or are

          • by pecosdave (536896) *

            Well, it was parked outside of a school in a area populated with little shits you didn't trust with anything. Not only was it parked directly in the sun, the truck was the dark blue/gray combo of the era and the windows were not cracked in the least. Since it was near a school cafeteria a cracked window would have meant a carton of chocolate milk dumped in your seat. Temps were probably in the 150ish range in the truck. My sisters walkman was physically warped by the heat.

            • by sumdumass (711423)

              Could have been hotter.

              I used one of those laser thermometer to check the temp of a seat in my car after I burned my ass through my pants sitting on it. It was only about 92 out but the seat itself was at 161F.

              To put hat into perspective, you can get about a third degree burn if exposed to water at 160 for 1 second and if exposed to air at 160, you can get a second degree burn in 60 seconds or less. Of course I had pants on so I wasn't exposed to anything that severe. But it left me with a reminder for a co

      • by afidel (530433)
        Yeah but real studies show with modern AC at some point you use more power spinning fans and on lost PSU efficiency than you save by raising the temperature.
        • by sumdumass (711423)

          It may be. I just knew there was a cost to cooling the air in addition to circulating it.

    • by Mspangler (770054)

      Since I live in Eastern WA, and visit Quincy now and again (the local Honda motorcycle dealer is there) I have to mention that the summer peak temperatures can reach 105 to 110 F. These usually only last a few hours. My criteria is that if it's 90 F by 9 AM it's going to be a hot day, probably over 100. By sundown it will be under 90, and an hour after that about 70. And it should be down to about 55 by morning.

      Humidity is about 15% at mid day, so a swamp cooler would work fine for cooling the hardware.

      Wint

    • Build some in England, we've got more than enough cold, shitty weather here to cool datacenters.
  • Another dirty job episode coming up?
    • Working in a barn or working with Windows? *bing badda bang*
      • On that thought, do barns have windows?

        Those that do, very few, in my experience... best keep it that way. Linux will run on almost everything...

        Got a barn and a Debian distro CD? Farmer, meet Dell.

  • by Palestrina (715471) * on Wednesday January 05, 2011 @07:52PM (#34771598) Homepage
    I suspect that walls are useful not only for controlling the ambient data center physical conditions, but also for keeping criminals out. Forget about MTTF. What is the Mean Time to being Stolen by High School Kids for a "data center in a tent"?
    • by cappp (1822388)
      Same as the mean time for most things involving teenagers - about 3 minutes unless they're reciting baseball stats.
      • by Bucky24 (1943328)
        That was last generation... This generation's teens are more likely to be reciting WoW stats.
    • by c0lo (1497653)

      I suspect that walls are useful not only for controlling the ambient data center physical conditions, but also for keeping criminals out. Forget about MTTF. What is the Mean Time to being Stolen by High School Kids for a "data center in a tent"?

      What is the Mean Time for Bug/Rat Infestation?

    • by PPH (736903)
      Forget cow tipping. Now its server racks.
    • isn't that what chain link fencing and razor wire are for?

  • by Anonymous Coward on Wednesday January 05, 2011 @07:52PM (#34771604)

    In September 2008, it successfully ran a stack of HP servers in a tent for seven months, apparently with no failures.

    So they weren't actually running microsoft software on those servers?

  • by Anonymous Coward

    "In September 2008, it successfully ran a stack of HP servers in a tent for seven months, apparently with no failures."

    Truly, it was an impressive feat of time dilation.

  • So after a data breach occurs, will they be shutting the barn door after the data is out?
  • The myth of cooling (Score:4, Informative)

    by Shoten (260439) on Wednesday January 05, 2011 @08:07PM (#34771746)

    In our datacenters (I work for a major IT company) we've actually done some research on running data centers at higher temperatures overall. The funny thing that came out of this...in the attempt to figure out where the magical "65 degrees" requirement came from, we had to do a lot of digging. It turns out that the requirement came from old APC UPS systems, which mandated that environmental temperature. We're discovering that data centers can be run WAY warmer than that with no ill effect, provided you still have good airflow.

    • That sounds a little dodgy. APC introduced its first UPS in 1984 (from http://www.apc.com/corporate/history.cfm [apc.com] ). I think data centers were kept at lower temperatures well before 1984. I think it's more likely that APC specified a given volt-amp performance at 65 degrees because that's the temperature data centers were usually kept at anyway.
    • by afidel (530433)
      Old IBM mainframes also needed to cooled fairly aggressively.

      We keep ours at a cold side target of 72, we could go higher but having a bit of a buffer before things go tits up if both AC systems short cycle during a power outage is a good thing IMHO, especially since off hours our response time can be over an hour from first page to someone being onsite.
    • by Animats (122034)

      We're discovering that data centers can be run WAY warmer than that with no ill effect, provided you still have good airflow.

      Not really. You're using up the lifespan of the semiconductors faster. You need to look at the cumulative effect of temperature on semiconductor electromigration. [wikipedia.org] This is a real issue, because current high-density ICs don't have a big safety margin in this area. There's a straightforward relationship, Black's equation [wikipedia.org] from which this can be computed. Notice that mean time to

      • Not really. You're using up the lifespan of the semiconductors faster.

        And if your equipment is cheaper to replace than paying for air-conditioning, it's still worth running them hot.

        • Not really. You're using up the lifespan of the semiconductors faster.

          And if your equipment is cheaper to replace than paying for air-conditioning, it's still worth running them hot.

          You also need to take to cost of recovering from those more frequent failures into account, transient failures might corrupt data, even without you noticing

  • by Ungrounded Lightning (62228) on Wednesday January 05, 2011 @08:15PM (#34771792) Journal

    Cut my "hacker's teeth" on some computers that were located in a WW II era bomber-plant hangar. (Built mostly of wood because the steel was being used for war machines.)

    Place had issues with mice and rats getting under the raised floor and chewing on the cabling.

  • by sootman (158191) on Wednesday January 05, 2011 @08:20PM (#34771830) Homepage Journal

    So we've had datacenters in shipping containers, [slashdot.org] and floating at sea, [slashdot.org] and now in a barn. Is this just large-scale case-modding for CIO's at rich companies? :-)

    • Ah, but one must also recall Data Center In A Cave [slashdot.org], which makes quite a bit of sense here:

      "Plenty of cities have submitted bids for the Google Fiber project, with most of their bids being centered around the attributes that could describe many communities. Yet one small midwestern town, with much less fanfare than the metropolitan bids, provided an unusual proposition for Google in their likely quixotic nomination. Quincy, IL, has an extensive series of underground caverns that could provide year-round temperature control, dedicated hydroelectric power, and security in the case of a terrorist attack."

      • by blair1q (305137)

        Cash registers are computers.

        Look at where those live.

        If these things can survive in a bar on the beach in a popular port-of-call for the Navy, they never did need walls.

    • by afidel (530433)
      Shipping containers are interesting in all sorts of cases like putting up a datacenter quickly in a warehouse in an industrial park anywhere in the world. The military also loves them because they can easily be transported by sea or cargo lifter.
      • by Alex Belits (437) *

        Shipping containers are interesting in all sorts of cases like putting up a datacenter quickly in a warehouse in an industrial park anywhere in the world.

        Shipping containers are useful for placing data center equipment where it otherwise does not belong, however it's foolish to think that they are actually efficient at anything. They have to contain built-in fans and air conditioners, so density and efficiency are lower than the same floor filled with racks. They need access to the airflow -- you can't just stack them or place next to each other. They need an outside electric panel. For a data center you still need giant UPSes and/or generators. And replacin

  • A server farm in a barn. About damn time! Now we just need a cloud in the air, and power through water, perhaps some storage in manure, and the future will be the past!

    • by Caerdwyn (829058)

      ...perhaps some storage in manure...

      Yeah, I remember 7200.8 Seagate drives too...

    • A server farm in a barn. About damn time! Now we just need a cloud in the air, and power through water, perhaps some storage in manure, and the future will be the past!

      The future-in-the-past [electronista.com] is now!

  • 'Tis a fine Barn, English, but surely 'tis no datacenter.

    Simpsons Amish [maxim.com]

  • by Caerdwyn (829058) on Wednesday January 05, 2011 @08:33PM (#34771964) Journal

    Yer SQL server crashin'? Lemme have a look at 'er...

    Ah! Found it right here... possums! Ya gots possums livin' in yer SNA-box-thingie. Heh... SNA... that always did sound dirty. ANyways, lemme get my plinkin' rifle and my coon dog Skeeter, we'll git yer back up and runnin'!

    Seein' as I'll be in there anyways, y'all want a RAM upgrade?

  • It was not the first time Microsoft had toyed with the idea of a datacenter without walls.

    I think they might be taking the "Windows everywhere" philosophy a little bit too seriously.

  • .... Windows is a pig!
  • I highly doubt they're going to let the rain get in.

    Otherwise, with an ambient temperature under 100F year-round, their gear should run fine.

    Until the birds start nesting in it...

    • by treeves (963993)

      I wonder about insects, pollen, and dust. Quincy WA is out in the central, agricultural part of the state.
        Seems like those might have an adverse effect on servers.
      Oh, and it can get over 100F there. I'm sure Microsoft knows all of this but I wonder how they will cope with it.

      • by GaryOlson (737642)
        The cockroaches can survive anything; I don't think the equipment can survive the cockroaches.
  • ....Until mice get into your wiring.....
  • What is the first thing you notice when stepping into a barn?

    That's right, the smell of shit!

  • The thing I've always wondered, something I've never seen mentioned, is how they deal with dust. Ok, so the walls keep out the large chunks, but what do they do to keep from drawing small particulate matter into their servers? I assume that they have filters on the intake vents, but they'd have to he more substantial than the ones used in facilities with traditional air conditioning, which would be a somewhat more closed environment, where the hot air circulates through the cooling system on its way back

    • Don't quote me on this, but I've read that air quality is typically worse in a house since it's such a closed environment. Dust comes in, settles, but can't get out.

  • by 1sockchuck (826398) on Thursday January 06, 2011 @12:42AM (#34773414) Homepage
    Data Center Knowledge has a photo feature [datacenterknowledge.com] with a bunch of images of the facility in Quincy and the container modules being assembled. You can see all the servers they pack into them.
  • Sun did some experimentation with self cooling datacenters a few years ago.
    http://www.youtube.com/watch?v=ZaEsFDjalvw [youtube.com]

  • Who needs windows in a datacentre without walls?

  • the scene matches all the manure Micro$oft has been shoveling on us all these years.

The closest to perfection a person ever comes is when he fills out a job application form. -- Stanley J. Randall

Working...