Microsoft Puts Datacenter In a Barn 147
aesoteric writes "Microsoft has announced that it will open a new datacenter in Washington State housed in a 'modern' barn-like structure that is 'virtually transparent to ambient outdoor conditions'. It was not the first time Microsoft had toyed with the idea of a datacenter without walls. In September 2008, it successfully ran a stack of HP servers in a tent for seven months, apparently with no failures."
Washington state is CHEATING! (Score:4, Funny)
I want them to replicate this experiment in Big Bend National Park in July.
Re: (Score:3)
I think that's the idea, isn't it?
When you get right down to it, all they're doing is getting free cooling from low ambient temperatures. It's hovering just above freezing there. The only thing they're keeping off is rain. I'm not sure what they're doing about the humidity -- maybe "not caring".
Re:Washington state is CHEATING! (Score:4, Informative)
Re: (Score:2)
Noooooo.... the "hundred dollar consumer device" was dependent upon servers that failed.
Re: (Score:3)
Re: (Score:2)
My bad...
Re:Washington state is CHEATING! (Score:5, Insightful)
That would makes sense, as it takes tons of heat to produce cold air, and simply using the existing cold air has a much lower carbon footprint (corporations don't really care, to them it means lower power bills). As for humidity, it should be lower inside the barn than outside, as the heat from the systems will still raise the temperature enough to drop the relative humidity to more reasonable levels. Probably still higher than optimal, but like you said, who cares if you are running cheap enough gear, you would replace it more often anyway.
Reading Wikipedia seems to indicate that it is much drier there than eastern Washington anyway. You could likely balance the humidity with the temperature, ie: if you want lower humidity, you vent less and put up with higher temps within the barn. If it is anything like Spokane, then the main humidity is in the winter, when the air is holding less water to start with and allowing the temp to go up (vent less) will do no harm, while dramatically dropping the humidity. There aren't a lot of places this would work, but this area might just fine.
Re: (Score:2)
I meant it is drier than western Washington, but I think you get the picture...
How about at the Alaska pipeline? (Score:5, Interesting)
I seem to recall a business plan back in the late 1990's to do something similar adjacent to the alaska pipeline; complete with a refinery.
The argument was that it would have
* Free airconditioning with the clean dry cold alaska air.
* Unparalleled physical security - with miles of visibility in all directions.
* A well-protected network (if they could run their lines along the well-defended pipeline)
* Unlimited backup-generator fuel (tapped directly into the pipeline)
I seem to recall they raised funds. Wonder what happened to them.
Re: (Score:2)
I hate it when people take an interesting idea and point out the few edge cases where it won't work. It's obviously not a conventional datacenter, and I would likely think that the bean counters would have figured in the cost of the premium salaries and determined if the ratio of what they're saving on cooling costs would work, well before the plan even got put into mot
Re: (Score:2)
The 'dry side' of Washington is also a great deal *hotter* than the 'wet side'.
Re: (Score:2)
I thought that hot things in cold climates didn't produce condensation, it was cold things in hot climates that did. Like your glass of ice tea, or the AC coils in your HVAC system. By definition, a hotter item will increase the ambient temperature directly around itself, which automatically lowers the humidity (dew point) which makes condensation impossible.
Re: (Score:2)
A bunch of "hot" machines in a cold climate will produce much more condensation
You seem to have your understanding backwards.
As you heat air it's water carrying capacity rises and it's relative humidity falls. So pumping cold outside air into your DC should not cause condensation problems in the DC (if people have been breathing in the DC you may get condensation when you release the air back into the atmosphere but you probablly don't care about that).
Condensation is mainly a problem when you have a cold o
Re: (Score:2)
Exactly. That is why the humidity isn't such a problem the hotter it gets. Just because hotter air can hold more water, that doesn't magically make water appear. Instead the relative humidity goes down, reducing any possibility of condensation. Read http://en.wikipedia.org/wiki/Relative_humidity [wikipedia.org] for more info. If that still doesn't make sense, you will just have to trust the grown ups on this one.
Re: (Score:2)
It is dipping well below freezing at night in Quincy right now, but this is the coldest part of the year. Central Washington isn't exactly cool in the summer months - the average highs for Quincy in July are in the mid-80s, and the record highs are pushing 110F.
Comment removed (Score:4, Interesting)
Re:Washington state is CHEATING! (Score:4, Interesting)
Re: (Score:2)
Re: (Score:2)
While we're on that idea, I can tell you that Paris has around ~300km of abandoned tunnels under it - bedrock inspection tunnels (dating from a time where Paris was expanding over the very ground they were extracting its building stone from). There would be a humidity problem under the areas of the city (near the water table), but with that fixed, the temperature is a constant 14 and the possibilities of air/cable distribution are endless (certain regions have been, in the past, transformed into bomb shelte
Re: (Score:2)
The temperature is not constant anymore if you add a lot of heath generating servers to the environment. At that moment you have to add warmth exchange equipment. In the Washington silo example someone thought of this, that part is missing from your tunnels.
One more thing: who owns those tunnels?
Re:Washington state is CHEATING! (Score:4, Insightful)
I think that's the idea, isn't it?
When you get right down to it, all they're doing is getting free cooling from low ambient temperatures. It's hovering just above freezing there. The only thing they're keeping off is rain. I'm not sure what they're doing about the humidity -- maybe "not caring".
Not caring is exactly what the point of the tent exercise was. Historically the datacenter industry has maintained baseline ideas like 70 percent humidity, 75 degrees temperature...ideas that haven't changed with server technology and durability improving.
The point is if we spend "X" dollars maintaining a datacenter environment for a baseline of reliability. If we spend 75% of X and reliability isn't significantly impacted, then that's a win. But if we spend 10% of X, and the failure rate costs less than our original X cost, then that's a HUGE win.
Nobody expects that you can run an open air datacenter without increasing system failure rates, but the current datacenter paradigm just isn't scalable with modern high density systems, so something has to give. If "tradition" is the only thing it costs us, then tents it is!
As for Washington state being cheating...nothing is going to work everywhere. Hell, desert country is far better for datacenters than cooler climates. It's much cheaper to cool hot dry air, than it is to dehumidify wet air.
Re: (Score:3, Informative)
Completely True.
I live in the Seattle area, and have a sump pump and dehumidifier in the basement. The dehumidifier has to run year round, and uses twice the energy in a day that the air conditioning uses in 5 days. Granted, the dehumidifier is about 25 years old, and the air conditioning is only 4 years old. But they have about the same volume of air to modify the composition of.
Western Washington is a really beautiful part of the country. But I have to admit, it's the least friendly place to any size
Re: (Score:3)
The problem is the obvious portion of the equation is by far the smallest factor in the issues that face free air cooling. I'm not talking about tent data centers and good luck getting PCI compliance on that rig.
In thinking outside the norms and in part the concept behind tent farms is also regarding ambient temperature. No longer chasing the perfect balance, but rather just pushing the limits.
It turns out you really have to design this type of thing from the ground up in order for everything to really work
Re: (Score:2)
ideas that haven't changed with server technology and durability improving.
Servers are less durable now than they were a decade ago. They're made more cheaply, and they have tin whiskers to contend with. Obviously they are better in most ways than their predecessors, but longevity isn't one of those ways. Before I moved on from server engineering, I had retired a couple of NT4-era Pentium II-based systems that were still chugging away after over a decade of essentially 24/7 operation. Meanwhile, anything ba
Re: (Score:2)
Historically the datacenter industry has maintained baseline ideas like 70 percent humidity, 75 degrees temperature...ideas that haven't changed with server technology and durability improving.
Respectfully, I must disagree. Typical Data Centers usually shoot for closer to 45% or 50% RH, not 70% as you suggest.
Re: (Score:2)
Historically the datacenter industry has maintained baseline ideas like 70 percent humidity, 75 degrees temperature...ideas that haven't changed with server technology and durability improving.
Respectfully, I must disagree. Typical Data Centers usually shoot for closer to 45% or 50% RH, not 70% as you suggest.
hah, that's funny. I started thinking about humidity and rattled off my cigar humidor setpoint, rather than the DC one. I think "respectfully" is the wrong answer to the suggestion of running a DC at 70% humidity, it should be more like "WTF?".
Re: (Score:2)
IMHO, Esperanza (Argentine Antarctica) is just about the most wet-dream-ideal place you could possibly build a datacenter, if you're willing to sink lots of capital into an investment with a 10-20 year payoff horizon. Let's see:
* Fiber-accessibility: not sure whether there's fiber there today (there probably IS), but the area's not glaciated, and the surrounding ocean isn't frozen (lots of icebergs, but the water itself remains liquid year-round), so laying fiber from Esperanza to Tierra del Fuego (or even
Re: (Score:2)
humidity is a non-problem in a strongly heated space. Because relative humidity drops rapidly with increasing temperature.
If you take in cool-and-wet air into a datacenter, the air ends up as warm-and-dry. The air still contains the same amount of water offcourse, but the *relative* humidity is much lower because the saturation-point of water-vapor in air rises rapidly with temperature.
Re: (Score:2)
humidity is a non-problem in a strongly heated space
That's not necessarily true. While your point makes sense in environment where there is adequate moisture in the air, the challenge is often keeping the humidity high enough. You'd be surprised how much humidifiers have to work in a Data Center to keep moisture in the air during colder times of the year. Air that is too dry is going to be just as harmful, if not more harmful to sensitive electronics than overly humid air.
Re: (Score:3)
I want them to do it in Death Valley National Monument in August.
Re: (Score:2)
I want them to do it on the surface of THE SUN.
HAVE FUN COOLING THAT M$ BITCHES!!!1!eleven!
Re: (Score:2)
They can run it at night, when it's cooler.
Re: (Score:3)
It probably wouldn't matter. If you have enough airflow, you can pull enough heat away from the systems to avoid failure. It's not like the chips and stuff run below 100 degrees F under load much anyways.
It they were liquid cooled, they probably could get away will a bit hotter ambient temps. What matters is the ability to pull excess heat away from the machines faster then it's created. In this situation you would be both heating and cooling the systems to maintain an average temp. As long as the outside a
Re: (Score:2)
When the temps in my hometown hit 128F back in 94 I was driving an 83 GMC truck. The orange needle that showed which position the automatic transmission was in more or less melted in half and pulled the needle all the way to the left.
Even if the CPU can handle that I'm not sure how everything else involved in the chain would.
Re: (Score:2)
I respectfully submit that you're looking at the incident the wrong way. You melted a $0.50 part, but the truck kept going. Operating in temperatures outside the normal range caused no downtime, just like with these servers.
Re: (Score:2)
Well, the needle didn't melt just because it was 128F (26.6C). You see, on an 80 degree day, [familycarguide.com] a parked car can reach as high as 110F (43.3C) in just 20 minutes and go on to reach 130 degrees F (54.4C) not to long afterward. So the needle already should have survived 128 degrees on it own. This is also why you don't leave children or animals in parked cars.
But if the air is moving through it, it doesn't get that hot. In fact, it stays about the same temp as the outside give or take. And if you perspire or are
Re: (Score:2)
Well, it was parked outside of a school in a area populated with little shits you didn't trust with anything. Not only was it parked directly in the sun, the truck was the dark blue/gray combo of the era and the windows were not cracked in the least. Since it was near a school cafeteria a cracked window would have meant a carton of chocolate milk dumped in your seat. Temps were probably in the 150ish range in the truck. My sisters walkman was physically warped by the heat.
Re: (Score:2)
Could have been hotter.
I used one of those laser thermometer to check the temp of a seat in my car after I burned my ass through my pants sitting on it. It was only about 92 out but the seat itself was at 161F.
To put hat into perspective, you can get about a third degree burn if exposed to water at 160 for 1 second and if exposed to air at 160, you can get a second degree burn in 60 seconds or less. Of course I had pants on so I wasn't exposed to anything that severe. But it left me with a reminder for a co
Troll? (Score:2)
The above is 100% true. Bite my hairy fleshy ass.
Re: (Score:2)
Re: (Score:2)
It may be. I just knew there was a cost to cooling the air in addition to circulating it.
Re: (Score:3)
Since I live in Eastern WA, and visit Quincy now and again (the local Honda motorcycle dealer is there) I have to mention that the summer peak temperatures can reach 105 to 110 F. These usually only last a few hours. My criteria is that if it's 90 F by 9 AM it's going to be a hot day, probably over 100. By sundown it will be under 90, and an hour after that about 70. And it should be down to about 55 by morning.
Humidity is about 15% at mid day, so a swamp cooler would work fine for cooling the hardware.
Wint
Re: (Score:2)
Re: (Score:2)
Yes, did microsoft test for the wall-less server installation's resistance to badly-aimed pepper shot?
Mike Rowe (Score:2)
Re: (Score:3)
Re: (Score:2)
On that thought, do barns have windows?
Those that do, very few, in my experience... best keep it that way. Linux will run on almost everything...
Got a barn and a Debian distro CD? Farmer, meet Dell.
I suspect... (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I suspect that walls are useful not only for controlling the ambient data center physical conditions, but also for keeping criminals out. Forget about MTTF. What is the Mean Time to being Stolen by High School Kids for a "data center in a tent"?
What is the Mean Time for Bug/Rat Infestation?
Re: (Score:3)
Re: (Score:2)
isn't that what chain link fencing and razor wire are for?
no failures? (Score:5, Funny)
In September 2008, it successfully ran a stack of HP servers in a tent for seven months, apparently with no failures.
So they weren't actually running microsoft software on those servers?
Re: (Score:2)
7 months in sep 08? (Score:2, Funny)
"In September 2008, it successfully ran a stack of HP servers in a tent for seven months, apparently with no failures."
Truly, it was an impressive feat of time dilation.
Re: (Score:2)
Re: (Score:2)
Or it was September for a long damn time.
Would've been possible if they ran the test in '93.
Re: (Score:2)
They just installed AOL on the servers. Eternal September, man.
Re: (Score:2)
...and eternal 1992!
The Barn Door? (Score:2)
Re: (Score:2)
Yep, as usual. They used to call that a 'service pack'. Wonder what they'll call it now?
The myth of cooling (Score:4, Informative)
In our datacenters (I work for a major IT company) we've actually done some research on running data centers at higher temperatures overall. The funny thing that came out of this...in the attempt to figure out where the magical "65 degrees" requirement came from, we had to do a lot of digging. It turns out that the requirement came from old APC UPS systems, which mandated that environmental temperature. We're discovering that data centers can be run WAY warmer than that with no ill effect, provided you still have good airflow.
Re: (Score:3)
Re: (Score:2)
We keep ours at a cold side target of 72, we could go higher but having a bit of a buffer before things go tits up if both AC systems short cycle during a power outage is a good thing IMHO, especially since off hours our response time can be over an hour from first page to someone being onsite.
Re: (Score:2)
We're discovering that data centers can be run WAY warmer than that with no ill effect, provided you still have good airflow.
Not really. You're using up the lifespan of the semiconductors faster. You need to look at the cumulative effect of temperature on semiconductor electromigration. [wikipedia.org] This is a real issue, because current high-density ICs don't have a big safety margin in this area. There's a straightforward relationship, Black's equation [wikipedia.org] from which this can be computed. Notice that mean time to
Re: (Score:2)
Not really. You're using up the lifespan of the semiconductors faster.
And if your equipment is cheaper to replace than paying for air-conditioning, it's still worth running them hot.
Re: (Score:2)
Not really. You're using up the lifespan of the semiconductors faster.
And if your equipment is cheaper to replace than paying for air-conditioning, it's still worth running them hot.
You also need to take to cost of recovering from those more frequent failures into account, transient failures might corrupt data, even without you noticing
worked on computers in a hangar once. (Score:3)
Cut my "hacker's teeth" on some computers that were located in a WW II era bomber-plant hangar. (Built mostly of wood because the steel was being used for war machines.)
Place had issues with mice and rats getting under the raised floor and chewing on the cabling.
Re: (Score:2)
At least 5.
Re: (Score:2)
Not enough. With around 2.7 times that you can easily cover 150 metres from the main switch. That's the answer for a remote datacentre built from natural logs.
Re: (Score:2)
You're right. They're high in fiber and you can't really get enough.
Exotic datacenter == CIO hobby? (Score:4, Insightful)
So we've had datacenters in shipping containers, [slashdot.org] and floating at sea, [slashdot.org] and now in a barn. Is this just large-scale case-modding for CIO's at rich companies? :-)
Re: (Score:2)
"Plenty of cities have submitted bids for the Google Fiber project, with most of their bids being centered around the attributes that could describe many communities. Yet one small midwestern town, with much less fanfare than the metropolitan bids, provided an unusual proposition for Google in their likely quixotic nomination. Quincy, IL, has an extensive series of underground caverns that could provide year-round temperature control, dedicated hydroelectric power, and security in the case of a terrorist attack."
Re: (Score:3)
Cash registers are computers.
Look at where those live.
If these things can survive in a bar on the beach in a popular port-of-call for the Navy, they never did need walls.
Re: (Score:2)
Re: (Score:2)
Shipping containers are interesting in all sorts of cases like putting up a datacenter quickly in a warehouse in an industrial park anywhere in the world.
Shipping containers are useful for placing data center equipment where it otherwise does not belong, however it's foolish to think that they are actually efficient at anything. They have to contain built-in fans and air conditioners, so density and efficiency are lower than the same floor filled with racks. They need access to the airflow -- you can't just stack them or place next to each other. They need an outside electric panel. For a data center you still need giant UPSes and/or generators. And replacin
Moooo! Your server is online. (Score:2)
A server farm in a barn. About damn time! Now we just need a cloud in the air, and power through water, perhaps some storage in manure, and the future will be the past!
Re: (Score:2)
...perhaps some storage in manure...
Yeah, I remember 7200.8 Seagate drives too...
Re: (Score:2)
A server farm in a barn. About damn time! Now we just need a cloud in the air, and power through water, perhaps some storage in manure, and the future will be the past!
The future-in-the-past [electronista.com] is now!
Tis a fine barn (Score:2)
'Tis a fine Barn, English, but surely 'tis no datacenter.
Simpsons Amish [maxim.com]
Got yer problem... (Score:5, Funny)
Yer SQL server crashin'? Lemme have a look at 'er...
Ah! Found it right here... possums! Ya gots possums livin' in yer SNA-box-thingie. Heh... SNA... that always did sound dirty. ANyways, lemme get my plinkin' rifle and my coon dog Skeeter, we'll git yer back up and runnin'!
Seein' as I'll be in there anyways, y'all want a RAM upgrade?
No walls? (Score:2)
It was not the first time Microsoft had toyed with the idea of a datacenter without walls.
I think they might be taking the "Windows everywhere" philosophy a little bit too seriously.
Its been said many times .... (Score:2)
So by virtually they mean not. (Score:2)
I highly doubt they're going to let the rain get in.
Otherwise, with an ambient temperature under 100F year-round, their gear should run fine.
Until the birds start nesting in it...
Re: (Score:2)
I wonder about insects, pollen, and dust. Quincy WA is out in the central, agricultural part of the state.
Seems like those might have an adverse effect on servers.
Oh, and it can get over 100F there. I'm sure Microsoft knows all of this but I wonder how they will cope with it.
Re: (Score:2)
This is all well and good.... (Score:2)
What is the first thing you notice. (Score:2)
What is the first thing you notice when stepping into a barn?
That's right, the smell of shit!
And so did Yahoo! (Score:2)
And so did Yahoo!
Microsoft and Google aren't the only people playing this music.
http://www.datacenterknowledge.com/archives/2009/06/30/yahoos-fresh-air-computing-coop/ [datacenterknowledge.com]
http://www.computerworld.com/s/article/9186618/Yahoo_opens_chicken_coop_green_data_center?source=rss_news [computerworld.com]
Dust, air filtering? (Score:2)
The thing I've always wondered, something I've never seen mentioned, is how they deal with dust. Ok, so the walls keep out the large chunks, but what do they do to keep from drawing small particulate matter into their servers? I assume that they have filters on the intake vents, but they'd have to he more substantial than the ones used in facilities with traditional air conditioning, which would be a somewhat more closed environment, where the hot air circulates through the cooling system on its way back
Re: (Score:2)
Don't quote me on this, but I've read that air quality is typically worse in a house since it's such a closed environment. Dust comes in, settles, but can't get out.
Photos - The "barn" and container assembly (Score:4, Informative)
The idea is not new (Score:2)
Sun did some experimentation with self cooling datacenters a few years ago.
http://www.youtube.com/watch?v=ZaEsFDjalvw [youtube.com]
Datacentre without walls (Score:2)
Who needs windows in a datacentre without walls?
Finally... (Score:2)
Re:interns (Score:5, Funny)
At least it's a stable job.
Re: (Score:2)
Re: (Score:3)
Billie, git yo-self behind th' barn 'n' cut a switch, you bin a baaaad boy!
Re: (Score:2)
You know, for a program, that probably beats being assigned to games... CLU probably won't even notice unless some chore was skipped.