Forgot your password?
typodupeerror
Microsoft

Microsoft Ends 'Project Natick' Underwater Data Center Experiment Despite Success (techspot.com) 35

Microsoft has decided to end its Project Natick experiment, which involved submerging a datacenter capsule 120 miles off the coast of Scotland to explore the feasibility of deploying underwater datacenters. TechSpot's Rob Thubron reports: Project Natick's origins stretch all the way back to 2013. Following a three-month trial in the Pacific, a submersible data center capsule was deployed 120 miles off the coast of Scotland in 2018. It was brought back to the surface in 2020, offering what were said to be promising results. Microsoft lost six of the 855 servers that were in the capsule during its time underwater. In a comparison experiment being run simultaneously on dry land, it lost eight out of 135 servers. Microsoft noted that the constant temperature stability of the external seawater was a factor in the experiment's success. It also highlighted how the data center was filled with inert nitrogen gas that protected the servers, as opposed to the reactive oxygen gas in the land data center.

Despite everything going so well, Microsoft is discontinuing Project Natick. "I'm not building subsea data centers anywhere in the world," Noelle Walsh, the head of the company's Cloud Operations + Innovation (CO+I) division, told DatacenterDynamics. "My team worked on it, and it worked. We learned a lot about operations below sea level and vibration and impacts on the server. So we'll apply those learnings to other cases," Walsh added.

Microsoft also patented a high-pressure data center in 2019 and an artificial reef data center in 2017, but it seems the company is putting resources into traditional builds for now. "I would say now we're getting more focused," Walsh said. "We like to do R&D and try things out, and you learn something here and it may fly over there. But I'd say now, it's very focused." "While we don't currently have data centers in the water, we will continue to use Project Natick as a research platform to explore, test, and validate new concepts around data center reliability and sustainability, for example with liquid immersion."

This discussion has been archived. No new comments can be posted.

Microsoft Ends 'Project Natick' Underwater Data Center Experiment Despite Success

Comments Filter:
  • It's obviously not.

    • Experiments can be successful for reasons other than "this is a way to save money."
      • Re: (Score:2, Insightful)

        by SeaFox ( 739806 )

        Not in modern America. If research doesn't find a way to create wealth for someone it's not worth doing. You think we're expanding our efforts in A.I. for the good of mankind or something? /s

      • Everybody that volunteered to work at that data center was on two lists: the work-from-home list and do-not-call list.

        • Everybody that volunteered to work at that data center was on two lists: the work-from-home list and do-not-call list.

          It was filled with pure nitrogen; they'd be on the dead-on-arrival list.

    • Why do you suspect Highlander is doing it, then?

      PS Microsoft should publish a paper.

    • Economical is a multi-faceted issue; the modules were relatively small-- about the size of a 20' shipping container IIRC, and a greater value was likely placed on the ability to upgrade hardware rather than operating costs. Something ~10-40x the size might yield different results in terms of economy, or some type of cluster arrangement that allowed for computing upgrades after 5 years to be easily loaded and re-submerged.

      • by cusco ( 717999 )

        That small? It apparently had 855 servers racked in that space, that's incredibly dense. If cooling is that good I wouldn't want to change the surface area to CPU ratio much. It would probably be better to just use a series of these units chained together rather than make them into one big one.

      • Economical is a multi-faceted issue; the modules were relatively small-- about the size of a 20' shipping container IIRC, and a greater value was likely placed on the ability to upgrade hardware rather than operating costs.

        Since the guy didn't explain what the reason for not continuing is, we have to speculate, but that is almost certainly not it. The data pod is a high density server farm providing generalized computing services ("cloud") that runs unserviced for years. If you want to "upgrade" (replace) the hardware you pull the existing pod and and deploy a new one while the old one is sent to the service center for replacement/reconditioning.

        I suspect it is simply that they simply don't currently want to open underwater d

    • Or maybe it is continuing, classified and a military thing. Economical isn't part of the lexicon there.

      • by La Gris ( 531858 )

        +1
        I think you nailed it. The know-how to produce, deploy and maintain deep sea data-centres is potentially valuable for intelligence and military operations.

    • "Homer that's your solution for everything!
      It's not going to happen."
  • At least while Dr. Quinn was running the project. But, per usual, Captain Murphy got bored, started messing with it, took it off on a bizarre tangent and... well the details are fuzzy but it ended with an explosion.

  • We learned a lot about operations below sea level and vibration and impacts on the server. So we'll apply those learnings to other cases.

    And now for a partnership with SpaceX to test a datacenter in orbit.

    • Maybe, but in space they'd sacrifice the primary benefit -- free access to the world's largest heat sink.

      • by drnb ( 2434720 )

        Maybe, but in space they'd sacrifice the primary benefit -- free access to the world's largest heat sink.

        On the other hand they would have free power.

    • by cusco ( 717999 )

      That would be a lot more problematic, cooling is easy underwater but expensive when there's no medium to transfer the heat to.

      • by drnb ( 2434720 )
        FWIW, collecting heat and emitting as infrared.

        "Active Thermal Control System (ATCS) Overview
        Heat Rejection Subsystem (HRS) The HRS consists of the radiator ORU, which is a deployable, eight-panel system that rejects thermal energy via radiation."
        https://www.nasa.gov/wp-conten... [nasa.gov]
        • by cusco ( 717999 )

          Yes, but now we're back to active cooling rather than passive like underwater, and radiative IR cooling is not very efficient. When the Space Shuttle reached orbit the first thing it did is open the bay doors to expose the radiators to space, because without them the living space would overheat in just a few hours. Cooling is one of the concerns when a new module or large experiment is sent to the the ISS, if the experiment generates more heat than it can get rid of it can only operate in periods where th

  • Or somewhere else near the southern tip of Labrador. Cold water for cooling, cheap electricity and close proximity to both Europe and the US Eastern sea board. Plus with the North American free trade agreement the Canadian governments can't renege on any deals they make with foreign companies.
  • "It also highlighted how the data center was filled with inert nitrogen gas that protected the servers, as opposed to the reactive oxygen gas in the land data center."

    Is 'reactive oxygen gas' another way of saying air?

    How do you do physical maintenance on equipment in nitrogen filled datacenters?

    • all my data centers are filled with a proprietary mixture of approximately 78% nitrogen, 21% oxygen, and a small amount of other gasses that i can't divulge. trade secret.

      interestingly enough, i run the same mixture in my car tires and it does pretty good there as well

      • by cusco ( 717999 )

        You use a distributed cloud deployment, probably RAID 10 configurations throughout, no dedicated single-use servers, no single point of failure. If a box or a dozen fail the other 800+ just pick up the slack, they probably designed the deployment with a 10-15% cushion of extra capacity since that's what they would do at a land-based DC deployment.

    • You don't. If it needs servicing, you haul it to the surface, open the doors and air it out, then do your servicing.

      Close it up, purge it and drop it overboard.

      I imagine the idea was that you basically don't service it - just leave the failing hardware there until it makes sense to do a major overhaul.

      • by cusco ( 717999 )

        Otherwise referred to as a "lights out data center", there are quite a few of them around the world above water.

    • by necro81 ( 917438 )

      How do you do physical maintenance on equipment in nitrogen filled datacenters?

      How do you do physical maintenance on equipment in a sealed pressure vessel hundreds of feet underwater?

  • Huge trade off (Score:5, Insightful)

    by llZENll ( 545605 ) on Monday June 24, 2024 @10:07PM (#64575197)

    Huge negative tradeoffs to be underwater. Expensive deployment, expensive maintenance, expensive enclosure, expensive retrieval. Temperature regulation on land is not that difficult, and if nitrogen makes that big of a difference then you can do that on land as well. Or so both by submerging in mineral oil. All of the extra cost in either case probably is not worth saving a few servers, after 5 years the servers value is close to 0.

  • by BetterThanCaesar ( 625636 ) on Tuesday June 25, 2024 @03:20AM (#64575633)
    I bet it wasn't compatible with Microsoft Surface.
  • Our intel agencies have tapped underwater data cables before. They've partnered with corporations to do 'big stuff', ie Glomar Explorer, ostensibly launched to gather ore nodes on the seabed, but actually an intelligence op. I wonder whether this project had some DoD funding, with an eye towards attaching one of these to a submarine data cable, and looking for stuff of interest to send on via however those earlier taps worked.
    • by cusco ( 717999 )

      A former co-irker had worked for Raytheon several years prior, on their early robotics projects. He was able to tell me that he had been on a US Navy sub for several months trespassing in Soviet waters. I eventually figured out that it was probably the mission that tapped the Vladivostok submarine cable.

  • As we've all witnessed over the last few years, it's too easy for rogue states to sever underwater fiber optic cables.

    I have a feeling Microsoft is reading between the lines of the current geopolitical situation and is worried a foreign sub could take out their underwater data centers.

After Goliath's defeat, giants ceased to command respect. - Freeman Dyson

Working...