miller60 writes: After years of deploying cloud capacity in enclosures resembling shipping containers, Microsoft has updated its data center design, returning to a more traditional data hall with a slab floor and hot-aisle containment. The new Generation 5 design also uses a fan wall to manage airflow through the server rooms. The redesign arrives as Microsoft’s cloud computing business is experiencing rapid growth, as the company continues to both build its own data centers and also leasing third-party "wholesale" IT space. Microsoft says the previous modular design, which it had used since 2008, was extremely efficient but it couldn't manufacture the enclosures fast enough to keep pace with its cloud deployments.
miller60 writes: Facebook is investing heavily in machine learning to build a smarter newsfeed for its 1.6 billion users. The key to this effort is Big Sur, a GPU-powered server that dramatically accelerates Facebook's AI training. This week Facebook offered a look inside its Oregon data center to show off Big Sur and discuss its ambitions in using machine learning.
miller60 writes: An offbeat idea hatched in a lab at Facebook has now evolved into a new class of optical cold storage. Both Sony and Panasonic have rolled out commercial versions of data archiving systems using robots to retrieve data stored on high-capacity Blu-Ray discs. Both systems are based on experiments at Facebook, which showed off the technology at the Open Compute Summit and then tested the systems at its data centers. A key selling point for these products is the ability to lower the cost of data archiving, as Blu-Ray systems offer considerable energy savings, since energy is only needed when writing data during the initial burn.
miller60 writes: The Internet of Things will lead to new types of data centers, optimized around the needs of machine-to-machine (M2M) workloads and analytics to mine oceans of data. Industry executives foresee unmanned facilities placed closer to network of IoT devices, where software and analytics can make real-time adjustments to data center configuration. The nature of M2M traffic presents an opportunity for design innovation. “The machine is fundamentally a different type of consumer, driven by software and defined by software,” said IO's George Slessman. “The new machine consumer is going to be a much more compliant user, and can be told what to do and when to do it.” Machine algorithms will be less sensitive to failure, which could allow them to run with less UPS and generator backup,
miller60 writes: Microsoft is encouraged by the results of its research on undersea data centers, and is continuing to develop the concept. A larger deployment is likely the next step, Microsoft's Ben Cutler told an industry conference. Last year Microsoft deployed a sealed container in 30 feet of water off the coast of California, where the micro server farm operated on the ocean floor for 105 days. Microsoft says it operated with no hardware failures and exceptional energy efficiency, and the company is exploring ways to power it with offshore energy sources using wind or wave power.
miller60 writes: In an effort to rethink the way data centers are built, San Diego startup ScaleMatrix has effectively shrunk the data center into a single cabinet. The result is a two-compartment cabinet that can support IT power loads of up to 78kW in a single rack. The company's approach has elements of the outdoor server huts deployed a few years back by AOL, but sees the cabinets as building blocks for a larger colocation and cloud infrastructure.
miller60 writes: The Switch SuperNAP in Las Vegas is among the most unique data centers in the world, both in its scale and ability to support high-density workloads. The SuperNAP was recently ranked as the largest cloud server farm, but has recently announced plans for new projects that will dwarf the three-building Las Vegas campus. These include a massive Reno project featuring a single data center building that will span more than 1 million square feet, and eventually house 6 million SF of server space for eBay and other tenants. Switch also plans to spend $5 billion to convert a pyramid-shaped building in Michigan into a 2 million square foot cloud campus.
miller60 writes: Citing strong demand from cryptocurrency miners, data center and colocation providers are beginning to accept Bitcojn as payment for large chunks of data center space. It's a sign that the data center industry sees an emerging opportunity in catering to the hosting needs of crypto miners, who typically seek high-density space with cheap power. While many web hosting companies accept Bitcoin, larger data center players have been slower to embrace cryptocurrency. Utah-based C7 Data Centers says it's accepting Bitcoin because of surging demand. The Utah-based company says it now hosts about 4.5 megawatts of mining gear, just down the road from the NSA data center.
miller60 writes: As it continues its global expansion, Facebook wants to be able to build twice the data center capacity in the same amount of time. So it's hacking data center construction, assembling teams of designers and experts in lean construction. The outcome: Facebook is evaluating two new concepts for building its future server farms. One involves modular construction, shipping large pre-fabricated “building blocks” that can be rapidly put together, much like Legos. The second design focuses on the use of IKEA-style kits filled with lightweight parts that can be assembled on-site. Either could mean that Facebook ditches its distinctive two-story "penthouse" cooling system.
miller60 writes: After getting started in garages and server closets, Bitcoin mining is moving into data centers and the cloud. Large mining operations are beginning to follow the example of their forerunners in hyperscale computing, shifting compute capacity to remote areas with cheap power, including Iceland and central Washington. Some are using leasing data centers from major providers, while some bitcoin entrepreneurs are developing custom facilities to house high-density hardware, ranging from makeshift server farms in warehouses packed with fans, all the way to futuristic racks of sleek, liquid-cooled immersion rigs in Hong Kong.
miller60 writes: How do you cool a high-density server installation inside a high rise in Hong Kong? You dunk the servers, immersing them in fluid to create an extremely efficient HPC environment in a hot, humid location. Hong Kong's Allied Control developed its immersion cooling solution using a technique called open bath immersion (OBI), which uses 3M's Novec fluid. OBI is an example of passive two-phase cooling, which uses a boiling liquid to remove heat from a surface and then condenses the liquid for reuse, all without a pump. It's a slightly different approach to immersion cooling than the Green Revolution technique being tested by Intel and deployed at scale by energy companies. Other players in immersion cooling include Iceotope and Hardcore (now LiquidCool).
miller60 writes: Microsoft wants to bring power generation inside the rack. The company says it will test racks with built-in fuel cells, a move that would eliminate the need for expensive power distribution systems seen in traditional data centers. Using a rack-level fuel cell can “collapse the entire energy supply chain, from the power plant to the server motherboard, into the confines of a single server cabinet,” says Microsoft, which plans to use biogas as fuel. The plan builds on Microsoft's plan for poop-powered data centers built alongside water treatment plants. The company has published a white paper describing its research.
miller60 writes: The London Internet Exchange (LINX) is teaming with Dutch data center provider EvoSwitch to start a European-style neutral internet exchange in northern Virginia. In the European model, traffic exchanges are managed by participants, rather than the colocation providers hosting the infrastructure. LINX will launch in EvoSwitch's Manassas facility, but also build a fiber ring to expand the exchange to at least two other sites in Virginia. The project is part of a broader effort to launch Euro-style exchanges as an alternative to Equinix and other commercial network hubs focused in single facilities. In London, the LINX spans 10 data centers run by four different colo providers.
miller60 writes: It lasted just 10 seconds. But a barrage of Tweets from fans of Hayao Miyazaki's 1986 anime film "Castle in the Sky" set an all-time Twitter traffic record on Aug.3, hitting 143,199 Tweets per second. The event provided an unusual test of Twitter's infrastructure, which has been broadly retooled since a series of embarrassing outages during the 2010 World Cup. The focused Tweetstorm during "Castle in the Sky" is tied to the practice of tweeting a key line of dialogue as it is spoken in the film.
miller60 writes: On July 1, 2012 the leap second time-handling bug caused many Linux servers to get stuck in a loop. Large data centers saw power usage spike, sometimes by megawatts. The resulting "server storm” prompted Facebook to develop new software for data center infrastructure management (DCIM) to manage its infrastructure, providing real-time data on everything from the servers to the generators. The incident also offered insights into the value of flexible power design in its server farmss, which kept the status updates flowing as the company nearly maxed out its power capacity.