1sockchuck writes: American’s data centers used an estimated 70 billion kilowatt hours of energy in 2014, an increase of just four percent from 2010, and accounts for 1.8 percent of total U.S. electricity consumption. That’s a huge change from the previous trends, which saw double-digit annual gains in data center power usage. Why the change? Data centers have dramatically improved their power and cooling, and the shift to cloud platforms has concentrated workloads in high-efficiency hyperscale facilities. The study projects that the industry’s improved use of energy will yield $60 billion in energy savings by 2020. In addition to the full report there's a summary from Lawrence Berkeley Labs.
1sockchuck writes: Google is building taller data centers to pack more server capacity into its cloud campuses. The company has doubled the height of its server farms, constructing four-story buildings in Oklahoma and Iowa. The shift is being driven by the accelerating demand for cloud space, as well as the data center arms race between Google, Amazon and Microsoft. "The movement to cloud is really accelerating,” said Joe Kava, Vice President for Data Center Operations at Google. "There’s only been a few inflection points like this in the history of IT. I believe we will look back and see this as an important moment. This is really a transformational time."
1sockchuck writes: The cloud computing arms race is accelerating, and the battle will be waged with data centers. Microsoft is reserving huge amounts of cloud capacity, leasing space from data center providers in Northern Virginia, Silicon Valley and San Antonio. The company has more than a million servers in 22 regions around the world, and plans to add six more regions this year. The move follows a major expansion by Google, which announced plans to add 12 data centers to step up its cloud game. Both are chasing Amazon Web Services, which will hit $10 billion in revenue this year.
1sockchuck writes: A growing number of data center customers are using a hybrid power design, running mission-critical workloads in traditional data halls, while shifting others apps to space that offers high-density cooling, but with no generator or UPS support. As a result, service providers are retooling their data centers to offer both types of space under the same roof. The "low resiliency" offering is way cheaper, in some cases less than half the cost of standard Tier III data halls. This offers the ability to match costs to workload, while allowing data center operators to retain applications that otherwise might be considered for cloud.
1sockchuck writes: Cloud computing was supposed to kill the data center. Instead, these are boom times for companies that build data centers. While some cloud players build their own data centers, others continue to lease space to expand their platform. Case in point: Developer DuPont Fabros Technology has leased nearly 60 megawatts of data center space in the last six months, including a 16-megawatt deal that will fill an entire new building in Santa Clara. Microsoft builds its own huge data centers in some cities, but it also leased 27 megawatts of capacity last year. Google, which just announced a major cloud expansion, said some of its 12 new cloud regions will be in leased space.
1sockchuck writes: Facebook is building a new generation of open hardware, part of its vision for powerful data centers that will use artificial intelligence and virtual reality to deliver experiences over the Internet. At the Open Compute Summit, Facebook shared details of its work to integrate more SSDs, GPUs, NVM and a "Just a Bunch of Flash" storage sled to accelerate its infrastructure. The company’s infrastructure ambitions are also powered by CEO Mark Zuckerberg’s embrace of virtual reality, reflected in the $2 billion acquisition of VR pioneer Oculus. “Over the next decade, we’re going to build experiences that rely more on technology like artificial intelligence and virtual reality,” said Facebook CEO Mark Zuckerberg. “These will require a lot more computing power."
1sockchuck writes: As the Open Compute Project turns five, it is growing beyond its roots in hyperscale data centers. The path to a larger market ran through a Rackspace data center in northern Virginia, where the open source servers and racks- which were originally developed at Facebook — were adapted for use in a commercial data center with traditional power distribution. Rackspace, which is using Open Compute servers to power its managed cloud platform, worked closely with OCP vendors like Quanta, Wistron, Delta and Cloudline (HPE/FoxConn) to develop racks and servers that could be productized so other companies can use open hardware in colocation environments. The Open Compute Project will discuss its progress next week at its annual summit in San Jose.
1sockchuck writes: A public utility in Washington state wants to raise rates for high-density power users, citing a flood of requests for electricity to power bitcoin mining operations. Chelan County has some of the cheapest power in the nation, supported by hydroelectric generation from dams along the Columbia River. That got the attention of bitcoin miners, prompting requests to provision 220 megawatts of additional power. After a one-year moratorium, the Chelan utility now wants to raise rates for high density users (more than 250kW per square foot) from 3 cents to 5 cents per kilowatt hour. Bitcoin businesses say the rate hike is discriminatory. But Chelan officials cite the transient nature of the bitcoin business as a risk to recovering their costs for provisioning new power capacity.
1sockchuck writes: With new construction projects underway in Alabama and Tennessee, Google will soon have 5 of its 8 company-built U.S. data center campuses located in the Southeast. The strategy is unique among major cloud players, who typically have server farms on each coast, plus one in the heartland. Is Google’s focus on the Southeast a leading indicator of future data center development in the region? Or is it simply a case of a savvy player unearthing unique retrofit opportunities that may not work for other cloud builders?
1sockchuck writes: Colocation and content delivery specialist EdgeConneX is operating unmanned “lights out” data centers in 20 markets across the United States, marking the most ambitious use to date of automation to streamline data center operations. While some companies have operated prototypes of "lights out" unmanned facilities (including AOL) or deployed unmanned containers with server gear, EdgeConneX built its broader deployment strategy around a lean operations model. The company uses software to remotely control the generators and UPS systems at each data center, and can dispatch techs when on-site maintenance is needed.
1sockchuck writes: Connected cars generate a lot of data. That's translating into big business for data center providers, as evidenced by a major data center expansion by Uber, which needs more storage and compute power to support its global data platform. Uber drivers’ mobile phones send location updates every 4 seconds, which is why the design goal for Uber’s geospatial index is to handle a million writes per second. It's a reminder that as our cars become mini data centers, the data isn't staying onboard, but will also be offloaded to the data centers of automakers and software companies.
1sockchuck writes: Over the past decade, there have been repeated predictions of the imminent arrival of higher rack power densities. Yet extreme densities have remained focused in high performance computing (HPC).Now data center providers are beginning to adapt their designs for higher densities. One of these companies is Colovore, which is among a cliuster of companies adopting chilled-water cooling doors for their cabinets (LinkedIn is another). They say the move to higher densities is driven in part by a generational change in IT teams, as younger engineers are less worried about high-density strategies using water in the data center. "A lot of them grew up with PC gaming and water cooling right in their living room,” said a Colovore executive.
1sockchuck writes: Does it make sense for state to offer tax incentives to lure huge data center projects? After an extended debate, legislators in Michigan have approved tax breaks for a $5 billion data center in Grand Rapids. The project from Switch, which previously built the SuperNAP in Las Vegas, brought the debate into stark relief due to the size of the project — an estimated 2 million square feet of data center space. States competing for projects often find themselves in a bind, since the highly-automated facilities create a limited number of permanent jobs, but many states already offer juicy incentives. Michigan ultimately sought a middle path, tying the tax breaks to job creation goals. If the data center jobs don't materialize, the breaks disappear.
1sockchuck writes: Cloud server farms have migrated to rural areas in Iowa and Oregon, but there's still plenty of infrastructure action in the big city. Urban carrier hotels are once again attracting investment and business, nearly 20 years after they marked the frontier of the transition from telcos to data centers. Companies like Netrality and Infomart are investing heavily in developing meet-me rooms in these facilities, underscoring the enduring power of the cross connect – the physical connection between networks that knits the Internet together.
1sockchuck writes: Data center providers are offering space with less power infrastructure than traditional mission-critical facilities, citing demand from customers looking to forego extra UPS and generators in return for more affordable pricing. The demand for "variable resiliency" space reflects a growing emphasis on controlling data center costs, along with a focus on application-level requirements like HPC and bitcoin mining. Data center experts differed on whether this trend toward flexible design was a niche, or a long-term trend. “In the next 12 months,data center operators will be challenged to deliver power to support both an HPC environment as well as traditional storage all under one roof," said Tate Cantrell, CTO at Iceland's Verne Global. "HPC will continue the trend to low resiliency options.” But some requirements don't change. "Even when they say they’re OK with lower reliability, they still want uptime," noted one executive.