Running a data center in a low humidity location for 10 months doesn't reflect real-world data center life, making this study basically worthless. Heat reduces the life of electronics. Run the experiment in somewhere of ~average~ humidity, for the typical lifespan of a data center and then compare the results. We keep our servers between five and ten years depending on the application. The cost savings will be dramatically impacted if we have to buy new servers twice as often. If people are to adopt this, they need to know how well it will work well in varying conditions. FWIW, I used to work for a major chipset company. We'd test all our hardware from 0C to 60C ambient temperature. Too cold can cause problems just as too hot can. Humidity can also contribute to corrosion, regardless of condensation, which also causes problems. What hardware is used also makes a difference - PSU's and drives often have a harder time taking the higher temps. Also, inconsistent temperatures (warmer in the day, cooler at night) can contribute to broken contacts due to the tiny amounts of expansion/contraction that would occur every day.