We're finding uses for spare computer cycles all over the place. Cloud computing is becoming a reality, vast server farms are continuously being built, and vast amounts of effort is invested in keeping these installations cool.
In my home office, keeping the room cool enough to concentrate in the presence of these silicone chips is a real task. And all the while, I've got baseboard heaters made of relatively expensive metal that I'm running electricity through in an effort to heat the place.
It occurred to me that we could replace those heaters with computer chips.
The material chips are made from is in greater supply than the metal we currently use, so we would be recovering metal by implementing such a scheme. Instead of varying the activity of the chips in response to demands from the network, you could vary the activity in response to the home thermostat. So, every time there was a cold snap, the computing resources available in the cloud would increase. The energy consumption would be subsidized, because you needed to use it to heat your home regardless.
Ideally, to maximize returns to society, you'd be looking at a national initiative to implement such a plan across the board, but there could be a business model in, say, replacing office and apartment heating infrastructure with such technology and selling those spare cycles.
Is there any technical reason this isn't practical?"