My understanding is most server farms are connected to dedicated nuclear power plants anyway, so power consumption isn't an issue. Heat dissipation? Yeah, that might be an issue.
Heat and power are the same issue. The conservation of energy means that power in is power out, and the power out is heat that needs to be dissipated. A rule of thumb for data centres is that every dollar you pay in electricity for the computers, you need to pay another dollar in electricity for cooling. If you want high density, then you hit hard limits in the amount of heat that you can physically extract (faster fans hit diminishing returns quickly). This is why AMD's presence in the server room went from close to 100% to close to 0%: Intel was much better at low power.