To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. Given that k = 1.38 × 10^16 erg/K, and that the ambient temperature of the universe is 3.2 Kelvin, an ideal computer running at 3.2 K would consume 4.4 × 10^16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.
So 4.4 × 10^-23 Joules minimum per bit flip * minimum of 2^128 bit flips = 1.4 * 10^16 J. Though of course our current computers are far from ideal and it would take many bit flips to test each key. Unless someone has a better source for the energy cost of computation?
The mass of the oceans is about 1.4x10^21 kg. It takes about 4,000 J to raise the temperature of 1 kg of water by 1 degree Celcius, and thus about 400,000 J to heat 1 kg of water from freezing to boiling. The latent heat of vaporization adds another 2 million J/kg. Thus the energy required to boil the oceans is about 2.4x10^6 J/kg * 1.4x10^21 kg = 3.4x10^27 J
So an ideal computer might be able to count to 2^128 without boiling the oceans (doh). It would take a 10^11 increase in energy usage per bit before boiling the oceans was impossible to avoid.