All the scientists I know (myself included) would correctly indicate that the sun will not grow cold, but will, after exhausting its core hydrogen fuel, vastly increase its luminosity, and swell in size past the Earth's orbit, essentially vaporizing it. All this, in roughly 5 billion years.
Modern humans as a species are 0.0002 billion years old. Yes, that's three zeroes to the right of the decimal. Do you really believe that we'll care about a couple thousand years worth of exemplars of humanity after we've evolved 25,000 times further than since we separated from proto-human homonids? Will we even be humans at that point? Are there any other conceivable disasters our species or its descendants could suffer during those billions of years, which colonizing space could not prevent?
The article is wrong on many levels. The key word here is "direct". The 2002 transmission spectra you mention (and others like it) consist of light from the host star, passing through the atmosphere of the planet as it passes in front of it, which imprints spectral signatures of the planetary atmosphere on that stellar spectrum. So in this sense, its not a direct spectrum of the planet's own light, but of the star, modified by the planet in front of it.
The first spectrum of a planet, consisting only of planetary light, came from the Spitzer Space Telescope, which used a differencing technique:
planet + star [out of eclipse] - star [when planet eclipsed] = planet only
The star and planet could not be resolved (separated) by the telescope, but by using the known orbit of this eclipsing planetary system, and timing the observations carefully, a spectrum of the "planet's own light" was obtained.
The novelty of this latest result is that no differencing of this sort was required. Using adaptive optics to correct distortions due to Earth's atmosphere, the light from a star and the light from its associated giant planet where physically resolved, and a spectrum of the planet, all by itself, was obtained. Even with adaptive optics, however, very few systems have star-planet separations on the sky large enough to permit this technique.
From one viewpoint, there is no fundamental difference between them. They scale linearly between two temperature points, assigning values of 0 and 100:
- Celsius: [freezing point of water, boiling point of water]
- Fahrenheit: [a cold solution of brine, human body temperature (approx)]
I argue that degrees F offers a more suitable range, and better resolution, than degrees C for temperatures encountered in everyday life. The smallest temperature difference I can detect? Roughly 1 degree F. That's 0.55 degree C. It's also why you often see forecasts in fractional degrees C. A day so cold you have to protect skin? 0 degrees F. A day so hot that wind actually warms you up? 100 degrees F. The advantages of Celsius in the lab are clear. For weather? Not so much.
Over 30 thousand gigabytes (30TB) of images will be generated every night during the decade -long LSST sky survey.
My car gets about 30 MPG and after a half hour 30 mile drive is thirsty for a gallon of gas. After a multi-hour 30 mile bike ride I am very hungry and can easily eat two pounds of food (and still lose weight, if it's salad and not eight quarter pounders with cheese and bacon). Anyway, that two pounds of food obviously takes twenty pounds of gasoline to grow and process and ship and cook. Now at 6 pounds of aviation gas per gallon (note I am not a pilot, but that is my fuzzy memory from wanting to be a pilot decades ago) that would make a bit over 3 gallons of gas to grow the food to bicycle 30 miles.
Your argument only works if you assume that otherwise you would not have consumed those 2lbs of food. Obesity-associated illness trends, in the USA at least, would indicate otherwise. By biking, you burn calories you would be eating anyway, improve your health, and save gas too.