Keeping Time with a Mercury Atom 153
Roland Piquepaille writes "The National Institute of Standards and Technology (NIST) has announced that a new experimental atomic clock based on a single mercury atom is now at least five times more precise than NIST-F1, the U.S. standard clock. This mercury atomic clock 'would neither gain nor lose a second in about 400 million years' while it would take 'only' 70 million years to NIST-F1, based on a 'fountain' of cesium atoms, to gain or lose a second. But even if this new kind of optical atomic clock is more accurate than cesium microwave clocks, it will take a while before such a design can be accepted as an international standard. A ZDNet summary contains pictures and more details about the world's most precise clock."
How much accuracy do you need? (Score:3, Interesting)
Why? (Score:2, Interesting)
I Know I'm Missing Something Here... (Score:4, Interesting)
Re:Universal clock? (Score:3, Interesting)
Because there are limits to measuring cesium (Score:1, Interesting)
Whenver the definition is revised, the new proposed standard is compared to the old accepted standard as precisely as anyone has ever done. For example, Louis Essen measured the frequency of the Cesium hyperfine transition as 9,192,631,770 +/- 20 Hz relative to the old tropical year definition. Thus, 9,192,631,770 was picked as the definition.
However, there are quantum mechanical limitations on our ability to measure that. In particular, when we examine the atoms for a time t, there is an uncertainty proportional to 1/t in the frequency. With cesium atoms, which are electrically neutral, gravity poses a problem. There's no way to hold them up without disturbing them, so they fly through cesium beam clocks in a fraction of a second, giving a small uncertainty in the measurement frequency. Suppose this is +/-1 Hz; that then leads to an uncertainty of +/- 1/9192631770 in the duration of a second.
Cesium fountains slow the cesium atoms down as much as possible and thereby extend the measuring time and reduce the uncertainty. However, for any given measuring time, a higher frequency will always lead to a smaller relative uncertainty. The problem is that 9 GHz is accessible to fast electronics. The mercury clock generates a frequency of 1,064,721,609,899,143 Hz (+/-10 Hz as of current measurements) - that's 1.065 Petahertz! There's no electronics that can keep up, so the challenge of building such a clock is measuring its output frequency. Nonetheless, it should be obvious that with a base frequency some 100,000 times higher than the base frequency of a cesium clock, the potential measurement uncertainty is 100,000 times lower.
If the standard second is ever redefined, it will be to a value that is indistinguishable from the old one using any cesium clock ever built.
It's like drawing a line. Suppose you have a line in pencil, and need to know where it is as precisely as possible. After a while the width of the pencil gets annoying, so you sit down with a magnifying glass and measure it as precisely as possible, and draw a line with a super-sharp pen through the middle of the pencil line. But then that's too wide and fuzzy, so you use a microscope and score a line with a diamond-tip probe. But then that's too wide, so you use an atomic-force microscope and push individual atoms around. Then the atoms are too fuzzy, so you cryogenically cool it to reduce their motion. Etc. etc. Each standard is equal to the old one because it's inside the range of uncertainty of measurement.
Re:How accurate is accurate enough? (Score:4, Interesting)
Re:One small problem... (Score:3, Interesting)
Re:How much accuracy do you need? (Score:3, Interesting)