An anonymous reader writes: We all know computers can measure time in seconds and milliseconds. But how precise can computers be when measuring time? microseconds? nanoseconds? picoseconds? femtoseconds? attoseconds? zeptoseconds? yoctoseconds? How precise can you measure time on a x86 / home PC? How precise can you measure time on a computer? How do scientists measure time when accuracy and precision is very important?