from the shouldn't-that-be-non-noise dept.
Roland Piquepaille writes "Researchers at Rockefeller University have built a mathematical method and written an algorithm based on the way our ears process sound that provides a better way to analyze noise than current methods. Not only is their algorithm faster and more accurate than previous ones used in speech recognition or in seismic analysis, it's also based on a very non-intuitive fact: they know what a sound was by knowing when there was no sound. 'In other words, their pictures were being determined not by where there was volume, but where there was silence.' The researchers think that their algorithm can be used in many applications and that it will soon give computers the same acuity as human ears. Read more for additional references and pictures about this algorithm."
"More software projects have gone awry for lack of calendar time than for all
other causes combined."
-- Fred Brooks, Jr., _The Mythical Man Month_