What I wonder is, say you have a 5V ADC. Using their technique, could you drive a 15V (max) signal into the ADC and effectively triple your resolution? You're still using all your bits to measure a 5V range... so if that's the case then it truly is quite groundbreaking.
It may be groundbreaking, but not for the reason advertised in the paper/article/summary. From a quick look at this paper, ADC power dissipation is proportional to f * 2^(2*n), where f is the sampling rate and n is the number of bits per sample. High performance ADCs are constrained by power dissipation, which limits either sampling rate or resolution. What these guys are probably trying to do is constrain n. By allowing signals larger than the ADC input range, and then unwrapping them in software, they increase the effective number of bits. Even if they gain only 2 bits by doing this, this is a factor of 16 advantage in power dissipation (but how does the self-resetting ADC compare to normal ADCs in terms of power?). In any case, the article seems to be hyping a non-existent advantage (sampling signals exceeding the nominal ADC range - why not just attenuate the signal and use a higher resolution ADC?), but does not mention the real advantage (power dissipation).
"This is a misguided mission without a mission, without a launch date, and without ties to exploration goals," concluded Rep. Lamar Smith (R-TX). "It's just a time-wasting distraction."
From the looks of it, this mission will probably not receive funding. It's a bit of a shame. It would have been a good opportunity to start developing asteroid mining technology. Perhaps no one is ready for that yet.
I am not an Economist. I am an honest man! -- Paul McCracken