Just to help expand on the noise source, it's coming from the change in current associated with the transistors, I'm not sure why the paper didn't mention that as clearly. For example, a transistor is going to be either on (value = 1) or off (value = 0). As you might guess, it takes more current to have a transistor in the on position! When there's billions of transistors, the amount of current for each transistor is pretty small. But if you get a bunch of them to line up all at once (lots of 1's or lots of 0's, as they mention), you start to get a measurable current change.
That's what's being heard. In points of the math algorithm, the current drawn by those 1's and 0's is changing, and it's changing in a repeatable way because of the math loops. That change in current draw is seen by the power supply, and pieces of the power supply end up making noise as a result, because the change in current draw is large enough for a long enough period of time.
An easy example you might recognize is a fluorescent light that's running on the 60Hz signal. If the light starts to die, you're starting to see varying amounts of current being drawn, which translate to a noise being heard. The CPU is doing exactly the same thing.
As far as the signal specs itself, the paper was looking up to ~300kHz, iirc. I don't remember the dynamic range, but because they were examining the frequency content and not just the overall signal power, it gets a lot easier to pick up on the tones.