Your examples are both high-output sources. A microwave operates on the order of a kW output; 420-450 MHz amateur radio is allowed up to 50W. Wifi is 200 mW. The idea that wifi is causing the same problem is fairly implausible, as least in comparison to the NRO situation.
Aside from shutting down extremely close repeaters (no distances given), a reduction to 5W output is all I can find for the radios near Beale AFB. And that is still 25X the power of wifi.
A microwave is roughly 5000x times the output of wifi, so the microwave would have to be at a distance cuberoot(5000) ~= 17 times greater distance to attenuate to comparable strength. "A few hundred feet" is vague, but rounding to 200 ft gives us roughly equivalent strength for the microwave radiation at 3400 ft.
If a microwave at 2/3 mile doesn't cause the radar to desense, then we would need a different effect from wifi in order for it to cause an issue even at 200 ft. It is not desensing the radar assuming even vaguely comparable sensitivity. (Granted, a significant assumption.)
In this case, reflections are routinely ruled out as ground clutter. Any wifi noise from near ground level is going to be severely attenuated by structures and other ground clutter around it; and if by chance it reflects to the receiver strongly off any fixed structure it will have no Doppler shift and be ignored.
At elevation, you are more likely to have a cleaner shot to the radar. That is the only situation where maybe it would matter to borderline compliant devices which maybe have manufacturing variations in performance that puts them somewhat above permissible levels.
The FCC investigations found most interference was caused by failing to use DFS or exceeding authorized power levels with a high gain antenna. A compliant device simply shouldn't be relevant unless there is some peculiarity to the circumstances.