A sensor that outputs a PWM signal, or something that accepts it (such as a servo) has a specified allowable range and curve that it COULD use, and an actual range that it DOES use.
Servo controllers nominally output pulses between 1ms (zero position) and 2ms (full rotation). Actual servo models don't exactly conform to this "standard", so you tune your control to the specific model of servo.
Analogously, the DMX protocol standard says that the BREAK is signaled by a pulse of AT LEAST 88 microseconds (and up to one second). Many controllers fail to read the spec carefully try to output exactly 88 microseconds, sometimes falling a bit short. If you program your DMX to work according to the standard, and test it with truly conforming peers, it'll fail to work with the many DMX items that don't quite conform, or are borderline, sometimes falling a couple microseconds short. To have compatibility with "almost compliant" neighbors, DMX outputs can output a 92 microsecond break, and receivers can accept a 84 microsecond break.
I suspect that's what happened here. The third-party parts ALMOST matched the Apple parts. Maybe they were barely complaint to the spec while the Apple parts were well within spec, or maybe the third-party parts were almost compliant. Either way, they didn't work quite the same, so customers saw failures. Apple adjusted it to work within the parameters of the third-party parts.
I highly suspect if you tested MAF sensor or O2 sensor speced with an output range of "up to 0-5V", you'd find some model's actual range is 0.2-4.5V, while another model's actual range might be 0.3-4.7V. Firmware tuned for the first, the OEM model, wouldn't work quite work as well with the second one - even though they both have "0-5V output".