Interpolation isn't about adding noise.
6 bit (per component) LCDs have for at least 10 years and probably much longer used dithering techniques to produce effective 16.2M colors (compared to a true 8 bit panel with 16.7M colors). This works very well for almost all use cases and provides smooth gradients but have the disadvantage that some image patters can produce flashing due to interference with the dithering algorithm.
Dithering isn't about adding noise either BTW.
Interpolation is about adding noise by attempting to recreate / create data that was lost / never in the original signal.
You CANNOT assure that the data is correct. It is therefore not signal. It is therefore noise, however much you try to make it subjectively look like it isn't, it mathematically is noise.
Dithering is all about adding noise that looks like noise to various degrees (random dithering, ordered dithering) to achieve a subjective aesthetic, often to achieve a smoothing effect to mask limited resolution (dithering in a 255-color GIF, for example) or to add a noisy effect to mask noise in the original signal (digital film grain effects, for example).
Both interpolation and dithering are adding noise, by definition. Whether or not you find them acceptable is your personal problem.