What is a high enough bit rate and sample rate to make something digital effectively analog?
This happens whenever the resolution of the digital signal exceeds the ability of the output device to display or play it back.
For example, if you have an inkjet printer, it sprays dots of ink on the page. Those nozzles have tolerances, and there's a minimum size of the ink splat they can make. If the resolution of your image is greater than the size of that ink splat, it's effectively equivalent to the best output a purely analog representation could deliver.
Same with a television that has a minimum dot pitch. If you downscale an HD image to standard definition, it's effectively equivalent to the best picture the analog set can display. It all depends on the target output device.
Or take your stereo - it has limits in terms of frequency response, total harmonic distortion, and signal-to-noise ratio. If the resolution of your digital source exceeds those tolerances, then you have surpassed what an analog input can reproduce on that system.
Of course, the digital representations have limits too. The ultimately analog circuitry and physical media that store and transmit the digital signal have to be at least accurate enough to represent the digital signal perfectly, so there's really no way for digital to "catch up" to the analog tolerances. They go hand-in-hand.