Quantum data transfer will probably end up succumbing to the same kind of catch-22/gotcha that plagues realtime digital filtering of analog waveforms...
a) Analog filtering introduces phase changes due to delays. When digitally-filtering a waveform, the length of time you have to sample it to get enough to analyze and transform ends up introducing basically the same phase shift an analog filter would have caused.
b) Quantum data transfer has "1 in 100 million" odds of actually working for any particular attempt. Obviously , lots of forward error correction will be needed to both detect and fix errors. My prediction is that the time the required error-correction overhead adds to the transmission time will end up being basically equal to the time it would have taken to transmit the data at the speed of light.
c) In both cases, the limit will apply primarily to realtime uses. Using the audio example, if you try to apply a digital high-pass/low-pass filter to audio for something like a subwoofer, you'll basically create the same phase shift you would have had anyway... but if you have the luxury of buffering playback so that you have time to completely analyze the signal & can delay the OTHER signals to bring them back into temporal alignment with the filtered signal, you can enjoy the best of both worlds... infinite-slope filtering with zero induced phase shift. In the context of quantum data transfer, it will fail at the goal of "faster than light" throughput, but might nevertheless find utility as a way to transport data in non-realtime under circumstances that would render "normal" electromagnetic radio modulation schemes unusable.