That is utter BS. A dedicated crystal produces no better or worse clock than a PLL clock derived from a crystal input. You are calling all digital engineers clueless hacks here, when that really applies to you.
Incidentally, you could have learned the same by having a look at a datasheet of a high-quality speaker or mic. But I guess audiophiles do not look at actual facts, as that could shatter their fantasy-world.
It is a system interface test. Quite standard. Requires some actual engineering knowledge though to understand why these are sensible.
Seriously, you have no clue how TCP/IP works. Case in point: TCP has absolutely no error correction. At all. Your statement is complete and utter BS.
TCP has retransmission, but for that to be needed over a single cable hop, the cable needs to be close to complete failure. It basically does not happen. Basically all bit-errors are introduces by broken Ethernet cards and switches. A lot of the drops are caused by overloads and are intentional drops under software/firmware control.
And this is not the only long-term scam that rakes in a lot of money. There are a lot of basically stupid people that still manage to get a decent or even good salary. Society is broken that way.
If you get retransmits because of the cable, then your cable is close to complete failure. Before that you, cables are not a significant source of retransmits. At all.
That is an impressive collection of utter fail. Audiophiles seem to have an even larger percentage of complete idiots than the general population.
None with regard to Ethernet used in a "normal" environment. Seriously, what you claim is religion, not technology.
Bullshit. Unless you get dropped packages, it will make zero difference. If you get dropped packages, the cable is very near to complete failure.
Actually, on analog, gold plating _decreases_ the sound quality, as any crossing from one metal to another one does induce unavoidable noise. Audiophiles generally do not know that little physical fact, because they do not understand audio technology.
And that has what relevance for audio? Right, none at all.
Sure, if noise gets high enough. But at that point the cable will be close to complete failure. Below that level, you get a perfect digital transmission, and that is that.
That is complete BS. The Ethernet data cables are differential and isolated via transformers. They are also twisted-pair. Nothing is going to leak from them. More likely a person touching the keyboard will already cause more EMI.
Nonsense. TCP/IP always gives you a lot of jitter. The target application compensates for that. Audio is so slow in comparison to the data-cables and buffering is always used, that unless the cable is very near complete failure, you are not going to hear anything.
This is _analog_ noise. Unless it exceeds a certain high threshold, it has no effect on data packets. It does allow one to gauge cable quality though. For example, if you have high, but still irrelevant, noise ion a 10m cable, you may be able to predict that you wil get actual digital errors on a, say, 100m cable.