Scientific output is the best for the result you want to show, but it's nowhere near to a complete perspective on a university: obviously, it ignores all of the non-scientific academic topics, and it also ignores quality of education, employability, experience, etc etc. These aren't merely nice to haves, they're fundamental to any reasonable understanding of a university's quality.
You're caricaturing all these surveys:
QS includes plenty more than survey responses from academics (who are rather less prone to the kind of paid-for bias than you claim they are); there's survey responses from employers, plus measures of faculty/student ratio, citations per faculty, international faculty ratio international student ratioo, international research network, and employment outcomes. It's far from perfect, but it's a broader basis for assessment and a larger dataset than Leiden uses.
ARWU has a whole host of measures beyond Nobel/Field medals -- HiCI, Nature/Science publications, and SCIE / SSCI publications. And Nobel / Fields prizes are an important metric because they're the result of long-term academic excellence, which is almost always fostered by world-class research institutions -- it's more the Olympics than a lottery. And they reflect an institution's ability to attract and retain world-class talent.
US News absolutely does not only rank US universities, and it's ridiculous to pick it out of a list of five that I gave and claim that it demonstrates US defaultism. Especially given I'm British. The top 10 for US News include three British universities, #11 is Imperial and 12 is Tsinghua. The reasons for US News having an over-representation of US universities (and English-language universities more generally) are the same as for other methodologies: bibliometrics tend to favour Western sources, eg Web of Science. In fact, for the very area that you are focused on, scientific impact, US News's method is particularly well-suited because it measures global academic research output and influence, especially in STEM.
THE is 70% *not* based on citations, and the citations element is not formally or explicitly limited to English only. The bias occurs because it uses sources that themselves are overweighted to English language citations, such as Scopus. But you can't ignore Scopus etc, that would be worse than the problems caused by relying on them. THE already adjusts citation normalisation by field to attempt to mitigate some of these disparities.
I do not disagree with your contention that Trump is disastrous for academia, as he is for everything else. But I do think your claim that Chinese universities are world-leading is not borne out by the facts.