You still haven't found a smoking gun proving your point
As I run gentoo, that supports multiple python versions and compiles everything, many times I see programs and libraries break when python-3.n+1 is introduced, and a new version in the next weeks that fixes the issue. Maybe the differences between versions appear trivial to you, but a search for "breaks with python 3.7" (and variations) on GitHub shows many people get bitten by backward-compatibility issues. It is a real-world issue.
And those are just those we can see. There are large code bases that aren't public (I hear Google uses *lots* of python internally), there must be lots of man-hours spent fixing perfectly working programs so that recent python versions don't choke on them.
You're also pointing to a specific libc changelog (GNU's), which has nothing to do with the C language definition itself; there are other C libraries such as uClibc; Windows has its own CRT DLL, BSD has its own libc, and so on. Adding/deprecating a non-POSIX function name in one specific libc implementation is not the same as declaring async to be a reserved keyword in the language itself.
There's a significant difference between the core language and its most popular libraries; it's normal for libraries to evolve, it's part of the deal. I'm not blaming python when numpy or urllib or scipy suddenly break their APIs.
We get it, you don't like Python. You don't have to like it.
I'd like to be neutral as I am with other languages of the same scope (perl, ruby, PHP, lua, ...) but it appears python has just the right set of annoyances to grind my gears (and don't get me started on the virtualenv-anaconda-mess that tries to work around broken package management and multiple concurrent python versions or on that stupid .pyc file pollution).