I'm still blocking all the ad's I can until the advertising networks start vetting their ad's and actually paying for real bandwidth. Almost every website loads twice as fast if you block the big ad network domains.
Except that the claims of strong AI 'real soon now' have been coming since the '60s. Current AI research is producing things that are good at the sorts of things that an animal's autonomic system does. AI research 40 years ago was doing the same thing, only (much) slower. The difference between that and a sentient system is a qualitative difference, whereas the improvements that you list are all quantitative.
Neural networks are good at generating correlations, but that's about all that they're good for. A large part of learning to think as a human child is learning to emulate a model of computation that's better suited to sentient awareness on a complex neural network. Most animals have neural networks in their heads that are far more complex than anything that we can build now, yet I'm not seeing mice replacing humans in most jobs.
Python is getting stronger in data mining and the cloud (AWS lambda), but that's because it has nice bindings for a lot of c/c++ libraries. Typescript is nice, but it doesn't have a mature ecosystem (like a mature IDE).
C++ is still overly complicated which prevents good autocomplete and needs expensive tools to sanity check.
General purpose computing will be the domain of Java and C# for a long time.
Given that most of this code was originally targeting systems from the 1960's and 70's, I can't imagine there being an insurmountable number of lines of code
According to Wikipedia, Gartner estimated about 200 billion lines of COBOL code in 1997. To put that in perspective, that's more than the total amount of open source C code tracked by OpenHub.net. Can you imagine persuading someone to rewrite all of that C code in a newer language?
"There is no statute of limitations on stupidity." -- Randomly produced by a computer program called Markov3.