The summary seems a bit misleading. The main thrust of page I saw what that the push to replace work with automation can have consequences at a certain level. Does decision making really work well in automation, or does it lead to problems? There's evidence in both camps. An example, some traders on Wall Street have complained about removing people from the process, they that really do add value at times. And sure, it's hard to imagine a human would issue a massive amount of bad orders, but a computer model with a bit of glitch might. But, is that enough to slow things down. Just one example of many.
In my mind, critical thinking does have value, and no, there is nothing in data science, machine learning, etc. that really comes even close to what humans can do in that area. There's a big debate in Medicine about following best practices and if just following algorithms would work better. Some note it would reduce unneeded tests and procedures. Others have noted that actually, doctors are much better at noting when something is going really wrong and that following a script could lead to unnecessary deaths that would be avoided by relying on clinical judgment. Is there is a need for better data? Sure, but can you really automate judgment? And what real value is there of taking the craft out of everything for humanity as a whole?
The problem is that some people don't think software engineering, programming, coding, whatever requires critical thinking, or that there is a craft or art to programming. And you can increasingly do it that way. Cut and paste, copy from the web, and when things don't work out, post on the web and hope somebody answers.
What is lost is somebody has to have the skills to figure out what is going wrong or that it can be done better. Where do those answers come from on the web after all? At some point, somebody has to know how to actually approach the problem from the fundamentals and solve it, and that's when all those things that we (okay, at least me and my schoolmates) studied in CS come into play.
I'm on a project and they are just throwing idea after idea to figure out a performance problem. Sure, it's tricky, but I realize, they have a huge blind spot. They don't know how to attach a low-level debugger to a process, to monitor OS resources, or even realize that you can debug something without sources. Sure, it's a Java enterprise application, so that's another layer of hard, but it can be done. Cripes, we had to debug core dumps. I'm glad (thrilled) that I don't have to do it anymore, but the skills that I learned doing it were invaluable.
A related aside. The problem is not better tools, it is not knowing there are better (or any) tools or that you can make better tools.