With the amount of abstraction in software development these days, very few people seem to really know what they're doing anymore. This should concern you -- if it doesn't, you haven't thought about it enough yet.
We regularly see new exploits that affect systems that have been live for years, oft-times spanning multiple major versions and platforms. In retrospect these flaws are often usually painfully obvious, but because they have been buried in the layers of sediment of "best practices", "boilerplate" code and underlying platforms, they aren't seen. At least, not until a curious or malicious mind starts poking around.
While this is in part a problem with QA, the deception of abstraction is that it provides a Black Box that is very easy to trust. This affects developers as much as QA.
Are we really wise to keep building on these layers of abstraction? Toolkits on top of frameworks on top of virtual machines on top of operating systems on top of hardware -- even device manufacturers can't keep their locked-down devices from being rooted in a matter of days, sometimes even before release. While many of the Slashdot crowd laugh because there is a sense of social justice in seeing DRM broken, the same exploits may some day be used against systems we rely on. I don't consider myself a fearmonger, but I wouldn't be surprised to see significant digital infrastructure fail at some point, either due to malicious intent or simply instability, at some point in my lifetime. Poor software quality hurts us all.
I realize that I sound like an old man yearning for the better days, but I learned to program in assembly on 8088s, and I knew exactly what my programs were doing. I'm not saying I want to go back to that, but the idea that most developers these days don't even understand memory management or garbage collection blows my mind. Asking for a new language because getters and setters are too much of a hassle? Somebody get this kid a lollipop, please.
I read the article (no, I'm not new here) and the author's main point, emphasis original, is this:
If your team is spending any time at all writing code to produce listing, filtering, and sorting behavior, not to mention creating CRUD screens and the back end logic for these operations, they are probably working at the wrong level of abstraction.
Where does he draw the line at "wasting time writing code"? This is exactly the mindset that leads us to buffer overruns, SQL injections, and many other problems which should not make it into production software. He wants his developers to abstract as much as possible, but code reuse all too easily leads to blind acceptance and a failure to understand what is being imported. If he trusts that all those acronyms on the blog post he wrote are bug-free, then I would hate to be one of his customers. Not that there seems to be many categorically better options available.
In the end, I think we need to abandon the cycle of "software bloat to more powerful hardware to software bloat..." and figure out what we can do with what we have. Good grief -- look at CUDA! We have orders of magnitude more processing power in a single video game console than all the world's computers before World War II, and available memory is simply insane. Take a look at what Farbrausch has done, and you will see what dedicated focus on efficiency can do.
Stop being lazy, understand what you are doing, understand what you have available, and use it well.