I think the author is optimistic about how good it was 20 years ago. Perhaps they are young. I got into writing code in the 80's, and for a living in the 90's, and we've been so bad for so long. A Calculator leaking 32GB is not necessarily bad - it could be a bug. The crime is that the damned Calculator requires 20MB of memory from a standing start. The crime isn't that some process periodically gets out of control and uses all the CPU and memory, that can be fixed, the crime is that my system has hundreds of background processes running after a fresh login. Nobody is able to grapple with that, so there is basically no chance that someone might think of a clever way to rearrange things to use half the resources. In fact, the best case is really that someone notices that three processes use shared structures, so they spin up a project to build a fourth process that the other three can be clients of.
When you come down to it, the real problem is that a lot of today's software isn't really necessary and doesn't really serve any particular end goal in terms of functionality, it's just aiming to track your usage and monetize things, so there's really not much point to doing a good job.
On the up side, this does mean that if we survive to see the very long term, we will be able to salvage a lot of performance out of simply wrangling away the lazy code. Speeds won't double every 9 months, but in 50 years our basic calculator programs might only need 4mb of memory. Unless someone adds skinning support.