Folks, we live in an age where programmers declare integers that are going to count from 1...10 as LONG INTEGERS, eating 8 bytes of RAM, where only 1 byte is needed.
We live in an age of cloud computing, load balancers, containers, and distributed databases with stored procedures. When code runs, you have no idea where it is running and how it is spread out over cloud services. Most of the time you don't even know what country the physical box is in.
I have a pure CS degree, but as long as we can keep making things faster and bigger, I am not sure if this book will ever be a top seller. In the brave new world of computing I am not even sure what optimization means anymore. Optimize for CPU, network, compiler, database, cloud architecture??? It is maddening!
As for me, I am currently doing an embedded systems project. Am I doing it in 'C' and ASM like in the good old days? Heck, no, I am using python on a quad core ARM SOC with 1GB of RAM. Even at max processing load I am barely hitting 10% CPU while coding in Python. As long as hardware is fast and cheap, there is no need to spend this kind of time optimizing every cycle and byte. BTW, this is my first Python project. Easy-peasy language that is great for hardware interfacing projects, most libraries exist for common chips like the MCP3008 (AD convertor).
To the kids out there. This is a great time to be alive. You can build anything, learn anything, and talk to anyone. Do cool stuff. Learn everything. There are no limits and powerful hardware is cheap. Look around at how lucky you are to be alive right now. It is an amazing time!