I can't say I've ever read Knuth in the literary sense. It's more of the ultimate reference.
Thankfully, I no longer have to construct sorts, searches, random number generators, etc. every time I write a program, since modern-day language systems come pre-supplied. However, someone has to implement those pre-supplied algorithms, and that someone almost certainly referenced Knuth. He has provided a concise, well-explained collection of analyses, discussions and sample implementations of many of the most important functions needed by almost everyone in the software development field.
The downside of it is MIX. I understand the reason Knuth created MIX, but MIX is an assembly language and not even a commonly-used one (and intentionally so). I know that there are many who think that the purpose of computers is to run machine (assembly) language, but they're wrong.
The purpose of computers is to run software. How it runs the software is secondary. You can see this in the fact that some machines include in their instruction set specialized functions related to specific abstract features. The Prime Computer instruction set had machine-language implementations of the FORTRAN 3-way branch. I think it was Honeywell that had an OS task dispatcher function implemented as a machine-language instruction. The Inter iAPX432 was designed to run Ada, the IBM System/360 had the Translate instructions, and the IBM zSeries has a major subset of the Unix stdlib implemented as CISC instructions, including a few of the Knuth algorithms.
Keeping software as a set of bits corresponding to primitive functions is convenient for implementing von Neumann machines, but it should be realized that it's the means and not the end, and when scaled, tends to lose the abstract picture in the minutae. If minutae were the end of it, one should be programming microcode and manually switching gates - on many machines, the "machine language" is itself an abstraction interpreted via microcode.
I've often complained that one of the biggest annoyances in IT is that you can pop out a prototype GUI in a day or 2 and everybody thinks it should go to full production next Thursday, where no one in their right mind would ever expect a scale cardboard model of a building or bridge to be ready for use in such time.
But there's a similar break in education in that in most cases CIS courses use a "practical" language instead of an abstract one.
While assembly language was very common when Knuth wrote Volumes I-III, he was aware that there was no universal high-level language any more that there was a universal low-level one, which was another reason he invented MIX. Even academically-inclined languages such as Modula ultimately detoured into practical use, warping their abstraction. And, although ideological purity makes me projectile-vomit, when you're dealing with abstract concepts, I prefer to keep things truly abstract. Otherwise it warps one's approach to later solutions.
Djisktra did. in fact attempt an abstract high-level language in his "A Discipline of Programming", although it has some odd warts of its own. Personally, I'd vote for Ada, because while the last time I actually ran Ada, it nearly burned out the bearings on a mid-line IBM mainframe, it is the only common language I know where values and functions are assigned ranges and domains - an essential concept that is taught at the very beginning of differential calculus classes, but not nearly well-stressed enough in most CIS curricula. Were it otherwise, perhaps fewer satellites would have been destroyer.
Ada, incidentally, is part of the gcc/gnat toolset available on almost any Linux development system and modern-day PCs are considerably more powerful than a 1990 mid-level IBM mainframe (but then again, probably so are most cellphones), so anyone who's curious can explore.