Let's take a look at software sizes, for a moment.
UNIX started at around 8k, and the entire Linux kernel could happily sit in the lower 1 megabyte of RAM for a long time, even with capabilities that terrified Microsoft and Apple.
The original game of Elite occuped maybe three quarters of a 100k floppy disk and used swapping and extensive use of data files to create a massive universe that could be loaded into 8k of RAM.
On a 80386SX with 5 megabytes of RAM (Viglens were weird but fun) and a 20 megabyte hard drive, running Linux, I could simultaneously run 7 MMORGs, X11R4, a mail server, a list server, an FTP server, a software router, a web server, a web cache, a web search engine, a web browser, and stil have memory left over to play Netrek, without slowing anything down.
These days, that wouldn't be enough to load the FTP server, let alone anything else.
On the one hand, not everything can be coded to SEL4 standards (although SEL4, by using Haskell as an initial language to develop the core and the proofs, was able to cut the cost of formal programming to around 1% of the normal value). On the other hand, a LOT of space is gratuitously wasted.
Yes, multiple levels of abstraction are a part of the problem. Nothing wrong with abstraction, OpenLook is great, but modern abstraction is mostly there due to incompetent architecture on previous levels and truly dreadful APIs. And, yes, APIs are truly truly dreadful if OpenLook is the paragon of beauty by comparison.