Fair enough - it is easy to forget just how much real functionality there actually is in these stacks. It is nice to live in a world where a handful of lines of glue code yield a rich application.
However, there is a lot of stuff that does not *need* all that and generally isnt worth the trade off for many/most users. There is also the reality that all that to frequently gets delivered in the laziest way possible. Rather than a few shared libs that the OS could map into multiple virtual address spaces, we get everything having its own copy, because its 'easier' if less efficent. It is a question of what you optimize around.
Look at an older house, every single door with be hung/framed and all the jointing will have been done on site. Look at new house, every door will be a pre-hung door. We incur the costs of packaging, shipping, stocking an array of sizes, to de-skill the install and save time. Its different optimization.
Software is not different, if RAM is expensive people will find ways to use less of it. What is special and uniquely good about software is we get to keep using it as long as we want. If expensive RAM drives development of memory efficent stacks, well when RAM gets cheap again (it will eventually) we still have the more efficent software, and we can pile even more debatable features on top...