This is a false dichotomy. Most software that uses less RAM is actually also faster.
Nowadays, it's usually faster to recompute than read it all back from RAM, and if an interactive program uses a lot of RAM, then it's likely keeping a lot of junk in memory that it doesn't need.
Wow, this is a perfect example of completely misunderstanding memory-CPU tradeoffs.
No. For a non-trivial amount of data, it is never cheaper to recompute the data, at access-time. It may be faster overall, as you might be able to use the freed RAM in a better way elsewhere, but it will never speed the accessing task up.
If you recompute the data constantly, it has to hit RAM and then read it back, unless you're dealing with a dataset small enough to be stored completely in cache, in which case this is a nonissue anyway.
More caching is never a bad thing, so long as you set smart defaults for how the caching is done, and you allow the users to configure it. More RAM, in the hands of a smart developer, is a Good Thing (TM).
Without life, Biology itself would be impossible.