Could but usually doesn't. As the hardware was more costly and slower, and labor relatively cheaper, mainframes ran in some sense "better" code with far less bloat and frillage. An A was just an A (ascii or baudot or ebcdic) - not a picture of a letter in some font taking many times the bits to store and draw for just one example. Audio or video which were (And still are) largely irreducible to small bits/second were right out for real time use.
Mainframes had "acceleration" hardware to compensate. Line printers took a few bits and did the drawing parts (as did plotters for other uses).
Now phones and modern PCs use accelerators for crypto, audio and video codecs, and for sure, don't bit bang the screen pixels.
This leaves enough CPU, admittedly faster now - to handle crap interpreted scripts, HTML rendering...a long list of silly stuff.
And no matter how much faster CPUs get - or in a possibly more important measure now, mips/watt - rather than code efficiently and use a low power cpu, we just accept shorter battery life, as the periodic table for some reason isn't driven my Moore's law - no new more electropositive or negative elements are to be found, period. (I see what I did there). No matter how much, we still waste enough to want more for the same results.
I'm enjoying my lawn. Having started with a PDP-8s, and today just working with all of the might of intel, down to arm (pi-3) and esp-8266 and teensies, this is a new world. But you still get more out of things if you write good code than most others would.