20 years from now, are we going to have machines with a terabyte of ram, 256 cores, and only running as fast, on average, as an old 386 because by then we'll have past the peak and gone into negative returns territory but can't go back because everything would break worse? For example, code with so many security checks that it's into "infinite bug" state, where fixing one exploit opens up another one (personally, I think we're there already, but that's another story).
This may be true for companies stuck using some particular application, but hopefully desktop applications will continue to improve through competition: when one browser gets too slow, people start switching.
Probably not be true for your average IE/MSOffice user though.
Whom computers would destroy, they must first drive mad.