Software has been eating the free lunch Moore was providing before it got to the users; the sad reality is that the typical end-user hasn't seen much in the way of performance improvements - in some cases, common tasks are even slower now than 10 years ago.
This point of view is common, even though its odd disparity with reality make it seem almost anachronistic. Software isn't bloating anywhere near as much as expectations are.
Oh, sure, it's true that much software is slower than its predecessor. Windows 7 is considerably slower, given the same hardware, than Windows XP which is a dog compared to Windows 95, on the same hardware. But the truth is that we aren't running on the same hardware, and our expectations have risen dramatically. But in actual fact, most implementations of compilers and algorithms show consistent improvements in speed. More recent compilers are considerably faster than older ones. Newer compression software is faster (often by orders of magnitude!) than earlier versions. Software processes such as voice recognition, facial pattern matching, lossy compression algorithms for video and audio, and far too many other things to name have all improved consistently over time. For a good example of this type of improvement, take a look at the recent work on "faster than fast" Fourier Transforms as an easy reference.
So why does it seem that software gets slower and slower? I remember when my Dell Inspiron 600m was a slick, fast machine. I was amazed at all the power in this little package! And yet, even running the original install of Windows XP, I can't watch Hulu on it - it simply doesn't have the power to run full screen, full motion, compressed video in real time. I was stunned at how long (a full minute?) the old copy of Open Office took to load, even though I remember running it on the same machine! (With my i7 laptop with SSD and 8 GB of RAM, OpenOffice loads in about 2 seconds)
Expectations are what changed more than the software.