My dad is in his mid 70s. He's stopped going to the local computer places because they won't stop hassling him about how he needs a new PC etc. His computer is about 7 years old. It's a Core2Duo E8600 (3.33 GHz), 2 GB of DDR2 and a SATA hdd (250 GB I think) with integrated graphics. He uses it for email, typing things up in Office, doing his income tax, looking at youtube videos, and occasionally (once a month or so) converting video footage my mom took with their video camera to DVD so he can burn it and watch it on the big tv.
The last time I was over there I fired up task manager and showed him that, other than when he was encoding or when he first started a program, it used 10% of less of his CPU's total processing power. That's one of the reasons he's so happy about getting an upgrade from 7 to 10 for free. He doesn't need to buy a new computer.
Your average user off the street just doesn't need and won't notice the difference in OOMPH if they get a new high-end computer. Wayyy back in my last year of high school (95/96) the school division gave every school a new high-end computer. My school was given a 486 DX4/100.. with I believe 8 MB of RAM. Our previous best computer was a 486DX/25. That new computer was a beast. You could boot to DOS 6.22 and be in Win 3.11 firing up the programs you needed in less time than it took your classmates to even boot up. And the high-school level stuff we did with databases, spreadsheets, etc went from "Oh geez how long is this stupid thing going to take?" to blink-and-you-miss-it.
That's the problem with how far technology has gone. As you said, it's gone PAST what your average slob needs or would notice in their day to day use. Isn't the supposedly magic number 15%? That an improvement has to be a MINIMUM of 15% over the previous for your average user (without prompting or telling them the upgrade is there) to notice the change and go "Wow"?