And my laptop can do everything that the PC from the early '90s could do without even turning the fans on. Even a Raspberry Pi 2 outperforms a late-'90s PC handily and doesn't even have a heatsink (let alone a fan) on the SoC and can run happily from a battery.
Apples and oranges. I could also compare a pocket calculator to ENIAC. It doesn't matter. What matters is what people use which is what drives power consumption.
You talked about an Alpha workstation when most people only had PCs in their house. Now you compare with a modern Raspberry Pi which is basically an embedded system. I compared high-end desktop PCs with high-end desktop PCs.
Those data points that you're picking are completely irrelevant for office machines.
No one needs an Alpha for an office machine either. In fact in the early 1990s a lot of people had electric typewriters or even mechanical ones in their office which used even less power than your laptop or whatever.
The "high-end" GPU is not for replacing a cluster. It's a gaming machine add-on that costs $3000 USD. Hardly a cluster. The Titan Z can't even do DP FP worth a damn. It's not for scientific computing. It's for gaming. Even a $999 USD card can use 320W. Not outside of the realm for a gamer. I paid more for a computer monitor two decades ago when the dollar was actually worth more.
The problem with modern LCD screens is 4K. It gobs power. Same thing happened with smartphones and tablets with high-resolution displays.