This has been a pet peeve of mine for some years ago. Lots of people, including IT professionals, claim that (CPUs or GPUs) have gotten "fast enough" and there is no need for better ones in the future. How can educated intelligent people be so short-sighted?
Anybody who has ever written even the simplest performance-sensitive program should understand that there is not and never will be "enough" processing power.
It is always easy to find more useful things to do if you only had more power. On the other side of the coin, it is always easy to make a program that takes a long time to run, simply by giving it a lot of data to crunch.
Since this is a graphics thread, lets talk about graphics. Obviously the current generation of games works ok on the current generation of hardware. The games were built for that hardware! Even once we have enough pixels, why is everyone so obsessed with counting the number of pixels anyway? How about more complex/realistic graphics using simple algorithms? It is very easy to write a raytracer that handles complex geometry with lights, shadows, and textures. You can even get full-on global illumination, including depth of field, soft shadows, diffuse interreflection, and caustics, pretty easily if you don't mind tracing a thousand rays per pixel. Simple and effective but very slow brute-force solutions have been around a long time, and if we had really really fast GPUs, any college undergrad taking graphics 101 could make breathtakingingly realistic real-time 3D renderers.
Now, you might argue that it is stupid to use brute-force solutions and waste massive amounts of CPU time, just because we can. Well, sometimes it is ok to waste the CPU, if you have tons and tons of CPU time to spare. On the other hand, you can always use more efficient algorithms, with the cost of more complicated code. Then our super-fast hardware would mean you get INSANELY high graphics quality and performance compared to the brute-force solution. Everybody wins.
I'm a graphics guy so this is the kind of thing I'm familiar with. But surely there are examples in other areas of computing where we have many orders of magnitude to go before we run out of ideas for how to use our computing cycles.