Comment Re:A non-problem (Score 1) 174
Hardware is increasing in capacity at an insane rate, so of course software efficiency takes a back seat. Under most circumstances, no one notices that it takes slightly longer for unoptimized to run.
Once we plateau ( again ), we'll see a greater push for optimization. These things happen in cycles.
Agreed on both counts, and it'll be interesting to watch from here. We're long-since post-gigahertz race, and increasingly post-(semiconductor) process shrink. Even GPUs are delivering diminishing returns, with thermal dissipation, capacitance across extreme-speed multi-die interconnects, and other basic limits of physics hindering the delivery of anything more than incremental returns.
The truth is, we've been in the "fracking" stage of (classical) compute performance growth for a decade or two now. Soon enough, we'll have run out of oil-bearing shale to shatter and we'll come to realize that 400MB libraries for a file-open dialog atop seventeen layers of abstraction was more a bloated V10 SUV than a sensible hatchback.
Of course, perhaps, capable, reliable commodity quantum computing lies after the next plateau and the cycle thus continues.