45 Years Later, Does Moore's Law Still Hold True? 214
Velcroman1 writes "Intel has packed just shy of a billion transistors into the 216 square millimeters of silicon that compose its latest chip, each one far, far thinner than a sliver of human hair. But this mind-blowing feat of engineering doesn't really surprise us, right? After all, that's just Moore's Law in action isn't it? In 1965, an article in "Electronics" magazine by Gordon Moore, the future founder of chip juggernaut Intel, predicted that computer processing power would double roughly every 18 months. Or maybe he said 12 months. Or was it 24 months? Actually, nowhere in the article did Moore actually spell out that famous declaration, nor does the word 'law' even appear in the article at all. Yet the idea has proved remarkably resilient over time, entering the zeitgeist and lodging like a stubborn computer virus you just can't eradicate. But does it hold true? Strangely, that seems to depend more than anything on whom you ask. 'Yes, it still matters, and yes we're still tracking it,' said Mark Bohr, Intel senior fellow and director of process architecture and integration. 'Semiconductor chips haven't actually tracked the progress predicted by Moore's law for many years,' said Tom Halfhill, the well respected chip analyst with industry bible the Microprocessor Report."
Virus? (Score:1, Insightful)
Stop using Microsoft Windows.
A Better Question: (Score:5, Insightful)
Seriously, hardware is always getting faster. Why do we need a law that states this? Which is a more likely scenario for Intel: "Ok, we need to make our chips faster because of some ancient arbitrary rule of thumb for hardware speed.", or "Ok, we need to make our chips faster because if we don't, AMD will overtake us and we'll lose money."?
Re:Number of components, not computing power (Score:4, Insightful)
Adding components is easy. Making faster computers is not.
Ask a vague question, get a vague answer. (Score:5, Insightful)
If you extrapolate from the date that Moore first made the prediction, using the transistor counts of the day and a particular scaling exponent ("doubling every two years"), then the extrapolated line, today, will not exactly match current transistor counts. So it fails.
But if you use the "Law" in its most general form, which is something like "computing power will increase exponentially with time" then yes, it's basically true. One of the problems with this, however, is that you can draw a straight-line, and get a power-law exponent, through a lot of datasets once plotted in a log-linear fashion. To know whether the data "really is" following a power law, you need to do some more careful statistics, and decide on what you think the error bars are. Again, with sufficiently large error bars, our computing power is certainly increasing exponentially. But, on the other hand, if you do a careful fit you'll find the scaling law is not constant: it actually changes in different time periods (corresponding to breakthroughs and corresponding maturation of technology, for instance). So claiming that the history of computing fits a single exponent is an approximation, at best.
So you really need to be clear what question you're asking. If the question is asking whether "Moore's Law" is really an incontrovertible law, then the answer is "no". If the question is whether it's been a pretty good predictor, then answer is "yes" (depending on what you mean by "pretty good" of course). If the question is "Does industry still use some kind of assumption of exponential scaling in their roadmapping?" the answer is "yes" (just go look at the roadmaps). If the question is "Can this exponential scaling continue forever?" then the answer is "no" (there are fundamental limits to computation). If the question is "When will the microelectronics industry stop being able to deliver new computers with exponentially more power?" then the answer is "I don't know."
Re:Number of components, not computing power (Score:5, Insightful)
You're either trolling or looking at it the wrong way.
More efficient software means we could probably run tomorrow's software with yesterday's hardware.
Instead, because of bloat, we're stuck running yesterday's software with tomorrow's hardware.
When put in the mobile context, it also means shorter battery life.
Re:A Better Question: (Score:5, Insightful)
My understanding was that the prediction was indeed important, for inter-business communication. Say, for example that a company purchases cpus from a vendor, for use in its product when it releases two years from now. The product development team will shoot for the expected specs on the cpus at that future date, so that the product will be current when it hits the market. Such predictability is very important for some.
Re:Number of components, not computing power (Score:3, Insightful)
Re:Number of components, not computing power (Score:5, Insightful)
I remember in the early 90s, processor performance was easily doubling every 2 years, and it certainly hasn't been that way the last 4-5 years.
It was easier to measure then, because performance was directly related to clock rate. Now that clock has stopped going up, performance depends on parallel processing.
Then there's a catch, parallel processing depends on the software. Doubling clock rate will probably double the performance of almost any software that runs in the computer, doubling the number of cores not necessarily. Luckily, the most demanding tasks in computing are those that can be parallelized.
With the advent of the GPGPU the future looks bright for Moore's Law. I've recently run some benchmarks using Cuda to perform FFTs and compared it to the data I have from my old computers. In my case, at least, my current computer is above the curve predicted by applying Moore's Law to the computers I've had in the last 25 years.
Moore's law is not a law (Score:4, Insightful)
At best it is a self-fulfilling prophesy, as the 'law' is now used as a standard for judging the industry, which strives to keep up with the predictions.
Re:Moores law of first posts (Score:5, Insightful)
I feel like I've been reading this article every six months for the last ten years.
Re:Number of components, not computing power (Score:4, Insightful)
I remember worrying when they started making 16 and 20Mhz CPUs, I thought digital electronics wouldn't be very stable at those sort of clock speeds.
Re:Number of components, not computing power (Score:5, Insightful)
Well said.
I'm often modded troll when I claim that every advancement in computer processing power has been absorbed by look and feel of the OS interface.
Recalculating the spreadsheet (or just about any other real work) seemingly takes just as long (short?) as ever.
I know its not provably true, but it sure seems that way.
Re:Number of components, not computing power (Score:5, Insightful)
The problem with that is there is no objective definition of software bloat. It's just slashdot shorthand for "spending time on stuff I personally don't find important".
Your "bloat" is another user's "better interface" or "better security" or "maintainable code".
Re:Number of components, not computing power (Score:4, Insightful)
I think you just refuted your own point. The most complex piece of software on your computer is a word processor. That's the problem. Things which are conceptually simple have become so monstrously bloated that they're now "complex software".
Re:Number of components, not computing power (Score:4, Insightful)
That's sometimes true, the trouble is that I'm finding software is often less reliable and slower than the same kind of software a decade ago. More maintainable code should mean that the product is more reliable. I don't see where security necessarily yields terribly slower software and much larger file sizes, unless you mean constant malware scanning. Software available today isn't necessarily more usable to novices or the experienced, so the suggestion of a better interface doesn't necessarily hold true.