Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

45 Years Later, Does Moore's Law Still Hold True? 214

Velcroman1 writes "Intel has packed just shy of a billion transistors into the 216 square millimeters of silicon that compose its latest chip, each one far, far thinner than a sliver of human hair. But this mind-blowing feat of engineering doesn't really surprise us, right? After all, that's just Moore's Law in action isn't it? In 1965, an article in "Electronics" magazine by Gordon Moore, the future founder of chip juggernaut Intel, predicted that computer processing power would double roughly every 18 months. Or maybe he said 12 months. Or was it 24 months? Actually, nowhere in the article did Moore actually spell out that famous declaration, nor does the word 'law' even appear in the article at all. Yet the idea has proved remarkably resilient over time, entering the zeitgeist and lodging like a stubborn computer virus you just can't eradicate. But does it hold true? Strangely, that seems to depend more than anything on whom you ask. 'Yes, it still matters, and yes we're still tracking it,' said Mark Bohr, Intel senior fellow and director of process architecture and integration. 'Semiconductor chips haven't actually tracked the progress predicted by Moore's law for many years,' said Tom Halfhill, the well respected chip analyst with industry bible the Microprocessor Report."
This discussion has been archived. No new comments can be posted.

45 Years Later, Does Moore's Law Still Hold True?

Comments Filter:
  • Virus? (Score:1, Insightful)

    by Anonymous Coward on Tuesday January 04, 2011 @04:03PM (#34757396)

    lodging hold like a stubborn computer virus you just can't eradicate

    Stop using Microsoft Windows.

  • A Better Question: (Score:5, Insightful)

    by justin.r.s. ( 1959534 ) on Tuesday January 04, 2011 @04:04PM (#34757408)
    45 Years Later, Does Moore's Law Still Matter?
    Seriously, hardware is always getting faster. Why do we need a law that states this? Which is a more likely scenario for Intel: "Ok, we need to make our chips faster because of some ancient arbitrary rule of thumb for hardware speed.", or "Ok, we need to make our chips faster because if we don't, AMD will overtake us and we'll lose money."?
  • by MrEricSir ( 398214 ) on Tuesday January 04, 2011 @04:05PM (#34757424) Homepage

    Adding components is easy. Making faster computers is not.

  • by JustinOpinion ( 1246824 ) on Tuesday January 04, 2011 @04:25PM (#34757636)
    Well the problem here is that the question "Does Moore's Law Hold True?" is not very precise. It's easy to show both that the law doesn't hold, and that it is being followed still today, depending on how tight your definitions are.

    If you extrapolate from the date that Moore first made the prediction, using the transistor counts of the day and a particular scaling exponent ("doubling every two years"), then the extrapolated line, today, will not exactly match current transistor counts. So it fails.

    But if you use the "Law" in its most general form, which is something like "computing power will increase exponentially with time" then yes, it's basically true. One of the problems with this, however, is that you can draw a straight-line, and get a power-law exponent, through a lot of datasets once plotted in a log-linear fashion. To know whether the data "really is" following a power law, you need to do some more careful statistics, and decide on what you think the error bars are. Again, with sufficiently large error bars, our computing power is certainly increasing exponentially. But, on the other hand, if you do a careful fit you'll find the scaling law is not constant: it actually changes in different time periods (corresponding to breakthroughs and corresponding maturation of technology, for instance). So claiming that the history of computing fits a single exponent is an approximation, at best.

    So you really need to be clear what question you're asking. If the question is asking whether "Moore's Law" is really an incontrovertible law, then the answer is "no". If the question is whether it's been a pretty good predictor, then answer is "yes" (depending on what you mean by "pretty good" of course). If the question is "Does industry still use some kind of assumption of exponential scaling in their roadmapping?" the answer is "yes" (just go look at the roadmaps). If the question is "Can this exponential scaling continue forever?" then the answer is "no" (there are fundamental limits to computation). If the question is "When will the microelectronics industry stop being able to deliver new computers with exponentially more power?" then the answer is "I don't know."
  • by Yvan256 ( 722131 ) on Tuesday January 04, 2011 @04:33PM (#34757718) Homepage Journal

    You're either trolling or looking at it the wrong way.

    More efficient software means we could probably run tomorrow's software with yesterday's hardware.

    Instead, because of bloat, we're stuck running yesterday's software with tomorrow's hardware.

    When put in the mobile context, it also means shorter battery life.

  • by epte ( 949662 ) on Tuesday January 04, 2011 @04:34PM (#34757728)

    My understanding was that the prediction was indeed important, for inter-business communication. Say, for example that a company purchases cpus from a vendor, for use in its product when it releases two years from now. The product development team will shoot for the expected specs on the cpus at that future date, so that the product will be current when it hits the market. Such predictability is very important for some.

  • by BlueWaterBaboonFarm ( 1610709 ) on Tuesday January 04, 2011 @04:35PM (#34757730)
    Since when can't you call someone a troll for telling you the truth? ~
  • by mangu ( 126918 ) on Tuesday January 04, 2011 @04:36PM (#34757752)

    I remember in the early 90s, processor performance was easily doubling every 2 years, and it certainly hasn't been that way the last 4-5 years.

    It was easier to measure then, because performance was directly related to clock rate. Now that clock has stopped going up, performance depends on parallel processing.

    Then there's a catch, parallel processing depends on the software. Doubling clock rate will probably double the performance of almost any software that runs in the computer, doubling the number of cores not necessarily. Luckily, the most demanding tasks in computing are those that can be parallelized.

    With the advent of the GPGPU the future looks bright for Moore's Law. I've recently run some benchmarks using Cuda to perform FFTs and compared it to the data I have from my old computers. In my case, at least, my current computer is above the curve predicted by applying Moore's Law to the computers I've had in the last 25 years.

  • by ahodgkinson ( 662233 ) on Tuesday January 04, 2011 @04:41PM (#34757798) Homepage Journal
    Moore made an observation that processing power on microprocessor chips would double every 18 months, and later adjusted the observation to be a doubling every two years. There was no explanation of causality.

    At best it is a self-fulfilling prophesy, as the 'law' is now used as a standard for judging the industry, which strives to keep up with the predictions.

  • by fusiongyro ( 55524 ) < minus cat> on Tuesday January 04, 2011 @04:42PM (#34757816) Homepage

    I feel like I've been reading this article every six months for the last ten years.

  • by Joce640k ( 829181 ) on Tuesday January 04, 2011 @04:43PM (#34757822) Homepage

    I remember worrying when they started making 16 and 20Mhz CPUs, I thought digital electronics wouldn't be very stable at those sort of clock speeds.

  • by icebike ( 68054 ) on Tuesday January 04, 2011 @04:58PM (#34758000)

    Well said.

    I'm often modded troll when I claim that every advancement in computer processing power has been absorbed by look and feel of the OS interface.

    Recalculating the spreadsheet (or just about any other real work) seemingly takes just as long (short?) as ever.

    I know its not provably true, but it sure seems that way.

  • by Cid Highwind ( 9258 ) on Tuesday January 04, 2011 @05:14PM (#34758212) Homepage

    The problem with that is there is no objective definition of software bloat. It's just slashdot shorthand for "spending time on stuff I personally don't find important".

    Your "bloat" is another user's "better interface" or "better security" or "maintainable code".

  • by SecurityGuy ( 217807 ) on Tuesday January 04, 2011 @07:11PM (#34759722)

    I'd have to wait an entire second to launch one of the most complex pieces of software on my computer.

    I think you just refuted your own point. The most complex piece of software on your computer is a word processor. That's the problem. Things which are conceptually simple have become so monstrously bloated that they're now "complex software".

  • by Jeff DeMaagd ( 2015 ) on Tuesday January 04, 2011 @07:13PM (#34759744) Homepage Journal

    That's sometimes true, the trouble is that I'm finding software is often less reliable and slower than the same kind of software a decade ago. More maintainable code should mean that the product is more reliable. I don't see where security necessarily yields terribly slower software and much larger file sizes, unless you mean constant malware scanning. Software available today isn't necessarily more usable to novices or the experienced, so the suggestion of a better interface doesn't necessarily hold true.

If you suspect a man, don't employ him.