45 Years Later, Does Moore's Law Still Hold True? 214
Velcroman1 writes "Intel has packed just shy of a billion transistors into the 216 square millimeters of silicon that compose its latest chip, each one far, far thinner than a sliver of human hair. But this mind-blowing feat of engineering doesn't really surprise us, right? After all, that's just Moore's Law in action isn't it? In 1965, an article in "Electronics" magazine by Gordon Moore, the future founder of chip juggernaut Intel, predicted that computer processing power would double roughly every 18 months. Or maybe he said 12 months. Or was it 24 months? Actually, nowhere in the article did Moore actually spell out that famous declaration, nor does the word 'law' even appear in the article at all. Yet the idea has proved remarkably resilient over time, entering the zeitgeist and lodging like a stubborn computer virus you just can't eradicate. But does it hold true? Strangely, that seems to depend more than anything on whom you ask. 'Yes, it still matters, and yes we're still tracking it,' said Mark Bohr, Intel senior fellow and director of process architecture and integration. 'Semiconductor chips haven't actually tracked the progress predicted by Moore's law for many years,' said Tom Halfhill, the well respected chip analyst with industry bible the Microprocessor Report."
Number of components, not computing power (Score:2)
Number of components, not computing power, and the time-frame should be easy to figure out given the difference between 1965's number and the 65,000 predicted in 1975.
Re:Number of components, not computing power (Score:4, Insightful)
Adding components is easy. Making faster computers is not.
Re:Number of components, not computing power (Score:4, Insightful)
I remember worrying when they started making 16 and 20Mhz CPUs, I thought digital electronics wouldn't be very stable at those sort of clock speeds.
Re: (Score:2)
It depends what you mean by "faster computer." Nobody expects clock speeds to advance much beyond the several GHz possible today. Therefore, more and more components are being devoted to parallel processing, such as multiple cores, pipelines, and processor threads.
It seems to me that chip designers like Intel, AMD and others are doing pretty well at getting more and more processing power out of each clock cycle, though I'd hesitate to call anything about chip design "easy." Writing software to take advantag
Re: (Score:2)
Nobody expects clock speeds to advance much beyond the several GHz possible today.
With the Sandy Bridge chips overclocking about 20% faster at the same temperature, it will only take about 3 more iterations before we are nearing 10GHz.
Re: (Score:2)
Moore's law is dead in everything except transistor count.
Here's the picture I was looking for, smack in the middle of:
http://www.gotw.ca/publications/concurrency-ddj.htm [www.gotw.ca]
Above 4Ghz, the power loss due to transistor current leakage suddenly starts going way up and becomes the most significant term.
It will take a fundamental change in the way we build transistors to get any kind of efficiency above 4Ghz... maybe photonic or micromechanical gates.
Re: (Score:2)
I think you'll see the move to photonics, with in silica light generation/detection used to push bits around. No voltage leakage, no heat issues (ergo, more reliable/longer lasting equipment). Intel is well on it's way with Light Peak.
Re:Number of components, not computing power (Score:5, Funny)
Re: (Score:3, Insightful)
Re: (Score:2)
That's called "enterprise software."
Re:Number of components, not computing power (Score:5, Insightful)
You're either trolling or looking at it the wrong way.
More efficient software means we could probably run tomorrow's software with yesterday's hardware.
Instead, because of bloat, we're stuck running yesterday's software with tomorrow's hardware.
When put in the mobile context, it also means shorter battery life.
Re:Number of components, not computing power (Score:5, Insightful)
Well said.
I'm often modded troll when I claim that every advancement in computer processing power has been absorbed by look and feel of the OS interface.
Recalculating the spreadsheet (or just about any other real work) seemingly takes just as long (short?) as ever.
I know its not provably true, but it sure seems that way.
Re: (Score:2)
Probably because your spreadsheet has updated at the same rate for the past 15 years- instantaneously.
I can tell you that my number one cycle killer that I actually wait on has reduced tremendously over time- compiling. There was an XKCD on this, but back in the day, even for smallish projects, for a full build you would go and get coffee, probably even lunch, and possibly just went home for the day or bothered your coworkers (how far we have come from the days of P4 when a process could peg your computer!)
Re: (Score:2)
Kernel compiles were my benchmark. Yes, they seem faster now, but not relative to the increase in processing speed claimed by the newer machines.
But I suspect almost all of the improvements in that arena came from faster IO, faster disks, bigger memory for file buffers, etc.
Re: (Score:2)
I'm often modded troll when I claim that every advancement in computer processing power has been absorbed by look and feel of the OS interface.
To put it another way, all the processing power recently has been put toward making computers more accessible and engaging to people and feel like actual appliances instead of obscure gadgets. Sign me up!
However, I don't think you're looking at the whole picture. A typical data center's 32-core powerhouses aren't rendering GUIs. The client machines spend their additional cycles making things look better while the servers spend their additional cycles crunching additional data. Again, I don't seem to hav
Re:Number of components, not computing power (Score:5, Insightful)
The problem with that is there is no objective definition of software bloat. It's just slashdot shorthand for "spending time on stuff I personally don't find important".
Your "bloat" is another user's "better interface" or "better security" or "maintainable code".
Re:Number of components, not computing power (Score:4, Insightful)
That's sometimes true, the trouble is that I'm finding software is often less reliable and slower than the same kind of software a decade ago. More maintainable code should mean that the product is more reliable. I don't see where security necessarily yields terribly slower software and much larger file sizes, unless you mean constant malware scanning. Software available today isn't necessarily more usable to novices or the experienced, so the suggestion of a better interface doesn't necessarily hold true.
Re: (Score:2)
When put in the mobile context, it also means shorter battery life.
It also provides incentive for hardware makers to keep focusing on performance rather than other qualities. It's to the point of the hardware literally catching on fire, killing people.
Sure, if you sit with your laptop in your lap, it's smart to make sure it gets properly ventilated, but WHY SHOULD THE USER HAVE TO CARE? Had software systems remained efficient, hardware manufacturers would soon had to differentiate themselves on other qualities such as cool and quiet, case-design or other things users actua
Re: (Score:2)
Instead, because of bloat, we're stuck running yesterday's software with tomorrow's hardware.
Bullshit!
You're still using a mechanical hard-disk, right? That's the component that's bottle-necking your PC, not programmers!
Get an SSD to find out what your CPU and your "bloated" software is capable of.
On my old laptop, with an old 2.6 GHz Core 2 Duo CPU, which is several generations older than the CPU in the article, if I double-click the Word 2010 icon, it launches instantly. Not after a second or two, instantly. It's like notepad. I get the pretty transparent window borders, the ribbon, font-smoothi
Re: (Score:2)
On my work computer adobe acrobat takes 10-25 seconds to load. yesterday i tried to print a single page from the middle of a PDF and it took a good 10 minutes to process it., but only 5 seconds to actually print it when windows finally had control of the file. The processor wasn't loaded it is just that adobe acrobat is a piece of shit software.
Hard drives are one of two bottlenecks in current computers the other is RAM. RAM tends to be several clock cycles behind the processor. This is why I have high
Re: (Score:2)
On my work computer...
I'll stop you right there. It has a mechanical drive, right?
Think about that for a second. You've got a solid state processor with hundreds of millions of parts switching billions of times per second waiting on... a single, moving, mechanical part that can't exceed about a hundred movements per second.
Your computer's mechanical hard disk has a latency about a million times higher than your computer's processor or memory!
Mechanical disk performance has not improved in over a decade! Ignore the benchmarks abo
Re: (Score:2)
You're still using a mechanical hard-disk, right? That's the component that's bottle-necking your PC, not programmers!
Actually, if you're having performance issues on a modern computer that are solved by swapping out your system/application disk with an SSD chances are your real problem is that you're low on RAM.
The issue most people have is that they're using all of their RAM or close to all of it with just their active applications (active here meaning "the window at the top, those other apps that are running are all swapped out") so every time they hit alt-tab/cmd-tab or start another app the disk starts churning like c
Re: (Score:2)
You are comparing software start-up time to code efficiency.
Efficient code would allow you to run the exact same software (from your point of view) on a 1GHz single-core CPU.
It's a valid comparison. The startup of Word is mostly single-threaded, and my CPU is 2.6 GHz, so on a 1 Ghz processor that measurement of 0.4 seconds would be... about 1 second. Oh no.. the horror! I'd have to wait an entire second to launch one of the most complex pieces of software on my computer.
Processors have been getting steadily faster, but most people's perception of their computer's speed has been completely dominated by the disk speed, so they haven't noticed.
When my customers complain about how
Re:Number of components, not computing power (Score:4, Insightful)
I think you just refuted your own point. The most complex piece of software on your computer is a word processor. That's the problem. Things which are conceptually simple have become so monstrously bloated that they're now "complex software".
Re: (Score:2)
You mean like how back in the day you had 32 megs of RAM using Windows, with a 100MHz processor, and you could pile on a new program and the computer would swap 50 megs to disk, and tick along just fine mostly?
And nowadays we have 4 gigs of RAM, and the computer uses 500 megs of swap and every time you alt-tab you have to wait 4-5 seconds for everything to load back into RAM as windows slowly get redrawn, and everything runs slow... but wait! Developers are piling more and more on, since there's 4 gigs o
Re: (Score:2)
some of us might want to run more than one major application at once, our maybe use our RAM for our own data instead of hundreds of megabytes of bloatware.
Re: (Score:2)
I thought it was the number of transistors in a chip will double (or more, due to major breakthroughs) every 2 years, which means whatever they had 2 years ago would need to have been doubled. Which, when people asked if Intel would have 1 billion transistors on a 1 inch chip by 2010 - they said "Already done it!"
Re: (Score:3)
Conveniantly, the actual 1965 article is linked in the summary above. Specifically, it was about the cost-effectiveness of adding components to in integrated circuit. Circuits with few components aren't cost effective to build, and cuircits with more components have lower yields, making them not ideal either. At the time, the component count was doubling on a yearly basis, and Moore predicted that this would continue for the near term (5-10 years), but that the longer term trend was unlikely. And so it
Re:Number of components, not computing power (Score:5, Insightful)
I remember in the early 90s, processor performance was easily doubling every 2 years, and it certainly hasn't been that way the last 4-5 years.
It was easier to measure then, because performance was directly related to clock rate. Now that clock has stopped going up, performance depends on parallel processing.
Then there's a catch, parallel processing depends on the software. Doubling clock rate will probably double the performance of almost any software that runs in the computer, doubling the number of cores not necessarily. Luckily, the most demanding tasks in computing are those that can be parallelized.
With the advent of the GPGPU the future looks bright for Moore's Law. I've recently run some benchmarks using Cuda to perform FFTs and compared it to the data I have from my old computers. In my case, at least, my current computer is above the curve predicted by applying Moore's Law to the computers I've had in the last 25 years.
Re:Number of components, not computing power (Score:5, Interesting)
It was easier to measure then, because performance was directly related to clock rate.
It was easier to measure then because real world performance was actually doubling and was apparent in most benchmarks.
Now that clock has stopped going up, performance depends on parallel processing.
Performance isn't doubling anymore. Cores are increasing, and the pipelines are being reworked, cache is increasing, but PERFORMANCE isn't doubling.
Then there's a catch, parallel processing depends on the software.
It depends on the task itself being parallelizable in the first place, and many many tasks aren't.
Luckily, the most demanding tasks in computing are those that can be parallelized.
Unfortunately its the aggregate of a pile of small independent undemanding tasks that drags modern PCs to a crawl. And these aren't even bottlenecking the CPU itself... to be honest I don't know what the bottleneck is right now in some items... I'll open up the task manager... cpu utilization will be comfortably low on all cores, hard drive lights are idle so it shouldn't be waiting on IO... and the progress bar is just sitting there... literally 20-30 seconds later things start happening again... WHAT THE HELL? What are the possible bottlenecks that cause this?
Re: (Score:2)
// TODO remove this
sleep(30);
Re: (Score:3)
I know you said that it shouldn't be I/O, but I would still bet money that if you put an SSD in there you'd notice a dramatic improvement. (Although, you didn't mention RAM usage, but even then the SSD would help since it would speed up swap.)
Re: (Score:3)
I know you said that it shouldn't be I/O, but I would still bet money that if you put an SSD in there you'd notice a dramatic improvement. (Although, you didn't mention RAM usage, but even then the SSD would help since it would speed up swap.)
However, when I observe PCs stall with no significant cpu activity and no disk activity... if it were thrashing ram there should be disk activity. No, those stalls have got to be something else.
Personally, though, yes, an SSD is my next upgrade, and I agree with you th
Re: (Score:3)
Deadlocks, badly implemented (blocking) loops, blocking or slow IPC, blocking file io (where it has to wait for the hard drive to return confirmation of the write), waiting on interrupts from various sources, exhausted entropy etc. etc.
There are a lot of things that programmers and compilers do wrong. There are a lot of things that can't be parallelized yet and there is a lot of contention over a few limited resources (RAM, hard drive, external IO) which makes a computer slow.
Re: (Score:2)
Performance isn't doubling anymore. Cores are increasing, and the pipelines are being reworked, cache is increasing, but PERFORMANCE isn't doubling.
It really is, if you have software that takes advantage of all those core. If you have a single-threaded task, then you probably aren't seeing an increase in performance of that task, but you can now run that task plus something else at the same time.
I have been encoding audio to Dolby Digital recently, and the single-threaded compressor finished the job in about 1-2% of the length of the audio, so, 1 hour of audio took about a minute to encode. Although it has been available for a long time, I had not tr
Re: (Score:2)
It really is, if you have software that takes advantage of all those core.
And if that is the only software you use. Otherwise you get a performance increase in one or two activities, for a net increase in total performance, that is distinctly less than DOUBLE.
There are many other examples like mine that show overall performance is increasing. Even games now benefit from more cores, although 4 is about the limit of increasing performance for most current titles.
Yes, overall performance is definitely increasi
Von Neuman Bottleneck. (Score:3)
Unfortunately its the aggregate of a pile of small independent undemanding tasks that drags modern PCs to a crawl. And these aren't even bottlenecking the CPU itself... to be honest I don't know what the bottleneck is right now in some items... I'll open up the task manager... cpu utilization will be comfortably low on all cores, hard drive lights are idle so it shouldn't be waiting on IO... and the progress bar is just sitting there... literally 20-30 seconds later things start happening again... WHAT THE
Re: (Score:2)
> It was easier to measure then, because performance was directly related to clock rate. Now that clock
> has stopped going up, performance depends on parallel processing.
Very true, but, is it still "Moore's Law" if you reformulate it to take new paradigms into account? When Einstein adopted the Lorentz Transformations to describe relative motion, nobody referred to those equations as "Newton's Laws".
Its splitting hairs but, I don't think its all that useful to call a "law" anyway. I always thought of
Re: (Score:2)
There is no exact law.
A Better Question: (Score:5, Insightful)
Seriously, hardware is always getting faster. Why do we need a law that states this? Which is a more likely scenario for Intel: "Ok, we need to make our chips faster because of some ancient arbitrary rule of thumb for hardware speed.", or "Ok, we need to make our chips faster because if we don't, AMD will overtake us and we'll lose money."?
Re: (Score:2)
Re:A Better Question: (Score:5, Interesting)
Re:A Better Question: (Score:5, Informative)
The problem with products from Adobe and Microsoft is that the codebase is massive and it can be a pain to fix and optimize one part without breaking something else. Software vendors deal with the same issue of needing to be faster than the competitor as Intel/AMD. If Adobe and Microsoft don't, I think it speaks more to the lack of competition in some of their product areas than it does to simply being lazy.
Re: (Score:3)
Re: (Score:3)
If they get there, they stop trying as they reached the prophecy.
If they do not get there, they will try harder to reach the prophecy.
Now the question is if the self fulfilling prophecy speeds up the process or slows it down in the long term.
Let's try it out:
-"Boss, I have this fantastic idea for a chip that will have ten times more components than the ones we have today".
-"No way! That would violate Moore's Law, make it just twice the number of components!"
No, I don't think Moore's Law is slowing down progress.
Re:A Better Question: (Score:5, Insightful)
My understanding was that the prediction was indeed important, for inter-business communication. Say, for example that a company purchases cpus from a vendor, for use in its product when it releases two years from now. The product development team will shoot for the expected specs on the cpus at that future date, so that the product will be current when it hits the market. Such predictability is very important for some.
Re: (Score:2)
Actually, it's important for intra-business communication too. Intel takes around five years to bring a chip to market. First, marketing works out what kind of chip will be in demand in five years, and roughly how much they will be able to sell it for. They produce some approximate requirements and the design team starts work. In the meantime, the process team works on increasing the number of transistors that they can fit on a chip, improving yields, and so on.
At the end of the five years, they prod
Re: (Score:2)
Without intending to start a flame war, I wish the programming side of computing was as interested in making things smaller and faster in code.
I don't think it's as bad as all that. Believe me, I would love it if all the software I used were trimmed-down and brilliantly optimized. There is indeed quite a lot of software that is bloated and slow. But it really just comes down to value propositions: is it worth the effort (in programming time, testing, etc.)? For companies, it comes down to whether making the software faster will bring in more sales. For many products, it won't bring in new sales (as compared to adding some new feature), so they don
Re: (Score:3)
Fact is, software development has relied on exponential hardware speedup for the last 40 years, and that's why Moore's Law *is* still relevant.
If a global computer speed limit is nigh then mainstream computing will slowly decelerate. Why? 1) Perpetual increase of bloat in apps, OSes, and programming languages. 2) Ever more layers of security (e.g. data encryption and the verification & validation of function calls, data structures, and even instructions). 3) Increasing demands of interactivity (e.g.
havent hit "diminishing returns" yet (Score:2)
Re: (Score:2)
I wish the programming side of computing was as interested in making things smaller and faster in code.
They are, just not everywhere. There just aren't that many people who care about how fast their spreadsheet is any more, and it isn't nearly as profitable to get devs optimizing speed vs add features. It's hard enough to get bugs fixed.
Re: (Score:2)
Yes, I think it does matter, because eventually the law will 'fail'.
I have no idea at what point that will come but it will certainly be an important inflection point for technology.
Re: (Score:2)
The utility of Moore's Law is not that "hardware is always getting faster", but rather, it is a good rule of thumb for the specific rate of change.
You can also throw in "transistor count != speed", but that's been beaten to death already.
Re:A Better Question: (Score:5, Informative)
Re: (Score:2)
It does, since something needs to counter the increasing sluggishness of software.
http://en.wikipedia.org/wiki/Wirth's_law [wikipedia.org]
Re: (Score:2)
Actually, Moore's law does not imply anything about computing power. It says that the number of transistors doubles every 18
No, a different question. (Score:2)
Sure, I don't care why intel is making their chips faster, but I would like to know how much faster and how?
If you have a software project, scheduled for three years of development, can I rely on my average customer to be running computers 2 ^ 1.5 times as fast as they are now, or will multi-core machines proliferate?
As an intel share holder, all I'd need is your question, but as a computer user looking to the future, I'm more interested in the answer to the original question.
Again? (Score:5, Funny)
Re: (Score:3)
It's also been so frequently misused that Halfhill was forced to define Moron's Law, which states that "the number of ignorant references to Moore's Law doubles every 12 months."
Re: (Score:2)
Cole'slaw.
Re: (Score:2)
Re: (Score:2)
And if it were studied by someone whose last name was Moore, it would be Moore's Moore's law's law.
Well, if it didn't... (Score:3)
Well, if it didn't, then would we still be talking about it, forty-five years later?
Answers (Score:2)
The real problem: Access Speeds (Score:2)
Re:The real problem: Access Speeds (Score:5, Funny)
How often do you really max out your CPU cycles these days anyway?
Every-fuckin'-time Flash appears on a web page!
Re: (Score:2)
+1
Re: (Score:3)
How often do you really max out your CPU cycles these days anyway?
All the time. Why?
Re: (Score:2)
The other side of this is that a general PC might not get much more powerful, but notebooks, tablets, and phones will be able to pack the same amount of power into smaller chips, resulting in reduced power consumption or i
The real problem: Speed of Light (Score:3)
Even if you had an arbitrarily powerful CPU, you'd still have to load in everything from memory, hard disk, or network sources (i.e. all very slow)
Considering that light only travels 30 cm per nanosecond in a vacuum, the maximum practical clock speed depends on how far your memory is. At a 3 GHz clock rate, a request for data from a chip that's just 5 cm away on the circuit board will have a latency longer than the clock period.
The only solution to this problem is increasing the on-chip cache. But that will depend on having software that manages the cache well, i.e. more complex algorithms. In that case, since you have to optimize the software anyhow,
Re: (Score:3)
I bet that in the future we will see chips with simpler (read RISC) architectures with more on-chip memory and special compilers designed to optimize tasks to minimize random memory access.
I bet that in the future, you still wont know how much chip space is used by its various components, leading you to continue believing that risc is some sort of space-saving advantage.
With any instruction set, execution can only be as fast as instruction decoding. Both RISC and CISC machines now have similar execution units, so CISC architectures can feed more execution units per instruction decoded..
To get the same sort of raw performance on RISC, the decoder needs to be faster than on CISC. When the
Re: (Score:2)
Check the Wiki sources. (Score:2)
Is it really that difficult?
The complexity for minimum component costs has increased at a rate of roughly a factor of two per year... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer.[7]
Original Article:
Cramming more components
onto integrated circuit
Article 2: Excerpts from A Conversation [intel.com]
with Gordon Moore: Moore’s Law
Re: (Score:2)
ftp://download.intel.com/museum/Moores_Law/Articles-Press_Releases/Gordon_Moore_1965_Article.pdf [intel.com]
Stupid not previewing...
It's linked in tf summary, it's TFA (Score:2)
Head explodes
Re: (Score:2)
/. doing its part (Score:3)
It's also been so frequently misused that Halfhill was forced to define Moron's Law, which states that "the number of ignorant references to Moore's Law doubles every 12 months."
There are only 13 posts so far, and yet /. is still on track to meet this law. Great job everyone.
Ask a vague question, get a vague answer. (Score:5, Insightful)
If you extrapolate from the date that Moore first made the prediction, using the transistor counts of the day and a particular scaling exponent ("doubling every two years"), then the extrapolated line, today, will not exactly match current transistor counts. So it fails.
But if you use the "Law" in its most general form, which is something like "computing power will increase exponentially with time" then yes, it's basically true. One of the problems with this, however, is that you can draw a straight-line, and get a power-law exponent, through a lot of datasets once plotted in a log-linear fashion. To know whether the data "really is" following a power law, you need to do some more careful statistics, and decide on what you think the error bars are. Again, with sufficiently large error bars, our computing power is certainly increasing exponentially. But, on the other hand, if you do a careful fit you'll find the scaling law is not constant: it actually changes in different time periods (corresponding to breakthroughs and corresponding maturation of technology, for instance). So claiming that the history of computing fits a single exponent is an approximation, at best.
So you really need to be clear what question you're asking. If the question is asking whether "Moore's Law" is really an incontrovertible law, then the answer is "no". If the question is whether it's been a pretty good predictor, then answer is "yes" (depending on what you mean by "pretty good" of course). If the question is "Does industry still use some kind of assumption of exponential scaling in their roadmapping?" the answer is "yes" (just go look at the roadmaps). If the question is "Can this exponential scaling continue forever?" then the answer is "no" (there are fundamental limits to computation). If the question is "When will the microelectronics industry stop being able to deliver new computers with exponentially more power?" then the answer is "I don't know."
It's not the law of gravity (Score:2)
It's not a natural law. It's neither a law of physics nor one of biology. Heeding or ignoring it has no real meaning. And, bluntly, I doubt anyone but nerds that double as beancounters really care about Moore's "law".
Computers have to be fast enough to do what tasks they're supposed to solve. Software will grow to make use of it (or waste it on eye candy). Nobody but us cares about the rest.
Wait just one minute here! (Score:4, Funny)
Apparently, I have entered the Bizarro World. Or perhaps the Mirror Universe. I can't be dreaming, because I'm not surrounded by hot women in tiny outfits, but something is most definitely WRONG here, and I aim to find out what.
Re: (Score:2)
It's neither scientific nor accurate, but other than that, yes.
Re: (Score:3)
Moore's law is not a law (Score:4, Insightful)
At best it is a self-fulfilling prophesy, as the 'law' is now used as a standard for judging the industry, which strives to keep up with the predictions.
Re: (Score:3)
It should be pointed out that the various social observations that have often been termed 'Laws' are not always true. For instance, Parkinson's Law states that work expands to fill the time and resources available. It's usually true. But sometimes, it's not, because it's trying to describe something about a system that nobody's been able to fully explain, specifically how an organization / business / bureaucracy actually functions.
That doesn't make them useless, but it does mean you have to treat them as tr
Re: (Score:2)
oh man, it wasn't about processing power, it was about the transistor density on the chip.
At least state the law reasonably accurately (Score:2)
1) A law does not imply causality.
2) Moore's Law does not state that processing power doubles every 2 years. It states that the number of transistors that can be placed reasonably economically on an integrated circuit doubles every 2 years. It's not the same thing.
Maybe it's My Age... (Score:2)
...but I gave up caring about processor speed about 10 years ago.
Just make a graph (Score:2)
I clicked only wanting one thing, a graph with three lines showing: Moore's Law, transistor count, and computing power of each processor.
Re:Just make a graph -- goog is your friend... (Score:3)
you keep using that word 'law' (Score:2)
None of these are 'laws', where you get punished by breaking them. Not Moore's, not Godwin's, etc. They are more 'generalizations' than anything else. Moore's, especially, could be more acccurately terms an 'observation', as that's what was going on at the time he made it. Everyone repeat after me: "Moore's Observation"
There we go.
Even "Moore's Average" would be more accurate.
Not who you ask, but what you ask... (Score:4, Informative)
the future founder of chip juggernaut Intel, predicted that computer processing power would double roughly every 18 months. Or maybe he said 12 months
What Gordon Moore actually said was that complexity would double every year. Moore was also relating cost at that time, but cost doesn't actually scale well, so most people don't include cost in modern interpretations of Moore's Law.
For circuit complexity, Moore's Law (with the 18 month amendment) seems to still hold true. However, we are fast approaching some physical limits that may cause the doubling period to increase.
Performance is commonly associated with Moore's Law (as you mention), However, performance is a function of clock speed, architecture, algorithm, and a host of other parameters and certainly does not follow Moore's Law... It never really has, even though people still like to think it does... or should...
CharlieMopps's Law (Score:3)
Re: (Score:2)
Shut the Fuck Up (Score:2)
"Moore ... predicted that computer processing power would double roughly every 18 months. Or maybe he said 12 months. Or was it 24 months? Actually, nowhere in the article did Moore actually spell out that famous declaration, nor does the word "law" even appear in the article at all."
"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page). Certainly over the short term this rate can be expected to continue, if not to increase."
Moore's
more about market/marketing forces than technology (Score:2)
I'd say that the fact that computing PRODUCTS have largely tracked "Moore's Law" says more about market forces and competition, and "Wintel" (Microsoft/bloat/software purchases, etc), than it says about physics, engineering and computing technology... It says more about what kind of products and features are needed to drive the IT money machine to spend and spend even though actually the computing needs to write letters, emails and most documents was attained more than a decade ago. Don't forget about adver
Self fulfilling prophecy (Score:2)
Chip makers intentionally regulate (slow down) their advancement to meet Moore's law because it allows them to make greater profits by forcing user's to upgrade on a regular basis, while still giving them enough time to thoroughly test the next iteration and make a profit on it.
Re: (Score:2)
Re:Moores law of first posts (Score:5, Insightful)
I feel like I've been reading this article every six months for the last ten years.
Re: (Score:2)
I feel like I see twice the rate of stories about this every 18 months or so.
Re: (Score:2)