Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Intel

Andy Grove Says End Of Moore's Law At Hand 520

Jack William Bell writes "Intel chief Andy Grove says Moore's Law has reached its limit. Pointing to current leaks in modern chips, he says -- "Current is becoming a major factor and a limiter on how complex we can build chips," said Grove. He said the company' engineers "just can't get rid of" power leakage. -- But, of course, this only applies to semiconductor chips, there is no guarantee that some other technology will not take over and continue the march of smaller, cheaper and faster processors. I remember people saying stuff like this years ago before MOSFET." Update: 12/11 22:01 GMT by T : Correction: the text above originally mangled Andy Grove's name as "Andy Moore."
This discussion has been archived. No new comments can be posted.

Andy Grove Says End Of Moore's Law At Hand

Comments Filter:
  • "The End" (Score:4, Insightful)

    by FosterSJC ( 466265 ) on Wednesday December 11, 2002 @02:49PM (#4864088)
    The end of Moore's law is heralded on Slashdot every 2 months or so; it comes at the hand of new materials (copper, etc), new layering techniques, the ever-popular quantum computing, etc. Frankly, it doesn't seem to me to be that useful a benchmark anymore. The article says it will come sooner, but I foresee in 7 to 10 years the physical production, leakage stoppage and general quality of the chips will be so perfected that Moore's law will no longer be applicable to silicon chips. But, by then, new sorts of chips will be available to pick up the slack. So let us say farewell to silicon, and enjoy it while it lasts. It is like the fossil fuels problem really, except the industry is slightly more willing to advance, having set up years in advance a healthy pace to keep.
  • by sisukapalli1 ( 471175 ) on Wednesday December 11, 2002 @02:49PM (#4864091)
    I hope this means back to actually finding ways of optimizing code, and not the standard "We can throw money at it", or "Next year computers will be twice as fast".

    However, may be better processor architectures and clusters will keep the march going.

    Either way, I believe some progress would be made.

    S
  • by jazman_777 ( 44742 ) on Wednesday December 11, 2002 @02:51PM (#4864123) Homepage
    If it's the end, it wasn't a law to start with, then, was it?
  • Arrogant Intel (Score:2, Insightful)

    by Bendebecker ( 633126 ) on Wednesday December 11, 2002 @02:52PM (#4864137) Journal
    Just because Their engineers can't solve the problem, the problem must be unsolvable.
  • Thank Godness! (Score:3, Insightful)

    by dokebi ( 624663 ) on Wednesday December 11, 2002 @02:53PM (#4864143)
    Moore's law is finally coming to an end. Seriously, continous and rapid advance of processing power is the one thing that's holding back affordable universal and pervasive computing in schools. These cash strapped schools cannot afford to replace text books every two years, let alone computers that cost hundreds more. Things are better now because relatively useful computers can be had for very cheaply, compared to just a few years ago, but scrapping Moore's law altogether is even better. Steve Wazniak also agrees [wired.com]
  • Measured by what? (Score:2, Insightful)

    by narq ( 464639 ) on Wednesday December 11, 2002 @02:53PM (#4864152) Homepage
    While Intel's batch of in-design processors may not keep up, and the engineers' current take on things seem to be dim, I would think a longer period would have to go by before it could be determined whether Moore's laws will hold. New designs have caused great jumps in the past that have kept the overall change of things in line with Moore's law.
  • Well maybe... (Score:5, Insightful)

    by Chicane-UK ( 455253 ) <chicane-uk@@@ntlworld...com> on Wednesday December 11, 2002 @02:54PM (#4864156) Homepage
    ..if Intel and AMD hadn't got locked into that stupid GHz battle and instead concentrated on optimizing their CPU design (rather than just ramping up the speed silly amounts) then there might have still be a few more years left before it became such a problem.

    Maybe thats the way forward? Optimisations and improvements on the chips instead of raw clock speed....?
  • Re:Newton? (Score:3, Insightful)

    by Anne_Nonymous ( 313852 ) on Wednesday December 11, 2002 @02:57PM (#4864198) Homepage Journal
    We don't. We just need less bloatware.
  • by corsec67 ( 627446 ) on Wednesday December 11, 2002 @02:58PM (#4864201) Homepage Journal
    Hypothesis is more approiate. He observed something, that processor power doubles every 18 months, and said that that trend might continue. No one has proved it yet.
  • Re:Well maybe... (Score:5, Insightful)

    by dillon_rinker ( 17944 ) on Wednesday December 11, 2002 @03:04PM (#4864291) Homepage
    If there'd been no competition, you're absolutely correct that we'd have had better CPU designs, and overall performance would likely have been orders of magnitude below what it is now.

    So, speed and feature size are as good as they're going to get, and they were easy to do. Now we can work on the hard stuff with the benefit of all the processor power we've got sitting around unused.

    Don't optimize the hard stuff until you've optimized the easy stuff.
  • by Drakonian ( 518722 ) on Wednesday December 11, 2002 @03:08PM (#4864334) Homepage
    If you restrict it to silicon-based ICs as we know them today, this may be right. Intel is the expert on this after all, and I'm willing to take their word.

    However, if you define Moore's law as computational capacity doubling every 18 months, than it is very unlikely to end. If you project back to well before integrated circuits, or the law itself, computational capacity has been growing at this same exponential rate for many decades - even back to the earliest mechanical based "computers". There will be something to replace the current paradigm; the paradigm has already changed numerous times without throwing off the exponential curve.

    For a facinating look at this phenomenon at what it holds for the future, I'd recommend The Age of Spiritual Machines: When Computers Exceed Human Intelligence [amazon.com] by Ray Kurzweil.

  • Re:Well maybe... (Score:2, Insightful)

    by Anonymous Coward on Wednesday December 11, 2002 @03:09PM (#4864342)
    Your first accusation makes no sense. Raw speed improvements is one of the reasons Moore's Law has continued to hold. And one of the reasons they have been able to engage in the speed battle is because they HAVE been coming up with new technologies to advance the speeds possible (copper metalizations, silicon-on-insulator, etc). They weren't just adding more and more transistors (which isn't to say that they weren't doing that at the same time). So if they hadn't done the research now, they could have done it in the future and thus there might still be a few more years left? That's what you're saying in your first paragraph? Well, it's backwards reasoning. Sure, if they hadn't already done all that work, they could use that work for future improvements. But maybe that just means they're ahead of schedule right non.

    However, your last comment is still valid. If they can't squeeze any more out of the fab processes, then they'll have to work more on design optimation. But that would hold true if they had or hadn't engaged in a "stupid GHz battle."

    Okay, not wasting any more time here...
  • Moors Law (Score:5, Insightful)

    by avandesande ( 143899 ) on Wednesday December 11, 2002 @03:10PM (#4864353) Journal
    Is an economic law, not a physical one. Lack of demand for high-powered processors is going to slow the progression in processor speeds.
  • Re:Newton? (Score:2, Insightful)

    by stratjakt ( 596332 ) on Wednesday December 11, 2002 @03:17PM (#4864416) Journal
    The point the poster was making was "I have no use for a high-end CPU, therefore no one does". Which is doofy.

    If all you do is email and run excel, then don't get a top of the line PC.

    I have a very small lawn (and a son). But I don't go around moaning that they should stop making better lawnmowers.

    But then, if I was the type to be jealous because someone else has a better lawnmower (or faster computer) than me, I'd probably get all upset and jump up and down that I cant afford to constantly have the latest model.
  • by Animats ( 122034 ) on Wednesday December 11, 2002 @03:17PM (#4864421) Homepage
    "Grove suggested that Moore Law regarding the doubling of transistor densities every couple of years will be redundant by the end of the decade." Not this year, eight years out.

    That's about right. It's a bit more pessimistic than the SIA roadmap, but it's close. Grove was just stating, for a general audience, what's accepted in the semiconductor industry. Optical lithography on flat silicon comes to the end of its run within a decade. Around that point, atoms are too big, and there aren't enough electrons in each gate.

    There's been a question of whether the limits of fabrication or the limits of device physics would be reached first. Grove apparently thinks the device problem dominates, since he's talking about leakage current. As density goes up, voltage has to go down, and current goes up. A Pentium 4 draws upwards of 30 amps at 1.2 volts. We're headed for hundreds of amps. It's hard to escape resistive losses with currents like that.

    There are various other technologies that could lead to higher densities. But none of them are as cheap on a per-gate basis.

  • Re:Hmm.... (Score:3, Insightful)

    by panZ ( 67763 ) <matt68000@hotmail.com> on Wednesday December 11, 2002 @03:38PM (#4864619)
    How the heck is this +4 interesting?? Its time to hunts some mods... it looks like a 16 year old 1st poster trying to use the word "superconductor" in a sentence, it doesn't contribute to the discussion in any way.

    Yes, we are all curious to see what the future holds for superconductors and smei-conductors (DUH). Yes, the two have nothing to do with each other in the context of this article. Superconductors can be used for transmition of signal, definitely important to computing, but not creation of logic; fundamentally, you need something that passes information in one direction under certain conditions but doesn't pass information when the same outside conditions are applied in the opposite direction (e.g. semi-conductive materials).
    Moore's Law doesn't need to be revived yet. It still holds true. Will it fail eventually? absolutely. But if Grove could pull his head out of his ass and see the wood for the tree, he'd realize that it isn't going to happen soon. People stopped lauging at Quantum, Bio and other computing theories a long time ago. If you step back and look at the big picture, you'll see Moore's law happily marching along and geeks like us making it happen. Grove is just shouting "look at me! i'm talking about theory of computing but saying nothing!"

  • It seems to me.. (Score:3, Insightful)

    by xchino ( 591175 ) on Wednesday December 11, 2002 @03:40PM (#4864640)
    Moore's law hasn't reached any limits, we have. If this is a barrier we need to overcome, we will overcome it. We could be be thousands of years ahead of our time in our technology if that was our priority as a race, or even as individual nations. If we *needed* faster, smaller processors, the governement would pour money into R&D and more brilliant minds could be gathered to work cooperatively and the results would be results :)

    Seriously, we've risen above much greater challenges than this..

    It sorta sounds like Intel is about ready to quit trying to innovate, perhaps this is time for AMD to take the lead..
  • Re:Thank Godness! (Score:2, Insightful)

    by cdunworth ( 166621 ) on Wednesday December 11, 2002 @03:51PM (#4864754)
    Seriously, continous and rapid advance of processing power is the one thing that's holding back affordable universal and pervasive computing in schools. These cash strapped schools cannot afford to replace text books every two years, let alone computers that cost hundreds more.

    I don't understand the implications herein. First off, if you already have computers sufficient for the tasks at hand, why should the release of more powerful computers compel anyone, including schools, to upgrade them? Moore's Law is not a legal dictum that states you must buy a better, faster computer when it becomes available.

    Secondly, Moore's Law actually enables schools to upgrade their equipment at bargain basement prices if they remain comfortably behind the power curve, and purchase boxes based on older CPU's for dirt cheap.

    Today, in general, I think the home computer has gotten far more powerful than most people really need. Seriously, a Pentium 4 3Ghz computer for browsing the web and sending email? I'm a software developer and a lover of technology, and like many others before me I drool over newer faster computers when they arrive. But I still haven't found a compelling reason to upgrade my 300Mhz PII for five years now. I can browse the web, compile modest java and C++ programs in reasonable time, watch MPEG video clips, edit photos with Photoshop, etc. What, exactly, will kids in a school environment need to do that exceeds the capabilities of this "ancient" PC of mine? (Which, by the way, you can buy today for roughly the cost of an American History textbook.)

  • by theCat ( 36907 ) on Wednesday December 11, 2002 @03:56PM (#4864812) Journal
    Recall, AMD just said they are done trying to up clock speeds all the time. Now Intel is outting themselves, too. The fact that these companies are not saying things like "we need to go to other materials to get higher clock speeds" is because 1) it costs huge $$$ to research and develop new materials, 2) it costs serious $$$ to change fabs to use new materials, 3) NOBODY (no, not even you) wants to continue to pay for increased clocks when there is almost zero benefit in real applications.

    Moore's Law is not dead. What is dead is the need for Moore's Law. I am not alone in noticing that, after 20 years of regular performance increases, things are now pretty good on the desktop, and excellent in the server room. Real changes now need to be in software and services. Further, high-performance computing is going the route of multiple cores per CPU, multiple CPUs per box, and clusters of boxes. The latter is probably the biggest innovation since Ethernet. So, who needs Moore's Law?

    Intel and AMD know *all* this. They want out of the clock race, and yesterday. They want to get into the next level of things, which is defining services and uses for their existing products. They are seeing the end of the glamour years of the CPU and the rise of the era of information applicances, which *must* be portable. Users *will* be far more sensitive to battery life and perceptions of performance (latency and ease of use) and far less sensitive to theoretical performance measures.

    Flame me if you like, but the geek appeal of personal computers is disappearing. Sure there will be people who fiddle with CPUs as a hobby, just as they did 30 years ago when the Apple computer was born to serve a small group of hobbyists. But is that the mainstream? Is that going to support Intel and AMD in their race? Are those companies going to promote a revolution in fab technology, to the tune of half a trillion dollars in investment and technology between them, just to support geeky hobbyists? They could, but they won't, because that is not the future. It is the past.

    The future will still be interesting, mind you, but the challenge has changed. A phone that fits in your rear molar and runs off chemical reactions with your own saliva looks far more lucrative to these companies than a CPU that runs at 100Ghz and consumes as much power as an appartment complex.
  • Threshold Voltage (Score:5, Insightful)

    by Erich ( 151 ) on Wednesday December 11, 2002 @03:57PM (#4864823) Homepage Journal
    One of the problems with "leaky" parts is that the threshold voltages are kept very low. This makes the transistors switch much faster, but makes them leak current quite a bit.

    You can fairly easily raise the threshold voltage (for a process). It makes the chip slower, but leaks less current (and therefore usually uses less power). This is one of the key elements of "Low Power" processes like CL013LP.

    For more information, the Britney Spears' Guide to Semiconductor Physics [britneyspears.ac] is sure to help.

    Interestingly, Using leaky transistors that switch faster has been a trick used for a very long time. One of the reasons the Cray computers took so much cooling was that they didn't use MOSFETs, their whole process was based on PNP and NPN junction transistors. For those who don't know much about transistors, FETs (or Field Effect Transistors) make a little capacitor that when you charge it up (or don't charge it up, depending), it lets current flow through on the other side. It takes a while to charge up the capacitor (time constant proportional to Resistance times Capacitance, remember!), but once it's charged there isn't any current (except the leakage current) that flows through.

    At least, that's what I recall from my classes. I didn't do so well in the device physics and components classes.

  • by mshiltonj ( 220311 ) <mshiltonjNO@SPAMgmail.com> on Wednesday December 11, 2002 @04:06PM (#4864910) Homepage Journal
    My theory is that they are going to cut back on research, because 4-6 ghz chips are going to be fast enough to most things for most non-specialized people.

    So joe sixpack won't be motivated to cough up the dough for the upgrade.

    No money, no research, no new speed barriers broken.

    Specialized markets (cgi movie production? weather modelling?) will require lots more horse power. But most corporate offices won't need it. In these cost-conscious times, that means they won't get it.

    So the market for the new high-end processors will be much smaller. This will probably lead to a stratification of the CPU market. Like the difference between Celeron/Duron and P4/Athlon, but with a much bigger difference.

    I read some where, maybe on slashdot, that what will push the next threshold of CPU processor speeds will be driven by the rise the accurate, real-time natural-language voice-recognition software. (and, along with it, language-to-language translation). That kind of processing requires lots of cycles, but has broad, not specialized, applications.

    The exception, and possibly the hole, to this theory is games. But DOOM III looks pretty damn impressive. What hardware does it require?

    Just idle speculation...
  • Um...no? (Score:3, Insightful)

    by sielwolf ( 246764 ) on Wednesday December 11, 2002 @04:27PM (#4865133) Homepage Journal
    I thought the root of Moore's Law wasn't the technology involved but the drive for improvement in computation. So that the chips may not improve beyond a certain point but then making a massively parallel system on a 2"x 2" card would still go into Moore's Law. It is hardware independent.

    I'd never put a limitation on this since somebody's going to come up with an idea to eek out more clocks.
  • by Anonymous Coward on Wednesday December 11, 2002 @04:32PM (#4865200)
    Code optimization isn't, and never was, ignored. There are many applications, mostly embedded computing, where optimization is a huge factor in development.
    However, at the risk of unleashing Slashdot Hell, I have to say that putting more emphasis into code optimization != progress. Let me explain why: Progress involves more than just code optimization. It involves more than just speed, just power, or just usefulness. It involves these factors, and more. If everyone started making programs that were extremely fast and extremely powerful, but extremely useless, that's not progress. That's why the philosophy of "We can throw money at it", or "Next year computers will be twice as fast" exists -- because it works.
    That isn't to say that optimization is useless; not at all. Just that it isn't the end-all solution.
  • by panurge ( 573432 ) on Wednesday December 11, 2002 @04:42PM (#4865295)
    First, Moore's Law is about transistor density, not clock speed. If it runs out by end of the decade that's still an increase of around 32X - and unless we suddenly have a need to become amateur weather forecasters, it's difficult to see any obvious applications. [cue enormous list from /. readers].
    We've now reached the stage where handheld devices have the same sort of processing power and memory of respectable desktops of a few years back, and I find it interesting that the sudden big hype is the tablet PC, which is relatively low speed but has good battery life. That could be the direction things are going, and if so it is hardly surprising Andy Grove is worried about leaking electrons, what with Transmeta, Via and Motorola/IBM having lower power designs.

    A case in point about technology demonstrators. Someone mentioned aircraft. OK, how much faster have cars got since, say, 1904 when (I think) RR first appeared? Not an awful lot, actually. They are vastly more reliable, waterproof, use less fuel, handle better, are safer, and enormously cheaper in real terms BUT they go about the same speed from A to B and carry about as many people. And they are still made of steel and aluminum, basically the same stuff available in 1904.

    This is far from a perfect analogy because, of course, the function of the computer keeps getting reinvented: it is applied to more and more jobs as it gets cheaper, more powerful, and more reliable. But it does point out that the end of Moore's law is not the end of research and development.

  • by foxtrot ( 14140 ) on Wednesday December 11, 2002 @04:57PM (#4865445)
    Colloquially we speak of Moore's Law and we mean "Chips get twice as fast every 18 months."

    This is not what Gordon Moore said. Moore's statement was based on transistor density. Indeed, perhaps we may not be able to cram transistors together as much in the not too distant future.

    Does this mean that chips won't continue to get twice as fast every 18 months? It would surprise me if processors slowed down their rate of speed growth much this decade. As people begin playing with digital video on the desktop, as people write games that can actually push enough information to a GeForce4 FX to make it worth spending money on, people are still going to want faster and faster machines. And while AMD still exists as a competitor to Intel, even those people who don't really need a 7 GHz machine are going to find that that's what's available.

    So while Moore's law, as it was stated, may be nearing its end, Moore's law, as it is usually spoken will probably stick around for a good while longer.
  • Moore Laws..? (Score:3, Insightful)

    by ackthpt ( 218170 ) on Wednesday December 11, 2002 @05:02PM (#4865491) Homepage Journal
    I wonder if there's a similar law representing toxicity to the environment of semiconductor manufacturing techniques.

    Regarding the natural world environment, you're correct, as I've seen some harsh criticism of the volume and toxicity of waste byproduct of semiconductor manufacturing. It's not so simple as, just add a little sand and some magic and voila! It's probably not reported so much because the wonders of innovation and heated competition make for more sexy news writing.

    Something not mentioned much, but observed by more than a few grumbling parties, is the ever increasing size of code. My first encounter with this was upgrading from RSTS/E 7.? to 8.0, which was darn exciting back in the day, yet the size of the kernel would have been about 50% larger if we activated all the features *I* wanted to (and since I was the admin, lemme tellya, it was darn painful to trim off a few features I lusted after to squeeze it into our memory and performance target.) These days, it's often the OS, ever notice how Windows installs got to needing more space than your entire first harddisk? Common response seems to be, just throw more memory at it. Yet, I think there's a Moore-like law with versions of Windows, i.e. every 2 years a new version comes out with twice as much code.

    With physical limitation of the current components nearing the top of the "rate of declinging return" curve, poor performance of the software will eventually catch up with users expectations. Thus, leaner, faster code could become a market direction.

    "** NEW: Office-O-Lux, With 50% less redundant code! ***"

  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Wednesday December 11, 2002 @06:21PM (#4866301) Homepage Journal
    I'm so tired of hearing people say what do we need more speed for. That's so damned ignorant and yet you people keep repeating it like a bunch of parrots. As we develop more CPU power we develop ways to use it to do things that we couldn't do before. When new classes of CPU come out everyone thinks they're so fast. I remember the first time I used a 68020-based machine compared to my Amiga 500 (with 68000) and I was sitting there saying "holy shit this thing is fast" - now we literally have wristwatches with more CPU power on the market, not just those cracked out linux watches from IBM.

    Also palladium is only significant when you come to palladium-protected content. It will have no effect whatsoever on your ability to rip DVDs because any palladium-protected DVD wouldn't be DVD compliant and wouldn't work in your DVD player. The next public video format is YEARS away and by that time palladium will likely only be an unhappy memory, though it may be supplanted by some other hardware DRM scheme.

    Think about this: A boating video game which uses computational fluid dynamics to determine how the water should behave. Or how about a racing game so realistic that you model chassis flex and installing subframe connectors actually makes a perceptible difference in handling?

    Also we are nowhere near movie quality real time rendering. We need an increase in the number of polygons of several orders of magnitude to really get there, or a revolution in the methods of rendering curved surfaces. There's practically no items in my apartment that don't require either a ton of polygons or a ton of curved surfaces to actually reach the point of photorealism. In addition to actually reach the point of reality some surfaces will have to be rendered with some actual depth. In other words, you will have to render not just the surface of someone's skin, but a certain depth of it to get a 100% realistic representation. Each wrinkle and crease must be rendered at a very high resolution to have them all move properly... Do you see where I'm going with this?

    There will always be a demand for more processing power. As we get more of it on the desktop, operations which formerly required supercomputers (or at least mainframes) will become commonplace. Anyone remember when Visio 2000 came out using M$ SQL server for a backend? Once upon a time you needed a "server class" system just to run a RDBMS, now we use them to store miscellaneous data. (IIRC mysql was a standard part of linux life before visio enterprise 2000 came out but I'm trying to put it in the corporate perspective.)

  • by CMU_Nort ( 73700 ) on Wednesday December 11, 2002 @06:22PM (#4866319) Homepage
    It says that either processor speed (or density) will double every 18 months OR (and it's a big or) the price will halve in 18 months.

    So logically we could continue on with the same speed processor and just have them get progressively cheaper. But hmm, I wonder whose profit margins this would affect? What he's setting us up for is that Intel will refuse to lower their prices. They'll continue to make the chips cheaper and cheaper but they won't sell them for any less.

    I actually look forward to an end in ever increasing clock rates, because then we can all get back to programming skillfully and making tight efficient code.

  • by Tyreth ( 523822 ) on Wednesday December 11, 2002 @09:30PM (#4867474)
    There will always be a need for more speed, and for more space. Scientists will never be able to simulate the universe, since the computer would need to be greater than that which it represents. Certainly some parts, but not it all down to every detail. If computers were capable of movie quality real time rendering, then there is still going to be room for improving AI tremendously, along with complex physics engines, and much more. We will always find more ways to burn clock cycles for more realism. If you are stuck for ways to burn your CPU then just ask me. I'll give you a hint on something new you can add to your game to improve realism and run slower.

    As for disk space, similar argument applies. The more space we have the more we will fill. We will have increasingly better quality films, higher framerates, etc, up until the point where we are recording details from many different angles with miniature cameras, and keeping the data nicely formatted and referenced for database use. Our needs will scale with the technology. We are always hoping for just a little more, and after that we see the next stone to hop to.

    So that I'm not completely critical, the home user will find little reason to upgrade, as indeed they already do. But I'd say this has always been the case. Average Joe likes to get the fastest PC and the best DVD player, but he only wants to upgrade every so many years. Whereas scientists, gamers, hobbyists, etc, like to update regularly to take advantage of new advances that they can use immediately. So I'd say the cycle will continue much the same.
  • Re:Hmm.... (Score:3, Insightful)

    by grumpygrodyguy ( 603716 ) on Wednesday December 11, 2002 @10:26PM (#4867877)
    I'm curious what kind of results the experimentation in superconductivity and semi-conductors will yield. They sound kind of mutually exlusive. But we may yet see Moore's Law revived and revised...

    Superconduction is overkill at this stage of the game.

    More efficient cooling technologies (liquid cooling, peltiers, etc) could keep Moore's Law alive an extra 5 years. The primary difficulty today is back-inductance. All the current in those tiny wires creates magnetic fields that resist the original current flow(this is why chips get so hot). As we all know, the cooler the chip the faster it can run.(This is because there's less back inductance at lower temperatures, superconduction being the optimal case).

    Anyhow, once current fab processes reach the wall, cooling technologies will probably have several good years of improvement that will directly enchance chip performance. That gives us a little more time to research new approaches(optical computing is probably the next step).

Remember, UNIX spelled backwards is XINU. -- Mt.

Working...