Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
Intel

Intel Cites Breakthrough In Transistor Design 255

n3hat was one of many who wrote in to tell us about the following: "Saw this report in Siliconvalley.com, 'Intel has devised a new structure for transistors that could lead to microprocessors that run faster and consume less power than conventional ones. The technology solves two of the more intractable problems: power consumption and heat.' It goes on to say that Intel plans to present two major elements of the new "TeraHertz" transistor structure at the International Electron Device Meeting in Washington on Dec. 3.
This discussion has been archived. No new comments can be posted.

Intel Cites Breakthrough In Transistor Design

Comments Filter:
  • by ergo98 ( 9391 )

    For you "oldtimers" out there these sorts of announcements must come with quite the sense of humor: Anyone remember BYTE magazine pronouncing the end of the line in advancement every 6 months or so back in the mid-late 80s? Each time stating that "Moore's law" would stop holding true and we've have to move to neural nets or analog computers for continued advancement. Quite humorous really.

  • Oh Yeah? (Score:3, Funny)

    by big_groo ( 237634 ) <groovis&gmail,com> on Monday November 26, 2001 @03:22PM (#2615308) Homepage
    "To compare, it would take a person more than 15,000 years to turn a light switch on and off a trillion times."

    Well, I bet *I* can do it in 11,000 years!

    Any takers?
    • nrsecs= 11000 x 3600 x 24 x 365.25 = 3.47 * 10^11
      10^12 / nrsecs = 2.88 Hz
      Sounds very doable (with a good lightswitch and a bit of training I think that even double that would be feasible), providing you can keep it up for more than a couple of minutes and that the switch does not break...
      If you connect a second switch to your left hand you could probably get a respectable bus speed as well. ;)
    • Re:Oh Yeah? (Score:2, Funny)

      by Suidae ( 162977 )
      Hook me up with their manufacture, I can't find anybody selling light switches with a MTBF that even come close to that!
    • By the time you get to 500, you'll get a call from your hydro company or the switch will fall off.
  • Yeah, yeah... (Score:5, Insightful)

    by rabtech ( 223758 ) on Monday November 26, 2001 @03:23PM (#2615314) Homepage
    Every other month someone comes out with a "breakthrough" in microprocessor design that could "someday lead to smaller and faster chips" that "use less power."

    I am not blaming only Slashdot for presenting this kind of fluff, I blame the major news organizations as well. Until these companies are getting ready to ship a product, I don't want to hear about it, because so much if it becomes vaporware. What little is left ends up being only slight improvements wrapped in marketing buzzwords.

    Give me more content and less fluff please.
    • In that case I suggest you stick to reading PC Weekly or an equivalent. They have pages and pages about new products being shipped by companies and where you can buy them from.
    • Oh, come on! Without the fluff, news sites would be boring as hell!! Without the totally useless stories (and anything by JonKatz), it would be ten hours between posts - and you wouldn't have anything to post complaints about!
    • Re:Yeah, yeah... (Score:5, Insightful)

      by Compuser ( 14899 ) on Monday November 26, 2001 @03:32PM (#2615384)
      No, Intel basically dropped the bomb and announced
      that they have achieved the holy grail by finding
      a better insulator than silicon dioxide and they
      claim this new material is "manufacturable", which
      I take to mean "fits within current process without
      too much investment". If true this is a fundamental
      thing and not at all fluff.
      • Re:Yeah, yeah... (Score:2, Insightful)

        by Anonymous Coward

        According to this article [eet.com], Intel has chosen ZrO2. This is very interesting, since the big industrial research consortia (Sematech and IMEC), as well as many semiconductor companies (ie. Texas Instruments, IBM, Motorola, AMD, and others) have been studying ZrO2 for several years. Some of these report intrinsic problems with the stability of ZrO2 during dopant activation anneals. I wonder if intel has really solved this problem, or if this is just a premature announcement by some marketroid.

        Getting ZrO2 to work on a few specialized experimental transistors in an R&D lab is much different than getting it to work on all of the billions of transistors in a chip or cpu. The former has already been done by several companies. I seriously doubt that intel has achieved the latter.

        Considering that the announcement is coupled with an announcement that intel is finally going to join the other Big Boys (IBM, Motorola, AMD) in endorsing SOI, I doubt any real breakthroughs have been achieved.

    • I agree (Score:1, Informative)

      by Anonymous Coward
      Too much hype, too little substance.

      I'm not sure what Intel is trying to do here but from what I hear it certainly doesn't sound revolutionary. In fact, in some areas they seem to be playing catchup - they're finally adopting SOI which has been around for a while now. So they are talking about terahertz transistors now? Did they actually built it and characterize it? If yes they should give us concrete information instead of hype. Anyway even if they did built it I don't think they're the first. I heard about NMOSs with sub picosecond gate delays some time ago (SOI, 40 nm gate, novel doping profile...)

      Because of stupid articles like this people are gonna start saying "Cool, we'll have 1000 GHz Pentium 7 in a year or two". Ugh.

      Here's a related article in EE
      Times:http://www.eetimes.com/story/OEG20011126S0 03 1
    • Re:Yeah, yeah... (Score:3, Insightful)

      by rnd() ( 118781 )
      Maybe /. should have a category (who knows, they may have one already) for tech press releases and the like.

      But seriously, is this really that bad?
  • Amd? (Score:5, Funny)

    by Spackler ( 223562 ) on Monday November 26, 2001 @03:23PM (#2615316) Journal
    It goes on to say that Intel plans to present two major elements of the new "AMDhurtz" transistor structure at the International Electron Device Meeting in Washington on Dec. 3.
  • Maybe NOW Intel can make something faster than an Athlon...

    - Freed
  • "Intel has devised a new structure for transistors that could lead to microprocessors that run faster and consume less power than conventional ones."

    The new types of structures that allow slower and hotter microprocessors?-)
    Is there any other type of breakthrough as far a microprocessors are concerned?
  • The new structure is being called the Intel TeraHertz transistor because the transistors will be able to switch on and off more than one trillion times per second. In comparison, it would take a person more than 15,000 years to turn a light switch on and off a trillion times.

    This means that a human can switch a light ON and OFF 2 times a second.
    Hey! Intel's engineers are really fucking slow. I now understand why it took them years to reach the gigahertz while it took only months to AMD's.
    • I hate to burst your bubble, but AMD had been making chips for MANY YEARS before they hit 1ghz...then again the above is what I'd expect from an AMD fanboy...
  • What is the breakthrough? That AMD actually *does* make a better cpu??
  • "will this make my internet faster?"

    No

    thank you, have a nice day

  • by frank_adrian314159 ( 469671 ) on Monday November 26, 2001 @03:29PM (#2615361) Homepage
    After all, SOI technologies are not new and people have been trying different gate insulators forever. The problem with alternate gate insulators has been cost for yield. Unless this has also been solved and this process gets moved into fab, it's just another research lab thingee.

    Must be a slow news day for nerds...
    • it IS just another research curiosity.

      that said, the diff here is that the buffer of "depleted" (very low doped?) Si reduces interface states that occur at the insulator/semi boundary (the "bottom" one). so there will be less recombination, and more of the current will go out the contacts rather than leaking along the bottom of the substrate.

      i wonder how they will deal with the floating substrate affect like in SOI?

      i think intel must have been scrambling for a good SOI competetor for a while now. maybe they'll actually switch to copper/low K as well?
  • by AugstWest ( 79042 ) on Monday November 26, 2001 @03:30PM (#2615369)
    That's AMAZING, they announced that?

    What's next, a means of DOUBLING HARD DRIVE SPACE? Maybe someone soon will announce they've figured out a way to make screens BIGGER and CHEAPER....

    It amazes me some of the stuff that slashdot rejects when compared with some of the stuff that gets posted.

    I submitted something over the weekend about someone at indymedia.org who was detained at an airport and questioned aboput posts he'd made to a web discussion group under a pseudonym.

    Yes, that's right, he was pulled aside at an airport and they not only knew exactly who he was, but his nick and specific posts he'd made.

    Seems to scream "YRO," but hey, we gotta make space for stories about bigger hard drives and faster, cooler processors that may see the light of day eventually.

    The story is here [indymedia.org], btw.
    • by Anonymous Coward

      Yes, that's right, he was pulled aside at an airport and they not only knew exactly who he was, but his nick and specific posts he'd made.

      Did it ever occur to you that it's just one guy telling a story on some obscure web site? Do you really really believe everything you read on the web?

      That story smells to me, and it probably smelled to the Slashdot editors (as amazing as that seems). Submit something from a reputable source, and maybe it will have a better chance. The Slashdot editors are gullible enough already, we don't need unsubstantiated bullshit like that.

    • Slashdot can reject stories so fast. THAT'S RIGHT! THE FASTEST REJECTERS IN THE WEST!
    • > The law is a code that isolates justice from
      > public participation.

      Damn straight, because a less polite term for
      "justice with public participation" is
      "lynch mob".

      Chris Mattern
    • Amazing, /. decided not to post an article about an activist returning to the US from the Mideast telling an unconfirmed story about how somebody mentioned something about a post.

      While I don't know what the editors use as criteria to pick a post with, I must say that this doesn't seem to unreasonable to leave off the main site.
    • I know what you mean. They also rejected my story about a friend I know, who knows a girl in college, who has a roommate who has a Pakistani boyfriend, who told her not to go near any malls the day before Halloween...I mean Christmas.
  • IBM and AMD First (Score:5, Informative)

    by sabinm ( 447146 ) on Monday November 26, 2001 @03:32PM (#2615385) Homepage Journal
    NPR had a report on this eariler today regarding this
    "Terahertz" chip. It seems both IBM and AMD had developed this technology and Intel snubbed it, citing that it was to expensive to implement. There is nothing breakthrough about "fast switching" electrons, just the fact that INTEL released a press story about it makes it interesting. Ho hum
    • I heard this same report. It came off as a bloody ad for Intel, for the most part. There was very little detail about IBM and AMD there. Does anyone have details here? I'd love to see Intel slammed for this. Give credit where it is due.

      Love to see this tech used in new Sparc and PPC chips as well. :)
    • I heard the same story on NPR and couldn't help but think that the Intel PR firm knew it needed some shiny new thing to wave in front of investors. I guess they figured most people wouldn't hear the part about it not being all that viable for production or that Intel didn't really develop it first. It's a pretty good trick. I think I'll request a raise telling my boss that I should be able to increase my productivity 100 fold and then cloud the conversation with details concerning the technical and political hurdles associated with cloning.
    • Yes, note that this is merely one implementation of SOI technology, which AMD and IBM have been working on for a few years, and which Intel has claimed was "worthless". AMD has plans for such devices to roll off the assembly lines in a year or two, and now Intel is claiming that they "discovered" it.

      More Intel Hype
    • I read an article a few months ago and it said that the government is involved in this. The national labs have engineers from intel, amd and other companies and government scientists researching these new technologies. The technology will be out in the open and no one company should own it.
    • Re:IBM and AMD First (Score:2, Informative)

      by Cougar1 ( 256626 )
      "Terahertz" chip. It seems both IBM and AMD had developed this technology and Intel snubbed it, citing that it was to expensive to implement. There is nothing breakthrough about "fast switching" electrons, just the fact that INTEL released a press story about it makes it interesting. Ho hum.

      Just a small correction. The technology was developed by IBM and Motorola. AMD licensed the technology from Motorola.

    • Re:IBM and AMD First (Score:4, Informative)

      by router ( 28432 ) <a.rNO@SPAMgmail.com> on Monday November 26, 2001 @07:16PM (#2616634) Homepage Journal
      Also, those of us who remember when IBM announced its desire to use SOI and Low-k dielectrics and Intel snubbed them are now giggling like schoolgirls....

      Check EETimes for the whole unabridged story.

      http://www.eet.com/story/OEG20011126S0031
    • That's not what I was told. I heard that the entire planet was in imminent danger of being eaten by an enormous mutant star goat.

      Tim
  • I love how everyone gets excited about these "breakthrough" announcements about processor components that blow everything else away. The only problem is that if you put it in perspective, by the time this "breakthrough" gets used (generally 3-4 years as noted in the press release) Moore's law will have taken effect and this "breakthough" won't even be savy enough to be put to use since everything else has advanced beyond it!
  • by rice_burners_suck ( 243660 ) on Monday November 26, 2001 @03:33PM (#2615393)

    OH WELL.

    I was impressed by the idea of Transmeta's Crusoe processor because it greatly reduces the increasingly complicated problems of heat and energy efficiency. However, I've heard rumors that their product isn't getting widespread acceptance for some reason. Perhaps speed or reliability. Who knows.

    The point is that we desperately need processors that produce less heat and use less energy. If you take a moment to think about it, it's totally ridiculous that we need so many noisy fans inside a computer that someone's using to compose an email. It's even more ridiculous when you consider that some graphics processors require a fan as well, and so does the power supply.

    If successful, Intel's breakthrough in transistors could solve or greatly reduce these and other problems. These solutions aren't limited to the processor! All the chips in your computer contain transistors. Reducing the size, heat and energy usage by tiny amounts in each transistor will yield enormous benefits. Suddenly, a fan won't be required on the main processor or the graphics processor. Look at how much energy you save, not only in the transistors themselves, but in removing the fans, which themselves need energy to remove the unnecessary heat! It may be possible to remove the fan altogether from the power supply, resulting in less noise and even less wasted energy.

    Now if only they'd come up with a breakthrough that will make fast, long lasting, solid-state hard drives a reality. Then the computer will be silent and use much less energy yet. We're getting there. It's only a matter of time and money.

    OH WELL.

    • by Christopher Thomas ( 11717 ) on Monday November 26, 2001 @03:48PM (#2615488)
      The point is that we desperately need processors that produce less heat and use less energy. If you take a moment to think about it, it's totally ridiculous that we need so many noisy fans inside a computer that someone's using to compose an email.

      If you're using a high-end computer solely to compose email, I'd argue that the problem isn't the hardware.

      Heck, if power is a concern, buy a Dreamcast and use the web client to access Hotmail. $50, and you get a low-power embedded box that you can read and write email and even play games on.

      Desktop systems are overpowered because people want to be able to run insanely high-powered applications on them, no matter how much of a waste this is when they're not playing Quake XIV.

      It's even more ridiculous when you consider that some graphics processors require a fan as well, and so does the power supply.

      Same thing. A real-time realistically rendered 3D environment requires one hell of a lot of computing power to generate. This means heat. If you're just answering email, buy a PCI Rage XL card and save on the fan and heatsink.

      Now if only they'd come up with a breakthrough that will make fast, long lasting, solid-state hard drives a reality.

      They're called "flash cards".

      If you want to store gigabytes of images or gigabytes of game install files, however, they won't be sufficient.

      RAM is harder to make per unit storage space than a magnetic platter. This is just the nature of the universe - RAM is intrinsically more complex. A magnetic platter is just a flat surface with the right kind of coating; it doesn't get much simpler than that. You can buy a solid-state drive off the shelf right now, but the the cost will reflect the fact that it's harder to build, and this will continue to be the case for quite a while.

      In summary, the problem isn't the technology, it's the fact that people *want* insanely powerful computers, with large amounts of storage, for the lowest price that still gives them the power and space they crave.
  • this is not good. No doubt Intel have a patent pending on this technology as well as the "Mystery" material. Unless AMD can come up with a equaly competitive product, I fear that we will no longer have our favorite inexpencive chip maker around anymore.....and say so long to Transmeta as I am sure the new Tablet PC will move over to the Intel platform as soon as this tech is made Comercialy Viable.
  • The article cites use of a "high-k dielectric" and a "depleted substrate".

    In english, this means using a different material for insulating layers and tweaking the doping of the substrate. A refinement, but hardly a breakthrough.

    A couple of points about this puzzle me:

    • Doesn't Si-on-I make the substrate less relevant?

      You could still call the channel material a substrate, and doping it might still do something, but it sounds like they're working with a bulk-silicon technique here. I'd thought that everyone and their dog was moving to silicon-on-insulator for capacitance reasons.

      I suppose if you left the substrate undoped (depleted of carriers) it would act more like an insulator, but I question why you wouldn't just use Si-on-I.

    • Weren't we trying to _reduce_ the k of dielectrics?

      The higher the k - dielectric constant - of a material, the higher the capacitance of a thin layer of the material between electrodes. A higher-k gate insulator, for instance, would cause your chip to run _slower_ due to increased gate capacitance. This is why we've had things like foamed dielectrics invented (bubbles of gas or vacuum in a high-k material reduces the k value).

      Perhaps there are other effects of using a high-k material that offset this. If this is actually the case, please enlighten me.


    In summary, this sounds like a suspiciously marginal improvement. I'm curious as to what they're actually trying to do with these process adjustments.
    • Re:Dielectrics (Score:3, Informative)

      by Erich ( 151 )
      One of the reasons you want a high-k layer is for making micro-capacitors to minimize ground bounce.

      One of the big problems with current chips is that voltages are getting so low and current is getting so high, and with clock gating to turn off things that don't need power you get the inductance of wires causing a lot of ground bounce, which can be really bad. So you want to add capacitance to offset the inductance, but there isn't really a high-k layer in most processes to make capacitors out of.

      • hmmmm... you could use their high K dielectric for that since they are depositing it only under the gate. so you could put it where you want decoupling caps between M1 and M4 or Mwhatever.

        but the poster was asking about high K under the gate which raises their concern on high turn on C. (see my post below)
    • for the depleted substrate explanation see my post above:

      http://slashdot.org/comments.pl?sid=24164&thresh ol d=0&commentsort=3&mode=thread&pid=2615361#2615436

      the Si layer above the insulator reduces recombination and leakage which plagues SOI.
      they may even be able to ground this thin layer to reduce the "floating substrate" problem in SOI. but then i think punchthrough will be a problem.

      as for the gate dielectric this is where we can't use low K as we NEED higher capacitance to modulate the channel. The higher K allows a thinner oxide increasing the control over teh channel and we can thus further invert the channel. especially with gate lengths getting so short.
      We want low K dielectrics in the matal layers for the reasons you stated (lower cap.).

      this is a big increment for intel and the industry.
    • this sounds like a suspiciously marginal improvement. Actually, it sounds like a marketdroid making a complete hash of what the engineers told him. And also like Intel trying to catch up to other companies that have been playing with SOI (Silicon On Insulator) for many years...
  • This may be a stupid question - but after all I only buy and use processors, I don't design them.

    That said, how about Intel taking what seems to me to be the next logical step, and combining what is essentially simply a new insulator breakthrough with an actual design shift like clockless processor design - like we all read about a couple of weeks ago here on Slashdot.

    Since clockless design is supposed to pave the way for faster, less power-hungry parts, and this new insulator technology allows you to use less power and achieve higher speed chips - wouldn't the two technologies be complementary?

    Okay, stupid question finished - feel free to flame me!
  • The new structure is being called the Intel TeraHertz transistor because the transistors will be able to switch on and off more than one trillion times per second. In comparison, it would take a person more than 15,000 years to turn a light switch on and off a trillion times.

    Wow, this is great benchmark for the same article that describes gate leakage and CMOS modifications. It sounds like some marketing genius went to the Intel R&D department and got the simple speech and just copied the rest from a quarterly report. I remember when Clinton introduced the DOE new super computer with the line, "It would take a person 10,00 years with a calculator to...that this machine can do in a second". It kinda makes the line between a research scientist and a research spokesperson really obvious. And you thought tweaking drivers for quake 3 was silly.
  • and Micro$oft taketh away.

    Will it really run at a terahertz, or is this going to be like the Cyrix chips that supposedly ran like their advertised clock speed, just no at their advertised clock speed?
    • "or is this going to be like the Cyrix chips that supposedly ran like their advertised clock speed, just no at their advertised clock speed"

      You mean like AMD's whatever+ XP/MP chips?
  • Wow, another amazing press release. Yeah. I'm thoroughly unimpressed. Using better materials and smaller parts is not revolutionary, but I guess their PR department hired a couple of MS spin-doctors recently to make it sound just out of this world.

    Maybe when they use bio-technology, or lightwave devices as the CPU, they can claim a "Breakthrough in Transistor Design." Don't worry folks, AMD is still more bang for the buck.

  • After beating on IBM's SOI technology they finally figured out a way to manufacture SOI on their own. This new "breakthrough" isn't, they just caught up (finally) on the manufacturing side and decided to try to put a super-hyper-creative spin on it.

    Stick with AMD and PPC chips ...
  • In looking at the story, one gains the amazing insight that Intel is quite worried about consumer reluctance to buy faster chips, as the faster MHz chip matters little beyond a certain point.

    One also can extrapolate they are quite worried about Transmeta competition for lower-power chips.

    So to me this really is a reflection of a PR piece in their attempt to stop going down the blind alley of chip speed, and try to figure out a way to fight Transmeta, without giving up the shop to AMD (cheaper materials aspect).

    [caveat - I own both TMTA and AMD]
    -
    • Hey, I dont mean to be rude, but:

      TRANSMETA IS NOTHING.

      They are a company in real trouble. They are being sued by *investors* which is never a good sign, they havent delivered on their much promised and ballyhooed new line of chips, and major vendors have recently dropped them. The performance of the chips is seriously sub-par (even below what they claim).

      So lets be clear here. I am sure that Intel is not trying to kill off Transmeta with this move. I'd take it for what its worth - Intel giving itself a little pat on the back.

      My best guess is that in five years Transmeta will be completely gone. Maybe three.

      Unless they can show a real reason why they are better than Intel or AMD's low power offerings, they are bound to flounder and be washed away.
  • Hope they start manufacturing this soon so all the p4s will go for sale cheap. Nothing like reaping the benefits of staying a generation behind in chip speed.
  • I for one am glad that Slashdot continues to post these sorts of stories.

    Improving chip design is pretty much just business as normal, but every now and then it's worth it to hear what the latest thing is (and an estimate of when it's coming down the pipes).

    Of course, it's more fun when they build transistors out of blue-green algae, or computers self assemble on cheese, or such - but new gate materials are important too (and certainly it's news for nerds.)
  • CNN is running a similar article, http://www.cnn.com/2001/TECH/ptech/11/26/intel.reu t/index.html
    , but in it they claim that "Intel Corp. has devised a new structure for transistors -- the tiny switches that make up semiconductors..." That's a new one to me: semiconductors are made of transistors... I guess no one there proof reads, or more likely understands what they write about.
  • BULLOCKS!

    As someone else said, every company has been shipping press releases claiming huge advances etc etc ad nauseam.

    IBM, SOI, Copper. The worlds most advanced fabs. they've claimed over 1 Ghz chips but they still only deliver 700 Mhz G3's.

    Motorola, AltiVec , SOI, lo-k dielectric. They promised 1 Ghz chips three years ago. Lets not even talk about the shit they got Apple into.

    Intel, MMX, MMX-2/KNI/SEE, SSE-2, IA-64 "Terahertz". They promised a 1,1 Ghz "Athlon killer" 18 months ago.

    AMD. yeah, lets soo, good chips, but now they're returning to 1997 marketing rating their chips after claimed Intel Pentium iV performance.

    Transmeta "codemorphing" VLIW core etc etc. Only this wonderul achevement delivered less than impressive perfomence.

    etc etc blah blah blah

    It looks to e like most people are still using 1 Ghz based computers. And our computers still arent flying (blame Windows? whatever).

    Argh.

    Sorry, diatribe mode OFF.
  • by Erich ( 151 ) on Monday November 26, 2001 @04:00PM (#2615572) Homepage Journal
    We expect that transistors keep getting smaller, and faster about the same rate as they get smaller. Gate delays are (looking out 5 to 10 years) not a big worry.

    The big worry is wire speed. Wires aren't getting much faster, even though dies are getting larger and clock frequencies are getting faster. It used to be that getting from point A to point B on a chip was no problem to do at the end of a clock cycle. Current processors are getting to be so fast that you can't get from one place to another in a whole clock cycle in some cases. Unlike transistors, wire delay gets worse as size gets smaller, because resistance goes up fast (scales with cross-sectional area), and wire delay is proportional to R*C. You can do some tricks to keep wire speed the same, but relative to switching speed and transistor size it still gets bad quickly.

    Routing information around is the problem of the future. You get free computation on the way, but getting from point A to B is the hard part.

    That being said, fast-switching, low-power transistors are nice. :-)

    And, for all you patent-ballyhooers, Intel will patent this (probably). As they should. Other companies will license this patent from Intel in the same way that Intel licenses patents on other aspects of their processes from other companies. That's the way things work.

    • D'oh. Subject should read either "so what about gate delay" or "now what about wire delay".
    • This is not a new problem. I was in a meeting over 15 years ago where this discussion came up in the design of high speed ECL ASICS for applications such as the CRAY Y-MP. With ECL devices having FMAX from 600-800Mhz in the early '80s propigation delay was a critical factor. The rumors I heard were that CRAY accounted for every millimeter of path length from the IC core, to the pad, through the wire bond, out onto the PCB, through the wire wrap etc and onto the next PCB. All of this I suspect without the "advanced" CAD routing tools we have today. All of the heat management and prop delay issues in high speed ECL systems of the late 70's early '80s are now getting onto the desktop. -- Ross
    • Whatever happened to those transistors with six legs? I rmember reading about them in Wired several years ago...before that magazine turned into a cathedral for every idiot who had a new idea about marketing. Anyway, aparently, they redirect power usually lost into another gate and thus increase speed without a comparable increase in power consumption. I don'e remember exactly how they work though.
  • Seeing as Intel just might hit the Thz mark very quickly, this leaves me with one question:
    Will rambus actually get above 400Mhz memory?
    (you know that "quad-pumped" x 100Mhz RDRAM)

    // begin sarcasm
    Heh, Thz chips, and 100/133Mhz bus...sounds like a match made in heaven to me
    //end sarcasm

    Seeing as the p3 was an "improvement" to the pentium architecture, it leaves me to wonder if the Ghz machine's similarity to a 486 will yield Thz machines similar to 386's?

    I dunno, it seems like trading clock speed and heat in place of actually getting stuff done seems rather silly.
  • by pm ( 11079 ) on Monday November 26, 2001 @04:12PM (#2615634)
    There is a mroe technical article over at EETimes.Com here:

    http://www.eetimes.com/story/OEG20011126S0031

    The following is based on my prior research into SOI and the EETimes.Com article that I cited, and not on any knowledge of what Intel is actually planning on doing. I have not read the IEDM presentation and have no inside knowledge of the details of Intel's SOI plans. I am not speaking for Intel (despite working there) and I may be wrong on the details. My purpose in posting is to give some details on the background of SOI.

    There are three parts to this: this uses fully depleted SOI vs. the current partially depleted insulators, this uses a high-K dielectric (zirconium oxide, according to the EETimes) vs. traditional dielectrics, and this uses thicker source and drain terminals to offset the increased resistance from fully depleted SOI.

    Conventional silicon wafers use essentially a large, somewhat thick circular chunk of silicon as the starting platform that transistors are then created on top of. SOI is "Silicon On Insulator" and refers to a type of silicon wafer in which there is a somewhat thick chunk of silicon that forms the bulk of the wafer, on top of this there's a relatively thin insulator (referred to as the bulk oxide) and then on top of this a new layer of silicon is deposited (referred to as an epitaxial silicon layer, or epi layer). The transistors are created on top of this epi layer.

    The only physical difference between fully depleted and partially depleted SOI is the thickness of the layers. Partially depleted uses a relatively thick layer of insulator followed by a relatively thick silicon layer. Fully depleted uses much thinner layers. The names come from the fact that the depletion region on fully depleted SOI reaches down all the way to the bulk oxide whereas in the partially depleted SOI, the depletion region ends and there is still some non-depleted silicon between the bottom of the transistor and the bulk oxide. To explain exactly what depleted silicon is would take some diagrams and some time. Suffice to say (and this is not debated in the industry, it is a fact): fully depeted SOI is better than partially depleted.

    So why do people use partially depleted? It's a matter of complexity. Fully depleted SOI requires extremely tight manufacturing margins. You need to have very precise thicknesses to achieve the advantages that fully depleted can offer over partially, and this precision results in much higher cost. People (like myself) say that SOI is expensive, but this is in reference to partially depleted SOI which is the most common in use nowadays, fully depleted is quite a bit more expensive than even this. There is also concern that wafer manufacturers may have problems supplying high-quality, fully-depleted, completely planar (flat) SOI wafers in high volumes.

    Switching to SOI reduces a form of leakage called subthreshold current (or Ioff) that occurs when a transistor is supposedly turned off. Fully depleted reduces this leakage even more than partially depleted. If you think of transistor current as being water that flows out of a water faucet depending on a signal (in this case the tap/handle of the faucet), subthreshold leakage is the equivalent of a leaky faucet that runs even when it's supposed to be off. It also has other benefits (it's faster, packing density is improved, etc.).

    The other primary form of leakage is something called gate oxide leakage that is current that tunnels through the increasingly thin region that separates the gate from the channel of the transistor. If we go back to the faucet metaphor, it would be like the faucet sucking water out of your hand while your hand is on the tap. :) Gate leakage is a function of oxide thickness, and I discuss this in another post of mine in this thread. The thicker the oxide, the less likely it is that electrons can tunnel through the gate. But if you increase the oxide thickness while leaving everything else the same, you lose performance since the capacitance of the gate is reduced. So what you want is a way to maintain a value of gate capacitance while increasing the thickness of the gate. The easiest way to do this is to switch to a material in the gate that has a higher dielectric constant. So, the high-K dielectric tackles the other part of leakage by allowing higher thicknesses of dielectric while maintaining a given level of performance.

    The third "new thing" offsets a disadvantage of fully depleted SOI - higher channel resistance. By increasing the thickness of the contacts of the source and drain you can reduce the resistance going into the transistor and can partially offset the increased channel resistance.
    • Hmm... there are a couple of typos. I really should preview things better before I post them.

      One that sticks out at me is my comment about discussing gate dielectrics in "another post in this thread". This is due to my having reposted this from my original post over at the BBS at Anandtech.Com. If you are looking for the comment on dielectrics, go to the Highly Technical BBS at forums.anandtech.com and search for "dielectric".
  • In terms of the power consumption side of this, I say blah, blah blah. Let's just admit that we will always need more and more energy and start working on our own Dyson sphere or at least more efficient solar collectors. People need to realize that we will always need more energy tomorrow than we did today, that is the way technology goes. We should be trying to capture more of the ridiculous amount of energy the sun puts out than save a few bucks with these chips. Building something to save energy from mostly oil-based energy generators are only prolonging the inevitable. We need energy and lots of it, so get over it and lets move on.
  • by alexburke ( 119254 ) <alex+slashdotNO@SPAMalexburke.ca> on Monday November 26, 2001 @04:32PM (#2615736)
    To compare, it would take a person more than 15,000 years to turn a light switch on and off a trillion times.

    Wow! That really puts things into perspective...
  • See this [eetimes.com]
    EE Times story for the technical details
    behind the announcement -- Intel does an
    about-face on SOI.
  • On a laptop, it is not the CPU that are using majority of the power, in fact, it uses the least power if you compare with the rest of the compontents.

    which componenet you ask, display (LCD) and HD use far more power than your CPU.

    Just take a look at my Dell Inspiron 8000's screen, it uses far more power than my CPU.

    The Ogranic display thingy should able to lower the power consumption for display, but I don't think the techonoloy even allows you to build a 15 inch screen.

    As for HD, what about those holographic stroage they've beent talking about
  • by ackthpt ( 218170 ) on Monday November 26, 2001 @04:34PM (#2615755) Homepage Journal
    It seems about every three, or so, months something on the order of a new transistor technology comes along from IBM or Intel. Prior links:

    IBM Develops Transistor Capable of 210GHz, June 25 2001 [slashdot.org]

    Intel Claims Smallest, Fastest Transistor, June 6 2001 [slashdot.org]

    Single-Atom Transistor, Mar 8 2001 [slashdot.org]

    Intel Claims 10Ghz Transistor, Mar 4 2001 [slashdot.org]

    Intel Creates 30-Nanometer Transistors, Dec 10 2000 [slashdot.org]

    I predict in the next couple weeks IBM, or someone else, will announce a smaller, faster transistor which slices, dices and scrambles eggs in the shell, leap through flaming hoops and balance your checkbook.

  • Kinda tiresome that any mention of Intel is cause for the AMD fanboys to come out of the woodwork. I'm not knocking AMD; it's just that it's endlessly boring to see so much empty froth and angst spewed forth in defense of a product
  • "He added that Intel is aiming to have 25 times more transistors in processors than in current ones, running at 10 times the speed, yet with no increase in power."

    I hope he meant "no increase in power consumption."
  • by jd ( 1658 ) <imipak@ya[ ].com ['hoo' in gap]> on Monday November 26, 2001 @04:52PM (#2615877) Homepage Journal
    Since heat is a product of power consumption (energy in = sum(energy out)), then solving one is solving the other. In other words, there is only one problem, not two.


    The basics, though, are simple enough. Both reduce to the problem of moving electrons through a medium, with minimal impedence, whilst still having a semiconductor. (ie: You can't just stick the whole thing in liquid helium, and hope that you can have a superconducting chip.)


    The ability to use gallium-arsonide with very fast VLSI chips, as described a while back, is a good step in the right direction. Using copper, rather than alumin(i)um is another, although silver would be superior.


    Another option might be to use non-flat architectures. A hemsphere would offer a greater radiating surface and offer much shorter connecting distances than a planar chip, although it would be a royal pain to actually build something like that. Since power consumption is a function of distance travelled, you would thereby reduce the power requirements.


    Another consideration is the differences between states. If you need to switch from +1 volt to -1 volt, then you've got a 2 volt potential difference. (Duh!) The smaller that gap can be made, the smaller that PD is, and the less power you consume in the process. The drawback is that outside sources can cause serious problems. You would need some decent shielding, and a reasonably clean power supply to get away with very small changes.


    Last, but by no means least, one of the worst culprits for power loss are connections. And modern CPUs have LOTS of them! Every single pin has three points in which you have the potential for high resistance - the connection between the socket & the pins on the chip, the pins & the gold wires connecting to the chip itself, and finally between the gold wires and the chip.


    Of these, by far the most likely source of a poor connection is between the socket and the pins. That connection will often be by simple soldering, so you've got the double blow of going from the alumin(i)um pins to a lead/tin mixture, and then from the mix to the alumin(i)um connection on the socket.


    Overall, it's a wonder modern CPUs ever work at all!


    (Actually, it's slightly worse than I'm describing, as chip manufacturers frequently split things between multiple chips, thereby doubling all the above problems, for each chip in the set. ie: 4 chips give you 16 times the headache.)


    Larger dies, fewer pins (how many do you need, for chrissakes! One per instruction?!), uniformity of materials (as far as possible), fewer chips per set, better screening, better PSUs, purer wafers, and less corner-cutting, would all lead to superior performance, in every respect.


    The main reason Moore's Law will last well into the 22nd Century is that, although ALL of these refinements could be implemented tomorrow, the cost/profit ratio isn't great, and one press announcement is pathetic compared to the free publicity of "ever more exciting discoveries" (which aren't).


    In short, why the hell SHOULD Intel, AMD, et al, make the best chips the can? What possible motive could they have for killing off a great revenue source at little effort, when the alternative would be a one-off mediocre improvement in sales for gigantic effort, followed by a massive slump? The rate of R&D is much too slow to keep supplying people with new toys. It's much more profitable to slow the rate of marketing, and keep people tagging along.


    (If a chip manufacturer wanted to destroy the technology industry, all they'd need to do is make the best product they possibly could, using the best tools, and never mind the rejection rate. You'd get a few days of massive buying, followed by a decade of stagnation.)

  • Amazing how the word "breakthrough" can be abused.

    Intel announced that they are going to go ahead and push their own high-k dielectric and modified silicon-on-insulator, which they took their time to refine instead of pushing this kind of stuff into fabs early (like IBM and AMD). That's it. They did the same with copper interconnects, waiting for .13 micron processes, IIRC.

    There's nothing fantastically new, especially in the press release, except that they did it themselves instead of liscensing it. These aren't the droids you're looking for.

    EEtimes has a better article http://eetimes.com/story/OEG20011126S0031 [eetimes.com]
  • This is terrific... with a terahertz CPU core we will all need to buy new Infiniband I/O devices to keep up! With the cost of ATE at $1-5 million to test these IC's we will probably see some more shifts to structural testing and serial I/O to keep manufacturing costs down. I wouldn't expect to actually see a 64 bit bus running at 1Ghz, but stranger things have happened. -- Ross
  • Seems familiar (Score:2, Insightful)

    by bibos ( 116554 )
    microprocessors that run faster and consume less power than conventional ones. The technology solves two of the more intractable problems: power consumption and heat.

    This does sound familiar. Remember the two advantages of clockless chips discussed a few weeks ago on Slashdot?

    less power consumption

    less heat

    faster processors

    The article, on the other hand, says it's (only ?) because of a substitute for the silicium wafer. Well we'll have to wait and see what AMD has got in it's pocket waiting to be shown.
  • by gnurd ( 455798 )
    if this means faster online porn, it should be "DickHertz." yikes!
  • So far the only major, high-speed chipmakers who seem to have a problem with power consumption and heat generation are the Intel and AMD family of processors.

    Rather than develop completely new technology that will raise the price of their chips higher, why doesn't Intel take a year off and totally revamp their architecture so it isn't so much of a space heater? Sure, the general public will be shocked and appalled when they can actually touch their 5ghz Intel Pentium-IX, but I'm sure they'll get over the noisey fan-belt the AMD version needs.
    • Yeah, that's a splendid idea, taking a year off in the industry where processor size/power doubles every year.

      That's ok, who needs performance when they'll have a low-power solution, right? Too bad Transmeta already showed that that approach doesn't work.
  • There's doubtless a law that says "expenditures always grow to meet income" or some such, and this applies very well to computer technology as well. Better battery technology has never meant that you'll ever see a laptop with an 8 hour battery life, it just means that manufacturers make laptops that consume all that extra battery power in two hours with bigger sharper displays, DVD players, faster harddrives and more ram and CPU cycles. Most of which is just junk that some VP or VC uses to show off to everyone who can't afford it, rather than to let real people do real work while they're on a flight.

    And of course, this development doesn't mean that Intel will make their processors run cool enough to run without a fan again, just that they'll pack transistors into them until you can roast marshmallows over your processor. Oh well. Speed is good.

On a clear disk you can seek forever.

Working...