Billions of Transistors on a Single Chip 151
cgi-bin writes, "IBM has reportedly developed technology to create "tens of billions" of transistors on a single chip. Intel's pentiums only have 27 million or so. The technique uses electron beams instead of the traditional optical lithography. "
Re:Interesting? Who knows... (Score:1)
There's no such thing as electron beam epitaxy. You might be thinking of molecular beam epitaxy, which is a method of growing single crystal thin films, but has nothing to do with this.
Electron Beam lithography, which is what this article is about, has been under development for some time. It is even used currently in some research applications. You are correct that the main problem with it is that you must scan an electron beam over the wafer. A process which is extremely slow, requiring 6-8 hours to pattern a single wafer. Perhaps, the IBM people have developed a way to speed up the process. If so, this could be big news, but it's hard to tell from the article.
"Electron hosing" (Score:1)
Sure, tiny stuff is good stuff, but so what if it won't work...
On the up side, now 1.4e10x23 angels will fit on the head of a pin!
Marketing geniuses! (Score:1)
For those who haven't read the piece, IBM's technique allows them to put slightly more transistors per square inch, but its main benefit is that it allows them to manufacture much larger chips. So, it's technically true to say that they can get "tens of billions of transistors on a single chip", but most of that benefit just comes from what is effectively joining multiple chips together.
It's a neat piece of technology, but it hardly justifies the hysterical boosterism of this. Phineas Barnum would have been proud.
0.08 micron (Score:1)
sektori.com [sektori.com]
Re:it's the coolness that counts, ... (Score:1)
Congratulations, jd. You've just written yourself a perfect sig line for yourself.
Re:Heat... (Score:1)
In E=MC^2 the E stands for energy, measured in somethingortheothers, but not necessarily electrical energy.
Simple physics (Score:1)
Re:On chip memory (Score:1)
Course, youd have to be sure you flushed everything before the power goes out..
Re:OT: your sig (Score:1)
The two together are a wonderful syngery.
As for the storage of the mind, it is all the abstract and multilayed relationships stored in the mind and somehow mapped to synapse interactions, that are the truly meaningful information in the brain. And this information we do not understand fully (or peven partially according to some). I can assure you, however, that to store all this information would require many orders of magnitude more storage than a single gigabyte (DVDR).
My signature is concerned only with the physical code for the body itself (hardware) which is DNA, which is roughly 620-630 megabytes of information. Not with consciousness or mental state (software) which is incredibly complex and ever permutating and extending itself while running stop the hardware (brain) built according to this gentic program.
Re:OT: your sig (Score:1)
So, encoding your DNA is no big deal, there are companies that will sequence yourt genome for you. Making any sense of the information however, is many times worse than reading cat
keeps us on Moore'e Law track (Score:1)
is a relentless exercise of technology.
You need inventions like these to keep on track.
Kind of like social security bankrupcy-
electrolithography, copper interconnect, etc. keeps it going "for the next
ten years" while pessimists think it will end
at that time.
Re:hrm (Score:1)
might not see it as all that important - but look at it this way: if they can fit the tens of
billions on a single chip, think how small they could make a chip that only needed 27
million?
We're apathetic because, as someone said earlier, this is ancient news. Just being able to etch a leeeeeetle tiny impression doesn't solve any of the real problems of making useable high-speed computing circuits at the atomic level.
Yeah, maybe it's cool that they actually made a box and did it, but I don't think anyone was questioning that ability to do so. In fact, I don't wonder if they did it for reasons other than pure science (big DUH, here). I'm sure there's some other company coming out with competing technology, and they just wanted to show em up.
And things like this are never bad for the friday stock run.
--
blue, corporate conspiracy theorist.
--
blue
Re:OT: your sig (Score:1)
-Jer
-Jer
Re:OT: your sig (Score:1)
One person gave an estimate based on the idea that we probably weren't remembering at more than x bits/second (I forget what x was, but it wasn't huge), and that there were thus an so many seconds per year and most people didn't live more than 75 years. But I think that he was vastly overrating the average number of bits/second that were remembered. I would put it at less than 1 (averaged over several decades). Still, consciousness is a bit more than memory. One also needs stack equivalences and heap equivalences. How many bits do you keep active in short-term memory? Probably less than a megabyte. Considerably less. Most imagery depends on either a fill-in-the-gaps representation, or on refresh- from- external- sources. If I close my eyes, the image of the room in my memory looses tremendous amounts of detail.
My feeling is: We might not fit entirely on a CDR, but we would probably fit on a DVD.
Re:Hooray! Another overly hyped undeveloped resear (Score:1)
The real question.... (Score:1)
Electron lithography is essentially a raster process; a beam sweeps across the wafer "cutting" away silicon. In contrast, photolithorgaphy is like taking a picture; the whole wafer gets exposed at once. Until now, at least, using a photresist has been orders of magnitude faster than beam-etching techniques.
So I wonder, how have they done this? Multiple beams per wafer? Arrays of emmiters? Super-fast HV electron optics? What?
Kind Regards,
Re:Electron Lithography 101 (Score:1)
Thanks for the explanation. I 've found a picture of SCALPEL [bell-labs.com] (the competing Bell Labs effort) here [vacuum-solutions.com]. As an electron spectroscopist in a previous life, I found your reply to be the most informative article in this topic so far.
That said, a couple of questions:
Kind Regards,
Re:Heat... or, MC^2 != IR (Score:1)
E=MC^2 is the equation stating the total amount of energy "frozen" into a given amount of matter.
1 gram of matter is "worth" 0.001kg[(300,000,000m/s)^2], or 9x10^13 joules of energy - any processor that dissipated that much heat energy would probably resemble a large thermonuclear device more than a computer. The E in E=MC^2 just isn't equivalent to the E in E=IR.
Re:More IBM hype? (Score:1)
Re:Interesting? Who knows... (Score:1)
Re:OT: your sig (Score:1)
--
e-beam (Score:1)
I guess it takes a while to get this stuff near the production phase !
Re:Some thoughts... (Score:1)
I'd be surprised if it didn't 'sell' for an order of magnitude or two above that...
Re:Heat... (Score:1)
Joules, I believe...
Sure isn't equivalent to Volts, though 8^)
Re:My initial response is "wow!!" (Score:1)
Also, by lowering Vdd, you actually slow the chip down. While it helps with the power produced in the chip, the electron mobility decreases in the semiconductor material. This decreases the current flow and hence charging time on the gates. (correct me if I'm wrong, but I think this is how it works).
I think the main problem is balancing the transistor sizing and the heat density in the chip. As you said - smaller transistor, higher clock speed. But as the density goes up, so does the heat developed which necessitates lowering Vdd.
Re:0.08 micron, mass 0.15 micron since 10/99 (Score:1)
--
Re:Pentiums only have 9 million.... (Score:1)
Lithography is only one of the holdups (Score:1)
EBL has long been thought of as the next step (many are surprised that photo has lasted this long), but there are still many great challenges left if the industry wants to continue using MOSFETs. Chief among these I would say are the gate oxide and leakage currents (both gate and channel leakage). As the lateral dimensions shrink, traditional scaling reduces the oxide thickness as well. Right now the oxide is only about 30 Angstroms thick (6 monolayers in crystaline Si, happily SiO2 is essentially amorphous), so we have to reduce the voltage we apply across this oxide. This leads to the second problem, turning off the transistor. Given that the turn on voltage is relatively low, there is a reduction in the ratio of on vs. off currents. This is bad for obvious reasons (ever wonder why your Athlon or PIII is so warm?). We need to find alternatives to continued scaling of the oxide (several papers have suggested that 10-15 Angstroms is a hard limit to thinning the oxide), or better yet get a new gate ox material with a higher K (dielectric constant). We also need to move to SOI (thank you IBM for helping push this) in order to try and control the off current.
It is nice to be able to draw such small features, but to make usable devices requires a lot more than just the lithography.
Interesting? Who knows... (Score:1)
Any info welcome.
Rehash of 2 year-old breakthrough? (Score:1)
but that we probably wouldn't need such a beast until 2005. I'm thinking 1) this is the same technique, but with a better prototype, and 2) They miscalculated the timeline.
-
I stand corrected (Score:1)
-
Re:Now, forgive me if I'm wrong... (Score:1)
Re:OT: your sig (Score:1)
---
Re:OT: your sig (Score:1)
So in order to store your "self" you'd need not only your DNA matrix, but also a complete map of your brain's neurons and synaptic connections.
Also, due to the somewhat random formation of organs, any irregularities (one limb longer than the other, different vision in one eye etc) would not be copied by either method and would necessitate a complete map of every cell in your body. While this may not be directly associated with consciousness, if you have any abnormalities of any glands that produce behavior altering proteins then that would not necessarily be carried over.
The idea that genetics could incorporate the entire contents of our consciousness is implausible for a number of reasons:
1) Our genetic makeup does not change (for the most part) over time. There are isolated circumstance of change to DNA, but not through all cells and not following a pattern (ie. mutation due to radiation). Corrolarry to this is the fact that it would be assumed that additional experience would necessitate additional DNA. No DNA is "gained" through experience, thus this would imply that there is no DNA record of consciousness.
2) Genetic code is too small. The amount of information stored in the average adult brain is vast. The amount of information in our DNA (even if you include mitochondrial DNA) is not nearly enough to account for this information.
Re:Some thoughts... (Score:1)
Something with that amount of control isn't going to sell cheap.
Re:Some thoughts... (Score:1)
The idea being that even the most jaded (and rich) of TV viewers probably wouldn't spend more than $50 grand on a TV.
I too think we're probably talking millions, not thousands. It'd be sorta like using an aircraft carrier to do water skiing too (way overkill).
Re:Marketing geniuses! -- Um, NO! (Score:1)
"the technology would not only allow the manufacture of much smaller components (potentially down to the atomic level)"
"the wavelengths of electrons is five quarters of magnitude shorter, so it is basically an open-ended resolution media that for all practical purposes is limitless"
"The demonstration system was used to create components at 0.08 microns, or 80 nanometers. But Pfeiffer said the system could have been designed to produce even smaller components."
"We can extend the resolution downward," he said. "We don't really see a limit at 50 or 35 nanometers, which is many years away."
Okay, I don't know what you were smoking, but try to use less of it before posting next time...
Chris
hrm (Score:1)
--
DeCSS source code! [metastudios.com]
Matchbox (Score:1)
Re:More IBM hype? (Score:1)
My best guess is that IBM has demonstrated 2 things. A reliable columnated electron source (it may already exist) and some sort of electron "optics" (I know that makes no sense). But some reliable way of manipulating wide streams of electrons. The trick would be to make a 6 inch wide electron beam that has really good homogeneous characteristics.
Re:Laser? (Score:1)
Optics are a pain in the neck to fabricate right now in any form. Someday, maybe, but not yet.
Re:More IBM hype? (Score:1)
Re:"Electron hosing" (Score:1)
What's next... Quantum Computing???
Re:OT: your sig (Score:1)
Crusoe (Score:1)
...or powered by an overclocked mini-celery.. but then I'd end-up with a nice burn on my wrist..
Nope, because they're smaller ... (Score:1)
Re:OT: your sig (Score:1)
Identical twins don't have identical fingerprints which, to my knowledge is purely a function of DNA.
Correct me if I'm wrong, folks.
Re:Billions and Billions.... (Score:1)
Re:On chip memory (Score:1)
And I thought NT was a slow boot process...
Re:My initial response is "wow!!" (Score:1)
Remember... (Score:1)
All very well and good.......but........ (Score:1)
If just one manufacturer would put chips of this nature into production within 12 months; computer technology would gain another five years worth of development in as little a 6 months, Intel would instantly go bust, IBM would make a fortune and AMD would make a lesser know: probably superior but ultimately cheaper clone
Roll on the 21st century!
Re:All very well and good.......but........ (Score:1)
The alpha tool probably won't be completed until the end of 2002, Pfeiffer said.
What I am basically saying is that IBM will develop this technology, fail to realise its potential, wait for ages and then cry about it when someone else does it instead.
Re:All very well and good.......but........ (Score:1)
Now, forgive me if I'm wrong... (Score:1)
You should never, never doubt what nobody is sure about.
Laser? (Score:1)
Any research being done on the possible use of lasers in chips?
IBM's current processors (Score:2)
Some thoughts... (Score:2)
Second, and perhaps more importantly, the same techniques used for the etching -could- be used to produce ultra-high definition TV. (After all, all you're using is a somewhat larger electron gun.)
Now, I don't know about you, but I like the sound of computer monitors (or domestic TVs) capable of definitions of up to 500 billion lines. So what if nobody would be capable of telling the difference - it's the coolness that counts, not the practicality! :)
Re:Some thoughts... (Score:2)
Re:On chip memory (Score:2)
The advantage here is that realistic amounts of memory--128MB or more--can be put on chip with the processor. In effect, all memory is cache. This would be fantastic both in terms of speed and low cost.
If you can make processors with this technique, I'm sure you can also make memory with it. So you would still have off chip memory that is far bigger than cache. So instead of 512k cache and 128M main memory, wouldn't you have 128M cache and a few gigs main memory...
--
Re:More IBM hype? (Score:2)
Of course, with feature sizes this small, processor speeds would be improved, too. So both interpretations are valid. But I believe they indeed said what they meant.
Re:Heat... (Score:2)
When used in Fields and Waves (EE313 iirc) then E, as in "the loop integral of E dl equals 0," is often written in script or boldface, and represents Electric Field strength, in Volts/meter.
And E as in E=MC^2 has already been covered in a prior reply. I think most applications are in Megajoules, though.
Anyway, none of those really apply to the original comment, which was concerned about heat. The applicable law here is named after, erm... Watt, I believe: P=IE. When it comes to resistive heating, applying ohm's law lets us express that as P = I^2 * R. Deriving the units for P is left as an exercise for the student.
Re:All very well and good.......but........ (Score:2)
Re:OT: your sig (Score:2)
Well, it is certainly influenced by it, but if that were the case, identical twins would have an identical consciousness, which we know not to be true.
However, identical twins are often very similiar in behavior, even when separated at birth and raised in different environments. So genetic predisposition definately has a part in the development of your consciousness, personality, and intelligence.
Nanoscale... (Score:2)
MMmmmmm nano nano
Re:On chip memory (Score:2)
With this type of size reduction we could have chips at 3 gigahertz with 50 GIGABYTES of L1 cache. At bootup your system would load your entire drive image into L1 memory, and there would never be page faults or disk hits. Except to save, but that would be a background process and not affecting computing.
Now THAT would be a fast system.
Re:Screw Moore and his damn "law" (Score:2)
"All tasks can be designed / computed / evaluated / indexed / summarized / optimized / etc by a solid cubic meter crystal super conducting FPGA like nano scale self assembled ceramic block running at 90 gigahertz and chilled to 20 degrees kelvin running evolutionary systems of simulations / knowledge agents / neural networks / genetic algorithms / bayesian probability maps / self organizing pattern matching systems. All types of evolutionary systems layered and nested to unlimited levels of abstraction and complexity. Hardware capable of performing a centillion teraflops and holding a centillion terabytes of memory in core, operating at crystal speed. "
yes, that is what im waiting for...
Re:OT: your sig (Score:2)
You could squeeze the instructions for building a physical body like yours onto a CDR, but your body is uniquely yours, and to save its configuration (meaning the output from the genetic program at this point) would still require orders of magnitude more information. Sorry, i misread your comment, i was under the impression you were saying you and your mind..
Re:IBM's current processors (Score:2)
Re:Marketing geniuses! (Score:2)
Re:OT: your sig (Score:2)
Re:All very well and good.......but........ (Score:2)
This isnt far out theoretical research, they have built the system, and scaling it to fab production sounds quite straightforward and practical.
Re:More IBM hype? (Score:2)
Re:Marketing geniuses! (Score:2)
Re:Billions and Billions.... (Score:2)
Re:My initial response is "wow!!" (Score:2)
The drawback to direct-write electron beam lithography is that you have to directly trace the circuit you are trying to print in most cases, while in optical lithography you can expose an entire die (or multiple die) at once. There have been improvements made over the years, using techniques such as parallel writing, but it's still slow. Even using a more conventional masked resist and scanning the beam across the wafer using vector or raster methods, there are problems with electron scattering and such.
This article is pretty short on technical content, so it may be that IBM has developed a way to make e-beam lithography fast enough to be used in a production environment for chips (it is already used for making photomasks). That would definitely be a significant development. We'll have to wait and see, I guess.
Also, keep in mind that just because they have a lithography tool that can write 80 nanometer lines does not mean that the rest of the processing equipment (etching, planarization, etc.) could support it. There would need to be advances in those tools as well.
My other question is, what do we do with tens of billions of transistors? If we jump three orders of magnitude in the number of transistors on a chip, is it really going to do us any good, at least with current circuit design techniques? I think testing a circuit like that would be a nightmare.
-Jason
My initial response is "wow!!" (Score:2)
Please alert me if I am wrong, but IIRC the smaller the transistor, the lower the power requirement (less heat), and the faster the chip (less distance from junction to junction). So if all they did was to make the same chips we have now on the smaller die size, there would be a reduction in power requirements and a speed increase, right?
Not that I'm much of an expert in these things, but when IBM says they've got the tech Nikon has built and demonstrated a a proof of concept machine, this sounds like tech that's less futuristic than say, quantum gates, etc. I mean, Nikon isn't in this to produce a one-off demo machine -- what they're really after is the ability to put their machines into the fab plants. So the actual production of chips is probably still a couple of years off, but the technology would vault IBM ahead of just about every other chip maker on the planet -- ahead of Intel, Motorola, AMD, TI, and anybody else I may have forgot.
My biggest remaining questioss not answered by the article are:
Feynman, 1959, electron beam. (Score:2)
Re:IBM's current processors (Score:2)
And better yet, the entire wafer becomes a customized circuitry area. Design processor and memory modules and have them splattered across the entire wafer with assorted bus circuits, with each wafer different based upon customer demands. (Yup, you'd better design those modules to work despite some of them not working due to production failures...)
Re:My initial response is "wow!!" (Score:2)
The increased speed doesn't really come for shorter junctions. The difference in the amount of time an electron takes to cross
The smaller components enable faster speed by decreasing the RC constant. RC stands for resistance times capacitance. Think of filling a bucket with a hose. A bigger hose lets you fill the bucket faster. It's also faster to fill a glass than a tanker truck with the same hose. You can think of circuits in the same way. It takes a certain amount of time to charge and discharge a circuit. Decrease the amount of resistance or increase the voltage so that more current flows and the circuit charges faster. Reduce the size of the circuit and you don't have as much to fill.
Overclockers often need to increase the voltage when trying to increase speed. What's happening is that above a certain clock rate, capacitors that control gates don't have a chance to completely charge/discharge before the clock switches. If you force more electrons to flow down the same pipe by 'increasing the pressure', the caps will be able to completely charge/discharge and the circuit will work. Of course, if you try to push the force of a fire hydrant through a drinking straw...
Re:Billions and Billions.... (Score:2)
Re:Laser? (Score:2)
Then I realized that you can't escape electricity. You would have to convert the light beams with optical switching sooner or later in the computers, and that would be an insane performance decrease.
You cannot escape the fact that memory will always run on electricity. It just isn't practical to come up with a type of memory that is based on storing light.
Also you run into the problems with refraction and the fact that if you want to make sure that each light beam doesn't interfere with the other beams during a processor cycle you would have to make the processor very large so you could sheild the indivudual light beams with an opaque material. Also, lasers get very hot. Its just not practical.
On chip memory (Score:2)
Re:Laser? (Score:2)
Also as the other reply states, any useful optical device actually involves electrons transitions anyway to provide some kind of nonlinearity (you can't build anything useful such as a NAND gate out of linear componants).
So has anybody found a URL which explains what advances IBM have actually made which makes this better than the E-Beam lithography that's been around for ages?
Edmund Green.
Nanoscale Physics Research Laboratory, The University of Birmingham, U.K.
Optical storage (was: Re: Laser?) (Score:2)
No, it is practical to come up with the idea. It's just not practical building and using it effectively.
Researchers at University of Colorado, Boulder designed an optical computer several years ago. There are several systems in existence where computation is partially or fully done optically, but this was the first (and the only, if memory serves me right) system to do everything optically-i.e. it had a memory system based on storing values optically.
The system essentially stored pulses in a loop of fiber optic cable several miles long. I think the principle is analogous to very early electronic memory systems were bits were stored in forms of waves on tanks of mercury.
Extensive info on this was published in some IEEE publications back then. I don't have the time to look for the URL now, but it will be helpful if someone can find the reference to it.
--
BluetoothCentral.com [bluetoothcentral.com]
A site for everything Bluetooth. Coming soon.
Here is the link.. (Score:2)
Stored Program Optical Computer(SPOC) [colorado.edu]
--
BluetoothCentral.com [bluetoothcentral.com]
A site for everything Bluetooth. Coming soon.
More IBM hype? (Score:2)
Called PREVAIL, ... the technology would ... significantly improve the speed at which silicon chips can be processed, researchers said.
I bet What the researchers probably said was "significantly improve processor speed." This is an important point to make because anyone in the semiconductor industry knows that electron lithography is SLOW, like orders of magnitude slower than optical lithography. That is why nobody has ever used it to make commercial chips even though the technology has been around for more than a decade. I would be interested to see some more technical back-up articale that talk about masking and throughput.
Re:Electron Lithography 101 (Score:2)
"s another side note to this, Lucent Tech. has an EBL system just about at proof of concept called SCAPEL. Hope this clears up a few of the wrong ideas and helps people understand what this is all about."
Might I point out that here on slashdot back in October or November was an announcement that Lucent had reached a resolution of .05 microns using SCALPEL. Then let's not forget the UC Berkeley student who made an even smaller transistor two weeks later. Something like 0.018 microns or so. This IBM announcement is not really anything new. Here is the press release. [berkeley.edu]
Potential (Score:2)
OT: your sig (Score:2)
---
Re:Optical storage (was: Re: Laser?) (Score:2)
And the winner is... (Score:2)
If they build it.... will it work? (Score:2)
It's all great and wonderful if they can construct devices that small, but the question is whether they will even work or not. Granted, the theoretical minimum device channel size is far below what is currently being produced (I think it's somewhere in the range of 0.02-0.05 microns, but don't quote me on that).
Another problem to look at is the degradation of the device that can take place when things get that small. I'm sure people wouldn't be so prone to overclocking their processors if there was a chance they might completely destroy the processor in doing so.
Another key, as was already stated, is that they need to bring the cost of the process down before it will ever see a production line. If the process requires a long time then it's likely to either create a bottleneck in the production line, reducing the overall output, or simply drive the price of the final product up a bit by forcing the company to purchase large numbers of the tool that performs the process.
Granted, the savings that would result from a smaller die size and potentially a correspondingly small package size could make up for the price difference due to the new tools. I'm not sure of the exact number, but a large part (>50% I believe) of the cost of the chip is in the packaging (Which is why you'll find bins of scrapped wafers at any production plant.. why package something that isn't going to work)
Another problem I could see in bringing the process to market is in contamination of the chips during production. As it stands now, lots of chips are scrapped because of skin cells, dust, etc landing on them during their trip down the line. With the smaller device size the smallest foreign particle size that could be tolerated would have to be smaller... so either clean rooms would have to get cleaner and their employees more religious in following the rules, or they would have to find some way to isolate the wafers from the technicians.
Re:My initial response is "wow!!" (Score:2)
Well, as you stated yourself, reducing the gate area you reduce the gate capacitance. Thus you can still achieve the same charging time, albeit with a smaller voltage.
And as for power consumption, yes, if you do full scaling where every part of the device is scaled down by some factor X then you get a reduction in power consumption. However, with the wonders of backwards compatibility and meeting external specs and such, oftentimes the devices are not scaled down using full scaling. In this case the voltage is kept the same and the device size reduced, which actually leads to higher power consumption.
Of course it reaches a point where the power consumption is just obscene, at which point they reduce the operating voltage. And this really isn't a problem if you're going to put out a new chipset for the processor, just dictate what the voltages have to be. However, if you're trying to build in compatibility for an older chipset that doesn't support the lower voltages your chip requires, your SOL.
Re:My initial response is "wow!!" (Score:3)
--
At a guess... (Score:3)
Electric fields can be used as lenses to focus electron beams, forming images of a stencil, just as physical lenses can be used to focus photon beams.
On one hand there's a complication because electrons mutually-repell and also affect the field that forms the lens, so higher beam currents tend to distort things somewhat.
On the other hand, the lenses are formed by an electric field's natural curvature. So small-scale optical imperfections just don't occur in a good vacuum, while gross imperfections are easy dealt with by maintaining decent tolerances in the construction and excitation of the electrodes.
Of course they COULD have made a breakthrough in scanning electron beam technology, and be talking about writing every chip one at a time. But that doesn't square with either the claims of "billions of transistors" and those of "speeding up the processing".
Yes, they could get DENSITIES of billions of transistors. But writing them one at a time takes a while. And keeping the beam alligned across a large chip is a problem. (Though the latter can be solved to some extent by first laying out a set of location markers and using them in later steps to figure out where the beam actually is.)
Billions and Billions.... (Score:3)
---
New York Times (Score:4)
Good news and bad news (Score:4)
The new chip has over ten billion transisters.
The bad news:
The new chip is over 1700 square feet.
Plans for a portable based on the new chip are being put on hold...
Electron Lithography 101 (Score:5)
1. Inorder to get the resolutions required in the future, photon lithography would have to go to X-rays that have a high enough brightness (i.e. You need a syncrotron X-ray source on site). For EBL, you just need a large source filament.
2. Masks for X-rays would need to be the same size as the actual features because there is still not a good method to make images out of X-rays so they use a shadowing technique. Not the case for EBL. Electrons use magnetic lenses to focus and have been used and designed for years. This actually allows you to build the mask in seperate parts and have the electron beam deflection put it all together for you as if it was all together.
3. Stepper motors don't need to be quite as acurate on positioning. This is because you can put in a simple feedback unit that examines where you are projecting on the surface of the wafer and deflection coils can position the beam exactly. This means that you can do lithography while the wafer is still moving! You couldn't do this in your wildest dream with X-rays.
4. Electrons have a very small wave-length at the acceleration voltages used (on the order of picometers). However, the real limitation for EBL is not the wavelength by lens abberations (pick up a good optics book) as well as space-charge effects (this happens because you are using a charged particle and they repel each-other giving a bluring effect). Even with all of this, some predict that you could get resolutions "easily" to the 10nm scale in lithography. No, we can't do atom manipulation with this technique.
5. No this technique does not use a focused beam technique (similar to scanning transmission electron microscopy), but it uses a plane-wave electron beam so that you can expose large areas at once (similar to stardard transmission electron microscopy), allowing for higher through-put.
Probably the major disadvantage for EBL right now is that we need more sensitive resists. The brightness of the EBL is still low compared to UV photon lithography, but I know of several groups that have come a long way with this one.
As another side note to this, Lucent Tech. has an EBL system just about at proof of concept called SCAPEL. Hope this clears up a few of the wrong ideas and helps people understand what this is all about.