Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

Cringley: Chip Manufacturing To Radically Change 141

eshefer writes " This week cringely talks about a company called rolltronics which he claims will make the current microprocessor fabrication on silicon wafers technology defunct in five years. The company uses roll-to-roll printing on plastic (somewhat like newspaper printing presses) making the process much cheaper to produce then current technologies. "
This discussion has been archived. No new comments can be posted.

Cringley: Chip Manufacturing To Radically Change

Comments Filter:
  • _Big_ oversight. Let me elucidate:

    Why do you need 1 CPU clocked at 1 GHz... if you can have 1000 CPUs clocked at 1 megahertz?

    Think about it. Ever heard of the 'connection machine'? Web printing of CPUs at unprecedentedly low costs is a situation that _begs_ for massively parallel computing.

    Suppose you could print up a big screen and behind every pixel is a 1 megahertz computer? Let's see- let's imagine the screen is three feet high and... well, any length, right? Make it as wide as your room. Call it 30 dpi and assume you're viewing it from a fair distance. That's 1080x4320 pixels (or so) for 3x12 feet. If each pixel's running a computer at about 1 megahertz that's more than four thousand gigahertz of CPU ;) _and_ you can paper your wall with it.

    Yes, getting _information_ to those clever pixels would be the trick, but it's not an insuperable problem. The CPUs can be pretty smart at one megahertz. Give them a few K of ram too, think of them as like a peculiar sort of 'accelerator card' only massively parallel. It's absolutely trivial to do hacks like Asimov's 'Prime Radiant': you'd just have each 'pixel' an alphanumeric generator able to consult an overall RAM location, a pointer to where in RAM to look, and with a brightness control connected to some gaze direction sensor. Presto, wall full of text that scrolls and becomes clearer and follows your gaze- and that is one of the _easiest_ things to do with this stuff.

    Better to ask whether you could have a suitcase-sized block with several terabytes of _really_ _slow_ RAM (read: way faster than a HD, and impact proof!), for the cost of a floppy drive- after all, it's only a matter of printing more pages, right? Just keep piling up the 'slow RAM' pages. The same thing could be done as read-only: if you're OK with text, you could have a 'book' (i.e. an object with a screen on it) that is ANY book you like, with terabytes of data in it, printed so cheaply as to be virtually disposable. Look around at your books and ask yourself how much it'd cost you to have a _scribe_ write those out for you ;)

    Ask whether the massively parallel 'screen' concept could be used for video game systems (imagine unrolling a wall-sized Quake, or flight sim- even if the quality is not ultra refined, some aspects of it would be as advanced over the current state of the art as the current state of the art is over software rendering- for instance, perhaps you'd have only solid colored triangles BUT the system would build them itself, being fed only very high level object information- and you'd get 100X or 100,000X the model geometric detail you can have today, because each pixel only 'sees' a couple triangles directly behind it and decides only what color it's going to be. So you wouldn't get 3Dfx motion blur but you'd get every leaf on every tree in the forest, as each pixel handles its own (emergent) geometry. Even if you only had Atari 2600 level pixel sophistication- think of the resolution! You could have new games designed to take advantage of that- like, I dunno, a Pac-Man that has just the one maze but it's the size of the wall :)

    Or, each pixel knows how to do MPEG transforms- so you unroll a videophone or movie screen the size of your wall. Power's an issue, but if you can get wide enough traces... like a foot wide... a lot of power dissipation issues become less of a problem :) really, the idea has loads of possibilities that don't require heavy centralised processing.

  • Absolutely. 300 dot _line_ _screen_ is high for mass printing: maybe that's what Shotgun's thinking. Besides, I bet a lot of the action's going to be in large format- doing stuff with wall-sized screens, or medium-res displays that are 2' tall and 6' wide and curve all around the back of your desk. With surface area like that who needs super high resolution? Big screens are cool :)
  • Depends on how you look at it. If you made one to encode into mp3, and figured out some way to hold an audio file up to it and let each processor do a block, you'd decide to encode the mp3 and bip! it'd be done already :)

    I could see similar uses in things like scanning, where you're dealing with a lot of parallel data anyhow. You could print some sort of processor that gets rolled up and stuck to the scanner's sensor bar, that interpolates and sharpens and does all sorts of nice image enhancements very cheaply- and have the scan go much faster than current models do. Output bandwidth would be the bottleneck- unless the rolled-up processor also encoded the result into JPEG for you :)

  • Not sure this technology will take off, anyone remember bubble memory or any of a dozen other pieces of technology set to take over the world? But it could, and through the same process that personal computers blew away minicomputers. Originally, personal computers were crap compared to minis. Slower, less reliable, less functional, you name it. Anyone who wanted to replace their minis with a bunch of PCs would have been looked at as insane. Now these days, people run very powerful and sophisticated software that mini's used to do on their PCs and now you have SETI using all that networked power to do supercomputer work.

    If they get this technology to actually work commercially, it could destroy conventional chipmaking techniques. Sure, these things will be slower than normal chips. However as the book _Innovator's Dilemma_ clearly states, what usually happens is that this technology will end up filling a new niche on the low end. Take embedded systems. This plastic technology is a natural fit for that sort of thing. Sure, its slower but most embedded systems are not CPU hogs anyway, and the lower cost and higher production rates outweigh the disadvantages of the reduced processor speed. The manufacturers of this technology work on refining the density of the circuity they're printing on the plastic, improving performance. Slowly they start eroding at the conventional chip technology market from the bottom up. Chip makers keep giving up the low end as they focus on the high end.

    The problem is you can't keep doing that forever. Sooner or later you run out of ground. So in a few decades, you might find that for everything but the luxury high performance end of things, all your circuits are printed plastic instead of etched silicon. That's how companies can get burned going with the "sure" technology and get blasted out of the market from below. The low end inevitabily eats up into the high end with cheap technology improvements.

    Not that I'm saying this is going to happen. A lot depends on whether they can make this work commercially and how well the technology scales. But it could happen. Don't go dismissing it out of hand. The smart answer is 'wait and see'.
  • err...

    s/Why/Who/

    Well atleast this crack smoking junkey corrected his mistakes ... :-)
  • 1-Why is the crack smoking junkey who put that website together?

    2- Anita Borg, Ph.D. -- Product Innovation and Social Responsibility. Hehe. I just think it's too funny.
  • by Bookwyrm ( 3535 ) on Friday January 19, 2001 @10:30AM (#495065)
    It would be worth considering that should this scenario come true, it would have an interesting impact on the usages of free-vs-commercial operating systems. If the computer costs $15 to make, people are not going to be spending $80-$100 to put Windows Whatever on it.

    On the flip side, if the computers are 'disposable', then this might drive up interest in MS .NET and similiar network-based hosting/application providers as a place to store data and applications on, with the $15 computers being treated as more of an access device than a computer -- the catch would be whether or not the monthly service charges or what have you over the long term were cheaper than buying a 'real'/non-disposable computer with software or not.
  • At least you clicked on the link and learnt something, which is more than the dumbass moderator who thinks the post is ''offtopic''
  • by warmcat ( 3545 ) on Friday January 19, 2001 @10:15AM (#495067)
    Ball Semiconductor [ballsemi.com] have at least as interesting a plan to deposit semiconductors on small spherical surfaces. They have some small gates working already.
  • Note to moderators:

    Posts warning that their parent contains a disguised disgusting link may qualify as "off-topic" (though there are better uses for your mod points), but in no way is it fair or accurate to label them as "trolls".

  • More likely you'll subscribe to the computer, with a new one printed up and sent out to you every month or so containing the latest MS bug fixes.

    Or maybe it could be a loose-leaf binder type arrangement and we can keep on calling them "service packs".

  • I want to know if it rubs off on my hands.
  • Ball Semiconductor. Spherical ICs. At first, I thought this was a troll...

  • Actually, I remembered reading about it in EE Times or something... it just took a moment to surface. Meanwhile, I was chuckling at the pun.

    The humor of which, BTW, isn't diminished.

  • You can say that, yes, if you are able to adjust the scale at which you look at that curve over multiple orders of magnitude.

    But human perception, in general, with respect to the number of pieces of information it can juggle at once, or the "gain" of a sensory organ, is restricted to only a few orders of magnitude, and in some instances only one order, so we percieve apparent "knees" in exponential phenomena. Preception of acoustic intensity is a classic example.

    Context can also impose a frame of reference on an exponential effect. I'd like to elaborate but the boss just walked in...

  • He's right - there's going to be an amazing convergence of new technology in the next decade: organic semiconductors, inkjet mass production, digital paper, amorphous photovoltaics, fuel cells, polymer electrolytes, bluetooth, broadcast power. It's going to be a cyberpunk's wet dream. Sure, the first things built using these advances are going to be large, slow and clumsy compared to their ultimate potential, but Bob's yeoman analogy is accurate for a reason. In another decade or two, the power of a modern laptop PC will be shrunken down to something you can fold up and stuff in your shirt pocket, and cost less than the shirt.

    And it just may BE your shirt pocket. That's what Cringley probably knows but isn't saying - When it becomes that cheap to just "print" a computer, they'll be integrated into everything: refrigerators, automobiles, clothing, furniture, you name it. Sure, there will still be information appliances, but their purpose will evolve into enabling your coordination of all the other computers you will interact with throughout your day, from your own household accoutrements to public infrastructure to your employer and the internet at large.

    It is going to make the world unrecognizeable.

    Again.

    And the amount of information that will need to be exchanged is going to make today's bandwidths look like trickles. Right now, we are at the knee of the exponential growth curve of the telecommunications market, and technology will keep up with demand as improvments in optical swithcing continue. Communication service is going to become more important than banking - hell, banking and finance has already become little more than information flowing around a network.

    You want to be a part of it? Forget putting your money in the people who make computers. Invest in telecommunications, and the hardware that supports it. That's where the fortunes are going to be made.

  • electron mobility [sucks] / chemical instability [sucks]

    These are just areas where we will see the incremental improvement that Cringley described.

    Hell, in 1988, when I bought my first CD player, the hard drive couldn't store even one track from a CD. Now my hard drive holds dozens of ripped CDs, in many cases uncompressed.

  • Well, aside from trying to poke holes in your math, there are (at least) two other reasons why these "plastic" computers are going to be a big hit, despite the fact that they're lower on the curve than ones made out of polluted sand: - Cost. The organic materials are much less expensive than the ultrapure materials required to make semiconductors and hard drives. That's why CDs are so cheap. (At least as long as the oil holds out.) - Ubiquity. When you can print a computer on any old visibly clean surface, not just plastic sheets, then you've turned an important corner. You can now put computers into eyeglasses, furniture, windows, coffemakers, even underwear (just imagine!) It doesn't matter how fast the machine once you pass certain computational thresholds: the ability to support a graphic interface, the speed to reproduce audio,and another milestone for video. Each of these thresholds open up yet more applications for embedded computing.
  • Linux [microsoft.com] BeOS [microsoft.com] FreeBSD [microsoft.com] MacOS X [microsoft.com] QNX [microsoft.com]
    SUB-20000 USER ID FOR FREE!
  • Yes, because the is a larger lower bound of feature size they are not going to be cranking out 1GHz CPUs on this stuff, especially when the chip is the size of a sheet of paper - it takes electrons time to get from place to place.

    So this limits the power of individual processors using this process, but you can go massively parallel, just add another processor page, or 3 or 10...

    This may never produce a barnstormer of a computer, but it sounds promising for consumer electronics and web appliances.

    -josh

  • The circuit size will drive up power usage and heat generation.

    You're thinking in terms of silicon. It's not a straight scale-up when you're changing the basic materials. It's my understanding that a polymer-based CPU of that size would generate less heat and use less power than a silicon-based CPU sized as they currently are.

  • If the computer costs $15 to make, people are not going to be spending $80-$100 to put Windows Whatever on it.

    Interesting thought, but it doesn't necessarily follow. You can buy a pretty decently-made blank book for just a few dollars, but lots of people happily buy the latest hard-cover best-sellers for $20-$30 (US). (I might not think Windows Whatever will be worth the extra cost, but I don't think the latest John Grisham is worth $28, either.)

  • by Bearpaw ( 13080 ) on Friday January 19, 2001 @10:37AM (#495081)
    Rechargeable batteries.
  • If the computer costs $15 to make, people are not going to be spending $80-$100 to put Windows Whatever on it.

    True, but isn't Whislter being positioned to be the replacement for Win9x and NT? By the time this process is ready for mass market, MS will probably be trying to shove Whistler down everyones throats. Since it is next in line after the current NT releases, it's a good bet that it will cost as much or more than Win2000, which the Professional version has an MSRP of almost $400. Of course, if people are that in love with Wondows, they'll buy it, but the ones on the fence will most likely gravitate to some thing more affordable.

    --
  • Although the technology fits on a sheet of paper ( what size? ), in theory it should be possible to fold it, or roll it, so it takes less space. Other advantages would include the fact that you could actually place it a wafer of plastic - imagine your desk could actually be made of layers of this technology and you wouldn't even notice, and all you would have to is plug your keyboard in.

    Technologies at this stage are often seen by themselves and out of context, but once out of the research stage there is nothing stopping them from being combined with other technologies to increase the number of possible applications.

    These sort of technologies are what will help contribute to the invisible technologies - whereby they are there and made use of, but you won't notice them.
  • What if someone accidentally rips my laptop? :)

  • Yeah, remember all those press releases about cool new technology that happened eighteen months ago but I can't buy at the store yet? Those all must suck, so they should go ahead and stop developing them.

    Science takes time, but it ALWAYS ALWAYS gets results. Project Apollo took, what, a whole 15 years to get people from primitive jet engines to walking on the moon. Who wants to wait that long?
  • Do you have any idea how much venture capital you could get if you had a credible design for a five year battery for a computer? ANY computer? If you know how to do that, I'll be glad to scare up some financing. Somehow.
  • What, you're going to land on that and ignore all the people who say "Here here!" when they should be saying "Hear hear!"

    Just like Mr. Simpson says. "It's pronounced noo-kyew-lar."

    No, it's not an American thing. I bet there are people in other countries that have bad grammar too...I'm just not good enough (enuff?) at reading their languages to pick it up. And British authors...hell, they think car hoods are called bonnets and cookies are called biscuits! Never mind grey and colour. Or Aluminium (sic). Who can tell what those poofters (weirdos) think is a grammar error?
  • Regarding your last point, it's interesting that the first computers DID cost $500 million EASILY (depending on how you want to adjust for inflation), but you can get a superb pocket calculator for $15. It's not as far fetched as it might seem on the face of it...
  • There's a difference in believing they are going to change the world and imagining what the world would be like if they did. I'm a skeptic myself (good trait for an engineering student), but I also don't mind exploring interesting, even improbable, ideas.

    I'm certainly not going to give them any money. : )
  • So a beowulf cluster would fill a shelf ?
    Or a library ?
  • Actually, I think in English concepts. "Knight" and "night" are two completely separate entries in my internal lexicon, just like other homonyms such as affect and effect.
  • Yeah, but you yanks have no honour.
  • Well, they tried marketing DVD's that expire (DIVX) and look how long that lasted!

    I don't doubt that there would be an easy and economical way to recharge/reuse/replace the power cell.

  • There was an interesting Wired Magazine article [wired.com] that discussed the work being done by Paper Computer [papercomputer.com] to make cheap flat computers.

    There was a Slashdot article [slashdot.org] about these guys over a year ago.
    -----
  • by gmhowell ( 26755 ) <gmhowell@gmail.com> on Friday January 19, 2001 @11:26AM (#495095) Homepage Journal
    Considering how reluctant paper is to disappear down the hole with our lousy 1.5 GPF toilets, I seriously doubt the computer would go anywhere.

    GPF=Gallons per Flush

  • by Basje ( 26968 )
    Great. Now you can have the dog fetch your laptop. I hope that they find a way to prevent it getting soaked at the same time

    ----------------------------------------------
  • Actually, there are at least two companies. Dieceland Tech Corp [dtcproducts.com] is promising a $10 phone, and this Register [theregister.co.uk] story says a $20 laptop also. The DTC "future" page has an image of a laptop but no details.
  • So you're saying that we're not going to see any new technologies because it is rare for a new technology to become mainstream? That's ridiculous - of course we'll be seeing new technologies appear, sooner or later. Sure, most fail, but not all of them.

  • "Than" and "then" sound very different when spoken in a good old South African accent. In general though I don't have that problem at all, I've never had much of a problem with spelling. I think its a genetic thing or something, some (otherwise intelligent) people seem to struggle with spelling. Or maybe it has do with how much a person reads. Reads books, that is, not websites like slashdot.

  • I don't have all the numbers to fill this in, but here's a try.

    The technology is aiming to be available 5 yrs from now (notice how long shots are always 5 yrs out)

    The latest generation of microprocessors have around ?20?million transistors. 5yrs will give us 3 more doublings (assuming Moore's Law holds), so will be looking at replicationg 160mil transistors for the processor. 1Gig of Ram, add another billion. All the other circuitry, lets just make it simple and say that an average computer will have 1.5billion transistors in 2005.

    A magazine has around 200 pages. So each page of this computer will have to hold something like 7.5million transistors (assuming an even distribution).

    Assuming they can print at 300dpi (which I believe is high for mass printing) on 8inx11in media gives

    300x8x300x11 = 7.92 million

    This may look like it will pass until you consider that a transistor will take more than a pixel and then consider inter-transistor wiring. If this is enough they will barely be cutting the edge unless:
    -they can print at higher resolution
    -they can print more pages

    I don't see the need anyway. Computers are cheap now. You can get one for $100. What's expensive are the latest processors, and they're not expensive because of production cost. It's recapturing the engineering cost that drives up the price. This will only produce $15 computers if someone is willing to pay $500million for the first one.

  • Can you say "use conducting (as opposed to semiconducting) plastics for the interconnects"?
  • I think the world is computerized enough right now. It's enough of a pain in the ass to bother with synching every random appliance I have and getting everything working together, playing nicely, etc. How much of a royal pain in the ass will it be when everything, from t-shirts to condoms (didn't think about that one, didja), to toilet paper holds information that people "need"?
    I don't want to live in a world where I have to sync a condom after usage in order to see the stats on it. I'd rather not know some things. And I definitely don't want to sync my t-shirt with everything else. Imagine hooking yourself up to the computer for a few minutes every morning... it'll be like taking an EKG (hooking up nodes to various parts of you). Screw that. The palm docking station is enough syncing for me. I'm a fruit of the loom guy... not a fruit of the valley.
    ______
    everyone was born right-handed, only the greatest overcome it.
  • Bob is often interesting and always entertaining.

    I think, however, he missed the boat on this one. In an age where Geometries are shrinking by orders of ten, twenty or even a hundred, it is not inconcieveable that five years from now they will be talking about mass producing chips where their geometries are approaching the size of atoms.

    The reason the printing press didn't change very much in three hundred years was because the people who sold it didn't have to worry about some guy down the street coming out with a better, simpler model. Intel and AMD do. I believe Bob is right when he talks about this company making something revelutionary, but I would bet money that these people are twenty years too late.

  • 1 gigaBYTE of ram will need more than 8G transistors. Sheesh.

    -Ryan
  • by SysKoll ( 48967 ) on Friday January 19, 2001 @10:42AM (#495105)
    This technology is old hat already, the trade press has been writing about it for years.

    Advantages:

    • supple plastic circuits
    • Mostly transparent
    • Low cost once the process is established

    Drawbacks

    • The electron mobility of these plastic semicond junctions suck. So this is good only for low-speed circuits. I'm not sure this is even good enough for the few MHz of CD-A audio decoding.
    • Concerns about chemical instability. These plastic circuits will have a low density, a low-cost packaging, and hence offer a huge surface to pollution by environment reactants. Ozone can make holes in the latex of condoms, guess what it can do to a semiconductor thin film exposed to air.

    A often-quoted great app is the head-up display for cars: a transparent set of electonic circuits that you glue on your windshield and contains its own display. UV protection films are mandatory for keeping the circuits from burning in the summer, but it looks feasable and cheaper than the usual optical projection solutions.

    Don't sell that $12 million 193-nm optical stepper in your silicon fab, though. We're not there yet, especially for medium or high speed circuits.

  • It came from Frontpage... it was pulled along on wires...

    I'm sure that every cutting-edge, change-the-industry startup company does their web site with Frontpage. I can't wait to see their saran-wrap substrate processors.

    - - - - -
  • I dunno that the feature size is that much of a killer. I think we can claw back the low-clock rate by massive integration.

    There's a lot of inefficiency inherent in the modular assembly process exemplified by your run-of-the-mill laptop. The processor talks to memory over a small bus, the hard-drive knows little of what will be requested next...

    This is all the case because traditional manufacuring needs to modularise in order to acheive high enough yeilds to be price-effective. Hence each component needs to be a fast as possible (thus the smaller feature size) to achieve acceptable performance.

    If the entire logic circuit of the laptop could be printed in one go -- Extrelemely Large Scale Integrationed -- I'd imagine that we could compensate for the lower clock speed by exploiting paralellism and asycn clocking. Instead of printing only one CPU, print ten, each with their own memory. Give 'em a nice wide bus to communicate...

    Of course, no-one said that designing or programming this beast would be easy.

  • If this is true, then that's a lot of cheap computers, that are going to need an operating system. One that's already demonstrated an ability to be easily ported to lots of architectures. This could be a big win for Linux.
  • by coug_ ( 63333 ) on Friday January 19, 2001 @10:20AM (#495109) Homepage
    Think more along the lines of magazine presses which are less likely to exhibit flaws (in my experience). The production of newspapers is done with less concern about details - as long as the thing is basically readable, no one's going to complain about a $.50 paper and the newspaper presses know this. As the circulation goes down and cost of issue goes up, people are more likely to complain. In this case, the circulation is going to be extremely low - everyone isn't going to be buying a computer every day, week, or month. The company would naturally have to make sure that this roll process is accurate enough that they can limit the number of misprinted computers to an amount that can be recovered by profits without a problem.
  • Which is a damn shame!

    I rent DVDs for $5, when with DIVX I could have "purchased" the physical item. Besides the convience of having it closer for more replays (paying the $5 again), or being able to buy it for life ($20), I have nothing when I return it to Blockbuster. And late fees? Pfft.

    And what would have made DIVX even more tempting if it caught on was the opportunity to hack the player to play "expired" disks.

    So what I'm saying here is.. I don't know what I'm saying. In DIVX it's a good idea, because a video disk is an item you'd perhaps rent. But for a computer? I like my comptuers big, grey, and power hungry. I'm barely sold on the concept of batteries, let alone in computers.

  • As many have pointed out, this would not be suitable for many things such as really high clock rate processors where teeny tiny features are critical. But you could do some cool stuff with it. Possibilities:
    • All the memory and drive electronics for a solid state TV screen, with an OLED layer for the light emitters. Cheap flat panel TV to hang on your wall!
    • Screen, touch pad, memory and slow processor for something like a Palm. Make it 3"x5" and then mount the quarter-inch-thick result in a bit of hard plastic housing for durability. Cheap digital assistant!
    • Same as the last one with an IR transmitter and receiver. Cheap teachable universal remote!
    The comments about the cost of the OS and other software for these gizmos raise interesting questions about what the total cost might be.
  • I doubt this technology will work for high-density state-of-the-art sorts of things (like CPUs for example) - it probably makes sense for something where transistor size doesn't have to be submicron (like in a TFT display). However for a 1GHz CPU it's not going to cut it (they have to be small because of little things like the speed of light and RC effects in wires).

    On the other hand there's a whole range of electronics out there where this sort of density is not an issue and this could make a lot of older fabs that are building this stuff redundant.

    I could imagine a cool disk drive replacement with this technology - basicly a pile of mylar sheets - I bet you could get comparable densities at similar prices .... and you wouldn't have to spin them ....

  • I didn't mean to imply that this process would be useless. Certainly there would be lots of applications where they would be extremely useful and novel. The innovative display technologies you mentioned are one. There are lots of embedded, low-computation, low-power applications where a manufacturing technique like this could find a big market. (I was thinking about ultra-cheap stick-on sensors powered by sunlight, communicating via picocellular networking.)

    The point I wanted to make was that the high-end laptop or desktop computer with submicron CMOS ICs is not going to be replaced anytime soon. Even if you could replace a 1 GHz CPU with 1000 CPUs operating at 1 MHz, your power requirements would not decrease (direct tradeoff of clock speed vs. number of transistors), and you'd need to write an operating system that could take advantage of a slow, massively-parallel processor. Also, as a previous poster pointed out, you wouldn't even get the 1 MHz clock speed, because the carrier mobility of the polymer semiconductors is much lower than silicon.

    This manufacturing technique will find a niche, but we won't be stamping out general-utility PCs with it anytime soon.
  • Nowhere in Cringley's article is there any discussion of the performance penalty that this process would entail. Let's assume we want to duplicate the equivalent of 50 million transistors clocked at 1 GHz. Right now Intel can squeeze that many components on a 200 square millimeter die by using a CMOS process with a 0.18 micron feature size.

    Now assume that your printing process needs transistors with 10 micron feature sizes to ensure proper registration and a high enough yield to be manufacturable. That increases your effective "die" to 956 square inches. (Area increases with the square of feature size.) That's equivalent to 10 sheets of single-sided paper.

    For a multi-layer printing process, 10 layers of plastic sandwiched together would definitely be possible. HOWEVER - you are not going to be able to clock your circuit at 1 GHz! Because of the much larger size (and capacitance) of your circuit, you'll do well to get a 1 MHz clock speed (1000X slower).

    While this process may be very useful for e-books, displays, etc., I don't see how any high-performance computing could be done with a microprocessor constructed with this technique. Your only alternative to slower clock speeds would be massive parallelism to achieve higher computational throughput. Assuming a direct tradeoff of speed versus number of transistors, you would need 10000 layers instead of 10 layers in your process. There goes your low manufacturing cost.

    It's not just enough for a computer to be cheap. It's got to be fast, or it's no good to anyone.
  • by daniell ( 78495 ) on Friday January 19, 2001 @11:43AM (#495115) Homepage
    This is actually conjuring up an amusing image of someone actually plugging in their soldering iron waiting for it to heat up correctly, and testing and cleaning the tip with a bit of solder. Fully satisfied that the process is well on its way, our hardware hacker touches both the solder end and the coper wire for the battery to the terminals he's so carefully traced through his dead machine. He's looking forward to a new and working machine, and brings the iron down to melt the solder. Before it even gets to the ink terminal on the plastic, the top layer browns, then melts away, exposing the next sensitive layer which quickly does the same as the iron is brought to a contact possition with the wire and solder. Our hacker realizes his error as he reflexivly twitches back; the solder hasn't melted yet, but there's a glob of messy plastic and ink burning to the tip of his iron. For shame, he thinks as toxins fill his notrils, I am so surprisingly stupid. :)

    -Daniel

  • Those all must suck, so they should go ahead and stop developing them.

    My intention on the original post was not to speak against scientific research (I am a scientist at Stanford University). My intention was to warn against the folly of believing that just because they have a company with cool press releases that they are going to change the world. As a scientific undertaking I'm all for it, as a business opportunity, I'm skeptical.

    (Jeez, you guys are harsh)

    -Moondog

  • by smoondog ( 85133 ) on Friday January 19, 2001 @10:16AM (#495117)
    Sounds like we've been through this path before. Unfortunately, developing new technologies rarely works and just because there is a company dedicated to it doesn't mean much more. Remember 3d protein memory based on lasers and rhodopsin? I'll believe it when I see it.

    -Moondog
  • -----
    Our microprocessor isn't some tiny silicon die -- it's the size of a sheet of paper, maybe two
    -----
    Where to start? The circuit size will drive up power usage and heat generation. The only way to offset this will be to slow the processor, and to add a huge battery. The result will be one of the biggest and heaviest "laptop" systems on the market that's slower than a P-100 and able to heat an average 3-bedroom house all by itself.

  • Although this technology has the possibility to make manufacturing chips much cheaper, Cringly doesn't mention anything about designing them.

    How much of the current price of a chip goes into R&D? Half or more?

    Now, granted, producing more chips does allow the cost of R&D to be distributed to more costomers, but the idead of a $15 computer seems absurd. You can only distribute the cost to so many poeple. (Most of the first world already supports it, and who else can afford it?)

    I hope this technology works, but it's more likely to be an evolution than a revolution.
  • If computers only cost $15 to manufacture, that will mean Windows will be over 3x more expensive than the computer itself.

    The masses will finally start taking free software seriously.
  • It's much worse than that. A large clump of Slashdot posters, presumably more intelligent than the average Joe, seem ignorant of the differences between you're and your, it's and its, there and their, me and I, and punctuation in general.

    I'm aware of the argument that the purpose of writing is communcation, and if the message is understood then the rest is unimportant. It seems to me that attitude is akin to a programmer hacking something together until it compiles and appears to work, and considering it a job well done.

  • That's funnier than it ought to be, for some reason.

  • Try printing something that's 15 atoms across....

    Try aligning your printing presses to that kind of close tolerances....

    Try doing multi-layers with this thing...

    Sure, it might work, but I doubt it.

  • It can't be too hard to wire another battery to it, just solder 2 points together and bypass the dead batter.
  • Well, if we get real simplistic and ignore the timing of operations, the distance across a page would only limit the CPU to something like 14.9GHz. Then again, you could get fancy with the layout, and if you consider the layering, you can keep some key parts of the CPU within less than 1/4 inch from each other.

    Anybody who says these things would replace current technology as speed/power leaders any time soon is smoking crack, but I would certainly pay $15 for a 486DX66 laptop.

  • At least DVD's don't expire.

    ... yet. See: Self-Destructing DVDs: Son of DIVX [slashdot.org]
  • Why is it that I so frequently see the (mis)use of then instead of than.

    Too werds: fonetik spelers.
  • I see a lot of people scoffing at the idea of making a fast computer on a plastic substrate. If you go to rolltronic's web site, and read their product info page [rolltronics.com], you will see that they don't talk about making a full blown computer anywhere. They talk mainly about display and memory applications. The one small paragraph they have about "Transistors, circuits and semiconductor devices" talks about tiny, flexible ID tags in packages, clothing, etc.

    So before you go dismissing this technology, try checking out what they're really trying to do with it instead of buying the Cringely article.. which seems a bit on the sensationalist journalist side.

  • I can see the headline now:

    Hack your AOL Spamputer to run Linux 6.4!

    If you thought those AOL CD's were bad, imagine getting pre-configured computers in the mail. If it's cheap enough, it'll happen.
  • If computers cost $15, then Microsoft will sell an OS for them that costs $2. By now, we should all realize that MS is extremely responsive to changes in the consumer computing market.
  • Here's a Google cache from the rolltronics site: http://www.google.com/search?q=cache:www.rolltroni cs.com/Roll2roll.htm [google.com]. I'm sure Google can handle the traffic much better than the original seems to be holding up...
  • Wouldn't the length of traces eventually represent a problem? We're already running into 'speed of electrons' problems with current designs, wouldn't an 8" CPU only magnify the problem, and create a speed limitation?
  • Or, even if it's not rechargeable, by the time a 5-year lithium battery runs out, you're probably ready for a new $15 computer.
  • What happens when the wafers get all crinkeled up like my newspapers?
  • Heat is a result of resistance. Bigger circuit components would have a lower resistance and preduce less heat. So this computer will probably run cool, but it may still waste power and at the one to two page per processor it will be slow.
  • something which doesn't require much processing power, but DOES require flexibility and low cost to succeed... Digital books, which we've been hearing about being right around the corner for frickin' ever (but are never good enough to really catch on) have been waiting for this sort of technology.

    But seriously, if it can't run half-life, it ain't replacing my note book.
  • by Ace905 ( 163071 ) on Friday January 19, 2001 @10:59AM (#495138) Homepage
    "Where to start? The circuit size will drive up power usage and heat generation."

    Where to start? Decreasing density leads to better heat dissipation. Changing fabrication materials could mean less heat generation. Size doesn't mean anything so long as no space is wasted; moving outwards in the x axis, instead of adding gates upwards to the y-axis are equivilants. ie: building out instead of up.

  • I'd be surprised if there is more than $5 worth of silicon in current chips.

    Sure these guys want to roll out the entire computer, from $15 in plastic, but is there really much more than $15 in raw materials in ur existing puter?

    There's the challenge of designing all the components, and performance questions (what is the gate length possible under this process?). I wish them luck, but I'll jump on the bandwagon when its no longer vaporware.
  • What he was saying (with the whole Longbow/rifle analogy) was that, while the technology may seem to be inadequate/kludgy now, it may not be in the future.

    AFA all the 'it won't compete with a X Ghz processor' has it occured to anybody that you print the pages for the supportable stuff (battery/display) and all 100 pages have a notch dead center where a real processor is dropped JUUUST before you laminate on the keyboard?

    So then you HAVE your 1 Ghz, $40 laptop.
  • I know what you're getting at, but I'm not sure it holds for everyone. Also, my accent (New Zealand) does not really place 'than' all that close to 'then'.

    I also think it depends somewhat on how your mind works. For example, I read a lot, and I tend to 'see' words, rather than hear them. Perhaps that is why, when I was younger, I used to know what a lot of words meant, and how to spell them, but I would sometimes have, uh, interesting pronunciation of some words.

    -----

  • by LKH ( 168628 ) <lindsay.k.hill@gmail.com> on Friday January 19, 2001 @10:34AM (#495142) Homepage
    Is it an American thing or something? Why is it that I so frequently see the (mis)use of then instead of than. Taco, for one, is well known for doing it, and here we see that a Slashdot reader, who obviously has been around a while, has been sucked into Taco's own version of the English language.

    Enough is enough I say! Bring back the 'a' in than!

    ------

  • I agree that this could be bad in a situation like this, borderline terrible, in fact. However, a disposable computer, like a disposable camera, can also be a good thing. Imagine losing your laptop when the airline loses your luggage (I know, Real Geeks take their laptops as carry-ons, but I digress). Now imagine picking up a Laptopzine at the airport shop for $50 that'll let you communicate with your company while you're on your trip waiting for your claim with the airline to go through. At the end of your trip, all you have to do is recycle your laptop. You've got less little lost productivity, and an interesting toy to play with for a little while.
  • ... . Right now, we are at the knee of the exponential growth curve of the telecommunications market, ...

    There's always been something that's bothered me about this 'knee of the growth curve' phrase. Aren't you always at the knee of the curve? As time progresses, the slope of the future curve is always exponentially steeper than the slope of the curve behind you. Isn't that why exponential growth on a log plot is a straight line?

  • Do you know how they make Hawaiian shirts?

    Printable == Wearable

  • Imagine A beowulf cluster printed on the pages of ... Beowulf.

    Translation or original Old English is up to you.

  • This raises the possibility of an Ebook with actual flippable pages.

    This I like.

  • And the old-line companies like Intel and AMD, which are currently fighting over which is the superior obsolete semiconductor company, well, those outfits go out of business

    Bzzzt Wrong.

    Intel or AMD BUYS our friendly RollTronics and maintain their positions in the new era --or-- they get involved enough in the technology and prove that it, in fact, does not work (in order to protect their $XXX Trillion dollar fab investments)

  • If you're ever stuck in a bathroom stall that's "empty" you'll have to severely double check the piece of scrap paper you use... you might be tossing/flushing your laptop by mistake.

    What a concept.

    Imagine sending THAT back to tech support for repair: "Reason for repair:" Euhh..... "Curry Related Emergency?"
  • While I'm with Cringely on most of what he says in the article, the 15th Century longbow/arquebus (they didn't have muskets then, damn it!) comparison is a poor one. Essentially, firearms underperformed the longbow right up until the middle of the nineteenth century when breech-loading and mass-produced rifles made them faster and more accurate (hand-built muzzleloaders were slow and inconsistent) than the bow.

    The reason that firearms replaced the bow some three hundred years before they were its technical equal was economic: an archer required years of training to have the accuracy and muscle development required to be any use at all in combat (archers can be identified from their skeletons, having asymmetric bone thickening in their arm bones) and have to be fit and well on the day as the physical effort required to discharge thirty arrows rapidly is huge.

    A muzzle-loading musket, despite having a 2-metre circle of probable hit at fifty metres range and a rate of fire perhaps a tenth that of a longbow, has a *way* lower ammo cost, a training overhead of perhaps a fortnight (and you can teach it in an afternoon if your student is bright and you don't care if he hits anything), and no great strength or stamina required since the kinetic energy that does the damage doesn't come from the soldier's own muscle.

    The solution to the technical difficulties is to use them en masse, mix them with pikes or give them bayonets for close work and fight on the defensive if at all you can.

    Basically, the longbow was betamax to the musket's VHS...

  • I suspect that the most important use of a technology like this will be to produce flat screen displays - driving the current high cost of HDTV and flat screen monitors way down.
  • Yes, developing new technologies rarely works. That is why we are still in the Stone Age.
  • Let's hope that his analogy isn't too accurate:
    "Say you were an English yeoman in the 15th century. ... Then your boss' boss up in the manor house said everyone had to trade his bow for a rifle. ... Early firearms were so bad they mostly just made noise. The longbow was not only more accurate in the hands of a good archer, it had longer range. But in time, firearms met the accuracy of bows and exceeded them. A 15th century futurist would see that. The trick is in timing when to jump to the new way of doing things."

    The problem with this analogy is that guns didn't exceed bows until about 300 hundred years later (and the yeoman, and his children, and children's children, et. cyk. hopefully stuck with archery.)

    I'm hoping to see nifty plastic computers sooner than that. But until then, I think I'll hang onto my yew-wood bow, er, semiconductor computer.
    -----
    D. Fischer
  • Last time I checked, most plastics have a large thermal expansion factor. Most modern CPU's have 5 or 6 metal layers of lines to connect things together. Materials are carefully chosen so the silicon, Inter Layer Dielectric, interconnects, and packaging all expand about the same. Can you say thermal cycling and stress crack failure? Actualy I see this technology being useful in something needing less than 3 connect layers that are not metal (limiting speed due to resistance) like an Active LCD or full color LED display. I wonder if they can make color LED's with this stuff. A large bright color display you could roll out on the wall in the confrence room would be neat.
  • I _love_ Nachos too ... and think Cringely is a bit of a dip on this.

    Why the f$ck would anyone think computers are going this way? Smaller is a trend. Wearable is a trend. Remote processing is a trend. All of which can be pushed to utterly ridiculous limits within the next decade! This Rolltronics seems more like a scam, especially with lines like "This is a multi-billion dollar opportunity."

    My wristwatch has more processing power than the first computers. /. Ran an article on a wristwatch that runs Linux. The ultimate CyberGeek I know had LCD glasses (prototype), Nintendo gloves, and a book sized unit that made Xbernaut look archaic. Not to mention full time wireless hookup to the net. While Cringely discounts 'incremental' changes, in ten years that's going to be reduced down to contacts, a wrist wrap (nerve sensors), and something the size of a pager. Hopefully running on an ethanol fuel cell.

    For my $.03 CDN, the reversible switch is probably a better bet, as it allows 3D 'chips', without the heat problems. Quantum is still a ways off. And Rolltronics is going nowhere.

  • by Leknor ( 224175 ) on Friday January 19, 2001 @10:20AM (#495172)

    Cringley says that the battery will be intergrated into the stamping procedure. This could be _really_ bad in my opinion becuase once the battery runs out so would the "computer".

    Lets say you pay for this month's Wired and comes via a wafer-computer. You read it and enjoy the interactive articles and eyecandy. Life seems that much cooler.

    Next month you want to re-read that artice. Too bad the battery is dead. Now you gotta pay for last months issue again.

    This seem like too much control over content I paid for. We are already bitching about DVD region encoding. At least DVD's don't expire.

    Leknor

  • Assuming they can print at 300dpi (which I believe is high for mass printing) on 8inx11in media

    Commercial-grade printing starts at about 1200 dpi and goes up to around 2400 dpi fairly inexpensively. Assuming similar characterstics of absorption and viscosity with the materials being contemplated here, the actual print density would be more like:

    1.2672x10^8

    Or roughly 127 million dots per page.

  • Actually, Cringley acknowledged that you couldn't replace current high-end silicon with this stuff overnight. That was the whole damn point of the bow-gun analogy he gave.

    Obviously, any technology with lower performance and lower price will start in the lower-end segment (wow!). In which segment you make money, which allows for R&D on reducing print size, which allows you to move up the chain, which makes more money, which allows more R&D, which allows you to move up the chain...

    If this is fundamentally cheaper than silicon chips, then it'll probably first eat the markets in which chips in the Z80/6520 are still being sold (2-12 MHz, 5k-6k transistors), then the 8088/8086 processors (5-12 MHz, 30k transistors), then the 68000-class (68k transistors).

IF I HAD A MINE SHAFT, I don't think I would just abandon it. There's got to be a better way. -- Jack Handley, The New Mexican, 1988.

Working...