Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
The Internet

The Microphotonics Revolution 49

MycoMan writes "Interesting article about photonic switching research, but there's a sentence in it that reads: 'So far, communications systems have managed to keep up because the volume of phone calls, Web pages and videostreams that optical fibers can carry is doubling every nine months, thanks in large part to the ability to squeeze more wavelengths of light into each fiber.' Doubling every nine months - is this really true?" True or not, it's an interesting article. Enjoy.
This discussion has been archived. No new comments can be posted.

The Microphotonics Revolution

Comments Filter:
  • by softsign ( 120322 ) on Sunday July 02, 2000 @01:50PM (#962617)

    These guys aren't saying they're going to build a motherboard with fibre-optic cables. They're talking about a silicon fabrication process that is used to build waveguides (micro-fibre-optic cables, if you will) into the silicon wafer itself.

    It's not the speed of light that matters here (the speed of an electrical signal is virtually identical to a light signal) - it's the switching speed. Even with the best CMOS processes out there today, there is still a finite switching time - the time it takes a transistor to go from one logic level to the other - that presents a barrier to the maximum available processing speed of the chip. With decreasing size and voltage you can improve the speed of the chip, but there's only so far you can go.

    These people are exploring the likelihood that you may be able to build something analogous to a transistor that acts upon photons instead of electrons.

    If they can succeed in making these feasible - then you have a technology that is potentially 1) faster than CMOS and 2) much more efficient.

    That is huge. It's not just a frivolous new motherboard with lots of unwieldly wires built into it. It would be a one-piece integrated design that would in all likelihood run very cool and perhaps even faster than microelectronics ever will.

    --

  • No, I'm not all that great at math now that you mention it :), but I still see a pattern. Logarithmic? Exponential? I'm not sure, I lack the vocabulary to explain what I see.

    And I'm not saying that one is causing the other, this has nothing to do with the astrologer comment or seeing pattersn everywhere a la Pi [slashdot.org] or anything like that. But when a pattern is noticed and a possible correlation is suggested, I think it's worth asking if there is a reason for it. Maybe there isn't -- that's very possible. But maybe there is -- that's very interesting.

    I'm just curious if anyone else has noticed this and put forth an explanation or refutation of the link.



  • These guys say communications traffic is doubling every nine months.

    Moore's law says that computing power doubles every eighteen months

    The two seem to be moving at proportional rates. Interesting coincidence. Anyone wanna speculate about reasons for this?

    How about, innumerate people attach significance to any dumb pattern they see? Tell me, what astrologer do you go to?
  • What's the point of a photonic motherboard?

    Actually there is a point in a photonic motherboard, but it is not related to this story in any way.

    This story talks about advances in optical switching technology. At present, high frequency optical signals need to be converted to electrical signals before being routed. But high frequency electronics are very expensive so it would be much better if it was possible to do the switching without converting from optical to electrical and back to optical.

    Philippe
  • by hrath ( 5792 ) on Sunday July 02, 2000 @05:01PM (#962621)
    Just a sidenote, UUNET's total backbone bandwidth has been increasing by 800-1000% every year since about 1994...

    In San Jose at Broadbandyear John Sidgemore (Vice Chairman, MCI WorldCom & UUNET) said in a keynote:

    "Bill Gates thinks that bandwidth should be free, of course we believe software should be free."

    Heiko - who works for WCOM, but is not a mindless drone
  • by cperciva ( 102828 ) on Sunday July 02, 2000 @05:17PM (#962622) Homepage
    states that bandwidth increases by a factor of three every year. This means doubling every 7-8 months.

    This, compared with Moore's law, has interesting consequences; among them the fact that as time goes to infinity processing power is expensive, while bandwidth is cheap. This is reflected in the differences between IPv4 and IPv6: while IPv4 has data fields tightly packed together, IPv6 spaces them out in a manner designed for easy access by software. While IPv4 optimizes bandwidth, IPv6 optimizes computational power.
  • In away you could say that we're already past
    that point. Network congestion is a relatively
    common problem on the internet. Just try to
    dowload something sizeable at saturday evening.
    (Or is that just my ISP?)
  • Oh man, you have no idea how much that upsets me. As a resident of Australia, and under the large thumb of Telstra, I'm stuck on a 56k modem link, and will be for quite some time. Cable's available to some people in other cities (God forbid we get this technology in the capital), ISDN is still incredibly expensive, as is frame relay and anything else at all better than a modem.

    God help us all really, and you Americans be thankful!

    Gfunk
  • I know you're more or less trolling for a laugh, but the speed of your personal net connection is totally irrelevant.

    What the article is pointing out is that the total amount of traffic flowing over the web doubles every nine months. That means that if we assume everyone on the net has a 56k connection and will never have a faster one, there will be twice as many people, and hence twice the volume (assuming everyone surfs the same amount) over the internet backbones.
  • How in the world did this paranoid delusion (old term for troll) get Score: 2???? The facts are that commercial technology is moving faster than state sponsored technology...as you might imagine for anything run by bureaucrats.
    For example, in the mid 80's, the Dept of Defense thought it needed to sponsor Very High Speed Integrated Circuits (VHSIC). The initial briefings talked about achieving 25 MHz and calling all such chips "munitions" and forbidding their export as a threat to national security. But as usual for DoD projects that try to compete with commercial, Intel had 100 MHz Pentiums before VHSIC had much more than some toy bus interface chips. Pentiums easily met the munition standard.
  • A switch in any network is a box that makes and breaks connections between the wires that flow in and out of it. These days, big busy switches use optical fiber cause of the sheer volume of data (noise) at their level. So the real deal is we can move from the bad old way of a) convert light to electrons, b) use electrons to make path between the incoming light fiber and the outgoing light fiber, and c) convert back to light to the brand new way of directly guiding the photon where we want it.
    In terms of development, it beats the hell out of vibrating mirrors.

  • An electron is a particle with mass and electric charge. A photon is a massless particle that is the carrier of the electromagnetic field. Both electrons and photons produce interference patterns, the wave-particle duality you mentioned, in double slit experiments. Electric current is a flow of charge carriers, which are not necessarily electrons, but could be ionized atoms or molecules. Electric current is not the same thing as electrical energy. Electrical energy is electromagnetic waves (photons).

    I found a web page, "ELECTRICITY" MISCONCEPTIONS IN TEXTBOOKS [amasci.com], that does a good job of explaining the difference between electric current and electric energy.

  • Interesting, but wrong. So is the other observation.

    Ok we will start with 2 bandwidth and 2 processing power. Measurements dont matter, its the numbers.

    For ever double the processing power makes, the bandwidth makes 2 of them. Look:

    Bandwidth --- Processing Power --- Time
    2 --- 2 --- 0 months
    4 --- 2 --- 9 months
    8 --- 4 --- 18 months
    16 --- 4 --- 27 months
    32 --- 8 --- 36 months
    64 --- 8 --- 45 months
    128 --- 16 --- 54 months

    So processing power goes down 50% each 18 months compared to bandwidth.
  • Actually, switching speed is becomming less and less important in chip design. Today's technology, 0.25 micron feature size and smaller has pushed line widths down and die size up to the point that it takes more than one clock cycle to move a signal from one side of the die to another...because the fine lines have a lot of resistance and silicon has a lot of capacitance. Which explains the push for copper interconnects (Intel's Coppermine CPU, and others), copper has less resistance than Aluminum.
    This speed limit is so significant that it drives CPU architecture towards the Itanium (wide instruction word) and Transmeta Crusoe (software scheduling) and away from superscalar/multiscalar, all because it's to difficult too design hardware that coordinates pipeline stalls all over the die.
  • "new hardware has often been developed and used secretly by states"

    I may have been misinformed, but as far as I know the CDROM was invented in the Netherlands by Philips electronics.

  • Good luck finding any detailed information on photon based computing. You will eventually end up hitting dead links just as the information starts to get interesting. And you ask why should this be so?

    You can divide electronics and computing into three generations of hardware platforms. Vacuum tubes in Zhe 30's and 40's, discreet transisters in the 50's and 60's and finally integrated circuits from the 70's to present. Each evolution in hardware platform brought a huge infusion of capital into the computing and electronics industry. Each new generation also brought about huge increases in hardware efficiency and speed.

    Now what if the powers that be made optical technology available to the public? You would expect that these new computers would be much faster than silicon-wire and be very efficient.
    No B fields and the ability to run perhaps tens of thousands of circuits in the same space. The paranoid might suggest that allowing the sale of such a machine would be a threat to national security, who knows. If history is any guide new hardware has often been developed and used secretly by states for a decade before the general public even hears about it. When did you first hear about the transister or the CDROM?
  • http://www.cd-info.com /CDIC/History/Pioneers/CDPioneers.html [cd-info.com].

    Sony and Philips, according to that, but there is no detailed information. A japanese company and a dutch company. Not exactly "secretly developed and used by the states for a decade", is it?

  • All the more reason to develop photonic chips.

    The propogation losses should be significantly lower... and if you can create a switching device that's efficient enough, you would be able to drive signals as far as you need (or at least as far as the "pin").

    Of course, there are probably a zillion other issues that aren't occuring to me.

    --

  • Just for clarity... the originating post in this thread is asking about photonic motherboards, which are briefly referenced in the article:

    Photonic wiring between transistors could make faster chips, while photons zipping between chips could turbo-charge your computer's motherboard.
    snip
    Engineers at companies such as Honeywell, Sun Microsystems and IBM, as well as at universities around the world, are already testing arrays of light-emitting diodes and laser beams that could serve as the "bus" that transports information across a motherboard from microprocessors to memory chips to display screen and back.

    --

  • The links to battelle.org seem to be working today.
  • Gilder, Moore... I remember when you had to discover a pretty clever fold in the universe to get your own law. Heck Pythagorus only got a "theorum." These days all you have to do is a little algebra.

    Erik's law: My hair seems to double in length every three weeks. I wonder what would happen if I didn't cut it at that point.

    -Erik
  • Sorry, but this is completely wrong. Electrons are waves and particles like any other electromagnetic wave, including light, which is a wave and a particle. It makes no difference if it's an electron or electromagnetic wave (they are the SAME thing - it's called wave-particle duality, and I am pretty sure they teach that in grade 10 science class).
  • "Where do you want the truth to go today?" (TM)

    :)
  • This is exactly why I bought shares in Gooch & Housego a UK company that is developing a Photonics switch, however, this is not a shares board so, I'll leave it at that.
  • by dustpuppy ( 5260 ) on Sunday July 02, 2000 @01:42PM (#962641)
    Every year we keep creating better and better technologies which expand our available bandwidth for communications. Will we ever reach a point when the amount of data that we want to transmit saturates our communication bandwidth?

    With 'everything' being networked and everything talking to everything else in the near future is it conceivable that our advances (such as the optical switching) won't keep up with the growth in transmitted data?

    Could it reach the point where we have communication restrictions (like water restrictions :-). eg only allowed to send emails on odd days, or no emails over 3k in size :P

  • You are breathing my oxygen. Please die and free up some air. Thanks. Damn KIDZ!
  • Recently, Corning, Inc. [corning.com] (mentioned in Fairley's article) announced that they will increase production at their Erwin, NY facility... by 700 jobs and $50 million. Read the press release here. [corning.com] Corning expects their photonics division to increase

    One personal connection for me is that this plant is literally just down the street from my summer job.

  • Oh Mr. Two-line-psuedo-witty-wasting-my-time-replying anonymous pussy. Im done with this thread.
  • you guys are all just a bunch of mac users.
  • by zaugg ( 87876 ) on Sunday July 02, 2000 @02:00PM (#962646)
    Well as well as fattening the pipes (and, hence, the "valves"), we can always lay down more of them.

    If the world travels down a purely client/server model (read: dot net... well, not quite) the bandwitdth requirements grow _very_ quickly. Go down a distributed path, and local traffic stays local, and the network grows strong :)

    Seems like the latter path is more sustainable, and more elegant.

    zaugg

  • Yep..in a cardboard box on 67th street with your mom and a jar to shit in. Shes into scat porn by the way.
  • What's the point of a photonic motherboard?
    [snip] I doubt that photonics will ever surpass the good old copper wire as ameans of data transmission for most people.

    Well, the "for most people" part is right, at least for now. Note this from the article:

    Initial applications for these systems will be in high-end computing. Honeywell, for example, is using optical interconnects to link microprocessors, creating compact, powerful parallel computers.
    The article also says that the electrical engineers "won't begin to exhaust improvements to metal interconnects by 2008." While it still remains to be seen whether or not optics can replace electronics on our buses (an possibly CPUs), time seems to be running out on the circuit technology were used to. The hardware companies will find a route to the next breakthrough, just like they always do.

    Is it just me, or is anyone worried that some key patents in this field could hold the future of computing innovation hostage by two or three viciously greedy companies?

  • "(the speed of an electrical signal is virtually identical to a light signal)"

    An electrical signal travels about 2x10^8 m/s through copper wire, with SOL being 3x10^8 m/s through free-space.

  • What's the point of a photonic motherboard?

    Well, for one thing, with a photonic based system you don't have to worry about electromagnetic interference between adjacent data tracks. That's one of the main stumbling blocks to compressing the size of circuits at the moment - at some point the individual tracks start behaving like capacitors relative to each other [that's all a capacitor is - two metal plates with a small gap between them].

    With light beams on the other hand, you can even have the beams shine through each other, and it won't have any effect. That in itself would be very usefull in designing circuits. What today takes several layers could be compressed onto a single layer.

  • by softsign ( 120322 ) on Sunday July 02, 2000 @08:09PM (#962651)
    If you can make a usable piece of hardware using only free space, sign up for the Nobel Prize, my friend. =)

    In crown glass (I don't really want to figure out the velocity in silica fibre and can't seem to find it quickly), light travels at roughly 66% of its free-space velocity. This is, indeed, very close to 2 x 10^8 m/s.

    --

  • Is it just me, or is anyone worried that some key patents in this field could hold the future of computing innovation hostage by two or three viciously greedy companies?

    Not particularly... this type of stuff has happened before... Jack Kilby (TI) and Robert Noyce (Fairchild, later Intel) both applied for patents on the integrated circuit at about the same time (1959) and eventually settled their disputes by cross-licensing each other's technology.

    Does TI dominate the world market in ICs today? Not really. Fairchild is still around but I don't think anyone would say they "control" the world's supply of integrated circuits.

    The important thing here is that there are many different companies working in parallel on this next generation of technology. Agilent, Lucent, Nortel et al are competitors. They each want to be first to market with this stuff. And when they are first to market, they want to be entitled to collect the rewards on their considerable investment.

    This is exactly why the patent system exists. To reward innovation.

    So what will Agilent and Lucent do if Nortel is first? Find a rock to crawl under and die? Hell no... they'll develop their own processes.

    Healthy competition is what's needed to ensure the public benefits from the technology.

    I don't think the public is served in any way by refusing patents to these companies. Without patents, companies will viciously guard their secrets and forward progress is slowed considerably. With a patent in effect, others can see what one company has done and come up with novel new ideas that 1) circumvent the patent and are 2) well, novel new ideas. =)

    Patents are not inherently bad. A microphotonic switch is not exactly as obvious as "one click buying". =)

    --

  • I'm not a Physicist, but I think that it is electromagnetic waves, not electrons, that travel at a significant fraction (velocity factor) of the speed of light in an electronic circuit. The electrons actually move at a very slow speed.
  • *laugh*

    This fortune was at the bottom of the page when I refreshed this thread:

    patent: A method of publicizing inventions so others can copy them.

    --

  • it's pointing out that the capacity that the network CAN carry has been doubling every nine months.

    .oO0Oo.
  • I don't get it. Can someone explain to be why photons rushing down waveguides on a chip would be orders of magnitude faster than electrons rushing down wires on a chip?

    I can see that it would be somewhat faster, as you don't have to contend with electrical resistances and so forth, and the information carrier is travelling at the speed of light (by definition) - but electrons also travel at a sizable fraction of the speed of light.

    What characteristics of photonic computing am I missing that would make is oh so much faster than what we have now?

    -josh
  • Weel if photonics result in chips running cooler where will I do my cooking. At present I use my Pentium 3 and its as good as a microwave. I was looking forward to using the Penttiom 4 for outdoor barbecues
  • But then again a Photonics chip could cook it (light)ly and not burn it
  • Somehow I neglected to finish my last sentence in the first paragraph: Corning expects their photonics division to increase by 80% in the next year.
  • Bandwidth limits *are* a huge concern. As an ex-employee of a national service provider, I have firsthand experience that the need for bandwidth is giving providers hotflashes as they attempt to order and provision the "fat pipes" they need with enough lead-time to prevent saturating their network.

    Not only are individual providers having problems meeting customers needs, but the peering points (NAPs, MAEs) are having trouble keeping data flowing between the disparate networks because their switches can't handle the amount of traffic. Do a search for "MAE" or "NAP" plus "outage" and see how many are switch-related.

    Some of the larger networks have partnered with, or ARE, the actual wire providers (Qwest, etc) so they can actually provision the pipes fast enough to meet demand, but the companies that have to lease the OC3s and OC12s (and fatter) are running into provisioning delays. (There have been lawsuits due to 'conflict of interest' problems. WorldCom was really bad about that. They could provision a new line for their internet in days; for us they said it would take months.)

    The bottom line is, internet traffic is increasing rapidly because:
    1. More people are using it. More AOLers, more Earthlinkers, etc, plus everyone wants a dot.com to run their flower or plumbing business.
    2. Some people are using it more. Cable modems, ADSL, and ISDN have become more affordable and more widely available. Think warez and mp3s.
    3. Overhead. What most people don't realize is that a GOOD CHUNK of internet traffic at the level they can't see is overhead, retries, fragments, etc. The CEO of my nameless-ex-employer made a rough guess that up to 30-40% of internet traffic could be reduced by tidying up certain protocols, configuring equipment PROPERLY, eliminating MTU mismatches, cleaning up and trimming routing announcements, etc. Some of this is also a result of crowding at the NAPs/MAEs causing packet fragmenting, busy webservers causing endusers to retry, DOS attacks by petulant teens at other petulant teens, etc etc. (non-productive traffic.)

    Wave division multiplexing and more and fatter pipes help. I think routing and addressing are a concern, but not right away. Private interconnects help with the NAPs/MAEs, although this tends to help the larger providers more than the smaller ones.

    Personally, I think we don't have much to worry about. The problems we do have are being solved by very smart engineers who come up with outrageous new equipment that outperforms the old equipment, or they come up with ingenious workarounds to the problems.
  • Hey, this is Slashdot, nobody cares if anything is true.
  • What's the point of a photonic motherboard? You're talking about a transmission distance of maybe a few miles if you stretched the wires out and laid them end to end. With such short distances, what's the point of moving a signal at .9c versus .7c. Also, all the fiber optics that I have worked with have been a total pain in the ass. I doubt that photonics will ever surpass the good old copper wire as ameans of data transmission for most people.
  • I think the nine months thing refers to the speed of *corporate* lines. The stuff the big guys use. While you may not care how fast Qwest employees can download porn, it DOES affect us in the long run. The more lines dropped, the more likely the lines are to go public when they need to get funding for new lines when they have to upgrade.

    Still, with speeds(in MBps) higher than most users IQs, it's dissapointing to see T1's still so high in price.

  • by babbage ( 61057 ) <cdeversNO@SPAMcis.usouthal.edu> on Sunday July 02, 2000 @02:28PM (#962664) Homepage Journal
    These guys say communications traffic is doubling every nine months.

    Moore's law says that computing power doubles every eighteen months

    The two seem to be moving at proportional rates. Interesting coincidence. Anyone wanna speculate about reasons for this? Just a glitch of the numbers, or does one have something to do with the other? How far back does this growth in communications speed go? Moore's law is claimed by some to go back in some way to the beginning of the Industrial Revolution, and with the telegraph and such I don't think it's impossible to speculate that communications has been doing the same thing.

    So. Anyone care to put forth a hypothesis to explain this coincidence?



  • uh... whatever

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...