Paint-on Laser Brings Optical Computing Closer 132
holy_calamity writes "New Scientist has a story about a laser made by painting a solution of semiconductor crystals onto glass. It could be used to break the interconnect barrier by having optical interconnects, the interconnect barrier threatens Moore's law unless a faster way of connecting chips is found."
Re:Whaaah? Maxwell 101 (Score:3, Informative)
The travel itself, no. The wavefront of "pressure" moving along the path of the electrons, yes. The electrons themselves move at only (depending heavily on current and wire diameter) around 1-10cm per hour.
But the wave still only travels somewhere between
Does the difference there really matter all that much? For long-distance communication, sure. But for chip interconnects? Doubtful.
Re:Whaaah? (Score:5, Informative)
yes. inductance slows electrons down and electrical traces can't touch each other so they have to be drawn around each other - laser light beans can pass through one another with no interference. So the traces can be more direct and hence faster. Finally, the scale of components in a processor has gotten small enough that individual traces are interfering with one another inductively and on a quantum level - these don't happen with light.
Quantum computers complement digital ones (Score:5, Informative)
Shor's algorithm for factoring numbers could be used to rapidly crack RSA encryption. http://en.wikipedia.org/wiki/Shor's_algorithm [wikipedia.org]
Grover's algorithm can be used to search an unsorted database in O(n?2) time. http://en.wikipedia.org/wiki/Grover's_algorithm [wikipedia.org]
Speed increase (Score:5, Informative)
The article and summary seem to be a bit misleading and vauge about how the speed increase arrises. The great benefit of optical computing is that it allows the signals to get much much closer together than electronic circuits, and as such allow more compact circuits, which as we know generally means faster. Interestingly, electronic signals in wires and optical signals in fibers have roughly identical upper speed limits (light in free-space optical computers is faster, but also almost impossible to do anything useful with), so its the density which is the major factor.
Electrons are charged, so as you squeeze transistors closer together, the wires get thinner and closer together, and you get cross-talk and interference between them. Photons however hardly interact at all, so you can have many beams in the same space, and theres very little heat to be dissipated. Multiplw frequencies can also be used, resulting in massivly parallel computing (another GoodThing).
There are downsides with optical computing still, photons cannot be stopped and stored (easily), meaning any kind of useful computer in the near term is likely to be some sort of electro-optical hybrid, with photons carrying signals and electrons storing them
Re:Whaaah? (Score:3, Informative)
Re:Quantum computers complement digital ones (Score:2, Informative)
However, a famous physicist/mathematician (whose name escapes me right now) proved that to emulate a quantum computer on a digital one will always require exponential complexity. So the benefit of speed is lost, but for the sake of curiosity and development, implementations of quantum algorithms can, at present, be tested. What we need now is the hardware. 8)
Re:Whaaah? (Score:3, Informative)
What the hell do imperfections have to do with it? Nothing with mass can move at the speed of light. You seem to be suggesting that if the conductor was perfect, the electrons could move at the speed of light. What sort of crazy talk is that?
Anyway, the electrons have a net speed on the order of just millimeters per second. However, changes in the electric field caused by the motion of the electrons can propagate through the conductor much, much faster.
Re:Quantum computers complement digital ones (Score:1, Informative)
His name was Richard Feynman. He sorta founded the subject of quantum computing since he was interested in modelling quantum physics on a computer but found this too be computationally expensive on a classical computer. Thus he invented the notion of a qunatum computer.