Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science

Light-Based Computers Using Quantum Principles 83

Maddog2030 cites a story at Science Daily, writing: "Here's an interesting twist to all the news on quantum computing. A computer running similarly to a quantum based computer, except it runs on light at similar speeds for particular tasks. It also rids itself of the many complications introduced by quantum computing."
This discussion has been archived. No new comments can be posted.

Light-Based Computers Using Quantum Principles

Comments Filter:
  • by Anonymous Coward
    "Wow, Fred... a new computer? Why did you trade in the red one for the blue one?"

    "I didn't... try walking toward it."

  • by Anonymous Coward
    A factor of a billion is inconsequential when considering 256-bit keys; after all, a billion is less than 30 bits, so it's equivalent to cracking a 226-bit key on a conventional computer.
  • by Anonymous Coward
    The TWINKLE device by Prof. Shamir can be used to factorize relatively large numbers.

    http://www.simovits.com/archive/twinkle.pdf

    And to all the scepticals out there: There are results that show that QC can achieve better results that standard computation models (factoring is not one of them as we don't know its hard). We've only merged from the level of understanding quantum bits to the level of having a few interesting algorithms. Give it a few more years for digestion and we'll know much more about what can and what can not be done with "non-standrad" tecniques.

    Can't wait for the light-based Tetris.
  • by Anonymous Coward
    This can't be used to crack RSA, and it's not a general method of algorithmically running through a large number of possiblilities concurrently, which we get with quantum computers.
    Well, technically, we don't get *anything* with quantum computers because they DON'T EXIST. These are both technologies that have yet to be developed into anything beyond a lab experiment. It's like comparing teleportation with faster-than-light travel.
  • by Anonymous Coward
    Word is the U.S. Military is working on a "super quantum" computer that sends its instructions faster than the speed of light. This way, in the event of a nuclear attack, we can hit a button that stops incoming missiles by blowing them up before they are ever launched.
  • by Anonymous Coward on Wednesday May 16, 2001 @05:55AM (#219793)
    Because it's cheating. First of all, because it has got nothing to do with quantum computing. Searching a database in parallel is only one aspect of quantum computing. Other aspects, like Shor's algorithm, totally depends on entanglement, which is something completely different from interfence. It's even worse, Shor's algorithm uses the fact that the combination of entangled states and partial measurements result in the destruction of interference. The second reason is that it will take you a rather long time to find out which of the 50 frequencies is altered. You have to do an interference measurements, which involves measuring light intensities with a moving detector. This is slower than a clasical computer.
  • I don't think it's as bad as you make it out to be.
    1) Quantum speeds? WTF is that? There's no such unit, not even associated with quantum computing.
    If I say my car can operate at racecar speeds, am I implying that there's a unit called the "racecar"? Of course not. "Quantum speeds" is shorthand for the the relative speed of quantum search algorithms when compared to classical algorithms.
    2) The device "mimics quantum interference". No, it's light; it displays quantum interference. Light is photons, quantum particles. Dur.
    Yes, light is composed of quantum particles. So are the electrons in my PC. That doesn't mean I have a quantum computer.

    The device in question works using interference of light waves, interference which can be described using a classical wave-only description of light: i.e. light can be treated as a wave propogating in a 3-d field of real-valued vectors that describe the magnitude and direction of E and H at each point in space.

    This is in contrast to quantum interference, which generally involves the superpostion of state vectors in a Hilbert space (of complex valued wave functions).

    -David

  • Cheggit out, here's the picture of the first computer bug ever:
    Debugged [dhs.org].

  • So in the case of Walmsley's device, 50 different frequencies of light shine through the modulator, and if the 20th frequency is the altered one, then Walmsley knows that the bit of information he was searching for is located at position 20 in the database. A conventional computer would have had to check 20 times to find the location.


    I kind of like the mudslinging here between the quantum and optics camps, but has anybody else been troubled by the glossing over of spatial complexity here? Anybody can build a device that will solve a problem in one step. The spoiler is that the device will grow in size (or in this case, frequencies) proportional to the complexity of the problem. The QC paper on database searches did this: they claimed that a "conventional computer" could only search n items in O(n) time. But "conventional" devices have been built which find an item in O(1) time, eg by indexing the items by value. It seems like we have a mutually interfering group of entangled misconceptions here.

    Heck, you could probably shoot radar at the surface of a hard disk and tune it to find a particular bit pattern, "billions of times faster". Would that violate "classical computing limits"? Not at all.

    Overall, I would say that this is a step in the right direction. If you read the article you'll find that the stumbling block was that the optical crowd thought that the particles had to be entangled for a device to work. I suspect that this misunderstanding arose from Deutsch when he used the term incorrectly to describe a QC.

    Maybe next they'll discover that a DRAM can fetch a value from any location *in one step*. My gosh, that's amazing. (OK, I'm sarcastic, but I think the physicists have to be a bit clearer in proving that their whizbang devices are not spatially complex...and yes, I am reading the papers to try to clear this up).
  • Yes, they seem to be ignoring spatial complexity here. That was my reaction to the quantum paper on the same thing.

  • I would like to think that our enlightment grows with time, but every new article I read about quantum computing research seems to be filled with more and more hyperbole.


    Yes, I think the field is incredibly strange and wonderful, but not for the physics. The ignorance and mutual misdirection are entertaining. What other field has physicists, philosophers, and computer scientists, all misunderstanding each other in fundamental ways? Now we have the optics community involved. Maybe at least we can separate the wave mechanics from the rest of it all, if there is any remainder.

    (We all learned that quantum mechanics destroyed the computer-like determinism of Newtonian mechanics, but now we think that, while the universe is not a big classical computer, the universe may be a big quantum computer!)


    I'm not trying to start a flamewar, but the claim that QM implies nondeterminism also falls into the unfounded hyperbole category. As someone wrote in Physical Review Letters a while ago, the stuff about trajectories not existing, and instantaneous quantum jumps, many worlds, etc., is a good way to keep from presuming too much information about a system, but it's not a proof of any deeper reality.
  • I'm about at the same stage of understanding. I suspect that wave mechanics (as the opticists used) may underly a lot of the claims. QC relies heavily on Bennett's work in reversible computing. The idea is that information, and its underlying physical representation, can flow both ways through the logic gates, just as light can propagate or resonate both ways in an optical system. If the system is stable (that's something I'm not convinced of), then the observed wave function should reflect the proper relationship between the "input" and "output" (even if we, say, put in a known output and want to reverse to an input). Measurements against such a system should yield statistics that reflect the wavefunction at different parts of the system, and the wavefunction will represent something about the information we're seeking.

    I don't see anything so mysterious about this. QM brings on a lot of handwaving (eg Deutsch's claim that QC "proves" that the many-worlds interpretation is correct), but I believe the opticists are on the right track when they say a lot of the logic can be done in non-quantum systems. QM at best provides a good source of waveforms to put through our interferometers.
  • I did, but not nearly as much as this guy [downport.com]. I still have them Little Black Books to prove it, too!
  • Did anyone ever seriously play Traveller ? If so, what was the key to getting it to work ?

    I bought the rules and thought it was a pile of poo. The individual combat system was awful and it didn't seem to have the aura of fun that say D&D had.
  • I believe I still have the boxed set too, but only god knows why!!

    Are they worth anything ?
  • No one has ever shown that quantum computers could break a symmetric encryption at more than twice as fast as brute force, and the actual clocking of quantum gates is not that fast.

    What everyone is talking about with encryption is Public Key, where quantum mechanics may be able to reduce the brute force test to n operations given an n bit key, and n quantum gates. This is particularly true of those built on prime numbers.
  • Interference *is* superposition (of out of phase waves). BTW: What quantum features of the light are they using exactly? Seems like the classical wave model to me. Acousic-optic transducers are used here (UT-Austin) as beam benders without so much hooplaah.
  • does that mean i can format my drive by holding a lamp too it?
    No, since they use sound waves to imprint the storage medium you get data corruption when the guy in the jeep drives by blasting the latest eminem garbage.

  • ...both of which are elements in "quantum computing".
  • But if you can get results from a quantum computer without turning on the power, the off-quantum-computer should be more resistant to radiation than any other type of computer. You do need to give the computer time to not run, but a wind-up clock is rather resistant to radiation.
  • because its difficult to manipulate light in a conventional machine. we dont have fully optical switches yet commercially let alone a massively parallel optical processor. yes, its been known for a loong time since optical fibres carry multiple wavelengths of light and the human brain mimicks this with its massively parallel neurons but its not yet viable for commercial use -- the NSA might use it but most conventional machines cant be mass produced with it.
  • While this does sound very neat and seems like a fantastic idea, it does seem to have a few inherent limitations.

    In order for this light interference method to work most efficiently, a single beam of light must be able to shine on the entire database at once. For small databases, no problem, but for large ones it seems an impossibility. Why not just do part of the database at a time? Well then you lose the reason you were doing this for in the first place, which is to make a single request to the database and come up with your answer.

    Another limitation with the size of the database is the number of frequencies that the light must be split up into. I'm not sure of the actual number of frequencies that we know how to split light into, but there must be some limit. Potentially, with quantum computers, there would be no limitations. Just add more qubits.

    So while this does sound very cool, I don't see it replacing quantum computers as the next big leap in computing.

  • <flame>
    you're thinking that an increase in speed of merely a billion times (or ANY physically possible multiplier) is going to help you brute force 256 bit keys?
    </flame>

    I suggest you go check up on the properties of exponents. A billion times faster is the same as reducing the keyspace by 30 bits: leaving you with a task equivalent to bruteforcing a 226 bit key on current hardware. If you make it a billion billion times faster, I need merely add 60 bits to my key to regain the previous level of security.

    The thing about quantuum computers is that they are good at reducing the order of complexity of certain computations -- in this case factorisation. You only need to start worrying about symmetric keys when a quantum computer can solve NP problems in P time. But then you REALLY have to start worrying, because ALL symmetric ciphers are in NP.
  • that's gotta knock a hole in current key lengths and security...I wonder how fast this thing has to get before brute forcing a 256 bit key becomes feasible.

    Not possible. If all the energy of a supernova could be channeled into a ideal computer that did nothing but count, it would only get up to 2^219. (from Applied Cryptography by Bruce Schneier). Brute force attacks will never work against keys of sufficient length; you will need to find a weakness in the cryptographic algorithm (or find the sticky note the user wrote his password on, which is a much better bet).

  • Photons are pretty stable against these kinds of interference, since they don't interact strongly with matter. So, as long as you use high quality optical elements, you can do quantum computation experements at room temperature. Unfortunately, the lack of interactions makes it hard to implement quantum gates, making systems larger than 2 qubits difficult.

    NMR quantum computer experements are also at room temperature, but have their own problems.
  • No, with this sort of device you do not have to iterate over the frequencies. If the 'on' bits of the modulator retard the light passing through them by, 2n-1/1 wavelengths and the off bits have no effect then when the beam is recombined you can just bounce it of a grating/prism into a linear array of detectors and there's your answer. The slow bit is reading the detectors, ie. as usual its the optical->electronic bit thats slow.

  • This sounds more like holographic computing, rather than quantum computing. In QC, a search would involve manipulating the system until one state is left. In this article's example, you still have to iterate over the frequencies to find the frequency that changed. As I am given to understand QC, were this a true analog to QC you'd simple have one color standing alone, with no searching.
  • Agreed. I have no points left :(
  • Really? I would have thought the transducers to be higher than 20khz. And when the jeep drives by, its really only the longest wavelengths that you hear.
  • "except it runs on light at similar speeds" - don't you mean wavelengths?? Isn't that whole speed of light declared as a constant and not a variable or a pointer??
  • I suppose a unit of quantum speed could be how long your average gate operation takes. In an ion trap QC it's about 10^-3 s. Or a clock speed of 1 milliHertz. Crap, in other word.
  • "...Conventional computers use particles called electrons..."

    If you didn't get basic atomic structure in grade school you should just be taken out and shot.

    They pretty much glossed over any details. I'll wait for the article in Nature.

  • ...me without mod points. Ah well.

    TomatoMan
  • But you don't have to look at the whole bunch of them individually. And regardless, you don't have to look at them at all. They are sorted, which was the goal.
  • There are 7-qbit quantum computers.
  • by Fjord ( 99230 ) on Wednesday May 16, 2001 @06:54AM (#219823) Homepage Journal
    The article describes a physical mecahnism that is faster than a algorithmic mechanism. This is nothing new. The one I leanred in first year CS was sorting spaghetti sticks. An algorithm will take O(n*log n) steps to sort them, whereas in the physical world it takes O(1) step: you pick them all up and tap the bottom of the cluster on a desk (lining them up, leaving the taller ones sticking out more).

    Another example is finding the bounding convex polygon for a set of n points. I don't remember the runtime for the algorithm, but for the real world it's O(n): you get a board, nail in the n points, then find a rubber band and wrapp it around the nails.

    The article describes another one of these problems that is solved faster with a physical process, in this case looking up a record in a database. By physically encoding data differently, you can find a record in a large set in a single step (well, maybe not since you still have to FFT the light to find the frequency, so I'm still not sure how this is faster than the O(log n) of an index, remember FFT is also O(log n) where n is the number of frequencies, and you need the number of frequencies to be the same as the number of records so it seems equivelent to me, but there may be some other way of determining the frequency of the altered light).

    This can't be used to crack RSA, and it's not a general method of algorithmically running through a large number of possiblilities concurrently, which we get with quantum computers. There may be a way to crack RSA generically with a physical process (didn't Shamir come up with an optical process for 512 bit RSA). But this has nothing to do with that.

  • Greg Egan's "Luminous" had a light-based computer in it that seemed quite a bit more conceptually ambitious than the one depicted here.

    Egan's homepage can be found at http://www.netspace.net.au/~gregegan/. [netspace.net.au]

  • "If the database in question were the Manhattan phone book, the search for a single phone number could take a conventional computer several million searches, while a light-based device could pinpoint the number in just one."

    The operations it takes to search through a million-entry sorted database isn't a million. It's proportional to log2(n). Think about it. Divide it in half. Is the current entry bigger or smaller? Bigger->Divide the bottom half in half and repeat. Smaller->Divide the top half in half and repeat.

    So how many times can you divide a phone book in half before you're guaranteed to find your answer? log2(n).
  • Sorry mate - you got the colours the wrong way round - blue has the shorter wavelength (either that or my Physics teacher was lying to me all those years ago). Still a great joke though : )
  • Exactly! But since there are no gates which have been realized *physicaly* outside of a few prototypes, there is no known speed yet.

    -david
  • by Science_Nut ( 113901 ) on Wednesday May 16, 2001 @06:33AM (#219828) Homepage
    That article was insipid to the point where I couldn't finish it. A few points in the first few paragraphs are worth mentioning:

    1) Quantum speeds? WTF is that? There's no such unit, not even associated with quantum computing.
    2) The device "mimics quantum interference". No, it's light; it displays quantum interference. Light is photons, quantum particles. Dur.
    3) "performs some tasks a billion times faster". This is what I call a 'crazy number' since it's not based on any sort of measurement and thrown in only for show-value.

    Don't get me wrong, I'm active in QC research and I like what the folks at Rochester are doing, so, too, the folks in an optics group at Los Alamos. But whoever wrote that Science Daily article is whacked out. It cheapens everything.
  • So, it looks like this thing can do computations about a billion times faster then conventional methods....that's gotta knock a hole in current key lengths and security...I wonder how fast this thing has to get before brute forcing a 256 bit key becomes feasible. (big math people out there anywhere?)

    Fortunately, that should also offer a slew of new possibilities for encryption schemes that were previously too slow or bulky.

    On your mark, get set, encrypt
  • you didn't get better performance, you just got radiation resistance

    In our real spacetime, you probably won't get radiation resistance. Quantum effects are all extremely susceptible to any sort of interference. AFAIK, any sort of quantum computing device only works as close to absoulte zero temperature as you can get.

  • what does this do to all the companies spending *billions* on quantum research? does this mean they are screwed? or does this mean they accelerate their research? i hope for the latter... (not just because i work for one, i assure you...)
  • The term debugging originated supposedly when they found a bug in between the relais of one of the first computers. I was just thinking what would happen if they actually build one of these machines and had to debug it: would they find a firefly??
  • by TeknoHog ( 164938 ) on Wednesday May 16, 2001 @06:32AM (#219833) Homepage Journal
    This looks exactly like the idea of holographic storage. What is stored in the memory material is the interference pattern of the data and the address beams. Then you can either light it with an address beam (the address can be either a direction, or frequency, or maybe something else) and out comes the data content, or vice versa (grep). This is excellent for database and memory technology, but I see no connection to quantum computing here.

    Disclaimer/shameless plug: I've recently compiled a semi-technical paper [ucam.org] on some of the theory behind quantum computing, as a project in our undergraduate physics course.

    --

  • Oh come on people if this isn't +5 informative then Eric Raymond isn't ugly
  • Also, from the article: If the database in question were the Manhattan phone book, the search for a single phone number could take a conventional computer several million searches, while a light-based device could pinpoint the number in just one. Well, in theory I suppose there are programmers who *could* write an algorithm that would take several million searches... Reminds me of MS-DOS (in W95 at least) where a single keystroke in the command window literally ties up the CPU for in a loop for 1/10 sec (say 50 million cycles) to limit the rate - if you type fast enough all other processes will come to a halt.

  • Yeah, I saw this one, too. Actually, if using B-Trees or better, you can achieve better speeds, in the real world, than log2(n).
  • Quantum Computers can crack any crypto system that's based on the hardness of factorisation, because there exists an algorithm (due to Peter Shor) that can factorise composite numbers exponentially faster than any classical computer can do. So whereas adding a few bits to your key turns a 3 month problem into a 3 Myear problem for a *classical* computer, the quantum computer just laughs it off. Whether there are other bases for crypto that a quantum computer couldn't break (are Elliptic Curves any good?), I don't know.

    However, Quantum Cryptography (properly called "Quantum Key Distribution") is another matter entirely. It doesn't rely on any computational problem being hard - it is based in the fundamentals of quantum measurement. Essentially, because of the fragile nature of quantum states, you can arrange that no eavesdropper can know your key - not even one bit of it - without disturbing it and so revealing themselves. Thus, you need never use an unsafe key again.

    Not even a quantum computer can get around the fundamental limits of quantum measurement, and so QKD is provably secure against any future technological development.

    ...unless a whole new paradigm of physics emerges to replace Quantum Mechanics, of course!
    StuP

  • by stup ( 180061 ) on Wednesday May 16, 2001 @06:01AM (#219838) Homepage
    AFAIKS, this isn't scalable, which is the key to making a real useful Quantum Computer. The amplitude of the output, a single frequency component, must surely drop away with the number of entries in the database. So if you increase the number of entries by a factor of a million, the output beam is a million times weaker, and almost undetectable.

    In analysing this sort of thing, the "size" of the problem is usually taken as the logarithm of the number of entries (ie the number of bits required to label each item). Since the strength of the output beam decreases linearly with the number of entries, it falls off exponentially with problem size.

    Now, it can be shown that even with a Quantum Computer, the best we can do is to speed up the search by the square root of the number of entries. So 10^6 entries takes 10^3 searches, and so on. This isn't an exponential speedup (which is impossible for Unordered Search), but I can't see that this "light interference" method could match a quantum machine.

    And it certainly couldn't match the exponential speedups on Factoring, the killer app for Quantum Computing.
    StuP

  • This would be the perfect computer to crack open when you're drunk/stoned/whatever. "Wow, look at the pretty colors....wooowww...."
  • To search the database, Walmsley directs a beam of light toward the modulator. The light is first split into two, with one part traveling through a prism so that a rainbow of different frequencies of light shines on the modulator. Each frequency shines through a different compressed or expanded part of the tellurium dioxide, which bends that frequency of light the way a straw appears bent when sticking out of a glass of water. The rainbow of frequencies is then recombined into a single beam. By mixing the new beam with the original beam that entered the device, a single frequency will emerge as having been altered by its trip through the database.

    Doesn't the beam of light have to contain the data that the user is looking for so that the match can be made? How is the data encoded into the light and how much data can be encoded? Guess I need more info. It kinda strikes me that all that guy is doing is using the 2d method of storage and the availability of air as a transmission medium for light to establish a connection to EACH section of the data storage medium. What's that got to do with quantum computing.

    Urgh. I'm feeling rather confused today.


    Pinky: "What are we going to do tomorrow night Brain?"
  • So you manage to prove that data retrieval using light is viable but how viable is the storage medium? It won't be any good having a brilliant information retrieval tool if you loose the data every time you have to reboot or if the storage medium requires 1 square cm per bit.


    Pinky: "What are we going to do tomorrow night Brain?"
  • So you wait for the acoustic wave to get in just the right position and you take a picture. What's so f***ing fast about that?!!!

    My seventh grade social studies teacher showed us punched cards with the holes along the edges. You ran a long metal pin through the holes and lifted, and the cards where the edge was punched away stayed in the tray. Repeat for multiple WHERE clauses. At least that could handle multiple cards with the same value.

  • "If the database in question were the Manhattan phone book, the search for a single phone number could take a conventional computer several million searches, while a light-based device could pinpoint the number in just one"

    Unless you use a database of course...

  • Anyone ever spend the extra mega-credits to get a fiber optic based computer for their starship?

    Though in Traveller, you didn't get better performance, you just got radiation resistance.
  • Well there is a bit of a difficulty in explaining exactly where the computational power of a quantum computer comes from. Kind of like asking where the power of a classical computer comes from (and don't say "from the power company" damnit (yes, I'm from California)).

    But the "power" of quantum algorithms over classical algorithms makes itself clear when you realize that all efficient quantum algorithms make use of a COLLECTION of quantum systems. When you take the polarization of a single photon and use polarization filters you essentially have a single quantum bit of information corresponding to the two polarizations. But in order to make a quantum algorithm, you need to put a bunch of these qubits together and they must interact in a non-trivial manner. Thus you need to get someway for the polarization of one photon to interaction with the polarization of another photon. This is really a pain in the ass to do without destroying the photon or the coherence of the polarizations.

    So I guess what I am saying is that when you take a bunch of quantum systems and build a quantum algorithm, the power of the algorithm comes from the dynamics of the interaction of multiple quantum systems.

    The fact that quantum computers are probabilistic and rely have a "collapse" of the wavefunction at the end of the computation are sort of secondary to the issue of where the power comes from.

    dabacon
  • OK, Anonymous Coward, you keep using your RSA then.

    As for me, I'm betting on good old human ingenuity.

    dabacon
  • As has been pointed out by many posters already, this is not what nearly all researchers would call a quantum computer. A universal quantum computer, from a physicists perspective, is a computer built with pieces which obey quantum mechanics AND can be used to EFFICIENTLY simulate the effects of systems obeying quantum mechanics. This EFFICIENCY condition is extremely important, because, for instance, your classical computer can simulate quantum physics...it just takes it a hell of a long time for most reasonably sized problems!

    The device described (poorly) in the article fails to achieve an efficient simulation of quantum systems because the number of frequencies needed in order to perform a given simulation will scale exponentially in the size of the quantum computer being simulated. Albeit technologically interesting, the computation performed by the experiment is not something which a classical computer cannot do as efficiently.

    But what really troubles me is the quote attributed to Walmsley in the article:
    "We wanted to show that the implementations which have been done with quantum computing have an exact analogy that is just as effective in light-based processes," says Walmsley.
    Just as effective?! That is a just not true. Is this a case of a scientist being quoted out of context or is it a case of a scientist who doesn't understand the issue?

    Yes, MTIOQC (my thesis is on quantum computing), so I feel like I have a little bit vested in this issue. Being so biased, I hope that this is just an out of context mistake.

    I would like to think that our enlightment grows with time, but every new article I read about quantum computing research seems to be filled with more and more hyperbole (oh do I hate the words "paradigm shift" and "synergy") and less and less good science. Don't get me wrong, I think quantum computing has a promising future both in actual future practice as well as in helping shed light on areas of physics (We all learned that quantum mechanics destroyed the computer-like determinism of Newtonian mechanics, but now we think that, while the universe is not a big classical computer, the universe may be a big quantum computer!), but irresponsible press releases drive me bonkers.

    dabacon
  • This is definitely not scalable. They claim it runs at "Quantum speeds" which isn't even really a word. There is no such thing as "Quantum speed", a quantum computer can take just as long for any particular algorithm as a classical computer, the special thing about QC is just that the speed scales polynomially with input size rather than exponentially. This is *not* true of the light interference system they describe, for many reasons some of which have been pointed out here already. (The time it takes to iterate through the outputs, the size of the beam, the number of beam splitters you would need, and the limited bandwidth of the light spectrum). None of these things scale polynomially with the input size so claiming that this is a substitute for Qc is nothing more than a trick to get funding. Plus, the idea isn't exactly new. I attended a conference on Quantum Computing at Georiga Tech over a year and a half ago where we discussed this very idea, and all parties (many PhD CS and Quantum Physicists among us) agreed that it could not yield the power of a quantum computer.

  • "...before brute forcing a 256 bit key becomes
    feasible..."

    Just remember that a 256 bit key has 2^128
    _TIMES_ as many states as a 128 bit key.
    That's 3.4 x 10^38 _TIMES_ more bits than a
    128 bit key, or 1.56 x 10^77 total states.
    A computer which is a billion times faster
    still _CANNOT_ approach this problem. A
    computer would have to be many QUADRILLION,
    QUADRILLION times faster to even have a
    chance.

    C//
  • "Doesn't the beam of light have to contain
    the data that the user is looking for so
    that the match can be made?"
    ----
    I think so, yes. This is a search for key,
    find value lookup approach, I think.

    C//
  • by hillct ( 230132 ) on Wednesday May 16, 2001 @06:18AM (#219851) Homepage Journal
    Yes but quantum computing Sounds So Much Cooler....

    In all seriousness, this is the sort of situation where the Internet is more a hinderence than a help. Over time discussions such as this will polarize the lay community either for or against a particular area of research, wher two areas of research strive to achieve similar goals.

    Public Opinion greatly influences funding of research, so I hope that premature dabates of which technology is superior, won't shape decisions to fund one or the other, since ther is the possibility that one or the other area of research might hit a brick wall at some time in the future, at which point it wll be nessecery to pursue the other area of study. It would be bennefitial to all to have continued both areas of research in parrelel.

    Don't get me wrong. I don't believe that discussions like this alone will influence the course of research, but merely that the colaborative enviroment the Internet offers will promote (suprisingly) colaboration to the point where only one research path will be pursued by both teams, working together, rather than competing, as it were.This is an area whewre competition is a positive thing in academic research. I merely question the degree to which the Internet actually contributes to this.

    --CTH

    --
  • So in the case of Walmsley's device, 50 different frequencies of light shine through the modulator, and if the 20th frequency is the altered one, then Walmsley knows that the bit of information he was searching for is located at position 20 in the database. A conventional computer would have had to check 20 times to find the location. It sounds all so simple that I can't help but ask : why nobody ever thought of it?
  • In my yoga class, there's a guy who lives on prana (air) and light. This strikes me as remarkably similar to that philosophically. It's the coolest thing I've heard in a long time.
  • So we convert the problem from scanning a spatial array to scanning a frequency array. Don't we still have to analyze the frequencies to figure out which one has been altered?

    I suppose that each frequency could go to a parallel detection array, which would then drive some sort of interrupt, however this would seem to become unwieldy as the problem space increases.

    Can someone explain further just how the detection of which frequency of light was changed would actually work in practice with a large problem space?
  • In case you haven't noticed, there is a [so-called] Ten Year Gap in the levels of technology that the military and government R&D comes up with, and what hits the store shelves for us to buy. We get the R&D hand-me-downs that they're unthreatened to release, as well as odds and ends that they never had any real use for in the first place. That means No Such Agency and most of the rest of the alphabet gestapo have at least ten years on us in tech level when it comes to cryptography. Personally, I'm inclined to think of the disparity as a curve. If they can do ten years, they can do twenty or a hundred years. All they have to do is set the pace of R&D release just slightly slower than the rate at which they innovate, and there's a gradually widening chasm of tech levels. And lest we forget, civillian technology is cancellable at a stroke using electromagnetic pulse should they consider it warranted. Reminiscent of Zelazny's Lord of Light.
  • None of these things scale polynomially with the input size so claiming that this is a substitute for Qc is nothing more than a trick to get funding.

    all parties (many PhD CS and Quantum Physicists among us) agreed that it could not yield the power of a quantum computer.


    "Hey John Q. Public! Concerned about the security of your encrypted data transmissions? Wish you had one of those whiz-bang QC's the Important People get to use? Here, buy one based on pure light! See, light is made up of tiny particles, so it's almost like having a QC of your own, without having to declassify the advances we've made on them!"

    Just what I want... to compute on an EZ-Bake Oven.
  • The recombinant beam isn't compared to the original beam. When you recombine the two beams you create interference. Depending on how you set it up, the interference could be constructive or destructive for the target or background frequencies. In other words, you can rig it so that only the correct frequency will make it through, and either a)detect frequency directly to determine which value is correct or b)use a prism to spread the beam again and project it against a wide detector and see which frequencies are appearing/not appearing, and thus which is correct.

    While this is a neat trick, it isn't readily scalable and thus will be of limited use compared to a quantum computer. It is working in parallel, but with only 50 values. You could expand that number by using multiple units, but based on the capabilities quoted in the article the system can only be scaled linearly, while a quantum computer can scale exponentially.

    Now if the can/do have a way to use those 50+ bits exponentially, that's a whole other story. All of this depends on how the data is stored within the tellurium dioxide and thus how it affects the light traveling through it representing the database. Although the method indicated is a linear search the article seems to indicate that it has much more potential.

    In any case, either the article overstated things or did not report the technique correctly.

    cryptochrome
  • The following was in response to a question similar to my previous statement, which I received very promptly from Iam Walmsley:

    You are correct in the statement that the scalability of our experiment is no better than that of a classical system. (I don't think the UR News Release claimed otherwise, did it? If so, I'd better check with our PR people!) The point is really a different one:

    First, we have shown formally that any information processing system based on quantum interference alone (e.g. a single Rydberg atom, or a single photon) can be implemented with equal efficiency with an all-optical interferometer. The important physics here is to realize that with a quantum computer the information in the register has no reality until you read it out, and so you must account for the readout resources in determining the efficiency of the computer.

    Second, we implemented an all optical version of Bucksbaum's Rydberg atom Grover search to show that our hypothesis is correct. The resource scaling in both is slightly better than that of Grover's second search algorithm, since we do not need the inversion-about-the-mean operation he proposed. Instead, we use part of the input as a reference beam and make use of interference to do the phase-to-amplitude conversion.

    This might leave one with the impression that all interference based schemes, including those based on quantum interference alone, can never do better than a classical machine. But recent articles by D. Meyer (PRL, 2000) and E. Knill et al (Nature, 2001) show that interference without entanglement can be used to advantage over classical computers. Therefore we are now seeking to implement the algorithms they analyze optically, and to provide a measure for evaluating the resources needed for them.

    Hope this helps - rest assured we are certainly not claiming that we can do everything full-blown quantum computers can, only those that are based on single-particle interference alone.

    Parenthetically, one can look for a single marked element in a database of 2^50 items using our method quite easily, provided the database is binary encoded to begin with. But such encoding schemes are also available in classical machines. For unary encoding we are limited to 50-element databases. Perhaps this is what you meant in your second paragraph.

    Best regards

    IAW
  • Unfortunately articles don't last too long on slashdot... Iam was quick to respond, but not quick enough I'm afraid.

    cryptochrome
  • So far we have 3 people posting who claim to be doing Quantum Computing research. No wonder we're not getting any closer to getting a QC built.. all the researchers are posting on slashdot all day.
  • Will people move over to Quantum Cryptohgraphy? Do Quantum Computers mean that you can crack Quantum Crypto? Could someone with some clue distribute some here. I'm confused :(
  • No, not quite. This implementation is not actually a quantum computer in the strict sense. If you read the article carefully you notice that the current experiment is unable to entangle states. So computationally speaking this is completly identical to a classical computer! The only benefit is that you can get some parallelism, in this case by using different frequencies. So the experiment is able to perform some (single-state) operations on all 50 states at once. Any normal computer can do the same. For example any 64 bit chip, like the Alpha, can perform a single operation on all 64 states at once (or at least with order 1).

    The idea of quantum computation is to be able to perform complicated (multi-bit) computations using entangled states. For example finding prime factors of a number, can be solved in polynomial time. The proposed light approach as described in the article has no chance of achiving that. (I think)

    So we should not be warried about proliferation of cheap quantum computers that can crack codes. We probably should not even worry about quantum computers that can crack codes at all for quite a while. (10-50 years depending on progress)
    1. Very true, very true.

    ------
    #!/usr/bin/perl -w
  • According to the article, or at least to my limited understanding of it, each 'record' needs to be represented simultaneously on 'modulator.' Wouldn't this then cause a restriction on the size of your computer, based on the size of the modulator?
  • Actually, detecting which frequency was changed should be rather easy. By combining the resulting lasers with the origonal laser, out of phase, it should be possible to create destructive interference, such that, the only laser to return to the detector is the one that is changed. Factor out the interference from the origonal laser, and you have your changed frequency, and only your changed frequency. frequency searching solved. And lots of time saved.
  • *smirks*

    I think that the applications to cryptography should be mentioned. Both for this particular mechanism (low, imnsho) and for more general quantum computers. These eventual cryptography machines will be massive, dedicated, and initially incredibly expensive (think eniac). What they *can* be easily seen to be used for is all manner of brute-forcing attacks on data by governmental and/or large institutions.

    Moreover, don't believe for a second that there will be "a slew of new possibilities for encryption schemes" that we'll see anytime soon. In fact, I suspect that it's precisely private data that will be exposed during early usage of new decryption methods; meanwhile, don't suppose that the various powers that be will ever allow an easily implemented, private, and quantum-secure data-hiding scheme.


    Nietzsche on Diku:
    sn; at god ba g
    :Backstab >KILLS< god.
  • by RalphTWaP ( 447267 ) on Wednesday May 16, 2001 @06:14AM (#219867)
    *smirks*

    Let's look at the story for a second here folks.

    The scientist set up a data-storage device (in this case an acoustically massaged medium), then an information retrieval was carried out against the medium. This retrieval was carried out in parallel. Now this is fairly exciting news, but it has some serious distance to go before it manages to become something general enough to threaten the intellectual-share of true quantum-entaglement computing schemes.

    The promises for the device so far seem to be in determining data returns along mulitple paths. In effect, the thing is performing the many many calculations (in this case actually only data-retrievals). However, it's performing them in parallel.

    In addition, I'm curious as to how the data is retrieved. If the recombinant beam must be compared to the original beam along all the frequency divisions, there's another indivisible operation requiring some length of time.

    But....

    It is an interesting method of encoding/decoding data from a medium to a laser without transducers. I'd say that this technology has great promise as a method to be derived from to create all-optical switching fabrics that are actually data-sensitive (how'd you love it if you could decode, process, and filter packet data from the very laser transmission that carried it down the fat fiber pipe...?)


    Nietzsche on Diku:
    sn; at god ba g
    :Backstab >KILLS< god.
  • Of course it would work. How do you get the
    actual digital information through binary
    code? Electricity runnning through
    circuits. What is one way to get
    electricity? Light. There, it's as simple
    as that. I even made a formula for you guys:
    Light = Electricity = Binary Circuits = Digital Data.

    somebody once asked / 'could I spare some change for gas / I need to get my self away from this place' / I said 'Yep. What a concept. I could use a little fuel my self and we could all use a little change
    -All Star, Smash MOuth

  • My invention : No machine does exactly the same as this machine, only much faster than light. It's a big joke I think.
  • You should state it as : I can't see THE LIGHT (would be faster than audio or even RF but what is actually new here?). But I think you're absolutely right about all the rest.

Any circuit design must contain at least one part which is obsolete, two parts which are unobtainable, and three parts which are still under development.

Working...