quax writes: Depending on who you ask Quantum Computers have already arrived (after all Google and NASA joined forces to buy one) or they are still about twenty years away. Rarely does an online article bother with differentiating all the various technologies and computational models that are labeled quantum computing. The headlines and news stories seem to be all over the place. Even Nature doesn't seem to be able to pull off an online article that actually asks the important questions and covers all aspects of the current race. Is this kind of shoddy journalism unavoidable? How can science journalists be prodded to ask the pertinent questions and go beyond superficial reporting?
quax writes: It has been over a century that William Kingdon Clifford developed Geometric Algebra. Yet due to his untimely death it was quickly forgotten, only to be partially reinvented when Dirac tackled relativistic quantum mechanics and introduced spinors. But geometric algebra is much more versatile than that, for instance it makes for a better alternative to vector calculus, combining div and curl operators and doing away with the cross-product in favor of bivectors. It is such a straightforward unification of otherwise, disparate mathematical techniques that I very much regret that my physics curriculum twenty years ago didn't cover it. Has this changed? Have you encountered geometric algebra in an undergraduate program?
quax writes: Mainstream media always follows the same kind of 'He said she said" template, that is why even climate change deniers get their say, although they are a tiny minority. The leading science journals on the other hand are expensive and behind pay-walls. But it turns out there are places on the web where you can follow science up close and personal: The many personal blogs written by scientists — and the conversation there is changing the very nature of scientific debate.
quax writes: So writes the company that developed the machine on their blog. Admittedly, you would expect them to defend their architecture, but the founder of D-Wave, Geordie Rose, puts forward a compelling argument, that comes down to Occam's razor. The scientists who claim that the machine can be explained classically, as recently reported on slashdot, only base their model on the sub-set of data that they looked at in their research. But if you look at all the data amassed by D-Wave over time, only quantum annealing makes for a perfect fit.
They are not the only ones who argue that D-Wave's claims in this regard hold up. Independent research performed by Matthias Troyer et al. confirms that quantum annealing is the best model to describe the machines performance, but they don't see evidence for quantum speed-up yet. A recent video nicely summarizes their research findings.
This kind of heated argument is part and parcel of the scientific discourse, yet often leads to abandonment cycles that see promising research avenues neglected, only to be rediscovered decades later. Is this inevitable? Simple human nature reasserting itself? Or is there a more rational way to determine where to focus research?
quax writes: If the company General Fusion succeeds in demonstrating the viability of their approach, the international ITER project will be pretty late to the party. Surprisingly this company managed to stick tightly to their development schedule in developing their reactor for Magnetized Target Fusion. This approach has never been tried at this scale, and it will be the first time to demonstrate net energy gain equivalent in this manner (the equivalent meaning, that if the pure deuterium mix in the test was replaced with one containing tritium you would get more power out than you put in).
The next big question will be if this can become commercially viable. The mechanical stresses the reactor will have to withstand are huge, so demonstrating that this can actually run continuously will be no small feat.
quax writes: Within the same week two major Quantum Information Technology milestones where announced: The Los Alamos National Labs unveiled that they've been operating a scalable quantum encrypted network for the last two years (link to original paper).
There have been commercial quantum encryption devices on the market for quite some time now, but these have been limited to point to point connections. Having a protocol that allows the seamless integration of quantum cryptography into the existing network stack raises this to an entirely different level.
quax writes: In the most influential textbook on the matter Michael Nielsen and Isaac Chuang wrote:
"Quantum Computing and Quantum Information Science has taught us to think physically about computation. (...) Indeed in the broadest terms we have learned that any physical theory, not just quantum mechanics, may be used as the basis for a theory of information processing and communication."
This is exactly what the Kish Cypher Encryption protocol is doing by exploiting thermodynamics in an unexpected fashion. Could this become an easier to implement alternative to Quantum Cryptography, providing unhackable networks?
quax writes: Feynman famously quipped that "nobody understands" quantum mechanics. But after almost a century shouldn't there be at least some consensus on how to interpret this theory? Ever since the famous argument between Bohr and Einstein over the EPR paradox, conventional wisdom was that Bohr's Copenhagen Interpretation will carry the day, but when surveying 33 leading experts at a quantum foundation conference, less than half voted that way.
quax writes: Science and engineering is not free of fads. Sometimes they start with a bang and end up vilified as pathological science just like cold fusion did. But could something seemingly as established as Quantum Computing fall into the same category?
Some physicists are seriously proposing exactly that. The author argues that the amount of publications on Quantum Computing has reached an unsustainable plateau and that the ratio of one experimental to thirty theoretical papers demonstrates how little this field is actually grounded in reality.
But what if the shoe is on the other foot? Could it be that these animosities are actually more a reflection on the state of modern physics?
quax writes: Whenever Quantum Computing is dragged out to get some mainstream exposure it is the same old story: If we finally get these powerful machines then the end of all encryption is here and the sky is falling.
This article makes the case that there is much more to Quantum Computing than that, and that all the hand-wringing is not only pre-mature but also rather silly. Current quantum computing devices cannot defeat our standard encryption yet, but are at a point where they can already be a valuable new computing resource. On the other hand when considering how modern cryptography works, and when taking into account the progress made on Quantum Cryptography, the often repeated threat from Quantum Computers to the privacy of a encrypted data appears to be completely overblown.
quax writes: The first quantum computing devices have hit the market, while the juggernauts of the IT industry are still in research mode. So what is the difference between what you can buy now, and what IBM and Microsoft are researching? Turns out, unlike modern digital computers, the quantum computing field is far more diverse in terms of design and hardware approaches. This article attempts to sort this out and predicts a timeline for this nascent IT sector.
quax writes: Solving linear equations is one of the most common mathematical problems, and it is a fairly easy one that everybody learns to work in school. Surprisingly, a new algorithm has been found that improves over established methods within the domain of finite fields.
This algorithm is poised to find widespread use in applications as diverse as cryptography and quantum error correction.
The article provides links to the original paper and illustrates the concept of finite fields.