What is that method? How does it differ from current encryption techniques? Why is that well suited to encrypting against quantum computers? How did you come to that conclusion, given that you don't have one to test against?
I'm one of the authors of the research that was discussed. Unfortunately, the MIT Technology Review article doesn't contain much detail. Here's a link to our research paper: https://eprint.iacr.org/2014/5....
The scheme uses a mathematical primitive called the "ring learning with errors (RLWE) problem". Rather than multiplying large prime numbers together like in RSA encryption or using points on a curve like in elliptic curve cryptography, here the mathematical operation is based on multiplying polynomials together, and then adding small random noise. An analog of this is solving systems of linear equations: if you took first year linear algebra, you might remember that if I give you a matrix A and a vector b, you can use Gaussian elimination (row reduction) to find a vector x such that Ax=b. But it turns out that if I add a small random noise vector e, and give you A and (Ax+e), it becomes much harder to find x. Our work is about actually using RLWE, seeing how to design a key exchange protocol that's suitable for use in SSL/TLS and then implementing and testing it. (Here are some slides from a recent talk I gave about the research, which try to explain the problem in more detail: http://files.douglas.stebila.c...)
RLWE isn't our invention -- we build on existing research by Regev, Peikert, and others, and RLWE has been studied for a few years now. RSA and elliptic curve cryptography can be broken by quantum computers because they have a certain periodic structure that can be easily detected by a quantum computer using a quantum algorithm invented by Shor. But RLWE, and several related problems, don't seem to be susceptible to Shor's algorithm, nor to any of the other quantum algorithms that give an exponential speedup over normal classical computers. No one in the research community today knows if RLWE is hard for quantum computers, but right now people accept it as a promising candidate, and it is being explored for a variety of uses. If after years of cryptanalytic research no one manages to break it, then it may achieve the corresponding levels of confidence that the research community has in the difficulty of currently accepted problems, like factoring or elliptic curve discrete log.