RSA is the thing we should be worried about; primarily since it's used so widely. The math behind quick factorization is probably already known, add in quantum computing and you'd need like 2^1024 bit keys for RSA to make sense. So that leaves really symmetric encryption systems (don't know enough about the curve based stuff) and those are typically based on shifting, XORing with key and repeating.
With respect to the article I'm not at all sure that anything can be inferred from having ciphertext and plaintext for a given key without having a quick way to do set subtraction between the set of all possible keys and the keys that could feasibly produce the ciphertext given the plaintext. Even then the key length could be increased very easily so you'd need something like a linear time method to do the set subtraction. Then, you'd just keep doing set subtraction until you had a reasonable number of keys like 1,000,000 and then go the other way and just brute force future ciphertexts to see which ones spit out coherent plaintext.
But really why not just work with OTPs, steganography, multiple layers of encryption or private encryption schemes?