Debunking a Bogus Encryption Statement? 215
deviantphil asks: "Recently, a coworker tried to assert that encrypting a file twice with a 64 bit algorithm is equivalent to encrypting it once with a 128 bit algorithm. I know enough about encryption to know that isn't true, but I am having difficulties explaining why and how. Doesn't each pass of the encryption create a separate file header which makes this assertion untrue? Can anyone point me to references that would better help me explain this?" What other laughable claims have you heard attributed to encryption, and how were you able to properly lay them to rest?
Snake oil FAQ (Score:5, Interesting)
I used to work at a small company where a guy came in and was trying to get us involved with some dodgy "crypto" product. Management were quite taken with it (he could talk and even had a patent!) but technically it was a load of horse shit with some pretty multicolour diagrams. That FAQ helped me crystalise my objections and the guy was given the heave ho before the company did anything embarrassing.
Re:Think about it as number of possibilities (Score:3, Interesting)
This assumes that you can determine when you break the first layer of encryption, i.e. it won't work if the encrypted string is not distinguishable from noise. If this is not true, you must try each possible 64 bit second key for each 64 bit first key, for a maximum of 2^64*2^64=2^128 different keys, which would be equivalent to brute-forcing one 128 bit key.
I'm right and I know it (Score:3, Interesting)
The other method of going over their head is to cite someone -- not always even a specific person -- who is smarter than both of us. So for crypto algorithms, I can ask them, "If it was really that easy, don't you think someone else would've thought of it by now? Why do you think we invent wholly new algorithms for 128 bits?"
And if both of those fail, the final method of going over their head is to cite specific sources and papers, and invite them to actually disprove what's said there (without really having to understand it yourself) if they really think they're right.
All of these, of course, are when I knwo I'm right. Quite often, I really don't know, and I stay humble there. But when I know I'm right, it's usually much more important to hit them over the head with one of the above than to try to explain why I'm right. After all, you don't have to know why a car is safer than open air for a lightning storm, you just have to do it. It's not important that someone understand why not to just double-encrypt with 64-bits, it's just important that they not do it.
Re:Group theory (Score:3, Interesting)
Wasn't, although the result came very late in the history of DES:
K.W. Campbell and M.J. Wiener, DES is not a group, Advances in Cryptology - Crypto '92, Springer-Verlag (1993), 512-520.
Beep! Retrieval complete.
DES design issues (Score:5, Interesting)
Easy - but wrong... (Score:4, Interesting)
enough analogies (Score:1, Interesting)
simple answer is that your friend is not entirely wrong. everyone here using the 64+64=65 is wrong unless they mention that that only occurs in some cases where a thing called a "meet in the middle attack" can be used. if that attack can't be used, or the person trying to crack your password doesn't have on the order of 10^19 bytes of storage available (i know hard drives are cheap, but c'mon now), then due to the fact that you don't get a green light that says, you cracked the first of the two keys, now crack the second one, encrypting twice does more than double the number of keys you have to try. but then you get into group theory, which affects us in the same way as if i said, take the number 42, then add a random number between one and ten to it, and then add another number between one and ten to it. I don't have to guess both of the numbers (100 possible combinations), I'd just have to guess the sum of those two numbers (20 possible combinations). Similar things happen when you encrypt twice using two different keys that are in a group, that is i don't have to guess both the keys, as there'd be another key that would be equivalent to encrypting with the two keys. it's more complicated than that, but on the same vein, and you can look it up if you're interested.
ummmmm, i mean, it's like if you have a bunch of switches, and pools of lava, and each switch makes a tiger appear from a door and then 64+64=65.
cheers.
Re:Debunking this claim (Score:3, Interesting)
Example: K1(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ onto DEFGHIJKLMNOPQRSTUVWXYZABC.
K2(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ onto LMNOPQRSTUVWXYZABCDEFGHIJK.
K2(K1(x)) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ onto OPQRSTUVWXYZABCDEFGHIJKLMN.
But notice that this mapping is of the same kind. Therefore, the double encryption operation is exactly equivalent to a single encryption operation: an attacker need never actually discover K1 or K2 to break the combined cipher. Notice also that this does not depend on K2(K1(x)) == K1(K2(x)): it holds also for simple random alphabet scrambling {i.e. the generalised set of all 26! 1:1 mappings of the alphabet}.
Example: K3(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ => NXZGMRALUCFWHTYDKVEQSBIJOP.
K4(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ => DLYHSVWXGMREKTJUZBIOANQCFP.
K4(K3(x)) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ => TCPWKBDEAYVQXOFHRNSZILGMJU {a mapping of the same kind}
but K3(K4(x)) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ => GWOLEBIJAHVMFQCSPXUYNTKZRD {a different mapping but still of the same kind}.
I believe {I cannot think of, but would love to see, a counterexample} this can be generalised to any set of simple one-to-one mappings where the sets of permitted values at the input and output are equivalent: Km(Km-1(.....K2(K1(x)).....)) == Kn(x).
Re:DES design issues (Score:3, Interesting)
The general opinion about why NSA pushed DES to be 56 bits instead of 128 bits is that "differential cryptography" attacks weaken it to about 55 bits anyway
Interesting. That's an idea I haven't read anywhere. Do you have a reference? Your explanation is at odds with my understanding of the timeline. As I understand it, Lucifer was vulnerable to differential cryptanalysis, but IBM independently discovered the technique during the design process of the final version, which used a smaller key size per NSA request. The IBM guys then build differential cryptanalysis resistance into their design criteria for the S boxes, which didn't prevent the NSA from specifying different S box values anyway (strengthening it against linear cryptanalysis, IIRC).
Searching a 56-bit keyspace isn't exactly in the reach of run-of-the-mill computers
Sure it is, it just takes a lot of them. Not an infeasible number, though, which is the point.
Gilmore's custom DES cracker and the distributed crack are *so* 1998.
Actually, I think the custom DES cracker is more like 1970, though it would have cost a great deal more than the $250K the EFF spent. Diffie and Hellman estimated that such a machine could be built for $20M in 1977. I think it is very likely that the NSA built and operated such a machine during that timeframe.
I think you've got the math wrong, actually (Score:3, Interesting)
For each possible 64-bit key K, you need to generate and store Ek(P) in a table. It's the size of that table I was trying to calculate. So that's 8 bytes * 2^64 keys, which is 147,573,952,589,676,412,928. That sure looks like 147 Billion Billion to me.
In any case, even using your figure of 2.2 Billion GB, you've still got the number of drives very wrong. When you divide 2.2 Billion by 750, you get 2.9 million drives required. First of all, that's probably larger than the total production run of those drives. Second, it's much more like $1.1 Billion, rather than $1.2 Million, in cost.
I don't think hard drives are going to be the way to go. A decent tape library can hold 500 Terabytes or so, so you'd only need to buy (and power, and house) a mere 300,000 of those to hold the whole table.