Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Debunking a Bogus Encryption Statement? 215

deviantphil asks: "Recently, a coworker tried to assert that encrypting a file twice with a 64 bit algorithm is equivalent to encrypting it once with a 128 bit algorithm. I know enough about encryption to know that isn't true, but I am having difficulties explaining why and how. Doesn't each pass of the encryption create a separate file header which makes this assertion untrue? Can anyone point me to references that would better help me explain this?" What other laughable claims have you heard attributed to encryption, and how were you able to properly lay them to rest?
This discussion has been archived. No new comments can be posted.

Debunking a Bogus Encryption Statement?

Comments Filter:
  • Snake oil FAQ (Score:5, Interesting)

    by Chuck Chunder ( 21021 ) on Tuesday August 22, 2006 @11:11PM (#15960058) Journal
    Is a good place to start [interhack.net] if you want to protect yourself against people hawking dodgy crypto.

    I used to work at a small company where a guy came in and was trying to get us involved with some dodgy "crypto" product. Management were quite taken with it (he could talk and even had a patent!) but technically it was a load of horse shit with some pretty multicolour diagrams. That FAQ helped me crystalise my objections and the guy was given the heave ho before the company did anything embarrassing.
  • by tkw954 ( 709413 ) on Tuesday August 22, 2006 @11:34PM (#15960142)
    If you have 64 bits, that is 1.84467441 × 10^19 (2^64), meaning maximum that many tries to break the first layer of encryption. The second layer is the same number, meaning to break it would mean a maximum of 3.68934881 × 10^19 attempts.

    This assumes that you can determine when you break the first layer of encryption, i.e. it won't work if the encrypted string is not distinguishable from noise. If this is not true, you must try each possible 64 bit second key for each 64 bit first key, for a maximum of 2^64*2^64=2^128 different keys, which would be equivalent to brute-forcing one 128 bit key.

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Wednesday August 23, 2006 @12:33AM (#15960337) Journal
    This is how I usually end this kind of conversation -- I go over their head. If I know more than they do about the subject, I speak jargon until their eyes glaze over, and then say something to the effect of "Do you really want to know why you're wrong, or will you take my word for it?" I'm a good enough teacher to back that up if they really do want to know. Or sometimes I start from the other end -- if I have a good enough analogy ready, I can start with that -- the money one is a good analogy for the crypto (Would you rather give me two $10 bills or one $100 bill? Would you let me settle a $100 debt with you by giving you two $10 bills?)

    The other method of going over their head is to cite someone -- not always even a specific person -- who is smarter than both of us. So for crypto algorithms, I can ask them, "If it was really that easy, don't you think someone else would've thought of it by now? Why do you think we invent wholly new algorithms for 128 bits?"

    And if both of those fail, the final method of going over their head is to cite specific sources and papers, and invite them to actually disprove what's said there (without really having to understand it yourself) if they really think they're right.

    All of these, of course, are when I knwo I'm right. Quite often, I really don't know, and I stay humble there. But when I know I'm right, it's usually much more important to hit them over the head with one of the above than to try to explain why I'm right. After all, you don't have to know why a car is safer than open air for a lightning storm, you just have to do it. It's not important that someone understand why not to just double-encrypt with 64-bits, it's just important that they not do it.
  • Re:Group theory (Score:3, Interesting)

    by Beryllium Sphere(tm) ( 193358 ) on Wednesday August 23, 2006 @01:10AM (#15960435) Journal
    >it isn't (or wasn't) known whether DES keys form a group.

    Wasn't, although the result came very late in the history of DES:

    K.W. Campbell and M.J. Wiener, DES is not a group, Advances in Cryptology - Crypto '92, Springer-Verlag (1993), 512-520.

    Beep! Retrieval complete.
  • DES design issues (Score:5, Interesting)

    by billstewart ( 78916 ) on Wednesday August 23, 2006 @04:02AM (#15960842) Journal
    The general opinion about why NSA pushed DES to be 56 bits instead of 128 bits is that "differential cryptography" attacks weaken it to about 55 bits anyway, so in fact you're not losing anything, and the 56-bit version was more compact and easier to implement in hardware. Searching a 56-bit keyspace isn't exactly in the reach of run-of-the-mill computers - you need a whole bunch of them working together to get any speed. On the other hand, Gilmore's custom DES cracker [nytimes.com] and the distributed crack are *so* 1998. I don't know how much ASIC technology has improved since then - Pentium IIs were up to 400 MHz, compared to ~3 GHz for a typical Intel desktop today, and memory prices and performance have also improved significantly, so maybe you could use 1/10 as many machines.
  • Easy - but wrong... (Score:4, Interesting)

    by billstewart ( 78916 ) on Wednesday August 23, 2006 @04:09AM (#15960858) Journal
    I hope you were suggesting the "Each bit doubles the strength" as one of the bogus assertions, not one of the true ones. For some kinds of algorithms, against some kinds of attacks, it's true. For algorithms like RSA and Diffie-Hellman that have some special properties to the keys, doubling the strength may require adding LogN bits. Some algorithms don't have variable-sized keys, and some of those that do aren't very good at using them - they're as strong as they are, and piling stuff on doesn't change the weaknesses, like rot-26. Some algorithms are groups - combining R rounds of N-bit keys just gets you the equivalent of one round with a different N-bit key.
  • enough analogies (Score:1, Interesting)

    by Anonymous Coward on Wednesday August 23, 2006 @04:12AM (#15960866)
    enough analogies already. it seems like most of the people who say "it's just like if". don't trust the people who make analogies, trust the people who can explain the answer in the terms of the question. also, don't listen to some who says that the "simple answer is", and then writes 8 paragraphs.

    simple answer is that your friend is not entirely wrong. everyone here using the 64+64=65 is wrong unless they mention that that only occurs in some cases where a thing called a "meet in the middle attack" can be used. if that attack can't be used, or the person trying to crack your password doesn't have on the order of 10^19 bytes of storage available (i know hard drives are cheap, but c'mon now), then due to the fact that you don't get a green light that says, you cracked the first of the two keys, now crack the second one, encrypting twice does more than double the number of keys you have to try. but then you get into group theory, which affects us in the same way as if i said, take the number 42, then add a random number between one and ten to it, and then add another number between one and ten to it. I don't have to guess both of the numbers (100 possible combinations), I'd just have to guess the sum of those two numbers (20 possible combinations). Similar things happen when you encrypt twice using two different keys that are in a group, that is i don't have to guess both the keys, as there'd be another key that would be equivalent to encrypting with the two keys. it's more complicated than that, but on the same vein, and you can look it up if you're interested.

    ummmmm, i mean, it's like if you have a bunch of switches, and pools of lava, and each switch makes a tiger appear from a door and then 64+64=65.

    cheers.
  • by ajs318 ( 655362 ) <sd_resp2@earthsh ... .co.uk minus bsd> on Wednesday August 23, 2006 @06:53AM (#15961229)
    The problem with simple ROT and EOR cipher schemes is that, for any K1(x) and K2(x) there exists a K3(x) such that K3(x) = K2(K1(x)). So encrypting something twice (first with K1 and then encrypting the ciphertext with K2) is really only the same as encrypting it once with a different key.

    Example: K1(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ onto DEFGHIJKLMNOPQRSTUVWXYZABC.
    K2(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ onto LMNOPQRSTUVWXYZABCDEFGHIJK.
    K2(K1(x)) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ onto OPQRSTUVWXYZABCDEFGHIJKLMN.

    But notice that this mapping is of the same kind. Therefore, the double encryption operation is exactly equivalent to a single encryption operation: an attacker need never actually discover K1 or K2 to break the combined cipher. Notice also that this does not depend on K2(K1(x)) == K1(K2(x)): it holds also for simple random alphabet scrambling {i.e. the generalised set of all 26! 1:1 mappings of the alphabet}.

    Example: K3(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ => NXZGMRALUCFWHTYDKVEQSBIJOP.
    K4(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ => DLYHSVWXGMREKTJUZBIOANQCFP.
    K4(K3(x)) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ => TCPWKBDEAYVQXOFHRNSZILGMJU {a mapping of the same kind}
    but K3(K4(x)) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ => GWOLEBIJAHVMFQCSPXUYNTKZRD {a different mapping but still of the same kind}.

    I believe {I cannot think of, but would love to see, a counterexample} this can be generalised to any set of simple one-to-one mappings where the sets of permitted values at the input and output are equivalent: Km(Km-1(.....K2(K1(x)).....)) == Kn(x).
  • Re:DES design issues (Score:3, Interesting)

    by swillden ( 191260 ) * <shawn-ds@willden.org> on Wednesday August 23, 2006 @07:56AM (#15961408) Journal

    The general opinion about why NSA pushed DES to be 56 bits instead of 128 bits is that "differential cryptography" attacks weaken it to about 55 bits anyway

    Interesting. That's an idea I haven't read anywhere. Do you have a reference? Your explanation is at odds with my understanding of the timeline. As I understand it, Lucifer was vulnerable to differential cryptanalysis, but IBM independently discovered the technique during the design process of the final version, which used a smaller key size per NSA request. The IBM guys then build differential cryptanalysis resistance into their design criteria for the S boxes, which didn't prevent the NSA from specifying different S box values anyway (strengthening it against linear cryptanalysis, IIRC).

    Searching a 56-bit keyspace isn't exactly in the reach of run-of-the-mill computers

    Sure it is, it just takes a lot of them. Not an infeasible number, though, which is the point.

    Gilmore's custom DES cracker and the distributed crack are *so* 1998.

    Actually, I think the custom DES cracker is more like 1970, though it would have cost a great deal more than the $250K the EFF spent. Diffie and Hellman estimated that such a machine could be built for $20M in 1977. I think it is very likely that the NSA built and operated such a machine during that timeframe.

  • by mbessey ( 304651 ) on Wednesday August 23, 2006 @06:58PM (#15966249) Homepage Journal
    Let's make sure we're on the same page. In order to perform the "meet in the middle" attack, you first need to have a known Plaintext and Ciphertext pair, P and C. In our example, we'll assume these are 64 bit blocks, or 8 bytes. We're also assuming that the keys are 64 bits.

    For each possible 64-bit key K, you need to generate and store Ek(P) in a table. It's the size of that table I was trying to calculate. So that's 8 bytes * 2^64 keys, which is 147,573,952,589,676,412,928. That sure looks like 147 Billion Billion to me.

    In any case, even using your figure of 2.2 Billion GB, you've still got the number of drives very wrong. When you divide 2.2 Billion by 750, you get 2.9 million drives required. First of all, that's probably larger than the total production run of those drives. Second, it's much more like $1.1 Billion, rather than $1.2 Million, in cost.

    I don't think hard drives are going to be the way to go. A decent tape library can hold 500 Terabytes or so, so you'd only need to buy (and power, and house) a mere 300,000 of those to hold the whole table.

Kleeneness is next to Godelness.

Working...