Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Debunking a Bogus Encryption Statement? 215

deviantphil asks: "Recently, a coworker tried to assert that encrypting a file twice with a 64 bit algorithm is equivalent to encrypting it once with a 128 bit algorithm. I know enough about encryption to know that isn't true, but I am having difficulties explaining why and how. Doesn't each pass of the encryption create a separate file header which makes this assertion untrue? Can anyone point me to references that would better help me explain this?" What other laughable claims have you heard attributed to encryption, and how were you able to properly lay them to rest?
This discussion has been archived. No new comments can be posted.

Debunking a Bogus Encryption Statement?

Comments Filter:
  • by dtfinch ( 661405 ) * on Tuesday August 22, 2006 @10:46PM (#15959969) Journal
    http://en.wikipedia.org/wiki/Meet-in-the-middle_at tack [wikipedia.org]

    That's why we have triple-DES instead of double-DES.
    • Re: (Score:3, Informative)

      by TheSHAD0W ( 258774 )
      Even if DES is tripled rather than doubled for their "128-bit crypto", I still don't like it much. The block size is still 64-bit, which means 2^64 possible combinations versus the 2^128 for ideal 128-bit. I say ideal, because typically attackers find ways of reducing the effective number of possible combinations for an algorithm. Original DES has been reduced significantly, so triple-DES was designed to improve it, and do so using the same encryption hardware. But while triple-DES may be closer to the
      • by swillden ( 191260 ) * <shawn-ds@willden.org> on Wednesday August 23, 2006 @01:23AM (#15960470) Journal

        The block size is still 64-bit, which means 2^64 possible combinations versus the 2^128 for ideal 128-bit.

        When talking about block ciphers, block size and key size are separate things. Generally, if you say algorithm X is an n-bit cipher, you're talking about key size, not block size. Given use of good block chaining modes, ciphers with larger block sizes don't really offer significant additional security.

        typically attackers find ways of reducing the effective number of possible combinations for an algorithm. Original DES has been reduced significantly, so triple-DES was designed to improve it

        Original DES hasn't really been reduced significantly. There are attacks which reduce the computational complexity to, IIRC ~41 bits, but at the expense of requiring impractical space. All in all, DES as stood up extremely well and its practical strength hasn't been significantly reduced by 30+ years of scrutiny.

        The problem 3DES was created to address was a deliberate limitation of the design: The small keyspace. The strongest variant of Lucifer (the original IBM cipher which became DES), used a 128-bit key, but the NSA deliberately had it reduced to a 56-bit key, presumably because they thought they could break the cipher with the smaller key size, but it was still secure enough against others. Advances in computation technology, of course, have put a brute force search of a 56-bit keyspace within reach of run-of-the-mill desktop computers. 3DES addresses this by increasing the effective keyspace to 112 bits.

        • DES design issues (Score:5, Interesting)

          by billstewart ( 78916 ) on Wednesday August 23, 2006 @04:02AM (#15960842) Journal
          The general opinion about why NSA pushed DES to be 56 bits instead of 128 bits is that "differential cryptography" attacks weaken it to about 55 bits anyway, so in fact you're not losing anything, and the 56-bit version was more compact and easier to implement in hardware. Searching a 56-bit keyspace isn't exactly in the reach of run-of-the-mill computers - you need a whole bunch of them working together to get any speed. On the other hand, Gilmore's custom DES cracker [nytimes.com] and the distributed crack are *so* 1998. I don't know how much ASIC technology has improved since then - Pentium IIs were up to 400 MHz, compared to ~3 GHz for a typical Intel desktop today, and memory prices and performance have also improved significantly, so maybe you could use 1/10 as many machines.
          • Re: (Score:3, Interesting)

            by swillden ( 191260 ) *

            The general opinion about why NSA pushed DES to be 56 bits instead of 128 bits is that "differential cryptography" attacks weaken it to about 55 bits anyway

            Interesting. That's an idea I haven't read anywhere. Do you have a reference? Your explanation is at odds with my understanding of the timeline. As I understand it, Lucifer was vulnerable to differential cryptanalysis, but IBM independently discovered the technique during the design process of the final version, which used a smaller key size per N

          • Re:DES design issues (Score:5, Informative)

            by WuphonsReach ( 684551 ) on Wednesday August 23, 2006 @09:06AM (#15961731)
            10x improvement over the last 10 years is about right (give or take 2 years).

            A Celeron 566MHz has a memory bandwidth around 150-175MB/s. An Athlon64 3200+ has a memory bandwidth around 1600MB/s. (Approximate numbers, with all sorts of undocumented assumptions.) So it's a pretty good bet that modern single-core CPUs are at least 10x as fast as the ones from 8-10 years ago.

            Of course, now we have things like dual/quad CPU machines with 2-4 cores per machine... so we could up that number to 150x as fast as a machine from 8-10 years ago. So instead of 3000 machines, we might only need 20-30.

    • by eddeye ( 85134 ) on Wednesday August 23, 2006 @02:00AM (#15960552)

      It's even worse if your cipher of permutations forms a group. In that case, for any two ENC keys k1 and k2, there exists a third ENC k3 such that ENC_k2 (ENC_k1 (x)) = ENC_k3 (x). In other words you can find a third key that produces the same permutation as the composition of keys 1 and 2. This means that breaking a double-encryption (or triple, quadruple, etc) is no harder than breaking a single encryption: the resulting permutation can always be described by a single key.

      That's why 3DES uses EDE instead of EEE. While DES doesn't form a group, it does have some group-like structures which reduce the workload quite a bit. This doesn't apply to all ciphers btw; there are many more possible 64-bit permutations than 64-bit keys, so compositions can fall well outside those covered by keyspace.

      I think this is covered in chapter 7 of the Handbook of Applied Cryptography [uwaterloo.ca].

    • DES was really strong for an algorithm with a 56-bit key, and was really just fine for the 1970s, though by 1998 DES-crackers had become affordable. Most 128-bit algorithms that people use are reasonable high quality as well.

      But 64 bits? There are a few algorithms that have variable key lengths that work at that size (RC5 is pretty strong), and then there's DES where you're counting the 8 parity bits as part of your keylength, but most of the 64-bit algorithms I've seen were things people hacked togethe

  • Seems like encrypting twice with a 64-bit key just means that instead of breaking a 64-bit key once, you have to do it twice. That's like using a 65-bit key, instead of a 64-bit key. Every bit doubles your keyspace.

    Plus, you have to realize that the amount of time needed to brute force a single key of a certain size goes up non-linearly with each additional bit. If you just double the number of times you encrypt, you're pretty much just increasing the effort to brute force the thing linearly.

    • by sr180 ( 700526 ) on Tuesday August 22, 2006 @10:56PM (#15960008) Journal
      Think of it as this: If you encrypt something as rot-13 twice, does it become any more secure?
      • Re: (Score:3, Informative)

        by igb ( 28052 )
        Yes, but rot-N forms a group under composition (ie there exists a Z such that rot-X followed by rot-Y is equivalent to rot-Z). DES, for example, has been shown not to form a group, so there is no single key equivalent to the use of two distinct keys.

        ian

      • If you rot-13'd something twice, wouldn't that effectively bring it back to it's original state?
      • Re: (Score:3, Funny)

        by Jim Hall ( 2985 )

        Absolutely! That's why I double-rot13 all my emails before sending them! :-)

      • Here's one with a HUGE key space - and much more difficult to break.

        "cat /dev/urandom > file_you_want_encrypted.doc"

        And for the best effect, you can let this run all night. Ignore the "disk full" message. Its a feature, not a bug.

        Pretty much guaranteed nobody else will be able to read your original contents when its finished.

    • Re: (Score:3, Insightful)

      by Jeff DeMaagd ( 2015 )
      A better illustration is to ask them if they want two ten dollar bills or one one hundred dollar bill? Both of them will get you two zeros, but obviously where those zeros are makes a big difference in the value of the number.
    • by TCM ( 130219 ) on Tuesday August 22, 2006 @11:30PM (#15960128)
      Seems like encrypting twice with a 64-bit key just means that instead of breaking a 64-bit key once, you have to do it twice.
      I think there's more to this problem than you realize at first. Bruteforce relies on the ability to identify when you have successfully broken the cipher text. If you encrypt random data and then try to bruteforce it, how will you know at the first layer that you have successfully broken it? It's not like some magic bit is telling you whether the key was correct or not. If you decrypt with the wrong key, you just get (truly random?) garbled data.

      Wouldn't you have to do the "inner" 64bit bruteforce procedure for each possible key of the "outer" 64bit keyspace, thus making it 128bit again? I'm feeling the effective key strength is somewhere between 64bit and 128bit, certainly more than 65bit, but not really 128bit.

      IANAC(ryptologist).
      • I think a big part of that comes down to the file headers and how you actually implement the cryptographic algorithms into a system.

        If you take a plaintext file and encrypt it into a file which has headers ("BEGIN ENCRYPTED CONTENT---"), and then encrypt the result again, assuming the attacker knows how you did it and that the intermediate file has plaintext headers, then they'll know the moment they broke the first 64-bit encryption layer. So in this example, you're basically at 65 bits.

        Now if you don't include any headers, so that there's no terribly good way to determine whether you've gotten the right key or not, as you're brute-forcing the first layer, then I think you're right -- the strength of the overall system is somewhere in a grey area between 65 and 128 bits.

        If someone was just thinking that they could use a file-encryption utility twice (which produces output files that have plaintext headers) and double the keyspace, they are dead wrong.
        • by jlcooke ( 50413 )
          If you're using a cipher that is weak to known plaintext attacks, you have other issues. The above argument doesn't hold, sorry.
          • I'm not talking about a known-plaintext attack on the cipher itself. (Although if you knew that the input into the second (the "outer") cipher contained headers, you could use this as part of a known-plaintext attack, but that's not what I was discussing.)

            Rather I was just saying that if you have file headers on the intermediate file, then it becomes quite easy to figure out when you've brute-forced the outer layer of encryption. Without these headers, it's much harder to tell when you've gotten the correct
      • by swillden ( 191260 ) * <shawn-ds@willden.org> on Wednesday August 23, 2006 @01:37AM (#15960505) Journal

        Wouldn't you have to do the "inner" 64bit bruteforce procedure for each possible key of the "outer" 64bit keyspace, thus making it 128bit again?

        No, you do a meet-in-the-middle attack, which is basically 2^64 in complexity if you're using two 2^64 keys.

        There are some optimizations that can be done, but the basic idea is this: You start with one ciphertext block and its corresponding known plaintext. Then you encrypt the plaintext with all 2^64 possible keys and store the results (with some indexes to make it easy to search for a particular block). Then you decrypt the ciphertext with all 2^64 possible keys, looking each result up. When you find a match, you have found the pair of 64-bit keys that turns that plaintext into that ciphertext. So to search the entire space, you have to do 2^64+2^64 = 2^65 trial operations. On average, you only have to search half of the second keyspace, so the complexity of the search is 2^64+2^63 = 2^64 (plus the huge storage cost).

        Triple encryption is also weakened by a meet-in-the-middle attack. Using three 64-bit keys, it would be nice to think you have a 192-bit key. But a meet-in-the-middle requires encrypting with all 64-bit keys for the first step, and decryption with all 128-bit keys for the second step, giving an effective strenth of 2^64+2^127 = 2^127 (plus the huge storage cost).

        Finally, keep in mind that DES keys aren't 64 bits. They're 56 bits. So 3DES has a brute force search computational complexity of 111 bits, on average.

      • by julesh ( 229690 )
        I think there's more to this problem than you realize at first. Bruteforce relies on the ability to identify when you have successfully broken the cipher text.

        We're probably talking consumer encryption software here. Many such programs (most?) include a hash of the encrypted data so that they can verify that decryption was successful, or at least some other mechanism that allows them to prompt for an alternative key if the one specified didn't work. Even if there are multiple possible keys that won't prod
        • I'm pretty sure that every encryption algorithm requires a checksum or hash of the data to ensure that the correct key was provided. Otherwise, you'd have a huge usability problem. Accidentally using the wrong key would still lead to data, but the user would have to identify why the data was not what they expected. Was it the wrong key? Was the file corrupted? Is there bad memory on the system, causing data to get corrupted?
    • by billstewart ( 78916 ) on Wednesday August 23, 2006 @12:06AM (#15960258) Journal
      Any sentence beginning with Seems like encrypting twice is likely to be doomed to bogosity, unless there's a later clause in it like but that's not what really happens. Crypto not only depends on a lot of deep math, it also depends on a lot of cryptographers spending time following bits around to see where they go and how to break them.

      Sometimes things do what your mathematical intuition tells you, if you're mature enough in the field to have a solid intuition, but often they don't. Problems can be very hard if looked at from one direction and very easy (or at least less hard) when looked at from another direction, and a cryptographer's job is to make sure they've checked out _all_ the directions because it only takes one weak one to break something. NP-complete problems are especially that way - they're potentially useful for crypto because there's one easy direction if you know the key, but many problems can't be transformed in a way that you can use the easy path and the attacker's stuck on the hard paths.

      But even the bit-twiddly algorithms, like most hash functions or the S-box building blocks inside Feistel-structured algorithms, can often be cracked by people examining them closely enough to find conditions under which they can deduce bits. For instance, MD5 is pretty much busted these days.

      And both mathematical crypto and bit-twiddly crypto has to worry about Moore's Law and brute force - some algorithms scale well, so you can double the strength of an N-bit key by adding 1 bit or maybe logN bits, while others don't, or they form groups so that encrypting a message multiple times with an N-bit key still only gives you N bits of strength (leaving aside pathological cases like rot13.)

    • If there is no known plain text in the second pass of the encryption, how do you know you've broken the first pass?

      In effect, if there's no fixed header, no way to quickly distinguish a random putative clear text from rubbish added in between the two encryption phase, then in effect a brute force attack would take 2^64*2^64 steps, that's to say 2^128 ...
    • by darkonc ( 47285 )
      You can explain it easier in the context of dropping the key size by a big -- for example, encrypting twice with 8 bit keys.

      If you encrypt twice with 8 bit keys, then you have to decrypt twice with 8 bit keys, then for each encryption, you have 256 possible keys... This means that across the two decryptions, you have to make a total of 512 guesses.

      If, on the other hand, you have a single 16 bit key, then you have 64K possible guesses.

      Now, one of the things that this presumes is that, when you find the

    • Seems like encrypting twice with a 64-bit key just means that instead of breaking a 64-bit key once, you have to do it twice. That's like using a 65-bit key, instead of a 64-bit key. Every bit doubles your keyspace.

      Ideally, yes, but it depends on the cipher, and the implementation.

      The cipher may form a group, where a single 64-bit key will decode the double-encrypted ciphertext to the original plaintext, so instead of having one key twice, you really have two keys that will decrypt.

      The implementation may le

  • Easy (Score:3, Insightful)

    by suricatta ( 617778 ) on Tuesday August 22, 2006 @10:54PM (#15959999)
    Each bit doubles the strength
    • Easy - but wrong... (Score:4, Interesting)

      by billstewart ( 78916 ) on Wednesday August 23, 2006 @04:09AM (#15960858) Journal
      I hope you were suggesting the "Each bit doubles the strength" as one of the bogus assertions, not one of the true ones. For some kinds of algorithms, against some kinds of attacks, it's true. For algorithms like RSA and Diffie-Hellman that have some special properties to the keys, doubling the strength may require adding LogN bits. Some algorithms don't have variable-sized keys, and some of those that do aren't very good at using them - they're as strong as they are, and piling stuff on doesn't change the weaknesses, like rot-26. Some algorithms are groups - combining R rounds of N-bit keys just gets you the equivalent of one round with a different N-bit key.
  • by unitron ( 5733 ) on Tuesday August 22, 2006 @10:55PM (#15960006) Homepage Journal
    While you're double super secret encrypting that file why not keep compressing and re-compressing it until you have it down to 1 bit as well? :-)
  • Encrypting a single document multiple times in that way is like using a piece of duct-tape to keep a box closed. Putting another strip of duct-tape directly over the top of the previous strip does not make the box any harder to open.

    Of course this is just an analogy from a non-expert... Use at your own risk. :-)
  • Debunking this claim (Score:5, Informative)

    by jordandeamattson ( 261036 ) <jordandmNO@SPAMgmail.com> on Tuesday August 22, 2006 @10:57PM (#15960012) Homepage
    The length of the key (64-bits, 128-bits) at a simple level describes the complexity or "hardness" of the encryption. Enrcypting with a 64-bit key will give you one level of "hardness" and 128-bit will give you another, greater, level of hardness.

    If you encrypt twice with a 64-bit key you will just be taking a plain text file and encrypting it and then taking your encrypted file and encrypting it once again. To break encryption on this file, getting access to the plain text, you would need to break 64-bit encryption twice.

    While this would be more of a challenge than breaking 64-bit standalone encryption, it wouldn't be equivalent to the security of a 128-bit key. The difficulty of breaking 128-bit encryption isn't 2 times 64-bit encryption. It is actually greater - I can't quantify it without additonal research, hopefully someone who can recall it off the top of their head can chime in with the details.

    Now, breaking a doubly encrypted file would present some interesting challenges. The most obvious is that your plaintext for the first layer of encryption is anything but plaintext. Confirming that one set of encrypted text is right one vs. some other set of encrypted text would definitely be a challenge. It would make it difficult to determine if you had found the right key to decrypt your first layer of encryption.

    As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!

    Yours,

    Jordan
    • As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!

      Rot-13 comes to mind...

    • by Coryoth ( 254751 ) on Tuesday August 22, 2006 @11:52PM (#15960212) Homepage Journal
      As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research.


      It isn't very common for the resulting encryption to be less secure, but "no appreciable improvement" is certainly very possible.

      With block ciphers you can run into the issue that if the cipher scheme forms a group (as some do) then even if you use 2 different keys for each encryption round, there will be a single third key that provides identical encryption: an attacker never needs to break your two keys, they can simply find the single third key that is equivalent to your double encipherment. Depite encrypting twice with two different keys the result is no more secure than encrypting once. For a simple example of this, consider a ROT encryption scheme - if you encrypt a message with ROT9 then ROT12 that is just equivalent to ROT21, so instead of trying to find the two keys 9 and 12, the attacker can just solve the ROT21 problem (the worst case, of course, is if your combination of keys gives the identity element such as ROT9 and ROT17, or ROT13 twice, in which case you end up with an unencrypted document).

      Even if that's not the case you can still get caught with a meet-in-the-middle approach if the attacker has some known plaintext (that is, they have some plaintext with the corresponding encrypted text and are attempting to extract your key) which does pretty much what it says, encrypting one way and decrypting the other to meet in the middle. Using such a scheme you can extract the keys with only marginally more effort than for single encryption.
    • Re: (Score:2, Informative)

      by xsonofagunx ( 902794 )
      Going purely by the numbers, I believe that the possible combinations for a 128-bit security system would be 3.4028236692093846346337460743177 * 10^38

      On the contrast, a 64-bit encryption scheme leaves you with 18,446,744,073,709,551,616. Done twice, that means it's 36,893,488,147,419,103,232 which is still far less than 128-bit encryption. I may be doing my calculations wrong (big numbers are scary...) but I think the difference between brute-force cracking a 128-bit key and two 64-bit keys is 340,282,3
    • The difficulty of breaking 128-bit encryption isn't 2 times 64-bit encryption. It is actually greater - I can't quantify it without additonal research, hopefully someone who can recall it off the top of their head can chime in with the details.

      The difficulty of a brute force search is directly proportional to the number of keys to be searched. So 2^128 is how many times bigger than 2^64? It's 2^64 times bigger, so a 128-bit keyspace is 2^64 times harder to search than a 64-bit keyspace.

    • On the 'Security Now!' podcast I listen to, the expert guy mentioned that although in theory each bit added to a key would double the effectiveness of it, it only adds about 9% to the difficulty due to the algorithms used. So a 128-bit key is about 1.09^64 = 248 times harder to break then a 64-bit key.
  • by Phantombrain ( 964010 ) on Tuesday August 22, 2006 @11:00PM (#15960023) Journal
    I'm going to think of it as if you were trying to bruteforce it.
    If you have 64 bits, that is 1.84467441 × 10^19 (2^64), meaning maximum that many tries to break the first layer of encryption. The second layer is the same number, meaning to break it would mean a maximum of 3.68934881 × 10^19 attempts.
    With 128 bit encryption, there are 3.40282367 × 10^38 (2^128) different possibilities, MUCh more than the double 64 bit encryption.

    Obviously people don't usually bruteforce encrypted files, but you can see there are much more possiblities for 128 bit encryption than with double 64 bit.
    • Re: (Score:3, Interesting)

      by tkw954 ( 709413 )

      If you have 64 bits, that is 1.84467441 × 10^19 (2^64), meaning maximum that many tries to break the first layer of encryption. The second layer is the same number, meaning to break it would mean a maximum of 3.68934881 × 10^19 attempts.

      This assumes that you can determine when you break the first layer of encryption, i.e. it won't work if the encrypted string is not distinguishable from noise. If this is not true, you must try each possible 64 bit second key for each 64 bit first key, for a maxi

      • by julesh ( 229690 )
        Yes, but note that the article poster was on the right track: it comes down to file headers. Almost all consumer-grade encryption software adds headers to its encrypted output so that it can tell if it has decrypted it successfully afterwards. If this is the case, it really does break down to 2^65 (or something not a huge amount larger) because you can usually tell when your first bruteforce has succeeded.
    • by jimicus ( 737525 )
      Obviously people don't usually bruteforce encrypted files

      Depends if you consider rubber hose cryptanalysis to be an application of brute force ;)
    • by ajs318 ( 655362 )
      That depends.

      Encryption and decryption are just mathematical functions. Let F(K,x) be a generalised function describing the encryption algorithm, where K is the key and x is the plaintext. We can collapse this to a simpler function K(x). Let K'(x) represent F'(K,x) i.e. the inverse of K(x).

      Now your double encryption is something like K2(K1(x)). But this is still just some function of x. It's possible {maybe not necessarily certain, depending upon the algorithm employed} that there exists K3 such
  • Bad math (Score:5, Funny)

    by Dan East ( 318230 ) on Tuesday August 22, 2006 @11:04PM (#15960043) Journal
    Your both wrong. Encrypting twice at 64 bit is like 64*64=4096 bit encryption. Personally, I like to encrypt 7 times at 2 bit encryption for 128 bit strength. If its something I really want to keep secret, I encrypt four times at 128 bit for 268435456 bit encryption. However, don't make the mistake I made once of encrypting 128 times at 1 bit. Because that is just 1^128, which yields nothing more than 1 bit encryption.

    In case anyone is wondering, I'm hoping some DRM coder will gain valuable insight from this post.

    Dan East
    • Re:Bad math (Score:5, Insightful)

      by thefirelane ( 586885 ) on Tuesday August 22, 2006 @11:13PM (#15960072)
      In case anyone missed it though, there was a good insight in that humor: a clear way of expressing the answer to the posed question:

      If two 64 bit encryptions equals one 128 bit, then 128 one-bit encryptions should also.

      This means the file would have a password of either 1 or 0 ... you would be prompted for this password, and if you got it wrong, you could try again. Obviously you'd get it on the second try. You could repeat this 128 times, and that would be the total of all protection. Naturally, this means the most number of guesses you'd need to make is 256.

      After thinking about it that way, it becomes quite clear: 128 bit encryption obviously requires more than 128 guesses to get the right key. As another poster pointed out, two 64 bit passes equals 65 bit encryption.
      • by G-funk ( 22712 )
        Well, two 64bit runs is 128 bit encryption in that there are 2^128 different possible keys. But it's only as useful as 65 bit encryption because you only need to guess one half at a time. Its just so pantently stupid that it really took some work for me to wrap my head around a "for-dummies" explanation... Which I was promptly beaten to of course :)
      • Re: (Score:2, Funny)

        by Miffe ( 592354 )
        Naturally, this means the most number of guesses you'd need to make is 256.
        Well, you only have to guess 255, since the last one wont really be a guess. =)
      • If two 64 bit encryptions equals one 128 bit, then 128 one-bit encryptions should also. This means the file would have a password of either 1 or 0 ... you would be prompted for this password, and if you got it wrong, you could try again. Obviously you'd get it on the second try. You could repeat this 128 times, and that would be the total of all protection. Naturally, this means the most number of guesses you'd need to make is 256.

        And how do you know, that you guessed right at every step? No, really. Goo

    • Sure, just take "Dan East" and shorten to the more common "DEAST" and you get a simple 5 letter name!

      Well, we all know that 2^5 = 32.

      So, if we take your slashdot userid of 318230 and multiply by 32 we get 10,183,360 (you can stop checking the math after this).

      So, take 10,183,360 and re-write as 10 18 33 60 and then translate those four characters into "WEST" and it is perfectly clear that he is one of them!

      I don't have to spell it out! Mr. East is leading you the wrong way, because you are getting t

  • by cunkel ( 111089 ) on Tuesday August 22, 2006 @11:06PM (#15960048)
    Well, one common way to dispel a myth is to find an authoritative source on the subject that explains why the myth is untrue. Perhaps a book on cryptography would be helpful, for example:
    • Applied Cryptography: Protocols, Algorithms, and Source Code in C by Bruce Schneier
    • Handbook of Applied Cryptography by Alfred J. Menezes et al.
    Either one will explain such things as why double-DES is not twice as strong as DES and common pitfalls in thought and design of cryptographic systems.
    • Re: (Score:3, Insightful)

      by mgblst ( 80109 )
      Well, while reading these books may work for this particular problem, it is not always the case that books answer simple problems like this.

      For most questions, you can read as many books as you want, but if the question is simple but stupid, you often won't find the answer.

      In these cases, you often need to find someone who is very knowledgeable in the subject, or ask slashdot.

      Anybody who has studies an of-campus course will realise the value of being able to ask stupid questions, and talk to knowledgeable p
    • I'd recommend the newer Practical Cryptography (by Niels Ferguson and Bruce Schneier) over Applied Cryptography. The forward is an interesting read and highlights some of the problems with the Applied Cryptography text (and focus of the earlier book). Applied Cryptography is probably better if you're going to write the encryption algorithms yourself.

      Practical Cryptography is written with a bit more hindsight because of watching all the sub-standard encryption systems that have appeared over the last dec
  • Snake oil FAQ (Score:5, Interesting)

    by Chuck Chunder ( 21021 ) on Tuesday August 22, 2006 @11:11PM (#15960058) Journal
    Is a good place to start [interhack.net] if you want to protect yourself against people hawking dodgy crypto.

    I used to work at a small company where a guy came in and was trying to get us involved with some dodgy "crypto" product. Management were quite taken with it (he could talk and even had a patent!) but technically it was a load of horse shit with some pretty multicolour diagrams. That FAQ helped me crystalise my objections and the guy was given the heave ho before the company did anything embarrassing.
  • by metamatic ( 202216 ) on Tuesday August 22, 2006 @11:14PM (#15960075) Homepage Journal
    Consider the (naive and insecure) XOR encryption.

    Take your file, encrypt it with a 16 bit number.

    Take the result, and encrypt it again with a different 16 bit number.

    Now, how many possible keys do you have to go through to get back the original data?

    (Hint: Try single decryption using key3 = key1 XOR key2.)

    This counterexample invalidates the claim that encrypting twice with an N-bit key is equivalent to encrypting with a 2N-bit key. It also demonstrates the worst possible case, that encrypting twice may be no more secure than encrypting once.
    • by moyix ( 412254 )
      This actually raises a question to which I don't know the answer: if you take a fairly standard symmetric cypher, say DES, and two keys K1 and K2, does there always exist a key K3 such that E_K1(E_K2(Message)) == E_K3(Message) ? This is not actually an obvious thing to prove, and I have a feeling it may vary from cipher to cipher.

      Any crypto experts want to weigh in?
      • by pclminion ( 145572 ) on Tuesday August 22, 2006 @11:41PM (#15960164)

        This actually raises a question to which I don't know the answer: if you take a fairly standard symmetric cypher, say DES, and two keys K1 and K2, does there always exist a key K3 such that E_K1(E_K2(Message)) == E_K3(Message) ?

        No, this is not always the case. For DES it is certainly NOT the case. The terminology you are looking for is "idempotent cipher." A cipher is idempotent if it obeys the equation you stated -- i.e., for any key pair K1,K2, there is a K3 such that E_K2(E_K1(x))=E_K3(x). DES is NOT an idempotent cipher.

        These questions only seem deep to non-crypto-experts (no offense to you intended) -- and it was certainly accounted for in the design of DES (along with features which harden it against differential cryptanalysis, a technique that wasn't even publically known outside the NSA at the time -- the people who work on these things are far from stupid)

  • There's a reason why triple encryption is used rather than double encryption. There's a technique known as the Meet-in-the-middle attack [wikipedia.org] which, by trading off storage space for time, you can break a double encryption with two independent keys using only twice the time needed for single encryption, meaning that using two independent 64-bit keys would only require 2^65 work, rather than 2^128 as one might expect. That's the reason why you never hear of "double DES" but rather Triple DES with independent key

  • Ask him if encrypting something with a 1 bit key 128 times is equally secure.
  • by Bender0x7D1 ( 536254 ) on Tuesday August 22, 2006 @11:26PM (#15960113)
    In this case, you can simply refer to the meet-in-the-middle attack as mentioned by dtfinch.

    However, in the general case of debunking bogus encryption statements, it can be quite difficult. Why? Because encryption is hard, hard, hard, hard stuff. I've taken a graduate course in cryptography and all it did was reinforce how easy it is to make a mistake.

    The only encryption scheme I know of that is provably unbreakable is the one-time pad. However, proper implementaion of that is a pain. Is AES secure? We don't know. We think it is, but there are researchers looking for weaknesses because we don't know for sure. It is possible, (however unlikely), that someone will develop an attack tomorrow sending everyone scrambling for a new encryption algorithm.
    • It is possible, (however unlikely), that someone will develop an attack tomorrow sending everyone scrambling for a new encryption algorithm.

      That's why we have other crypto algorithms ready. I use Blowfish for my VPN connections.

  • Is to show how two rounds of the Caeser Shift [wikipedia.org]
    Is exactly equivilent to 1 round of the same, with a different key,
    and no harder to crack. In fact, if your key is the same length as
    your alphabet, it would be the plaintext again.

  • Group theory (Score:5, Informative)

    by coyote-san ( 38515 ) on Tuesday August 22, 2006 @11:39PM (#15960159)
    Let me see if I can explain this in a few paragraphs....

    A single encryption key defines a one-to-one mapping between each plaintext block and ciphertext block. (In ECB (electronic codebook) mode, but the chained block encoding follows a similar analysis.) It has to be bidirectional so that the decryption is unique.

    This means that, AT A MAXIMUM, your meaningful keyspace can be no larger than the 2^(number of bits in block). In practice it takes a lot of hard work to ensure that you have 2^(number of bits in key) distinct and non-trivial mappings, and ideally for any arbitrary plaintext and ciphertext block you can guarantee that there is a unique key that will produce that result. However it's easy to screw up and only have, e.g., 2^40 actual encryptions even though you have a keysize of 64 or even 128 bits.

    Double encryption also defines a one-to-one mapping between each plaintext block and ciphertext block... but you have the same blocksize. If the keys form a "group", then any double encryption will be equivalent to a single encryption with a different key. In fact any N-encryptions will be equavalent to encyrption with a single different key. Some combinations will leave you with no encryption. (XOR encryption is a trivial example of this -- you'll get no encryption whenever you have an even number of encryptions.)

    Worse, your keys may not form a group because there's some property that gets cancelled out in a double encryption. E.g., maybe something drops out each time both keys have a '1' in the same spot. In this case double encryption would be significantly weaker than single encryption.

    Triple-DES encryption is an oddball since it isn't (or wasn't) known whether DES keys form a group. Remember that a DES key is only 56 bits, but a DES cipherblock is 64 bits -- you know that can't get every possible mapping between plaintext and ciphertext by running through the keyspace.

    Hence 3DES. However 3DES isn't simply encryption three times, e.g., the form we studied in my grad class used two (not three) 56-bit keys, in an EDE pattern. That is, you encrypt with the first key, then DECRYPT with the second key, then encrypt with the first key again. I didn't follow all of the reasoning but I seem to recall it was because of some property of the S and P boxes.
    • Re: (Score:3, Interesting)

      >it isn't (or wasn't) known whether DES keys form a group.

      Wasn't, although the result came very late in the history of DES:

      K.W. Campbell and M.J. Wiener, DES is not a group, Advances in Cryptology - Crypto '92, Springer-Verlag (1993), 512-520.

      Beep! Retrieval complete.
    • However 3DES isn't simply encryption three times, e.g., the form we studied in my grad class used two (not three) 56-bit keys, in an EDE pattern. That is, you encrypt with the first key, then DECRYPT with the second key, then encrypt with the first key again. I didn't follow all of the reasoning but I seem to recall it was because of some property of the S and P boxes.

      The reason I learned is much more prosaic: With the EDE pattern, you can implement single DES simply by using the same key for both halve

    • A block cipher uses a key to map plaintext blocks to ciphertext blocks. Given 2^N plaintext blocks and 2^N ciphertext blocks, the number of possible encryptions are 2^N!, not 2^N. This is a staggering difference in magnitude.

      For example, the maximum keysize of a 128 bit block cipher is about 10^22 bits.
  • by inio ( 26835 ) on Tuesday August 22, 2006 @11:42PM (#15960166) Homepage
    ... as long as the keys for each pass are different and there's no header.

    A good encryption scheme doesn't have any header, for exactly this reason. It reads in n words and writes out n words and that's that. So if your coworker is doing:

    C = E_K2(E_K1(P))

    there are indeed 128 bits of information required to decrypt the document. Given given a plaintext+cyphertext pair a meet-in-the-middle attack might be possible, but the search space is still on the order of 2^128 (cryptanalysis could probably reduce this some). Given plaintext+intermediate+cyphertext, it's down to 2^64, because each half could be cracked separately.

    With a good encryption algorithm the first pass ( E_K1(P) ) generates something approaching line noise - close enough so that there's no statistical way to tell real random bits from encrypted data (many RNGs now use encryption routines to boost incoming entropy). There is no structure, there is no header, and there is no way to verify that it's an encrypted anything. It might as well be a bunch of random bits. Given the cyphertext, your total search space to find the plaintext is still 2^64 * 2^64 = 2^128 sets of keys because there's no way to tell that the first round was performed correctly - it will always just be random bits.

    Now, if there IS a header added you're back to n*2^64. For example, the software compresses (and thus adds a header to) the data before encrypting it. Given the cyphertext you can search for keys that decrypt to have a valid header, and then search for keys that produce reasonable english text from those outputs.
    • by Coryoth ( 254751 )

      Given given a plaintext+cyphertext pair a meet-in-the-middle attack might be possible, but the search space is still on the order of 2^128 (cryptanalysis could probably reduce this some).

      Mounting a known-plaintext attack (plaintext + ciphertext pair) using meet-in-the-middle on something encrypted twice with two different 64 bit keys requires only 2^65 trials. It's a time memory trade off, so it is a little memory expensive, but it's well short of the 2^128 trials you claim.

      What you need is two ciphertexts

  • Look at the HAC (Score:4, Informative)

    by petard ( 117521 ) * on Wednesday August 23, 2006 @12:22AM (#15960311) Homepage
    The Handbook of Applied Cryptography is an excellent book and is now available online for free [uwaterloo.ca]. If you passed a couple college calculus courses, the material should be very accessible to you. In general, it's an excellent source of facts and analysis to debunk incorrect statements about cryptography.

    In particular, the part that answers you question is found in Chapter 7 (block ciphers) [uwaterloo.ca]. The justification for fact 7.33 explains why, if you can store 2^64 blocks of ciphertext, you can break double 64-bit encryption in 2^64 operations.

    If your coworker is not capable of understanding the math behind this reasoning, he really should not be making decisions about encryption :-).

    (Note: I'm not witholding the justification in this post simply to be obtuse. It's too much of a pain to format it reasonably in this input box... you'll just have to refer to the PDF.)
  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Wednesday August 23, 2006 @12:33AM (#15960337) Journal
    This is how I usually end this kind of conversation -- I go over their head. If I know more than they do about the subject, I speak jargon until their eyes glaze over, and then say something to the effect of "Do you really want to know why you're wrong, or will you take my word for it?" I'm a good enough teacher to back that up if they really do want to know. Or sometimes I start from the other end -- if I have a good enough analogy ready, I can start with that -- the money one is a good analogy for the crypto (Would you rather give me two $10 bills or one $100 bill? Would you let me settle a $100 debt with you by giving you two $10 bills?)

    The other method of going over their head is to cite someone -- not always even a specific person -- who is smarter than both of us. So for crypto algorithms, I can ask them, "If it was really that easy, don't you think someone else would've thought of it by now? Why do you think we invent wholly new algorithms for 128 bits?"

    And if both of those fail, the final method of going over their head is to cite specific sources and papers, and invite them to actually disprove what's said there (without really having to understand it yourself) if they really think they're right.

    All of these, of course, are when I knwo I'm right. Quite often, I really don't know, and I stay humble there. But when I know I'm right, it's usually much more important to hit them over the head with one of the above than to try to explain why I'm right. After all, you don't have to know why a car is safer than open air for a lightning storm, you just have to do it. It's not important that someone understand why not to just double-encrypt with 64-bits, it's just important that they not do it.
  • The basic problem is that if you do E_KB(E_KA(plain_text)), there may be a key C with

    E_KC (plaintext) = E_KB(E_KA(plain_text))

    If you just encrypt 64 bit with an one-time pad (trivial example), then this is obvious:

    C = A XOR B

    I.e. with one-time pad you gain exactly no security at all.

    For many ciphers, a key C may not exist, but even if it does not exist, an attack on E_KB(E_KA(plain_text)) may be just as easy as on a single key. What is worse is that for some ciphers (e.g. DES) it is not kno
  • Encrypting something twice is effectively creating a new cypher system. Always remember Friedman's remark, "No new cypher is worth looking at unless it comes from someone who has already broken a very difficult one."

    (If you don't know who William Friedman was, you probably shouldn't be designing cyphers.)

  • No, it's not "equivalent", but he is closer to the truth than you are and something like this is widely used in practice. Look for "Triple DES" on Wikipedia for a longer description of the consequences of encrypting multiple times with the same algorithm.
  • There are a bunch of technical explanations here. Let me make a simple analogy.

    If you have a lock on your door, which has only 100 different keys, a thief could take 100 keys along, and break in. Suppose you find this "100 keys" too easy, and install a second (100-key) lock.

    There are now 10000 combinations of key1 and key2.

    Do you expect the next thief to take along 10000 keyrings, each keyring having two keys one for the first lock, one for the second lock, or does he come in with 200 keys?

    In the mechanical
  • "Recently, a coworker tried to assert that encrypting a file twice with a 64 bit algorithm is equivalent to encrypting it once with a 128 bit algorithm. I know enough about encryption to know that isn't true, but I am having difficulties explaining why and how. Doesn't each pass of the encryption create a separate file header which makes this assertion untrue? Can anyone point me to references that would better help me explain this?"

    First of all, what is a '64-bit encryption algorithm'? Is this a symmetr

  • by bLanark ( 123342 ) *
    Why dick around with your own encryption: use AES. Then you can tell your customers that you are using the standard suggested by the US government. There are free implementations available in many languages. You will be doing the best thing for your team and your company.

    If you are worried about CPU, memory, etc, remind yourself that the criteria for selecting the AES algorithm included CPU usage, memory usage, and the ability to be implemented on silicon.
  • Briefcases (Score:4, Insightful)

    by ajs318 ( 655362 ) <sd_resp2@@@earthshod...co...uk> on Wednesday August 23, 2006 @08:00AM (#15961423)
    You can buy a briefcase with two separate, three-digit combination locks in any stationery store, and it won't cost you a lot of money. Invest in one as a "prop" for this demonstration. Have your co-worker set a combination for the left-hand lock and a different combination for the right-hand lock, lock the case and spin the wheels randomly. Now, there are 1000 possible combinations for the LH lock and 1000 possible combinations for the RH lock, giving you 1000000 possible combinations. If you can try one combination a second, it could take you up to 9 months to get into the case. Right? {If your co-worker actually believes this, perhaps it would be simplest just to have them put to S-L-E-E-P}.

    Once you have cracked the left-hand lock {which will take at most 1000 tries - 17 minutes} you are free to concentrate on the RH lock {which also will take at most 1000 tries}. You don't have to try anymore combinations on the LH lock. So you can open the briefcase in a maximum of 2000 tries, or just over half an hour. The key fact that makes this possible is that the two locks are independent: you can undo one while the other remains fastened. The combined keyspace is then the sum, not the product,of the two keyspaces.

    If the encryption operation adds a predictable header to the ciphertext {for example, BEGIN OPHIOL ENCRYPTED MESSAGE} {and most of them do} then you have something to look for that will let you know when you have cracked the first layer of encryption -- in effect, popped open the left-hand lock.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...