## Debunking a Bogus Encryption Statement? 215

Posted
by
Cliff

from the this-and-other-gems dept.

from the this-and-other-gems dept.

deviantphil asks:

*"Recently, a coworker tried to assert that encrypting a file twice with a 64 bit algorithm is equivalent to encrypting it once with a 128 bit algorithm. I know enough about encryption to know that isn't true, but I am having difficulties explaining why and how. Doesn't each pass of the encryption create a separate file header which makes this assertion untrue? Can anyone point me to references that would better help me explain this?"*What other laughable claims have you heard attributed to encryption, and how were you able to properly lay them to rest?
## Meet in the middle attack (Score:5, Informative)

That's why we have triple-DES instead of double-DES.

## Re: (Score:3, Informative)

ideal128-bit. I say ideal, because typically attackers find ways of reducing the effective number of possible combinations for an algorithm. Original DES has been reduced significantly, so triple-DES was designed to improve it, and do so using the same encryption hardware. But while triple-DES may be closer to the## Re:Meet in the middle attack (Score:5, Insightful)

The block size is still 64-bit, which means 2^64 possible combinations versus the 2^128 for ideal 128-bit.When talking about block ciphers, block size and key size are separate things. Generally, if you say algorithm X is an n-bit cipher, you're talking about key size, not block size. Given use of good block chaining modes, ciphers with larger block sizes don't really offer significant additional security.

typically attackers find ways of reducing the effective number of possible combinations for an algorithm. Original DES has been reduced significantly, so triple-DES was designed to improve itOriginal DES hasn't really been reduced significantly. There are attacks which reduce the computational complexity to, IIRC ~41 bits, but at the expense of requiring impractical space. All in all, DES as stood up extremely well and its practical strength hasn't been significantly reduced by 30+ years of scrutiny.

The problem 3DES was created to address was a deliberate limitation of the design: The small keyspace. The strongest variant of Lucifer (the original IBM cipher which became DES), used a 128-bit key, but the NSA deliberately had it reduced to a 56-bit key, presumably because they thought they could break the cipher with the smaller key size, but it was still secure enough against others. Advances in computation technology, of course, have put a brute force search of a 56-bit keyspace within reach of run-of-the-mill desktop computers. 3DES addresses this by increasing the effective keyspace to 112 bits.

## DES design issues (Score:5, Interesting)

## Re: (Score:3, Interesting)

The general opinion about why NSA pushed DES to be 56 bits instead of 128 bits is that "differential cryptography" attacks weaken it to about 55 bits anywayInteresting. That's an idea I haven't read anywhere. Do you have a reference? Your explanation is at odds with my understanding of the timeline. As I understand it, Lucifer was vulnerable to differential cryptanalysis, but IBM independently discovered the technique during the design process of the final version, which used a smaller key size per N

## Re:DES design issues (Score:5, Informative)

A Celeron 566MHz has a memory bandwidth around 150-175MB/s. An Athlon64 3200+ has a memory bandwidth around 1600MB/s. (Approximate numbers, with all sorts of undocumented assumptions.) So it's a pretty good bet that modern single-core CPUs are at least 10x as fast as the ones from 8-10 years ago.

Of course, now we have things like dual/quad CPU machines with 2-4 cores per machine... so we could up that number to 150x as fast as a machine from 8-10 years ago. So instead of 3000 machines, we might only need 20-30.

## Re:Meet in the middle attack (Score:5, Informative)

It's even worse if your cipher of permutations forms a group. In that case, for any two ENC keys k1 and k2, there exists a third ENC k3 such that ENC_k2 (ENC_k1 (x)) = ENC_k3 (x). In other words you can find a third key that produces the same permutation as the composition of keys 1 and 2. This means that breaking a double-encryption (or triple, quadruple, etc) is no harder than breaking a single encryption: the resulting permutation can always be described by a single key.

That's why 3DES uses EDE instead of EEE. While DES doesn't form a group, it does have some group-like structures which reduce the workload quite a bit. This doesn't apply to all ciphers btw; there are many more possible 64-bit permutations than 64-bit keys, so compositions can fall well outside those covered by keyspace.

I think this is covered in chapter 7 of the Handbook of Applied Cryptography [uwaterloo.ca].

## Also, 64-bit is Extra Wimpy (Score:2)

But 64 bits? There are a few algorithms that have variable key lengths that work at that size (RC5 is pretty strong), and then there's DES where you're counting the 8 parity bits as part of your keylength, but most of the 64-bit algorithms I've seen were things people hacked togethe

## Re: (Score:3, Informative)

Known plaintext attacks are a common method for breaking crypto. For session keys for stream encryption, yes, a key is only used once. For other things like PK crypto, though, the key persists.

As for why this is interesting, you know

apiece of plaintext andapiece of encrypted text. That doesn't mean that you necessarily know the plaintext you're trying to decrypt (though you might know some of it).Case 1: You know a portion of the plaintext. For example, you might know that the beginning of an em

## I think you've got the math wrong, actually (Score:3, Interesting)

For each possible 64-bit key K, you need to generate and store Ek(P) in a table. It's the size of that table I was trying to calculate. So that's 8 bytes * 2^64 keys, which is 147,573,952,589,676,412,928. That sure looks like 147 Billio

## Your keyspace wouldn't be that much bigger (Score:5, Insightful)

Plus, you have to realize that the amount of time needed to brute force a single key of a certain size goes up non-linearly with each additional bit. If you just double the number of times you encrypt, you're pretty much just increasing the effort to brute force the thing linearly.

## Re:Your keyspace wouldn't be that much bigger (Score:5, Funny)

## Re: (Score:3, Informative)

ian

## Re: (Score:2)

## Re:Your keyspace wouldn't be that much bigger (Score:5, Funny)

## Re: (Score:2)

## Re: (Score:3, Funny)

Absolutely! That's why I double-rot13 all my emails before sending them! :-)

## Re: (Score:2)

Here's one with a HUGE key space - and much more difficult to break.

"cat /dev/urandom > file_you_want_encrypted.doc"

And for the best effect, you can let this run all night. Ignore the "disk full" message. Its a feature, not a bug.

Pretty much guaranteed nobody else will be able to read your original contents when its finished.

## Re: (Score:2)

And if it was intended as such, it was a very sloppy attempt at humor.

## Re: (Score:2)

## Re: (Score:3, Insightful)

## Re:Your keyspace wouldn't be that much bigger (Score:4, Informative)

Wouldn't you have to do the "inner" 64bit bruteforce procedure for each possible key of the "outer" 64bit keyspace, thus making it 128bit again? I'm feeling the effective key strength is somewhere between 64bit and 128bit, certainly more than 65bit, but not really 128bit.

IANAC(ryptologist).

## Re:Your keyspace wouldn't be that much bigger (Score:5, Insightful)

If you take a plaintext file and encrypt it into a file which has headers ("BEGIN ENCRYPTED CONTENT---"), and then encrypt the result again, assuming the attacker knows how you did it and that the intermediate file has plaintext headers, then they'll know the moment they broke the first 64-bit encryption layer. So in this example, you're basically at 65 bits.

Now if you don't include any headers, so that there's no terribly good way to determine whether you've gotten the right key or not, as you're brute-forcing the first layer, then I think you're right -- the strength of the overall system is somewhere in a grey area between 65 and 128 bits.

If someone was just thinking that they could use a file-encryption utility twice (which produces output files that have plaintext headers) and double the keyspace, they are dead wrong.

## Re: (Score:2)

## Re: (Score:2)

Rather I was just saying that if you have file headers on the intermediate file, then it becomes quite easy to figure out when you've brute-forced the outer layer of encryption. Without these headers, it's much harder to tell when you've gotten the correct

## Re:Your keyspace wouldn't be that much bigger (Score:5, Informative)

Wouldn't you have to do the "inner" 64bit bruteforce procedure for each possible key of the "outer" 64bit keyspace, thus making it 128bit again?No, you do a meet-in-the-middle attack, which is basically 2^64 in complexity if you're using two 2^64 keys.

There are some optimizations that can be done, but the basic idea is this: You start with one ciphertext block and its corresponding known plaintext. Then you encrypt the plaintext with all 2^64 possible keys and store the results (with some indexes to make it easy to search for a particular block). Then you decrypt the ciphertext with all 2^64 possible keys, looking each result up. When you find a match, you have found the pair of 64-bit keys that turns that plaintext into that ciphertext. So to search the entire space, you have to do 2^64+2^64 = 2^65 trial operations. On average, you only have to search half of the second keyspace, so the complexity of the search is 2^64+2^63 = 2^64 (plus the huge storage cost).

Triple encryption is also weakened by a meet-in-the-middle attack. Using three 64-bit keys, it would be nice to think you have a 192-bit key. But a meet-in-the-middle requires encrypting with all 64-bit keys for the first step, and decryption with all 128-bit keys for the second step, giving an effective strenth of 2^64+2^127 = 2^127 (plus the huge storage cost).

Finally, keep in mind that DES keys aren't 64 bits. They're 56 bits. So 3DES has a brute force search computational complexity of 111 bits, on average.

## Re: (Score:2)

I think there's more to this problem than you realize at first. Bruteforce relies on the ability to identify when you have successfully broken the cipher text.We're probably talking consumer encryption software here. Many such programs (most?) include a hash of the encrypted data so that they can verify that decryption was successful, or at least some other mechanism that allows them to prompt for an alternative key if the one specified didn't work. Even if there are multiple possible keys that won't prod

## Re: (Score:2)

## Intuition doesn't work well in crypto (Score:4, Insightful)

Seems like encrypting twiceis likely to be doomed to bogosity, unless there's a later clause in it likebut that's not what really happens. Crypto not only depends on a lot of deep math, it also depends on a lot of cryptographers spending time following bits around to see where they go and how to break them.Sometimes things do what your mathematical intuition tells you, if you're mature enough in the field to have a solid intuition, but often they don't. Problems can be very hard if looked at from one direction and very easy (or at least less hard) when looked at from another direction, and a cryptographer's job is to make sure they've checked out _all_ the directions because it only takes one weak one to break something. NP-complete problems are especially that way - they're potentially useful for crypto because there's one easy direction if you know the key, but many problems can't be transformed in a way that you can use the easy path and the attacker's stuck on the hard paths.

But even the bit-twiddly algorithms, like most hash functions or the S-box building blocks inside Feistel-structured algorithms, can often be cracked by people examining them closely enough to find conditions under which they can deduce bits. For instance, MD5 is pretty much busted these days.

And both mathematical crypto and bit-twiddly crypto has to worry about Moore's Law and brute force - some algorithms scale well, so you can double the strength of an N-bit key by adding 1 bit or maybe logN bits, while others don't, or they form groups so that encrypting a message multiple times with an N-bit key still only gives you N bits of strength (leaving aside pathological cases like rot13.)

## No it's not -- if there's no known plain text (Score:2)

In effect, if there's no fixed header, no way to quickly distinguish a random putative clear text from rubbish added in between the two encryption phase, then in effect a brute force attack would take 2^64*2^64 steps, that's to say 2^128

## Re: (Score:2)

If you encrypt twice with 8 bit keys, then you have to decrypt twice with 8 bit keys, then for each encryption, you have 256 possible keys... This means that across the two decryptions, you have to make a total of 512 guesses.

If, on the other hand, you have a single 16 bit key, then you have 64K possible guesses.

Now, one of the things that this presumes is that, when you find the

## Re: (Score:2)

Ideally, yes, but it depends on the cipher, and the implementation.

The cipher may form a group, where a single 64-bit key will decode the double-encrypted ciphertext to the original plaintext, so instead of having one key twice, you really have two keys that will decrypt.

The implementation may le

## Easy (Score:3, Insightful)

## Easy - but wrong... (Score:4, Interesting)

## While you're at it... (Score:4, Funny)

## Re:While you're at it... (Score:4, Funny)

## Re: (Score:2, Funny)

Chuck Norriscan roundhouse kick that compressed data back into its'rawformat.## Re:While you're at it... (Score:5, Funny)

## Re: (Score:2)

Here I got some intresting downloads for you:

My whole collection of the latest movies: '1'

All microsoft programs: '0'

A working beta of Duke4Ever: '0'

ISO's of the 20 best games of 2005 and 2006: '1'

Every MP3 you could find on the net: '1'

pron: 12.250.000 images, 512.054 movies: '0'

I'll upload the decompress util as soon as I got that last bug out of it. But you could finish the downloads in the mean time.

## It's like this.. (Score:2)

Of course this is just an analogy from a non-expert... Use at your own risk.

## Re: (Score:2)

At some point so mant strips of duct tape are going to hold down the box lid due to sheer mass.True, but in the end you could just pry the whole mess off with a crowbar or something. It's only as strong as the glue on the bottom piece.

## Debunking this claim (Score:5, Informative)

If you encrypt twice with a 64-bit key you will just be taking a plain text file and encrypting it and then taking your encrypted file and encrypting it once again. To break encryption on this file, getting access to the plain text, you would need to break 64-bit encryption twice.

While this would be more of a challenge than breaking 64-bit standalone encryption, it wouldn't be equivalent to the security of a 128-bit key. The difficulty of breaking 128-bit encryption isn't 2 times 64-bit encryption. It is actually greater - I can't quantify it without additonal research, hopefully someone who can recall it off the top of their head can chime in with the details.

Now, breaking a doubly encrypted file would present some interesting challenges. The most obvious is that your plaintext for the first layer of encryption is anything but plaintext. Confirming that one set of encrypted text is right one vs. some other set of encrypted text would definitely be a challenge. It would make it difficult to determine if you had found the right key to decrypt your first layer of encryption.

As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!

Yours,

Jordan

## Re: (Score:2)

As an aside, I do remember reading about code systems where double encryption acutally made the result encryption less secure. I don't remember the details, but now my brain is itching and I will have to do some research. Thanks!Rot-13 comes to mind...

## Re:Debunking this claim (Score:5, Insightful)

It isn't very common for the resulting encryption to be

lesssecure, but "no appreciable improvement" is certainly very possible.With block ciphers you can run into the issue that if the cipher scheme forms a group (as some do) then even if you use 2 different keys for each encryption round, there will be a single third key that provides identical encryption: an attacker never needs to break your two keys, they can simply find the single third key that is equivalent to your double encipherment. Depite encrypting twice with two different keys the result is no more secure than encrypting once. For a simple example of this, consider a ROT encryption scheme - if you encrypt a message with ROT9 then ROT12 that is just equivalent to ROT21, so instead of trying to find the two keys 9 and 12, the attacker can just solve the ROT21 problem (the worst case, of course, is if your combination of keys gives the identity element such as ROT9 and ROT17, or ROT13 twice, in which case you end up with an unencrypted document).

Even if that's not the case you can still get caught with a meet-in-the-middle approach if the attacker has some known plaintext (that is, they have some plaintext with the corresponding encrypted text and are attempting to extract your key) which does pretty much what it says, encrypting one way and decrypting the other to meet in the middle. Using such a scheme you can extract the keys with only marginally more effort than for single encryption.

## Re: (Score:2, Informative)

On the contrast, a 64-bit encryption scheme leaves you with 18,446,744,073,709,551,616. Done twice, that means it's 36,893,488,147,419,103,232 which is still far less than 128-bit encryption. I may be doing my calculations wrong (big numbers are scary...) but I think the difference between brute-force cracking a 128-bit key and two 64-bit keys is 340,282,3

## Re: (Score:2)

The difficulty of breaking 128-bit encryption isn't 2 times 64-bit encryption. It is actually greater - I can't quantify it without additonal research, hopefully someone who can recall it off the top of their head can chime in with the details.The difficulty of a brute force search is directly proportional to the number of keys to be searched. So 2^128 is how many times bigger than 2^64? It's 2^64 times bigger, so a 128-bit keyspace is 2^64 times harder to search than a 64-bit keyspace.

## Re: (Score:2)

## Re: (Score:2)

## Re: (Score:2)

I agree that XOR and ROT13 are encryption systems which end up decrypting the cipher text when re-encrypted. But I will be honest that I wasn't thinking of them, but rather of "real" encryption systems (i.e. more robust), when I thought about systems where double encryption could lead to the cipher text being less secure than it was with one level of encryption.

I think the get mistake which spawned this discussion is that people tend to think of encryption as "lock boxes" for information. In the r

## Re: (Score:3, Interesting)

Example: K1(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ onto DEFGHIJKLMNOPQRSTUVWXYZABC.

K2(x) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ onto LMNOPQRSTUVWXYZABCDEFGHIJK.

K2(K1(x)) maps ABCDEFGHIJKLMNOPQRSTUVWXYZ onto OPQRSTUVWXYZABCDEF

## Re: (Score:2)

Answering in a forum is something to do only when you know the answer.Replying to an answer in a forum is something to do only when you know the answer and the other guy doesn't. I actually do know the answer, but didn't feel the need to spoonfeed the D&H article to educate you and Jordan. You could go look it up, and find out what a Meet-in-the-Middle attack is, maybe look at subsequent research, understand how the math actually works, and understand why Jordan doesn't know what he's talking about

## Re: (Score:2)

## Think about it as number of possibilities (Score:5, Informative)

If you have 64 bits, that is 1.84467441 × 10^19 (2^64), meaning maximum that many tries to break the first layer of encryption. The second layer is the same number, meaning to break it would mean a maximum of 3.68934881 × 10^19 attempts.

With 128 bit encryption, there are 3.40282367 × 10^38 (2^128) different possibilities, MUCh more than the double 64 bit encryption.

Obviously people don't usually bruteforce encrypted files, but you can see there are much more possiblities for 128 bit encryption than with double 64 bit.

## Re: (Score:3, Interesting)

This assumes that you can determine when you break the first layer of encryption, i.e. it won't work if the encrypted string is not distinguishable from noise. If this is not true, you must try each possible 64 bit second key for each 64 bit first key, for a maxi

## Re: (Score:2)

## Re: (Score:2)

Obviously people don't usually bruteforce encrypted filesDepends if you consider rubber hose cryptanalysis to be an application of brute force

## Re: (Score:2)

Encryption and decryption are just mathematical functions. Let F(K,x) be a generalised function describing the encryption algorithm, where K is the key and x is the plaintext. We can collapse this to a simpler function K(x). Let K'(x) represent F'(K,x) i.e. the inverse of K(x).

Now your double encryption is something like K2(K1(x)). But this is still just some function of x. It's possible {maybe not necessarily certain, depending upon the algorithm employed} that there exists K3 such

## Re: (Score:2)

## Bad math (Score:5, Funny)

In case anyone is wondering, I'm hoping some DRM coder will gain valuable insight from this post.

Dan East

## Re:Bad math (Score:5, Insightful)

If two 64 bit encryptions equals one 128 bit, then 128 one-bit encryptions should also.

This means the file would have a password of either 1 or 0

After thinking about it that way, it becomes quite clear: 128 bit encryption obviously requires more than 128 guesses to get the right key. As another poster pointed out, two 64 bit passes equals 65 bit encryption.

## Re: (Score:2)

## Re: (Score:2, Funny)

## Now, that's a really BAD MATH (Score:3, Informative)

And how do you know, that you guessed right at every step? No, really. Goo

## Yeah, right, "D. East" (Score:2)

5letter name!Well, we all know that 2^5 = 32.

So, if we take your slashdot userid of 318230 and multiply by 32 we get 10,183,360 (you can stop checking the math after this).

So, take 10,183,360 and re-write as 10 18 33 60 and then translate those

four charactersinto "WEST" and it is perfectly clear thatheis one ofthem!I don't have to spell it out! Mr. East is leading you the wrong way, because you are getting t

## Get a book on cryptography (Score:5, Informative)

Applied Cryptography: Protocols, Algorithms, and Source Code in Cby Bruce SchneierHandbook of Applied Cryptographyby Alfred J. Menezes et al.## Re: (Score:3, Insightful)

For most questions, you can read as many books as you want, but if the question is simple but stupid, you often won't find the answer.

In these cases, you often need to find someone who is very knowledgeable in the subject, or ask slashdot.

Anybody who has studies an of-campus course will realise the value of being able to ask stupid questions, and talk to knowledgeable p

## Re: (Score:2)

Practical Cryptography(by Niels Ferguson and Bruce Schneier) overApplied Cryptography. The forward is an interesting read and highlights some of the problems with theApplied Cryptographytext (and focus of the earlier book).Applied Cryptographyis probably better if you're going to write the encryption algorithms yourself.Practical Cryptographyis written with a bit more hindsight because of watching all the sub-standard encryption systems that have appeared over the last dec## Snake oil FAQ (Score:5, Interesting)

I used to work at a small company where a guy came in and was trying to get us involved with some dodgy "crypto" product. Management were quite taken with it (he could talk and even had a patent!) but technically it was a load of horse shit with some pretty multicolour diagrams. That FAQ helped me crystalise my objections and the guy was given the heave ho before the company did anything embarrassing.

## Simple counterexample for your co-worker (Score:3, Insightful)

Take your file, encrypt it with a 16 bit number.

Take the result, and encrypt it again with a different 16 bit number.

Now, how many possible keys do you have to go through to get back the original data?

(Hint: Try single decryption using key3 = key1 XOR key2.)

This counterexample invalidates the claim that encrypting twice with an N-bit key is equivalent to encrypting with a 2N-bit key. It also demonstrates the worst possible case, that encrypting twice may be no more secure than encrypting once.

## Re: (Score:2)

Any crypto experts want to weigh in?

## Re:Simple counterexample for your co-worker (Score:5, Informative)

This actually raises a question to which I don't know the answer: if you take a fairly standard symmetric cypher, say DES, and two keys K1 and K2, does there always exist a key K3 such that E_K1(E_K2(Message)) == E_K3(Message) ?No, this is not always the case. For DES it is certainly NOT the case. The terminology you are looking for is "idempotent cipher." A cipher is idempotent if it obeys the equation you stated -- i.e., for any key pair K1,K2, there is a K3 such that E_K2(E_K1(x))=E_K3(x). DES is NOT an idempotent cipher.

These questions only seem deep to non-crypto-experts (no offense to you intended) -- and it was certainly accounted for in the design of DES (along with features which harden it against differential cryptanalysis, a technique that wasn't even publically known outside the NSA at the time -- the people who work on these things are far from stupid)

## Meet in the middle attack (Score:2)

There's a reason why triple encryption is used rather than double encryption. There's a technique known as the Meet-in-the-middle attack [wikipedia.org] which, by trading off storage space for time, you can break a double encryption with two independent keys using only twice the time needed for single encryption, meaning that using two independent 64-bit keys would only require 2^65 work, rather than 2^128 as one might expect. That's the reason why you never hear of "double DES" but rather Triple DES with independent key

## easy solution (Score:2)

## Simple for this - hard in general (Score:4, Informative)

However, in the general case of debunking bogus encryption statements, it can be quite difficult. Why? Because encryption is hard, hard, hard, hard stuff. I've taken a graduate course in cryptography and all it did was reinforce how easy it is to make a mistake.

The only encryption scheme I know of that is provably unbreakable is the one-time pad. However, proper implementaion of that is a pain. Is AES secure? We don't know. We think it is, but there are researchers looking for weaknesses because we don't know for sure. It is possible, (however unlikely), that someone will develop an attack tomorrow sending everyone scrambling for a new encryption algorithm.

## Re: (Score:2)

That's why we have other crypto algorithms ready. I use Blowfish for my VPN connections.

## The only thing I can think of... (Score:2)

Is exactly equivilent to 1 round of the same, with a different key,

and no harder to crack. In fact, if your key is the same length as

your alphabet, it would be the plaintext again.

## Group theory (Score:5, Informative)

A single encryption key defines a one-to-one mapping between each plaintext block and ciphertext block. (In ECB (electronic codebook) mode, but the chained block encoding follows a similar analysis.) It has to be bidirectional so that the decryption is unique.

This means that, AT A MAXIMUM, your meaningful keyspace can be no larger than the 2^(number of bits in block). In practice it takes a lot of hard work to ensure that you have 2^(number of bits in key) distinct and non-trivial mappings, and ideally for any arbitrary plaintext and ciphertext block you can guarantee that there is a unique key that will produce that result. However it's easy to screw up and only have, e.g., 2^40 actual encryptions even though you have a keysize of 64 or even 128 bits.

Double encryption also defines a one-to-one mapping between each plaintext block and ciphertext block... but you have the same blocksize. If the keys form a "group", then any double encryption will be equivalent to a single encryption with a different key. In fact any N-encryptions will be equavalent to encyrption with a single different key. Some combinations will leave you with no encryption. (XOR encryption is a trivial example of this -- you'll get no encryption whenever you have an even number of encryptions.)

Worse, your keys may not form a group because there's some property that gets cancelled out in a double encryption. E.g., maybe something drops out each time both keys have a '1' in the same spot. In this case double encryption would be significantly weaker than single encryption.

Triple-DES encryption is an oddball since it isn't (or wasn't) known whether DES keys form a group. Remember that a DES key is only 56 bits, but a DES cipherblock is 64 bits -- you know that can't get every possible mapping between plaintext and ciphertext by running through the keyspace.

Hence 3DES. However 3DES isn't simply encryption three times, e.g., the form we studied in my grad class used two (not three) 56-bit keys, in an EDE pattern. That is, you encrypt with the first key, then DECRYPT with the second key, then encrypt with the first key again. I didn't follow all of the reasoning but I seem to recall it was because of some property of the S and P boxes.

## Re: (Score:3, Interesting)

Wasn't, although the result came very late in the history of DES:

K.W. Campbell and M.J. Wiener, DES is not a group, Advances in Cryptology - Crypto '92, Springer-Verlag (1993), 512-520.

Beep! Retrieval complete.

## Re: (Score:2)

However 3DES isn't simply encryption three times, e.g., the form we studied in my grad class used two (not three) 56-bit keys, in an EDE pattern. That is, you encrypt with the first key, then DECRYPT with the second key, then encrypt with the first key again. I didn't follow all of the reasoning but I seem to recall it was because of some property of the S and P boxes.The reason I learned is much more prosaic: With the EDE pattern, you can implement single DES simply by using the same key for both halve

## Wrong. (Score:2)

For example, the maximum keysize of a 128 bit block cipher is about 10^22 bits.

## Possibly not as bunk as you think... (Score:3, Insightful)

A good encryption scheme doesn't have any header, for exactly this reason. It reads in n words and writes out n words and that's that. So if your coworker is doing:

C = E_K2(E_K1(P))

there are indeed 128 bits of information required to decrypt the document. Given given a plaintext+cyphertext pair a meet-in-the-middle attack might be possible, but the search space is still on the order of 2^128 (cryptanalysis could probably reduce this some). Given plaintext+intermediate+cyphertext, it's down to 2^64, because each half could be cracked separately.

With a good encryption algorithm the first pass ( E_K1(P) ) generates something approaching line noise - close enough so that there's no statistical way to tell real random bits from encrypted data (many RNGs now use encryption routines to boost incoming entropy). There is no structure, there is no header, and there is no way to verify that it's an encrypted anything. It might as well be a bunch of random bits. Given the cyphertext, your total search space to find the plaintext is still 2^64 * 2^64 = 2^128 sets of keys because there's no way to tell that the first round was performed correctly - it will always just be random bits.

Now, if there IS a header added you're back to n*2^64. For example, the software compresses (and thus adds a header to) the data before encrypting it. Given the cyphertext you can search for keys that decrypt to have a valid header, and then search for keys that produce reasonable english text from those outputs.

## Re: (Score:3)

Mounting a known-plaintext attack (plaintext + ciphertext pair) using meet-in-the-middle on something encrypted twice with two different 64 bit keys requires only 2^65 trials. It's a time memory trade off, so it is a little memory expensive, but it's well short of the 2^128 trials you claim.

What you need is two ciphertexts

## Look at the HAC (Score:4, Informative)

In particular, the part that answers you question is found in Chapter 7 (block ciphers) [uwaterloo.ca]. The justification for fact 7.33 explains why, if you can store 2^64 blocks of ciphertext, you can break double 64-bit encryption in 2^64 operations.

If your coworker is not capable of understanding the math behind this reasoning, he really should not be making decisions about encryption

(Note: I'm not witholding the justification in this post simply to be obtuse. It's too much of a pain to format it reasonably in this input box... you'll just have to refer to the PDF.)

## I'm right and I know it (Score:3, Interesting)

The other method of going over their head is to cite someone -- not always even a specific person -- who is smarter than both of us. So for crypto algorithms, I can ask them, "If it was really that easy, don't you think someone else would've thought of it by now? Why do you think we invent wholly new algorithms for 128 bits?"

And if both of those fail, the final method of going over their head is to cite specific sources and papers, and invite them to actually disprove what's said there (without really having to understand it yourself) if they really think they're right.

All of these, of course, are when I knwo I'm right. Quite often, I really don't know, and I stay humble there. But when I know I'm right, it's usually much more important to hit them over the head with one of the above than to try to explain why I'm right. After all, you don't have to know why a car is safer than open air for a lightning storm, you just have to do it. It's not important that someone understand why not to just double-encrypt with 64-bits, it's just important that they not do it.

## Example: Ont-time pad (Score:2)

E_KC (plaintext) = E_KB(E_KA(plain_text))

If you just encrypt 64 bit with an one-time pad (trivial example), then this is obvious:

C = A XOR B

I.e. with one-time pad you gain exactly no security at all.

For many ciphers, a key C may not exist, but even if it does not exist, an attack on E_KB(E_KA(plain_text)) may be just as easy as on a single key. What is worse is that for some ciphers (e.g. DES) it is not kno

## And encrypting 8 times with a 8 bits key ? (Score:2)

See the subject.

## It's a new cypher. Remember what Friedman said. (Score:2)

Encrypting something twice is effectively creating a new cypher system. Always remember Friedman's remark, "No new cypher is worth looking at unless it comes from someone who has already broken a very difficult one."

(If you don't know who William Friedman was, you probably shouldn't be designing cyphers.)

## let's put it this way (Score:2)

## Simple analogy. (Score:2)

If you have a lock on your door, which has only 100 different keys, a thief could take 100 keys along, and break in. Suppose you find this "100 keys" too easy, and install a second (100-key) lock.

There are now 10000 combinations of key1 and key2.

Do you expect the next thief to take along 10000 keyrings, each keyring having two keys one for the first lock, one for the second lock, or does he come in with 200 keys?

In the mechanical

## Ask questions---lots of questions. (Score:2)

First of all, what is a '64-bit encryption algorithm'? Is this a symmetr

## AES (Score:2)

If you are worried about CPU, memory, etc, remind yourself that the criteria for selecting the AES algorithm included CPU usage, memory usage, and the ability to be implemented on silicon.

## Briefcases (Score:4, Insightful)

Once you have cracked the left-hand lock {which will take at most 1000 tries - 17 minutes} you are free to concentrate on the RH lock {which also will take at most 1000 tries}.

You don't have to try anymore combinations on the LH lock. So you can open the briefcase in a maximum of 2000 tries, or just over half an hour. The key fact that makes this possible is that the two locks are independent: you can undo one while the other remains fastened. The combined keyspace is then thesum, not theproduct,of the two keyspaces.If the encryption operation adds a predictable header to the ciphertext {for example, BEGIN OPHIOL ENCRYPTED MESSAGE} {and most of them do} then you have something to look for that will let you know when you have cracked the first layer of encryption -- in effect, popped open the left-hand lock.

## Re: (Score:3, Insightful)

anymechanical combination lock: if the turny bit of the mechanism isn't properly decoupled from the locky bit, then you can identify a partial solution and so decrease the search space. But if it istoowell decoupled, then the lock will give false positives or false negatives. Some of those cheap briefcase locks are actually quite poorly decoupled: you can feel just a little bit more movement when any one of the wheels is "solved". {Which is why that episode ofOnly Fo## Re: (Score:2)

Ummm, I think the answer is rather obvious... I'm posting on

## Encryption not like matricies... (Score:2)

Of course a cipher being a mapping function that maps it

## Re: (Score:2)

It is arguable that if you had a one-time pad system (which is provably totally secure so long as the key is secure and used no more than once) where each bit/byte of the message was XORed with a totally random bit/byte from start to finish and where

## Re: (Score:2)

At least one American Admiral has reused OTPs. It's easy to call thes

## Re: (Score:2)

You can guess the letters one at a time (i.e. "is the first letter 'a'?"). Your coworker has to guess the letters two at time (i.e. "is the phrase 'aa'?") Generally speaking, who wins?To make this analogy work, the answer to the first person would have to be "MAYBE" every time they asked, because there's no intrinsic way to know when you've guessed the key to unencrypt pseudorandom ciphertext - it looks just like all the other pseudorandom data that invalid keys would generate.

## Re: (Score:2)

> Then you do the same search for the second pass.

Okay, I understand this argument I think. However, you say you crack the

cyphertext in two passes. Now, what I don't get is: how do you know which of

the 2^64 results from the first bruteforce run is the right one to feed

into the second run? If there is now header, all 2^64 results look like

garbage, no? Wouldn't I need to run ALL results through the second round?

Please enlighten me

## Re: (Score:2)

You missed some key points:

In general TDES with three different keys (3TDES) has a key length of 168 bits: three 56-bit DES keys (with parity bits 3TDES has the total storage length of 192 bits), but due to the meet-in-the-middle attack the effective security it provides is only 112 bits. A variant, called two-key TDES (2TDES), uses k1 = k3, thus reducing the key size to 112 bits and the storage length to 128 bits. However, this mode is susceptible to certain chosen-pl