Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Encryption Security

Biggest Public-key Crypto Crack Ever 67

galore writes "Certicom's ECC2k-108 Elliptic Curve Discrete Logarithm challenge has been broken! This was the largest public calculation ever to use a complex parallel algorithm. $5,000 dollars in winnings will be donated to the Free Software Foundation. Congratulations to everyone who participated, including team Slashdot! " There seems to be conflicting versions of info about the prize money - some says 8,000 to the Apache Foundation, others say 5,000 to the FSF.
This discussion has been archived. No new comments can be posted.

Biggest Public-key Crypto Crack Ever

Comments Filter:
  • by Anonymous Coward
    The comparison between RSA and ECC is not that difficult. Here's a walk-through:

    1) Pick a level of security that you would like to enjoy. (Say "If distributed.net were 100 times its current size and grew exponentially, it would take them 1000 years to break it..." or something like that.)

    2) Find the best known algorithm for integer factorization (i.e. cracking RSA). This would be the Number Field Sieve (or some variation thereof). Figure out the key length you would need to enjoy the level of security you wanted in part 1.

    3) Find the best known algorithm for solving the elliptic curve discrete logarithm problem (i.e. cracking ECC). This would be the Parallized Pollard Rho algorithm. Figure out the key length you need to enjoy the level of security you wanted in part 1.

    4) Figure out how long it would take RSA to encrypt/decrypt something with the key length you calculated in step 2.

    5) Figure out how long it would take ECC to encrypt/decrypt something with the key length you calculated in step 3.

    6) Compare.

    To say "RSA is faster than ECC" without qualification ignores the fact that the running time of each algorithm is fundamentally dependent on key size. (choosing 2^16+1 as a public exponent still makes for slow decryption, and although there is no break per se, it is still suspected that low public exponents are less secure than random ones. See, for example, the survey paper "Twenty Years of Attacks on the RSA Cryptosystem" by Dan Boneh of Stanford.) Also, the choice of basis and finite field for ECC is not that hard either. You pick the choices that gives you the best results.

    NB: This isn't really a walkthrough, if you haven't figured that out by now. Each of the steps, with the possible exception of step 6, actually take a fair amount of work and background research to do. Also, since the question at hand seems to be the centre of a religious debate, I'll try not to say which algorithm comes out on top.

    donny
  • > This is about as cool as a guy who claims he can
    > open every safe and when asked to demonstrate
    > sits down in front of it and starts
    > turning the wheel until he gets the combination
    > by accident.

    no, read up on it first. this was not a brute force attack, it used an intelligent algorithm. read more (and brush up on your abstract algebra) here: http://cristal.inria.fr/~harley/ecdl7/FAQ.html

    later,
    ian
  • i believe that is what we refer to as "security through obscurity." the algorithm itself must withstand, one must always assume attackers will be able to determine how a document was enciphered. in this particular case, the "crackers" most certainly new about the method of encryption before hand.

    later,
    ian
  • by aphr0 ( 7423 )
    Does apache really need the money? It's not like their project really needs help. Wouldn't the money be better spent on 1 or 2 smaller projects that could actually use an extra development machine or some hardware?
  • Okay, I'll throw this out. We thought it'd be an interesting experiment in GA to see if it could take an input n and give me an output p where n is the number of primes since 0. So, 1,2,3,5,7,9,11,13 .. 13 would be the 7th value. So we plug 7 into the algo and get 13 out of it.

    The program basically creates a string of random math which means nothing and then starts mutating them. It feeds about 30 primes plus their offsets in and then averages how far each one was off. When the algorithm reaches a 0% deviation, that means we solved the problem. His program got to 0.000?? - ask him for the algo it used.

    The damn thing's output is ugly as sin though.. well.. actually ugly as sine *cough* .. but it ran out of memory to hold the gene pool and basically swapped the machine it was running on to hell. That's why we need more RAM. :(

    As to why such an algorithm would be useful.. I thought that since people basically brute-force through X numbers and run fast-fourier transforms and what-not on them we thought it would be nice if you could compress the keyspace to a pitifully small number. a 1000-bit keyspace now only has maybe 16 bits of keyspace. And because of how primes work... a 10000-bit keyspace might only have 20 bits of keyspace now. Kinda makes using large keys pointless then and puts us back on the fast track to using OTP.

  • The ProcessTree Network [processtree.com] was created to bring two groups together: those who
    have huge computing jobs to do; and millions whose computers just sit idle much of the time.
    With the ProcessTree Network those BIG computing jobs can get done faster
    and cheaper, and people can turn there computers idle time into money.
  • Unless you are able to find a huge (for large values of "huge") amount of computing power, your chance of finding the key, as an individual, is zero (plus epsilon, where epsilon is smaller than a Slashdot troller's vocabulary). However, joining a team means you may actually have a chance of winning SOMETHING, even if it is just accolades, or the right to vote on which charity gets the cash.

    Anyone who is cracking RC5 for the money, is deluding themselves. Do it for the sex appeal!
  • AFAIK it is proven that any higher-times DES give no significant security bonus over TripleDES regardless of the greater keylength.

    A quick search of my literature reveals no such proof, although pg.366 of Applied Cryptography implies that Quintuple DES offers improved resistance to meet-in-the-middle attacks, as it certainly does for exhaustive search.
  • You are right to point out that TripleDES is still the most tested and proven cipher in the field, despite DES being much maligned these days. It should be noted that TripleDES is thought of so highly simply because it has been around for so long. Blowfish, IDEA, and others are too new to be completely trusted, although they may very well be as secure or more secure than TripleDES.

    Unfortunately, there is generally no way of formally proving security for these block ciphers, other than showing that it can resist various known real and hypothetical attacks (opening the possibility that some currently unknown technique could be found that crack a block cipher wide open)

    So, in other words, DES and TripleDES have stood the test of time, but there is no guarantee that TripleDES is more secure than a newer block cipher. Personally, I'd use QuintupleDES if I were really ultra paranoid, otherwise I like Blowfish because of its speed. (And it is free for anyone to use and implement)
  • I find it strange that with more a more encryption busting going on out there we still cannot decypher the encryption used by microsoft word to mark up text with a consistent appearance. This challenge is something worth rather more than the 5000 to FSF, valuable as that may be. Does anyone remember WordPad used to credit someone outside microsoft for its existence (before 95b was released)? I think that program could decypher most word codes.
  • Glad to see all the comments on the ECC challenge, and congratulations to the ECDL project team.

    To be clear, what has been broken is a single key on an elliptic curve which has a 108 bit order. The smallest key size which Certicom commercially recommends is a 160 bit key, which is on the order of 2^26 times as hard to break (a factor of approximately 67 million).

    The papers available at www.cryptosavvy.com [cryptosavvy.com] are a good, independent, and recent reference for key size comparisons.

    Tim Dierks
    Chief Technical Officer, Certicom
    tdierks@certicom.com [mailto]

  • 1. ECC is not patent free. Several companies are engaged in patent war over ECC (Certicom being the number one). The "nice" curves have already been patented (mathematicians in the audience will crucify me for describing some curves as "nice", but it's a reasonably accurate layman description--some curves make crypto easier than others, hence they're "nice").

    There are a number of ECC patents, but it's possible to write a 'nice' implementation without violating any of them. It's not possible to write an implementation of RSA without violating their patent.


    2. ECC is not faster than RSA. RSA is not faster than ECC. Nor are they equal in speed. While this all sounds terribly contradictory, it's all true; as we all know from having complained about NT-versus-Linux benchmarks, whoever is paying the analysis firm gets the results they want. When Certicom pays for ECC-versus-RSA, it always turns out that ECC is faster. When RSADSI pays for it, it always turns out that RSA is faster.


    Emacs beats vi!

    You have to take into account the relative strength of ECC keys versus RSA keys. ECC keys can be much shorter and therefore require much less processing. It's debately what that ratio should be, but contest like this help decide those numbers.

    Even assuming that ECC were unambiguously faster than RSA, it wouldn't make a tinker's dam of difference. The applications which use asymmetric cryptography extensively are few and very far between. Symmetric ciphers have a better foundation in number theory, are more thoroughly cryptanalyzed and are often faster. Most of the time when asymmetric crypto is used, it's only used to negotiate a symmetric key. If it takes RSA a millisecond to encrypt/decrypt a 256-bit Twofish key, what do I care if it takes ECC a microsecond to do the same task?

    One millisecond? Ok, I'm using weird hardware - but my decryption is way longer than 1 millisecond. (IIRC around 50-100ms for a 256 bit field). Considering all secure connections must decrypt at least one asymetrical key, you are looking at least 50ms on connection time. That means you can only accept 20 connections per second per machine. For high ussage applications this ain't so great. Speed does matter. Makes very little difference on a web browser - i'll give you that, but if you download a secure page with 20 inline images - each must do a RSA decrypt. Multiply by the number of users and you've got a CPU crisis on the server side - unless you start buying crpto-accelerator boards.
  • And what is to stop me from tapping the voice coil on the speakers? OH!! Special no tamper circuitry? I had best use super high impedance probes now shouldn't I? Sheesh but the RIAA is really stupid!

  • Part of the same reason we land on the moon, or other seemingly stupid things.

    Chris Hagar
  • Speaking of prime numbers, please check this out: http://www.entropia.com/ips/ ...it is a distributed prime number search. I consider it to be more "useful" than say, the RC5 effort. But that's my own personal opinion anyway. - Save a life by slashdotting this site! http://www.hungersite.com/
  • Hehehe I run a team, so I can help you out. :)

    Go to The Distributed.net download page [distributed.net] and grab the client for your platform. Specific instructions on how to install it for your platform are in the client, but the win32 version has a nice installer. :)

    Configuring it can be kind of a pain if your a dialup/offline user, but if you have a lan type connection just plunk in your email address and thats about that, your ready to go. It'll connect to the internet and download some "blocks" to work on. When its done them, it'll send them back.

    The day after you send some blocks in, go to the Distributed.net Stats Server [distributed.net] and do a search for your email address. From there you can get your password, and can join a team.

    If you want to join team slashdot, after you have retrieved your password, do a team search for slashdot, go to the team info, and there should be an "I want to join this team" link. Click it.

    After that, your blocks will also count towards the team.

    If you need more help with it, you can post here, visit #distributed on efnet, or if your desperate, email me. :-)

    Happy cracking!

  • AFAIK it is proven that any higher-times DES give no significant security bonus over TripleDES regardless of the greater keylength.
  • First napster bombs, and now crypto? The Evolution Control Committee [evolution-control.com] is even cooler than I thought... :)
  • There is no way you can tell a number is a prime in polynomial time. There are probablistic algorithm which can do that but not any deterministic algorithm. this problem belongs to class Co-NP and not P. There are very efficient deterministic algorithm which have near polynomial behavior (n^logn or something like that)
  • >... Looking through the chain, you've got some file (FOO.BAR) which you compressed using PKZip (FOO.ZIP). You ran it through uuencode and uudecode (net result, no change), leaving it as FOO.ZIP. Then you just compressed-it-with-password again and renamed it BAYWATCH.JPG.

    No.
    I pw-zip FOO.BAR to FOO.ZIP
    I reverse FOO.ZIP to PIZ.OOF
    I uuencode PIZ.OOF to PIZOOF.UU
    I pw-ZIP PIZOOF.UU as PIZOOF.ZIP
    I reverse PIZOOF.ZIP to PIZ.FOOZIP
    I rename PIZ.FOOZIP to BAYWATCH.JPG

    Well, maybe not .JPG but, like, .LOG or .CmdrTaco

    But you are right - in the long run it won't be secure enough. If OTOH I replace ZIP with PGP and/or some scrambler that uses Abe's speech as the key... And then, the cracker still must have a clue what the heck is that interesting with a file like MSG.LOG...

  • This leads me to a question: Do the crackers know about the algorithm used for a specific document? I remember coming across that .zoo file that refused to unpack until we found out it was a scrambled-with-password .zip file...

    So, if I just went and pw-zipped a file, uuencoded it, reversed it, re-zipped again with password and called the whole thing BAYWATCH.JPG, what time would it take to decrypt it?

    What I want to say is: if you are smart enough to leave no traces (or better yet, lure the crackers to a wrong trail), could even a 'mediocre' encryption algorithm (or combos thereof) be 'secure enough'? (that's that stegano thing, right?)

  • It's cool. What other reason do you need?

    I elaborate anyway: it's like a worldwide Beowulf cluster, except sorta not.

    Further, it shows what a whole bunch of idle CPU cycles can accomplish.

    And anyway, see the subject: why not?

  • Ahhh. Excuse my ignorance, then.


    -RickHunter
  • I think the point is to see how long it takes and how much power has to be applied to effectively brute-force the key. Mathematical models can predict this, but its nice to have a real-world validation. Then they can say "Look how secure our system is! It too X computers of Y power Z months to crack A message encrypted with B key! Now give us much money!" ;)


    -RickHunter
  • Does it not in fact demostrate how hard it is to crack ECC?

    As I read the article, that was indeed the point. The sponsors of the contest wanted media attention on how HARD it was to break their technique.

  • Since when is 9 a prime number?
  • The point is to determine a minimum key length to use. Minimum key lengths are conservative estimates balanced by pratical key storage and enciphering computation limits. With the real-world data produced by contests such as these, we can make much better estimates of minimum key lengths required for the desired level of security.

    Burris
  • Seti@HOME has outright stated that they have no need for more processing power. Their clients are dreadfully wasteful and unoptimized. Their leadership essentially tells people who want to help with their design to fuck off and their computational cores (of which the algorithms have been known for decades) are completely closed source, unlike those of d.net.
  • Have you wandered over to www.apache.org recently? There is much more to the Apache Foundation than just the web server project. Most of the others (tomcat, coccoon, xerxes, etc.) are still relatively young. And although other companies (e.g., Sun, IBM) are helping out on many of these projects, the help comes through staff, not money.

  • The DTCP spec, which I've seen mentioned in the context of F*r*W*r* / IEEE 1394. It uses SHA-1 and ECC based `crypting, with keys from 64 to 320 bits depending on what part of the protocol suite you're refering to.

    It's a @#$%^&* messy spec, with parts released only with signed NDA. I suspect that someone may figure out a fairly fast hack of the "copy-1-time" :copy-never" " copy freely" attribute and pass that info around. What this breaking of a key-pair shows is what a brute force attack will cost (today).

  • Soon we'll crack the code of this seti@home thing.

    I've written them and asked them to open-source their core like d.net did [distributed.net], claiming that this would cut down on illegal [cr|h]acking to make the client faster.

  • From the challenge site: "There are two Challenge Levels: Level I, comprising a 109-bit and a 131-bit challenge; and Level II, comprising 163-bit, 191-bit, 239-bit and 359-bit challenges. The 109-bit challenges are considered feasible and could be solved within a few months, while the 131-bit challenges will require significantly more resources to solve. All Level II challenges are believed to be computationally infeasible."

    Maybe I'm a fool but I could not notice on the page which of these "levels" was solved. I think that's kindof an important distinction to make.

  • Good on everyone. I'm not a crypto cracker by any stretch of the imagination, and so I am in awe of those who are able to. A while back /. had something on a crypto written by the late great Edgar Allen Poe and I posted it on my wall. After many an hour starting at it, I haven't even a glimmer of the meaning behind it. *SIGH* Maybe I'll find the time to learn about encryption someday...
  • How are the folks at distributed.net doing with their RC5 attempt? How do I sign up for the "TeamSlashdot" and donate my (albeit humble) proc time to winning (maybe) more money so that Slashdot could (maybe) afford some real bandwidth...

    UMR Rulez
  • "Does it not in fact demostrate how hard it is to crack ECC?

    No, you see: It's really as simple as 60 years ago, in WW2: When the Germans heard the words:

    Certicom's ECC2k-108 Elliptic Curve Discrete Logarithm challenge has been broken!
    they thought it meant
    Elliptical curve Certicom ECC2k-108 was interrupted separate logarithm challenge!
    which is in fact the result when you translate the words into German and back with Babelfish [altavista.com].

    This, however, hasn't brought me any closer to understanding what we're discussing here. But I do know that, once the Americans learned how to decrypt the Germans, the war was won. And this is why we'll never be taken over by the machines - we're smarter than them, and we speak real German :-p

    Excuse me for being offtopic. It won't happen again. Promise.

  • I can tell that you didn't go to the site mentioned at the top of the page. I can also tell that you didn't read the other threads. The purpose of the contest was to spread the awareness of ECC encryption. Not to "condone cracking and the breaking of codes put in place to protect information from pirates".
  • This was a big cracking effort, absolutely devastating the elliptic curve encryption system. To the best of my knowledge, this was the first real Open test of an EC system.

    Given the key-length and the time taken, what (if any) are the implications for EC encryption?

  • i originally posted the story and said that $8,000 would go to the apache software foundation - this is still correct... i'm not sure why it was changed on the front page. just read homepage for the project.

    --

    The plan:

    Solve part of Certicom's ECC challenge.
    Win $10000 and give most of it to the Apache Software Foundation.
    Um, that's it.
    ...

    Certicom is offering a prize of $10000 for the first correct solution. If we win it, $1000 will go to each of the two people who find the match and we will donate the remaining $8000 to the Apache Software Foundation. Note that following our previous successes, we have already donated $12000 to the Free Software Foundation.

    --

    older versions of this effort donated to the FSF, this one is definitely going to apache.

    later,
    ian
  • other posters have pointed out that DES is a secure algorithm, but also of note is the ridiculousness of comparing ECC to DES. elliptic curve cryptosystems are for public-key use, while DES is a symmetric cipher. so no, ECC is not "better" than DES, or vice versa. DES is a very strong symmetric cipher that has withstood many many years of attacks. This project helps to show how strong elliptic curve cryptosystems really are, as they have been around a relatively short time (early 80's i believe). While elliptic curves have been studied for a great deal longer than that (as certicom points out), not until recently have they been looked at in regards to cryptography. If ECC is as strong as certicom claims, it proves to be one of the most interesting cryptographic techniques since public-key and RSA itself.

    later,
    ian
  • Isn't this the very same algorithm USB speakers over the wire so the MPAA could ensure end-to-end encryption? The same algorithm that would be used for Intel's wonderful monitors that encrypt the output?

    Isn't it about time the Apache project relocated to a country not a member of the WIPO? :)

  • Major news would be an algorithm that ran in linear time.

    ask this guy [mailto] about linear solving. It was my idea, but he's got one that's pretty fscking close to being able to predict primes in linear time. I just wish we had enough memory.. we got to a deviation of less than 0.0000?? per thousand.. :( I think we need a 256MB chip to let the tree fully 'bloom'.. but ask him.. he knows how the algo works better than I.

  • Good point on the certicom patents, but I've used both ECC and RSA, and shotgun-accurate, ECC is muuuch faster for roughly equivalent strengths. It's also much smaller per key; RSA 1024 is about as strong as ECC 131 or 163 (depending on who you're talking to, both are considered by Certicom to be computationally unfeasible).

    Quoting Alfred Menzes, a consultant at Certicom and general ECC god-of-knowledge,in his web page [uwaterloo.ca] says:
    NIST recommended that 256-bit ECC key lengths be used for equivalent security as 128-bit AES, and that 384-bit and 512-bit ECC be used for 192-bit and 256-bit AES. The rough estimates of RSA key lengths for equivalent security are 3072 bits, 7680 bits, and 15630 bits. Imagine the performance degradation incurred with RSA implementations at these key sizes, even with e=3!!

    This same paper (written in response to Bruce's discussion of ECC in his Nov99 crypto-gram [counterpane.com] goes on to say on speed:
    It is generally agreed that ECC private key operations (signature generation and decryption) are faster than RSA private key operations. It is also generally agreed that ECC public key operations (signature verification and encryption) are slower than RSA public key operations when a small encryption exponent (such as e=3) is used.
  • Ok, I got confused mail. RSA can't really be broken in 12 micro seconds. That was a joke - refering to a previous article [slashdot.org] posted on /.

    512 rsa really takes a few days to factor with specialized hardware (AFAIK).
  • Due to a bug in Slash, the message directly above this in the thread (BAYWATCH.JPG) was attached as a response to the wrong message. A newbie was asking a question about multiple encryptions in weird and unexpected ways, and whether or not it could provide security. So please--hold the flames, I really *did* respond to the right message, honest. :)

    A bug report has been filed with Taco.
  • However, unlike DES, there is no known mathematical loophole

    Sorry, this is glaringly inaccurate. There is NO known mathematical loophole in DES - the "least effort attack" is a brute-force attack. If there was a weakness known in the algorithm then the best attack would exploit this instead.

    The DES algorithm is excellent, merely suffering from keys that are too short at 56 bits to give adequate protection - note the difference between the *quality* of the algorithm and the level of protection here. Triple-DES (112 bit key equivalent - don't believe anyone trying to sell it to you as a 168 bit key solution since there is an attack against double-DES making it only a single bit stronger than single-DES. Yadda yadda - see "Applied Cryptography" for the details) builds upon this and gives a level of protection considered unbreakable since the best attack is still the brute force attack trying all keys in turn, and 2^112 is somewhat larger than 2^56.

    S
  • It's not as major as you may think. All public key algorithms are cracked in a sense that there are known attacks that are significantly faster than complete-trial-and-error. For RSA you just factor the public modulus. ElGamal and Elliptic Curve involve calculating discrete logarithms in a finite field.

    So, this doesn't mean the algorithm is broken. We've known all along that the keys could be cracked in this way. It does give us a better picture of how large the keys must be to make cracking them unfeasable.

    Unless the implementations you mention use keys that are not much larger than the ones recently cracked then they are still quite safe. Major news would be an algorithm that ran in linear time.

    Burris
  • Wrong. If the ERH is proven correct, a variant of the classic Miller-Rabin SPPT can prove a number to be prime in strictly polynomial time. The Jacobi sum test (aka APR-CL algorithm) can prove primality in deterministic O(n^(lg lg n)) time, which is really close to polynomial. The ECPP can prove primality, under a very plausible heuristic argument about the smoothness of a certain class of eliptic curves that is too complicated to explain here, in O(n^(6+eps)) time for all numbers except a set with Lebsegue measure 0 (ie almost all integers in a well-defined, formal sense). Proving primality is in the complexity class RP, not Co-NP, via the inefficient in practice, but theoretically rigourous, Adleman-Huang algorithm (based on improvements to Goldwasser's original randomized algorithm over eliptic curves of modular genus 2) that can prove the primality of a number of any input n in running time O(n^(13)). It's also a complete pain to program (better than Atkins-Elkies-Schoof, but that's not saying much), but that's an aside. Needless to say, we don't use algorithms like these very often...
  • There are several certificational weaknesses of the DES algorithm that are not practical. Linear and differential cryptanalysis can reduce the required time needed to break DES to about 2^45, with massive amounts of known and chosen plaintext/ciphertext pairs.

    Your statements about the strengths of double and triple DES should be clarified. The key-length equivalents that you mention are only applicative for what is know as a "meet in the middle" attack, whereby the attacker uses a known PT->CT pair and decrypts one DES encryption of the CT with a test key while encrypting the PT with another test key under one DES encryption. The attacker keeps a list of the pairs of the results in the middle (I'm handwaving here, as there are some optimizations that can be made) and when there is a match, the correct keys can be trivially recovered. The attack requires serious memory to even think of carrying out (orders of magnitude greater than all of the RAM installed on every computer that has ever been built).
  • ECC is not strictly faster than RSA. An exact comparison is rather difficult, as the proponents of each cryptosystem can adjust the parameters to make their system appear faster (ie. low or high encryption exponents, choice of basis, GF(p) or GF(2^m),optimization efforts). In most cases, RSA is faster than ECC, especially with a reasonable (2^16+1) public exponent during encryption. ECC is also NOT patent free. Certicom and others have patents on specific implementations and types of the general eliptic curve algorithm that can be significantly faster and more memory friendly than the classic eliptic curve defined over a large prime.
  • [quasi-flame, but probably deserved]Are you on crack? I'm sorry, but I have no better way to say this. Are you crazy?

    "I thought that since people basically brute-force through X numbers and run fast-fourier transforms and what-not on them we thought it would be nice if you could compress the keyspace to a pitifully small number. a 1000-bit keyspace now only has maybe 16 bits of keyspace."

    Please spare us your thoughts and actually learn something about what you are blatthering about. To find large prime numbers, most people use a variant of Miller-Rabin SPPT efficiently. To enumerate all the primes in a certain interval, we use either an Atkins sieve or an appropiately blocked standard sieve. To determine the next larger prime than a certain number, we use standard sieving or delta-big delta sieving. To decompose a composite into primes, we use either the ECM (
    And how precisely do you plan on compressing a 1000-bit keyspace into a 16-bit keyspace? Ever heard of the pigeonhole principle? Do you even know how many primes are in that interval? The distribution of prime numbers in a specific interval is regular in a very specific and formal way that effectively means that you can treat them as randomly distributed (yes, I know about the RH, I don't care). And how would a 10000-bit be represented as 20-bits of keyspace? Please tell me "how primes work"? I would really like to know and collect my Fields medal.

    Finally, if you by some miracle got your GA working (and please don't get me started on this particular bit of idiocy; smoothness arguments about the primes and their spacing will be a waste of space), how would this render symmetric algorithms obselete? How would this magically crack DES, IDEA, or [insert block cipher here]? How would this magically crack RC4, SEAl or [insert stream cipher here]? How would this magically crack ECC, the modified-Chor-Rivest knapsack scheme, NTRU, or [insert other PK algorithm here]?

    It won't, because you're talking out of your ass here.[/quasi-flame]
  • Predict primes?!?! What does that mean? Enumerating all the primes up to an arbitrary integer n can be done very efficiently using an Atkins sieve in minimal memory; it's usually faster to generate them than to store the all the primes under 2^32. Deciding whether an arbitrary integer is prime or composite can also be done in polynomial time (via the Adleman-Huang algorithm, or in probablistic polynomial time via the ECPPA). Simple enumerability arguments argue against the possibility that an algorithm could do this in linear time. Decomposing a composite into primes through the best known methods takes subexponential time using a *very* sophistocated algorithm. Counting the number of points on an eliptic curve also takes polynomial time using another very involved algorithm (Atkins-Elkies-Schoof). The best known technique for generating collisions on an eliptic curve takes fully exponential time (Rho sieving w/ distinguished points).

    So please, tell me, what does "linear solving" entail? What does it mean to "predict primes in linear time"? What does this "deviation" refer to? How does this tree "fully 'bloom'"?
  • An ECC derived primitive is used to exchange symmetric encryption keys. Not that one would even bother trying to break the PKC to attack the system; just reverse engineer the hardware and read the key off from that. Definitely not trivial, but much easier than any other way.
  • [Spam]
    Have I got a deal for you. For only $19.99 (Plus $42.36 S&H) you could be the owner of a hand-made, titanium-polymer, mechanical pencil. No more broken pencils moments before cracking a code. This pencil has a full, money-back guarantee (Minus shipping and handling both ways). And if you order in the next 349 seconds (324 if you read slow), I'll toss in a free piece of 0.7mm lead.
    [/Spam]

    Sorry, I know it's slighly OT but I couldn't resist this one.

    kwsNI

  • From an exhaustive research effort that included actually reading the first two paragraphs of the Certicom web page, I will quote
    The first of its kind, the ECC Challenge has been developed to increase the industry's understanding and appreciation for the difficulty of the elliptic curve discrete logarithm problem, and to encourage and stimulate further research in the security analysis of elliptic curve cryptosystems. We believe that the knowledge and experience gained from this Challenge will help confirm comparisons of the security levels of systems such as ECC, RSA and DSA that have been based primarily on theoretical considerations. We also hope it will provide additional information to users of elliptic curve public-key cryptosystems in terms of selecting suitable key lengths for a desired level of security.
    There, now you know why too.

  • Does it come in 0.3mm? I write very small and I find I can put 60% more numbers on a single page before I have to break my concentration to get another page.
  • by Anonymous Commando ( 6326 ) on Tuesday April 04, 2000 @12:02PM (#1151525)

    Team Apache cracks the code and the Apache Software Foundation gets the cash...

    Apache (or some variant thereof) runs on over 50% of Internet-connected web servers...

    When was the last time you took a peek in the Apache source code to make sure they weren't sneaking some sort of distributed processing client into the web server? Very suspicious indeed... maybe that's why my server's processor usage has been higher than normal! :=]
    ________________________

  • by jabber ( 13196 ) on Tuesday April 04, 2000 @10:19AM (#1151526) Homepage
    The decrypted plaintext message contained copyrighted intellectual property.

    All participants are officially under arrest.

    Please proceed to your nearest police station to turn yourselves in, and nobody gets hurt.
  • by ChadN ( 21033 ) on Tuesday April 04, 2000 @11:06AM (#1151527)
    Actually, this just validates ECC, as it was fully expected that the 108-bit challenge was feasible (ie. could be broken in reasonable time with massive computing power). The linked site seems to indicate it took about 5 months (searching about 76% of the problem space), which is roughly what the challenge issuers anticipated.

    The 97-bit challenge was broken (by the same project) in a couple of months, also in line w/ prediction.

    The next challenge is 131 bits, which should take a LOT more time and power to break. The challengers can use these results to justify their claim that the 161+ bit challenges are infeasible.

    FWIW, people in the field believe a 160 bit ECC problem to be roughly equivalent to 1024 bits of RSA, which is currently WAY beyond what can be feasibly attacked (barring no advances in factoring technology)

    ECC has a bright future and these challenges only serve to emphasize that point.

    See the link on the Certicom ECC challenge [certicom.com] for more info.
  • by rjh ( 40933 ) <rjh@sixdemonbag.org> on Tuesday April 04, 2000 @01:56PM (#1151528)
    Here's what I'd do in order:

    1. Try and display it as a .JPG. Wow, guess what, it's not a .JPG... must be something else.

    2. Look at the first few hundred bytes of the file. Almost every distinctive filetype out there today has distinctive header information. Standard compression libraries (PKZip, zlib, etc.) have very well-known headers.

    3. Try to unzip the BAYWATCH.JPG file. Hey, it's asking for a password. Damn.

    4. Leaf through my notebooks to find PKZip's encryption algorithm (it's trivially weak, but I don't remember its implementation off the top of my head).

    5. Ah ha! I've found the cipher that's used in PKZip. Now I've just got to write a Perl script that'll break the cipher--it is a trivially weak cipher, after all.

    6. Presto. Now I've got the original file back, or so I think. I take a look at it with a hex editor. Hey, this is ... another ZIP file! Well, damn. Smack it around with the Perl script once more.

    7. Check the file again. I've suddenly got the secret recipe for the $250 Neiman-Marcus cookies. I hand over the fruits of my labor to Boris and Natasha in the hopes that they will use these cookies to lure the moose and squirrel to their doom.

    ... Looking through the chain, you've got some file (FOO.BAR) which you compressed using PKZip (FOO.ZIP). You ran it through uuencode and uudecode (net result, no change), leaving it as FOO.ZIP. Then you just compressed-it-with-password again and renamed it BAYWATCH.JPG.

    Total time for me to break it--six hours. That includes time to write the Perl script to break PKZip's encryption (provided I don't already have one somewhere), time to debug the Perl script, and enough time to order a pizza and have a decent lunch while I work.

    If I've already got a Perl script to rip PKZip's encryption, then figure it'd take me about fifteen minutes.

    What I want to say is: if you are smart enough to leave no traces (or better yet, lure the crackers to a wrong trail), could even a 'mediocre' encryption algorithm (or combos thereof) be 'secure enough'?

    ... I hope this post answers your question. The short answer is "no". The long answer is "usually, no".
  • by Chilles ( 79797 ) on Tuesday April 04, 2000 @11:18AM (#1151529)
    Not to belittle the sheer amount of computing power that was applied here or the effort that went into creating a client for said effort but I still object to calling this a "breaking" of an algorithm. Sure, they've found the key, but change the key and the same amount of computing power will be needed to find that one so what have we learned here? What was so great?

    This is about as cool as a guy who claims he can open every safe and when asked to demonstrate sits down in front of it and starts turning the wheel until he gets the combination by accident. Everybody knows you can do it that way.

    Now if the attack had taken just a few sheets of paper, a pencil and a calculator, then I'd be impressed.
  • by Logicon ( 87902 ) on Tuesday April 04, 2000 @10:04AM (#1151530)
    I was about to break it, but then my pencil broke.
  • by (void*) ( 113680 ) on Tuesday April 04, 2000 @11:03AM (#1151531)
    Pardon me for being clueless, but what is the point of this cracking effort?

    So one particular message, encrypted with one particular pair of keys was cracked after running a brute force cracker on lots and lots of machines. I don't get it, what does this prove? Does it not in fact demostrate how hard it is to crack ECC? In fact, this does not prove anything. Maybe there are specific key-pairs which are particularly easy to brute force?

    Can someone please enlighten me?

  • by burris ( 122191 ) on Tuesday April 04, 2000 @12:54PM (#1151532)
    This was not a brute-force (formally known as "complete trial and error") attack. Brute force means trying every possible key until you find the right one. The attack in question is much faster than complete trial and error. Every asymmetric cipher has an attack faster than complete trial and error by virtue of the fact that there is a "public" component that can be analyzed by the attacker. That's why symmetric ciphers only need keys that are 128-bits long where asymmetric ciphers need much longer ones.

    Burris
  • by jonathanclark ( 29656 ) on Tuesday April 04, 2000 @11:27AM (#1151533) Homepage
    For those of you unfamaliar with with elliptical encryption I recommend this book. [amazon.com] EE is an asymetrical algorithm in the same way RSA is. This "crack" is significant because it shows the relative strength between RSA and EE. 512 bit RSA ca n been cracked [sunday-times.co.uk] in about 12 microseconds. Other nice properties about EE algorithms :

    - patent free (RSA expires this year!)
    - faster than RSA
    - can be implemented easily using 8/16bit microcode (ideal for smartcards)

    Bruce likes to claim cracking contents have no value, but I disagree. EEs haven't been studied as much as RSA, so contest like this are important to showing the algorithms strength as implemented in the real world - and more importantly - generating interest in the research community.
  • by rjh ( 40933 ) <rjh@sixdemonbag.org> on Tuesday April 04, 2000 @01:36PM (#1151534)
    Although I am a professional InfoSec consultant in real life, my opinions expressed here are not to be interpreted as my professional opinion. (There. Now that the legal disclaimers are done...)

    1. ECC is not patent free. Several companies are engaged in patent war over ECC (Certicom being the number one). The "nice" curves have already been patented (mathematicians in the audience will crucify me for describing some curves as "nice", but it's a reasonably accurate layman description--some curves make crypto easier than others, hence they're "nice").

    2. ECC is not faster than RSA. RSA is not faster than ECC. Nor are they equal in speed. While this all sounds terribly contradictory, it's all true; as we all know from having complained about NT-versus-Linux benchmarks, whoever is paying the analysis firm gets the results they want. When Certicom pays for ECC-versus-RSA, it always turns out that ECC is faster. When RSADSI pays for it, it always turns out that RSA is faster.

    Even assuming that ECC were unambiguously faster than RSA, it wouldn't make a tinker's dam of difference. The applications which use asymmetric cryptography extensively are few and very far between. Symmetric ciphers have a better foundation in number theory, are more thoroughly cryptanalyzed and are often faster. Most of the time when asymmetric crypto is used, it's only used to negotiate a symmetric key. If it takes RSA a millisecond to encrypt/decrypt a 256-bit Twofish key, what do I care if it takes ECC a microsecond to do the same task?

    3. RSA smartcards have existed for years. Check out the iButton for an example of how asymmetric cryptography can be used in smartwear (jewelry, buttons, etc).

    Insofar as Schneier's distaste for cracking contests, I've got to say I'm in the same boat. Running a cracking contest doesn't prove anything. It doesn't prove it's secure, brute-force cracking rarely betrays insecurities, and what it's best at is media hype. PHBs the world over look at cracking contests and say "Wow, ECC stood up really well to that distributed attack, I guess it's safe for us to use!", without even once bothering to learn what the theory behind ECC is.

    Schneier himself uses contests, so it's disingenuous to suggest he claims contests have no value. Schneier's Blowfish contest offered a cash award to the best cryptanalytic results against Blowfish, regardless of whether or not those results led to a break. That seems to me to be a more healthy way to run contests, although I'm still not entirely sold on the idea.
  • by rjh ( 40933 ) <rjh@sixdemonbag.org> on Tuesday April 04, 2000 @01:03PM (#1151535)
    First, I am a professional information security consultant. Second, no, this is not professional advice; do not rely on it without first verifying.

    However, unlike DES, there is no known mathematical loophole

    Wrong answer, thank you for playing. DES is one of the most, if not the, most thoroughly-analyzed ciphers of all time. So far, the best way to break DES is by a brute force attack. There are some attacks against it which some people use as proof that the NSA put a backdoor in it, but these attacks are extremely esoteric -- for instance, the key complementation property means you only have to test half the possible keys; this reduces the difficulty of some attacks (chosen-plaintext attacks, specifically, although I think Eli Biham has a known-plaintext version) by a factor of 2--meaning the keyspace is only of size 2**55, not 2**56.

    The rules for using DES are simple. Don't use weak keys; don't use complementary keys; use it in DESede (aka TripleDES) mode. The resulting ciphersystem is as close to unbreakable as you're likely to ever get. If your system is eventually broken, you can be reasonably certain that the cipher was not the subsystem which suffered the breach.

    I trust DESede more than I trust Blowfish, more than I trust IDEA, more than any other symmetric ciphersystem out there.

    Interestingly enough, so does Schneier. A few months back at a crypto conference someone in the gallery asked him what the strongest cipher today was. As I recall, his words were "Triple DES. There is no question."
  • by Noer ( 85363 ) on Tuesday April 04, 2000 @10:43AM (#1151536)
    Don't misunderstand what this means. The ECC algorithm was not cracked; an encrypted message was cracked after a ridiculously large amount of computing power was applied to it. Perhaps this means larger key sizes are needed, or smaller windows of using the same key. However, unlike DES, there is no known mathematical loophole; the algorithm has not been shown to be insecure. If there is a loophole, then increasing key size doesn't help; the algorithm is flawed. But in this case, all that's needed is larger key sizes. Arbitrarily large keys allow for encryption that can't be cracked with all the computing horsepower on the planet within the age of the universe.

    I'd be more interested in real cryptographic algorithm analysis of the algorithm, but that is not by any means my forte.

On the eighth day, God created FORTRAN.

Working...