Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Encryption Security

Probable Solution Found for ECC2-109 Challenge 130

kpearson writes "The eCompute ECC2-109 distributed computing project discovered a probable solution to Certicom's ECC2-109 challenge today. The challenge was to defeat a 109-bit Elliptic Curve Cryptosystem (ECC). Since the eCompute ECC2-109 project began on November 8, 2002, 1,981 volunteers have run the project's software and found almost 40.5 million distinguished points. From those points the project found two which matched and caused a collision, enabling the project to find a solution to the ECC. The solution was submitted to Certicom this morning for verification."
This discussion has been archived. No new comments can be posted.

Probable Solution Found for ECC2-109 Challenge

Comments Filter:
  • by after ( 669640 ) on Thursday April 08, 2004 @10:24PM (#8811519) Journal
    At least for me, I never heard of this ECC-109 method before. The summery clears it up though.
    Project Overview


    This is a distributed computing project to solve the Certicom ECC2-109 challenge.In general it can be described as a whole lot of computers from all over the world, all of them working together in an attempt to have a collision. Each of the computers runs a program that computes DP (Distinguished Points) values. Each time a computer finds a DP, it is uploaded to a main server. The main server then checks to see if anyone has uploaded a matching DP (this is known as a collision). If a collision happens (two matching DP values exist), then the Certicom challenge is solved!


    Yea ... but what practical use does this have? Was this done _just_ because it could have been done?
    • by mphase ( 644838 ) on Thursday April 08, 2004 @10:30PM (#8811581) Homepage
      Price money. About $10,000 I believe.
      • But does this have any use? Prize money for what? For being lucky enough to get some random number twice? I didnt read teh full description, so I dont really know if there is reale use for this, but from what the Introduction tells me: "We are getting a ton of computers to generate numbers, and if two computers generate the same number, then we win." Hmm, huh? I still dont see what the point of this is? Does this advance some sort of research? Does this support some other principal of theory?
        • It proves that it is possible with commodity hardware (and a lot of time) to break ciphers that are regarded as pretty strong.

          This ofcourse is nothing to what one can imagine that national agencies have at their disposal. If a gang of internetusers can break a cipher (brute forcing it) using spare cpu-cycles, imagine what a dedicated cluster of highend computers using an algorithm more efficient than bruteforcing it would be.
          • by LostCluster ( 625375 ) * on Thursday April 08, 2004 @10:56PM (#8811770)
            I'm not sure the government has more efficient algorithms for most of the commerically-used encryption schemes... I seriously doubt the government has encryption-busters publishing classified theory discoveries that we don't know about.

            However, they most certainly have the resources to create a reduced instruction set processor dedicated to breaking an instance of a popular encryption scheme, and the ability to buy lots of such to build a supercomputer.
            • But usually a good way to breaking a cipher is having a crib, ie. some hint to what the plaintext contains.
              OTOH governmentstyle ciphercrunching doesn't necessarily have all the information about which cipher and so on that most of these cryptochallenges has.

            • by bluGill ( 862 ) on Thursday April 08, 2004 @11:11PM (#8811867)

              The NSA is the largest hirer of math majors. The NSA is mostly interested in encryption. A custom computer is not nearly as good as an algorithm breakthrough, though the computer is obviously creatable (or not creatable depending on the requirements). We know the NSA is doing algorithm research, but because they are not talking we don't know what or how much.

              Back in the '70s IBM developed DES, and it was proposed that it become the national standard for encryption. In the process the NSA got involved and told IBM to make some changes, IBM did, but didn't understand why. About 10 years latter differential cryptanalysis was developed, and it turns out that the changes the NSA requested made DES much more secure than the original design, just because of resistance to differential cryptanalysis.

              Many researchers believe that with recent private interest in cryptography the gap has been closed. However we do not know that. In any case, the NSA know everything universities know, but they do not contribute back.

              • mod up

                very true about code vs hardware. log(n) is so much faster then n - x (x being some hardware improvement).

                yeah if I would guess government is atleast a couple years ahead of civilian technology. Makes since given the resources they have.
              • by dj245 ( 732906 ) on Friday April 09, 2004 @12:25AM (#8812327) Homepage
                The NSA is the largest hirer of math majors.

                Gee and I thought the cell phone companies hired the most math majors.

                Seriously, you have to have taken Analytical Calculus I-III, Differential Equations, Geophysical Fluid Dynamics, and be Phd standing in order to understand the rate plans. Think of the guy who must have made them up!

              • The NSA is the largest hirer of math majors.

                Even if true, they still hire only a small, small fraction of all math and computer science PhDs.

                Worldwide non-NSA talent vastly outweighs NSA talent, I would wager dollars to donuts.
                • I would doubt it. If your willing to spend a few million yearly keeping top tier researchers on your payroll, it's not all that hard.

                  Though I doubt the gap between the NSA and other crypto teams is huge, I am positive the difference between the crypto teams and pgp is a joke.

                  In fact I wouldn't be surprised if they can crack pgp in real time.
                  • I would doubt it. If your willing to spend a few million yearly keeping top tier researchers on your payroll, it's not all that hard.

                    Yes, it is. First, you radically underestimate the quality of researchers doing crypto in academia -- if you look at the people who were doing top-rate crypto work in graduate school and publishing at the best conferences, virtually all of them have taken academic positions after graduation. Second, a sizable fraction of those researchers are not US citizens. The rest of
                    • Right, but Israel has their own NSA.

                      I'm saying the two crypto teams are at the same level.

                      And we are quite a bit behind them.
                    • Who is "we"? Civilian cryptographers?

                      If so, I repeat: if you look at the people who were doing top-rate crypto work in graduate school and publishing at the best conferences, virtually all of them have taken academic positions after graduation. When you take a reasonable look at these things, the far-and-away most likely conclusion is that there is vastly more talent doing civilian crypto than classified crypto. The only way to think otherwise is to believe that tons of uber-geniuses are somehow found a
                    • The only way to think otherwise is to believe that tons of uber-geniuses are somehow found and snatched up before they ever publish any good work. Pretty unlikely, if you ask me.

                      We seemed to do a pretty good job of it in WWII, I see no reason to expect things are different now.
                    • >> The only way to think otherwise is to believe that tons of uber-geniuses are somehow found and snatched up before they ever publish any good work. Pretty unlikely, if you ask me.

                      > We seemed to do a pretty good job of it in WWII, I see no reason to expect things are different now.

                      If you're referring to the Manhattan project, the government just went out and hired everyone who had ever done any research work in nuclear physics, regardless of how good they were or how much they'd published. Sinc
                    • And the enigma project, and the german counterpart. Along with several projects we probably don't even know about.
                    • While I do not believe this is the case, allow me to espouse a crackpot theory.

                      At public universities, the government does have a say in what papers get published based on research done at a governmentally funded institution. So, maybe the great theorists submitted their paper for publication and got picked up by the powers that be. ...

                      Just kidding.
                • Worldwide non-NSA talent vastly outweighs NSA talent, I would wager dollars to donuts.
                  Which is where the CIA fits in, helping to "level" the playing field, getting rid of the competition.
                • I'd be willing to go out on a limb and say that financial corporations and games developers are hiring the majority of math grads these days. Finance institutes need lots of quants and hardcore math guys to get their funds and investments as profitable as possible. Games companies need physics and math grad for the environmental physics in the latest games.
              • Nice story, but not true. According to Coppersmith, the DES design team discovered differential cryptanalysis during the design process of DES, and defended against it in the design of the S-boxes. They called it the T-attack, for "tickle". The NSA said they'd discovered some of their darker secrets, and asked them to keep quiet about it. The result was that everyone could see there was structure in the S-boxes, but no-one knew why, until Biham and Shamir re-invented DC in the late eighties and broke ju
            • I seriously doubt the government has encryption-busters publishing classified theory discoveries that we don't know about.

              Naaa... its not as if the government hires some huge number of mathematicians [wikipedia.org] or anyting.
            • Last time I checked, the Brits had a implementation of RSA long befor R, S, and A did, it just happened to be classified. Polish mathmeticians broke enigma in what 30, 31? Didn't help them much, but their techniques trained the first generation of computer cryptographers (Turing included). There was no point in having the listening/intercept nets that the US, England, and the former USSR maintained during the cold war had and China and the US have today if all you get to listen to was essentially white n
          • It proves that it is possible with commodity hardware (and a lot of time) to break ciphers that are regarded as pretty strong.

            You don't need to have any hardware to prove this. Every cipher in the world can be cracked given enough time

            The real queustion is how long would it take if it wouldn't been what cipher and length has been used

      • About $5 per person involved. Now they can pay for all that electricity they used.
    • "Was this done _just_ because it could have been done?"

      Reading around it seems that the idea is to prove that the encryption method is good rather than just theoretically sound. Probably makes it easier to sell stuff based on ECC if you can show how hard it is to crack.
      • Reading around it seems that the idea is to prove that the encryption method is good rather than just theoretically sound. Probably makes it easier to sell stuff based on ECC if you can show how hard it is to crack.

        Wouldn't it just mean that the test program they were using to crack it just didn't happen to exploit any weaknesses that might exist? The uniqueness of the point is one thing but choosing it is another.
    • Like the RC5 challenges [and DES before it] just to say "yes, yes you can do this".

      So when someone says "64-bits of security ought to be enough" you can say "no, no it isn't." ;-)

      Though yeah, if they do more challenges it's just getting futile.

      Tom
      • I think it's enough to say the theory is proven... no number of bits of security is ever going to be "enough" to be uncrackable. However, a LARGE number of processor-time units on even the fastest processors available is needed to crack strong encryption.

        I think the only remaining point of the challenges are just to prove that the ability to legally aquire $10,000 and the point of saying it's been done is enough to motivate people to contribute their processor time to such a project. We should have stats p
        • I think it's enough to say the theory is proven... no number of bits of security is ever going to be "enough" to be uncrackable

          Pet peeve: You can't prove something is always true by providing anecdotal evidence - but you can disprove it.

          Anyway, to the main point: you live in a universe with a finite number of particles and a finite TTL (physics majors are welcome to argue if they want). Or let's put it this way: The total amount of resources available to humanity is limited. Given that, there are problem
          • Funny thing, I just had an exam this morning where eliptic curve encryption was one of the topics. Here's a little tidbit from the course notes about brute forcing these things that some may find interesting, as it compares to RSA:

            An ECC key size of 106 bits should take 10^4 MIP years to compute, corresponding to a RSA key size of 512 bits.
            An ECC key size of 160 bits should take 10^12 MIP years to compute, corresponding to a RSA key size of 1024 bits.
            An ECC key size of 211 bits should take 10^20 MIP yea
          • finite TTL An exceelent point, if you can't crack the code with the TTL, there is no point in trying. Given that, there are problems that, although possible to solve in theory, are impossible in practice - take, for example, the problem of playing perfect chess by fully computing the game tree. With enough bits encryption becomes unbruteforcable too.

            The problem with that logic is that it assume brute force attack -- that a chess program must compute all branches of the game tree or that a cracker mus
        • Um to some extent. But recall that the attacks on ECC, symmetric ciphers like DES and RC5 and RSA are all unique.

          Against ECC was the pollard-rho cycle finding algorithm. DES/RC5 was brute force and RSA has been a combo of QS and GNFS.

          Tom
    • Was this done _just_ because it could have been done?

      This was application of something that could have just been left alone as a probability.
      It's one thing to have it on 'paper', it's another to know it will actually work (or not) as predicted.

      What's additionally fascinating is that the post-doc researcher called it "the most difficult elliptic curve discrete algorithm problem ever computed" and now plans to take the 10000 USD prize and give $8K to the Open Software Foundation and a grand each to two i
    • Does everything have to have a practical use? Couldn't those who participated just wanted the challenge, to see if they could do it? Does everything you do have a practical use?
    • I could make two computers have a collision easily. I duct tape them both to 2 seperate office chairs and run them into each other.

      Now give me my $10k
    • From one of the links above I got to:

      http://www.certicom.com/index.php?action=company ,p ress_archive&view=121

      It appears the 109-bit challenge was already solved in November 2002 and one would expect them to be working on the 131-bit challenge now. Am I missing something here?
  • Yeah, this article is definitely a bit confusing, but it is still interesting. I'd like to hear what someone who participated has to say about it.
  • ...most widely-used asymmetric cryptosystems these days are based on the hardness of factorising large numbers as opposed to finding the factors of elliptic curves. So...what purpose does this serve? Or does it just act as a useful benchmark?
    • by cyb97 ( 520582 ) * <cyb97@noxtension.com> on Thursday April 08, 2004 @10:41PM (#8811666) Homepage Journal
      Elliptic Curve Cryptography is a much younger branch of mathematics and cryptography than plain old factor-based ciphers like RSA and friends.

      That's probably why it isn't as well known and well deployed as factor-based encryption. The number of implementations is also much smaller.
      Be aware however that NSA NSA Turns to commercial software for crypto [slashdot.org] chose ECC as (one of) the way(s) to go for the future not long ago...
    • ...most widely-used asymmetric cryptosystems these days are based on the hardness of factorising large numbers...

      Don't you mean, factoring large PRIME [encyclopedia4u.com] numbers? :)
      • not necessarily ;-) but it sure makes it more secure. In RSA the two factors only really have to be relatively prime, but having non-prime factors makes the cipher much weaker ;-).
        • by LostCluster ( 625375 ) * on Thursday April 08, 2004 @11:06PM (#8811843)
          having non-prime factors makes the cipher much weaker

          Actually, using a non-prime makes your encryption bullet-proof against any check against the domain of all known prime numbers...

          Therefore, the much larger domain of near-primes also needs to be considered. Afterall, you can get a known near-prime just by multiplying two known primes by each other, the only factors will be 1, itself, and the two primes by each other. To check all of those, it'll take quite the load of processor time too.

          And that leads to a guessing situation as to which subset of the possible numbers to check first. If the "blue team" checks all the prime known numbers in a situation where the "red team" has has selected a near-prime... the near-prime will take longer to solve than had any known prime been selected. If the "blue team" checks the near-primes with the same priority of as the primes while the "red team" has selected a prime number, then the possiblity of the near-primes has lead to a time-costing distraction from the real solution.
          • ummm. for, say, 1024bit keys, the number of all primes (see prime number theorem [wikipedia.org]) is more than the number of all hydrogen atoms in the observable universe (again checking wikipedia would help) by orders of magnitude. you can't check all primes...
          • ...do not take cryptology advice on Slashdot. That goes for the rest of this post, too.

            It's extremely easy to dermine if a number is prime or composite (as long as it isn't pseudoprime, but that's something completely different from near-prime) using Fermat.

            Normally you have n = p*q, p and q prime
            If you have n = p*c, c composite, what you really have is n = p*q1*q2(*q3...), which makes each factor a lot smaller.

            The odds of finding a prime dividing n is now a *lot* greater. The algorithms typically make n
          • Umm, I would like to draw your attention to the fundamental theorem of arithemtic, which kinda shoots great big holes in your argument. Any non prime will fall more quickly than a prime in any reasonable primality checking algorithm.

            RSA is in principle very simple. Take two big prime numbers, multiply them together, and you get a bigger number. Finding out what those factors are is *hard*. A brute force check is not possible (relatively speaking: it's possible, but expensive, or requires writing a vir

    • by Anonymous Coward
      It seems that you're referring to the RSA crypto system. If so, you should realise that using RSA for extended lengths of communication is almost prohibitively (computationally) expensive. Elliptic curve crypto systems, on the other hard, have much lower computational costs. Moreover, experts believe that they provide at least as much security, if not more, as the RSA scheme. This is one of the primary reasons ECC is the hot thing these days (just check eprint.iacr.org)
      -
      Orca
    • by Coryoth ( 254751 ) on Thursday April 08, 2004 @10:56PM (#8811774) Homepage Journal
      The most widely used assymetric system is RSA, which is indeed based on factoring (or calculating the Euler Phi function - it amounts to the same thing).

      Next on the list is Diffie-Hellman, which just a key exchange algorithm (you can't encrypt with it, it simply allows both parties to communiccate in public to agree on a private session key. RSA is slow enough that this is all RSA gets used for mostly anyway though (agreeing on a symmetric session key). Diffie-Hellman is based on the difficulty of the discrete logarothm problem. That is, given a large prime p, and a numbers x, y find a such that a^x mod p = y.

      If you want to do encryption with a Diffie-Hellman liem system, you can, and that system is known as El-Gamal. It works very similarly, and is based on the same problem (Discrete Log Problem).

      Elliptic Curve Cryptography is simply Diffie-Hellman or El-Gamal, except that instead of using Z_p as the group in which you do calculations, you use the group formed by the points of an elliptic curve over a given finite field. Mostly that means that multiplication is much more complicated, and the Discrete Log Problem itself becomes much harder (partly due to multiplication being harder, partly due to other properties of the group that it would be tedious and not very illuminating to explain).

      The advantage of Elliptic Curve systems is that because the DLP is much harder on the group used (elliptic curve group), you can use a much smaller key size and still have strong encryption. Note that it was only a 109bit key that was cracked after years of effort - compare that to the RSA factoring challenge where much larger key sizes have been cracked.

      You have extra benefits in ECC as well - you get to choose the base field, and the curve itself to determine the group, rather than picking a large prime. As the properties of elliptic curve groups can vary dramatically given a change in field or curve this means if you can choose your curve randomly you get even more security (for very few extra bits - elliptic curves are very complicated objects, but simple to describe).

      What all of that means is that, while current systems are based on factoring (RSA), that system require slarger keys, is less secure and - given recent developments by Biham, Bernstein and the like - is looking potentially surprisingly crackable even at some of the larger key sizes. That is to say, Elliptic Curve Cryptography is very much the future of Asymmetric Cryptosystems. Being able to break this key size gives a decent benchmark of the security of current systems (which don't use randomly chosen curves yet - there are still issues with that).

      That is to say - this is very important, but given the complexity and the effort involved, looks like a good sign for the security of Elliptic Curve Cryptography.

      Jedidiah.
      • The thing I don't get is why people do these competitions. As far as I've seen, people can easily estimate the amount of time necessary to produce a collision in any algorithm (obtaining an actual result seems like a waste of time to me).

        That and they're only finding collisions. Collisions are next to useless unless you want somebody to accidentally download a file with seemingly random bits instead of something they wanted. (That's just one example, but collisions are not very useful a good 99.99999999% o
        • That and they're only finding collisions. Collisions are next to useless unless you want somebody to accidentally download a file with seemingly random bits instead of something they wanted. (That's just one example, but collisions are not very useful a good 99.99999999% of the time).

          That would be hash collisions you are thinking of, this is rather different, given the nature of the system. I would direct you to this [slashdot.org] very well written post, which explains the significance of collisions in ECC. It effect
      • One of my engineering profs who is a cryptography researcher says there is a company in Calgary, Alberta, Canada that specializes in ECC encryption. Anyone know who that might be? I tried googling for it but couldn't find much.
  • by jamonterrell ( 517500 ) on Thursday April 08, 2004 @10:35PM (#8811627)
    I understand finding a collision (two things that when crypted yield the same result) is considered a goldmine in breaking an encryption algorithm.

    How does finding a collision help break the encryption? Does anyone know the technicalities of why this allows you to break an encryption algorithm, to me, who has no clue, this seems just like a coincidence and not very useful, but i'd like to be enlightened.
    • by Anonymous Coward
      Finding a collision breaks the HASH function associated with the elliptic curve.
      --
      AC
      • I was hoping for an explanation. Simply stating that it breaks the HASHing doesn't explain anything. You're just saying the same thing again.

        Anyone else know?
      • ... I can probably answer this myself but I'd rather get some credible experts opinions... failing that, slashdot is right handy....

        umm, can they do a coordinated analysis of their logs at the moment of capture, then work backwards and determine the sequence of what I will term "lucky breaks" that lead to the capture, allowing them to throw away all the blind leads, so that in future attempts they can cut to the chase quicker? All the non-lucky breaks should be easy (that's the real question I don't know)
    • It gives something to stand on basically. Context can then be taken into account. Most good crypto systems were broken socially, with a person of importance being mentioned and then their name being run through the encrpted messages, or with a i/o machine being captured (ie: enigma in WW2).
    • by acidblood ( 247709 ) <decio@@@decpp...net> on Thursday April 08, 2004 @11:01PM (#8811808) Homepage
      The `collision' mentioned here is related to the particular algorithm being used to break ECC, which is called Pollard rho for discrete logarithms.

      Let's work with the integers modulo a prime p -- the algorithm works just the same with elliptic curves. Say you were told that a^b == c mod p (where == means `is congruent to'). You were also given a, c and p, and you need to figure out b. This is the so-called discrete logarithm problem.

      Pollard's rho algorithm solves this problem the following way. Suppose you somehow figure out that a^x c^y == a^w c^z mod p, and of course x != w, y != z (which is the trivial solution). That's the kind of collision they found. Now this yields a solution because, as c == a^b, then a^x c^y == a^x (a^b)^y == a^x a^(by) == a^(x+by), and similarly a^w c^z == a^(w+bz). Thus a^(x+by) == a^(w+bz), so one is left with the very easy task of solving the equation x+by == w+bz modulo the group order, which is p-1 here since we are working with integers modulo p (this is Fermat's Little Theorem). For elliptic curves, it's not so easy (i.e., it may take a couple of hours, maybe days, on a single CPU for a curve of cryptographic interest) to figure out the group order but it's still possible.

      And how is that collision found anyway? That's a bit complicated, but I guess it can be found on the Handbook [uwaterloo.ca]. It has to do with the theory of random functions.
    • Say you take a hash or sum of the latest Fedora ISOs, and it comes out to some value x. If I make an equal-length file that has the same sum x, but is not Fedora, I can crack the Fedora servers, wreak havoc everywhere, and replace the ISOs. If the Fedora people have trouble fixing my other damage, they may be too busy to notice that the ISOs are different, even though they're the same size and sum.

      Now, suppose I make/find a file that not only collides with the ISOs, but also does largely what the ISOs did
    • two things that when crypted yield the same result

      That can't happen. Would you by any chance be thinking of something else? Possibly a hash rather than an encryption. The fact that you can decrypt proves that the encryption cannot produce collisions. That is for any m we know that D(E(m))=m so if E(m1)=E(m2) then it must be the case that D(E(m1))=D(E(m2)) which is the same as m1=m2.
  • by Bytal ( 594494 ) on Thursday April 08, 2004 @10:45PM (#8811696) Homepage
    Basically it's a cryptographic method that allows the same or nearly the same level of security as a regular public-key encryption scheme(based on factoring large numbers) but makes it computationally cheaper to encrypt the data. So while the bad guys still, theoretically, need nearly the same amounts of processing power and time as regular asymmetric crypto to decrypt the message, the good guys save significantly in encryption. This is of course extremely important for, let's say mobile devices with limited processing power.
    • by cyb97 ( 520582 ) * <cyb97@noxtension.com> on Thursday April 08, 2004 @10:48PM (#8811713) Homepage Journal
      Not only in processing power, but also in keylength. A typical RSA key would be 512-1024bits to be considered secure, an equivalent ECC key would be 140-200 bits. Which leads to smaller circuts and inturn cheaper implementations.
    • by Coryoth ( 254751 ) on Thursday April 08, 2004 @11:12PM (#8811875) Homepage Journal
      Basically it's a cryptographic method that allows the same or nearly the same level of security as a regular public-key encryption scheme(based on factoring large numbers) but makes it computationally cheaper to encrypt the data.

      Mostly right. ECC is based on the Discrete Log Problem, not factoring. The Discrete Log Problem is basically: given x, y find g such that g^x = y. That's easy for real numbers - you just take a log. The problem becomes rather more difficult in the case where you are working with integers mod some prime - that is, find an integer g, such that g^x mod p = y. That gives you Diffie-Hellman and El-Gamal. ECC is the same problem, but over the group [wolfram.com] of points of an elliptic curve [wolfram.com] over a finite field [wolfram.com]. You can show that this class of groups effectively maximises the difficulty of the Discrete Log Problem, and that's why the key sizes and computational efficiency is so much better.

      Jedidiah
      • Mod parent up, great techical explanation. I should've been more clear in mentioning that the factoring is used in the brute force _breaking_ of the encryption (in RSA for ex.) by factoring the mod portion of the pub key. It looks like I meant that it's used in the actual encryption. My mistake :)
      • You can show that this class of groups effectively maximises the difficulty of the Discrete Log Problem, and that's why the key sizes and computational efficiency is so much better.

        Nobody has shown any such thing -- as far an anyone knows, DLP over elliptic curves is easy, but still hard over the integers mod primes.

        However, the best *known* algorithms for solving DLP over elliptic curves are exponential-time (this may change, if more is learned about elliptic curves), while in the integers case they are
        • Nobody has shown any such thing -- as far an anyone knows, DLP over elliptic curves is easy, but still hard over the integers mod primes.

          However, the best *known* algorithms for solving DLP over elliptic curves are exponential-time (this may change, if more is learned about elliptic curves), while in the integers case they are subexponential-time. This makes a big difference in key lengths when you get down to implementations.


          Quite true - that's why I said effectively. Given current techniques for solvi
      • The Discrete Log Problem is basically: given x, y find g such that g^x = y.

        Actually, that's taking the x'th root of y, not a logarithm. The discrete logarithm problem is:
        given g, y find x such that g^x = y

        The Diffie Hellman Problem is given g, g^x, g^y, find g^xy. This is generally done by finding the discrete log of g^x or g^y, but I'm not entirely sure whether it's proven if the DLP and DHP are equivalent.. perhaps google may yield answers to that

    • So while the bad guys still, theoretically, need nearly the same amounts of processing power and time as regular asymmetric crypto to decrypt the message, the good guys save significantly in encryption.

      Unless of course it's the bad guys doing the encryption. Then it will take the good guys more time to decrypt. That evil, evil knowledge! :)

  • by Anonymous Coward on Thursday April 08, 2004 @10:51PM (#8811731)
    If you put enough computers on the problem, you will eventually find a solution by random chance. (correct me if I'm wrong.)

    The sketch goes something like this:

    If you put enough monkeys at enough typewriters, they will eventually, by random chance, type all the works of Shakespeare. Unfortunately, this would require an infinite number of people going around checking the monkey typing.

    "Bob, Bob, come here, I think we've got something! Yes this is really it! 'To be, or not to be, that is the gzortmanplatz...'"
  • I'm not an expert, but this is my basic understanding. ECC is a public key encryption algorithm. The challenge was to find the private keys using a list of public keys and a set of system rules. If two computers find keys (DP points) that are the same, then theoretically, the encryption has been cracked, i.e. the private key has been found. It's a brute force method. This shows, however, that it is very hard to crack the encryption and would be generally useless, since by the time you cracked the encry
    • Well it depends on how long you want something kept private. My ( credit card number | social security number | drivers license number ) don't change all that often. If someone, somewhere who sold identities could crack mine in 2 years I'd be a little worried. Heck, if some government got interested in something I said, I'm sure they wouldn't think twice about applying some CPU time to it. The longer it lasts, the better

      -Mr. Lizard
  • by Pike ( 52876 )
    Now get on the ball and crack this [jdueck.net] already.

    -JD
  • bring out the 110 foot pole!
  • I guess we have a bunch of potential recruits for the MD5CRK project [md5crk.com] (mentioned here [slashdot.org]), no?

  • Waste of cycles (Score:4, Informative)

    by gumpish ( 682245 ) on Thursday April 08, 2004 @11:59PM (#8812171) Journal
    It seems that a better use of spare CPU cycles would be distributed.net's Optimal Golomb Ruler [distributed.net] search, or Stanford University's Folding @ Home [stanford.edu] project.

    Make an actual contribution to science - much more satisfying than looking for a very improbable E.T. or senselessly cracking encryption schemes.
  • by crypto257 ( 699692 ) on Friday April 09, 2004 @12:16AM (#8812269)
    The purpose of all these challenges is to understand how much computing power is necessary to break encryption or signature schemes. EC109 strength is pretty low, but offer a way-point on the curve. Distinguished points are not really distinguished. They just have an easy search pattern such as a number of trailing zeroes or other constant values. These are searched ad-infinitum and when two matches are found, a little math can get you a private key. The death nell for the DES algorithm was heard when distributed.net, in cooperation with the Electronic Freedom Foundation built a machine that could crack it in 27 days (or so). And the cost made you wonder who might want to build such machines. As a result, we have AES and expanding public key lengths. No-one would really use ECC 109 for current cryptographic systems. The results from this test confirms that. The real question is what is the appropriate key length for a The amount of money (n computers over t time) tells us what sort of advisaries these techniques are useful against. It also
  • curves... (Score:3, Funny)

    by dj245 ( 732906 ) on Friday April 09, 2004 @12:18AM (#8812282) Homepage
    Ooh, Curves! Can I touch?

    I am too dumb to say anything further on the subject.

  • i am a little fuzzy on this--does this yeild the session key (and decrypt only this session) or will it yeild the private keys? (and allow decryption of all further communications?)

    either way..this was only what, 17 months for only 2k people on a 109 bit key?... just think.. each bit doubles a key's strength. and with 64 bit keys still being fairly commonly used, think how fast the govt (NSA) will be (probably, has been) blowing through lower bit (possibly higher bit) crypto communications--and i can almo
  • Uh... November 2002? (Score:4, Interesting)

    by 1000StonedMonkeys ( 593519 ) on Friday April 09, 2004 @01:44AM (#8812777)
    Okay, so why does the linked webpage indicate that the 109 challenge was Completed in November of 2002 [certicom.com]?
  • can someone explain this without all the numbers with funny symbols, and possibly small words. I mean, sure, I read 'Cryptonomicon', but I admit I skimmed the theory sections....

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...