Forgot your password?
typodupeerror

Crypto Snake Oil 215

Posted by ScuttleMonkey
from the never-trust-a-man-selling-something dept.
An anonymous reader writes "Luther Martin of Voltage Security has published an article about the perception of cryptography today with regards to quality and honesty in vendors. From the article: 'Products that implement cryptography are probably credence goods. It requires expensive and uncommon skills to verify that data is really being protected by the use of cryptography, and most people cannot easily distinguish between very weak and very strong cryptography. Even after you use cryptography, you are never quite sure that it is protecting you like it is supposed to do.'"
This discussion has been archived. No new comments can be posted.

Crypto Snake Oil

Comments Filter:
  • Snake Oil (Score:5, Informative)

    by Anonymous Coward on Sunday September 03, 2006 @05:40AM (#16032041)
    Snake oil is a traditional Chinese medicine used for joint pain. However, the most common usage of the words is as a derogatory term for medicines to imply that they are fake, fraudulent, and usually ineffective. The expression is also applied metaphorically to any product with exaggerated marketing but questionable or unverifiable quality.

    'nuff said
    • Re: (Score:2, Insightful)

      ...any product with exaggerated marketing but questionable or unverifiable quality.

      Like a religion?!
    • So did he write the article and then post it on wikipedia, or did he swipe it from wikipedia and post on his site?

      http://en.wikipedia.org/wiki/Snake_oil [wikipedia.org]

      Not trying to troll, I just couldn't figure out which it was and I don't have a lot of time to investigate.

      Transporter_ii
      • Or it could possibly be neither, since the definitions given in both are generally accepted and don't match up word for word (assuming you're trying to insinuate plagiarism). Yes, they're similar, but they are not identical. Anyone familiar with the term or its history would write something very similar if asked to...
      • Re: (Score:3, Informative)

        by 44BSD (701309)
        Actually, Ross Anderson was the first infosec/crypto dude to channel Akerlof, in section 5 of this paper [cam.ac.uk].
  • Still not too bad (Score:4, Interesting)

    by legoburner (702695) on Sunday September 03, 2006 @05:44AM (#16032047) Homepage Journal
    Even though in many cases this might be true, and product prices are increased because of it, weak encryption is a lot better than no encryption at all. There are many people out there who might go as far as casual data theft (eg; taking someone at their school's USB memory stick), but even a weak layer of encryption will stop all but those who know what encryption is and where to start breaking it.
    • Re:Still not too bad (Score:5, Interesting)

      by TCM (130219) on Sunday September 03, 2006 @05:51AM (#16032055)
      I'm not so sure. Once a flawed implementation has been broken, there will be tools to crack it.

      Take WEP for example. I personally wouldn't know how to crack it. But others do. They develop tools. Et voila, today it's trivial to download some tool and break WEP, even for novices.

      Weak encryption is never good and should be strongly discouraged.
      • by Panaflex (13191) <convivialdingo@noSpaM.yahoo.com> on Sunday September 03, 2006 @05:55AM (#16032059)
        WEP is still a great example... it's enough of a pain that if given the choice between breaking a WEP connection and using an open WAP - well, you'll choose the open one.

        In that case, WEP really does work for most people.
      • Re:Still not too bad (Score:5, Interesting)

        by Snarfangel (203258) on Sunday September 03, 2006 @06:03AM (#16032066) Homepage
        I'm not so sure. Once a flawed implementation has been broken, there will be tools to crack it.

        Plus, if there is *no* encryption, people are less likely to put sensitive information in the application.

        To use an analogy, consider two locker rooms. Room A does not have locks on any of the lockers. Room B has locks, but all of them have the same combination. In which one is a person more likely to leave their wallet?
        • Re: (Score:3, Funny)

          by Anonymous Coward
          behind the fire extinguisher in the hall between room A and room B. Security through obscurity!
        • by Phleg (523632) <stephen@touset. o r g> on Sunday September 03, 2006 @10:05AM (#16032480)

          In which one is a person more likely to leave their wallet?
          Am I the only person who thinks the correct answer to this question is in his pocket?
        • Re: (Score:3, Insightful)

          by k98sven (324383)
          To use an analogy, consider two locker rooms. Room A does not have locks on any of the lockers. Room B has locks, but all of them have the same combination. In which one is a person more likely to leave their wallet?

          I take it you're implying the correct answer would then be "Neither". And I'd agree.

          Problem is, it's not a relevant point. The context here is consumer's ignorance on the performance of crypto products. If someone is buying a crypto product, they must have determined that they need one. Or to co
        • Re: (Score:2, Funny)

          by BobNET (119675)
          Room A does not have locks on any of the lockers. Room B has locks, but all of them have the same combination. In which one is a person more likely to leave their wallet?

          Put the wallet in your sneaker. I put it down by the toe, they never look there!

        • by jd (1658) <imipak@yaCOLAhoo.com minus caffeine> on Sunday September 03, 2006 @02:12PM (#16033339) Homepage Journal
          But I've worked as a contractor for Government sites where their central data server was:
          • Publicly accessible, outside of any firewall
          • Had .rhosts on it, for the specific purpose of avoiding having to write login code for scripts that copy data
          • Stored commercially sensitive (and possibly classified) information.

          Ok, I'll be fair - though God alone knows why, and I think even God gave up trying to figure out the tangled mess I call a brain some time ago. They did use DES - not triple DES, just plain DES - for the really really sensitive stuff. The encryption key was visible to anyone logged in on any account, however, as the DES they used required the key to be the first parameter and they made no effort to erase it. So it was technically encrypted. (Once the passkey has been broadcast to all and sundry, I do not regard the encryption as anything more than a technicality, and in the case of DES, I seriously doubt you could even claim that.)

          I've heard that security has since improved. I say "heard", because it was some time AFTER security was said to have been improved that reports started coming out of a fileswapper using NASA storage machines as extra disk space - the very same organization and very same type of mass storage device I had serious doubts about many years prior to that.

          But that's a Government institution! Yes, and they're the ones with a great many experts in such matters and a great many contracts with people who can not merely withdraw business but also guarantee a disaster in the next election. The bulk of private corporations out there have neither the skills to draw on OR the incentives to maintain some sort of standard. All they have to do is ROT13 and tell you it's got digital security. Enough suckers'll buy into it to keep the CEO in champaign, caviar and girls of commercially-negotiable virtue for life.

          The problem is, there is no mandated minimum standard for security, so those who can WILL use the lowest standard possible that will deceive customers into thinking they're safe whilst staying a gnat's whisker (after being compressed by the forces of a neutron star) beyond what could be sued for in courts, assuming a technically ignorant judge.

          IMHO, "snake oil" could be vastly reduced - not eliminated but reduced - by placing minimum standards for crypto, compression and other easily-manipulated areas of technology, and enforcing them. Not maximum - that's what the intelligence services want, and they want it to be zero. I'm strictly talking minimum. Your good, old-fashioned lemon law - does it fill the purpose for which it was sold to the customer? Yes or no.

          In the case of cryptography, that would be rephrased as follows: would a reasonable person, aware of the strengths and deficits of the technique concerned, aware of any warnings published on the block crypto lounge, hashing function lounge, etc, aware of the Usenet Crypto FAQ (ie: aware of the "common knowledge" that exists on cryptography), and aware of the grade of security the user is demonstrably expecting, agree or disagree that the cryptographic system sold meets the grade expected or not?

          If it does not, it is a lemon for the purpose for which it was sold. It might be perfectly good otherwise, but it doesn't, can't, and never will do what was expected of it.

          This would be enforceable, as I said very clearly that I'm talking about weighing the "common knowledge" against the "personal expectation". Both are easy to define and even a non-expert should understand a skull-and-crossbones labelled "BROKEN, DO NOT USE" in a crypto lounge. They might not understand the fine nit-picking or the advanced maths, but that's why I'm sticking solely to what is commonly known and understood, not what is derivable from axiom 327 as applies to lemma 291 as described by Professor Branestawm's obscure paper entitled "techniques for splicing dormice genes into giraffe brai

        • Re: (Score:3, Insightful)

          by Inode Jones (1598)
          Room A.

          And I'll bring my own lock.
    • Re:Still not too bad (Score:5, Interesting)

      by Lord Ender (156273) on Sunday September 03, 2006 @06:14AM (#16032082) Homepage
      I would say that there is an inverse relation (at least somewhat) between price of crypto software and real security.

      The cheaper the software is, the greater the number of people who could have peer-reviewed it for correctness. The more open the software, likewise.

      Really expensive software could only have been peer-reviewed by a small number of people, while free, open source software could have been reviewed by a huge number of people.

      I recently was asked to recommend a way for my CEO and several other executives to securie thier IMs. I recommended gaim + gaim-encryption because it was all open source and free, so if there were a flaw in the crypto implementation, it would likely have been discovered already.

      I also made sure the CEO knew that he was using open source software, and I told him why. He was totally down with it :-)
      • Re: (Score:2, Insightful)

        by Anonymous Coward
        Unfortunately this is a flawed approach. A million people may have read it, but if none of them were cryptographers than it was no better than if nobody had read it. What's really important is _who_ has read the code, not how many.
        • If a million people read the code, and 1 in 10,000 were cryptographers who examined the code closely, that's still 100 cryptographers examining the code. Assuming most of them were working independently or in small groups, that's good enough for me. It's probably a lot better than a closed-source solution where maybe half a dozen experts looked closely at the code.

          The best thing about open-source is that if it's a real concern to you, you can hire your own experts to check out the implimentation. You don
        • If they were writing it from scratch, then sure. But good crypto techniques are fairly numerous, and they are well tested. For instance, SSH+AES+RSA authentication is about as secure as you can make a remote shell. Thus, you don't need a cryptographer to review this if you turn around and implement it in something else, like, say, PGP. All you need is someone with a basic understanding of how the original was implemented, and you need to make sure the code doesn't do anything stupid or malicious -- if i
      • Re: (Score:2, Informative)

        by abhi_beckert (785219)
        Peer reviewed does not equal security. It could be there are several known flaws in something that's had "peer reviews", or it could be the system is totally open but hasn't been around long enough to be tested thoroughly, or maybe it's been around forever but is now using a faster alogorithm that hasn't been proven to be secure...

        If you want security, ask an authority on the matter rather than basing it on inderect things like price, openness, etc.
        • by Jsprat23 (148634)
          Given that gaim-encryption is currently based off of Mozilla's NSS and NSPR libraries, I think it's pretty safe to assume that the "right people" have looked at them. Most of g-e is an interfacing layer between gaim and the underlying encryption that also prevents replay attacks by inserting nonces into the stream.
        • Re: (Score:3, Informative)

          by Lord Ender (156273)

          Peer reviewed does not equal security. It could be there are several known flaws in something that's had "peer reviews"...

          Yes, "it could be" that many unlikely things are true. But they are still unlikely.

          Are you new to cryptogology? It seems you are unfamiliar with the fundamental tenet of cryptography: "If lots of smart people have failed to solve a problem, then it probably will not be solved anytime soon."

          You seem to think peer review doesn't have much to do with cryptography, but I would argue that it

        • by Jeremi (14640)
          If you want security, ask an authority on the matter rather than basing it on inderect things like price, openness, etc


          Of course, the authority's opinion on the product might be mistaken also. What we really need is a way for laypeople to test a program's security themselves.... some sort of auto-hacker-in-a-box software, perhaps. I have no idea if that's even remotely feasible, but it would be really useful.

      • Re: (Score:2, Interesting)

        by RoboSpork (953532)
        gaim-encryption is flawed in that it is a weak encryption scheme. Off The Record [cypherpunks.ca] is a far superior gaim plugin providing a much stronger encrytion, authentication, deniability, and secrecy into the future. Read how it compartes to gaim-encryption on their website. Their whitepaper [cypherpunks.ca] is really good introduction to what can make encryption strong vs what can make it weak, definitely worth a read for anyone new to crypto. And besides all that, open source != secure. That is a really bad assumption to make.
        • Re: (Score:3, Informative)

          Would you please explain why gaim-encryption is weak?

          OTR might be a better choice for social communications, as explained in the paper, but that does not make gaim-encryption (or PGP, etc) weak. For its intended purpose both PGP and gaim-encryption seem strong.

          If I wanted to authenticate and keep a message secret from eavesdroppers, I would have no problems using gaim-encryption. At work, non-repudiation is really not a problem, and if my key was compromised, IM compromise would be my smallest problem (assu
    • by vidarlo (134906)

      Even though in many cases this might be true, and product prices are increased because of it, weak encryption is a lot better than no encryption at all. There are many people out there who might go as far as casual data theft (eg; taking someone at their school's USB memory stick), but even a weak layer of encryption will stop all but those who know what encryption is and where to start breaking it.

      If you don't think, you'll agree that weak crypto is better than none crypto. The problem is if you believe

      • Re: (Score:2, Informative)

        by inviolet (797804)

        If you imagine something is uncrackable, like pgp pretty much is [. . .]

        Cracking PGP is still a Hard Problem, but the times they are a'changin'. It may succumb to quantum computing. Or, it may fall under the combined assault of the army of mathematicians who are studying integer factorization. Nobody knows for sure, but the NSA has been telling people for years now to not rely on RSA. They suggest switching over to Elliptic Curve or other advanced algorithm.

        • the NSA has been telling people for years now to not rely on RSA. They suggest switching over to Elliptic Curve or other advanced algorithm.

          Provide a cite for that, please?

          I don't personally feel very kindly disposed towards RSA - I don't see any advantage it has over Rabin-based schemes and important disadvantages - but I think it is scaremongering to say that the NSA have been warning people about it.
  • Then use OSS!! (Score:4, Insightful)

    by JimBowen (885772) on Sunday September 03, 2006 @05:53AM (#16032056)
    If you are worried about the honesty of vendors, this is exactly why you should be using free cryptography software in the first place, because you know that is going to be strong, and trustworthy, because otherwise someone would have changed it by now. :)
    It is also much easier to verify strength by reading the source rather than by reading the binary or by cryptanalysis.
    • or (Score:4, Interesting)

      by xmodem_and_rommon (884879) on Sunday September 03, 2006 @06:22AM (#16032093)
      or you could just take the common sense approach and use products that rely on algorithms that are open, widely tested and reviewed, and known secure. Algorithms like Blowfish, AES, etc. I use Apple's built-in Filevault protection to encrypt my Powerbook's hard disk, in the event that it is ever stolen. It uses AES-128, which means I know that no-one is getting in without my password.

      Any vendor that relies on a custom algorithm for their encryption technology shouldn't be trusted.
      • Re:or (Score:5, Interesting)

        by TCM (130219) on Sunday September 03, 2006 @06:37AM (#16032112)
        Any vendor that relies on a custom algorithm for their encryption technology shouldn't be trusted.
        Of course.

        But even then there are vendors who claim to be using AES and end up introducing implementational flaws that are not obvious to the user. It's not just algorithms that need to be reviewed but complete implementations.

        Nice read: http://www.schneier.com/crypto-gram-9902.html#snak eoil [schneier.com]
      • Re: (Score:3, Interesting)

        by swelke (252267)
        ...use products that rely on algorithms that are open, widely tested and reviewed, and known secure.

        Just because the algorithm is widely tested and known to be secure doesn't make the software based on it secure. It's very easy to take a secure algorithm like AES and make a totally insecure program by, for example, not encrypting all of the data it should, or by selecting the encryption key poorly so that it's easy to "guess",meaning you might only have to check 2^20 keys to decrypt that email of yours
      • Re: (Score:3, Informative)

        by rew (6140)
        or you could just take the common sense approach and use products that rely on algorithms that are open, widely tested and reviewed, and known secure ... and in reply I quote the blurb from the article on slashdot:

        "Even after you use cryptography, you are never quite sure that it is protecting you like it is supposed to do."

        If it claims to use AES, does it really? Even if it actually does, are you sure it doesn't conveniently store the key somewhere? Even if it doesn't do anything this stupid, are you s
    • by cduffy (652)
      Using OSS is not a guarantee of strong crypto.

      See Peter Gutmann's analysis of open source VPNs [auckland.ac.nz] back in 2003. To be sure, the situation was not as dire as he described it to be in all these cases -- in some cases such issues were arguably not readily exploitable or were documented as recognized tradeoffs -- but it nonetheless raises a point that even having a substantial group of folks looking at the source doesn't necessarily help as much as it generally does if recognizing the bugs requires special knowled
      • Re: (Score:2, Funny)

        by portmapper (991533)
        See Peter Gutmann's analysis of open source VPNs back in 2003.

        That has the following great suggestion:

        Whenever someone thinks that they can replace SSL/SSH with something much better that they designed this morning over coffee, their computer speakers should generate some sort of penis-shaped sound wave and plunge it repeatedly into their skulls until they achieve enlightenment. Replacing the SSL/SSH data channel is marginally justifiable, although usually just running SSL/SSH over UDP wou

  • by smilindog2000 (907665) <bill@billrocks.org> on Sunday September 03, 2006 @06:10AM (#16032076) Homepage
    So, for example, with a post like this, will somebody in a dark suit and glasses show up at my door tomorrow?

    Blasphemy #1: I've heard from a claimed friend of one of the inventors of RSA that it was cracked it years ago. Yet, it continues to get worldwide use. Sure my friend was probably full of it... but who am I suppose to trust here? The government?

    Blasphemy #2: One of my close friend's mother had to switch fields from Numerics after she published some papers considered too sensitive. It had something to do with factoring.

    Blasphemy #3: Anybody else notice that quantum computers have been proven to be capable of factoring really well, but no one has shown that they can solve any NP-hard algorithms? Come on... factoring isn't NP hard.

    Then, there's just some silly stuff I've noticed about crypto. Why do we always seem to use encryption just a generation or so ahead of what is needed to crack it? SHA-1 for example... And, why do we encrypt one small block at a time. Each encrypted file usually gives many independent chances to crack the key, and in many cases, some of those blocks have known data. Also, public key is great, but secret key can be easily shown NP-hard to crack (in terms of secret key length) with semi-reasonable assumptions, while public key has no such simple proof. I personally have been trying to prove that no public key system can be NP-hard, but what the heck... I'm not that good. However, I do believe it's probably true.

    It seems any time you start talking about crypto, you get assailed by experts telling you just how full of it you are. Consider something simple, like generation of random numbers. Just claiming you can do a good job brings nay-sayers out of the woodwork. See:

            http://linux.slashdot.org/comments.pl?sid=193904&c id=15899118 [slashdot.org]
            http://www.billrocks.org/rng [billrocks.org]

    for how to do it well. Any child could do it (well at least my geeky 6-year-old).

    Everything about crypto is scary... Are we being manipulated into using weak encryption? Is there some invisible line, which if crossed, bad things can happen? The scary part is the unknown.

    --

    Just because your paranoid doesn't mean the world isn't out to get you.
    • Re: (Score:3, Interesting)

      by Anonymous Coward

      Is there some invisible line, which if crossed, bad things can happen? The scary part is the unknown.

      That's exactly what it is, I think. Crypto is so complex that, unless you are absolutely sure wtf you're doing, you're better off NOT trying to implement your own crypto algorithm, random number generator and whatnot. Without the mathematical knowledge, you can never completely assess side effects, for example.

      A nice page about how novice understandig of crypto can turn into horribly insecure software: http [auckland.ac.nz]

    • by Realistic_Dragon (655151) on Sunday September 03, 2006 @06:48AM (#16032123) Homepage
      sci.crypt is a good read if you are interested in Crypto. However it does tend to get a bit antagonistic towards newbies - and it's not hard to see why.

      Approximatly every 12.5 minutes someone turns up claiming to have invented a new:

      Random number generator
      Unbreakable encryption method
      Implimentation of old methods that makes them unbreakable
      Proof that shows that all crypto is worthless

      The percentage of loons is *so* high that anyone who does have an interesting idea (and who doesn't publish in reputable journals) is dismissed out of hand.

      For example, here is a typical conversation from the one sane new poster (posted somewhere between the 999,999 people trying to sell "200000 bit quantum crypto based on the randomness of STARS!!!!!"):

      <i>** Hi, I'd like to find out if there's a RNG sandbox somewhere so I can play about with some ideas.</i>

      <i>* ARGH! Dont impliment your own RNG! It'll be crap! Here, use product X.</i>

      Well, yes, that's true. When it comes to crypto there is a 99% chance that what you impliment will not work properly and as a result will be insecure... but stoping on someone who wants to try some ideas out is just plain wrong. All research doesnt have to take place in academic institutions.
      • All research doesn't have to take place in academic institutions, but people who claim revolutionary results should be expected to have a decent background in the field.

        What would you expect to happen if I showed up in sci.physics.catapults and said:

        I've developed a revolutionary new catapult based on my super-accurate estimation of acceleration due to gravity at 11 m/s^2! With my new iterated approximation algorithmic technique, I can calculate trajectories so accurately I can hit a fly on a wall from a

    • Blasphemy #2: One of my close friend's mother had to switch fields from Numerics after she published some papers considered too sensitive.

      Considering that an agency that thinks polygraphs give absolutely perfect proof of lies is enforcing this sort of stuff - yes we are being manipulated into weak encryption by a bunch of incompetant clowns that have already been taken in by snake oil and are seen internationally as bumbling fools. US intelligence doesn't rate as highly as newspaper articles these days. A

    • Re: (Score:3, Insightful)

      by Panaflex (13191)
      Well, I think the facts(haha - ahem - as far as is publically known) are this:

      I've heard from a claimed friend of one of the inventors of RSA that [it was cracked years ago].
      1. RSA is not known to be cracked and in general is still considered HARD - though the rapidly increasing amount of free and cheap CPU time will eventually defeat most of today's common length keys in 35-50 years (who knows?). That said, it may be possible that RSA gets cracked next week - I wouldn't be surprised. I too have a few fri
    • by gkhan1 (886823) <<oskarsigvardsson> <at> <gmail.com>> on Sunday September 03, 2006 @07:25AM (#16032158)

      Boy, you don't know that much about cryptography, do you ;)

      Blasphemy #1: I've heard from a claimed friend of one of the inventors of RSA that it was cracked it years ago. Yet, it continues to get worldwide use. Sure my friend was probably full of it... but who am I suppose to trust here? The government?

      That's complete BS. It hasn't been cracked, and it wont be for a long time. Just remember to use big keys and your stuff is safe. As for who you are supposed to trust, you're supposed to trust the huge mathematical community that every day is pounding and pounding and pounding on this problem. They are honest academics, and if there is even a hint of progress it will become public.

      Blasphemy #2: One of my close friend's mother had to switch fields from Numerics after she published some papers considered too sensitive. It had something to do with factoring.

      I'm not entirely sure what the hell you are saying. Are you saying that your friends mother is a genius mathematician who published a few papers about factoring and was somehow forced to leave the field? That's completely ridiculous, lots of people publish papers on factoring every year. Either you are lying or you have completly misunderstood the matter.

      Blasphemy #3: Anybody else notice that quantum computers have been proven to be capable of factoring really well, but no one has shown that they can solve any NP-hard algorithms? Come on... factoring isn't NP hard.

      This is a common misconception, that quantum computers will be like a regular computer, "but way faster". This is not so, a quantum computer works in a fundamentally different way, a way that makes it possible to invent algorithms that are way faster than anything on a classical computer. Many of these new algorithms are made for cryptanalysis, namely Shor's algorithm (integer factorization in polynomial time, breaks RSA), the discrete logarithm algorithm (breaks Diffie-Hellman) and Grovers algorithm (would speed up standard brute forcing cracking, but only a quadratic amount which means that you can just double your key length, and it's still as hard).

      As for complexity, the decision-problem form of integer factorization ("Is there a factor of M smaller than N?") is indeed in NP, but the specific class is an unresolved problem. Most people doubt that it is in either P or NP-Complete which would most certainly make it NP-hard (unless P=NP ofcourse, but that's a whole 'nother discussion ;) Maybe you are thinking of primality testing, which has very recently been proven to be in P. The whole village rejoiced.

      Then, there's just some silly stuff I've noticed about crypto. Why do we always seem to use encryption just a generation or so ahead of what is needed to crack it? SHA-1 for example...

      Has been a problem in the past, but we've learned our lesson. 256 bit AES will (very possibly) never be cracked by an ordinary computer. A quantum computer might, but it would have to be one bad-ass quantum computer. 256 bit AES is completely safe.

      And, why do we encrypt one small block at a time. Each encrypted file usually gives many independent chances to crack the key, and in many cases, some of those blocks have known data.

      It doesn't matter one iota whether a block has known data or not. You still need the key to have any idea what is in there or not (that is, imagine you suspect a block of data Y has encrypted X, there is no way you can prove that if you don't have the key). There is something called chosen plaintext attack which you can do a similar thing in public key cryptography, but it is only works in bad implementations of it.

      Also, public key is great, but secret key can be easily shown NP-hard to crack (in terms of secret key length) with semi-reasonable assumptions, while public key has no such simple proof. I personally have been trying to prove that no public key system can be NP-hard, but what the heck... I'm not that good. Howe

      • by u38cg (607297)
        Well, Cocks at GCHQ derived RSA independently four years before Rivest, Shamir et al. figured it out. One of my (brighter than me) friends went to work at GCHQ (on what I don't know, but he is a wrangler), and his only comment on the place was "it fucking blew my mind how far ahead of the rest of the world they are". It wouldn't surprise me in the least that they have discovered how to fast factor.
    • by dhasenan (758719)
      "Blasphemy #3: Anybody else notice that quantum computers have been proven to be capable of factoring really well, but no one has shown that they can solve any NP-hard algorithms? Come on... factoring isn't NP hard."

      There is no proof that it is and no reason to think that it is. We just have no fast algorithm for it.

      "Then, there's just some silly stuff I've noticed about crypto. Why do we always seem to use encryption just a generation or so ahead of what is needed to crack it?"

      That's largely a matter of ke
    • Re: (Score:3, Informative)

      by YoungHack (36385)

      Blasphemy #1: I've heard from a claimed friend of one of the inventors of RSA that it was cracked it years ago. Yet, it continues to get worldwide use. Sure my friend was probably full of it... but who am I suppose to trust here? The government?

      I'm a professional mathematician and have had the opportunity to work with and become friends with some big names in number theory and factoring. No one can know for certain, but my friends were of the general opinion that RSA was probably okay.

      Blasphemy #2: One

    • Re: (Score:3, Insightful)

      by Tack (4642)
      And, why do we encrypt one small block at a time. Each encrypted file usually gives many independent chances to crack the key, and in many cases, some of those blocks have known data.

      They're only independent if you use ECB, and anyone using ECB deserves what they get. Cipher modes like CBC or CTR solve these problems.

    • You really do sound paranoid. Unless you are totally ignorant, you must know that any math/copsci student who could show a well-established crypto system is easily breakable would have his career set for life.

      So you must think it is possible that every time one of these students publishes a paper on a fast way to factor large numbers, he vanishes, never to be seen again. How many people vanished from the math dept. of your school? That just doesn't happen. Unlless "they" (meaning all of academia) are also i
  • by Paul Crowley (837) on Sunday September 03, 2006 @06:11AM (#16032077) Homepage Journal
    Many Slashdot readers are savvy enough to know that when a software product advertises itself as using, say, secret encryption algorithms with 10,000 bit keys, it's probably snake oil. But I'm seeing increasing amounts of snake oil that uses the Advanced Encryption Standard, AES, and it can be just as weak.

    AES itself of course is nigh-on as trustworthy a cryptographic primitive of its kind that we have. But just because you've used the right primitive, doesn't mean you've built a secure product. You have to consider what chaining mode to use, how to handle passphrases if they exist, how to keep your secrets secret, defense against side channel attacks, and more.

    What I look for is a product that provides enough information that I can actually assess its security - what attacks they've considered and how they've built the product to defend against them. What I see disturbingly often is a bald declaration that the product is secure, because it uses AES.
  • by melted (227442) on Sunday September 03, 2006 @06:11AM (#16032078) Homepage
    >> It requires expensive and uncommon skills to verify that data is really being protected by the use of cryptography

    No. It requires reading a couple of good, inexpensive books and understanding of what the heck you're doing. Math behind the whole thing can be complicated. But you don't really need to understand the math 100% here. All you need to know is whether an algorithm is considered "strong" by today's standards, understand a few key concepts, guard your keys, and aproach security related coding with a healthy amount of paranoia.

    In other words, a decent developer can get a pretty good understanding of this all in two weeks or less. And these skills need to become "common" already.
    • by Paul Crowley (837) on Sunday September 03, 2006 @06:19AM (#16032086) Homepage Journal
      If you believe that, no wonder so much insecure stuff is being written. I have been called upon to review code written by developers with your level of knowledge in crypto. They do things like use RSA without proper padding, or use predictable IVs in CBC mode, or fail to properly authenticate the message. They also add totally unnecessary complexity to the system in the mistaken belief that their improvements make it more secure. I shudder when I see a copy of "Applied Cryptography" on the shelves because it is just enough knowledge to be dangerous.

      Even the experts make errors in cryptographic protocol design and implementation - I've been doing this for ten years and I've made at least one howler myself. Why do you think, contrary to the advice of pretty much everyone who really knows their stuff, that people with a couple of week's worth of knowledge can get this stuff right?
      • What happens if I use predictable IVs in CBC mode?

        I just finished up a system where I do use (very) predictable IVs in CBC mode (with AES128).

        From what I could tell, an IV really only helps with preventing parallel dictionary attacks. That is, like people use against the UNIX crypt function (in passwd files). Since there won't be more than about 30 things ever encrypted with this key, I figured I didn't need the additional security IV gives me.

        And besides, the IV has to be in the code or data somewhere, as
        • by Paul Crowley (837)
          IVs are public after encryption. However, in CBC mode an attacker must not be able to predict the IVs before encryption takes place. This is a pain; you're better off using CTR mode, for which your IVs need only be different from each other, and predictability is not a problem. Or better yet, EAX or GGM mode so you get authentication too. Why are you hand-rolling your crypto? Why did you use CBC rather than CTR mode? What are you using for authentication?

          Your guess about what the IV is for is mistaken
          • I'd like to be funny and say it's because I've used OpenSSL, and I am astounding anyone (including myself) is capable of making a system that works with it, let alone is secure. It's a complete disaster.

            But really it's because the system I'm using is an embedded system. And by embedded system I don't mean a full-blown PC running Linux, I mean a small system. I was allocated about 4,000 instructions for my crypto work (and an AES ECB accelerator).

            In CTR mode, the each encryption doesn't depend on other data
            • by vrt3 (62368)

              In CTR mode, the each encryption doesn't depend on other data encrypted before it. Thus someone could change a single 128-bit area in the cyphertext and it wouldn't affect anything else. This greatly reduces the difficulty at changing my cyphertext to change the plaintext in certain areas without it being detected after decode.

              I'm a layman in the field, but it just so happens that I just read Practical Cryptography. The book makes it very clear that you should never depend on encryption for checking the a

            • by Paul Crowley (837)
              RC4 and RC5 are very different; all they have in common is a designer and a few ideas.

              As the guy says, don't use CBC for authentication; that's broken. Since you have an AES accelerator EAX mode is probably a good fit; it provides both encryption and authentication.

              You can do it wrong if you prefer, but it should be possible to do it right.
      • I shudder when I see a copy of "Applied Cryptography" on the shelves because it is just enough knowledge to be dangerous.

        Which books would you want to see on someone's bookshelf for you to consider respecting them?
        • by Paul Crowley (837)
          Which books would you want to see on someone's bookshelf for you to consider respecting them?

          Conference proceedings are a good sign, especially with your name on the contents page :-) Seriously, no book on the shelf can be a badge of expertise or otherwise, and I imagine most people who know what they're doing also have a copy of AC. It's just that there are a lot of people who consider themselves qualified to write cryptographic software because they've read it, and that just isn't so. If you're asking
    • Re: (Score:2, Interesting)

      by zolaris (963926)
      Yeah sure they can get a great understanding of crypto... with inexpensive books. Just curious do you know how many crypto courses at top level universities rely on textbooks for teaching crypto? I'd suggest discounting any books where the professor is the author. But even with that, it will probably be very small. There are recommended books but in my crypto classes (granted Johns Hopkins isn't exactly the number one crypto school in the country or world but I'd like to think we are half way decent) we
    • Re: (Score:3, Insightful)

      by canuck57 (662392)

      No. It requires reading a couple of good, inexpensive books and understanding of what the heck you're doing.

      That is an understatement.

      Reminds me of the time I watched a finance person use PGP to encrypt a very sensitive file they sent via email. They did everything right except for one critical part.

      After the file was encrypted, they deleted the original one as per instructions. Trouble was it was in the "Recycle" bin a readable.

  • by njdj (458173) on Sunday September 03, 2006 @06:20AM (#16032088)

    Products that implement cryptography are probably credence goods. It requires expensive and uncommon skills to verify that data is really being protected by the use of cryptography, and most people cannot easily distinguish between very weak and very strong cryptography.

    Can you distinguish, by inspection, between a reliable automobile and a piece of junk that will barely last 2 years? I certainly can't. So I rely on reviews by people I trust when I buy a new car.

    In the field of cryptography there are several people who have written peer-reviewed books about cryptography, are trusted in the community, and who occasionally review products. Bruce Schneier [schneier.com] is one (there are others, use Google, this is not mean to be a puff for Schneier or his company).

    There are also open-source cryptographic programs [gnupg.org], which are peer-reviewed and definitely not snake-oil.

  • by CrazyJim1 (809850) on Sunday September 03, 2006 @06:34AM (#16032108) Journal
    Get creative, use Rot-14 or something.
    • by cortana (588495)
      We need a feature addition to Slashcode that does that repeated-plunging-of-penis-shaped-sound-wave thing into the head of anyone who rehashes the thired, old ROT-13 joke in one of their posts.

      At least '1, 2, ???, profit', 'I, for one...' and 'haha it says nothing to see here OMGWTFBBQAOLCIA' are finally being retired.
    • by kestasjk (933987)
      I know it's a joke, but I thought I'd clear up a common misconception; ROT-13 by definition isn't encryption. Encryption requires a key to decrypt, you can know the encryption algorithm but can't know the data without the key. Encoding only requires an algorithm.
      ROT-13 is encoding; ROT by an unknown amount, where the unknown amount is the key, is encryption.
      • by gkhan1 (886823)
        Exactly correct. The name of that algorithm btw is a Ceasar shift or a Ceasar cipher since Ceasar used it (Gaius Julis, that is).
    • Use ROT-12 to decrypt. Well, either that, or use ROT-14 twenty-five more times. It depends how much your time is worth to you.
  • by owlstead (636356) on Sunday September 03, 2006 @06:34AM (#16032109)
    It's pretty well known that there are many snake oil products that deploy cryptography. Bruce Sneier frequently displays snake-oil cryptography products in his newsletter, for instance. And these are just the really obvious ones.

    Some time ago, I tried to evaluate if a Enterprise Service Bus (intercomponent communication) was fit enough to be put into a production environment. It said that it had AES encryption build in. When I looked at the manual, it displayed a pop up window where you could choose the key-size. It listed exactly all key sizes that were *not* possible for AES. This was a very short evaluation, I can tell you. This also shows a very important thing about cryptography: the algorithms used say very little about the security of an application.

    Generally, the manual for cryptographic services is easy to find. This is simply because cryptography is added at the end of the development lifecycle. This is logical because cryptography is not part of the main functionality of most applications (e.g. mime encryption in email products). It's something that was added after the products main functionality was finished. So just look at the last paragraph, or Appendix Z and you are looking at it.

    Sometimes it is easy to see why so many products contain bad cryptography. Take XML signatures for instance. XML signatures themselves contain *references* to the data that is signed and the cryptographic techniques used. If you are to verify an XML digital signature, you *must* check if these are not altered. Furthermore, you must keep the XML schema-definitions on your own disk, and not retrieve them from the internet. Nevertheless, I've not seen any API-documentation even mentioning this rather obvious cryptographic insight. You can rest assured that there will be many implementations that will get this wrong.

    Cryptography is hard.

    The real insight of this story is the listing of the products into "credence goods". If you can call this new insight. Otherwise, it's just stating the well known/obvious.
  • Truecrypt (Score:4, Interesting)

    by urikkiru (801560) on Sunday September 03, 2006 @06:55AM (#16032130) Journal
    This is something I've often considered about commercial encryption software. There's just no way to be sure of their validity, as they are closed source implementations. Open source solutions like Truecrypthttp://truecrypt.sourceforge.net/ [sourceforge.net] are at least somewhat more trustworthy, in that they can be openly reviewed by anyone. Despite the fact that I know jack all about the specific math behind AES and such, at least I can read some simple explanations of the concepts, read the source, and decide if I want to trust my data to it. Honestly, unless we get down to the fraction of the population that actually does understand these bits at a deep level, that's the best any of us can do really.

    Sure, large clusters of powerful servers working in tandem(or quantum computing) may render the factoral math behind crypto obsolete. A nice thing though, is that those kind of solutions are limited to those that can afford them. Still, even if it's all true, and I'm wasting my time encrypting things, what better solutions do we have?
    • Re:Truecrypt (Score:4, Interesting)

      by kasperd (592156) on Sunday September 03, 2006 @07:14AM (#16032144) Homepage Journal
      I agree TrueCrypt is well documented, and in addition to that the source is available. I have the necesarry knowledge to actually review such a design, and in the case of TrueCrypt I must say it is not the worst I have seen, but it is certainly not perfect either. There are some subtle watermarking attacks if you can get access to different encryptions of the same sector. Still in spite of that I'd much rather rely on TrueCrypt than some closed source products. So far all storage encryption products I have seen have had some weakness, I'd much rather use one where I know what it is and to what extend it could be a problem to me.
      • There are some subtle watermarking attacks if you can get access to different encryptions of the same sector.

        Care to explain that a little bit further?
        • by kasperd (592156)

          Care to explain that a little bit further?

          In recent versions of TrueCrypt, the encryption is performed using tweakable block ciphers in way that reuse tweaks. When a sector is written, a sequence of 32 tweaks is used for each 16 byte cipher block. If this sequence was used only once, the encryption would have been as secure as the underlying cipher. However the same sequence is used every time a write is performed to the same sector. Thus by looking on two encryptions of the same sector, you can tell exactl

          • by trifish (826353)
            Even more secure solutions using a few percent extra disk space could avoid this weakness entirely.

            Obviously, you don't know much about designing and developing transparent real-time disk encryption software. The two attributes "transparent" and "real-time" rule out any solution that is not 1-to-1 mapped.

            That's why all transparent real-time disk encryption programs (PGPDisk, TrueCrypt, etc.) use and have to use 1-to-1 mapping.
            • Re: (Score:3, Insightful)

              by kasperd (592156)
              The two attributes "transparent" and "real-time" rule out any solution that is not 1-to-1 mapped.
              GBDE have proven you wrong a long time ago, it is transparent and does not use such a 1 to 1 mapping. Besides why do you think the 1 to 1 mapping is necesarry on the encryption layer when both the layer beneath it (firmware in the storage device) and the layer above it (file system) can use something more complicated than a 1 to 1 mapping?
    • by arevos (659374)

      Sure, large clusters of powerful servers working in tandem(or quantum computing) may render the factoral math behind crypto obsolete.

      Raw power alone cannot overcome modern encryption. One either has to find a flaw in the encryption algorithm, or the implementation being used; or use something radically different from your run of the mill Turing machine, such as a quantum computer. Otherwise, you simply have too many possibilities to feasibly calculate; even with the fastest computers in the world, the Sun

  • Snake Oil Warning Signs: Encryption Software to Avoid [interhack.net]

    Last updated 1998, still insightful.
  • Anyone remember the Blitzkrieg server [attrition.org], which seems like the solution to all of the world's security needs? The expression Bruce Schneier used was "just too bizarre for words". I don't know if this was an elaborate trolling attempt or an actual real honest scam to deceive the terminally dumb, but it's fun to read, still, just for the amazing technobabble and ludicruous claims.

  • It isn't a matter of honest vendors. It can generally assumed that most/all cryptography companies are owned and run by the various security services. For decades a US/Swiss/Israli firm Crypto AG [aci.net] sold a cryptology machine with a secret built in backdoor [spray.se]. At least until Pres. Reagan announced on television that they were reading [orlingrabbe.com] Gaddafi's coded messages.

    There has also been speculation why Windows requires three unique signing keys [cryptome.org]. The disengenious reason given being that in case the first one got lost in
  • Try This... (Score:3, Informative)

    by thebdj (768618) on Sunday September 03, 2006 @11:15AM (#16032684) Journal
    If you are truly concerned about the validity of cryptography provided by the vendor, then try to find products that have been certified under the FIPS 140-2 standard [wikipedia.org]. The only problem might be that a lot of those products are usually commercial grade items meant for use by government agencies; however, some of the items that have received approval are reasonably available to consumers. The products are reviewed by independent labs, and then the CMVP [nist.gov] reviews the labs results. (The site was down earlier this morning.)

    These products have been reviewed by independent labs, who review their implementation to verify that cryptographic mechanisms are implemented properly. This includes reviewing source code and/or hardware designs. Just a thought for anyone who is truly concerned that their hardware or software be compliant. (Note: If you want a "secure" operating system, look into CC Evaluation.)
  • an old problem (Score:4, Interesting)

    by v1 (525388) on Sunday September 03, 2006 @11:39AM (#16032767) Homepage Journal
    I worked for sevearal years on a programming language called REALbasic. In the latter releases that I saw, it featured "encryption". A compiler is basically a tool that takes human readable commands and turns them into a program that a computer can run. This process is not easily reversable, and once compiled, it's difficult at best to make changes to the progam.

    Encryption was added to RB so that it was possible for you to give away portions of your program's "source code" (the human readable part) without anyone actually being able to READ it. They could incorporate your souce into their new project and use it normally, they could just not read it or make changes to it.

    This sounds like a nice idea, until you realize that when you get someone's "encrypted" source code and add it to your program, the compiler has to be able to read the source code, because it needs to translate it for your new program. This means one thing: the encryption is not secure because the compiler itself must somehow posess a "master key" of sorts so that it can read the source code to do its thing. So... when you select the module and try to open it to look at it, it's not that it can't read it.. it's that it won't read it. A sufficiently skilled programmer could go into the compiler and flip a switch inside it and basically say "ignore that", and you would have unrestricted access to the so called "encrypted" informataion.

    I assisted with a project where we found out how this information was encrypted. In short, a fixed key was used to encrypt the project data. Then a different fixed key was used to encrypt the passcode you would use to "protect" the project. Thus, the compiler could ask you for the password if you wanted to read your own project, and it could verify you typed in the correct passcode. If you did, it would decrypt the project for you to view. So you see, the compiler does not NEED the passcode, it simply WANTS it.

    It took us about a week to write a program that would read in the projects, decrypt them using the fixed key and completely ignoring the passcode thing, and saved an unprotected naked project file that anyone could edit or view.

    This is probably not too far from the mark on how a LOT of programs "protect your privacy". In reality they are only protecting you from the casual inspection. Anyone that really wants your data can get it, all too easily. Be sure that with any program you are certain that the program NEEDS the passcode to unlock your data. If it only WANTS it, (is there a password reset option available?) then you know it's "security through obscurity", and we know how totally worthless that is.

    You thought your windows or OS X keychain was secure? You have auto login turned on? Does the computer need your password? Think about it.

    • by jimicus (737525)
      Wouldn't it have been simpler to distribute un-linked object code which the compiler could link in when needed?
  • by Coward Anonymous (110649) on Sunday September 03, 2006 @02:05PM (#16033304)
    One of the major perils facing a would-be crypto user is himself. Many people think they know it all (as evidenced in many of the posts to this article) and therefore can dictate insecure and plain silly design choices when deploying a secure solution in a non-trivial environment (for anything: authentication, the crypto itself, access enforcement, etc.).
    For the vendor this creates a conflict. On the one hand, you want to satisfy the customer's request. On the other hand, you know your customer is shooting himself in the foot and very possibly becoming a vendor reputation problem later on down the line.
    In my experience, most customers are accustomed to being "always right" and fail to recognize that crypto/security may be one of those things that they simply do not know enough about and to let the vendor help them. It is often the case that the vendor can explain/evangelize and detail the very attack the customer is opening himself up to with little or no effect - the customer is convinced they know it all.
  • With the prevalence of keystroke loggers on Windows boxes, is the typical home user better off having encrypted/stored passwords for website forms -- or better off crossing his fingers and typing them manually dozens of times per day?

    Discussing encryption under ideal circumstances is like saying birth control pills are 99.97% effective when taken as directed in laboratory studies.

Cobol programmers are down in the dumps.

Working...