Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Crypto Guru Bruce Schneier Answers 105

Most of the questions we got for crypto guru Bruce Schneier earlier this week were pretty deep, and so are his answers. But even if you're not a crypto expert, you'll find them easy to understand, and many of Bruce's thoughts (especially on privacy and the increasing lack thereof) make interesting reading even for those of you who have no interest in crypto because you believe you have "nothing to hide." This is a *long and strong* Q&A session. Click Below to read it all.

First Bruce says, by way of introduction...

"I'd like to start by thanking people for sending in questions. I enjoyed answering all of them.

"I've written on many of these topics before, and often I will point to existing writings on my Web site or in my Crypto-Gram newsletter. This isn't to be annoying; it just seems useful to point to things I have already written. I urge anyone interested to sign up for a free subscription to Crypto-Gram. I write it monthly, and I regularly answer questions such as these, or write about topics in the news (especially ones that have been reported on badly). To subscribe, visit my Web site at www.counterpane.com/."

ryanr asks:
I've heard you say many times that unless a particular crypto alg. has undergone lots of public review, it should not be considered safe. Unless possibly it's from the NSA. (Excluding, of course, the NSA stuff that is INTENTIONALLY backdoored.)

The implication there is that the NSA has applied some many resources to the crypto problems, that they are as good as the rest of the cryptographers put together.

My question is: Do you really think that a private process, no matter how many resources applied, can equal the public process?

ANSWER:
Yes. One way of looking at a public process is as a large and distributed private process. If the NSA collected all of the academic cryptographers, gave them a clearance, and locked them in a basement somewhere, it would become a private process. The real issue is whether or not the NSA has equivalent expertise to the public academic community, and whether it can apply that expertise in an effective manner.

The NSA has a lot more cryptographic experience in the narrow fields of making and breaking algorithms. They have been doing this, and nothing else, for decades. I don't believe that they have much expertise in weird digital signature schemes, or zero-knowledge protocols, or even more bizarre electronic commerce and voting schemes, because they aren't really of practical interest. But the NSA certainly has a very strong practical interest in algorithm design and analysis, to a much greater degree than we in the public community do. And they have seen a lot more ciphers: both designs that they have proposed internally and designs in production systems that they have tried to attack.

The NSA also has the ability to target its analysis resources. The public academic community is scattershot. We work on what interests us, and we are each interested by different things. A director at the NSA has the ability to take the top ten cryptanalysts in the building and say: "You. Go into that room and don't come out until you've broken RC4. I don't care if it takes two years." That ability to direct resources at particular problems gives them an edge that we don't have.

But how much of an edge? Until recently, I would have stated unquestionably that the NSA is a decade ahead of the state of the art in cipher design and analysis. Now, I'm not so sure.

Over the past five years, there has been a lot of open research in cryptography. We have discovered many different types of attacks, and have learned a lot about how to design ciphers. The best and brightest of the cryptographers are staying in the open academic community, and are not being swallowed up by the NSA (or by its counterparts in other countries). There is a vibrant academic community in cryptography; people can exchange ideas, share research, build on each other's work. We've seen attacks against the NSA-designed algorithm Skipjack that almost certainly were not known by the NSA. (See http://www.counterpane.com/crypto-gram-9807.html#skip for Skipjack information, and http://www.counterpane.com/crypto-gram-9809.html#impossible for information on impossible-differential cryptanalysis.) We've seen other attacks that, I believe, were not known by the NSA. (See http://www.counterpane.com/mod3.html for more information.) The public research community is now doing cutting-edge research in cryptography.

Now this doesn't mean we are better than they are. Certainly the NSA knows more about cryptography than the public community does. They read everything we publish, and we read nothing that they publish. Almost by definition, they know what we do. That imbalance alone will always give them an edge in knowledge. But I think that edge is closing rapidly.

And on a related topic, I don't think the recent press flap about the NSAKEY means that the NSA has a backdoor in Microsoft Windows. I wrote about this in HREF="http://www.counterpane.com/crypto-gram-9909.html#NSAKeyinMicrosoftCryptoAPI. But I do think that the NSA deliberately puts back doors in products; seehttp://www.counterpane.com/crypto-gram-9902.html#backdoors for some details.

Sajma asks:
Your book describes a slew of interesting applications for crypto protocols, including electronic money orders, digital time-stamping, and secure multi-party computation. What are the remaining crypto problems of interest to the general public which have not been solved? (secure distribution of digital media comes to mind -- can you sell someone a music file, allow them to use the file anywhere, but make sure no one else can use it?)

- SEE NEXT QUESTION -

randombit asks:
OK, hypothetical question. You rub a magic lamp, and a genie comes out. Specifically, a cryptographic protocol genie. He can come up with an efficient, secure protocol for any activity you want (assuming a protocol is possible, of course). What would you pick, and more importantly, why?

ANSWER:
Two questions; one answer. We actually have all the protocols we need. It's true that I described all sorts of interesting protocols in _Applied Cryptography_. The reality is that none of them is actually useful. What is useful are the few simple primitives -- signatures, encryption, authentication -- and the different ways to mirror real-life trust models using them. These protocols are simpler, easier to understand, and more useful.

The real problem with protocols, and the thing that is the hardest to deal with, is all the non-cryptographic dressing around the core protocols. This is where the real insecurities lie. Security's worst enemy is complexity.

This might seem an odd statement, especially in the light of the many simple systems that exhibit critical security failures. It is true nonetheless. Simple failures are simple to avoid, and often simple to fix. The problem in these cases is not a lack of knowledge of how to do it right, but a refusal (or inability) to apply this knowledge. Complexity, however, is a different beast; we do not really know how to handle it. Complex systems exhibit more failures as well as more complex failures. These failures are harder to fix because the systems are more complex, and before you know it the system has become unmanageable.

Designing any software system is always a matter of weighing and reconciling different requirements: functionality, efficiency, political acceptability, security, backward compatibility, deadlines, flexibility, ease of use, and many more. The unspoken requirement is often simplicity. If the system gets too complex, it becomes too difficult and too expensive to make and maintain. Because fulfilling more of the other requirements usually involves a more complex design, many systems end up with a design that is as complex as the designers and implementers can reasonably handle. (Other systems end up with a design that is too complex to handle, and the project fails accordingly.)

Virtually all software is developed using a try-and-fix methodology. Small pieces are implemented, tested, fixed, and tested again. Several of these small pieces are combined into a larger module, and this module is tested, fixed, and tested again. The end result is software that more or less functions as expected, although we are all familiar with the high frequency of functional failures of software systems.

This process of making fairly complex systems and implementing them with a try-and-fix methodology has a devastating effect on security. The central reason is that you cannot easily test for security; security is not a functional aspect of the system. Therefore, security bugs are not detected and fixed during the development process in the same way that functional bugs are. Suppose a reasonable-sized program is developed without any testing at all during development and quality control. We feel confident in stating that the result will be a completely useless program; most likely it will not perform any of the desired functions correctly. Yet this is exactly what we get from the try-and-fix methodology with respect to security.

The only reasonable way to "test" the security of a system is to perform security reviews on it. A security review is a manual process; it is very expensive in terms of time and effort. And just as functional testing cannot prove the absence of bugs, a security review cannot show that the product is in fact secure. The more complex the system is, the harder a security evaluation becomes. A more complex system will have more security-related errors in the specification, design, and implementation. We claim that the number of errors and difficulty of the evaluation are not linear functions of the complexity, but in fact grow much faster.

For the sake of simplicity, let us assume the system has n different options, each with two possible choices. Then, there are about n^2 different pairs of options that could interact in unexpected ways, and 2^n different configurations altogether. Each possible interaction can lead to a security weakness, and the number of possible complex interactions that involve several options is huge. We therefore expect that the number of actual security weaknesses grows very rapidly with increasing complexity.

The increased number of possible interactions creates more work during the security evaluation. For a system with a moderate number of options, checking all the two-option interactions becomes a huge amount of work. Checking every possible configuration is effectively impossible. Thus the difficulty of performing security evaluations also grows very rapidly with increasing complexity. The combination of additional (potential) weaknesses and a more difficult security analysis unavoidably results in insecure systems.

In actual systems, the situation is not quite so bad; there are often options that are "orthogonal" in that they have no relation or interaction with each other. This occurs, for example, if the options are on different layers in the communication system, and the layers are separated by a well-defined interface that does not "show" the options on either side. For this very reason, such a separation of a system into relatively independent modules with clearly defined interfaces is a hallmark of good design. Good modularization can dramatically reduce the effective complexity of a system without the need to eliminate important features. Options within a single module can of course still have interactions that need to be analyzed, so the number of options per module should be minimized. Modularization works well when used properly, but most actual systems still include cross-dependencies where options in different modules do affect each other.

A more complex system loses on all fronts. It contains more weaknesses to start with, it is much harder to analyze, and it is much harder to implement without introducing security-critical errors in the implementation.

This increase in the number of security weaknesses interacts destructively with the weakest-link property of security: the security of the overall system is limited by the security of its weakest link. Any single weakness can destroy the security of the entire system.

Complexity not only makes it virtually impossible to create a secure system, it also makes the system extremely hard to manage. The people running the actual system typically do not have a thorough understanding of the system and the security issues involved. Configuration options should therefore be kept to a minimum, and the options should provide a very simple model to the user. Complex combinations of options are very likely to be configured erroneously, resulting in a loss of security. There are many stories throughout history that illustrate how management of complex systems is often the weakest link.

I repeat: security's worst enemy is complexity. The most serious protocol problem is how to deal with complex protocols (or how to strip them down to the bone).

Get Behind the Mule asks:
Bruce, thanks very much for making cryptography so much more accessible to us all.

You wrote in Applied Cryptography that IDEA was your "favorite" symmetric cipher at the time. Is that still true today?

ANSWER:
It depends what you mean by "favorite." If I needed a secure symmetric algorithm for a design, and performance were not an issue, I would choose triple-DES. No other algorithm has been as well-studied, so nothing can compare in confidence.

The problem is that triple-DES is slow; on a 32-bit microprocessor it encrypts data at a rate of 108 clock cycles per byte. (You have to remember that DES was designed in the mid-1970s for discrete hardware. It is very slow on 32-bit microprocessors.) If I need a faster algorithm, I would use Blowfish. Blowfish encrypts data at a rate of 18 clock cycles per byte. Information on Blowfish is at http://www.counterpane.com/blowfish.html.

Neither is my favorite algorithm. Currently, my favorite algorithm is Twofish. Twofish is our submission to AES. It is still too new to use operationally, but I hope it will see wide use as people analyze it and as confidence grows in its security. Information on Twofish is at http://www.counterpane.com/twofish.html. It's even faster than Blowfish, and (I think) much better designed.

Faster algorithms are more problematic. I don't really like RC4. SEAL is better, but patented by IBM. I don't care for WAKE. I would probably use one of Belgian cryptographer Joan Daemen's designs.

I don't recommend IDEA anymore for several reasons. One, it isn't very fast; on a 32-bit microprocessor it encrypts data at a rate of 50 clock cycles per byte. Two, IDEA is patented, and the terms change regularly. Also, attacks against IDEA have steadily eaten away at the security margin. IDEA has eight rounds, and the current best attack breaks 4.5 rounds. There are still no attacks against the full eight-round cipher, and there is no reason to believe that any are possible. Still, since there are algorithms with much better performance, it seems improper to suggest IDEA.

Speed comparisons of other algorithms can be found at http://www.counterpane.com/speed.html. A detailed paper comparing performance of the AES candidates can be downloaded at http://www.counterpane.com/aes-performance.html. And for a current summary of attacks against various algorithms, see http://www.ii.uib.no/~larsr/bc.html.

Remember, though -- breaking the cryptographic algorithm is almost never the way to attack a security product. There is almost always an easier way to break the security. I've written about this extensively; see http://www.counterpane.com/whycrypto.html and http://www.counterpane.com/pitfalls.html in particular.

Tet asks:
Scott McNealy claims we've already fought and lost the war for personal privacy. Do you agree with him or not, and why?

ANSWER:
One hundred years ago, everyone could have personal privacy. You and a friend could walk into an empty field, look around to see that no one else was nearby, and have a level of privacy that has forever been lost to today's technology. The framers of the Constitution never explicitly put a right to privacy into the document; it never occurred to them that it could be withheld. The ability to have a private conversation, like the ability to keep your thoughts in your head and the ability to fall to the ground when pushed, was a natural consequence of the world. When the Supreme Court found a right to privacy in the Constitution, it's because the language of the Constitution assumed its existence.

Technology has demolished that worldview. Powerful directional microphones can pick up conversations hundreds of yards away. Pinhole cameras -- now being sold over the Internet -- can hide in the smallest cracks; satellite cameras can read the time on your watch from orbit. And the Defense Department is prototyping micro-air vehicles, the size of small birds or butterflies, that can scout out enemy snipers, locate hostages in occupied buildings, or spy on just about anybody.

In the aftermath of the terrorist takeover of the Japanese embassy in Peru, news reports described audio bugs being hidden in shirt buttons that allowed police to pinpoint everyone's location. Van Eck devices can read what's on your computer monitor from halfway down the street. (I heard that the CIA demonstrated this for Scott McNealy at Sun; they captured his password from a van in the company's parking lot.) Lasers bounced off windows can reveal the Doppler effect of compression and rarefaction of air by soundwaves, and eavesdrop on conversations happening on the other side. If an attacker can plug into your power line, it can read it from even further away. Purchase anything lately? Unless you use cash, what, where, and when is recorded in a database. And in many stores, a security camera has recorded your presence while the helpful sales clerk captures your name and personal information.

The ability to trail someone remotely has existed for a while, but it is only used in exceptional circumstances. In 1993, Colombian drug lord Pablo Escobar was found partly by tracking him through his cellular phone usage. Timothy McVeigh's truck was found because the FBI collected the tapes from every surveillance camera in the city, correlated them by time (presumably the explosion acted as a great synch pulse), and looked for it. During Desert Storm the U.S. dropped thousands of miniature robots -- millimeters in diameter -- on Iraq that looked for signs of biological warfare.

The technology to automatically search for drug negotiations in random telephone conversations, for suspicious behavior in satellite images, or for faces on a "wanted list" of criminals in on-street cameras isn't here yet, but it's just a matter of time. Face-recognition software can pick individual faces out of a crowd. Voice recognition will soon be able to scan millions of telephone calls, listening for a particular person; it can already scan for suspicious words or phrases. Moore's Law, which says the industry can double the computing power of a microchip every 18 months, affects surveillance computing just as it does everything else: the next generation will be smaller, faster, and a lot cheaper. As soon as the recognition technologies can find the people, the computers will be able to do the searching automatically.

At the same time, the fear of crime is facilitating a great deal of surveillance, not all of it instigated by the police. Some U.S. airports automatically record the license plates of anyone coming onto airport property, even if it is just to pick up someone. Some cities are installing directional microphones to pinpoint gunfire; others are setting video cameras on lampposts to deter crime. It's getting difficult to walk into a store without being videotaped. Timothy McVeigh couldn't drive a truck through downtown Oklahoma City without it showing up on an in-store surveillance camera, and these cameras were positioned to protect the store, not to track goings-on outside the windows.

The U.S. is initiating a program called "computer-assisted passenger screening," or CAPS. The idea is to match commercial air travelers against profiles of evildoers, using such items as the traveler's address, credit card number, destination, whether or not he is traveling alone, whether the ticket was paid in cash, when the ticket was purchased, whether it was one-way or round trip, and about three dozen other factors that are being kept secret. Needless to say, groups like the ACLU have objected to stopping and searching people based on stereotypes. Not to mention that the data is saved, just in case the government needs to peek into people's pasts. No warrant required, of course.

More is coming. Out of concern for public safety, the FCC has ruled that by 2001, cellular and PCS companies must be able to locate users who dial 911 to within a radius of 125 meters. Consumers will foot this bill through a user tax, and you can be sure that wireless operators will introduce a plethora of other services based on this technology. The companies are probably going to use the cellular technology to locate people, although if they can wait a couple of technological generations they can drop miniature GPS receivers in the phones and do even better. One way or another, people will end up carrying technology that allows them to be digitally tailed. And currently, no warrant is required.

The surveillance infrastructure is being installed in our country under the guise of "customer service." Some hotels track guest preferences in international databases, so that customers will feel at home even if it is their first stay in a particular city. Caterpillar Corporation is installing diagnostic chips into all new farm machinery. These chips alert the local dealer, via satellite, when a part is failing. The dealer can then drive to the farm with a replacement, often before the machine has even broken down. This is great; I'll bet farmers really like the prompt service and the reduction in downtime. But the same technology can be used for other, less benign, purposes.

Automobile surveillance is almost automatic. Rental cars, equipped with GPS navigational systems, can keep a complete record of exactly where that car has been. Mercedes Benz is planning on embedding a Web server into its cars, so that technicians can spot service problems remotely. At least two companies plan on marketing a smart car locator that uses a GPS receiver and a cellular phone to alert the authorities to your whereabouts in case of an emergency. It only takes a slight modification to allow the locator to work automatically when queried by the police. Lojack, the device that can track your car if it has been stolen, can also be used for surveillance. Will net-connected smart cars give police the ability to track everybody in the country simultaneously? Already systems like Lojack do this, as do car phones.

GPS is a dream technology for surveillance. One company is selling an automatic warehouse inventory system, using GPS and affixable transmitters on objects. The transmitters broadcast their location, and a central computer keeps track of where everything is. Spies have probably been able to use this kind of stuff for years, but it's now becoming a consumer item.

Individual privacy is being eroded from a variety of directions. Most of the time the erosions are small, and no one kicks up a fuss. But there is less and less privacy available, and most people are completely oblivious of it. It is very likely that we will soon be living in a world where there is no expectation of privacy, anywhere or at any time.

rise asks:
As one of the stronger voices behind the proposition that only peer reviewed, open, and thoroughly tested algorithms can be trusted you've widely disseminated several algorithms, Solitaire and Yarrow among them. What attacks or interesting analyses have surfaced since their release?

ANSWER:
For those who want to know what he is asking about, you can read about Solitaire at http://www.counterpane.com/solitaire.html, and about Yarrow at http://www.counterpane.com/yarrow.html. And you can read about my position on the importance of using a public, peer-reviewed algorithm at http://www.counterpane.com/crypto-gram-9904.html#different, my snide comments about proprietary cryptography at http://www.counterpane.com/crypto-gram-9902.html, and my dismissing of cracking contests at http://www.counterpane.com/crypto-gram-9812.html#contests.

There has been some excellent analysis of Solitaire by Paul Crowley. He has posted his results to the sci.crypt newsgroup, and you can look them up. Briefly, he found a bug in the code and a problem with the algorithm. I will fix the bug in the code as soon as I get around to it, but the problem with the algorithm is more disturbing. We hope to write a joint paper documenting the problem, and proposing a fix.

I don't think it is a problem operationally, though. Solitaire is a pencil-and-paper cipher designed for very short messages, and the attack will require a lot of ciphertext. Still, it is a problem and one that I should fix.

As to Yarrow, I don't know any outside cryptanalysis. I'd like to see some.

Thagg asks:
I bought your first edition of Applied Cryptography, and you say two things that bother me, with respect to your submission of Twofish as a Federal standard for encryption.

In the forward, you describe how you got interested in cryptography, and that you had no background or training in the field, but you thought it was interesting. Also, several times throughout the book you caution people not to trust cryptosystems from amateurs.

Clearly you have become well versed in the history and application of cryptography, your book makes all other descriptions of the state of the art invisible by comparison. Still, it appears to me that cryptosystem design and analysis requires fairly extreme mathematical proficiency, which I do not believe that you have.

Now, of course, Twofish is published in detail, and the best people in the world have attempted to crack it (and I think that the competitive process that the US Gov't has promoted is a spectacular way to get the best people to attack each other's ciphers). But, I remain somewhat worried that at the foundations of Twofish...is there something missing that a PhD in mathematics and number theory would have seen?

The winner of this competition will likely be the next DES, and will provide security for a fairly large percentage of the planet. The stakes are high. I'm sure that you have an answer to this criticism, and I'm eager to hear it.

ANSWER:
Certainly you should not trust cryptographic algorithms designed by people who have no experience designing and analyzing cryptographic algorithms. The question you ask is different. You are asking if a Ph.D. in mathematics and number theory gives someone any special insights that someone without the Ph.D. would miss. I believe that cryptographic experience is something that is learned through both training and through experience, and that someone with a Ph.D. is not automatically a better cryptographer.

Cryptography is interesting, because there are no absolute metrics. Anyone can design an algorithm that he himself cannot break. This means that anyone, from the best cryptographer to the sorriest man on the street, can design a cryptographic algorithm that works: that encrypts and decrypts data properly, and that the designer cannot break. The false reasoning that often follows is this: "I can't break it, therefore it is secure." The first question that anyone else should ask is: "You say you can't break it; well, who the hell are you?" More on this topic can be found at http://www.counterpane.com/crypto-gram-9810.html#cipherdesign.

The experience of the designers is something that I look at very carefully when I evaluate an algorithm. I can't devote the months and years necessary to convince myself that an algorithm is secure, so I want to know about the people who are convinced. And I don't look at their academic degrees; I look at what else they have broken.

The Twofish team has dozens of published cryptanalytic attacks, breaking all kinds of ciphers. (A list of Counterpane papers can be found at http://www.counterpane.com/publish.html, and David Wagner's published papers can be found at http://www.cs.berkeley.edu/~daw/.) These are impressive results: mod n cryptanalysis, boomerang attacks, slide attacks, side-channel cryptanalysis, related-key differential cryptanalysis, and attacks against Skipjack, Speed, Akelarre, RC5a, CMEA, ORYX, TwoPrime, etc., etc., etc. Interestingly enough, all five AES finalists have been designed by teams that have a similarly impressive list of published cryptanalytic attack. With a couple of exceptions, none of the non-finalists have any cryptanalysts on their teams.

Another thing to look at is the quality of the designer's analysis. I like designs that have long and detailed documents that discuss how the designers have attacked their own design. You can see this in the submissions for Twofish, and for Mars, RC6, and E2. I worry about a cipher like Serpent that does not come with any analysis. Either the designers didn't do any, which is bad -- or they did it and are hiding it, which is worse.

I think these things speak more to the strength of the design than academic degrees.

In fact, I have seen many systems designed by Ph.D. mathematicians with little cryptographic experience, that have been quickly broken. Experience in cryptography is much more important than experience in general mathematics.

It is certainly possible that there are attacks against an algorithm that the designers missed. This is why AES is a public process. Before AES is chosen, dozens of people with Ph.D.s in mathematics will be performing their own analyses on the submissions. If Twofish is chosen, it will because none of those Ph.D.s have found any weaknesses.

But if you want Ph.D.s on the Twofish team, co-designer Doug Whiting has a Ph.D. in computer science from CalTech. His dissertation was on building Reed-Solomon error-correcting codes in VLSI, so it had a heavy math content.

Enoch Root asks:
It was noted in your biography that you hold a degree in Physics in addition to your M.S. in Computer Science. This seems to be a developing trend in IT, as many Physics graduates turn to CS. Neal Stephenson undertook studies in Physics before becoming a writer. I am myself a physics graduate turned computer geek.

What impact do you think your science studies have on your current career? I suspect the high mathematical background of physics prepared you for cryptology, but what other aspects of a science degree come into play in your line of work? Would you call your B.S. in Physics an advantage or a disadvantage?

ANSWER:
The unfortunate answer is that it wasn't very relevant. It was neither an advantage nor a disadvantage, although it was harder to get a job out of college with a physics degree. Physics teaches mathematics, and that was helpful.

If you want to become a cryptographer, study mathematics and computer science. I wrote an essay on this very topic; see http://www.counterpane.com/crypto-gram-9910.html#SoYouWanttobeaCryptographer.

Hobbex asks:
One would think that cryptographers, who study the mathematical means for controlling information (not just secrecy, but also signatures, zero knowledge proofs etc) would be the least inclined to support the artificial limits to information set up by our legal system, and yet the field is littered with patents (probably more so than any other field of mathematics).

You, on the other hand, have been very generous with your algorithms and cryptos. Is there a political, ideological, or practical reason behind this?

ANSWER:
It is impossible to make money selling a cryptographic algorithm. It's difficult, but not impossible, to make money selling a cryptographic protocol.

Look at algorithms first. There are free encryption algorithms all over the place: triple-DES, Blowfish, CAST, most of the AES submissions. Look again at the URLs attached to question 6. It makes sense for a designer to use one of these public algorithms. If I patented Blowfish, no one would have used it. No one would have analyzed it. (Why would they do work for free, if I were making money off of it?) Only because Blowfish is free is it in over 100 products. (For a list, see A HREF="http://www.counterpane.com/products.html">http://www.counterpane.com/products.html.) If I patented it and charged for it, it would be much less widely used. IDEA is a good example of this. IDEA could have been everywhere; for a while it was the only trusted DES replacement. But it was patented, and there were licensing rules. As a result, IDEA is barely anywhere. SEAL is a great-looking stream cipher. But because IBM has a patent on it, no one uses it.

The early public-key patents are the only exception to this. Because the patents controlled the concept of public-key cryptography, there was no way to do public-key encryption, key exchange, or digital signatures without licensing the patents. This exception is over; the Diffie-Hellman patents have expired.

Regularly I hear from algorithm inventors who want to patent their new cool algorithm and then sell it. This business plan has absolutely zero percent chance of succeeding. I recommend that people give their cryptography away, and use the PR benefit to make a living. If the IDEA patent holders did that, they would be much better off.

Protocols are a little bit different. You can patent a protocol and turn it into a useful business. It's hard; most of the time a competitor can engineer around a protocol patent. But it is possible, and many companies are making a go at marketing patents in authentication, certificate revocation, digital content protection, etc. I'm not thrilled by this, but it's the reality of American business today.

aheitner asks:
As many know, your Twofish algorithm is one of the (many) submissions to become the AES standard. The goal for these algorithms is to be able to implement them extremely cheaply in hardware -- say on a 6800 with 256 bytes of RAM. In other words, cheaply enough to put on a smart card.

But IBM's team alleges that any algorithm that simple can be fairly easily cracked by doing a power usage analysis on the chip (by watching fluctuations in the electrical contacts with the reader) and that the necessary equipment to protect against power analysis would be equivalent to a much more complex processor -- so much so you might as well just implement a different and more complex (and hopefully power-random) algorithm. Of course IBM suggests their own implementation.

What do you think? Is there a way to build a simple smart card so that power analysis isn't a problem? Perhaps the whole question will become irrelevant since we'll be carrying around so much processing power in our PDAs that we'll just use them?

ANSWER:
Power analysis involves breaking a cryptographic algorithm by looking at the power trace from a chip executing that algorithm. This is a specific case of what I call "side-channel attacks." I wrote about side-channel attacks at http://www.counterpane.com/crypto-gram-9806.html#side, and some excellent information on power analysis can be found at http://www.cryptography.com/dpa/index.html.

I don't believe it is possible to create a cryptographic algorithm that, because of its mathematics, is immune from side-channel attacks.

Any cryptographic primitive, such as a block cipher or a digital signature algorithm, can be thought of in two very different ways. It can be viewed as a mathematical object; typically, a function taking an n-bit input and producing an m-bit output. Alternatively, it can be viewed as a concrete implementation of that mathematical object. Traditionally, cryptanalysis has been directed solely against the mathematical object, and the resultant attacks necessarily apply to any concrete implementation. The statistical attacks against block ciphers -- differential and linear cryptanalysis -- are example of this; these attacks will work against DES regardless of which implementation of DES is being attacked.

In the last few years, new kinds of cryptanalytic attacks have begun to appear in the literature: attacks that target specific implementation details. Both timing attacks and differential fault analysis make assumptions about the implementation, and use additional information garnered from attacking certain implementations. Failure analysis assumes a one-bit feedback from the implementation -- was the message successfully decrypted -- in order to break the underlying cryptographic primitive. Related-key cryptanalysis also makes assumptions about the implementation, in this case about related keys used to encrypt different texts. Side-channels attack are a generalization of this idea.

These attacks don't necessarily generalize -- a fault-analysis attack just isn't possible against an implementation that doesn't permit an attacker to create and exploit the required faults -- but can be much more powerful. For example, differential fault analysis of DES requires between 50 and 200 ciphertext blocks (no plaintext) to recover a key.

A side-channel attack occurs when an attacker is able to use some additional information leaked from the implementation of a cryptographic function to cryptanalyze the function. Clearly, given enough side-channel information, it is trivial to break a cipher. An attacker who can, for example, learn every input into every S-box in every one of DES's rounds can trivially calculate the key. What is surprising is how little side-channel information is necessary to break an algorithm. I have published a paper on side-channel attacks against block ciphers at http://www.counterpane.com/side_channel.html.

You can't design the math to solve this problem. You can try to design he hardware so that side-channel information does not leak; this seems to be far more difficult than it appears. Or you can design your cryptographic protocols so that side-channel attacks don't matter. This makes the most sense. Choosing AES based on side-channel resistance is short-sighted.

A long digression on AES, for those who haven't been following the process and for those who care about the outcome: AES is the Advanced Encryption Standard, the new encryption algorithm that will replace DES. NIST is in the process of defining a new encryption standard, which will have a longer key length (128-, 192-, and 256-bit), larger block size (128-bit), be faster than DES, be patent-free, and hopefully will remain strong for a long, long time.

The process is an interesting one. In 1997, NIST sent out a request for candidate algorithms. They received fifteen submissions by the June 1998 deadline, five from the U.S. and ten from other countries. In August, we (my own algorithm, Twofish, was one of the submissions) presented our algorithms to the world at the First AES Candidate Conference.

There was a Second AES Candidate Conference in Rome in March 1999, where people presented analyses of the algorithms. NIST chose five finalist algorithms this summer: Mars, RC6, Rijndael, Serpent, and Twofish. There will be a third AES Candidate Conference in New York in April 2000. NIST is also accepting public comments on the algorithms through 15 April 2000. Finally, NIST will choose an algorithm to become the standard (or perhaps more than one algorithm).

Cryptographers are busily analyzing the submissions for security. It's tempting to think of the process as a big demolition derby: everyone submits their algorithms and then attacks all the others...the last one standing wins. Really, it won't be like that. I strongly believe at the end of the process most of the candidates will be unbroken. The winner will be chosen based on other factors: performance, flexibility, suitability.

This means that we need your input into this process. I know you're not cryptographers, and you won't be able to comment on the mathematics of the various submissions. But you can comment on your encryption requirements, and whether the algorithms will suit your needs.

  • AES will have to work in a variety of current and future applications, doing all sorts of different encryption tasks. Specifically:
  • AES will have to be able to encrypt bulk data quickly on top-end 32-bit CPUs and 64-bit CPUs. The algorithm will be used to encrypt streaming video and audio to the desktop in real time.
  • AES will have to be able to fit on small 8-bit CPUs in smart cards. To a first approximation, all encryption DES implementations in the world are on small CPUs with very little RAM. It's in burglar alarms, electricity meters, pay-TV devices, and smart cards. Sure, some of these applications will get 32-bit CPUs as those get cheaper, but that just means that there will be another set of even smaller 8-bit applications.
  • AES will have to be efficient on the smaller, weaker, 32-bit CPUs. Smart cards won't be getting Pentium-class CPUs for a long time. The first 32-bit smart cards will have simple CPUs with a simple instruction set. 16-bit CPUs will be used in embedded systems that need more power than an 8-bit CPU, but can't afford a 32-bit CPU.
  • AES will have to be efficient in hardware, in not very many gates. There are lots of encryption applications in dedicated hardware: contactless cards for fare payment, for example.
  • AES will have to be key agile. There are many applications where small amounts of text are encrypted with each key, and the key changes frequently. This is a very different optimization problem than encrypting a lot of data with a single key.
  • AES will have to be able to be parallelized. Sometimes you have a lot of gates in hardware, and raw speed is all you care about.
  • AES will have to work on DSPs. Sooner or later, your cell phone will have proper encryption built in. So will your digital camera and your digital video recorder.
  • AES will need to be secure as a hash function. There are many applications where DES is used both for encryption and authentication; there just isn't enough room for a second cryptographic primitive. AES will have to serve these same two roles.
  • AES needs to be secure for a long time. Infrastructure is hard to update. Like DES, AES hardware is likely to be installed and used for decades. A radical new algorithm, with interesting and exciting ideas, just doesn't make sense. A conservative algorithm is what is needed.
Choosing a single algorithm (or even a pair of algorithms) for all these applications is not easy, but that's what we have to do. It might make more sense to have a family of algorithms, each tuned to a particular application, but there will be only one AES. And when AES becomes a standard, customers will want their encryption products to be "buzzword compliant." They'll demand it in hardware, in desktop computer software, on smart cards, in electronic-commerce terminals, and other places we never thought it would be used. Anything we pick for AES has to work in all those applications.

So how do you comment? NIST is accepting formal comments either on paper or by e-mail. See http://www.nist.gov/aes for instructions. Be sure to identify who you represent and what cryptography interests you have. And if you have any weird cryptography applications or environments, tell me. I'd like to know. Remember, the AES is going to be your cryptography standard for the 21st century. Tell NIST what you think.

Christopher B. Brown
Several announcements have been made lately about ciphers being assortedly vulnerable/invulnerable against Quantum cryptography.

Quantum physics seems to be the "magical" form of physics, and its application to cryptography even more magical. I don't think I properly understand "quantum cryptography," and I don't think that most of the people that have made public comment on it understand it terribly well either.

Could you comment on the present state of Quantum cryptography, and its probable relevance in public matters short term (which appears nonexistent), medium term (where the research of today may be in 5-10 years), and longer term?

ANSWER:
There are two separate applications of quantum mechanics to cryptography: quantum computing and quantum cryptography. I think you are conflating the two. Let me take them up in turn.

Quantum cryptography is a means of using quantum mechanics for key exchange. Basically, because it is impossible to measure the state of a quantum system without disturbing it, the physics of the key-exchange protocol allows you to detect eavesdroppers. It's a cool idea, and I spend a few pages in _Applied Cryptography_ explaining it in detail.

Quantum computing is newer. It turns out that it is theoretically possible to build a quantum computer. And, if you can build such a beast, it can factor large numbers and calculate discrete logarithms efficiently. Hence, it renders pretty much all of public-key cryptography insecure.

Both of these things are pretty far out. Quantum cryptography has been demonstrated in the lab; British Telcom researchers have exchanged keys over a 10km fiber-optic link. This is interesting, but I don't see it being used very much. The mathematics of cryptography, while not perfect by any means, is the one thing we can do well. There are so many easier ways of breaking into systems, it doesn't make any sense to replace mathematics with physics. And math is always cheaper.

Quantum computing is far out technologically. These computers are theoretically possible to build. We actually have no idea how to build them. Figure it will be ten or twenty or more years before this has a possibility of being a reality. An excellent article was published in a magazine called _The Sciences_. You can find it at http://cryptome.org/qc-grover.htm.

And when it becomes a reality, it does not destroy all cryptography. Quantum computing reduces the complexity of arbitrary calculations by a factor of a square root. This means that key lengths are effectively halved. 128-bit keys are more than secure enough today; 256-bit keys are more than secure enough against quantum computers.

Neville asks:
What's your response to the notion that the web's reliance on centralized Certificate Authorities for secure commerce is ultimately flawed? There are those, like the Meta Certificate Group, who feel that a hierarchical chain of certificates leading back to only a couple of elite organizations won't hold up in the distributed environment of the Internet. The entire framework of e-commerce seems to stand on the private keys of Verisign and Thawte. Do you feel this is a danger, and will there be viable alternatives?

ANSWER:
I agree with you 100%. The notion of a single global public-key infrastructure (PKI) makes no sense. Open your wallet, and you will see a variety of different authentication credentials: driver's license, credit cards, airline frequent flyer cards, library cards, a passport, and so forth. All of these are analog certificates, and will eventually have digital equivalents. There's no reason in the world why Visa can't use your driver's license number as a credit card number; it's just a pointer into a database, after all. But Visa never, ever will. They want control over issuance, update, and revocation. Similarly, digital credentials only make sense when the entity who cares about the credential controls the issuance, use, update, and revocation of that credential.

I have jointly written a paper with Carl Ellison called "Ten Risks of PIKI: What you aren't being told about Public Key Infrastructure." It will be published in the Winter 2000 issue of the _Computer Security Journal_. It's not on my Web site yet, but will be in by December. (Watch http://www.counterpane.com for details.) You can find a lot more good material on the problems with PKI at Carl Ellison's Web site: http://www.clark.net/pub/cme/html/spki.html.

jovlinger asks:

Bruce,

In a recent cryptogram, you write that most symmetric ciphers need more entropy than people can remember and hence supply. Even with bio-metrics adding more bits, it is not really worth the effort to construct ciphers with more than 128 bits of entropy in the key, because people won't give them more than that much entropy in the pass phrase.

However, social and technological pressures make longer and longer keys a necessity. What promising approaches do you see for making remembering and entering -- even though I have long passages of text memorized, I don't want to type them in for each e-mail I want to send -- usefully long passphrases?

I.e., to paraphrase, would you discuss the state of the art of cipher/human interaction, as it pertains to key management?

ANSWER:
For the rest of you, the Crypto-Gram article he mentions is at http://www.counterpane.com/crypto-gram-9910.html#KeyLengthandSecurity. In it, I argue that people just can't remember complicated enough keys and passwords to be immune from brute-force attacks (for example, L0phtcrack, see http://www.l0pht.com/l0phtcrack). Some of us can, but the masses that are using the Internet aren't able to, can't be bothered to, and won't be cajoled to.

The other way to carry around a large pool of random bits is on a data storage mechanism: a smart card, a Dallas Semiconductor iButton, a chip inside a physical key (like the device DataKey sells). These mechanisms are more annoying than passwords and passphrases, but they work. There's no real alternative; if something is too large to memorize, the only solution is to store it somewhere.

I don't believe that biometrics will ever become cryptographic keys. I wrote about this at http://www.counterpane.com/crypto-gram-9808.html#biometrics. Biometrics does have use as an authentication mechanism (note the difference), if it is engineered properly.

-----------------------

Next week: Mick Morgan, "the Queen of England's Webmaster," will answer questions about why not only the Royal Family's Web site but also the huge open.gov.uk site (and more than 80 other official UK Web sites) now run on Linux.

This discussion has been archived. No new comments can be posted.

Crypto Guru Bruce Schneier Answers

Comments Filter:

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...