Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re:you have the source (Score 1) 566

We had some issues with not adding enough randomness in embedded devices, but that problem was largely fixed a year ago. At this point, I think urandom should be fine for session keys. It's not the best choice for long-lived keys in those embedded devices, but those devices (a) don't have RDRAND, since they tend to mips or ARM CPU's, and (b) since they don't have any peripherals other than the flash drive and the networking cards, there isn't that much entropy they can draw upon. There are things you can do to improve things in userspace, such as holding off on generating the host keys and generating the RSA keys for the certificates as long as possible, instead of right after the boot. But that's much more of a specialized problem for a specific class of system.

Comment Re:Errk don't yell at the brainpool curves! (Score 1) 366

I linked to that RFC for the text in the introduction section, from which I got the "chosen ad hoc" language. My point is not to cast suspicion on all ECC, which is a valid mathematical technique developed in the open by civilian academics. But rather, to provide more evidence for the fact that nobody seems to know how the seed values were generated (we know WHO generated them, but not HOW).

Comment Re:Reference? (Score 2) 366

I just found this new blog post from the NYT which gives a very small amount of additional context. It also explicitly names the NSA RNG as what they were talking about.

http://bits.blogs.nytimes.com/2013/09/10/government-announces-steps-to-restore-confidence-on-encryption-standards/

But internal memos leaked by a former N.S.A. contractor, Edward Snowden, suggest that the N.S.A. generated one of the random number generators used in a 2006 N.I.S.T. standard — called the Dual EC DRBG standard — which contains a back door for the N.S.A. In publishing the standard, N.I.S.T. acknowledged “contributions” from N.S.A., but not primary authorship.

Internal N.S.A. memos describe how the agency subsequently worked behind the scenes to push the same standard on the International Organization for Standardization. “The road to developing this standard was smooth once the journey began,” one memo noted. “However, beginning the journey was a challenge in finesse.”

At the time, Canada’s Communications Security Establishment ran the standards process for the international organization, but classified documents describe how ultimately the N.S.A. seized control. “After some behind-the-scenes finessing with the head of the Canadian national delegation and with C.S.E., the stage was set for N.S.A. to submit a rewrite of the draft,” the memo notes. “Eventually, N.S.A. became the sole editor.”

The Guardian, ProPublica, the NYT and Schneier all appear confident enough in what they've read to state assertively that it's a hacked standard. Also, why else would the NSA care so much about pushing a crap and slow RNG that we know can have a backdoor into international standards?

Comment Re:We owe our thanks to Mr. Snowden (Score 5, Informative) 366

That story is about Dual_EC_DRBG which was indeed strongly suspected of being an NSA back door back in 2007. Snowden confirmed the suspicion. However this story is not about that algorithm. It's about the SEC random curves that are used for signing and other crypto, not random number generation. There are two different algorithms under discussion here.

Comment Re:Is Bitcoin Vulnerable? (Score 1) 366

Bitcoin uses what the SEC calls a Koblitz curve (secp256k1) for which there is much less design freedom and it seems much less likely that there is any way to back-door those curves. Unfortunately many ECC implementations don't support all the curves, just a few of the plain vanilla random ones. Actually I'm not aware of anything except Bitcoin that uses secp256k1.

Comment Re:Reference? (Score 5, Informative) 366

Sorry, I could have provided a link for that too. It was in the major Snowden story of last week that revealed the NSA was undermining public standards. The New York Times said this:

Simultaneously, the N.S.A. has been deliberately weakening the international encryption standards adopted by developers. One goal in the agency’s 2013 budget request was to “influence policies, standards and specifications for commercial public key technologies,” the most common encryption method.

Cryptographers have long suspected that the agency planted vulnerabilities in a standard adopted in 2006 by the National Institute of Standards and Technology and later by the International Organization for Standardization, which has 163 countries as members.

Classified N.S.A. memos appear to confirm that the fatal weakness, discovered by two Microsoft cryptographers in 2007, was engineered by the agency. The N.S.A. wrote the standard and aggressively pushed it on the international group, privately calling the effort “a challenge in finesse.”

“Eventually, N.S.A. became the sole editor,” the memo says.

Although the NYT didn't explicitly name the bad standard, there's only one that fits the criteria given which is Dual_EC_DRBG.

Comment Re:you have the source (Score 1) 566

How would they detect any shared properties? The point is that they are providing a random number generator (not a stream of random numbers) which is supposedly "secure". Secure means that no one, including the person providing the RNG, can predict the stream of numbers coming form the RNG. If the RNG coming form the US source is not honest, that means that presumably the NSA can predict the stream of numbers coming out of the RNG. But the NSA (assuming that it distrusts the KGB and the MSS) wouldn't want the KGB and the MSS to be able to carry out the same feat. The same is true for each of the other devices. So there's no way that any one of the actors should be able to detect any shared properties --- that's the point of the proposal.

Now, if the NSA is able to gimmick the RNG coming from China, then that's a different story. And to the extent that many electronics are designed in the US and then manufacturered in China, that's certainly a concern. In order for a scheme like this to work, the parts would have to be designed and built in such a way that an outsider would believe that the NSA couldn't have possibly gimmicked an RNG, even if it could have been gimmicked by another spy agency. Then combine this with a device that you're sure couldn't have been gimmicked by the MSS, but may have been subject to pressure from the NSA, and so on.

Submission + - Are the NIST standard elliptic curves back-doored? 2

IamTheRealMike writes: In the wake of Bruce Schneier's statements that he no longer trusts the constants selected for elliptic curve cryptography, people have started trying to reproduce the process that led to those constants being selected ... and found it cannot be done. As background, the most basic standard elliptic curves used for digital signatures and other cryptography are called the SEC random curves (SEC is "Standards for Efficient Cryptography"), a good example being secp256r1. The random numbers in these curve parameters were supposed to be selected via a "verifiably random" process (output of SHA1 on some seed), which is a reasonable way to obtain a nothing up my sleeve number if the input to the hash function is trustworthy, like a small counter or the digits of PI. Unfortunately it turns out the actual inputs used were opaque 256 bit numbers, chosen ad-hoc with no justifications provided. Worse, the curve parameters for SEC were generated by head of elliptic curve research at the NSA — opening the possibility that they were found via a brute force search for a publicly unknown class of weak curves. Although no attack against the selected values are currently known, it's common practice to never use unexplainable magic numbers in cryptography standards, especially when those numbers are being chosen by intelligence agencies. Now that the world received strong confirmation that the much more obscure and less widely used standard Dual_EC_DRBG was in fact an NSA undercover operation, NIST re-opened the confirmed-bad standards for public comment. Unless NIST/the NSA can explain why the random curve seed values are trustworthy, it might be time to re-evaluate all NIST based elliptic curve crypto in general.

Comment Re:most like 100,000 years (Score 3, Informative) 63

Link: carbon dating can't be trusted beyond 150 million years.

Conclusion: The date of 100,000 years given here is wrong.

If you'd taken time to scan the paper, you'd easily find the section on dating (2.2): "A chronological model was
developed using a combination of radiocarbon, optically stimulated luminescence (OSL), and relative
palaeomagnetic intensity dating. [...] OSL measurements suggested that material incorporated into the basal sediments might date to
93,000 ± 9000 years ago."

I.e. the 100,000 years is independent of carbon dating. (Actually, I'm surprised they even attempted carbon dating in this environment.)

Comment Re:you have the source (Score 5, Insightful) 566

The random driver has changed significantly since July 2012, which is we were given a heads up about the paper described at http://factorable.net/ which is also when I took back maintainership of the /dev/random driver. We gather entropy at every single interrupt, and mix it into the entropy pool. This is done unconditionally, you can't disable it, like what happened with the SA_SAMPLE_RANDOM flag.

The thing about entropy pools is that when you combine entropy sources, the result gets better, not worse. So the best thing would be if we had hardware random number generators sourced from China, Russia, and the USA. Since presumably the MSS, KGB, and the NSA mutually distrust each other, if we combine the entropy from those three soruces, the result will be stronger than any one alone.

This is why I don't recommend using RDRAND directly. Sure, an honest (emphasis on honest) hardware random number geneterator will always be able to source higher quality entropy than anything we can do by sampling OS events, such as interrupts. But the problem is it's hard to guarantee that a HWRNG is really honest. Especially given the Snowden revelations which seem to indicate the NSA has successfully leaned on at least one chip manufacturer. If you must use RDRAND, I'd recommend generating a random key via some other means, and then encrypting the output of RDRAND by that random key before use the resulting randomness for session keys, etc. Or better yet, do what we do in /dev/random, which is to mix RDRAND with other sources of entropy.

Comment Re:you have the source (Score 2) 566

What I said is that /dev/urandom is much more important to get right than /dev/random. Realistically, far more programs use /dev/urandom than use /dev/random. GPG uses /dev/random for long-term key generatiom, but in terms of generating certs, creating session keys, etc., /dev/urandom is far more important.

If you trust Intel not to have gimmicked RDRAND, by all means, feel free to use it. Please do it in open source, though, so I can fix said program not to, though.....

Comment Re:Marital/Money problems??? (Score 4, Informative) 566

I think it's more likely that the RDRAND thing has been an ongoing argument/flamewar for a long time. See this thread for an example.

BTW Linus is right. According to what we know about randomness, even if RDRAND is hacked then mixing it with other entropy can't hurt - at worst, it merely is a no-op and achieves nothing. However, even if RDRAND is backdoored, the NSA is not the worlds only adversary. Given that when mixed with other randomness it doesn't hurt, it's still better to use it against all the other adversaries out there than not.

Linus' point is, exclusive reliance on RDRAND would be bad, but the kernel doesn't/shouldn't do that.

Slashdot Top Deals

Unix will self-destruct in five seconds... 4... 3... 2... 1...

Working...