Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Comment What could possibly go wrong? (Score 1) 169

Let's see. My wife goes into labor at 4:00 AM*, and sleepy and excited I get into the car to drive her to the hospital... only to have the car refuse to start, as my brain waves don't match its stored template. Oh, yeah, that will go over well.

* That was, in fact, when my wife went into labor.

Comment False negatives? (Score 4, Insightful) 169

What if I'm hugely stressed out because a tsunami or forest fire is coming or my critically injured child needs rushing to hospital or some such? If that changes my brain waves enough to prevent me driving, it would be unfortunate.

(To be fair, TFA says they're looking initially to use it on buses and armoured cars. I wonder if "masked man is pointing gun at my head and ordering me to drive" sufficiently alters the brain waves.)

Comment Re:maintenance (Score 1) 195

Since you obviously know that a *file* can be fragmented, obviously you already know that a file doesn't have to be contiguously written.

Thus, you don't need to defragment it. The directory structure knows that the 'file' is in blocks 1-5, 8, 14.

As other people pointed out, disk seeks are most assuredly something to avoid on spinning media. But even when seeks are free, as they are on SSD, fragmentation still sucks and you should avoid it like you owe it money. For one, some filesystems use run-length encoding for the list of blocks in a file. Basically, instead of recording "1, 2, 3, 4, 5, 8, 14", they notice the pattern and record "1-5, 8, 14" like you just did in your post. (The ext[234] family doesn't do this, but IIRC some of the post-ext2 up-and-comers use it.) RLE lets you inline more metadata directly in the inode without resorting to indirect blocks, which basically means you get your data with fewer round trips to the disk. (It might save you from needing to read a meta-meta-block to find the meta-blocks that tell you where the blocks are. Instead you can fit all the blocks in one meta-block and skip a round trip.) For two, even filesystems on SSD that don't do RLE still suffer under fragmentation. Unfragmented files make it easy for the kernel I/O scheduler to coalesce those sequential block reads into big, happy multi-block SATA reads when you're streaming through the file. As before that means fragmentation = more round trips to the disk, but it also means fragmentation = spamming the SATA controller with more commands and spamming the CPU with more interrupt handlers for the command completions. (In other words, copying a big fragmented file slows down everything else on the computer, moreso than copying a big un-fragmented file.)

Disclaimer: I am not a filesystem designer, I just play one on Slashdot.

Comment Re:Treason.. or... (Score 1) 524

I wonder if she was told that it would be treason by someone in law enforcement (they are allowed to lie, after all). Perhaps, as so many other citizens would, she believed what she was told. That's unfortunate for someone in her position, but sadly quite normal.

Besides, the way the words "traitor", "treason", "un-American" and "terrorist" are thrown around, their actual meanings are diluted in common speech by all that hyperbole.

Unlike some defendants, I would be astounded if she did not have advice of counsel, and pretty good counsel at that.

Comment Treason is in the Constitution (Score 2) 524

It is the only crime defined there :

Treason against the United States, shall consist only in levying War against them, or in adhering to their Enemies, giving them Aid and Comfort. No Person shall be convicted of Treason unless on the Testimony of two Witnesses to the same overt Act, or on Confession in open Court.

Don't see anything there about not cooperating with the authorities.

Comment Re:you have the source (Score 1) 566

We had some issues with not adding enough randomness in embedded devices, but that problem was largely fixed a year ago. At this point, I think urandom should be fine for session keys. It's not the best choice for long-lived keys in those embedded devices, but those devices (a) don't have RDRAND, since they tend to mips or ARM CPU's, and (b) since they don't have any peripherals other than the flash drive and the networking cards, there isn't that much entropy they can draw upon. There are things you can do to improve things in userspace, such as holding off on generating the host keys and generating the RSA keys for the certificates as long as possible, instead of right after the boot. But that's much more of a specialized problem for a specific class of system.

Comment Re:Errk don't yell at the brainpool curves! (Score 1) 366

I linked to that RFC for the text in the introduction section, from which I got the "chosen ad hoc" language. My point is not to cast suspicion on all ECC, which is a valid mathematical technique developed in the open by civilian academics. But rather, to provide more evidence for the fact that nobody seems to know how the seed values were generated (we know WHO generated them, but not HOW).

Comment Re:Reference? (Score 2) 366

I just found this new blog post from the NYT which gives a very small amount of additional context. It also explicitly names the NSA RNG as what they were talking about.

But internal memos leaked by a former N.S.A. contractor, Edward Snowden, suggest that the N.S.A. generated one of the random number generators used in a 2006 N.I.S.T. standard — called the Dual EC DRBG standard — which contains a back door for the N.S.A. In publishing the standard, N.I.S.T. acknowledged “contributions” from N.S.A., but not primary authorship.

Internal N.S.A. memos describe how the agency subsequently worked behind the scenes to push the same standard on the International Organization for Standardization. “The road to developing this standard was smooth once the journey began,” one memo noted. “However, beginning the journey was a challenge in finesse.”

At the time, Canada’s Communications Security Establishment ran the standards process for the international organization, but classified documents describe how ultimately the N.S.A. seized control. “After some behind-the-scenes finessing with the head of the Canadian national delegation and with C.S.E., the stage was set for N.S.A. to submit a rewrite of the draft,” the memo notes. “Eventually, N.S.A. became the sole editor.”

The Guardian, ProPublica, the NYT and Schneier all appear confident enough in what they've read to state assertively that it's a hacked standard. Also, why else would the NSA care so much about pushing a crap and slow RNG that we know can have a backdoor into international standards?

Comment Re:We owe our thanks to Mr. Snowden (Score 5, Informative) 366

That story is about Dual_EC_DRBG which was indeed strongly suspected of being an NSA back door back in 2007. Snowden confirmed the suspicion. However this story is not about that algorithm. It's about the SEC random curves that are used for signing and other crypto, not random number generation. There are two different algorithms under discussion here.

Comment Re:Is Bitcoin Vulnerable? (Score 1) 366

Bitcoin uses what the SEC calls a Koblitz curve (secp256k1) for which there is much less design freedom and it seems much less likely that there is any way to back-door those curves. Unfortunately many ECC implementations don't support all the curves, just a few of the plain vanilla random ones. Actually I'm not aware of anything except Bitcoin that uses secp256k1.

Comment Re:Reference? (Score 5, Informative) 366

Sorry, I could have provided a link for that too. It was in the major Snowden story of last week that revealed the NSA was undermining public standards. The New York Times said this:

Simultaneously, the N.S.A. has been deliberately weakening the international encryption standards adopted by developers. One goal in the agency’s 2013 budget request was to “influence policies, standards and specifications for commercial public key technologies,” the most common encryption method.

Cryptographers have long suspected that the agency planted vulnerabilities in a standard adopted in 2006 by the National Institute of Standards and Technology and later by the International Organization for Standardization, which has 163 countries as members.

Classified N.S.A. memos appear to confirm that the fatal weakness, discovered by two Microsoft cryptographers in 2007, was engineered by the agency. The N.S.A. wrote the standard and aggressively pushed it on the international group, privately calling the effort “a challenge in finesse.”

“Eventually, N.S.A. became the sole editor,” the memo says.

Although the NYT didn't explicitly name the bad standard, there's only one that fits the criteria given which is Dual_EC_DRBG.

Comment Re:you have the source (Score 1) 566

How would they detect any shared properties? The point is that they are providing a random number generator (not a stream of random numbers) which is supposedly "secure". Secure means that no one, including the person providing the RNG, can predict the stream of numbers coming form the RNG. If the RNG coming form the US source is not honest, that means that presumably the NSA can predict the stream of numbers coming out of the RNG. But the NSA (assuming that it distrusts the KGB and the MSS) wouldn't want the KGB and the MSS to be able to carry out the same feat. The same is true for each of the other devices. So there's no way that any one of the actors should be able to detect any shared properties --- that's the point of the proposal.

Now, if the NSA is able to gimmick the RNG coming from China, then that's a different story. And to the extent that many electronics are designed in the US and then manufacturered in China, that's certainly a concern. In order for a scheme like this to work, the parts would have to be designed and built in such a way that an outsider would believe that the NSA couldn't have possibly gimmicked an RNG, even if it could have been gimmicked by another spy agency. Then combine this with a device that you're sure couldn't have been gimmicked by the MSS, but may have been subject to pressure from the NSA, and so on.

Submission + - Are the NIST standard elliptic curves back-doored? 2

IamTheRealMike writes: In the wake of Bruce Schneier's statements that he no longer trusts the constants selected for elliptic curve cryptography, people have started trying to reproduce the process that led to those constants being selected ... and found it cannot be done. As background, the most basic standard elliptic curves used for digital signatures and other cryptography are called the SEC random curves (SEC is "Standards for Efficient Cryptography"), a good example being secp256r1. The random numbers in these curve parameters were supposed to be selected via a "verifiably random" process (output of SHA1 on some seed), which is a reasonable way to obtain a nothing up my sleeve number if the input to the hash function is trustworthy, like a small counter or the digits of PI. Unfortunately it turns out the actual inputs used were opaque 256 bit numbers, chosen ad-hoc with no justifications provided. Worse, the curve parameters for SEC were generated by head of elliptic curve research at the NSA — opening the possibility that they were found via a brute force search for a publicly unknown class of weak curves. Although no attack against the selected values are currently known, it's common practice to never use unexplainable magic numbers in cryptography standards, especially when those numbers are being chosen by intelligence agencies. Now that the world received strong confirmation that the much more obscure and less widely used standard Dual_EC_DRBG was in fact an NSA undercover operation, NIST re-opened the confirmed-bad standards for public comment. Unless NIST/the NSA can explain why the random curve seed values are trustworthy, it might be time to re-evaluate all NIST based elliptic curve crypto in general.

Slashdot Top Deals

My mother is a fish. - William Faulkner