Forgot your password?

Comment: sensationalizing headline and summary (Score 2) 206

While the actual incident -- a near-collision -- may be worth debating*, note that both the headline and summary any reference to collision, using "nearly downed" and almost "bringing down", sensationalizing it into seeming like something more.

* It's not really; a near-collision with an out-of-control flying machine can happen from any flying machine that can go out of control (ps: that's all of them). It's just the cost of doing business.

Comment: Re:100 mile border (Score 1) 194

What SC validation? According to Wikipedia, it's the opposite: "the Supreme Court has clearly and repeatedly confirmed that the border search exception applies only at international borders and their functional equivalent" and there's a link to such a ruling from 1973.

Comment: Re:How about the US-Canadian/US-Mexico border? (Score 5, Informative) 597

by nothings (#42840539) Attached to: DHS Can Seize Your Electronics Within 100 Mi.of US Border, Says DHS

The claim that the there is no 4th amendment right within 100 miles of a border is false. (Though the federal government may occasionally conduct illegal searches on that basis.)

As wikipedia says, "Despite federal law allowing certain federal agents to conduct suspicionless search and seizures within 100 miles of the border, the Supreme Court has clearly and repeatedly confirmed that the border search exception applies only at international borders and their functional equivalent (such as international airports)."

Wikipedia offers this Supreme Court decision as an example: a non-US-citizen was busted for marijuana possesion while driving 25 miles from the border; and the SC ruled that the search of his car could not be justified by the border provision.

Comment: James H. Schmitz (Score 1) 1130

by nothings (#40936319) Attached to: Ask Slashdot: Most Underappreciated Sci-Fi Writer?

James H. Schmitz was a Sci-Fi writer published from the 40s to the 70s. He mostly wrote short stories (most in a common setting, often with one of two specific lead characters), though he wrote a few (short) novels.

He was an early feminist author; most of the his lead characters are strong females (Telzey Amberdon and Trigger Argee are the leads in many of his short stories, and the main lead in the novel The Demon Breed is a woman). His most famous work, the novel The Witches of Karres, is a picaresque, wildly imaginitive space-romp (although rather overtly expanded from a novelette).

In one (or more?) stories, he imagines an Internet-like source of information, and (IIRC) even calls it a "web".

In the 2000s, most of his works were republished by Baen Books, with sometimes significant interference^H^H^H^H^H^H^H^H^H editing by Eric Flint.

Comment: Re:And they found that... (Score 2) 132

by nothings (#40321267) Attached to: Chords To 1300 Songs Analyzed Statistically For Patterns

tl;dr: RTFA, not just the pictures.

Full version:

Unfortunately, you misread the site. The site doesn't report the popularity of chords by name at all. If you'd read the lead-in to the chord chart, you'd see the explanation. Or if you'd thought about the most popular chords being "G F C Am Dm Em", the main traids in the key of C, you might be suspicious. Or if you had read the following analysis on the site which explains his theories for their popularity, you'd have seen your misinterpretation.

The site reports the popularity of key signatures by name.

It reports the popularity of chords by pseudo-name: relative to the key signature by transposing all the songs into the key of C. Yes, that's a dumb thing for him to do, but that's what he did, and it's identical to what you propose he should do. (The per-song analyses do actually use roman-numeral notation.)

Your explanation is therefore bogus; the A chord is not necessarily particularly rare as far as we can tell. Maybe it is, maybe it isn't. Probably it is, actually, which leads me to my second argument: your reason for why it's rare is wrong.

It is absolutely true that the popularity of many chords in guitar music is due to what's convenient on the guitar. But I'm doubtful that A chords are very rare in guitar music. More likely, the pop music analyzed here is not very guitar-centric.

Let's look at an actual guitar band. The easiest to use is the Beatles, since they're well studied. They got less guitar-y in their later albums, though. Here's a source.. Note that relative minors have already been adapted in the same way.

Top six keys in order on the site in the slashdot article:
C G Eb F D A

All beatles: G A E D C ...
First two albums: E D A G ...
Next three albums: A G D E ...
Abbey Road: A C E D F

So, in this actual guitar band, before they started writing on piano, retuning songs by changing the tape speed, etc., the keys of E, D, and A were incredibly popular, so I bet the A chord was probably popular as well. (but I have no stats).

And since guitarists don't actually avoid these keys, unsurprisingly, your explanation for why guitarists would avoid these keys are wrong. (1) The B chord is uncomfortable for a beginning guitarist, but the B7 chord is easily learned, so B doesn't present a problem for the key of E. (And the reality is that the difficulty of the chord isn't a big deal for serious musicians. They favor open chords not because they're easy, but because they sound better.) (2) The key of D doesn't present much problem, as not having the chord root not at the top just means you play inversions a lot, or use sparser chords. The fact it's not low is irrelevant when you have a bassist; and look at something like Nashville tuning. Indeed, the convenience of a flexible A7 for use as V, and the ease of Dsus2 and Dsus4, makes D a quite popular key signature on the guitar. (3) I don't know why your A theory is wrong, but since your E and D theories were wrong, and since the Beatles (with three different male singers) loved the key of A, I can't imagine it's correct.

So, the actual explanation for why A is at 2% is that it's the "relative A" chord that is the major VI chord, or i.e. the V-of-ii chord. That makes it popular enough to be at 2% -- V-of-ii isn't unheard of, but not a particularly common chord in the key of C, the way the non-diminished triads from the key signature are.

Comment: Re:You cant hear it anyway. (Score 2) 255

by nothings (#40039289) Attached to: Dolby's TrueHD 96K Upsampling To Improve Sound On Blu-Rays

I posted about this on twitter a month ago.

The frequency chosen had to be a multiple of 900 and had to be somewhere in a limited range of frequencies (above 40Khz, below some number I forget). The 900 comes from a factor of 300 (to guarantee it was divisible by 50 and 60 for PAL/NTSC), and a factor of 3 (the preferred number of samples per scanline; 2 was too few, 4 would have been wasteful).

There is no evidence that the specific multiple of 900 from the required range (40Khz to 47Khz) was chosen because of what the factors of the multiplier would be, but rather because the frequency wanted to be as high as possible (giving a wider region between limits of human hearing and the nyquist freq, thus making filtering it cheaper), but higher frequencies would have required encoding samples in the vertical blank part of the signal.

Certainly the fact that 900 itself is already the product of the squares of the three smallest primes is coincidence, since the factors of 300 and 3 were essentially independently motivated--the 3 wasn't chosen because it "completed the set" with the 300. Likewise, I don't believe the additional factor of 49 was chosen because of that factor. (Having more divisibility is useful in some circumstances, but 7 is such an uncommon divisor; 900*48 would be far more useful on the general divisibility front, introducing more factors of 2 and 3. But, in fact, nobody NEEDS this number to be more divisible, as it's not needed to be divisible beyond the factor of 900 that was required.)

There's a wikipedia page about it (do an "I feel lucky" search on google for "44.1").

Comment: bufferbloat science (Score 1) 134

by nothings (#39946075) Attached to: Controlling Bufferbloat With Queue Delay
Has Van Jacobson's research on "bufferbloat" ever been replicated? Because I'm pretty sure "the cause is persistently full buffers" is only "according to researcher" singular, unlike the claim of the submitter.

(The linked article, by Jacobson and a collaborator, cites two sources: one is Jacobson's original article, and the other is by Jacobson and the same collaborator.)

I'm not saying he's wrong; I'm saying this isn't very scientific.

Comment: Re:Something for the wrist? (Score 1) 133

by nothings (#39789631) Attached to: Brain Scan Can Predict Math Mistakes
If a test had 20 problems, and you screwed up one problem because of an arithmetic error (forgetting to divide by 2), and did every other problem perfectly, you shouldn't get a D on the test. A grade of D does not reflect the knowledge or competence you exhibited on the test.

If the test requires 20 steps that build on each other, and you screw up one step because of an arithmetic error (forgetting to divide by 2), and do every other step perfectly (but building on that incorrect step), you shouldn't get a D on the test. A grade of D does not reflect the knowledge or competence you exhibited on the test.

It is contrary to reasoning to say that there is a vacuum or space in which there is absolutely nothing. -- Descartes