Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Not any more. (Score 1) 502

I do not use onboard computer DACs. Never found one that I liked.

Yes I am insanely fussy about sound quality compared to most folks.

It used to be that I would get a sound card in order to get digital sound out. SPDIF so I could run it through a nice external DAC (typical good ones cost about $1K and up.).

Nowadays that isn't needed any more. Integrated sound almost always comes with SPDIF out, and most external DACs have USB capability. So I don't need sound cards to get the sound into my DAC these days.

Comment Re:Technically, it's not a "draft notice" (Score 1) 205

The closest we get to that is the airport, where rights have been considerably and visibly curtailed (as opposed to the comparatively invisible loss of rights due to government intrusion in electronic communications). People seem to have accepted that more or less gracefully: they bitch, but it's not seen as a massive imposition on most people's daily lives.

I don't know if we'd ever get to the point of rationing food. Even if we declared a full-scale war, technology means we grow a lot of surplus food in this country. Prices might rise, but I don't think we'd ever see "grow victory gardens" posters as we did in the last unlimited war.

Oil, however, would skyrocket, and technology might be severely curtailed. It would be interesting to see how people reacted to that. It's hard to say whether that would be a bigger factor than outrage at a draft of manpower. In Korea and Vietnam, a lot of the public seemed to take the draft with equanimity since it came without the kind of rationing we saw during World War II.

Comment Re:As plain as the googgles on your face (Score 1) 56

As intrusive as the Google Glass has proven to be, it will only be worse when observation recording tech is more difficult to detect.

I disagree. The exact opposite: when people stop noticing, they will stop caring. It won't be perceived as intrusive anymore, and people will be less annoyed by it.

It's the conspicuousness of the camera in Google Glass, the constant reminder that you might be recorded, that makes most people feel creeped out. For the previous decade leading up to that product, nobody cared about small+cheap camera tech itself. And people walk/drive by fixed-position cameras all the time, and don't give a fuck there either. Peoples's behavior shows that "intrusiveness" happens when a cameras looks like a camera, and I suspect it also has something to do with being face-level, literally "in your face" and you're making eye contact with it, unlike the case with less conspicuous cameras. It was never about privacy; it's some aspect of self-consciousness kind of related to privacy, but a different thing.

You might say "maybe you, but I sure care. Hell yes it's about privacy." Of course you say that. I'm talking about how people behave and the emotions they display. Not their innermost secret thoughts that they are always terrified to express in voting booths or policy decisions, yet are happy to speak of on the Internet.

You know, the Internet, where they don't have a camera in their face making them all self-conscious! The Internet, where instead of a terrifying 1x1 pixel image that makes you think "WTF is that? That's weird! Are you watching me?" you now instead see a bunch of "like buttons" which are obviously for liking things, not getting your browser to send a request to an unrelated tracking server.

In addition, there's a certain inevitability about it all. The cameras have been there a long time, there are more today, and there will be even more tomorrow. You can't do anything about it, except stay at home. So you'll either accept or you'll go insane and get selected out. You'll handle it. (Contrast that to Google Glass, the one small camera out of the hundreds out there, that you actually recognize and is also rare enough that there's little social cost to shunning. With GG you can refuse to accept and also stay within social norms, so GG is different.)

Comment Re:Technically, it's not a "draft notice" (Score 1) 205

I recall some talk during the lead-up to the Afghan war about the potential for a draft. It wasn't clear at the time just how big that particular conflict would get. It wasn't impossible to imagine it turning into World War-sized scenario against a lot of Islamic countries. The resulting conflicts were small compared to that, but we had to scale up the military substantially and if they'd grown any bigger we'd have had to have a draft.

Now that women are allowed access to combat positions, it's going to be very hard to exempt them from a draft should one be necessary. I can't conceive of the legislature passing any such bill right now (I can't imagine this Congress passing any non-trivial bill, and I don't see that changing), but a wise legislature would want to do that ahead of need rather than after the fact. If women are going to be drafted, you'd need to start registering them now.

I sincerely hope that it's never necessary. And if a war of that scope does happen again, we'll probably be a lot less selective with our weapons of war. (Afghanistan and Iraq were fought house-to-house, because as bizarre as it sounds that was a way of reducing civilian casualties, at least compared to just flattening entire cities as was done in World War II.) So we may well not have a draft even in a bigger conflict. But I think that, while it's politically impossible, a really good pragmatic case could be made for starting to require Selective Service registration for everybody right now.

Comment Re:Most humans couldn't pass that test (Score 1) 285

To me, this seems to cut to the heart of it. AI is commonly conceived of as trying to mimic human intelligence, while there are cognitive tasks that cats and even mice can do that prove too hard for computers. A cat can recognize a mouse with essentially 100% accuracy, from any angle, in an eyeblink. No computer would come close, and the program that came closest wouldn't be a general-purpose object matcher.

Vertebrate brains are pretty remarkable. Human brains are an amazing extra step on top of that. We don't know exactly what that is in part because we don't really understand the simpler vertebrate brains. IMHO, we won't have a good mimic for sapience until we've gotten it to first do sentience. We don't have to rigorously follow the evolutionary order, but it seems to me that conversation-based tests are rewarding the wrong features, and even if they get better by that definition they're not getting us any closer to the actual goals of understanding (and reproducing) intelligence.

Comment Re:Tannenbaum's predictions... (Score 1) 136

Predicting that x86 would go away was more wishful thinking than anything else. At the time, Intel had just switched from pushing the i960 to pushing the i860 and would later push Itanium as x86 replacements (their first attempt at producing a CPU that it was impossible to efficiently compile code for, the iAPX432, had already died). Given that Intel was on its second attempt to kill x86 (the 432 largely predated anyone caring seriously about x86), it wasn't hard to imagine that it would go away soon...

Comment Re:A great writer (Score 2) 136

I found Modern Operating Systems better than the Minix book. The Minix book tells you exactly how a toy OS works in detail. Kirk McKusick's Design and Implementation of the FreeBSD OS (new version due out in a month or two) tells you how a real modern OS works in detail. Modern Operating Systems gives you a high-level overview of how modern operating systems work and how they should work. If you want to learn about operating systems, I'd recommend reading the FreeBSD D&I book and Tanenbaum's Modern Operating Systems and skipping the Minix book (which was also a bit too heavy on code listings for my tastes).

Comment Re:Does this mean the death of Minix3? (Score 1) 136

I feel it necessary to point out, though, that OS X is not a microkernel system comparable to Minix

While this is true, it's worth noting that a lot of the compartmentalisation and sandboxing ideas that most of the userland programs on OS X employ (either directly or via standard APIs) have roots in microkernel research. OS X is in the somewhat odd situation of having userspace processes that are a lot more like multiserver microkernels than its kernel...

Comment Uh... I don't get it (Score 1) 28

I did read the fine article, but I'm afraid I just don't get what's going on here. Are the players contributing something in some kind of crowd-sourced "Yes, that blob is a star, and its center is here" kind of way? Or are they using players' computers as a distributed processing system?

It's nifty either way, but I don't the New Yorker's audience has the same kinds of questions about the technology that I do. Can anybody in this audience (more like me) help me out?

Comment Re:Bitcoin isn't money but it's still a financial (Score 1) 135

Bitcoin's primary purpose is to traffic/launder money and goods.

Objection. Will stipulate that its primary purpose is to traffic. But I call mega-bullshit on its primary or even secondary purpose being to launder, though there might be a way one could use Bitcoin for that.

Slashdot Top Deals

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...