Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Missing the (well, a) point (Score 3, Insightful) 433

Saying that drone warfare is not particularly good at decapitating an institutional terrorist organization like Al Qaeda is missing the point. Or at least a key point. Drone warfare has made large scale terrorist training largely impossible. The boot camps and months long, practical courses in guerrilla warfare that used to be an Al Qaeda staple are now just very visible, attractive targets for drones. Drone warfare occasionally knocks out a head, but it really undermines the base.

In all force, there is some deterrence power. For some technologies, the deterrence is the whole point. For example, land mines aren't meant to be a good way to blow up people, they're meant to be a good way to prevent groups of people from traversing an area once you advertise that it's full of mines. Here, drones are useful for rapid, cheap attacks of opportunity... but the fact that they are almost always ready means long-term, open-air training camps are suicide.

Comment Re:Privacy only works when it's in your own hands (Score 1) 300

Exactly. It was always a pretty bad idea. In fact, it reminds me a great deal of the RFC 3514 "evil bit"

Do-Not-Track is basically a "Don't be evil" bit. It makes a plea on behalf of the end user and the end user hopes some distant system honors it. Any time you implement some version of the evil bit, you should expect that it's not going to work.

(Then again, there are a lot of tech features in use now -- such as a PDF owner_pass edit lock, or phone service Caller ID blocking -- which are also based on "please keep private" bit, and those are effective for 98% of the people out there who are just to lazy to get around them. So maybe there's something to be said for an evil bit after all.)

Comment Short news is dead, long analysis lives. (Score 1) 285

"Long read" periodicals, which rely on research or expertise are still worth reading. The Economist and Foreign Policy are tow that stick out in my mind.

Local news may or may not be good. When national coverage dominates, you're basically getting a watered down version of last week's CNN. When local coverage dominates, at least you know there was was probably no other source for that information.

Industry Journals probably cover esoteric topic no one else will, so those count if your are actually interested in the esoteric topics.

Sadly, the niche, hobby magazine is pretty much dead. Big players release news and content directly to the web, and the best commentary is spread around blogs and web-zines. In fact, if the bulk of a magazine can be described as "news about X", or a "a community newsletter for Y", then it's dead.

Comment A Reminder: there is no STEM worker shortage (Score 1) 529

America's tech leaders are literally going to Washington with demands for "comprehensive immigration reform that allows for the hiring of the best and brightest".

I'm honestly surprised that more hasn't been said so far of this statement. I suppose it comes up rather frequently here when visas come up, but I think that it needs to be stated again: there is no STEM worker shortage. There is no lack of qualified people. American companies are just too cheap to train, and don't want to pay American workers proportionate to their talents and the cost of living in America. And I think it's worth repeating that again, and again, and again, because as near as I can tell policy-makers actually seem to believe the nonsense they are being fed.

Comment The move may be counter productive (Score 1) 424

First, my immediate response, as a Time Warner customer was, "well, we're canceling then. I literally would prefer to deal with Satan than Comcast." That's not an abuse of the word literally: I mean in all serious that the Morning Star, enemy of man, font of lies and evil has a better business track record than Comcast, and I would be more likely to extend him the benefit of the doubt. This means we're ditching cable, internet, and phone from Time Warner.

Second, AT&T still sells internet connections. As do others if we really need to move to a smaller firm. And that's just because we've so far been too lazy to set up either of our devices as a cell modem or link it to a larger screen. Mobile phones make the land line and the cable lines have a lot less value. Comcasts' proposition -- that cable is inherently valuable even with shitty service provided to relatively few true television-viewers -- is already on very shaky terms.

Third, the wife and I immediately re-evaluated how much TV we watch. We quickly came to the agreement that we DVR more than we ever watch, that most of those could have been received over-the-air from networks, and that we really aren't that interested in most of them anymore. Most of the time we ignore the DVR-ed material to binge on classics and series on Netflix, new releases on Red Box, or just plain, old-fashioned, 3-seasons-on-sale-for-$15 DVDs. We can do with a lot less in terms of cable.

In the short run, people who will be most affected are families that can't imagine ditching the Disney Channel. In families without adolescents yet, I think a lot more people will just be too cheap to ever -start- on the Disney Channel. Their strategy, like every Comcast strategy, is short-sighted.

Comment Re:TheAgriculture Ministry is not in charge of Gun (Score 3, Interesting) 112

Except that's not what the article accuses them of. The article mainly accuses them of editing badly.

For those who didn't RTFA, here's the high points:
* IBM was huge in computing, so why is it so poorly represented (in terms of article count, total kB of text, and editing quality) on Wikipedia, the self-appointed online repository of all human knowledge?
* people at IBM seem to be editing IBM-related articles, but not in any kind of organized way. (The article actually chastises them for FAILING to have any kind of organized method.) Mostly it's people editing articles about themselves or things that they have worked on.
* The person who worked on the Watson project is and admin on Wikkipedia, married to another editor and edited Wikipedia articles while on the job for IBM. (Almost as if she were passionate about it or something... and working on a project where her computer barfed up nonsense when it parsed a really poorly written article....)
* the three shadiest things that they mention are 1) a guy who created an article about an IBM award/title he won; 2) an editing fight about the relevance of a book that linked IBM to the third Reich (which went through the usual Wikipedia channels and ended up in favor of keeping the article); and 3) The guy who started BASH.org (and who happens to be at IBM) arguing that the page was relevant and should be kept (again, usual Wikipedia channels, this time not in BASH.org's favor)

So basically what we have here are the notions that:
* even relatively obscure people probably shouldn't edit articles about themselves to avoid bias (which strikes me as silly for biasing things hard in the other direction)
* that IBM needs to tackle Wikipedia in an organized way to make up for the lack of interest by anyone outside the industry in preserving this huge chuck of history...
* unless it stays away altogether, because they already have a huge company history page on their website.
* and that IBM-ers should not touch the articles that they are most likely to have specific knowledge on...
* ...or for that matter any article, no matter what they happen to find odd if they found it while at work.

like most fights on and about Wikipedia, this is a tempest in a teapot by people who do a poor job articulating whether the collaborative encyclopedia of all human knowledge is actually suppose to be any of those things and why.

Comment Re:Why? (Score 1) 86

There are a couple of reasons.

As a starter, I remember an interview from way back in the aughties when where they asked an IE designer for his thoughts on the Firefox browser, which was at that point really cutting into IE market share. I remember one comment along the lines of "really good browser: the only thing I would change is to put tabs on top. The address bar and everything else only affects the current tab, so you want tabs on top to give the impression that each tab is like its own, separate browser." At the time, IE didn't have tabs, so he could say these sort of things without thinking he's shooting himself in the foot.

He did cite Microsoft usability studies (no specific study, just the nebulous term "usability studies") as part of that comment. Eventually Mozilla did it's own study and concluded pretty much the same thing. There was also an argument about how tabs would be easier to select now while using less screen space because of the "infinite space" of the tab. You see, if you scroll over a tab, and go too far, one of two things can happen: you either scroll past the tab and onto something else (miss), or you hit the edge of the screen and the cursor lands on the tab anyway. The argument was that this is in effect like having an infinitely tall tab, so it's easier to hit.

Now, some personal comments on why I hate that entire line of reasoning:
First, back at the time I found the initial comments from an MS employee to be odd, because that's exactly the opposite of how MS has trained users to think of tabs in every one of their products (except for this hypothetical "tabs on top" browser which didn't exist anywhere yet). Before the browser, you mostly saw tabs in OS preference dialogs, where sometimes the tabs were on top just because they were used as categorical dividers (you know, just like real tabs in a in a real filing cbinet were always meant to be). But just as often, there would be a small section of tabs embedded on some larger dialog pane. The only thing they had in common was the obvious "tabs are nested within windows". To the population of the time, window and browser were inextricably linked.

But that was then, and this is now. What about how people ten years later are used to interacting with the browser? Well for one, most still don't actually think of each tab as a "mini-browser". If anything they just expect the browser control elements to go away altogether to make room for the page. (In fact, the ease with which mobile browsers have hidden away such controls proves to me that taking up _any_ space within a tab is probably a losing proposition.) But where hiding elements isn't possible, the view is still generally that the window is a true "window" out to some slice of the internet. To me personally, arguing that each tab should contain "its own" URL bar and buttons is sort of like arguing that each window in your card should have it's own steering wheel and speedometer. It just doesn't follow for me to embed controls within content. But since the controls are being hidden away fast, it's largely a matter of choice.

So why should tabs-on-top be a good default choice then? They argue because it make tabs easy to select with minimum real estate. The "infinite space of the screen-edge tab". Unfortunately, I don't think I have ever had a browser end at a screen edge. Either I'm on a Mac or a Linux variant that has a bar on top, or I'm in Windows where I never use full-screen mode. (I'm having trouble even thinking of a time when I even would use full-screen mode for a browser now that the screens are all obscenely wide.) So the infinite space argument is DOA. And then there's the times where I'm on a half-foreign system (read: work laptop) where the touchpad cursor is slow and all I want is for the cursor to get there. Overshooting is not a concern. In these cases, making the tabs farther from center and shorter while trying to make up for it with pseudo-infinite space just makes them shorter and farther away.

And on a final, unrelated note, when the summary notes, "it is feasible to combine the address, search, and find box into one": of course it is. We did it before 2004. It was called a command line. It's trivially easy to designate some box as the "do things with this input" box. The actually command used to parse that input was still offloaded to other elements, such as the "go" button, and the "search" spyglass. Even in 2004, it wasn't a matter of feasibility. The only reson the bars were ever separate in the first place is because most screen space for you "Ask Jeeves" toolbar meant more advertising for "Ask Jeeves".

Comment Re:Scalpel or gun can be used for good or bad ... (Score 4, Insightful) 406

When doctors or nurses use their knowledge of anatomy in order to torture or conduct medical experiments on helpless subjects, we are rightly outraged. Why doesn't society seem to apply the same standards to engineers?

Whenever I read something like this, I immediately think of Florman's "Existential Pleasures of Engineering" despite the title, Florman's book is rooted is actually a spirited apology for the engineering profession in an age where everyone was lamenting all the modern horrors that those damned engineers could have prevented if they had just be more ethical.

As Florman notes, there has been a large focus for the past half-century on making engineers more ethically aware, and it's mostly pointless. Despite what most people seem to believe engineers are not philosopher kings any more than Technology is some sort of self-sufficient, self-empowering beast working counter to the benefits human society. Both do exactly what the rest of society tells (read: pays, begs, and orders) them to do, and nothing more. And while you don't see many engineers saying this -- because when someone tells them that they run the world and hold the future of all man kind in their hands, people are disinclined to temper their ego and deny it -- we only do what the suits pay us to do, and if we don't do that they fire us and move on to someone else who will.

Let's ask this another way: why aren't business men considering the ethical implications of their investments? Why aren't militaries, bureaucracies, and governments considering the ethical implications of their orders? Why isn't the average person taking five minutes to understand a problem now so he doesn't demand government, the market, and God on high give him an answer that he's going to hate more than the original problem a year from now?

Every profession has ethical considerations. More ink has been spilled and time spent on the subject of ethics in engineering and practical sciences than any discipline save medicine. And yet it does not solve the problem and will not solve the problem because that is not where the problem lies.

Comment Re:Iridium + Something Else (Score 2) 175

I did some work with both Iridium and Inmarsat on a project a while back. It's been a while, so my comments are mostly qualitative, not quantitative.

Iridium offers a global array with redundant satellites (which is good since they lost a few a few years back), while Inmarsat uses a directional antenna relies on you being able to actually aim an antenna. If you're in the Inmarsat range of coverage (and pretty much everyplace habitable is), I'd recommend it. You can get a ethernet-ready single package antenna+modem (about the size of a thick laptop) that's pretty easy to aim (the unit provides some guidance). This assumes you're on foot, of course. If you have a dedicated vehicle you might invest in a tracking antenna. The data rates we got we in the 35 Mbps range.

Iridium is literally dial up over satellite. The service was designed for voice telephony, and it uses an analog signal until the satellite relays it to a base station with modems and an internet connection. It will be reliable, but very slow. The 0.0024 Mbps rate Spazmania gives below matches my recollection.

The two units are similarly portable: the Inmarsat unit is the size of a thick laptop, while the iridium modem is half that, but you have to get an antenna.

Comment Re:Maybe (Score 1) 293

Here's the issue: either there is something out there we can't see (hence "dark matter") which is taking on more and more fantastical properties the more we learn about it, or our understanding of the universe's mechanics on a grand scale is wrong (and wrong in such as way that they line up pretty well at our small-scale understanding). Or for that matter, both could be true to some degree.

The scale is beyond the range of direct experimentation, so what can you do? In the former case, you can try to find some other way to observe the material. In the latter case, you can keep making observations until you have enough data to form a new understanding. That's about it. Until you have progress on one, it's very, very difficult to rule out the other.

At this rate, I am disinclined to believe in dark matter. Unfortunately, whether you believe in it or not you have to go through a similar search to determine what's true. This is the slog we're going through now.

Comment Re:GMA 600? Last years Atom? $200?!? (Score 2) 214

It's basically an MCU eval board: low PCB runs, low quantity orders for every part in the BOM, very generic capabilities. EVBs are expensive. Sure, the engineering time that goes into it is pricey, but on a per-unit basis it's the fact that these are very general-use, quasi-custom toys that increases the cost.

Comment It's about time (Score 3, Insightful) 192

Every time I look at my old engineering texts taking up shelf space I think, "I wish that someone could take all these, cut out about half of the valuable material, dice up the remainder between 30 odd sites and apps, and then tie it to a device with a 7-year shelf life."

As anyone who's dealt education-oriented online media (such as Blackboard) can tell you, the products are not always stellar. You get less text, its usually structured in such a way that it takes longer to read, the access is spotty, and it will probably not work as well as that in a year. Even the number one benefit of digitization -- search -- tends to be awkward or incomplete.

They say the iPad is about half the cost of books. I can easily believe that, but it also means you don't get to buy used books, or re-sell your used books. They've streamlined the process in a way that either offers no benefit, or benefits suppliers more than students.

It did convince the university to buy their student's books for them, provided you don't consider being forced to buy an iPad as being the same as being forced to "rent" used books. Or for that matter, so long as you don't consider going to a free library as an option. And so long as you don't consider that buying an iPad and getting electronic copies of textbooks was always an option for most books. All the ways they've streamlined the process are for the primary benefit of the supplier of the material.

Overall, it seems workable for books that you no interest in keeping beyond one semester (electives). But that is exactly the case where you can generally benefit from being flexible, buying bog-standard books from any store you please, buying a digital copy, or going to the library as needed. If you're talking about material that will actually continue to be relevant after a single semester, it sounds like a bad idea, putting a random-valued timer on your reference material.

Comment Re:interesting take. (Score 1) 158

the problem has always been that you have to be of questionable morality to harvest this data, get data of probably low quality, and piss people off

It may not solve the "current" problem, since advertisers won't even use this if they don't feel it will help them - they already have their means. But that doesn't mean it's useless. Don't be myopic.

No one said it's useless. It's very useful, primarily to advertisers, who will definitely make use of it if it's available. And it aims to solve _exactly_ the "current problem" of advertising, couched in a language that presumes that there is surely some other use out there that we're going to presume is the primary use case once they figure out what it is. So far, their attempt to find another use case amounts to a news site that skips right to the sports section (or a site that knows that to you, news.tld really means news.tld/sports). At best (where the user is concerned) it's a solution in want of a problem. At worst, it's an attempt to obscure its primary purpose. And the morality of the purpose itself remains questionable.

Moreover, it could theoretically work for a lot more than just ads and marketing. It would basically permits third parties to be granted access to your data. So, for instance, you could grant your geolocation database to only certain mapping sites. Or your social media history only to a game site that will utilize it.

There are many applications that are appearing for this kind of information harvesting that aren't all malicious. Some of them are even exciting

Except that's not what's being proposed at all. It's a good idea that any data sharing should require explicit user approval on a party-by-party basis: history, geolocation data, anything. But they're not talking about parsing your geolocation or social connections or anything like that. Their proposal is not "sharing certain things with certain parties over the internet". That's pointlessly broad. They're talking about divining general categories of interests, Stumble-Upon-style, from your browser history -- the type of sites you'd like to see more of, and the stuff you're most likely to buy -- and sending that to designated parties. In other words, they propose giving you increased power to allow people to advertise to you. But, since they haven't fixed any other method of tracking, you don't have a corresponding increase in your power to disallow advertising. The benefits of this are 100% in the favorof the advertiser.

Um, they're proposing that you, the user, can tell sites what you want them to know. Nothing at all, or certain information for certain sites. This is an opt-in. It will be useless to we who don't want it. It will help those who aren't as hardline about ads.

Respectfully, no technology in the history of the web has actually panned out this way. The web as its consumed now is developer-skewed. Ultimately, you're looking to the developers to provide the medium, and often the content, for everything you consume. Your only ultimate freedom is to choose to participate on the developer's terms, or not. Everything else is either granted by a developer favor (perhaps because he actually cares about a democratized, user-empowered web), pried out of the service under false pretenses, or just a result of a blind spot. If the developers don't want to support IE8 anymore, they have the power to lock you out (unless you spoof the UA string). If they want to force javascript on a site that doesn't even need it, they can detect non-compliance and refuse to serve you (unless you run No-Script and essentially lie about having a browser that runs javascript). If they want to require GPS location from your phone and refuse to let you use their location-driven service otherwise, they can. This is generally in the name of offering a uniform experience (read: creating complex applications with minimal testing), but the fact of the matter is that you still lock people out of the information they want because they aren't consuming it on the terms you want.

If everyone else is fine enabling cookies just to get any use out of, say, target.com, then you can bet you will eventually be expected to as well. Even if you're just "window shopping" and don't need cookies for any legitimate reason. It becomes the burden of the cookie-wary to either compromise their preferences or figure out a technical solution to bypass the developer's preferences. This is the meaning of my phrase above, "you are as free as your neighbors want to be".

Any useful technology enables new things that were not possible without it. It's tautology then to say that that technology will be a requirement for something. If it's accepted as a requirement for enough things, it soon becomes an acceptable requirement for pretty much _everything_, the way cookies, CSS, javascript, and the right user agent string all are these days. That is why we have to be skeptical, and not just throw our arms lovingly around every shiny bauble that gets thrown our way.

Slashdot Top Deals

"More software projects have gone awry for lack of calendar time than for all other causes combined." -- Fred Brooks, Jr., _The Mythical Man Month_

Working...