Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment: Re:Easy... (Score 1) 1121

No. Bigotry is thinking less of people because of assumptions you are making, or because of factors beyond that person's control.

For example, if a person genuinely believes that black people are inferior and should not have the right to vote, then I will think that belief is wrong and stupid, and I WILL and SHOULD think that the person is wrong and stupid for choosing to hold that belief.

Our beliefs are choices and they have consequences: they can reflect poorly on our moral character and critical thinking abilities. I don't want to argue that someone with religious belief should be judged poorly, but if someone does want to make a careful version of that argument, that person would not be a bigot.

Comment: Re:What OSS really needs... (Score 1) 356

by Canjo (#42935887) Attached to: Ask Slashdot: What Does the FOSS Community Currently Need?
The narrowness of the scroll bars is totally irrelevant because any modern Apple input device lets you scroll with two fingers on the mouse. This is the preferred way to scroll; it is much faster, more natural, and more accurate than moving around scroll bars. I'm not an Apply fanboy; there's a lot I don't like about the OSX interface. But blaming them for shrinking the scrollbars is just stupid.

Comment: Summary is completely misleading (Score 1) 293

by Canjo (#41836867) Attached to: Empathy Represses Analytic Thought, and Vice Versa

The research is about social cognition vs. mechanical cognition--reasoning about other people's beliefs vs reasoning about physical situations. Both of these are highly analytical tasks! This has nothing at all to do with empathy vs. analyticity, in fact this might be the worst quality science reporting I have ever read.

The journalists who reported on the story this way have harmed the general public's understanding of psychology in a particularly pernicious way by reinforcing stereotypes. Unfortunately their behavior is typical and they probably would have been fired if they had given an accurate summary of the research.

Comment: Evidence? (Score 1) 580

by Canjo (#41280535) Attached to: Rick Falkvinge On Child Porn and Freedom Of the Press
The guy keeps talking about how current legislation is actually already preventing the capture of child rapists, which if true would be the most powerful part of his argument, but he does not elaborate that point nor provide any empirical evidence that this is what's happening. It's all speculation as far as I can tell. Maybe someone else can find some evidence?

Comment: Stop being embarrassed by the command line (Score 1) 1154

by Canjo (#41270401) Attached to: Ask Slashdot: How Would You Fix the Linux Desktop?
There's always a lot of fussing about how the command line scares people, how if you have to do something on the command line then something is horribly wrong. This attitude has resulted in the dumbing down of the various GUIs for Linux, as if Linux will ever be able to compete with Windows or OSX for the content consumption market, or as if it would be a *GOOD* thing to turn Linux into the crappy knock-off cheap alternative to a real OS. I don't see how anyone with the intelligence to engineer an OS or a UI could be motivated by such a dumb goal.

I don't want to say what Linux should or shouldn't be, but occasionally I have flashes, glimpses of something awesome that it *COULD* be, and I wish I had the skills to make it or even articulate clearly what I'm seeing. I'll try here.

In selling Linux to newbies we're always afraid to show them the command line, and so no one has worked on developing a better shell. As a result, the shell is some monster out of the 1980s. What if all the expertise in user interfaces was turned on developing a better command line? (The fish shell is a step in the right direction; the OS X Terminal also has some nice features, such as drag-dropping files into it.) Where you interact with your computer by typing commands, rather than by pressing buttons? It isn't so unnatural. As I read in some post here a while ago, people have been talking to each other to get things done for millenia, but we've only been pressing on things that look like candy to get things done for a few years. On Star Trek, they instruct their computer verbally. And yet UI designers insist that our interfaces have to be entirely visual, because the CLI has been done in such a user-unfriendly way in the past.

The command line is powerful because it is a language; in languages you can build an infinity of possible sentences by combining words according to grammar. In Unix, you can make your computer do pretty much any complex task by chaining together other small programs. There is no comfortable visual metaphor I am aware of that enables this infinity of possible tasks. In a GUI, you have a finite amount of space on the screen, a finite number of buttons to press. In a CLI, you have an infinity of options that you can build up if you know the language. In my work, I do a lot of text processing; I make up new pipelines every day and I can't imagine any way of doing that in a GUI.

But the command line is still a dinosaur. It doesn't have to be that way.

This is my as-yet-fragmentary, inarticulate dream. Imagine if you type "ls" at the command line, and instead of a bunch of color-coded names of files (wow! color! on a monitor!?!?!? how high-tech!) you get thumbnails, so you don't have to open things individually to see what they are. And maybe those thumbnails stay at the top of the terminal, in some unobtrusive form, as you do your complicated renamings or whatever, so you don't have to keep typing "ls". What if you had the option of clicking cool.pdf OR typing evince cool.pdf? Or, if you don't remember that your pdf reader is evince, what if you could just type "open cool.pdf" and it would use $DEFAULT_PDF_THING? What if every program had a little terminal on the bottom where you could tell it what to do, in case you prefer to talk to your computer instead of press predefined buttons?

I don't know if I've articulated this clearly--I think about it quite a bit, in my spare time. The last thing I want to see is the command line wrecked by trendy UX crap. Rather I want to see the real work on usability brought to bear. Linux doesn't have to be the next OS that limits you. Linux can be the OS that makes it easy to build new things, to do things no one has done before. Steve Jobs said that computers should be like a bicycle for the mind: it enhances what you can do, it lets you go farther to places you hadn't thought of before. And he seemed to fulfill that idea in things like HyperCard before dumping it in favor of turning people into content-grazing animals with stuff like the iPad.

Linux can be the bicycle for the mind, the thing that makes it easy to create, to build new things that haven't been done before, and for me at least the power of Linux for those tasks is in the VERBAL INTERFACE of the command line, as opposed to the limiting visual interface of, say, Windows. Windows, OS X, iOS, Android, whatever, they're all going in the direction of limitations, of allowing people to consume the limited content that exists in the world. OS X is a little better since it's so developer friendly, but it also shuns the command line in the same way Linux does.

This isn't just to say that Linux should be for power-users: Linux should CREATE power users. After years and years of miserable trial and error, I have become proficient enough with the command line that people ask me how to do things. It was extremely difficult to learn. It doesn't have to be that hard. The command line interface, because it is a verbal interface, is what enables a computer to have infinite potential. Maybe there needs to be a new shell, like what I was describing above, or maybe a scripting language that's more obvious to computer-know-nothings. But the potential there, and it is in the command line, not the GUI.

Developers! Don't be embarrassed by the command line! It is strength, not weakness, and not even Apple realizes it. Stop trying to dumb down Linux, in pursuit of users you think are dumb--even the dumbest people are pretty smart, and get bored of just sitting around consuming "content". Let's make Linux the bicycle of the mind by embracing the command line.

Comment: Re:Badly needed (Score 1) 31

by Canjo (#39016327) Attached to: Texas Supercomputer Upgrading the Hurricane Forecast

But who on the ground even knows that things got more refined and predictable?

After all, everyone on the ground sees the same news media clowns struggling to stand upright in the same puddle of water getting lashed by winds in areas especially selected for their wind tunnel effect. The media is constantly preaching of doom and gloom and great destruction from storms which are ALREADY predicted via current methods to be largely spent by time of arrival. Disaster theater.

This is indeed a problem, but when you live in a Hurricane-prone area you typically aren't watching the national news, which is trying to make $$$ by making a spectacle. You're probably watching the local news stations which are relatively more informative. In fact, when Hurricane Gustav hit New Orleans in 2008, I remember the guy on the local Fox station explicitly told us NOT to watch Fox News, since it was just sensationalist and trying to scare people.

In any case, it doesn't hurt to err on the side of caution with these things.

Comment: Dubious linguistic claims (Score 5, Interesting) 297

by Canjo (#38975279) Attached to: If You're Fat, Broke, and Smoking, Blame Language
As a professional linguist I'm concerned about the linguistic analysis of English in this paper. The author claims that German does not have explicit future marking, while English does. He uses examples like:
"Morgen regnet es" --German, literally "it rains tomorrow", with no future tense marker
"It will rain tomorrow" -- English, with the future tense marker
He argues that the explicit future tense marking causes speakers to treat future events differently and thus damages savings or whatever. The statistical analysis in this paper looks pretty good to me, though I'm not familiar with the way economics people report linear regressions so it'll take more time to evaluate that. But the statistical analysis is no good if the linguistic analysis it's based on is wrong. Garbage in, garbage out.

The problem is that languages don't exclusively use or neglect to use future tense markers. For instance in German, you could use a future tense marker, as in "es wird regnen" (literally, it will rain). BUT you drop the future tense marker if you have a word like "tomorrow" that makes it obvious that the event is in the future, like "morgen regnet es" (tomorrow it rains). All languages make use of a variety of different patterns to mark future tense.

In English there is a similar pattern to German, for instance. People will very frequently say things like "I'm teaching tomorrow" or "I'm grabbing donuts with my friend tomorrow morning." The author ignores this, although it is very common in English usage, and even though it is a direct counterexample to his purported classification of English. He claims that English MUST mark future explicitly by pointing out that we don't say things like "I listen to a lecture"--but the problem with that sentence is NOT that it doesn't mark future; the problem is that we use the progressive in English contexts, and we could very easily say "I'm listening to a lecture tomorrow, so I won't be able to come to your party" or similar.

It turns out English and German have pretty much the same pattern of future tense marking. Maybe English speakers use explicit tense marking more than German speakers do, but that's a quantitative difference, which is ignored in this paper in favor of arbitrary categorizations.

If this fellow is so ignorant about the language he's writing in, how much can we trust his judgments about other languages? Or rather, how much can we trust him to be sufficiently critical of the linguistic categorizations that he's looking at, or to know what they really mean? Yes, his data was based on "expert" linguistic sources, but linguists are also prone to this kind of miscategorization, and are very often more driven by a need to make languages conform to certain modern theories than by a desire to make a legitimate description; furthermore the people writing about these languages are all operating according to DIFFERENT DEFINITIONS and different theoretical frameworks, a problem I have to deal with just about every day in my work.

tl;dr It looks like the author has given almost no thought to the lack of soundness in the linguistic categorizations he uses, even though his system breaks down in the very examples he cites. I don't think he knows what he's talking about.

Comment: Re:Sure... (Score 5, Insightful) 299

by Canjo (#38669304) Attached to: Should Science Rethink the Definition of "Life"?
Exactly this--there might well be other forms of life, but we only really know how to look for life like our own. You may say that it's dumb for NASA to look for carbon-based life, or for SETI to look for life that uses radio wavelengths like us, but if you do so you're misunderstanding their logic. If there is enough life out there, some subset of it will be carbon based, some subset will use radio communication, and some subset will be interested in communication. That subset is the ONLY subset that we have the tools to look for. There may be non-carbon-based life, sure, but since we've never seen it we don't know exactly what its properties are or how to detect it. We may be able to theorize, but those are only theories; whereas we KNOW how life works here. It's not that researchers have a narrow definition of life, it's that we have limited resources and can only hope to detect the subset of life that is like life here on Earth.

Comment: Re:Get a clue Big Sis (Score 1) 256

by Canjo (#38473576) Attached to: Vanity Fair On the TSA and Security Theater
They may have prevented all terrorist attacks (the article only mentions one attempt), but you also have to consider their false positive rate. It says 2-5% of passengers receive secondary screening, which can involve hours of interrogation and privacy violations. This seems unacceptable, especially if your chances of getting the secondary screening (and probably missing your flight) are much higher than 5% when you're not white!

Comment: Re:Taught? (Score 1) 176

by Canjo (#37900762) Attached to: Why Fingernails On a Chalkboard Sound Painful
Yes, in fact this is part of what the research was testing! Here is an official summary of the actual paper: http://www.acoustics.org/press/162nd/Oehler_4pPP6.html They either told subjects that this is the sound of a chalkboard, or said it was from a "contemporary composition." Subjects found it more painful when they had been told it was chalkboard, and they found it less painful when they thought it was music. So it seems that the idea of chalk on a a chalkboard is part of what makes it so horrible, and that's probably just learned.

"Flattery is all right -- if you don't inhale." -- Adlai Stevenson

Working...