Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Comment Caveat in re: power laws in empirical data (Score 5, Interesting) 181

Cosma Shalizi rants a lot about scientists' (often physicists') claims about having found a power law description of some empirical phenomenon (upshot: finding a straight line on a log-log plot isn't enough). See the following:


Comment Re:This is what linguists have been waiting for (Score 1) 197

However, it might be the case that this "syntax" has developed in parallel to human syntax from some common protolanguage

What is interesting here is not the structure of the language, but the fact of it.

Granted, I---probably erroneously---took the statement to which I responded to be a claim that studying the monkey's language would be informative vis-à-vis human language. The "proto" prefix is typically used to mean "ancestral to contemporary human language".

Humans are possessed of a wide range of incredibly powerful, flexible and general linguistic mechanisms. Non-human animals are frequently held to be entirely non-linguistic.

I think (hope!) that this position is becoming outmoded among the newer generations of linguistics & cognitive scientists. The evidence for abilities that map fairly straightforwardly onto human linguistic abilities is pretty much overwhelming at this point. (The final chapter of Bridget Samuels' dissertation talks about this a fair bit, mostly in relation of phonology)

This is implausible on the most basic evolutionary grounds: evolution is an elaborative process, and to have such remarkable abilities amongst humans strongly suggests a lot of linguistic or proto-linguistic capability in our ancestral line, and probably in other animals too. Otherwise, it would be like humans having the ability to run fifty miles in one go in a world where no other animal has legs.

Well, maybe yes, maybe no. There's a big push now toward viewing Language as a cultural artifact, whose properties are emergents of cultural evolution (cf. anything going on at the LEC in Edinburgh, or Mort Christiansen's work). This viewpoint, to which I'm generally sympathetic, always leads me to thinking about cooking and recipes. Cooking is, to the best of my knowledge, a purely human endeavour; one that has presumably been considerably refined via cultural evolution since the day when someone accidentally dropped her hunk of meat in the fire. And yet, no one would be tempted to say that the seeds of cooking/recipes/soufflé can be found in the behaviours of some animals (or maybe they would...I'm not ethologist).

I had a point, but it seems to be gone now...probably that appeals either to innateness or evolution alone are by necessity oversimplifications. The kind of empirical work being discussed here is what will move this domain of knowledge forward.

While the sexual selection forces that drove the evolution of human intelligence are powerful and able to produce relatively rapid elaboration of new capabilities, those capabilities have to be elaborations of something that already existed, and so we should naively expect this kind of discovery.

I don't think I'm understanding what you've said here. Surely not everything is built on something that came before? Mutation and exaptation have clear---in fact vitally important for the former---roles in evolutionary processes.

Unfortunately, because linguists seem for some reason to think that human language is the only possible model for language (see the other comments from linguists in this thread, for example) it can be difficult to recognize the linguistic (or possibly linguist-ish) capabilities of non-human species that do not conform well to that model.

Given that our only unambiguous model for Language is human language, it should be unsurprising that that's what we take as our primary model. Nonetheless, see my earlier reference to Samuels 2009 for a clear indication that this trend is changing.

I now have the PNAS paper in hand (well, on-screen)...I may come back and say more...

Comment Re:This is what linguists have been waiting for (Score 5, Insightful) 197

Those scientists who have been studying animal language as a non-pseudoscience have been waiting for anyone to show SYNTAX in animal language. You have have 1 trillion different words in a language, and it has a finite range of expressions... meanwhile you can have 10 different words, that with the right syntax can generate an infinite range of expressions.

While this is true, it's not clear to me that what's documented here is, in fact, syntax. The researcher in question (Zuberbühler) has written about this stuff before and has been much more cautious in attributing full-on linguistic properties (a search of LanguageLog will turn something up from 2006).

I'll reserve absolute judgment for when I get a chance to look at the actual paper, but this quote from NYT gives me pause: Two booms can be combined with a series of "krak-oos," with a meaning entirely different to that of either of its components. This is not (typically) how human language works...meaning is compositionally built up from bits of syntax, whereas what's described here looks more like idiom. In fact, it looks more like phonology (*maybe* morphology) to me...meaningless bits that can be put together to make meaningful bits.

What they need to do now is get a linguist in there so slice & dice the recordings, play them back to the monkeys in various reconstructed forms, and see how they react.


[...] a chance to really look at a real proto-syntax, because all human languages have a very strongly developed syntax

some would argue against the subordinate clause here (pointing at Piraha, for example), but I'm not one of those. However, it might be the case that this "syntax" has developed in parallel to human syntax from some common protolanguage (since these are monkeys and not even apes, we're talking REALLY far back), and so this may be relatively uninformative with respect to human syntax.

Comment Re:Actually, you're a good example of that. (Score 1) 1255

You'll see more misanthropy, misogyny, misandry, every flavor of "ism" etc etc in pretty much any community.

Yeah, but we're not talking about "hatred/dislike on the basis of intrinsic characteristics" here (viz. your "mis-X" examples)...you can dislike someone and still accept that they're highly competent (cf. many people's opinion of Theo de Raadt). The current brouhaha is about differential (i.e. worse) treatment on the basis of gender. I'm willing to just come out and say that there's none of that in the FOSS world as regards its overwhelmingly male membership..."obviously you can't code, you're just a dude"...

[...] but a social group based around "show us the code" [...]

One of the major points of this discussion is that what I just quoted is a fictive characterization of the FOSS community. Obviously there are places where the situation is better (e.g. the kernel mailing list), and worse (e.g. several recent keynote talks). The thing is people are people and sociological things will typically get in the way of impartiality. In this case, we're talking about when that manifests as discrimination against women. Which it does. But shouldn't.

where people can choose everything about how they present themselves[...]

Not sure about the relevance of this...

Comment Montrealers, beware! (Score 1) 289

This would be awful for people who live in Montreal...the axis that determines streets' "North/South" designation is pretty nearly NW-SE, and most people who've lived in Montreal for a while point NW when you ask them to show you N. In winter the sun rises & sets in really weird places. (or rather, it doesn't but a lot of people think it does *if* they bother to stop and think about it)

Comment Re:Dell has dropped most Linux models (Score 1) 324

Huh ?


Leftmost item is a Dell Mini 10v (as you pointed out), with Ubuntu pre-installed, and a 160GB hard drive. In fact,the page you linked to seems to have the same item (3rd from left), albeit at an inexplicably higher price (and in USD).

Who gives a shit about the "instant discount" for the Windows version?

Comment Re:Pedant Warning! (Score 2, Informative) 394

Languages are shaped by cognitive cost.

What are you talking about? Languages are shaped by a lot of things...social conventions, acquisition/induction in the face of noisy data, possible predispositions/biases towards particular analyses of novel data...but not cognitive cost. Unless you're using those words to mean something non-obvious.

This is what Steven Pinker seems not to get. There _is_ an innate language instinct, it's just not what he thinks it is. What we all share is the ability to introspect the cognitive cost of figuring out "WTH is this dude trying to convey?"

I'm no Pinker apologist (Jackendoff is better, for my money), but I'm pretty sure that there's not much that Pinker "doesn't get" about language...other than in the obvious sense that we're all on this voyage of knowledge and there are tonnes of things that we collectively don't know about language. The view of the "language instinct" espoused by Pinker has undergone a lot of revision, including by him (maybe try reading something post-1994. I recommend Words and Rules.) Also, the things that we're able to introspect about our language production ("how do I say X?") or comprehension ("what does Y mean when that person says it?") is a relatively small corner of the cognitive edifice that undergirds our linguistic knowledge. Moreover, it's rare that we have to explicitly reason through to an interpretation...most of the time there's no introspection involved at all.

One of the key insights on language is that Lempel-Ziv compression never transmits the compression dictionary.

Really? That's funny, because not a single one of the textbooks I've opened in 9 years of studying linguistics has mentioned gzip as representing one of the key insights of language.

The dictionary is implied because the compression program and the decompression program share the same dictionary construction heuristic. This is a trick you can pull off only if the two sides of the channel share the same cognitive architecture. There are no shortage of examples out there of how fast communication breaks down when the parties begin with fundamentally different premises on how to structure the categories of thought.

You don't need to have different cognitive category-structures for communication to break down. Moreover, there aren't any concepts that aren't expressible in some human language. Sure there may not be an English word that means zeitgeist (to trot out a hackneyed example), but that doesn't mean I can't use some longer construction to express the same meaning (look in your Deutsch-English dict for some hints).

Here's another fundamental question: what portion of the brain's cognitive activity is devoted to power management? For one thing, glucose is precious resource, and the brain is a chug-a-lug organ where it comes to glucose consumption. For another, the brain is costly to cool. From the real-time perspective (which governed 5.999 million years of human evolution), there's not much use firing up the abstract-noun chocolate factory when you need a survival response in under 100ms.

I'm not clear what this has to do with anything else, so I'll mostly gloss over it. BUT, I'm pretty sure it doesn't cost THAT much to cool one's head, since a lot of our heat escapes that way anyhow (lots of blood vessels really close to the surface, hence the propensity for head injuries to bleed like the dickens).


You can't defer deciding what to record for very long. So this is an obligatory cognitive function when your brain is already heavily loaded. At high enough stress levels, the recording function does shut down. Assessing and responding to cognitive burden is a mission-critical survival function. This is a key foundation for language learning.

First anguage acquisition happens in the absence of explicit tutoring, and in the (apparent) absence of concrete effort on the part of the learner, so I'm not sure your claim about "assessing and responding to cognitive burden" being a "key foundation for language learning" can be supported. Also, language acquisition starts happening nearly from birth...by a year of age kids are able to segment words out of the speech stream and have built the psychological categories that correspond to the sounds of their native tongues, while "losing" those categories that correspond to non-native speech sounds (which infants can identify at birth).

So how well-equipped is a one-year-old to "assess and respond to cognitive burden"?

A child doesn't need a special gene to discover the linguistic consequences of garden path sentence structures.

Now you're just making shit up. I'll literally wire you money if you can find me a published work that makes the above claim. Firstly, those linguists who believe that there is some genetic endowment involved in language acquisition would never phrase it in terms of "a language gene", except maybe at a cocktail party in order to be quick enough to not bore their interlocutors to tears. Also, nearly every analysis of garden path sentences and the problems they cause has appealed to parts of our faculty of language (e.g. parsing/memory, prosodic structure) that are explicitly outside the realm of things that could have a genetic basis.

"Oh damn, my mind when the wrong direction, and I wasted cognitive effort". Thus a child can self-infer a constraint on viable grammatical form,[...]

The point of garden path sentences (hackneyed example #2: "The horse raced past the barn fell") is that they ARE grammatical, but prone to misinterpretation (of a rather peculiar sort). There's no need for the child to "infer" anything about grammatical form when she comes across a garden path.

[...] even if, in the manner of an LZW dictionary, the constraint is never explicitly conveyed from the language proficient to the language learner. The underlying assumption that makes this work in practise is that the architectural model of the child's brain resembles that of the rest of the population. This is 99% satisfied by being a member of the same species, without any weird genetic Pinkerisms.

So. The fact that learner and producer are of the same species virtually guarantees sufficient similarity of cognitive architectures for successful language acquisition. How is that not genetic predisposition to learning language?

As the language convention becomes more sophisticated, some parameters in the ambiguity resolution process become social constructs.

The vast bulk of our linguistic knowledge (egs. the fact that the plural of the English word dog is pronounced dogZ but that cat is pluralized as catS, or that Who saw what? is fine, but What did who see? isn't, is NOT "conventionalized" in any meaningful sense of that word. Truly conventionalized aspects of language, i.e. normative aspects of language use, represent a tiny fraction of an individual's linguistic knowledge/capacity.

Given a conflict between two heuristics, which takes priority? The important thing to realize about socially determined linguistic parameters is that they tend to vary across discourse settings.

That last sentence seems tautologous to me.

[bits excised that I have no stake in...]

The "ATM Machine" linguistic construct is a linguistic violation only with respect to the homogeneity premise.

It's not a violation of anything, except some prescriptivist ideas about how people should say things. I might go so far as to say it flouts the Gricean Maxim of Quantity, but that's a pragmatic issue, not a grammatical one (with apologies to any pragmaticists in the audience).

Sometimes, however, the right thing to do is to let it go. The entire Chinese language made that decision when it decided that two-character nouns was not a form of redundancy, but a great aid in reducing cognitive burden for everyone involved.

Wait...WHAT? "The Chinese language" is a noun phrase with an empty referrent. Do you mean Mandarin, Cantonese or any of the other ones? Also, "languages" aren't things that can make decisions, so "[t]he entire Chinese language" definitely did not decide to "let it go". More generally, your entire premise is ridiculous, in that the Chinese writing system is actually pretty hard to learn. Someone on the forum will surely correct this (many eyes, and all), but I think you need to have mastered something on the order of 10^4 ideograms in order to read a newspaper (nevermind a technical publication). So your whole "reducing cognitive burden" argument falls flat, without even taking into account the stuff I mentioned above about first language acquisition and the likelihood that infants/toddlers can assess and respond to cognitive burden.

I wrote this post because I actively struggle with this particular construct. In my own notes, I tend to write out the redundant noun, because in this form I read it faster, even though it makes me gag every time. I'm just not wired to shave 1% in comprehension speed for a 100% gain in elegance.

I'm pretty sure actively gagging every time you come across "PIN number" would dramatically slow your reading times. You should probably do some experiments to check that out. You might have motivation to ditch the stylistically offensive bits after all!

Sorry for the obsessively-detailed reply, I've decided to start replying to posts about language that seem considered/well thought out, but are nonetheless wrong-headed (much as the physicists and mathematicians on /. tend to do). I realize I should probably work on my tone.

Comment The MC is behind the times (Score 1) 398

The point Hawking's making is interesting, and potentially relevant, but it's hardly a novel claim. What he's talking about it often called "cultural evolution", and people have been talking about it for a while now, starting(?) with Cavalli-Sforza et al back in the 80s. It's regained momentum with the recent (~ last decade) resurgence of interest in the evolution of language (cf. papers by Simon Kirby, Henry Brighton & friends in the 2006 PNAS).

Also, it seems doubtful to me that we're literally no longer evolving in the "traditional" sense of the word. Sure, we're doing things to artificially prolong life and enhance reproductive success, but that doesn't really change the fact that natural and sexual selection are still at work.

Disclaimer: I am NOT a biologist. I AM a linguist.

Comment Re:Not new (Score 1) 300

Either video gaming is harmful for everyone, or it's not harmful for anyone. I mean, you could come up with reasons why it's worse for kids, but lots of them have been explored and found to have been pure bullshit.

OK, so either I wasn't clear, or you're being a bit disingenuous. I'm talking about content, rather than the mere act of gaming itself. For all of its faults, the gaming ratings system (much like the movie rating system), says that particular kinds of content are appropriate for particular audience demographics, not that games (or movies) are good for some people and bad for others. I mean, really now.

If the issue is the kind of game you're playing, then play a different kind of game that you can play with your kid and stop skulking around like a criminal.

Well, there's no doubt that playing family games is a good thing to do. It's a great way to get parent-child time in for those who love gaming, as well as being responsible parenting in re: supervising the content of your kids' games. That being said, having a child means doesn't have to mean that I can't play any non-child-appropriate games (see any non-child-appropriate movies) until my kid is a teenager. When the kid's in bed, I can play my AO games. It's not really that hard.

Comment What can *I* do? (Score 2, Insightful) 135

I live in Ottawa and want to do something more than write a letter that I know will be ignored to a local MP who I know is not in line with my position anyway. While I'm interested in law & policy as it applies to this domain, it's definitely not in my sphere of knowledge.

Do /.ers have any suggestions about what I can do to fight this, or good ways to raise awareness?

Slashdot Top Deals

"Mr. Watson, come here, I want you." -- Alexander Graham Bell