Languages are shaped by cognitive cost.
What are you talking about? Languages are shaped by a lot of things...social conventions, acquisition/induction in the face of noisy data, possible predispositions/biases towards particular analyses of novel data...but not cognitive cost. Unless you're using those words to mean something non-obvious.
This is what Steven Pinker seems not to get. There _is_ an innate language instinct, it's just not what he thinks it is. What we all share is the ability to introspect the cognitive cost of figuring out "WTH is this dude trying to convey?"
I'm no Pinker apologist (Jackendoff is better, for my money), but I'm pretty sure that there's not much that Pinker "doesn't get" about language...other than in the obvious sense that we're all on this voyage of knowledge and there are tonnes of things that we collectively don't know about language. The view of the "language instinct" espoused by Pinker has undergone a lot of revision, including by him (maybe try reading something post-1994. I recommend Words and Rules.) Also, the things that we're able to introspect about our language production ("how do I say X?") or comprehension ("what does Y mean when that person says it?") is a relatively small corner of the cognitive edifice that undergirds our linguistic knowledge. Moreover, it's rare that we have to explicitly reason through to an interpretation...most of the time there's no introspection involved at all.
One of the key insights on language is that Lempel-Ziv compression never transmits the compression dictionary.
Really? That's funny, because not a single one of the textbooks I've opened in 9 years of studying linguistics has mentioned gzip as representing one of the key insights of language.
The dictionary is implied because the compression program and the decompression program share the same dictionary construction heuristic. This is a trick you can pull off only if the two sides of the channel share the same cognitive architecture. There are no shortage of examples out there of how fast communication breaks down when the parties begin with fundamentally different premises on how to structure the categories of thought.
You don't need to have different cognitive category-structures for communication to break down. Moreover, there aren't any concepts that aren't expressible in some human language. Sure there may not be an English word that means zeitgeist (to trot out a hackneyed example), but that doesn't mean I can't use some longer construction to express the same meaning (look in your Deutsch-English dict for some hints).
Here's another fundamental question: what portion of the brain's cognitive activity is devoted to power management? For one thing, glucose is precious resource, and the brain is a chug-a-lug organ where it comes to glucose consumption. For another, the brain is costly to cool. From the real-time perspective (which governed 5.999 million years of human evolution), there's not much use firing up the abstract-noun chocolate factory when you need a survival response in under 100ms.
I'm not clear what this has to do with anything else, so I'll mostly gloss over it. BUT, I'm pretty sure it doesn't cost THAT much to cool one's head, since a lot of our heat escapes that way anyhow (lots of blood vessels really close to the surface, hence the propensity for head injuries to bleed like the dickens).
[...]
You can't defer deciding what to record for very long. So this is an obligatory cognitive function when your brain is already heavily loaded. At high enough stress levels, the recording function does shut down. Assessing and responding to cognitive burden is a mission-critical survival function. This is a key foundation for language learning.
First anguage acquisition happens in the absence of explicit tutoring, and in the (apparent) absence of concrete effort on the part of the learner, so I'm not sure your claim about "assessing and responding to cognitive burden" being a "key foundation for language learning" can be supported. Also, language acquisition starts happening nearly from birth...by a year of age kids are able to segment words out of the speech stream and have built the psychological categories that correspond to the sounds of their native tongues, while "losing" those categories that correspond to non-native speech sounds (which infants can identify at birth).
So how well-equipped is a one-year-old to "assess and respond to cognitive burden"?
A child doesn't need a special gene to discover the linguistic consequences of garden path sentence structures.
Now you're just making shit up. I'll literally wire you money if you can find me a published work that makes the above claim. Firstly, those linguists who believe that there is some genetic endowment involved in language acquisition would never phrase it in terms of "a language gene", except maybe at a cocktail party in order to be quick enough to not bore their interlocutors to tears. Also, nearly every analysis of garden path sentences and the problems they cause has appealed to parts of our faculty of language (e.g. parsing/memory, prosodic structure) that are explicitly outside the realm of things that could have a genetic basis.
"Oh damn, my mind when the wrong direction, and I wasted cognitive effort". Thus a child can self-infer a constraint on viable grammatical form,[...]
The point of garden path sentences (hackneyed example #2: "The horse raced past the barn fell") is that they ARE grammatical, but prone to misinterpretation (of a rather peculiar sort). There's no need for the child to "infer" anything about grammatical form when she comes across a garden path.
[...] even if, in the manner of an LZW dictionary, the constraint is never explicitly conveyed from the language proficient to the language learner. The underlying assumption that makes this work in practise is that the architectural model of the child's brain resembles that of the rest of the population. This is 99% satisfied by being a member of the same species, without any weird genetic Pinkerisms.
So. The fact that learner and producer are of the same species virtually guarantees sufficient similarity of cognitive architectures for successful language acquisition. How is that not genetic predisposition to learning language?
As the language convention becomes more sophisticated, some parameters in the ambiguity resolution process become social constructs.
The vast bulk of our linguistic knowledge (egs. the fact that the plural of the English word dog is pronounced dogZ but that cat is pluralized as catS, or that Who saw what? is fine, but What did who see? isn't, is NOT "conventionalized" in any meaningful sense of that word. Truly conventionalized aspects of language, i.e. normative aspects of language use, represent a tiny fraction of an individual's linguistic knowledge/capacity.
Given a conflict between two heuristics, which takes priority? The important thing to realize about socially determined linguistic parameters is that they tend to vary across discourse settings.
That last sentence seems tautologous to me.
[bits excised that I have no stake in...]
The "ATM Machine" linguistic construct is a linguistic violation only with respect to the homogeneity premise.
It's not a violation of anything, except some prescriptivist ideas about how people should say things. I might go so far as to say it flouts the Gricean Maxim of Quantity, but that's a pragmatic issue, not a grammatical one (with apologies to any pragmaticists in the audience).
[...]
Sometimes, however, the right thing to do is to let it go. The entire Chinese language made that decision when it decided that two-character nouns was not a form of redundancy, but a great aid in reducing cognitive burden for everyone involved.
Wait...WHAT? "The Chinese language" is a noun phrase with an empty referrent. Do you mean Mandarin, Cantonese or any of the other ones? Also, "languages" aren't things that can make decisions, so "[t]he entire Chinese language" definitely did not decide to "let it go". More generally, your entire premise is ridiculous, in that the Chinese writing system is actually pretty hard to learn. Someone on the forum will surely correct this (many eyes, and all), but I think you need to have mastered something on the order of 10^4 ideograms in order to read a newspaper (nevermind a technical publication). So your whole "reducing cognitive burden" argument falls flat, without even taking into account the stuff I mentioned above about first language acquisition and the likelihood that infants/toddlers can assess and respond to cognitive burden.
I wrote this post because I actively struggle with this particular construct. In my own notes, I tend to write out the redundant noun, because in this form I read it faster, even though it makes me gag every time. I'm just not wired to shave 1% in comprehension speed for a 100% gain in elegance.
I'm pretty sure actively gagging every time you come across "PIN number" would dramatically slow your reading times. You should probably do some experiments to check that out. You might have motivation to ditch the stylistically offensive bits after all!
Sorry for the obsessively-detailed reply, I've decided to start replying to posts about language that seem considered/well thought out, but are nonetheless wrong-headed (much as the physicists and mathematicians on /. tend to do). I realize I should probably work on my tone.