Jaron Lanier on the Semi-Closed Internet 248
Will Wilkinson writes "Jaron Lanier's recent essay, The Gory Antigora: Illusions of Capitalism and Computers, kicks off a discussion of 'Internet Liberation: Alive or Dead?' at the Cato Institute's new blogazine, Cato Unbound. In Lanier's essay today, find out how the 'brittleness' of software has kept the Internet from realizing its potential as 'a cross between Adam Smith and Albert Einstein; the Invisible Hand accelerating toward the speed of light.' Also, find out why, upon meeting Richard Stallman, Lanier's reaction was: 'An open version of UNIX! Yuk!'"
Bunk. (Score:4, Interesting)
That peer is the very sentance you are writing, correct?
This entire essay is bunk; every paragraph the author brings up a point that can quickly be refuted. He overgeneralizes issues and adds a big dollup of emotional appeal to make his points. And frankly, his points are just misguided, if not straight out wrong.
Nothing to see here (Score:3, Interesting)
Lanier is a self-promoting, no-talent technophobe (Score:4, Interesting)
I worked alongside him at Time Inc. New Media, back in the Pathfnder days. He kept on proposing one project after another that simply couldn't be done - the technology didn't exist. I called them "Flying Car Projects" - sounds good, but creating a Flying Car is tough once you start dealing with logistics of fueling, licensing, training, etc. etc.
His biggest "idea" was called GigaJam, where we'd have millions of surfers hit virtual keys, somehow turning that into music, and streaming it back to them. In realtime. That'd be difficult to do today, but totally impossible back in 1996. Moreover, I'm not sure that there would have been much of a point to devote resources to something like that. To a user, that would have been fun for about 2 minutes.
Rumor was that he was boinking one of the head honchos at TINM, which is likely how he got the job. He was likely getting paid an assload of money to do nothing but bother people with his silly notions. After a year, he had contributed NOTHING. Not one of his projects was ever adopted in any fashion. And I heard that he had difficulty using a Macintosh to do things like, say, copy files.
So here's a guy that has fed (and rather overfed) himself on being a technology pundit, who doesn't understand the first thing about technology. Plus he's fat and smelly. So take his opinions with a huge chunk of salt.
All the above opinion, rumor, innuendo
Welcome to the conservitive think-tank. (Score:2, Interesting)
This is the same loose affiliation that will scream about "liberal media" until their noise drowns out any other signal. And the Cato Institute is the "section" that sets up the "information" that is going to be sited.
My big concern here is that this is the beginning of the hard core lock down of the internet. Their typical tactic is to chip away until nothing is left. Think imperialist presidency, with their "Us" in control. It almost sounds like fascism, remember, Hitler was elected too.
Go ahead, mark this as a troll, but the Libertarians should be just as scared as the Liberals.
Why? (Score:2, Interesting)
Re:Bunk. (Score:3, Interesting)
The Internet and the Web would be fabulous Antigoras if they were privately owned.
Here he proves he knows nothing about the internet, or at least the internet in the US. The net is almost completely (if not completely) privately owned in the US.
some good point, but mostly FUD (Score:4, Interesting)
Define huge. Hundreds of hours? Double-digit percentage of all time spent using the computer? He doesn't say, and I doubt it's close to either metric for all but the most inept of users. For the average person *any* amount of time spent doing *any* one of these tasks is, in their opinion, too much. Time spent doing basic maintenance is one of the most overstated stats thrown around.
The biggest point he comes close to touching but then completely misses is with the language analogy. The informational content of language is almost entirely context sensative. For example, I can make the statement "I'm blue", and without context, you don't know if I'm refering to the color of the clothes I'm wearing, my emotional disposition, me political affiliation, if I'm pretending to be a cartoon dog while playing with my kids, or any other reference for which the word "blue" might apply.
Langauge has the the immediate context of the conversation in which it is occuring, and the ultimate context of the physical world. What he misses is that not only does computer software have to be precise, it has to supply it's own virtual context; i.e. your web browser exists in the virtual context of the network, which connects it to an application which exists in a vitual context of a combination of, for example, a java environment on top of a database on top of an operating system. All the underlying layers provide a context for the next layer above in which to exist and interact. And we had to create every single layer from scratch!
Lanier then makes the usual eglatarian conceit with the statement "Only culture is rich enough to fund the Antigora." The Internet is its own culture, which both incorportates and yet transends mutiple, different national, tribal, and social cultures. Lanier and all the other Internet pundits need to recognize that, get the hell over it, and move on.
Against files (Score:3, Interesting)
If you needed a "text file" in the Tandem world, it was treated as a big object (a BLOB) in the database, and handled as a unit. THis seems wierd, but it allowed program development on Tandem machines. Storing a file was, of course, an atomic operation; you never had a truncated file.
Apple's "resource fork" was a step in the right direction, but the implementation of updates in the classic MacOS was so unreliable that it was hopeless as a data-storage mechanism. Apple backed off from the resource fork when they went to a UNIX-type file system after the NeXT acquisition. Now it's making a comeback in a minor way.
Early visions of Microsoft's Longhorn seemed to be moving in that direction, but Microsoft couldn't bring it off.
UNIX/Linux has terrible file reliability semantics. Locking is an afterthought. File transactions aren't atomic. (Even lock file creation isn't atomic if the the file system is on NFS.) Nobody understands two-phase commit, the technique that keeps your bank account from being debited twice if the ATM loses power during a transaction. There have been attempts to fix these problems (see UCLA Locus), but they never caught on.
The most likely company to fix this problem is Google. Google's own machines are full of databases of text, not "files". In a sense, we're all using a system that's not file-based - we just don't see it.
Re:Jaron's Title (Score:3, Interesting)
He makes an interesting point about the idea of files and how entrenched that idea is. I would take this further and say that the whole idea of the desktop metaphor is very entrenched, although I don't think I'd go so far as to say that we'll necessarily still be using it in a thousand years.
An exaggeration, to be sure, but it's still a decent point.
As he gets further and further down his conclusions become more and more farfetched, and I don't agree with his dislike of UNIX, either (frankly I find the commandline to be somewhat empowering, although I dislike the knowledge-cult attitude that it seems to generate sometimes).
I think he makes a relatively salient point as well about content protection:
To my eye, Apple's iPod and iTMS system is a system which leaks "a little." There's some content protection (keeping you from moving the files from the iPod to a friend's computer) but it's nothing that can't be bypassed by a bright 12-year-old. As such, it succeeds in being commercially viable -- as totally open systems aren't -- without disempowering users to the extent that competitors systems do. They could have been a lot more thorough with the copy protection; they were not and it shows. Instead they did what they needed to do to get it out the door. I'm sure there are other examples of this around, but having just restored a music collection from an iPod yesterday, this was foremost in my mind.
At any rate, I think it would be wrong to dismiss Lanier out of hand. I don't know his history or reputation -- apparently it's not great -- and I'm not sure how I feel about his politics regarding ultra-privatization of everything. But there are some things worth discussing in there, among the pretentious academic language. We'd be doing ourselves a bit of a disservice if we didn't bother to bring them up at all.
Hierarchies and relations and ramndom walks... (Score:5, Interesting)
And databases came first. back in the 60s and even well into the 70s, a "file" was seen as a column in a table, or a table, in a database. As databases became more powerful, data stores tried to follow... you had RMS on DEC operating systems, "typed" data sets and files, systems like Pick, Apple's "resource forks", and Be's BeFS. No matter how the data's stored, eventually anything more than a shopping list (oh, yes, there are very complex shopping lists: address books, customer databases, and lots and lots of indexes into collections of flat files like Harvest and Spotlight and Google) becomes a flat block of text with embedded links to other blocks of text.
Whether those links are "see chapter 10" or "#include stdio.h" or "import io"... those links are not links to databases, they're links to files.
---
The idea that an unstructured block of data was the default was a breakthrough. The idea that a command line interface could be relatively terse and simple so that mere humans could learn to use it, that was a breakthrough too. UNIX cut through an enormous amount of user interface trash and laid bare what was, for the end of the '60s, at least as dramatic an improvement in UI design as GUIs were for the '70s. It's a linguistic interface, not a gestural one, but the first linguistic interface that provided concurrency (through the & background scheme, then through shell layers and job control) and the complete OPPOSITE of the normal "user submits a command, user waits for a response" that every other system in the world used.
I implemented a UNIX shell with explicit backgrounding on RSX-11 and showed it to my boss, and he was astonished. Even though RSX has an ability to hit return and get a new prompt at any time, so you already have the ability to "interrupt" a program and do something else, he'd never used that other than to treat that MCR prompt as an "I'm still busy" message. But being able to take something that was going to take a long time and throw it off into the background under his control was great.
Given the hardware limitations of the time, I submit that the UNIX shell and the UNIX plain-text-file pipes-and-filters job-control environment is close to the very best user interface that could be developed. It's the "tabbed browser" of the command line world. Alas, X-Windows came along and people stopped really using and understanding the shell, and X11's high-latency message based interface became the standard for the UNIX world.
It's really X11, a non-UNIX-like window system developed for UNIX and VMS at MIT in the '80s, that Lanier should be complaining about. Because UNIX itself doesn't suffer from the flaws that he's attributing to it. UNIX is small, tight, fast, responsive, and concurrent, a UNIX shell is a team of willing slaves that does WHAT you want WHEN you want it, and you NEVER have to wait for them unless you choose to.
---
File systems with UNIX semantics, by the way, work well. That's the problem with NFS. NFS is *not* a UNIX file system, and its semantics make it a huge nightmare for applications developed on REAL UNIX file systems. It was a hack-job designed to make it possible to implement a reasonably fast and efficient file system in the kernel on a 68000-based Sun workstation in the '80s. It should have been turfed long since and again IT'S NOT UNIX, IT'S NOT UNIX FAULT.
---
For structured data, databases are great. Using a file system for database operations was a result of UNIX coming from an era before there was a really common way to talk about relational operations linguistically. Bad as SQL is, at least it gives us a framework to deal with the problem. But for hierarchical randomly interrelated data the filesystem model works well.
Google is an index, it's not the data itself. The data that gives google its value is in a file system.
Re:I didn't understand any of that. (Score:4, Interesting)
OT: Slashdot doesn't care about black people... (Score:3, Interesting)
(Score:5, Funny)
That's just...sad. That being said, I'm certain that I will now be modded down into oblivion. Goodbye cruel world. I hardly knew ye.