Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Internet Editorial

Jaron Lanier on the Semi-Closed Internet 248

Will Wilkinson writes "Jaron Lanier's recent essay, The Gory Antigora: Illusions of Capitalism and Computers, kicks off a discussion of 'Internet Liberation: Alive or Dead?' at the Cato Institute's new blogazine, Cato Unbound. In Lanier's essay today, find out how the 'brittleness' of software has kept the Internet from realizing its potential as 'a cross between Adam Smith and Albert Einstein; the Invisible Hand accelerating toward the speed of light.' Also, find out why, upon meeting Richard Stallman, Lanier's reaction was: 'An open version of UNIX! Yuk!'"
This discussion has been archived. No new comments can be posted.

Jaron Lanier on the Semi-Closed Internet

Comments Filter:
  • Bunk. (Score:4, Interesting)

    by RevDobbs ( 313888 ) * on Monday January 09, 2006 @01:50PM (#14428834) Homepage
    The unfortunate Internet has only one peer when it comes to obfuscation due to an inundation of excessive punditry,

    That peer is the very sentance you are writing, correct?

    This entire essay is bunk; every paragraph the author brings up a point that can quickly be refuted. He overgeneralizes issues and adds a big dollup of emotional appeal to make his points. And frankly, his points are just misguided, if not straight out wrong.

  • Nothing to see here (Score:3, Interesting)

    by Eli Gottlieb ( 917758 ) <[moc.liamg] [ta] [beilttogile]> on Monday January 09, 2006 @01:54PM (#14428874) Homepage Journal
    There isn't much in TFA except a nice point about how we should be able to "browse" video games in the way we browse through books or newspapers. Which does, in fact, make me wonder why stores don't allow you to rent a copy of a game, bring it back and decide whether or not to buy it. I've been doing that for years, but never with one store.
  • by Anonymous Coward on Monday January 09, 2006 @02:27PM (#14429194)
    Lanier's claim to fame is that he "invented" virtual reality. or something like that. His real fame comes from being a huge fat guy with white boy dreads who has an unfounded reputation for being a "luminary". He cons people into paying him to write articles, speak, "conceptualize", and keeps the repuation going for more cons.

    I worked alongside him at Time Inc. New Media, back in the Pathfnder days. He kept on proposing one project after another that simply couldn't be done - the technology didn't exist. I called them "Flying Car Projects" - sounds good, but creating a Flying Car is tough once you start dealing with logistics of fueling, licensing, training, etc. etc.

    His biggest "idea" was called GigaJam, where we'd have millions of surfers hit virtual keys, somehow turning that into music, and streaming it back to them. In realtime. That'd be difficult to do today, but totally impossible back in 1996. Moreover, I'm not sure that there would have been much of a point to devote resources to something like that. To a user, that would have been fun for about 2 minutes.

    Rumor was that he was boinking one of the head honchos at TINM, which is likely how he got the job. He was likely getting paid an assload of money to do nothing but bother people with his silly notions. After a year, he had contributed NOTHING. Not one of his projects was ever adopted in any fashion. And I heard that he had difficulty using a Macintosh to do things like, say, copy files.

    So here's a guy that has fed (and rather overfed) himself on being a technology pundit, who doesn't understand the first thing about technology. Plus he's fat and smelly. So take his opinions with a huge chunk of salt.

    All the above opinion, rumor, innuendo :)
  • by Anonymous Coward on Monday January 09, 2006 @02:30PM (#14429218)
    The Cato institute is one of the more conservitive think tank. From what I understand, they will publish stuff like this so that the NeoCons in Congress can quote them as a reliable source.
    This is the same loose affiliation that will scream about "liberal media" until their noise drowns out any other signal. And the Cato Institute is the "section" that sets up the "information" that is going to be sited.

    My big concern here is that this is the beginning of the hard core lock down of the internet. Their typical tactic is to chip away until nothing is left. Think imperialist presidency, with their "Us" in control. It almost sounds like fascism, remember, Hitler was elected too.

    Go ahead, mark this as a troll, but the Libertarians should be just as scared as the Liberals.

  • Why? (Score:2, Interesting)

    by LukePieStalker ( 746993 ) on Monday January 09, 2006 @02:39PM (#14429311)
    Why is a right-wing propaganda machine like the Cato Institute being given a forum here? At least be honest and put it in the politics forum.
  • Re:Bunk. (Score:3, Interesting)

    by Ironsides ( 739422 ) on Monday January 09, 2006 @02:41PM (#14429322) Homepage Journal
    I'd like to add one more thing to help debunk this guy.

    The Internet and the Web would be fabulous Antigoras if they were privately owned.

    Here he proves he knows nothing about the internet, or at least the internet in the US. The net is almost completely (if not completely) privately owned in the US.
  • by prgrmr ( 568806 ) on Monday January 09, 2006 @02:44PM (#14429355) Journal
    Every computer user spends astonishingly huge and increasing amounts of time updating software patches, visiting help desks, and performing other frustratingly tedious, ubiquitous tasks

    Define huge. Hundreds of hours? Double-digit percentage of all time spent using the computer? He doesn't say, and I doubt it's close to either metric for all but the most inept of users. For the average person *any* amount of time spent doing *any* one of these tasks is, in their opinion, too much. Time spent doing basic maintenance is one of the most overstated stats thrown around.

    The biggest point he comes close to touching but then completely misses is with the language analogy. The informational content of language is almost entirely context sensative. For example, I can make the statement "I'm blue", and without context, you don't know if I'm refering to the color of the clothes I'm wearing, my emotional disposition, me political affiliation, if I'm pretending to be a cartoon dog while playing with my kids, or any other reference for which the word "blue" might apply.

    Langauge has the the immediate context of the conversation in which it is occuring, and the ultimate context of the physical world. What he misses is that not only does computer software have to be precise, it has to supply it's own virtual context; i.e. your web browser exists in the virtual context of the network, which connects it to an application which exists in a vitual context of a combination of, for example, a java environment on top of a database on top of an operating system. All the underlying layers provide a context for the next layer above in which to exist and interact. And we had to create every single layer from scratch!

    Lanier then makes the usual eglatarian conceit with the statement "Only culture is rich enough to fund the Antigora." The Internet is its own culture, which both incorportates and yet transends mutiple, different national, tribal, and social cultures. Lanier and all the other Internet pundits need to recognize that, get the hell over it, and move on.
  • Against files (Score:3, Interesting)

    by Animats ( 122034 ) on Monday January 09, 2006 @03:03PM (#14429524) Homepage
    There's an argument against "files" as the basis of an operating system. The most successful movement in that direction was Tandem's operating system, Guardian. The bottom-level storage system in the classic Tandem world was a relational database, not "files". All database operations were atomic and recoverable, and the database was duplicated across multiple disks and computers. Tandem machines were all clusters, decades before other companies figured out how to do clustered systems. Business systems built on Tandem's hardware and software could be, and were, able to run for years, sometimes decades, without failure. Machines in the cluster could be fail and be replaced without a shutdown. This worked for real transactions where updates mattered, not just for stateless operations like web page serving. Many banks and stock markets still run Tandem systems for that reliability.

    If you needed a "text file" in the Tandem world, it was treated as a big object (a BLOB) in the database, and handled as a unit. THis seems wierd, but it allowed program development on Tandem machines. Storing a file was, of course, an atomic operation; you never had a truncated file.

    Apple's "resource fork" was a step in the right direction, but the implementation of updates in the classic MacOS was so unreliable that it was hopeless as a data-storage mechanism. Apple backed off from the resource fork when they went to a UNIX-type file system after the NeXT acquisition. Now it's making a comeback in a minor way.

    Early visions of Microsoft's Longhorn seemed to be moving in that direction, but Microsoft couldn't bring it off.

    UNIX/Linux has terrible file reliability semantics. Locking is an afterthought. File transactions aren't atomic. (Even lock file creation isn't atomic if the the file system is on NFS.) Nobody understands two-phase commit, the technique that keeps your bank account from being debited twice if the ATM loses power during a transaction. There have been attempts to fix these problems (see UCLA Locus), but they never caught on.

    The most likely company to fix this problem is Google. Google's own machines are full of databases of text, not "files". In a sense, we're all using a system that's not file-based - we just don't see it.

  • Re:Jaron's Title (Score:3, Interesting)

    by Kadin2048 ( 468275 ) <slashdot.kadin@xox y . net> on Monday January 09, 2006 @03:15PM (#14429630) Homepage Journal
    There are a couple of cool things in the article. Not particularly interesting things, and I'm not sure whether they really hold any water, intellectually, upon any sort of lengthy consideration, but I think people are giving it a bit of a kneekjerk response here on /.

    He makes an interesting point about the idea of files and how entrenched that idea is. I would take this further and say that the whole idea of the desktop metaphor is very entrenched, although I don't think I'd go so far as to say that we'll necessarily still be using it in a thousand years.


    Prior to sometime in the mid-1980s, there were influential voices in computer science opposing the idea of the file because it would lead to file incompatibility. Ted Nelson, the inventor of the idea of linked digital content, and Jef Raskin, initiator of the Macintosh project at Apple, both held the view that there should be a giant field of elemental data without file boundaries. Since UNIX, Windows, and even the Macintosh--as it came out the door after a political struggle--incorporated files, the idea of the file has become digitally entrenched.

    We teach files to undergraduates as if we were teaching them about photons. Indeed, I can more readily imagine physicists asking us to abandon the photon in a hundred years than computer scientists abandoning the file in a thousand.


    An exaggeration, to be sure, but it's still a decent point.

    As he gets further and further down his conclusions become more and more farfetched, and I don't agree with his dislike of UNIX, either (frankly I find the commandline to be somewhat empowering, although I dislike the knowledge-cult attitude that it seems to generate sometimes).

    I think he makes a relatively salient point as well about content protection:

    The most attractive designs, from the point of view of either democratic ideals or the profit motive, would have intermediate qualities; they would leak, but only a little. A little leakage greatly reduces the economic motivation for piracy as well as the cost of promotion. A little leakage gives context to and therefore enhances the value of everything.


    To my eye, Apple's iPod and iTMS system is a system which leaks "a little." There's some content protection (keeping you from moving the files from the iPod to a friend's computer) but it's nothing that can't be bypassed by a bright 12-year-old. As such, it succeeds in being commercially viable -- as totally open systems aren't -- without disempowering users to the extent that competitors systems do. They could have been a lot more thorough with the copy protection; they were not and it shows. Instead they did what they needed to do to get it out the door. I'm sure there are other examples of this around, but having just restored a music collection from an iPod yesterday, this was foremost in my mind.

    At any rate, I think it would be wrong to dismiss Lanier out of hand. I don't know his history or reputation -- apparently it's not great -- and I'm not sure how I feel about his politics regarding ultra-privatization of everything. But there are some things worth discussing in there, among the pretentious academic language. We'd be doing ourselves a bit of a disservice if we didn't bother to bring them up at all.
  • The problem is that databases and other non-file data stores are more brittle than files. The more complexity there is in the metadata, the easier it is to lose information, and the more you're locked in to one specific form of metadata.

    And databases came first. back in the 60s and even well into the 70s, a "file" was seen as a column in a table, or a table, in a database. As databases became more powerful, data stores tried to follow... you had RMS on DEC operating systems, "typed" data sets and files, systems like Pick, Apple's "resource forks", and Be's BeFS. No matter how the data's stored, eventually anything more than a shopping list (oh, yes, there are very complex shopping lists: address books, customer databases, and lots and lots of indexes into collections of flat files like Harvest and Spotlight and Google) becomes a flat block of text with embedded links to other blocks of text.

    Whether those links are "see chapter 10" or "#include stdio.h" or "import io"... those links are not links to databases, they're links to files.

    ---

    The idea that an unstructured block of data was the default was a breakthrough. The idea that a command line interface could be relatively terse and simple so that mere humans could learn to use it, that was a breakthrough too. UNIX cut through an enormous amount of user interface trash and laid bare what was, for the end of the '60s, at least as dramatic an improvement in UI design as GUIs were for the '70s. It's a linguistic interface, not a gestural one, but the first linguistic interface that provided concurrency (through the & background scheme, then through shell layers and job control) and the complete OPPOSITE of the normal "user submits a command, user waits for a response" that every other system in the world used.

    I implemented a UNIX shell with explicit backgrounding on RSX-11 and showed it to my boss, and he was astonished. Even though RSX has an ability to hit return and get a new prompt at any time, so you already have the ability to "interrupt" a program and do something else, he'd never used that other than to treat that MCR prompt as an "I'm still busy" message. But being able to take something that was going to take a long time and throw it off into the background under his control was great.

    Given the hardware limitations of the time, I submit that the UNIX shell and the UNIX plain-text-file pipes-and-filters job-control environment is close to the very best user interface that could be developed. It's the "tabbed browser" of the command line world. Alas, X-Windows came along and people stopped really using and understanding the shell, and X11's high-latency message based interface became the standard for the UNIX world.

    It's really X11, a non-UNIX-like window system developed for UNIX and VMS at MIT in the '80s, that Lanier should be complaining about. Because UNIX itself doesn't suffer from the flaws that he's attributing to it. UNIX is small, tight, fast, responsive, and concurrent, a UNIX shell is a team of willing slaves that does WHAT you want WHEN you want it, and you NEVER have to wait for them unless you choose to.

    ---

    File systems with UNIX semantics, by the way, work well. That's the problem with NFS. NFS is *not* a UNIX file system, and its semantics make it a huge nightmare for applications developed on REAL UNIX file systems. It was a hack-job designed to make it possible to implement a reasonably fast and efficient file system in the kernel on a 68000-based Sun workstation in the '80s. It should have been turfed long since and again IT'S NOT UNIX, IT'S NOT UNIX FAULT.

    ---

    For structured data, databases are great. Using a file system for database operations was a result of UNIX coming from an era before there was a really common way to talk about relational operations linguistically. Bad as SQL is, at least it gives us a framework to deal with the problem. But for hierarchical randomly interrelated data the filesystem model works well.

    Google is an index, it's not the data itself. The data that gives google its value is in a file system.
  • by anaesthetica ( 596507 ) on Monday January 09, 2006 @04:49PM (#14430463) Homepage Journal
    Seriously. Someone needs to buy that author a copy of The Elements of Style [amazon.com]. Lanier couldn't write a clear sentence if his life depended on it.
  • by toiletsalmon ( 309546 ) on Monday January 09, 2006 @07:09PM (#14431709) Journal
    Most of the time, I can read most of the comments on this site and forget that I'm surrounded by a bunch of whiteboys. Then, there are the times that the sad truth comes bubbling furiously to the surface.

    (Score:5, Funny)

    That's just...sad. That being said, I'm certain that I will now be modded down into oblivion. Goodbye cruel world. I hardly knew ye.

An authority is a person who can tell you more about something than you really care to know.

Working...