Assuming that he'd filed for his patents before the USPTO went insane, odds are low that they would have survived the prior art test. As others have mentioned, Gopher, FTP, and even several BBS systems would have been able to cover the prior art for the HTTP component. HTML was really just a bastardized version of SGML. And the entire concept of a hypertext page was predated by HyperCard and a bunch of work at Xerox PARC.
Why are we even having this discussion?
As it turns out, the real problem on these platforms is power generation. With synthetic aperture radars, flight control systems, on-board mission management systems, laser designators, EO sensors, and LOS and BLOS/satellite comms gear on board, the problem of supplying electricity for all the systems becomes critical.
I worked on the original J-UCAS program which transitioned from DARPA to the Navy, and designing the autonomous flight and mission management systems was the easier part of the problem. Creating the comm infrastructure (software defined radios), the operational procedures, the peer-to-peer cooperation, and mundane stuff like dealing with air traffic control turn out to be much harder in practice.
Definitely one of the coolest projects I have ever worked on and I'm glad to see one of the J-UCAS derived UAVs finally getting into the air.
It's just as shallow to compare a desktop software application's success to the much more transient, ephemeral, and difficult to quantify success of Facebook. Better to look at the Internet as a whole and ask a couple of simple questions.
First, name one network-wide, user-oriented application level service that was present when the commercial Internet opened for business in 1991 that is still in operation and use today.
Discounting UseNet and Email as infrastructure, the answer is likely "nothing." It's instructive to consider why. Early community plays on the Internet (The Well, The Globe, AOL, WebTV, and even MySpace) fell in succession, not because there weren't plenty of users and not because they weren't good services. They fell because, by definition, something newer and better comes along. It's the same reason we don't drive horse-drawn wagons to work. Supporting the infrastructure and feature set of an existing system means, by definition, that you will never be able to change and adopt new technologies as fast as someone else starting with a clean slate.
Second, what is so special about Facebook that it will avoid being obsolesced by the next cool fad? Answer again, "nothing".
Facebook's only advantage is the depth of its social graph. And as many posters have noted, the average Facebook user has a pretty static social graph and no need to add to it in any significant way now. Once you are fully connected, it becomes trivial to notify your graph that you are moving elsewhere, and then Metcalfe's Law kicks in. Once the infrastructure becomes distributed and you are no longer locked into a single service, people will be free to move their social graph and associated applications wherever they'd like.
Extrapolating the past lifecycles of similar, successful social sites to Facebook, it seems logical to conclude (as the author did) that Facebook's days are numbered. Maybe in the thousands, but numbered nonetheless.
The developer is at fault, not Apple. Apple is not party to the GPL terms just because a developer with a political agenda chooses to upload an application to the App Store. The developer is at fault for choosing a distribution medium for his software that is incompatible with GPL terms. It's not any more complicated than that.
The fallacy in this article is that designers are the ultimate arbiters of how a web site gets built. It's just not the case. If I am paying the bills and I want to reach the largest possible audience, my first instruction to the designer working for *me* is going to be to avoid Flash, Java, and anything else that is plug-in dependent like the plague. Flash sites are inaccessible to the vision impaired, they are not generally searchable, they are prone to breakage when plug-ins are out of date or non-existent, and they are generally not maintainable by the organization they were created for once they are delivered.
Sorry, but sticking with standards based solutions and tools that manipulate open formats is the way to go if you, Mr. Designer-boy, are working for me. In a market-driven economy, designers are a commodity, not an industry force. They're gonna do what they are paid to do.
Thank goodness that's all solved now!
The good news is that SPDY seems to build on the SMUX ( http://www.w3.org/TR/WD-mux ) and MUX protocols that were designed as part of the HTTP-NG effort, so at least we're not reinventing the wheel. Now we have to decide what color to paint it.
Next up: immediate support in FireFox, WebKit, and Apache -- and deafening silence from IE and IIS.
Also, a couple of times I've had dying drives that work OK for a few minutes after a cold boot, and then they (heat up and) die. I've had good luck throwing the drive in the freezer (in a ziplock bag) for a day, then powering up it, recovering as much as I can until the drive chokes again, lather, rinse, repeat, until all recoverable data has been copies off to a good drive.
TFA:"Bringing up the Status window I noticed my download performance was a far cry from my 7 mbps speed, but rather a measly 0.48 mbps...:"
0.48Mbps = 480Kbps (kiloBITS/sec) = roughly about 48KBps (kiloBYTES/sec)
As Alexis de Tocqueville said, People get the government they deserve. If you're "smart" enough to think Iowa has it right, then by all means, please enjoy the pig farts and corn husks of your new overlords.
Gee, Toto, I don't think we're in Kansas anymore.