IMO this is the classic case of something getting so big that a superior system can never challenge it.
The web, to me, is a under-functional core with over-functional (read: insecure and often able to do things against the users' wishes) bolt-on additions (eg java), whose security holes are then repeatedly patched up with incomplete fixes. It's inefficient, kludgy, and very badly misused by web authors.
A few of the more annoying parts are:
Some time midway through its history, the idea that content should be adaptable by the user (as was a big initial selling point) got diminished, as you can't adapt scripts or flash. A good example (but only one of a million) is that scripted links usually can't be opened in new tabs.
The biggest problem (as I see it) is that web authors (for example, /.'s) that have sites perfectly capable of being transmitted in the cleaner, more efficient script-less "old" method choose to go with scripts "because they can", or because it sounds better to say (either to the users or to HQ) our site now has here. These scripts routinely perform erroneously on non-standard browser setups.
Compared to an optimal replacement for the web, information takes far, far too long to download, for a number of reasons:
1. Poor caching, especially of "advanced" content.
2. Various protocols and multiple servers meaning that content can't be sent as one big chunk and must instead require multiple establishments of communication channels).
3. Lack of compression of much content, such as text and many images.
IMO, the web is a good pet project suitable for hobbyists and enthusiasts. It's not good enough for prime-time - sad that it's used there.