Are 99.9% of Websites Obsolete? 546
citizenkeller writes "Zeldman is at it again: " Though their owners and managers may not know it yet, 99.9% of all websites are obsolete. These sites may look and work all right in mainstream, desktop browsers whose names end in the numbers 4 or 5. But outside these fault-tolerant environments, the symptoms of disease and decay have already started to appear.""
Figures.... (Score:2, Interesting)
YEAH I agree (Score:3, Interesting)
It worked in all the current browsers a year ago.
but with IE 6 and the new netscape coming out - you would *THINK* there would be backwards compatability.
However, I get e-mails all the time from things that are now 'suddenly' broke.
And after verifying what browser/etc the user encountered this error with - amazingly enough
*go figure*
Gasp! (Score:5, Interesting)
Who on earth is running a browser earlier than 4.x? Do you expect stuff to be rendered right if you use an older version of IE/Netscape/Opera? Do advertisers want to sell to people that refuse to use the latest and greatest thing? Don't you have to try real hard to even find an older version of any of these browsers?
Sounds like a cheap way to sell a book - and a little extra helping of FUD thrown in.
Books vs. The Web (Score:4, Interesting)
You know, if I pick up a book printed in 1920, it's interface is going to be familiar to me. Table of Contents, Index, Chapters, Body Text, etc.
And now? I pick up a book printed today and find the same, useful interface.
Contrast that with the web, where I can find simple clean interfaces like Google or Yahoo compared to some ghastly Flash-based interfaces that do everthing they can to distract me from the information I'm seeking. Plus, I'm being told that the device (program) I use to access these sites is obsolete less than five years after being released?
I'm all for freedom of speech (and web presentation), but the web's got a long ways to go before it can become the useful instrument it can be.
How about Slashdot then? (Score:3, Interesting)
correction .. company website (Score:5, Interesting)
{and technically
however
IE 5.5 will support nested tables up to 7 in depth. Netscape 6 will only support up to 4 in depth.
Netscape 4.7 does not require quotes around 'field' tags like width or height.
Netscape 6.0 can do unusual things if they are not there.
the problem (as stated in the article) is that becuase of the past 'browser wars' fighting for dominance
Now that everyone is trying (or at least saying they are) getting on the w3 bandwagon. These little 'faults' are starting to cause errors.
And since the vast majority of web pusblishers and early adopters out there have not received *formal* training in html [I for example
5 years of bad habits become 2nd nature.
sorry for the confusion.
Complexity vs. usability (Score:3, Interesting)
I'm not saying that we as a collective need to move back to HTML 1.0, but there has got to be a solution to increasing complexity in Web information spaces. Companies that intentionally cripple some browser/OS combinations are doing the greater community a vast disservice.
The majority of Web pages are not necessarily broken, but reflect limits on the time and energy of those who create them to keep up with 'standards' that seem to shift every other week.
It's harder to play one note and have it be perfect than it is to play a thousand and have them be close. Most people choose the latter, and hope that one note hits home.
Re:Back in Reality... (Score:5, Interesting)
For my own sites I simply don't care about older browsers. I provide alternative CSS files (with basically all layout stripped) that should work in netscape 4 (haven't actually tested this). Aside from that there's only IE6 and mozilla for me. I develop for Mozilla and remove everything that doesn't work as specified in IE6. I refuse to do browser detection or to use CSS hacks to get stuff working. Some people advocate such hacks to trick IE into the right behavior but I refuse to sacrifice elegance and simplicity. That is also the reason I use XHTML strict. XHTML strict is much easier to maintain than HTML dialects that are polluted with formatting and other bullshit.
Giving netscape 4 users a bad experience may actually stimulate them to install something else. If enough sites ignore netscape 4, maybe it will be abandoned by users. On most platforms there are now good alternatives (e.g. opera performs better than netscape 4.x on win32).
Web Standards are a well conceived joke (Score:2, Interesting)
The main problems that I see are that
1. Web standards bodies move slow and specifications are obsolete before they are approved. Take SVG. (please) Flash is a superior format with a large installed base, quality authoring tools, platform scalability, and open but expensive architecture. SVG took five years to become a reality, and is still VERY immature.
2. It's about the user stupid! For the most part, users sit at a computer desktop, with a commercial browser (IE), and use the internet. It needs to look right for THEM. The .001% of users on cell phones are doing specific activities with mostly packaged content. These users are novelty users. Portable devices have no standards as to how they display, and without this, nobody can expect a useful cross platorm "standard" that works everywhere. It's a microsoft world whiner. There is no doubt that IE is the only browser that matters. If someone else wants to make a competitive browser, it needs to be IE compliant, not W3C compliant. Microsoft took it upon themselves to create a language that works, no matter how it's written. Who cares about sloppy coding? Bandwidth is hardly an issue, and if a browser renders correctly, it should LOOK right.
in conclusion, the web standards project and w3c have failed due to their manegerial impotance, and can be safely ignored.
The Problem Ins't Backward Compatibility (Score:3, Interesting)
Designers want to control every pixel of a page's layout, completely ignoring what the web was designed for. If everyone used logical markup to describe their data, later adding CSS to attempt to influence the layout, the web would be a much friendlier place. It may not look exactly the same on every browser (which, come to think of it, may be Zeldman's point), but with proper testing, it should look similar on popular browsers, and at least be LEGIBLE on others.
People need to be convinced that the web is not a graphic design medium. That's what PDF files are for. People don't try to build their sites solely from PDF files, because that just wouldn't fly. Instead they try to use the web to achieve the same goal, completely oblivious to the fact that it's a really poor tool for that purpose. Rather than embracing a new paradigm, they try to contort it to look like what they already know. To me, that's just incompetence.
No, it's just reminiscent of "Flash: 99% Bad" (Score:3, Interesting)
Solution: Content Management Systems? (Score:2, Interesting)
I work for a mid sized company but I know the web site is very out of date and has incredibly poor content. In my mind I can pinpoint this to one thing. The inability for the people who write content to get it to the site.
I know for fact there is more than enough good stories and photographs in the organization that can be published but most of the technicians who would write it (or at least the first draft) don't have the time to learn a web design program. The solution I believe is a good content management system. I've been looking into Typo3 [typo3.com] and a couple of other content management systems. I believe once we make it easy to update then content will be less likely to be obselete.
Content Management Systems are right now the best place I can start introducing open source software at my work. We've looked at Microsoft's Content Management Server which is highly over priced for our needs and its hard to argue with the documentation and self-help community that open source software provides. I know there are other content management systems out there but the point is that for content to stay current publishing capabilities must be pushed to the people who will author it.
Obsolete is an obsolete word (Score:4, Interesting)
Here's what it means: http://www.dictionary.com/search?q=obsolete
Hell, I still use lynx when all I want to do is snag a tarball. My linux boxes dont even have a GUI. If the content there has meaning, who cares if the web page uses the latest 'nifty tricks'. Is an ASCII text file obsolete? No, not if the information it contains is valid. Is EBSDIC (sic) obsolete? Probably. I cant even remember the acronymn
I'm constantly hearing how my P3 600 is obsolete. There's nothing that doesn't run on it. Hell, I have a router box running a P90.
Is my original NES obsolete? Or my Atari 2600, for that matter? Not as long as I enjoy playing them.
Is a 2001 model vehicle obsolete because the 2002 line is introduced? It does have a bigger cupholder, after all.
If people want to push their agendas, sell whatever they're selling, go for it. Just quit trying to redefine perfectly cromulent words in the english language to do so. Make up new ones, like cromulent. I propose 'obsolastweek' to mean everything that wasn't shrinkwrapped within the last 24 hours.
This article should read "99.9% of websites are obsolastweek because they haven't been redesigned because some propellerhead made a new widget"
Propellerheads (I can use that word because I am one), dont realise the cost of doing business. The world doesn't start over at 0 just because they invented something 'slightly better'.
sample chapters are great! (Score:2, Interesting)
The essay gave a good analysis of tradeoffs that web programmers have to make when planning websites. Some of the code examples here were particularly hilarious (if only because I know my websites have code that is equally ugly). This chapter, as I see it, is not advocating anything radical or controversial; it is merely restating the problem in as dramatic way as possible.
Book Previews reduce the "obsolescence" of technical books. I say, let's have more of them!
rj
Re:correction .. company website (Score:3, Interesting)
Netscape 6.0 can do unusual things if they are not there.
The W3C standard says that ALL attributes are required to have quotes. A browser could refuse to render any element with no attribute quotes and still be as compliant as before, since the behavior is undefined. If you followed the standard in 1994, the problem would never have occured in any browser. Since, what you fail to mention, is that both Netscape 4.7 and 6.0 render the page properly with quotes. Why risk making it fail then by leaving them out? This is about the same error as a programmer not initializing a bit of dynamically allocated memory and then writing in it. Might work, might not. The behavior is undefined and with the proper education, a programmer would have learnt to not make the mistake.
Now that everyone is trying (or at least saying they are) getting on the w3 bandwagon. These little 'faults' are starting to cause errors.
The problem is, no one said that "writing the pages this way will probably make them work in the future". However, I'd like to see a page written using proper HTML + CSS and use no deprecated tags (like FONT) to go bad in the next version of IE or Netscape/Mozilla.
So what if the faults are causing errors? A design fault causing a fault in rendering is fully logical to me, and I understand the browser designers who're starting to have troubles rendering according to their previous non-standards as features are added to *follow* the standard. Soon you'll have a big mix of standards and non-standards and at least I would be very tempted to just throw out the shit and attempt to follow standards better in the future. Something Microsoft partially did in IE 6 (they requirery a proper DOCTYPE to enable compliance mode -- probably too afraid of doing too drastic things) and something the Mozilla group definetily did in their browser.
Just follow the standards and your pages should look very nice in Netscape 8 and Internet Explorer 7. Start by learning about the DOM tree and forget everything you ever "learnt" about document.all. Use getElementById("id") instead of document.all.id. As a bonus, by following DOM and skipping deprectated tags like FONT, etc, while using CSS with em values and the likes, you'll automatically get the benefit from getting pretty much a cross-browser page, since the CSS rules have very strict rendering rules. *And* a page that looks good in the future.
If more web designers got education (as in all most other sorts of work -- what's so special about web designers needing no education anyway?), things would of course look better today.
Good Logic (Score:3, Interesting)
However, when the HTML is standard, it's a bug in the browser, which needs to be addressed.
Your logic is flawless, but notice where you're left now.
The browser is branded buggy and non-compliant.
Say the browser is IE 4 or Netscape 4.
Great - the browser creators come out with a new version of the browser that fixes those bugs.
IE 6 and Netscape 6 are in greater compliance with standardized HTML 4.01, CSS, DOM, etc.
Now you come to the end of the road:
Re:Business Need and Long Term Costs (Score:3, Interesting)
Its not the backwards compatability that concerns me, its the _sideways_ compatability that's more important to me. The authored HTML tends to work in a range of Netscape browsers, a range of Internet Explorer browsers, and sometimes in a range of Opera browsers. Anything other than that is random.
A standard's adhering HTML document could be used in all the browsers above, plus all the other user agents out there that support the standard followed. So text-to-speech browsers, indexers, spiders, content aggregators -- all the silent user-agents suddenly have access to structured content.
These are the useragents that are overlooked by the typical public website. People don't tend to notice that structured markup scores a lot better in google than font-flavoured tag soup, precisely because h1 defines a first level header, and font defined some weird presentational style but nothing semantic that a search engine can use.
I don't believe browsers will be the user-agent of choice in the coming years - we'll automate all the manual intensive process of trawling through websites looking for information, and we'll delegate it to some sort of intelligent agents that do the work while we do something more enjoyable.
RSS Aggregators like AmphetaDesk [disobey.com] show a very basic inkling of what can be possible with structure and the value of content out there on the Internet.
But we need structured markup to add semantic meaning to the content, and then we can leverage that content into something truely useful. (Yes, I'm a dreamer longing for something practical)