Follow Slashdot stories on Twitter


Forgot your password?
The Internet

Are 99.9% of Websites Obsolete? 546

citizenkeller writes "Zeldman is at it again: " Though their owners and managers may not know it yet, 99.9% of all websites are obsolete. These sites may look and work all right in mainstream, desktop browsers whose names end in the numbers 4 or 5. But outside these fault-tolerant environments, the symptoms of disease and decay have already started to appear.""
This discussion has been archived. No new comments can be posted.

Are 99.9% of Websites Obsolete?

Comments Filter:
  • Figures.... (Score:2, Interesting)

    by inf0rmer ( 545195 )
    I think this percentage of the web sites that Iv'e developed over the years are obsolete. It's nothing to do with bad design - the owners of the site don't bother to use them effectively any longer and content becomes... obsolete.
    • At one point durring the heyday of the .com gold rush, people threw money at companies which claimed the ability to draw increadible proffits at some undetermined point in the future. Some onsider this long term thinking, while others consider it foolishness.

      Website designers have learned this lesson well. They strive to serve their business clients by allowing them interact with the largest customer base possible by using clunky non-standard, bandwidth-consuming techniques to get outdated browsers to render their stores in the desied fashion.

      You really can't blame website designers for this, nor can you blame site owners. The designers are working to meet their client's requirements, which is to make money, by being accessible to the largest percentage of the available customer base.

      The fault, dear brutus, is in ourselves. Website visitors are at fault, for using browsers which promote this non-standard architecture. Certainly no one will use a browser which is strictly standards complient such that any non-standard website would not be visible, because that would diminish the user's internet experience; but this is what's required. We need to force site owners to become standards compliant, which will in turn improve efficiency throughout the net.

      If only, bandwidth were more expensive, this problem would already have been fixed, as the bandwidth costs of ineficient non-standard site design would be far mor visible.

      It really is a foustian bargain. Reduce revenue by modernizing your website thereby making it inaccessible to older browsers and thus reducing your potential customer base and save money on bandwidth usage, then wait for web users to upgrade their browsers so as to be able to view your site, and build up your custoemr base once again; or, cater to every antiquated browser in existance, so as to maximize your potential customer base, and accept the increased bandwidth costs.

      In the long term, with a little short term pain, this problem will be resolved, but in the short term, there really is no good answer.

  • Blinkers (Score:2, Troll)

    by Zemran ( 3101 )
    It seems like someone has finally noticed that if you do not test your site using a wide range of browsers you do not know how your page is going to look... To most of us this problem is obvious.
    • It seems like someone has finally noticed that if you do not test your site using a wide range of browsers you do not know how your page is going to look... To most of us this problem is obvious.
      To most of us, yes, but not to a large proportion of web site designers, apparently.
  • YEAH I agree (Score:3, Interesting)

    by RembrandtX ( 240864 ) on Wednesday September 11, 2002 @11:15AM (#4237582) Homepage Journal
    I cant even keep OUR damn site up and compliant.

    It worked in all the current browsers a year ago.
    but with IE 6 and the new netscape coming out - you would *THINK* there would be backwards compatability.

    However, I get e-mails all the time from things that are now 'suddenly' broke.
    And after verifying what browser/etc the user encountered this error with - amazingly enough .. pages that work with older browsers - are choking up the newer ones.

    *go figure*
    • I don't see how that's possible, since you're using standard HTML. Wow. Maybe your web site got sucked into an alternate dimension where HTML versions are not backwards compatible?
      • by RembrandtX ( 240864 ) on Wednesday September 11, 2002 @11:26AM (#4237709) Homepage Journal
        Correction .. I mean to say my employer's website, which uses asp/javascript/VB

        {and technically .. my personal website uses PHP .which is just getting parsed into html for your browser}

        however .. if you would read the article .. even basic HTML can be corrupted ..

        IE 5.5 will support nested tables up to 7 in depth. Netscape 6 will only support up to 4 in depth.

        Netscape 4.7 does not require quotes around 'field' tags like width or height.
        Netscape 6.0 can do unusual things if they are not there.

        the problem (as stated in the article) is that becuase of the past 'browser wars' fighting for dominance .. previous incarnations of browsers tolerated (and corrected) sloppy html.

        Now that everyone is trying (or at least saying they are) getting on the w3 bandwagon. These little 'faults' are starting to cause errors.

        And since the vast majority of web pusblishers and early adopters out there have not received *formal* training in html [I for example .. got my CS degree in 1994 .. i never even learned visual basic in college] they are/were not always *aware* of things that html 'requires' but the browsers let them get away with.

        5 years of bad habits become 2nd nature.

        sorry for the confusion.
        • Yeah, I know what you mean. Most of the time, the problem is with the browsers, though. When you allow yourself to compromise for the sake of compatibility with poorly designed browsers, this is exactly what happens.

          Granted, sometimes it's unavoidable, since backward compatibility can't be maintained. In this case, the problem is with standard HTML. However, when the HTML is standard, it's a bug in the browser, which needs to be addressed.

          Just because void main{} can compile doesn't mean it's right.
          • Good Logic (Score:3, Interesting)

            by 4of12 ( 97621 )

            However, when the HTML is standard, it's a bug in the browser, which needs to be addressed.

            Your logic is flawless, but notice where you're left now.

            The browser is branded buggy and non-compliant.

            Say the browser is IE 4 or Netscape 4.

            Great - the browser creators come out with a new version of the browser that fixes those bugs.

            IE 6 and Netscape 6 are in greater compliance with standardized HTML 4.01, CSS, DOM, etc.

            Now you come to the end of the road:

            Joe Sixpack refuses to upgrade his browser!
        • You know, adding a DTD, defining character encoding and validating your HTML [] would probably help quite a bit.

          They're called standards for a reason.


          cLive ;-)

          • because there are so many to choose from?
            • So? At least if you specify a DTD you are showing that you understand there are issues and have made a rational choice in selecting one.

              If you bother to adhere to a particular DTD, the odds on your site being viewable accross a range of browsers increases dramatically.

              Yes, you will never get 100% compatibility, but you will get damn close.

              If you insist on features outside current DTDs, then use server side browser detection to serve either the site as you intended, or a heavily stripped down totally cross platform version.


              cLive ;-)

              cLive ;-)

          • by JamesOfTheDesert ( 188356 ) on Wednesday September 11, 2002 @01:37PM (#4238721) Journal
            They're called standards for a reason.
            Well, no, they're not called standards, and for a reason. From the w3c home page:

            The World Wide Web Consortium (W3C) develops interoperable technologies (specifications, guidelines, software, and tools) to lead the Web to its full potential.

            No mention of standards.

            Take a look at the HTML specification page []:

            W3C produces what are known as "Recommendations". These are specifications, developed by W3C working groups, and then reviewed by Members of the Consortium. A W3C Recommendation indicates that consensus has been reached among the Consortium Members that a specification is appropriate for widespread use.

            Again, no mention of standards.

            The W3C is a vendor consortium, primarily a group of big players who are trying to reduce their cost of busness by hammering out some common formats. The W3C is not a standards body, and they do not produce standards. While there are smart, possibly altuistic people on W3C working groups, by and large the W3C as a whole is intersted in promoting the welfare of its member companies, not that of the general developer community. Typically, though, these interests overlap, but that doesn;t change the purpose of the W3C.
        • Do you have any idea how old Netscape 6.0 is?
          For goodness's sake, upgrade to Netscape 6.2, 7.0 or Mozilla 1.1! 6.0 is so old and has so many bugs, while 6.2 is almost infinitely more stable/faster/better in rendering.
        • Netscape 4.7 does not require quotes around 'field' tags like width or height.
          Netscape 6.0 can do unusual things if they are not there.

          The W3C standard says that ALL attributes are required to have quotes. A browser could refuse to render any element with no attribute quotes and still be as compliant as before, since the behavior is undefined. If you followed the standard in 1994, the problem would never have occured in any browser. Since, what you fail to mention, is that both Netscape 4.7 and 6.0 render the page properly with quotes. Why risk making it fail then by leaving them out? This is about the same error as a programmer not initializing a bit of dynamically allocated memory and then writing in it. Might work, might not. The behavior is undefined and with the proper education, a programmer would have learnt to not make the mistake.

          Now that everyone is trying (or at least saying they are) getting on the w3 bandwagon. These little 'faults' are starting to cause errors.

          The problem is, no one said that "writing the pages this way will probably make them work in the future". However, I'd like to see a page written using proper HTML + CSS and use no deprecated tags (like FONT) to go bad in the next version of IE or Netscape/Mozilla.

          So what if the faults are causing errors? A design fault causing a fault in rendering is fully logical to me, and I understand the browser designers who're starting to have troubles rendering according to their previous non-standards as features are added to *follow* the standard. Soon you'll have a big mix of standards and non-standards and at least I would be very tempted to just throw out the shit and attempt to follow standards better in the future. Something Microsoft partially did in IE 6 (they requirery a proper DOCTYPE to enable compliance mode -- probably too afraid of doing too drastic things) and something the Mozilla group definetily did in their browser.

          Just follow the standards and your pages should look very nice in Netscape 8 and Internet Explorer 7. Start by learning about the DOM tree and forget everything you ever "learnt" about document.all. Use getElementById("id") instead of As a bonus, by following DOM and skipping deprectated tags like FONT, etc, while using CSS with em values and the likes, you'll automatically get the benefit from getting pretty much a cross-browser page, since the CSS rules have very strict rendering rules. *And* a page that looks good in the future.

          If more web designers got education (as in all most other sorts of work -- what's so special about web designers needing no education anyway?), things would of course look better today.
    • Re:YEAH I agree (Score:3, Insightful)

      by ncc74656 ( 45571 )
      I cant even keep OUR damn site up and compliant.

      It worked in all the current browsers a year ago. but with IE 6 and the new netscape coming out - you would *THINK* there would be backwards compatability.

      If you had written to the standards instead of just hacking something together until it worked in IE/NS $CURRENTVERSION, odds are pretty good that you wouldn't have this problem now.

    • by bunratty ( 545641 ) on Wednesday September 11, 2002 @11:25AM (#4237698)
      It worked in all the current browsers a year ago. but with IE 6 and the new netscape coming out - you would *THINK* there would be backwards compatability.
      You have backwards and forwards compatibility mixed up.

      Backwards compatibility means it works in older browsers. As Zeldman mentions, it always has some cutoff point, such as Netscape 3 or IE 2.

      Forwards compatibility means that it works in newer browsers. There is not necessarily any cutoff point, as long as you have constructed the website correctly. Structural problems and other typos in the HTML, proprietary and deprecated tags, and versioning can all limit the forward compatibility of the page.

      Read the article and you'll see that Zeldman is arguing that web designers should be developing with forwards compatiblity in mind. Unsurprisingly, yours is one of the 99.9% of all sites that have not.

      • by Anonymous Coward
        He made no such mistake. He places the burden of interoperability on the producers of the software, not the designers of the sites. You place the burden on the designers, not the producers. From his perspective, the software companies should make sure that their software does not make unnecessary deviations from standard, thus breaking older sites. You think that the designers should predict change and design their sites to take this into account.

        I don't know which philosophy is more unreasonable.
        • But the problem is that most designers are NOT following these standards ,they keep using non standard features of the older browsers, thus the software writers now have a dilemma of the own making i grant.

          They have two choices, Only render the pages that follow the standards and have 99% of sites non functional in there browser or allow it to work so there browser can be used today.

          The only company that could currently force the updating of many sites is our favorite company Microsoft and even then I'm sure there would be resistance to a browser that only followed the standard.

          So the burden had to be on the designers of the site to pull them into line with the standard, once the browsers can render strictly to the standard such as mozilla and opra etc.

      • by Jahf ( 21968 ) on Wednesday September 11, 2002 @12:24PM (#4238164) Journal
        You're talking about forwards compatibility of the HTML code (being able to render properly on future browsers, where the onus of compatibility is on the HTML author).

        The parent was talking about backwards compatibility of the browsers (being able to properly render old HTML code in a new browser, where the onus of compatibility is on the browser author).

        It's semantics, but I didn't start the nitpick :) Either term works for this application as long as you are looking from the correct side of the issue.

        As for the parent that wanted browsers to be backwards compliant ... that works, but only if you write your code compliant 100% to standards. That means leaving out all the proprietary cruft (which became especially prevalent in the "4.0s" of Netscape and IE) -as well as- all of the stuff that doesn't work in a cross-browser environment.

        This is very hard to do if you want interactive sites, or at least was until recently when most browsers began to pay more attention to standards such as the DOM (document object model).

        Again, we're back to a very basic problem. Do you write your page to work in old browsers or do you use the latest standards? I'm less concerned with this (as the author of the book seems to be) than I am with the idea of writing code to today's standards and having it work in future browsers.

        I as a user understand that I'm taking my experience in to my own hands if I try to load a modern page into Netscape 1.0 (but it is fun some times :).

        However, words can't express my frustration when I have the most modern browsers available and I can't load a page because it was written for an older browser. This happened to me yesterday when trying to sign up for a service from my phone company. The reps kept saying "I see that option, you should have it to". 30 minutes later I decided to load the same page into a 2 year old browser and it worked fine. It had used some tags that were horribly broken, not in any standard, and later abandoned by all involved.

        If the modern browsers had had to be compatible with everything since the dawn of the web, they would be twice as large and 4 times as buggy. I would much rather that web authors stick to published standards and not rely on proprietary tags for public pages.

        From what I see, this is what the book's author meant by "obsolete" and I agree. Most websites, if locked down and not changed for 3 years, would no longer render in the browsers that are new in 3 years.

        While they will naturally work to fix these issues as the new browsers are released, they would not have to if they wrote to the basics. And the problem with fixing things as they evolve is that some pages (like that damned phone company page) get ignored and by the time they're found no one knows how to fix them.

    • I use XML output from PHP ran through my XSL stylesheets to produce the final output. The stylesheets get fed the users language and user-agent along with the output and easily produce custom output for all devices without any significant coding. I will be glad when (if?) HTML is finally replaced by good XHTML support but overall keeping up with these things is not difficult if you design your site well. Also since XSL checks the HTML output it produces it eliminates many of the problems commonly found in output code. The biggest problem is trying to deal with user inputted data and that is more of a language problem than a formatting problem.
  • by plover ( 150551 ) on Wednesday September 11, 2002 @11:15AM (#4237588) Homepage Journal
    It's not even a review. The "sample chapter" presented features such nice conflicts as: web pages that are HTML 1.0 compliant waste bandwidth vs. web pages that are written for IE only turn away 25% of their viewers.

    Near as I can figure out, he's claiming "the web is broken, don't bother."

    The book looks broken. Don't bother.

    • Have to agree.

      HTML. Broken. Yadda yadda yadda. Design. Content. Seperation. Blah blah blah.

      Relying on HTML to solve these problems is outdated. We have back-end scripts used to deliver cutomized presentations depending on the browser used to visit the site.

      But I guess this is obvious to most of the horde of /. readers.
    • by Monkeyman334 ( 205694 ) on Wednesday September 11, 2002 @12:15PM (#4238062)
      I don't know where you get your stats, but it's 8% that don't use IE. I agree the book looks like a joke though. Take this quote for example:

      The irony is that no one beside Yahoo's management cares what Yahoo looks like. The site's tremendous success is due to the service it provides, not to the beauty of its visual design (which is non-existent).

      I just want to know, what part of this makes it obsolete? That it uses html work arounds, looks right, or is a great service?

      Then he goes on to complain about this extra html causes huge bandwidth charges, which I can assure you are negligible, even over millions of page views. If you take a look at my August statistics [], on the 22nd you can see the sysadmin disabling mod_gzip. On the 28th, you can see me panicking about bandwidth and switching our old font tags to CSS. You can see the page views are about the same as the 27th, but the bandwidth goes from 871megs to 838megs. 40 megs is a very small difference for possibly breaking browsers that don't support CSS! Seeing as the bandwidth for a site like Yahoo is bought in bulk, even a gig of difference a day wouldn't be that much. And this is with mod_gzip turned off, that 40 meg gap would be turned to nothing if it was on. With yahoo, most of their bandwidth is in news images and content anyway, not their design. So I wouldn't recommend taking the time to read his book, or even the sample chapter, it's bogus for sure.
    • People tend to knock down geeks who have become popular or well-respected. As for sample chapters, I think they are great! Not only New Riders, but Oreilly does a great job in letting readers sample chapters. What a wonderful thing that anyone can download chapters before a book actually comes out. In book publishing, there is an enormous lag time between assignment of the book and publication date (just look at the review of the blogging book from yesterday). By the time a book comes out, the examples are irrelevant and the standards have changed or improved.

      The essay gave a good analysis of tradeoffs that web programmers have to make when planning websites. Some of the code examples here were particularly hilarious (if only because I know my websites have code that is equally ugly). This chapter, as I see it, is not advocating anything radical or controversial; it is merely restating the problem in as dramatic way as possible.

      Book Previews reduce the "obsolescence" of technical books. I say, let's have more of them!


  • by digital_milo ( 212475 ) on Wednesday September 11, 2002 @11:16AM (#4237597)
    If by "obsolete" you mean "porn", then I'd have to agree with you.
  • Back in Reality... (Score:5, Insightful)

    by alexhmit01 ( 104757 ) on Wednesday September 11, 2002 @11:17AM (#4237605)
    You can read the Webmaster World [] article, "XHTML -- is now the time? []" if you want to read a debate among professionals. There are many pros, primarily developers of small sites, that are advocating dropping NN 4 for XHTML Strict and CSS, but most developers aren't going that route.

    They are developing XHTML 1.0 trans or HTML 4.01, maybe adding CSS to go foward. NN4 will be around for a while, and few people are willing to write them off simply to appease the standards gods.

    In the real world, we build sites for human composition. We separate content from display with our databases and content management. HTML may be an inefficient way to get the data to the browser (XML+XSLT would be ideal, XHTML+CSS would be easier on the browser), but it works. The browser parsers are done.

    Sure XHTML+CSS is easier on the browser, and that may help rendering issues. However, the reality is that old browsers will be with us for a while. Maybe in 5 years this will matter, but not until then.

    • by jilles ( 20976 ) on Wednesday September 11, 2002 @11:46AM (#4237825) Homepage
      XHTML strict by itself renders quite nicely in older browsers. It's CSS that causes the problems. If you adhere to the standards and do some positioning, etc. You are likely to encounter problems in almost all browsers other than Mozilla. It is really frustrating to tweak your CSS to do what you want it to do and have it work on all major browsers.

      For my own sites I simply don't care about older browsers. I provide alternative CSS files (with basically all layout stripped) that should work in netscape 4 (haven't actually tested this). Aside from that there's only IE6 and mozilla for me. I develop for Mozilla and remove everything that doesn't work as specified in IE6. I refuse to do browser detection or to use CSS hacks to get stuff working. Some people advocate such hacks to trick IE into the right behavior but I refuse to sacrifice elegance and simplicity. That is also the reason I use XHTML strict. XHTML strict is much easier to maintain than HTML dialects that are polluted with formatting and other bullshit.

      Giving netscape 4 users a bad experience may actually stimulate them to install something else. If enough sites ignore netscape 4, maybe it will be abandoned by users. On most platforms there are now good alternatives (e.g. opera performs better than netscape 4.x on win32).
    • Sure XHTML+CSS is easier on the browser, and that may help rendering issues. However, the reality is that old browsers will be with us for a while.
      Even if your users use only the newest browsers, there are reasons to stay away from XHTML. Read Ian Hixie's Sending XHTML as text/html Considered Harmful [].
    • Rants against Netscape 4 tread well beyond the scope of CSS, but it's commonly known that any webpage that implements a fair amount of CSS1 will not be supported correctly on NN4. Better yet, if the webpage implements ANYTHING from CSS2, it's very likely that Netscape 4 won't support it. And there's much, MUCH more:

      NN4 doesn't support <DIV>. It supports <LAYER> instead.

      NN4 doesn't like inline styles.

      NN4 doesn't fully support the height attribute (e.g., table cells).

      NN4 doesn't allow onclick events on every object, such as <img> and <div> (or, layer, if we want to be technically correct).

      NN4 uses its own Document Object Model, which results in very poor DOM Level 1 support, and virtually no support for Level 2.

      NN4 supports the onunload event, but it does so quite unconventionally. This results in strange behavior when resizing a window: content unloads and refreshes, which is very undesirable for persistent objects, such as applets.

      I guess that's a good stopping place. The list goes on, but I hope you see my point. In fact, the word "unconventional" suits NN4 quite well.

      Web developers who are serious about dynamic or heavily stylized content will quickly realize that full NN4 support requires either an insane dedication to little hacks and gimmicks or a text-only version of their website. The way to present cross-platform, stylized content today is to use Shockwa^H^H^H^H^H^H^H a plugin.

      The fact that 5th and 6th (and now 7th) generation browsers are 95-99% standards compliant means that bleeding-edge content will target newer browsers, and Netscape 4 will be left to rot. Five years is an insane lifespan for a browser, and if you remember correctly, Netscape 4 was just getting off the ground five years ago. Internet life moves at the speed of normal time ^2, so your five years is really like 25.

      Maybe I live in a parallel universe, but in my reality, NN4 is already dead. Or, at least it has a really bad case of leprosy.
  • by joshua404 ( 590829 ) on Wednesday September 11, 2002 @11:20AM (#4237631)
    In the neverending rush to heap more and more gadgets and whizbang technology into browsers, the people that develop them didn't seem to take much of an interest as to their usefulness. Web developers struggling to stay abreast of existing technologies hardly had time to hone their skills on all the latest, bleeding edge (and often contradictory) gadgetry while being pushed by their managers to get their work done "Now, now, now!" Everyone was in such a rush to cash in that nobody put any thought into it.

    Now that the bubble has burst, fixing "obsolete" sites is not a priority. IT staffs have been cut, resources have been redirected into projects that actually turn a profit, or the "web guys" are gone all together. Nobody is around or has time to fiddle with the brochureware homepage.

    • Well, it's all part of evolution. Those companies that are not paying attention, and are letting their websites gather dust, will be overtaken by those with better and more compatible sites who are.

      Mobile devices, web kiosks, smart agents - these things will all be connecting to the net more and more in the future. Companies with well constructed sites which follow open (not browser/platform specific) standards will likely benefit greatly from all this.

      It won't happen overnight, but it will happen.

  • Gasp! (Score:5, Interesting)

    by Tsali ( 594389 ) on Wednesday September 11, 2002 @11:20AM (#4237634)
    And Jeffrey Zeldman will help us fix the errors or our ways! Anyone check Amazon for the price on this baby?

    Who on earth is running a browser earlier than 4.x? Do you expect stuff to be rendered right if you use an older version of IE/Netscape/Opera? Do advertisers want to sell to people that refuse to use the latest and greatest thing? Don't you have to try real hard to even find an older version of any of these browsers?

    Sounds like a cheap way to sell a book - and a little extra helping of FUD thrown in.

    • Re:Gasp! (Score:5, Insightful)

      by Isofarro ( 193427 ) on Wednesday September 11, 2002 @12:03PM (#4237968) Homepage
      Who on earth is running a browser earlier than 4.x?

      I'm using Konqueror 3.0 which came with Suse 8.0. Googlebot is version 2.1 according to my logs. The point is that it shouldn't matter what browser you are using, and we shouldn't be fudging markup into tag-soup in an effort to keep certain browsers happy. Rather markup a document cleanly, and use CSS to present the markup -- that way less capable browsers can strip away the CSS and have a default view of the content - which they can markup or manipulate themselves.

      Do you expect stuff to be rendered right if you use an older version of IE/Netscape/Opera?

      No, I don't care about the rendering, but a page would be much more interesting to my little scripts if the markup described the structure of the content appropriately.

      Don't you have to try real hard to even find an older version of any of these browsers?

      Not too hard at all:

    • Re:Gasp! (Score:2, Informative)

      by deepchasm ( 522082 )
      Of course hardly anyone uses pre-4.x versions of IE/Netscape/Opera. But you are ignoring other victims of kludgy web design - like blind people who rely on browsers with built in speech synthesis.

      An easy experiment you can do is to try and access a website with lynx, it will simulate what a blind person listening might here. Straight away you notice that in multi-column table based layouts, all those tiny links down the side of the page (next to the article you actually want to read) have to be scrolled through before you get to the article.

      I don't understand the mentality of people who fudge around adding hack after hack for compatibility with 4.x browsers.

      If you write a page using XHTML, a user with any browser that understands HTML will be able to read it. You can write it in the order "title,article,links/adds" - then the blind browser will get to the content they came for instantly. With the intelligent use of the DIV tag, all this can be positioned using CSS so you can still have the layout you want for people who can see it.

      Best of all, unlike a sea of hacks and workarounds, this is built to standards so it won't need tweaking every few months.

      It's easy to say to a 4.x user "upgrade" - after all, the system requirements for IE haven't changed that much from 4 to 5 to 6. But a blind person can't "get some eyes that work". So don't discriminate against them.
    • Who on earth is running a browser earlier than 4.x?

      Mozilla 1.0, anyone?
    • Who on earth is running a browser earlier than 4.x [...] Don't you have to try real hard to even find an older version of any of these browsers?

      Nah. I just go over to my sister's house and see what she happens to be using to access the Web...

    • Who on earth is running a browser earlier than 4.x?

      Me. Lynx anyone? Not anyone around here who uses a shell is there? Also, old Macs - SE, SE30, etc - can dialup, and there are ethernet adapters for them. They make good, cheap, space-saving machines for simple access. Use Nifty Telnet for shell access, older versions of Fetch and Netscape 2.0.

      But the important messge here is that:

      The web is about content, not format.

      Remember this. The whole point to html is that it's a *markup* language, not a *forced formatting* language. The browser takes the content and displays it in the manner of the user's choosing.
      This seems to have been lost in the corporatization and control of the 'net.

      Remember the good old days? When the web was about content and not about spam and marketing? That's where I live. I don't want to see blinking and flashing and animated ads and popups. If I can't see your content on lynx or with a 4.x or pre 4.x browser, you have lost my eyeballs and any potential to recieve my money. No popups on lynx.

      The same goes for html formatted mail (there is a special place in hell reserved for people who send html formatted mail.) If I can't read it in pine, I don't even care what it says. Send me text if you want me to read it. (No web bugs and stuff that way too.)

      In short, the goal is to get your content to other people, stop being such control freaks about how it is displayed. Write to the lowest common denominator, be creative with what is available there and you save much time, aggravation and money. -- And I'll be able to see your content.


      The web is about content, not format.

      Join the Any Browser Campaign []and make your pages 'content enhanced'.
  • Cause and effect? (Score:2, Insightful)

    by Marqui ( 512962 )
    Could this be because of the huge numbers of layoffs since the dot-bomb explosion? There are less people being paid to maintain and monitor the data, hence rendering it obsolete. Also, I am sure there are people who "maintain" to just keep the site alive and not actually doing anything as far a changing it since in most cases, it was not their site originally.
  • by Ratface ( 21117 ) on Wednesday September 11, 2002 @11:20AM (#4237641) Homepage Journal

    (Hmm, I was tempted to leave that as is, but I think at least a little explanation is required. Zeldman disagrees with his own thesis in as much as he says that sites like Yahoo! are important because of what they offer not how they look. So QED a site that relies on it's content is not obsolete. Tadaaa!)

    • Zeldman ... says that sites like Yahoo! are important because of what they offer not how they look. So QED a site that relies on it's content is not obsolete.

      If Yahoo could offer its content free of the tag-soup additions, it would last quite a bit longer than its current incarnation, purely because the content would be a lot more accessible to more browsers and user-agents than at present. (Take a peek at the HTML source and tell me honestly that the markup matches the structure of the content).

      Inaccessible content is just as bad as no content at all. Machine-readable markup has enormous benefits, and RSS just doesn't match up. Given clean markup, you'll be finding a lot more useful applications of the Web framework, but at the moment we are stuck in a browser only, keyword only environment. The Web offers us so much more than that.

      Zeldman is looking forwards. Today doesn't matter tommorrow. The browsers you test your site on today are outdated. You think IE will still be king of the hill in a few years from now? Did you also believe the same about Netscape Navigator a few years ago?

      The Web evolves, but at the moment tag-soup markup is what's preventing us from reaching the full potential that Tim Berners Lee saw at the very start.
  • 99.9%??? (Score:5, Insightful)

    by pubjames ( 468013 ) on Wednesday September 11, 2002 @11:20AM (#4237645)

    Talk about sensationalism. The article just points out that many web sites have mark-up errors in them. Big deal. To go from that to saying that 99.9% of sites are obsolete is just dumb.

    This is just a sensationist way to promote a book. Shame it got onto the front page of Slashdot. It will encourage more to do the same.
    • Well lets see---

      99.9% of web sites are obslete, and every computer for sale is obsolete by the time it hits the store.

      What's the difference?

      We design our web pages not to be constantly cutting-edge, but to be compatible and useful. Also as the parent post points out there is a difference between non-compliance and obsolesence.
      • 99.9% of web sites are obslete, and every computer for sale is obsolete by the time it hits the store.

        What's the difference?

        The difference, as the article explains well, is that if you design your website with standards in mind, it can be forwards compatible with newer browsers. If you do so, it will be a very long time before it is "obsolete," as Zeldman uses the term, if ever.

        Everyone who I have ever worked with has generated invalid HTML that has made even current browsers crash or behave erratically in different browsers. When I realized that I was also making these mistakes, I finally learned my lesson and started using the W3C validator [] to make sure my web pages are valid HTML. Since then, I have not had any problem with my pages not working in any browser. This is exactly what Zeldman is asking web developers to do.

      • Re:99.9%??? (Score:3, Informative)

        by Isofarro ( 193427 )
        We design our web pages not to be constantly cutting-edge, but to be compatible and useful.

        Compatible with what? Testing in available browsers today only gives you compatibility for yesterday.

        Compatible with standards such as the XHTML Recommendation and CSS Level 1 & 2 Recommendations offers you compatibility tommorrow too.

        Surely anything that helps your website to be accessible tommorrow is to your advantage?
    • No. It's about informing the public about the dangers of having proprietary code in their websites. Sure, the headline is a sensational, but that seemed to work with Jakob Nielsen's Flash: 99% Bad [], which practically woke up the whole Flash community to making more usable Flash objects in websites. We needed a similar wake up in regards to websites.
      What do developers mean by "backward compatibility?" They mean using non-standard, proprietary (or deprecated) markup and code to ensure that every visitor has the same experience, whether they're sporting Netscape Navigator 1.0 or IE6. Held up as a Holy Grail of professional development practice, "backward compatibility" sounds good in theory. But the cost is too high and the practice has always been based on a lie.
      Proprietary code and those little hacks are bad. Code to standards.
      • Jakob Nielsen's Flash: 99% Bad, which practically woke up the whole Flash community...

        Speaking as, I believe, a member of the Flash community, I take that as an insult. People who are serious professional Flash developers didn't need Neilsen to tell them that many people used (and use) Flash in bad ways.

        Proprietary code and those little hacks are bad. Code to standards.

        Do you think web site developers choose to use "those little hacks?" The fact of the matter is that clients say "hey, I want that image to be down and to the left a little bit" so you find yourself putting a little invisible GIF image in to get the position right. You would love to do it "to standards" but if you use layers then it doesn't work for a good proportion of your visitors. Alternatively of course you could do all your work twice, once with "little hacks" for the older browsers and once again "to standards", but most of us like to take a more pragmatic approach.
    • The article just points out that many web sites have mark-up errors in them. Big deal. To go from that to saying that 99.9% of sites are obsolete is just dumb.

      What percentage of websites pass cleanly through an html validator such as W3 []? Surely those sites that do not validate are because there are errors in the HTML markup?

      Zeldman probably believes that 0.01% of sites validate correctly, so his figure of 99.9% obsolete isn't mathematically that far off.
  • Uh-huh. (Score:4, Insightful)

    by American AC in Paris ( 230456 ) on Wednesday September 11, 2002 @11:22AM (#4237663) Homepage
    Though their owners and managers may not know it yet, 99.9% of all websites are obsolete

    Methinks somebody is confusing "are obsolete" with "will eventually be obsolete, so long as web browsers suddenly becoms fault-intolerant and the site owners leave things exactly how they are and never ever maintain them, ever".

    (Not to say that I don't agree with what he's saying, but jeez, what a wanker! "I declare that everything, everywhere sucks ass! Huzzah!")

    • Methinks somebody is confusing "are obsolete" with "will eventually be obsolete, so long as web browsers suddenly becoms fault-intolerant and the site owners leave things exactly how they are and never ever maintain them, ever".

      Two points:

      1.) Current "practise" forces all HTML parsers to enforce error-correction before content can be used for other purposes. Content Aggregation and Syndication is there to make our lives easier, and using the web more efficient. Why do you think there's a lack of tools that do interesting things with Web Content - its the overhead of handling non-standard markup that causes problems.

      2.) If site owners aren't willing to deliver standards compliant (and semantically useful) markup now, when?
  • Obselete? (Score:3, Insightful)

    by forevermore ( 582201 ) on Wednesday September 11, 2002 @11:22AM (#4237665) Homepage
    I wouldn't call that "obselete" so much as "noncompliant"... Obselete would mean that newbrowsers can't run them, not that old ones can't. The problem in't that the technology in the websites has grown old, but that lazy users (those of the WYSIWYG persuasion, among others) and Microsoft devotees have chosen their own set of standards (or merely force out browsers who don't comply with their standards), rather than the ones set out by the people who are supposed to control the specs for html, javascript, etc.
    • Obselete would mean that new browsers can't run them, not that old ones can't.
      This is exactly how Zeldman uses the term "obsolete." He's asking web designers to make forwards-compatible websites by ensuring that they adhere to web standards so they won't stop working in newer browsers and thus become obsolete.
  • Books vs. The Web (Score:4, Interesting)

    by laetus ( 45131 ) on Wednesday September 11, 2002 @11:22AM (#4237666)
    Obsolescence and wildly diverging ways of presenting information is one of the basic faults I find with the web.

    You know, if I pick up a book printed in 1920, it's interface is going to be familiar to me. Table of Contents, Index, Chapters, Body Text, etc.

    And now? I pick up a book printed today and find the same, useful interface.

    Contrast that with the web, where I can find simple clean interfaces like Google or Yahoo compared to some ghastly Flash-based interfaces that do everthing they can to distract me from the information I'm seeking. Plus, I'm being told that the device (program) I use to access these sites is obsolete less than five years after being released?

    I'm all for freedom of speech (and web presentation), but the web's got a long ways to go before it can become the useful instrument it can be.
    • You know, if I pick up a book printed in 1920, it's interface is going to be familiar to me. Table of Contents, Index, Chapters, Body Text, etc.

      Well, yeah. By 1920, there had been thousands of years during which the presentation of the printed word was gradually improved and codified.

      We're still in the early stages of presenting electronic content, the brainstorming stage, if you will. There's still plenty of room for innovation. Bear with it.

      And I'm surprised at YOUR surprise that 5-year-old technology is considered obsolete in Internet time. Improvements are a GOOD thing.

    • And now? I pick up a book printed today and find the same, useful interface.

      Yes, but when you pick up that book from the 20's, did they split their sentences with unnecesary commas? And check out Chaucer, is his work obsolete? Would you really want to read it if it were "ported" to 2st Century English?

      After a cursory survey, I'd say that at least 99.9% of the writing on the web is not standards compliant.

      The rest are l337 5kr1pt k1dd135.
  • by epeus ( 84683 ) on Wednesday September 11, 2002 @11:22AM (#4237668) Homepage Journal
    When are you going to get rid of all those icky nested tables that slash makes?
  • And here I am thinking that 99% of websites have out of date misinformation - that most corporate websites don't have any useful content and are nothing but marketing garbage and are "obsolete" as in old and worthless. ..and you were talking about layout. Wait, we're both right.

    My personal favorite worthless website - ZDNet, home of TechTV and other "high tech" offerings, with its absolute length table widths, every try to read that crap at a high resolution? Another one of my favs is the old 1x1 gif or other "layout tricks" still used by webmasters today. Get a clue people, an XHTML book is $15.

    • About 10% of our sites' audience still uses Netscape 4.x, which doesn't support some elements of XHTML, nor any CSS positioning abilities.

      We'd love to upgrade our standards to something more forward-thinking, but it's extremely bad business practice to piss off a tenth of your user base.
  • The web could do a lot worse than become a bit more strongly-typed, and a bit more like a programming language than a scripting language.

    True, most folks don't need more than the basic mark-up for their websites, especially where personal websites are concerned. But commercial sites could stand for a much better design than they have. . . the author here makes a lot of good points when he calls out the faults of ZDNet and Yahoo for their HTML. The code is crap - thank God HTML doesn't have GOTO statements, or these sites would probably be chock full of those, too.

    Let's do what we did with the blink tag. Don't just deprecate it--ignore it. Tell the browser, "Don't listen to the <font> tag, just skip over it."

    Not too long ago, I re-wrote my own personal webpages using Cascading Style Sheets. It's tricky, since Netscape/Mozilla oftentimes has different ideas of how to interpret CSS than Internet Explorer. But it's easy enough to accommodate both, without too much effort. And I'm a lot happier now that my HTML code looks less like last night's dinner and more like something that someone else could read and understand.

    • Uh, the "web" isn't a language at all, let alone a scripting language. It's more of a concept.

      Or did you mean HTML? It isn't a scripting language either, it's a markup language. It doesn't have any processing instructions, it just describes data. Or did you mean DHTML...?

      And the differences between a "programming language" and a "scripting language" have always been murky. What's the difference? That one can be compiled and the other is interpreted? Is one strongly typed and the other not?

      I'm probably not saying anything you don't already know, but it's hard to know what you're getting at.

  • by imperator_mundi ( 527413 ) on Wednesday September 11, 2002 @11:29AM (#4237729)
    It seems so altough w3 [] offer a validator [] for free.

    Maybe learning html in a weekend [] or in faster [] don't help keeping the quality of code at high level ; )
  • This is so stupid.

    Do we start broadcasting TV signals in black and white again because a similar portion of viewers use b&w tv's?

    Who ever uses an older browser ussually isn't a power user to start with and isn't looking for the latest fluff anyway.
    • The difference is that NTSC (US color television standard) was designed to show up well on the old black and white tv. All of the picture is there you just don't see the fancy color.

      I think the complaint with the web is that things don't gracefully degrade in downlevel browsers, they just die.

      The original intent of the web and html was to distribute content with tags that describe the "purpose" of that content and leave the rendering up to the browser. This meant that I could write a page and my message would get across to anyone even though it might look different to every person.

      Then enter the marketing folk and the desire that a webpage look the same to everyone. That sucked.

      CSS allows better control of the look but still works on the premise that the html (or xhtml) describes the purpose of the content and CSS is around to give hints on how the page should look. It still gives the end browser ultimate control of the rendering and the page could look different to different people.

      If people would design thier webpages realizing that whats important is the purpose of the information and not the look of the information we wouldn't have so many of these problems. The web was designed for information, not for art.

    • Who ever uses an older browser ussually isn't a power user to start with and isn't looking for the latest fluff anyway.

      Who ever said a Compaq IPaq running Pocket Internet Explorer, or a Sharp Zaurus running Opera at a max screensize of 320x200 is "an older browser"?

      When HTML and CSS are used correctly, optimally and compliantly the resulting websites are far more accessible in more user-agents that the mere crop of bloated OS based browsers.
  • by r_j_prahad ( 309298 ) <r_j_prahad AT hotmail DOT com> on Wednesday September 11, 2002 @11:30AM (#4237740)
    Our website is not only obsolete (it was designed that way from the ground up), but it's ugly and almost entirely non-functional too! Mainly we use it to harbor and distribute viruses inside the company. It's been very effective.

    Now that he's completely met his goal of total obsolescence, our webmaster spends every day looking for new ways to make our website even less useful, uglier, and more of a pain-in-the-ass to use. He's been very effective.
  • I can assure everybody that well over 95% of sites out there are in fact obsolite.

    Lets take a closer look.

    Overwhelming majority of websites out there are not HTML 4.0/XHTML 1.0 compliant. Even the sites that belong to members of w3c bend the rules which they help write. Sounds asinine? You bet.

    Standards do not mean s**t anymore. Everybody is aiming for IE 5.x/6 compatibility nowdays. Cross platform understanding is dead, now that Netscape has lost the overall war. Vast majority of web designers do not even double check their sites in Opera/Mozilla nowdays, thinking they might have to do some extra compatibility coding/clean-up.

    Most sites are NOT cross device/platform. You cannot view them on a PDAs of cellphones. Notice the word _MOST_

    There are millions of other reasons, but I have to run to a meeting. I'll expand on this later today in more detail.
  • Zeldman (Score:4, Insightful)

    by earache ( 110979 ) on Wednesday September 11, 2002 @11:31AM (#4237756) Homepage
    I've always considered Zeldman to be one of those self-proclaimed know-it-alls who has had little real industry experience in high volume, high technology web-sites. Most of his portfolio is brochure-ware that looks like it was done by a team of one. So I've always considered his belly-aching a little simplistic and, frankly, unrealistic in current web development scenarios.

    It's easy to lament the fact that these sites aren't standard, but there are clearly reasons why most of these sites don't fit his vision of standards compliance.

    For one, most sites don't have the budget to develop to standards. It's much easier to code to specifics and use non-standard work-arounds where possible then to boil everything down to the least common denominator (which standards are supported by whom). When I say easier, I mean that years of experience have instilled intimate knowledge in the seasoned web developer that almost comes as instinct now.

    Secondly, all of these "standards" are interpreted differently by the different browsers, so you can't insure consistent look and feel without kludges.

    Third, most of the foundations for these sites were layed out before coding to a standard was even possible, and when the mindset was not focused on any sort of standards compliance.

    Finally, I've always thought that they made writing to standards compliance sound easier then it actually is, because even though it's called a standard, it rarely exhibits standard and consistent behavior across the various platforms. Most art directors and graphic designers - specifically those that migrated from print or traditional design - tend to be exteremly unyielding in the way their designs are interpreted on the web, leaving developers with few options that are fully supported by these so-called standards.

    Personally, I think Zeldman needs to spend some time in the trenches working on a large site with a large development team under real deadlines for real clients. Things are rarely ideal in these circumstances.

    What is it they say about armchair coaches?
    • by Arker ( 91948 ) on Wednesday September 11, 2002 @12:21PM (#4238144) Homepage

      ...who don't understand what HTML is.

      Secondly, all of these "standards" are interpreted differently by the different browsers, so you can't insure consistent look and feel without kludges.

      You're not supposed to be able to. That's not what HTML does.

      HTML is a content language. The whole beauty of it is that the final presentation is NOT THE DESIGNERS RESPONSIBILITY. No web site will look the same on all platforms - that's the point.

      Finally, I've always thought that they made writing to standards compliance sound easier then it actually is, because even though it's called a standard, it rarely exhibits standard and consistent behavior across the various platforms. Most art directors and graphic designers - specifically those that migrated from print or traditional design - tend to be exteremly unyielding in the way their designs are interpreted on the web, leaving developers with few options that are fully supported by these so-called standards.

      The people you are talking about are not 'web designers' - cannot be, because they don't have a clue what the web is. If you cannot accept the fact that your content can be presented different ways (including to blind people) as appropriate to each individual client, you have no business on the web. Make .pdf files or something.

      I know someone will interpret this as flamebait, and someone else will probably tell me to 'get with the real world' or the like, but in fact I am just telling you the truth, and I'm quite grounded in the real world. There has been no shortage of people explaining these simple facts about what HTML and the Web are, in simple terms and moderate tones, from the very beginning - and sadly there has been an overabundance of self-styled 'designers' that refuse to understand the medium and insist on trying to make it what they want it to be, instead of what it is. REAL designers work with their medium, they take the time to learn how it works and why, and they produce designs that are appropriate to it, rather than insisting that every media work the way their favourite one does and breaking it every time they touch it. And that is something that every decent art teacher in the world tries to teach his students. Sadly, the students, particularly the ones that go into web design, don't often listen. I'm not trying to pick on you personally, but your clueless post makes an excellent example I must admit.

      'Designers' that couldn't be bothered to understand the medium of the web before proceeding to dump their work on it have done great damage to the web, and that's something I happen to care about quite deeply. Your ad-hominen attacks and dismissals of Zeldman aside, he makes a point that is absolutely true, and will have real economic consequences. All that patched up proprietary spaghetti code of mal-formed HTML-abuse IS coming down. While standards compliant pages from the very earliest days of the web still display perfectly in the latest nightly builds of Mozilla, the pages written by people with the philosophy your post shows ARE becoming obsolete, very quickly. In a way, the 'designers' that can't be bothered to learn their medium have won - the new standards will allow them to do what they always wanted to do, and what HTML was never designed to do - to specify layout and 'look and feel' issues. But it will require them to do it in ways that consistent with the underlying philosophy of HTML and the web - something they've never shown any interest in doing before. I expect to hear a lot of whining from that corner in the coming years, but don't look to me for sympathy.

  • by Metropolitan ( 107536 ) on Wednesday September 11, 2002 @11:33AM (#4237767) Journal
    How many variations of 'standards' should one have to comply with to make a usable, functional, Web-based information node? That I have to test against huge numbers of browser/platform/OS variations is a massive waste of time and energy, when I should instead be able to focus on making the information clear and the functionality flawless.

    I'm not saying that we as a collective need to move back to HTML 1.0, but there has got to be a solution to increasing complexity in Web information spaces. Companies that intentionally cripple some browser/OS combinations are doing the greater community a vast disservice.

    The majority of Web pages are not necessarily broken, but reflect limits on the time and energy of those who create them to keep up with 'standards' that seem to shift every other week.
    It's harder to play one note and have it be perfect than it is to play a thousand and have them be close. Most people choose the latter, and hope that one note hits home.
    • I agree with you, and hope someone mod's this post up.

      All the folks out there who are slamming web developers/authors really need to step back a second. [I'm amazed that my first post in this topic already has 3 "You should code better" responses.]

      I have been working with 'web' pages professionally since late 97.

      And man has stuff changed.

      Anyone who works in the real world (not academia) understands that not only is there the pressure of a 'real world' environment - but the need to show value for a company.

      Understaffed departments, unreasonable demands, HUGE goals. Those are the factors that REALLY limit the 'good code' out there. Its very hard to make sure your 100% compliant [no matter how hard you tell the board/your boss/your dept/the finance people that you SHOULD be] when at the end of the day - you have more 'new' projects in your inbox than ones you have finished.

      [and before folks cry - TELL THEM ! TELL THEM ! We are in an economy now .. where people are HORRIBLY disillusioned with the internet. I work for a fortune 500 which produces power tools - and it has been kicked around previously the idea of actually SCRAPPING our web-based projects. Hows that for a scarey morning meeting to walk in on :(]

      but I digress .. my real point is .. standards change, and 'mega-powers' in the browser world ignore them anyways.

      HTML that was 100% w3c 4 years ago .. is maybe 80% now. [good and bad .. means that html is more versatile .. but means that you have to recode that stuff.]

      XML .. geez .. I have been using it at work for about 3 years now .. and for a 'universaly standard' language .. its sure been through the damn wringer.

      I can write some xml/xsl for IIS .. and put it on a unix box and watch it puke. [and vica versa]. The standards on this 'universally adaptable' langage have changed so many times in the past few years my head is still spinning.
      [clarification .. i dont mean the 'Official Top Shelf writtin in stone' standards .. I mean the ones that are in the real world .. MS for example. Its not a surprise they tweek things .. but when a major player in the software dept {yeah yeah} produces something sub-standard .. how long before it BECOMES part of the standards? even if its unwritten?]

      So yeah .. I think your insights are dead on here.
  • by mmoncur ( 229199 ) on Wednesday September 11, 2002 @11:38AM (#4237796) Homepage
    Here's a condensed version of the article for those who don't have time to slog through it:

    1. Standards are good.
    2. Bad code that happens to work in current browsers is bad.
    3. Buy my book.
  • Yeah, whatever. 83.7% of all roads are in need of repair. 99.9% of all sewers contain rats and cockroaches. Things in society are messy and are nearly always far from perfect. Someone trying to make a buck doesn't make it anymore interesting or news.

  • While I have to agree on the theoretical benefits of web standards, the real world makes the whole thing fall apart.

    The main problems that I see are that

    1. Web standards bodies move slow and specifications are obsolete before they are approved. Take SVG. (please) Flash is a superior format with a large installed base, quality authoring tools, platform scalability, and open but expensive architecture. SVG took five years to become a reality, and is still VERY immature.

    2. It's about the user stupid! For the most part, users sit at a computer desktop, with a commercial browser (IE), and use the internet. It needs to look right for THEM. The .001% of users on cell phones are doing specific activities with mostly packaged content. These users are novelty users. Portable devices have no standards as to how they display, and without this, nobody can expect a useful cross platorm "standard" that works everywhere. It's a microsoft world whiner. There is no doubt that IE is the only browser that matters. If someone else wants to make a competitive browser, it needs to be IE compliant, not W3C compliant. Microsoft took it upon themselves to create a language that works, no matter how it's written. Who cares about sloppy coding? Bandwidth is hardly an issue, and if a browser renders correctly, it should LOOK right.

    in conclusion, the web standards project and w3c have failed due to their manegerial impotance, and can be safely ignored.

  • by wandernotlost ( 444769 ) <slashdot AT trailmagic DOT com> on Wednesday September 11, 2002 @11:49AM (#4237845)
    Zeldman asserts that the problem plaguing web developers is a desire for backward compatibility. In fact, that desire seems unfortunately missing in most websites. The real problem making websites suck is the desire to view the web as a graphic design medium.

    Designers want to control every pixel of a page's layout, completely ignoring what the web was designed for. If everyone used logical markup to describe their data, later adding CSS to attempt to influence the layout, the web would be a much friendlier place. It may not look exactly the same on every browser (which, come to think of it, may be Zeldman's point), but with proper testing, it should look similar on popular browsers, and at least be LEGIBLE on others.

    People need to be convinced that the web is not a graphic design medium. That's what PDF files are for. People don't try to build their sites solely from PDF files, because that just wouldn't fly. Instead they try to use the web to achieve the same goal, completely oblivious to the fact that it's a really poor tool for that purpose. Rather than embracing a new paradigm, they try to contort it to look like what they already know. To me, that's just incompetence.
  • Hello,

    the world wide web is about what ever you make it. I could make my own meta language that the uses http servers. coming soon- rEml - randomErr markup language. it won't meet your standards, but it meets mine.

    forcing everyone to do things your way is so... microsoft.
  • I have to laugh at the assertion "For a beginner, XHTML is easier to learn than HTML precisely because its rules are consistent"--what wishful thinking! XHTML is harder to learn because there are so many more rules. Newbies, even ones who manage to make some interesting content [] think HTML already has too many rules...

    Can someone tell me, is
    <b> go and <a href="somelink">click me</a> now</b>
    illegal in XHTML? Does it need to be
    <b> go and </b><a href="somelink"><b>click me</b></a><b>now</b>

    because A HREF tags aren't part of the valid contents of the bold tags?
  • I work for a mid sized company but I know the web site is very out of date and has incredibly poor content. In my mind I can pinpoint this to one thing. The inability for the people who write content to get it to the site.

    I know for fact there is more than enough good stories and photographs in the organization that can be published but most of the technicians who would write it (or at least the first draft) don't have the time to learn a web design program. The solution I believe is a good content management system. I've been looking into Typo3 [] and a couple of other content management systems. I believe once we make it easy to update then content will be less likely to be obselete.

    Content Management Systems are right now the best place I can start introducing open source software at my work. We've looked at Microsoft's Content Management Server which is highly over priced for our needs and its hard to argue with the documentation and self-help community that open source software provides. I know there are other content management systems out there but the point is that for content to stay current publishing capabilities must be pushed to the people who will author it.

  • Pure Bunk (Score:2, Insightful)

    by Greyscale ( 597578 )
    My servers' web stats show 96.4% of all browsers visiting the servers are Internet Explorer and/or Netscape. The only thing surprising in this article--other than the clearly fudged percentage sited--is that the author advocates, with a straight-face, that because 3-4% of a site's visitors use incompatible browsers this translates into a 99.9% obsolence rate.

    Still, it's always amusing to see someone suit up, gird their horse, and charge at the windmills while proclaiming the revolution.
  • by Soft ( 266615 ) on Wednesday September 11, 2002 @12:09PM (#4238029)
    Let's do it the standards way.

    I want to do a nice little page, and do it in XHTML because it's The Way Of The Future (or I want to display a little math, which only XHTML+MathML allows without resorting to ugly inline images). The tag soup itself isn't a problem, I just close all my tags and make sure the doctype declaration says XHTML instead of HTML, as prescribed by the standard [].

    However, is this enough? The document is now XML, and therefore should have a <?xml declaration, if only to specify its encoding. Except that said XHTML standard says it is optional if the encoding is UTF-8 or UTF-16, or has been otherwise determined (think HTTP headers), which contradicts the XML standard, sec. 4.3.3 [], the last two paragraphs, one which says that no declaration and no other information means mandatory UTF-8, and the next one "It is also a fatal error if an XML entity contains no encoding declaration and its content is not legal UTF-8 or UTF-16."

    So I need a declaration no matter what. But according to this page about the different layout modes in current browsers [], MSIE will react to an XML declaration by switching to "quirks" mode, which is precisely what I wants to avoid by sticking to the standards... And I wouldn't want to lock out 85% of WWW users, wouldn't I?

    But wait, this is only if the page was served with a text/html content-type. The right answer would then be to use the standard content-type for XML/XHTML... which should be [] application/xhtml+xml! Yes, "application"! Now if I use that content-type, all browsers I have at my disposal except Mozilla (MSIE5, Konqueror, Links, Lynx...) either consider the page an application and offer to save it to disk, or display it as-is! Same with the second-best, text/xml.

    Okay, am I the only one experiencing this? Any point in not using good-ol' HTML4 and avoid doing (yet another kind of) horrible bugware?

  • by stratjakt ( 596332 ) on Wednesday September 11, 2002 @12:14PM (#4238058) Journal
    it's lost its meaning. It's been degraded by marketing drones and morons to mean 'anything thats not the cutting edge'.

    Here's what it means:

    Hell, I still use lynx when all I want to do is snag a tarball. My linux boxes dont even have a GUI. If the content there has meaning, who cares if the web page uses the latest 'nifty tricks'. Is an ASCII text file obsolete? No, not if the information it contains is valid. Is EBSDIC (sic) obsolete? Probably. I cant even remember the acronymn :P

    I'm constantly hearing how my P3 600 is obsolete. There's nothing that doesn't run on it. Hell, I have a router box running a P90.

    Is my original NES obsolete? Or my Atari 2600, for that matter? Not as long as I enjoy playing them.

    Is a 2001 model vehicle obsolete because the 2002 line is introduced? It does have a bigger cupholder, after all.

    If people want to push their agendas, sell whatever they're selling, go for it. Just quit trying to redefine perfectly cromulent words in the english language to do so. Make up new ones, like cromulent. I propose 'obsolastweek' to mean everything that wasn't shrinkwrapped within the last 24 hours.

    This article should read "99.9% of websites are obsolastweek because they haven't been redesigned because some propellerhead made a new widget"

    Propellerheads (I can use that word because I am one), dont realise the cost of doing business. The world doesn't start over at 0 just because they invented something 'slightly better'.
  • .... But outside these fault-tolerant environments, the symptoms of disease and decay have already started to appear.

    Tell me about it. I just checked my webpage, and all my <br> tags had decayed into <blink> tags....
  • "99.9% of all websites are obsolete."


    "0.01% of all websites render nicely in Lynx."

    Seems to me there's some confusion between "obsolete" and "usable." Those websites that will be obsolete with fubar 6.x are the same ones that cram a lot of visual shit down your throat, making you work very hard to extract the useful information out of the noise.

    Fight designed obsolescence, and write text-based web content with a minimum of static content. Otherwise, don't bitch when fubar 6.x fubars your site.

  • by pjrc ( 134994 ) <> on Wednesday September 11, 2002 @12:39PM (#4238308) Homepage Journal
    From the article:

    all of us temporarily lost something more important: the chance to create a usable, accessible Web built on common industry standards. We lost it when designers and developers, scrambling to keep up with production demands during the short-lived Internet boom, learned non-standard, browser-specific ways of creating sites, thus bringing us to our current pass whose name is obsolescence.

    Yeah, that's right. It was the fault of all those developers who didn't have the forsight to see the standards that would eventually be approved years later. What were they thinking?

    It didn't have anything to do with the standards process being slow, or diverging from the needs/demands of the market (HTML 3.0). And even after the standards were finally approved with buy-in from the browser makers, no blame rests with both Microsoft and Netscape for serious bugs in their 4.x browsers, often causing their browsers to crash on many CSS features.

    Yep, those developers were at fault. They learned bad techniques, when those techniques were the only way to accomplish what their customers wanted. They continued to use them when the 4.x browsers would crash on standard-based markup. Even after the really serious problems were cleared up in IE5.x, they still used their old tricks. And now, damn them, that 6.x browsers have been available for only a year or so, they haven't redesigned all the world's websites to be fully standards compliant (and broken on 4.x and some 5.x browsers which are still in heavy use).

    Yep, if anyone's to blame, it's those developers.

Bell Labs Unix -- Reach out and grep someone.