Are 99.9% of Websites Obsolete? 546
citizenkeller writes "Zeldman is at it again: " Though their owners and managers may not know it yet, 99.9% of all websites are obsolete. These sites may look and work all right in mainstream, desktop browsers whose names end in the numbers 4 or 5. But outside these fault-tolerant environments, the symptoms of disease and decay have already started to appear.""
Figures.... (Score:2, Interesting)
Business Need and Long Term Costs (Score:3, Insightful)
Website designers have learned this lesson well. They strive to serve their business clients by allowing them interact with the largest customer base possible by using clunky non-standard, bandwidth-consuming techniques to get outdated browsers to render their stores in the desied fashion.
You really can't blame website designers for this, nor can you blame site owners. The designers are working to meet their client's requirements, which is to make money, by being accessible to the largest percentage of the available customer base.
The fault, dear brutus, is in ourselves. Website visitors are at fault, for using browsers which promote this non-standard architecture. Certainly no one will use a browser which is strictly standards complient such that any non-standard website would not be visible, because that would diminish the user's internet experience; but this is what's required. We need to force site owners to become standards compliant, which will in turn improve efficiency throughout the net.
If only, bandwidth were more expensive, this problem would already have been fixed, as the bandwidth costs of ineficient non-standard site design would be far mor visible.
It really is a foustian bargain. Reduce revenue by modernizing your website thereby making it inaccessible to older browsers and thus reducing your potential customer base and save money on bandwidth usage, then wait for web users to upgrade their browsers so as to be able to view your site, and build up your custoemr base once again; or, cater to every antiquated browser in existance, so as to maximize your potential customer base, and accept the increased bandwidth costs.
In the long term, with a little short term pain, this problem will be resolved, but in the short term, there really is no good answer.
--CTH
Re:Um, no? (Score:4, Funny)
Particularly Microsoft. I applaud their attempts to encourage individuality by setting their own standards. This proves that Bill Gates loves us all (I think).
The Internet didn't become what it was today through standardization- thank God that pesky TCP/IP plan never took off.
Re:Business Need and Long Term Costs (Score:3, Interesting)
Its not the backwards compatability that concerns me, its the _sideways_ compatability that's more important to me. The authored HTML tends to work in a range of Netscape browsers, a range of Internet Explorer browsers, and sometimes in a range of Opera browsers. Anything other than that is random.
A standard's adhering HTML document could be used in all the browsers above, plus all the other user agents out there that support the standard followed. So text-to-speech browsers, indexers, spiders, content aggregators -- all the silent user-agents suddenly have access to structured content.
These are the useragents that are overlooked by the typical public website. People don't tend to notice that structured markup scores a lot better in google than font-flavoured tag soup, precisely because h1 defines a first level header, and font defined some weird presentational style but nothing semantic that a search engine can use.
I don't believe browsers will be the user-agent of choice in the coming years - we'll automate all the manual intensive process of trawling through websites looking for information, and we'll delegate it to some sort of intelligent agents that do the work while we do something more enjoyable.
RSS Aggregators like AmphetaDesk [disobey.com] show a very basic inkling of what can be possible with structure and the value of content out there on the Internet.
But we need structured markup to add semantic meaning to the content, and then we can leverage that content into something truely useful. (Yes, I'm a dreamer longing for something practical)
Blinkers (Score:2, Troll)
Re:Blinkers (Score:2)
Re:Blinkers (Score:2)
Never, never call HTML markup "coding." It's simply a markup language.
YEAH I agree (Score:3, Interesting)
It worked in all the current browsers a year ago.
but with IE 6 and the new netscape coming out - you would *THINK* there would be backwards compatability.
However, I get e-mails all the time from things that are now 'suddenly' broke.
And after verifying what browser/etc the user encountered this error with - amazingly enough
*go figure*
Re:YEAH I agree (Score:2)
correction .. company website (Score:5, Interesting)
{and technically
however
IE 5.5 will support nested tables up to 7 in depth. Netscape 6 will only support up to 4 in depth.
Netscape 4.7 does not require quotes around 'field' tags like width or height.
Netscape 6.0 can do unusual things if they are not there.
the problem (as stated in the article) is that becuase of the past 'browser wars' fighting for dominance
Now that everyone is trying (or at least saying they are) getting on the w3 bandwagon. These little 'faults' are starting to cause errors.
And since the vast majority of web pusblishers and early adopters out there have not received *formal* training in html [I for example
5 years of bad habits become 2nd nature.
sorry for the confusion.
Re:correction .. company website (Score:2)
Granted, sometimes it's unavoidable, since backward compatibility can't be maintained. In this case, the problem is with standard HTML. However, when the HTML is standard, it's a bug in the browser, which needs to be addressed.
Just because void main{} can compile doesn't mean it's right.
Good Logic (Score:3, Interesting)
However, when the HTML is standard, it's a bug in the browser, which needs to be addressed.
Your logic is flawless, but notice where you're left now.
The browser is branded buggy and non-compliant.
Say the browser is IE 4 or Netscape 4.
Great - the browser creators come out with a new version of the browser that fixes those bugs.
IE 6 and Netscape 6 are in greater compliance with standardized HTML 4.01, CSS, DOM, etc.
Now you come to the end of the road:
Re:correction .. company website (Score:2)
You know, adding a DTD, defining character encoding and validating your HTML [w3.org] would probably help quite a bit.
They're called standards for a reason.
.02
cLive ;-)
Re:correction .. company website (Score:2)
There *is* a point to DTD definition (Score:2)
So? At least if you specify a DTD you are showing that you understand there are issues and have made a rational choice in selecting one.
If you bother to adhere to a particular DTD, the odds on your site being viewable accross a range of browsers increases dramatically.
Yes, you will never get 100% compatibility, but you will get damn close.
If you insist on features outside current DTDs, then use server side browser detection to serve either the site as you intended, or a heavily stripped down totally cross platform version.
.02
cLive ;-)
cLive ;-)
Re:correction .. company website (Score:4, Informative)
Well, no, they're not called standards, and for a reason. From the w3c home page:
The World Wide Web Consortium (W3C) develops interoperable technologies (specifications, guidelines, software, and tools) to lead the Web to its full potential.
No mention of standards.
Take a look at the HTML specification page [w3.org]:
W3C produces what are known as "Recommendations". These are specifications, developed by W3C working groups, and then reviewed by Members of the Consortium. A W3C Recommendation indicates that consensus has been reached among the Consortium Members that a specification is appropriate for widespread use.
Again, no mention of standards.
The W3C is a vendor consortium, primarily a group of big players who are trying to reduce their cost of busness by hammering out some common formats. The W3C is not a standards body, and they do not produce standards. While there are smart, possibly altuistic people on W3C working groups, by and large the W3C as a whole is intersted in promoting the welfare of its member companies, not that of the general developer community. Typically, though, these interests overlap, but that doesn;t change the purpose of the W3C.
Netscape 6.0 IS obsolete (Score:2)
For goodness's sake, upgrade to Netscape 6.2, 7.0 or Mozilla 1.1! 6.0 is so old and has so many bugs, while 6.2 is almost infinitely more stable/faster/better in rendering.
Re:correction .. company website (Score:3, Interesting)
Netscape 6.0 can do unusual things if they are not there.
The W3C standard says that ALL attributes are required to have quotes. A browser could refuse to render any element with no attribute quotes and still be as compliant as before, since the behavior is undefined. If you followed the standard in 1994, the problem would never have occured in any browser. Since, what you fail to mention, is that both Netscape 4.7 and 6.0 render the page properly with quotes. Why risk making it fail then by leaving them out? This is about the same error as a programmer not initializing a bit of dynamically allocated memory and then writing in it. Might work, might not. The behavior is undefined and with the proper education, a programmer would have learnt to not make the mistake.
Now that everyone is trying (or at least saying they are) getting on the w3 bandwagon. These little 'faults' are starting to cause errors.
The problem is, no one said that "writing the pages this way will probably make them work in the future". However, I'd like to see a page written using proper HTML + CSS and use no deprecated tags (like FONT) to go bad in the next version of IE or Netscape/Mozilla.
So what if the faults are causing errors? A design fault causing a fault in rendering is fully logical to me, and I understand the browser designers who're starting to have troubles rendering according to their previous non-standards as features are added to *follow* the standard. Soon you'll have a big mix of standards and non-standards and at least I would be very tempted to just throw out the shit and attempt to follow standards better in the future. Something Microsoft partially did in IE 6 (they requirery a proper DOCTYPE to enable compliance mode -- probably too afraid of doing too drastic things) and something the Mozilla group definetily did in their browser.
Just follow the standards and your pages should look very nice in Netscape 8 and Internet Explorer 7. Start by learning about the DOM tree and forget everything you ever "learnt" about document.all. Use getElementById("id") instead of document.all.id. As a bonus, by following DOM and skipping deprectated tags like FONT, etc, while using CSS with em values and the likes, you'll automatically get the benefit from getting pretty much a cross-browser page, since the CSS rules have very strict rendering rules. *And* a page that looks good in the future.
If more web designers got education (as in all most other sorts of work -- what's so special about web designers needing no education anyway?), things would of course look better today.
Re:correction .. company website (Score:4, Informative)
HTML4.01 recommended using quotes as a best practice. XHTML (being a reimplementation of HTML using XML rules) by inheritance from XML requires attributes to be quoted.
Re:correction .. company website (Score:3, Insightful)
Uhh, ever view the source of this slashdot page you are currently reading? Try it some time. Each block of comments at a given indent level is a nested table. It's called "Nested" for a reason. (I can't belive anyone actually uses that godawful "Threaded" option that's the default, but it too uses nested tables as well.) And the the entire block of comments themselves are nested in a table, which itself is nested. Notice the page layout, the menus on the left, the 5% black borders on the margins, etc, those are all from tables.
Deeply nested tables are more common than you would think, because webmasters use tables for specifying page layout.
Re:YEAH I agree (Score:3, Insightful)
If you had written to the standards instead of just hacking something together until it worked in IE/NS $CURRENTVERSION, odds are pretty good that you wouldn't have this problem now.
Re: Backwards vs. Forwards Compatibility (Score:4, Informative)
Backwards compatibility means it works in older browsers. As Zeldman mentions, it always has some cutoff point, such as Netscape 3 or IE 2.
Forwards compatibility means that it works in newer browsers. There is not necessarily any cutoff point, as long as you have constructed the website correctly. Structural problems and other typos in the HTML, proprietary and deprecated tags, and versioning can all limit the forward compatibility of the page.
Read the article and you'll see that Zeldman is arguing that web designers should be developing with forwards compatiblity in mind. Unsurprisingly, yours is one of the 99.9% of all sites that have not.
Re: Backwards vs. Forwards Compatibility (Score:2, Insightful)
I don't know which philosophy is more unreasonable.
Re: Backwards vs. Forwards Compatibility (Score:2, Insightful)
They have two choices, Only render the pages that follow the standards and have 99% of sites non functional in there browser or allow it to work so there browser can be used today.
The only company that could currently force the updating of many sites is our favorite company Microsoft and even then I'm sure there would be resistance to a browser that only followed the standard.
So the burden had to be on the designers of the site to pull them into line with the standard, once the browsers can render strictly to the standard such as mozilla and opra etc.
Re: Backwards vs. Forwards Compatibility (Score:5, Insightful)
The parent was talking about backwards compatibility of the browsers (being able to properly render old HTML code in a new browser, where the onus of compatibility is on the browser author).
It's semantics, but I didn't start the nitpick
As for the parent that wanted browsers to be backwards compliant
This is very hard to do if you want interactive sites, or at least was until recently when most browsers began to pay more attention to standards such as the DOM (document object model).
Again, we're back to a very basic problem. Do you write your page to work in old browsers or do you use the latest standards? I'm less concerned with this (as the author of the book seems to be) than I am with the idea of writing code to today's standards and having it work in future browsers.
I as a user understand that I'm taking my experience in to my own hands if I try to load a modern page into Netscape 1.0 (but it is fun some times
However, words can't express my frustration when I have the most modern browsers available and I can't load a page because it was written for an older browser. This happened to me yesterday when trying to sign up for a service from my phone company. The reps kept saying "I see that option, you should have it to". 30 minutes later I decided to load the same page into a 2 year old browser and it worked fine. It had used some tags that were horribly broken, not in any standard, and later abandoned by all involved.
If the modern browsers had had to be compatible with everything since the dawn of the web, they would be twice as large and 4 times as buggy. I would much rather that web authors stick to published standards and not rely on proprietary tags for public pages.
From what I see, this is what the book's author meant by "obsolete" and I agree. Most websites, if locked down and not changed for 3 years, would no longer render in the browsers that are new in 3 years.
While they will naturally work to fix these issues as the new browsers are released, they would not have to if they wrote to the basics. And the problem with fixing things as they evolve is that some pages (like that damned phone company page) get ignored and by the time they're found no one knows how to fix them.
XML/XSL. Know it, Use it, Love it. (Score:2)
This is just a book advertisement. (Score:4, Insightful)
Near as I can figure out, he's claiming "the web is broken, don't bother."
The book looks broken. Don't bother.
Re:This is just a book advertisement. (Score:2)
HTML. Broken. Yadda yadda yadda. Design. Content. Seperation. Blah blah blah.
Relying on HTML to solve these problems is outdated. We have back-end scripts used to deliver cutomized presentations depending on the browser used to visit the site.
But I guess this is obvious to most of the horde of
Re:This is just a book advertisement. (Score:4, Insightful)
The irony is that no one beside Yahoo's management cares what Yahoo looks like. The site's tremendous success is due to the service it provides, not to the beauty of its visual design (which is non-existent).
I just want to know, what part of this makes it obsolete? That it uses html work arounds, looks right, or is a great service?
Then he goes on to complain about this extra html causes huge bandwidth charges, which I can assure you are negligible, even over millions of page views. If you take a look at my August statistics [oswd.org], on the 22nd you can see the sysadmin disabling mod_gzip. On the 28th, you can see me panicking about bandwidth and switching our old font tags to CSS. You can see the page views are about the same as the 27th, but the bandwidth goes from 871megs to 838megs. 40 megs is a very small difference for possibly breaking browsers that don't support CSS! Seeing as the bandwidth for a site like Yahoo is bought in bulk, even a gig of difference a day wouldn't be that much. And this is with mod_gzip turned off, that 40 meg gap would be turned to nothing if it was on. With yahoo, most of their bandwidth is in news images and content anyway, not their design. So I wouldn't recommend taking the time to read his book, or even the sample chapter, it's bogus for sure.
sample chapters are great! (Score:2, Interesting)
The essay gave a good analysis of tradeoffs that web programmers have to make when planning websites. Some of the code examples here were particularly hilarious (if only because I know my websites have code that is equally ugly). This chapter, as I see it, is not advocating anything radical or controversial; it is merely restating the problem in as dramatic way as possible.
Book Previews reduce the "obsolescence" of technical books. I say, let's have more of them!
rj
Let me qualify... (Score:4, Funny)
Back in Reality... (Score:5, Insightful)
They are developing XHTML 1.0 trans or HTML 4.01, maybe adding CSS to go foward. NN4 will be around for a while, and few people are willing to write them off simply to appease the standards gods.
In the real world, we build sites for human composition. We separate content from display with our databases and content management. HTML may be an inefficient way to get the data to the browser (XML+XSLT would be ideal, XHTML+CSS would be easier on the browser), but it works. The browser parsers are done.
Sure XHTML+CSS is easier on the browser, and that may help rendering issues. However, the reality is that old browsers will be with us for a while. Maybe in 5 years this will matter, but not until then.
Alex
Re:Back in Reality... (Score:5, Interesting)
For my own sites I simply don't care about older browsers. I provide alternative CSS files (with basically all layout stripped) that should work in netscape 4 (haven't actually tested this). Aside from that there's only IE6 and mozilla for me. I develop for Mozilla and remove everything that doesn't work as specified in IE6. I refuse to do browser detection or to use CSS hacks to get stuff working. Some people advocate such hacks to trick IE into the right behavior but I refuse to sacrifice elegance and simplicity. That is also the reason I use XHTML strict. XHTML strict is much easier to maintain than HTML dialects that are polluted with formatting and other bullshit.
Giving netscape 4 users a bad experience may actually stimulate them to install something else. If enough sites ignore netscape 4, maybe it will be abandoned by users. On most platforms there are now good alternatives (e.g. opera performs better than netscape 4.x on win32).
Re:Back in Reality... (Score:3, Insightful)
This should be a simple economic issue. If it's really that much of a pain in the butt to support NN4, price that extra work at a point where you're OK with having to do it. If it's worth that much to your customer, then you have no excuse complaining; just do the work and take your money Lots of other system-requirements / target platform decisions work like this (do we port to MacOS, do we port to MS SQL Server, do we port to Linux, do we port to iPlanet Web Server, etc.) so this isn't exactly a radical idea. If it's not justifiable from a business sense, just don't bother, but if it is, adjust your prices and STFU.
There are companies out there which have standardized on NN4 and haven't upgraded to NN6.2 or NN7 yet. Bless them. If not for them we'd all be coding in MS-HTML and MS-CSS, or XML and MS-XSL, and wondering why IE 5 was the last browser they released. One of these days they'll upgrade to NN7 (or something similar) and life will suck less. Until then, do your job and separate business logic from presentation, so the only part you have to re-code and QA for NN4 is the presentation layer. XSLT can help with this.
Re:Back in Reality... (Score:2)
I'm an exception to your generalization. (Score:2, Informative)
NN4 doesn't support <DIV>. It supports <LAYER> instead.
NN4 doesn't like inline styles.
NN4 doesn't fully support the height attribute (e.g., table cells).
NN4 doesn't allow onclick events on every object, such as <img> and <div> (or, layer, if we want to be technically correct).
NN4 uses its own Document Object Model, which results in very poor DOM Level 1 support, and virtually no support for Level 2.
NN4 supports the onunload event, but it does so quite unconventionally. This results in strange behavior when resizing a window: content unloads and refreshes, which is very undesirable for persistent objects, such as applets.
I guess that's a good stopping place. The list goes on, but I hope you see my point. In fact, the word "unconventional" suits NN4 quite well.
Web developers who are serious about dynamic or heavily stylized content will quickly realize that full NN4 support requires either an insane dedication to little hacks and gimmicks or a text-only version of their website. The way to present cross-platform, stylized content today is to use Shockwa^H^H^H^H^H^H^H a plugin.
The fact that 5th and 6th (and now 7th) generation browsers are 95-99% standards compliant means that bleeding-edge content will target newer browsers, and Netscape 4 will be left to rot. Five years is an insane lifespan for a browser, and if you remember correctly, Netscape 4 was just getting off the ground five years ago. Internet life moves at the speed of normal time ^2, so your five years is really like 25.
Maybe I live in a parallel universe, but in my reality, NN4 is already dead. Or, at least it has a really bad case of leprosy.
Technology exceeds demand.. (Score:3, Insightful)
Now that the bubble has burst, fixing "obsolete" sites is not a priority. IT staffs have been cut, resources have been redirected into projects that actually turn a profit, or the "web guys" are gone all together. Nobody is around or has time to fiddle with the brochureware homepage.
Re:Technology exceeds demand.. (Score:2)
Mobile devices, web kiosks, smart agents - these things will all be connecting to the net more and more in the future. Companies with well constructed sites which follow open (not browser/platform specific) standards will likely benefit greatly from all this.
It won't happen overnight, but it will happen.
Gasp! (Score:5, Interesting)
Who on earth is running a browser earlier than 4.x? Do you expect stuff to be rendered right if you use an older version of IE/Netscape/Opera? Do advertisers want to sell to people that refuse to use the latest and greatest thing? Don't you have to try real hard to even find an older version of any of these browsers?
Sounds like a cheap way to sell a book - and a little extra helping of FUD thrown in.
Re:Gasp! (Score:5, Insightful)
I'm using Konqueror 3.0 which came with Suse 8.0. Googlebot is version 2.1 according to my logs. The point is that it shouldn't matter what browser you are using, and we shouldn't be fudging markup into tag-soup in an effort to keep certain browsers happy. Rather markup a document cleanly, and use CSS to present the markup -- that way less capable browsers can strip away the CSS and have a default view of the content - which they can markup or manipulate themselves.
Do you expect stuff to be rendered right if you use an older version of IE/Netscape/Opera?
No, I don't care about the rendering, but a page would be much more interesting to my little scripts if the markup described the structure of the content appropriately.
Don't you have to try real hard to even find an older version of any of these browsers?
Not too hard at all: http://browsers.evolt.org/
Re:Gasp! (Score:2, Informative)
An easy experiment you can do is to try and access a website with lynx, it will simulate what a blind person listening might here. Straight away you notice that in multi-column table based layouts, all those tiny links down the side of the page (next to the article you actually want to read) have to be scrolled through before you get to the article.
I don't understand the mentality of people who fudge around adding hack after hack for compatibility with 4.x browsers.
If you write a page using XHTML, a user with any browser that understands HTML will be able to read it. You can write it in the order "title,article,links/adds" - then the blind browser will get to the content they came for instantly. With the intelligent use of the DIV tag, all this can be positioned using CSS so you can still have the layout you want for people who can see it.
Best of all, unlike a sea of hacks and workarounds, this is built to standards so it won't need tweaking every few months.
It's easy to say to a 4.x user "upgrade" - after all, the system requirements for IE haven't changed that much from 4 to 5 to 6. But a blind person can't "get some eyes that work". So don't discriminate against them.
Re:Gasp! (Score:2)
Mozilla 1.0, anyone?
Re:Gasp! (Score:2)
Who on earth is running a browser earlier than 4.x [...] Don't you have to try real hard to even find an older version of any of these browsers?
Nah. I just go over to my sister's house and see what she happens to be using to access the Web...
Re:Gasp! Yup, I'm a luddite. (Score:3, Informative)
Me. Lynx anyone? Not anyone around here who uses a shell is there? Also, old Macs - SE, SE30, etc - can dialup, and there are ethernet adapters for them. They make good, cheap, space-saving machines for simple access. Use Nifty Telnet for shell access, older versions of Fetch and Netscape 2.0.
But the important messge here is that:
The web is about content, not format.
Remember this. The whole point to html is that it's a *markup* language, not a *forced formatting* language. The browser takes the content and displays it in the manner of the user's choosing.
This seems to have been lost in the corporatization and control of the 'net.
Remember the good old days? When the web was about content and not about spam and marketing? That's where I live. I don't want to see blinking and flashing and animated ads and popups. If I can't see your content on lynx or with a 4.x or pre 4.x browser, you have lost my eyeballs and any potential to recieve my money. No popups on lynx.
The same goes for html formatted mail (there is a special place in hell reserved for people who send html formatted mail.) If I can't read it in pine, I don't even care what it says. Send me text if you want me to read it. (No web bugs and stuff that way too.)
In short, the goal is to get your content to other people, stop being such control freaks about how it is displayed. Write to the lowest common denominator, be creative with what is available there and you save much time, aggravation and money. -- And I'll be able to see your content.
NEVER FORGET --
The web is about content, not format.
Join the Any Browser Campaign [anybrowser.org]and make your pages 'content enhanced'.
Cause and effect? (Score:2, Insightful)
Are 99.9% of Websites Obsolete? (Score:3, Insightful)
(Hmm, I was tempted to leave that as is, but I think at least a little explanation is required. Zeldman disagrees with his own thesis in as much as he says that sites like Yahoo! are important because of what they offer not how they look. So QED a site that relies on it's content is not obsolete. Tadaaa!)
Re:Are 99.9% of Websites Obsolete? (Score:2)
If Yahoo could offer its content free of the tag-soup additions, it would last quite a bit longer than its current incarnation, purely because the content would be a lot more accessible to more browsers and user-agents than at present. (Take a peek at the HTML source and tell me honestly that the markup matches the structure of the content).
Inaccessible content is just as bad as no content at all. Machine-readable markup has enormous benefits, and RSS just doesn't match up. Given clean markup, you'll be finding a lot more useful applications of the Web framework, but at the moment we are stuck in a browser only, keyword only environment. The Web offers us so much more than that.
Zeldman is looking forwards. Today doesn't matter tommorrow. The browsers you test your site on today are outdated. You think IE will still be king of the hill in a few years from now? Did you also believe the same about Netscape Navigator a few years ago?
The Web evolves, but at the moment tag-soup markup is what's preventing us from reaching the full potential that Tim Berners Lee saw at the very start.
99.9%??? (Score:5, Insightful)
Talk about sensationalism. The article just points out that many web sites have mark-up errors in them. Big deal. To go from that to saying that 99.9% of sites are obsolete is just dumb.
This is just a sensationist way to promote a book. Shame it got onto the front page of Slashdot. It will encourage more to do the same.
Re:99.9%??? (Score:2)
99.9% of web sites are obslete, and every computer for sale is obsolete by the time it hits the store.
What's the difference?
We design our web pages not to be constantly cutting-edge, but to be compatible and useful. Also as the parent post points out there is a difference between non-compliance and obsolesence.
Re: What's the difference? (Score:2)
Everyone who I have ever worked with has generated invalid HTML that has made even current browsers crash or behave erratically in different browsers. When I realized that I was also making these mistakes, I finally learned my lesson and started using the W3C validator [w3.org] to make sure my web pages are valid HTML. Since then, I have not had any problem with my pages not working in any browser. This is exactly what Zeldman is asking web developers to do.
Re:99.9%??? (Score:3, Informative)
Compatible with what? Testing in available browsers today only gives you compatibility for yesterday.
Compatible with standards such as the XHTML Recommendation and CSS Level 1 & 2 Recommendations offers you compatibility tommorrow too.
Surely anything that helps your website to be accessible tommorrow is to your advantage?
No, it's just reminiscent of "Flash: 99% Bad" (Score:3, Interesting)
Re:No, it's just reminiscent of "Flash: 99% Bad" (Score:2)
Speaking as, I believe, a member of the Flash community, I take that as an insult. People who are serious professional Flash developers didn't need Neilsen to tell them that many people used (and use) Flash in bad ways.
Proprietary code and those little hacks are bad. Code to standards.
Do you think web site developers choose to use "those little hacks?" The fact of the matter is that clients say "hey, I want that image to be down and to the left a little bit" so you find yourself putting a little invisible GIF image in to get the position right. You would love to do it "to standards" but if you use layers then it doesn't work for a good proportion of your visitors. Alternatively of course you could do all your work twice, once with "little hacks" for the older browsers and once again "to standards", but most of us like to take a more pragmatic approach.
Re:99.9%??? (Score:2)
What percentage of websites pass cleanly through an html validator such as W3 [w3.org]? Surely those sites that do not validate are because there are errors in the HTML markup?
Zeldman probably believes that 0.01% of sites validate correctly, so his figure of 99.9% obsolete isn't mathematically that far off.
Uh-huh. (Score:4, Insightful)
Methinks somebody is confusing "are obsolete" with "will eventually be obsolete, so long as web browsers suddenly becoms fault-intolerant and the site owners leave things exactly how they are and never ever maintain them, ever".
(Not to say that I don't agree with what he's saying, but jeez, what a wanker! "I declare that everything, everywhere sucks ass! Huzzah!")
Re:Uh-huh. (Score:2)
Two points:
1.) Current "practise" forces all HTML parsers to enforce error-correction before content can be used for other purposes. Content Aggregation and Syndication is there to make our lives easier, and using the web more efficient. Why do you think there's a lack of tools that do interesting things with Web Content - its the overhead of handling non-standard markup that causes problems.
2.) If site owners aren't willing to deliver standards compliant (and semantically useful) markup now, when?
Obselete? (Score:3, Insightful)
Re:Obselete? (Score:2)
Books vs. The Web (Score:4, Interesting)
You know, if I pick up a book printed in 1920, it's interface is going to be familiar to me. Table of Contents, Index, Chapters, Body Text, etc.
And now? I pick up a book printed today and find the same, useful interface.
Contrast that with the web, where I can find simple clean interfaces like Google or Yahoo compared to some ghastly Flash-based interfaces that do everthing they can to distract me from the information I'm seeking. Plus, I'm being told that the device (program) I use to access these sites is obsolete less than five years after being released?
I'm all for freedom of speech (and web presentation), but the web's got a long ways to go before it can become the useful instrument it can be.
Re:Books vs. The Web (Score:3, Insightful)
Well, yeah. By 1920, there had been thousands of years during which the presentation of the printed word was gradually improved and codified.
We're still in the early stages of presenting electronic content, the brainstorming stage, if you will. There's still plenty of room for innovation. Bear with it.
And I'm surprised at YOUR surprise that 5-year-old technology is considered obsolete in Internet time. Improvements are a GOOD thing.
The real problem: Non-compliant writing (Score:3, Insightful)
Yes, but when you pick up that book from the 20's, did they split their sentences with unnecesary commas? And check out Chaucer, is his work obsolete? Would you really want to read it if it were "ported" to 2st Century English?
After a cursory survey, I'd say that at least 99.9% of the writing on the web is not standards compliant.
The rest are l337 5kr1pt k1dd135.
How about Slashdot then? (Score:3, Interesting)
Misread the title... (Score:2)
My personal favorite worthless website - ZDNet, home of TechTV and other "high tech" offerings, with its absolute length table widths, every try to read that crap at a high resolution? Another one of my favs is the old 1x1 gif or other "layout tricks" still used by webmasters today. Get a clue people, an XHTML book is $15.
not that simple (Score:2)
About 10% of our sites' audience still uses Netscape 4.x, which doesn't support some elements of XHTML, nor any CSS positioning abilities.
We'd love to upgrade our standards to something more forward-thinking, but it's extremely bad business practice to piss off a tenth of your user base.
Re:not that simple (Score:2)
Strong Typing, Strong Code (Score:2)
The web could do a lot worse than become a bit more strongly-typed, and a bit more like a programming language than a scripting language.
True, most folks don't need more than the basic mark-up for their websites, especially where personal websites are concerned. But commercial sites could stand for a much better design than they have. . . the author here makes a lot of good points when he calls out the faults of ZDNet and Yahoo for their HTML. The code is crap - thank God HTML doesn't have GOTO statements, or these sites would probably be chock full of those, too.
Let's do what we did with the blink tag. Don't just deprecate it--ignore it. Tell the browser, "Don't listen to the <font> tag, just skip over it."
Not too long ago, I re-wrote my own personal webpages using Cascading Style Sheets. It's tricky, since Netscape/Mozilla oftentimes has different ideas of how to interpret CSS than Internet Explorer. But it's easy enough to accommodate both, without too much effort. And I'm a lot happier now that my HTML code looks less like last night's dinner and more like something that someone else could read and understand.
Re:Strong Typing, Strong Code (Score:2)
Or did you mean HTML? It isn't a scripting language either, it's a markup language. It doesn't have any processing instructions, it just describes data. Or did you mean DHTML...?
And the differences between a "programming language" and a "scripting language" have always been murky. What's the difference? That one can be compiled and the other is interpreted? Is one strongly typed and the other not?
I'm probably not saying anything you don't already know, but it's hard to know what you're getting at.
J
Is error free HTML a chimera? (Score:3, Informative)
Maybe learning html in a weekend [intuitive.com] or in faster [geocities.com] don't help keeping the quality of code at high level ; )
93% of your audience use 4.x or better browser (Score:2)
Do we start broadcasting TV signals in black and white again because a similar portion of viewers use b&w tv's?
Who ever uses an older browser ussually isn't a power user to start with and isn't looking for the latest fluff anyway.
Re:93% of your audience use 4.x or better browser (Score:2, Insightful)
I think the complaint with the web is that things don't gracefully degrade in downlevel browsers, they just die.
The original intent of the web and html was to distribute content with tags that describe the "purpose" of that content and leave the rendering up to the browser. This meant that I could write a page and my message would get across to anyone even though it might look different to every person.
Then enter the marketing folk and the desire that a webpage look the same to everyone. That sucked.
CSS allows better control of the look but still works on the premise that the html (or xhtml) describes the purpose of the content and CSS is around to give hints on how the page should look. It still gives the end browser ultimate control of the rendering and the page could look different to different people.
If people would design thier webpages realizing that whats important is the purpose of the information and not the look of the information we wouldn't have so many of these problems. The web was designed for information, not for art.
Re:93% of your audience use 4.x or better browser (Score:3, Informative)
Who ever said a Compaq IPaq running Pocket Internet Explorer, or a Sharp Zaurus running Opera at a max screensize of 320x200 is "an older browser"?
When HTML and CSS are used correctly, optimally and compliantly the resulting websites are far more accessible in more user-agents that the mere crop of bloated OS based browsers.
Re:93% of your audience use 4.x or better browser (Score:2)
I do however care if its so moronically broken that its not even navigable. As long as I can read the information I went there for all is not lost.
P.S why do I sometimes get slashdot telling me "you cant post to this page"?
Obsolete and then some (Score:4, Funny)
Now that he's completely met his goal of total obsolescence, our webmaster spends every day looking for new ways to make our website even less useful, uglier, and more of a pain-in-the-ass to use. He's been very effective.
Re:Obsolete and then some (Score:4, Funny)
Web designer's perspective (Score:2)
Lets take a closer look.
Overwhelming majority of websites out there are not HTML 4.0/XHTML 1.0 compliant. Even the sites that belong to members of w3c bend the rules which they help write. Sounds asinine? You bet.
Standards do not mean s**t anymore. Everybody is aiming for IE 5.x/6 compatibility nowdays. Cross platform understanding is dead, now that Netscape has lost the overall war. Vast majority of web designers do not even double check their sites in Opera/Mozilla nowdays, thinking they might have to do some extra compatibility coding/clean-up.
Most sites are NOT cross device/platform. You cannot view them on a PDAs of cellphones. Notice the word _MOST_
There are millions of other reasons, but I have to run to a meeting. I'll expand on this later today in more detail.
Zeldman (Score:4, Insightful)
It's easy to lament the fact that these sites aren't standard, but there are clearly reasons why most of these sites don't fit his vision of standards compliance.
For one, most sites don't have the budget to develop to standards. It's much easier to code to specifics and use non-standard work-arounds where possible then to boil everything down to the least common denominator (which standards are supported by whom). When I say easier, I mean that years of experience have instilled intimate knowledge in the seasoned web developer that almost comes as instinct now.
Secondly, all of these "standards" are interpreted differently by the different browsers, so you can't insure consistent look and feel without kludges.
Third, most of the foundations for these sites were layed out before coding to a standard was even possible, and when the mindset was not focused on any sort of standards compliance.
Finally, I've always thought that they made writing to standards compliance sound easier then it actually is, because even though it's called a standard, it rarely exhibits standard and consistent behavior across the various platforms. Most art directors and graphic designers - specifically those that migrated from print or traditional design - tend to be exteremly unyielding in the way their designs are interpreted on the web, leaving developers with few options that are fully supported by these so-called standards.
Personally, I think Zeldman needs to spend some time in the trenches working on a large site with a large development team under real deadlines for real clients. Things are rarely ideal in these circumstances.
What is it they say about armchair coaches?
The problem is people... (Score:5, Insightful)
...who don't understand what HTML is.
You're not supposed to be able to. That's not what HTML does.
HTML is a content language. The whole beauty of it is that the final presentation is NOT THE DESIGNERS RESPONSIBILITY. No web site will look the same on all platforms - that's the point.
The people you are talking about are not 'web designers' - cannot be, because they don't have a clue what the web is. If you cannot accept the fact that your content can be presented different ways (including to blind people) as appropriate to each individual client, you have no business on the web. Make .pdf files or something.
I know someone will interpret this as flamebait, and someone else will probably tell me to 'get with the real world' or the like, but in fact I am just telling you the truth, and I'm quite grounded in the real world. There has been no shortage of people explaining these simple facts about what HTML and the Web are, in simple terms and moderate tones, from the very beginning - and sadly there has been an overabundance of self-styled 'designers' that refuse to understand the medium and insist on trying to make it what they want it to be, instead of what it is. REAL designers work with their medium, they take the time to learn how it works and why, and they produce designs that are appropriate to it, rather than insisting that every media work the way their favourite one does and breaking it every time they touch it. And that is something that every decent art teacher in the world tries to teach his students. Sadly, the students, particularly the ones that go into web design, don't often listen. I'm not trying to pick on you personally, but your clueless post makes an excellent example I must admit.
'Designers' that couldn't be bothered to understand the medium of the web before proceeding to dump their work on it have done great damage to the web, and that's something I happen to care about quite deeply. Your ad-hominen attacks and dismissals of Zeldman aside, he makes a point that is absolutely true, and will have real economic consequences. All that patched up proprietary spaghetti code of mal-formed HTML-abuse IS coming down. While standards compliant pages from the very earliest days of the web still display perfectly in the latest nightly builds of Mozilla, the pages written by people with the philosophy your post shows ARE becoming obsolete, very quickly. In a way, the 'designers' that can't be bothered to learn their medium have won - the new standards will allow them to do what they always wanted to do, and what HTML was never designed to do - to specify layout and 'look and feel' issues. But it will require them to do it in ways that consistent with the underlying philosophy of HTML and the web - something they've never shown any interest in doing before. I expect to hear a lot of whining from that corner in the coming years, but don't look to me for sympathy.
Complexity vs. usability (Score:3, Interesting)
I'm not saying that we as a collective need to move back to HTML 1.0, but there has got to be a solution to increasing complexity in Web information spaces. Companies that intentionally cripple some browser/OS combinations are doing the greater community a vast disservice.
The majority of Web pages are not necessarily broken, but reflect limits on the time and energy of those who create them to keep up with 'standards' that seem to shift every other week.
It's harder to play one note and have it be perfect than it is to play a thousand and have them be close. Most people choose the latter, and hope that one note hits home.
HEAR HEAR ! (Score:2)
All the folks out there who are slamming web developers/authors really need to step back a second. [I'm amazed that my first post in this topic already has 3 "You should code better" responses.]
I have been working with 'web' pages professionally since late 97.
And man has stuff changed.
Anyone who works in the real world (not academia) understands that not only is there the pressure of a 'real world' environment - but the need to show value for a company.
Understaffed departments, unreasonable demands, HUGE goals. Those are the factors that REALLY limit the 'good code' out there. Its very hard to make sure your 100% compliant [no matter how hard you tell the board/your boss/your dept/the finance people that you SHOULD be] when at the end of the day - you have more 'new' projects in your inbox than ones you have finished.
[and before folks cry - TELL THEM ! TELL THEM ! We are in an economy now
but I digress
HTML that was 100% w3c 4 years ago
XML
I can write some xml/xsl for IIS
[clarification
So yeah
Condensed version (Score:3, Funny)
1. Standards are good.
2. Bad code that happens to work in current browsers is bad.
3. Buy my book.
Yeah, whatever (Score:2)
-Sean
Web Standards are a well conceived joke (Score:2, Interesting)
The main problems that I see are that
1. Web standards bodies move slow and specifications are obsolete before they are approved. Take SVG. (please) Flash is a superior format with a large installed base, quality authoring tools, platform scalability, and open but expensive architecture. SVG took five years to become a reality, and is still VERY immature.
2. It's about the user stupid! For the most part, users sit at a computer desktop, with a commercial browser (IE), and use the internet. It needs to look right for THEM. The .001% of users on cell phones are doing specific activities with mostly packaged content. These users are novelty users. Portable devices have no standards as to how they display, and without this, nobody can expect a useful cross platorm "standard" that works everywhere. It's a microsoft world whiner. There is no doubt that IE is the only browser that matters. If someone else wants to make a competitive browser, it needs to be IE compliant, not W3C compliant. Microsoft took it upon themselves to create a language that works, no matter how it's written. Who cares about sloppy coding? Bandwidth is hardly an issue, and if a browser renders correctly, it should LOOK right.
in conclusion, the web standards project and w3c have failed due to their manegerial impotance, and can be safely ignored.
The Problem Ins't Backward Compatibility (Score:3, Interesting)
Designers want to control every pixel of a page's layout, completely ignoring what the web was designed for. If everyone used logical markup to describe their data, later adding CSS to attempt to influence the layout, the web would be a much friendlier place. It may not look exactly the same on every browser (which, come to think of it, may be Zeldman's point), but with proper testing, it should look similar on popular browsers, and at least be LEGIBLE on others.
People need to be convinced that the web is not a graphic design medium. That's what PDF files are for. People don't try to build their sites solely from PDF files, because that just wouldn't fly. Instead they try to use the web to achieve the same goal, completely oblivious to the fact that it's a really poor tool for that purpose. Rather than embracing a new paradigm, they try to contort it to look like what they already know. To me, that's just incompetence.
What are standards? (Score:2)
the world wide web is about what ever you make it. I could make my own meta language that the uses http servers. coming soon- rEml - randomErr markup language. it won't meet your standards, but it meets mine.
forcing everyone to do things your way is so... microsoft.
xhtml easier, yeah right (Score:2)
Can someone tell me, is
<b> go and <a href="somelink">click me</a> now</b>
illegal in XHTML? Does it need to be
<b> go and </b><a href="somelink"><b>click me</b></a><b>now</b>
because A HREF tags aren't part of the valid contents of the bold tags?
Solution: Content Management Systems? (Score:2, Interesting)
I work for a mid sized company but I know the web site is very out of date and has incredibly poor content. In my mind I can pinpoint this to one thing. The inability for the people who write content to get it to the site.
I know for fact there is more than enough good stories and photographs in the organization that can be published but most of the technicians who would write it (or at least the first draft) don't have the time to learn a web design program. The solution I believe is a good content management system. I've been looking into Typo3 [typo3.com] and a couple of other content management systems. I believe once we make it easy to update then content will be less likely to be obselete.
Content Management Systems are right now the best place I can start introducing open source software at my work. We've looked at Microsoft's Content Management Server which is highly over priced for our needs and its hard to argue with the documentation and self-help community that open source software provides. I know there are other content management systems out there but the point is that for content to stay current publishing capabilities must be pushed to the people who will author it.
Pure Bunk (Score:2, Insightful)
Still, it's always amusing to see someone suit up, gird their horse, and charge at the windmills while proclaiming the revolution.
Everybody knows the answer is standards! (Score:3, Informative)
I want to do a nice little page, and do it in XHTML because it's The Way Of The Future (or I want to display a little math, which only XHTML+MathML allows without resorting to ugly inline images). The tag soup itself isn't a problem, I just close all my tags and make sure the doctype declaration says XHTML instead of HTML, as prescribed by the standard [w3.org].
However, is this enough? The document is now XML, and therefore should have a <?xml declaration, if only to specify its encoding. Except that said XHTML standard says it is optional if the encoding is UTF-8 or UTF-16, or has been otherwise determined (think HTTP headers), which contradicts the XML standard, sec. 4.3.3 [w3.org], the last two paragraphs, one which says that no declaration and no other information means mandatory UTF-8, and the next one "It is also a fatal error if an XML entity contains no encoding declaration and its content is not legal UTF-8 or UTF-16."
So I need a declaration no matter what. But according to this page about the different layout modes in current browsers [www.hut.fi], MSIE will react to an XML declaration by switching to "quirks" mode, which is precisely what I wants to avoid by sticking to the standards... And I wouldn't want to lock out 85% of WWW users, wouldn't I?
But wait, this is only if the page was served with a text/html content-type. The right answer would then be to use the standard content-type for XML/XHTML... which should be [w3.org] application/xhtml+xml! Yes, "application"! Now if I use that content-type, all browsers I have at my disposal except Mozilla (MSIE5, Konqueror, Links, Lynx...) either consider the page an application and offer to save it to disk, or display it as-is! Same with the second-best, text/xml.
Okay, am I the only one experiencing this? Any point in not using good-ol' HTML4 and avoid doing (yet another kind of) horrible bugware?
Re:Everybody knows the answer is standards! (Score:2)
Obsolete is an obsolete word (Score:4, Interesting)
Here's what it means: http://www.dictionary.com/search?q=obsolete
Hell, I still use lynx when all I want to do is snag a tarball. My linux boxes dont even have a GUI. If the content there has meaning, who cares if the web page uses the latest 'nifty tricks'. Is an ASCII text file obsolete? No, not if the information it contains is valid. Is EBSDIC (sic) obsolete? Probably. I cant even remember the acronymn
I'm constantly hearing how my P3 600 is obsolete. There's nothing that doesn't run on it. Hell, I have a router box running a P90.
Is my original NES obsolete? Or my Atari 2600, for that matter? Not as long as I enjoy playing them.
Is a 2001 model vehicle obsolete because the 2002 line is introduced? It does have a bigger cupholder, after all.
If people want to push their agendas, sell whatever they're selling, go for it. Just quit trying to redefine perfectly cromulent words in the english language to do so. Make up new ones, like cromulent. I propose 'obsolastweek' to mean everything that wasn't shrinkwrapped within the last 24 hours.
This article should read "99.9% of websites are obsolastweek because they haven't been redesigned because some propellerhead made a new widget"
Propellerheads (I can use that word because I am one), dont realise the cost of doing business. The world doesn't start over at 0 just because they invented something 'slightly better'.
disease and decay... (Score:2)
Tell me about it. I just checked my webpage, and all my <br> tags had decayed into <blink> tags....
Translation (Score:2)
Ergo,
"0.01% of all websites render nicely in Lynx."
Seems to me there's some confusion between "obsolete" and "usable." Those websites that will be obsolete with fubar 6.x are the same ones that cram a lot of visual shit down your throat, making you work very hard to extract the useful information out of the noise.
Fight designed obsolescence, and write text-based web content with a minimum of static content. Otherwise, don't bitch when fubar 6.x fubars your site.
Re:Translation (Score:2)
Shame on all those developers..... (Score:5, Insightful)
Yeah, that's right. It was the fault of all those developers who didn't have the forsight to see the standards that would eventually be approved years later. What were they thinking?
It didn't have anything to do with the standards process being slow, or diverging from the needs/demands of the market (HTML 3.0). And even after the standards were finally approved with buy-in from the browser makers, no blame rests with both Microsoft and Netscape for serious bugs in their 4.x browsers, often causing their browsers to crash on many CSS features.
Yep, those developers were at fault. They learned bad techniques, when those techniques were the only way to accomplish what their customers wanted. They continued to use them when the 4.x browsers would crash on standard-based markup. Even after the really serious problems were cleared up in IE5.x, they still used their old tricks. And now, damn them, that 6.x browsers have been available for only a year or so, they haven't redesigned all the world's websites to be fully standards compliant (and broken on 4.x and some 5.x browsers which are still in heavy use).
Yep, if anyone's to blame, it's those developers.
Re:Yahoo (Score:2)
Occasionally, I get to sites that don't let Konqueror in (maybe once in a month) but Mozilla works.
BTW, Flash works very well in Linux, too.
Re:Yahoo (Score:2)
Yahoo's markup structure adds no semantic meaning to the content at all. Therein lies the problem. The website may look like its compatible, but that is mere face value. When you dig deeper into the markup and try to manipulate it with a script, Yahoo's markup is one of the most unwieldly around.
So Yahoo basically prevents tommorrow's browsers from leveraging its content in favour of being compatible with todays. Zeldman is advocating a method of accomplishing both feats, while nonchantly supporting all HTML compliant browsers to boot.
Re:Microsoft makes it especially difficult with IE (Score:2, Informative)
Re:... So Let Me Guess: (Score:2)
<Personal Shill mode>So now you all go out and buy my book and your HTML will be cleaner, 20% whiter, your breath will be fresher, and you'll get this lovely set of steak knives</Personal Shill mode>
Re:Coding Insanity (Score:2)
Lynx handles images by using an external program - essentially like a plugin. Plus, for maximum accessibility you should be providing textual alternatives to rich media types anyway - thats a priority 1 checkpoint of WCAG [w3.org].
So lets all just use HTML 0.1 with only <br> tags and <a> tags. Whine whine whine...!
No. Well structured HTML (as in _this_ is a heading, _this_ is a paragraph, _this_ is a quote), and using CSS to style the presentation (whatever the output destination: screen, printer, aural devices, holograms).
Re:Slashdot (Score:2)
- people who adopt IE only standards are stupid because the piss away 25% of potential users.
- people should abandon older standards for W3C
What is logically inconsistant about those two statements?
Authors want and write backwards compatibility in order not to piss off the friggin users who use older browsers! Get a clue pill dude.
IE only extensions force people on other platforms to change platforms. Standards compliant HTML forces people to upgrade their browsers. Which would you rather do??