no. the toxicity comes from the advertising and the insistence on javascript (and flash and java etc applets).
just displaying documents is harmless. it's the fact that web-dev fuckers (and worse, "designers") want to run arbitrary code on millions of computers belonging to other people that is the source of the harm.
Noooo, the toxicity comes from using a platform that was designed only to publish static documents with a simple markup language to do rather complicated dynamic operations requiring the browser to act more like an operating system managing objects, making additional requests based on client side activity, etc.
This what YOU WANT. You can't tell me you don't want those features without also telling me that you want to be Amish.
Running arbitrary code is exactly what is required to satisfy web browsing needs of people today.
you're making the mistake of assuming those "shiny" features are essential. they're not. in fact, more often than not, they're a PITA and end up being a reason not to return to the site.
if a web site or even just a web page doesn't work without javascript, then it is broken. js can be useful to *optionally* enhance a page, but the page should work (i.e. display the important information and navigation controls) without javascript.
They're absolutely essential, and done right, not even close to being a pain in the ass. If you have a "form" on a page and want to be able to work with it, without having the entire page reload, your ONLY option is JS. There are NO OTHER OPTIONS.
I'm flabbergasted that you even think for one moment that a site not working without javascript is broken. With respect, you're the one broken.
You simply cannot do 99% of what people expect in a "Web 2.0" experience with working with the DOM locally, and that absolutely requires JS. Unless... you really want JAVA and FLASH?
There is no such thing as "optional" enhancements anymore. Responding to events and using a JS framework greatly accelerates development times and actually allows for rendering on different browsers to be largely the same.
My pages are written far easier and are less cluttered with a simple class having a click event that runs an AJAX call to get your more information about that object.
You're wishing to go back to a pure static page environment, or worse, a dynamic one where to do the SIMPLEST activity requires a GET or POST to a separate page requiring server side code to operate, create a dynamic page just for you, and then return it.
Really? All of that work, those server resources, just because you didn't want one little AJAX call running on the page submitting your post?
Hardly seems reasonable.
Then let's not forget. With your idea, no major web development would have ever been done period. The only way to do anything again would be native code, thereby shutting out quite a bit of valuable innovation in the markets by startups that could have never afforded the resources for large coding shops that could keep track of code for multiple platforms.
Having experienced native code companies and SAAS companies... please don't relegate the rest of us to that hell. The native code companies are cratering because they can't begin to hope to keep up with the SAAS companies using open source frameworks and rapid cross platform development to push out fixes and features in weeks instead of 3 years.
As for that CDN observation, I never said it was perfect. Only that if you wanted web browsers to operate the way they do now with no 3rd party client side library and frameworks you would need to provide all of those capabilities straight in the browser. Registering a CDN in the browser as servicing a particular domain at least allows the browser the ability to filer those requests and choose whether to do it at all. Site coding is easier because the browser understands to redirect the requests for those files to the registered CDN instead of the website. If the CDN fails, you have graceful failure back to the site's servers. Unless it is 100% hosted on a CDN like Amazon EC2 can.
Bypassing the CDN filtering doesn't make sense either as it would be the CDN itself that gets listed on the RBL, or the subset of it belonging to that customer. Putting their scripts there is the entire point of the CDN so that doesn't make sense either. Why would I host my own scripts AND pay the CDN? No, the idea of the CDN is to 100% host my site on it. Not ala carte. When that malware script gets put on the RBL, hundreds of thousands of people can be subsequently protected. Your idea of the static 1990's page completely prevents that capability. Every subsequent user of the site has the same exact risk.
Putting the advertisers on networks of CDNs that proactively participate in the RBL program does go a long way to mitigating large malware attacks. It would have helped with Yahoo earlier this month. That spying that you are talking about can be done with CDN's right this minute so I don't see how that is relevant to an idea I came up with in 30s.
Web browsers are not used for static document delivery and rendering anymore. They just are not used that way anymore.
I understand your bitching, but the option is most assuredly NOT going backwards, but going forwards and creating a new cross platform design that precisely allows for arbitrary code to be executed in a heavily sandboxed environment. That would be what a web site should be. It's the only way to deliver the services people expect in 2014.
If we at least approach it that way from the start, creating an entirely new framework, with security being the #1 motive, than we have a chance at least. Otherwise, we are just left with more of the same, which quite frankly gets really fucking old quick.