I had not considered "polyfills". However, if they are critical to site operation then you would have no choice but to load them on the pages that need it. I'm not saying there should be a hard 250k limit or anything, just that for normal operations you should aggressively try to limit the size.
Personally, I like to inject scripts on demand. When I would need one of those polyfill libraries I could inject the script before hand. If the user never performs an operation requiring it, it does not get loaded.
I never claim to be a front-end expert at all. In the last 2 years though I've been forced to step up and start creating more and more of the front end infrastructure due to costs (which is terrible; I'm overworked). For what I needed the minimum JQuery library has been giving me everything I need and I inject the chart and graph JS when I need it on specific pages.
How do images not from the domain (or registered CDN) failing to load make essential web applications useless? That sounds like hotlinking of images which I've always understood to be a dickhead move on the part of web designers.
That also comes back to the javascript because I cannot understand putting all of your JS on CDN's that you are not paying for. Worse, I see developers all the time hotlinking scripts from other blogs and developers. If you pay for the CDN, then use it as you see fit. If it's something your not paying for, then don't be a dickhead. Common courtesy from my point of view.
In some cases I can understand that hotlinking may be required to offer a service, or normalize images and scripts amongst all of the services clients. However, the vast majority of this activity is highly undesirable from the user's point of view. I use DoNotTrackMe and Ghostery precisely for that reason.
I'm not interested in facilitating the mass violations of privacy for marketers. They can DIAF repeatedly. I really believe that the content or script should be delivered from the domain, or CDN's that they are paying for. Other CDN's are too risky and don't offer the minimum level of reliability that I expect.
Keeping strictly to this requirement would seem to solve a lot of problems. The site owners and designers would be forced to host and review the content they are pushing out. No more pushing the blame out to a 3rd party company, and create accountability to help the users.
If they really want that beacon or tracking technology then download the script to your own servers or CDN's. While I know that many designers would just cron the damn thing to sync it, it does still create accountability.
As for the registering of the CDN, I think you misunderstood me. The site itself would register a CDN as part of the domain through instructions in the XHTML. I believe this would simplify work flow and allow you to swap out a CDN without touching a single line of code in the rest of the site.
You could register several different objects with the browser itself. Off the top of my head you could register CDN's, external widgets, beacons, trackers, etc. Even better, you could register known external objects that are community approved. Meaning, your XHTML does not have to reference the exact Google Analytics script, but to a common reference point that allows Google to normalize that code to whatever they want. No cost for registration. If it's well used enough the community itself can add it to the list of common objects to make life easier for all designers.
Having that can allow the community to manage RBL's for objectionable content. Not to mention with a single preference click you could disable all beacons and trackers. Privacy laws could be amended to state that bypassing the user preferences in the browser is illegal. Wasn't Google guilty of that anyways with something? There are really very few exceptions to disabling trackers. Netflix loading the FB Connect code to facilitate certain features is an example. Netflix could tell it was blocked and pop-up a dialog box informing that certain features will now be missing AND list them. I know some people want Social Media to be omnipresent in all the sites they visit. This does allow that, but strongly categorizes all of these activities in a way that is now transparent to all users.
A strong level of security is provided when the community finds that somebody was injecting a blocked beacon or tracker from their own domain or CDN, they get added to the RBL and disabled everywhere. That's a really strong deterrent and well operated RBL's can be fair and impartial. While I've heard nightmares regarding RBL's, the major ones I've always found to be operated well and you have ample ability to remove yourself. Plus, it would only happen if that RBL found you being a dickhead, not just faked reports from user complaints. Legitimate malware complaints and tracking from security companies could allow many large scale malware attacks to be stopped in their tracks. Certainly a more comprehensive and proactive approach.
I realize that this would upturn entire marketing industries, but quite frankly, fuck them. If the user base sends out a message loud and clear that says, "Don't Track Me", I don't see why they get to continue violating privacy. DoNotTrackMe and Ghostery are not going anywhere and are in fact growing.