Don't worry, NuPlayer is sure to have its own unique collection of buffer overflows!
BROP doesn't work against a proper ASLR implementation
Define 'proper'. Re-randomisation after every fork()? Good luck with that. PLTs at random offsets? Sure, if you're willing to pay the overhead of not being able to share any position-independent code between processes.
During your rant, I couldn't help but think, 'But they DO have a standardized app for accessing all the websites', and it's called the browser!
I think that you're slightly missing the grandparent's point. About 10-15 years ago, there were two groups pushing new directions for the web. One group, led mostly by the W3C (though backed by Apple and a few other big companies) wanted to completely separate content and presentation. You'd have a service that would provide structured XML and then a web page or a native app that would process it and present it to the user. This would make it easy to write programs that aggregated data from multiple sources (e.g. find bus, train and flight times and prices so that you can find out the cheapest or most convenient route from A to B, including getting to and from different airports).
The other faction, led by Google, wanted to completely destroy this separation and make web pages into rich web apps that would ensure that you could only view the content in exactly the form that the authors intended. The main goal of this was to make it hard to distinguish content from ads and therefore make it hard to automatically remove ads.
Unfortunately, the second group mostly won. The grandparent seems to want people to go back to the other approach and present machine-readable data feeds so that we can then have rich client-side apps that are agnostic to the source, but present the data as the user wants. I'd like that too.
Governments don't always suck at providing services, you know. The BBC is one of the only major news outlets that does actually try to be unbiased, even if they aren't always perfect at it.
Oh, wait, you didn't need to pass a test for that.
I'm just trying to think how that would have been possible. I think back then there was a medical exception you could plead for. I didn't. I passed the 20 WPM test fair and square and got K6BP as a vanity call, long before there was any way to get that call without passing a 20 WPM test.
Unfortunately, ARRL did fight to keep those code speeds in place, and to keep code requirements, for the last several decades that I know of and probably continuously since 1936. Of course there was all of the regulation around incentive licensing, where code speeds were given a primary role. Just a few years ago, they sent Rod Stafford to the final IARU meeting on the code issue with one mission: preventing an international vote for removal of S25.5 . They lost.
I am not blaming this on ARRL staff and officers. Many of them have privately told me of their support, including some directors and their First VP, now SK. It's the membership that has been the problem.
I am having a lot of trouble believing the government agency and NGO thing, as well. I talked with some corporate emergency managers as part of my opposition to the encryption proceeding (we won that too, by the way, and I dragged an unwilling ARRL, who had said they would not comment, into the fight). Big hospitals, etc.
What I got from the corporate folks was that their management was resistant to using Radio Amateurs regardless of what the law was. Not that they were chomping at the bit waiting to be able to carry HIPAA-protected emergency information via encrypted Amateur radio. Indeed, if you read the encryption proceeding, public agencies and corporations hardly commented at all. That point was made very clearly in FCC's statement - the agencies that were theorized by Amateurs to want encryption didn't show any interest in the proceeding.
So, I am having trouble believing that the federal agency and NGO thing is real because of that.
Today is a good day for information-gathering. Read someone else's mail file.