Yes, because I would just love having to go through regulatory channels and potentially paying fees in order to publish software that I don't even make any money from.
Depends on the regulations: "Commercial software can pick from one of the 5 following standard commercial licenses:
You are then perfectly free to make money from your software. Pick whichever one of the standard licenses suits your purpose and carry on. But what you cannot do is employ a lawyer to invent a creative way to screw your users in the fine print. If you do, your license is automatically torn up and replaced with something sane.
I wonder if anyone actually takes the responsibility to do this check. Maybe there are GCC binaries in the wild which replicate a backdoor.
Even if there were, you need only recompile your gcc source with llvm, icc, visual studio, or basically anything that isn't gcc to get a new compiler that won't replicate the backdoor any more. For extra fun, randomise the order of this compiling that compiling something else so that even backdoor reinsertions that cross the vendor boundary will eventually fail. Or write your own C++ interpreter in Python/Perl/whatever and use it to (very slowly) run gcc on itself - even if it takes a week you'll have a clean binary at the end. Yes, hiding such a backdoor seems scary to the untrained eye. It's also trivial to get rid of if you're paranoid enough to care.
If you want "networked" configuration nodes, an isolated network should be the only thing accessing equipment. That node should not access anything else, or any other networks...
Because those experts are morons. It ignores the economic cost of companies having to run a separate parallel Internet. Take electricity suppliers that need to monitor and control remote switching devices, for example. GSM/CDMA networks are just there, already deployed by the telecommunications industry. A cheap GSM modem and an account with the local telecomms supplier is economically better at contacting remote stations than running ones own wires out to single-point stations in the suburbs and the bush.
Isolated networks also don't work. Putting a dodgy default-passworded device on an internal network doesn't work when your attacker walks up to the remote station, cuts off the padlock, and installs their own device straight onto your wide-open "no one could possibly hack this because it's disconnected" network. Which is basically how Stuxnet got deployed - direct intervention onto a private network at a weak point.
This problem cannot be solved with simplistic "if you don't want people to hack it, don't connect it to the Internet" solutions. How about building it to be difficult to hack in the first place? Or making VPN layers the default way the Internet works rather than an afterthought? Or teaching (mostly non-software) engineers security techniques that were honed over decades of fighting malware on the open Internet? Or any of a million other practical solutions that don't boil down to "la la, I can't hear you so you can't hear me".
As an Android and iOS developer, it is tough to support all possible screen sizes, aspect ratios, hardware specs and versions of Android. Sometimes not having a newer version of Android(>= 4.0) you miss a lot of features that people come to expect and your code is riddle with backwards compatibility stuff just to support Gingerbread, or worse(ie: Donut).
And none of this would be a problem if PBS would simply publish the specification for whatever JSON/XML/etc back end they are using to transmit information to the clients about shows and episodes, and use standard RFC-compatible video formats and streaming protocols with no DRM or other nonsense.
Why would it not be a problem? Because the next day the app stores would be full of "SparkleVideoPlayer now supports PBS!" updates for all of the existing streaming video apps and their loyal users. Or if my screen size, aspect ratio, blah, blah, blah is not supported, I can write my own app!
I can understand why the commercial TV outfits want to control everything - they think it's the only way to poison the experience with ads. But why are public broadcasters like PBS, BBC, and Australia's ABC doling the same thing? It's idiotic - the solution to "how do I support a million devices" is simple: "publish the spec so that the taxpaying public can write their own apps".
Incredible amounts of money and aggravation are wasted every year on this leftover from the age of agriculture.
Speaks someone who has no idea where their food comes from. Hint: agriculture.
Here's one simple example: Every morning the cows come in around dawn to be milked. Several hours later the milk tanker arrives to collect the milk and take it to the bottlers to get it ready to put on the trucks to go to the supermarket for you to buy tomorrow.
The cows will come in a little later in winter. Which pushes the schedules for the tanker drivers and bottlers back by an hour. Now the bottlers who used to work 9-5 are working 10-6. Also shifted are the truck drivers going to the supermarkets. And the stockists in the stores. And so on.
Do you really think it is a good idea to force millions and millions of low-paid truck drivers, milk bottlers, and cheese churners to work idiotic shifts and see their families even less just so that you can avoid having to change your office-worker watch twice a year? There are more people in society than you post-industrial types.
When building a bridge to take a 10 ton load, you better use 15 ton beams just in case one is under spec. When building a circuit to switch at 10 MHz, use components designed for 12 MHz just in case one is under spec. It's called "tolerances" and is the underpinning of all engineering, and is a great idea for those fields where once it is built the requirements generally stop changing.
Except in software engineering. Tolerances in software are called "fudge factors" or "heuristics", and they always result in unmaintainable spaghetti as requirements constantly change over time.
I use the term "Software Developer" for myself because I refuse to "engineer" software; i.e. build crap. My most recent contract involved fixing problems in embedded systems code written by an EE major. Total nightmare - no unit tests, no code comments to speak off, mysterious algorithms with no explanation as to where they came from, references to "see datasheet" for the component that was used three board revisions ago but not any more, and so on. The circuit? An absolutely beautiful example of balancing requirements and managing tolerances. But the code to run the circuit was rubbish that would get stamped "go back and do that again" by the code reviewers in any software development shop.
The ironic thing is that the term "Software Engineer" was coined to give developers the air of professionalism. Perhaps the engineers could learn something about professionalism from the developers instead? Like how to design a system that won't fail the minute the requirements change.