In the past, publishers did an important work. Today, they should be replaced by digital tools. And they will. In time.
Yet, a HW guy (EE Major) has arguably the same skills, foundation, and fundamental knowledge as SW guys (CS majors). Its really just that EE guys don't like doing CS, but they could do it easily. I went to a top-tier uni with EECS as a combined major (and this is common at the top uni's). I can tell you the top EE guys aced their CS classes easily and beat out the top CS guys or were on par.
Wrong. Much as Dilbert's PHB, you assume that stuff you don't know is easy, based on the initial learning curve. Granted, throwing in a couple thousand line coding project is easy. That's the algorithmic training, and that is the easy part. Many kids learn that on their own well before their teen years. The difficult part is the system design; the cobbling of many software pieces to work together coherently; the design of clean abstraction layers (non-leaky abstractions are perhaps one of the most difficult engineering tasks); the design of stability pillars for maintainability (test frontiers, on APIs or on abstraction layers, documentation).
To put it into familiar terms, it's as easy for you to code as it is for me to program an FPGA, stuff it into a breadboard, plug in sensors and actuators and produce a prototype. However, CS guys know that they will never ever design a decent electronic circuit for mass production, even if they can tinker with EE stuff. EE guys will, like you do, boast to be able to decently design a decent software stack for mass usage. They can't (mass generalization). Not without relevant training. Information systems can get pretty darned complex, and just because they don't burn on failure or don't crash into rubble doesn't make software design any less difficult, just less appreciated.
The EFF is already on top of the issue. Use their form to contact your representative.
Oh, and for the mytical cherry on top of the cake, the proposal is supported by Facebook."
Link to Original Source
First point: this didn't pass into law, after active online campaigning; the law was shelved, will be revised, and may (probably will) be resubmited for approval in the future.
Second point: this wasn't a "pure" tax. This is a compensation scheme to pay back artists for "lost revenue" due to the private copy law. The private copy law (already in effect), allows citizens to copy artist work for private use. Putting an example into it, ripping a CD I own and copying the mp3 onto my ipod is perfectly legal, under the private copy law. Artists somehow believe I would buy the same art twice (once on CD and again on mp3), so they cry out for compensation. In fact, a compensation scheme already exists (on the old private copy law), which levies blank media (tapes, CD-Rs), but it did not include hard disks and solid state storage. This law meant to update the old compensation scheme. It's stupid, designed by someone who doesn't know Moore law, and was rightfully bashed in public.
I just hope it doesn't reappear in camouflage. Or whenever people are distracted by something else. The law is stupid beyond belief.
a) Statically link every library that may be problematic, or dynamically link against libraries shipped with software. It's the same Windows as OSX apps do; or
b) Use the distribution package managers, something that is incredibly missing from Windows and OSX. State your dependencies. Granted, you'll have to support two different package formats (.deb and
You are complaining that option b) is a pain. I disagree, but even if I'd grant you that point, you overlook the presence of option a) for release. I can't fathom how having a new avenue for releasing software can be construed as bad.
Time, and the evolution of science it brings, is an amazing thing. Once, developing a nuclear bomb required gathering a super team of scientists, closed off in a campus inventing bleeding edge chemistry and physics. Today, the workings of nuclear bombs are considered trivial by scientists in the area.
Dijkstra's algorithm is, by today's standards, pretty trivial. It's a breadth first search in a graph. Simple enough to be completely described in a single sentence. I'm not negating the brilliance of the scientist, I'm merely stating software has evolved in the last decades.
If the algorithm is simple enough to be described in one sentence, a software engineer, self taught or not, better be able to get there, when asked for a solution to the problem. Again, knowing it by name is irrelevant, but when asked for a shortest-path algorithm for a graph, any half-decent developer should be able to reach a reasonable approximation.
Frequently, software development isn't about what you know... it's about what you don't know and how quickly you can learn it.
That may be true, but not in Dijkstra's case. Dijkstra is quite simple. If, in an interview, I ask the candidate to design an algorithm for a shortest-path search in a graph, and he does not eventually arrive at a breadth first algorihm (Dijkstra) or a well design depth first, he's out. It's not about the formal knowledge, as much as it is about the thought process. A programmer who can't pass this test will surely design problematic software (in many fronts, like performance or security).
The vast majority of climate scientists disagree with the 16 non-climate scientists cited by the WSJ.
Please refrain from using democracy as a scientific metric. The "vast majority" of scientists were wrong throughout History many many times. A majority is no measure of scientific quality, otherwise we'd still study geocentric universe models.