Actually it's not at all a democracy. It's a market. There are two different marketplaces, which are distinct. One market is for developer time, and one market is for users. Each of those affects the other. For the most part, developers aren't going to pour huge amounts of resources into a project no one uses. Users can vote with their feet. If enough users are fleeing because of dumb decisions by developers, a fork might happen, or Shuttleworth might back down. See Pidgin/Gaim/Empathy.
The developer arena is generally a meritocracy. If you've shown up and written lots of code, your opinions carry more weight. If you show up and snipe on a list, and never contribute your opinion carries less weight. User's just want code that helps them get stuff done. If you do that by voting, by fiat, or some other mechanism, they generally don't care. So a developer can fork a code base, and that forces them to take on the burden of doing a lot of work. If they can attract more users, they'll generally attract more developers. That's exactly what Ubuntu has been doing to Debian (and most other Linux distro's for that matter, they built the stuff everybody wanted to use, so they have scores of users, which makes developers want to hack on that). Debian still has users and developers doing a lot of heavy lifting in development and testing. XFree86 vs. Xorg was a revolt over license terms by the devs (Thanks Keith P, et al.). A couple of guys decided that what was happening was stupid, and they fled and put up their own playground, bringing over the last copy of the "free" ball. The users followed, and XFree86 has pretty much dried up (I haven't heard anything about it for years). Look at the old gcc vs. egcs and you'll find that the developers got tired of stupidity, and the users followed the best code base (so much so that the "fork" became the "offical" version after the dust settled). Look at Lucid Emacs vs. GNU Emacs, there it sounds like there's enough compatibility that a clear winner/loser never appeared. Look at the various BSD's, they all split off due to various issues. NetBSD was the "original" and wants to be really portable, FreeBSD decided to make the fastest x86 version of BSD they could, OpenBSD was when Theo had pissed enough folks off they told him to go away, and Dragonfly BSD exists because one guy thought that the new threading/process change over from FreeBSD n to FreeBSD n+1 was a bad idea (I think that was 4->5, but it might be 5->6). Linux ran roughshod over all of those mostly because the Linus was much nicer to deal with, and it wasn't a total distribution. All of the various BSD's are total systems, not just kernels. Linux proper is mostly just a kernel. The reason forks are so uncommon especially of large/critical components, is the burden of doing all the work. Especially as a one man band. You have to start gathering a bunch of followers to help, or you'll fall behind the base code base, at which point, you'll have an old version of the software with a tweak or three. Unless you can attract enough folks to pull over the fixes from the mainline code base, or enough developers that the mainline code base essentially dies, you're an island and an evolutionary dead end for the code base.
Some OpenSource groups *MIGHT* run there things are meritocracies, or as oligarchies, and some might run the choices as "democracies", but I believe you're either abusing the term democracy, or you don't know what it really means.
It's a market place, the only question is will this bother enough users to get them to move to a different code base? Will this bother enough developers to get them to fork? My hunch is that as long as there exists a theme that moves the buttons back, most people won't care. I use Mac and Linux, and generally I just don't care. My muscle memory is for them to be on the right, but I got over it. Nobody is going to want to setup all of the infrastructure necessary to do a large scale replacement Ubuntu over this. It's too big, and too much work. If it were something like Pidgin vs. Empathy, it might happen.
Fair enough, in the end, I think that Linus got this right by saying: If it exists only to work with this GPL'ed software, then it's a derivative work, if it has standalone functionality then it ain't a derivative work. Which is roughly the resolution to virtually all of the cases described. So lets see if I got the logic on all this right:
Which seems pretty sensible, and appears to be what you're saying is came out of the IBM case? I think if you use the "line of thought" from the GPL FAQ about "shared data structures implies derivative works", that also lines up with the above thought processes. The FSF might not agree with that interpretation, but it would sure seem a very sensible line of demarcation. I'd have to ponder strange corner cases carefully.
I think you missed what I was trying to say, so one more go at that. Replace the legally loaded concept of "derivative work" in the GPL, with "morality". The crux of the GPL is to say: "Here are the properties I consider moral, and if your software when added to these is still moral, then you have my blessing and permission to redistribute the modified version of the software. If I don't consider your software moral, then you can use your modified software and do anything you want with it, but you can't redistribute it to others". I use "moral" because it'd make RMS happy, not because I in any way agree with the whole "free-software-as-morality" movement the FSF is trying to lead. You could change that to be moral to be "baz" and immoral to "non-baz", and nothing about the discussion changes.
A copyright holder has that right. Now the GPL might have mixed legal metaphors and what not, I'm not a lawyer, I don't really want to debate the finer points of copyright law, or the finer points of the legal text of the GPL. The crux of what I'm saying is that the legal definition of derivative work isn't the point, what the copyright holder wishes to allow is. The GPL is about giving you rights you otherwise don't have if the copyright holder approves. The GPL (v2 especially) might have flubbed on that, and I really don't care, that's just a "bug" to be fixed by releasing a new license with better legalese. I thought that was one of the reasons that GPLv3 moved away from the concept of "derivative work".
I grok derivative works, and could have laid most of what you said out for you. I'm a bit fuzzy on the "collective copyright" vs. individual copyright, and how the details work out there, but conceptually I've got derivative work down. It agrees strongly with what you're saying.
Well, in that context, I want to think differently. I suppose I could be having my cake and eating it too with this interpretation: The derivative work isn't the "derivative work" in the strictly legal meaning, it's a "These are the terms by which I'll extend to you the right to copy my stuff and redistribute it". In that case, I think that RMS is "correct". So if Microsoft put such a term in their EULA or their copyright license, I believe it would be legally binding assuming I agreed to it.
I mean, they could say: "I consider anyone using this software named 'Bob' to have the right to copy this software and re-distribute it to whomever they want", and that'd be a legitimate (if stupid), license.
I mean the copyright owner has strict controls over how their works are used. Just ask Disney. I'm not sure if it'd hold up in a court of law if put under strict scrutinty, but I think that there is nothing legally stopping RMS from saying exactly what you think is bad, if he assembled the legalese correctly.
For better or worse, I'd like to respect the copyright owners wishes. If they told me: "You can't ship libpq that way", I'd just get over it, and move along. Either write an SSL that is GPL-compatible, or re-compile libpq without SSL support. It's their ball, and I'm thrilled they let me play with copies of it.
If I left it out of my post, I was trying to cover the case where the NDIS folks make you download their driver. Obvious if you make the end user do all the work to assemble the pieces, it's all good by the GPL. That's the whole reason the AGPL was constructed.
I agree with your post. I'm willing to concede that it's plausible that NDIS drivers might be found to be a non-derived work and thus you are distributing them "in aggregate" (assuming you had permission of the NDIS driver copyright holder). However, if it were found to be a violation, the only people who could do anything about it are the authors of NDIS, so it doesn't matter. Which was my point. I mean, I could tell a distribution that they are in violation of the terms, but the only folks with legal standing to actually do anything would be the NDISWrappers author. Which is why it can't be "viral", there is no way the GPL can "attach" itself to the NDIS driver itself. That was what I was driving at.
No, the GPL is not on "shaky legal firmament", that is FUD. There is tremendous value in the GPL'ed software, if there were a hole somewhere in the GPL, you think folks would be driving a truck through it to get at that valuable source code. Notice that everybody who has gone up against the GPL has uniformly lost or backed down.
The reason it's "bad" isn't that there is something wrong with the Linux kernel, it's "bad" because it sets a very dangerous precedent. Sorta like saying that "human rights" are something the gov't allows us to have because they feel like it, but the could repeal them any time the feel like it. In the US at least, that's not possible. If somebody said: "You have free speech because the SCOTUS feels like allowing it", rather then it being a first principle of the US law, that'd be "bad" too. The GPL isn't a virus, it doesn't have magic powers to overtake IP, it's not black magic. If you violate it, you are in violation of someone's copyright. So every time someone says "That the GPL will 'steal IP', or 'accumulate intellectual rights'" is just flat out wrong. The absolute worst thing that can happen to you is that you're found guilty of copyright infringement and a judge will hand down a remedy. That could be a fine, or jail time. I'd be relatively surprised that they'd force you to open your source code. However, it's highly likely that it will include an injunction against you distributing the copyright violation. Rendering your IP mostly useless until you replace the pieces of GPL'ed software.
You'll have to actually cite any place where someone from the FSF had "acknowledged" that libraries are a problem. The only place I've ever seen anything remotely like that is the "readline" case. Where the BSD folks make a binary compatible API (note, using reverse engineering to duplicate the API and totally avoided any copyright infringement issues). I've seen folks make an argument there, but if the FSF folks really wanted to press the issue they should have a legal leg to stand on.
Well if API compatibility is all it takes, then I can say *any* piece of GPL code is an API, therefore I can re-use it as a library. That's a very, very slippery slope. The only piece that makes it strange is that the GPL has specific and explicit exceptions for OS and system libraries, which Linux obviously is the OS, so it is not extremely clear to me how that would apply. However, you can see the same type of care is taken by the FSF with respect to various supporting libraries they ship with GCC, you'll find that the libgcc_s.so.1 library that is linked via GCC are "GPL + exceptions". See this e-mail about the topic, that includes the special exception.
If it is the API nature of the Linux kernel that makes it okay to link with the kernel and not be GPL'ed, that's very, very bad. All I'd have to do is make an API out of any old GPL'ed software I feel like, and then I would appear to be free of any GPL terms. I believe that to be wrong. If it is the fact that it's an Operating System, that's just a flaw in the wording of the license that for the most part doesn't hurt anything. If it is because there's an explicit exception, that's a very good thing.
Linus has also stated publicly: "The NVIDIA driver is not a derivative work, because it existed outside of the Linux kernel, and runs almost totally independent of any of the Linux internals". Note, that the small section that is Linux specific, is GPL'ed, and is distributed in source form. I can find the exact quote for you, if you'd like. If we applied the same logic for the NVIDIA to the kernel to a new back end optimizer for the GNU C Compiler (aka GCC), you can bet that the owners of the copyright (the FSF), would come after you with a vengeance. I believe they are highly likely to win. In fact, the FSF does believe that the NVIDIA driver is a GPL violation (and many of the other bits of the various firmwares are a violation), but Linus is far more practical, and says: "I don't care, you aren't using my code in any way that offends me, but you're making the OS more useful, more power to you".
You're thinking that the GPL is viral, again, it isn't a damn disease, you have to analyze who the copyright owners are, and what complaint could be made and who is allowed to make it. As for the NDISWrapper situation, it would break down as follows: The original driver author (read the hardware manufacturer) might take legal action against folks distributing their compiled binaries, because that is likely a copyright violation. The person who is using NDISWrappers could realize that person who distributed the software to them, didn't comply with the GPL (because the binary portions were given to them without source in the "preferred form"). The only person who could could take legal action to remedy this situation is the author of the NDISWrappers as they are the copyright owner. Thus it's highly unlikely that there could be legal action. Alternatively, if NDISWrappers doesn't actually ship the binaries for the actual drivers, there is no violation. The end user is doing what is hypothetically a violation of the GPL. The end user can do absolutely anything they like, the GPL only kicks in when it is distributed.
This is the same loophole that Sun used to make Linux drivers run under Solaris, because they forced the end users to actually compile the drivers from source using a shim compatibility layer. Solaris would load those drivers that the end user compiled, and because they were distributed in source form to the end user, there was no GPL violation according the letter of the license, but was widely viewed as a violation of the spirit of the license. So if you ship the GPL'ed source/binaries in compliance with the GPL, and a separate piece that can operate on that source/binary it's no problem (see the GPL FAQ section about "works in aggregate").
I agree with you, but I believe you to be wrong on a technical point. The license applied to the kernel is the GPLv2 with the specific stipulation that the userspace boundary was not considered a derivative work by the author. Otherwise, I believe distributing a binary that linked with the Linux kernel would have been a GPL violation (depending on the weird interpretation about OS/tools libraries "get out of jail free" clause in the GPLv2).
See COPYING from the linux kernel. The absolute top clarifies the copyright owners distinctions.
The thing about the GPL is that it isn't "viral" despite what folks claim. It merely has terms of usage, just like virtually any other software. When found in violation of the terms, the easiest way to comply happens to be to release your source. You could stop using the GPL software and move along. The only person who can take you to court over the GPL is a copyright holder. Your "customers" sure can't. So if Linus says: "I don't consider that a derivative work", in the legal document describing it, he'll have a really hard time telling folks in court: "I think that's a derived work, and they are in violation of my license".
That's not strictly true. If I started out with a movie that was original "Star Wars", and I slowly but surely removed every frame of that film, and then saved the film. It would not be considered a derived work of Star Wars. No harm no foul. If I started out with the Linux Kernel, and I released version 0.1, 0.2, 0.3, 0.4, up to 0.9, and finally released Kirbix at 1.0 and claimed I owned the copyright. I would be obligated to give the source code out for versions 0.1 and 0.9 (assuming I distributed them to anyone), but at 1.0, I'd be well within my rights to re-license the software. Bruce is claiming "compiliation copyright", which I'm unfamiliar with the basics of that.
I don't understand what legal principle is being applied to claim some piece of the copyright if I had replaced all of the pieces and parts. The mostly commonly known situation like this is the old BSD UNIX distribution. Eventually it was determined that UC had sole rights to all of their copy of UNIX, because they had slowly but surely replaced all of the pieces of AT&T's UNIX. I thought 4.4BSD Lite was essentially BSD UNIX minus the 7-10 files that AT&T still owned. Eventually those last bits were re-written and 386BSD and it's decendents (FreeBSD, OpenBSD, and NetBSD) were spawned in the late '80s (looks like I might have the timeline wrong, but the salient points at the end of the Wikipedia story linked above shows the thrust of this is correct). In my mind that sets a much stronger precedent then what I have seen of Bruce's claims, assuming that "I've re-written every line" claim is true. However, I believe Bruce is probably right, that it hasn't all been re-written. Just a hunch. Re-writing "everything" is pretty darn difficult.
Funny, I took off my shoes after 9/11 but before they required it, precisely because my work boots always set off the metal detectors.
Yeah, but Evian water is just an inconvenience, as I'll buy some when I get there (if I drank bottled water). I'm not dropping the money for a laptop on the other side, especially if I can't bring it back with me. I'm highly unlikely to check my $1800 laptop. For my work, I'd not go on the trip, as me without my laptop has virtually no value. It'd take a half day just to get a machine setup so I could get logged in over the VPN, and get all of the tools I need installed.
So cheap items, there's no big deal, but items that are too expensive to just replace upon every trip, are likely to cause a much bigger backlash. Especially if they affect business people, who generate the bulk of the revenue in flying. Hell, they could tell me I couldn't take clothes except what I had on, and I'd deal with that (assuming there I could locate a decent big and tall shop in town). Who knows, maybe they'll create a "laptop license", and charge $50/year to get it renewed, and have a background check done on it. My work would cover that.
If they do ban them, look for people to start carrying on laptop hard drives, and using laptop rentals. Or a lot more driving than flying.
Unless you have a compiler that can generate meaningful names, you are in trouble. All code must have human readable and comprehensible names. ANTLR is a great code generator, that generates very readable code, but even it has poorly named variables. You can solve the file history by extracting the commit messages. You can solve the function call tree documentation if you write a good parser (the parser for C++ is non-trivial, which is why we didn't do that).
You can write tools to detect a lot of the issues. You can write code formatters to re-flow the code to 80 characters. You might be able to get a very good code re-writer to have only one return statement. All variable names must be meaningful.
However, documenting the flow of control is beyond any compiler, at least in any human fashion. You can't indent more then 5 levels of code, which is a problem for code generators. Finally, if you use a COTS (Commercial off-the-shelf) code generator then it is acceptable. However, if you use an internal code generator, it must follow the VVSG guidelines as I recall.
They very much frown upon any generated code, that is generated by an internal tool. We used a code generator to parse the XML documents, and used some for re-formatting some Java. However, C/C++ is extremely difficult to get automated tools to work with (macros and #includes are non-trivial to deal with).
Code that is automatically translated from Python to another language will never pass compliance. You might be able to resolve trivial issues.
I completely and totally agree with the notion that those rules are stupid. However, most states use Federal Funding for the purchase of hardware for elections. Once that is done, you must be certified by the FEC, and you must follow the above guidelines. Unless your state officials want to break Federal laws, or can find all the money for it from non-Federal sources, those rules will have to be followed. It's not like you can use an off-the-shelf computer, and the hardware is only good once maybe twice a year. You'll need one that refuses all external input except for the types of storage you plan on using to transport the votes from a machine. Even if all of the software is secure from this Open Source code, they will still need to get secure hardware. The problem is you send everyone to a place alone with the machine where they have total access. Securing the machine is actually, extremely difficult.
From what I know of the state and counties, they all use Federal money. Everybody who took HAVA money has to follow both those and ROHS rules for the hardware (ROHS, I might have the acronym wrong, but it's the environmental friendly hardware when you go to dispose of it, so no using lead, etc, etc). Even most states defer to the FEC to set testing guidelines, and most states will refuse anything that does not pass the VVSG hardware and software guidelines.
You can't run an election without a scanner of some sort. You'll need a scantron type solution for a state wide vote. You can't run those any other way. If you say "DRE", I'm going to smack you. Even one's with paper trails are stupid. Scantrons to count, and paper ballots are the only way, unless we hand count (which I've got no problem with, but the computers generally do a better job, especially if you want to do accurate stats for funding of parties). Once you start doing scantrons it will require custom hardware, and the state will be incapable of dealing with it.
I think it would be great to require a security review from real security folks. The problem with most of the VSTL employees I've dealt with, is that they aren't capable of getting a paying programmers job. That's why they review someone else's code. We tried fairly hard with the stuff I worked on. We used Linux, and used a "known" Live CD to boot from, and had a completely scripted build from source code. With the exception of the RSA Crypto library and the JDK/JRE (because we couldn't prove OpenSSL's was FIPS-140.2 compliant on our OS and hardware), everything was built in from of an Election official. We built the entire toolchain that would then build the absolutely everything that was installed on the firmware. For a "real" security review, we had almost everything. If OpenJDK had been released at the time, we would have built the JDK/JRE from scratch also.
The stuff I worked on could have been hacked, especially if the source code ever leaked. Not that it was blatantly insecure, but like most code written, it has bugs and flaws that more eyes would catch. We generally did a good job using constructs that avoided buffer overflows (we avoided most C in favor of C++ where possible). The problem was the size of the programming team (I'm guessing that maybe 5-6 full time programmers worked on the system that counted a significant fraction of the votes in the 2000 and 2004 elections). I left because of the dysfunction inside the company due to dealing with Federal crapola. I just hated the code I had to write. I hated how old and antiquated the rules I had to follow were. It was a fun gig, and I liked that I got to contribute to cleaning up some of the problems folks have with electronic voting. I took it very seriously.
I agree with you, the solution is to update the rules to involve actual security. The problem is you most literally can't. There are lots of "rules of thumb", but if there were actual rules to follow, we wouldn't have security professionals we'd just write a compiler that understood the rules. These rules exist as a proxy. Each one of them was added to mitigate some problem they had before. To require more security reviews by appropriate security professionals (who are hard to come by). Most gov't officials don't have clue about most of these topics, so it is exceedingly difficult to convince them that this structure they cling to has to be changed.
How many hardware guys does it take to change a light bulb? "Well the diagnostics say it's fine buddy, so it's a software problem."