Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
The Internet

W3C Patent Board Recommends Royalty-Free Policy 119

Posted by Hemos
from the making-the-smart-choices dept.
Bruce Perens writes "A year ago, the World Wide Web Consortium proposed a policy to allow royalty-generating patents to be embedded in web standards. This would have been fatal to the ability of Free Software to implement those standards. There was much protest, including over 2000 emails to the W3C Patent Policy Board spurred on by a call to arms published on Slashdot. As a result of the complaints, I was invited to join W3C's patent policy board, representing Software in the Public Interest (Debian's corporation) -- but really the entire Free Software community. I was later joined in this by Eben Moglen, for FSF, and Larry Rosen, for the Open Source Initiative." Bruce has written more below - it's well worth reading.
After a year of argument and see-sawing, W3C's patent policy board has voted to recommend a royalty-free patent policy. This recommendation will be put in the form of a draft and released for public comment. There will probably be a dissenting minority report from some of the large patent holders. Tim Berners-Lee and the W3C Advisory Committee, composed of representatives from all of the consortium's members, will eventually make the final decision on the policy. My previous interaction with the Advisory Committee and Berners-Lee lead me to feel that they will approve the royalty-free policy.

The policy will require working group members to make a committment to royalty-free license essential claims - those which you can not help infringing if you are to implement the standard at all. There is also language prohibiting discriminatory patent licenses. The royalty-free grant is limited to the purpose of implementing the standard, and does not extend to any other application of the patent. And there is a requirement to disclose whether any patent used, even a non-essential one, is available under royalty-free terms, so that troublesome patents can be written out of a standard. The limitation of the scope-of-use on patents, and some other aspects of the policy, are less than I would like but all that I believed we could reasonably get. Eben Moglen may have some discussion regarding how GPL developers should cope with scope-of-use-limited patent grants from other parties. For now, it should suffice to say that while this is less than desirable, is will not block GPL development.

I'm not allowed to disclose how individual members voted, but I'll note that the vote did not follow "friends-vs-enemies" lines that the more naive among us might expect - so don't make assumptions.

Now, we must take this fight elsewhere. Although IETF has customarily been held up as the paragon of openness, they currently allow royalty-bearing patents to be embedded in their standards. This must change, and IETF has just initiated a policy discussion to that effect. We must pursue similar policies at many other standards bodies, and at the governments and treaty organizations that persist in writing bad law.

For me, this process has included two trips to France (no fun if you have to work every day) and an appearance at a research meeting in Washington, a week in Cupertino, innumerable conference calls and emails, and upcoming meetings in New York and Boston. That's a lot of time away from my family. Larry Rosen has shouldered a similar burden while nobody has been paying him for his time and trouble, and Eben Moglen put in a lot of time as well. Much of the time was spent listening to royalty-bearing proposals being worked out in excrutiating detail, which fortunately did not carry in the final vote. We also had help from a number of people behind the scenes, notably John Gilmore, and the officers and members of the organizations we represent.

I'd like to give credit to HP. Because I was representing SPI, and HP had someone else representing them at W3C, I made it clear to my HP managers that they would not be allowed to influence my role at W3C - that would have created a conflict-of-interest for me, as well as giving HP unfair double-representation. HP managers understood this, and were supportive. During all but the very end of the process, HP paid my salary and travel expenses while they knew that I was functioning as an independent agent who would explicitly reject their orders. Indeed, HP allowed me to influence their policy, rather than the reverse. This was the result of enlightened leadership by Jim Bell, Scott K. Peterson, Martin Fink, and Scott Stallard.

For most of the existence of Free Software, technology has been of primary importance. It will remain so, but the past several years have seen the emergence of the critical supporting role of political involvement simply so that we can continue to have the right to use and develop Free Software. I do not believe that we will consistently be able to code around bad law - we must represent what is important about our work and involve ourselves in policy-making worldwide, or what we do will not survive. I hope to continue to serve the Free Software Community in this role.

Respectfully Submitted

Bruce Perens
"
This discussion has been archived. No new comments can be posted.

W3C Patent Board Recommends Royalty-Free Policy

Comments Filter:
  • Justice prevails (Score:5, Informative)

    by Compact Dick (518888) on Monday October 07, 2002 @07:32AM (#4401865) Homepage

    I have been waiting long for this, and I'm glad it didn't turn out the other way round.

    Those who were involved in the outcry would recall the proposal for RAND [non-free] standards was done in a rather suspicious manner. There was no announcement on the W3C's front page [I visit it regularly] but a proposal for RAND was quietly drafted, a ludicrously short deadline for feedback set and a mailing list [w3.org] for the same created. Why, then, is it suprising that only thirteen posts were recorded until a week before the initial deadline, ten of them spam?

    Then the story broke [on The Register, IIRC] and the mails flooded in. And what a flood it was *smiles* - 755 in Sept and 1686 in Oct. My mailbox was getting a good beating.

    Many voiced their opinions strongly, and with an exception or two [one of which was obvious astroturfing], they were all soundly against the inclusion of non-free patents in W3C standards [check out the archives [w3.org] and spot the famous names]. Under this tremendous pressure, the W3C had little alternative but to extend the deadline. I am sure certain * ahem * special interest groups were disappointed - but hey, it's for the best. Really.

    And now we have this. Brilliant. Common sense and justice have won this round.

    Special thanks to Bruce Perens, Daniel Phillips, Adam Warner and Gervase Markham for their dedication to this cause.
  • Re:Free Software (Score:5, Informative)

    by Anonymous Coward on Monday October 07, 2002 @07:36AM (#4401883)
    That is not that strange. He has expressed is views on Open Source and Free Software in a short paper entitled: "It's Time to Talk about Free Software Again"

    It must be on his website somewere, but you can read a early version [debian.org] in the Debian Developer archives.

    The important paragraph in that article is the following:

    Most hackers know that Free Software and Open Source are just two words for the same thing. Unfortunately, though, Open Source has de-emphasized the importance of the freedoms involved in Free Software. It's time for us to fix that. We must make it clear to the world that those freedoms are still important, and that software such as Linux would not be around without them.
  • by IamTheRealMike (537420) <mike@plan99.net> on Monday October 07, 2002 @07:37AM (#4401893) Homepage
    ... the W3C has been a group without any kind of power for a long time. They've been suggesting technical web standard since the Web began, but they've been largely ignored for at least the past 3-4 years.

    What evidence do you have to support that statement? Let's see, specifications of the W3C that are widely used:

    XML: uh, yeah. Ignored?

    All the other bits that come with XML - XPath, XSLT, DOM and so on

    HTML4 - yes, this is a standard, yes people frequently break its rules but html4 is a standard that allows for that to some extent. It's implemented in every major browser (ignoring bugs).

    CSS - lots of sites use this

    SOAP? No, it wasn't "invented" by the W3C, but the W3C accepts other peoples technologies as well as inventing its own, hence this story.

    The W3C is producing some of the most thorough and powerful technical standards around. They are very readable and well organized (if you don't believe me try reading some specs from ECMA, or the IETF which still does not use rich text in its specs). They have a long term vision - the semantic web.

    To be honest, the W3C is one of the most important standards bodies around, if they didn't exist and hadn't sorted out the browser wars, today the web would be totally screwed over. I'd like to say a huge thankyou to Bruce: anybody can sit back in their chair and write a new MP3 player but it takes real dedication and energy to travel the world sitting through meetings with corporate execs and fighting for our cause when all you have is the strength of your argument to back you up.

  • by NineNine (235196) on Monday October 07, 2002 @07:48AM (#4401939)
    Not a single thing that you listed has been implemented as per their specs. There are still two different browsers, with each one only supporting the various technologies partially. There's no consistency between the brosers (still), and there's probably not going to be. They may have *ideas*, but the technical specifications are simply not implemented. Hell, I've got an open issue in Bugzilla that is a W3C spec that Mozilla doesn't support, and it's been open for close to a year. There's clearly no kind of real, pressing reason for software developers to design according to the W3C specs. The W3C has no teeth. The best they can do is throw something out there, and cross their fingers.On top of that, I gotta say that from what I've read, these various technologies would have happened with or without the W3C. And, you didn't list the hundreds of other specifications that they wrote that are simple not implemented anywhere.
  • by IamTheRealMike (537420) <mike@plan99.net> on Monday October 07, 2002 @08:04AM (#4402005) Homepage
    Not a single thing that you listed has been implemented as per their specs. There are still two different browsers, with each one only supporting the various technologies partially. There's no consistency between the brosers (still), and there's probably not going to be.

    Only 2? There are loads of web browsers. IE and Mozilla, Opera, Konqueror, iCab, gtk-html etc. Virtually all of them save IE and Opera implement the specs pretty well. IE just suffers from a lot of bugs - you'll notice in IE6 one of the "new features" was a modicum of standards compliance. Yes, there are bugs in browsers. Wowee, the programmers made some mistakes. It happens, these are not simple technologies. IE has more bugs than it should do, but they seem to be getting their act together to at least some extent.

    The W3C specs are typically complex - they do pretty advanced stuff. A complete vector graphics language anybody? That's damn cool, but a lot of work. Yet it's getting done none the less, Moz has its own native svg implementation and Konqui supports it too.

    There's clearly no kind of real, pressing reason for software developers to design according to the W3C specs. The W3C has no teeth. The best they can do is throw something out there, and cross their fingers.

    Sure there is - interoperability. Hence the fact that all web browsers attempt to use the same technologies. Some manage better than others of course.

    On top of that, I gotta say that from what I've read, these various technologies would have happened with or without the W3C. And, you didn't list the hundreds of other specifications that they wrote that are simple not implemented anywhere.

    The point of the W3C is not to be a research institution. There was structured markup before XML, there was hypertext before HTTP, there was vector graphics before SVG. But people are using these specs regardless, because the value of interoperability is high. That last sentance is provably false, for a specification to reach "W3C Recommendation" status there must be at least one, often more than one implementation. Don't make the mistake of assuming that all their specs are meant for the web browser, or even the web.

  • by Zeinfeld (263942) on Monday October 07, 2002 @08:35AM (#4402162) Homepage
    The W3C is producing some of the most thorough and powerful technical standards around. They are very readable and well organized (if you don't believe me try reading some specs from ECMA, or the IETF which still does not use rich text in its specs). They have a long term vision - the semantic web.

    W3C is certainly not under any challenge from the IETF. Apart from CISCO there are very few vendors who take their proposals to IETF by choice these days. It simply takes too long to get anything done and the IETF rules allow far too much scope for individuals with an agenda to delay the process until the rest of the group gives in.

    W3C is under challenge from OASIS however. It can take over a year just to get a W3C group formed, you can get the spec completed in the same time at OASIS. The other issue is cost, W3C charges $50,000 a year for membership, OASIS is only $10,000 for the top membership tier. That makes a big differene when it comes to getting customers involved. Few customers want to pay $50K for 4 years to influence the direction of a technical spec.

    Semantic Web is not that popular with the W3C membership. Every time members suggest new work items there are attempts to align them with RDF. Now I don't have a problem with Tim's goal, but I don't think a rehash of Lenat's cyc project is the answer.

    The attempt to get consistency across standards is good in theory, but the problem is that the membership don't get much input in the direction of that consistency. For example XMLQuery was proposed as an XML based interface to SQL. I can see a case to support that as a legacy issue, but since then we have been having W3C people asking us repeatedly why we are not using it. I have zero interest in using XMLQuery and will take my specs elsewhere rather than have it polute my spec. SQL is a legacy data model that we are trying to leave behind, insisting that everything bebased on it is as clueless as demanding that every spec be easily implemented in COBOL.

    The W3C handling of its patent policy has not been competent. On occasions people have been flying to WG meetings and the patent terms of the meeting have changed while they were in mid air. The Royalty Free issue is nowehere near as simple as likes of Bruce Perens would have people believe, life is always simple for idealogues because they measure their achievement in terms of their commitment to their ideology rather than by actual results.

    The IETF policy that Bruce had a go at is actually the most pro-open source arround. Basically it says that specs should not be encumbered by patents unless there is a really good reason. The last really good reason that was allowed was to use public key cryptography without which we could not have written the PGP and S/MIME specs at IETF.

  • Thank You (Score:5, Informative)

    by greenhide (597777) <jordanslashdot@c ... DENom minus poet> on Monday October 07, 2002 @08:54AM (#4402293)
    I just want to express thanks to all those in the Free Software movement who *are* politically motivated enough to take these sorts of efforts on behalf of all of us.

    If had w3 standards that required us to pay money to follow them, this would put many developers and individuals using those standards into a real bind. Ensuring that these patents will be royalty free is crucial to the growth of standards conformance. We don't want financial *disincentives* to following standards.

    It never ceases to amaze me just how many different areas there are in which the freedom of the people is being transferred to corporate or moneyed interests, and how important it is to fight against them [globalizethis.org]. It's good to see that the little guy still wins from time to time.

  • by NineNine (235196) on Monday October 07, 2002 @10:02AM (#4402811)
    Only 2? There are loads of web browsers. IE and Mozilla, Opera, Konqueror, iCab, gtk-html etc. Virtually all of them save IE and Opera implement the specs pretty well. IE just suffers from a lot of bugs - you'll notice in IE6 one of the "new features" was a modicum of standards compliance. Yes, there are bugs in browsers. Wowee, the programmers made some mistakes. It happens, these are not simple technologies. IE has more bugs than it should do, but they seem to be getting their act together to at least some extent.

    There are no more than two that are even remotely popular (and the popularity of Netscape/Mozilla is falling by the day). The other ones are largely irrelevant.

    And as far as compliance, much more of the DOM is implemented in IE than it is in Mozilla. I have *several* non compliance issues open in Bugzilla that haven't been addressed in nearly a year. OTOH, I haven't stumbled across a part of the DOM that IE is lacking in yet.

    As far as bugs, I don't know what you're talking about.

    Sure there is - interoperability. Hence the fact that all web browsers attempt to use the same technologies. Some manage better than others of course.

    This goes back to my first point. Interoperability? Depending on the numbers you read, 85-95% of all surfers use IE. For the vast majority of web site owners, interoperability with the W3C spec is a moot point. IE interoperability is key. If IE decided to completely split fromt he W3C spec tomorrow, whose specs are going to be followed? With 95% of my surfers using IE, that's what I'm concerned about. Hence, the W3C has no teeth.

    The point of the W3C is not to be a research institution. There was structured markup before XML, there was hypertext before HTTP, there was vector graphics before SVG. But people are using these specs regardless, because the value of interoperability is high. That last sentance is provably false, for a specification to reach "W3C Recommendation" status there must be at least one, often more than one implementation. Don't make the mistake of assuming that all their specs are meant for the web browser, or even the web.

    Ok, maybe *somebody's* using every one, but a handful of users does not a "standard" make. XSL? PNG? Again, they can scream until they're blue in the face, hold press conferences, protest, whatever, but unless a large number of people want to actually *use* those specs, they're about as worthwhile as the new "NineNine SeXML" spec that I could write.

I never cheated an honest man, only rascals. They wanted something for nothing. I gave them nothing for something. -- Joseph "Yellow Kid" Weil

Working...