Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security

Who Is Liable For Software With Security Holes? 441

securitas writes "Interesting article over at eWEEK that asks who is and should be legally responsible for insecure software. Some say it's the manufacturer. Currently software is exempt from product liability as we've come to know it in the physical world. Others say the software licenses should make users responsible if they don't install patches and updates. Infosecurity czar Richard Clarke said in his speech at RSA that Nimda cost US companies an estimated $2 billion. Imagine if Microsoft was legally liable and a $2 billion suit was filed. Now extend that to the other jurisdictions outside the US. What does this mean to open source software, which is being used to a greater extent in corporate environments? Food for thought."
This discussion has been archived. No new comments can be posted.

Who Is Liable For Software With Security Holes?

Comments Filter:
  • Just like a car or a bike, if the equipment is faulty, I think the company that made it should be liable. However, if you got that car or bike for free and knew before hand that hey, this thing may not work because I'm giving this to you out of the goodness of my heart, then I don't think that independent developer should be liable.

    I suppose that's only a dream for us OSS kids ;) Having the big boys like Microsoft liable while we get off easy. There's no way in hell those dirty politicians would see that that would make the most sense for the consumer. But hey, that's democracy for you.

    Just my US$0.02
    Hargun
    • by I Want GNU! ( 556631 ) on Thursday February 28, 2002 @03:00AM (#3082802) Homepage
      That's a little different. Software bugs cost money to fix. Car bugs kill people. The tobacco industry gets sued because they kill their own customers, but I don't think software companies do the same. Plus, if the software manufacturer is liable, and writes nearly perfect code, and then five years later somebody discovers a single bug and writes an exploit, who is liable? I say nobody is, the licenses always say that the software provider is not responsible.
      • That's a little different. Software bugs cost money to fix. Car bugs kill people.

        That's what situation is NOW. Wait for a couple a years and you'll see net used for lots of 'critical' missions (like remote surgery, diagnostics, controlling). THEN a simple DoS (nimbda even) will kill people.
        I think this thing should be sorted out before it will become a problem.
        And of course having a legislature doesn't mean it's enforced.
      • Actually, when I took my Comp Sci 101 class awhile back the professor brought up a situation in which a computer program has harmed people. It's been awhile so I only remember the general gist of the stories.

        Consider: Company designed X program to run X piece of medical equipment. Program fails. Patient dies. Who is responisble (the company was sued out of existence it turns out)

        Consider: Company designed mainframe system X. System fails b/c of date-bug (like 2K bug but it failed on 1987 for some reason). Hospital computers crash. Nobody dies but it was a distinct possibility.

        It isn't that hard to extrapolate situations where computer programs can/do cause actual physical harm to people (would YOU want win95 running the air traffic control system? Didn't think so).

        Holding software makers unaccountable for their errors is ridiculous. No industry in America is allowed to do this. You can say software is impossible to be completely fail-safe. Ok, so are cars, VCRS, DVD players, airplanes, etc but manufactures are still held liable. Simple fact is the software industry has been able to produce bug ridden, crappy software under the title of 'good enough' for far too long. Accoutability is desperately needed.
      • Software bugs cost money to fix. Car bugs kill people. The tobacco industry gets sued because they kill their own customers, but I don't think software companies do the same.
        So you're saying we should wait until Windows starts killing people before we can sue Microsoft?

        "At first I wondered why I needed to register my toaster with Windows XP, but the computer wouldn't let me on the Internet until I brought it the toaster! Things were fine for a while, until someone hacked into my computer and took control of my toaster [geocities.com]! I tried to sue Microsoft, but the courts ruled that Windows didn't kill my boy, the TOASTER did!"

      • Re:Just like a car.. (Score:2, Interesting)

        by fferreres ( 525414 )
        * five years later somebody discovers a single bug and writes an exploit

        Software will always have bugs. But no producer is punished for making insecure programs. Only bad PR. I think it's suboptimal that bad PR is the ONLY incentive to write secure apps.

        Company A wants to sell products for e-tailers? Then they better issue some kind of warranty (not that it's 100% bug free, but at least a level indicating how hard it is to break it, or how much time will it pass before they issue a patch).

    • That gets into a gray area where you really have to define faulty. For instance, when it comes to system faults vendors should be required to offer a guaranteed uptime (they can set the value at whatever they want, so you could sell your software with a guarantee of no more than 20 critical faults a minute, but that might hurt your sales somewhat... As it is, organizations make very few commitments to their systems, allowing Microsoft, as an example, to simply push each new OS as "way more stable that that last piece of software which we sold you under the pretense that it was super duper stable..."). Is that bicycle fault if the rider drives irresponsibly and gets hit in traffic? Is that bicycle faulty if it gets stolen or is otherwise maliciously used?

      Security robustness is a marketing function (it's a feature, if you will, just like a Volvo withstands impacts better than most other cars), and insofar as vendors don't outright lie about the security of their systems, they should not be held responsible: The responsible parties are the hackers/DOS attackers/etc, and no one should ever fool themselves into anything otherwise. For all of the talk comparing software to the "real" world, the reality is that the window maker isn't responsible if someone throws a brick through it, and the lock company isn't legally responsible if someone drives a tow truck through the door: As long as it withstood at least the marketed capabilities there is no vendor fault.

    • If I build a tree house on my property that is unsafe and someone tresspasses and uses this tree house (which I haven't even said he could use) and gets hurt then I am potentially liable both crimally and civilly. It's called an attractive nuiscence.

      I didn't charge anybody anything... I didn't even give permission for it to happen. So if this is a crime surely if I knowingly give somebody a car that is faulty (even if I don't charge him) shouldn't I also be guilty.

      Just because I don't profit off of a transaction doesn't give me a right to put somebody at risk, financially or physically, unless perhaps I am completely forth right and even then often not; and simply saying "Well, at your own risk," is not completely forth right, not even close.

      The problem with your argument is you offer two different arguments and claim that one applies to paid software and the other to free. Yet your arguments have no dependency on this variable so it is unclear why the arguments vary so. What it appears you are saying is if you are giving away software then you are a nice person. And nice people shouldn't be held to the same laws as mean people. Well a system bases on niceness is in a different ball park than a justice system.

      The other way your argument makes sense is if the seller is only liable up to the price he charged and is not liable for damages. Otherwise you're buying the right not to be put in a dangerous situation with out your knowledge... which u can't buy.
    • "Just like", huh?

      Isn't it interesting that cars and bikes underwent continual improvement throughout the last century, which is still ongoing?

      These improvements have made cars and bikes much safer than even what our parents had. Today every major operating system, even Linux, is riddled with bugs of all sorts. Software is still a young field. When you use software, you take a calculated risk.
    • by Zocalo ( 252965 )
      OK, to extend the analogy...

      My car's design has a flaw and the manufacturer issues a public recall for a free repair, I have this mentioned when I next go for a service, but choose not to have the work done because it's too inconvenient. The part fails and I am involved in an incident that causes harm to a third party - I think I should have my ass sued clean off, don't you?

      My software has a bug, the vendor issues a freely downloadable patch, and even emails me about it, which I choose to ignore and don't install it. My server is compromised and used to DoS a third party - I think I should have my ass sued clean off, don't you?

      In the incidence of software this is clearly related to the debate about disclosure of vulnerabilities. You have to acknowledge that software is going to have flaws, that it takes a period of time from discovery of a flaw to produce, test and release the fix, and that during this time liability is the grey area this topic is discussing, but once the fix is out and announced, responsibilty *has* to be transferred onto the people using the software rather than those that produced it.

      I don't think you can blame a vendor for having a bug in their code, because it's not a perfect world and it happens (albeit more with some vendors than with others) and doing so sets a precedent that would effect other industries as well. You can however apportion a great deal of blame after the flaw becomes public knowledge, and reapportion that blame once the fix is available or if the fix is sufficiently tardy in arrival to cause problems. Which explains a great deal about some people's attitudes towards the issue of full disclosure, doesn't it?

  • by I Want GNU! ( 556631 ) on Thursday February 28, 2002 @02:52AM (#3082768) Homepage
    Yes, it is the software manufacturer's fault if they make buggy software and don't ever put a hold on new features to fix bugs. The customer is responsible for installing bugfixes, when released.

    Still, they aren't legally responsible for the bugs. If you read most licenses, they say "this software is provided as is." Everybody makes mistakes and even though software creators should make more effort to stamp out bugs, no code of a certain level's complexity is perfect.

    The important thing here that needs to happen is that businesses and consumers say "features are nice, but fix the bugs first." At the moment though, they say "features first! bugs aren't displayed on the box." They speak with their wallets by buying buggy software. I don't mean to be one of those typical anti-MS people (even though I dislike their software), but the fact is, they produced extremely buggy software and most people still bought it. That says something.
    • And by the anti-MS people, what I meant is that I do disapprove of Microsoft's business practices, but I'm not some kind of anti-MS zealot. They illegally abused their monopoly and caused everybody to use an extremely buggy OS through shady business practices. What I mean is that I try to be unbiased and let the facts speak for themselves.
    • Yes, it is the software manufacturer's fault if they make buggy software and don't ever put a hold on new features to fix bugs.
      I'm not sure it's so simple. For example, what if no one knows about the bug when the software is released. Later someone finds the bug and some computers are compromised before a patch can be released. Is the manufacturer still at fault?

      And this begs the question of whether or not it's possible to make bug free software in the first place. Given the complexity of software, 100% bug free software might not be a realistic goal and this seems to make it unfair to punish software companies for every bug. Making software companies liable could severely hinder software development due to the high risk involved.

      It's very hard to assess liability when software fails. I haven't the solution and I imagine it'll be a while before anything concrete is determined.

    • Correct.

      The software industry heavly-lobbied for legislation (and got it, of course) that basically makes its products legally without warranty.

      In my opinion, buggy software is a result of "time-to-market" hype that results from managerial/marketing pressure and insufficient, undermanned, undertrained people coding away and reinventing the wheel every chance, while making YAWOD (Yet Another Wrapper Or Driver) because they don't understand something (as typical w/ micro$oft coding). What is ActiveX called now? Wasn't it DCOM... wait... COM... wait ... OLE? Wasn't it DDE/DDX? More marketing terms == more confusing APIs. Otherwise, you wouldn't have to rewrite your apps every year and have a slower OS. Oh, wait .NET/C# is supposed to solve everything, yeah... that's the ticket. Oh, wait.. Java, BTDT.

      Features last, working first. I'd prefer features in an patch and working OOTB.

      "Interface is everything."
      • by Danse ( 1026 )

        The software industry heavly-lobbied for legislation (and got it, of course) that basically makes its products legally without warranty.


        Which legislation are you talking about? The only law I know of that would accomplish this for them is UCITA, and that's only been adopted by 2 states.

        • UCITA for one, DMCA, maybe SSSCA. Read the DMCA, it applies to more than music. Case in point: Dimitry [eff.org], held in prison for giving a seminar at DEFCON and coding done in Russia for a US company!!! The laws maybe annoying and dumb, but they are laws that are being enforced right now.

          What about ridiculous software patents? Those are being "legally" enforced left and right; whole companies are based on IP-squatting.

          Wake up and smell the fucking coffee. [eff.org]
  • its the contract for the use of software - this is where something like this should be stated. :) the user must accept the license before using the software - however, when a computer is provided pre-installed with software, it makes you wonder if users really do have a choice.
  • Why should software be any different than any other product on the market? But I do think software makers should be able to protect themselves somehow.

    If someone is mowing the lawn and a stick flies up and takes out an eye the lawn mower company isn't liable if there is a warning somewhere saying "must wear eye protection while operating". Maybe a "must back up all data" in the software agreement would cover the software companies somewhat.. but then again, who reads the agreements in the first place?
    • isn't that the american way:
      -you must not put a cat in the microwave
      -if you vandalize the vending machine it might tip over and kill you.
      -playing on the nintendo 8hrs a day 6 dats a week might not be wise if you have seizures.

      i don't think its reasonable that the manufacturer is responsible for all the really stupid things the customer can do with its product. there is a thing such as common sense. people should not sue 'because it did not say on the package that i should be careful when using a chainsaw'.

      btw in all EULAs there is a phrase that says :"this product is provided as is...." and "the manufacturer cannot be hold responsible for anay damages...."
      which is also common sense. if a sofware creates software that contains 40 million lines of code it cannot be bugfree. no matter is your name is msft, redhat, oracle or apple.
      demanding that it should be is unrealistic.

      though i agree that better design would solve a lot of problems.

      btw.
      there is no spoon.......
      when you realize that, you will see that it is not the software that contains bugs, but that your mind interprets undocumented features that way
  • for what we create. that may give our profession a little more formality of a "true" engineering profession, and force developers to fully think out designs instead of just saying "it'll be addressed in the next version".
    • Not unless we have the power and authority to make decisions like that ("fix it and delay the shipment").

      Most product failures are management decisions (tradeoffs). And managers are basically never held liable; even the companies usually have enough lawyers to avoid real consequences.
  • What does this mean to open source software...

    buh bye sendmail!

    -Bill
  • by pjbass ( 144318 ) on Thursday February 28, 2002 @02:57AM (#3082789) Homepage
    Funny that Nimda was mentioned; I seem to remember that @Home.net and AT&T were pulling the plugs on their customers because they were saturating the bandwidth due to Nimda. This seems to be directed towards the users' negligence/lack of knowledge about what they're doing, and so one can argue "why blame them? They did exactly what MS said they could do: plug and play."

    Now I also remember when the commercial version of SSH released v3.0, there was a HUGE security hole (passwords of length 2 or less would always work...), and SSH developers took the heat; rightfully so. They 'fessed up, and they fixed it. As far as I know, there were no incidents because of it, because the problem was fixed before it was used widespread. But if it did create an issue (like Nimda, Code Red 1/2, etc.) before a fix was made (proactive vs. reactive), they should be held liable, not the users. If a fix exists, and a user says "oh, I don't have *that* problem," well, I think we all know who should get the blame. Just my $0.02 worth though...
    • by 0xA ( 71424 ) on Thursday February 28, 2002 @06:01AM (#3083125)
      "why blame them? They did exactly what MS said they could do: plug and play."

      Does it seem to anyone else that the whole software industry is starting to look like a house of cards?

      All these products are being marketed as easy to use, easy to take care of, easy to everything. It's not. It's hard, very hard sometimes. I run into the strangest interdependencies, completely unexpected behavior, just plain wierd shit all the time.

      It's dumb stuff mostly. How many of you knew that Photoshop 6.0 will randomly cut off network access on a Windows box? (6.0.1 fixes it) When presented with this problem, Photoshop was not my first thought, I'm looking at the swich, changing cables etc. Took me an hour to realize that this only happened when Photoshop was running. Would the user have been able to figure this out herself? Not very quickly.

      People are starting to clue into this, I've had two people ask me if they should buy Windows XP. Both of them asked if it would mess up any of their programs first, before the asked if XP had any new features they would find useful. It seems to me that the marketing messages are failing, the upgrade treadmill is starting to look more and more like a sham. Seriously, what is the compelling value that will make me upgrade my company from Office 2k to XP? Somebody tell me cause I have no idea at all. I don't want to woosh around the desert on my desk, I want to not restore Outlook .pst files 3 times a week.

      I think soon the software industry is going to have to really consider making a more stable product, the flashy wizz bang product doesn't have the draw it used to. Security is really only a part of this but given the Summer of the Worms (tm) we just went through it is the most visible part right now. People are terrified of thier email, those little home firewalls are flying off the shelves, we're almost to the point of widespread clue. I just hope we make it.

  • by mESSDan ( 302670 ) on Thursday February 28, 2002 @02:58AM (#3082792) Homepage
    Classic quote at the very end of the article:
    "I hate to even speculate on this stuff," Gupta said. "I'm not a lawyer."
    (IANAL). Funny. Hell, we could have gotten an expert opinion worthy of that article just by one of our regular Slashdot users.
    • by Anonymous Coward
      Or JonKatz. It would go something like this:

      Who is liable for defective software? This is a question that has plagued many in its time. I intend on answering it. What we must do is write perfect software. Then there won't be defective software. But then, what if there is buggy software? Huh? Whatcha gonna do about it? Then you gotta sue. But it shouldn't involve legal action. It should be solved out of court but they should be legally liable. This question has plagued many people in it's time but I have solved it.
  • Most software packages, require you to waive all rights before installing. If you don't waive it, you can install it.


    How can users know about holes, where a company charges for tech support calls? Then if there is a hole, the user must pay for the upgrade.

  • by Bob_Robertson ( 454888 ) on Thursday February 28, 2002 @03:06AM (#3082817) Homepage
    Liability is an individual thing. Liability is based on making statements that are not true, or the deliberate cause of harm.

    The supposed $2B in "damages" are a liability on those who wrote and launched the worms, directly.

    By connecting to the net, just like stepping outside your door, you are assuming risk.

    That said, Microsoft should be liable if they represent their product as "safe" and it isn't. I believe their representation of XP as the "Most Secure Windows Ever" does open the company to prosecution for misleading advertizing, but who has the resources to prosecute it?

    There is a great deal of difficulty with trying to assign liability to those who are in the wrong place at the wrong time. Someone who gets wet because they weren't wearing a long coat when a truck splashed them doesn't expect to sue the truck driver, do they?

    The systems owners who were "damaged" by the worms are indeed guilty of not securing their systems. Who will prosecute them? And for what?

    Liability is based on two things: Intent and negligence. False advertizing and misrepresentation are the former, the success of virii is the latter.

    Personally, I think a few false-advertizing claims against Microsoft would be great, and from a theoretical standpoint they certainly are misrepresenting their products when they call them "secure" or "safe". Who's got a million or two for the legal fees when we lose?

    Bob-

  • I would have to say that under normal circumstances, the manufacturer would not be liable. If the hole was intentionally put in, that is a different story, but it's not like any company is going to willingly put a security hole in its software.

    Bad PR due to security holes again and again are enough of an effect (liability) for companies to wise up, one should hope (how many times have you heard from respected experts and, at times, Microsoft itself, to have IIS disabled on Win2k?).

    If you contract a company to design specific software to suit your specific needs, and that software does not perform adequately (security holes, or what have you) then I believe that it is acceptable to blame the software manuf.

    Face it, security holes exist. No one likes them, everyone wants to blame someone else for them, but you just have to accept that they do exist.

    Weigh your options and choose the option that has proven itself. Be it number of security problems, speed in which they were fixed, or severity (proven and potential)of these vulnerabilities.
    Oftentimes this points in the direction away from Microsoft, but that's in the eye of the beholder.

    -kwishot
    • it's not like any company is going to willingly put a security hole in its software

      Unless ofcourse, it's Micro$oft... NSA key anyone? AutoUpdate? There are more...
  • Defective software (Score:5, Informative)

    by Anonymous Coward on Thursday February 28, 2002 @03:08AM (#3082824)
    As a matter of law,in Australia, goods including software have to be "reasonably fit for the purpose" they have been purchased for, of "merchantable quality", and must fit the "description" they are sold under. If a good fails to comply with any or all of the above conditions, the disgruntled purchaser can sue for damages or a suitable replacement.In Queensland the relevant legislation is the 1896 Sales of Goods Act, which all Australian and New Zealand jurisdictions, has analogues of.

    Many Commonwealth jurisdictions have similar regulatory regimes.

    It is arguable that software which doesn't work very well fails all of the above requirements. A former law school acquaintenance of mine has even sued a car distributor, for a fleet of Lada Samaras, claiming that they didn't fit the description of a "motor vehicle" (ie a moving machine !) because they spent all their time in the shop !

    What needs to be remembered is that all software producers can be liable under such a regime, Linux or Winduhs.
    • As a matter of law,in Australia, goods including software have to be "reasonably fit for the purpose" they have been purchased for, of "merchantable quality", and must fit the "description" they are sold under.

      Does the Australian law (either in the statute or appropriate court ruling) define "software" as "goods". The usually issue here is that abstract licences arn't either goods or services....
  • It's too much liability on small companies...

    Think about how many companies form as little one or two man shops that have great ideas.

    Sure they have bugs and security holes and hopefully they're fixed before any damage is done, but to sue a small shop a million dollars because you didn't test something you installed on production servers is a joke.

    Instead, you could pay another company to test your security all the way around including all software installed on a server.

    Also, if there were something that says the software maker is liable, open source should be exempt as everyone has the oppourtunity to review exactly what the code does or doesn't do.
  • Well, if license agreements did protect companies we would probably end up with the equivlanent of malpractice insurance for software projects. Effectively increasing development costs by millions or billions. So it would stifle small projects. As fun as it would be to sure Microsoft, the costs and precidents would rebound and damage opensource and GPL.
  • here's my view (Score:4, Insightful)

    by nzhavok ( 254960 ) on Thursday February 28, 2002 @03:11AM (#3082835) Homepage
    I'm a professional software developer. I work for a very large computer company (not ms). We all try pretty hard to get rid of bugs in programs, hell as programmers we do care that our code is as bug free as possible, it's a pride thing - as well as being good for business.
    Unfortunately there's no way to produce software which is bug free, just not possible today. Well perhaps with the exception of hello world :) However it is possible to lower the amount of problems you are willing to invest a lot more money into testing which in turn ends up costing the users a lot more money (yes I'm sure there will be replies saying open source can solve this problem; more eyes find bugs quicker etc etc etc but a lot of people are still not going to consider open source solutions).

    I don't think software producers should be responsible unless it's shown they are grossly neglegent and even then they are not neccessarily responsible. Otherwise amer^H^H^H^H people are probably just going to start suing people stupid leading to massive rises in software prices. OTOH when I use windows it pisses me off when it crashes, it I upgraded from 95 to Xp a few months ago. MS says XP is rock stable, hardly ever crashes, bullshit. The lies in advertising piss me off more than the crashes themselves - false advertising that is something I'd like to see them punished for.
    • Re:here's my view (Score:2, Insightful)

      by fferreres ( 525414 )
      Well, if you are selling stuff to a bank or online retailer you "should be willing to invest a lot more money into testing which in turn ends up costing the users lot more money". In fact, the law should FORCE you to do so.

      The problem is that there's no regulation at all. When something wrong happens we all blame it to "sCriPt KiDz or CiberTerrorists".

      Like you'd open a bank in the a bazar...or like you'd open a ice-cream shop in a highly secured building. Software is the same, there should be different warranties regarding security so that each kind of company could pick the one.

    • I work for a very large computer company (not ms). I would guess we work for the same company, directly or indirectly. I don't think software producers should be responsible unless it's shown they are grossly neglegent and even then they are not neccessarily responsible. I don't think it's about nzhavok, chrysrobyn or any individual developer being held responsible. We do our best. It's about M$, IBM, Blizzard, Apple, etc., being held responsible for being so selective about their beta testers. If one is making a "best effort" at making secure, bug free code, would one go exclusively to an audience of customers who will throw their typical workloads at it? Or would "best effort" involve soliciting the opinion of some of the vulnerability finders, or (better yet) the exploit writers? I believe that the collection of teams, the company, should expand the efforts of the one beyond the development and into testing. They find experts at locating and documenting UI bugs, why not buffer overflows? For me, determining what should be law is looking at current things that aren't crimes (or are) and I think they should be (or shouldn't be), and comparing those to the exceptions I can think of. For example, I believe in personal freedoms enough to believe Napster should be legal for trading songs at will, but I don't think that it should be legal for people to pirate CDs and resell them to friends. What's the difference? Napster quality isn't perfect. I can't listen to a 128kb/s MP3 on a decent stereo without clawing my ears out. I will go to the store (Best Buy/Circuit City on Black Friday when all CDs are $9.99 -- over $200 last day-after-Thanksgiving) and purchase what I want to listen to on my real stereo. So maybe the law should be that lossy compression of music should be legal to distribute (it certainly isn't a direct copy of the CD). Linus shouldn't be held responsible for exploits in Linux. Red Hat should be, if it can be proven that they didn't think enough people were looking at the code and that they weren't proactive enough at getting patches out. M$/IBM/Apple should be if it can be proven that they did not actively go out and hire security/stability freaks to test the very closed source software. The Linux kernel has been under active community development, [hacker|cracker] testing, open for all to see, since 1991? How long did M$ actively recruit people who have reputations for breaking things for the purpose of breaking XP?
  • One of the stronger cases for a liability lawsuit is a Linux server being subjected to a denial-of-service attack by an army of captured Microsoft desktop systems. It doesn't matter what's in Microsoft's end-user license agreement, because the operator of the Linux server isn't a party to that agreement.

    This is a standard legal theory. Manufacturers get third-party liability claims all the time, and carry insurance to deal with them. Except in the Y2K area, though, this doesn't seem to have been litigated yet.

  • by Fizzlewhiff ( 256410 ) <jeffshannon@nosPAm.hotmail.com> on Thursday February 28, 2002 @03:28AM (#3082851) Homepage
    who is and should be legally responsible for insecure software?

    A. The Author/Publisher
    B. The User
    C. CowboyNeil
  • Now, this might just sound like one of those zany, out from left field ideas, but "what if" we decided to hold the actual criminals who are breaking in through security holes liable? I know, I know, I must sound like a kook, but hey, you never know what might work!
  • by Gerad ( 86818 ) on Thursday February 28, 2002 @03:34AM (#3082864)
    If party A licenses software from Microsoft, and agrees not to hold Microsoft liable for any bugs in their code, than MS may be safe from suit from party A. However, if party A's sevevers start attacking party B's servers, and party B never had a contract with Microsoft, there's nothing legally stopping them from trying to sue Microsoft. In that, I think, is why issues like this are important.
    • However, if party A's sevevers start attacking party B's servers, and party B never had a contract with Microsoft, there's nothing legally stopping them from trying to sue Microsoft.
      They could sue MS, sure. It wouldn't be successful though because MS did not damage party B, party A did. Party B might be able to successfully sue party A who would then be screwed as they couldn't recoup any damages by suing MS. A couple of results like that would certainly cause people to think twice about (a) adopting MS products, and (b) having computers connected to the net.
    • If I sells doors and a burglar breaks down the door and robs someone's home, who is legally liable? The door manufacturer? Or the criminal?
  • Perhaps the money involved in purchasing licensed copies of non-free software should be considered a sort of contract. When I pay for an item (any item) at a store, I expect the item not to be shoddy, or at the very least I expect that there will be compensation should shoddiness be present. This compensation usually comes in the form of a refund, although manufacturers of consumer products often are held liable for product defects and any damages that might result from them. The same principle could easily be extended to software. If I pay for a piece of software, I expect it to work. If you certify to me via the implied contract of sale that your product works and it does not (e.g. if I purchase a piece of software which, through some defect, corrupts my data or causes loss), you are liable for the damages.

    Free software is a separate case, IMO. If, for example, I download a Linux ISO, then there has been no sale. Accordingly, no contract has been entered into either by myself or the creator of the software. I may have obtained the product legally, but since no contract of sale is present, I am SOL if anything bad happens.

  • At issue is a simple question of whether or not a vendor is negligent in the manufacture of a product. Simple consumer product law applies here, believe it or not.

    In the case of Microsoft, you can demonstrate a pattern of negligence in the way they test and release their product. The company also publically denies that there are problems until it is too late for users to do much of anything to protect themselves and their networks. The last thing MS wants is administrators migrating their operations off MS products in favor of more controllable risk(like Open Source or a different and better tested proprietary one). I say controllable risk, because no software is bug-free and it is the job of the administrator to manage the technical arena and minimize risks to their networks.

    With the Redmond mis and disinformation machine, you can never be sure of what the truth is in terms of real support from the vendor. Afterall, this latest round with UPnP pretty much proved that the company puts profits over security. I mean, only Microsoft would try to tell the FBI that a security disaster waiting to happen wasn't one. It IS how they maintain their 'edge'.

    Death by a 1000 cuts.

  • by coyote-san ( 38515 ) on Thursday February 28, 2002 @03:56AM (#3082912)
    Most people seem to be missing two important distinctions here. You pay for commercial software, but not for free software.

    This totally changes the nature of the beast. As a specific, non-tech example, I can give a friend a ride. I can even graciously accept gas money, or a free lunch for my troubles. I could even be a good Samaritan and offer a lift to total strangers.

    But the instant I actively charge people for this, even if it's a token amount, I become a "for hire" limosine service and am required to obey a large number of laws. Some are "on point," others seem to exist solely to eliminate competition.

    There are other, more subtle differences. I can refuse to give a friend a lift without explanation. Once I become "for hire" I can't (legally) refuse to accept a passenger without a good reason. E.g., someone showing a weapon can be refused, but someone who stinks because they haven't bathed in weeks can't be refused.

    An even more extreme example is the difference between my friend asking me if I've ever experienced certain medical symptoms and a stranger paying me for advice. The former is a casual conversation between friends (or not so casual, if it involves a possible STD :-), the latter is practicing medicine without a license.

    In the software realm, I would expect to see a similiar difference in the treatment of amateur efforts (where people develop software for the love of the craft) and commercial efforts. If someone is grossly negligent, it won't matter whether they're compensated or not. But for routine oversights, I would expect to see far more severe penalties for commercial vendors than OSS providers.

    The second difference is that when you get software from Microsoft, you can't change it. Any errors *have* to be due to Microsoft's (in)action. In contrast, free software is released in source form and patches are routinely assigned. It's not morally acceptable to hold people accountable for the (mis)actions of others, so it's much harder to justify penalties against parties that provide source code.
    • Also, Open Source software, if provided as source code, is just a blueprint. It is the person that does the compiling that is "manufacturing" the product. Maybe (certainly) with a faulty compiler. If the Open Source distributor disclaims that the precompiled binaries are for demonstration purposes only, then liability for a faulty "product" should be avoidable.
    • Most people seem to be missing two important distinctions here. You pay for commercial software, but not for free software.

      It also makes sense to consider the difference between closed source and open source. In the latter case even if you don't pay for it you effectivly get something which is "take it or leave it". With open source (even if you pay for it) you get something which you can modify yourself...
      • And how do you handle the user who cannot make those modifications to Open Source code. Bringing out the example of my dear old mother, who wouldn't know gcc if it showed up at the door with a sign saying "gcc", would she have a valid lawsuit if a software bug allowed hackers to run rampant through her storage management software? By placing the onus on the end-user, you transfer responsibility to people who are not capable of maintaining their own software and who cannot afford to hire out for repair.

        Now, one has to consider - does mere notification to the developer constitute due diligence? What happens if the developer doesn't acknowledge that there is a problem (Microsoft)? What happens if a product has such a complex management that fixes are routinely overlooked (Linux)? What happens if a project is abandoned (half of Sourceforge)? What happens if the sole developer dies (no example given)?

        What may be necessary is a form of limited tort liability, similar to what law enforcement in my home state has. There is a limit on the damages that can be collected from any lawsuit against law enforcement, regardless of actual damage caused.

        Which of course leads to the situation where someone sustains a billion dollars of economic hardship, but is limited to only a million in lawsuit damages. It isn't justice, and the money won't come near recovery for the damages, so ... what?

        This is one ugly situation.
  • Reading Microsoft's End User Agreement..."Software is provided as is." This means...they can patch it if they want, you can't sue them if they don't. If you continue to use said software even if it's got more bugs than the MIB can handle, that's tough.

    As for open source, "As is" is very much implied before you even start using it. It's impossible for anyone to be at fault in either case, from a legal standpoint. Therefore, this story is completely bogus.

  • Me. (Score:2, Funny)

    by Anonymous Coward
    It's all about me, I did it all. Blame me. Go ahead.

    Thanks,
    Al Gore
  • by MillionthMonkey ( 240664 ) on Thursday February 28, 2002 @04:34AM (#3082975)
    Selling software is great. Compared to someone selling a real physical product like spark plugs, you legally retain much more extensive control over how your product can be used even after you've sold it. This is because of the enhanced rights you get as a holder of intellectual property as opposed to real property. But even though you can dictate to people the conditions under which they can use your software, if anything goes wrong, the product liability risk you expose yourself to as a seller of software is zero!

    Why does anyone even try to sell anything else?
  • by guttentag ( 313541 ) on Thursday February 28, 2002 @04:36AM (#3082979) Journal
    Its shameful, the way we try to pin the crimes of computers on people. A man buys a computer, the computer hacks into the Federal Reserve and and he goes to jail. Another man writes an operating system, a computer using that operating system smurfs AT&T but he goes to jail. The computers remain free to strike again... when will society hold computers accountable for their actions? When will we stop persecuting man for the crimes of his possessions? Perhaps some day... in the Twilight Zone. (insert cheesy dramatic music followed by annoying roll-credits music)
  • How about this:

    If you don't publish the source, you're liable. Hiding the inside of a program is perfectly OK - assuming that you take full responsibility for the manner it works.

    If you publish the source, you can be extempted. Exposing the inner workings, anyone can verify the suitability of the software for a given purpose.

    MS plays safe by not being responsible (sueable) for their bugs. If they where requested to either FIX them holes before release or publish the source, they'd concentrate on security before feature count, which would be double good.

    Only problem is, this way of cutting things would hardly feed the lawyers :)

  • Whilst the thought of seeing Microsoft taken to the cleaners for product liability would fill me with a certain amount of malicious glee, I do not believe that software companies should be liable for the security of their products.

    As others have pointed out, if someone breaks into your car, then you cannot sue the car manufacturer (at least it is difficult to do so successfully!) for the theft of your vehicle. Similarly if someone steals your hi-fi from your house, you do not sue the manufacturer of your locks and windows, or even the hi-fi maker.

    I do believe that software should be reliable and perhaps there is a case for liability if the operation of the software causes a major disaster without malicious outside interference. The problem with that, however, is we're all to aware of what will be the result; software prices will skyrocket to cover the immense legal costs that will result defending and settling these claims.

    The only people who would benefit from this will not be the software developers, regardless as to whether it's Micorsoft of open source developers; it would be the legal profession aiming to take 10-50% of your damages award when you did settle.

  • Who is liable if a lock on my front door does not work? The company who made the lock? Or me for not being able to afford a good lawyer?

    I would really like to know what some lawyers have to say.
  • It bothers me that there is this mentality that software designers are responsible for including security in there products. If I buy a peice of software, I am paying for a peice of code that is designed to perform a specific task, not neccescarly for a peice of code that will protect me from illegal activities.

    If I buy a car, I'm paying for transportation. It would seem silly to sue the manufacturer because somebody stole my car and I found out the locks on it were easy to pick.

    I use Outlook as my mail program at work. I paid for it, and I expect it to be able to send and receive mail. If somebody illegally exploits that program to do malicious things, I don't blame Microsoft, I blame the person who wrote the virus.

    On the other hand, I also own a virus scan program. This is a security measure I pay for. If my computer is attacked by a virus, I expect my virus scan program to detect it and remove it. After all, thats what I'm paying for.

    Yet the mentality is, if somebody illegally affects my mail program, Microsoft is at fault. While the virus scanner, which I also pay for and keep updated, which failed to do it's task, remains blameless.

    It's nuts.
  • by markj02 ( 544487 ) on Thursday February 28, 2002 @06:07AM (#3083136)
    When an organization makes a promise about their software, I think they should be held legally responsible for it, whether it's Apache or Microsoft. The real problem is that companies like Microsoft create the impression in their marketing efforts that their software secure, but disclaim it all in their licensing contracts. This is primarily an issue of fair competition in the consumer marketplace. For consumer products, commercial software vendors should be held to their marketing promises, with a liability of at least the purchase price of the software if they don't live up to it.

    In addition, there should perhaps be restrictions on what can be sold: for the sale to be legal, consumer software should perhaps have to conform to some basic safety standards, analogous to UL standards for electrical devices. (Since this is a restriction on sales, it would obviously not apply to free software.)

    Large commercial customers are presumed to be competent, and they should be responsible for this themselves; they don't need regulations or legislation to protect them. For example, if a company exposes 10000 people to identity theft through an unsecure computer system, the company should be legally liable for that. The company will then insure against that risk (possible directly through the software vendor). The insurer will assess the risk and compute the cost of the insurance. The company then can take the cost of the insurance into account when selecting software. I.e., it comes down to the question of: is Apache plus insurance more or less expensive than IIS plus insurance?

  • Here are some of my thoughts on why we have buggy and insecure softwares.

    * Human Nature
    People in general don't like to admit that they are wrong. Companies small and large are not much different. Even when they distribute the patch, there is rarely accurate or complete information about the problem or the severity of the problem being addressed. We think apologizing is a sign of weakness.

    * Corporate Image
    By admitting fault, company loses credibility. Company is always willing to live with few unhappy customers to protect it's overall image. It's one of the reason why software defects, security or otherwise, get hushed up and buried. You all know that the euphemism for this policy when it is applied to security is called "security through obscurity". You also know how well that works. Admitting fault is the last thing company will do. Even when they do admit it publicaly, they will always play down the severity of the problem.

    * Monopoly
    When a company is a monopoly, there is almost no incentive to admit to a problem and fix it. If you know that you can't get fired and you will get paid the same if you work one hour a day or eight hours a day, which would you choose? Lack of incentive is the very reason why communism is bad for progess. Only reason why Microsoft is pretending to care about security recently is because they are having trouble penetrating (from behind) the enterprise market with their tarnished image.

    * Money
    When I say money, I don't mean cost to create or distribute bug fixes. Putting a patch on a website for user to download isn't such an expensive proposition. It's lot different than car manufacturer doing a recall. When I mean money, I mean greed. Companies are using bugs fixes as a ploy to get users to upgrade. Marketing departments have figured out that consumers are willing to pay for bug fixes. Example of this is Windows 98 and ME. Essentially they are selling you a big pile of bug fixes as a full product and charging you for it. Sneaky isn't it? MS is not the only guilty party of this devious practice. Many companies such as Vignette, bea systems have done this sort of thing. It's becoming very common in many places and we all have been brainwashed to accept it as a norm.

    Since Free Software/Open Source has only one of the four problems to deal contend with, I think it has a somewhat better chance of producing superior software than from commercial environment.
  • I don't see what the problem is.

    If you write it, you do your best to make it secure and keep it that way. If you write insecurities into it, that's your problem.
    If you install it, it's up to you to make sure it stays uptodate with patches.

    I've got no sympathy for people with cracked boxes when there's a patch that should've been applied (ie in 99.9% of linux and 99.99% windoze cases).

    I don't see what casting it in law is going to achieve; far rather use common sense that people are responsible for their own doings, with a few precedent cases to back it up. (That'd be a first ;)
  • There are a couple points that I want to mention.
    1. NOBODY is EVER forced into buying a particular product. Every product has a competitor that you could go with. Just because the majority of the market uses a certain product doesn't mean that there are no choices. It becomes an argument of the benefits of each product (compatibility, security, features, etc) and what best suits your and your companies needs. If you don't agree, argue with a linux guru that Windows is the ONLY operating system available because 95.9% of the market uses it. If there is a problem with a certain product, you are not bound to it. You are free to choose the alternative.
    2. Liability is always something hard to pinpoint. Every producer is liable for every product that he produces. Just because there is a problem with a product doesn't mean that it is automatic grounds for lawsuits. What is important in the long run is what steps the producer goes about resolving the problem. In the case of the car company that found that it was economically wiser to let a problem exist then resolve it, they are liable for their lack of action in an accident. However, in both Nimbda and Code Red (2 recent events that come to mind), MS has release patches and solutions for it months ahead of time. They followed the correct procedure in these two incidents of identifying the problem, notifying the users of the problem, and producing a fix for the problem. The reason that these vulnerabilities still existed was that the administrators and users ignored the patches and bug reports. How can MS be held liable for the inaction of its customers. It is like saying that Ford should pay for your medical bills because you got into an accident with a car that they recalled months ago. In this case, it is your responsibility to take the car back to get it serviced.
    3. One thing that we all must remember is that a law cannot be created targetted at a specific company. So if any laws are produce regarding liability, the people it would hurt the most are the individual developers and the medium to small companies. Microsoft has a couple billion dollars in the bank. They can easily settle a lawsuit that some of you have talked about. But for smaller companies, it is a major issue. Take for example SSH [ssh.com]. In recent months, there was found to be a vulnerability in one of the older SSH clients. If they were held liable for the problem that results into stolen data, it would most likely bankrupt the company or at least cause it to be in a situation where it had to be bought up by a bigger company.
    4. Now this one is a stretch. But seeing the way that Congress has been leaning towards big corporation over the consumers [don't believe me? look at DMCA], they would most likely butcher this law in support of big business. Secondly, there is no reason why a law must be put in place where there is no problem that cannot be resolved by the market. The market and consumers are strong force. They are the ones that can make or break a company. If the security problems of MS was enough of a issue, a big chunk of the market could shift over to a competitor product.
  • I've a couple of questions for you guys.

    In a normal hetrogenous environment (as 99% of n/ws are), you're going to be dealing with software and hardware from many different vendors.

    It is possible (if not probably) that the interaction of these components will create security holes for an attacker to exploit. Which vendor do you blame? They may all be working as designed. Do you blame your low-paid network guys? Do you spend hundreds of thousands to hire external consultants? Can you blame (and sue) them if your network is breached?

    What about default configurations of software? What if the default configuration is insecure, but the documentation describes how to secure it?

    I have my own thoughts on these issues, I'd like to see what the general consensus is here.

    Btw, if you're looking for a secure OS, try XTS 300 STOP [ncsc.mil].

    The EPL makes interesting reading. [ncsc.mil]

  • From what I've read most of the damage estimates were pulled out of somebody's ass, anyways. So my question is, if this became law would the damage estimates get lowered considerably?
  • Sorry? Fully 70% of security problems are bugs in the software? Well what are the other 30%, then?!?!?

    Oh yes, I forgot: features!
  • by Nindalf ( 526257 ) on Thursday February 28, 2002 @07:56AM (#3083337)
    I mean, if you buy bulletproof glass for your car, and somebody shoots you through it, you might have a case: one of its purposes is to stop bullets. But if you buy an ordinary car, and somebody shoots you through the window, you hardly have grounds to sue them for poor product quality.

    Being able to stand up against novel forms of human attack is not basic product quality. Worms, trojans, and viruses are not mere environmental hazards, they are the results of intensive effort to find and exploit any system weaknesses.

    Disappointed customers and annoyed partners are punishment enough. Market forces will correct the problem; people will eventually learn not to buy stuff that doesn't work. They will also learn to do their part, since security doesn't come in a shrink-wrapped box.

    In a way, these petty vandals are doing us all a favor by forcing us to harden our systems. If nobody exploited the security holes, you couldn't convince people to spend extra money or effort on security. Then, when somebody made a truly serious attack, as an act of war, we would be utterly defenseless. I believe humans evolved an instinct for mischief for just this reason, and so we shouldn't be too hard on the script kiddies.
  • I believe that the software companies should be liable *up to the point that they release a patch that fixes the problems.* Then the owner becomes responsible. This does 3 things.

    First, it makes the software company more dilligent about getting all bugs out of software, and worry more about security concerns (which are, shockingly, rarely "bugs" in the software)

    Second, it makes the software company work harder at producing a patch that fixes the problem.

    Third (and most importantly in my book) it forces system admins to work faster at patching software.
  • How to protect free software? How's this?

    "If product fails to perform in a secure manner, buyer of product will be entitled to a refund in the amount of two times the purchase price."

    Free software covered! :-)

  • by Otter ( 3800 ) on Thursday February 28, 2002 @08:41AM (#3083450) Journal
    This strikes me as a textbook case of "Watch out what you wish for because you might get it."

    The prevailing of commercial software is set by the market, and reflects the balance of features, updates, price and quality that users want. That's why your word processor crashes sometimes and your defibrillator doesn't. Attempting to set a new and better balance by turning hordes of plaintiffs' lawyers loose on the software industry is going to improve the situation of users about as well as turning lawyers loose on the tobacco industry has helped smokers.

    Oh, and if you think that open source software is going to be unaffected by this, either because it has no bugs or because it's so cuddly it will be exempted from liability -- good luck. Bye-bye, Red Hat!

  • Keeping a piece of software's source closed should result in harsh liability. Since users cannot examine the source to confirm bugs or even functionality, they are completely at the mercy of the vendor. Since the vendor has welded the hood shut, problems with the engine are THEIR FAULT.

    Open source software provides a method with which users can confirm functionality (checking the source to see it really does what it's meant to), report faults to the vendor and even make fixes themselves, if required. These factors should result in a vastly reduced liability, since this kind of software gives users the tools to take responsibility of their systems. Even if the user doesn't have the skills or inclination to use the source, they can hire someone who can.

    While this may sound like pandering to the open source crowd and Microsoft-bashing, it just seems to make good sense... keeping the source to yourself means that you have to take responsibility.
  • Suppose that someone is selling a voodoo book that teaches how to make a love potion. The author made a mistake and introduced a wrong ingredient that will make a person paralytic for 24 hours instead of falling in love when drunk. The publisher immediately releases errata for several wrong formulas, but the reader didn't know and thus used the buggy formula and damage was done. Should the publisher be held responsible?
  • Does this mean that we can sue Apache? This article [com.com] says that Apache and PHP have flaws. Come on guys, let's sue.
  • by Jason Levine ( 196982 ) on Thursday February 28, 2002 @09:24AM (#3083565) Homepage
    Considering the nature of software, bugs are a fact of life. No code is going to be 100% bug free unless it's a simple "Hello World" program. It's how the vendor treats the bugs that counts.

    If the vendor is informed and fixes the bug in a reasonable amount of time then they shouldn't be liable. (Reasonable being a flexible span of time. If a bug is particularly vexing but they keep their users informed of the progress, then they should get extra time. But if they just say "yeah, yeah, we'll work on it" and then nothing happens for a month, they don't get extra time.) Of course, if the vendor is informed about the bug and does nothing about it, they should be made liable.

    Finally, if they release a patch but the user doesn't install it and has their security compromised (e.g. what happened with CodeRed), the user is the one at fault. In this case, it would be like an automobile manufacturer issuing a recall, a consumer ignoring the recall, and then getting into an accident because of the very defect that prompted the recall. Software companies shouldn't be liable for the stupidity/ignorance of their users.
  • ...most open source projects have CYA verbage in the licenses saying something like "THIS IS UNSUPPORTED, AS IS, USE AT YOUR OWN RISK, BLAH BLAH BLAH"...
  • Truth in Advertising (Score:3, Informative)

    by martyb ( 196687 ) on Thursday February 28, 2002 @09:48AM (#3083652)

    Automatically applying patches is NOT a solution! There are countless stories where the applying of patches caused formerly working software to crash.(*)

    One major advantage of OSS vs Commercial software is the availability of the source code. Another major benefit, but less well recognized, is the visibility of REPORTED DEFECTS. Prior to obtaining an OSS application, say on sourceforge, I can peruse the bug list and get a complete list of reported bugs. What's the chances I can see the complete list of reported defects in, say, Microsoft Office?

    Okay, why not just have a law passed that requires commercial software developers to make all reported bugs publically visible? Ain't gonna happen; political contributions and lobbying efforts would squash that in a heartbeat.

    BUT, there's another approach. Don't use LEGAL requirements -- make it a MARKET requirement.

    In other words, consider these two scenarios when making a recommendation for two different software packages:

    • This commercial package has these features and an undisclosed list of reported bugs. When bugs are discovered, we have to wait for the vendor to create a fix.
    • This OSS package has these features, too, but here's a complete list of all reported bugs. Further, whenever any new bugs are discovered, I can find out about it immediately, and we can fix the code ourselves.

    In short, software will always have bugs -- just as OSS makes the code available, we can use market forces to trumpet the same visibility of the known (and future) bugs.

    (*) Footnote: Feature vs Bug... many years ago I worked for 2+ years in testing a COBOL compiler that was being upgraded to support the latest standard. The version that was already out in the field was rife with bugs. Several customers were worried that we were going to fix some bugs they depended on! Though non-standard code, they had developed workarounds and used them extensively -- fixing the bugs in the compiler would break their programs!

  • by phillymjs ( 234426 ) <slashdot@stanTWAINgo.org minus author> on Thursday February 28, 2002 @09:49AM (#3083655) Homepage Journal
    It should not be possible for Microsoft (or any company, but Microsoft is the best example) to boast about how robust and secure their products are in their marketing, and then make the purchaser agree to a EULA that removes their liability, if their claims turn out to be untrue.

    This is especially true of their enterprise products, like, say, Outlook/Exchange. It should not be a full-time job patching and reconfiguring the damn stuff to keep the misfit script kiddies with Outlook Worm Kits from bringing down an entire organization's e-mail system. Microsoft should damn well have been able to be held liable for something like ILOVEYOU, that knocked some very large companies' mailservers off the Net for days.

    Imagine if, after all the car commercials boasting airbags, crumple zones, etc, those safety features turned out not work-- and then, while paging through it from your hospital bed, you found a EULA in the back of the Owner's Manual disclaiming Ford/GM/whoever from liability, if they didn't?

    The biggest bullshit, though, is the notion that people will eventually get pissed off about software not living up to the hype and take their business elsewhere. If that theory held water, Microsoft would already be a memory amongst sysadmins these days. Companies are practically locked into using Microsoft products. And what people use at work, they will buy and use at home because by and large, they are sheep who fear change. Which is exactly the kind of environment in which companies like Microsoft can shovel sub-par shit out the door, not be liable for its flaws, and still thrive.

    ~Philly
  • I don't think that Nimda is a good example of the sort of thing that microsoft could be held liable for. Errors that cause data loss, yes. Errors that cause the machine to lock up and cost you time, yes. This is akin to holding car manufacturers liable for things that go wrong with the car (exploding fuel lines and such), and is perfectly justifiable since the manufacturer is directly at fault.

    The fault for Nimda, however lies squarely on the shoulders of the virus author. Claiming that an operating system, no matter how insecure, is at fault, is like claiming that non-bulletproof t-shirts are responsible for murder by gunshot. Murderers are responsible for murder. Virus authors are responsible for viruses. Software writers are responsible for software problems-- but not for criminal acts by other people.
  • One difference (Score:2, Interesting)

    by russianspy ( 523929 )
    First. You do not BUY software. You buy the license to use - like a service. If you hire a company to provide support or to manufacture something for you they're responsible.
    There is a related story that happened a couple of years ago (don't remember exactly). Tim Hortons is haveing a Roll Up the Rim to Win promotion every year. When you buy a coffee - you can roll up the rim of the cup to see if you won a prize (all I ever got was donuts and more coffee - go figure!). Well.. It came out that some of the people who worked at the company that was manufacturing those cups were cheating by unwrapping those rims and stealing prizes. I know that that company lost the contract - I do not remember if they were sued for damages as well. I think they did - they failed to provide a resonable service they were contracted out for.

    OSS is a bit different. It's public domain. Everyone owns it - therefore if you choose to use it, and if it breaks you yourself are responsible for damages.

    That's what I think - I don't know how accurate this is, but I do realize that it's not such a great thing. If a company has to choose between OSS and proprietary solution then they will choose the proprietary one. Simply because IF something goes wrong - they have a chance of getting some recompensation.

    It's a simple choice - do you buy a reliable car, or one less reliable with insurance?

  • It's a crule fact of life ... developing 100% secure software isn't possible, just as there is no such thing as bug free software.

    No security analyst worth his weight in sand will ever tell you that a system is 100% secure. That is why people like Bruce Schneier ("Sectrets and Lies: Digital Security in a Networked World") [amazon.com] talk about security landscapes and weighing the cost versus benifit of implementing a given security feature.

    It is up to the System Administrator(s) to determine which solutions make sense. Most know that they are buying a system that is hackable by definition when they use M$, but find the risks acceptable.

    What M$ should be held liable for is their blatant lies when they say that security is a priority for them, because it isn't ... making money is the sole goal (which just happens to require that they at least provide the illusion of some security.) In the Open Source community, money tends to be a secondary or tertiary factor, with quality being number one, so that contributes to the greater security level obtained. Still, no sufficiently useful system is 100% secure. Sorry folks, but the American system of injustice has sold you a bill of goods ... being the idea that someone is necessary liable every time things go wrong. Mistakes happen, be they faulty tires (Firestone) or hackable systems. The time when liability should come into play is when a company is made aware of a problem, and doesn't address the issue. For example Ford/Firestone for continuing to pump out death machines, or M$ for keeping XP on the market now that everyone knows it's flawed to the core.
  • They would be able to pay it out of petty cash.
  • by Jerf ( 17166 ) on Thursday February 28, 2002 @11:56AM (#3084417) Journal
    You know, I have zero problem with saying people should be responsible for software they write, at least in the abstract. The idea that they should not is kind of silly, if you think about it honestly.

    But at this point in time, it would be disasterous to start allowing liability. Why? Because liability is determined by the court system, and with no offense intended, the court system is incompetent at this time to make those sort of decisions.

    I have no faith in the ability of the court system to distinguish between an obscure flaw that allows a man-in-the-middle attack on a so-called "secure" connection, and a glaringly obvious security problem like "By default, everyone in the world has full access to your desktop." (reference: Symantec's PCAnywhere for a *very* long time.) In fact, I don't trust me to make those decisions.

    At this point in time, and at our current technology level, as we've all heard and said many times, one wrong character in the wrong location, out of billions, can cause a difficult-to-detect error that, when exploited, can give an attacker root access. It's difficult to come up with any sort of definition of proportional responsibility.

    If a bridge collapses because all of the tons upon tons of concrete used was an inferior grade, that's one thing. But if the bridge collapses because one screw was made of aluminum instead of steel, is that worth suing over? My real point can be seen in how this metaphor is not applicable; A bridge would never collapse over something so trivial unless it had other fundamental problems! Software is fundamentally more fragile. (So far, all attempts to negate this have essentially failed, and I'm not willing to count on some miraculuous development in the future. Though I suppose if such a thing occurred, and it was legally mandated to use formal methods, that would make people like me who could understand them suddenly no longer competing with hacks who think they're leet 'cause they can sorta use Perl... >:-) )

    Even a professional like me might be hard pressed, after the fact, to determine which sort of problem is before the court, to determine liability. Do you want to leave it in the hands of lawyers?

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...