Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
United States

Laws to Punish Insecure Software Vendors? 581

Gambit Thirty-Two writes "An influential body of researchers is calling on the US Government to draft laws that would punish software firms that do not do enough to make their products secure." Yeah that'll work.
This discussion has been archived. No new comments can be posted.

Laws to Punish Insecure Software Vendors?

Comments Filter:
  • fgp (Score:2, Funny)

    A visiting professor at the University of Alabama is giving a seminar on the supernatural. To get a feel for his audience, he asks:

    "How many people here believe in ghosts?" About 90 students raise their hands.

    "Well that's a good start. Out of those of you who believe in ghosts, do any of you think you've ever seen a ghost?" About 40 students raise their hands.

    "That's really good. I'm really glad you take this seriously. Has anyone here ever talked to a ghost?" 15 students raise their hands.

    "That's a great response. Has anyone here ever touched a ghost?" 3 students raise their hands.

    "That's fantastic. But let me ask you one question further... Have any of you ever made love to a ghost?"

    One student in the back raises his hand. The professor is astonished. He takes off glasses, takes a step back, and says,

    "Son, all the years I've been giving this lecture, no one has ever claimed to have slept with a ghost. You've got to come up here and tell us about your experience."

    The redneck student replies with a nod and begins to make his way up to the podium.

    The professor says, "Well, tell us what it's like to have sex with a Ghost."

    The student replies, "Ghost?!? I thought you said 'goats'."
  • open source (Score:5, Insightful)

    by kz45 ( 175825 ) <kz45@blob.com> on Wednesday January 16, 2002 @11:35AM (#2848479)
    What will this mean for open source? OSS companies/programmers will be just as liable as closed source ones.
    • Re:open source (Score:3, Insightful)

      by zebs ( 105927 )
      The article says 'software companies', besides you pay for commercial software and its reasonable to expect it to be installed in a way that doesn't expose your computer to any form of attack.

      With open source you didn't pay and its a matter of trust between the user and developer that the program is secure... and if you're really worried about it you have access to the source.
      • Re:open source (Score:2, Interesting)

        by kz45 ( 175825 )
        With open source you didn't pay and its a matter of trust between the user and developer that the program is secure... and if you're really worried about it you have access to the source

        if Open Source developers have no liability as you say, the business world will have a very difficult embracing it.
        • OH PLEASE! (Score:2, Insightful)

          by gfxguy ( 98788 )
          if Open Source developers have no liability as you say, the business world will have a very difficult embracing it.

          That's ridiculous, how many times have you heard of a commercial company being liable for crappy products? How many products have MS released that have NOT worked as advertised, yet required consumers PAY to upgrade to a version that should have worked to begin with?

          Besides that, all the software licenses (shrink wrap or no) basically say "we're not responsible".

      • Re:open source (Score:3, Interesting)

        by alen ( 225700 )
        So if I buy Redhat 7.2 or Suse and it is later found out to be full of security holes then I can't sue them under this proposed law? Why not? They sold it. MS Windows is full of third party apps that MS licensed and included as part of the package. Look at IE, most of it is written by someone else and licensed by MS.
      • Re:open source (Score:3, Insightful)

        by Flower ( 31351 )
        The article says 'software companies',
        • Redhat Inc
        • Suse
        • Slackware
        • OpenBSD
        • FreeBSD Mall, Inc.
        • Caldera
        • Progeny
        • etc., etc., ect.

        These are companies that hire programmers, go through source code and make distros that people pay money for. I would consider them software firms that would fall under this proposal and I also consider them critical for the success of Open Source software.

        Now what happens to these comapanies when some project they have little control over but include in their distribution has a critical flaw that gets exploited? How vulnerable to litigation do they become? Guess we'll have to wait and see.

        • Re:open source (Score:4, Insightful)

          by Computer! ( 412422 ) on Wednesday January 16, 2002 @12:13PM (#2848808) Homepage Journal
          that gets exploited

          A critical point, I think. Keep in mind that these security holes are not exactly akin to a lock with a pink sticker that says "This lock doesn't actually work". A lot of research and experimentation is necessary in order to exploit those security holes. Research and experimentation carried out by criminals. As much as I would love to see software companies held accountable for the generally terrible state of software quality industry-wide, I'm not sure it's fair to hold Microsoft responible for making possible the actions of a malicious hacker. Is it Honda's fault a slimjim opens the door of my Civic?

          • A Certain Level (Score:5, Insightful)

            by virg_mattes ( 230616 ) on Wednesday January 16, 2002 @01:25PM (#2849307)
            > I'm not sure it's fair to hold Microsoft responible for making
            > possible the actions of a malicious hacker. Is it Honda's fault a
            > slimjim opens the door of my Civic?


            Well, to get a realistic comparison, you'd need to compare on even ground. Pretend for a moment that your car door locks went to "locked" when you pushed the lock button, and "unlocked" when you pushed the unlock. However, they didn't actually engage the tumblers in the door, so when it's locked, the handle still opens the door. Now, there's a switch inside the door that you can get to by pulling the door side off, and when you throw it the tumblers connect and when the door says "locked" it now really means it.

            Now, would you blame Honda if they didn't set the switch to "on" at the factory, and didn't tell anyone about the switch, and only acknowledged that it exists when someone in the field finds it and threatens to tell the general public?

            I'd bet you would. That's a fairer comparison, and so yes, I think the companies that produce easily exploitable software should be forced to reckoning for it.

            Virg
    • Re:open source (Score:5, Insightful)

      by glitch! ( 57276 ) on Wednesday January 16, 2002 @11:42AM (#2848526)
      OSS companies/programmers will be just as liable as closed source ones.

      It does not have to be that way. Why not put in exemption for software that comes with source code? The presumption could be that releasing source code allows the user to take responsibility for the correct operation of the software. Also consider that the OSS writer has little or no control over changes the user might make (and that's one of the main points, isn't it?)
      • Just like a LLP (Score:5, Interesting)

        by Mr. Fred Smoothie ( 302446 ) on Wednesday January 16, 2002 @11:55AM (#2848667)
        The software producer's liability should be limited to the amount of their financial return on the software, except in cases where gross negligence is apparent. If I never made a dime of the sale of the software, I should be liable only for that $0.
      • Do you really think that if this becomes a Bill with any serious chance of passing Microsoft won't have lobbied sufficiently to get it to pose a threat to its most serious competition? (Linux and OSS)
      • Re:open source (Score:5, Insightful)

        by athakur999 ( 44340 ) on Wednesday January 16, 2002 @12:05PM (#2848742) Journal
        The presumption could be that releasing source code allows the user to take responsibility for the correct operation of the software.


        That's a bit like saying a car company shouldn't be held responsible for putting faulty brakes on a car, since after all, the car owner could have replaced the brakes with something that worked.
      • Couldn't the GPL be modified for this even without a OSS clause? Something along the lines of "By using this program, you acknowledge the availability of source code and accept responsibility for any and all warranty requirements." (IANAL so that's probably well below the threshold of what's required, but my idea of what would work.)
      • Re:open source (Score:5, Insightful)

        by kin_korn_karn ( 466864 ) on Wednesday January 16, 2002 @12:31PM (#2848956) Homepage

        It does not have to be that way. Why not put in exemption for software that comes with source code? The presumption could be that releasing source code allows the user to take responsibility for the correct operation of the software. Also consider that the OSS writer has little or no control over changes the user might make (and that's one of the main points, isn't it?)

        What needs to be made illegal are EULAs that absolve the software creator of guilt for flaws. Ford is liable for putting the wrong tires on SUVs and causing people to die. Ask Explorer owners (if you can talk to people that would buy one nowadays) how they would have reacted to such a license, and imagine how the courts would have reacted.

        You've also made an excellent point about the futility of the GPL, but I digress.
      • Re:open source (Score:3, Insightful)

        by erroneus ( 253617 )
        Hear hear!

        OSS companies/programmers will be just as liable as closed source ones.

        It does not have to be that way. Why not put in exemption for software that comes with source code? The presumption could be that releasing source code allows the user to take responsibility for the correct operation of the software. Also consider that the OSS writer has little or no control over changes the user might make (and that's one of the main points, isn't it?)


        Furthermore, OS authors do not always have control over what versions of what libraries are being used, or for that matter, what compiler is being used. With source code, mileage *will* vary. With a complete binary only distribution, it's another matter.
      • Re:open source (Score:3, Informative)

        by sheldon ( 2322 )
        Well first of all the exemption would never get into the law because those who have the money have the lobbying power. Despite their hatred, not one of Microsoft's competitors would step up in support of this law. Oracle, Sun, Apple, etc. would all be lobbying against it as hard as Microsoft.

        Second of all, it wouldn't matter anyway. If I walk into a business suggesting they buy a warrantied product from a reputable manufacturer, and my competition walks in suggesting they use a free product with no warranty.

        I will win the contract, I guarantee it.
    • OSS companies/programmers will be just as liable as closed source ones.

      And how, exactly, is this a bad thing? Personally if RedHat got hauled into court due to their history of sloppiness, I'd be cheering.
      • by wiredog ( 43288 )
        What if Linus got hauled into court after ext2fs ate someone's data?
      • Re:open source (Score:3, Insightful)

        by SirSlud ( 67381 )
        Really now. People equate OSS with guys at home working for free. I support RedHat being held liable for software they write if they are making money off of it.

        But software that is free, free as in free beer, should not be liable. I've always felt that if you are providing something for free, and you don't force it into people's hands, those people should understand the risks of using it.

        However, if you're making money off of it, that money should go to making sure the software is stable and secure, and that people get what they pay for. So, in that case, I think the idea of certain reasonable guidelines on security and realiability should and could be held up by consumer protection laws. I think there are certain things, such as vulnerabilities of running servers and such being on by default in shipped software, that should be illegal. The way some software vendors ship products with 40 outside-facing services to the novice user who will never ps aux or check out the services control panel is, to me, an unneccessary and easily preventable and pluggable hole, especially considering the number of people who use them and the value of the data that gets thrown on these systems.
        • Re:open source (Score:3, Informative)

          by gus goose ( 306978 )
          I am afraid that you are mistaken ... Redhat makes no money off it ... they make money from selling manuals, CD's, and support. Re-read the GPL, Redhat IS Free (as in Beer) except for delivery charges, P&P, Printing, Paper, CD's, etc, but the software itself is Free (as in Beer).

          gus.
        • So terrible flaws such as the recent Internet Explorer problems wouldn't apply, because the free clause makes it exempt? Sounds like this won't do much good.
    • Also interesting to read:

      Open source developers face new warranty threat [theregister.co.uk]
      Rosen and Kunze were attempting to secure an exemption from implied warranties of merchantability, fitness, or non-infringement for a computer program, "provided under a license that does not impose a license fee for the right to the source code, to make copies, to modify, and to distribute the computer program."
      The proposal would have brought the rest of the States in line with Maryland.
      The replacement version, which reads "or to distribute..." is joined by a provision that nullifies the exception for software licensed to consumer

      The complete text can be found here [nccusl.org]....
      a) Except as provided in subsection (b), the warranties under Sections 401, and 403 do not apply to a computer program if the licensor makes a copy of the program available to the licensee in a transaction in which there is no contract fee for the right to use, make copies of, modify, or distribute copies of the program.
      (b) Subsection (a) does not apply if the copy of the computer program is contained in and sold or leased as part of goods or if the transaction is with a consumer licensee that is not a software developer.

  • Easy Money (Score:2, Insightful)

    by rhost89 ( 522547 )
    <SARCASM>

    So this means that if i configure my computer without a password i can sue the manufactuere for defective security in their software if it gets hacked.... Cool

    </SARCASM>
    • by SirSlud ( 67381 )
      Considering what things MS leaves on by default in Windows when it ships, you could buy their software for 200$, and then get a $20,000 lawsuit-fueled mail-in rebate! Talk about savings!

  • Aimed at Microsoft, George Bush's friends in Redmond. Asking for them and others to actually produce secure and reliable software, and making them responsible for their actions.

    Sounds ridiculous that this shouldn't already be covered by things like Consumer Protection but in fact those licenses make sure that they have no responsibilities. And no-one is going to change that in the US when there is a president who doesn't want to prosecute for monopolistic practice the bigger violator of security concerns out there.
    • I know this will get some dissenting responses, but I feel I should say it.

      I have administered WindowsNT 4 and Windows 2000 systems. I have *NEVER* been cracked, hacked, or otherwise seen any ill effects from the security flaws that do exist in any of the Microsoft products we use on our server platforms.

      I have written WSH scripts that automatically update and spread any updates to all of my systems. All I have to do is approve the update, which is done after I test it. I stay on top of their security patches and simply followed their recommended guidelines for locking down a server. I also disabled several things I know are exploitable.

      The funny thing is, I end up doing the same thing with the latest and greatest from RedHat. They make it a little easier out of the box to keep up with the updates etc. I have to turn off services I don't want and follow the "common sense" guide of things like turning off services I don't need.

      I am not saying my boxes are uncrackable, or that I am all knowing, or even that great at securing systems.... Anyways.
      • > I am not saying my boxes are uncrackable, or that I am all knowing, or even that great at securing systems.... Anyways.

        So what are you saying? :) That you havn't been cracked? Hehe, reminds me of my giraffe scarecrow .. works like a charm, I havn't ever seen any giraffes around my lawn. ;)
  • Hard to implement (Score:2, Insightful)

    by RazzleFrog ( 537054 )
    How do you quantify what is doing enough? If they release a patch in two weeks is that enough? How about 4? Is releasing a patch not enough? Should they actually call people and tell them to install a patch that has been out for months? I mean there is no doubting that Microsoft software has holes but they do patch them. The question is do the do it fast enough and do they make it required for users.
    • by Mr. Fred Smoothie ( 302446 ) on Wednesday January 16, 2002 @12:42PM (#2849052)
      Your post is interesting, especially in light of the difficulty a court may have in accurately assigning liability to the correct party.

      For instance, am I liable if I use the standard C function gets() in a program? I, as the program vendor, can argue that that's what was taught in my undergrad CS course, or I could point the finger at the language designer or C library vendor.

      What about a program I write that communicates w/ other software via a standard protocol, and works perfectly if the other software adheres strictly to that protocol but fails in combination with another program which implemented that protocol incorrectly; am I to blame, or is the other vendor? What if the spec is vague?

      As I've said in other posts, the potential for good legislation along these lines is there, but only with *heavy* involvement of people who understand issues such as these, along side of the industry lobbyists, consumer advocates and politicians.

  • by squarooticus ( 5092 ) on Wednesday January 16, 2002 @11:38AM (#2848493) Homepage
    Be careful what powers you give to the government.
    • Just as importantly, beware what responsibilities you let corporations abdicate. "...but I had my fingers crossed behind my back, and only mentioned that in fine print I made you agree to..." should not be a valid defense against damage caused by software which is patently faulty, which the producer knew about, and which the producer wants to charge you to fix.
    • Already a member of the Green Party, thanks.

      Be careful what powers you let corporations have when you let them run amok without government regulation.
    • Be careful what powers the governments assigns to its proxies.

      Such as special dispensations to ignore normal contract law by selling "licenses", such as copyright, such as patent, ...

      *Real* libertarians aren't as one sided as you seem to be. They actually believe in fewer laws of any kind, not just fewer of the kind favorable to their favorite soapbox.
    • car safety (Score:3, Interesting)

      by coyote-san ( 38515 )
      I used to support the Libertarians. Why should The Man have the right to tell idiots to wear helmets? Just make motorcycle riders carry enough insurance to cover their costs when they get non-fatal brain injuries (so I don't have to pay for their mistakes) and let them have fun.

      But then there's the impaired drunk drivers (not to trivialize the 0.08 crowd, but I'm far more worried about Bubba with a 0.24 BAC than the 0.08 crowd). They tend to take out other people as well. When they drive impaired, they're at threat to all of us. I don't think we should ban alcohol, but I don't see a problem the state having the right to crack down on repeat drunk drivers because there are documented cases of some drunk drivers who have been in multiple accidents resulting in death.

      Taking it one step further, I remember being poor and in college and resenting the mandatory vehicle checks my state required. Then I moved to a state that didn't have mandatory vehicle checks... and heard some horror stories of what those vehicle inspections found in other states. Again, I don't give a damn if some moron wants to jack up his pickup with ice hockey pucks... until he takes it on the road and they suddenly shear, forcing his vehicle to roll/tumble into my oncoming traffic lane.

      Now let's revisit the software issue. Once again, I really don't give a damn what people do on their own systems that are not attached to the net. But I do care when I can't use my cable modem because NIMBA a NIMBA stupid NIMBA coding NIMBA bug NIMBA NIMBA left NIMBA many NIMBA NIMBA NIMBA systems NIMBA NIMBA open NIMBA NIMBA NIMBA NIMBA NIMBA.

      The Libertarians have a point when they argue that the state should rarely, if ever, protect an individual from themselves. And that the state should rarely, if ever, protect people from inconsequential behavior of their neighbors. (You don't like the fact that your neighbors are gay? It's your problem, not theirs, unless they're doing stuff that would be a problem regardless of their sexual orientation.)

      But once you get into behavior that demonstratively harms others, or could reasonably result in harm to others, it's a whole new game. Unfortunately far too many Libertarians don't get this.

      In this particular case, we need to see the proposals. But there is absolutely no way you can argue that Microsoft's sloddy practices have not harmed many innocent people. If it takes a law to force them to accept that their indifference demonstratively harms others, so be it.
  • Terrorism (Score:2, Interesting)

    by CounterZer0 ( 199086 )
    So, if a law like this is passed, will the people who break it be branded IT Terrorists? I mean, everything else is terrorism now, why stop here?
  • by alen ( 225700 ) on Wednesday January 16, 2002 @11:39AM (#2848502)
    Linux, Solaris, HP-UX, MS WIndows and a bunch of other products have holes in them that SANS tells others about. Has there ever been a piece of software with no security holes?
    • I think this is exactly the problem they're talking about. Not only would OS vendors ALL be liable but anyone who makes any type of network connected software would be as well.

      The linux kids might be happy about MS getting hit for $10K or whatever per IIS hole, but when the same thing starts happening to proFTPd, BIND, sendmail, etc... the shat will really start hitting the fan!

      If such a law does get passed, it will certainly be ruled unenforceable the first time it's tested in court.
      • M$ and Big Software would love this law. It would effectively kill the free/open-source software movement. Who besides MS, Sun, Oracle, et al. can afford to take a chance on getting hit for $10k for each bug? I wouldn't be surprised if Larry, Bill, and Bill are behind this...
    • I don't think the point was to punish co's because their products have problems; they would be punished if it could be shown that this was more or less deliberate, ie. company didn't bother to even try to make it secure?

      In case of, say, Microsoft, the problem is not necessarily that they don't (try to) fix the known problems, it's that they somehow managed not to realize the obvious potential problems (with email/documents allowing active fully enabled scripting) when designing products in the first place.

    • #include

      int main()
      {
      cout "Hello, World";
      return 1;
      }

      as far as I know, the root hole was fixed in 0.2.3
    • by stilwebm ( 129567 ) on Wednesday January 16, 2002 @12:21PM (#2848872)
      A law like this would benefit two camps. One would be large software companies, since the smaller competetition would be squashed as the cost of doing business reaches prohibitive levels. The other benefactor would be the insurance agency. They would increase premiums for software businesses greatly, since this would be the best way for businesses to protect themselves. Consumers would only suffer.
  • Fine them? (Score:3, Funny)

    by Geeky ( 90998 ) on Wednesday January 16, 2002 @11:40AM (#2848505)
    Your software is insecure. Please pay your fine by credit card at http:// ...
  • Oh my, the irony (Score:4, Insightful)

    by Reckless Visionary ( 323969 ) on Wednesday January 16, 2002 @11:40AM (#2848507)
    You know, it used to seem like the software security and freedom communities were pretty closely related. Apparently the NAS doesn't have the same lassaiz fairre attitude as most of the freedom advocates.

    It's always interesting when those who call for freedom and security for themselves can only figure out how to do it by reducing the freedom of others. Now they want to legislate software standards? Come on, you have to be against that.

    • I'm not necessarily advocating this legislation, but your assumption that regulation is automatically anti-freedom is flawed. Freedom and laissez-faire are not synonyms; there is also the "freedom means responsibility" concept. Just like with free speech you don't get "say whatever without consequences"; (pre-)censoring things is illegal, but you may be nailed later on the contents. Another way to put this is that libertarians have no monopoly for Freedom even though two things are related.

      That being said, the goal (having some recourse against foolishly ignorant s/w companies) could be more easily obtained by just clearly abolishing EULAs, and letting legal action start based on actual damages products cause (if any). I know that administration doesn't really have power (and shouldn't have) over courts, but they should be able to test out EULAs in court.

  • Reconsidering that plaintext cookie in my browser that holds my account password, are we?

  • emmm... (Score:2, Interesting)

    by einer ( 459199 )
    This is definately a double edged sword. This could bite anyone on the ass. MS doesn't hold a monopoly on crap code (arguable). What happens to people who don't sell the software, but wrote and make money on its support? (I'm thinking of Apache here).
  • Lobbying against it? (Score:2, Interesting)

    by coug_ ( 63333 )
    So.. if a company lobbies against this law, wouldn't that open them up to criticizm? I mean, it'd essentially be like them saying "we don't want to be responsible for our insecure software."
  • Freedom of Speech (Score:4, Insightful)

    by CTalkobt ( 81900 ) on Wednesday January 16, 2002 @11:41AM (#2848518) Homepage
    This raises some constitutional issues - Do I have the right of freedom of speech ( as code has been found to be in some cases ) to utter an incorrect program?

    An additional question would be should all software now come with a warrently that specifically disclaims the implied warrenty and states that there is no warrenty? Would it be legal under the proposal?
    • Re:Freedom of Speech (Score:4, Interesting)

      by cperciva ( 102828 ) on Wednesday January 16, 2002 @11:53AM (#2848643) Homepage
      This raises some constitutional issues - Do I have the right of freedom of speech ( as code has been found to be in some cases ) to utter an incorrect program?

      Do you have the right of freedom of speech to utter other potentially hazardous comments? Yelling "FIRE!" in the middle of a crowded theatre is dangerous, and illegal. If you're engineering a bridge, does "freedom of speech" give you the right to design it so that it will collapse when people try to use it?

      There is a wide legal history for freedom of speech ending when it causes harm to others.
      • by sam_handelman ( 519767 ) <samuel.handelmanNO@SPAMgmail.com> on Wednesday January 16, 2002 @12:22PM (#2848887) Journal
        There is a wide legal history for freedom of speech ending when it causes harm to others.

        You don't need to open that whole kettle of worms at all, in this case. The right to say something does not equate with the right to sell it - unless it is sold for the purpose of communication (which commercial software is not.)

        People who write software and then sit on it, or only give it to a few friends, cannot and should not be able to be held accountable for their software not working - unless (like yelling "FIRE!" in the middle of a crowded theatre) there is clear evidence of malicious intent (computer viruses.)

        Someone who distributes software for free ought to be required to disclaim any warranties, which they allready do, and that is fine.

        On the other hand, when you sell a piece of software there is an implied warranty of merchantability that you cannot disclaim. Extending that warranty to include security is not a free speech issue. Your right to write any code you want is still protected, you just cannot necesarilly sell it.

        By extension, however, code written for the purpose of communication - including "here is how you write DeCSS" or the example code in a CS textbook - would still be protected, and you'd still have a right to sell it, whether or not it worked or was secure.
    • Of course you have the right to utter an incorrect program. And due to the nature of free speech other people can call you on the flaws of what you've said.

      But, if you have been reading some of the latest decisions in the courts, software also has a functional aspect that can be litigated. You package that program into a binary and start selling it the issue is less of the code being free speech and more of the executable being a product.

  • by Pinball Wizard ( 161942 ) on Wednesday January 16, 2002 @11:42AM (#2848530) Homepage Journal
    If you are talking about imposing rigid design and coding standards to software that is released to the public, it could have a far more adverse effect on small software publishers and open source projects than it does to, oh say Microsoft.


    Seems to me this will have the least impact on those who need to pay attention to security the most(large software companies) while having the potential to make it harder for the "little guy" to write and publish software.

  • by jarodss ( 243400 ) <`moc.liamtoh' `ta' `97siupudekim'> on Wednesday January 16, 2002 @11:42AM (#2848533) Homepage
    Anyone ever read their full End User Licence Agreements, especially MS?

    It always has a limit that anything bad that happens while using their product is not their fault.

    Now IANAL but I thought that by clicking I Agree, that you were actually agreeing to that.
  • by Mr_Perl ( 142164 ) on Wednesday January 16, 2002 @11:43AM (#2848534) Homepage
    I suspect that this would ensure far less software gets produced by smaller vendors and individuals who can't afford the liability.

    Another good move for corporate America.

    Microsoft is able to defend itself against the government. Are you?
    • It would also result in far less software being produced for businesses (large and small), since it would increase the cost of software so much. This would be a disaster for everyone.

  • by Rothfuss ( 47480 ) <chris...rothfuss@@@gmail...com> on Wednesday January 16, 2002 @11:43AM (#2848539) Homepage
    But Windows XP is not the only Microsoft product with security failings.

    For example Microsoft Bob.

    I've been waiting for a service pack for it for years. I'm just not as comfortable hooking Bob up to the internet as I once was. Bob has gotten more viral infections than an old French Whore in a port town.

    -Rothfuss
  • draft laws that would punish software firms that do not do enough to make their products secure

    What, legally require things like DRM?

    No, I know what it means. Who's going to check out all this software? Are we going to have a Federal Department of Bug-Finding, which employees 57,000 people trying to write Code Red 3?

    How will this result in anything other than higher prices and no change in the "security" of software?
  • Even the animated paperclip that acts as a helper in some Microsoft software can be compromised and turned against the computer it is being used on.
    I always said that thing was evil
  • I agree (Sort of...) (Score:3, Informative)

    by GSloop ( 165220 ) <networkguru.sloop@net> on Wednesday January 16, 2002 @11:44AM (#2848552) Homepage
    Laws that make a vendor produce a secure and safe product should apply to software too.

    Ford and GM shouldn't be allowed to produce cars that kill people, simply because they couldn't be bothered to make them safer - like exploding gas tanks - ok, so that's not such a great example... (grin)

    But really, but the responsibility where it lies. If I put a system out on the net, and don't take some steps to make it secure, I should be liable for damages it causes when it's compromised. Same for SW companies. If you produce a product that doesn't meet the "reasonable" man test for care in producing the product, the maker should be liable for negligence.

    I might go even further though, and add some criminal penalties too.

    Software can be more reliable and bug-free and secure. (Go read the "Software Conspiaracy") Sure it will cost more, but what do you think all the virus outbreaks costs business and individuals. It's just a hidden tax. MS (and others) are just shifting the burden of producing software that works to the users. It's cheaper for MS to produce the software, but lots more expensive for the user to use them.

    Finally, the legal system _IS_ part of the free market. The threat and actual loss of damages to a plaintiff balance the system of the market. It's not just buyers and sellers - and a wild wolly mess...
    It just bugs me when "free market" proponents want to proclaim that the courts are unneccessary in the free market - bull! They are important and the market will not function correctly without them!
  • I think a much better approach would be if companies had their software certified as secure. Just an independent group to come in and audit the release at varying levels of bulletproofedness.

    It'd drive up software costs, but if consumers don't care to look for the "Certified Secure!" brand, why should the government force it?

  • Do they really think more regulation is going to improve software? All this will do is make companies put time and effort into "compliance" instead of fixing problems users are asking for
  • The US National Academy of Sciences (NAS) has released drafts of a report commissioned after 11 September to look at the state of America's computer systems.

    If the USA Patriot Act could get passed after 9/11, so could this. Let's not forget that rationale goes the way of the buffalo in the months following an attack. And while I think a lot of software would be better than it is now if it were more secure, this wouldn't just affect MS.

    Let's hope nothing comes of this, as it could mean lawsuits against anybody and everybody if any piece of data becomes available to the wrong party.
  • good concept (Score:3, Insightful)

    by Kallahar ( 227430 ) <kallahar@quickwired.com> on Wednesday January 16, 2002 @11:47AM (#2848575) Homepage
    While the concept to "punish" vendors for flawed products is a good one, trying to get the _government_ to do it is a bad one. For one reason, the government is very easily corrupted, and often looks the other way.

    A better solution is to allow people to sue software companies that produce software that does not do what it is supposed to do. For example, if Microsoft says they have the most secure servers on the market, they damn well better be that.

    As soon as a few lawsuits are filed, things will change for the better. There's too much being "protected" by microsoft software for them to continue business-as-usual for long if they get sued for every nimda/code red/etc out there doing damage.

    However, if the company puts out patches (such as through windowsupdate) and the user fails to apply them in a timely manner, it's the user that screwed the pooch, not the producer.
  • Where laws are concerned one must always tread carefully. What they are proposing is criminal penalties for security flaws. Imagine if the authors faced liability for writing ftpd with back dores in it. Whould people still be willing to write free software if that little disclaimer doesn't work any more?

    There is a long history of laws (e.g., Sherman Act) designed to limit corporations but instead limit individuals.
  • We really need fair competition in computer software again. If there were reasonable alternatives (yes *we* know there are, but most companies are pretty clueless wrt actual computer-based solutions), there would be NO NEED for this law, as the better software *should* do better in the marketplace.
    • If there were reasonable alternatives (yes *we* know there are, but most companies are pretty clueless wrt actual computer-based solutions), there would be NO NEED for this law, as the better software *should* do better in the marketplace

      But it's not. Which suggests that it isn't actually better. Remember, "better" is relative, and what you look for may not be what someone else looks for in a product.
  • Not to sound insensitive to the software security issue, but going down this path simply encourages massive efforts at hacking one camp's software to further one's own favorite.

    Yes, people already do this, but to bring in the Gov't to be manipulated by these whims seems silly. Be responsible for your own security.
  • I cannot even imagine how a mandatory scheme would work in terms of criteria, process, remedies, etc. Using the auto industry as an example, we have government standards/regulations vis a vis car safety, we have government testing processes, we have mandated manufacturer testing, we have independant testing and verification, and a slew of consumer watchdogs to try and keep us informed.

    Translating this to the software world, frankly, makes my head explode just thinking about it. Consider:

    • the handful of auto manufacturers vs. the thousands of software houses who would potentially be safety-regulated
    • the cut-and-tried 'goal' of a car (transportation) vs. the many, many 'goals' of the many, many pieces of software to be certified
    • the bureaucracy (public and private) required to make this work

    I can see, perhaps, a public standards body to which software vendors could choose to submit their products. In this scheme the government could award some kind of 'certification label' that a vendor could use on their packaging, etc. indicating it's 'safe'. That would at least enable the marketplace to decide the importance of government certification. However, we'd still be left with the niggly questions of what 'safe' is and how we might determine 'safeness'. Maybe this akin to 'quality' certification along the lines of ISO9001/2 processes(??).

  • Comment removed based on user account deletion
  • And real basic liability -- their product does what their marketing claims say it will, or they fix it or take it back and provide some kind of refund.

    I'm willing to accept that it may have defects that may cause problems, but the defects in the software should be fixable by the vendor.

    I'm not willing to accept that the product has so many defects that it does not do what is claimed. I call that fraud.
  • This is bad news for anyone dabbling in software development, you make a piece of software to do something (in your opinion) useful, release it on your website where a few dozen download it, it spreads a bit more, and suddenly, someone somewhere does something that provokes your app to crash, or be used, in a nasty way taking out their box and the boxes on that network.

    Now you suddenly find yourself with a fresh lawsuit in your mail claiming you're responsible for the couple hundred thousand dollars worth of damage done to a company in some remote place you've never heard of...

    This sounds like an excellent way to deter anyone from ever releasing anything that's not tested and tested again, meaning development for a hobby will be a lot tougher.

    I see a suggestion like this working only after a developer clearly states and guarantees that his software will not in any way harm the users equipment, or, very gross neglect from the developer and failing to provide even rudimentary security.

  • The market should work this issue out on its own if it is healthy.

    If organizations want higher security, they won't buy the insecure products. Business that have been burned by Outlook/IIS/Windows in the past will move to alternatives: GroupWise/Apache/*NIX.
  • I don't want to be able to punish software companies that make insecure software. It's a blanket statement that makes no sense -- there are plenty of things that are insecure by design. There are lots of things that really don't NEED tight security.

    What I do want is to KNOW when a supposedly secure product has a security leak. Moreover, I want to know the ramifications of the issue, the patch progress, and current known virii/worms/other explotations roaming around.

    I really don't want to sue company X for making insecure software -- but I don't like the idea of them holding back on vulnerability announcements one they've been exploited.

    • Really, if we sue the crap out of them, they will not have the opportunity to fix the problem, since they will spend all of their time and money in court.

      This would just be a hinderance towards making more secure software. We need something more like a "right to know" law.
  • How 'bout we just not use products that are known to have chronic security problems? That would send a clear message to iresponsible companies a lot better than some silly law.

    I do think companies like Microsoft need to take more responsibility for the huge gaping security holess in their products but I'm not legislature is the right way to go about it. I do think consumers need to be better informed. When a Ford recalls a few vehicles over some potential saftey hazzard it's all over the evening news. But what about when a dangerous security hole is found in the world's most used operating system? The vast majority of users never even know about it.

  • Whatever happened to the good old days, where if a product was notoriously unsafe and insecure, that consumers simply refused to buy the product? The manufacturer's only choice then was to either fix the problems, or cease production.

    If we bought cars with the same lack of discern that we buy software, Chevrolet could bring back the Corvair.
  • So would it be legal to hack again? Or would hacking a system to prove it's insecure cancel the other one out.
  • by acceleriter ( 231439 ) on Wednesday January 16, 2002 @12:03PM (#2848734)
    . . . we might want to consider that while "security" can mean keeping your machine from being 0wn3d, it can also mean "security" as in the Security Systems Standards and Certification Act [petitiononline.com], otherwise known as the "Enforced Copy Control and Free Operating System Elimination Act."
  • I hear a lot of people happy about the idea of going after M$ because they are the Evil Empire. I also hear a lot of people that are afraid of us open sourcers being attacked. Obviously, more secure and better written code should be standard.

    I'm not so sure that liability isn't a good thing. I'm not saying that a programmer should be completely responsible for his/her code and any results that occur. I can instead think of a different situation. Imagine I produce a piece of software and sell it/give it away. I don't think it's a bad idea for me to be required to:

    Openly reveil any and all known bugs/hacks/vulnerabilities (available from a website or whatever).


    If the product was PURCHASED, I should be required to give freely downloadable patches that will fix known (serious) bugs within a specified amount of time.

    If the product was given free of charge, then the product has no obligation other than to report the bugs (though giving away the source would be nice so others could fix it).

    If I fail to fix a serious, known bug within that specified time, I should be first not allowed to sell the product. It's buggy, and has a flaw that's very bad. Selling more broken copies just looks like I don't care. I would call it malicious.

    If I still don't fix the issue, then I SHOULD be culpable for damages. By this point, I would have ignored many warnings and I have negligently continued on a dangerous course. If a bug in my code (which I retain the rights to) causes loss of data, property, or life, I have contributed to that loss.

    Now, of course end users will be responsible for installing patches, monitoring CERT advisories, etc. The end users are also responsible for attempting to avoid known bugs while waiting for a patch to become available. But, sometimes this isn't avoidable (think power generation system). If this particular bug is the cause, then by all means I think the users should be able to go after the company they PAID for damages. It's not like the software company didn't charge the end users to use the software. With those software rights, there really should be some sort of software liability (just like if I made a defective car, and then had to do a recall).
  • Absolutely no way (Score:2, Interesting)

    by Glorat ( 414139 )
    This is another one of those catch-all blanket decisions that seem alright at first thought but if you apply to all cases, you see that it is just disastrous. Let's look who it affects the most

    BETA SOFTWARE
    Well of course that has bugs. So we exempt this? OK, all (Microsoft) software will be beta

    NEWBIE / EDUCATIONAL
    Some newbie developer or uni student writes a piece of toy software and makes it available on his home page to boost his ego. Some other newbie academic downloads it and a bug in the "file manager" software deletes his C: drive.
    Exempt educational software??

    FREE BEER
    Some people make software out of the goodness of their hard. "YMMV, maybe you like it maybe you don't. No warranty". Maybe it is superb. But it might have a horendous bug. So people will no longer release freeware

    OPEN SOURCE
    Same as above but with source open, people can deliberately find bugs and cry out. Worse, there is plenty of open source software in commercial use (Apache etc). What if in some new iteration of Apache, there is a security hole and this will happen. Can people sue for this?! Can people sue the developers who worked on it for free? What exemption do you want now?

    MICROSOFT
    Well, by now, OSS has dried up because everyone is too scared to give work away. Maybe top projects that have been so heavily scrutinised in the past might be ok (Apache, Linux Kernel). Microsoft might just last a little longer than expected due to security through obscurity but of course they too will perish

    The end of software =)
  • After the US government begins its new laws in the area of data and intellectual property, i have some more they could add:

    1. The Crap Film and Television Act, will hold film-makers responsible for bad productions, bad acting, bad lighting and poor scripts. If someone passes out from bordom from watching a film, they can sue the studio.

    2. The Invasive Pop-up Advertising Act, will ban all pop-up adverts. This will tie-in with the software laws, because pop-ups are technically software, and are insecure (in that they cause damage to my mouse).

    3. The Insecure Boy-Band Act, will ensure that all boy-bands are securely locked-up. If a record company tries to bring them to a studio or gig, they will be punished.
  • Even the animated paperclip that acts as a helper in some Microsoft software can be compromised and turned against the computer it is being used on.

    Are they serious? Can Clippy spread a virus? I never heard of that.

    Ahhhh he's coming out of the computer....

    - adam

  • Think carefully... how do you make software secure in the first place? Microsoft try to go through extensive software testing to detect bugs. Who knows, maybe if test software is good enough, they can catch most bugs

    How does the OSS world make its software so secure? Through peer review. People find bugs and report them. With OSS these bugs are found fast. And these bugs get fixed fast. But what would be ludicrous would be to sue for bugs since at V1.0.0 there are bound to be bugs. Suing would kill the project. Peer review has made OSS strong and that is the way it should be.
  • Almost all of the serious virus outbreaks of the last two years can be traced to vulnerabilities in Microsoft products.

    I'm not fan of Microsoft, but it seems to me that it is the user's fault if they contract a virus. It all goes back to the knowledge level of the user.

    If someone sent me:

    #!/bin/sh
    mail next@victim < $0
    if [ "$UID" = "0" ]; then
    rm -rf /
    else
    rm -rf ~
    fi

    And I executed it, it would be entirely my fault! Now can I sue every single UNIX (and UNIX-like) vendor because their system allowed me to delete my files "unknowingly"? Most of the Outlook viruses out there were really nothing more than that! In most cases, the user had to manually open the attachment and run it.

    Notice, basically every single complaint about Microsoft insecurities were due to ease-of-use features. Outlook executes attachments, it's much easier for users to click on it to execute it. The web server exploits targeted extra services Microsoft added to make things easier for people who want to use those features. And our good pal Clippy, again, another ease-of-use feature. If people were more knowledgable about computers there would be no need for these extra features and so there would be less code that has to be verified as safe, not to mention more time to verify the important code.

    While software security is important, knowledgeable users is just as important, if not more.
  • White Hats (Score:4, Informative)

    by Merry_B.Buck ( 539837 ) <MeriadocB_Buck2@@@yahoo...com> on Wednesday January 16, 2002 @12:30PM (#2848949) Homepage Journal
    If companies faced lawsuits and financial penalties when vulnerabilities were found and exploited, they would strongly discourage white-hat hacking, independant vulnerability testing, etc. It would be in Microsoft's best interests to immediately sue anyone who reports a flaw. (White hat hacking violates US law [usdoj.gov] just as black hat does.)

    Lawyers would start to be accused of Bugtraq chasing.
  • The report (Score:3, Informative)

    by rde ( 17364 ) on Wednesday January 16, 2002 @12:32PM (#2848970)
    The NAS, god bless 'em, tend to make their books available to the great unwashed; you have signed on for email updates, haven't you?
    Well, just in case you haven't the draft report is available for online perusal here [nap.edu]

    PS I said NAS, not NSA. Just to be clear.
  • by gosand ( 234100 ) on Wednesday January 16, 2002 @12:37PM (#2849010)
    Hmm, under the DMCA it would be illegal to try to circumvent security in order to figure out how to fix it in order to comply with this legislation.

    Um, yeah, that makes sense.

  • by gotan ( 60103 ) on Wednesday January 16, 2002 @01:05PM (#2849201) Homepage
    It's really very basic: ensuring better security is costly, and handling the threat of liabilities too (for example by buying insurance to cover the risk). These are costs and risks a large corporation (like Microsoft) may be able to handle, but for small outfit, or small open source projects it's much harder. Something the size of mozilla, or the linux kernel can afford good QA and will find backers to handle the risks, but small projects would be forced under the cover of some larger organisation or the distributors. Also, in the case of open source projects, the sponsors would demand some say in the development process, or maybe even licensing of the software. But small software makers are in a similar position: To handle the risk of litigation they'd need a backer, they won't have the resources until their Software sells well.

    By charging higher premiums to insure companies using software with a bad track record, there are already market forces in place: include that difference in premiums in the TCO-calculations microsoft is so fond of to prove that Windows is cheaper than any competition, and make management aware of it (and make them wonder why that insurance company wants higher premiums for insuring against damages from security holes in that software).

    Legislation could hurt many a small software maker, and it would also be subject to heavy lobbying from Microsoft to see to it that their interests are hurt the least, a better idea would be an independant (that's the hard part) organisation providing certification of software. Once that is established there could be legislation demanding minimum standards for software used in certain critic areas.

    That way each software maker could choose how much to invest in security and QA, and it would be more transparent for customers how secure a product really is, so they wouldn't have to rely on the software-makers advertising for that kind of information. In effect the insurance conditions and premiums for different kinds of software are already an indicator for its security, and the insurance companies probably have a high interest in accurately estimating the risks, so probably they should play some part in ensuring the proposed organisations independance.
  • by valmont ( 3573 ) on Wednesday January 16, 2002 @01:27PM (#2849318) Homepage Journal
    First, keep in mind that we are not talking about "direct government involvment" in punishing bad software vendors. The government is merely pushing to have laws written to deal with flawed software. This should essentially enable common citizens and business entities to seek compensation from software vendors. So I just want to make sure everyone understands there really isn't a "big brother" thing going on here.

    Second, if any laws are written, my guess is they would merely extend already existing more generic laws regarding false advertisement. Under such circumstances, software vendors would not be *required by law* to produce secure software. But, if their advertising campaign, sales representatives, software packages blatantly lead potential consumers to believe that their product is of "enterprise-level", "mission-critical-caliber", "secure", "reliable" or any such wording which implies "secure software", then the law could provide for some serious compensations to the harmed consumer.

    To avoid endless legal battles over wording, the government should define an entity whose role would be to design, draft and maintain a *very specific* scale of security levels which defines strong standards for security features within software packages. The scale could not only provide very precise security requirements for software, but also standards type of compensation to the consumer for failure to meet each of its levels' standards.

    Such scale should be massively advertised thru all media so consumers would know to look for a software package's rating on such scale before purchasing it for any mission-critical purpose.

    We could let software vendors rate their own software packages according to this scale. If the scale is *specific-enough* and clearly defines levels of security, then consumers should have very strong cases to bring to class-action law-suits to seek compensation in the case such software should fail to meet all of the requirements defined by their advertised grade on the scale.

    Such model would keep the government's involvment minimal and place all of the liabilities on the software vendor, so consumers don't ever have to seek compensation from some government-sanctioned entity which would assign ratings to software packages. We must keep in mind that computer software is by nature a highly volatile, constantly evolving, and rarely flawless type of product, as every new piece of software written is by nature "cutting-edge".

  • Unsafe at any speed (Score:5, Interesting)

    by Animats ( 122034 ) on Wednesday January 16, 2002 @01:32PM (#2849349) Homepage
    I've been proposing this for years. [downside.com] What's needed is to require commercial software companies to provide a "full warranty", as defined in current Federal law.

    It took legislation to make cars safe. The auto companies hated it. They fought every inch of the way. But it made the auto industry grow up and make their products really work, no matter what.

    Every major industry goes through this transition, where society insists that the technology work safely. Railroads did. Steam boilers did. Autos did. Civil engineering did. Electric power did. It's time for computing to do it.

    It's time for the software industry to grow up and stop hiding behind one-sided licensing agreements. Software is too important in modern life to be as crappy as it is.

  • by stonewolf ( 234392 ) on Wednesday January 16, 2002 @02:25PM (#2849702) Homepage
    I said this a while back and I'm saying it again:

    There should be criminal and civil penalties for withholding information about security risks. Right now I do not have the legal right to know about security risks that are discovered in systems I use, the creators of those systems are not legally required to inform me when a new risk is discovered. This means that I can not make an informed decision about how to protect myself from the problem. I can't even use a list of currently unresolved risks to help me decide what systems to use and/or purchase.

    To me, the withholding of security risk information is a form of fraud. It is the same as rolling back the odometer on a used car. It is the same as selling Pintos with exploding gas tanks and the same as selling flammable pajamas to children. Companies must be required to release security risk information about their systems in a timely manner. They must be legally liable for damages that result from security issues between the time they discover the problem and the time they warn users of the problem. These kinds of penalties will force companies to create secure systems in the first place. And, to warn people in a timely manner so that they can take action to protect themselves. Although it is tempting I don't think the developers should be required to fix the system. But, a list of all outstanding security problems must be included in advertising and on the packaging of any system. People have to be able to make an informed decision about what systems to use. We put warning labels on beer and cigarettes, we require people to wear seat belts, we require the disclosure of the ingredients of all our food, we have lemon laws to protect us from unscrupulous car salesmen, and we have product liability laws that cover every physical thing we purchase. But, we have no equivalent legal protection from the purveyors of software snake oil.

    The only way a company should be able to get out from under these penalties is to declare the product "dead", notify all customers of record that no more security support will be given for that product. Declaring the software dead should also require that the source code and/or system designs as well as any patent and copyrights to the system be released to the customers so that customers can arrange for other sources of security support for the system. At that point the company would not be allowed to sell, distribute, or accept any sort of payment including royalties and support payments for the software.

    Stonewolf
  • by mindstrm ( 20013 ) on Wednesday January 16, 2002 @02:26PM (#2849708)
    Though, I don't know what a real law would look like...

    Consider, say, the hotel I was at years ago... they had an indoor pool. Before you used the pool, you had to sign a waiver... they had a stack of them in the pool room.

    The waiver basically said using the pool was at your own risk, etc, etc.

    Now... Dad asked his lawyer later, for kicks.
    Say you drowned becuase you couldn't swim.. and they had no lifeguard. This document would protect them... it was fairly clear there was no lifeguard.
    But.. say the diving board was in disrepair, and broke off while you were about to dive, causing you to fall and break leg... guess what? That contract doesn't absolve them of responsibility. Why? Because... it was reasonable to expect that the diving board worked.. the owner still had a duty to keep the area safe for it's users, regardless of their waiver. (If they wanted a waiver to protect them against that, they would have to clearly state the risks.. state that the facilities are in bad repair and broken.

    Now.. software, we have these horrible EULAs... but still. I can understand how it's okay for a company to, say, protect itself from being sued over some little bug.. of COURSE they have to. Like.. say Excel crashes while you are in the middle of some work.. and you have to re-do it, so you are late for a meeting, so you lose the deal, etc.

    Just as in the real world, where even a disclaimer can't generally release you of all obligation, so should it be with software. I don't know what the wording would be, or what would be fair... but software vendors should have a certain level of accountability for what they do.

    Now.. how does this affect OSS? I don't know. Do I think OSS authors should be responsible for what they do? Yes, to a degree.. but there is a problem.. I don't think someone should be sued just because they shared some code with the world and it didn't work.
  • by Zeinfeld ( 263942 ) on Wednesday January 16, 2002 @02:41PM (#2849791) Homepage
    I have read the report. The BBC article is very misleading.

    It certainly does not claim that Microsoft is responsible for most security issues. If it had I would have expected Butler Lampson to have resigned from the board. It is not usual for NAS reports to target particular companies. It is not likely that David Clark would attack Butler in that way given that they are both LCS computing profs.

    The statement about Microsoft is actually introduced from other sources but in such a way that the casual reader assumes it was a recomendation from the report. The only occurrence of the string 'Microsoft' in the text is Butler's accreditation.

    Likewise I find it hard to find any recomendations. The majority of the report is simply a post 9-11 rehash of three previous reports by the same board. The nearest the report comes to suggesting legislation is:

    Consider legislative responses to the failure of existing incentives to cause the market to respond adequately to the security challenge. Possible options include steps that would increase the exposure of software and system vendors and system operators to liability for system breaches and mandated reporting of security breaches that could threaten critical societal functions

    That is quite a way from endorsing legislation, which is hardly surprising given the makeup of the panel.

You are in a maze of little twisting passages, all different.

Working...