Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
Software

Can Software Kill? 562

mykepredko writes "Eweek has an interesting, if somewhat long article titled Can Software Kill? The article focuses on a programming error that resulted in 28 Panamanian cancer patients receiving many times an expected lethal dose of radiation. The article briefly mentions, but doesn't go into detail, the 1991 Patriot Missile Failure that resulted in the deaths of 28 American service men and women."
This discussion has been archived. No new comments can be posted.

Can Software Kill?

Comments Filter:
  • by grub ( 11606 ) <slashdot@grub.net> on Tuesday March 09, 2004 @04:53PM (#8513308) Homepage Journal

    Can Software Kill?

    Certainly. A complete set of Novell manuals dropped from 40 stories up packs the same kinetic energy as a 10 car freight train moving at 80 km/h.
  • by bmorton ( 170477 ) on Tuesday March 09, 2004 @04:53PM (#8513310)
    Apparently it can only kill people in groups of only 28.
  • by Anonymous Coward on Tuesday March 09, 2004 @04:53PM (#8513315)
    No.

    Next story please, does it look like I have work to do?
  • by _xs_ ( 14098 ) on Tuesday March 09, 2004 @04:53PM (#8513316) Homepage
    Is 28 deaths the level at which we get concerned?
  • by AtariAmarok ( 451306 ) on Tuesday March 09, 2004 @04:54PM (#8513318)
    If software is outlawed, only outlaws will have software.
  • Of course! (Score:5, Funny)

    by zuikaku ( 740617 ) on Tuesday March 09, 2004 @04:54PM (#8513326)
    One must be very careful when you kill -9!
    • One must be very careful when you kill -9!

      By the way, is this proof that a *nix OS can kill a cat?
      Since it should have 9 - 9 = 0 lives left...

      Wow, that one was bad :-)
      • by iantri ( 687643 )
        Or if you really hate cats, "killall cat".

        On an only somewhat unrelated note, my alarm clock is "cat `slocate *.au` /dev/dsp", which I cancel by stumbling out of bed every morning while the thing screeches and generally makes a racket and typing "killall cat".

        For some reason I have murderous urges whenever I see a cat..

  • EULA's (Score:5, Interesting)

    by onyxruby ( 118189 ) * <{ten.tsacmoc} {ta} {yburxyno}> on Tuesday March 09, 2004 @04:54PM (#8513327)
    If a software maker is found negligible and convicted of manslaughter (unintentionaly causing death) due to buggy software, would that void out the whole EULA business since they all claim they can't be held responsible? Or would the burden pass on the poor chap that used it for being irresponsible enough to use something where the maker couldn't be held accountable? Lets's face it, why are only software companies able to make themselves free from accountability when every other industry has to design for it?
    • Re:EULA's (Score:5, Insightful)

      by Unknown Relic ( 544714 ) on Tuesday March 09, 2004 @05:05PM (#8513501) Homepage
      I'm not positive, but aren't most of these type of disclaimers saying something along the lines of "We do not give permission for this software to be used in environments where failure could result in loss of life. In the event of such unauthorized use, we will not warranty the product, nor be held accountable for any damages it may cause"? If this is the case, than I have no problem with this, as they are saying the software isn't good enough to use in such a situation, if you do so, you're on your own. Anything that's mission critical to a degree where lives depend on it, should be licensed with that in mind (which I imagine software for nuclear power plants, etc. is).

      If the organization that's being entrusted with people's lives cheaps out and uses software in environments it's not rated for, there's no way the manufacturer should be held liable. It's not different than tires on cars. If you're ripping around at 150mph on non Z-rated tired, and one blows, it's your own damned fault, not that of the manufacturer.
      • You wrote:

        "If you're ripping around at 150mph on non Z-rated tired, and one blows, it's your own damned fault, not that of the manufacturer."

        The problem with your analogy is that tire manufacturers are required to specify the maximum speed rating of their tires between N (87 mph) and Y (186 mph). Note: Z is now effectively a dead designation.

        The DOT/NHTSA does not allow Firestone to put a disclaimer on their tires saying that they can't be used when the failure could cause injury or death. And neither
      • Re:EULA's (Score:3, Interesting)

        by ooby ( 729259 )
        It's not software, it's hardware. Intel specifically states that its x86 processors should not be used in mission critical systems such as air traffic control systems.
    • You clueless cretin. (Score:3, Interesting)

      by Thud457 ( 234763 )
      RTFE!

      IIRC, Microsoft's license has had since day zero, a clause to the effect that you are not legally allowed to use their software to control nuclear reactors, medical devices, avionics or any other application that could endanger human life. THERE'S A REASON THAT'S THERE!

      If you DO have such an application, the software vendor : 1) takes much greater care in design, implementation and testing, 2) carries a godawful ginourmous insurance rider to cover any such failure.

      There is a segment of the industr

      • by Neil Watson ( 60859 ) on Tuesday March 09, 2004 @05:14PM (#8513647) Homepage
        I seem to recall reading somewhere that much of the systems on board some US Navy ships run Windows NT. Also, there was an article in Wired last year about software used by the US military in Iraq, which was mostly Windows. Both of these situations could endanger human life.
        • by canajin56 ( 660655 ) on Tuesday March 09, 2004 @05:32PM (#8513882)
          The Davis-Besse nuclear reactor in Ohio was running its safty monitoring systems on an NT server. And it got infected by Slammer and crashed. Fortunatly, the system had an analog backup, and the reactor had already been offline all year, after inspectors discovered a 6" hole through the cement in the reactor head, which left the core exposed.
      • by onyxruby ( 118189 ) * <{ten.tsacmoc} {ta} {yburxyno}> on Tuesday March 09, 2004 @05:15PM (#8513664)
        My point is more in relation to the concept that a EULA should disavow a company of all accountability. Let's look at this in other ways to help illustrate my point.

        Car manufacture. This vehicle is intended only to operate withing the bounds of the law and shall be considered out of warranty if operated outside those bounds. - Not a car made would still be under warranty after a week.

        Airplane manufacture. This airplane is intended to be flown in by those who choose to accept said risk. - No defect could be held against the manufacturer.

        Pharmaceutical company. This drug is intended only to give an increased chance of success to the patient. All risk and responsibility is the patients to accept and the manufacturer cannot be held responsible. - It wouldnt matter if the study was done by baboons instead of on baboons, the drug company would get a walk.

        It's a case of accountability, and companies' attempts to use an EULA to get out of accountability. If this precedent stands unbated we will soon have EULAs on everything from TVs to cars with no manufacture ever able to be held accountable for defects. Thats what I have problems with.
    • Consider the automobile. The government regulates the safety equipment in your car. Airbags, side impact beams, 5MPH bumpers, seatbelts, and standard crash tests all help protect you and your passengers while in your car. Do you think most of these systems would exist without government legislation? Consider that airbag technology has existed since the 1970s.

      While the government tends to keep an eye on the automotive industry, the computer industry seems to be able to operate with virtual impunity.

      Why doe
    • by Vagary ( 21383 ) <jawarren@gmailDEGAS.com minus painter> on Tuesday March 09, 2004 @05:11PM (#8513591) Journal
      The problem is that in every other development environment, the legal liability ultimately rests on the engineer who signed off on the quality assurance. But because software developers are not professionals and have no professional code of conduct, their signatures are meaningless. The only way software can become as reliable as other engineered products is to create the profession of software engineering*. And I'm not just talking about giving CompSci students a ring: many CompSci curriculums don't require any engineering techniques at all, and those that do usually devote less time to engineering than they do to sorting algorithms. The software industry requires fundamental changes, and legal liability is at most the catalyst.

      * Yes, I know there are a couple of schools out there that offer SoftEng degrees, but until industry distinguishes them from CompScists and requires the engineering designation for key positions they are meaningless.
    • Re:EULA's (Score:5, Interesting)

      by stratjakt ( 596332 ) on Tuesday March 09, 2004 @05:13PM (#8513628) Journal
      What other manufacturer would be held accountable?

      My TV comes with a warrantee, but that says they wont be liable for any damage or caused by the use of the tv.

      I bought a bucked of concrete paint a week ago. It's guaranteed not to fail, but that guarantee doesnt cover the cost to remove/strip/repair the damage caused by bad paint (thousands), just 20 bucks for a new can of paint.

      In court you'd have to prove negligence or deliberate behavior. You'd have to prove Sony designed the TV to electrocute you, etc.. The fact they get it UL listed is enough to get past that.

      For software you'd have to show that they deliberately put the flaws in, or knew about the flaws and didnt care (depraved indifference)..

      But I'm no lawyer so who knows.. Everyone can go fucking sue everyone else.

      All I know is if Dr Pib puts a family member on an untested, unproven life support system, and it fails, I'm suing the Doctor.
    • ...even though the "eula" told me not too, is the hair dryer company responsible if they didn't design the grounding well enough to save me from myself?

      Come to think of it, the power company never warned me that all that electricity the sell me could be dangerous... it was here when I got the house. Maybe it's in the fine print on my electric bill, but I think most of that is how they can track me down to collect money from me if I don't pay...

      Seriously - I think it depends on the software. If you use c

    • Re:EULA's (Score:3, Informative)

      by C10H14N2 ( 640033 )
      No. By law, the everything-and-the-kitchen-sink EULAs cannot be applied to any application that has to do with medical devices, air traffic control, military weaponry etc. Don't think that just because your word processor has a liability release that the same is true for all types of software.

      That said, the software development standards that are required under the FDA essentially enforce standard software lifecycle practices that people should be adhering to anyway, with the exception that the accountabil
  • Yes (Score:5, Insightful)

    by paranode ( 671698 ) on Tuesday March 09, 2004 @04:54PM (#8513329)
    Software can kill, just like any other stupid mistakes if left unchecked.

    insert open source plug here
    • Re:Yes (Score:2, Interesting)

      Would an opensource:d project have been more likey to lack these bugs? I mean, compared to proprietary software.
  • by smharr4 ( 709389 ) on Tuesday March 09, 2004 @04:55PM (#8513331)

    Software will only kill people through bad programming.

    It is humans that make the underlying mistakes

  • by The I Shing ( 700142 ) * on Tuesday March 09, 2004 @04:55PM (#8513338) Journal
    Tonight on Fox...

    WHEN SOFTWARE ATTACKS!
    with host Mitch Pileggi
  • It's the human error factor.

    Short of the software making an intelligent decision to kill, it always comes back to us.
  • Therac-25 (Score:5, Informative)

    by addaon ( 41825 ) <addaon+slashdot.gmail@com> on Tuesday March 09, 2004 @04:56PM (#8513358)
    Anyone who hasn't read this [vt.edu] paper, should.
    • Every CompSci student I know (disclaimer: most of them are Canadian) learned about the Therac-25 in class. And I'd hope that every engineer building a software-controlled radiation machine would have at least heard of it. Yet clearly its publicity has done nothing to advance the state of software engineering, as this almost identical tragedy shows.

      Big scary warnings don't effect software quality. I think we should consider whether legal liability will.
  • Who makes software??? Blame it on the people who made the software. Its the same as saying guns don't kill, bullets do.
  • of course it will (Score:2, Interesting)

    by pvt_medic ( 715692 )
    as we see technology inch into out society more in the medical field we will see more incidents of things like this happening. I know many hospitals are moving to computer based medication system, records, etc. A programing error could easily kill someone.

    With this happening i think people wouldnt have any issue on arguing how programers need to be accountable for thier programs. So if they should be held responsible, should people who program other things like operating system be accountable to flaws
    • by Bombcar ( 16057 ) <racbmob@bom b c ar.com> on Tuesday March 09, 2004 @05:04PM (#8513496) Homepage Journal
      You see, if I'm a doctor, and I screw up and overdose you, it isn't a news item. I'll get reprimanded, maybe sued. No one will even notice if it happens many times, because each time it is a different doctor in a different circumstance.

      But if I'm a computer software engineer and have a bug in a program that gets 3 people an overdose, then it will be noticed and much howling will be done over it. Even if the total number of errors have gone down, the type of error is new and there is a common factor between all the cases. And so we will complain.

      And, I think, rightly. Computers are a tool, not to be trusted, always to be checked. I fear many people believe the computer can never be wrong (because it is so complex as to be indistringuishable from magic, and magic is never wrong) - perhaps this is why there isn't much howling about Diebold voting machines: It's digital, so it must be better!
  • by dummkopf ( 538393 ) on Tuesday March 09, 2004 @04:57PM (#8513368) Homepage
    ... dumb programmers kill!
  • by maxwell demon ( 590494 ) on Tuesday March 09, 2004 @04:57PM (#8513369) Journal
    ... but it can make the hardware controlled by it kill.
  • by surreal-maitland ( 711954 ) on Tuesday March 09, 2004 @04:57PM (#8513374) Journal
    would you trust a technician to adjust the settings for a radiotherapy machine?

    the therac-25 actually injured a fair number of people in the US 10-15 years ago. yeah, software fucks up sometimes. it's old news. for the article:

    Nancy G. Leveson and Clark S. Turner. An investigation of the Therac-25 accidents. Computer 26, 7 (July, 1993) pages 18-41.

  • Two words... (Score:3, Interesting)

    by El Destructo ( 592406 ) on Tuesday March 09, 2004 @04:58PM (#8513386)
    Therac-25. [onlineethics.org]

    The software is only one piece of the puzzle, of course. Its killing was enabled by the hubris of its developers and the blind trust of its users.

  • This is old news...

    In any introductory Software Engineering course, they will discuss this as one of the first topics.

    Humans make mistakes and mistakes can cause an OS to crash, a program to seg fault or for a few people to die. We, like our software, are fallibe.

  • Can software kill? (Score:4, Interesting)

    by YrWrstNtmr ( 564987 ) on Tuesday March 09, 2004 @04:58PM (#8513389)
    Not by itself, no.

    An autopilot that is consistently 1000 feet off, a poorly written control routine for an MRI, miscalibrated antilock brakes...can certainly cause death.

    But ultimately, it comes back to whoever wrote it. Or specced it. Or tested it.

    Software by itself is benign.
    Human implementation of it may be lacking, though.
  • by phorm ( 591458 ) on Tuesday March 09, 2004 @04:58PM (#8513393) Journal
    Can negligence in any area kill? Yes.
    Software is no different from hardware in this aspect. If it is handling mission-critical or potentially-lethal equipment... great care should be taken to ensure its integrity.

    Trusting those that make your irraditation software is no different from trusting the those that made your life-support hardware.

    Human error, or mechanical, can mean death in both cases. If the error is glaring, it becomes a case of negligence.

    Unfortunately in cases of software or even computer hardware operating environment becomes an often overlooked factor. Stress tests are needed... data collisions checked for, line noise, redundancy, etc. When we're talking about people's lives, that extra parity bit can be just as important as a backup-parachute...
    • Modern Verification (Score:3, Informative)

      by krysith ( 648105 )
      In modern systems, where IMRT (intensity modulated radiation therapy) is used, the medical physicist in charge is supposed to verify each field delivered. This tests both the treatment planning software, as well as the accelerator, collimator leaves, etc. Often this is done using film, however, time saving electronic devices (basically diode arrays) are used by those who can afford them. Of course, the verification system has its own software, which requires verification also. Luckily the verification s
  • by Ryu2 ( 89645 ) on Tuesday March 09, 2004 @04:59PM (#8513397) Homepage Journal
    IIRC, the Patriot missile was never really designed or intended as an anti-missile missile, but a anti-aircraft (ie, a target much lower and slower) missile. It was only pressed into service killing Scuds because there was nothing better available.

    So, wouldn't the Patriot missile failure be understandable due to it being used outside its original design? If the Patriot had been really intended and design as a missile killer, then yes, it should have a "failure" because it didn't live up to its original spec.
    • by irokitt ( 663593 ) <archimandrites-iaur.yahoo@com> on Tuesday March 09, 2004 @05:04PM (#8513485)
      The problem was actually one of training and clueless operators. IIRC. the coordinates of the missile launcher had to be updated several times a day. The technicians went several days without doing so. A Scud flew into the area the Patriot was supposed to be protecting, but the system was so confused as to where it was that it thought it was another batteries' responsibility and did nothing. The Scud crashed into an area with Coalition troops and killed 28, the largest death toll due to a single action in Desert Storm.
      • by Dun Malg ( 230075 ) on Tuesday March 09, 2004 @06:17PM (#8514485) Homepage
        The problem was actually one of training and clueless operators. IIRC. the coordinates of the missile launcher had to be updated several times a day. The technicians went several days without doing so. A Scud flew into the area the Patriot was supposed to be protecting, but the system was so confused as to where it was that it thought it was another batteries' responsibility and did nothing. The Scud crashed into an area with Coalition troops and killed 28, the largest death toll due to a single action in Desert Storm.

        Actually, if you check the link in the article, it explains all about the Patriot failoure. It was a "range gate error" caused by clock drift. The patriot was designed as a mobile anti-aircraft SAM and, as such, was never designed to run for more than a few hours at a time. The one at Dhahran had been running for over 100 hours. It was the Israelis who first noticed the clock drift problem and they reported it to Raytheon. The problem was caused by programmers who would "round off" the clock increment before storing it in order to save a couble bytes of memory. This rounding error was inconsequential so long as the system was rebooted every few hours (which a mobile SAM on the move would do), but it could easily grow to cause huge errors if the computers were left running continuously, as they were on SCUD intercept duty. Raytheon's solution was to send out a warning followed by a patch to fix the error. Unfortunately, in classic Raytheon bumbling style [slashdot.org], the warning was "'very long run times' could affect the targeting accuracy", with no indication what "very long" was, or how much it would affect accuracy. The Alpha battery at Dhahran only ran so long because the Bravo battery was having radar trouble and Alpha was picking up the slack. The operators had no idea the range gate tracking was off by 600+ meters, otherwise obviously they'd have rebooted to fix it.

  • Set Phasers on Stun (Score:5, Informative)

    by jhines0042 ( 184217 ) on Tuesday March 09, 2004 @04:59PM (#8513399) Journal
    A good book that tells how technology can cause death, destruction, and mayhem entitled "Set Phasers on Stun". Includes the Therac radiation machine accidents, nuclear accidents, and many other odd stories.
  • I've been defenestrated [reference.com] by XP several times today, and I'm still here ;-)
  • by Trolling4Dollars ( 627073 ) on Tuesday March 09, 2004 @05:00PM (#8513423) Journal
    The article focuses on a programming error that resulted in 28 Panamanian cancer patients receiving many times an expected lethal dose of radiation.

    So are you saying they INTENDED to kill their patients and this software just did it more efficiently? ;P

  • Answer = NO.. (Score:4, Insightful)

    by msimm ( 580077 ) on Tuesday March 09, 2004 @05:00PM (#8513428) Homepage
    Bad programming can, just like guns don't kill, people do. An engineer makes mathmatical mistakes designing a bridge and the bridge later collapses, do bridges kill? Seems like a dedundent question, mistakes we make sometimes cost peoples lives, why would software be any different?
  • RISKS Digest... (Score:5, Informative)

    by Dr. Zowie ( 109983 ) <slashdot@def[ ]st.org ['ore' in gap]> on Tuesday March 09, 2004 @05:00PM (#8513430)
    ... is a forum that talks about specifically this kind of stuff. Being moderated the old-fashioned way, with a benevolently autocratic editor, it has much higher quality posts than the /. average.


    There was a good discussion of this event [ncl.ac.uk] some months ago; the current issue [ncl.ac.uk] has blurbs on topics ranging from computer viruses to aircraft cockpit management.

  • ethics & liability (Score:4, Interesting)

    by v_1_r_u_5 ( 462399 ) on Tuesday March 09, 2004 @05:01PM (#8513436)
    There must be a point where software makers can no longer say "DISCLAIMER: IF WE BREAK YOUR MACHINE, IT'S NOT OUR FAULT." If you look at every piece of software's license, you'll see a clause like that. Imagine if every industry took that approach:

    DISCLAIMER: IF YOUR CAR'S BRAKES FAIL, IT'S NOT OUR FAULT. TOUGH LUCK!

    DISCALIMER: IF THIS MEDICINE KILLS YOU, OH WELL.. NOT OUR FAULT!

    etc.

    Some laws must be passed and software makers must be held accountable- they should no longer be able to hide under the big umbrella of the disclaimer.
  • Yes. It can. (Score:5, Informative)

    by Mr. Slippery ( 47854 ) <tms@infamous.nGAUSSet minus math_god> on Tuesday March 09, 2004 @05:01PM (#8513442) Homepage

    Sadly, this is nothing new [umn.edu].

    Every software developer needs to read Peter Neuman's book Computer-Related Risks [sri.com], and keep up with the Risks digest (comp.risks) [ncl.ac.uk].

    Learning from other's mistakes is much less painful.

  • That's enough to make one want to kill.
  • ... but it can instruct hardware to do so.
  • by bad-badtz-maru ( 119524 ) on Tuesday March 09, 2004 @05:04PM (#8513489) Homepage
    These issues were already mentioned a year ago in the slashdot article Debug your Code, or Else! [slashdot.org].
  • Of course, software can be used to kill. What do you think guides cruise missiles, intercontinental ballistic missiles, HARMs, Stingers, Hellfires, etc. Any type of guided weapon is steered by software.

    There is even software at use in the cell phones Iraqi insurgents use to trigger roadside bombs.

    Haven't heard of anyone dying by ingesting software yet, but those tests are probably still in the stage where they are being conducted on mice first.
  • I suppose that Darl McBride is trying to further his claims that his enemies are out to kill him, eh?
  • In Soviet Russia, software kills YOU!
  • Dave: Open the pod bay doors, HAL.
    HAL: I'm sorry Dave, I'm afraid I can't do that.
  • Sure it can (Score:5, Insightful)

    by aduzik ( 705453 ) on Tuesday March 09, 2004 @05:06PM (#8513536) Homepage
    Software is an engineered thing, just like any other tool upon which we rely. Think about airplanes, which occasionally have mechanical failures in flight. Think about Columbia, which burned up because of engineering defects. So, if the software is flawed, it will certainly cause eventual damage. Sometimes it's benign -- restarting Word isn't so big a deal -- but sometimes it's catastrophic.

    This is why I've always thought it's vitally important to have good, precise specifications in place and excellent quality assurance for any life-critical application. It's even better with many eyes overseeing every step of the process -- wait... that smacks of open source, doesn't it?

    If you ask me -- and you haven't, but I'll tell you anyway -- what would be the best way to prevent catastrophe, it would be to PREVENT CHANGES TO THE SPEC. In college, our software engineering prof. gave us an assignment, then halfway through, she changed the spec on us. Well, not surprisingly, there wasn't a single project that worked faultlessly, and many of us were doing really well before that.

    Software itself doesn't kill people. Bad software written by overworked developers writing to a constantly-changing specification with not nearly enough QA does. That is, people inadvertantly -- we hope -- kill people with software. Yeah yeah, it's cliche, but it works.

  • by LeoDV ( 653216 ) on Tuesday March 09, 2004 @05:06PM (#8513538) Journal
    Steam engines have blown up and killed people. The first power machines have killed people. People have died in coal mines. In cars, in airplanes. I'm pretty confident when we first came up with fire there were a few mishaps.

    It's called progress, people. The more power we gain, the better we can kill ourselves, as the 20th century showed. Which goes to prove -- with great power comes great responsibility. (HHOS)

    I also have to point out the problems with luser errors. A lawyer friend of mine represents a corporation that is sued because somebody lost an arm in one of their industrial machines. The machine is set so you have to keep pushing the button to make it run. That way, if you want to go fiddle with it, it can't be running while you do it and therefore take off your arm. But what are you supposed to do when the guy tells an other guy to help him out and press the button while he puts his fork in the toaster. Did the power machine tear off that guy's arm? No, his stupidity did.
  • ...mainly when the software is controlling a pair of 5 ton robotic (human crunching) legs.

    Of course software that tells your garbage disposal to turn on at the wrong time would also be bad.

    I guess microwave software has a hardware interlock to deal with, so I can safely piss the software off and clean my microwave at the same time ("just stick my head in here to see if the roof is clean..." bRRRRRRrrwttTTTTTT!!!!!!!!!).

    Ultimately, I guess it boils down to how many grandmas have tripped over a roomba down
  • by matastas ( 547484 ) on Tuesday March 09, 2004 @05:07PM (#8513555)
    Good job with the Terminator images in everyone's heads.

    Software does not kill. Bad engineering and poor implementation kills. My copy of Windows XP, while still radiating pure evil, has not managed to pop open the gun cabinet.

    You might as well ask the question, 'can the old saddlebag gas tanks on Ford Rangers kill? Gasp!'
  • by RockClimbingFool ( 692426 ) on Tuesday March 09, 2004 @05:16PM (#8513674)
    Last time I checked, we don't have a bunch of kamakazi pilots for our Tomahawk Cruise Missiles. We make software to intentionally kill people all the time.
  • This is why I quit (Score:5, Insightful)

    by willpost ( 449227 ) on Tuesday March 09, 2004 @05:18PM (#8513699)
    I was working for a desktop consulting company, and I was the only database developer there.

    One of my customers wanted to convert a database, and originally I thought, no problem just convert some tables and redraw some forms.

    It turns out this database was also going to store information about blood matching, transplants, and it would also calculate daily drug doses for the nurse to sign off on for kids getting marrow transplants. Success is measured in how many months the kid gets to live.

    If I was working on a team using a more robust platform I might have had more confidence to push forward. However, this is Microsoft Access and i'm the only guy who would know how this thing would work. This means it would be very easy for some kid's death to point towards me.

    So I quit.

    By the way, if anyone has work for a database developer, feel free to contact me at will_spangler@juno.com. I'm quite good with MS Access.
    • by YrWrstNtmr ( 564987 ) on Tuesday March 09, 2004 @05:52PM (#8514166)
      What you should have done is to point out the failings in their current system, i.e Access. Point them towards a more robust solution, that will actually work for their needs. Then built it, and charged through the nose for it.

      As it is, you left the thing to be built by someone else. On an insecure system. Possibly with worse skills than you.

      Sometimes the developer has to push back against managements wishes. You might have won, but at worst, you'd be no worse off than you are now.
  • by EmbeddedJanitor ( 597831 ) on Tuesday March 09, 2004 @05:18PM (#8513713)
    Sure 28 service-folk got killed by a screwup in the Patriot Missiles. I bet though that thousands of others had their lives saved by big and little electronic gadgets (radar, rescue beacons, GPS, DVD players, two-way radio) that all have software in them.

    Warfare is not about certainty, it's about playing the odds.

    A very similar case could be constructed for them poor fried cancer patients. SQL databases that manage breast/cervical screening programs save thousands of people from cancer each year.

  • by Chokai ( 10224 ) on Tuesday March 09, 2004 @05:22PM (#8513755)
    The next time you visit the doctor watch the workflow of the office staff. Increasingly chances are they will probably be entering your medical information, and I mean the clinical stuff, not your address into some type of computer system.

    I currently work for a small Electronic Medical Records company. At some level I worry about potentially killing someone every day. In fact our bug tracking tool has a special category in it called "Patient Safety" which is the highest priority bug. We deal with things most of you probably wouldn't think of such as a tool for writing Prescriptions, which given the fact that many drugs interact ( potentially fatally) has to catch and alert the physician to such cases. I also deal with lab results which if reported incorrectly could lead to a potentially fatal decision by the doctor and so forth.

    Consultants and pundits like to say that computer control reduces the chances of human error and failure, this is said IMO to comfort the masses. To state the obvious I suspect EVERYONE on Slashdot knows that in reality that statement is not true, the human error has just been moved to a different point in the chain. A tired programmer is just as likely to make a mistake as a tired machinery operator. The difference is that that software might be used by 5,000 machines, whearas that operator runs 1.
  • by Kegetys ( 659066 ) on Tuesday March 09, 2004 @05:25PM (#8513783) Homepage
    If that Patriot missile failure counts as a "software kill" then surely software does kill; Look at the amount of people killed in Iraq for example by different types of bombs and cruise missiles that are guided (and detonated) by software.
  • Medical software (Score:3, Insightful)

    by drmike0099 ( 625308 ) on Tuesday March 09, 2004 @05:42PM (#8514008)
    Most people in the comments are focusing on actual bugs and crashes in a system causing deaths. While that could certainly happen, those types of errors are more visible and actually a much "better" error to have than some other types. If the system crashes, it may have some immediate effects depending on its purpose, but if it's something that causes its action through an actual user, they are generally harmless, though very annoying. An example of the difference is that if the software designed to run a ventilator has a bug that causes it to crash, since it is directly providing life to a person, when it crashes someone will probably die. On the other hand, systems designed to give information to a clinician, who can then act upon it are going to be very aware when that system is down, and so much less likely to make an error based on that outage.

    The more insidious "errors" if you want to call them that are ones that are errors of design and process, and not execution. If a piece of software is designed with certain assumptions in mind, and something happens outside of the parameters of those assumptions, the software will appear to be working correctly when in fact there may be egregious errors. There are a lot of instances of this in everyday practice.

    Lastly, what we run across is that clinicians are used to a world of paper, where everything obviously either there or not. You know that there's a problem, and there is transparency to the error, so you can factor that into your decision-making. In a clinical system, the transparency is not there, and a subtle flaw can mislead someone making a clinical decision into making a poor one.

    Of course, the above are all gross generalities, as is any discussion of errors in complex systems, but I hope you get the idea.
  • by JGski ( 537049 ) on Tuesday March 09, 2004 @05:45PM (#8514054) Journal
    comp.risks has been talking about these issues on the net for 20 years now. Started with the fatal Therac cancer machine incident, way back when. Also comp.risks has been warning about just about every security eventuality that has hit Microsoft recently starting 10-15 years ago!. I've been on this group since it started - I'm still surprised other don't know about it.

    http://catless.ncl.ac.uk/Risks/

  • by hopeless case ( 49791 ) <christopherlmarshall@@@gmail...com> on Tuesday March 09, 2004 @05:51PM (#8514162)
    From the article:

    The three Panamanian medical physicists who used the software to figure out just how much radiation to apply to patients are scheduled to be tried on May 18 in Panama City on charges of second-degree murder. Under Panamanian law, they may be held responsible for "introducing changes into the software" that led directly to the patients' deaths, according to Special Superior Deputy Prosecutor Cristobal Arboleda.

    I just love it when reporters try to pull a fast one. The people operating the machine *changed the software* because they *thought they knew what they were doing*. If they had opened up the machine and altered the control circuits, would the article be trying to discourage kids from having fun designing circuits and publishing the designs? "Can Circuits Kill?" would be the title, I suppose and it would end with a cautionary note shaking its finger at radio amatuers.

    Again, from the article:

    This is not a cautionary tale for medical technicians, even though they can find themselves fighting to stay out of jail if they misunderstand or misuse technology. This also is not a tale of how human beings can be injured or worse by poorly designed or poorly explained software, although there are plenty of examples to make the point. This is a warning for any creator of computer programs: that software quality matters, that applications must be foolproof, and that-- whether embedded in the engine of a car, a robotic arm in a factory or a healing device in a hospital-- poorly deployed code can kill.

    Every example given was life threatening, yet the author clearly wants you to draw the conclusion that a software author should hesitate to publish a program she wrote to perform a calculation because someone *who thought they knew what they were doing* might plug it into a lethal machine.

    Next we will be hearing about how someone wrote a spreadsheet in gnumeric to calculate the radiation dose, killed someone because of a bug in gnumeric, and how the authors of gnumeric should be ashamed of themselves, and not the asshole who *thought he knew what he was doing.*

    Special Superior Deputy Prosecutor Cristobal Arboleda, unlike the author of the article, is accusing the right people and doing his job well.

  • negligence kills (Score:3, Insightful)

    by kaltkalt ( 620110 ) on Tuesday March 09, 2004 @05:57PM (#8514224)
    people acting in a negligent (or intentional) way causes death -- not inanimate objects like cars, guns, and software. Put the blame on the right factor. Poorly designed roads and poorly designed software can both end up causing human death, but the key is that these things were poorly designed by humans. Negligence. Not acting as a reasonably prudent person in similar circumstances. Don't blame the thing, blame the people who made/designed/controlled the thing. Come on, slashdot is better than this.
  • Software does kill (Score:3, Informative)

    by DroidBiker ( 715045 ) on Tuesday March 09, 2004 @06:12PM (#8514427)
    Software has killed many people. Radiation machines under software control have killed people in the US, and Canada as well as the incident the article mentions in Panama.

    A software glitch caused the crash of the first F-22 prototype (noone died fortunately) and as someone else pointed out, the Patriot missile failure in the first Gulf War (Software wasn't ENTIRELY to blame there. The bug was known and the folks in the field were given instructions on how to avoid it, but didn't follow them)

    The trick is who do you hold responsible? The software person who has no training in mission critical software and who's working 80 hour weeks to meet the deadline the idiot managers are shoving down vis throat?

    After 10 years in the industry I'm FINALLY starting to see movement towards software creation as a serious engineering discipline. Schools are starting to offer programs in Software Engineering, the ACM and IEEE have agreed on an official code of conduct (tho IHMO it still has SERIOUS problems), and most importantly companies are starting to listen to their technial folks when they say "You can't do that!".

    Liability is just another incentive to head down that road. We need to be sure we pin the liability on the right folks.

  • by Chris Y Taylor ( 455585 ) on Tuesday March 09, 2004 @06:13PM (#8514438) Homepage
    Isn't that the same question?
  • by blueZ3 ( 744446 ) on Tuesday March 09, 2004 @06:35PM (#8514688) Homepage
    In a former life ( :-> ) I was employed by a large multi-national that worked with utilities. Some of our software used SCADA protocols to remotely switch breakers - not household breakers, these switches control significant segments of the US power grid. All the software and documentation contained numerous warnings, because if a utility employee manually switched of a segment to make repairs, and switch was remotely turned on, someone could be killed. There are numerous other software applications that control (potentially) deadly devices - robots, industrial equipment, etc. Failure of the software, or problems with operator headspace, create a potential for death when working with almost any software that controls physical entities.
  • Killer Software (Score:3, Interesting)

    by onkelonkel ( 560274 ) on Tuesday March 09, 2004 @06:37PM (#8514718)
    We write software for railroad traffic control and crossing warning systems. If it fails we could end up with two trains trying to occupy the same piece of track at the same time (ref. Clapham Junction 35 dead) or gates that stay up when the train comes. Testing is very formal and rigorous and every step is documented.
    For every hour we spend making sure the system does what it's supposed to do, we spend eight hours making sure it doesn't do what it's not supposed to do.

We can found no scientific discipline, nor a healthy profession on the technical mistakes of the Department of Defense and IBM. -- Edsger Dijkstra

Working...