
Can Software Kill? 562
mykepredko writes "Eweek has an interesting, if somewhat long article titled Can Software Kill? The article focuses on a programming error that resulted in 28 Panamanian cancer patients receiving many times an expected lethal dose of radiation. The article briefly mentions, but doesn't go into detail, the 1991 Patriot Missile Failure that resulted in the deaths of 28 American service men and women."
Sure it can kill. (Score:5, Funny)
Can Software Kill?
Certainly. A complete set of Novell manuals dropped from 40 stories up packs the same kinetic energy as a 10 car freight train moving at 80 km/h.
Re:Sure it can kill. (Score:5, Funny)
Re:Sure it can kill. (Score:2, Funny)
Re:Sure it can kill. (Score:5, Funny)
Re:Sure it can kill. (Score:2, Funny)
Re:Sure it can kill. (Score:2, Informative)
Re:Sure it can kill. (Score:5, Funny)
Re:Sure it can kill. (Score:4, Funny)
For example, on the moon there is no terminal velocity as there is no atmosphere.
Hence, when teenagers go to the moon (one day) there will doubtless be fatalities due to "Hey its low gravity! I can jump off this mountain and just float down! Hey watch this!" *splat*
Re:Sure it can kill. (Score:5, Informative)
It's a serious topic, even more so since the over-radiation shit in Panama happened so recently.
The infamous Therac-25 incidents [vt.edu] happened between 1985 and 1987 and should be required reading... too bad the three Panamanian medical physicists cited in the article hadn't paid attention to it.
Re:Sure it can kill. (Score:3, Interesting)
Software that kills... (Score:5, Funny)
Re:Software that kills... (Score:5, Funny)
Re:Software that kills... (Score:3, Funny)
8 bits = 256 combinations
Perhaps you were talking bits per set of nibbles
Oh, it was a joke.
Re:Software that kills... (Score:2)
Re:Software that kills... (Score:2)
Re:Software that kills... (Score:2)
Is this what you meant? (Score:5, Funny)
-- author unknown
Re:Software that kills... (Score:5, Informative)
Re:Software that kills... (Score:5, Funny)
answer to subject: (Score:4, Funny)
Next story please, does it look like I have work to do?
Re:answer to subject: (Score:3, Funny)
You clearely haven't seen my coding, have you?
Re:answer to subject: (Score:5, Funny)
Re:answer to subject: (Score:3, Funny)
Why 28 deaths? (Score:4, Funny)
Lethal Weapon (Score:4, Funny)
Re:Lethal Weapon (Score:5, Insightful)
Of course! (Score:5, Funny)
Re:Of course! (Score:3)
By the way, is this proof that a *nix OS can kill a cat?
Since it should have 9 - 9 = 0 lives left...
Wow, that one was bad
Re:Of course! (Score:3, Funny)
On an only somewhat unrelated note, my alarm clock is "cat `slocate *.au` /dev/dsp", which I cancel by stumbling out of bed every morning while the thing screeches and generally makes a racket and typing "killall cat".
For some reason I have murderous urges whenever I see a cat..
EULA's (Score:5, Interesting)
Re:EULA's (Score:5, Insightful)
If the organization that's being entrusted with people's lives cheaps out and uses software in environments it's not rated for, there's no way the manufacturer should be held liable. It's not different than tires on cars. If you're ripping around at 150mph on non Z-rated tired, and one blows, it's your own damned fault, not that of the manufacturer.
Re: (Score:3)
Re:EULA's (Score:3, Interesting)
Re:EULA's (Score:3, Informative)
When people's lives are at stake, I do expect the designers to take well-known failure modes, such as the Blue Screen, into consideration, in much the same way that I expect everyone else who designs systems that have high potential for catastrophic failure to. If I desig
Re:EULA's (Score:3, Interesting)
No, they use an existing OS [www.qnx.ca] that is designed for such mission critical applications (medical, cars, space, etc.). I don't like letting M$ (or others) off the hook for quality software, but clearly when people's lives are at risk the equipment designers need to choose the right software for the job, same as hardware.
Having said
Re:EULA's (Score:3, Interesting)
You clueless cretin. (Score:3, Interesting)
IIRC, Microsoft's license has had since day zero, a clause to the effect that you are not legally allowed to use their software to control nuclear reactors, medical devices, avionics or any other application that could endanger human life. THERE'S A REASON THAT'S THERE!
If you DO have such an application, the software vendor : 1) takes much greater care in design, implementation and testing, 2) carries a godawful ginourmous insurance rider to cover any such failure.
There is a segment of the industr
Re:You clueless cretin. (Score:4, Interesting)
Re:You clueless cretin. (Score:5, Informative)
Re:You clueless cretin. (Score:4, Insightful)
Car manufacture. This vehicle is intended only to operate withing the bounds of the law and shall be considered out of warranty if operated outside those bounds. - Not a car made would still be under warranty after a week.
Airplane manufacture. This airplane is intended to be flown in by those who choose to accept said risk. - No defect could be held against the manufacturer.
Pharmaceutical company. This drug is intended only to give an increased chance of success to the patient. All risk and responsibility is the patients to accept and the manufacturer cannot be held responsible. - It wouldnt matter if the study was done by baboons instead of on baboons, the drug company would get a walk.
It's a case of accountability, and companies' attempts to use an EULA to get out of accountability. If this precedent stands unbated we will soon have EULAs on everything from TVs to cars with no manufacture ever able to be held accountable for defects. Thats what I have problems with.
Re:EULA's (Score:2)
While the government tends to keep an eye on the automotive industry, the computer industry seems to be able to operate with virtual impunity.
Why doe
We Need Software *Engineers* (Score:5, Interesting)
* Yes, I know there are a couple of schools out there that offer SoftEng degrees, but until industry distinguishes them from CompScists and requires the engineering designation for key positions they are meaningless.
Re:EULA's (Score:5, Interesting)
My TV comes with a warrantee, but that says they wont be liable for any damage or caused by the use of the tv.
I bought a bucked of concrete paint a week ago. It's guaranteed not to fail, but that guarantee doesnt cover the cost to remove/strip/repair the damage caused by bad paint (thousands), just 20 bucks for a new can of paint.
In court you'd have to prove negligence or deliberate behavior. You'd have to prove Sony designed the TV to electrocute you, etc.. The fact they get it UL listed is enough to get past that.
For software you'd have to show that they deliberately put the flaws in, or knew about the flaws and didnt care (depraved indifference)..
But I'm no lawyer so who knows.. Everyone can go fucking sue everyone else.
All I know is if Dr Pib puts a family member on an untested, unproven life support system, and it fails, I'm suing the Doctor.
So if I use an electric hairdryer in the shower... (Score:3, Interesting)
Come to think of it, the power company never warned me that all that electricity the sell me could be dangerous... it was here when I got the house. Maybe it's in the fine print on my electric bill, but I think most of that is how they can track me down to collect money from me if I don't pay...
Seriously - I think it depends on the software. If you use c
Re:EULA's (Score:3, Informative)
That said, the software development standards that are required under the FDA essentially enforce standard software lifecycle practices that people should be adhering to anyway, with the exception that the accountabil
Yes (Score:5, Insightful)
insert open source plug here
Re:Yes (Score:2, Interesting)
Software? no - humans, yes. (Score:4, Insightful)
Software will only kill people through bad programming.
It is humans that make the underlying mistakes
Re:Software? no - humans, yes. (Score:4, Funny)
-- John.
Re:Software? no - humans, yes. (Score:3, Interesting)
No, software cannot kill anyone. Only machines controlled by software can kill people.
Now, how to handle the legality or morality of said observation is beyond my level of interest at this time. However, I hope this clarifies things.
At the very least, these things confirm my general posit that "Computers should not be allowed to control things that move."
Tonight on Fox... (Score:5, Funny)
WHEN SOFTWARE ATTACKS!
with host Mitch Pileggi
Well.. (Score:2)
Short of the software making an intelligent decision to kill, it always comes back to us.
Therac-25 (Score:5, Informative)
Obviously Reading It Does Nothing (Score:3, Insightful)
Big scary warnings don't effect software quality. I think we should consider whether legal liability will.
Blame the makers (Score:2, Insightful)
of course it will (Score:2, Interesting)
With this happening i think people wouldnt have any issue on arguing how programers need to be accountable for thier programs. So if they should be held responsible, should people who program other things like operating system be accountable to flaws
Re:of course it will (Score:5, Insightful)
But if I'm a computer software engineer and have a bug in a program that gets 3 people an overdose, then it will be noticed and much howling will be done over it. Even if the total number of errors have gone down, the type of error is new and there is a common factor between all the cases. And so we will complain.
And, I think, rightly. Computers are a tool, not to be trusted, always to be checked. I fear many people believe the computer can never be wrong (because it is so complex as to be indistringuishable from magic, and magic is never wrong) - perhaps this is why there isn't much howling about Diebold voting machines: It's digital, so it must be better!
software does not kill... (Score:5, Insightful)
Software cannot kill ... (Score:5, Insightful)
well, yeah, but so can not having software (Score:3, Interesting)
the therac-25 actually injured a fair number of people in the US 10-15 years ago. yeah, software fucks up sometimes. it's old news. for the article:
Nancy G. Leveson and Clark S. Turner. An investigation of the Therac-25 accidents. Computer 26, 7 (July, 1993) pages 18-41.
Two words... (Score:3, Interesting)
The software is only one piece of the puzzle, of course. Its killing was enabled by the hubris of its developers and the blind trust of its users.
Old Story... (Score:2)
In any introductory Software Engineering course, they will discuss this as one of the first topics.
Humans make mistakes and mistakes can cause an OS to crash, a program to seg fault or for a few people to die. We, like our software, are fallibe.
Can software kill? (Score:4, Interesting)
An autopilot that is consistently 1000 feet off, a poorly written control routine for an MRI, miscalibrated antilock brakes...can certainly cause death.
But ultimately, it comes back to whoever wrote it. Or specced it. Or tested it.
Software by itself is benign.
Human implementation of it may be lacking, though.
A dumb question, yet a good one (Score:5, Interesting)
Software is no different from hardware in this aspect. If it is handling mission-critical or potentially-lethal equipment... great care should be taken to ensure its integrity.
Trusting those that make your irraditation software is no different from trusting the those that made your life-support hardware.
Human error, or mechanical, can mean death in both cases. If the error is glaring, it becomes a case of negligence.
Unfortunately in cases of software or even computer hardware operating environment becomes an often overlooked factor. Stress tests are needed... data collisions checked for, line noise, redundancy, etc. When we're talking about people's lives, that extra parity bit can be just as important as a backup-parachute...
Modern Verification (Score:3, Informative)
Patriot missile -- really a "failure" (Score:4, Insightful)
So, wouldn't the Patriot missile failure be understandable due to it being used outside its original design? If the Patriot had been really intended and design as a missile killer, then yes, it should have a "failure" because it didn't live up to its original spec.
Re:Patriot missile -- really a "failure" (Score:5, Informative)
Re:Patriot missile -- really a "failure" (Score:4, Interesting)
Actually, if you check the link in the article, it explains all about the Patriot failoure. It was a "range gate error" caused by clock drift. The patriot was designed as a mobile anti-aircraft SAM and, as such, was never designed to run for more than a few hours at a time. The one at Dhahran had been running for over 100 hours. It was the Israelis who first noticed the clock drift problem and they reported it to Raytheon. The problem was caused by programmers who would "round off" the clock increment before storing it in order to save a couble bytes of memory. This rounding error was inconsequential so long as the system was rebooted every few hours (which a mobile SAM on the move would do), but it could easily grow to cause huge errors if the computers were left running continuously, as they were on SCUD intercept duty. Raytheon's solution was to send out a warning followed by a patch to fix the error. Unfortunately, in classic Raytheon bumbling style [slashdot.org], the warning was "'very long run times' could affect the targeting accuracy", with no indication what "very long" was, or how much it would affect accuracy. The Alpha battery at Dhahran only ran so long because the Bravo battery was having radar trouble and Alpha was picking up the slack. The operators had no idea the range gate tracking was off by 600+ meters, otherwise obviously they'd have rebooted to fix it.
Set Phasers on Stun (Score:5, Informative)
Obligatory MS joke (Score:2)
It can only be attributed to human error (Score:5, Funny)
So are you saying they INTENDED to kill their patients and this software just did it more efficiently? ;P
Answer = NO.. (Score:4, Insightful)
RISKS Digest... (Score:5, Informative)
There was a good discussion of this event [ncl.ac.uk] some months ago; the current issue [ncl.ac.uk] has blurbs on topics ranging from computer viruses to aircraft cockpit management.
ethics & liability (Score:4, Interesting)
DISCLAIMER: IF YOUR CAR'S BRAKES FAIL, IT'S NOT OUR FAULT. TOUGH LUCK!
DISCALIMER: IF THIS MEDICINE KILLS YOU, OH WELL.. NOT OUR FAULT!
etc.
Some laws must be passed and software makers must be held accountable- they should no longer be able to hide under the big umbrella of the disclaimer.
Yes. It can. (Score:5, Informative)
Sadly, this is nothing new [umn.edu].
Every software developer needs to read Peter Neuman's book Computer-Related Risks [sri.com], and keep up with the Risks digest (comp.risks) [ncl.ac.uk].
Learning from other's mistakes is much less painful.
Every read deal with Tom Christiansen? (Score:2)
No, software can't kill... (Score:2, Insightful)
issue already discussed in old article (Score:3, Informative)
What if it is supposed to kill? (Score:2)
There is even software at use in the cell phones Iraqi insurgents use to trigger roadside bombs.
Haven't heard of anyone dying by ingesting software yet, but those tests are probably still in the stage where they are being conducted on mice first.
More FUD by Darl (Score:2, Funny)
Obligitory Russian Statement (Score:2)
It can kill in the movies, it must be true! (Score:2, Funny)
HAL: I'm sorry Dave, I'm afraid I can't do that.
Sure it can (Score:5, Insightful)
This is why I've always thought it's vitally important to have good, precise specifications in place and excellent quality assurance for any life-critical application. It's even better with many eyes overseeing every step of the process -- wait... that smacks of open source, doesn't it?
If you ask me -- and you haven't, but I'll tell you anyway -- what would be the best way to prevent catastrophe, it would be to PREVENT CHANGES TO THE SPEC. In college, our software engineering prof. gave us an assignment, then halfway through, she changed the spec on us. Well, not surprisingly, there wasn't a single project that worked faultlessly, and many of us were doing really well before that.
Software itself doesn't kill people. Bad software written by overworked developers writing to a constantly-changing specification with not nearly enough QA does. That is, people inadvertantly -- we hope -- kill people with software. Yeah yeah, it's cliche, but it works.
Sure software can kill. (Score:3, Insightful)
It's called progress, people. The more power we gain, the better we can kill ourselves, as the 20th century showed. Which goes to prove -- with great power comes great responsibility. (HHOS)
I also have to point out the problems with luser errors. A lawyer friend of mine represents a corporation that is sued because somebody lost an arm in one of their industrial machines. The machine is set so you have to keep pushing the button to make it run. That way, if you want to go fiddle with it, it can't be running while you do it and therefore take off your arm. But what are you supposed to do when the guy tells an other guy to help him out and press the button while he puts his fork in the toaster. Did the power machine tear off that guy's arm? No, his stupidity did.
Can software kill? Yes, but... (Score:2)
Of course software that tells your garbage disposal to turn on at the wrong time would also be bad.
I guess microwave software has a hardware interlock to deal with, so I can safely piss the software off and clean my microwave at the same time ("just stick my head in here to see if the roof is clean..." bRRRRRRrrwttTTTTTT!!!!!!!!!).
Ultimately, I guess it boils down to how many grandmas have tripped over a roomba down
Bad, sensationalized article title. (Score:4, Funny)
Software does not kill. Bad engineering and poor implementation kills. My copy of Windows XP, while still radiating pure evil, has not managed to pop open the gun cabinet.
You might as well ask the question, 'can the old saddlebag gas tanks on Ford Rangers kill? Gasp!'
Umm.... Cruise Missiles? (Score:4, Insightful)
This is why I quit (Score:5, Insightful)
One of my customers wanted to convert a database, and originally I thought, no problem just convert some tables and redraw some forms.
It turns out this database was also going to store information about blood matching, transplants, and it would also calculate daily drug doses for the nurse to sign off on for kids getting marrow transplants. Success is measured in how many months the kid gets to live.
If I was working on a team using a more robust platform I might have had more confidence to push forward. However, this is Microsoft Access and i'm the only guy who would know how this thing would work. This means it would be very easy for some kid's death to point towards me.
So I quit.
By the way, if anyone has work for a database developer, feel free to contact me at will_spangler@juno.com. I'm quite good with MS Access.
Re:This is why I quit (Score:5, Insightful)
As it is, you left the thing to be built by someone else. On an insecure system. Possibly with worse skills than you.
Sometimes the developer has to push back against managements wishes. You might have won, but at worst, you'd be no worse off than you are now.
Yes software kills, but there is an upside too (Score:3, Insightful)
Warfare is not about certainty, it's about playing the odds.
A very similar case could be constructed for them poor fried cancer patients. SQL databases that manage breast/cervical screening programs save thousands of people from cancer each year.
Worry About This Every Day (Score:5, Interesting)
I currently work for a small Electronic Medical Records company. At some level I worry about potentially killing someone every day. In fact our bug tracking tool has a special category in it called "Patient Safety" which is the highest priority bug. We deal with things most of you probably wouldn't think of such as a tool for writing Prescriptions, which given the fact that many drugs interact ( potentially fatally) has to catch and alert the physician to such cases. I also deal with lab results which if reported incorrectly could lead to a potentially fatal decision by the doctor and so forth.
Consultants and pundits like to say that computer control reduces the chances of human error and failure, this is said IMO to comfort the masses. To state the obvious I suspect EVERYONE on Slashdot knows that in reality that statement is not true, the human error has just been moved to a different point in the chain. A tired programmer is just as likely to make a mistake as a tired machinery operator. The difference is that that software might be used by 5,000 machines, whearas that operator runs 1.
Many modern warfare weapons use software (Score:5, Insightful)
Medical software (Score:3, Insightful)
The more insidious "errors" if you want to call them that are ones that are errors of design and process, and not execution. If a piece of software is designed with certain assumptions in mind, and something happens outside of the parameters of those assumptions, the software will appear to be working correctly when in fact there may be egregious errors. There are a lot of instances of this in everyday practice.
Lastly, what we run across is that clinicians are used to a world of paper, where everything obviously either there or not. You know that there's a problem, and there is transparency to the error, so you can factor that into your decision-making. In a clinical system, the transparency is not there, and a subtle flaw can mislead someone making a clinical decision into making a poor one.
Of course, the above are all gross generalities, as is any discussion of errors in complex systems, but I hope you get the idea.
Sort of old news: comp.risks subject 20 years now (Score:3, Troll)
http://catless.ncl.ac.uk/Risks/
can lack of software kill? (Score:3, Interesting)
Again, from the article:
Every example given was life threatening, yet the author clearly wants you to draw the conclusion that a software author should hesitate to publish a program she wrote to perform a calculation because someone *who thought they knew what they were doing* might plug it into a lethal machine.Next we will be hearing about how someone wrote a spreadsheet in gnumeric to calculate the radiation dose, killed someone because of a bug in gnumeric, and how the authors of gnumeric should be ashamed of themselves, and not the asshole who *thought he knew what he was doing.*
Special Superior Deputy Prosecutor Cristobal Arboleda, unlike the author of the article, is accusing the right people and doing his job well.
negligence kills (Score:3, Insightful)
Software does kill (Score:3, Informative)
A software glitch caused the crash of the first F-22 prototype (noone died fortunately) and as someone else pointed out, the Patriot missile failure in the first Gulf War (Software wasn't ENTIRELY to blame there. The bug was known and the folks in the field were given instructions on how to avoid it, but didn't follow them)
The trick is who do you hold responsible? The software person who has no training in mission critical software and who's working 80 hour weeks to meet the deadline the idiot managers are shoving down vis throat?
After 10 years in the industry I'm FINALLY starting to see movement towards software creation as a serious engineering discipline. Schools are starting to offer programs in Software Engineering, the ACM and IEEE have agreed on an official code of conduct (tho IHMO it still has SERIOUS problems), and most importantly companies are starting to listen to their technial folks when they say "You can't do that!".
Liability is just another incentive to head down that road. We need to be sure we pin the liability on the right folks.
Can bad engineering kill? (Score:3, Insightful)
SCADA software certainly can... (Score:4, Interesting)
Killer Software (Score:3, Interesting)
For every hour we spend making sure the system does what it's supposed to do, we spend eight hours making sure it doesn't do what it's not supposed to do.