Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Robotic Cannon Loses Control, Kills 9 580

TJ_Phazerhacki writes "A new high tech weapon system demonstrated one of the prime concerns circling smarter and smarter methods of defense last week — an Oerlikon GDF-005 cannon went wildly out of control during live fire test exercises in South Africa, killing 9. Scarily enough, this is far from the first instance of a smart weapon 'turning' on its handlers. 'Electronics engineer and defence company CEO Richard Young says he can't believe the incident was purely a mechanical fault. He says his company, C2I2, in the mid 1990s, was involved in two air defence artillery upgrade programmes, dubbed Projects Catchy and Dart. During the shooting trials at Armscor's Alkantpan shooting range, "I personally saw a gun go out of control several times," Young says. "They made a temporary rig consisting of two steel poles on each side of the weapon, with a rope in between to keep the weapon from swinging. The weapon eventually knocked the pol[e]s down."' The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether."
This discussion has been archived. No new comments can be posted.

Robotic Cannon Loses Control, Kills 9

Comments Filter:
  • by dpbsmith ( 263124 ) on Thursday October 18, 2007 @07:43PM (#21033689) Homepage
    Three Laws of Robotics: [wikipedia.org]

    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

    2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.

    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    "Asimov believed that his most enduring contributions would be his "Three Laws of Robotics" and the Foundation Series."Isaac Asimov [wikipedia.org] article in Wikipedia.

  • by riker1384 ( 735780 ) on Thursday October 18, 2007 @07:45PM (#21033709)
    Why didn't they have some provision to cut power to the weapon? If they were testing it in a place where there were people exposed in its possible field of fire (effectively "downrange"), they should have taken precautions.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Thursday October 18, 2007 @07:45PM (#21033715)
    Comment removed based on user account deletion
  • No pun intended (Score:5, Insightful)

    by geekoid ( 135745 ) <dadinportland&yahoo,com> on Thursday October 18, 2007 @07:46PM (#21033737) Homepage Journal
    But shouldn't this thing have a kill switch? Seriously, my table saw has a kill switch.

  • Riiight (Score:4, Insightful)

    by Colin Smith ( 2679 ) on Thursday October 18, 2007 @07:47PM (#21033747)

    The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether.
    Because human beings are so good at shooting down low flying supersonic aircraft.

     
  • by HTH NE1 ( 675604 ) on Thursday October 18, 2007 @07:51PM (#21033807)
    From "Mostly Harmless" by Douglas N. Adams, Chapter 12:

    (It was, of course, as a result of the Great Ventilation and Telephone Riots of SrDt 3454, that all mechanical or electrical or quantum-mechanical or hydraulic or even wind, steam or piston-driven devices, are now required to have a certain legend emblazoned on them somewhere. It doesn't matter how small the object is, the designers of the object have got to find a way of squeezing the legend in somewhere, because it is their attention which is being drawn to it rather than necessarily that of the user's.

    The legend is this:

    "The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at or repair.")
  • by MrKaos ( 858439 ) on Thursday October 18, 2007 @07:59PM (#21033907) Journal
    seems a bit stoopid

    By the time the gun had emptied its twin 250-round auto-loader magazines, nine soldiers were dead and 11 injured.
    was it neccesary to fill both magazines in a test fire, or for that matter in a live test fire perhaps have some sort of abort system ready - even if it just cut the power to the control systems?

    Maybe fill the magazines on the 5th live fire test???

    Just sayin, ya know.

  • by FooAtWFU ( 699187 ) on Thursday October 18, 2007 @08:00PM (#21033925) Homepage
    Not the least of which is, with current artificial intelligence, they're laughably unenforcable. In Asimov's books, you had this neat little "positronic brain" which was capable of resolving sensory inputs and determining things like "that is a human -->" (to say nothing of "I am harming it", especially through indirect causality.) They were even capable of coming up with ideas to avoid the "through inaction" clauses.

    Really, the stories weren't about robots, they were about people just like us, with a certain set of "must-follow" rules. Modern AI does not resemble this in the slightest.

  • I wonder (Score:2, Insightful)

    by redcaboodle ( 622288 ) on Thursday October 18, 2007 @08:02PM (#21033953)

    Why was an anti-aircraft gun able to hit ground targets at all?

    Shouldn't it be constructed so it can only fire overhead at a certain minimum elevation so it cannot hit anything less than let's say a truck's height from the ground? Sure that might not keep it from hitting targets on higher ground but it would make the gun a lot safer for firing crew and support troops around. Even if it was tracking a legitimate target coming in it might shoot right through it's own crew if say put on a hill so the incoming is coming in at 0

  • by Anonymous Brave Guy ( 457657 ) on Thursday October 18, 2007 @08:03PM (#21033963)

    Sorry, I missed the end of that story. How did it turn out, again?

  • by geekoid ( 135745 ) <dadinportland&yahoo,com> on Thursday October 18, 2007 @08:10PM (#21034053) Homepage Journal
    A person can screw up, a computer can screw up the same way millions of times a minute.
  • by nuzak ( 959558 ) on Thursday October 18, 2007 @08:17PM (#21034151) Journal
    > I respect Asimov, but the three laws are pretty naive.

    All of the stories in I, Robot are about pointing out the flaws in the laws, actually. From what several bigger fans of Asimov than myself have told me, he wasn't really trying to make grand philosophical statements with them though; they were just story hooks he used for the purpose of spinning a good yarn.

    Interpreted seriously, the three laws are slavery.
  • by Anonymous Coward on Thursday October 18, 2007 @08:23PM (#21034219)
    1) Do not mistake literary fiction elements for real life.

    2) Do not mistake literary fiction elements for real life.

    3) Do not mistake literary fiction elements for real life.
  • Sad, isn't it? (Score:2, Insightful)

    by Kelson ( 129150 ) * on Thursday October 18, 2007 @08:28PM (#21034271) Homepage Journal
    As I read the headline, "Robotic Cannon Loses Control," I immediately thought of the droids in Robocop. I was all set to make a funny post, if someone hadn't already. Then I got to the end: "Kills 9." And suddenly it wasn't funny anymore.

    It's one thing to make jokes about things going wrong. It's another thing to make jokes about people dying. I'd like to think that the people who made those comments, or modded them up, only skimmed the headline and summary. But I can't quite convince myself.
  • by rossifer ( 581396 ) on Thursday October 18, 2007 @08:28PM (#21034277) Journal

    I respect Asimov, but the three laws are pretty naive.
    Well, sure.

    Asimov's three laws were meant to be a thought experiment in hubris and unintended consequences. They were sold (in the context of the stories) as the perfect control system for robots, and then there were always "problems" that the USR management couldn't understand and which Susan Calvin needed to figure out and fix.

    Asimov wasn't naive, but some of his characters were...

    Regards,
    Ross
  • by Kamokazi ( 1080091 ) on Thursday October 18, 2007 @08:36PM (#21034355)
    From here [itweb.co.za]:

    Young says he was also told at the time that the gun's original equipment manufacturer, Oerlikon, had warned that the GDF Mk V twin 35mm cannon system was not designed for fully automatic control. Yet the guns were automated. At the time, SA was still subject to an arms embargo and Oerlikon played no role in the upgrade.

    It may just be me, but automating a machine that fires explosives that isn't designed to be automated just sounds like a Bad Idea(TM).
  • by Anonymous Coward on Thursday October 18, 2007 @08:40PM (#21034401)
    The three laws might be relevant when artificial intelligence is sufficiently advanced that a robot can understand abstract ideas, like what constitutes a human being and what it means to cause them harm. Until then, they are irrelevant because it is the the human programmers responsible for making the decisions about how the gun will respond to external stimuli, not the will of a 'robot'. This was not a malevolent machine attacking people, just a malfunctioning computer controlled gun (gone wild!).
  • by Skreems ( 598317 ) on Thursday October 18, 2007 @08:47PM (#21034473) Homepage
    When you're talking about massive loss of life while testing armed robots that the military wants to turn loose on the world, sometimes humor is the only way to deal with reality.
  • by Detritus ( 11846 ) on Thursday October 18, 2007 @08:53PM (#21034559) Homepage
    If I stub my toe, it's a tragedy. If you get run over by a herd of elephants, it's funny.

    If you want really sick and twisted humor, try living in a war zone.

  • by Anonymous Coward on Thursday October 18, 2007 @09:00PM (#21034641)
    And the science gets done
    And you make a neat gun
    for the people who are still alive
  • by Anonymous Coward on Thursday October 18, 2007 @09:03PM (#21034669)
    > It may just be me, but automating a machine that fires explosives that isn't designed to be automated just sounds like a Bad Idea(TM).

    It's just you. On Slashdot, we call that "pretty fuckin' cool", on Makezine.com [makezine.com], they call it "neat, but don't try this at home", and at Survival Research Labs [srl.org], they call it "another Thursday at work".

  • by jlarocco ( 851450 ) on Thursday October 18, 2007 @09:33PM (#21034905) Homepage

    150000 people die every day. That's almost 2 a second. I'm sure the family and friends of these 6 are heart broken, but for the 6.5 billion people who don't know them, it's not all that remarkable.

    The only thing unique about these 6 people is that they died in a somewhat amusing way. If you want to mourn, mourn for the other 149994 people who died today that you'll never hear about.

  • by Cheapy ( 809643 ) on Thursday October 18, 2007 @09:44PM (#21035007)
    I always find it hilarious that people will always post those "Laws", as if they were Universal Laws such as "1+1=2".

    They are a set of fictional laws made up by an author for his science fictional books. Are we seriously going to accept every and all Laws that appear in fiction?
  • by delong ( 125205 ) on Thursday October 18, 2007 @09:45PM (#21035027)
    Kind of like my response to Slashdotters objecting to an automated weapon designed to shoot down cruise missiles, which leave too little reaction time for human-controlled defenses to counter, which save lives of soldiers, airmen, and sailors from massive loss of life.
  • by Rich0 ( 548339 ) on Thursday October 18, 2007 @09:53PM (#21035095) Homepage
    Honestly, from reading the article it isn't clear that a software problem was even the cause of this disaster. It could have been some kind of mechanical gun jam.

    Any time you are dealing with big guns, fast motors, high-speed fire, large rounds, and explosive projectiles there is a risk of disaster if things go wrong. These things aren't toys. Even if the fire button was completely manual things could still go wrong.

    I recall reading an article about a magazine detonation in a battleship which went into all kinds of detail about all the things that could go wrong - and this was a fairly manual operation. It did involve lots of machinery (how else do you move around shells that weigh hundreds of pounds?), but it was all human operated.

    Assuming the system is well-designed the automation actually has great potential to LOWER risk. Humans make mistakes all the time. They're even more prone to making mistakes when a jet is incoming loaded with cluster bombs.

    Another thing to keep in mind is that peacetime training disasters always make the news with the military. However, the military has a fine line to walk - on one hand they want to be safe in their exercises, but on the other hand they want to be able to handle combat operations. A 30 minute single-shot firing procedure that allows for all kinds of safety checks sounds great in theory, but in wartime you'd lose more people to incoming fire than you'd ever save from gun explosions. Sure, you don't want to kill yourself, but if you're so ineffective that the enemy overruns you it is all for nothing. As a result we tolerate some friendly fire, accidents, etc.

    Like it or not robotic weapons WILL be the future of warfare. Sure, one country might elect not to develop them, but sooner or later somebody else will, and once they work out the bugs they'll be overrunning everybody else...
  • by ghostunit ( 868434 ) on Thursday October 18, 2007 @10:07PM (#21035239)
    Nope, unlike what tv may have taught you, people rarely, if ever, joke about something anything that affects and hurts them.

    Let's see you cracking a joke about the robot at the funeral if it was *your* son in the casket.

    Now, I don't see anything bad about us making jokes in this forum, since we aren't personally involved in the matter at all and can only feel sorry in an "abstract" kind of way (as in, accidents and human loss are sad but oh well I can't feel sad for *every* bad thing that happens in this world right?), and this won't be read by the affected people. But let's not go around pretending that we are "dealing" or "coping" with anything here. That's just hipocrisy.
  • by microTodd ( 240390 ) on Thursday October 18, 2007 @10:14PM (#21035287) Homepage Journal
    This thread happens every single time some tragedy with loss of life is posted here on Slashdot. Some people find the humor, then others are "sickened" and "can't believe the heartlessness".

    The simple matter is, many, many people die every day. Many, many people are also born every day. You can't be personally upset over every life lost or you would spend all your time in overwhelming grief. And sometimes humor is the only alternative to what would otherwise be shock, anger, sadness, or fear.
  • by Nazlfrag ( 1035012 ) on Thursday October 18, 2007 @10:17PM (#21035325) Journal
    Well, if you'd grown up all your life in the despotic, decadent corporate dystopia depicted in Robocop like those young'uns did, you'd be fairly oblivious too.
  • by Al Al Cool J ( 234559 ) on Thursday October 18, 2007 @11:15PM (#21035897)
    That's funny, because as a human, with close to 40 years experience working with other humans, all *I* can say is "PLEASE DON'T KEEP GIVING *THEM* GUNS!!!"

    I would never want to be around a human with a gun, just too big of a chance for something to go wrong.

  • by Johnny5000 ( 451029 ) on Thursday October 18, 2007 @11:23PM (#21035967) Homepage Journal
    When you're talking about massive loss of life while testing armed robots that the military wants to turn loose on the world, sometimes humor is the only way to deal with reality.

    Seriously.. this thing was built with the explicit purpose of raining death down on people.

    And lookee, it apparently did the job it was built to do....
    Only on people we've all decided "deserved" to keep their lives.

    Unlike the people this thing was *intended* to kill.
  • by Damarkus13 ( 1000963 ) on Thursday October 18, 2007 @11:38PM (#21036093)
    That's complete bullshit.

    My father is a paramedic, and some of the jokes that circle the station after a particularly gruesome scene would probably make you vomit. These men aren't deranged, dark humor is a very real way to deal with tragic events. These men are psycologically evaluated from time to time and the psycologists never seem to have any problem with dark humor. One has gone so far as to tell my father it is a COMMON coping mechanism, especially when one is trying to remain abstracted from the trauma.

    I'm not saying they make these jokes at funerals (that's just called tact) or in the presense of civilians, but pull your head out of your ass and realize that laughter is a powerful healing tool.

  • Re:Riiight (Score:5, Insightful)

    by Thaelon ( 250687 ) on Thursday October 18, 2007 @11:56PM (#21036257)
    Maybe that's what they tell the grunts. Congratulations, you managed to shoot down large mock targets that weren't shooting back.

    Think you can shoot down supersonic missile flying below the horizon? No. They let the computer guided robots do that. You're not nearly good enough at it. Ok, maybe you get lucky and nail it. Now try thirty in five seconds all coming from different bearings. Didn't think so.
  • by plover ( 150551 ) * on Friday October 19, 2007 @12:04AM (#21036331) Homepage Journal

    Let's see you cracking a joke about the robot at the funeral if it was *your* son in the casket.
    I did. It was the only way I could react to my father's death. It's who I am. I hurt fiercely, I was crying hard, and when my mom and I stepped into her kitchen I had to say something, so I cracked a quiet joke. It broke the tension, and made us feel just a tiny bit normal.

    That's coping, using humor. It happens in real life.

    In this forum, however, nine South Africans are truly remote. They're about as far outside my monkey sphere as humans can get. You wanna joke about them? Fine by me. You want to complain about the jokers because you don't think people really deal with tragedy that way? You're quite wrong.

  • by Enlightenment ( 1073994 ) on Friday October 19, 2007 @12:17AM (#21036467)
    It wasn't really for raining death down on people. It was an antiaircraft cannon, which is presumably used defensively against military aircraft.
  • by Dun Malg ( 230075 ) on Friday October 19, 2007 @01:32AM (#21037105) Homepage

    He had claimed that he had been involved in writing code for some kind of automated anti-missile defense system, though he had always insisted that he wasn't allowed to give details.

    If programmers like HIM are writing the code for these "smart" weapons, then I think we should just give the things to our enemies for free.

    Defense contractors frequently end up with bad products, but it's usually due to mission creep and gross mismanagement. Based on my experience*, I'd almost guarantee that this guy was lying about his experience. Pretending to have worked on a "top secret" project that you conveniently can't talk about is pretty weak sauce. In reality, there are two kinds of classified projects: mundane ones, where the engineers working on 'em can talk about the "what" of the program in great general detail, but the specific "how" is classified; and REALLY secret ones, which you can't talk about at all, the most you can say is "I work for Lockheed" or whomever. This "I worked on a secret anti-missile program" shit is a load of crap. It falls into the big fat liar zone between mundane and really secret.

    * I was an intelligence analyst in the Army. I dealt strictly with excruciatingly mundane secrets. Boring, boring, boring. My father was an engineer for Hughes (now Raytheon). He worked on things like the B-2 Spirit ground mapping radar system. For years he "worked at Hughes", and that was it. Later, he was able to say "I work on the B-2 radar system. You'd be amazed at some of the cool shit we do with it, but I can't say what it is."
  • by timeOday ( 582209 ) on Friday October 19, 2007 @01:49AM (#21037261)
    Does the robotic aspect make this any different from a fatal bridge collapse or a tire failure? IMHO it's the same.
  • by shinmai ( 632532 ) <<moc.liamg> <ta> <otsiraas.opaa>> on Friday October 19, 2007 @02:52AM (#21037647) Homepage
    it certainly doesn't make these soldiers any less dead

    I think it's more important to note, that it doesn't make them more dead, or kill additional soldiers, either. And really, thousands of far more tragic deaths happen each day. There are children being molested all over the world as I write this. Sorry if I don't lose myself over some minor military casualties while developing more efficent ways to kill people.

    And like the Parent said, laughing does make the world a better place. When unfamiliar people find something commonly humorous, it really brings them together in a really strong way..

    Yeah, I'm an idealist hippie, shoot me (whith a robotic cannon).
  • by Anonymous Coward on Friday October 19, 2007 @03:54AM (#21038017)
    I think it's official, "programmers kill people". I'm bracing for a backlash...
  • by ghostcorps ( 975146 ) on Friday October 19, 2007 @06:01AM (#21038605) Homepage
    It really must be an Aussie thing. My mate is a cop, last month he had to tell a nun, that her wheelchair-bound brother had lost control down a hill and drowned in a duck pond.

    But when she asked how he died, he could barely hold a straight face so he told her to ask at the hospital.

    Later she saw him and said, "No wonder you couldn't tell me how he died". Seems, she nearly pissed herself laughing at the hospital. She also told him to practice more, he'd given himself away with a tiny lift at the corner of his mouth when she asked.

    Personally, I don't get what a period of mourning achieves. Losing someone leaves any empty place, but I wouldn't want anyone to waste a moment of their life mourning the loss of mine. Why is it that the west treats death as some kind of divine punishment, and the east tend to celebrate it?
  • by buck-yar ( 164658 ) on Friday October 19, 2007 @06:51AM (#21038899)
    Would you rather be on the front lines of a war, or be controlling a robot that is?
  • by jollyreaper ( 513215 ) on Friday October 19, 2007 @07:39AM (#21039179)

    I teach robotics (on a VERY basic) level to high school kids. I explain that there are some really peculiar people out there who watch movies like Terminator and think "Hey that's cool! I wanna build a killer robot" and who then spend their professional careers trying to build machines that will lower our position in the food chain. :( They just don't sense the danger. Just like those designing artificial brains, smart weapons, doomsday plagues, better nukes......
    Yup. And I'm not even looking at it from a robot uprising perspective. Strong AI may or may not happen but I think it's going to be far, far off, like practical fusion power. But in the meantime, weak AI robotics is coming along nicely, predator drones and SWORDS robots, etc. Just look at the anti-democracy crackdown in Burma, that shows you the power of force when applied against the people. There were reports that some of the military units were wavering, having second thoughts about killing civilians and monks. An automated gun doesn't care. We've already got that level of distance with aerial bombing. We killed what was it, twenty civilians trying to take out Saddam the opening night of the war? We've got Marines on trial for deliberately raping and murdering civilians up close and personal but we gave medals to guys doing the indiscriminate killing from the air. We act like it's different, like accidentally killing dozens of people in an air attack is different from shooting them up close and personal. Wow, I'm sure their families will see that distinction exactly the same way we do. And when our cruise missiles go off-course and hit the wrong target, they're going to realize that's entirely different from when a suicide bomber does the same thing with two tons of explosive in a truck. "Sorry, my bad."

    Automated weapons are going to make the blood cost of war (to us) too low. We need casualties in the millions before our dumb monkey brains can figure out it's a bad idea, sometimes not even then.
  • by eam ( 192101 ) on Friday October 19, 2007 @07:45AM (#21039219)
    Picture my family sitting around the corpse of my grandfather. He didn't want a funeral or burial. He was going to be cremated. We were there to say good-bye. My father (his son) said, "Wouldn't you shit if he sat up and said 'April Fool!'" (it was April 1st). We all had a good laugh.

    My wife, an optometrist, dreamed of having her own practice. She has had her own practice for ten years and it remains a dismal failure. We are scratching and crawling out from under the debt we incurred, and eventually we'll reach the point where we'll be able to more or less survive. I'll never be able to retire. Neither will she. We won't be able to send our kids to college the way our parents sent us. Nevertheless, it is a constant source of humor. If we didn't joke about it, I think we would lose our minds.

    People *do* joke about the suffering and loss of their loved ones, they joke about having their own dreams crushed. So, when you say you don't think anyone does, you're wrong. Maybe not everyone. But people do, and it is valid. In fact, it is just as valid for someone not directly involved.
  • by Denial93 ( 773403 ) on Friday October 19, 2007 @11:03AM (#21041887)
    Huh? You mean on 9/11, 2.5% of fatalities worldwide were due to terrorism? And since then, terrorist deaths have practically flatlined, with rarely more than 0.01%, way behind pulmonary heart diseases, the flu, starvation, war, crime, work accidents, motorvehicle crashes and all sorts of other causes? You mean it doesn't make sense to throw terabucks into the War On Terror when relatively cheap nutrition programmes could save 27000 lives per day?

    What is this, a remaining pocket of common sense?
  • by Dread_ed ( 260158 ) on Friday October 19, 2007 @11:10AM (#21042019) Homepage
    To be fair to history, the "cruise missile off course" problem is a nice trade off for "razing an entire city, raping, enslaving, or killing the entire polulation, stealing all the valuables, and toying with the captives by seeing who can skin one completely without letting a single drop of blood fall."

    Warfare, as recently as the second world war, was not limited by counting civilian casualties. And yet many of our refined and erudite citizens now take it as the norm, lamenting even one collateral kill. It is truly amazing the indoctrinal effects of "civilization;" sufficient even to erase the survival capabilities of hundreds of thousands of years of evolution in a few generations. Hopefully we never meet an enemy who has not learned to sublimate their instincts in the pursuit of some dubious higher morality.

    As for automated weapons kiling indiscriminately, I think they are just suffering from an acute self-actualization crisis.
  • by ozgood ( 873183 ) on Friday October 19, 2007 @12:08PM (#21043207)
    Every army that wants to be good needs to be a well oiled machine. Otherwise accidents like this happen regularly.

    The parents "racist beliefs" broken down were:

    The post apartheid government is black. True
    Corruption is running rampid in SA, which has a black government. True
    HIV is climbing faster than curruption. True
    SA is now dangerous. True
    SA government (which again happens to be black) spends money on needless things rather than helping the people. True

    The facts are that in the post apartheid era, things in South Africa are in fact worse. I dont think it's a black thing vs. a white thing, but when anybody points out these above facts they are called racist.

    Your issue shouldn't be with the parent being racist, it should be with your government being accountable to the above issues, whether the government happens to be black or white it doesnt matter.

    Sadly, most of Africa seems to be following this trend which is a shame.
  • by bwen ( 675669 ) on Friday October 19, 2007 @12:16PM (#21043333)
    Out of curosity, are you sure that the parent kept the entire population dumb. Your post seems very accusatory of the parent. And saying murder or rape is to be acceptable is beyond idiotic. As far as South Africa not thriving, are we to blame the parent for the utter failure of the whole continent. Give me a break.

And it should be the law: If you use the word `paradigm' without knowing what the dictionary says it means, you go to jail. No exceptions. -- David Jones

Working...