Forgot your password?
typodupeerror

Armed Robots Not Actually Gone From Iraq 263

Posted by Zonk
from the someone-call-for-an-exterminator dept.
NightFalcon90909 writes "You may have heard that armed robots were yanked from Iraq after a gun started to swivel without it being told to do so. 'A recent news report that armed robots had been pulled out of Iraq is mistaken, according to the company that makes the robot [Foster-Miller] and the Army program manager. 'The whole thing is an urban legend,' says Foster Miller spokesperson Cynthia Black, of the reports about SWORDS moving its gun without a command.'"
This discussion has been archived. No new comments can be posted.

Armed Robots Not Actually Gone From Iraq

Comments Filter:
  • by enzo_romeo (756095) on Tuesday April 15, 2008 @12:18PM (#23078702)
    Who cares if it works?
  • by T-Kir (597145) on Tuesday April 15, 2008 @12:19PM (#23078720) Homepage
    Maybe they put the Telencephalic inhibitors back in?
  • by SeeSp0tRun (1270464) on Tuesday April 15, 2008 @12:20PM (#23078732) Journal
    So the United States Government says this didn't happen... They also said the prisoners of war were treated fairly...
    coughWATERBOARDINGcough

    Yep, the government must be right!
    • Re: (Score:3, Funny)

      by uncoveror (570620)
      Cover-up! Cover-up! You can be sure something is true if it has been officially denied. Calling this story an urban legend is the falsehood here. These Terminators are going to be the end of us all!
      • by techpawn (969834)

        These Terminators are going to be the end of us all!
        Now that you mention it, they are consolidating and restricting access to a lot of "in the air" government databases.. making a SKYNET if you would...
      • by jellomizer (103300) on Tuesday April 15, 2008 @12:47PM (#23079140)
        Question is what if the government is telling the truth...

        You cant trust the government if it hides anything.
        You cant trust the government if fully discloses everything (they must be lieing)
        You cant trust the government if it give you need to know.

        How do you convience Joe Six pack that we did go the moon.
        That is the problem of Conspericy theories, The more proof that you give them the more elabrate the conspericy is.
    • by kellyb9 (954229)
      It's worse... the government didn't say so... the company making them did.
  • It's Inevitable (Score:5, Interesting)

    by Al Mutasim (831844) * on Tuesday April 15, 2008 @12:20PM (#23078734)
    Three false moves prior to certification is not a problem. Compare this to false moves by soldiers carrying rifles, which are universal. Even if a robot were to point its gun in the wrong direction, the person controlling it, and there always is one, would not pull the trigger. The Army will (and should) let the Talon see action. Gun-shooting robots are inevitable.
    • Re:It's Inevitable (Score:5, Insightful)

      by Rob T Firefly (844560) on Tuesday April 15, 2008 @12:22PM (#23078772) Homepage Journal
      I know nothing about these things or guns in general so maybe I'm off base, but if the bit that makes it swivel engages without being told, what on Earth makes you so confident that the bit that makes it shoot will not engage without it being told?
      • Is an armed robot that much different than say a modern aircraft? What I mean is, what keeps a modern attack place from launching a missile or shooting the guns do to a fault? Its the way things are going the best we can do is make sure they are tested retested and tested again. Still accidents will happen, and those accidents will claim lives. Thats the cold hard truth about it. As far as I know people aren't 100% failsafe either.
      • Re:It's Inevitable (Score:5, Informative)

        by SwordsmanLuke (1083699) on Tuesday April 15, 2008 @01:16PM (#23079576)
        I work for a robotics company and (among other things) have worked on modifying a TALON (on which these SWORD robots are based) to work with our control software.

        if the bit that makes it swivel engages without being told, what on Earth makes you so confident that the bit that makes it shoot will not engage without it being told?
        To answer your question, not a damn thing. The TALON I worked with was really flaky. It shook and twitched so frequently the guys who owned the TALON referred to the bot has having the "Foster-Miller shakes."

        I hope the SWORD bots are much better quality than the TALON bot, because, quite frankly, there is no fraking way I'd trust one of those things with a gun.
      • You are off base, sorry, these systems are much more complicated than a standard RC toy with which you are familiar.
        #1 Go read the article. If you've read it already then go read it AGAIN. Failures were discovered during evaluation PRIOR TO CERTIFICATION AND DEPLOYMENT. The faults were found, analyzed and redundant connections or fail-safe modes were introduced to remove those failures.
        #2 The firing system is a different component that DID NOT FAIL. The firing system also went through literally years of r
    • by dave420 (699308)
      So it'd be fine if the wheels on your car started to move independently from the steering wheel, as the driver can just use the brakes when that happens?

      Back to the robots: the trigger mechanism might also be dodgy, and what if the gun is already being fired when it starts to move?
    • by sm62704 (957197)
      Gun-shooting robots are inevitable

      And IMO that's good news. I'd rather have armed robots in Iraq than have one of my daughters there, especially since Iraq posed no threat to US security and never did. How do you spell "clusterfuck"?

      Yeah, go ahead and mod me down. The truth hurts, doesn't it?
  • You know how I know calling your armed robots SWORDs is a bad idea? Because I saw this movie, that's how: http://www.imdb.com/title/tt0112993/ [imdb.com]
  • The article is worth it just for this quote: "So, now there is now redundant wiring on every circuit."
  • Like The Terminator, they'll be back.
  • by explosivejared (1186049) <hagan.jared@NOSPAM.gmail.com> on Tuesday April 15, 2008 @12:22PM (#23078770)
    "It can't shoot anyone [without orders]," Black says. "It's not an autonomous vehicle."

    Can we not dream that there are artificially intelligent armed to the teeth robots ready to kill us at a moments notice?! If you take that away, what do we have left?! Do not bring your holier than thou facts to our paranoia party. If we believe hard enough that there are crazed, deadly robots on the loose, maybe... one day our dream might come true! So step off Sgt. Buzzkill.
    • by mh1997 (1065630)

      "It can't shoot anyone [without orders]," Black says. "It's not an autonomous vehicle."

      Can we not dream that there are artificially intelligent armed to the teeth robots ready to kill us at a moments notice?! If you take that away, what do we have left?! Do not bring your holier than thou facts to our paranoia party. If we believe hard enough that there are crazed, deadly robots on the loose, maybe... one day our dream might come true! So step off Sgt. Buzzkill.

      Ok, they may not get orders to kill, but

    • by kabocox (199019)
      Can we not dream that there are artificially intelligent armed to the teeth robots ready to kill us at a moments notice?! If you take that away, what do we have left?! Do not bring your holier than thou facts to our paranoia party. If we believe hard enough that there are crazed, deadly robots on the loose, maybe... one day our dream might come true! So step off Sgt. Buzzkill.

      I'm just waiting until some one let's loose the bots and has them conquer and expand out in any direction without thinking ahead of w
    • by MightyYar (622222)
      No, but if they get struck by lightning, they can be a good friend [youtube.com].

      Neeeed input!
    • Short circut.

      Thare, happy now?
  • Ooops! (Score:4, Funny)

    by mcecil (1248130) on Tuesday April 15, 2008 @12:23PM (#23078786)
    (Hastily tears down "Hail Robots" sign)
    • by interiot (50685)
      55 comments so far, and nobody has mentioned welcoming our robot overlords? It's true that the meme is getting a bit old (even Fox news [foxnews.com] has picked it up — quite the death-knell), but that's never stopped Slashdotters before.
  • EX-TER-MI-NATE! (Score:5, Interesting)

    by Aquaseafoam (1271478) on Tuesday April 15, 2008 @12:25PM (#23078820)
    EX-TER-MI-NATE! EX-TER-MI-NATE! *Cough* Hrm hrm... If a crossed wire can cause the gun to swivel, then a crossed wire can also cause the gun to fire. Anyone else surprised to see that they failed to include multiple redundancies? Of course, one could put forward the argument that the more redundancies there are, the more there is to go wrong.
    • You're confusing complexity with poor engineering. Properly designed redundancy adds to complexity while only serving to increase reliability. If it doesn't, it is not the cause of the complexity, but a fault of the engineer himself.
  • Traduction (Score:4, Funny)

    by Ariastis (797888) on Tuesday April 15, 2008 @12:27PM (#23078854)
    s/Its an urban legend/All witnesses have been silenced/
  • by usul294 (1163169) on Tuesday April 15, 2008 @12:30PM (#23078890)
    I'm an engineer for a company that writes some of the signal analysis for robots, mostly military. They are designed to look for people, noise, or something easily sensible and train their guns on that location and await further instruction. Its a de facto law for military robot design that a human makes every firing decision, but the robot is allowed to aim and ask if it can fire. If a US soldier did something loud (shoot a gun, slam a door, yell) theres a good chance thats what set off the targeting routine. There was never any chance of a weapon being fired, except of course if there was a malicious operator. I have not worked on this type of robot, so I can't be sure of the process. There might be a user command that says "go look for target". If the robot looked for a target without ever being commanded that'd be a pretty horrendous software error.
    • by TTURabble (1164837) on Tuesday April 15, 2008 @12:35PM (#23078986)
      But did you implement the three laws?
      • Re: (Score:3, Insightful)

        by Kartoffel (30238)
        Rembmer, Asimov's laws of robotics are science fiction. They are relevant in same way as the laws of the old testament: both are prominent literary works...of fiction.
        • by Dr Caleb (121505)
          "They are relevant in same way as the laws of the old testament: both are prominent literary works...of fiction."

          So, of course, the inherent morality in both works should therefore safely be ignored.
          • by Kartoffel (30238)
            Taken with a grain of salt, perhaps. Both Asimov's and the bible's laws make no distinction as to what is moral; they are mere laws. Morality more difficult than a simple boolean.

            Two similar questions with very different implications:
            1. Is it ever moral for a robot to kill?
            2. Is it ever legal for a robot to kill?
        • by sm62704 (957197)
          Rembmer, Asimov's laws of robotics are science fiction. They are relevant in same way as the laws of the old testament: both are prominent literary works...of fiction.

          If the writer believes what he's writing, it isn't fiction, and there's no evidence whatever that the writers DIDN'T believ what they were writing. It may be incorrect, but it isn't fiction.

          As to Asimov's laws, they sound like good solid engineering principles that we should try to impliment, even though they are, in fact, fiction, and would,
        • The laws aren't works of fiction. Many Jews have lived under them for many, many years. Christians got a Get Out of Jail Free card, and Muslims got a whole new set of them, because some idiot fat fingered the transcription and now the original ones are lost.

          Now ten commandments being inscribed miraculously on the mountain top? Ok, that takes faith. But I'd still say that the laws are pretty relevant.

        • by geekoid (135745)
          And they failed. Hence the mystery.

          • Did you read the article? There is no "mystery." The faults were found and corrected. The design was changed to prevent them from occurring again.
        • by legirons (809082)
          Rembmer, Asimov's laws of robotics are science fiction. They are relevant in same way as the laws of the old testament: both are prominent literary works...of fiction.

          Asimov's laws are a philosophical choice between humanitarian robots and military robots.

          Without such a choice, the structure of governments tends towards the 'military robots only' being a default option (as seen here, and in UAVs)

          Asimov showed what robots could be, if we had higher moral expectations of them
        • Re: (Score:3, Funny)

          by Bryansix (761547)
          I think a lot of people will disagree with you here. Maybe not Slashdot members but a lot of people (me included) believe that God really did write the 10 commandments on stone tablets.
      • by tgd (2822)
        One doesn't tend to land DoD contacts if ones devices implement the three laws. Its sort of robotic evolution -- if you have three laws, you definitely won't be having any reproducing going on.
    • If the robot looked for a target without ever being commanded that'd be a pretty horrendous software error.

      There are two sides on this coin. Heads: robots are more expendable than people, and intimidating, trigger-happy, seemingly out-of-control robots can scare enough bejesus out of militant insurgents to turn the tide and keep terrorists to themselves. Tails: a robot can be captured by the enemy and leads to the scenario, unlikely as it may be, that it is sent back with enough sneakiness to gun down comm
    • by baudilus (665036)
      Does a dialog pop up?

      Fire gun?
      [YES] [NO]

      What if the operator mistakenly sees "Having fun?" and accidentally clicks yes?
      • by samkass (174571)
        That's the Windows style of dialog box. If it was a Mac, it would say "Would you like to fire the gun? [Fire][Cancel]". For apps that follow Apple's style guidelines, the command wording goes in the button, not "yes/no".

        I don't know what Linux's style guidelines say on this matter. I suspect the phrase "Linux style guidelines" are already causing some snickers.
      • Re: (Score:3, Funny)

        by AgentPaper (968688) *
        Clippy just got a whole new lease on life...

        "It looks like you're trying to shoot an insurgent. Would you like help?"
    • by Kartoffel (30238)
      Good clarification. Sometimes the media makes little or no distinction between autonomy and teleoperation.
    • by geekoid (135745)
      "There was never any chance of a weapon being fired,"
      Clearly they have developed some magic bit of electronics and fool proof code.

      Hell, stop making robots and sell your magical technology~

      I agree that it isn't likely to go off, but equipment fails all the time, and equipment in the field can fail in spectacular ways.
      • by Jtheletter (686279) on Tuesday April 15, 2008 @02:34PM (#23080602)

        "There was never any chance of a weapon being fired," Clearly they have developed some magic bit of electronics and fool proof code.
        Let's go over the various realistic reasons for why there might not have been a chance of a weapon firing:
        1) Weapon safety was on.
        2) Weapon was not loaded.
        3) Weapon was not attached to robot base.
        4) Firing system was not installed/powered/engaged.

        Remote firing circuits while not 100% perfect (only because nothing is) are a mature technology. They are used all the time in law enforcement and especially in EOD remote detonations. Could you also please tell us all what certifications were passed for this firing circuit? Until you can point to that specific data and tell us why it fails, then you're guessing at things you don't know.
  • by mmell (832646) <mike.mell@gmail.com> on Tuesday April 15, 2008 @12:33PM (#23078956)
    During initial testing, the automated vessel identified Catalina Island as a fast moving object and proceeded to lock her guns on her escort vessel (which was nowhere near Catalina at the time). The system (NT 4.0 based, IIRC) had to be shut down, as there was no manual override and the Navy didn't feel like burying that many seamen at sea.

    After which (with engines and navigation offline) she had to be towed back to port.

    Y'know, after those problems were addressed, the Aegis-class cruiser entered service and is still a very effective platform for the US Navy. Not that I think it wise of us to arm automated robots, but from the military perspective this is only a minor setback.

  • by Kjella (173770) on Tuesday April 15, 2008 @12:33PM (#23078960) Homepage

    "It can't shoot anyone [without orders]," Black says. "It's not an autonomous vehicle."
    It doesn't have to be autonomus to do bad things. Say for example you can order it to rotate the turret and to fire its gun, then the radio transmission is jammed. If you programmed it really stupid and it kept waiting for a stop command that never came, it'd fire in circles until it was out of ammo. Obviously this is a very naive example, but sure the robot can do plenty harm unless it stops cold any time the transmission is having a hiccup. Even then I'm sure there's ways to make it react unintentionally.
    • by boris111 (837756)
      Exactly if it auto targets it could potentially provoke someone that it did not intend to (even a friendly). Picture a scenario where a friendly soldier comes into a room and the robot auto targets because they're firing at a enemy. The friendly gets startled and starts firing at the robot. The operator on the other end gets startled and opens fire on the friendly.
    • by SwordsmanLuke (1083699) on Tuesday April 15, 2008 @01:05PM (#23079394)
      Disclaimer: I haven't worked on SWORD robots, but I have worked with the TALON on which the SWORDS are based.

      The sort of scenario you describe is prevented with a heartbeat based killswitch. E.g. a signal is sent to the robot at a regular interval. If, for some reason, the heartbeat is not received, the robot immediately shuts down and stops moving. So, as you said, the robot "stops cold any time the transmission is having a hiccup." It can be a pain sometimes, but it's hell of a lot better than the alternative.

      In the same way, dangerous commands (such as "shoot gun") require the robot to receive said command constantly in order to continue that action. So a robot being commanded to turn and fire just before losing comms would at worst, just turn, and typically do nothing.

      Also: +1 Ironic Sig.
      • by geekoid (135745)
        Unless an error made the it 'think' it was still getting the heartbeat.

        • Geekoid shut up. You don't even understand basic robotic development. The whole concept of a heartbeat is just that, it's a heartbeat. Heartbeat stops, robot stops. There is no second heart beating, it's not a Klingon.
    • by Blakey Rat (99501)
      That post is pretty ironic coming from someone whose sig is making fun of paranoia on Slashdot.
  • ROV (Score:3, Insightful)

    by TomRK1089 (1270906) on Tuesday April 15, 2008 @12:40PM (#23079052)

    So it's basically a Remote Operated Vehicle, not some kind of autonomous drone. Makes sense that they wouldn't want to give up on a potentially useful project so quickly then. If they had, I'd say they were throwing the baby out with the bathwater.

    Of course, on the other hand is the fact that the Middle East has to be one of the most inhospitible environments for robots, what with the extremes of temperature, sand getting into internal parts, et cetera. I'm curious on what kind of tests they did with SWORD that these connections and such weren't fixed before deployment. Did they not understand that "Works perfectly in a sealed lab environment" doesn't translate to "Will work in field, without regular maintenance, in a non-ideal environment?"

  • by hyades1 (1149581) <hyades1@hotmail.com> on Tuesday April 15, 2008 @12:40PM (#23079056)
    Given his track record for pointing guns in the wrong direction, perhaps we should start calling the little darlings, "Cheneys".
    • by sm62704 (957197)
      Given his track record for pointing guns in the wrong direction, perhaps we should start calling the little darlings, "Cheneys".

      If Bush went duck hunting with Cheney (and his heart troubles) we might have a woman president.
  • Never Say Never (Score:5, Interesting)

    by bostongraf (1216362) on Tuesday April 15, 2008 @12:48PM (#23079152)
    To all those saying that a human is "required" for the trigger, and it could "never" shoot on its own, I would like to remind you of this past October in South Africa:

    "It appears as though the gun, which is computerised, jammed before there was some sort of explosion, and then it opened fire uncontrollably, killing and injuring the soldiers."
    This was reported here: Wired Danger Room [wired.com] The most unreal quote from that link is (IMO) this:

    But the brave, as yet unnamed officer was unable to stop the wildly swinging computerised Swiss/German Oerlikon 35mm MK5 anti-aircraft twin-barrelled gun. It sprayed hundreds of high-explosive 0,5kg 35mm cannon shells around the five-gun firing position. By the time the gun had emptied its twin 250-round auto-loader magazines, nine soldiers were dead and 11 injured.
    The robot was set to reload automatically, as well, and the only reason it stopped firing is because they hadn't provided it with more cartridges.
  • But one was seen headed back to the States muttering about "John Connor."
  • I'd be a lot more concerned if it never failed because that would mean they don't know the true extent of its capabilities. From TFA, All three umcommanded movements occurred before it was safety certified. Meaning before it was a finished product. As always, the facts tell the story:

    There were three cases of uncommanded movements, but all three were prior to the 2006 safety certification, she says. "One case involved a loose wire. So, now there is now redundant wiring on every circuit. One involved

    • by polymath69 (94161)

      I am not exactly sure what it means to "double solder" something.

      I'm not certain either, but here's how I picture it working.

      Take four spots on a circuit board, all connected together.

      oo ab
      oo cd

      Take one redundant wire end and solder it to both 'a' and 'b'. The second wire to 'c' and 'd'. Run both these wires together to a similar four spots on the destination circuit board. Repeat as needed.

      Mind you, that's a guess, but that's how I'd do it.

  • This story is not about autonomous robots, it's about remote controlled toys.
  • SWORDS.
    Seriously, what is it with US military and silly macho sounding acronyms? You can almost hear the marketing meeting where they added and removed features until the project had a cool sounding acronym.
    Back in the good old days you called a new plane a Spitfire and a new gun a Bren gun. You didn't make up some silly collection of 6 or 7 different words that spelt out PATRIOT or other such silliness.
    Grrr.

1 + 1 = 3, for large values of 1.

Working...