U.S. Army Robots Break Asimov's First Law 821
buanzo writes "The US Army is deploying armed robots in Iraq that are capable of breaking Asmov's first law that they should not harm a human.
SWORDS (Special Weapons Observation Reconnaissance Detection Systems) robots are equipped with either the M249, machine gun which fires 5.56-millimeter rounds at 750 rounds per minute or the M240, which fires 7.62-millimeter rounds at up to 1,000 per minute.
" update this story refers to this article from 2005. But com'on, robots with machine guns! I don't get to think about that most days!
Phalanx... (Score:5, Informative)
Fluff Piece (Score:5, Informative)
These are actually robots, but they're not the fully-autonomous solutions that Asimov was suggesting that mankind needed protection from. Thus the "laws" of robotics don't apply here, because it's still a human who's doing the thinking for the machine.
In effect, this is a safe way for ground troops to line up a kill zone, then cause lots 'o bad guys to get torn to shreds. Prior to this, troops needed to use a vehicle-mounted machine gun to get this sort of rate of fire. This was extremely limited in close quarters, where a Humvee or Tank might not fit. While it was theoretically possible to carry a machine gun to the combat zone, such weapons are difficult to transport, setup, and use in close quarters.
Not an Automaton (Score:2, Informative)
Comment removed (Score:5, Informative)
Re:Not really... (Score:3, Informative)
Not quite (Score:3, Informative)
Re:Phalanx... (Score:5, Informative)
Re:Not to worry (Score:1, Informative)
I'd forgotten the "You suck!" reply though, that's a nice touch.
Re:Am I the only one... (Score:5, Informative)
Common confusion between "robot" and "remote" (Score:3, Informative)
The distinction is hard to get non-geeks to make though, as all sorts of remote controlled devices are talked about as "robots." They misuse this term all the time when talking about devices to search dangerous locations for earthquake survivors, for instance. The devices are like remote controlled cars with a camera on the front (and are not wirelessly controlled--they drag a cable behind them for power and control) but they call them "robots" all the time in the news
Re:Am I the only one... (Score:3, Informative)
more than 30000 civilians died in iraq (Score:2, Informative)
Re:Who cares? (Score:5, Informative)
Wrong.
The signatories of the treaty agree to follow the rules regarding the treatment of the prisoners they take, their actions during wartime, etc.
A country that signs the treaty has to treat the prisoners of war that it captures according to the rules specified in the treaty, regardless of where those prisoners come from. That's why it's so important that the prisoners of war...excuse me, "enemy combatants" aren't officially recognized as prisoners of war... otherwise we'd have to treat them according to the rules of the treaty the US signed.
Pretty please spare everyone the bullshit until you know what the hell you're talking about.
Re:Not really... (Score:2, Informative)
While it's true that many governments suspected Saddam had WMD, there was no agreement as to what his actual capabilities were, or on what to do about it. Further, simply believing something to be true does not make it so, and certainly does not form a basis for war.
The administration never had a "smoking gun" to prove Saddam had WMD, and in fact the intelligence supporting the administration's view was alarmingly thin. As we now know from various reports, US intelligence affirming WMD frequently came from paid informants who, in some cases, were later proven to be fabricators. There was virtually no intelligence coming out of Iraq itself--the country was impenetrable, leaving the US and others with little in the way of credible sources.
It is also worth noting that while there was a range of opinion (and widespread error) as to Saddam's chemical and biological weapons capability, there certainly was not a consensus. The issue of nuclear weapons is a different story. Here, the US and UK stood nearly alone in their dire assessment. It was also on this issue that the administration demonstrated its willingness to use highly dubious intelligence reports by claiming that Iraq had sought nuclear material from Niger. This claim, of course, was based on crudely forged documents and should never have been made. The fact that the President did made this claim, and did so in a State of the Union address, is all the more troubling, especially given that the same statement was pulled from a speech he gave just a few months earlier.
http://www.downingstreetmemo.com/realitycheck.htm
Re:Am I the only one... (Score:4, Informative)
Also, you only highlighted the first half of the amendment, let's consider the other half:
The first half states the reason for protecting the right. The second half states the right itself and limits the governments' power with respect to the right. Some gun control advocates argue the Second Amendment is only a collective right, not an individual right. But if you follow their reasoning, it would apply to the First Amendment too.
Re:Who cares? (Score:4, Informative)
http://www.unhchr.ch/html/menu3/b/91.htm [unhchr.ch]
Geneva Convention Article 4
A. Prisoners of war, in the sense of the present Convention, are persons belonging to one of the following categories, who have fallen into the power of the enemy:
1. Members of the armed forces of a Party to the conflict as well as members of militias or volunteer corps forming part of such armed forces.
2. Members of other militias and members of other volunteer corps, including those of organized resistance movements, belonging to a Party to the conflict and operating in or outside their own territory, even if this territory is occupied, provided that such militias or volunteer corps, including such organized resistance movements, fulfil the following conditions:
(a) That of being commanded by a person responsible for his subordinates;
(b) That of having a fixed distinctive sign recognizable at a distance;
(c) That of carrying arms openly;
(d) That of conducting their operations in accordance with the laws and customs of war.
Hate to interrupt your uninformed rant, but persons who violate (b), (c), and (d) don't count as "prisoners of war". It's right there in the text of the geneva convention.
HTH
HAND
Not the first example (Score:3, Informative)
This fellow [cbsnews.com] is a fine previous example of an exception to Azimov's first law.
Re:Not really... (Score:2, Informative)
Re:A few thoughts.... (Score:3, Informative)
Or, how about an AEGIS [fas.org] ship itself? AEGIS ships can do about the same thing autonomously - automatically firing missles at targets that it is programmed to consider threatening.
Mind you, these systems are (well, were) almost never put into fully automatic mode - that's usually reserved for times when the fecis is hittin the fan and the operator may not have time to react.
Re:Not really... (Score:3, Informative)
All he had done? Dude, doctor Asimov INVENTED the word "robotics".
If they had read the robot novels, they would have noticed that even Asimov's robots did not always obey the laws.
If YOU had read them, you would have noticed that they ALWAYS obey the laws. The laws just happen to have loopholes. i.e. No robot may harm a human... unless he doesn't know that the order he's following will result in a human being harmed, etc.
Re:Not really... (Score:3, Informative)
1) before ODS (Operation Desert Storm) Iraq had Nuclear weapons, Chemical weapons and Biological weapons.
2) during ODS all or nearly all of the capability to produce, refine, or use these weapons was destroyed, either through US bombing or from inspectors.
3) after ODS and before OIF (Operation Iraqi Freedom) the US had poor intelligence as to the Iraqi production capabilities of Nuclear, Chemical and Biological weapons, so it was easy for them to assume that Saddam's saber rattling about having such weapons was true.
4) now, after (or during) OIF, the truth is that we think that there was no Nuclear or Chemical weapons capability in Iraq, (though the materials to make chemical weapons were certainly there), and there is some question as to where alot of Biological weapon stockpiles were.
Source of this is http://www.cia.gov/cia/reports/iraq_wmd_2004/ [cia.gov]
Ira
Re:Not really... (Score:5, Informative)
I was in the U.S. Army, and we do not do whatever we're told by our superiors "give or take". There's no give or take involved since the Vietnam War. I know you said "Professional soldiers", but we are talking about the U.S. military, not just any merc.
The U.S. Armed Forces Code of Conduct is taken very, very seriously by all of the members of the U.S. military. All U.S. soldiers are required to know it BY HEART and to understand every word of it, and it's impact on them as a modern soldier.
Read every word of it, since you obviously never have:
http://www.armystudyguide.com/content/army_board_
Pay close attention to article 6: "I will never forget that I am an American, fighting for freedom, responsible for my actions, and dedicated to the principles which made my country free."
Every U.S. soldier is responsible for his own actions, not his superior who ordered him to do something illegal. A soldier who follows an order that is illegal or just plain wrong according to that soldier's morals is just as guilty as his superior who gave him that order.
The bottom line: Any U.S. soldier can refuse to carry out an order if he believes it is illegal, and that soldier's judgement of whether an order is illegal is governed by his own morals.
A robot has no morals, but if this Army robot is just a machine remote controlled by a U.S. soldier, then that soldier will be held accountable for any action by the robot, which is just an extension of him.
Given that freedom that every U.S. soldier has to evaluate the orders they are given, there will still be incidents where soldiers with bad or no morals do horrible things when carrying out their orders.
But, how is it any different when a U.S. citizen decides to take an automatic weapon to a school to gun down a couple of dozen kids?
It all comes down to the morals of the indvidual, regardless of whether the person is a U.S. citizen or soldier. U.S. soldiers are no better or worse than the average U.S. citizen.
Re:French resistance = unlawful combatants? (Score:3, Informative)
Nowhere does the GC say "okay, you're an illegal combatant, it's okay to torture you."
In fact, the GC doesn't contain the words illegal combatant. That's an invention by the Bush administration as a category for people they don't want to treat as prisoners of war, but also don't want to treat as criminals -- ie, they don't want any laws at all to apply to them.
Video of flying robot armed with shotgun (Score:3, Informative)
A small company called Neural Robotics has produced a robotic mini-helicopter [defensereview.com] armed with a rapid-fire shotgun. Based on their off-the-shelf AutoCopter [neural-robotics.com], the UAV uses neural network-based flight control algorithms to fly in either a self-stabilizing semi-autonomous mode controlled by a remote operator, or a fully-autonomous mode which can follow GPS waypoints. A video [defensereview.com] of the AutoCopter Gunship is available.
Stepping aside the ethical issues of replacing soldiers with flying shotgun-wielding robots for the moment, their "neural network-based" flight control system seemed like an interesting technical accomplishment. This PDF briefing [neural-robotics.com] has a few details.
Taking a look at page 14 of their PDF though, perhaps their control system is a little on the simplistic side. It seems to just update roll and pitch based on the current movement and facing of the helicopter, without making use of visual information or other sensors. I'm not too familiar with flight control, but using a neural network for that seems like overkill. When in fully-autonomous mode, I wonder if they make use of sensors for crash-avoidance at all, or if they just hope that nothing's in the way of the chosen GPS coordinates.
Assuming they haven't done so already, it would be rather neat to load some range-finding sensors on the helicopter and have it automatically avoid nearby obstacles; the basic algorithms should be fairly straightforward.
Another idea is to allow the robot to visually track a point of laser light, potentially allowing somebody to control the robot with a designated laser. The military application of this is pretty obvious: You could quickly point a laser wherever the people shooting at you are hiding, so that the robot knows what area to scope out. A laser could also be used to trace out a patrol route for the robot, so that a user doesn't have to deal with typing in cumbersome GPS coordinates.
As for civilian applications, the AutoCopter with a stabilized camera might be useful for filming video. One could imagine a system of two designated laser pointers, one for each hand. One pointer would designate a spot for the robot to hover over, while another pointer would indicate where the robot should direct its camera. Of course, one could alternatively just hire a dedicated RC operator, so perhaps this would be of limited usefulness.