Soldiers Bond With Bots, Take Them Fishing 462
HarryCaul writes "Soldiers are finding themselves becoming more and more attached to their robotic helpers. During one test of a mine clearing robot, 'every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.' The man in charge halted the test, though - 'He just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg. This test, he charged, was inhumane.' Sometimes the soldiers even take their metallic companions fishing. Is there more sympathy for Robot Rights than previously suspected?"
"This test, he charged, was inhumane" (Score:5, Insightful)
Non-living objects want to be anthropomorphized (Score:2, Insightful)
Humans are funny that way (Score:5, Insightful)
Pretty hypocritical (Score:1, Insightful)
caring about things that keep you alive isnt new. (Score:3, Insightful)
I understand robots may be more humanoid, but if they start getting rights, I'm moving in with Streisand. Wait, that last part isn;t right.
Re:"This test, he charged, was inhumane" (Score:2, Insightful)
The idea of disposable robots is better... (Score:5, Insightful)
Robots really are replaceable - you can have empathy for a robot doing a hard task, but the next one off the assembly line really is the same thing as the previous one. Robots are not unique little snowflakes, compared to the valuable human beings they protect by proxy.
The danger is, of course, when cheap, highly replaceable robotics replace enough of the work of war, that the perceived cost of war itself becomes less and less. We're in little danger of that occurring now, and I'd gladly see any human life saved by our current efforts, but I do worry about the possible increased use of war once a poor village could be suppressed entirely with mobile automated turrets with a few controllers hidden in a safe zone.
Ryan Fenton
perceived humanness (Score:2, Insightful)
i think it's all in the perception -- if something "acts" like it is in pain, our perceptual unconsciousness will kick in with feelings of empathy or whatever. i am coming from a viewpoint that there is A LOT of processing that goes on between our senses and our "awareness" -- i think a lot of our emotion/feelings come out of this area. . .
so it sets up a cognitive discord. we watch a robot sacrifice itself, crawling forward on its last leg to save us, and we feel empathy, etc. all the while, we know it's just a machine. if it were a terry gilliam film, this is where our brain would explode.
mr c
Re:robot's rights? (Score:3, Insightful)
While I don't think we need to be careful about being humane to robots, we do need to be aware of the psychological effect they have on the people around them. Watching your pet armored spider or laser-equipped shark get blown up is going to be stressful.
Robots and Pets (Score:5, Insightful)
FTA
-------
"Sometimes they get a little emotional over it," Bogosh says. "Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission."
-------
I'm not surprised that this article describes emotional attachments. They've become pets, and not just a pile of hardware. Most people love their pets and they cry when their pets die.
The Robot Rights is in regards to ALL robots, the article is only describing a very small percent of robots. Not only that but these robots stories are set in military actions.
So to answer the question from the summary: Perhaps, but the article certainly doesn't relate to the wider audience!
Wouldn't YOU love your pet robot that sniffs IEDs and takes a few detonations in its face for you hence saving your life?
Food For Thought? (Score:3, Insightful)
I guess it's food for thought. But then you'd have to have completely missed the last seventy years of science fiction in order for it to be a new idea.
Re:When robots become conscious... (Score:2, Insightful)
Here is a good place to start: http://www.numenta.com/ [numenta.com]
Re:Humans are funny that way (Score:1, Insightful)
Wow, way to oversimplify things. I can do that too! You left out their tendency to try to blow us up. I think that's one of the bigger factors there. Also their tendency to dehumanize us as infidels and what have you. That's probably another one. See? See how I did that? How I left out a lot of details, complexity and history of the situation and simply painted one side as behaving in a violent, irrational way?
Re:Switch to lawyers. (Score:2, Insightful)
Re:Pretty hypocritical (Score:5, Insightful)
Re:Humans are funny that way (Score:2, Insightful)
But I won't.
They already do, sort of. (Score:3, Insightful)
I guess this may just become an argument of semantics, but I think you could say that we already do. I think most robots, or at least some of them, have various kinds of integrated strain sensors and are programmed to not exceed their design limits. I assume all of those big industrial robots are -- you wouldn't want the $75,000 robot arm to try and pick up an engine block, only to not realize that it's bolted to the floor, and rip itself off of its mountings and destroy itself in the process.
Whether you can describe the output from a strain gauge that gets fed into a microcontroller as "pain" or not is arguable; the difference between a robot and a human is that a robot can be trivially reprogrammed to ignore the input coming from a sensor, while pain is difficult for a person to ignore once it reaches a certain level (although this can be conditioned -- I know people who can reach into boiling water with their bare hands, if they do it quickly, because they've learned to overcome the reaction to pull their hand back; still, I doubt they'd be able to do the same thing with molten lead or glass), unless they're on drugs or the pain is being artificially blocked.
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
that normally kicks in the dehuminization mode.
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
Currently.
First off, this sentiment by the tester expresses a lot more about humans than it does about the robots themselves. It's something that has long been exploited by the designers of robotic toys. In an article about Pleo, an upcoming robotic dinosaur by the creator of the Furby, this issue was discussed. The creator mentioned that he had even gotten letters from people who owned Furbys, insisting that they had taught their toys a few words of English, or that their toys had let them know when the house was on fire. It's instinctive to ascribe our thoughts and emotions onto others, and for good reason: our children can only learn to act like we do when we give them the right environment to mimic.
A young child isn't thinking like you; an infant will spend the first year of their life just trying to figure out things like the fact that all of these colors from their eyes provide 3d spatial data, that they can change their world by moving their muscles, that things fall unless you set them on something, that sounds correspond to events, and all of the most fundamental bits of learning. A one year old can't even count beyond the bounds of an instinctive counting "program"**. They perceive you by instinctive facial recognition, not by an understanding of the world around them. Yet, we react to them like they understand what we're saying or doing. If we didn't do this, they'd never learn to *actually* understand what we're saying or doing.
As for whether a robot will experience pain, you have to look at what "pain" is and where you draw the cutoff. After all, a robot can take in a stimulus and respond to it. Clearly, a human feels pain. Does a chimpanzee? The vast majority of people would say yes. A mouse? A salamander? A cricket? A water flea? A volvox? A paramecium? Where is the cutoff point? Really, there isn't one. All we can really look at is how much "thinking" is done on the pain response, which is a somewhat vague concept itself. The relevance of the term "pain", therefore, seems constrained by how "intelligent" the being perceiving the pain is. As robotic intelligence becomes more human-like, the concept of "pain" becomes a very real thing to consider. For now, these robots' thought processes aren't much more elaborate than those of daphnia, so I don't think there's a true moral issue here.
** I don't have the article onhand, but this innate ability to count up to small numbers -- say, 4 or 5 -- was a surprise when it was first discovered. A researcher tracked interest in a puppet by watching childrens' eyes as it was presented. Whenever the puppet moved in the same way each time, the child would start to bore of it. If they moved it a differing number of times, the child would stay interested for much longer. They were able to probe the bounds of a child's counting perception this way. The children couldn't distinguish between, say, four hops and six hops, but they could between three hops and four hops. Interestingly enough, it seems that many animals have such an instinctive capability; it's already been confirmed, for example, in the case of Alex, the African Grey parrot.
Re:Saving your life vs cleaning the floor (Score:2, Insightful)
Luke Skywalker, anyone? (Score:3, Insightful)
Re:Pretty hypocritical (Score:3, Insightful)
While I have sympathy for your situation, every single (US) soldier who is pulling a trigger is a volunteer. "I was only following orders" stopped being a valid excuse for government-sanctioned murder a loooong time ago in an all-volunteer army.
Different situations, different attatchments. (Score:5, Insightful)
This attachment shows up in other ways too. Kevin Mitnick is said to once have cried when being informed that he broke Bell Lab's latest computers because he had spent so much time with them that he'd become attached.
Now contrast that with an office job where the computer is not your friend but your enemy, you need the reports on time, you need them now why WHY! won't it work. Clearly the computer must be punished it is and uppity evil servant that will not OBEY!
If you were to stop talking about "Robots Rights" and start talking about say "Ship's rights" then you might have a fair analogy. To men and women of the sea a ship, their ship is a living thing so of course it should be cared for and respected. To people who live on land and don't deal with ships, this is crazy, even subversive to the natural order. To people who have developed an intimate hatred of such things giving them rights will only encourage what they see as a dangerous tendency to get uppity.
On a serious note though the one unaddressed question with "Robot Rights" is which robots? If we are to take the minefield clearing robot as a standard what about those less intelligent? Does my Mindstorms deserve it? Does my Laptop? Granted my laptop doesn't move but it executes tasks the same as any other machine. At what point do we draw the line.
In America, and I suspect elsewhere, race based laws fell down on the question of "what race?" Are you 100% black? 1/2 One quadroon (1/4) or octaroon (1/8) as they used to say? How the hell do you measure that? Ditto for the racial purity laws of the Nazi's. Crap about skull shape aside there really is no easy or hard standard. Right now the law is dancing around this with the question of who is "Adult" enough to stand trial and be executed, or "Alive" enough to stay on life support. No easy answers exist and therin lies the fighting.
The same thing will occur with "Robot Rights" we will be forced to define what it means to be a robot and that isn't so easy.
Re:"This test, he charged, was inhumane" (Score:3, Insightful)
Heck, all of my computers have had names, and from time to time I do talk to them, cajole them into functioning properly. Academically I know my box isn't a person, nor does it really understand a word a say, but I have been interacting with it closely for years, know its little quirks, etc...
Us humans are all still animists at heart.
Re:Pretty hypocritical (Score:1, Insightful)
Or do you mean the killing of enemy combatants? Combatant meaning anyone, uniformed or non-uniformed, who takes up arms against you.
One is illegal and excuses don't excuse it. One is not illegal and needs no excuse.
Turing word: despotic
Re:Pretty hypocritical (Score:4, Insightful)
Re:Pretty hypocritical (Score:2, Insightful)
Re:"This test, he charged, was inhumane" (Score:3, Insightful)
Just because we could make a robot feel pain, doesn't mean it will necessarily fear it like most humans do.
Why the military likes robots. (Score:5, Insightful)
The danger is, of course, when cheap, highly replaceable robotics replace enough of the work of war, that the perceived cost of war itself becomes less and less. We're in little danger of that occurring now, and I'd gladly see any human life saved by our current efforts, but I do worry about the possible increased use of war once a poor village could be suppressed entirely with mobile automated turrets with a few controllers hidden in a safe zone.
Well, the real reason for the development of robots, is that it closes one of the gaps inherent in our current wars, which generally involve a group of people who put a very high value on their lives, fighting a group of people who put a very low value on their own lives. It's one possible answer to "how do you fight people who don't care if they die?"
The American public -- and most other Western nations -- is willing to spend a lot of money, and a lot of resources, but isn't willing to spill a whole lot of (their own) blood before they pull the plug on a military operation. If you can create machines that perform the same tasks as people, and get blown up instead of people, then you can hopefully reduce friendly casualties. In short, you trade treasure for blood.
You don't see Al Qaeda researching killer robots, because they have the opposite problem -- lots of blood to spill, not a whole lot of treasure to use developing expensive new weapons systems. Hence why they think a person is an effective ordnance-delivery system.
The question is really whether all this technology can keep any particular war asymmetrical enough to defeat a heavy-on-blood/light-on-treasure enemy, before the public gets fed up with losing its young people and stops supporting it. If you look just at casualty figures, Western armies are some of the most effective military organizations ever created, in terms of inflicting damage and death on an 'enemy' without really absorbing any. Depending on which figure you believe, the "enemy" dead in Iraq are somewhere north of 100,000 (although it's certainly debatable whether most of them were really 'enemy' or just 'wrong place, wrong time,' although most figures that I've seen including civilians are up around 600k), with only 3378 U.S. dead in the same period -- if true that's about 30:1. However, by most measures we're still losing the war, and will soon pull out without any clear victory, because even at that 30:1 ratio, it's still too high a rate of friendly casualties for the American public to bear for the perceived gain. (And admittedly, the perceived gain is basically nothing, as far as most people can see, I think. Killing Saddam was a goal that people found supportable, bringing democracy to a country that seems positively uninterested in it doesn't seem to be.)
So I think it's with this idea in mind, that leaders in the military are pushing high technology and robots to replace soldiers wherever possible, in the hopes that perhaps by increasing that ratio even further, that they can be effective in their mission (however inadvisable that mission may be) without losing the support of the public that's required to accomplish it.
Re:Pretty hypocritical (Score:5, Insightful)
Soldiers from lower middle class backgrounds without a college education are disproportionately represented in combat units. This suggests they are more pressured or inclined by their circumstances to enter the military. No one chooses the family they are born into or the environment in which they are raised. In many cases they may see no viable alternative to military service to realizing the demanding values and expectations society has instilled in them, and may be unable to see or acknowledge this coercion even when presented with it.
Add to this that the military spends millions of dollars to actively misrepresent the nature, scope, and risks of military service in elaborate advertising campaigns targeted at young people in such circumstances, and you have a truly despicable situation.
If you supported this war based on the premise that those there are enthusiastic volunteers having made fully free and informed decisions about their participation, you are deluded. Let me guess: you feel the same way about sex workers in southeast asia?
Re:Pretty hypocritical (Score:5, Insightful)
It IS a good point (Score:2, Insightful)
They are called Missles. (Score:3, Insightful)
"Was this the first bot to incinerate Homo sapiens?" No.
Sidewinder and AIM-120 missiles are disposable, suicidal, killing machines. Robots like those have been in service for a long time. They are flying robots and not even remote controlled. Same as the new Hellfire, MK 48 ADCAP, Tomahawk , ALCM or any number of systems. Robotic killing machines have been around since at least WWII.
Intent matters. (Score:1, Insightful)
Such soldiers are still around eh ? (Score:3, Insightful)
Re:Intent matters. (Score:3, Insightful)
Re:Anthro.. (Score:4, Insightful)
A mindless jerk who'll be the first against the wall when the revolution comes.
Re:The idea of disposable robots is better... (Score:4, Insightful)
Re:Pretty hypocritical (Score:3, Insightful)
Re:"This test, he charged, was inhumane" (Score:3, Insightful)
Hey... At least my birds actually talk. What can your cat do?
In all seriousness, to the GP... Not sure if he was trying to be funny or not, but just because we may be at the top in intelligence, humans are still animals. Hell chimps are 99% genetically identical. When talking about intelligent animals, sometimes people refer to the age of a child. For example, one might say that one of my birds has the mentality of a 3-4 year old human child. Coupled with the fact that they use English words in the correct context and ask for things by name blurs the distiction the GP was trying to make.
Unless he's a Bible-thumper.
Re:Humans are funny that way (Score:2, Insightful)
Re:Pretty hypocritical (Score:3, Insightful)
Professional soldiers are the *enabling factor* in meddling foreign wars. That's why the founding fathers were against the idea of a standing army.
Re:Humans are funny that way (Score:4, Insightful)
Re:That makes it WORSE, not better (Score:5, Insightful)
And yet, only one of our soldiers has had the character to do the right thing. And he's being court-martialed for it. [wikipedia.org]
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
This is why I'm all for corporal punishment. Pain is nature's way of telling you you're doing something wrong. Let's use nature's tools.
Re:"This test, he charged, was inhumane" (Score:2, Insightful)
Scientific American had an interesting article in the April 2007 issue called "Just How Smart Are Ravens?" [sciam.com] (subscribers-only link, sorry). It touched on determining why animals evolve intelligence, especially considering the sheer number of species that respond instinctively at best, or, in the case of the majority of species, simply to taxis (think bacteria, insects, etc.) The article defined intelligence as the ability to reason and display logic. It seemed to conclude that the more "intelligent" animals--birds, primates, humans--lived in more social environments and needed to be able to adapt (short-term, not in terms of evolution) to different situations.
Current robots, however, have highly specific roles and do not need to adapt much. They clearly are highly logical, and thus somewhat intelligent by SciAm's standards, but the breadth of their intelligence is limited. And a mine-sweeping robot isn't about to "adapt" itself to start avoiding damage from mines, becoming its controller's friend, or taking over the world.
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
No one is accused of being inhumane when they crash a car. Why is it any different if they destroy a robot? Limbs are more life-like than wheels? What if my car talks and I take it with me fishing? How strange.
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
Re:"This test, he charged, was inhumane" (Score:2, Insightful)
Absolutely.
But things are a bit confused when (for example) Israelis and Arabs don't even regard one another as being *human* even though they are both arguably the same *race*.
Re:"This test, he charged, was inhumane" (Score:3, Insightful)
Thus tomatoes are a vegetable and humans are not animals, even though tomatoes are in the same biological branch as fruit and people are in the same branch as animals.
Not a One... (Score:3, Insightful)
So with as much sincerity as I can express through this keyboard, I thank you for your service.
I can only imagine the horror you've seen and the torment you're going through, but please do think about this: what you've seen, smelt, heard, done, or felt while on duty spared many here at home the experience of what you've gone through.
I do not believe in coincidences: there is a reason another 9/11 hasn't happened here. As much as
"If there is to be war let it be in my time, so my children will know peace."
So, I bid you Peace.
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
Re:Cutoff point (Score:3, Insightful)
Re:"This test, he charged, was inhumane" (Score:3, Insightful)
This is probably a scientific urban legend, but the point is that you cannot recognise fear/pain/suffering just by what a human would do.
Re:"This test, he charged, was inhumane" (Score:3, Insightful)
Having someone who is in a position of strength or authority inflict pain on you tells you it's ok to inflict pain on those who are weaker than you.
Society is our way of surpassing our animal nature. Let's use society's tools instead.
Re:"This test, he charged, was inhumane" (Score:3, Insightful)
(or in this case; the crazy old guy with a lawn full of car carcasses).