Soldiers Bond With Bots, Take Them Fishing 462
HarryCaul writes "Soldiers are finding themselves becoming more and more attached to their robotic helpers. During one test of a mine clearing robot, 'every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.' The man in charge halted the test, though - 'He just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg. This test, he charged, was inhumane.' Sometimes the soldiers even take their metallic companions fishing. Is there more sympathy for Robot Rights than previously suspected?"
Re:"This test, he charged, was inhumane" (Score:3, Interesting)
Seriously, though, perhaps it'd be beneficial to equip robots with sensors and constraints which would let them feel "pain". Kind of like how if you try to overextend your arm you'll feel pain in the shoulder. It could become a self-limiting mechanism.
(As opposed to hard coding the limits? I dunno. Humans have some hard coded limits by the structure of bones and placement of muscles, but others don't.)
Happens with all complex machines. (Score:5, Interesting)
I wouldn't count on that. I worked in a big warehouse once, and some of the guys got pretty attached to their pallet jacks; they'd each have their own and god forbid you tried to drive it. Several of them had names.
People are funny that way. It's not a 'robot thing,' it's a 'complicated machine' thing. When a device gets complicated enough that it develops "quirks" (problems that are difficult to diagnose and/or transient), there's a tendency to anthropomorphize them. But the tendency to do it decreases with the more knowledge you have about how it works. E.g., the people who give names to their cars are generally not auto mechanics; likewise I suspect the designers of the de-mining robot would probably have not had as much of a problem testing it to pieces (or rather, their objection would probably have been "I don't want to watch six months of work get blown up," not "that's inhumane to the robot"), because they know what goes into it.
People do the same things to computers; I've dealt with lots of people who will say their computer is "tired," when it's really RAM starved -- after using it for a while, it'll run out of memory and start thrashing the disks, slowing it down. To someone who doesn't understand that, they just understand that after a certain amount of time, the computer appears to get 'fatigued.' Since they don't know any better, they try to understand the mysterious behavior using the closest analog to it that they do understand, which is themselves / other people.
The nature of bonding (Score:5, Interesting)
I've bonded very thoroughly with my laptop - it's name is Turing. I jealously clutch it when I travel. Whenever I put it down, I'm very careful to ensure that there's no stress on any cables, plugs, etc. It contains years of professional information and wisdom - emails, passwords, reams and reams of source code, MP3s, pictures, etc.
Yes, I have backups that are performed nightly Yes, I've had problems with the laptop and every few years I replace it with a new one. That doesn't change the bonding - every time there's a problem it's upsetting to me.
Am I crazy? Perhaps. But there's good reason for the laptop to be so important to me - it is the single most important tool I use to support my wife and 6 children, which are the most important things in the world to me. My workload is intense, my software is ambitious, my family is large and close, and this laptop is my means of accomplishing my goals.
If I can get attached like this to something over my professional career, it wouldn't be out of norm for strong emotional reactions towards something preserving your very existence day after day.
Not equivalent (Score:5, Interesting)
soldiers blowing up robots with landmines is inhumane, but soldiers killing people on their own land with no cause isn't?
Nobody said that killing people is somehow more humane than blowing up robots. Also, training soldiers to kill other humans is actually more difficult than you might think. Study after study has shown this, from WW II to Korea and Vietnam. Killing is not a natural impulse, which is why soldiers who have been involved in killing often come out of it with deep psychological scars. Most of what soldiers do is motivated from a desire to defend themselves and their cohorts, so it makes sense that the robot that saves soldiers from getting blown up by landmines would become dear to them.
Reminds me (Score:2, Interesting)
IANAB, this is just a theory.
In evolution the only advantages of being 'nice' to another creature is when they are receptive or/and when they are in your immediate family. We have these instructions hard-coded in our brains. Unfortunately with evolution there is no foresight into how these instructions may affect other human behavior/qualities. As long as the faulty behavior has no evolutionary disadvantages it will remain in the genes as a by-product of the original instruction.
In the case of becoming attached to robots, as they were not present in our ancestral environment, our brains output a 'must reciprocate' command through the use of emotional attachment, which may be hard to override with logic. There is nothing in the brain that states 'reciprocation will not be required in producing future gain from this particular creature/object'. It assumes that most of the recipients have a similar brain structure.
Re:Humans are funny that way (Score:2, Interesting)
Treat them like you hope they treat us. (Score:3, Interesting)
The Soldier and the Warped Sense of Humour (Score:4, Interesting)
Then they have to do dangerous and uncomfortable things that have nontrivial odds at killing them in horrendous and painful ways.
Plus they may be called upon to kill other human beings (in horrendous and painful ways) which carries its own psychic cost.
And on top of all this, they are usually in a state of mind-numbing boredom, occasionally punctuated by periods of extreme terror.
One of the defense mechanisms one develops (to help one stay sane) is a somewhat twisted and black sense of humour. Not cruel or mean, just... warped.
It isn't something you take at face value; there are layers and layers of irony involved, and you pretty much have to be a soldier to get it.
DG
Really... (Score:2, Interesting)
Robots are tools. If you refuse to use a tool for its primary purpose, you are in fact disgracing the job and existence of the tool in question. If you want to fish with your bot to better understand how to control it, that is fine. But you must in the end allow the tool to perform its task, or else you are basically saying "its not worthy". If you want robot rights, include pride in such rights.
Its very much like, in ancient times especially, not letting a soldier go out to war because you don't want that person to "die a meaningless death". Sure you still have the soldier...but if your not going to let them fight, what's the point of having trained them in the first place? Keep in mind this is in the context of a feudal society in which soldiers were just considered tools.
If the mining robot was withdrawn because, lets say, they had another mine-detecting bot and repairing the missing legs on the 1-legged bot was cheaper then letting the thing be completely destroyed, I wouldn't concern myself. But when human lives are put at risk because machines begin to look pathetic, there is something wrong. I'd say "wrong with today", but through the ages human lives have been subject to the continued functionality of machines. Look at the importance placed on katana in the edo-tokugawa periods of Japan. The captain dying with his ship. Th
Although the above examples are more along the lines of "pride of owning a machine that performed its task exceedingly well mostly and in many ways because of that machines extraordinary use BY a human in the field."
I wonder how much I could ebay a wing off the 747 that hit one of the towers for...I'd probably be blacklisted over all the media for being insensitive and people would ask me to put it in a museum...but this is just a hypothetical.
Final Word from all that mess:
A right is a set of basic requirements, minimum standards as it were, given to something. Large animals should not be stuffed in small boxes and shipped around the world without airholes or food. Babies should not be thrown in dumpsters in plastic bags. Etc...etc... But when we extend these rights to things created by man, we are restricting the creations of the future.
And either way: Robots are machines. They have more moving parts and appear to be "an agent of their own", but in the end are machines.
In the end I have a feeling the majority of people who want robot rights are not the same people who build, maintain or design these bots.
On the plus side, the real reason for giving rights to machines is not to protect the machines, but to protect the psyches of humans operating the machines.
People bond with objects and animals (Score:3, Interesting)
In at least one other book [wordpress.com], the protagonist loves, after a fashion, a simulacrum of something he knows cannot be who he loved. As the protagonist says, "We all know that we are material creatures, subject to the laws of physiology and physics, and not even the power of all our feelings combined can defeat those laws." We know robots are the opposite of material creatures, but that doesn't stop us from dreaming that they are not, and we have been dreaming of objects that come alive for at least as long as we have been writing things down. The truly strange part is that we are closer to having what we think of as "things" that do come alive.
Re:They already do, sort of. (Score:4, Interesting)
The R2-D2 cover-up (Score:3, Interesting)
What the techs didn't tell Luke was that this repair required replacing much of R2's outer casing, as well as fused logic and memory units, with parts from a similar droid. They basically murdered someone else's droid so they could resurrect Luke's.
And then, there was the subtler matter of whether this "new" R2-D2 was even the same droid. It's kind of a philosophical question. They retrieved as much of R2-D2's data store as they could from the original modules, of course, and according to the specs the replacement parts they used should be equivalent to the parts that were damaged. And, of course, they did simple things like make sure that after the repair R2 still recognized the same designation, as well as his established relationships with others - property of Luke, partner of C-3PO, etc. But it'd really be more accurate to call the repaired R2-D2 a new droid, created from parts of a wrecked droid and a scrapped droid.
As for Luke - R2-D2 could be considered stolen property (the fact that this property stole itself, or that they assumed the Jawas weren't selling them stolen goods - and perhaps even the fact that his owner was killed by Vader, and that any heirs may likely have been killed when Alderaan was destroyed - changes little) - but assuming no relations of Captain Antilles were interested in making such a claim, R2-D2 was Luke's property by virtue of the transaction with the Jawas. So it's not as though R2-D2 was the property of the rebel alliance to begin with. The X-Wing fighter Luke left at Cloud City may be another matter, however...
Re:Not equivalent (Score:2, Interesting)
Re:They already do, sort of. (Score:4, Interesting)
I'm sure it doesn't take much imagination to think: "Jesus Christ, TOFFEE? That's going to be far worse than water, because it'll stick and basically rip all your skin clean off!"
But it's well possible and doesn't hurt at all. You just put your hands in a bowl of ice water for a good 5 minute or so beforehand, til they go totally numb. Bash 'em into the vat, in, out, quick as that, you don't feel a thing.
Again, kids, don't try this at home
Re:"This test, he charged, was inhumane" (Score:5, Interesting)
Strange that you should pick the idea of the car. Some people get very attached to their cars (and other belongings) and DO empathize with them. Imagine a car you had first learned to drive as a teenager, lost your virginity in, drove your wife to the hospital in while she was having labor pains, and took your grandfather on a cross country ride right before he passed away later that year. Now imagine that the car has had it and will never again be feasible to drive. Do you take it to the scrapyard to be torn apart for parts and then crushed? Do you donate it to the junkyard derby to be smashed up and discarded?
Hell, at this point I'm not just empathizing with a car, I'm empathizing with a fictional car that I just made up.
Re:"This test, he charged, was inhumane" (Score:3, Interesting)
The fact that we do get attached to cars but would still not call someone inhumane who destroyed one (even one to which we have attachments) is odd, given that someone would call inhumane one who allowed a robot (of the sort we actually have, not the sci-fi sort) to come to harm.
Plus, slashdotters like car analogies ;-)
This has been going on for millenia (Score:3, Interesting)
Re:"This test, he charged, was inhumane" (Score:3, Interesting)
The reason why it's more likely to happen with a robot (even of the kind we have) than a car is because of a more anthropomorphic shape. Just look at how it's described in terms of "limbs" and "legs." A great example of this is this video [youtube.com]. The legs look lifelike enough that I've seen several people wince and pity the robot when it gets kicked. Our brains seem pretty hardwired for this sort of thing. If it looks like a duck and quacks like a duck, we'll probably empathize with it.
Once again, we come back around to our own humanity. As people have said, the terms "humane" and "inhumane" are all about the person being human, not the object on which they are taking some action. I see this as becoming more and more of an issue the better robot construction and more importantly virtual reality gets. People make a stink about FPSes and GTA being murder simulator, but can you imagine if it was a simulation that was indistinguishable from reality, and the "people" in it were programmed to look and act just like humans? If people decide to treat them differently from humans, what will that do the those people? Will it mess them up and make them treat "real" humans differently? In that sense, will their "inhumane" actions take away some of their humanity?
Re:"This test, he charged, was inhumane" (Score:3, Interesting)
Hired killers feeling empathy with a machine... Don't ask me what this means - All I know is it's fucked up.
teach the soldiers (Score:1, Interesting)
Re:Anthro.. (Score:3, Interesting)
Re:"This test, he charged, was inhumane" (Score:3, Interesting)
"Cats," says Pamela. "He was hoping to trade their uploads to the Pentagon as a new smart bomb guidance system in lieu of income tax payments. Something about remapping enemy targets to look like mice or birds or something before feeding it to their sensorium. The old kitten and laser pointer trick."
Manfred stares at her, hard. "That's not very nice. Uploaded cats are a bad idea."
"Thirty-million-dollar tax bills aren't nice either, Manfred. That's lifetime nursing-home care for a hundred blameless pensioners."
Franklin leans back, sourly amused, keeping out of the crossfire.
"The lobsters are sentient," Manfred persists. "What about those poor kittens? Don't they deserve minimal rights? How about you? How would you like to wake up a thousand times inside a smart bomb, fooled into thinking that some Cheyenne Mountain battle computer's target of the hour is your heart's desire? How would you like to wake up a thousand times, only to die again? Worse: The kittens are probably not going to be allowed to run. They're too fucking dangerous - they grow up into cats, solitary and highly efficient killing machines. With intelligence and no socialization they'll be too dangerous to have around. They're prisoners, Pam, raised to sentience only to discover they're under a permanent death sentence. How fair is that?"
"But they're only uploads." Pamela stares at him. "Software, right? You could reinstantiate them on another hardware platform, like, say, your Aineko. So the argument about killing them doesn't really apply, does it?"
"So? We're going to be uploading humans in a couple of years. I think we need to take a rain check on the utilitarian philosophy, before it bites us on the cerebral cortex. Lobsters, kittens, humans -- it's a slippery slope."
"Hired Killers?" (Score:3, Interesting)