Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Soldiers Bond With Bots, Take Them Fishing 462

HarryCaul writes "Soldiers are finding themselves becoming more and more attached to their robotic helpers. During one test of a mine clearing robot, 'every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.' The man in charge halted the test, though - 'He just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg. This test, he charged, was inhumane.' Sometimes the soldiers even take their metallic companions fishing. Is there more sympathy for Robot Rights than previously suspected?"
This discussion has been archived. No new comments can be posted.

Soldiers Bond With Bots, Take Them Fishing

Comments Filter:
  • by Applekid ( 993327 ) on Tuesday May 08, 2007 @01:31PM (#19039243)
    Obligatory "Why was I built to feel pain?"

    Seriously, though, perhaps it'd be beneficial to equip robots with sensors and constraints which would let them feel "pain". Kind of like how if you try to overextend your arm you'll feel pain in the shoulder. It could become a self-limiting mechanism.

    (As opposed to hard coding the limits? I dunno. Humans have some hard coded limits by the structure of bones and placement of muscles, but others don't.)
  • I'm pretty sure that they don't have feelings for a floor jack, or won't until it can move on its own. Now is the time for people to think about and begin establishing 'rights' for machines... WTF?

    I wouldn't count on that. I worked in a big warehouse once, and some of the guys got pretty attached to their pallet jacks; they'd each have their own and god forbid you tried to drive it. Several of them had names.

    People are funny that way. It's not a 'robot thing,' it's a 'complicated machine' thing. When a device gets complicated enough that it develops "quirks" (problems that are difficult to diagnose and/or transient), there's a tendency to anthropomorphize them. But the tendency to do it decreases with the more knowledge you have about how it works. E.g., the people who give names to their cars are generally not auto mechanics; likewise I suspect the designers of the de-mining robot would probably have not had as much of a problem testing it to pieces (or rather, their objection would probably have been "I don't want to watch six months of work get blown up," not "that's inhumane to the robot"), because they know what goes into it.

    People do the same things to computers; I've dealt with lots of people who will say their computer is "tired," when it's really RAM starved -- after using it for a while, it'll run out of memory and start thrashing the disks, slowing it down. To someone who doesn't understand that, they just understand that after a certain amount of time, the computer appears to get 'fatigued.' Since they don't know any better, they try to understand the mysterious behavior using the closest analog to it that they do understand, which is themselves / other people.
  • by mcrbids ( 148650 ) on Tuesday May 08, 2007 @01:40PM (#19039415) Journal
    It's normal for people to bond with people/things that are necessary to their survival.

    I've bonded very thoroughly with my laptop - it's name is Turing. I jealously clutch it when I travel. Whenever I put it down, I'm very careful to ensure that there's no stress on any cables, plugs, etc. It contains years of professional information and wisdom - emails, passwords, reams and reams of source code, MP3s, pictures, etc.

    Yes, I have backups that are performed nightly Yes, I've had problems with the laptop and every few years I replace it with a new one. That doesn't change the bonding - every time there's a problem it's upsetting to me.

    Am I crazy? Perhaps. But there's good reason for the laptop to be so important to me - it is the single most important tool I use to support my wife and 6 children, which are the most important things in the world to me. My workload is intense, my software is ambitious, my family is large and close, and this laptop is my means of accomplishing my goals.

    If I can get attached like this to something over my professional career, it wouldn't be out of norm for strong emotional reactions towards something preserving your very existence day after day.
  • Not equivalent (Score:5, Interesting)

    by Infonaut ( 96956 ) <infonaut@gmail.com> on Tuesday May 08, 2007 @01:51PM (#19039573) Homepage Journal

    soldiers blowing up robots with landmines is inhumane, but soldiers killing people on their own land with no cause isn't?

    Nobody said that killing people is somehow more humane than blowing up robots. Also, training soldiers to kill other humans is actually more difficult than you might think. Study after study has shown this, from WW II to Korea and Vietnam. Killing is not a natural impulse, which is why soldiers who have been involved in killing often come out of it with deep psychological scars. Most of what soldiers do is motivated from a desire to defend themselves and their cohorts, so it makes sense that the robot that saves soldiers from getting blown up by landmines would become dear to them.

  • Reminds me (Score:2, Interesting)

    by resonte ( 900899 ) on Tuesday May 08, 2007 @01:52PM (#19039583)
    When I was a small child, I used to think that the plants in my backyard had feelings. And when my mother started ripping out the weeds in the garden, I replanted them cause I thought they were being murdered. I think it was from watching a cartoon that had a talking tree in it.

    IANAB, this is just a theory.

    In evolution the only advantages of being 'nice' to another creature is when they are receptive or/and when they are in your immediate family. We have these instructions hard-coded in our brains. Unfortunately with evolution there is no foresight into how these instructions may affect other human behavior/qualities. As long as the faulty behavior has no evolutionary disadvantages it will remain in the genes as a by-product of the original instruction.

    In the case of becoming attached to robots, as they were not present in our ancestral environment, our brains output a 'must reciprocate' command through the use of emotional attachment, which may be hard to override with logic. There is nothing in the brain that states 'reciprocation will not be required in producing future gain from this particular creature/object'. It assumes that most of the recipients have a similar brain structure.

  • by powerpants ( 1030280 ) * on Tuesday May 08, 2007 @01:54PM (#19039595)
    By "we" I meant humans. I was referring to a human tendency. You however, have dived headlong into us/them-ism... and been modded insightful for it.
  • by scoser ( 780371 ) on Tuesday May 08, 2007 @02:03PM (#19039735) Journal
    Maybe if we treat robots well now, maybe Skynet will decide not to nuke us when it gains sentience.
  • by DG ( 989 ) on Tuesday May 08, 2007 @02:11PM (#19039885) Homepage Journal
    Soldiers are routinely taken away from their homes and loved ones and dumped in the places that are the assholes of the world.

    Then they have to do dangerous and uncomfortable things that have nontrivial odds at killing them in horrendous and painful ways.

    Plus they may be called upon to kill other human beings (in horrendous and painful ways) which carries its own psychic cost.

    And on top of all this, they are usually in a state of mind-numbing boredom, occasionally punctuated by periods of extreme terror.

    One of the defense mechanisms one develops (to help one stay sane) is a somewhat twisted and black sense of humour. Not cruel or mean, just... warped.

    It isn't something you take at face value; there are layers and layers of irony involved, and you pretty much have to be a soldier to get it.

    DG
  • Really... (Score:2, Interesting)

    by kitsunewarlock ( 971818 ) on Tuesday May 08, 2007 @02:11PM (#19039897) Journal
    The best solution (for what I consider a problem) would be to make machines less animal-like ("legs", etc...), but the truth is trying to find a solution better then nature tends to be...difficult...I mean, besides wheels, what do we have that nature hasn't already done?

    Robots are tools. If you refuse to use a tool for its primary purpose, you are in fact disgracing the job and existence of the tool in question. If you want to fish with your bot to better understand how to control it, that is fine. But you must in the end allow the tool to perform its task, or else you are basically saying "its not worthy". If you want robot rights, include pride in such rights.

    Its very much like, in ancient times especially, not letting a soldier go out to war because you don't want that person to "die a meaningless death". Sure you still have the soldier...but if your not going to let them fight, what's the point of having trained them in the first place? Keep in mind this is in the context of a feudal society in which soldiers were just considered tools.

    If the mining robot was withdrawn because, lets say, they had another mine-detecting bot and repairing the missing legs on the 1-legged bot was cheaper then letting the thing be completely destroyed, I wouldn't concern myself. But when human lives are put at risk because machines begin to look pathetic, there is something wrong. I'd say "wrong with today", but through the ages human lives have been subject to the continued functionality of machines. Look at the importance placed on katana in the edo-tokugawa periods of Japan. The captain dying with his ship. Th

    Although the above examples are more along the lines of "pride of owning a machine that performed its task exceedingly well mostly and in many ways because of that machines extraordinary use BY a human in the field."

    I wonder how much I could ebay a wing off the 747 that hit one of the towers for...I'd probably be blacklisted over all the media for being insensitive and people would ask me to put it in a museum...but this is just a hypothetical.

    Final Word from all that mess:

    A right is a set of basic requirements, minimum standards as it were, given to something. Large animals should not be stuffed in small boxes and shipped around the world without airholes or food. Babies should not be thrown in dumpsters in plastic bags. Etc...etc... But when we extend these rights to things created by man, we are restricting the creations of the future.

    And either way: Robots are machines. They have more moving parts and appear to be "an agent of their own", but in the end are machines.

    In the end I have a feeling the majority of people who want robot rights are not the same people who build, maintain or design these bots.

    On the plus side, the real reason for giving rights to machines is not to protect the machines, but to protect the psyches of humans operating the machines.
  • by ThousandStars ( 556222 ) on Tuesday May 08, 2007 @02:18PM (#19040015) Homepage
    In Heinlein's Starship Troopers, there's a bit about how the K-9 units just kill the dog part of the team if the human dies, but they can't do the same when the dog half dies, and someone (the narrator?) speculates that it would be more humane if the same happened.

    In at least one other book [wordpress.com], the protagonist loves, after a fashion, a simulacrum of something he knows cannot be who he loved. As the protagonist says, "We all know that we are material creatures, subject to the laws of physiology and physics, and not even the power of all our feelings combined can defeat those laws." We know robots are the opposite of material creatures, but that doesn't stop us from dreaming that they are not, and we have been dreaming of objects that come alive for at least as long as we have been writing things down. The truly strange part is that we are closer to having what we think of as "things" that do come alive.

  • by Have Blue ( 616 ) on Tuesday May 08, 2007 @03:12PM (#19040967) Homepage
    Actually, it *is* possible to dip your hand into molten lead and quickly pull it out with no ill effects, thanks to the Leidenfrost effect [wikipedia.org]. Kids, don't try this at home.
  • The R2-D2 cover-up (Score:3, Interesting)

    by MS-06FZ ( 832329 ) on Tuesday May 08, 2007 @03:28PM (#19041293) Homepage Journal

    Reminds me of the time when Luke Skywalker destroyed the Death Star, when he was asked if he wanted a new droid to replace the busted R2D2, he outright refused!
    (Actually, he was offered the replacement droid before he sortied... When R2 was still functional but "banged up".)

    What the techs didn't tell Luke was that this repair required replacing much of R2's outer casing, as well as fused logic and memory units, with parts from a similar droid. They basically murdered someone else's droid so they could resurrect Luke's.

    And then, there was the subtler matter of whether this "new" R2-D2 was even the same droid. It's kind of a philosophical question. They retrieved as much of R2-D2's data store as they could from the original modules, of course, and according to the specs the replacement parts they used should be equivalent to the parts that were damaged. And, of course, they did simple things like make sure that after the repair R2 still recognized the same designation, as well as his established relationships with others - property of Luke, partner of C-3PO, etc. But it'd really be more accurate to call the repaired R2-D2 a new droid, created from parts of a wrecked droid and a scrapped droid.

    As for Luke - R2-D2 could be considered stolen property (the fact that this property stole itself, or that they assumed the Jawas weren't selling them stolen goods - and perhaps even the fact that his owner was killed by Vader, and that any heirs may likely have been killed when Alderaan was destroyed - changes little) - but assuming no relations of Captain Antilles were interested in making such a claim, R2-D2 was Luke's property by virtue of the transaction with the Jawas. So it's not as though R2-D2 was the property of the rebel alliance to begin with. The X-Wing fighter Luke left at Cloud City may be another matter, however...
  • Re:Not equivalent (Score:2, Interesting)

    by Anonymous Coward on Tuesday May 08, 2007 @03:28PM (#19041299)
    Indeed -- I was reading an account by a WW2 Japanese veteran of how, when they arrived in China, each new recruit had to bayonet a Chinese PoW -- the point being to "break the spell" around killing someone. And the vet said it worked -- "you realised that it was such a simple thing to kill someone" and after that he had no problem doing it again...
  • by soliptic ( 665417 ) on Tuesday May 08, 2007 @05:05PM (#19043119) Journal
    Totally off-topic, but one of the most curious things I've done is stick my hands into a giant vat of boiling toffee. I can't even remember the occasion, some school/college thing I think, but a whole bunch of us were being taught how to make toffee, and the stage of getting from the giant vat of bubbling liquid into smaller units, was done by simply reaching in and grabbing a fist-size chunk at a time.

    I'm sure it doesn't take much imagination to think: "Jesus Christ, TOFFEE? That's going to be far worse than water, because it'll stick and basically rip all your skin clean off!"

    But it's well possible and doesn't hurt at all. You just put your hands in a bowl of ice water for a good 5 minute or so beforehand, til they go totally numb. Bash 'em into the vat, in, out, quick as that, you don't feel a thing.

    Again, kids, don't try this at home ;)
  • by illegalcortex ( 1007791 ) on Tuesday May 08, 2007 @05:29PM (#19043513)
    Who says human empathy has to make any sense? It's not like it's a rigidly programmed set of rules. We empathize with actors in a film, even when it's pure fiction.

    Strange that you should pick the idea of the car. Some people get very attached to their cars (and other belongings) and DO empathize with them. Imagine a car you had first learned to drive as a teenager, lost your virginity in, drove your wife to the hospital in while she was having labor pains, and took your grandfather on a cross country ride right before he passed away later that year. Now imagine that the car has had it and will never again be feasible to drive. Do you take it to the scrapyard to be torn apart for parts and then crushed? Do you donate it to the junkyard derby to be smashed up and discarded?

    Hell, at this point I'm not just empathizing with a car, I'm empathizing with a fictional car that I just made up.
  • by treeves ( 963993 ) on Tuesday May 08, 2007 @05:38PM (#19043695) Homepage Journal
    No, I completely agree with you. It doesn't make a lot of sense. My comment was more about the strange way that the word inhumane seems to be defined (or not so defined, really).

    The fact that we do get attached to cars but would still not call someone inhumane who destroyed one (even one to which we have attachments) is odd, given that someone would call inhumane one who allowed a robot (of the sort we actually have, not the sci-fi sort) to come to harm.

    Plus, slashdotters like car analogies ;-)

  • by karlandtanya ( 601084 ) on Tuesday May 08, 2007 @06:16PM (#19044391)
    Why do you think ships are referred to as "she"?
  • by illegalcortex ( 1007791 ) on Tuesday May 08, 2007 @06:23PM (#19044533)
    The thing is, I think most people wouldn't call the robot thing inhumane, either. And there are probably some people who would call the car thing inhumane, namely the car's owner. But not always.

    The reason why it's more likely to happen with a robot (even of the kind we have) than a car is because of a more anthropomorphic shape. Just look at how it's described in terms of "limbs" and "legs." A great example of this is this video [youtube.com]. The legs look lifelike enough that I've seen several people wince and pity the robot when it gets kicked. Our brains seem pretty hardwired for this sort of thing. If it looks like a duck and quacks like a duck, we'll probably empathize with it.

    Once again, we come back around to our own humanity. As people have said, the terms "humane" and "inhumane" are all about the person being human, not the object on which they are taking some action. I see this as becoming more and more of an issue the better robot construction and more importantly virtual reality gets. People make a stink about FPSes and GTA being murder simulator, but can you imagine if it was a simulation that was indistinguishable from reality, and the "people" in it were programmed to look and act just like humans? If people decide to treat them differently from humans, what will that do the those people? Will it mess them up and make them treat "real" humans differently? In that sense, will their "inhumane" actions take away some of their humanity?
  • by fuzzix ( 700457 ) <flippy@example.com> on Tuesday May 08, 2007 @06:58PM (#19045089) Journal

    You know... Before we get this 'robot rights' thing down, we should get the whole 'human rights' thing right first.

    Hired killers feeling empathy with a machine... Don't ask me what this means - All I know is it's fucked up.
  • teach the soldiers (Score:1, Interesting)

    by Anonymous Coward on Tuesday May 08, 2007 @10:24PM (#19047137)
    Why not teach give the soldiers some insight into how the robots have been constructed - you know, the basics, like logic gates, programming, that sort of thing. Maybe once they see that's it's all just ones, zeros and pre-determined responses they might get the point that a chunk of metal, plastic and silicon aint even close to human/animal (or even plant).
  • Re:Anthro.. (Score:3, Interesting)

    by ErroneousBee ( 611028 ) <neil:neilhancock DOT co DOT uk> on Wednesday May 09, 2007 @05:19AM (#19049367) Homepage
    While we are still channelling the spirit of DNA, why not build robots that want to be blown up, and are perfectly capable of saying so for themselves?
  • by Fyz ( 581804 ) on Wednesday May 09, 2007 @08:45AM (#19050397)
    This dialogue was taken from Charles Stross' "Accelerando", and takes on the point of the ethics of sentient military devices, though in this case, uploaded ones:

    "Cats," says Pamela. "He was hoping to trade their uploads to the Pentagon as a new smart bomb guidance system in lieu of income tax payments. Something about remapping enemy targets to look like mice or birds or something before feeding it to their sensorium. The old kitten and laser pointer trick."

    Manfred stares at her, hard. "That's not very nice. Uploaded cats are a bad idea."

    "Thirty-million-dollar tax bills aren't nice either, Manfred. That's lifetime nursing-home care for a hundred blameless pensioners."

    Franklin leans back, sourly amused, keeping out of the crossfire.

    "The lobsters are sentient," Manfred persists. "What about those poor kittens? Don't they deserve minimal rights? How about you? How would you like to wake up a thousand times inside a smart bomb, fooled into thinking that some Cheyenne Mountain battle computer's target of the hour is your heart's desire? How would you like to wake up a thousand times, only to die again? Worse: The kittens are probably not going to be allowed to run. They're too fucking dangerous - they grow up into cats, solitary and highly efficient killing machines. With intelligence and no socialization they'll be too dangerous to have around. They're prisoners, Pam, raised to sentience only to discover they're under a permanent death sentence. How fair is that?"

    "But they're only uploads." Pamela stares at him. "Software, right? You could reinstantiate them on another hardware platform, like, say, your Aineko. So the argument about killing them doesn't really apply, does it?"

    "So? We're going to be uploading humans in a couple of years. I think we need to take a rain check on the utilitarian philosophy, before it bites us on the cerebral cortex. Lobsters, kittens, humans -- it's a slippery slope."
  • "Hired Killers?" (Score:3, Interesting)

    by jamrock ( 863246 ) on Wednesday May 09, 2007 @09:24AM (#19050703)

    Hired killers feeling empathy with a machine...
    Hired killers? Is that your view of those who serve in the military? Soldiers kill if duty demands it, not because they enjoy it. Same with cops. By your definition, they're "hired killers" too. On behalf of my former Army and Marine Corps comrades, FUCK YOU.

Say "twenty-three-skiddoo" to logout.

Working...