Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Soldiers Bond With Bots, Take Them Fishing 462

HarryCaul writes "Soldiers are finding themselves becoming more and more attached to their robotic helpers. During one test of a mine clearing robot, 'every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.' The man in charge halted the test, though - 'He just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg. This test, he charged, was inhumane.' Sometimes the soldiers even take their metallic companions fishing. Is there more sympathy for Robot Rights than previously suspected?"
This discussion has been archived. No new comments can be posted.

Soldiers Bond With Bots, Take Them Fishing

Comments Filter:
  • by TodMinuit ( 1026042 ) <todminuit@@@gmail...com> on Tuesday May 08, 2007 @01:20PM (#19039055)
    Good thing a robot isn't a human.
  • by techmuse ( 160085 ) on Tuesday May 08, 2007 @01:23PM (#19039101)
    Just like chairs, couches, and other inanimate objects, animate, but non-thinking and non-feeling machines want to be anthropomorphized.
  • by powerpants ( 1030280 ) * on Tuesday May 08, 2007 @01:23PM (#19039115)
    We can feel empathy for a machine that's doing us a favor -- but in reality has no feelings -- while simultaneously dehumazing whole groups of people who only differ from ourselves culturally and/or geographically.
  • by otacon ( 445694 ) on Tuesday May 08, 2007 @01:23PM (#19039117)
    soldiers blowing up robots with landmines is inhumane, but soldiers killing people on their own land with no cause isn't?
  • by deft ( 253558 ) on Tuesday May 08, 2007 @01:25PM (#19039145) Homepage
    Men used to name their ships and grow attached them as well. They didnt need to give them rights. It is easy for the human mind to notice "personality" in objects though, it's in out nature to see these things.

    I understand robots may be more humanoid, but if they start getting rights, I'm moving in with Streisand. Wait, that last part isn;t right.
  • by cdrdude ( 904978 ) on Tuesday May 08, 2007 @01:26PM (#19039157) Journal
    You can't argue that inhumane isn't the correct term because robots aren't human. We use the same term for mistreating animals. The difference lies in that animals, like humans but unlike robots, can feel pain.
  • by RyanFenton ( 230700 ) on Tuesday May 08, 2007 @01:30PM (#19039215)
    Than the idea of disposable soldiers. And that's really the design ideal here - the cheaper and more disposable the robot can be while meeting reliability requirements, the more extremely dangerous jobs can be done by robots.

    Robots really are replaceable - you can have empathy for a robot doing a hard task, but the next one off the assembly line really is the same thing as the previous one. Robots are not unique little snowflakes, compared to the valuable human beings they protect by proxy.

    The danger is, of course, when cheap, highly replaceable robotics replace enough of the work of war, that the perceived cost of war itself becomes less and less. We're in little danger of that occurring now, and I'd gladly see any human life saved by our current efforts, but I do worry about the possible increased use of war once a poor village could be suppressed entirely with mobile automated turrets with a few controllers hidden in a safe zone.

    Ryan Fenton
  • by mrcdeckard ( 810717 ) on Tuesday May 08, 2007 @01:31PM (#19039245) Homepage

    i think it's all in the perception -- if something "acts" like it is in pain, our perceptual unconsciousness will kick in with feelings of empathy or whatever. i am coming from a viewpoint that there is A LOT of processing that goes on between our senses and our "awareness" -- i think a lot of our emotion/feelings come out of this area. . .

    so it sets up a cognitive discord. we watch a robot sacrifice itself, crawling forward on its last leg to save us, and we feel empathy, etc. all the while, we know it's just a machine. if it were a terry gilliam film, this is where our brain would explode.

    mr c
  • Re:robot's rights? (Score:3, Insightful)

    by 644bd346996 ( 1012333 ) on Tuesday May 08, 2007 @01:32PM (#19039253)
    I don't think that we can blame the soldiers for feeling sorry for the robots. After all, the robots are coming closer and closer to looking and acting like living creatures. We model the robot leg systems after what we find in nature, because we can't do better than evolution yet. We constantly strive to make the robots more intelligent, so that they will be more useful. It is inevitable that the best robots will be thought of as pets or friends.

    While I don't think we need to be careful about being humane to robots, we do need to be aware of the psychological effect they have on the people around them. Watching your pet armored spider or laser-equipped shark get blown up is going to be stressful.
  • Robots and Pets (Score:5, Insightful)

    by EvilGrin5000 ( 951851 ) on Tuesday May 08, 2007 @01:33PM (#19039265)
    This article isn't talking about those annoying toy robots available at your nearest junk store for the low low price of $99.99, this article describes robots that take on the impossible jobs of sniffing bombs, of tracking enemies and searching caves! They become part of the team:

    FTA
    -------
    "Sometimes they get a little emotional over it," Bogosh says. "Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission."
    -------

    I'm not surprised that this article describes emotional attachments. They've become pets, and not just a pile of hardware. Most people love their pets and they cry when their pets die.

    The Robot Rights is in regards to ALL robots, the article is only describing a very small percent of robots. Not only that but these robots stories are set in military actions.

    So to answer the question from the summary: Perhaps, but the article certainly doesn't relate to the wider audience!

    Wouldn't YOU love your pet robot that sniffs IEDs and takes a few detonations in its face for you hence saving your life?
  • Food For Thought? (Score:3, Insightful)

    by Petersko ( 564140 ) on Tuesday May 08, 2007 @01:36PM (#19039335)
    "At the time we are able to produce systems (robots and/or software) that can become self-aware, we will very likely need to consider "rights" of such. Think about it (no pun intended). At the time a machine realizes it's not aware, it becomes aware. Soon, such a machine will begin to re-design itself, and easily surpass human intelligence.What then? ;-) Food for thought"

    I guess it's food for thought. But then you'd have to have completely missed the last seventy years of science fiction in order for it to be a new idea.
  • by Nick Fury ( 624480 ) <massengillm@ncssm.edu> on Tuesday May 08, 2007 @01:37PM (#19039359)
    Dude, you have got to put down the Matrix and Terminator. Take some time off and go read about the current state of AI design. The real world is very much removed from the fantasy you have concocted within your brain Mr. Anonymous Coward.

    Here is a good place to start: http://www.numenta.com/ [numenta.com]
  • by Danse ( 1026 ) on Tuesday May 08, 2007 @01:39PM (#19039393)

    while simultaneously dehumazing whole groups of people who only differ from ourselves culturally and/or geographically.

    Wow, way to oversimplify things. I can do that too! You left out their tendency to try to blow us up. I think that's one of the bigger factors there. Also their tendency to dehumanize us as infidels and what have you. That's probably another one. See? See how I did that? How I left out a lot of details, complexity and history of the situation and simply painted one side as behaving in a violent, irrational way?
  • by john83 ( 923470 ) on Tuesday May 08, 2007 @01:41PM (#19039417)
    If they had mine-clearing politicians, we'd probably have a lot less mines.
  • by chuckymonkey ( 1059244 ) <charles DOT d DO ... AT gmail DOT com> on Tuesday May 08, 2007 @01:44PM (#19039479) Journal
    Ok, having been to a war zone I can tell you first hand that you're completely wrong. What the hell do you think PTSD is? You cannot imagine the total mindfuck it is to kill a living breathing person even if that person was trying to kill you. I'll have nightmares the rest of my life because of it, and that's only the direct instances. Nevermind that for what I did, I had a very high kill count even though it was more distant and I wasn't necessarily pulling the trigger. Yeah, we may joke about with eachother but all this is is a defense mechanism. If we don't "dehumanize" it we go fucking crazy. I have several friends that are so messed up from thinking about all the horror that they've had to do that they'll never really be a good part of society. So yeah it's inhumane, I did it because I had a choice. Kill him or he'll kill me, not a really hard choice for me to make but I have to live with it for the rest of my life. Once the trigger is pulled there's not taking it back ever. I do agree that it isn't necessarily right and something should be done. That's why I vote and take an active part in trying to get people out of there because I know first hand the horrors of a war zone, horrors that I hope people like you never have to face. Don't blame the soldiers that do the killing, blame the people in their pinstriped suits that don't have to do the trigger pulling.
  • by Bearpaw ( 13080 ) on Tuesday May 08, 2007 @01:45PM (#19039493)
    Wow, way to assume what specific group of people was meant by "groups of people". I can do that too!

    But I won't.

  • by Kadin2048 ( 468275 ) * <slashdot.kadin@xox y . net> on Tuesday May 08, 2007 @01:46PM (#19039515) Homepage Journal
    Seriously, though, perhaps it'd be beneficial to equip robots with sensors and constraints which would let them feel "pain". Kind of like how if you try to overextend your arm you'll feel pain in the shoulder. It could become a self-limiting mechanism.

    I guess this may just become an argument of semantics, but I think you could say that we already do. I think most robots, or at least some of them, have various kinds of integrated strain sensors and are programmed to not exceed their design limits. I assume all of those big industrial robots are -- you wouldn't want the $75,000 robot arm to try and pick up an engine block, only to not realize that it's bolted to the floor, and rip itself off of its mountings and destroy itself in the process.

    Whether you can describe the output from a strain gauge that gets fed into a microcontroller as "pain" or not is arguable; the difference between a robot and a human is that a robot can be trivially reprogrammed to ignore the input coming from a sensor, while pain is difficult for a person to ignore once it reaches a certain level (although this can be conditioned -- I know people who can reach into boiling water with their bare hands, if they do it quickly, because they've learned to overcome the reaction to pull their hand back; still, I doubt they'd be able to do the same thing with molten lead or glass), unless they're on drugs or the pain is being artificially blocked.
  • by MrMr ( 219533 ) on Tuesday May 08, 2007 @01:47PM (#19039527)
    Why not declare the robots enemy combattants?
    that normally kicks in the dehuminization mode.
  • by Rei ( 128717 ) on Tuesday May 08, 2007 @01:52PM (#19039577) Homepage
    animals, like humans but unlike robots, can feel pain

    Currently. ;)

    First off, this sentiment by the tester expresses a lot more about humans than it does about the robots themselves. It's something that has long been exploited by the designers of robotic toys. In an article about Pleo, an upcoming robotic dinosaur by the creator of the Furby, this issue was discussed. The creator mentioned that he had even gotten letters from people who owned Furbys, insisting that they had taught their toys a few words of English, or that their toys had let them know when the house was on fire. It's instinctive to ascribe our thoughts and emotions onto others, and for good reason: our children can only learn to act like we do when we give them the right environment to mimic.

    A young child isn't thinking like you; an infant will spend the first year of their life just trying to figure out things like the fact that all of these colors from their eyes provide 3d spatial data, that they can change their world by moving their muscles, that things fall unless you set them on something, that sounds correspond to events, and all of the most fundamental bits of learning. A one year old can't even count beyond the bounds of an instinctive counting "program"**. They perceive you by instinctive facial recognition, not by an understanding of the world around them. Yet, we react to them like they understand what we're saying or doing. If we didn't do this, they'd never learn to *actually* understand what we're saying or doing.

    As for whether a robot will experience pain, you have to look at what "pain" is and where you draw the cutoff. After all, a robot can take in a stimulus and respond to it. Clearly, a human feels pain. Does a chimpanzee? The vast majority of people would say yes. A mouse? A salamander? A cricket? A water flea? A volvox? A paramecium? Where is the cutoff point? Really, there isn't one. All we can really look at is how much "thinking" is done on the pain response, which is a somewhat vague concept itself. The relevance of the term "pain", therefore, seems constrained by how "intelligent" the being perceiving the pain is. As robotic intelligence becomes more human-like, the concept of "pain" becomes a very real thing to consider. For now, these robots' thought processes aren't much more elaborate than those of daphnia, so I don't think there's a true moral issue here.

    ** I don't have the article onhand, but this innate ability to count up to small numbers -- say, 4 or 5 -- was a surprise when it was first discovered. A researcher tracked interest in a puppet by watching childrens' eyes as it was presented. Whenever the puppet moved in the same way each time, the child would start to bore of it. If they moved it a differing number of times, the child would stay interested for much longer. They were able to probe the bounds of a child's counting perception this way. The children couldn't distinguish between, say, four hops and six hops, but they could between three hops and four hops. Interestingly enough, it seems that many animals have such an instinctive capability; it's already been confirmed, for example, in the case of Alex, the African Grey parrot.
  • by sehlat ( 180760 ) on Tuesday May 08, 2007 @01:55PM (#19039607)

    than to one that does something like vacuuming the carpet
    I'm not so sure about that. We have a Roomba at home and named it "Pinball." When it got caught on a couple of obstacles in our home and had to be rescued, I found myself feeling sorry for it. People care about things that become part of their lives, particularly the animate ones, natural or artificial. For people or pets, we call it "empathy". The ability to feel such things is a sign of emotional health.
  • by Tatisimo ( 1061320 ) on Tuesday May 08, 2007 @01:56PM (#19039621)
    Reminds me of the time when Luke Skywalker destroyed the Death Star, when he was asked if he wanted a new droid to replace the busted R2D2, he outright refused! We all grow to love to our favorite stuff: Computers, cups, cars, blankets, robots, etc. Are soldiers any less human than us? Heck, let them keep their robot buddies after the war as personal assistants, that might make people less scared of technology! If Luke Skywalker could, why can't they?
  • by CantStopDancing ( 1036410 ) on Tuesday May 08, 2007 @01:58PM (#19039653)

    Don't blame the soldiers that do the killing, blame the people in their pinstriped suits that don't have to do the trigger pulling.


    While I have sympathy for your situation, every single (US) soldier who is pulling a trigger is a volunteer. "I was only following orders" stopped being a valid excuse for government-sanctioned murder a loooong time ago in an all-volunteer army.
  • by Irvu ( 248207 ) on Tuesday May 08, 2007 @02:02PM (#19039723)
    Soldiers in the field are themselves constantly at risk of life and limb. They are also constantly under stress and tension. Such stresses and risks are what forms the bond with their comrades as well as their equipment. Everything, everyone, has to work right or likely they all die. This is why sailors refer to their ship as she, and call her by name, why they get almost tearful when thinking of a favored ship and wear caps claiming them as a member of her crew. This is why Airforce officers feel an attachment to their planes and why Army officers care for their sidearms. This anthropomorphization is an essential facet of how they operate not just a side effect. The application to a mine-clearing robot may be new but not so unprecedented.

    This attachment shows up in other ways too. Kevin Mitnick is said to once have cried when being informed that he broke Bell Lab's latest computers because he had spent so much time with them that he'd become attached.

    Now contrast that with an office job where the computer is not your friend but your enemy, you need the reports on time, you need them now why WHY! won't it work. Clearly the computer must be punished it is and uppity evil servant that will not OBEY!

    If you were to stop talking about "Robots Rights" and start talking about say "Ship's rights" then you might have a fair analogy. To men and women of the sea a ship, their ship is a living thing so of course it should be cared for and respected. To people who live on land and don't deal with ships, this is crazy, even subversive to the natural order. To people who have developed an intimate hatred of such things giving them rights will only encourage what they see as a dangerous tendency to get uppity.

    On a serious note though the one unaddressed question with "Robot Rights" is which robots? If we are to take the minefield clearing robot as a standard what about those less intelligent? Does my Mindstorms deserve it? Does my Laptop? Granted my laptop doesn't move but it executes tasks the same as any other machine. At what point do we draw the line.

    In America, and I suspect elsewhere, race based laws fell down on the question of "what race?" Are you 100% black? 1/2 One quadroon (1/4) or octaroon (1/8) as they used to say? How the hell do you measure that? Ditto for the racial purity laws of the Nazi's. Crap about skull shape aside there really is no easy or hard standard. Right now the law is dancing around this with the question of who is "Adult" enough to stand trial and be executed, or "Alive" enough to stay on life support. No easy answers exist and therin lies the fighting.

    The same thing will occur with "Robot Rights" we will be forced to define what it means to be a robot and that isn't so easy.
  • This might be true for you, but for many people it isn't. Many people I know treat their animals like they were their own children, especially if they are a childless couple. I accord my own cat with roughly the same level of accord as I do most people, if you were crapping on the carpet I would swap you too. Seriously, though, there is a long history of people anthropomorphizing their tools and machine. Look at naval vessels, and bombers, or any other transportation method that people depend on for their very lives, the practice of calling these vessels "she" points to the fact that we don't view them as "merely" machines.

    Heck, all of my computers have had names, and from time to time I do talk to them, cajole them into functioning properly. Academically I know my box isn't a person, nor does it really understand a word a say, but I have been interacting with it closely for years, know its little quirks, etc...

    Us humans are all still animists at heart.
  • by Anonymous Coward on Tuesday May 08, 2007 @02:15PM (#19039955)

    While I have sympathy for your situation, every single (US) soldier who is pulling a trigger is a volunteer. "I was only following orders" stopped being a valid excuse for government-sanctioned murder a loooong time ago in an all-volunteer army.
    By government-sanctioned murder do you mean systematically rounding up and slaughtering unarmed civilians? Such as was practiced by the Third Reich?

    Or do you mean the killing of enemy combatants? Combatant meaning anyone, uniformed or non-uniformed, who takes up arms against you.

    One is illegal and excuses don't excuse it. One is not illegal and needs no excuse.

    Turing word: despotic
  • by paranode ( 671698 ) on Tuesday May 08, 2007 @02:20PM (#19040053)
    I am not a huge fan of this war but you need to get your terms straight. Murder is what the jihadis do when they blow up a car or restaurant full of innocent people, including women and children, on purpose. Killing is what the soldiers are doing, and they do it to the asshats who perform acts like I just described.
  • by operagost ( 62405 ) on Tuesday May 08, 2007 @02:29PM (#19040211) Homepage Journal
    I didn't realize that Iranian and Jordanian terrorists owned Iraq. Because that's where those guys are coming from.
  • by karnal ( 22275 ) on Tuesday May 08, 2007 @02:29PM (#19040223)
    One thing that no one has brought up in this thread is that it is OK to feel pain. Fearing pain, however, will typically alter your course of action.

    Just because we could make a robot feel pain, doesn't mean it will necessarily fear it like most humans do.
  • by Kadin2048 ( 468275 ) * <slashdot.kadin@xox y . net> on Tuesday May 08, 2007 @02:31PM (#19040235) Homepage Journal
    Robots really are replaceable - you can have empathy for a robot doing a hard task, but the next one off the assembly line really is the same thing as the previous one. Robots are not unique little snowflakes, compared to the valuable human beings they protect by proxy.

    The danger is, of course, when cheap, highly replaceable robotics replace enough of the work of war, that the perceived cost of war itself becomes less and less. We're in little danger of that occurring now, and I'd gladly see any human life saved by our current efforts, but I do worry about the possible increased use of war once a poor village could be suppressed entirely with mobile automated turrets with a few controllers hidden in a safe zone.


    Well, the real reason for the development of robots, is that it closes one of the gaps inherent in our current wars, which generally involve a group of people who put a very high value on their lives, fighting a group of people who put a very low value on their own lives. It's one possible answer to "how do you fight people who don't care if they die?"

    The American public -- and most other Western nations -- is willing to spend a lot of money, and a lot of resources, but isn't willing to spill a whole lot of (their own) blood before they pull the plug on a military operation. If you can create machines that perform the same tasks as people, and get blown up instead of people, then you can hopefully reduce friendly casualties. In short, you trade treasure for blood.

    You don't see Al Qaeda researching killer robots, because they have the opposite problem -- lots of blood to spill, not a whole lot of treasure to use developing expensive new weapons systems. Hence why they think a person is an effective ordnance-delivery system.

    The question is really whether all this technology can keep any particular war asymmetrical enough to defeat a heavy-on-blood/light-on-treasure enemy, before the public gets fed up with losing its young people and stops supporting it. If you look just at casualty figures, Western armies are some of the most effective military organizations ever created, in terms of inflicting damage and death on an 'enemy' without really absorbing any. Depending on which figure you believe, the "enemy" dead in Iraq are somewhere north of 100,000 (although it's certainly debatable whether most of them were really 'enemy' or just 'wrong place, wrong time,' although most figures that I've seen including civilians are up around 600k), with only 3378 U.S. dead in the same period -- if true that's about 30:1. However, by most measures we're still losing the war, and will soon pull out without any clear victory, because even at that 30:1 ratio, it's still too high a rate of friendly casualties for the American public to bear for the perceived gain. (And admittedly, the perceived gain is basically nothing, as far as most people can see, I think. Killing Saddam was a goal that people found supportable, bringing democracy to a country that seems positively uninterested in it doesn't seem to be.)

    So I think it's with this idea in mind, that leaders in the military are pushing high technology and robots to replace soldiers wherever possible, in the hopes that perhaps by increasing that ratio even further, that they can be effective in their mission (however inadvisable that mission may be) without losing the support of the public that's required to accomplish it.
  • by dircha ( 893383 ) on Tuesday May 08, 2007 @02:36PM (#19040303)
    "While I have sympathy for your situation, every single (US) soldier who is pulling a trigger is a volunteer. "I was only following orders" stopped being a valid excuse for government-sanctioned murder a loooong time ago in an all-volunteer army."

    Soldiers from lower middle class backgrounds without a college education are disproportionately represented in combat units. This suggests they are more pressured or inclined by their circumstances to enter the military. No one chooses the family they are born into or the environment in which they are raised. In many cases they may see no viable alternative to military service to realizing the demanding values and expectations society has instilled in them, and may be unable to see or acknowledge this coercion even when presented with it.

    Add to this that the military spends millions of dollars to actively misrepresent the nature, scope, and risks of military service in elaborate advertising campaigns targeted at young people in such circumstances, and you have a truly despicable situation.

    If you supported this war based on the premise that those there are enthusiastic volunteers having made fully free and informed decisions about their participation, you are deluded. Let me guess: you feel the same way about sex workers in southeast asia?
  • by Foolhardy ( 664051 ) <[csmith32] [at] [gmail.com]> on Tuesday May 08, 2007 @02:36PM (#19040317)
    Oh yeah, because everyone knows there are no civilian [washingtonpost.com] casualties in Iraq [iraqbodycount.org] from US military actions. Civilian casualties are civilian casualties, be it from terrorism, military invasion, ethnic cleansing, whatever. The innocent are just as dead.
  • It IS a good point (Score:2, Insightful)

    by KKlaus ( 1012919 ) on Tuesday May 08, 2007 @02:40PM (#19040383)
    To the extent that there's no reason to think animal pets have "souls" or any some such, how are they any different from robots? Because if the only answer is they're made of soft gooey parts and robots are made of hard metal and plastic, then I can't see why that should dictate that an emotional attachment to them is reasonable but one to the robot is not. The parent is right. If a reasonably complex robot is essentially a metallic pet, then developed human attachment is pretty reasonable.
  • by LWATCDR ( 28044 ) on Tuesday May 08, 2007 @02:42PM (#19040423) Homepage Journal
    From the article.
    "Was this the first bot to incinerate Homo sapiens?" No.
    Sidewinder and AIM-120 missiles are disposable, suicidal, killing machines. Robots like those have been in service for a long time. They are flying robots and not even remote controlled. Same as the new Hellfire, MK 48 ADCAP, Tomahawk , ALCM or any number of systems. Robotic killing machines have been around since at least WWII.
  • Intent matters. (Score:1, Insightful)

    by Anonymous Coward on Tuesday May 08, 2007 @02:45PM (#19040463)
    Subject says it all.
  • by unity100 ( 970058 ) on Tuesday May 08, 2007 @02:49PM (#19040529) Homepage Journal
    These types of soldiers are the ones who make heroes, ones who you can depend on defending the innocent and the weak. Hippie speaking here - we need more soldiers of this type.
  • Re:Intent matters. (Score:3, Insightful)

    by crabpeople ( 720852 ) on Tuesday May 08, 2007 @02:50PM (#19040555) Journal

    Intent matters.
    Not to the dead.

  • Re:Anthro.. (Score:4, Insightful)

    by Constantine XVI ( 880691 ) <trash,eighty+slashdot&gmail,com> on Tuesday May 08, 2007 @02:52PM (#19040597)
    H2G2 defenition of ackthpt:
    A mindless jerk who'll be the first against the wall when the revolution comes.
  • If you take away the human cost and human horrors of war, of what benefit is peace?
  • by Cheapy ( 809643 ) on Tuesday May 08, 2007 @02:58PM (#19040717)
    It's a lot easier to grow attached to something that'll save your life, instead of something that could take your life away.
  • by UseTheSource ( 66510 ) on Tuesday May 08, 2007 @02:59PM (#19040749) Homepage Journal
    Many people I know treat their animals like they were their own children, especially if they are a childless couple. I accord my own cat with roughly the same level of accord as I do most people, if you were crapping on the carpet I would swap you too.

    Hey... At least my birds actually talk. What can your cat do? :P

    In all seriousness, to the GP... Not sure if he was trying to be funny or not, but just because we may be at the top in intelligence, humans are still animals. Hell chimps are 99% genetically identical. When talking about intelligent animals, sometimes people refer to the age of a child. For example, one might say that one of my birds has the mentality of a 3-4 year old human child. Coupled with the fact that they use English words in the correct context and ask for things by name blurs the distiction the GP was trying to make.

    Unless he's a Bible-thumper. ;)
  • by powerpants ( 1030280 ) * on Tuesday May 08, 2007 @03:13PM (#19040979)

    One sympathizes best with those that one is most similar to...
    I agree with that.

    ...and the modern american soldier has more in common with that robot than most iraqis. In more ways than one.
    But I have to disagree here. The American soldier has one thing in common with these robots: they are working toward the same goal (in this narrow instance). Aside from that, everything about them is different. To illustrate how much more the Americans have in common with Iraqis (or even insurgents), let me quote from Merchant of Venice:

    I am a Jew. Hath not a Jew eyes? hath not a Jew hands, organs, dimensions, senses, affections, passions? fed with the same food, hurt with the same weapons, subject to the same diseases, healed by the same means, warmed and cooled by the same winter and summer, as a Christian is? If you prick us, do we not bleed? if you tickle us, do we not laugh? if you poison us, do we not die? and if you wrong us, shall we not revenge?
    Clearly, these things are not also true of the robot (except a couple of them, if you take their meanings loosely). And yet, because they help to keep us alive, we feel empathy toward them.
  • by Aurisor ( 932566 ) on Tuesday May 08, 2007 @03:15PM (#19041009) Homepage
    Please. Bush would NEVER have been able to draft troops for this ridiculous excuse for a war. If there were a draft tomorrow there'd be riots in the streets and US troops coming home before they got anyone over there against their will. I know that, you know that, and the white house knows that....that's why there hasn't been a draft (despite huge troop shortages), isn't a draft, and won't be one.

    Professional soldiers are the *enabling factor* in meddling foreign wars. That's why the founding fathers were against the idea of a standing army.
  • by QuasiEvil ( 74356 ) on Tuesday May 08, 2007 @03:32PM (#19041391)

    We can feel empathy for a machine that's doing us a favor -- but in reality has no feelings -- while simultaneously dehumazing whole groups of people who only differ from ourselves culturally and/or geographically.
    Um, that's because I like my car more than I like most of humanity.
  • by Hatta ( 162192 ) on Tuesday May 08, 2007 @03:40PM (#19041547) Journal
    There are boundaries, and we expect our soldiers to recognize them. We expect soldiers to be able to tell the difference between the lawful application of deadly force and unlawful murder, and we expect soldiers to carry out the first and to refuse to carry out the second. Soldiers who cross the line we expect to be disciplined in the harshest manner possible.

    And yet, only one of our soldiers has had the character to do the right thing. And he's being court-martialed for it. [wikipedia.org]
  • by Tuoqui ( 1091447 ) on Tuesday May 08, 2007 @03:52PM (#19041771) Journal
    You know... Before we get this 'robot rights' thing down, we should get the whole 'human rights' thing right first.
  • by greenbird ( 859670 ) on Tuesday May 08, 2007 @04:29PM (#19042371)

    Imagine not having any stimulus to tell you that putting your hand in front of a blow torch is a bad idea. Not accidentally killing yourself becomes a bit of a challenge. Pain is an excellent instructional tool.

    This is why I'm all for corporal punishment. Pain is nature's way of telling you you're doing something wrong. Let's use nature's tools.

  • by presentt ( 863462 ) on Tuesday May 08, 2007 @04:41PM (#19042615) Homepage Journal

    The brains of different animals, humans included, evolved (yes EVOLVED ;) for different purposes so comparing is apples and oranges.

    Scientific American had an interesting article in the April 2007 issue called "Just How Smart Are Ravens?" [sciam.com] (subscribers-only link, sorry). It touched on determining why animals evolve intelligence, especially considering the sheer number of species that respond instinctively at best, or, in the case of the majority of species, simply to taxis (think bacteria, insects, etc.) The article defined intelligence as the ability to reason and display logic. It seemed to conclude that the more "intelligent" animals--birds, primates, humans--lived in more social environments and needed to be able to adapt (short-term, not in terms of evolution) to different situations.

    Current robots, however, have highly specific roles and do not need to adapt much. They clearly are highly logical, and thus somewhat intelligent by SciAm's standards, but the breadth of their intelligence is limited. And a mine-sweeping robot isn't about to "adapt" itself to start avoiding damage from mines, becoming its controller's friend, or taking over the world.

  • by treeves ( 963993 ) on Tuesday May 08, 2007 @04:48PM (#19042769) Homepage Journal
    I think this is right.

    No one is accused of being inhumane when they crash a car. Why is it any different if they destroy a robot? Limbs are more life-like than wheels? What if my car talks and I take it with me fishing? How strange.

  • by kalirion ( 728907 ) on Tuesday May 08, 2007 @05:03PM (#19043059)
    Now if only it was that easy to tell the body "All right, I acknowledge your message that something's wrong. However there's nothing I can do about that, SO STOP YELLING."
  • by myowntrueself ( 607117 ) on Tuesday May 08, 2007 @05:36PM (#19043677)
    we should get the whole 'human rights' thing right first.

    Absolutely.

    But things are a bit confused when (for example) Israelis and Arabs don't even regard one another as being *human* even though they are both arguably the same *race*.
  • by Grant_Watson ( 312705 ) on Tuesday May 08, 2007 @05:39PM (#19043715)
    This is like the whole tomato-as-fruit-or-vegetable thing. It's not necessary for social categories to match scientific ones precisely.

    Thus tomatoes are a vegetable and humans are not animals, even though tomatoes are in the same biological branch as fruit and people are in the same branch as animals.
  • Not a One... (Score:3, Insightful)

    by SixFactor ( 1052912 ) on Tuesday May 08, 2007 @05:51PM (#19043975) Journal
    I dug through all the replies (as of 1738 EDT), but not a one said a simple "Thanks."

    So with as much sincerity as I can express through this keyboard, I thank you for your service.

    I can only imagine the horror you've seen and the torment you're going through, but please do think about this: what you've seen, smelt, heard, done, or felt while on duty spared many here at home the experience of what you've gone through.

    I do not believe in coincidences: there is a reason another 9/11 hasn't happened here. As much as /.'ers mock the Federal government, the military, the TSA, or Homeland Security, I will never denigrate the efforts of those who at least try to keep that from happening again. Finally, for your consideration, as a father, you may be aware of a prayer that goes something like:

    "If there is to be war let it be in my time, so my children will know peace."

    So, I bid you Peace.
  • by An Onerous Coward ( 222037 ) on Tuesday May 08, 2007 @06:48PM (#19044955) Homepage
    I fully disagree. Anything that helps us expand our sphere of empathy helps. Working on one will likely have a positive effect on the other.
  • Re:Cutoff point (Score:3, Insightful)

    by Rei ( 128717 ) on Tuesday May 08, 2007 @07:58PM (#19045827) Homepage
    So you credit a "nervous system", even one with just a few cells, like in daphnia, as being the key. Yet, I'd claim that slime molds respond to outside stimuli with more "thought" than daphnia do. Why require *specifically* a nervous system, when it's not the only way a being can "think"?
  • There's a fairly well known anecdote about Jane Goodall going to see a chimp lab somewhere. She was shown around and told "look, all the chimps here are smiling! the must be happy!". She apparently fled in tears - chimps don't smile, they only pull their lips back like that when they're terrified.

    This is probably a scientific urban legend, but the point is that you cannot recognise fear/pain/suffering just by what a human would do.
  • Pain is nature's way of telling you you're doing something wrong. Let's use nature's tools.

    Having someone who is in a position of strength or authority inflict pain on you tells you it's ok to inflict pain on those who are weaker than you.

    Society is our way of surpassing our animal nature. Let's use society's tools instead.

  • by Carewolf ( 581105 ) on Wednesday May 09, 2007 @07:19AM (#19049873) Homepage
    It's healthy to have real empathy, that is for real people. If you start applying too much "empathy" to everything, you become that crazy old woman in the suburbs with a house full of cats.
    (or in this case; the crazy old guy with a lawn full of car carcasses).

Today is a good day for information-gathering. Read someone else's mail file.

Working...