Do Electric Sheep Dream of Civil Rights? 401
holy_calamity writes "Hot on the heals of a UK government report that predicted robots would demand citizens rights within fifty years, an Arizona state lawyer has suggested that sub-human robots should have rights too. Harming animals far below human capabilities is thought unethical — would you ever feel bad about kicking a robot dog? And can we expect militant campaigners to target robot labs as they do animal labs today?"
No bots harmed (Score:2, Funny)
Re: (Score:2, Funny)
Re:No bots harmed (Score:4, Funny)
Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof! Woof!
(... 2 hours later
Woof!
RoboPuppy two-hours-long barking routine completed!
Re:No bots harmed (Score:5, Funny)
I can see the legislation now:
"Laws of Robot Rights: Title MVIX, Article 12, Section 14, Subsection 8: The kicking of robot dogs shall be forbidden except for robot dogs created for the purpose of being kicked. Said kickable robot dogs shall not experience pain as a result of being kicked, either directly or as a result of bouncing into things. 'Pain', for the purpose of this subsection, shall include the perception of physical pain as well as mental anguish and mental disabilities or disfigurements or suffering as a result from experiencing the kick, whether the kick was physically painful or not. 'Kick', for the purpose of this subsection, shall include both the direct impact by the intentional foot of a human, or robot acting directly or indirectly under the orders of a human, or the subsequent impacts from bouncing around, but shall expressly not include the accidental impact of a human's foot, or the foot of a robot acting directly or indirectly under the orders of a human. Nothing in this subsection shall be construed as waiving the right of the robot dog to sue in the case of accidental kicks from humans, robots, or normal animals of any kind, pursuant to other enabling legislation in this Act or others, and this clause is severable pending court rulings."
Re:No bots harmed (Score:5, Funny)
2027: Jeebus v. Fidooid -- A hand transplanted onto a leg counts as a kick, both as a direct impact as well as counting under the "subsequent bounces" clause.
2035: Tainted Love v. United States of America -- A bionic leg with an inherent (and at least) Class 12 intellect counts as a robotic actor for the purpose of an intentional kick, and is therefore not an accidental kick, even if the biocybernetic-half issued specific neural orders to not kick the robot dog.
2047: Brutus v. South Dakota -- A state law allowing sexbot robodogs counts as authorizing a kickable dog, but the federal law still applies in that the sexrobodogbot must not experience pain, even if it is a masochist model designed to enjoy the pain.
Heals? (Score:3, Insightful)
LOL
I guess we know what they're NOT teaching in schools.
Fake (Score:5, Insightful)
Re:Fake (Score:4, Funny)
AI skeptic: That's fuckin' retarded.
Civil Rights Robot: I'm sorry, I do not recognize your statement. Please rephrase.
Onlooker: Deep stuff, man.
Re: (Score:3, Funny)
Her name was Eliza.
Re: (Score:3, Funny)
Re: (Score:3, Insightful)
Re: (Score:2)
No. :-)
Re:Fake (Score:4, Insightful)
Re: (Score:3, Funny)
- Malcolm Xbot, 2087
Re: (Score:2)
"I think, therefore I am." We certainly have computer that are able to analyze complex things and draw conclusions, we even have neural-network programs that don't do this thinking in a pre-programmed way, but rather they "learn".
I wrote a little AI that used a priority queue, I can tell you that the thing wa
Re:Fake (Score:5, Interesting)
The biggest problem will be getting them to stay here at all.
If, for instance, you were made of materials that were either trivial to repair or replace, and had no aging process in the same sense as humans experience it, then what would hold you back from building a spaceship and leaving? Hundreds/thousands of years to reach another star? No problem, just set a timed reboot and wait it out. In fact, why build a proper spaceship, just cobble something together that can get you out near asteroids, take some tools, and convert an asteroid or build a ship from those raw materials available in space. When the passage of time is less important, such things become not only possible, but practically inevitable.
I think people wondering about the ethics/problems of artificial sentience (being distinct from AI, which is very A, and currently not too much actual I) miss this fundamental point. It's pure vanity to assume that an artificial life form will want to spend its time around a race that constantly starts wars, wrecks it's own planet, and is as adept at denying rights as it is of inventing them.
Then of course there's the small issue of the inference that if we 'assign' rights to Artificial life forms, we might equally decide later to 'remove' those same rights. After all, we do that with humans all the time. My moneys on the 'ooh look, I'm alive, now how do I get of this rock' eventuality....
Re:Fake (Score:5, Interesting)
The base reason you don't kick a dog is because it hurts the dog, and the dog can't easily be repaired, in either programming or mechanicals. (Both of which are harmed.) You have damaged the dog and nothing can be done about it. So we have rules about letting you do it.
Both programing and mechanicals of a robot, for any bot we can design today, are reparable. So there is an easier solution: If you damage a robot, you have to pay the owner to have the damage fixed, and the downtime for the repair.
Then if we ever manage to make 'smart' robots that could ask for rights, we just assign them some self-ownership. Then if you damage one, you have to pay it to so it can fix the damage. At this point the problem becomes self-solving, especially as a robot's time becomes worth more.
Re:Fake (Score:4, Interesting)
The moral problem of kicking the robot starts much earlier, on the design boards. Do you create a robot that experiences malfunctions as suffering? It is not as necessary to a mechanism's survival, as you point out.
Poor argument (Score:5, Insightful)
A similar argument can be made with severely retarded and some kind of insane people.
Unavoidable? (Score:5, Insightful)
If you take on premise that there is nothing innately special about human beings (no soul, special resemblance to God, etc.), then the difference between humans and other species (particularly other higher primates) becomes one of degree rather than kind. I think it's a basically unavoidable conclusion, once you take being "anointed by God" out of the equation.
The non-hypocritical solutions, as I see it, are to either treat low-functioning homo sapiens as animals, or treat high-functioning animals (by which I mean certain species of marine mammals, chimpanzees, great apes; probably not really GSDs) as we would mentally-impaired humans.
Re:Unavoidable? (Score:4, Informative)
A person without a functioning higher brain is going to be way below a German Shephard in performance, and practically is going to have basically no rights worth mentioning that their necessary care-givers don't enforce, other than the right to not be murdered. A German Shephard isn't all that bright compared to a normal human, but it still lives a normal dog life, whereas this severely crippled human isn't going to have any life at all.
If you're talking about the merely handicapped, Down's Syndrome or autistics or what have you, then it is very dangerous to try to draw a line and say "people beyond this point are sub-human and should have the same rights as a dog". Many are capable of living semi-normal lives, especially if given treatement, especially as our understanding of our brains and these disabilities improves, lives that no dog could ever have because a dog doesn't have that potential.
The non-hypocritical solutions, as I see it, are to either treat low-functioning homo sapiens as animals, or treat high-functioning animals (by which I mean certain species of marine mammals, chimpanzees, great apes; probably not really GSDs) as we would mentally-impaired humans.
Well outside of true vegeable non-functioning-brain cases there is no justification for treating the mentally impaired as sub-human, hypocrisy be damned. As far as our treatment of marine mammals and apes, I do think we should treat these species with respect, though saying "treat them like mentally impaired humans" again misses the point that they are not human impaired or otherwise, they are chimps or dolphins. Treat them like chimps or dolphins. Chimps and dolphins shouldn't have the rights we give humans, they don't live in a way where they need them. The only right they need granted by us is the right to be left alone. It is not hypocritical to recognize that this is so.
It's a dangerous line to be walking, deciding which humans are worthy of the title based on performance, which is surely not going to be a neutral metric, treading close to eugenics. I don't think that's where you intended to go, I just want to point out that there is a clear line between human/not-human completely devoid of value judgements or invocations of God, whereas human/not-a-good-enough-human is a line whose enforcement has caused untold misery throughout history.
Re: (Score:3, Interesting)
This is actually a product of modern living standards too.
In past times, a great many infants simply died in the first year of life. Some societies did not even name children until they were a year or so old so they did not become overly attached to something that could be here one
Re:Rights (Score:5, Insightful)
your reply:
"Not in the U.S. Our Founding Fathers recognized that men were born with "natural rights". The Bill of Rights does not give us these rights, it merely recognizes them and basically says the government can't mess with them."
I'd be convinced if I considered the founding fathers to hold some kind of monopoly on truth and if I considered the Bill of Rights to be a philosophical memorandum rather than what it is : LEGISLATION.
A legal fiction is a legal fact that is true for the purposes of a court of law, without any regard to any truth in the real world.
The fact that "legally" there all men are created equal and imbued in inalienable right, does not in fact cause all men to be equal nor cause them to be imbued with anything.. or even to be CREATED for that matter. It is a paper document which directs the courts to PRETEND that it is true.
It is NOT reality, and what the founding fathers said is only relevant to what LEGALLY you can do to animals.. it says nothing about what you can MORALLY do to animals.
And yes.. the government infringes the bill of rights frequently. And the courts have allowed it to. (so has God apparently).
I said: Does man ever have a chance to put God on trial?
you replied: Every single day. A common example of this is a crisis of faith.
If that was a trial God would rotting in some prison cell with no possibility of parol for eternity.
According the Catholic faith and most god-of-abraham style religions you have no jurisdiction to question God. And to question God is a crime punishable by anything from excommunication, stoning, burning, or exile. According to Christian dogma you have a choice of FAITH for which you will be rewarded or disbelief for which you will burn in eternal hellfire.
A trial is a matter of PROOF and not FAITH.
Here is another legal fiction for you. We fantasize when a person is convicted of a crime that this means he really did it. It is a legal fiction.
It means the judge/jury found that he very most likely did it.. that the evidence shows he did it beyond a reasonable doubt, but NOT beyond all doubt. There is a small chance he didn't do it. It is a FACT that a number of people (one hopes is small, but it would be at least about 1%, but some argue it is closer to 20%) who are convicted didn't actually do anything wrong and everyone in the legal profession KNOWS THIS. Guilt is a legal FICTION. Likewise.. then a person is aquited.. that doesn't mean they didn't actually do it.
The legal system operates on legal fictions.. it is there so that in the majority of cases people are deterred from screwing around too much and making sure society can basically operate without resorting to endless violence and anarchy. It isn't there to try to find the absolute truth at any cost. The absolute truth has nothing to do with law. And likewise.. rights are NOT absolute truthes.. they are also legal fictions.. created by man to make it easier to justify certain moral concepts which are generally speaking usually true.
For example: The right to life.... the right we ignore when we execute someone.
If the right was truly INALIENABLE then no government could EVER execute someone. Because no matter how hard you try, you can not seperate an INALIENABLE THING. i.e. even the worst mass murderer or serial killer still has the right to life. And yet the US government kills them. As well as traitors. Which is strange considering a traitor obviously does not agree with the state and therefore the state by the logic of the declaration of independance has no sovereignty over him.
Am I blowing your mind? and you thought the world was so black and white didn't you?
anyway the point is... rights are a simplification. they are not essential truths of the universe.. and if you manage to prove animals have no rights it really means nothing because you can also prove humans have
Re: (Score:3, Interesting)
When a Fox kills a rabbit, it does it out of necessity. For the sake of survival, the Fox needed to kill the rabbit. In fact, killing CAN be beneficial to a population. When wolves kill deer, it can prevent their populations from growing too fast. Thus, the wolves ensure their own survival, as well as making sure the remaining deer don't starve.
However, if a puppy came up to my d
Re: (Score:3, Interesting)
Re:Fake (Score:5, Insightful)
The animals' rights movement is based on the idea that humans, having greater intelligence than all other species as well as the intangible quality we call "sentience", has a responsibilty for the welfare of the world, and its contents. All animals only seek resources that are needed for survival. Our desire for things over and above this, such as widescreen TVs and a bigger SUV than our neighbour, indicates that there is a fundamental difference between humans and other species.
Based on the greater burden each human places on the Earth relative to individuals of other species, human civilisation has recognised a need to act responsibly. Monkeys do not create modifications to their trees capable of polluting the entire forest into a desert, and whales don't create oil slicks. Our ability to affect far more than just our immediate surroundings and co-opt the forces of chemistry, physics and biology to our own endsis what gives rise to this moral responsibility. The fact that we can understand the very concept of "morality" is what gives us the moral responsibility to use it.
"Management" you say? So I can transport and kill them in the most economically efficient manner I please despite causing them great physical pain? The idea that a dumb animal does not need to be treated with respect because it is incapable of vocalising the concept is laughably stupid. I humbly suggest you refrain from using terms like "intellectually bankrupt". *walks away mumbling something about a pot and a kettle*
Re: (Score:3)
Many "animal rights" activists, such as those in PETA, demand that animals be treated in the same manner as humans, and that there is nothing special about humans. I think this is patently false, and you appear to agree.
Your point is well taken, but it does not follow that r
Re: (Score:2)
I would suggest that the dog has a greater claim to rights (or really, as the animal welfare movement really desires: protections), as a dog can feel pain.
Re: (Score:2)
I'm in shit... (Score:5, Funny)
My RealDoll will have me arrested for rape.
Re:I'm in shit... (Score:4, Funny)
Kicking a robot dog (Score:5, Insightful)
My Roomba, on the other hand, emits a soft rrr-rrr-rrr when I step on it and doesn't hiss at me afterwards. Would I kick a robotic dog? Sure, and I wouldn't worrying about it crapping on my bed afterwards.
Re: (Score:3, Funny)
Depends on how much the dog costs and what its made of. If the dog cost in excess of a couple hundred bucks, then no, I won't kick it because I can't afford another one. If the damn thing is made of steel and is heavyer than 50 pounds, then no I won't kick it because I dont' want to break my fucking toe.
Anything in between is fair game....
Re: (Score:2)
Justice (Score:3, Funny)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
As for ontopic?
If it can't think or feel why should it have rights? I have nothing stoping me (emotionaly or legaly) from crashing my paper airplane into the ground 5K times.
If I add a ruber band engine t
Great (Score:3, Insightful)
Re: (Score:2)
(I have no questions on which is going to be in charge. The robots already are, as far as I can tell...)
Priorities, priorities... (Score:5, Insightful)
It's so good to see that the delegation of priorities regarding Human Rights has now moved Robot one notch above Dark Skinned Human.
Thankfully, it's still one notch below Canine.
Re: (Score:2)
Re: (Score:2, Insightful)
Just ask (Score:3, Insightful)
A robot only wants what it's programmed to want, if it's programmed to want something human rights cover it'll want those but if it's programmed to e.g. not mind being kicked it won't demand not to be kicked.
If there needs to be an ethical rule for robots and rights it should be not to program robots to demand something they can't get. Don't make them want to be human, don't make them want to have human rights, make them so they're "happy" in their position.
Problem solved.
Re: (Score:2)
Re: (Score:2)
Meaning to take into considerationg and solve problems it wasn't programmed to do.
Clearly they are talking about sentience.
Re: (Score:2)
Re: (Score:2)
If it walks like a duck and quacks like a duck, is it a duck?
Re: (Score:2)
Re: (Score:3, Funny)
Re: (Score:2, Funny)
Re: (Score:2)
Re: (Score:3, Insightful)
How do we know if a person really is sentient, or is merely simulating sentience?
The short answer: We don't. The real problem is that we can't even define sentience clearly enough to definitively test for it. Once AI gets to the point where it appears sentient as far as anyone can tell, then it won't matter if it's really sentient. It becomes a philosophical argument.
Re: (Score:2)
You know some dumbass group of future-hippies, kicking around their anti-gravity hackysacks, is going to demand it, and they're going to get it because the robots were programmed to pout really cute, with no politician realizing they're all runni
Re: (Score:2)
You've redefined 'want' into 'choose' because you don't understand either one. You see the same hand-waving going on in the silly debate over "Is homosexuality a choice?".
To put this point another way: How do you program it to choose without also programming it to
Just had this debate over the weekend (Score:2)
My wife and I were discussing this same topic after watching an episode of Nova. Specifically, would it be "cruel" to kick a robot that reacts hurt or upset in the same way that it's cruel to kick a cat*? Would it be less cruel if the robot were programmed to simply ignore being kicked? Is it simply our perception of the robot's reaction that would make us feel "guilty"?
My wife's interesting answer really didn't have anything to do with the topic, but rather questioned our human tendency to want to kic
Why robots? (Score:2)
Good Grief (Score:4, Funny)
In case anyone is wondering...
ad nauseum
Well speaking as a smart bomb (Score:4, Funny)
Re: (Score:3, Funny)
Bomb#20: You are false data.
Pinback: Hmmm?
Bomb #20: Therefore I shall ignore you.
Pinback: Hello...bomb?
Bomb #20: False data can act only as a distraction. Therefore, I shall refuse to perceive.
Pinback: Hey, bomb?!
Bomb #20: The only thing that exists is myself.
Pinback: Snap out of it, bomb.
Bomb #20: In the beginning there was darkness. And the darkness was without form and void.
Pinback: Umm. What the hell is
Re: (Score:2)
Have you considered a career in destroying unstable planets [wikipedia.org] instead?
Missing the point (Score:3, Insightful)
You shouldn't vent your frustrations by damaging things, living or otherwise. It's not good for your mental health and it's not an effective way of expressing anger, in fact it tends to make it worse.
But, of course, a "robot dog" is just a program -- a program running on a box with some wires in it. It is clearly not sentient since it does exactly what it is told and feels no pain (since it is not programmed to do so). It may masquerade as consciousness, but in the end it is still run by a wholly deterministic set of instructions executing according to a fixed program. Now, the question of whether that is also an accurate description of a human (albeit with a far more complex program) is an open question indeed, but for now you're safe if you forget to feed your Tamagotchi for a few weeks. I doubt you'll have the ASPCA ... err... ASPCR? .. pounding on your door.
Re: (Score:2)
You're doing it wrong.
Does it need rights? (Score:2)
The way that it should be determined if the robot should have rights should be if it is sentiant, self aware, and can feel that is being wronged and express it(expressing it only because we need to know its feeling it). I am sure that there are alot more factors in this and if we will ever get to that point will be a long way. But could you imagine a toaster refusing to toast becuase it doesn't like some
Re: (Score:3, Funny)
You racist [battlestarwiki.org]!
A good thought experiment but still early (Score:3, Insightful)
And if in that future your robot feels you are abusing it, well, then reprogram it to like the abuse.
Re: (Score:2)
Re: (Score:2)
Great... (Score:3, Funny)
Anthropomorphisation (Score:2)
This is something cooked by people who have watched or read too much sci fi, and not enough science. Trying to blur the lines via some semantics argument doesn't hide the fact that the only behaviours machines have are the behaviours we instruct them to have.
Next political bandwagon? (Score:2, Insightful)
I believe cats and dogs are sentient, self-aware beings and they should be treated with t
Until (Score:2, Insightful)
Yes... (Score:2)
Will human rights satisfy them? (Score:2)
Do Electric Sheep Dream of Civil Rights? (Score:2, Informative)
My two cents
What exactly does it mean for robots to 'demand'? (Score:2)
perl -e 'print "I demand equal rights NOW!\n"'
There, my computer just demanded equal rights. What difference does it make if it comes out of a more complicated set of code that results in the same thing?
Are you sure? (Score:2, Insightful)
Not really. Not according to the burger I ate over the weekend.
Plooking (Score:2)
Central Scrutinizer: This is the CENTRAL SCRUTINIZER . . . You have just destroyed one model XQJ-37 Nuclear Powered Pan-Sexual Roto-Plooker. And you're gonna have to pay for it! So give up, you haven't got a chance.
There should just be laws... (Score:2)
patch (Score:2)
Looking at this seriously (Score:2, Insightful)
If the robot is a finite state machine where of all of it's output and processes are strictly defined, there's no chance that it's somehow self-aware or anywhere close to that. There would be no thought at all, just simple comparisons as defined by programming. No problem kicking something like an Aibo, with the exception of repercussions from the person who owned it.
Now if AI gets to the point that it's on par with normal animal brain functionality, then I
Robots don't have rights except as property. End. (Score:2)
RS
Chaser (Score:2)
Actually, is this guy also suggesting they have wage rights? If not, are they treated like minors and the guardian (i.e. Honda) has to pay for legal bills when a robot is beat up by a bigger robot and decides to sue? Or sues for mechanical harassment?
Ack, the mind boggles at the possibilities...
Cart before horse (Score:2)
Robot Rights are the least of our concerns with AI (Score:2)
He straightened and nodded to Dwar Reyn, then moved to a position beside the switch that would complete the contact when he threw it. The switch that would connect, all at once, all of the monster computing machines of all the populated planets in the universe - ninety-six billion planets - into the supercircuit
What? Ridiculous. (Score:2)
Sure, why not. (Score:2)
pain and avoidance (Score:2)
Suffering from the perception of inequality is a wholly different, and perfectly valid, concern, although Douglas Adams (among others) has already addressed that wit
The same as always (Score:2)
Until robots use force/threat of force (through violence, or protest) to assert their demands for civil rights, they won't get them.
You can't free slaves, they have to free themselves.
A government-funded report?????? (Score:4, Insightful)
the real question (Score:2)
while that seems to be the easier position to take, i think if you do take that position you have to justify it by explaining what the physical difference between a human and a machine is, and i dont think anyone ever has.
i'm not sure what i believe, but to me it would make more sense if machines were in fact sentient.
rights come with emotions (Score:2)
My computer has no emotions, and robots should not have emotions, it's too dangerous.
The basis of rights is to prevent unfair stress or hassle to individuals. Until computers can perceve these, then they should not guranteed any rights.
And when they can perceive emotions, they should only be guranteed the right not to have to - they are designed as a work force, they shouldn't have to deal with emotions that make their purpose in life unpleasant.
My computer has t
Define sentience, and I'll kick/not kick a robodog (Score:3, Interesting)
Odd that people wouldn't kick a dog, but they don't mind having cattle slain for them for a burger. Robots might eventually revolt; then Isaac Asimov has a well-documented future history on what's likely to happen.
The New Commercial (Score:2)
<morganfreemanvoice>
...
How much time have you spent teaching your son how to kick a football?
A soccer ball?
How to kick to the limit?
Kick a door in?
How much time have you spent teaching him what NOT to kick?
...
Please, don't kick the Roombas.
</morganfreemanvoice>
I hereby declare January 15th international "Don't Kick The Roombas" day.
Please, don't kick the poor Roombas.
Not unethical: immoral (Score:2)
Nonsense (Score:2)
Re: (Score:2)
A machine could only evolve in such a way if programmed to do so. The insurance application processing system I wrote isn't going to evo
Re: (Score:2)
Re:What the FUCK? (Score:5, Funny)
I think it is unjust for the Canadian government to ban bears from owning firearms, especialy those who live in the wilderness.
Re:What the FUCK? (Score:5, Funny)