Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

U.S. Army Robots Break Asimov's First Law 821

buanzo writes "The US Army is deploying armed robots in Iraq that are capable of breaking Asmov's first law that they should not harm a human. SWORDS (Special Weapons Observation Reconnaissance Detection Systems) robots are equipped with either the M249, machine gun which fires 5.56-millimeter rounds at 750 rounds per minute or the M240, which fires 7.62-millimeter rounds at up to 1,000 per minute. " update this story refers to this article from 2005. But com'on, robots with machine guns! I don't get to think about that most days!
This discussion has been archived. No new comments can be posted.

U.S. Army Robots Break Asimov's First Law

Comments Filter:
  • by LiquidCoooled ( 634315 ) on Wednesday March 15, 2006 @10:51AM (#14923796) Homepage Journal
    Theres lots of robots designed for this purpose.

    Of course, they are just toys and the big deal is this will be rolled out, but heres a couple of things I thought of:

    USB Air Darts [gizmodo.com]

    Controllable from the computer :D

    Automatic sentry gun [hackaday.com]
    Uses a built in camera to detect and aim at moving targets.

    Its all very half life ish, but plenty of fun.
  • Really? (Score:3, Interesting)

    by AnonymousYellowBelly ( 913452 ) on Wednesday March 15, 2006 @10:54AM (#14923831)
    As the robot is not intelligent enough - or isn't considered as such - to make the decision of opening fire I personally don't think this breaks Asimov's law. This robots are more like 'extensions' to a soldier's body, IMHO.

    I guess that if we are to consider this a violation of Asimov's laws the computers of guided missiles have been ilegally killing people for a long time.
  • by dada21 ( 163177 ) * <adam.dada@gmail.com> on Wednesday March 15, 2006 @11:01AM (#14923909) Homepage Journal
    I am being serious. I firmly believe in the right to bear arms -- all arms. I see nothing wrong with arming myself to protect myself from not only thieves and rapists but from anyone who decides they want to restrain my freedoms on my own land. I also believe the idea of a right to bear arms was protected against tyranny -- all tyranny. The Constitution doesn't guarantee the right, I believe it is a right all humans have from birth. The Constitution merely tells government to stay away from our weapons.

    For me, this also means that while government can afford greater weapons, it shouldn't prevent us from obtaining them as well. Look at the Framers hatred of big centralized control of the masses and one would believe they, too, would not want a central army more powerful than the militias that army was supposed to be solely composed of.

    We the People are idiots if we believe that the power hungry aren't utilizing fear as a way to control the average citizen. Don't pay your taxes? Don't accept the draft? Do something on your own property that hurts no one but is considered a crime? Think first: who has the biggest gun?
  • by SisyphusShrugged ( 728028 ) <meNO@SPAMigerard.com> on Wednesday March 15, 2006 @11:04AM (#14923941) Homepage
    Does anyone remember the movie Screamers (and the Philip K Dick book Second Variety, on which it was based) In the movie the robots that we trying to wipe out humanity were called SWORDs. Maybe Bush really wants to wipe out all those annoying voters who are messing up his approval ratings....
  • by Sierpinski ( 266120 ) on Wednesday March 15, 2006 @11:13AM (#14924027)
    ...who fears government having sole access to technology that its own citizens would be jailed for?

    You mean like tanks? Fully-automatic weapons? Explosive devices? Artillery? Jets? Bombers? I don't think any of those can be legally possessed by any normal private citizen (with the possible exception of a collectors/dealers license, which are apparently not so easy to get)

    The government owning/using technology that the average person cannot use (re: wiretaps) is commonplace. The only problem is to get through the secrecy/red tape to make sure they are not abusing it. (which also is very subjective, I understand that.)
  • by bhsurfer ( 539137 ) <bhsurfer@gmail.MENCKENcom minus author> on Wednesday March 15, 2006 @11:22AM (#14924125)
    It's not only the "average citizen" that these power hungry you speak of are seeking to control. Apply your right to bear arms as a basic human right to the nuclear fiasco going on with Iran right now and it's easy to see that the US Gov't has not intention of allowing other soverign nations to possess the type of hardware that you'd like to be able to possess, let alone it's own citizens. The whole "right to bear arms" thing as a defense against governmental tyranny in the US was an antiquated notion 150 years ago and it's even more amusing now. In my opinion the "average citizen" has not had the opportunity for a level playing field in terms of military hardware with a large Govt in many many years, and (unfortunately) likely never will again.
  • Re:Not really... (Score:2, Interesting)

    by Shakrai ( 717556 ) on Wednesday March 15, 2006 @11:24AM (#14924152) Journal

    First off, "war on terror" != "WW2".

    Where the hell did I say war on terror? Why did you even bring that up? I largely agree with you that the war on terror is bullshit! I took exception to your insulting comments about people in the military. I didn't even specify which military I was talking about and went so far as to include insurgents in my example of how badly people are scarred by war.

    As for the "low threshold". If these people placed any value on the lives of others they would question being in the military and seriously give oppressing other nations a rest.

    Give me a fucking break. Oppressing other nations? They don't see it that way. Hell, I'm opposed to the war and I don't see it that way! The difference between you and I is that I'm willing to acknowledge your viewpoint and you have no idea where mine is coming from.

    *** TEN TIMES *** the number of dead on 9/11 have died in Iraq since March 2003.

    How the hell is that relevant to your insulting comment that started this "discussion"? Why do you even bring that up if for no other reason then to inflame opinion? I'm trying to decide if you are interested in a relational conversation or just a flamefest. I'm leaning towards flamefest.

    *** YOU ARE *** the terrorists you sick fucking depraved lunatics!

    If I'm the "terrorist" and I largely agree with your thoughts about the war then what does that make people who supported it? You are a radical leftist flower child who is just as dangerous as the radical rightist neo-con. Where is the sanity and the rational discussion with people like you (on either side)?

    I'm not wasting anymore time with you.

  • by Alien54 ( 180860 ) on Wednesday March 15, 2006 @11:25AM (#14924170) Journal
    And who cares about fictional "laws", anyway?

    Many researchers are spending lots of time researching AI, and the problems for which the Laws of Robotics are a an attempted solution; Namely how do you keep the robotis from taking over and/or indiscriminately killing mere humans, as seen in so many hollywood movies. So fictional laws are important as experiments in looking at potential solutions to a real problem.

    As I see it, the main problem consists of two factors. One factor develops as a result of the first.

    The first factor is consciousness, also known as self awareness. The second factor sounds like it is the first, but it includes other areas.

    The second factor is Identity. Identity is not restricted to Self Awareness, but also includes group awareness, etc in expanding circles to include universes, subjective and otherwise. When someone else is considered part of a group identity, as "one of us", then you tend not to act against yourself. When the other person is seen as being "one of the Not Us but Them" then you tend to get an opposition, etc.

    In wars, it is more a universe thing, the Hitler Universe vs the Churchill Universe, for example. Or Religious Figure One (tm) vs Religious Figure Two (tm). Or a religious universe vs a scientific universe.

    Part of the problem of psychopaths, sociopaths, etc. is that they tend to group their victims into the "One of the Not Us/Not Me" category. No sense of being or identity is allowed or granted to the other person, and so, to one degree or another, this rationalizes pigeon-holing people into things that can be abused one way or another. Or else the identity given is some other alteration of reality that legitimizes criminal activity.

    This is difficult enough to deal with in humans. Psychologists and psychiatrists have no cure for psychopaths, since it is seen as being in the genes. You can't make a pill for it, and no psychopath would take it as they do not have the luxury of seeing that anything is wrong with themselves.

    Now we try to apply this to Robotics. Probably the only real solution for the problem is to redefine Human as self aware creatures from earth, and incorporate this awareness somehow into robots, to some slight degree, so that Robots see Humans as "One of Us".

    It is a little touchy on how you would do this. It exposes some of the potential hypocrisy of humans in actions towards other potentially self aware creatures on earth, as well as each other. A self aware robot could see the hypocrisy without the emotional justification people exhibit. At this point, we could be in trouble.

  • Re:Phalanx... (Score:5, Interesting)

    by kryzx ( 178628 ) * on Wednesday March 15, 2006 @11:39AM (#14924316) Homepage Journal
    My dad was in the navy reserves for ages. He had a tale about this thing. Once they were in port, and a helicopter was landing on the ship. For whatever reason, the Phalanx was left turned on. Something about the rotors on the chopper pissed off (i.e. fooled the sensor algorithms of) the Phalanx, which rapidly swung around and pointed itself at the chopper. Luckily, as the parent post says, there was no ammo, so no shooting. But it scared the bejesus out of the helicopter pilots.

    Also, one of the things that makes this thing so kick ass is that once it decides to shoot something, it start shooting (at 4,500 rounds per minute (or 75/sec)) and the radar tracks each bullet's trajectory and corrects the aim based on that. It has eliminated any aiming error before the first bullet gets to the target.
  • Re:Not really... (Score:5, Interesting)

    by arivanov ( 12034 ) on Wednesday March 15, 2006 @12:02PM (#14924556) Homepage
    They have an immediate major failing which was duly noted by quite a few other SciFi writers.

    There is at least one missing law: The robot must know that he is a robot.

    Without this one the primary three make no sense.

  • Re:Not really... (Score:3, Interesting)

    by demigod ( 20497 ) on Wednesday March 15, 2006 @12:08PM (#14924619)
    These people you so casually dismiss as "robots" sign up, generally speaking, when they're eighteen or nineteen years old; they believe, almost without exception, that they are doing so to serve their country, to protect the Constitution and the flag and Mom and apple pie. And you know what? At most times throughout our country's history, they've been right.

    Back when I was one of those eighteen year olds and signed up it sure wasn't for any of that shit. It was to get money for college and some useful training while I was at it. It's not that I'm unpatriotic (far from it) it's just that patriotism have zero infuence on the decision.

    Most of the guys I met fell into three categories; those there to get money for college, those there to get training so they could get a decent job on the outside and those that actually wanted a military career. That third catagory was tiny by comparison. There were a few exceptions, perhaps the strangest I met was a lawyer who had joined the army national guard as an enlisted combat engineer, for the student loan replayment program.

    If my kids ever want to join up, I think I'll have then read a little Smedley Bulter [wikipedia.org] first.

  • Re:Not really... (Score:3, Interesting)

    by Shakrai ( 717556 ) on Wednesday March 15, 2006 @12:19PM (#14924749) Journal

    I think that it's partially a case of Iraqi patriotism, partially a case of the Sunni's not wanting to become a minority power in a country they ruled for decades, and partially the sheer amount of ineptitude that we took into the country in the first place (no plan for the post-war).

    The first case is pretty obvious. Think of how you'd react if our country was invaded? Even if we were being ruled by an insane dictator who was running rape rooms and oppressing 2/3's of the population.

    The second case is also pretty obvious. The Sunni's were the ruling power under Saddam. Whether or not most of them approved of his methods (the power elite certainly did -- I'd like to think the average Sunni on the street did not) they have to be scared shitless that the same kind of stuff will happen to them if they become a minority power. Think of the ruling power in South Africa and the fears they had. To this day it still amazes me that was solved without much bloodshed.

    The third case is where it gets muddy. We obviously didn't win any hearts and minds by securing the oil fields whilst letting hospitals get looted. It's hard to win hearts and minds when you can't even keep the lights on 24 hours a day. Disbanding the Iraqi military was a huge mistake.

    The thing that sticks out in my head the most however was this Iraqi man that got interviewed on the street by a CNN crew embedded with our forces during the first few days of the War. He kept asking our troops "Not like 1991 is it? You won't leave this time?" Think about what we did after the First Gulf War. We (Bush Sr.) encouraged them to rise up and throw off the shackles of Saddam. Then we allowed Saddam to use his helicopters and stood by and did nothing while he slaughtered and brutally crushed them.

    Take that little bit of history then toss it on top of the criminal incompetence that went on during/right after the fall of Saddam's regime and you can see why winning hearts and minds is a next to impossible task.

    The really sad thing is that I don't see any way out of this mess. I don't see a way out of it for us, I don't see a way out of it for the Iraqi insurgent fighter that really thinks he's fighting for his own freedom and I don't see a way out of it for the Iraqi people. I completely disagreed with going into the country in the first place but I'm more upset about the sheer incompetence that followed the war then I am with the war in the first place.

    *Sigh*

  • Re:Not really... (Score:3, Interesting)

    by Daniel Dvorkin ( 106857 ) * on Wednesday March 15, 2006 @12:47PM (#14925032) Homepage Journal
    Out of curiosity, when did you enlist? When I first raised my right hand, it was 1987; the Cold War was still on, NATO and the Warsaw Pact were still on a hair trigger, and for all we knew the tanks could start rolling at any minute. Granted, the USSR was already collapsing from the inside and the Soviet forces probably couldn't have sustained high-intensity combat operations for more than a week at the outside, but we didn't know that; to us, they were still the Big Bad. I suspect that for those who joined up even a couple of years later, the picture looked very different.

    My daughter is twelve. Her mother was in the service too, so it's in her blood from both sides; I have no idea if she'll want to enlist or not, but if she does, I think I'll tell her, "Spend the summer after you graduate from high school volunteering at a VA hospital. If you still want to do it after that, you'll have my blessing." There are good reasons to join the service, but I think it would help a lot if these kids knew what they were getting into.
  • Re:Not really... (Score:1, Interesting)

    by Flunitrazepam ( 664690 ) on Wednesday March 15, 2006 @12:57PM (#14925143) Journal
    True, and it is mathematically provable that a Robot (based on a turing machine) can NOT become self aware (Halting Problem)
  • Re:Not really... (Score:3, Interesting)

    by jusdisgi ( 617863 ) on Wednesday March 15, 2006 @01:01PM (#14925188)

    The real problem is not whether machines think, but whether men do.

    -B.F. Skinner

  • by cr0sh ( 43134 ) on Wednesday March 15, 2006 @01:02PM (#14925201) Homepage
    First off, people need to understand that the so-called "Three Laws of Robotics" was nothing more than a literary device Asimov used to help advance the plots in his stories revolving around robots. His robotic protagonists could be considered embodiments of "the ideal man", a conception of an individual who strove to be helpful while preserving integrity and life. Since people are, well, human - and thus prone to fallibility - the concept of a robot with in-built laws to guide it was a perfect literary device to allow Asimov to explore the possibilities of a future where mankind, through being forced by his own creations, was a more altruistic species. Granted, in these stories (and I have by no means read them all) ways were found around these laws by both men and robots, but this is again nothing more than another magnifying glass being focused on mankind's faults.

    Indeed, it is this and other devices which Asimov employs in his fiction-based studies of human nature which make his books masterpieces in the hard-science fiction genre. He could have just as easily have written about ordinary men under regular law, during just about any era in history, but such stories wouldn't have likely had the same impact as what he ultimately wrote. His work is great social commentary and insight about the human condition wrapped in a gauze of fiction. Unfortunately, so many people seem to not realize this or choose to ignore it. So much for reading comprehension, I guess.

    With that said, is it really any wonder why we would make automated war machines (especially ones which fail the "Three Laws of Robotics")? Throughout history, technology (amongst other things) has ultimately been spurred on more by violence than by any other force. Information technology and the machines which manifest themselves from it are no different (save for the other great driving force, sex, which also has proven to be a factor in the spread of information and the technology that controls it). Violence and sex - war and pornography - these are ultimately the two great driving forces of information technology in human society.

    Where's my fembot, damn it?!

  • Nintendo generation (Score:2, Interesting)

    by pig_eye_jackson ( 961383 ) on Wednesday March 15, 2006 @01:07PM (#14925244)
    I work at the company that makes these although I'm not in that group. I've heard that when they were first developed the army was afraid that the controls were too complicated for the soldiers. However, because many soldiers were raised playing Nintendo controlling the robots was second nature to them. You can read more about them here [foster-miller.com], and here is their data sheet [foster-miller.com].
  • by Medievalist ( 16032 ) on Wednesday March 15, 2006 @01:13PM (#14925283)

    How beauteous mankind is! O brave new world, that has such people in't! - Shakespeare, The Tempest

    I'm guessing that the reason we still use human operators is because nobody's come up with a solution to the "AK-toting cow" problem.

    It works like this: The US Gubbmint is all overjoyed 'cause robot soldiers will do all kinds of stuff that would psychologically damage normal Americans, and you don't have to re-integrate robots into normal society afterwards.

    But, after you release the robotic hounds, it'll only take about ten nanoseconds for the local equivalent of Viet Cong to figure out that they can make a whole bunch of AK-47 stencils (you fold 'em in half for safe carrying) and stencil "destroy this" on whatever they want.

    So, the mayor's car gets an AK-47, the local CIA listening post gets a few dozen, etc. etc. and the robots blow the living shit out of them. You can use black spray paint at night and the robots won't see 'em until morning, or you can use various solutions that are invisible under most circumstances but will show up in IR, or when light is reflecting off the surface, or whatever.

    A few days later, Fat Tony, the local "legitimate businessman", realises that he can have his "associates" spray paint AKs on his competitors' cattle herd and then phone in an anonymous tip to the local American occupation force about guerillas hiding in the competing stockyard. Then the Baptists figure out they can spraypaint AKs on the Buddhist temple, or whatever.

    Obviously, the Armed Forces don't want the local outlaws calling the shots, regardless of whether the locals are political, religious or economic outlaws.

    Solve that problem and you'll make a world where Abu Ghraib and Bagram will seem like the good ol' days.
  • Re:Not really... (Score:3, Interesting)

    by SnowZero ( 92219 ) on Wednesday March 15, 2006 @01:21PM (#14925355)
    The halting problem is not the same thing as self awareness, unless you have a very strange definition of "self aware". So if you, as a human, can't answer whether the Collatz function [wolfram.com] terminates on an arbitrary finite integer, does that mean that you are not self aware? Do you think a Chimpanzee is self-aware? How many theorems have they proved?

    This is exactly why Turing proposed the Turing test. It's the only objective way to gauge human-like intelligence proposed thus far. You have to bypass a human's bias about our uniqueness.
  • we didn't know that? (Score:4, Interesting)

    by drgonzo59 ( 747139 ) on Wednesday March 15, 2006 @02:23PM (#14925928)
    But why, haven't you ever asked yourself that question. Here is this huge country, much bigger than US, was it really that hard to see that everything was going downhill and there are only a couple of years left before the "red giant" collapses?

    It turns out that this image of the Soviet Union as an uber-powerful country that will invade at any minute now, was in the interests of the neo-cons in power. It is known now that Congressional groups influenced by them, would go through the CIA evidence and re-interpret and mix everything with fantasy to make it sound as if the Russians have reached this unprecedented level of technological achievements and are ready to "push the button" at any minute. The media didn't know, it just regurgitated everything that the government told it to. So the minds and oppinions of ordinary Americans are controlled by this small group of people who have it as their main principle to hold the society in fear so they can control it.

    Watch the "Power of Nightmares" movie. It is a British documentary, aired on BBC a while ago and now it is free for download here [archive.org]. It is very educational, it talks about the idiological forces behind the US neo-cons, and Islamic extremism, how it started how both clashed. There is also a Wiki page about the movie, check it out. Just search on Google for it. Warning: it is a 3 hour long thing, but I didn't regret taking that time to see it.

  • by mr_burns ( 13129 ) on Wednesday March 15, 2006 @02:23PM (#14925929)
    Both these weapons (m249 and m240g) get really hot. You have to fire them in 3 second strings and swap out the barrels every 300 rounds or so. While you're letting the barrel cool between strings or changing out the barrel... that's when the enemy attacks you.

    So a common technique is "talking machine guns". You have 2 gun crews and they take turns with the firing strings, so there are always rounds going down range and the barrels stay relatively cool. Hopefully you can stagger changing out the barrels too.

    So how do the robots handle this? You'd need moving parts that handle the ammo chain. Either it would have to be able to reload from standard chains by itself or troops would have to link many chains together and load them into a drum beforehand. If you've got a long chain you need an armature to twist the chain in case of a runaway gun. And then there's the barrels. You need more moving parts to change those out. And what if it drops one?

    So to deal with those cooling issues with these weapons you may need 2 weapons per robot or 2 robots working in tandem.

    But even that's not ideal. A minigun is a far better weapon for this kind of thing. The ones on the blackhawks would be perfect. We already can order them in bulk, the barrels stay cool and in the case of a runaway gun, you just cut power to the motor. And the moving parts are far less compicated. Much easier to maintain in the field.

    The only advantage I can see to deploying the m249 or m240g is that the robot and troops could share ammo and the troops know how to service them. But the m134 minigun already uses the same ammo as the m240g and if you're going to service a robot, you probably are going to get special training anyway.

    Oh, and in peacetime can they clean my carpets?
  • Re:Not really... (Score:4, Interesting)

    by smoker2 ( 750216 ) on Wednesday March 15, 2006 @02:33PM (#14926010) Homepage Journal
    Yeah, there's that guy Arthur C Clarke too, he writes sci-fi, and made up something called a geostationary satellite [sciencemuseum.org.uk] , that will never happen either !

    Oh, wait ...

  • Re:Not really... (Score:3, Interesting)

    by UserGoogol ( 623581 ) on Wednesday March 15, 2006 @02:42PM (#14926095)
    Yes, but I'm not of the opinion that Asimov ever portrayed "the human race being reduced to animals in a robot zoo" as being a bad thing. In the particular stories in I, Robot where he directly touches upon the idea of robots entering politics, ("Evidence" and "The Inevitable Conflict") he's sort of nervously optimistic on the subject. In Evidence, he very clearly states that robot overlords would be awesome, and in The Inevitable Conflict he's a bit grimmer, but I still don't think it's depicted as a "bad" thing. Yes, humanity loses control of their destinies, but... so? These aren't shitty "I'm gonna go run around in circles because I don't know what the fuck I'm doing" robots like in Runaround. If they run into a situation where they have to harm a human, they will just pick the least harmful option and deal with it. These are smart robots. Asimov (or Dr. Calvin at least) seems to present it as ultimately a good thing.

    Regarding your main claim, I dunno. There's certainly a recurring theme about how strict rules can be rather brittle in real life, but I think you're reading too much into it.

    I think that the overall idea of the three laws is a good idea, though. If you make a robot with a "general purpose intelligence," you're going to have to hard-wire some sort of ethics into it so as to make sure it acts in the best interests of its end-user. You can't just tell the robot, "Oh, go read Nicomachean Ethics or whatever" and hope for the best. Just because a robot knows ethics doesn't mean it'll actually feel like doing what it's told. You need to actually make it want to be ethical, and that's where hardwiring comes in.
  • by Stephen Samuel ( 106962 ) <samuel@bcgre e n . com> on Wednesday March 15, 2006 @03:08PM (#14926357) Homepage Journal
    The real purpose of the three laws was to create a plot device. Asimov was clear that robots could be used to kill people -- that was the excuse for bringing up the three laws to begin with -- people scared that robots would run around killing them.. so they created the three laws to keep people calm.

    That violent future history of robotics leading to the creation of the three laws could have made a story in and of itself, but asimove relegated it to a footnote -- because that sort of story would be something of a literary FPS... Go in, kill things get killed, clean up the mess. Not a whole lot of plot device in there.

    On the other hand, the three laws -- while looking simple and 'safe' introduced all sorts of dilemas and thought experiments -- like, what do you do when people have to go into a slightly harmful area?
    What happens when your choice is between one person dying and another?
    Can you hurt someone to prevent him from killing someone?
    Is suicide (and thus breaking the third law) better than chosing how to break the first law?

    Is (secret) interference with humanity's destiny justifiable to (hopefully) minimize suffering.

    None of those plotlines are possible without the 3 laws. On the other hand, any plot that requires that any of the 3 laws don't exist can be facilitated by the simple (and very believable) plot device of having a human take the 'shortcut' of removing or modifying the 3 laws so as to allow something 'important' (or just profitable) to get done.

    (( ... and you realize, of course, that getting to the point where an autonomous entity could even recognize when a possible violation of the three laws was occuring would be the excuse for creating and wallowing in entire fields of artificial intlligence that have, so far, only had their surfaces scratched. ))

  • Re:The problem is... (Score:3, Interesting)

    by CrimsonAvenger ( 580665 ) on Wednesday March 15, 2006 @04:02PM (#14926816)
    don't forget that tens of thousands of US soliders have been severely injured by wounds that would have killed them in previous conflicts.

    As of last month, there were ~16600 US military wounded. That's all sorts of wounds, from "lost both arms and legs" to "flesh wound". Kerry got a Purple Heart for a wound that was treated with a bandaid, which is probably close to the lower limit of a wound recognized by our military.

    So it's fair to suggest that the vast majority of those wounded do not fit "severely injured by wounds that would have killed them in previous conflicts".

  • Re:Not really... (Score:3, Interesting)

    by ScentCone ( 795499 ) on Wednesday March 15, 2006 @04:45PM (#14927234)
    What have those French riots to do with Iraq?

    Pay attention to the thread, how about? The question is, why would France have, despite glaring evidence, have said that it would always, no matter what, use its ill-deserved veto-enabled seat on the security council to stop any action, ever, that would involve force against Saddam? Primarily because of a long history of doing business with him (even he has his forces taking shots at aircraft patroling the no-fly zones, skimmed "humanitarian" resources from his people to keep building more weapons and palaces, etc), and because a gigantic and rapidly expanding Middle-Eastern/African Muslim immigrant population that is simmering away in France's stratified socialist wonderland (with no job prospects, a stagnant economy, and a portable cultural leadership that - as we saw in the riots - leverages flimsy excuses to stir up telegenic trouble). And because with so little actual global clout available to it, the craven French government would rather watch Iraq rot under Saddam, and watch him send cash to suicide bombers, and watch him shop around for missle parts, etc., just so they could make a show of being not the US. It was callow, and embarassing for the French people.

    The riots, per se, have very little to do with Iraq... but they are a perfect example of one of the reasons that the French did not want to allow themselves to be painted, by a sensationalist Arab press, as being anti-Muslim in their foreign relations. This has backfired, of course, because they were hoping that Saddam himself would fall from inside, and they'd still have their economic connections intact with the Baathists, the better to reap the financial rewards of providing more services. Poor judgement.

    De-clawing of any law enforcement in Germany? Have you ever even been there?

    Hmmm, let's see. My wife was born in Frankfurt, also lived in Munich, Berlin, Vienna, and elsewhere. Many family trips there... does any of that count? Many personal friends in foreign service, international business, law enforcement, defense/intel who work there. Does any of that count? Friends and neighbors from places throughout Africa and the middle east who have spent recent years there in school among immigrant student populations... does any of that count? Germany's wide-open borders, crazily hands-off attitude (only recently starting to straighten out) towards the movement and activities of transparently hostile, radicalized militant Islamists, and drooping economy are well known. The German government's resistence to holding Saddam accountable for his continuing provocations had nothing to do with their awareness of the available intel, but instead with internal elections and cheesy leftist muscle-flexing for the press ("I'll stand up to Bush! Vote for me!") without a single thought for whether or not they were going to condemn millions of Iraqis to another generation of murderous Sunni Baathist rule. You'd think Germany, of all places, would know better (I hope that comment didn't invoke Godwin).

    There is no reason why Siria, or even Iraq would want to deploy WMD's anyway. If they really wanted to deploy WMD's, or hurt the USA or Israel, they could have already done so in the last 20 years.

    I sure wish I could get as much sleep as you apparently do (I have to assume that you slept through Iraq's casual lobbing of SCUD missles at Israel as Saddam was being kicked out of Kuwait?). I'm not saying that Syria would be foolish enough to use Saddam's exported WMDs against the US, I'm saying that he (Saddam) did have plenty of such materials still stashed away, and was very busy trying to keep the UN from seeing and touching them. The Germans, the French, and everyone else knew that. More pointless paper sanctions against Saddam weren't going to stop him from stashing stuff with his buddies in Syria.

    All this talks simply takes away the focus of the reason why Europe and many other countries don't support the U
  • Re:Not really... (Score:3, Interesting)

    by owlstead ( 636356 ) on Wednesday March 15, 2006 @07:10PM (#14928604)
    "...perfect example of one of the reasons that the French did not want to allow themselves to be painted, by a sensationalist Arab press, as being anti-Muslim in their foreign relations..."

    And with good reason. But I think that would go for any nation at this point in time. Like it or not, the Arab world is growing and countries need to think or rethink their relation to the Arab world. I am the last to to support or like any religion, but there it is.

    "You'd think Germany, of all places, would know better (I hope that comment didn't invoke Godwin)."

    You would think that the USA, of all places, would have learned what invading a country really means. I think that the only real change from such a situation should come from within a country. That would mean less regulation (which really entrenches governments, as history has shown) and trying to help the Iraqi's build a better country. Iraq is a very difficult country to reign. Sadam only just managed it by being pretty brutal. Not that that indemnifies him, but I believe the situation was definately better than it is now. Since you already mentioned Godwin, I can safely say that Sadam was far from a Hitler. He mostly really oppressed his people to stay in power (from what I can surmise).

    "now that Saddam would have still be in exactly the same situation (only more entrenched and richer) if he had been handled according to, say, French or German dictates on the matter"

    Yes, he would have. But is this a better situation? And why save Iraq, there are plenty, much less divided countries to "safe". Lets hope the USA learned its lesson and won't try to save those either. I am not too worried, most of them don't contain any oil.

    "Did you prefer the Taliban running Afghanistan?"

    Of course not, and I have much less problems with invading Afganistan. There was nothing to mess up worse than it already was. But to say that it was an altruist action by the USA, neh. I really, really hope that those girls can do something positive with their education, and that the country won't return to chaos (it is going the wrong way at the moment).

    The point of such inane regimes is to replace chaos/war (which is probably worse). To help the countries one should try and remove poverty instead of attacking the ones to run the country. Because that is restoring chaos and, unfortunately, war.
  • Re:Not really... (Score:3, Interesting)

    by AlterTick ( 665659 ) on Wednesday March 15, 2006 @10:04PM (#14929705)
    Yes, but I'm not of the opinion that Asimov ever portrayed "the human race being reduced to animals in a robot zoo" as being a bad thing.

    Indeed, I was perhaps editorializing a bit there. I think Asimov did a splendid job of merely presenting the end result, pretty much letting you decide whether you think it's a bad thing to have all your needs taken care of by a watchful, caring robot overlord. I have more of a Heinlein mindset, so I can't help but imagine how a Heinlein story hero would have reacted to such a situation. Suffice to say, rather than relaxing into upholstered opulence ,I think I'd be standing with the Heinlein guy on top of a pile of dead robots, myself. Not necessarily rational, but I just wouldn't want my destiny directly managed by something beyond my control.

    Regarding your main claim, I dunno. There's certainly a recurring theme about how strict rules can be rather brittle in real life, but I think you're reading too much into it.

    (shrug) Seemed as obvious as a slap in the face with a fish to me, all through the whole collection of stories. You could fairly easily change the references to "robot" into "human follower of fictional religion X" and the references to the 3 Laws into "Three Commandments of Religion X". I think it's pretty obviously a wide-ranging commentary on everything from slavery to prejudice to human nature, all wrapped up in a masterfully crafted set of stories that are a great read.

    I think that the overall idea of the three laws is a good idea, though. If you make a robot with a "general purpose intelligence," you're going to have to hard-wire some sort of ethics into it so as to make sure it acts in the best interests of its end-user.

    The problem there is that, as the book so elegantly illustrates, the "best interests of the end user" often defy pre-conceived laws. Sure, a self-aware intelligent robot would necessarily need to be programmed to behave ethically, but there simply no way to boil that down to a handful of hard-coded rules. Life is complicated. The robots, like we humans, would need to be allowed the free will to adjust their ethical conclusions to fit the situation.

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...