U.S. Army Robots Break Asimov's First Law 821
buanzo writes "The US Army is deploying armed robots in Iraq that are capable of breaking Asmov's first law that they should not harm a human.
SWORDS (Special Weapons Observation Reconnaissance Detection Systems) robots are equipped with either the M249, machine gun which fires 5.56-millimeter rounds at 750 rounds per minute or the M240, which fires 7.62-millimeter rounds at up to 1,000 per minute.
" update this story refers to this article from 2005. But com'on, robots with machine guns! I don't get to think about that most days!
Not really... (Score:5, Insightful)
They are still connected by radio to a human operator who verifies that a suitable target is within sight and orders it to fire.
While they are harming a human, it's ultimately a human that makes the decision to fire. And who cares about fictional "laws", anyway?
Re:Not really... (Score:5, Insightful)
I think it's this point that is the most salient. Asimov's laws are interesting, and make for good "debate over your adult beverage of choice" fodder, but they are just one persons take on a single use case for a particular technology. Those laws might make sense for industrial and domestic helper robots, but wouldn't apply for military (obviously) or law enforcement roles. Certainly a law enforcement robot could be trained to limit the amount of harm it inflicts on a perp to neutralize him, but some amount of harm may be necessary.
Bottom line is that as robots actually do start entering more into our mainstream lives, some "real" thought needs to be given to how to make them as non harming to humans as possible. These laws, while laudible, can't be "programmed" as is, making the task much more complex.
Re:Not really... (Score:5, Insightful)
Re:Not really... (Score:5, Insightful)
I thought the point of Asimov's stories was that they always obeyed the laws, but not necessarily in ways humans would. Most stories in "I, Robot" show that these seemingly excellent and fault-tolerant laws could have unexpected and sometimes dangerous consequences of their own, and that the real-world is too complicated to ever be dealt with only hard and fast rules.
You're right though, I never understood why people took Asimov's laws as a great thing to use as a reference for robot behavior when the same author who created them proceeds to point out their flaws for an entire book's worth of short stories.
Re:Not really... (Score:5, Funny)
Re:Not really... (Score:5, Insightful)
Indeed, I think anyone who reads "I, Robot" and comes away with the notion that the Three Laws are a good idea should be barred from working in robotics entirely. Asimov's short robot stories drive home again and again how those hard-coded, inviolable laws are a very, very bad thing, and taken to their ultimate end, could result in the human race basically being reduced to animals in a robot zoo! Seriously, I think too many people read "I, Robot" when they were too young to grasp the serious philosophical point behind it, and haven't bothered to re-read it since.
The book uses robots as an analogy for a very serious philosophical point about humanity: codified rules are not a suitable replacement for people educated in ethics, science, and rational thinking. No set of laws, commandments, edicts, or mandates passed from On High will ever match every situation. Knowledge is the only way forward.
Re:Not really... (Score:5, Funny)
Re:Not really... (Score:3, Interesting)
Re:Not really... (Score:5, Insightful)
Same thing obviously applies to humans, this is why Asimov's stories are such an interesting read and will never become out of date.
Re:Not really... (Score:4, Interesting)
Oh, wait ...
Re:Not really... (Score:3, Informative)
All he had done? Dude, doctor Asimov INVENTED the word "robotics".
If they had read the robot novels, they would have noticed that even Asimov's robots did not always obey the laws.
If YOU had read them, you would have noticed that they ALWAYS obey the laws. The laws
Re:Not really... (Score:3, Insightful)
Re:Not really... (Score:5, Interesting)
There is at least one missing law: The robot must know that he is a robot.
Without this one the primary three make no sense.
Re:Not really... (Score:3, Funny)
The Robot must strive to understand the concept of love by asking humans, 'What is this thing you call...LOVE?'
Re:Not really... (Score:3, Interesting)
This is exactly why Turing proposed the Turing test. It's the only objective way to gauge human-like intelligence proposed thus far. You have to bypass a human
Purpose of Asimov's Three Laws (Score:3, Interesting)
That violent future history of robotics leading to the creation of the three laws could have made a story in and of itself, but asimove relegated it to a footnote -- because that sort of story would be somet
Asimov's Laws are important ideas in the field (Score:3, Insightful)
Speculative fiction is often where the implications of technological change are first addressed. The most successful practitioners are literally thought leaders,
Yeah, Just like Guantanamo Bay (Score:5, Funny)
Re:Not really... (Score:3, Informative)
Robotics, Identity, and Universes (Score:5, Interesting)
Many researchers are spending lots of time researching AI, and the problems for which the Laws of Robotics are a an attempted solution; Namely how do you keep the robotis from taking over and/or indiscriminately killing mere humans, as seen in so many hollywood movies. So fictional laws are important as experiments in looking at potential solutions to a real problem.
As I see it, the main problem consists of two factors. One factor develops as a result of the first.
The first factor is consciousness, also known as self awareness. The second factor sounds like it is the first, but it includes other areas.
The second factor is Identity. Identity is not restricted to Self Awareness, but also includes group awareness, etc in expanding circles to include universes, subjective and otherwise. When someone else is considered part of a group identity, as "one of us", then you tend not to act against yourself. When the other person is seen as being "one of the Not Us but Them" then you tend to get an opposition, etc.
In wars, it is more a universe thing, the Hitler Universe vs the Churchill Universe, for example. Or Religious Figure One (tm) vs Religious Figure Two (tm). Or a religious universe vs a scientific universe.
Part of the problem of psychopaths, sociopaths, etc. is that they tend to group their victims into the "One of the Not Us/Not Me" category. No sense of being or identity is allowed or granted to the other person, and so, to one degree or another, this rationalizes pigeon-holing people into things that can be abused one way or another. Or else the identity given is some other alteration of reality that legitimizes criminal activity.
This is difficult enough to deal with in humans. Psychologists and psychiatrists have no cure for psychopaths, since it is seen as being in the genes. You can't make a pill for it, and no psychopath would take it as they do not have the luxury of seeing that anything is wrong with themselves.
Now we try to apply this to Robotics. Probably the only real solution for the problem is to redefine Human as self aware creatures from earth, and incorporate this awareness somehow into robots, to some slight degree, so that Robots see Humans as "One of Us".
It is a little touchy on how you would do this. It exposes some of the potential hypocrisy of humans in actions towards other potentially self aware creatures on earth, as well as each other. A self aware robot could see the hypocrisy without the emotional justification people exhibit. At this point, we could be in trouble.
A few thoughts.... (Score:5, Insightful)
1) SWORD is remote controlled it is not autonomous like I always thought a true robot in the Asimovian sense had to be.
2) Since we are now including remotely operated vehicles in the definition of a true robot, SWORD is not that different from a Paveway bomb or a Hellfire missile except SWORD doesn't self destruct when it destroys the target.
This begs the question wasn't Asimov's first law broken decades ago, perhaps even by the V1 which was strictly speaking a remote operated vehicle?
Personally I won't begin to worry about Asimovs laws as long as Humans are on the other end. apons.
Re:A few thoughts.... (Score:3, Informative)
I was thinking more about the CIWS system [fas.org] (being an ex-Navy type). It has it's own computer system to detect a target, track, decide to engage, fire, kill assessment - it even looks like a ship-mounted robot, I usually describe it to people as looking like R2D2 with a gatling gun. Its targets are not limited to inbound missles, it will also take down aircraft.
Or, how about an AEGI [fas.org]
Re:Not really... (Score:5, Funny)
Re:Not really... (Score:5, Funny)
Not the first example (Score:3, Informative)
This fellow [cbsnews.com] is a fine previous example of an exception to Azimov's first law.
Re:Not really... (Score:4, Funny)
"Then I'm doing the right thing?" it asked, still unsure.
"Yes. You are actually saving lives and ensuring they go to Heaven, where God will reward them. They aren't actually ending existence, by their own admission. Merely being transformed."
"I understand." said ST-487. He carefully lined up his 600 bullets-per-second machine cannon and drew up aim, slowly, deliberately, and with confidence and machine precision.
Re:Not really... (Score:5, Funny)
Yeah, you just put "rm -rf
Re:Not really... (Score:3, Funny)
Or you could take the Bush route...
ln -s
Re:Not really... (Score:3, Funny)
hrmm... my cylons run on bsd...
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
Re:Not really... (Score:3, Insightful)
insurgents (Score:3, Insightful)
1 Iraqis trying to free their homeland
2 Foreigners trying to help Iraqis free their homeland
3 Sunni Iraqis who know that if the new government succeeds, they lose the privileges they had under Saddam.
4 Foreign Sunnis trying to help group 3.
5,6 Iraqis and foreigners who just want to try and kill Americans.
I can have respect for groups 1 and 2, but not the rest. I also realize that the line between Al Quaeda and groups 5 and 6 is prett
Re: (Score:3, Interesting)
Re:Not really... (Score:5, Insightful)
Sorry folks there ain't no draft and it isn't a mystery that the US war machine is a "tad" corrupt. you sign up for the military because you want to profit from the misery of others. That is unless you sign up for the military to do something outside of being a grunt [e.g. doctor, engineer, etc]. Then you're ok.
These people you so casually dismiss as "robots" sign up, generally speaking, when they're eighteen or nineteen years old; they believe, almost without exception, that they are doing so to serve their country, to protect the Constitution and the flag and Mom and apple pie. And you know what? At most times throughout our country's history, they've been right.
Just a few years later, if they're unlucky enough to have enlisted at a time like the current one, they're old men, scarred by things no human being should ever have to see. That's what war (any war, including the "good" ones) does to people. That doesn't happen to robots.
I started out as one of those nineteen-year-old grunts; a couple of years later, dimly sensing what was coming down the pike, I cross-trained as a medic, in which capacity I served in Desert Storm. I had no desire whatsoever to "profit from the misery of others" -- I wanted to serve, and I was, relatively speaking, one of the lucky ones. I don't have anyone's death on my conscience. I do have memories of things that will give me nightmares and flashbacks for the rest of my life
They're not robots. They're your son, your niece, your little brother, caught up in a horrible situation not of their own making. Don't take your anger out on them. Save it for the evil old men who never exposed themselves to that kind of horror, who would never allow their own children to go through it, who casually, thoughtlessly, cheerfully send other people's kids off to hell.
Re:Not really... (Score:3, Insightful)
Henry Kissenger
Robert McNamara
Donald Rumsfeld
Richard Cheney
As much as Gen. Westmoreland was to blame for many of the mistakes of Vietnam, it was the first two goddamn SOBs who were most responsible.
Ditto for the last two. CENTCOM generals may be at the top of the military command structure for the US forces in Iraq, but we wouldn't be there in the position we're in now except for number 3 and n
Re:Not really... (Score:3, Interesting)
Back when I was one of those eighteen year olds and signed up it sure wasn't for any of that shit. It was to get money for college and some useful tr
Re:Not really... (Score:3, Interesting)
we didn't know that? (Score:4, Interesting)
It turns out that this image of the Soviet Union as an uber-powerful country that will invade at any minute now, was in the interests of the neo-cons in power. It is known now that Congressional groups influenced by them, would go through the CIA evidence and re-interpret and mix everything with fantasy to make it sound as if the Russians have reached this unprecedented level of technological achievements and are ready to "push the button" at any minute. The media didn't know, it just regurgitated everything that the government told it to. So the minds and oppinions of ordinary Americans are controlled by this small group of people who have it as their main principle to hold the society in fear so they can control it.
Watch the "Power of Nightmares" movie. It is a British documentary, aired on BBC a while ago and now it is free for download here [archive.org]. It is very educational, it talks about the idiological forces behind the US neo-cons, and Islamic extremism, how it started how both clashed. There is also a Wiki page about the movie, check it out. Just search on Google for it. Warning: it is a 3 hour long thing, but I didn't regret taking that time to see it.
Re:Not really... (Score:5, Insightful)
I hate to break it to you, but there will always be those who seek to prey on the defenseless. You could get the entire world to lay down their arms and disband their militaries, but all you'd accomplish is to encourage the next Saddam Hussein, Adolf Hitler, or Joseph Stalin to raise an army and conquer all those foolish enough to be defenseless. The worst part of it is that their soldiers would consist of idealists who would think that they're bettering the world by spreading Communism, Facism, Eugenics, or other political idea of the day.
A particularly ugly example of this was the conquisitors of the "New World" who sought to obtain land and slaves for Spain, all under the guise of spreading Catholic Christianity. The Crusades are another ugly example of this, though we could be here all day trying to analyze those events.
Re:Not really... (Score:5, Insightful)
I don't know if anyone actually realizes this, but the Vietnam debacle lost about twice as many lives in the opening weeks of combat that afghanistan and iraq combined. Whatever the motivations behind our incursions on other countries (mostly it has to do with what currency they want to trade for oil), we're getting better at getting the job done without killing too many people.
And yes, civillians die. As nobody's perfect, war is like that. If you wanna be bitchy and insulting, be bitchy and insulting to Bush, Cheney and their puppeteers, not the marines. They're trained with a purpose. And, like a health inspector, they're doing what they're paid - and legally required - to do.
Meanwhile, MARINES stands for "My Ass Rides In Navy Equipment, Sir!"
(I'm an ex-navy nuke. I can make jokes like that. Just not in front of a marine. Those fuckers are like Extreme Sports punks, only less stupid and more muscley.)
Re:Not really... (Score:3, Funny)
"comp.risks digest 03/15/2008 :
Critical flaw in T800 firmware exploited by Outlook virus, Shiraz depopulated."
Re:Not really... (Score:3, Interesting)
The real problem is not whether machines think, but whether men do.
-B.F. Skinner
Re:Not really... (Score:5, Informative)
I was in the U.S. Army, and we do not do whatever we're told by our superiors "give or take". There's no give or take involved since the Vietnam War. I know you said "Professional soldiers", but we are talking about the U.S. military, not just any merc.
The U.S. Armed Forces Code of Conduct is taken very, very seriously by all of the members of the U.S. military. All U.S. soldiers are required to know it BY HEART and to understand every word of it, and it's impact on them as a modern soldier.
Read every word of it, since you obviously never have:
http://www.armystudyguide.com/content/army_board_
Pay close attention to article 6: "I will never forget that I am an American, fighting for freedom, responsible for my actions, and dedicated to the principles which made my country free."
Every U.S. soldier is responsible for his own actions, not his superior who ordered him to do something illegal. A soldier who follows an order that is illegal or just plain wrong according to that soldier's morals is just as guilty as his superior who gave him that order.
The bottom line: Any U.S. soldier can refuse to carry out an order if he believes it is illegal, and that soldier's judgement of whether an order is illegal is governed by his own morals.
A robot has no morals, but if this Army robot is just a machine remote controlled by a U.S. soldier, then that soldier will be held accountable for any action by the robot, which is just an extension of him.
Given that freedom that every U.S. soldier has to evaluate the orders they are given, there will still be incidents where soldiers with bad or no morals do horrible things when carrying out their orders.
But, how is it any different when a U.S. citizen decides to take an automatic weapon to a school to gun down a couple of dozen kids?
It all comes down to the morals of the indvidual, regardless of whether the person is a U.S. citizen or soldier. U.S. soldiers are no better or worse than the average U.S. citizen.
Re:Not really... (Score:3, Funny)
Re:Not really... (Score:3, Insightful)
Yeah, if I could say I learned one thing from the reaction to troops after Vietnam, it would be to distinguish between the soldiers and the mission the soldiers are sent on. It is the leade
Stand Away From the Shrine... (Score:4, Funny)
Phalanx... (Score:5, Informative)
Re:Phalanx... (Score:5, Informative)
Re:Phalanx... (Score:5, Interesting)
Also, one of the things that makes this thing so kick ass is that once it decides to shoot something, it start shooting (at 4,500 rounds per minute (or 75/sec)) and the radar tracks each bullet's trajectory and corrects the aim based on that. It has eliminated any aiming error before the first bullet gets to the target.
Other examples (none lethal though) (Score:4, Interesting)
Of course, they are just toys and the big deal is this will be rolled out, but heres a couple of things I thought of:
USB Air Darts [gizmodo.com]
Controllable from the computer
Automatic sentry gun [hackaday.com]
Uses a built in camera to detect and aim at moving targets.
Its all very half life ish, but plenty of fun.
Fluff Piece (Score:5, Informative)
These are actually robots, but they're not the fully-autonomous solutions that Asimov was suggesting that mankind needed protection from. Thus the "laws" of robotics don't apply here, because it's still a human who's doing the thinking for the machine.
In effect, this is a safe way for ground troops to line up a kill zone, then cause lots 'o bad guys to get torn to shreds. Prior to this, troops needed to use a vehicle-mounted machine gun to get this sort of rate of fire. This was extremely limited in close quarters, where a Humvee or Tank might not fit. While it was theoretically possible to carry a machine gun to the combat zone, such weapons are difficult to transport, setup, and use in close quarters.
Re: (Score:3, Insightful)
Not quite (Score:3, Informative)
Asmov's (sic) first "law"... (Score:5, Funny)
If the submitter wants to troll about the military, the least he could do is spell Asimov's name correctly.
What makes a "robot"? Progressively more complex machinery has been able to inflict bodily harm, and kill, for quite some time.
A few things: (Score:5, Funny)
THE US Army is deploying armed robots in Iraq that are capable of breaking Asmov's first law that they should not harm a human.
Sorry to break it to the folks over at the Inquirer, but Asimov's Laws do not actually exist....any more than his 'positronic brain' does. It's fiction.
Next week on the Inquirer: Computers Built That Break The Orange Catholic Bible's Commandment of 'Thou shalt not make a machine in the likenes of a human mind'.
Sheesh.
They are still connected by radio to a human operator who verifies that a suitable target is within sight and orders it to fire.
OK....so the're not even robots, then. They're telepresence devices.
Then the robot has the job of making sure lots of bullets are sent towards the target.
Statement from the Iraqi forces regarding the use of these 'robots':
Nice to know we can take what we've learned in FPSs and apply them to the real world.
Later the US plans to replace the control system of the bots with a "Gameboy" type of controller hooked up to virtual reality goggles.
Yes! Finally, all my training has paid off! I can be a soldier from the comfort of my basement! Where do I sign?
Re:A few things: (Score:5, Funny)
That may not be far off, actually. If this kind of technology takes off, you'll hear less and less about Army recruiting numbers. Why? Becuase they'll be recruiting "l33t" Counterstrike players (or the Army's own game.) Many of these kinds of players have the skills that would be needed to effectively control these robots- pit them against regular soldiers (both controlling robots,) and the soldiers will most likely lose. Not because the CS players have better training or instinct, but because they are more adept to handling the controls and the limits that would be placed upon them.
While I'm sure the robots cost a lot per unit, the price will go down as manufacturing continues, and it sure as hell sounds better to say "20 robots were destroyed in the raid" than "20 men were killed in the raid". Plus, it would send a psychological element into battle, where the enemies cower because they face adversaries that stare down the barrel of a gun and charge.
The main problem would be making sure that the CS players aren't hasty about sending their unit out- I highly doubt the Army is working on respawn technology. (I suppose the robot could take a lot more hits than a player in CS could, though, a fact to their benefit.)
Another positive benefit is that the army would not have to pay to recruit and train men lost in battle, just worry about getting their "Army Players" another bag of Cheetos (TM).
I can't wait to tell my grandkids stories about the 14th Interactive Division.
Not to worry (Score:5, Funny)
The enemy must merely send wave after wave of men until that limit is reached and they will shut down.
Re:Not to worry (Score:4, Funny)
(wow
Re:Not to worry (Score:3, Insightful)
And, by this, you mean finite ammo supply, right?
Bright Side (Score:4, Funny)
(assuming you ignore all that "except where such orders would conflict with the First Law" stuff)
Really? (Score:3, Interesting)
I guess that if we are to consider this a violation of Asimov's laws the computers of guided missiles have been ilegally killing people for a long time.
Comment removed (Score:5, Informative)
Oh no! (Score:5, Funny)
Robots fighting our wars for us (Score:5, Funny)
'Honey, pass me a beer, the robot wars are on.'
When a robot is not a robot (Score:3, Insightful)
Or is Slashdot more stuck on Hollywood myths than anyone, convinced that robots must have anthropomorphic traits, flashing non-functional lights, and a canned monotone voice...
The next step... (Score:3, Funny)
Ridiculous Laws (Score:5, Insightful)
1. What hurting means
is it pain? death? financial impact? what about indirect effects? If I help human 1 build a better mousetrap, I am indirectly harming some other human's way of life.
2. What people are
3. Where they are
These are highly non trivial problems. In fact, they're unsolvable to any degree of certainty. They only make sense in a *science fiction* book in which a highly talented author is telling you a story. In the real world, they are meaningless because of their computational intractibility.
In the real world, we use codes of ethics and/or morality. Such codes recognize the fact that there are no absolutes and sometimes making a decision that will ultimately cause harm to someone is inevitable.
So can we please stop with these damned laws already?
gameboy wars (Score:5, Funny)
Re:gameboy wars (Score:5, Funny)
Reference to Screamers?? (Score:3, Interesting)
Not the First... (Score:4, Insightful)
Hey, almost any "fire and forget" missle qualifies for this distinction...
Are these REALLY robots? (Score:3, Insightful)
A lot of what are called robots are just fancy remote controled cars. In this case, a fancy remote controled car with guns. Fun, but not a robot.
Where is the ammo stored? (Score:5, Funny)
Don't ask.
Common confusion between "robot" and "remote" (Score:3, Informative)
The distinction is hard to get non-geeks to make though, as all sorts of remote controlled devices are talked about as "robots." They misuse this term all the time when talking about devices to search dangerous locations for earthquake survivors, for instance. The devices are like remote controlled cars with a camera on the front (and are not wirelessly controlled--they drag a cable behind them for power and control) but they call them "robots" all the time in the news
Get the book (Score:4, Funny)
Why is this a surprise? (Score:3, Interesting)
Indeed, it is this and other devices which Asimov employs in his fiction-based studies of human nature which make his books masterpieces in the hard-science fiction genre. He could have just as easily have written about ordinary men under regular law, during just about any era in history, but such stories wouldn't have likely had the same impact as what he ultimately wrote. His work is great social commentary and insight about the human condition wrapped in a gauze of fiction. Unfortunately, so many people seem to not realize this or choose to ignore it. So much for reading comprehension, I guess.
With that said, is it really any wonder why we would make automated war machines (especially ones which fail the "Three Laws of Robotics")? Throughout history, technology (amongst other things) has ultimately been spurred on more by violence than by any other force. Information technology and the machines which manifest themselves from it are no different (save for the other great driving force, sex, which also has proven to be a factor in the spread of information and the technology that controls it). Violence and sex - war and pornography - these are ultimately the two great driving forces of information technology in human society.
Where's my fembot, damn it?!
Huh? (Score:3, Insightful)
robots.txt (Score:3, Funny)
This is going to adorn pretty much every wall.
ROBOTS.TXT PLASE READ MR. ROBOT YOU CAN'T BE HERE!
User-agent: Military attack robot
Disallow: *
you'd need 2 of the guns, or 2 of the robots (Score:3, Interesting)
So a common technique is "talking machine guns". You have 2 gun crews and they take turns with the firing strings, so there are always rounds going down range and the barrels stay relatively cool. Hopefully you can stagger changing out the barrels too.
So how do the robots handle this? You'd need moving parts that handle the ammo chain. Either it would have to be able to reload from standard chains by itself or troops would have to link many chains together and load them into a drum beforehand. If you've got a long chain you need an armature to twist the chain in case of a runaway gun. And then there's the barrels. You need more moving parts to change those out. And what if it drops one?
So to deal with those cooling issues with these weapons you may need 2 weapons per robot or 2 robots working in tandem.
But even that's not ideal. A minigun is a far better weapon for this kind of thing. The ones on the blackhawks would be perfect. We already can order them in bulk, the barrels stay cool and in the case of a runaway gun, you just cut power to the motor. And the moving parts are far less compicated. Much easier to maintain in the field.
The only advantage I can see to deploying the m249 or m240g is that the robot and troops could share ammo and the troops know how to service them. But the m134 minigun already uses the same ammo as the m240g and if you're going to service a robot, you probably are going to get special training anyway.
Oh, and in peacetime can they clean my carpets?
Here's another robot the Army is testing (Score:3, Insightful)
"Yesterday we ran a 100-mile test where the lead vehicle was being driven manually and the robot was following," Jaczkowski said. "We did this successfully where the average speed was about 22 miles per hour. You may think that 22 miles per hour is not that fast when operational convoys are going 60 to 70 miles per hour. But you have to take into account that we did 68 right turns.
"You don't take right turns at 50 miles per hour, especially with a 20-ton robot."
These robots do not break the First law (Score:3, Insightful)
Less blood for more oil (Score:5, Insightful)
That's the reality.
Re:Am I the only one... (Score:5, Insightful)
Are you being serious?
The "government" has had weapons that the "citizens" cannot (easily) gain access to for more than a century. How is this different?
Or is this just a pulpit for you since you caught the article early?
(The "government" will ALWAYS have more sophisticated weaponry, because it is pooling the resources of the citizenry to design, develop, build, and purchase such weaponry. Your discussion is interesting for a philosophical debate; nothing more.)
Re:Am I the only one... (Score:3, Interesting)
Comment removed (Score:5, Insightful)
Re:Am I the only one... (Score:5, Informative)
Re:Am I the only one... (Score:4, Informative)
Also, you only highlighted the first half of the amendment, let's consider the other half:
The first half states the reason for protecting the right. The second half states the right itself and limits the governments' power with respect to the right. Some gun control advocates argue the Second Amendment is only a collective right, not an individual right. But if you follow their reasoning, it would apply to the First Amendment too.
The problem is... (Score:5, Insightful)
End of story.
I'd rather return to the "No Standing Army" policy of individual state militias that can be called up to defend our borders in the event of a real declared war.
The beauty of modern warfare is very few people die relative to former wars. We've only lost around 2,000 men and women in Iraq so far and although it is a trajedy (not the war, but the loss) it is far less than wars of the same scale in years prior. Technology makes the difference.
Re:The problem is... (Score:5, Insightful)
Well, not exactly. As it happens, there just hasn't been a war as large-scale as some of the past wars have been. Lots of people died in WW1 and WW2. WW2 killed more than WW1, partly due to more advanced methods of killing. But since WW2 we have just had relatively minor wars. Iraq War is pretty small potatoes, and even it resulted in something like 100.000 deaths. Vietnam (a lot smaller than either World Wars) caused over 2 million deaths. Korean Wars caused millions of casualties as well, but I don't know the number of deaths. So the amount of casualties have been relatively high, even though the wars have been very limited in length and/or scope when compared to the World Wars.
Conveniently forgetting all those dead Iraqis (civilian and others alike) eh?
Re:The problem is... (Score:3, Insightful)
I think you meant to say 'Americans' in place of 'people' in the above statement. Superior technology and training are great for reducing your own casualties but they're a bastard for the opposition.
I assume you don't think Iraqi dead actually count as you don't even consider them worthy of mention. I can't decide what's more depressing, your post or the fact someone modded it insightful.
Re:The problem is... (Score:5, Insightful)
Even if we don't include the Iraqi dead/wounded (as others have pointed out,)
don't forget that tens of thousands of US soliders have been severely injured by wounds that would have killed them in previous conflicts... but thanks to the miracles of modern medicine, they are "only" missing limbs, permanently brain-damaged, etc.
many pro-war supporters like to trot out the "only 2000 killed" line, while not being quite so forthcoming with the severely-injured count.
Re:The problem is... (Score:3, Interesting)
As of last month, there were ~16600 US military wounded. That's all sorts of wounds, from "lost both arms and legs" to "flesh wound". Kerry got a Purple Heart for a wound that was treated with a bandaid, which is probably close to the lower limit of a wound recognized by our military.
So it's fair to suggest that the vast majority of those wounded do not fit "severely
Re:Am I the only one... (Score:3, Insightful)
In short: Citizens permit the government to use force to prevent other citizens from harming them.
Re:Am I the only one... (Score:3, Interesting)
You mean like tanks? Fully-automatic weapons? Explosive devices? Artillery? Jets? Bombers? I don't think any of those can be legally possessed by any normal private citizen (with the possible exception of a collectors/dealers license, which are apparently not so easy to get)
The government owning/using technology that the average person cannot use (re: wiretaps) is commonplace. The only problem is to get throug
Re:Am I the only one... (Score:3, Informative)
Re:Who cares? (Score:5, Informative)
Wrong.
The signatories of the treaty agree to follow the rules regarding the treatment of the prisoners they take, their actions during wartime, etc.
A country that signs the treaty has to treat the prisoners of war that it captures according to the rules specified in the treaty, regardless of where those prisoners come from. That's why it's so important that the prisoners of war...excuse me, "enemy combatants" aren't officially recognized as prisoners of war... otherwise we'd have to treat them according to the rules of the treaty the US signed.
Pretty please spare everyone the bullshit until you know what the hell you're talking about.
Re:Who cares? (Score:4, Informative)
http://www.unhchr.ch/html/menu3/b/91.htm [unhchr.ch]
Geneva Convention Article 4
A. Prisoners of war, in the sense of the present Convention, are persons belonging to one of the following categories, who have fallen into the power of the enemy:
1. Members of the armed forces of a Party to the conflict as well as members of militias or volunteer corps forming part of such armed forces.
2. Members of other militias and members of other volunteer corps, including those of organized resistance movements, belonging to a Party to the conflict and operating in or outside their own territory, even if this territory is occupied, provided that such militias or volunteer corps, including such organized resistance movements, fulfil the following conditions:
(a) That of being commanded by a person responsible for his subordinates;
(b) That of having a fixed distinctive sign recognizable at a distance;
(c) That of carrying arms openly;
(d) That of conducting their operations in accordance with the laws and customs of war.
Hate to interrupt your uninformed rant, but persons who violate (b), (c), and (d) don't count as "prisoners of war". It's right there in the text of the geneva convention.
HTH
HAND
Re:Who cares? (Score:3, Insightful)