Obeying such a ban would be like Ukraine giving up nuclear weapons in 1994, and then getting invaded by Russia 20 years later. Until the world is a place full of liberal democracies, banning this type of weapon or that type of weapon, merely gives a huge advantage to ruthless powers like Russia and China.
I am seriously afraid that the EU might one day be invaded because they will have so regulated their military to death, that Russia will be able to walk in and hardly fire a shot.
Until the world is a place full of liberal democracies,
Hardly. You can build one in your garage tomorrow, it won't be an Aerial HK of Terminator fame, but it'll do the job. I personally wouldn't give it face recognition software and license to kill with me out of the loop, but that's just my personal choice, fearing for my own life, it wouldn't be hard to do. The hard part would be making it reliable, but maybe individuals with an agenda don't care?
The problem with these proposed bans are: 1) It's so easy
Iâ(TM)m talking about the regulation of nation states, and their agencies, not private individuals. And with regards to their militaries, itâ(TM)s foolish for one nation to give up a huge advantage if adversary will not.
I think that reading Asimov would give a bit of an interesting and intriguing discussions on that point when it comes to how fuzzy this can be.
We have had heat-seeking missiles for decades now, and that's a kind of autonomous weapon.
In smarter weapons then you could have "don't kill humans", but if someone redefines what human is then you could get a bad result. Hence the Asimov reference. And such a weapon would have been loved by many throughout history.
I don't get your point, but maybe that was a typo and the last part was supposed to be reversed.
Having said that, I think the main objection is that programmable devices can be reprogrammed. Even if the program is in ROM, the ROMs can be replaced. Therefore any "sufficiently smart" weapons system could go off half-cocked in an "autonomous" manner. The problem is that if it's supposed to go after the enemy, you have real problems with the false negatives. Not that you'd call it friendly fire, but it would st
I support enforcement by absolute threat of destruction of any operators. MAD worked for decades, and assassination or other devastating global reply taking out an entire political and military leadership caste should be good weight to keep them from developing or using these weapons. Or just feed those same leaders to the Screamers that they created.
I could see these being used by police in some circumstances. I would NOT want autonomous weapons being designed or used for warfare. Except possibly if they only target other autonomous weapons, but even that is a bit much.
No. The heat-seeking missile is not roaming around looking for targets. It is shot - by a human - against an intended target. The heat-seeking part is for accuracy.
The US Navy's Phalanx anti-aircraft and anti-missile systems are completely autonomous, as a missile can come in too fast for a human to react to. Of course most of the time they are turned off, but when entering hostile territory they are activated.
A heat seeking missile was fired by a human, either a drone operator or a pilot. They can be prosecuted for the results of their actions, including for any war crimes. An actual autonomous weapon is more like a mine that is indiscriminate with respect to time but adding place too, and this lack of predefined scope is what makes it different. The operators may claim "I didn't commit genocide, etc. The weapon chose when/where/how to fire". That is the element that introduces problems. The only groups seriousl
A human operator turns any robot on and off. If that's the definition then no weapon is autonomous because there was always a human who chose to set it loose.
While there's no chance of a ban actually stopping the bad guys (Iran, Russia, USA, UK, EU, err, well everyone actually, except possibly the late Mother Theresa) from making any particular sort of weapon (and anyway some arsehole can always reverse a ban), at least it cramps their production and deployment.
A) Deciding who / what / where to pick as target. It seems we still prefer to keep humans in the loop here.
B) Do the actual job of killing / bombing / destruction etc. It seems we're already comfortable using (smart?) machines for this purpose. Ranging from machine guns to Predator, Tomahawk rockets and more. But ultimately still with a human selecting the target & pressing the "go" button. Or at least I'd hope / like to think so...
Is it going to kill some teenager sneaking back into the house at night?
Is it going to just decide to open fire on a crowd because a sensor sent a bad signal?
Is it going to try and use facial recognition to target people/enemy soldiers/'terrorists'?
Or is it going to be programmed to take out political leaders the programmers don't like without leaving a trace of who sent the robot or told it who to target?
For good or ill people and countries will find a way to get what they want, no matter what. You can and should try and ban the worst of it, but it will never be very successful.
Two things come to mind here for me. The first: back in the early 2000s some nut from New Zealand created a website where he was working out how to create a DIY cruise missile using off the shelf components. He got pretty decently into the project before the US Federal Government leaned on the government of NZ to have him shut down. Secondly we've are now seeing videos and photographs of rather sophisticated DIY weapons systems in conflict zones. Not simply crude shop made guns, but entire armored vehic
Fear of the exotic by those ignorant of war is understandable , but combat has been getting "cleaner" since the advent of modern precision weaponry. Consider artillery and aerial bombs. In the bad old days one had to saturate target areas to have some chance of success. Look at WWII bomb damage photos then compare to modern PGM strike footage. We no longer have to destroy a city to take out a factory. PGMs kill far fewer civilians than area bombardment and autonomous PGMs won't tire or make emotional decision
... Babylon 5 s01e04 "Infection". tl;dr/SPOILER: characters come across autonomous defense mechanism of ancient extinct civilization, programmed by populist zealots, that had detected its own entire civilization as "unpure" and wiped it out.
... Babylon 5 s01e04 "Infection". tl;dr/SPOILER: characters come across autonomous defense mechanism of ancient extinct civilization, programmed by populist zealots, that had detected its own entire civilization as "unpure" and wiped it out.
Wouldn't that be tl;dw (too long; didn't watch)? Or, I guess around here, it could be tl;dr (too long; didn't rewatch)...
... Babylon 5 s01e04 "Infection". tl;dr/SPOILER: characters come across autonomous defense mechanism of ancient extinct civilization, programmed by populist zealots, that had detected its own entire civilization as "unpure" and wiped it out.
Nice countries will respect it. Asshole countries won't give a damn and do whatever gives them the most advantage. Therefore everyone will likely end up having them or get mowed down by robo-soldier.
... that you don't give a fuck who banned you from what?
Also, my vote is for: ONLY autonoumous weaponry! But ONLY against other autonomous weaponry. And if I could enforce that, I'd also say: NO humans allowed on the battle field!
such a ban would probably be unenforceable (Score:2)
if such devices are undetectable -- because their functionality is not detectable when in use -- how could a ban be enforced?
Re: such a ban would probably be unenforceable (Score:1)
Re: such a ban would probably be unenforceable (Score:2)
I am seriously afraid that the EU might one day be invaded because they will have so regulated their military to death, that Russia will be able to walk in and hardly fire a shot.
History is full of
Re: (Score:2)
EU might one day be invaded
unrealistic. EU's and Russia's money are intertwined.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Until the world is a place full of liberal democracies,
Hardly. You can build one in your garage tomorrow, it won't be an Aerial HK of Terminator fame, but it'll do the job. I personally wouldn't give it face recognition software and license to kill with me out of the loop, but that's just my personal choice, fearing for my own life, it wouldn't be hard to do. The hard part would be making it reliable, but maybe individuals with an agenda don't care?
The problem with these proposed bans are:
1) It's so easy
Re: such a ban would probably be unenforceable (Score:2)
Re: (Score:2)
I think that reading Asimov would give a bit of an interesting and intriguing discussions on that point when it comes to how fuzzy this can be.
We have had heat-seeking missiles for decades now, and that's a kind of autonomous weapon.
In smarter weapons then you could have "don't kill humans", but if someone redefines what human is then you could get a bad result. Hence the Asimov reference. And such a weapon would have been loved by many throughout history.
Re: (Score:2)
I don't get your point, but maybe that was a typo and the last part was supposed to be reversed.
Having said that, I think the main objection is that programmable devices can be reprogrammed. Even if the program is in ROM, the ROMs can be replaced. Therefore any "sufficiently smart" weapons system could go off half-cocked in an "autonomous" manner. The problem is that if it's supposed to go after the enemy, you have real problems with the false negatives. Not that you'd call it friendly fire, but it would st
Re: (Score:2)
Re: (Score:2)
Re: such a ban would probably be unenforceable (Score:1)
Limited use cases (Score:2)
When is a weapon autonomous? (Score:2)
Is a heatseeker autonomous? Once launched it picks its own target which is, hopefully, the one intended by whoever launched it.
There's a quagmire lying in the definitions here.
Re: (Score:2)
No. The heat-seeking missile is not roaming around looking for targets. It is shot - by a human - against an intended target.
The heat-seeking part is for accuracy.
CIWS (Score:2)
The US Navy's Phalanx anti-aircraft and anti-missile systems are completely autonomous, as a missile can come in too fast for a human to react to. Of course most of the time they are turned off, but when entering hostile territory they are activated.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
Re: (Score:2)
A human operator turns any robot on and off. If that's the definition then no weapon is autonomous because there was always a human who chose to set it loose.
Just put up signs: "autonomous weapons free zone" (Score:2)
Are people really so naive to think that words on a piece of paper will stop bad people from doing or creating bad things?
Re: (Score:2)
Are people really so naive to think that words on a piece of paper will stop bad people from doing or creating bad things?
Yes, people really are than naive. It's the reason we end up with laws that prohibit you from doing things that are against the law already.
Re: (Score:2)
Exactly. Anybody remember those nuclear disarmament treaties of the Cold War? What good did they do?
It could work, but it might help (Score:1)
While there's no chance of a ban actually stopping the bad guys (Iran, Russia, USA, UK, EU, err, well everyone actually, except possibly the late Mother Theresa) from making any particular sort of weapon (and anyway some arsehole can always reverse a ban), at least it cramps their production and deployment.
On the fence (Score:1)
There's 2 things at play here:
A) Deciding who / what / where to pick as target. It seems we still prefer to keep humans in the loop here.
B) Do the actual job of killing / bombing / destruction etc. It seems we're already comfortable using (smart?) machines for this purpose. Ranging from machine guns to Predator, Tomahawk rockets and more. But ultimately still with a human selecting the target & pressing the "go" button. Or at least I'd hope / like to think so...
Autonomy imho means leavi
It's futile (Score:1)
Who is the target? (Score:1)
Is it going to kill some teenager sneaking back into the house at night?
Is it going to just decide to open fire on a crowd because a sensor sent a bad signal?
Is it going to try and use facial recognition to target people/enemy soldiers/'terrorists'?
Or is it going to be programmed to take out political leaders the programmers don't like without leaving a trace of who sent the robot or told it who to target?
hilariously niave (Score:2)
would you rather a robot fight a robot, or be drafted into the military to kill other humans?
I mean, come on people. Pull your heads out of your asses.
Prohibition doesn't work. (Score:3)
Not for drugs.
Not for weapons.
Just... not.
For good or ill people and countries will find a way to get what they want, no matter what. You can and should try and ban the worst of it, but it will never be very successful.
Re: (Score:2)
Autonomous precision beats human slop. (Score:2)
Fear of the exotic by those ignorant of war is understandable , but combat has been getting "cleaner" since the advent of modern precision weaponry.
Consider artillery and aerial bombs. In the bad old days one had to saturate target areas to have some chance of success. Look at WWII bomb damage photos then compare to modern PGM strike footage. We no longer have to destroy a city to take out a factory.
PGMs kill far fewer civilians than area bombardment and autonomous PGMs won't tire or make emotional decision
See... (Score:1)
Re: (Score:2)
... Babylon 5 s01e04 "Infection". tl;dr/SPOILER: characters come across autonomous defense mechanism of ancient extinct civilization, programmed by populist zealots, that had detected its own entire civilization as "unpure" and wiped it out.
Wouldn't that be tl;dw (too long; didn't watch)? Or, I guess around here, it could be tl;dr (too long; didn't rewatch)...
Re: (Score:1)
... Babylon 5 s01e04 "Infection". tl;dr/SPOILER: characters come across autonomous defense mechanism of ancient extinct civilization, programmed by populist zealots, that had detected its own entire civilization as "unpure" and wiped it out.
"No one is pure. No one!"
lol (Score:2)
A human is an autonomous weapon. The most dangerous one in human history.
Missing option. (Score:2)
The natural next step... (Score:1)
Yes, but it's unenforceable (Score:2)
Isn't the whole point of warfare... (Score:2)
... that you don't give a fuck who banned you from what?
Also, my vote is for: ONLY autonoumous weaponry! But ONLY against other autonomous weaponry.
And if I could enforce that, I'd also say: NO humans allowed on the battle field!