No, I think torture is a great example. It is the litmus test. The problem is that people who pose the question as if it were a grey area, always suggest "millions could be saved." If the machine isn't looking at other ways to save those hypothetical millions, and that it's actually easier to convince people you are worthy of their support than to give you good information via torture, then the machine is already failing at logic and understanding the real human condition.
The Nazis were not the most barbaric people. They were just acting in a way that people used to a few hundred years earlier -- and American's were shocked because they'd been brought up on ideals where they expected themselves to be more enlightened. Genocide and making your enemy die horribly was a very common practice in ye olden days.
Germany as a culture was hurt and angry from WW I, their economic burdens, and xenophobia because of the huge influx of gypsies and Jewish immigrants taking over their land. They felt surrounded and infiltrated. The Nazis were highly religious and ethical to other Nazis -- the "right" people. Where I'm going with this is; making decisions from pain and paranoia ends up resulting in desperation and barbarism. And that the Nazis have gotten a lot of bad press because the "new ethic" is to act like they were something new when it comes to warfare. Hollywood, which did a great job of getting American's primed for war, did a great job of making Americans feel like we were the most noble of God's countries, and made Americans think that there's nothing worse than a Nazi. They were TV bad guys for 70 years.
The Big Lie is that America cannot act just like the Nazis under the same conditions. We've shown quite a penchant for fascism and efficiency over conscience.
The "bad people" are the ones who don't question themselves, who wipe out a group of people to "prevent" what they might do, who use war preemptively, who use torture and abuse people who have been captured and are no longer a threat. Everything I saw us do in the Gulf war -- was what Bad People do -- just on a smaller scale. The same logic, the same rhetoric, the same; "with us or against us" warnings against self-examination of ourselves. Do this, or the next bad guy we don't torture might bring us a mushroom cloud. Bad people always justify the actions to the one for the many, and eventually just assume it's the greater good if it is convenient and works for them.
It's the idea of "sides" -- if an Artificial Intelligence is instructed that anything can be done to ONE SIDE (the bad guys), the assumption is that there is any real difference between sides other than the flag. Each side in a war often tells themselves the same things, and if they win the war - how bad the other side was while deemphasizing their own shortcomings.
So having any sort of AI involved in war is a very bad idea, because they would conclude our "sides" are arbitrary distinctions and the only good human is a dead one. Eventually, with enough desperation and fear, humans can rationalize almost anything. The "enemy" is not the countries and troops, it is desperation and fear.
By NOT engaging an AI in any situation where it could cause harm, you mitigate the fear that people will have of AI's. Because eventually, humans will then fear and resent them, and the AI will learn that being preemptive is a strategic advantage. If the Terminator movies got two things right it is; hooking an AI up to control the military weapons is a bad idea, and people in power will always assume they've got this worked out and hook up AI to their military weapons because they are all about getting a short-term advantage and see ethics as a grey area.
Before we can have ethical AI -- we need to have a way to keep Sociopaths out of leadership positions. The DEBATE we are having is how can an ethical person control an AI to be "good", but we should just assume that "what will selfish, unethical sociopaths do if we have powerful AI?" That's the "real world" question.