The moral "high ground" thing (which isn't very high given that we're dealing with people-killin tech here) is that the mine doesn't "decide" to kill. A soldier decides to make a particular area lethal for any human to enter, for a period of time. The odd logic seems to be that an AI that makes some kind of "reasoned" decision to kill a target, in a package deployed by a soldier and pointed at a patrol area with a pat on the back and a "do your best buddy to shoot the baddies" and then self-destructs or goes home - cannot be morally worse than a mine that literally just blows up in your face whether soldier or orphan child even 50 years after the war ended. But, this is a weirdly relativistic argument with a land mine as a dancing partner when there are much greater issues to consider as you said, like is banning them a good idea or not, which is much bigger than a "what about landmines" argument. Both landmines, and AI-powered drones with kill/no-kill decision making, are in a way no different than bullets or nuclear bombs - used properly, they are a denial-of-service, to humans, to a particular area, for a particular amount of time, at a particular degree of lethality. But this terrifying cold war style thinking of "gotta beat em all to everything to protect America" seems childish, warmongerin boomerish, or worse, a deliberate attempt to avoid the slippery slope discussion of what happens on a planetary scale when nations compete in war with a kill/no-kill decision-making capability handed off to AI brains unleashed upon one another, with nothing more than "keeping up with the Jones'" as their only excuse for building it. I mean yikes.
Sure, I could imagine a scenario where a nation is airdropping drones, instructed to kill anyone on two legs - which do not have enough fuel to accidentally return home and turn on its own people. So there's no Skynet, just a hellish swarm of death for anyone unlucky enough to be where the drones drop, and at best maybe some get captured and redeployed on the folks that sent them, but that's the exception and worth the lethal benefit of the millions that complete their mission and self-destruct when they're out of ammo or return home safe. So, this nation has this insane weapon and has a signficant tactical advantage over a nation that decides, based on some kind of sci-fi Skynet fear of friendly-fire gone mad due to an AI that gets confused due to a Johnny 5 lightning strike or something and starts obliterating friendlies - yeah, not having to "press fire every time" is an tactical advantage, but so are nukes, and we don't let some people have those. But we can have anything? Good to be at the top I guess. Naturally, the warmongering types will argue that that's just being a pansy to not build it - like, a nation that scared of technology will be at quite a disadvantage when the enemy drones start dropping from the sky, AI enhanced and killing anything that moves - pitty the bleeding-heart too scared to build the same thing themselves.
So thusly, we begin the collective death march into the brave new future of hot and cold wars unlike anything the planet has ever seen, it's near-equal being the nuclear arms race which nearly ended everyone - a march into an era of AI autonomous robot swarms laying waste, and hey, maybe it all works perfectly and "we build em better" and we win and Uncle Sam is a happy camper and makes apple pie, and thank god we had smart patriotic generals with the foresight to realize that a new arms race revolving around AI-autonomous lethal machines was the best way to handle this new development.
On a nation level, you can debate and reason out many sides of all this and make good points for ban vs must-build. Such topics tends to be, like almost everything, generally divided pro-war and anti-war folks - the only thing new being with AI - like nuclear - that it's a technology that gives even the most bloodthirsty warmonger pause, initially because of possibility of self-harm, but once THAT'S alleviated, whatever is left is not enough pause to NOT actually build it. I mean, that'd just be irresponsible. Gotta be first! As a nation - sure, justifiable, but like religion, under those rules, anything is, easily.
On a planetary level - the whole thing is just embarassing and dangerous, just like nukes were, and those are barely under control - but yeah sure, let's welcome in a new class of devastating world-killing autonomous drone swarm murderbots because someone else might do it first, it's the only way to keep the balance.
Aliens looking in on us: "lmao jfc wtf"
"The pathology is to want control, not that you ever get it, because of course you never do." -- Gregory Bateson