We're not going to come to any absolute decision or agreement. Which is fine. The world and morals aren't experienced in terms of absolutes.
It depends on what "weapon" means. An amalgamation of steel resembling an AK-47 is a weapon. Heck even a chunk of iron ore is a weapon. It's not the weapon that is bad it is the use. And weapons and software are used in both moral and amoral contexts. It's not the lines of code or parts of the machine. It becomes moral only when a person picks it up and carries out their intent.
By applying a morality clause, I limit the intents that the software can be used in. Importantly, I create a barrier to entry as someone would have to recreate the software to accomplish an intent. This additional effort is then is a signal of software with a malicious intent. It would allow us to ask the questions of "Why can't this software use a moral license?" Maybe it is decided that the intent is valid, and immoral software should be used because the intent is that important. But we would know it. The current state of OSS licenses makes no distinction.
Finally, as you point out "people with weapons cause suffering and need to be stopped by other people with weapons". The US has supported various rebel groups who were allies, only to have them turn around and become enemies or terrorist groups. What gets accomplished there is just killing... on both sides. One has to question if these behaviors that lead to killing on both sides is a smart idea. The idea here are allegiances are fleeting, death is permanent.
Fundamentally, I want my work or portion of my work to be used in killing someone. I'm sure there are others. And I'm not entirely a pacifist. I just want to limit the applications of my open source work to moral causes.