Neither of these analogies seem quite right to me.
If there are any morally legitimate uses for military weapons, you cannot say the working on weapons per se is automatically immoral. On the other hand, that doesn't make working on any weapon development program for any client morally neutral.
When Mikhail Kalashnikov designed the AK-47, the Soviets were busy trying to repel German invaders -- surely that was a legitimate goal. They needed a cheap, rugged, lethal weapon that could be easily manufactured in large numbers. These same properties that have caused to to proliferate into unstable regions of the world. In some countries it is cheaper to buy an AK-47 than a live chicken. Some have called it a "slow motion weapon of mass destruction."
If somebody had asked Kalashnikov "Design me the ideal weapon to arm a conscripted child-soldier," he'd have told them to get lost. He designed the weapon to liberate his homeland; and he always regretted seeing his inventions in the hands of terrorists. He remarked on one occasion that he'd rather have invented an improved lawn mower.
Clearly, the ethics of weapons engineering is complex. But complex is not the same as "morally neutral". Heisenberg made errors in his atom bomb calculations, leading him to believe that a bomb was not feasible in time to affect the course of the war one way or the other. If his calculations had shown the way to an easier, practical bomb much earlier, then he'd have faced the ethical problem that arming a regime such as the Nazis with such a weapon would be a bad thing.
Today people working on aerial drone warfare are faced with serious ethical questions. Yes, you can construct scenarios in which the drone does the work of a human piloted vehicle without exposing the operator to risk -- clearly that's a good thing if you believe the operator is fighting in a just war. But one of the tenets of just war theory is that killing people pointlessly is never moral. Suppose you believed (as many do) that the Obama administration's use of drones was self-defeating, that we'd never be able to kill more legitimate enemies than are recruited to to the cause by civilian "collateral damage". Working to supply *this* regime with *that* weapon would present a moral dilemma.
Here's a simple analogy that I think works. Selling someone a gun is morally neutral, if you know nothing about what they intend to do with a gun. But if you know for a fact someone is going to use that gun to committ robberies, then selling the gun becomes wrong. The point is that you can't make generalized decisions about weapons development in a vacuum. Circumstances matter. For example it is possible to believe that under the circumstances the Manhattan Project was justified, but believe that North Korean or Pakistani nuclear program is not, without necessarily stipulating that the United States has more rights to nuclear weapons than any other country. You just have to show the circumstances are different.