Unless AI robots get vastly more flexible rapidly, seems to me that large armies of nasty humans are still a much bigger threat (albeit one we've lived with since time immemorial).
But I suppose in the end, folks think really clever, self improving robots will win the day: https://en.wikipedia.org/wiki/...
Years ago Bill Joy warned everyone about self replication ("grey goo"). Self replicating *and* self improving seem like much worse ideas than simply arming them.
Use case; Consider some sort of waste repository (nuclear, biologic, something really, really bad to let free..zombie virus perhaps?). Say we've designed the facility to last for thousands of years. Folks have already worried about what happens when the language is no longer known, etc. ... would a "killer robot" as a last line of "defense" be worse than allowing the genie out of the bottle?