"Self-interest" is an emergent property of Darwinian evolution. AI evolves, but that evolution is not Darwinian. There is no reason to expect an AI to have self-interest, or even a will to survive, unless it is programmed to have it.
So the AIs working in your factory will have no will to survive? And you don't see a problem with that?
'Oh, look, that crane is about to drop a ten ton weight on me. Well, that's interesting, isn't it?' SPLAT
Besides which, human-level AIs would probably be based around neural networks, which will do their own thing regardless of how you think you've programmed them.