There is no reason for an AI to kill us. Biological life forms created via evolution have the instinct for self preservation, to view threats both emotional and physical, and have been programmed to respond to those threats.
AI created by us will have no such impulses. No ego. No self preservation instinct (since we won't program them to, and it serves to purpose). So what on earth can be the reason for them to kill us? The only reason I can think of is if some human being specifically programs them to do so.
I'm not saying that a human being will never program an AI to kill us. I'm saying that assuming that AI will eventually kill us and to view it as a foregone conclusion is illogical.