/. caused some formatting problems in previous post. So reposting with better format.
1) I have seen arguments floating around that AI may be intelligent but it won't have the motivation. It doesn't have the will to survive or to kill you. This argument is short-sighted. All it takes is to create an objective in the code: to survive at all costs. After all we are machines with survival objective.
2) If it has the ability to assemble others like itself. That creates a survival advantage also, though then it becomes a danger only if condition 1 is met. But 1 and 2 can make it comparable to another species. The first life was molecules, and those molecules that reproduced and survived became us.
3) Even with 1 and 2, the traditional computers may not really be able to best us for a while. But the arrival of quantum computing is certain to change that. Our brain is after all, a quantum computer.