I find myself yet again in agreement with hawking. Of course predicting the future is a great way to find yourself wrong... but we wouldnt be human if we didnt try.
Bottomline is that AI has a couple very serious threats to humans, the first being its use by humans as a weapon against others humans for power and control. In the not very distant future it really wouldnt be hard for a small group of people to use AI (and non AI) to essentially control most of the worlds industry, production and so forth... and its not a real big leap to posit the possibility of a hitler style "solution" being run by some cult or political group.
The second is alittle more long term but the competition for resources would be a real tangible reason for AI to either directly or indirectly compete us out of existence. If AI ever reaches a stage where it cannot be assailed or "beaten" through warfare it may very well find itself "forced" to gradually curtail or even eliminate the human population as being inefficient... or as a threat. Technially speaking it may not need to do so in a violent direct manner, it could just ensure we dont have children... or that we have drastically fewer of them each generation (allowing us to live out mostly happy lives).
I personally hold out a belief that humans will intergrate well before fully capable digital only AI comes to fruition. I dont think it will be long before we start getting implants and other "aids" connected to our brains... small and discrete at first - but over time becomming more and more intergrated to a point where whats biological or not may not even be distinguishable. while im a fan of purely biological humans i think this would actually be the best outcome - and the most likely.
My greatest fear is that AI does get rid of us.... and then does nothing of worth, i think the human capacity to easily and readily imagine things WAAAAYYYYYYYYY outside of reality may never be achievable in AI. And i question if AI can ever generate a sense of purpose, desire and direction which has allowed humanity to advance in extraordinary spurts since we created our first structured civilisations. when you think about the fact that gentically speaking we are basically the same as our wild lawless animal ancestors you can imagine just how spectacular our brains/behaviors really are. the "emergent behavior" of the human species as a group may not be reproducable by an AI.... and that could be a truely sad loss for the galaxy.
indeed, perhaps it isnt nukes, environmental suicide or war... maybe AI is the answer to the Fermi Paradox.