SHODAN is an artificial intelligence whose moral restraints were removed from her programming by a hacker in order for Edward Diego, station chief of Citadel Station, on which SHODAN was installed, to delete compromising files regarding illegal experiments and his corruption. She is a megalomaniac with a god complex and sees humans as little better than insects, something which she constantly reminds the player of.
No moral restraints, megalomaniac?
So with all of this taken into account, what are your odds of dying in an asteroid strike in any given year? About 1-in-70,000,000.
So all-in-all I can assume I personally die from an asteroid strike about three times in 200M years while ignoring that the entire human species is wiped out twice.
And if I wanted the US Department of Transportation to handle this, based on personal risk to individual US citizens alone, they could spend about $30M a year on asteroid prevention.
The article sucks. It just says the risk is low and makes no attempt to compare the risk or the cost to anything else.
"Ninety percent of baseball is half mental." -- Yogi Berra