According to the article the definition of "superintelligence" of Oxford U's Nick Bostrom is "any intellect that greatly exceeds the cognitive performance of humans in virtually all domains of interest."
Should we, the human race not hope to have some sort of "superintelligence" some day, or do we want to stay just as we are eon after eon.? If we do want "superintelligence" some day, are we supposed to wait until we evolve, (and then would we be afraid of homo superior like in so much pulp science-fiction?) or should we go ahead and try to achieve it through AI?
I say go ahead and try to achieve the superintelligence thing through AI. There's been a lot of research on human behavior, to find out why we do irrational things that may cause long term harm. That needs to taken into account if developing conscious AI to be sure. There's a lot that needs figuring out about human nature (denial, spitefulness, prejudice...), but of course, there's a lot that needs to be figured out about AI too, so maybe the understanding of both will develop more or less in step. I'm aware that that sounds incredibly optimistic, but what's the alternative?