Actually the singularity is not really about exponential growth -- I mean, sure, Ray Kurzweil writes about it a lot and uses it as evidence to get people excited, but the main idea of the singularity is that once we can create a high-level AI (at or greater than human), it can modify itself to become "smarter", and then that modified self could modify itself, and then it recursively continues.
"Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an âintelligence explosion,â(TM) and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make."
Also, I don't know about Cerf, but Kurzweil definitely doesn't use this idea to incite fear -- he has been criticized for being too optimistic.