Is it? Don't forget Moore's Law. Or some variation of it. It may be that we're coming close to the end of how much we can cram onto a silicon chip, but Intel and others are exploring 3D fabrication, and there are probably other approaches as well (carbon nanotubes, etc.)
Kurzweil might be overly optimistic on a lot of things, but his notion of the Law of Accelerating Returns is pretty compelling, and it's not based on our prowess with silicon. Moore's Law is just a specific instantiation of a more general principle. Even people who *do* understand the implications of exponential growth can be surprised by it.
And the question is how much of our job performance is based on being "fully human"? Does it really require "strong-AI" to do most jobs? "Weak-AI" is often defined as task-specific AI, and really most jobs are task specific. It isn't going to take strong-AI to take most jobs -- weak-AI should be sufficient. It may require that weak-AI to be improved, but again, Moore's Law.
By all accounts self-driving vehicles are not sufficiently advanced to allow them to safely drive anywhere that hasn't been carefully mapped for them. But Cadillac will be offering autonomous freeway cruise control in two years -- essentially self driving, limited to freeways. That's a long way from a fully self driving car, but if you had predicted such a thing ten years ago I'd have told you that it would be (many) decades away.
Ten years ago I'd have been confident that a driving job would be safe for a long time. Only humans could do that.