This bizarre model in which the car drives, except when it doesn't, and with no clear demarcation between is damned near impossible to make sense of.
If the car decides it's got no idea what to do, and it just says "you're in charge", and before you even know what's happening you're in an accident .. and the logs say "human was driving, his fault", you're screwed. Or, worse, someone builds in code which lies and just says "human was driving" 5 minute before any crash is triggered (so they can avoid liability).
Hell, they're already doing just the opposite. Remember the Hyundai superbowl commercial? Within a certain speed range the car will emergency brake itself to prevent a collision - and that's with a human driver at the wheel.
Given the VW scandal, I think that car companies are going to be under more intense scrutiny for a while. The only time I've heard about self-driving cars that will toss control to a driver were extreme-alpha builds, manned by professional drivers. Modern self-driving cars have the opposite problem - they're designed to stop safely if there's a problem, and not proceed if they don't understand what's going on.
Tossing control to a driver while traveling at 50+ mph moments before an accident isn't something any professional is going to allow. "Cattle car" is a good analogy. Worst case, it stops safely on the side of the road and buzzes for assistance. That's an acceptable failure mode.