I'd be willing to bet that said data will show that the gross majority of accidents happen just after the driver takes control, and are a direct result of driver actions.
Just like the majority of aircraft incidents are caused by "pilot error" because, well, there was a pilot on board and he didn't stop whatever bad thing it was from happening. Autopilot went south, drove the elevator trim full nose up, and the pilot couldn't get the nose back down before the plane went into a stall/spin/crash/die? That was his error. Or he failed to cancel his flight because he didn't detect the problem before taking off. That's "pilot error", too, just worded as "improper preflight".
So from what you say, as a potential driver of an autonomous vehicle, whenever it barfs and tries to hand control over to me, I should refuse. Otherwise, when I can't fix whatever situation the car has gotten me into it will be my fault ("accidents happen just after the driver takes control"). To keep from being sued for the accident, I'll have to take the position that "hey, I was never in control, it was Google that failed, sue them."
Of COURSE a large number, even majority, of accidents in autonomous vehicles will happen "just after" the vehicle has bailed out on the driver and said "tag, you're it". Especially to those drivers who want to abandon their responsibility for their own safety and sleep instead of drive. "I said WAKE UP human, I can't deal with thi.... (sound of bending metal) oh never mind."