Even without self driving cars, it is quite possible that you can be found to have 0 liability for injuring someone:
It was dark, you were driving with your headlight on the highway. As you turn the corner a small kid is out on the street chasing after her ball. You slam the breaks, but you still hit her. Your car was in full working order, and you reacted as fast as reasonably expected. Good chance that the judge finds no one liable, or maybe the parent of the kid for letting them be in a dangerous situation.
You are driving along, and hit the break at a stop light. Your breaks fail and you get into an accident. You've had a recent checkup, and you took all reasonable steps to ensure a safe car. Maybe the manufacture is responsible, maybe the last mechanic you saw, maybe no one.
The only case I see for someone being liable for an accident using self-driving car is:
1. Not keeping your software updated. It would be like not responding to a car recall.
2. Using unauthorized software,, beta software, or software that isn't compatible with your car (including modifying your car).
3. Operating the car beyond the safe operating parameters. Like running the car in extreme weather (in this case the car should detect this and pull over or not start unless the user enables an override, which may be needed in case of emergency).