They don't need to do that to be able to drive more safely than an average driver.
Why the hell not? If a manual car automaker made a car on which the lug bolts could break thus losing the wheel and killing the occupant, that car would not be allowed on the market. Why would we allow AI on the market that is in some way not safe? Humans cannot be held responsible for mistakes that the AI makes no more than a lug bold snapping. I don't care how many people it might save one day, you can't guarantee that result so people should not be injured or killed today. What are these companies going to say to the parents when one of these vehicles kills a child who was sitting on a curb with the sun shining a certain way so as to be in a blind spot? "Oh well, to make an omelette you have to break a few eggs"?? It will be enough of a tragedy if these things kill animals that a human would have avoided.
"Aww, if you make me cry anymore, you'll fog up my helmet." -- "Visionaries" cartoon