Driving is and always been the most intense social activity we will engage in on a daily basis. We cooperate and even conspire with each other to break the law in order to get to our destinations quickly and safely. I doubt they will be able to program the complex second by second social decisions we make every day. I think Google now thinks the same as they have separated out the autonomous car part of Google, probably because they expect it to be sued into oblivion eventually by the families of autonomous car killed human drivers. If an autonomous car breaks the law and someone dies, the record is there in the database. Present that to the jury and watch the money flow. If they want autonomous cars to be on the roads of this country with humans, a corrupted and incompetent Congress will make sure we have to accept a certain number of machine-caused deaths and maimings. The Congress will have to strip human legal rights to allow widespread use of killer machines on our highways. Google has the bribe money to bring this about.
Another point: OSHA doesn't allow a certain amount of machine-caused deaths and maimings in our industries, why would it here?
It's more an example of the "create a new universal standard" approach to programming: The obligatory XKCD cartoon is:
Which is part of the coming complexity collapse as it will not stop at 15, 20, 30, 100....
"The fundamental principle of science, the definition almost, is this: the sole test of the validity of any idea is experiment." -- Richard P. Feynman