[...] But the government was always going to need to step in and regulate this. Firstly in case of accident it's not clear where the liability lies. And secondly without regulation people will not accept it unless it reaches unrealistic levels of safety. Even being 1000x safer than human drivers wouldn't be OK. If anything that would be worse because as it is traffic accidents don't make the news, but if self driving cars had only say 10 accidents per year, all 10 would be newsworthy. Not only that but the accidents may well be ones that an alert, well trained, skilled human driver would not have made, because self driving systems will likely have different accidents.
Yup.
The car doesn't need to know or care whether it's a cat or a dog running across the road, only that it should avoid hitting it. Things become more difficult when it's a black cat or a pothole - which is it? Easy enough for a human to decide, hard as hell for a computer.
Now, is that a plastic bag blowing in the wind, or is that a toddler running across the road? Does your computer make a decision to cause an accident with another car (and its relatively-well protected occupants) to avoid hitting a plastic bag that could have looked like a child running across the street?
These are HUGE issues in autonomous vehicle design, and there are no easy answers.
Because safety is involved people want to see that Tesla are beyond reproach. This is not only unrealistic but harmful. No organisation is beyond reproach, mistakes will be made, corners will be cut and stupid decisions will be made. That's a consequence of it being done by humans. And not only that, engineers will make money-death tradeoffs, essentially working out the cost of a life in dollars. People outside engineering don't like that on the whole and seem shocked if you tell them, but it's impossible to engineer a safety critical system without such calculations.
Just look at the Ford Pinto gas tank fiasco. In reality, the car was every bit as safe as other hatchbacks at the time. Toyota, whose manufacturing is above reproach, somehow shipped over 200,000 Corollas missing something as seemingly obvious as a speaker - what if it had been a little safety clip in the brakes or front suspension? In 1970s Honda Civics, the passenger could apply the brakes just by pushing too hard on the floor. That didn't make the news because it didn't have leaked internal memos which costed out the deaths and the payouts. This was a central theme in the great 1999 movie Fight Club, and the unnamed protagonist's eventual inability to reconcile what he was seeing with the tradeoffs engineers have to make in their line of work.
I could build you the safest car on the planet. But it would cost you $500,000, would be ugly as sin, would get 5 miles to the gallon on a good day with a nice tailwind, and I wouldn't make money to use for research and innovation and shareholder profit on it. And, most importantly, no one would want it in their driveway.
Risk is a consequence of doing anything. And Engineering is a profession of balancing design constraints to achieve a good outcome.
The important question is whether it is safer enough than human driving. There's a strong tendency to accept the status quo a somehow better by default simply because it's there. As it is, every time I drive and see other drivers I feel that almost any vaguely functional self driving capability is likely to be better[*]. The autopilot is stupid and reactive, but it is never sleep deprived, never angry, never frustrated, never yells at its kids, never reads its phone or sends messages, never hassles other drivers, never gets scared by being hassled, and so on.
...never drives drunk... never has a sneezing fit... never has a seizure...
These things are literally two separate and identical computers which decide together what to do. And I'll bet money that when the SpaceX Dragon capsules have docked autonomously with the International Space Station, the computers controlling the Dragon weren't all that different from what's inside every Tesla Autopilot system. Different software and firmware for sure, but I'd bet there'll be Tesla logos on those circuit boards in a few places.
Tesla and SpaceX are not automotive and aerospace companies. They're computer companies which specialize in a ground-up approach to applying high-reliability real-time systems to old problems.
The useful question is what is the accident rate of autopilot vs human. Given that I have no idea if the autopilot is good enough, but I still feel that's the important question.
And that's it. A car accident doesn't make the news unless it's spectacularly tragic. If an autonomous car takes out a stop sign to avoid a gopher, it's news.
I don't think the autonomous technology is ready for the road yet, certainly not here in Ottawa, Canada, with our climate. But yeah, it's coming, and it will soon be safer per kilometer than any human driver.
[*]I'm an excellent driver of course and rate myself above average much like 80% of all drivers...
At least you understand the Dunning-Kruger effect. You are therefore already a better driver than the vast majority of the people on the road.