Comment Measured approach to incident (Score 2) 295
I have already several articles suggesting that this should not be done because only more and more refinement of such a complex product will cause it to become viable. Also even with a few bugs, driverless cars are possibly already less accident-prone than humans.
As a software developer, I naturally side with continuing development.
Looking at the FAA gives a good model on how to proceed.
When an airplane crashes, the FAA sometimes grounds all models of that plane until the cause of the crash is determined and, if it was a technology error, will not allow the planes to fly again until the problem is satisfactorily resolved.
That would appear to be a measured response to this type of problem.
Don't halt all development. Don't proceed, ignoring the death(s).
Prohibit the specific driverless system from using the public roads until the problem is determined and an acceptable fix is made.
Just as cars have model years that receive approval, so should specific versions of driverless systems.
Then we can have official patches deployed on an as-needed basis, not just when a software engineer declares a bug has been fixed.
Very strict controls need to be in place to allow/deny a software/hardware update to a driverless system.
I don't want my car to be hacked and used as a killer weapon.