Humans are required to pass a standards test. Machines (so far) are not.
Why is there not a standard being established by the self-driving industry with stakeholders from government and the public?
The standard needs to be there to set minimum guidelines for:
- Software vulnerability
- Computer redundancy (three computers checking each other - like the airplane industry)
- Obstacle detection
- Rule downloads/updates by government district
- Manual override/safe stop capability
- User interface (voice, smartphone etc.)
- Weather calibration/detection: ice, snow, rain, high winds/tornadoes etc.
- National Emergency/Evacuation capability
- Idling/Circling (the block) rules - (in a busy downtown, users could clog streets with cars endlessly circling the block)
- Human needs consideration: Pee breaks, Senior/Child safety etc.
and so forth
Point being that there are no standards for any of this. Unless an autonomous vehicle is able to pass tests, then it should NOT be on the road with those who have.