Comment Not really AI at all (Score 0) 146
Tesla and just about everyone else in the "autonomous" driving game is using an Expert System. This isn't AI, it is something dug up from the 1970s that sort of modeled what AI could do. Someday.
Well, someday isn't quite here yet. There is no underlying intelligence to these things. It is all based on rules and if you get to the bottom of the list of rules, the car has no idea what to do. This is freaking dangerous.
A true "AI" would have some default precepts, like "don't crash" and "don't hit people". These precepts or basic concepts do not exist in these vehicles. So when something unexpected comes up, there is no rule and no action. So when the roadway presents a situation that isn't recognized, the car will do something unpredictable - or it will do nothing and just keep on going.
Why can't the cars simply be programmed to brake gently to a stop if there is a condition they do not understand? That would require defining what "understand" means. An Expert System has no real intelligence, it is just a series of rules that say when you recognize the conditions of X then do Y. So a basic requirement of this is to load up the computer with every possible scenario that can happen on the road. Anyone that has driven a car for more than a week will tell you that is impossible well, maybe possible but it would take a huge amount of work.
Google had a car that drove into the side of a bus. The response was to add 3600 new rules to the package. Does that illustrate the scope of the problem?
Having a fleet of autonomous vehicles, such as what Uber is thinking of, is foolish and dangerous. Driving on the road with one of these present will present and unlimited capacity for chaos because if something unexpected (or unprogrammed) happens, the car will do something unexpected. And that could be dangerous to everyone around.