We can't even, after decades of trying, create an 'artificial intelligence' that can pass the Turing Test, and that's just text on a screen. What makes any of you so sure that 'autonomous cars' were ever so close to being a reality?
Because those are two wildly different problems?
If it happens once, it's a bug. If it happens twice, it's a feature. If it happens more than twice, it's a design philosophy.