Comment Re:Selling shovels to miners (Score 1) 45
Tesla already has issues with their cars rear-ending others since they removed the front radar, as they simply can't recognize certain types of vehicle.
I'm gonna need you to provide references.
There have been documented cases of Tesla cars, driving themselves, hitting things and killing someone. To my knowledge, most or all of these cases involved very old versions of Autopilot, possibly even "Hardware 1" (which only had a single front-facing camera). And all the ones I know about involved the car hitting a non-moving object like a fire truck or a semi-truck trailer.
The old self-driving had problems telling whether an object in front of the car was actually in the road or not. If the car was heading uphill and there was a sign above the road, the sign could look like it was in the road so the car had to assume that not everything it saw that appeared to be in the road was actually there. (It had to, because panic braking for no real reason is also an unsafe thing to do.)
Now, Tesla self-driving decomposes the problem into two parts: first, a neural net takes the 8 camera inputs (360 degree view around the car) and builds a 3D vector space model; then, another neural net uses the 3D model to make driving decisions.
Clearly a 3D model, if correctly constructed, means the car can tell whether something is actually in the road in front of the car or not. And the car shouldn't be hitting anything, whether it can tell what kind of vehicle it is, or not.
I drive a Tesla that has the Full Self-Driving beta. At this point, I trust it not to hit anything; I don't trust it to correctly follow all the rules of the road or perfectly navigate the roads I drive on. I'm definitely not claiming it's perfect. But again, I trust it not to kill me or others.
Everyone else greatly simplified this task with lidar and radar.
Everyone else has also greatly simplified this task using high-resolution maps, with the result that their cars can't drive anywhere that isn't covered by an HD map. Only Tesla is trying to solve the full, general problem.
Are they planning to have an AI driver that is trained on the rules of the road?
Yes, I'm pretty sure that's what they are doing. When my car drives me on the freeway, it will often use the passing lane to pass other cars, but then it gets out of the passing lane. It actually shows me the message "changing lanes to get out of passing lane" (quote from memory and may not be exact). This must be a trained-in behavior.
I don't know if FSD beta was given a rule just for Washington state, but in Washington it's actually in the state law that the left lane is the "passing lane" and you are supposed to get out of it if you aren't actually passing.
But the carpool lane doesn't count as a passing lane; you can hang out in a carpool lane. And, my car doesn't try to leave the carpool lane.
FSD isn't perfect but I can see progress.