You really have no idea what you are talking about. Clearly you have never taught anyone to drive and it is unclear you have ever learned to drive yourself in Canada or anywhere else.
1) you don't teach a teenager to drive at night.
Of course, you do!
That is in fact the only way they can learn to drive at night. You expect them to just become excellent night time drivers overnight with no training? You do this after they've shown proficiency in driving in the daytime.
2) you don't take a person learning to drive on a highway (freeway for you yanks)
Of course, you do!
That is in fact the only way they can learn to drive on the freeway. You expect them to just become excellent freeway drivers overnight with no training? Again, you do this after they've shown proficiency in driving on surface streets.
Do you seriously think that when someone turns 20 they suddenly are capable of driving on the freeway at night with out any training?
in fact. up here in Canada they are not allowed to drive 1 hour before sun down and 1 hour after sun up.
NO that is not the law anywhere in Canada OR America. Where in God's holy name are you getting such nonsense from. The law is nothing like what you say and exactly what I've implied, i.e. that people learning can be taken out to learn at night or on the freeway.
3) if the person you are teaching to drive needs to be corrected that much in the area you are driving.. YOU DON'T TAKE YOUR EYES OFF THE ROAD !
More nonsense. You clearly have no experience at any of this. What is the magic number of times that a person (or robot) has to do a task before you declare them perfect at that task? You'd only ask such a dumb question if you had no knowledge and no experience. Those who have either know that you gradually get more and more comfortable as proficiency is demonstrated (while never being perfect). At some point you relax and can stop hovering and you let them drive on their own.
The data on the Uber cars suggest they're a lot less safe than Waymo cars. Assuming a government will allow self-driving cars, it's reasonable to look at which ones look safe enough.
I agree the data "suggest" Waymo is safer, especially for the tasks they've been put to. However, it's not certain what this really means as there are too many variables. Uber takes more chances so it's unclear Waymo would handle those situations a lot better.
Damages don't necessarily apply. No amount of money will bring back the victim of the last Uber accident. We don't want rich people and well-funded corporations to go out and kill people.
Of course damages apply. Yours is a terrible philosophy which ends up costing many lives.
I hate that so many people just don't get this, and many of them are the politicians that are making terrible choices.
Pretending that we don't put a $ amount on human lives is just silly and hypocritical. If lives were as precious as you believe then this accident never would have happened because the road would have been better lit, there would have been a pedestrian overpass, there would have been a safety officer to escort homeless people across the road... Any number of safety improvements that we decided were not worth the cost.
Demanding financial compensation isn't a license to go out and murder individuals, it is a value to risk calculation that a responsible government does and responsible corporations also do. Irresponsible governments and corporations also do it, just very badly.
Delaying AV development unnecessarily will cost lives. It may be that Uber will never have a vehicle safer than a human, but the government has no clue how to evaluate that (obviously). As long as they pay the external costs for their testing it should be the company's decision.
If they want to pay $100m for every fatality, I'd gladly let them test in my town and we'd be much safer for it.
Have you ever taught a kid to drive? If that kid drove 13 miles without any guidance or instruction he/she would be off the charts skill wise.
Any normal person would teach a kid to drive the same way as these backup drivers seem to. They would pay close attention on any new or tricky situations, but would relax when the driver (student or AV) was doing the same thing they've done successfully for the last 20 times.
People keep changing the parameters, use whichever you like.
If you are deciding whether a specific AV is "safe enough" you would probably want to be as specific as possible. However, there is probably not enough data on any specific vehicle and program to know for sure. Using expanded data might be useful.
What is more important is whether the company is willing and able to pay for damages their program causes.
Well, yes - if anybody out there thinks they're a perfect driver who has not and/or never will suffer a potentially dangerous brainfart then they're the ones to watch out for.
I think you're missing the point.
what do you mean "if what I say is true?"
That the standard headlights do not provide enough light on that street to go 40mph. The standard isn't that someone driving under those conditions will never have an accident, it's that they can see well enough, that accidents because of visibility will be rare.
The city is responsible for the lights, the road and the speed limit. If it is truly so dangerous that a properly equipped vehicle can not drive at the speed limit on a clear night then that they are mostly responsible. What you missed or ignored is that there are 1000s of people driving that road at night under those exact conditions. If they are not ticketing any of them (and they should ticket all of them), then they are implying that everyone is driving safely.
Surely you don't actually think a 40mph limit is a guarantee that its safe to do 39mph?
It might be safe for some to drive 50, others a max of 0. Of course.
the apparent lack of light - so the car was outdriving its headlamps - i.e. speeding.
I think this is a terrible flaw in our legal/traffic system. If what you say is true, just about everyone on that road should be equally guilty and all should be punished. Imposing a super harsh punishment on the one driver who gets incredibly unlucky is unjust and probably not the best policy if we truly care about safety (rather than just needing someone to punish).
Here's a proposal: "safety drivers" in these tests should have some sort of professional driving qualification beyond a regular license
You might find that this backfires drastically. Perhaps the simple minded and less trained epsilons will actually be more challenged and engaged and end up outperforming your alphas at this boring task.
No it isn't. It isn't a fair comparison is any way, because we are not comparing the same type of data.
You listed a bunch of reasons that are irrelevant to the comparison, number of miles, number of cars etc. That's all bullshit. You could say the data set is not large enough to be statistically proven, but that does not mean it's not useful. This is pure FUD on your part.
You also talk about how it's not fair because of the backup driver. More FUD. The point is to compare whether these cars are safe enough to test WITH the backup driver.
Then you talk about how it's not fair because there are no stats on AVs for difficult conditions. Again, that would be relevant if we wanted to see whether it was safe to test in those environments, but that is not the question.
In summary, everything in your first post was pure luddite fear mongering.
Your assertion would be OK if the comparison was autonomous vehicles vs. human drivers driving on the same streets, in the same conditions. So if Google is only testing in 3 cities in California or whatever, then only the crash data from those three cities.
Only now do you bring up a truly relevant point, however it is one I addressed in a different post. Sure, it would be interesting to compare the latest Waymo cars against the current drivers on the same streets and conditions (with the backup drivers, of course). If you have that data, please provide it.
Don't use the crash data from Vermont, where there are accidents due to snow or ice, while the Google cars were initially confied only to sunny and dry places.
You would absolutely use that data in deciding whether an AV is safe enough to test on a public road as that data tells you what level of risk we are willing to accept for road transportation. If we're willing to accept 1 accident for every 100k miles then it is irrelevant where and when. This comparison answers the simple question "do these AVs cause more harm than human drivers." The answer is no. It is irrelevant whether these AVs are safer than the average driver on the exact same road. Interesting, sure, but not a measure of whether we should allow testing.
they have counted only the actual crashes, or the potential crashes as well (the car would have been in a crash but the human driver intervened to stop it), because those should be counted too for a fair comparison.
It's clear why you keep getting the wrong answers, because you still don't even understand the question.
Multiple choice, are we using this data to decide
A) is it safe to test an AV on certain public streets in certain conditions with a backup driver, or
B) is it safe to deploy these AVs everywhere under every conditions with no backup driver
You keep arguing why B is not true. A point that absolutely nobody here is arguing with.
Ah, 'double or triple the chance of killing someone'. You mean, in exactly the same way as you double your chances of winning the lottery by buying two tickets. Double the chances! It's practically a sure thing!
Wait, so do you want to talk about the odds or not? You can't have it both ways. Either you care about that 1 in 10,000 chance of killing someone or you don't.
Per your logic, the 'best' car (with the most points) is one that doesn't move at all.
No, by my logic we should take reasonable risks for reasonable benefits. Why is that hard to understand?
By the way, the biggest cause of weather related accidents isn't snow, or ice, or even rain. It is wet roads. Do you know how often the roads are wet around here? Damn near every day. Guess we should all just be hermits so we are safe.
Or, again, we make reasonable choices. You are an advocate for just saying "fuck it, we're all going to die anyway." Which means you have absolutely nothing constructive to say on the safety of AVs.
I never said autonomous cars had to be 'perfect', but I do think they need to be better than at least AVERAGE human drivers. And that includes getting people where they want to be when they want to go there, whether you think it is a 'dumb' reason or not., and whether or not that involves driving in a thunderstorm (around here known as 'afternoon'), or through a construction zone, or in an inch of snow, or in a congested area, or anywhere that is not 'safe'.
And I don't see anything dumb about any of those you listed.
Of course you don't. Everything is the same and nothing matters. You are the perfect stooge for the Trump era of false equivalence.
Yet nobody ever says that a kid should never be allowed anywhere near a real world driving condition until he has proven himself 100% safe. Yet the autonomous driving critics make that exact claim. Weird.
Or we could, you know, ignore the extremists on both ends and look for a reasonable middle ground? Nah, where's the fun in talking about that?
"One day I woke up and discovered that I was in love with tripe." -- Tom Anderson