Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:before you judge anything watch the video (Score 1) 295

Maybe you would have seen her, maybe you would have slowed/swerved enough to avoid fatality. But would you be willing to risk life in prison that you could safely have avoided (or not killed) the pedestrian? That's the real question isn't? Whether an average driver would have typically handled this better. That only a negligent driver would be held fully responsible.

In that regard whether the car had LIDAR is irrelevant. Humans don't have LIDAR so that shouldn't be part of the standard on whether Uber is more at fault than a human would be. Btw, it's quite possible the LIDAR was working fine and detected something, but the other sensors saw nothing and the software decided to avoid drastic maneuvers that might cause other problems. Similar to how a human might see a hint of shadowy movement, but humans that brake hard and swerve to avoid a fast moving tree are quickly removed from the gene pool.

No doubt Uber is using the worst camera and slanting the details to offer as a defense. Just as every human driver facing manslaughter charges would do, to expect them to be more forthcoming is unfair and naive.

Comment Re:Big mistake! (Score 1) 295

To be clear what you are saying is that it's okay for people to double or triple the chance of killing someone because life goes on and they have shit to do. Got to get to your job at 7/11, got to pick up that pepperoni pizza, got to get to that Buffalo Bills game in the snow, ...

What you're saying is that driving is a binary choice. There are certain conditions that it is just absolutely wrong to drive, but for every other time, every other situation, every other driver then it's just fine. It's not binary.

What bugs me most is the complete conflict, the cognitive dissonance that this opinion entails (not necessarily by you, but you can see it all over this thread). People are okay with other drivers, sometimes terrible drivers, driving in poor conditions for dumb reasons, but just can't tolerate an AV on the road until it has been proven to be completely safe.

Comment Re:Big mistake! (Score 1) 295

Except it's not 13 miles on a well marked highway. It's 13 miles driving at least partially in a complicated city environment, something the ACCs can't do. If that really means it could do 5 full Uber pickup/dropoffs without intervention that is pretty impressive technology. Not ready for primetime, but still unimaginable from what was possible 18 years ago.

Comment Re:Big mistake! (Score 1) 295

1) This comparison is of automated vehicles driving only in the safest conditions to humans driving in all conditions. That introduces a huge bias in favor of automated vehicles.

Fair point, but as a society we've decided to tolerate a certain amount of carnage to get where we need to go. IMO, as long as AV testing stays at about that level they are doing little harm. That human drivers put themselves in far more dangerous situations is not necessarily a point in their favor.

Comment Re:Big mistake! (Score 1) 295

This is a completely irrelevant statistic at this point. You're comparing accidents per miles driven of regular vs. experimental self-driving cars...the two "sample sizes" so to speak are so vastly different that no valid comparison is possible.

You are quite mistaken. The comparison is fair in deciding whether AVs are safe enough for real world testing in limited environments, not whether they are ready to be left unsupervised in all conditions.

Comment Re:Big mistake! (Score 1) 295

It's just cheaper to put real cars on real roads and endanger real people.

That's all that matters, really. If it would have taken $100 million in extra testing to save this one life, it would not have been worth it. When you also consider that such rigorous testing would delay the deployment of safe AVs (even if only by days), it makes real world testing even more imperative.

Comment Re:Big mistake! (Score 1) 295

"Failure" is defined as the backup driver having to take over. That says nothing about the overall safety of the car (and driver) on the road, just that the cars are not ready to be left on their own.

Google's relative caution may have less to do with caring about public safety and more about fearing public and regulatory backlash.

Slashdot Top Deals

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...