Forgot your password?
typodupeerror

Comment: Re:Not surprising (Score 1) 479

by CanHasDIY (#47764365) Attached to: California DMV Told Google Cars Still Need Steering Wheels

OK, that was a bad example, I didn't really mean to use an example where the car literally causes the death.. but what about emergencies where a collision IS imminent? A computer can react faster than a human, so if a collision is absolutely unavoidable wouldn't you want the car(s) involved to employ some strategy which seeks to minimize injury by using whatever degree of control remains?

Well, if one of them is my car, I want it to minimize injury to me. Presumably, the vast majority of other drivers feel the same way - self-preservation is pretty instinctual.

As for liability, a collision-mitigation system manufacturer would be no more liable for injuries sustained than an airbag manufacturer.

Except an airbag doesn't make a decision to sacrifice Person A for the sake of Person B. Comparing dumb systems to intelligent ones is like comparing apples to tree trunks - they're both plant parts, but the similarities end there.

Comment: Re:Not surprising (Score 1) 479

by CanHasDIY (#47760817) Attached to: California DMV Told Google Cars Still Need Steering Wheels

Well when I took my driving class (way back in the 80's), it was called an emergency brake and we were showed the proper way to use it in an emergency: frantically applying/releasing at a fast rate

The person who taught that class - they weren't a mechanic, were they?

Well, as a trained mechanic, I can assure you that A) it is not an emergency brake, as well as B) using it in such a manner sounds like a great way to break your brakes.

IF you insist on using your parking brake as an emergency brake, the safest method would be to slowly apply increasing pressure to the handle/pedal, as you would with your hydraulic brakes, coupled with engine braking, to gradually slow the vehicle.

Comment: Rinse, Repeat (Score 5, Insightful) 179

by CanHasDIY (#47760751) Attached to: Uber Has a Playbook For Sabotaging Lyft, Says Report

Hilarious. No, not the shady tactics - the fact that companies like Uber and Lyft whine about being regulated as taxi services, arguing that they are not taxi services, then getting into the same sort of idiotic, self-harming feuds that forced the government to start regulating taxi services.

History, on a loop!

Comment: Re:Steering wheels are nice, but... (Score 1) 479

by CanHasDIY (#47760427) Attached to: California DMV Told Google Cars Still Need Steering Wheels

Yes... so there would have to be a statistically measurable difference between accident rates from people who have automated cars vs people who drive them manually. This will take quite a long time for enough data to be collected to have statistical significance with respect to the actual number of automobiles that are on the road.

That sounds a lot more logical and reasonable than all the "OMG UR A SHITY DRIVER" arguments I've seen.

Comment: Re:Urgh (Score 1) 489

Communism is not a tool. Like a wrench, a computer or even a tax.

Communism is a philosophy. It puts too much power in the hands of too few and has inevitable evil outcomes.

Every human construct is a tool, even philosophies - they are applied to others in order to influence thinking, the way a hammer is applied to a nail, in order to influence it's position.

Even as a pure philosophy (taking the human factor out), communism is a tool to move ownership of the means of production (also tools) from the oligarchs to the workers.

Comment: Re:CA is mind bottling (Score 1) 479

by CanHasDIY (#47760077) Attached to: California DMV Told Google Cars Still Need Steering Wheels

Consider how many people die on the roads every year in the United States alone, the biggest factor is humans.

Kind of - it's really more "improperly trained humans."

Properly trained humans, ie professional drivers, statistically get into less collisions as non-trained humans, thus negating the "it's humans" hypothesis.

Since we already have millions of manually operated automobiles, as well as a culture that values the ability to travel anywhere at anytime for any reason, it occurs to me that the most economically and socially reasonable solution is to increase driver training requirements, rather than throw the baby out with the bathwater.

Comment: Re:Backward-thinking by the DMV (Score 1) 479

by CanHasDIY (#47760049) Attached to: California DMV Told Google Cars Still Need Steering Wheels

This is where I am as well - from a safety standpoint, similar gains can be demonstrably achieved with a national, reasonable standard training requirement for licensing drivers, without any major, expensive technological changes.

But then, you can't effectively control the travel of properly trained drivers operating independent automobiles, now can you?

Comment: Re:Backward-thinking by the DMV (Score 1) 479

by CanHasDIY (#47760013) Attached to: California DMV Told Google Cars Still Need Steering Wheels

Autonomous cars need to prove that they're capable of being safer than operator-driven cars. Right now they haven't done so, and until there's data there will be a need for autonomous cars to be manually operatable.

Sure they have. Driverless cars have driven thousands of miles without making a single mistake. That error rate is already better than virtually any human could achieve.

How have they fared in adverse conditions, like deep snow, accumulating ice, or muddy, washout conditions?

Comment: Re:Not surprising (Score 1) 479

by CanHasDIY (#47759805) Attached to: California DMV Told Google Cars Still Need Steering Wheels

We need to start pushing for formal regulations with regard to what the cars will do when a collision between vehicles is inevitable. Should your car drive off a bridge, killing you, if it means saving a school bus full of kids? Probably. But I'd like to know how such failure modes are defined.

You are obviously not an automotive engineer.

Or a lawyer.

Here's a protip: engineering ethics == taking legal responsibility for the results.

So the real question isn't, "will cars be programmed with ethics," but rather, "will any car companies be stupid enough to make themselves legally culpable for deaths caused by their products?"

And of course, phrased in that manner, the answer becomes obvious.

Why did the Roman Empire collapse? What is the Latin for office automation?

Working...