a text editor that is so error prone that *needs* to autosave constantly("continuously"). Or software in general, for that matter.
You've got it backwards--it ain't an error-prone text editor, it's an error-prone human. Even conscientious, process-driven users make stupid mistakes and forget to save their work (especially when they're on a roll.) This protects us from ourselves, not the machines we're working on.
Now, you may be among that handful of people who never forgets to save--in which case, I congratulate you on being in one of the outlier cohorts that software engineers really shouldn't ever spend their time worrying about.
Year 1: "You guys, this is even better than [current industry leader]'s tech! Amazing!"
Year 2: "Hardly anybody who has updated to version 5.4 still bleeds from their eyeballs. [current industry leader] hasn't updated their tech for months!"
Year 3: "Samsung is the undisputed leader in virtual reality headsets! They've shipped five times as many units as [current industry leader], and there's no stopping this tidal wave!"
Year 6: "Hey, you should really check out the high-end Samsung VR units. They're every bit as good as [current industry leader] nowadays."
So they drive like I do. Safely. I have zero tickets. I've never even been pulled over.
Those "laws" and "signs" aren't arbitrary guidelines out to ruin your day. If everybody would actually follow them then accidents -
Hold on, another point here. The word accident is bullshit. Accidents imply that the situation was unavoidable. 99.999% of vehicle collisions are entirely preventable by simply following the rules. (Properly maintaining your vehicle is part of the law too)
Oh, I do--haven't had a moving violation in 7+ years, back when I was younger and stupider.
That said, I got nailed by a car that decided to try to make a right turn through my car last November. I was in the right lane, going the speed limit, didn't have anyone in front of me, and even saw the other driver overtaking on my left--but there was no way on this green-and-blue earth I could have reacted any faster than I did. A robot -probably- could have, and may well have saved the annoyance of having to go to a body shop to have the other guy's insurance fix it.
From my own perspective, I'm hard-pressed to see how I could have avoided this collision. And frankly, it doesn't really matter that the other driver could have--that doesn't do me a whole lot of good. I don't get to pick and choose who drives next to me.
There was an article a short while ago written by a journalist who rode in a driverless car for a stretch. There was one adjective that really stood out, an adjective that most people don't take into consideration when talking about driverless cars.
That one word: boring.
Driverless cars drive in the most boring, conservative, milquetoast fashion imaginable. They're going to be far less prone to accidents from the outset simply because they don't take the kind of chances that many of us wouldn't even begin call "risky". They drive the speed limit. They follow at an appropriate distance. They don't pull quick lane changes to get ahead of slowpokes. They don't swing around blind corners faster than they can stop upon detecting an unexpected hazard. They don't nudge through crosswalks. They don't cut off cyclists in the bike lane. They don't get impatient. They don't get frustrated. They don't get angry. They don't get sleepy. They don't get distracted. They just drive, in a deliberate, controlled, and entirely boring fashion.
The problem with so, so many of the "what if?" accident scenarios is that the people posing said scenarios presume that the car would be putting itself in the same kinds of unnecessarily hazardous driving positions that human drivers put themselves in every single day, as a matter of routine, and without a moment's hesitation.
Very, very few people drive "boring" safe. Every driverless car will. Every trip. All the time.
Would it pull over if it sees the blinking lights / siren behind it?
Probably, yes--after all, a strobing emergency light is fairly easy to detect, and as automated cars grow in number, you'd likely see more elegant mechanisms for alerting driverless vehicles of the presence of emergency vehicles. I'd imagine that manufacturers would keep some form of the "big red button" emergency stop button we've seen in a number of prototypes, as well.
Could you spoof it with a bunch of blinking xmas lights on the side of the road?
Unless you have some pretty heavy-duty strobing Christmas lights, probably not. That said, there'll probably any number of ways you could spoof the behavior of an official vehicle. In doing so, though, I'd imagine that you'd fall afoul of the same impersonation laws that exist and work quite effectively today.
Actually, the Dust Bowl was mostly caused by human actions, but please don't let _facts_ cause you to pull your head out of the sand.
Oh, sure, next thing you'll be trying to tell us that we're going to have a massive, multi-year drought because some East-coast scientists say that farmers are planting their crops wrong. You:
1. Clearly know nothing about farming,
2. are obviously a shill for the Roosevelt administration, and
3. want us to throw out generations of farming wisdom and spend huge amounts of money on a problem that doesn't even exist.
Only an idiot could look at the past decade of incredible crop yields and scream that everything's going wrong. Get off the telegraph, moron.
Josiah H. Blough (Dust-Bowl-skeptic)
When the back door was made of cloth and paper, there wasn't much sense in trying to fool the user guarding the front gate. Now that we've locked that down with a steel door and a proper deadbolt, it's a lot easier to try to sneak past the guard--and it's a lot harder to upgrade a guard than it is to upgrade a door.
I think we're entering a period where forensics and an effective legal apparatus are going to become the primary means of defense.
When the computer is good enough that you haven't had to do any driving in the past 3 months, how much are you really going to be paying attention when something goes wrong?
I'd suggest that once this is consumer-ready, the vast majority of "something goes wrong" scenarios where the car doesn't know what to do would fall into one of two categories:
These things'll never, ever be perfect. They will almost undoubtedly reach a point where they're at least an order of magnitude safer than humans, though. That'll be more than good enough for most people.
Biology grows on you.