Forgot your password?

What qualifications should the 'driver' of a fully autonomous car need?

Displaying poll results.
Driver's license plus autonomous car license
  4626 votes / 28%
Driver's license
  2558 votes / 15%
Just insurance
  2262 votes / 13%
Basic permit, requiring no tests
  289 votes / 1%
Age restriction only
  484 votes / 2%
No restriction
  1777 votes / 10%
Driver must wear a funny hat
  4302 votes / 26%
16298 total votes.
[ Voting Booth | Other Polls | Back Home ]
  • Don't complain about lack of options. You've got to pick a few when you do multiple choice. Those are the breaks.
  • Feel free to suggest poll ideas if you're feeling creative. I'd strongly suggest reading the past polls first.
  • This whole thing is wildly inaccurate. Rounding errors, ballot stuffers, dynamic IPs, firewalls. If you're using these numbers to do anything important, you're insane.
This discussion has been archived. No new comments can be posted.

What qualifications should the 'driver' of a fully autonomous car need?

Comments Filter:
  • by Grant_Watson (312705) on Saturday May 17, 2014 @02:41PM (#47026817)

    It would make the most sense to require fewer qualifications as the technology becomes more proven; it could start requiring a driver's license with an endorsement and, as the cars become more capable and the kinks are worked out, go down to no license. But gradual deregulation tends to run counter to a bureaucracy's instincts and when the political process steps in it tends to do so suddenly, so I don't know if the idea would work in practice.

    • by fustakrakich (1673220) on Saturday May 17, 2014 @04:07PM (#47027333) Journal

      Best to make 'em go cold turkey. But autonomous has to mean autonomous. If the operator does not control the vehicle, aside from its destination, then he should not be held responsible any more then the guy selecting the floor in the elevator.

    • But gradual deregulation tends to run counter to a bureaucracy's instincts and when the political process steps in it tends to do so suddenly, so I don't know if the idea would work in practice.

      Not to mention that there are other bureaucratic elements (car manufacturers, insurance companies, etc.) that might have an interest in requiring a license, because that would imply that you still have the necessary skills and training to take over in an unusual situation... which also means you'd share liability for the accident in some cases. The possibility of blaming the driver goes away completely if you no longer have proof the driver could even operate the vehicle.

      Even if state waive the license r

    • by Lesrahpem (687242)
      Working from a license with an endorsement down to no license makes sense to me. I would like to see insurance, at least liability insurance, dissappear as well. I don't think the owner of an autonomous vehicle should be liable if it wrecks into something or runs someone over.
      • If your car hits my car I don't care if you are driving it or not, I care that someone pays for the damage done. So I think it's more than reasonable to require vehicles on the road autonomous or not to have a named entity who will be required and able to pay up in the first instance for the damage they do.

        Exactly how the responsibility should be split between manufacturer and owner is more open to question. I could see an arrangement where the autonomous car has a service arrangement which includes liabili

        • by jeffmeden (135043)

          If your car hits my car I don't care if you are driving it or not, I care that someone pays for the damage done. So I think it's more than reasonable to require vehicles on the road autonomous or not to have a named entity who will be required and able to pay up in the first instance for the damage they do.

          Exactly how the responsibility should be split between manufacturer and owner is more open to question. I could see an arrangement where the autonomous car has a service arrangement which includes liability cover.

          Be careful what you wish for: autonomous cars are going to contain such a huge array of cameras and other sensors that they will readily be able to prove if you for one split second diverted your attention, strayed from your lane, followed too close, exceeded the speed limit, or committed any other violation (which every human driver does dozens of times on every drive) and you will be summarily lawyered to death to prove you were actually not liable.

    • The real question is "will the driver be expected to take over when things go wrong"?

      If the vehicle is truely fully autonomous with no expectation of driver input even in emergencies then I wouldn't see any need for a drivers license (I would see a need for liability insurence on the vehicle). If there is an expectation that the driver will take over in an emergency then then driver needs full driver traning and possiblly more (training/testing on keeping their attention on the road even though they aren't

  • by Anonymous Coward on Saturday May 17, 2014 @03:16PM (#47027015)

    and then not absolving them off the responsibility is just cruel. Nobody is going to have the presence of mind to react after they've been lulled by hours upon hours of not having to do any driving. Either the car is autonomous, then the company who makes the car's algorithms or an insurance company must be responsible, or the car isn't autonomous and then it shouldn't pretend to be.

    • by CastrTroy (595695) on Saturday May 17, 2014 @08:02PM (#47028641) Homepage
      Very much agree with this. I think this is the current problem with autonomous cars. They require that the driver be ready to take over at any time in the case where something goes wrong. However once the person's been sitting in the car for a few months and never had to tale control, how much are they really going to be paying attention? Not to mention, if you're still required to take control, that means you can't be reading a book, playing video games or any of the things you'd rather be doing than driving. So what's the point of even getting one in the first place. If you have to pay attention to the road, you might as well be actually driving
      • Well, yes and no. If you're going to let autonomous cars drive in the same environments where you let people drive cars right now, as opposed to separated lanes on a few specially designed and maintained roads, the human ultimately should be responsible. Like if the car crashes because you didn't clean the bird-shit off the camera, or you didn't notice that the radar antenna in the front got dinged by a shopping cart and is about to fall off, it is your fault when it falls off and the car crashes.
        • Re: (Score:2, Informative)

          by Anonymous Coward

          Wrong. That is the manufacturer's fault for the car not notifying the owner of such problems and attempting to operate anyhow.

        • Heh, the autonomous car will drive itself to the nearest garage if it detects any anomaly.

          Really though, an autonomous car has to be considered like an elevator. And I don't see where a license should be required to select your destination with the push of a button.

          • Well, if the cars only went on special roads, you'd be right, and just like the elevator, the people who are responsible for maintaining the roads are responsible for any accidents. But normal elevators don't pop off their rails and walk you to some arbitrary location on your chosen floor. My point is that autonomous cars should be considered like an airplane with a super-duper autopilot. Sure, the pilot just pushes a button and the plane does its thing, but the pilot is still responsible for a visual inspe
            • Dunno how it is in the US, but here in the Netherlands a driver is obligated to check the car before any trip. Check functionality of lights, check tire pressure, stuff like that.
              In my 30 years I haven't seen anyone do that, except before really long trips (holiday). Not for a daily commute.
              An autonomous car should have some redundancy in critical parts, at least enough to drive safely to an emergency parking or something like that. It should test the functionality of the sensors before each trip. It should

        • by jtownatpunk.net (245670) on Sunday May 18, 2014 @05:39PM (#47033951)

          That's like saying I should be at fault if I'm a passenger on a bus and it crashes. Sure, if I interfered with the driver or slashed a tire before getting on, I'd be responsible. But, if I get on without sabotaging the bus and sit quietly in my seat without distracting the driver, how can I be at fault if there's an accident?

          As far as the auto-driving car goes, I didn't write the software or design the hardware. If it passed federal and state guidelines, it's not my fault if it fails. If I've maintained the vehicle properly and installed all of the required software updates, that should be the extent of my responsibility and liability. Also, this is what insurance is for. My insurance pays the aggrieved parties, then goes after the manufacturer if a cost/benefit analysis says they should.

    • by chuckugly (2030942) on Sunday May 18, 2014 @11:43AM (#47031871)
      Depends on what it means to be required to take control. I suspect the most likely scenario is not the one people often imagine where some split second emergency has to be avoided by heroic driver effort. This seems unlikely; almost no one would get it right and accidents would result. More likely the car detects a situation it doesn't know how to deal with, comes to a safe stop and beeps at the occupants until someone drives around the obstacle and re-engages auto mode.
    • by gman003 (1693318)

      No. The driver should not be expected to react during autonomous mode, but autonomous mode should fail gracefully. I can easily envision scenarios (construction, unmarked road, adverse weather) where the best thing the AI can do is give up, stop, and let the human take over. For those situations, the car should have at least one occupant who is able to drive.

      However, the AI should not just dump that responsibility back on the user. I would say that the AI can only hand over control when the vehicle is compl

  • Autonomous Taxis (Score:5, Insightful)

    by Anonymous Coward on Saturday May 17, 2014 @03:25PM (#47027083)

    No licence should be required, because an autonomous car is effectively a chauffeured vehicle, and since when do you need a licence to be driven around. It's my strong suspicion that autonomous vehicles will first hit the scene in the form of autonomous taxis. First these vehicles going to be fairly expensive and out of most people's budgets, but even if you could afford your own autonomous vehicle would you really want one? Why deal with car ownership when you can call up an autonomous taxi to pick you up anywhere and take you where you need to go, or even schedule pickups so a car is waiting for you when you get out of work. All the liability would be with the taxi company, so you're not responsible for any accidents, you wouldn't have to maintain your vehicle, there's no risk of it being stolen or damaged wile it sits idle outside you house. You'd never have to deal with parking again.

    • by leuk_he (194174)

      In some test by the TU delft [vislab.it] the autonomous cars test are now foucsing on how the driver takes over from the car. The coming of autonomous cars will come fluent. First there will be advanced lane asist and takeover of tasks that are easy for computers. And of course the driving in a "train" with minimal space between cars. More and more functions will be added.

      Fully autonomous will be a long way to go. e.g. to determine the last street where you want to go might be a very hard task. GPS navigations rarel

  • by Anonyme Connard (218057) on Saturday May 17, 2014 @04:28PM (#47027495)

    The computer may have a failure in the middle of nowhere. If the system is well designed the car should stop safely. But Then?
    The user may want -- or need -- to go somewhere the computer can't drive, e.g. off road.
    Therefore the user should have a driver license. If it is not mandatory, the user must at least be inform that he may be in trouble in some situations. A driver license is not only about road regulation, it is also about the ability to handle a car.

    An "autonomous car license" should require some basic knowledge about what the computer can do and can't do, and how to use it: the destination is not always an address.

  • by artor3 (1344997) on Saturday May 17, 2014 @05:10PM (#47027773)

    If the car is truly "fully autonomous" as the question suggests, then the human is just a passenger. Since when do we need a license or insurance to be a passenger? Some age restriction would be nice, so that little five-year-old Jimmy doesn't steal the family car for an automated trip to Disney World, but anything beyond that is just clinging to the past.

    • by LainTouko (926420)
      Well, someone has to have insurance. The reason passengers obviously don't have to have insurance at the moment is because they can rely on the driver having it. (Or at least they should be able to.) If we eliminate drivers from the equation, we don't eliminate the need for insurance, it just becomes less obvious who should be responsible.
      • by AmiMoJo (196126) * <mojoNO@SPAMworld3.net> on Saturday May 17, 2014 @05:37PM (#47027961) Homepage

        It does indicate the qualifications that should be required though. I voted for "none", because you don't need a license to use your private chauffeur driven car, which is essentially what this is.

      • Well, someone has to have insurance.

        Sure, but that should be the company I bought the car from, factored into the price. It's the software the car maker provided making the driving choices, why should I be responsible in any way for the choices it makes.

        • by Kjella (173770)

          Sure, but that should be the company I bought the car from, factored into the price. It's the software the car maker provided making the driving choices, why should I be responsible in any way for the choices it makes.

          So if you don't bring it in for service like ever and the brakes fail or you haven't patched the car's software in 10 years, is it still their fault? If you want Google to own that liability, you can also expect Google to set demands.

          • So if you don't bring it in for service like ever and the brakes fail or you haven't patched the car's software in 10 years, is it still their fault? If you want Google to own that liability, you can also expect Google to set demands.

            I have no problem with that. It's an autonomous vehicle, it can drive itself to the nearest link for software updates or over to the repair depot at night. If repairs are going to take more than my normal downtime or a nighttime emergency comes up, a taxi should be able to come and get me.

            I actually anticipate a day when most people don't own their own cars, they subscribe to an on-demand point-to-point transport service instead.

            • I actually anticipate a day when most people don't own their own cars, they subscribe to an on-demand point-to-point transport service instead.

              Before cars we used to have something like this...they were called trains.

              • Really? When were trains on-demand point-to-point? Did they build tracks from your house straight to grandma's house, or what?
                • by kyrsjo (2420192)

                  You might have to switch to a subway / bus / tram and (ghasp!) maybe walk a few 100 meters.

                  • Not sure where you live where there's a train/subway/bus/tram within a few hundred meters of everyone's house.

                    I had a set of grandparents that lived 30 km out in the country. The nearest road a bus would have ever used was about 15 km from their house.

                    Note for the unthinking - autonomous cars aren't just for city-dwellers.

                  • As CrimsonAvenger points out, not everybody lives near a train depot.

                    You've also completely ignored the "on-demand" part of the statement. Trains don't depart on your schedule. A train won't pick you up at your front door at 3AM because you have a 6AM flight and the nearest airport is in a city 100km away. A subscription or rental service for autonomous cars should be able to handle that with no sweat.

                    • by kyrsjo (2420192)

                      I was answering to your comment, which seemed to imply that they are never point-to-point, when the system as a whole often are. And for the on-demand thing: If it leaves every 10 minutes or so, that's close enough.

                      But shure, if you live out on the contryside, it gets harder, especially during the night. However it's funny that you mention airports, as they are often quite well served by public transport. I have myself several times taken the airport express train or bus to catch an early flight.

                    • I was answering to your comment, which seemed to imply that they are never point-to-point, when the system as a whole often are. And for the on-demand thing: If it leaves every 10 minutes or so, that's close enough.

                      I'm not trying to imply anything, I want to proclaim it boldly! Trains are not point-to-point. They don't go from your door to a destination, and don't generally provide direct routes. They don't depart at your convenience, and in the US, at least, they don't depart every ten minutes. In fact, there is virtually no reliable subway/tram/train service throughout the US except for a relatively small number of cities - more on why below.

                      Even where trains are available, it's not what I meant by point-to-p

          • The car refuses to drive itself if it's not maintained on schedule.
    • . . . and that passenger might just hit Ctrl-Alt-Del.

      Folks call me up all the time when they have computer problems. Now they'll be calling me with car trouble, as well.

      " . . . but I didn't change anything . . . really . . . it just stopped working . . . but I wasn't doing anything at the time . . ."

    • Completely agree if the car is FULLY autonomous. If it is partially autonomous, eg. it sometimes needs human intervention, then it is a very confusing situation. I would have no desire to own a partially-autonomous car because of the unclear responsibility. If I need to pay constant attention to what the car is doing, then I want to be driving.

  • by emil (695) on Saturday May 17, 2014 @06:05PM (#47028089) Homepage

    If not, sorry, you can't drive this car.

    "Unix is user friendly - it's just picky about it's friends."

  • In order for the hardware and software for a fully autonomous vehicle to really be deemed safe to be 100% in control of the vehicle on public roads, I'd think the testing the government should require (at least on the level of software testing and qualification for aircraft) would make them so expensive to purchase and operate/maintain that only people with money to throw around will own one. Regardless there's no way in my world at least you could be the sole occupant of such a vehicle and not be a fully q
    • by RR (64484) on Sunday May 18, 2014 @03:45AM (#47030243)

      In order for the hardware and software for a fully autonomous vehicle to really be deemed safe to be 100% in control of the vehicle on public roads...

      If only humans were safe, 100% in control of vehicles, when on public roads. As a bicyclist, I wish fewer humans drove cars.

      Just because the car is supposedly 'autonomous' does not mean it's so sophisticated that it can handle any situation that comes up, especially when most of the other vehicles on the road are not autonomous and therefore must be considered unpredictable.

      The Google autonomous cars are intended and have been tested on city streets, which are emphatically not full of predictable autonomous cars. So far, they've been in 1 accident, and the other party was at fault. Humans get into that type of accident all the time.

      Autonomous cars are great because they can have much faster reflexes and much deeper memories of how to avoid accidents than humans. Find a way to avoid more accidents, send an update, and all the cars running the same software would avoid those accidents in the future. Contrast that with humans, who decline with age and die, and are replaced with new idiot teenagers all the time. Google is not doing autonomous cars because they want a dystopian future without humans, like those lame protesters say, but because they're trying to save lives. [ted.com]

      At this point, the greatest fear I have about autonomous cars is that lame regulators will make it so difficult to approve them, that they never become available to the public.

      • by kheldan (1460303)
        There will ALWAYS be situations where any sort of auto-pilot will NOT be able to handle it, and that is why aircraft still have manual controls with fully qualified and experienced pilots sitting there overseeing the autopilot's operation and taking control where necessary or desired. For precisely the same reasons all motor vehicle operators should continue to be trained, tested for competency, licensed, and should strive to be experienced as drivers. It isn't always the case right now, but my main content
        • by RR (64484) on Monday May 19, 2014 @05:33PM (#47041531)

          There will ALWAYS be situations where any sort of auto-pilot will NOT be able to handle it, and that is why aircraft still have manual controls with fully qualified and experienced pilots sitting there overseeing the autopilot's operation and taking control where necessary or desired.

          There's a major difference: In an aircraft, you're always minutes away from falling out of the sky in fiery doom. A car has the option of pulling over and stopping. Also, I've been watching Mayday, [wikipedia.org] and all of the autopilot accidents have been a result of poor user interface design. If an autopilot has difficulty, then a human pilot will have difficulty. On the other hand, the Miracle on the Hudson [wikipedia.org] was facilitated by good use of the autopilot, to make corrections that a human would not be able to handle, in total contrast to that hijacking off Africa. [wikipedia.org]

          For precisely the same reasons all motor vehicle operators should continue to be trained, tested for competency, licensed, and should strive to be experienced as drivers. ... I suspect you, personally, find driving a car to be a chore that you hate, and would rather just let the deus ex machina take the wheel from you instead, and damn the consequences.

          Certainly, the operator of the car should be experienced and properly licensed. Again, as a bicyclist, I think more people should be using human power to move themselves, and not going around in multi-ton metal death boxes like it's a human right. I drive a manual transmission, so I already appreciate how people are deferring to the car's engineering, especially in boring situations.

          Far from being a deus ex machina, an autonomous car is an engineered product. In principle, you can examine its code and analyze how it works. Once it works, it will work the same way every time, unless there are software updates or faults in the sensors. In contrast, your God-given brain is messy and unpredictable. The longer you go without an accident, the more complacent you become. The more safety features you have, the more careless you become. [wikipedia.org] And as long as you go without accidents, the DMV does not bother testing your driving ability, but just renews your license sight-unseen. The current situation is demonstrably not safe.

          The big question is whether having the car drive itself would make the humans' skills atrophy. My guess is that it would improve safety, having the humans drive only in tricky situations where they know they have to be careful. And the rest of the time, the computer would be driving with all of the safety techniques that it knows, constantly alert.

  • Poll question not valid.

    "driver" means, in this context, passenger then?

    is the question assuming that the law would apply to *at least* one passenger but not others?

    is it asking if all passengers in the autonomous vehicle must have the listed options?

    is it about manual override? is the idea that theoretically a law might or might not want the person to be able to "take over"? are we assuming the car will be mandated to have manual override?

    or is it asking about the ***person who programs the software***

    because that's who's driving the "autonomous" vehicle...the glorious bastards who code & program the thing

  • If the driver needs to be able to step in at any moment, it's not a fully autonomous car in any meaningful sense, it's just a tech demo.

    The only point of a fully autonomous car is so you don't have to drive it. This means (i) being able to concentrate on stuff other than the road, like a book, a movie, or being asleep and (ii) if you're not going to be driving it anyway, why bother acquiring the skill to do so, and the certification to prove you can? Hence, just insurance.

    I really can't see the point in buy

  • Like totally missed the elephant in the room there.

    [ ] Regular MOT type tests to show that the car is in good working order and that the autonomous systems haven't been interfered with (to allow speeding, murder attempts etc).

    • by MrL0G1C (867445)

      Bollocks, one day I'll learn to read.

      Stupid poll anyway - 'autonomous' cars don't have drivers.

      Next poll:
      Should new laws be passed to hold mountains responsible for the people that fall off of them.
      [ ] yes
      [ ] no

  • I am considering that the car may be sent (without passengers) to "Go pickup hubby from work" after the spouse has used the car during the day.

    In that scenario the car would be just like a "Johnny Cab" without the passengers.

  • If it's truly autonomous, with no manual override (or the override can be locked out and proved to be locked out) then why have any restrictions at all? Of course then the rider is really a passenger, not a driver.

    If the car has a way to let the passenger take manual control and override the autopilot, then the passenger has become a driver and should be properly licensed.

    While I don't discuss the licensing issues, my book The Reticuli Deception (set about 100 years from now) has several scenes invo

  • Seems to me, buying an autonomous car should not absolve the owner of liability or responsibility, and the owner (or designee) needs to be in the vehicle as it is being operated. The owner has to have adequate insurance and passed a valid written exam, which would cover vehicle code and the limitations of the technology.

    For car companies, it seems they should have proven the technology out with minimum 5 years of safe driving record in controlled studies (just like FDA would require controlled studies to
  • Something tells me that autonomous cars will always have a manual override. There are times when it will be easier to take direct control of a vehicle than to communicate to a computer what you want done (e.g. parking in a particular way to load or off-load cargo). There are times when computers make very poor decisions, either being too safe or too dangerous, because of poor or outdates information. (Consider the periodic quirky routes generated by GPS navigation systems.) There are also people who nee

  • Until I can legally drink a White Russian and chill out with a joint, some tunes and Wii U Bowling while being driven around by my autonomous car, it's not really autonomous is it, man?

  • I think drivers of autonomous cars should have licenses. Even if they aren't directly driving, they're still responsible for keeping the vehicle in proper functioning condition.

    I do, however, think there should be reasonable exceptions for people who can't get a regular driver's license because of a physical disability such as bad eyesight.

  • by john_uy (187459)

    May be some EE and mechanical license to go with it for troubleshooting in case of problems. :P

    But if it will be autonomous, the operator will be the one responsible.

  • Insurance is already a scam. Let's not make it moreso.

    As a driver who owns two cars, you have to insure both cars if you want to alternate driving them, rather than insuring yourself as a driver, which is pretty much the definition of a scam.

  • If you really do put people in fully autonomous cars, will they ever get where they want to go? Where will cars go if they are fully autonomous? Will they choose to go anywhere, or just rest a lot? I'd prefer an "automatic" car, that goes wherever I tell it to without me having to drive it.
  • . . . the missing choice.
  • I don't know what an autonomous car is,exactly as one has never been released. Once you describe to me what is meant by autonomous cars, their capabilities and liabilities, and then I'll tell you what qualifications a human passenger must have.

    Right now, its like asking if extra terrestrial life forms should be granted citizenship in the USA. If they are sentient and as intelligent on average as the average voter, then yes. If they are of the intelligence and communication skills as a non-human great ape, t

  • Insurance should be held by the owner of a car.

    Here are the main reasons why none of the other crap should be relevant.

    1) Old people whose vision and reflexes are not up to snuff

    2) Drunk people coming home from a party.

    Basically, the entire point of a driverless car is not to give the driver a break - that is stupid. It's not a driverless car if you need someone behind the wheel qualified to drive it. Nor should they be responsible for what the driverless car does.

    Driverless cars can destroy drunk

  • Autonomous driving should be thought of as a tool to make operating your vehicle safer. Not a luxury that allows you to sit back and take a nap. Flying a plane is mostly autopilot (except for takeoff). But pilots are required to have rigorous training. Why would car autopilot be any different? You need to be prepared for the worst. And no matter how good the technology gets, there will ALWAYS be situations that require manual override.

    And this is where the slashdot idealists come in and say "well if that'

The most delightful day after the one on which you buy a cottage in the country is the one on which you resell it. -- J. Brecheux

 



Forgot your password?
Working...