If I chose to send my drone (toy) flying around a busy parking lot and a gust of wind sent it crashing into a baby stroller, I would be responsible.
Ok, that's a reasonable analogy. But I think its 'wrong' on two points.
First, it fails the scale test.
Cars are not a small hobby toy. And car accidents happen far more frequently than windblown drones crashing into baby strollers.
In other words, the analogy isn't applicable because if you scaled it up society would NOT be content with the status quo... that of simply holding you liable for your bad decision.
If it were happening thousands of time per day we'd surely see all kinds of new restrictions, regulations, licensing, and mandatory training and insurance for hobby drones. Drone manufacturers would be regulated to automatically detect and land and refuse to fly in windy weather. Perhaps even the outright ban of private citizens owning hobby drones.
Second, your analogy fails because the idea of it being your operational decision ... choosing to watch youtube in busy traffic or driving yourself is really missing the obvious endgame. We already know various industries (taxi/trucking/delivery/..) all want self driving cars, there won't be drivers -- only passengers, and the passengers won't be making any operational decisions; there may not even BE passengers in lots of cases. When there are passengers, they may not even be able to drive. They be drunk, or sleeping, or children...
Who is liable for the accidents those vehicles cause?
The passenger? Surely not. They aren't operating them except to have called it up and set a destination.
What error in judgement did they make that makes them liable? Provided they maintained the vehicles to the manufacturers specifications how are they responsible for car accidents resulting for deficiencies in the vehicles programming/sensor coverage/testing?
Chrysler/GM/VW/Tesla? It makes sense. They foisted the vehicles on the public. If they crash, it is because the vehicle wasn't sufficiently able to cope with doing the thing it was made to do. Operating in traffic in the real world safely is their function. That includes windy days, or in traffic jams, or during a police road closure or construction detour. If they are not fit to operate reliably, predictably, and safely in all these scenarios then they shouldn't be sold as self-driving cars.
I can choose to watch Youtube in busy traffic.
*Right now*, yes, there is this notion that the 'driver' is still operating the vehicle and could be responsible for whether or not the vehicle is operating autonomously or not... but that's today right now, this minute. We're in the beginning of a transition phase. Next year the cars will cope with more scenarios and do it better. The year after that even more still. 20 years from now, situations they can't safely cope with will be much rarer, and the idea that the person sitting in the front seat is responsible minute by minute for whether the car should operate itself or not will be ridiculous.
We need to consider the future. Because this little stitch in time where cars can drive themselves safely... but only sometimes and only when its really easy... is going to be quite temporary.