> What error in judgement did they make that makes them liable?
That's not the legal, or fair, standard. The results of my actions are the results, whether I made an error in judgement or just got unlucky.
Actually, you probably will want to read up on tort law, specifically standards for negligence. In the most detailed legal analysis, there are a number of elements to proving negligence. Along the line, you must establish that a defendant had a "duty" to act in a certain way and then "breached" that duty in some way. But you also not only need to prove that the defendant's actions caused something, but that they were a direct and legally relevant cause of the harm. Events always have multiple causes -- a lot of tort law is about sorting out which causes are legally relevant and which aren't.
Example: if you run over a pedestrian because you were drinking a soda and not paying attention to the road, a plaintiff generally can't bring a successful action against the shop that sold you the soda. Yes, the fact that you were distracted due to the soda was a cause of the accident, but it wasn't a legally relevant cause, because you were the one driving poorly, and your choice of distraction isn't the fault of the soda shop.
On the other hand, if you run over a pedestrian because you were drinking a bottle of whiskey, and you had bought that bottle after walking into a liquor store noticeably drunk, and surveillance footage has you on camera saying, "Yes, I'm gonna drink this and I'm gonna be drivin' all over town tonight -- I'd do give a crap if I hit someone..." -- well, in that case, the pedestrian who was struck might actually have a case to sue the liquor store, because they sold a dangerous item to someone already in a state unable to handle it and someone declaring he was going to use it improperly.
We could just as easily create scenarios with your "log in the road" example too where one person bears primary legal responsibility, or another party, or both.
The problem with accidents and driving is that, unlike most other tort cases, the pervasive and required "insurance" has led to default assumptions about where liability must lie in almost all scenarios. Thus, even in cases where a provable manufacturer defect was the primary cause of a crash, you'll frequently still see insurance companies of the drivers arguing over having to pay damages too. That's just not always the case in most other legal scenarios -- in some cases, the product manufacturer may be primarily liable and a suit against the user could NOT be successful (and would even be summarily dismissed by a court) depending on the assumptions of "normal" product use and what happened.
According to your legal theory of negligence, consumers in fact could NEVER sue product manufacturers, since the "results of your action are the results"... and you're apparently solely responsible for them, even if the product blows up unexpectedly on you -- it was your fault for using it in the first place.
Maybe such a thing will be sold some day. Right now, cruise control amd automatic braking aren't anywhere near what you've described.
Then the cars aren't actually "self-driving." Until a car has the ability to handle ALL reasonably foreseeable road conditions as well as (or better than) a human driver, it should not be sold as a "self-driving car." And note that "reasonably foreseeable" has to do with the legal issue again. Just like the soda shop can't reasonably foresee that you'd hold a soda cup up in front of your face for a full 10 seconds while driving before plowing into a pedestrian, there are likely going to be scenarios where people try to operate "self-driving cars" in situations that a car manufacturer might never consider. But there will also be plenty of conditions it WILL consider "reasonable," and if the car causes an accident in those circumstances, they should be held liable.
If UPS's truck rear-ends me on an ice-covered road, I'm going to sue UPS.
This is an incredibly bizarre argument right in the middle of what you claim to be basing on your strict direct proximate causation theory of negligence. Unless the UPS packages deliver themselves to the doorsteps of customers, surely the UPS truck has a guy in the truck. Why aren't you suing HIM, according to your theory of negligence?? What does UPS have to do with it? If that guy hadn't started up the truck and ordered it to go on its route, the hazard wouldn't have been created.
And of course UPS DOES have something to do with it, because they deployed the truck fleet. So even if there is a closer proximate cause (the "driver" of the truck putting it on the road), there may also be a corporation behind that action that should assume some of the blame. That is EXACTLY the same legal argument (which you implicitly assumed to be the case) that argues for the self-driving truck company as liable. Just as UPS chose to deploy the fleet, so Tesla chose to market the trucks as "self-driving." Depending on whether UPS operated the trucks under specs or not, you may be able to successfully sue UPS or Tesla or both.
Bottom line: events have more than one cause, both in the real world and legally. Not all of them are legally the most important. And a guy sitting in the back seat of a "self-driving" car is generally not going to be as responsible as the company that marketed the car as "self-driving" to begin with. And car companies can't be allowed to get away with no liability with a 25-page statement saying, "Don't ever use this car under X, Y, Z, and 9583 other conditions" when that's not practical. If they are going to sell it as a "self-driving" car, it needs to be able to operate under a reasonable set of road conditions that can be foreseeable.
To take your example of icy roads, well, on a single trip, road conditions may change from clear to icy. A true "self-driving" car must be equipped with some mechanism to detect potentially unsafe road conditions and to warn the passenger that travel is no longer safe. If the passenger then chooses to override that mechanism and have the car continue driving anyway, there should be a clear warning that he is now legally responsible for any damages the car may cause while operating outside of parameters. Without such a detection or warning system, why should a passenger be liable in something marketed as a "self-driving" car? Part of the duty of drivers is to pay attention to road conditions. A "self-driving" car by definition needs to be able to detect that and adjust its behavior.