Here's what I think your argument is: It was negligent for Tesla to provide a feature that a reasonable person would foresee substantial misuse leading to death.
This kind of product liability in cars has had lots of cases to work through the elements in the past... see Jablonski v. Ford Motor Company as a recent case.
When the use and misuse of a product results in death, the burden of diligence is on both the manufacturer and the operator. If you try to make all products perfectly safe that's an impossible condition. If you let manufacturers off the hook completely that's a wild ride too.
If you're the product engineer that is looking at the data that says when Autopilot is used correctly it's expected to save lives, and only adds to the accident rate if the feature is misused. That phenomenon pretty much describes every safety feature ever added to cars. ABS... great until you try to do the old fashioned pump the brakes. Air bags... awesome unless you put a child seat in the front seat.
So as that product engineer, if you don't roll it out, you'll save a few people who would misuse it, and kill others that would be saved by using it correctly. Beyond case law, basic ethics kick in.
Autonomous cars are going to produce some crazy case law!!