No, but I can imagine a change to the legal system limiting the liability of the manufacturers of self-driving cars.
If we could know that self-driving cars reduce accidents by 95% (a not unrealistic amount), it would be morally wrong for us to not put them on the road. If the only hurdle the manufacturers had left was the liability issue, then it would be morally wrong for Congress to not change the laws.
Of course, Congress has been morally bankrupt since, oh, about 1789, so I doubt that they'll see this as an imperative. On the other hand, I do imagine the car makers paying lobbyists and making campaign contributions to ensure that self-driving car manufacturers are exempted from these lawsuits, so it could still happen.
If corporations have the same rights as people in our framework of laws, why should they not be subject to the same penalties, including the death penalty (in those jurisdictions that have it)? By limiting the liability of a corporation, you are placing a higher value on it's survival than an actual person.
Limited liability is fine, as long as the corporation is viewed as a collection of persons who can be held individually responsible for malfeasance, but the moment you equate the corporation to a person, in any sense, it should suffer the same consequences along with the privileges.