I have been surprised how few others have mentioned Asimov's laws.
For the train argument, the law says save the most lives by number. No debate needed.
For the car crash, run the numbers, especially if you know the number and locations of passengers. If the choice is between a small car with just a driver and a minivan full of children, 1 high risk injury versus 7 lower-risk injuries, 1 is better than 7.
Where it gets tricky (and Asimov wrote several of these) are in cases that numbers are hard to determine.
When the choice is between an older civil servant with a high probability of survival and a young child with a low probability of survival, the robot chose based on survival probability. The lawyers and statisticians agreed it was correct based on the odds. The civil servant whose life was saved disagreed and suffered from survivor's guilt. From his life choices and his career, he would have willingly given his life for the child no matter how low the odds were. A variant of this ended up in the I, Robot movie.
Another one posed not just by Asimov but also sometimes faced in real life: If you can only save the life of the mother or the infant, and it is guaranteed at least one will die, and the odds of saving the either individual are slim, which do you work on? Those who study law, medicine, and ethics have reached the consensus that you focus on saving the mother, not the child.
Getting closer to the challenge of autonomous vehicles: Swerve left to hit the 5-year-old, swerve right to hit the 17-year-old, inaction hits them both, at least one must be struck. Again you can play the odds; the 17-year-old is bigger and probably has a better chance of survival.
Perhaps taking it further, hitting a 5-year-old child or an elderly great-grandparent, both have high risk of death, the algorithm designer may decide that the young child's longer life is more valuable than the elderly person's short remaining life. Assigning a number is hard, but something that is increasingly important in autonomous decisions.
As far as the law and liability is concerned, you still must play the numbers game. An autonomous vehicle is going to keep a log of that type of decision. Hit the single-passenger instead of 7 children, that is easy. Hit a large vehicle instead of unprotected pedestrians, again easy. Hit one child instead of two children, fewer fatalities is better.. As long as the paper trail shows the least statistically bad option was taken in the circumstances, it is difficult to argue it was the wrong action. The statistics themselves may be up for debate, but choosing the least-bad option in an all-bad scenario is typically the best option.