But you see your answer perfectly explains the bias. You see it as a matter of yourself deliberately killing 3 people to save 300, i see it as a matter of letting 300 people die because I lack the testicular fortitude to act to let the 3 go. You see, either way, 3 people minimum are going to die.
Going back the the crux of TFA, there was a scene in 'I, Robot' where Del (Will Smith) is recapping about how he got his robotic arm to the engineer lady, he says that as his car and another containing a young girl were sinking into the water, a robot came and saved him instead of the child. She remarks 'The robot was a difference engine, it worked out that you have a higher percent chance of surviving than she did' and he replies 'I know, but 14% was enough'.
Anyway, my point towards this is, say you take a scenario like that, and the machine does try to save the kids against 14% odds and fails, there would be a lawsuit anyway. There's actually nothing that can be done but to make the machine as logical as possible. That way, the laws of physics and difference are the deciding factors in a car's actions, not whether it has been programmed to pay special attention to X special interest group of people. Save from avoiding obvious mass hazards like explosives and fuel stations, I doubt there is much that can be programmed anyway.