Rule one, save all your passengers. Nobody would buy a car that has the death of its passengers as an acceptable scenario and Jeff from marketing will be on my ass otherwise.
Rule two. Kill the least amount of people outside of the car. Done.
I know this is a thought experiment but this is completely missing the point of self driving cars IMO. Sure a human can be more moral than a car, but all it takes is being distracted for a second and you killed all the babies on the pavement.
How is that for a thought experiment?
Say I build a self-driving car, that when faced with such cases does the equivalent of "Jesus take the wheel". This is well known by the owner of the car.
In case of injury or death, who should go on trial?
Ooooh... I like that! There's bound to be a large body of case law on it. This argument (a) suggests that the person(s) constructing and/or maintaining a machine are culpable
Rule two. Kill the least amount of people outside of the car. Done.
I know this is a thought experiment but this is completely missing the point of self driving cars IMO. Sure a human can be more moral than a car, but all it takes is being distracted for a second and you killed all the babies on the pavement.