Nah. Just follow the law and protect the medics.
It said I preferred to kill females... But I just chose to prefer the law/staying in the lane the car was already in...
I don't see what a false dichotomy has to do with morals. To choose between killing a pregnant woman and an old man is not a standard of behaviour nor any lesson to be learned. It is lesser of two evils dilemma. Both are wrong in principal and probably have other options. Wrong and less wrong is not a moral choice. It is a lesser of two evils choice (unavoidably amoral).
moral :1. a lesson that can be derived from a story or experience., 2. standards of behaviour; principles of right and wrong.
EG: In the us elections there is no correct moral choice. War criminal/rape victim witch-hunter vs racist/sexist/taxavoiding/dumbdumb nut bag. You choose you lose.
Yes, good point. Let me add some spark to the fire. What if the machine learns by itself? The base program is still man made but the rest is developed by it. Where does one simple say the machine made a mistake instead of programmer error?
If a human engineering team creates a car with adaptive software, and that software does not contain sufficient safeguards to ensure it at all times directs the vehicle in accordance with laws and applicable regulations, then the engineering team is liable. That is how engineering works, outside of the software industry. It doesn't matter how 'smart' the car is; it is not a force of nature or a human being. It is a product, and any catastrophe engendered thereby is the responsibility of the organization that produced it.