I was listening to a Software Engineering Daily podcast with Lex Fridman about self-driving deep learning.
Very interesting topic on ethics of self-driving cars. What he was saying, is that we need to accept the fact that people are going to die following incidents with autonomous vehicles involved. In order for systems to learn how to drive, people will have die. It's more of societal change that is needed. 30,000 people die on the roads in US every year, in order to decrease that number we need self-driving cars even with a price that society as of now can't accept
It's very possible to get self-driving systems to learn to drive without killing people.
Waymo and Cruise have done a good job at that so far, by taking the conservative route of testing their systems in controlled environments before unleashing them on an involuntary public.