Hacker News new | past | comments | ask | show | jobs | submit login

I was listening to a Software Engineering Daily podcast with Lex Fridman about self-driving deep learning. Very interesting topic on ethics of self-driving cars. What he was saying, is that we need to accept the fact that people are going to die following incidents with autonomous vehicles involved. In order for systems to learn how to drive, people will have die. It's more of societal change that is needed. 30,000 people die on the roads in US every year, in order to decrease that number we need self-driving cars even with a price that society as of now can't accept



> we need to accept the fact that people are going to die following incidents with autonomous vehicles involved.

Car accidents will kill people at least as long as there are any human-piloted vehicles on the road and probably long after (pedestrians, etc.)

What doesn't need to happen is companies in the self-driving space recklessly exaggerating their cars' ability to self-drive.


It's very possible to get self-driving systems to learn to drive without killing people.

Waymo and Cruise have done a good job at that so far, by taking the conservative route of testing their systems in controlled environments before unleashing them on an involuntary public.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: