Hacker News new | past | comments | ask | show | jobs | submit login

This was my set of rules:

1) Save humans at the cost of animals

2) If both options involve killing humans, always prefer to do nothing (continue straight ahead). That way it's the failed brakes that killed them, rather than the car's "decision". I know not making a decision is also making a decision, but a passive decision is not on the same level as an active one.

No difference between who the people are, whether they're passengers or not, or even how many.




Sounds pretty close to the heuristic I expect real-world autonomous machines to follow, which is "do the thing that is least likely to cause a lawsuit."

Doing nothing is generally seen as more innocent than doing something, at which point I'd expect most mobile robots to freeze when something goes wrong.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: