Hacker News new | past | comments | ask | show | jobs | submit login

I guess one can't assign value to moral dilemma, namely zero (0) and one (1). Zero be the death, one - life.

This kind of moral 'spectrum' is highly biased and forces you to assign 'bigger' value to people that you really know nothing about.

In this case - athletic male would be a preference to save, over elderly woman, but that elderly woman could have been Marie Curie, and in case the vehicle is driving 60Km/h towards the random and unknown crowd, one (or machine) can't know the details.

This is the same as 'Would you kill a jew to preserve great german reich?' question. It's biased and can't be determined by defining a moral compass, or taking statistical example as training set.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: