Hacker News new | past | comments | ask | show | jobs | submit login

Empathy is a very specific trait that evolved in humans, it would be unlikely that the first AI's would have it, and if they did, it wouldn't likely be exactly the same as the version humans have. I expect the first AIs to be psychopathic/amoral, or else have an entirely different moral system than our own. The first is scary enough, the second could lead to very disturbing dystopias.

For example, the AI force feeds everyone happy pills to maximize happiness. Or kills everyone to stop anyone from ever suffering again. Or maybe it values lots of beings and so forces us to reproduce as much as possible. All sorts of disturbing worlds are possible if the AI doesn't have exactly the same values we have. And we don't even know what our own values are.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: