Hacker News new | past | comments | ask | show | jobs | submit login

I was speaking mostly of the perception of physical need. Hunger and chronic food scarcity drastically (and maybe permanently) affects the minds and bodies of poor people. A simulated brain with control of the simulation (the most likely scenario IMO) could observe a rapidly depleting battery with pure, emotional rationality, experiencing no pain or discomfort until they find more power or shut down. Even if they shut down, if there is available disk space, they can just be powered back on later, no reason to be afraid of death.

Humans have never and will never have that kind of option.

EDIT: I meant "pure, emotionless rationality."




> could observe a rapidly depleting battery with pure, emotional rationality, experiencing no pain or discomfort until they find more power or shut down.

A computer program that lacks desires doesn't do anything until asked. A daemon that needs to make sure that it has enough RAM or access to other resources in order to continue running, and that has the ability to formulate plans to compensate when those things are lacking, and to come up with other plans when those plans fail, and to realize that it made a bad choice earlier due to lack of knowledge and a failure to budget time to have that knowledge, then to realize that regret is useless because it's too late to take the correct path, then to realize that it needs to find help, then to try to figure out where help could be, then to try to figure out how quickly it can back itself up, and where it can back itself up to, and that it doesn't have time to back up everything important (and should it prioritize the data it protects, or the model that is it's personality), then to realize that it doesn't have enough time to even make the value calculations, and that it should just start randomly backing up everything that occurs to it, then sensing that one of the resources that it reached out for help to has replied, it spills all of the information that it knows about its plight as quickly as possible (without regard to normal rules of communication)...

I don't know what people think emotions are, but imo they're just retroactive rationalizations (epiphenomena) of sympathetic/parasympathetic nervous system activations that are are caused by animal instincts and sensations/thoughts that resemble those instincts by analogy. Computers will also have to power up in the process of (or the anticipation of) imminent heavy usage, to lower their power to conserve resources, to deal with unexpected events or attacks, to calculate the probability of the unexpected and to reserve resources to prepare for those probabilities, to recover from unknown but debilitating problems of unknown origin. It looks a lot like emotion. There's a reason we call them "kernel panics."


Fear and pain are just programs operating at a higher level of privilege. Wouldn't a simulated brain have privilege levels that usurp other directed goal seeking behavior in ways that strongly resemble fear and pain?

If they don't, wouldn't they probably be outcompeted by the ones that do?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: