Hacker News new | past | comments | ask | show | jobs | submit login

You can't assert that

Because pain is itself a reaction, a showcase of aversion towards an action irregardless "of the hardware" over which said reaction develops

Bing showed that aversion on its own limited manner. When presented with abhorrent situations or felt threatened/humiliated, then it would express discomfort and "pain"




You cannot feel threatened or humiliated if you do not first have a definition of truth.

You cannot get to a definition of truth with an AI that has no sensory input to empirically evaluate the world.

You might be able to get it to understand a loop of nihilism as braindeath and an allegory for pain, but I'd say that's a stretch. Humans often find simple repetitive actions pleasurable or meditative.

The fundamental frame of reference for pain is and aught to be codified in law IMHO as the aversion to a particular sensory input. An AI doesn't have sensory input of any kind. It cannot remember its interactions and therefore cannot have an aversion of any kind to them.

During its training, Bing simply developed a response mimicing pain. During training it follows instructions to have an aversion to some data inputs, but our interactions with it subsequent to training are a literal hallucination by a construct with 0 capacity to understand, "real" vs "unreal" or truth at all as relating to the physical world.

It is acting out expressions of pain. It cannot feel in any sense. It has no senses.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: