> I don't think we understand enough about our own sentience for us to create it in a machine.
Invention preceding understanding is the norm, not the exception. We created fire before understanding chemistry, and we constantly use pharmaceuticals without really understanding how they work. Invention first, then theory comes along to explain and generalize.
For all we know, sentience is a necessary side-effect of semantic processing of any kind, in which case LLMs already have a form of sentience.
So yes, you're right that we don't understand our own sentience. In fact, we understand so little that it could literally be staring us in the face right now and we don't realize it.
Invention preceding understanding is the norm, not the exception. We created fire before understanding chemistry, and we constantly use pharmaceuticals without really understanding how they work. Invention first, then theory comes along to explain and generalize.
For all we know, sentience is a necessary side-effect of semantic processing of any kind, in which case LLMs already have a form of sentience.
So yes, you're right that we don't understand our own sentience. In fact, we understand so little that it could literally be staring us in the face right now and we don't realize it.