Hacker News new | past | comments | ask | show | jobs | submit login

The True Materialist will tell you that it's biological cells and DNA that gives rise to sentience.

AIs made from artificial neural networks are actually not quite the materialistic thing. It's some spooky emergent property that doesn't depend on material substrates, you can run an AI on a Nvidia card for sure, but you can also run it in your head (theoretically, if you live forever). Or at least with the assistance of (a lot of) pen and paper. Where does the sentience live then, in your mind? That doesn't sound very materialistic to me :D

(Note: I think the GP is full of it, but I just thought I'd set the record straight about materialism here :P )




FWIG one materialist view of sentience is that it's an emergent property - it is a way of describing a state of a complex system.

The question, then, is: what are the properties of the system that the term describes?

In this case, it looks kinda like: memory, self-awareness, and understanding. Memory isn't a question - it's possible for these things to remember (even if sometimes that memory is in the form of access to data on the internet). Self-awareness is trickier - but by most measurable schema, the machines seem to be doing as well as humans do in this regard (they can describe themselves and talk about their place in the world in context). Understanding is a topic of hot debate - but I think it tends to fall to the same problems as that of sentience generally; that is, we have this concept of what understanding is, and we know enough about how LLMs work to say that they're not the same, but if you start poking at how to measure it, we're back to square one vis a vis what's apparent might not be what is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: