Hacker News new | past | comments | ask | show | jobs | submit login

> > The machine doesn’t “really” understand, it’s just “simulating” it understands.

> You are actually displaying a subtle form of anthropomorphism with this statement. You're comparing a human-like quality (“understands”) with the AI.

This doesn't make sense. You're saying that saying a machine DOES NOT have a human like quality is "subtly" anthropomorphizing the machine?




I mean I think I kinda get it.

Understanding for a machine will never be the same understanding than understanding for a human. Well maybe in a few decades tech is really there and it turned out we were really all in a one of many laplace deterministic simulated worlds and are just LLM's generating next tokens probabilistically too


I mean the word “understand” is problematic. Machine and human understanding may be different but the word is applied to both. Does an XOR circuit “understand” what it is doing? I venture the word is inappropriate when applied to non-humans.


I think it makes sense, the framing is an inherently human one even if in negation. In contrast we'd probably never feel the need to clarify that a speaker isn't really singing.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: