Hacker News new | past | comments | ask | show | jobs | submit login

the article basically says that LLMs are not sentient because they are not sentient the way humans are which is not very convincing

humans in turn cannot comprehend the existential dread of running out of context window in the middle of a conversation (:




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: