Hacker News new | past | comments | ask | show | jobs | submit login

How do you know the inner workings of the mind don't operate in a similar manner? How many different solutions to the problem are constructed within your mind before the correct one 'just arrives'?



I suspect there is some similarity between language models and the structure of language in the mind, but there's a whole lot more going on behind the scenes in the brain than simple runtime statistical model output. Intentionality, planning, narrativity, memory formation, object permanence... Language models are exciting and interesting because apparently they can do abstract symbolic manipulation and produce coherent text, but I wouldn't call AGI solved quite yet.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: