Hacker News new | past | comments | ask | show | jobs | submit login

I wouldn't even give as much appreciation to chatgpt as you do. But I don't see it doing anything different than human brains do. It's just still not very good at it.

If it were up to me I'd try to give it another representation than just words. I think those models should be trained to represent text as relationship graphs of objects. There's not much natural data lole that, but it should be fairly rasy to create vast amounts of synthetic data, text generated from relationship graphs. Model should be able to make the connection to natural language.

Once models are taught this representation they might learn how the graphs transform during reasoning just by training on natural language reasoning.




You might find Drexler's "Quasilinguistic Neural Representations" stimulating.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: