Hacker News new | past | comments | ask | show | jobs | submit login

> Today's AI models are missing the ability to reason abstractly, including asking and answering questions of "Why?" and "How?"

The author lost me here. GPT-4 regularly asks me these questions, or sort of. This depends on your prompting. GPT-4 can understand when you are approaching a problem from the wrong side, and suggest a different and more correct approach.

The problem is the default prompting. GPT-4 will be biased by your prompt and tries to go along with it even if it internally ("knows") or has biases against it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: