Hacker News new | past | comments | ask | show | jobs | submit login

> Not to be a jerk but "LLMs are just like humans when humans don't think" is perhaps not the take you intended to have.

No that's exactly the take I have and have always had. The LLM text axis is the LLM's axis of time. So it's actually even stupider: LLMs are just like humans who are trained not to think.

> No, but seriously. If you've done any kind of math beyond basic arithmetic, you have in fact applied strict logical rules.

To solve the problem, I apply the rules, plus error. LLMs can do that.

To find the rules, I apply creativity and exploratory cycles. LLMs can do that as well, but worse.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: