Hacker News new | past | comments | ask | show | jobs | submit login

Life is optimized for _SURVIVAL_ which means being able to navigate the environment, find and find/utilize resources and ensure that they continue to exist. Reproduction is just a strategy for that.

LLMs are human thinking emulators. They're absolutely garbage compared to "system 1" thinking in humans, which is massively more efficient. They're more comparable to "system 2" human thought, but even there I doubt they're close to humans except for cases where the task involves a lot of mundane, repetitive work - even for complex logic and problem solving tasks I'd be willing to bet that the average competitive mathematician is still an order of magnitude more efficient than a LLM SoTA at problems they could both solve.




> LLMs are human thinking emulators.

They aren't. They are text predictors. Some people think verbally, and you could perhaps plausibly make your statement about them. But for the people who eg think in terms of pictures (or touch or music or something abstract), that's different.

> They're absolutely garbage compared to "system 1" thinking in humans, which is massively more efficient. They're more comparable to "system 2" human thought, but even there I doubt they're close to humans except for cases where the task involves a lot of mundane, repetitive work - even for complex logic and problem solving tasks I'd be willing to bet that the average competitive mathematician is still an order of magnitude more efficient than a LLM SoTA at problems they could both solve.

LLMs are still in the infancy of where we will be soon. However for me the amazing thing isn't that they can do a bit of mathematical reasoning (badly), but that they can do almost anything (badly). Including reformulating your mathematical proof in the style of Chaucer or in Spanish etc.

As for solving math problems: LLMs have approximately read about any paper ever published, but are not very bright. They are like a very well read intern. If anyone has ever solved something like your problem before (and many problems have been), you have an ok chance that the LLM will be able to help you.

If your problem is new, or you are just getting unlucky, current LLM are unlikely to help you.

But if you are in the former case, the LLM is most likely gonna be more efficient than the mathematician, especially if you compare costs: companies can charge very little for each inference, and still cover the cost of electricity and amortise training expenses.

A month of OpenAI paid access costs you about 20 dollars or so? You'd have to be a pretty clueless mathematician if 20 dollars an hour was your best money making opportunity. 100+ dollars an hour are more common for mathematicians, as a eg actuaries or software engineers or quants. (Of course, mathematicians might not optimise for money, and might voluntarily go into low paying jobs like teaching, or just lazing about. But that's irrelevant for the comparison of opportunity costs.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: