Hacker News new | past | comments | ask | show | jobs | submit login

In the case of AI, the more parameters, the better! In Physics is the opposite.



One of the hardest parts of training models is avoiding overfitting, so "more parameters are better" should be more like "more parameters are better given you're using those parameters in the right way, which can get hard and complicated".

Also LLMs just straight up do overfit, which makes them function as a database, but a really bad one. So while more parameters might just be better, that feels like a cop-out to the real problem. TBD what scaling issues we hit in the future.


A dichotomy between these fields




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: