Hacker News new | past | comments | ask | show | jobs | submit login

Isn't that just scale? Even small LLMs have more parts than any car.

LLMs are more analogous to economics, psychology, politics -- it is possible there's a core science with explicability, but the systems are so complex that even defining the question is hard.






You can make a bigger ICE engine (like a container ship engine) and still understand how the whole thing works. Maybe there’s more parts moving but it still has the structure of an ICE engine.

With neural networks big or small, we got no clue what’s going on. You can observe the whole system, from the weights and biases, to the activations, gradients, etc and still get nothing.

On the other hand, one of the reasons why economics, psychology and politics are hard is because we can’t open up people’s heads and define and measure what they’re thinking.


One way I've heard it summarized: Computer Science as a field is used to things being like physics or chemistry, but we've suddenly encountered something that behaves more like biology.

[flagged]


"God" as a concept in unproven to exist, it is also impossible to prove, so for all intents and porpouses it doesn't exist.

Could be, but it does not change the fact that we do not understand them as of now.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: