Hacker News new | past | comments | ask | show | jobs | submit login

Neural nets can approximate any function.

A large enough llm with memory is turning complete.

So theoretically I don’t think there is anything they can never do.




> Neural nets can approximate any function.

Common misunderstanding of the universal approximation theorem.

Consider this: can an mlp approximate a sine wave?

> A large enough llm with memory is turning complete.

With (a lot of) chain of thought it could be.

Read the paper, and its references.


Sort of moot anyway. If statements can approximate any function, most programming languages are effectively turing complete. What's important about specific architectures like transformers is they allow for comparatively efficient determination of the set of weights that will approximate some narrower class of functions. It's finding the weights that's important, not the theoretical representation power.


"Consider this: can an mlp approximate a sine wave?"

Well, yes - we have neutral speech and music synthesis and compression algorithms which do this exceedingly well...


I think the person you're replying to may have been referring to the problem of a MLP approximating a sine wave for out of distribution samples, i.e. the entire set of real numbers.


There's all sorts of things a neural net isn't doing without a body. Giving birth or free soloing El Capitan come to mind. It could approximate the functions for both in token-land, but who cares?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: