Hacker News new | past | comments | ask | show | jobs | submit login

I don't know why this mistake gets repeated, but NNs are universal continous function approximators. They can't approximate non-continuous functions in general, and there are plenty of non-continuous computable functions.

Also, this theorem is almost useless in practice: it only tells you that, for any function and desired error rate, there exists some NN which would approximate that function within that error rate. But there is no proof that there is some way to train the weights for that NN based on any known training mechanism, even if we magically knew the shape of that NN (which we don't). Obviously, since we don't know if an algorithm to train the NN even exists, we have even less idea of how long it might take, or how large the training set would have to be.

So this theorem doesn't really help in any way answer whether our current AI techniques (of which training is a fundamental component) could be used to approximate human reason.




> They can't approximate non-continuous functions in general, and there are plenty of non-continuous computable functions.

For a given error I guess. So it's a discussion about how good we can approximate such a function.


If we don't care about any bound on the error, any function can be said to approximate any other function.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: