Hacker News new | past | comments | ask | show | jobs | submit login

For your interest, Taylor Series are not universal function approximators - the Taylor Series around 0 for

f(x) = e^(-1/x^2) if x != 0 else 0

is identically zero (all partial derivatives are 0 at 0) but the function is clearly not identically zero. So the radius of convergence for this Taylor series is infinite but it only equals the approximated function at one point.

I'm sure there are some conditions you can put on f to make the Taylor Series a UFA but it's been quite a while since I did any real analysis so I have forgotten!

Doesn't detract from the overall point though that there are UFAs that are not neural nets. I should say that I don't know what the precise definition of a UFA really is, but I assume you have to have more than equality at one point.




Taylor series work on differentiable intervals. You specifically chose a function and interval where this is not true. Of course it will not be a good approximation.


I'm pretty sure the UFA theorems for neural networks wouldn't apply to that function either: https://en.wikipedia.org/wiki/Universal_approximation_theore...

Generally, they assume the function to be approximated is continuous.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: