Hacker News new | past | comments | ask | show | jobs | submit login

> Neural networks are universal function approximators

What about discontinuous functions?




Every function can be made continuous if you add one dimension in it's output for defined/not defined.


Continuous functions are dense in <pick your favorite function space> so yes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: