Hacker News new | past | comments | ask | show | jobs | submit login

Gradient descent also allows batching, which reduces the memory requirements, can the other methods support that?



As far as I know, this is the #1 gradient descent superpower that makes it the preferred choice above all others. I don't think e.g. L-BFGS supports batching, I've certainly never seen it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: