Hacker News new | past | comments | ask | show | jobs | submit login

You need it for l1 and l2 regularization which is popular these days don't you?



No. L2 regularization (ridge regression) also has an analytical solution, and L1 regularization (lasso) can be more efficiently solved using the LARS (least-angle regression) algorithm.

Gradient descent would work for ridge regression (it would just be slow). It wouldn't even work for lasso, because the penalty function isn't differentiable at the origin.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: