Hacker News new | past | comments | ask | show | jobs | submit login

Their articles are often rehashes of press releases, but this one seems to be of the better ones



This title is still a bit click baity because ultimately this problem has very little impact on any real world usage of stochastic gradient descent - and especially the more important ones, where getting infinitesimally close to any particular local minimum is specifically and totally not what people care about (and getting to a global minimum is something we've long known to give up on).

It is a theoretically interesting result about the PPAD complexity class itself. Not sure if it should be STOC best paper though, if it was nominated as best paper because of the weak link to machine learning I'd be a bit disappointed.


I should’ve clarified I was referring to the old (2021) article at the OP when I was talking about the high quality.

The recent ones haven’t remained in the same pathway unfortunately.


Is this a quality article? It starts badly:

> ... elevation of the land is equal to the value of the function (the “profit”) at that particular spot. Gradient descent searches for the function’s local minimum ...


Ouch. Compared to current ones, I'd unfortunately have to say yes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: