Hacker News new | past | comments | ask | show | jobs | submit login

Interestingly, since the L^1 discrepancy is convex, it's possible to analytically minimize it by using the subdifferential, and thereby prove that its minimum is attained at the median. The subdifferential is a set-valued generalisation of the derivative to all convex functions, including non-differentiable ones. See https://en.wikipedia.org/wiki/Subderivative and https://towardsdatascience.com/beyond-the-derivative-subderi...

Whenever the subdifferential of a convex function at a point includes 0, that point is a global minimum of the function.




L^1's derivative is a perfectly good function, it's not defined or continuous at 0, but whatever... it's for the same reason that the median handles even-sized sets in a special way.

0 = d/ds \sum_i | x_i - s | = \sum_i sign(x_i - s)

which is true precisely when x_i <= s for half the elements and x_i >= s for the other half (taking sign(0) = 0, sign(+x) = 1, sign(-x) = -1).


A derivative can also be obtained for the abs function interpreted as a distribution / generalized function. However I don’t think it is helpful for calculus because of the dirac measure that pop-up


"Whenever the subderivative of a convex function at a point includes 0, that point is a global minimum of the function"

* you mean a strictly convex function


In the non-strictly convex case, the minimum is still a global minimum; it's just not unique.


[edit] right, you're correct sorry




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: