Hacker News new | past | comments | ask | show | jobs | submit login

> when did that happen?

This happened a long time ago, at least 9 years ago.

Source: The mentors of my math diploma thesis were Prof. Andreas Griewank and Prof. Andrea Walther, who both happen to be the authors of the kind-of standard book on AD: "Evaluating Derivatives" http://epubs.siam.org/doi/book/10.1137/1.9780898717761

The second edition of that book is from 2008 and uses (and prefers) the term "algorithmic differentiation".

> algorithmic, to me, conjures up ideas more related to numerical differentiation. there are algorithms in the implementations but the concept really is automatic.

I beg to differ.

I like the term "algorithmic differentiation" because it describes what is differentiated (algorithms rather than plain formulas). This is better than saying how it is differentiated, as symbolic differentiation is also "automatic" in the sense that any computer algebra system performs that task automatically.

The novelity of AD is not that it is automatic, but that it operates on code rather than formulas. The techniques of AD are still valuable even if you transform your code by hand (purely mechanically, using the rules of AD). This is especially true for the reverse mode, which is a great recipe on how to write your gradient calculation code in an optimal way. This is easier and less prone to errors than, say, "naive" differentiation and optimizing your code afterwards by hand.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: