Yes, "algorithmic differentiation" is the modern term for what was formerly called "automatic differentiation".
> Is this native support (as opposed to there being a library for it)?
I didn't look into the details, but it seems to be essentially the "operator overloading" approach, not the "source transformation" approach. However, given the optimizing compiler and OCaml's very good module type system, the result might be the same, at least for the forward mode.
when did that happen? i have never heard AD referred to as algorithmic differentiation. seems weird to me as automatic is already a good name. algorithmic, to me, conjures up ideas more related to numerical differentiation. there are algorithms in the implementations but the concept really is automatic.
This happened a long time ago, at least 9 years ago.
Source: The mentors of my math diploma thesis were Prof. Andreas Griewank and Prof. Andrea Walther, who both happen to be the authors of the kind-of standard book on AD: "Evaluating Derivatives" http://epubs.siam.org/doi/book/10.1137/1.9780898717761
The second edition of that book is from 2008 and uses (and prefers) the term "algorithmic differentiation".
> algorithmic, to me, conjures up ideas more related to numerical differentiation. there are algorithms in the implementations but the concept really is automatic.
I beg to differ.
I like the term "algorithmic differentiation" because it describes what is differentiated (algorithms rather than plain formulas). This is better than saying how it is differentiated, as symbolic differentiation is also "automatic" in the sense that any computer algebra system performs that task automatically.
The novelity of AD is not that it is automatic, but that it operates on code rather than formulas. The techniques of AD are still valuable even if you transform your code by hand (purely mechanically, using the rules of AD). This is especially true for the reverse mode, which is a great recipe on how to write your gradient calculation code in an optimal way. This is easier and less prone to errors than, say, "naive" differentiation and optimizing your code afterwards by hand.
Is this native support (as opposed to there being a library for it)?