From what I gather, much of AI is gradient descent, which appear to be matrices of PDEs.
I took calculus/linear algebra/PDEs decades ago, but remember little of it. Are there any resources for relearning the parts useful for AI in an efficient manner?
https://arxiv.org/abs/1802.01528
From the intro:
>This article walks through the derivation of some important rules for computing partial derivatives with respect to vectors, particularly those useful for training neural networks. This field is known as matrix calculus, and the good news is, we only need a small subset of that field, which we introduce here. While there is a lot of online material on multivariate calculus and linear algebra, they are typically taught as two separate undergraduate courses so most material treats them in isolation. The pages that do discuss matrix calculus often are really just lists of rules with minimal explanation or are just pieces of the story.
>In contrast, we're going to rederive and rediscover some key matrix calculus rules in an effort to explain them.