Here is the special sauce: “We can consider the the discrete Fourier transform (DFT) to be an artificial neural network: it is a single layer network, with no bias, no activation function, and particular values for the weights. The number of output nodes is equal to the number of frequencies we evaluate.”
A single layer neural network is a sum of products, the basic Fourier equation is a sum of products.
In this view there are lots of single layer neural networks out there. For me, it’s the training algorithm (backprop) that sets apart the neural net.
The activation function is what keeps layers separated. When it isn't there, a pair of layers devolves to a matrix multiplication, which can be replaced by its resulting matrix, a single layer.
It seems that backpropagation might be a good abstraction for biological learing rules after all: "local predictive coding converges asymptotically (and in practice rapidly) to exact backprop gradients on arbitrary computation graphs using only local learning rules" from https://arxiv.org/abs/2006.04182
Isn't "backpropagation" just a synonym for the partial derivative of the scalar cost function with respect to the weights? And in the mathematical formulation of biological neural networks these derivatives can't be computed analytically. Your comment sounds like "backpropagation" is some kind of natural phenomenon.
A single layer neural network is a sum of products, the basic Fourier equation is a sum of products.
In this view there are lots of single layer neural networks out there. For me, it’s the training algorithm (backprop) that sets apart the neural net.