Hacker News new | past | comments | ask | show | jobs | submit login

Try reading the last two paragraphs of the article first; it won’t make much more sense unless (I suspect) you already understood the point it was intended to convey, but you’ll save some time being confused:

> By now it’s probably worth dropping the allegory: the “workers” in our story are models, which could be individual layers of a neural network, or even whole models. And the process we’ve been discussing is of course the backpropagation of gradients, which are used to iteratively update the weights of a model.

> The allegory also introduced Thinc’s particular implementation strategy for backpropagation, which uses function composition. This approach lets you express neural network operations as higher-order functions. On the one hand, there are sometimes where managing the backward pass explicitly is tricky, and it’s another place your code can go wrong. But the trade-off is that there’s much less API surface to work with, and you can spend more time thinking about the computations that should be executed, instead of the framework that’s executing them. For more about how Thinc is put together, read on to its Concept and Design.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: