It sounds complicated, but neural networks these days are basically just a bunch of filters with nonlinear cut-off and bining. (From a signal processing point of view..)
Super simple to implement the feed-forward scenario for decoding.
Not entirely sure what the residual memory aspect of these networks add in terms of complexity, but it's probably just another vector add-multiply, or something to that effect.
Super simple to implement the feed-forward scenario for decoding.
Not entirely sure what the residual memory aspect of these networks add in terms of complexity, but it's probably just another vector add-multiply, or something to that effect.