Zero divergence inference learning

Researchers develop new predictive coding training algorithm for networks that they say is biologically plausible and equivalent to back propagation.

News article:

Paper:

Implementation:

5 Likes

Then that would make backpropagation biologically equivalent.

It also suggests that backprop is something that brains actually do, just differently than conventional software implementations.

I suppose it depends on your definition of backprop, but what I got from that paper is not that the brain actually does backprop, but that with predictive coding, local Hebbian updates can systematically converge to following the same gradients that backprop does. Meaning that the brain’s circuitry could be applied to many of the same problems that standard ML algorithms can (and, more interestingly, that those algorithms could be implemented in massively more parallel architectures than they are typically)

It would be like saying we discovered that a skateboard can follow the same sidewalk gradients that a bicycle can, so a skateboard may in fact actually be a bicycle. At the end of the day, backprop and Hebbian w/ predictive coding are still two different learning strategies.

2 Likes