FF NN by Geoff Hinton

Just putting this here in case it’s not yet shared. Hinton’s preliminary investigations on his forward forward learning neural network. Positive and negative passes sound interesting.

Deep Learning Pioneer Geoffrey Hinton Publishes New Deep Learning Algorithm


Here-s a pretty good explanation video

It is also mentioned here Signal Propagation: A Framework for Learning and Inference In a Forward Pass - #2 by DanML


Maybe it’s worth summarizing for those too busy to click links.

Hinton’s made a huge step to bridging the gap of ‘but brains don’t backpropagate’.

He’s devised a super-simple NN construct (ForwardForward) that learns without invoking backpropagation. Super-simple as in: you can code it up in PyTorch (say) in a few minutes.

If it can be shown that backprop can be ‘swapped out’ with ForwardForward, that significantly narrows the gap between MachineIntelligence and BioIntelligence.

1 Like

If you want to go straight to implementations:

Keras: Using the Forward-Forward Algorithm for Image Classification

PyTorch: GitHub - mohammadpz/pytorch_forward_forward: Implementation of Hinton's forward-forward (FF) algorithm - an alternative to back-propagation

They implement enough of the paper to see it working and get something to play with in terms of tuning.
However the ‘architecture’ is entirely standard. The interesting bits, like timing layer feedback (from the paper) are not demonstrated.