Switch Net 4 - linking multiple tiny neural layers with a connectionist device

You can link multiple small neural network layers together using the fast Walsh Hadamard transform as a connectionist device.
Java type code:
https://discourse.processing.org/t/switch-net-4-neural-network/33220
JavaScript code:
https://editor.p5js.org/seanhaddps/sketches/ERaMwFcej
Full Screen: https://editor.p5js.org/seanhaddps/full/ERaMwFcej
In the code I used width 4 neural layers, yeh, with the Beyond ReLU idea I’ve mentioned before.
You can use different width non-linear layers 2,8,1,16… or just use parametric non-linear functions.
The WHT is a connectionist device because it connects each input to all the outputs coequally.


Allowing you to turn sparse neural networks into pseudo-fully-connected networks for example.