Fast Transform (aka. Fixed Filter Bank) neural networks trained by evolution and by backpropagation.

Evolution: https://s6regen.github.io/Fast-Transform-Neural-Network-Evolution/

Backpropagation: https://s6regen.github.io/Fast-Transform-Neural-Network-Backpropagation/

A fast transform (Walsh Hadamard Transform) is used as a fully connected set of fixed weights which, if used a conventional neural network activation function would give you a real but completely nonadjustable neural network layer.

Something must bend. What you can do is use individually adjustable parametric activation functions.

With switching based parametric activation functions the resulting fast transform neural network is very similar to a ReLU based network, though faster to compute and using fewer parameters per layer.

Information: