What is the commonality between so called fast transform neural networks and conventional ReLU neural networks? ReLU can be viewed as single pole single throw switch, the activation function in the fast transform net can be viewed as single pole double throw switch. In either case they are connecting and disconnecting different (or potentially different) dot products. In one case the dot products (weighted sums) are internally fully adjustable, in the other only the magnitudes of a set of orthogonal dot products are adjustable. If might seem obvious which one to choose but I wouldn’t be so hasty because of the ‘orthogonal’ part and how variously scaled versions of one set of orthogonal dot products will interact when processed by further orthogonal dot products.
In any event for a particular input the state of each switch becomes known.
Each neuron in the net receives some particular composition of dot product of switched (connect or disconnected) dot products of switched dot products…
However the dot product of a number of dot products can be condensed right back down to a single simple dot product of the input vector.
In particular the value of each output neuron can be view as that of a single dot product of the input vector. The entire output vector a matrix of such dot products.
For both types of neural network a particular input causes the switches in the system to be thrown decidedly one way or the other, inducing a particular matrix, and a matrix multiply mapping of the input vector to the output vector.
Then what they have in common is some kind of dancing matrix operation, like Proteus changing shape on the beach.
What if most of the synapses in the brain where fixed and random then the thing would be a vast 3d fast transform random projection. Only a small percentage of synapses/neurons would need to act as activation functions within that matrix of fast transform random projections. That would make learning more efficient in the sense that far fewer parameters would need to be adjusted to get a particular wanted behavior or response.
I’m not saying that is how it is, I’m just putting it forward as an idea.