SwitchNet (fixed filter bank) neural network

I tidied up the code for SwitchNet neural network and pointed out a linear mapping mathematical justification.

https://sites.google.com/view/algorithmshortcuts/switchnet-newer

I’m not a paid computer scientist. I don’t get a cent out of it. Therefore only outline arguments are given.

2 Likes

I’ve written a dense layer version where you can exceed the n squared parameter limit of a conventional neural network layer or you can make layers sparser than a conventional dense layer.
I would call the code alpha quality because I’ve hardly done any programming for a year or more.
https://archive.org/details/switch-net-dl
It is written in Processing(.org) a form of Java.
I think mainly provided to point out such things are possible.
The information needed to create neural networks centered on fast transforms has been out there for years. So who knows who has proceeded in the shadows and could be far ahead of openAI etc with some advanced abilities. No one? A few?
Anyway you get far more switching decisions per parameter than with conventional dense ReLU layers. Whatever the significance of that is?

1 Like

I wrote an on-line version.
https://sites.google.com/view/algorithmshortcuts/switchnet-dense-layer
The basic concept is Hadamard product, Hadamard transform, switched Hadamard product.
Which you may say is adaptive linear algebra filtering. That is also what a ReLU based neural network is. The adaptive switching being the brains of the operation.

1 Like

Is there a GitHub or GitLab repo for your code?

Thanks.
Fred

1 Like