Weight Switching in Neural Networks using Locality Sensitive Hashing

You have a very wide range of choices if you use locality sensitive hashing to select weights in a neural network. Not only various select from a pool strategies, also various switching scopes (eg. input weights to a neuron, weights in the next layer connected to the output of a neuron in the current layer and other distinct grouping of weights within a neural network.)

That results in quite a large menu of options:

https://ko-fi.com/post/Neural-Network-Weight-Switching-using-Locality-Sen-Y8Y21KRZZB

1 Like

And if you (personally can) view ReLU as a switch with a switching decision (x>=0)? then you can use that switching decision for multiple other things instead. It is rather blinkered to only use it for a simple switching operation.

With a certain dash of creativity it should possible to find new and useful alternatives to current neural network structuring.

1 Like

For example using switching decisions for Layer-wise Dynamic Weight Synthesis for Neural Networks:

Layer-wise Dynamic Weight Synthesis for Neural Networks

However adapting that for use with switching decisions rather than continuous variables.

1 Like