Binary Reservoirs

Just connecting up some ideas:
https://groups.google.com/forum/#!topic/comp.ai.neural-nets/r29N5BBhl5U

I wouldn’t depend on Micron continuing with that project indefinitely. But maybe if they find more applications that would help.

New paper I was given a link to:
https://arxiv.org/pdf/1702.03812.pdf

I found that churning the data around in the analog/floating point domain first, then converting it to binary -1,+1 works really well.
https://drive.google.com/open?id=0BwsgMLjV0BnhRFNoTTh3UEhwZlk

Hello @Sean_O_Connor, a friendly suggestion.

The variety, out of the box thinking and passion on different approaches you bring to this forum is very much appreciated. It might be a bit better if you expanded on how the things you share are related to biology, neuroscience based AI solutions, cortex, brain or HTM if at all.

Best of luck on your research :slight_smile:

1 Like

I’m not paid to. It is only hobbytek as the Germans call it. Anyway if you mix up your input data as a binary pool/reservoir then depending on how chaotic the mixing you get an extremely large locality sensitive hash or just a hash. If you break up the reservoir into 8 bit pieces each of the 256 possible values is almost equally likely. With 32 bit pieces you will get a very non-uniform distribution, but that is too sparse. Somewhere in between you usefully get pieces whose values can be stored (in a hash table) as features of the input example. You can mix detected features back into the reservoir if you want to push matters further. Then a simple floating point readout layer seems to work well with binary features, to map to the target output. Anyway I have to buy a new hard drive for my laptop before I can do anything further. It’s analog electronics design for the next few weeks.