Here I have another idea of TP implementation.
You keep an accumulator (1D array of int) of size 2048 where every time there is a bit in a position X you increment that element. The input is the TM Predicted SDR.
Now come the interesting part … the Accumulator is decaying… what do I mean… here is the formula :
val = exp(- (t/n))
where t is the time step and ‘n’ is parameter that control how fast to decay …
On every time step we take winner-takes all (biggest values) with the required sparsity.
Now we have value that changes according to the underling data, next thing is to pass this pattern trough a Spatial Pooler … and we have stable and unique pattern.
What do you think ?
This would mean that your pooling layer will have fixed sparsity right? Though what if the TM layer(s) its monitoring are unstable due to randomness in their inputs? Wouldn’t it be good if the pooling layer was less stable itself during those times?
I think I get the decaying function you have there and the idea makes sense to me – tho if you have a diagram showing how connections are formed between cells in your TP and TM layers that’d help to make it most clear ideally!
yes, fixed sparsity …
The TM generates 2048bit SDR on every step, which is used to increment the Accumulator.
No connections, this is purely algebraic solution.
May be later some sort of feedback links, but I’m not sure yet of how TP=>TM feedback work. I was about to post a question on that
I think that’s a great idea. I came to a very similar solution, tested it, and found that it works well. I wrote about it here: https://github.com/ctrl-z-9000-times/sdr_algorithms/blob/master/ascii_stability.pdf