Nooby questions for Implementation

I was making my Temporal Memory algorithm a while back, but work decided to eat a lot of my time, but I’m back at it again. I like to program these algorithms myself to better understand, I’m just having some difficulty with a few things following the BAMI book.

the " function activatePredictedColumn(column)" seems to add all predicting cells to winners, and reinforces their segments connections. This means multiple cells in a column can predict the exact same thing in the exact same context. This is wasteful, no?

growSynapses() Looks like it continuously grows new synapses and will eventually consume all memory, unless createNewSynapse removes below-threshold synapses. is this the case?

It also appears that growNewSegment() could potentially be called a lot, and there could be tonnes of segments, consuming all memory as well.

Is there perhaps a better place to look at pseudocode for temporal memory, because I find this BAMI book annoying to build from. Maybe the problem is with me.

Any help is appreciated!

2 Likes

Short answer: SDR.
Most, if not all of the problems you’ve mentioned is never serious due to the sparse nature of SDRs.
And I guess the fact that a cell subsamples to predict will help too.

1 Like

Yes, but it is not described in the book. HTM continuously removes the weak distal connections.

1 Like

That pseudocode does the absolute bare minimum necessary to get basic working HTM algorithms, and does not pay attention to optimization. It sounds like you may be looking for less pseudo-y code, here are links to those functions in production:

1 Like