Temporal Memory algorithm : Permanence of synapses

Hi , in BaMI Temporal Memory section , it says "When a dendrite segment becomes active, modify the permanence values of all the synapses
associated with the segment. For every potential synapse on the active dendrite segment, increase the
permanence of those synapses that are connected to active cells and decrement the permanence of
those synapses connected to inactive cells. These changes to synapse permanence are marked as
temporary "

As for “temporary permanences of synapses”, what i understand is that for every active dendrite segment of a cell , we increase the permanence of a temporary variable ; lets say “temporaryWeightVariable”, instead of WeightVariable which is permanent and its value propagates forward in next timesteps. And WeightVariable change its value only in case of right/false prediction for a specific cell. Is that true?

Yes, I believe that is correct. I’m sure there are other ways to do this in code, but I think this is how NuPIC does it.

You might want to look at NuPIC’s TemporalMemory for a reference implementation. Here are the docs.

1 Like

The quoted section appears in the initial 0.4 release of the Temporal Memory description, but not in the 0.5 update https://numenta.com/assets/pdf/temporal-memory-algorithm/Temporal-Memory-Algorithm-Details.pdf

It seems to me that the Python reference code does not implement temporary permanence updates.

@mrcslws Is this true?

Finally , is there an answer on this?

@rogert is correct, that section is no longer in BaMI.

Our Temporal Memory implementation doesn’t have a notion of a “temporary” change. When the Temporal Memory receives a new input, it evaluates the input against its predictions. On segments that correctly predicted the input, it rewards active synapses (PERMANENCE_INCREMENT) and punishes inactive synapses (PERMANENCE_DECREMENT). On segments that incorrectly predicted another input, it punishes active synapses (PREDICTED_DECREMENT).

1 Like