I am a beginner in HTM and I looked at the HTM school video on TM. I was trying to understand the pseudocode of the TM implementation from the BAMI book. But I have difficulty in understanding since the terminologies used there are quite new to me (considering that I am not from biology background). So, here are my questions:
- The BAMI book mentions this in TM section:
Every cell has many distal dendrite segments. If the sum of the active synapses on a distal segment exceeds a threshold, then the associated cell enters the predicted state. Since there are multiple distal dendrite segments per cell, a cell’s predictive state is the logical OR operation of several constituent threshold detectors. - My understanding is that every cell in a column has one-to-one connection with other cells in the same layer. But the statement says each cell has multiple segments which inturn might connect to different cells. I am confused with this statement. Please explain whats dendrite segments, synapses with the context from HTM school video.
- Is the SDR union concept used in TM? If so, can you tell me in which part of the algorithm?
- I also looked at the pseudocode of TM in BAMI book. I dont understand these terms:
LEARNING_THRESHOLD. I dont understand what synapses mean here (whether its one-on-connection from one cell to another in the same layer or something else) because of my confusion in question 1.
Other set of questions I had after studying TM:
- How does the brain decode these predicted neuron signals from TM to say a str value (incase of prediction of next word for example)?
- What is the effect of having multiple active cells per mini-column at every timestep in TM? Does it make the predictions robust?
- Please point me towards the biological terminologies to learn to understand the TM pseudocode.
A cell has multiple distal segments which connect to other cells. This allows a single cell to detect multiple patterns.(roughly a pattern a segment)
I don’t know how the brain does that. But in practice, there are some ways to do it. An example would be having a trained decoder with linear regression.
Each cell in a column represents a possible temporal pattern. Multiple active cells means multiple possible patterns. Thus, having only a single cell active per column would make it consider only one pattern.
Based on your questions I’d very highly recommend reading the papers on core HTM structures, like this one.
I think you’ll find it’s well worth the time, enabling you to visualize the dynamics of HTM systems which are totally different than the well known ANN/MLP models. Then I’d go back to HTM School, and once that’s clear go to the pseudocode.
@sheiser1 Thanks for the reply. Reading the paper helped in understanding majority of my confusion. But I still have this query about the meaning of
matchingSegments. Is it a segment where the linear sum of its synapses is less than threshold or does matchingSegments include active segments as well? I see it being used in punishing predicted column
and also in burst column
According to former function, I feel it includes active segments while according to the later definition, I think matchingSegments are segments with less than active_threshold synapses.
matchingSegments is a list of segments that has potential synapses or connections to active cells including the active segments.
bestMatchingSegment is different to
matchingSegments, as it’s the segment that has the most number of potential synapses or connections to active cells in a column. Which might or might not be an active segment but it wouldn’t matter as it’s only utilized when there’s no active segments.
Isnt it better to to use
activeSegments instead of
Then it would never “truly forget” a pattern(or at least what it thinks as) that it will never see again. Which can lead to wasting resources thus resulting in worse performance and high memory usage.
It should completely get rid of the “knowledge” if it was a false positive cause by a noise.
Yes, which can be seen at work in this code from nupic/temporal_memory.py:
def activateDendrites(self, learn=True):
Calculate dendrite segment activity, using the current active cells.
:param learn: (bool) If true, segment activations will be recorded. This
information is used during segment cleanup.
for each distal dendrite segment with activity >= activationThreshold
mark the segment as active
for each distal dendrite segment with unconnected activity >= minThreshold
mark the segment as matching
numActivePotential) = self.connections.computeActivity(
activeSegments = (
for i in xrange(len(numActiveConnected))
if numActiveConnected[i] >= self.activationThreshold
matchingSegments = (
for i in xrange(len(numActivePotential))
if numActivePotential[i] >= self.minThreshold
self.activeSegments = sorted(activeSegments,
self.matchingSegments = sorted(matchingSegments,
self.numActiveConnectedSynapsesForSegment = numActiveConnected
self.numActivePotentialSynapsesForSegment = numActivePotential
for segment in self.activeSegments:
self.lastUsedIterationForSegment[segment.flatIdx] = self.iteration
self.iteration += 1
matchingSegments are those which have are linked to at least
minThreshold previously active cells (from t-1) – whereas activeSegments are those linked to at least
activationThreshold. Since the former is default set to 10 and the latter to 13,
activeSegments are subset to
Therefor when looping over all
punishPredictedColumn you’re decrementing all active synapses from both