the algorithm inc/dec of premanence uses CLIPPING as indicated by the pseudo-code.
Do I understand that correctly ?
for c in activeColumns(t)
for s in potentialSynapses(c)
if active(s) then
s.permanence += synPermActiveInc
s.permanence = min(1.0, s.permanence)
s.permanence -= synPermActiveInc
s.permanence = min(0.0, s.permanence)
wouldn’t it be better to do it “sigmoid” way :
permanence += synPermActiveLearnRate * (1 - permanence)
But unless one does the job of really comparing the two in the context of HTM you wouldn’t know for sure.
This sigmoid was found as an interesting technique to update synaptic weight for point neurons with some form of diminishing returns… Nothing really tells us it would be significant for HTM (which does not consider those values as weight per se).
There does seem to be a relationship between synapse size, permanence, and possibly, firing effectively. A “permanence value” seems to be related to all of these. However, since it is a stand-in for all of these, it could be muddying the understandings of what is going on in the biology.
This is also the point of effectively of storing neurotransmitters and the size could be related to habituation - how long until the supply is exhausted.
I sure hope that last line is a max operation.
s.permanece = max(0.0, s.permanence);
Otherwise you’re going to be nulling out the permanence the first time it’s not active. There’s also nothing to stop it from going more negative.