Placeholder for development of this idea in our repo is
Definition: Boosting =
SP parameter boostStrength:
A number greater or equal than 0, used to
control boosting strength. No boosting is applied if it is set to 0.
The strength of boosting increases as a function of boostStrength.
Boosting encourages columns to have similar activeDutyCycles as their
neighbors, which will lead to more efficient use of columns. However,
too much boosting may also lead to instability of SP outputs.
This is what I referred to as homeostatic boosting.
SP parameter minPctOverlapDutyCycles:
A number between 0 and 1.0, used to set
a floor on how often a column should have at least
stimulusThreshold active inputs. Periodically, each column looks
at the overlap duty cycle of all other column within its
inhibition radius and sets its own internal minimal acceptable
duty cycle to: minPctDutyCycleBeforeInh * max(other columns’
duty cycles). On each iteration, any column whose overlap duty
cycle falls below this computed value will get all of its
permanence values boosted up by synPermActiveInc. Raising all
permanences in response to a sub-par duty cycle before
inhibition allows a cell to search for new inputs when either
its previously learned inputs are no longer ever active, or when
the vast majority of them have been “hijacked” by other columns.
Why? To help columns/cells more effectively cover the input space (those underperforming are artificially given chance to find new niche)
Biological plausibility:
There’s no good biological explanation of how our current implementation can correlate.
Idea:
Add concept of energy for each unit (should it be a cell, segment, synapse?).
when a cell activates, the energy increases (cell gets new food)
when? at random, to keep a constant cell count, …
where? randomly over inputspace, at hotspots (where most cells are), at deserts (opposite)
Questions:
what do you think of this model?
its implementation (with regards to speed)
are there good datasets to test boosting?
I recall a counter “dataset”: have a large SP with simple pattern (line), boosting will go crazy and generate artificial spikes in anomalies (I had a few methods to mitigate that)
I have been thinking about a similar approach for controlling the creation and pruning of synapses along a dendrite. If there is a “metabolic sustenance factor” that feeds a set of synapses this can be manipulated into replacing boosting and forgetting.
“An interesting side note on path-based SDRs is the biologically plausible enforcement of scarcity - there could also be the metabolic equivalent of a growth promoter shared over the length of each path SDR. An “empty” dendrite could grow a lot from whatever activation it receives - a “full” dendrite would starve older synapses to reinforce new synapse learning.
Another tweak is that the distance between two connections could be an influencing factor in learning to enforce spacing of synapse growth.”
If I understand the idea of “Path SDRs”, you want to hint where a new synapse should grow, and assume it should happen “in direction to” other active cells. (right?)
To be biological, we could use the hypothesis that a signal going through axon (or a pathway rather) also generates some electrical field in neighbourhood (even if shielded by myelin shell), so the graph approximates the path (actually can be curved) of the axon. Cells near an ofthen-active pathway get energy bonus. That would be the “pathway SDR growth”.
Though I’m not completely sure this approach would not actually break some assumptions how SDRs form.
Back to energy, cell death, boosting, and forgetting:
do we have a good experiment/dataset to test its efficiency?
I have the counter example with “empty input, large SP”
regarding implementation, how to do this to not hurt performance too much?
so we agree the “metabolic sustenance” should be applied to synaptic level?
boosting is about supporting columns/synapses to “grow to unexplored areas”. How do we go about that?
the “pathway SDR growth” would do the opposite.
I think we could do this model:
neurons have some “fibres” that nuture them (give them food)
generalize local inhibition to competition for food
in popular areas (many active cells) the competition is tough
in deserted areas (no activations) theres abdulance of food, so a random cell can activate and grow synapse.
…TBD
The path I am describing is the dendrite and not the axon.
The path is fixed and what changes is the addition or modification of synaptic connections. As it starts out it may pass thousand of potential connections but only a few are randomly initialized. As it learns synapses along the path are added to the list.
The relationship is as follows:
Cell body holds an index to which path table entry this cell is using and a table of synapse values.
cell body --> index -->path table entry
cell body --> synapse list
Each synapse has a value (same as in regular HTM) and an index to where it is in the selected path table list for this cell body.
synapse = value and index --> position in path table
Each path table entry holds a list of addresses (offsets relative to cell body). This table may be several hundred or a few thousand in length. I envision a few hundred paths to be sufficient to assure random connections.
For this cell and dendrite the only synapses that can be evaluated or added have to be positions along this path. For training the evaluation it is the intersection of all potential positions in this path table and active cell bodies in this cycle.
The cell body also has other values like “predictive” in regular HTM. This is set by evaluating interactions between the dendrites in the synapse list and active cell bodies.
The metabolic factor could be a fixed value for each iteration. As part of the learning pass the synapses to be modified could have this split over the synapses in this dendrite only and use this as a co-factor for both modifying existing and adding new. If there was a lot of existing synapses there would not be a lot of juice left over to grow new ones. If there are only a few they would add new or boost training strongly. This accomplishes the goal of boosting - the strong only change a little, the weak can grow quickly. The nice thing here is that it is more plausible than boosting from the biological view; it is local to the dendrite.
Every so many cycle you could run though all synapse lists and shrink all values (with a floor function) to promote forgetting.
I don’t like killing brain cells… I don’t know if this is a valid argument but it just seems like a bad idea.
For segments on a distal dendrites, it might be useful to remove old & rarely used segments if you are running out of free segments, but there is already a mechanism for doing just that.