Introduce new spatial pooler boosting rule (breaking change)


I have implemented a new boosting rule in the HTM spatial pooler (SP). Boosting in SP encourages efficient use of columns by applying a multiplicative factor to the inputs of each column. We call this the boost factor and update it according to the activation history of individual columns (activeDutyCycle).

In the old boosting rule, boosting factors are always >=1.0. We increase boost factors for “weak” columns (controlled by the parameter minPctActiveDutyCycles). The strength of boosting is controlled by a parameter called maxBoost. Through a series of experiments, we find the discrete nature of the old boosting rule often introduce instability.

In the new boosting rule, the boosting function is a continuous exponential function, with its slope controlled by a parameter called boostStrength. We not only increase excitability for weak columns, but also decrease excitability for strong columns.

The new boosting rule is implemented in nupic.core and is about to be merged in nupic

We propose to change the parameter name ‘maxBoost’ to ‘boostStrength’ and eliminate the parameter minPctActiveDutyCycles. The related PRs are here for NuPIC, and here for nupic.core.

Please note that this will be a breaking change as it will no longer support old models. We will wait for 72 hours before merging for community response.

A complete description of the spatial pooler learning algorithm can be found in our HTM SP paper.


Nice, Yuwei. It seems intuitive that this would be better.

Do you have performance evaluations to demonstrate that this improves column distribution?


Yes, I have an entropy metric that measures the column distribution. There is a clear improvement with the new boosting rule. Below is a figure that compares the distribution of activeDutyCycles with/without boosting on the NYC taxi dataset. Without boosting, ~50% of the columns were not used at all. The old boosting rule is very close to “no boosting” in this case.

The new boosting also improves prediction accuracy on several standard datasets, including hotgym and nyc-taxi. There is a separate PR that updates the hotgym example with the new boosting rule.


Thanks for sharing this. Do you think this might be extrapolated onto bumping mechanism? (Increasing synapse permanence of less used columns) They seem to use a similar approach (minPctOverlapDutyCycles) as calculating boosting factors, so maybe bumping also benefits from a similar change.


Interesting question. I think these are two separate mechanisms (despite the similarity of the parameter name). The boosting mechanism is related to homeostasis excitability control (I recommend Matt’s video on this topic).

The bumping mechanism is related to spontaneous synaptogenesis. It is possible a more continuous bumping mechanism could also be beneficial for the spatial pooler (e.g., by introducing a small amount of random synaptogenesis). That could be an interesting research direction.