Scaling down nupic model size

So I’m looking to add a ‘model-scale’ parameter to my function that generates nupic models. A set value of 1.0 would be a full-sized model, while a value of 0.5 would be a half-sized model, etc.

I want to make sure I adjust the SP and TM params in a valid way to achieve this, so here is the logic I have to do so:

SP

  • ‘column_count’: 2048 --> 1024

TM

  • ‘column_count’: 2048 --> 1024

  • ‘input_width’: 2048 --> 1024

  • ‘newSynapseCount’: 20 --> 10

  • ‘maxSynapsesPerSegment’: 32 --> 16

  • ‘maxSegmentsPerCell’: 128 --> 64

  • ‘minThreshold’: 9 --> 5

  • ‘activationThreshold’: 12 --> 6

Is there anything missing from this??

I’m using the ‘getScalarMetricWithTimeOfDayAnomalyParams’ to generate the params with RDSE encoders. Should the size of these encoders change as well? I assume that the smaller SP would map its fewer columns to the same-size encoder, resulting in bigger receptive fields for each column. Is that assumption correct?

Thanks as always :smile:

2 Likes

Could I get a quick thumbs up/down or re-direct @rhyolight?

The reason for the scaling is to try and reduce the memory use of each model, since I have n of them running at once. However I am skeptical that this scaling would reliably help – since I suspect that model memory-size depends more on the complexity & noise of the data than model scale. This is my hunch given the nature of HTM parameters values as sparse, simple (no heavy math) and accumulated from 0.

It seems ok to me Sam, but I have not experimented with this type of thing at all yet. So I have to also add :man_shrugging:.

2 Likes