Generic NuPIC anomaly / usage questions

Discussion continued from Anomaly detection - params and questions / definition of terms.

Finally got a running Java / NuPIC app working and have built a list of questions along the way:

a - When I run ~20K iterations of my dataset (timestamp, param0, p1, p2, result) where only the timestamp and result fields are changing - the values for ‘SDR.cellsAsColumnIndices(inf.getPredictiveCells(), …)’ and ‘inf.getAnomalyScore()’ never change. The first being integers from 0-18 and the second being 0.0 consistently. Is this insufficient data to find patterms?
b - The ‘hot gym’ sample seems to find an equilibrium fairly rapidly. Perhaps my data is not regular enough within the span of time?
c - How would I expand the ‘memory’ of the HTM setup? IE What governs how far back it looks to build an anomaly ‘model’?
d - How does nupic deal with timestamped data that’s out of sequence?

There are ways you can configure the algorithm parameters that will always produce these results. What are the params you’re using?

How often is data coming in? Is it at a regular interval? What date semantics are you encoding?

I think these are the TM parameters that might affect this:

  # Parameters of the TM region.

    # Active synapses get their permanence counts incremented by permanenceInc.
    # All other synapses get their permanence counts decremented
    # by permanenceDec.
    permanenceInc: 0.1
    permanenceDec: 0.1

Not well. It assumes data is chronological.


Parameters parameters = Parameters.getAllDefaultParameters();

//SpatialPooler specific
parameters.set(Parameters.KEY.POTENTIAL_RADIUS, 12);
parameters.set(Parameters.KEY.POTENTIAL_PCT, 0.5);
parameters.set(Parameters.KEY.GLOBAL_INHIBITION, true);
parameters.set(Parameters.KEY.NUM_ACTIVE_COLUMNS_PER_INH_AREA, 40.0);
parameters.set(Parameters.KEY.SYN_PERM_INACTIVE_DEC, 0.085);
parameters.set(Parameters.KEY.SYN_PERM_ACTIVE_INC, 0.05);
parameters.set(Parameters.KEY.SEED, 1960);

//Temporal Memory specific
parameters.set(Parameters.KEY.MIN_THRESHOLD, 10);
parameters.set(Parameters.KEY.MAX_NEW_SYNAPSE_COUNT, 20);
parameters.set(Parameters.KEY.PERMANENCE_INCREMENT, 0.1);
parameters.set(Parameters.KEY.PERMANENCE_DECREMENT, 0.1);
parameters.set(Parameters.KEY.ACTIVATION_THRESHOLD, 12);

Data comes in sporadically (from kafka topic). Im working on smoothing it out to 1Hz.

How are you encoding date semantics? (What are your date encoder settings?)

 Map<String, Map<String, Object>> fieldEncodings = setupMap(
            0, // n
            0, // w
            0, 0, 0, 0, null, null, null,
            "stamp", "datetime", "DateEncoder");

fieldEncodings.get("stamp").put(Parameters.KEY.DATEFIELD_PATTERN.getFieldName(), "YYYY-MM-dd HH:mm:ss");
fieldEncodings.get("stamp").put(Parameters.KEY.DATEFIELD_DOFW.getFieldName(), new Tuple(1, 1.0)); // Day of week
fieldEncodings.get("stamp").put(Parameters.KEY.DATEFIELD_TOFD.getFieldName(), new Tuple(21, 9.5)); // Time of day

I think I remember a bug in NuPIC where potentialRadius was not set properly. I’m not sure if this is an issue in HTM.Java. But if there is no global inhibition it should be set to total number of columns in the SP. (right?)

Also @phil_d_cat I assume you are using 2048 columns and 32 cells per column?

Anything not set there is all default.

That was the cause of my summer of tears! :frowning: Yes, that was a problem in HTM.Java but it is automatically handled now… (tg)

Ok so even though he’s setting it, it is overridden because global inhibition is on, right?

No. Good catch. If it’s set, I don’t override it because the user knows more than I what they want. It is set to a default of “-1” which if not “altered”, then gets set internally by the Network to the # of columns. (The documentation is very LOUD about that too).

So in short. He shouldn’t set it if using GlobalInhibition. If he is then it’s a problem for sure - good catch!

@phil_d_cat Did you get that? Remove this parameter setting above. (Thanks for confirmation, @cogmission).

1 Like

Ok. Im spending my cycles of late trying to get my data massaged into some form that HTM will be happier with.