Anomaly detection stops after a number of output

Hi,
I’m executing anomaly detection with HTM.Java. I noted that after it pulled out a number of scores (around 1.000) it stops and doesn’t calculate others.
But my dataset is composed by 180.000 records… No errors are showed, simply HTM stops.
There are implicit limits or I can unset it in network setup?

Hi @Andrea_Giordano,

Do you get 180,000 lines of output? Is it stopping because your input stopped? Can you give me more details? And maybe example code so I can see if there is in fact a problem?

My input doesn’t stop and I’m sure the data are pushed into the system. Moreover, no errors are showed about anything.

public Network createNetwork(Object key) {
        Parameters p = getParameters();
        p = p.union(getEncoderParams());
        
        p.set(Parameters.KEY.INFERRED_FIELDS, getInferredFieldsMap("read", CLAClassifier.class));
        
        return Network.create("Network API Demo", p)
                .add(Network.createRegion("Region 1")
                        .add(Network.createLayer("Layer 2/3", p)
                                .alterParameter(Parameters.KEY.AUTO_CLASSIFY, Boolean.TRUE)
                                .add(Anomaly.create())
                                .add(new TemporalMemory())
                                .add(new SpatialPooler())
                                .add(MultiEncoder.builder().name("").build())));
    }

public static Map<String, Map<String, Object>> getSensorFieldEncodingMap() {
        Map<String, Map<String, Object>> fieldEncodings = setupMap(
                null,
                300,
                21,
                0, 0, 0, 0.01, Boolean.FALSE, Boolean.FALSE, null,
                "read", "float", "RandomDistributedScalarEncoder");
      
        return fieldEncodings;
    }

public static Parameters getEncoderParams() {
        Map<String, Map<String, Object>> fieldEncodings = getSensorFieldEncodingMap();
        Parameters parameters = Parameters.getAllDefaultParameters();
        
        parameters.set(Parameters.KEY.GLOBAL_INHIBITION, true);
        parameters.set(Parameters.KEY.COLUMN_DIMENSIONS, new int[] { 2048 });
        parameters.set(Parameters.KEY.CELLS_PER_COLUMN, 32);
        parameters.set(Parameters.KEY.NUM_ACTIVE_COLUMNS_PER_INH_AREA, 40.0);
        parameters.set(Parameters.KEY.POTENTIAL_PCT, 0.85);
        parameters.set(Parameters.KEY.SYN_PERM_CONNECTED,0.1);
        parameters.set(Parameters.KEY.SYN_PERM_ACTIVE_INC, 0.04);
        parameters.set(Parameters.KEY.SYN_PERM_INACTIVE_DEC, 0.0005);
        parameters.set(Parameters.KEY.MAX_BOOST, 1.0);
        parameters.set(Parameters.KEY.MAX_NEW_SYNAPSE_COUNT, 20);
        parameters.set(Parameters.KEY.INITIAL_PERMANENCE, 0.21);
        parameters.set(Parameters.KEY.PERMANENCE_INCREMENT, 0.1);
        parameters.set(Parameters.KEY.PERMANENCE_DECREMENT, 0.1);
        parameters.set(Parameters.KEY.MIN_THRESHOLD, 12);
        parameters.set(Parameters.KEY.ACTIVATION_THRESHOLD, 16);
        parameters.set(Parameters.KEY.FIELD_ENCODING_MAP, fieldEncodings);
        
        return parameters;
    }

To call the computation:

        Map<String, Object> multiInput = new HashMap<>();
	    multiInput.put("timestamp", new DateTime(record.getTimestamp()));
		multiInput.put("xValue", val);
		Inference inference = network.computeImmediate(multiInput);
		System.out.println(inference.getAnomalyScore());

Can you tell me what input line number the output stops at? Does it always stop at the same line number? What is your Java Heap size set to? Can you try increasing/decreasing the Heap size - does it have an affect on the number of outputs you have?

Example:
The heap size may be configured with the following VM options: -Xmx - to set the maximum Java heap size.

-Xmx 2000M (2 gigabytes)

-Xmx 1000M

-Xmx 100M

-Xmx 10M

htm stops after line 831. Always the same line. I tried to set -Xmx4000M but never changed.

Just for tried I changed n from 300 to 1000 and w from 21 to 71 I reached 3071 lines.
Anyway it is not enough for my 180.000 records…

Hi @Andrea_Giordano,

Can you send me your code, and a copy of your input? Up to 10, 000 lines should do it because you say you don’t get any further than 3071. Also, send me your parameters.

Send in zip file to cognitionmission@gmail.com

Cheers,
David

My issue concerned the htm.java porting on apache flink. Yesterday Eron Wright updated the library and now the problems seems vanished. Probably was a bug. Thank you anyway

1 Like

Thanks for keeping me updated. I’m happy a solution for your problem was found! :slight_smile:

Cheers,
David