Once the nupic has seen an anomaly, why doesn't it show it again as an anomalous behaviour?Shouldn't it throw an error every time it sees the same anomaly?

Hi everyone,

I am new to nupic and I am using it to detect anomaly in ECG data.My main target is to detect arrhythmia.I am currently trying nupic’s examples - one hot gym prediction/anomaly and sine prediction.
I was able to successfully implement them on my machine.

I was wondering, If I try ECG data using the same principles as mentioned in One Hot Gym example, it will show an anomaly for the first time it sees it. After few time intervals, it will learn it as a normal behaviour of ECG and will not show it the next time.

Suppose I have to detect a distorted QRS complex. Given the current learning algorithm, it will detect few instances of the anomalies after which, things will be normal for the system.

Is my understanding correct? Please clear this doubt.

Thanks.

2 Likes

Yes, it will learn that re-occurring anomalies are no longer anomalies, just like your brain does. If you don’t want this to happen, you can disable learning once your model has learned the patterns you want it to recognize. Then as it sees new anomalies occurring over and over it will always flag them as anomalous.

2 Likes

You make a good point. It should be possible to detect that some event is anomalous in the sense that it is rare or out of context, even though that event may be recognised in a global sense.

Events that are rare enough may continue to be flagged as anomalies, depending on the distal punishment rate (for unfulfilled predictions).

Turning off learning has a cost in not adapting to gradual changes. On the other hand, you might want to detect accumulated gradual changes.

2 Likes

@rhyolight Sir, Thanks so much for the reply.This makes sense in theory.I recently read about this in HTM White paper.Under Chapter 1, heading “Learning” -

After initial training, an HTM can continue to learn or, alternatively, learning can be
disabled after the training phase. Another option is to turn off learning only at the
lowest levels of the hierarchy but continue to learn at the higher levels.

However it will take some time for me to implement it practically.I will really appreciate if you could give a me a little idea on how should I go about it.

Thank you.

@floybix Thanks for the reply.I am specifically interested in detecting arrhythmia in ECG data.In such scenario, rare encounters of anomalies can be neglected as it is normal to skip a beat or two. But the problem occurs when these encounters are not few but many.
One very important point there Sir, that turning off learning will be at the cost of not adapting to gradual changes.
Will you please elaborate on the last statement of yours.Consider the ECG data as the input.

Thank you.

@ioarun Anomalies occur at the egde of known and unknown knowledge\patterns. While it is the essence of intelligence , what is really of practical use is classification . Here you are looking for the ability to classify a pattern as something ( warning\danger\success\cat\dog) and be able to tell you that you encountered that pattern. So erratic heart conditions can get learnt and not show as an anomaly any more, but they can trigger a classification and a respective alert

Regards
Chandan

1 Like

I met the same question. Can you show me how to turn off learning? I didn’t find where learning is in the codes.:sob:

If you are using an OPF Model use disableLearning().

If using algorithms directly, sp has a param here and tm has a param here.

1 Like

Thanks for your reply! Great help for me.