Measurement the performance of the anomaly detection algorithm

Hii,
I develop algorithm for anomaly detection problem, that use also the HTM engine.
I encounter in problem for estimating the accuracy of the algorithm .
The algorithm is working well but the algorithm recognize the anomaly with phase ,
the phase is changing .
for example:

y_actual = [0,0,0,0,1,0,1,0…,1,0,0,0,…,0,1,0,0,0]

y_predict = [0,0,0,0,0,0,0,1,…,0,0,0,1,…0,0,0,0,1]

The algorithm recognize/predict anomaly in the area of the actual anomaly but not in the specific position (with different phase) , for my purpose it’s fine .

When I trying to compare the results by position I received poor results in the standard estimation method (TPR=72%/TNR=98%) , when I am looking on the results manually I see the algorithm is more accurate … (when there no anomaly the algorithm return TNR=100,TPR=100)
May someone can recommended on methodology for tuning the estimating accuracy.
Thanks,
MAK

1 Like

Since HTM predicts the future and the anomaly is determined by how bad the prediction is (how different the current pattern is from the previous patter). There will be some offset to when the anomaly is detected.

If you feel like so (requiring some advanced programming), you can try encode you data using Grid Cell encoders. They are more sensitive then Scalar/RDSE.

Hii,
First can you share link for the GRID Cell ?
The anomaly detection result of the HTM with scalar is also satisfy me , I allow that the algorithm will have offset in the detection time .
My question is more general question and less HTM core question, how to measure the accuracy of the HTM algorithm even if he predict with changing offset in the time ?
Thanks,
M

Be sure to read the anomaly detection docs. You might also watch this old video.