Shifting in the predicted signal

I have a question about the prediction in HTM. HTM algorithm is supposed to have the ability to make multiple predictions ahead in time. For time series data, let’s say a sine wave, the predicted value should be ahead in time by the number of the prediction steps. For instance, if we are making 2nd order prediction, the predicted signal should be two steps ahead in time than the original sine wave.

When I was looking at the hot-gym example yesterday, I found that the predicted overlap exactly with the presented signal, why does this happen?

Also, I have another question about setting up the SDR classifier for prediction. Why is the record number for inference one step ahead of the record number for training, shown in the example below?

# learning
c.compute(recordNum=0, patternNZ=[1, 5, 9], classification={"bucketIdx": 4, "actValue": 34.7},
          learn=True, infer=False)
# inference
result = c.compute(recordNum=1, patternNZ=[1, 5, 9], classification={"bucketIdx": 4, "actValue": 34.7},
                   learn=False, infer=True)

Is there any correlation between the shift in the record number and the order of prediction?

// Thanks

Look into the inference shifter and see if that solves your problem.

But what if I’m using the SDR classifier only without inference shifter, would my above statement be valid?

What about my second question?

// Thanks

To clarify, this is not a 2nd-order prediction (in the sense of single order vs high-order memory). You are merely attempting to classify not the predicted cells into a value, but the predicted cells plus X number of steps in the past into a value. So predicting 2-steps into the future is more of a classification parameter than a core HTM thing. The HTM will learn the same way, no matter how many steps into the future you try to classify.

If inference shifting is on for plotting and those overlap exactly, that means the predictions were exactly correct (probably not the case). More likely is that inference shifting is not used, and you’re seeing the predictions actually trailing the actual values by one step. When there is no good prediction available, the most likely input tends to be the same input it just saw, so it returns that.

Inference is always one time step ahead of reality. So if you are at a time step, the real world value is now and the inference is one step from now.