Delta encoder example?

Hi all,

So I just implemented the delta encoder on my 2-field scalar data, which looks like this:

I’m wondering if I’ve set this up properly in the params file. How does this look?

I have my min and max values as the min and max of the actual scalars. Is this right, or should they be the min and max of the differences instead? Thanks,

  • Sam
1 Like

Did you able to implement the delta encoder in prediction as well as anomaly. I am trying to d the same but not sure how to set the param file.

I think he best way to go is do the differencing in preprocessing, then fed the differences values into NuPIC, treating them like any other numeric input. This allows you to use the Scalar Encoder, which the classifier is setup to work with.

Thank you sheiser1…Actually i tried with this approach sometime back and it gave me RMSE of 300 .This might be because of -ve (decreasing) and +ve(increasing) diff i got but not sure. Even though I want to model the differences ,but the output of the model will be actual value instead of diff. How will we do that ? may be using a post processing module where it ll keep track of the original value and add this to model output?

This seems the way to me. Take the forecasted differences from the model and add them to the current raw values (unprocessed input values that aren’t differenced). The current raw/ value + predicted diff = predicted next raw value.