Making a deep copy of an OPF model




I have an instance of an OPF model (the same one used in the numenta detector in NAB, actually) that I’d like to make a deep copy of in my code. The simplest solution (trying “model_copy = copy.deepcopy(model)”) does not work. When trying to do, it breaks because self.__restoringFromState gets set to true inside model_copy which doesn’t pass an assert statement.

Could somebody explain or point me in the direction of documentation which explains how to make deep copies of models?



You probably want to use the old way of saving, which pickles things.


Interesting…is there a new way of saving/loading models that is under development? Is that why it’s deprecated?


Yes, see the rest of the doc I sent. But to use this method you need to get hold of the algorithm instances.


Thanks for the quick responses…I have to create and load many copies and unfortunately writing to the file system is exceedingly slow. I’m going to keep trying to deep copy a model all within memory.


I notice the eq method is not overwritten in the base model class or the htm_prediction_model. Is there any code out there to test equality between two model instances?


I’ll have to defer to @scott on that question.


I think you can probably use the new or old serialization completely in memory. The old method requires separate steps for the Python state (pickle) and C++ state so the new method is probably easier (and faster).

For the new method, you can see how to do an in-memory copy here:

It might be faster to use to_bytes_packed and from_bytes_packed instead of the unpacked version used in the example code. See pycapnp reference here:

Serializing an anomaly detection model

And you probably want to save the result of builderProto.to_bytes() into a variable that you can call from_bytes on multiple times to create multiple copies of the model.


Excellent. You’re right, that method is much faster. For determining equality I’m just checking that they are giving the same output over time which is the case. Thanks so much for your help.