Does Nupic Allow Deepcopying a trained OPF model?



Hi everyone,
When I implemented OPF model for anomaly detection, my idea is to feed test data to the trained model to detect the anomaly score. In order to avoid the process of training for multiple times and in the meantime not to pollute the fine-trained model by the previous test data, I want to keep a deepcopy of the trained model. However, it goes wrong with SwigPyObject error when trying to deep copy nupic.engine.Network object, I guess it is because Network is wrapped in C++ and there is not any __deepcopy__ or __copy__ function for the wrapped class. Also, I notice there is not any nupic API for copy or deepcopy a nupic.frameworks.opf.model.Model object.

So could anyone provide any precious hints or solutions for this? I would appreciate the attention of nupic developers for this issue. Thanks for time and patience!

The error is as follows:
TypeError: object.__new__(SwigPyObject) is not safe, use SwigPyObject.__new__()


Also, the following error comes up when I try to pickle a Temporal Memory object using pickle.dump(tm, open("tm_admin", "wb")):
in _reduce_ex raise TypeError("a class that defines __slots__ without " TypeError: a class that defines __slots__ without defining __getstate__ cannot be pickled I’d appreciate it if any nupic developer could resolve this~ Thanks!!!


The OPF has its own serialization method. Try that first.


Thank you Matt~~ I have already tried OPF. Although pickle works with OPF, OPF does not allow deepcopy of a trained model, which seems to make keeping a copy of the trained model state impossible for now.


But that is exactly what this is supposed to do. How are you saving your OPF model?