Would breaking NuPIC serialization backwards compatibility affect you?

NuPIC models can be serialized to disk. Many people do this to save their models in case of failure, to stop and restart them for performance reasons, or to save them for later times.

Breaking backwards compatibility of this serialization technique means that any current models saved using the old method could not be resurrected in any version of NuPIC after the backwards compatible change was made. [1]

The ability to break serialization backwards compatibility would untie our hands in many ways. We could rethink how code is organized, named, and structured. We could remove legacy terminology, making it easier for new contributors to understand. We could clarify our interfaces, allowing us to get to a NuPIC 1.0 version much sooner. It would speed up our build pipelines because we’d no longer need to run backwards compatibility tests around serialization.

But what happens if we break NuPIC serialization backwards compatibility?

Any NuPIC models serialized from current production applications would not be able to load in newer versions of NuPIC, preventing these applications from upgrading to newer versions of NuPIC without losing the state of all their models.

How disruptive would this actually be?

Remember that affected applications would not need actual code changes, because the code API hasn’t changed (that is a different situation entirely). It just means that their models will lose all memory of the data they have seen and start from scratch.

So, is this a big deal to anyone? Remember that NuPIC is still pre-1.0, so I will not feel very guilty about pulling this trigger. But I do want to gauge how much this would affect all of you who might have some NuPIC models running today.

  • I am not affected because I have no NuPIC models running that I care about
  • I am affected, but am ok losing my model history when I upgrade NuPIC
  • I am affected, but I just won’t upgrade to a later version of NuPIC
  • I am affected, and this situation is unacceptable and I’m going to raise hell about it

0 voters

Or instead of taking the poll you can just respond below. (Or both.)


[1] It is possible to create a model migration program that reads in models in the old format and saves them in the new format, but we don’t have any plans of working on this right away.

I appreciate that you’re being “sensitive” to users, but isn’t this to be expected of pre-release software? You need some flexibility to be able to hone the software into its best form possible - which means occasionally changing crucial things until the API is one you can commit to. Just wondering?

How might someone deal with the format change in their production application, short of a transcoding mechanism? One way would be to delete the old checkpoints, “replay” cached data stream over a “reasonable” timeframe into the newly-created models (assuming that you’ve been caching a reasonable amount of recent data for the affected models).

2 Likes

When you add pre-release software to your “production application”, you are doing it with the knowledge that things will change and you might have to make some changes to your software at some point if you decide to “update” dependencies to the latest version. Most of the software I have seen that was shared before the 1.0 release, comes with the caveat that change is to be expected?

1 Like

@cogmission In response to both your replies, I think I’ll just reiterate what I said:

1 Like

@rhyolight Wow, I totally read your initial post and somehow glazed over the fact that you had already expressed my point of view, sorry.