KjException when serializing

I’m trying to use the writeToFie function from “New Serialization”, which I found here: http://nupic.docs.numenta.org/1.0.0/guides/serialization.html, but I got the next error:
apnp.lib.capnp.KjException: src/capnp/arena.c++:106: failed: Exceeded message traversal limit. See capnp::ReaderOptions.
stack: 0x7f666095c6f9 0x7f666096032d 0x7f66608c1d1c 0x7f666093d374 0x7f6660904c5f 0x7f666092cd57 0x7f666092d7af 0x7f666081af5c 0x7f666081b40e 0x7f66607b8746 0x7f66607c32b0 0x7f66607c19f2 0x7f66607c160f 0x7f66607c5290 0x4c45fa 0x4c9d7f

I would appreciate any help.
Thanks in advance.

@Sergey Please update to the latest nupic and try again. You want nupic 1.0.3 / nupic.bindings 1.0.0.

Thanks for your help. I update nupic ,but I got same error. Maybe you know another approach ?

Can you please share the code that is causing this problem? What object are you writing and how are you writing it?

Thanks

Of course. This code causing problem (the second line):

with open("out_sp.tmp", "wb") as f1:
         sp.writeToFile(f1)

I want to write Spatial Pooler and Temporal Memory, but now we discuss about SP .

@lscheinkman @scott Does this sound familiar? This SP write should be pretty cut-and-dry.

I just realized it is probably because it’s an OPF model?

@Sergey exactly what object is sp in your code? How do you create it?

Could be related to this?

This code how i create sp :

with open(PARAMS_PATH, "r") as f:
    modelParams = yaml.safe_load(f)["modelParams"]
    # enParams = modelParams["sensorParams"]["encoders"]
    spParams = modelParams["spParams"]
    tmParams = modelParams["tmParams"]
sp = SpatialPooler(
    inputDimensions=(encodingWidth,),
    columnDimensions=(spParams["columnCount"],),
    potentialPct=spParams["potentialPct"],
    potentialRadius=encodingWidth,
    globalInhibition=spParams["globalInhibition"],
    localAreaDensity=spParams["localAreaDensity"],
    numActiveColumnsPerInhArea=spParams["numActiveColumnsPerInhArea"],
    synPermInactiveDec=spParams["synPermInactiveDec"],
    synPermActiveInc=spParams["synPermActiveInc"],
    synPermConnected=spParams["synPermConnected"],
    boostStrength=spParams["boostStrength"],
    seed=spParams["seed"],
    wrapAround=True
)

I don’t know exactly. I didn’t use any class about capnp.

Have you passed any data into your SP before trying to serialize it?

Yes, I have a function, where I did sp.compute and pass to TM an then classifier. After these steps in this function I try to serialize SP.

Ok, does this happen if you don’t send data into it and try to serialize?

I try now. I put this code at the begin of my function. I got same error.

  with open("out_sp.tmp", "wb") as f1:
    sp.writeToFile(f1)

Are you on Windows?

Linux

Does this script fail with the same error?

from nupic.algorithms.spatial_pooler import SpatialPooler

def testWriteSp():
  sp = SpatialPooler(
    inputDimensions=(400,),
    columnDimensions=(1024,),
    wrapAround=True
  )
  with open("out_sp.tmp", "wb") as f1:
    sp.writeToFile(f1)


if __name__ == "__main__":
  testWriteSp()

(it works for me)

I don’t find different things in my function . This is the same approach. Maybe you find differents in these functions :

def runLearning(numRecords):
  
  learning_time = time()
  with open("test3.csv", "r") as fin:
    reader = csv.reader(fin)
    headers = reader.next()
    reader.next()
    reader.next()

    for count, record in enumerate(reader):

      if count >= numRecords: break

      # Convert data string into Python date object.
      #dateString = datetime.datetime.strptime(record[0], "%m/%d/%y %H:%M")
      # Convert data value string into float.
      event_value = float(record[6]) # device 5
      event_value_3 = float(record[4]) # device 3
      event_value_2 = float(record[3]) #device 2
      event_value_7 = float(record[8]) # device 7
      bezline_all = float(record[10])

      flow_value  = float(record[0])
      # To encode, we need to provide zero-filled numpy arrays for the encoders
      # to populate.
      eventBits = numpy.zeros(eventEncoder.getWidth())
      eventBits_2 = numpy.zeros(eventEncoder2.getWidth())
      eventBits_3 = numpy.zeros(eventEncoder1.getWidth())
      eventBits_7 = numpy.zeros(eventEncoder7.getWidth())

      baseline_Bits = numpy.zeros(baselineEncoder.getWidth())
      flowBits = numpy.zeros(flowEncoder.getWidth())


      # Now we call the encoders to create bit representations for each value.
      eventEncoder.encodeIntoArray(event_value, eventBits)
      eventEncoder1.encodeIntoArray(event_value_3,eventBits_3)
      eventEncoder2.encodeIntoArray(event_value_2,eventBits_2)
      eventEncoder7.encodeIntoArray(event_value_7,eventBits_7)

      baselineEncoder.encodeIntoArray(bezline_all,baseline_Bits)
      flowEncoder.encodeIntoArray(flow_value, flowBits)


      # Concatenate all these encodings into one large encoding for Spatial
      # Pooling.
      encoding = numpy.concatenate(
        [eventBits,flowBits,baseline_Bits,eventBits_2,flowBits,baseline_Bits,eventBits_3,flowBits,baseline_Bits,eventBits_7,flowBits,baseline_Bits]
      )


      # Create an array to represent active columns, all initially zero. This
      # will be populated by the compute method below. It must have the same
      # dimensions as the Spatial Pooler.
      activeColumns = numpy.zeros(spParams["columnCount"])
      # activeColumns1 = numpy.zeros(spParams["columnCount"])


      # Execute Spatial Pooling algorithm over input space.

      sp.compute(encoding,True,activeColumns)

     # sp.compute(encoding1, True, activeColumns)

      activeColumnIndices = numpy.nonzero(activeColumns)[0]

      # Execute Temporal Memory algorithm over active mini-columns.
      tm.compute(activeColumnIndices, learn=True)

      activeCells = tm.getActiveCells()

      # Get the bucket info for this input value for classification.
      bucketIdx = eventEncoder.getBucketIndices(event_value)[0]
      bucketIdx_2 = eventEncoder2.getBucketIndices(event_value_2)[0]
      bucketIdx_3 = eventEncoder1.getBucketIndices(event_value_3)[0]
      bucketIdx_7 = eventEncoder7.getBucketIndices(event_value_7)[0]
      print "BucketIdx_3:",bucketIdx_3
      print "BucketIdx:",bucketIdx


      # Run classifier to translate active cells back to scalar value.
      classifierResult = classifier.compute(
        recordNum=count,
        patternNZ=activeCells,
        classification={
          "bucketIdx": bucketIdx,
          "actValue": event_value
        },
        learn=True,
        infer=False
      )
      classifierResult1 = classifier1.compute(
        recordNum=count,
        patternNZ=activeCells,
        classification={
          "bucketIdx": bucketIdx_3,
          "actValue": event_value_3
        },
        learn=True,
        infer=False
      )
      classifierResult7 = classifier7.compute(
        recordNum=count,
        patternNZ=activeCells,
        classification={
          "bucketIdx": bucketIdx_7,
          "actValue": event_value_7
        },
        learn=True,
        infer=False
      )
      classifierResult2 = classifier2.compute(
        recordNum=count,
        patternNZ=activeCells,
        classification={
          "bucketIdx": bucketIdx_2,
          "actValue": event_value_2
        },
        learn=True,
        infer=False
      )
    learning_time_end = time()
    print "Time",(learning_time - learning_time_end)
    with open("out_sp.tmp", "wb") as f1:
        sp.writeToFile(f1)
    with open("out_tm.tmp", "wb") as f:
        tm.writeToFile(f)

    return result

That’s not all your code. I don’t see where the sp is initiated.

But did the script I posted above work for you? Can you save it into a file and run it with python? If it fails for you, I can rule out data and model params as a source of the problem.

Your script works for me .

My initiation :

with open(PARAMS_PATH, "r") as f:
    modelParams = yaml.safe_load(f)["modelParams"]
    # enParams = modelParams["sensorParams"]["encoders"]
    spParams = modelParams["spParams"]

sp = SpatialPooler(
    inputDimensions=(encodingWidth,),
    columnDimensions=(spParams["columnCount"],),
    potentialPct=spParams["potentialPct"],
    potentialRadius=encodingWidth,
    globalInhibition=spParams["globalInhibition"],
    localAreaDensity=spParams["localAreaDensity"],
    numActiveColumnsPerInhArea=spParams["numActiveColumnsPerInhArea"],
    synPermInactiveDec=spParams["synPermInactiveDec"],
    synPermActiveInc=spParams["synPermActiveInc"],
    synPermConnected=spParams["synPermConnected"],
    boostStrength=spParams["boostStrength"],
    seed=spParams["seed"],
    wrapAround=True
)