Frontiers Topology Paper Code Errors

I am looking at using the topology feature of NuPIC to analyze 2D inputs with non-global inhibition and non-global receptive fields (potential pool I think is the NuPIC term?). To this end I am trying to run the code for the 2016 Frotiers paper “The HTM Spatial Pooler - A Neocortical…”

Both the 1D and 2D examples give me errors (different ones). 1D case looks like an object property name was changed or something. The second looks like the input dimensionality isn’t handled correctly. Below are the two commands and excerpts from the output. Note that I also had to make a small change to the 2D script import function to find the parameters file.

Any help would be appreciated. Thanks!

1D case:

$python train_sp.py -d 'randomBarPairs'
.
.
.
[lots of parameter outputs, and finishing =the first training epoch]
.
.
.
Mean number of active columns: 20.0
Traceback (most recent call last):
  File "train_sp.py", line 640, in <module>
    metrics, expName = runSPexperiments(expConfig)
  File "train_sp.py", line 553, in runSPexperiments
    reconstructionError(sp, testInputs, activeColumnsCurrentEpoch))
  File "C:\Python27\lib\site-packages\htmresearch\frameworks\sp_paper\sp_metrics.py", line 836, in reconstructionError
    numActiveColumns = int(sp._localAreaDensity * sp._numColumns) + 0.0
AttributeError: 'SpatialPooler' object has no attribute '_localAreaDensity'

2D case:

$ python train_sp_topology_simple.py -d 'randomBarPairs'
.
.
.
[lots of parameter output]
.
.
.
training SP epoch 0
ERR:  Expecting 1D array but got 2D array [C:/projects/nupic-core/src/nupic/py_support/NumpyVector.hpp line 406]
Traceback (most recent call last):
  File "train_sp_topology_simple.py", line 150, in <module>
    sp.compute(inputVector, True, outputColumns)
  File "C:\Python27\lib\site-packages\nupic\bindings\algorithms.py", line 2873, in compute
    return _algorithms.SpatialPooler_compute(self, *args)
RuntimeError: Expecting 1D array but got 2D array

Just to be sure before I work on this, you are using this code, right?

That is correct.

I think this is a bug. I’ve filed a report.

It took me 32 minutes to confirm this. :wink:

Wow, I think it took @lscheinkman less time to fix this bug than it took me to confirm it!

1 Like

I pulled the latest commits and now the without-topology case worked when I ran the random SDRs, fixed sparsity case. However, I am still getting the _localAreaDensity error in the random SDRs, fixed sparsity, with topology case.

Perhaps I am not understanding the dependencies I need? I have the latest nupic and htmresearch installed (via pip), but it sounds like I should be using Docker and the Dockerfile instead.

I don’t think this will work. htmresearch does not install via pip, you have to install it from source. Right, @lscheinkman?

The htmresearch repo readme indicates it should install 0.0.3 via pip.

Also maybe I should specify that my real goal is to set up a simple (hotgym-like) anomaly detection system that uses a 2D topology and a passthrough encoder. I figured I would run what you have already implemented to understand how it works and then port the topology aspect to my own code (which has grown out of the hotgym code).

So if there is a simpler way of getting to that, maybe via the HTMSchool topology examples, I am open to pursuing that instead.

Then I would start here, were the params are modified for topology:

If you want to try and run this yourself, you’ll need to run NuPIC on an HTTP server. Here are some instructions, I think they still work?

Thanks for the link, I will look into it. I also found a comment in the documentation that:

Note that the OPF will only create one-dimensional input and spatial pooling structures, so during SP creation columnCount translates to columnDimensions=(columnCount,) and inputDimensions=(inputWidth,).

which seems to explain my trouble with simply changing the hotgym example parameters and confusion with the parameter names. So I guess the way forward is to abandon the OPF/hotgym example and use either the networks or algorithms API.

1 Like