Discussion on compatibility of various Linux build environments



I’m curious what are the ideal/ recommended Linux build environments that folks are using to build NuPIC core from source, and which have had problems.

I’ve lately been experimenting with building Docker images for armhf architecture, and seems that more recent base images (Debian Stretch or Ubuntu Bionic for example) run into more deprecation problems (and ultimately test case failures) than older base images (Debian Jessie or Ubuntu Trusty for example), while the latter tend to have problems with outdated versions. These problems are not specific to the armhf architecture.

Figured I’d get some input from folks who have more experience with building NuPIC core, and any lessons they have learned along the way.

For reference, the most promising test I have so far is this one. That one fails with an error about setuptools being outdated, but solves most of the other problems.


I have been running Linux on widows (Ubuntu flavor) for a while now and it has been good for everything I have thrown at it. Mostly this has been fMRI based brain viewing tools and Circos.

I am curios if anyone has used it for the NuPIC core.


I have had no success building Nupic on linux. I offer two work arounds:

  • Install windows and use a prebuilt binary. I got windows 10 for free and put it on its own partition so that I can duel boot my workstation. This is IMO the simplest and quickest way to get Nupic-Core.
  • Use Nupic without Nupic-Core. I did this because I wanted to use Nupic’s encoders and SDR-Classifiers on a linux installation. I needed to modify each file which I used, by deleting references to Nupic-Core. On this Nupic installation I also converted from python2 to python3 using the program “2to3”, and it mostly worked.

Good luck


Fixed the remaining build issues, and got versions required to complete the build on the unmodified source without any errors. The recipe I ended up with was to start with debian:jessie, install the prerequisite packages besides cmake (copied these from the official NuPIC Core Dockerfile). Then add jessie-backports to the sources list, and pull in cmake from there to get a high enough version. And finally use the --upgrade flag when doing installing setuptools with pip to get a high enough version of it.

Though installation completes without any issues, there are still some test cases failing (only on armhf), so haven’t quite achieved my ultimate goal yet. I think at this point the remaining issue is just with the architecture (since the exact same recipe passes all tests on x64).

The remaining failures on armhf are similar to this one:

_____________ NetworkTest.testSimpleTwoRegionNetworkIntrospection ______________

self = <tests.network_test.NetworkTest testMethod=testSimpleTwoRegionNetworkIntrospection>

    def testSimpleTwoRegionNetworkIntrospection(self):
      # Create Network instance
      network = engine.Network()
      # Add two TestNode regions to network
      network.addRegion("region1", "TestNode", "")
      network.addRegion("region2", "TestNode", "")
      # Set dimensions on first region
      region1 = network.getRegions().getByName("region1")
>     region1.setDimensions(engine.Dimensions([1, 1]))

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <nupic.bindings.engine_internal.Dimensions;  >, args = ([1, 1],)

    def __init__(self, *args):
            __init__(self) -> Dimensions
            __init__(self, v) -> Dimensions
            __init__(self, x) -> Dimensions
            __init__(self, x, y) -> Dimensions
            __init__(self, x, y, z) -> Dimensions
>       this = _engine_internal.new_Dimensions(*args)
E       NotImplementedError: Wrong number or type of arguments for overloaded function 'new_Dimensions'.
E         Possible C/C++ prototypes are:
E           nupic::Dimensions::Dimensions()
E           nupic::Dimensions::Dimensions(std::vector< size_t,std::allocator< size_t > >)
E           nupic::Dimensions::Dimensions(size_t)
E           nupic::Dimensions::Dimensions(size_t,size_t)
E           nupic::Dimensions::Dimensions(size_t,size_t,size_t)

../src/nupic/bindings/engine_internal.py:613: NotImplementedError

It appears that Python is passing an array of numbers, and that pattern is not properly matching any of the native prototypes.

I’m thinking a clue is the use of “size_t” here, which on armhf is 4 bytes in length, but on x64 is 8 bytes. Will do some more digging to verify that this is the issue (one way to check would be to build on x86 architecture which also has 4 byte size_t). Anyone have some thoughts on this particular problem, and how it might be addressed?


I verified that the same problem also occurs on x86 (building from 32bit/debian:jessie image). Last test to be sure, 64 bit ARM should work fine (building from arm64v8/debian:jessie or aarch64/debian:jessie image – I’m still not clear on the difference between these two yet). Should be able to use Crazyhead90’s port of pi64 to test this on the Pi3B+.

If my theory is correct, then NuPIC Core is not compatible on 32 bit systems.


I sort of thought that I was done dealing with code that cared about “endyness” and word/byte alignment issues.



The unit tests for the native code all pass, so I’m hoping the issue is limited to the binding interface for the Python code to interact with the native core. I’m still relatively new to Python myself, so this is an area I’ll have to study up on to see if I can confirm the scope of the issue and contribute a fix. My goal is to run NuPIC (with the native core for performance) on the Pi Zero W, which of course can’t do 64 bit (being only ARMv6)


A little more information. The issue appears to be related to this 32bit fix . Essentially what it does is only create the Dimset template for std::vector<size_t> if it is being compiled on something other than 32bit Linux. Removing this condition results in a compiler error about redefinition of ‘struct swig::traits_asval<unsigned int>’ (which is of course why the fix was added).

My current theory is that because this template doesn’t get created (since the compiler recognizes size_t and unsigned int as the same thing), some other logic later isn’t making the connection and recognizing that engine.Dimensions([1, 1]) should match the prototype nupic::Dimensions::Dimensions(std::vector< size_t,std::allocator< size_t > >)

I don’t have a solution yet, but at least feel like I am making some progress in understanding the issue.


Solved it

================================================================================== 210 passed, 4 skipped, 2 pytest-warnings in 4.74 seconds ==================================================================================

Now to test on armhf :fearful: