Compiling NuPIC on Ubuntu 14

Quick Install Script

If you don’t care what’s actually happening in the installation described below, you can just run this script (with sudo rights!) on a freshly installed Ubuntu 14 LTS, and it should do everything you need. However, I suggest you install NuPIC by following the step-by-step directions below so you’ll know exactly what the installation process does in your environment.

Setup

Install some basic depedencies

sudo apt-get update -y
sudo apt-get install git g++ cmake python-dev -y

Clone nupic.core and nupic

git clone https://github.com/numenta/nupic.core.git
git clone https://github.com/numenta/nupic.git

Set some environment vars

These point to our checkout directories. We’ll need them later for running tests and scripts. (You’ll probably want this in your .bashrc or .bash_profile.)

export NUPIC=$HOME/nupic
export NUPIC_CORE=$HOME/nupic.core

Get the latest pip

Notice we are not using apt-get because it provides a very old version of pip that is hard to update properly.

curl https://bootstrap.pypa.io/get-pip.py | sudo python

NuPIC Core

Install nupic.core python dependencies

This takes awhile because numpy!

cd $NUPIC_CORE
pip install -r bindings/py/requirements.txt --user

Install pycapnp

This will install the python Capnproto bindings and the C++ Capnproto program.

pip install pycapnp==0.5.8 --user

Configure and generate C++ build files

See the README for details about the following commands.

mkdir -p $NUPIC_CORE/build/scripts
cd $NUPIC_CORE/build/scripts
cmake $NUPIC_CORE -DCMAKE_INSTALL_PREFIX=../release -DPY_EXTENSIONS_DIR=$NUPIC_CORE/bindings/py/nupic/bindings

Build

Note: The -j3 option specifies ‘3’ as the maximum number of parallel jobs/threads that Make will use during the build in order to gain speed. However, you can increase this number depending your CPU.

make -j3

Install

Stay in the $NUPIC_CORE/build/scripts directory.

make install

Run Tests

cd $NUPIC_CORE/build/release/bin
./cpp_region_test
./unit_tests

Install the Python Bindings

This will expose the C++ nupic.core program to python by installing the nupic.bindings python module.

cd $NUPIC_CORE
python setup.py install --user

NuPIC

Install

cd $NUPIC
python setup.py install --user

Run Tests

First, make sure that py.test is on your path. It got installed into ~/.local/bin in a previous step.

export PATH=~/.local/bin:$PATH

Now run unit tests.

python $NUPIC/scripts/run_nupic_tests.py -u

Note:
If executing that last line generates an error such as “ImportError: No module named _markerlib”, it may be caused by Ubuntu’s pre-installed pip version which seems to be outdated. Running these 3 lines should fix the problem.

sudo apt-get install python-pip python-dev build-essential
sudo pip install --upgrade pip
sudo pip install --upgrade virtualenv

Running swarms

To run swarms, a MySQL database is necessary. If MySQL is already installed and has already been setup with a username and password, then you will need to modify the user and password values accordingly in the file nupic-default.xml located at $NUPIC/config/default/nupic-default.xml. However, if it is not installed, then simply executing

sudo apt-get install mysql-server

will install MySQL on Ubuntu. During installation, you will be prompted to enter a password for the “root” user. Simply hit the Enter button to complete installation. By default, this should create a MySQL database located at “localhost” with the username “root” and an empty password. Once done, you can verify that everything was setup correctly by executing

python $NUPIC/examples/swarm/test_db.py

which should generate some text, with the very last line confirming that the connection test was successful.

Note:
To run swarms, the script run_swarm.py must be executed. For this to work, you must either add the script run_swarm.py to the python path or simply invoke the entire path everytime as such

$NUPIC/scripts/run_swarm.py $NUPIC/your_folder/yoursearch_def.json --maxWorkers=4

More details and in-depth explanation about swarms and how to run them can be found here