Where is the sensorimotor code?


I’ve watched a couple videos of Jeff Hawkins talking about their work on sensorimotor inference and also allocentric object representations. Is there any code associated with these experiments?

I can see how it might work, but the biggest question I have is how to do the coordinate transformations in an HTM way. I’ve read some modeling papers where they shifted the expected inputs of visual fields by feeding in a motor efferent copy from an intended saccade. Of course, this was a very restrictive model for a very specialized application. I’d like to know the more general approach to doing coordinate transformations.

I presume that it would be represented by a number of “paths” that a predicted sequence could take and then a particular motor efferent copy that it receive dictates the sequence that will be predicted for the upcoming inputs. Similarly, for a particular sequence of inputs, you could back out an motor prediction.

But these are allocentric predictions of motor signals. I think I would be happy with just that without having to worry about the transform between allocentric and egocentric coordinate frames.

Any experiments or code along these lines? I’ve looked in the repositories and I can’t find anything obvious.

Help with an HTM implementation

It has not really been written yet. And I say really because while there may be some proof-of-concepts in our research repository, nothing is official. There are a few things to work out before really writing code. I dare say the coding is the easy part.

By the way, we don’t treat our research codebase the same way we treat NuPIC. See the README to find out why. Any new sensorimotor codebases will end up there eventually. But nothing there is official.

When it is ready, the sensorimotor codebase will be moved (or re-implemented) in https://github.com/numenta/nupic. But we don’t have a timeline. I am working to get NuPIC into a 1.0 state before this happens, but I know it is coming. :wink:

When the new codebase is ready for applications to be built on top of it, we will release NuPIC 2.0. I’ll be talking about this more in the future.


How the “location” is represented and calculated is a fascinating problem. Grid cells in the entorhinal cortex solve a similar problem. Over the past few weeks we have been reading and studying what is known about these cells. It has given us some fascinating insights into how neurons do this, in an “HTM way”. I don’t have time to try to summarize it her but if this topic interests you, you might want to read about grid cells. Marcus and Yuwei have been searching the literature. They might have some paper recommendations if you need it.


Thanks Matt,

I was pretty sure there was no official release but I’m certain you guys would have done some experiments by now. I’m very curious to see what a 3D spatial object representation will look like in the HTM-way. Any evidence of this?

  • Jacob



Thanks for the pointer. I was aware of place cells, but not grid cells. I would definitely be interested in any paper references.

It seems like place and grid cells form a conceptual hierarchy of location in space. Place cells record the “place” you’re in like a living room, a cave, a nook, a turn in the road, while grid cells record the actual position within that place.

Do grid cells record position in 3D as well? It’s not clear from the wikipedia article I read how the representation looks. Only that they are active under certain conditions.

  • Jacob


Place cells fire when an animal is at one particular location in an environment. E.g. as you move around a room different place cells fire for each location within the room. If you remembered a sequence of place cell activations you would remember your path through the room.

A grid cell fires at multiple locations within an environment. To determine where you are you have to look at multiple grid cells.

There is no reason grid cells can’t represent high-dimensional spaces. If this is new to you I would start with wikipedia that has good entries and references.


@jacobeverist There is evidence that flying mammals (bats in particular) do indeed exhibit grid cells that tessellate 3D space. Nachum Ulanovsky’s lab is the research group to watch for bat-related hippocampal work.


I don’t have time to try to summarize it her but if this topic interests you, you might want to read about grid cells. Marcus and Yuwei have been searching the literature. They might have some paper recommendations if you need it.

Hi People,
Is it possible Marcus or Yuwei could provide us with the papers’ links?
I am very interested in the topic too.


@jhawkins . did you post the code you mentioned in the “How Sensorimotor Inference Works (Part 3)” video? You mentioned that you built the sensorimotor code and tested it extensively. i’d like to see what that looks like if you’re ready to share.

  • Jacob


In the past we have posted our research code, even in its raw and changing-daily form. I don’t know if we have done that here. I will ask.


Hi, here’s a delayed response! Here are two reviews that are good.


Hi mrcslws,

thanks a lot for the links!

Excuse my delay too. It seems I erased the email by mistake without being aware of your reply. Then when I entered to the forum to re-check the topic I found this great surprise!

Thanks you very much.




@jacobeverist check out htmresearch/frameworks/layers/l2_l4_inference.py and all the files using it:

~/nta/htmresearch  (master)
$ git grep -n l2_l4_inference

especially htmresearch/projects/l2_pooling/infer_hand_crafted_objects.py. Cheers.


A post was split to a new topic: Free will Volition module


Can I get some hints on how to get the htmresearch repo installed and working? I’ve been trying with no success. I thought I should build htmresearch_core, then nupic, then htmresearch based on a comment from Austin a while back. It seems my problem lies with htmresearch_core, I can’t import things from it in the python interpreter (e.g., from htmresearch_core import algorithms fails) even though I followed the ‘install from sources’ instructions and the install appeared successful. Oddly if I say help(htmresearch_core) the package only contains:

dummy proto (package)

(and from htmresearch_core import proto works)

I know htmresearch installation isn’t supported but maybe I can bribe people with a delicious lunch or dinner offer on me. think about it folks :wink:


@mccall.ryan - htmresearch-core now just provides the research-related code so it should be installed alongside nupic.bindings. So nupic.bindings.* will come from the nupic.bindings package but no longer have the “nupic.bindings.experimental” extension. That will be in the htmresearch_core namespace.


that’s helpful and does clear things up. the problem I’m having now is that this htmresearch-core somehow hasn’t installed properly? (from the looks of it) b/c I can’t import from it. Followed the htmresearch-core README:

pip install -r bindings/py/requirements.txt --user pip install pycapnp==0.5.8 python setup.py install --user python

>>> from htmresearch_core.experimental import ExtendedTemporalMemory
Traceback (most recent call last):
File “”, line 1, in
ImportError: No module named experimental

Perhaps I should follow the Developer Installation instead of the simple installation…


@mccall.ryan Did you have any success?


Hi Thanh Binh, I tried installing htmresearch-core the pip way and also the developer way by building c++. Both ways appeared to work successfully but I cannot import things from htmresearch_core that htmresearch depends on, e.g., I couldn’t import ExtendedTemporalMemory. Maybe the readme is incomplete since it looks like a copy of the nupic-core readme and I have to change some arguments to adapt the instructions to htmresearch_core.


I have the same problems like you. I try different ways without any success. Could you, Numenta guys, please help us?