Has anyone made a generic sensorimotor app with nupic?

I’m not asking about a perfect instantiation of the still theoretical, but instead a simple, as naive as possible, as generic as possible, sensorimotor app?

This diagram I’m going to share is laughably simple but perhaps it’ll give you some idea on the highest level possible of what kind of thing I’m asking about.

Does this sort of thing exist, like even the simplest version of what this might look like?

I think the key to making the simplest version is to make it only work on environments that don’t change unless the agent does something and where the environment is fully observable. there are no hidden variables. the state of the environment right now is the only thing you need to know about it to determine your location in a theoretical Markov map of the environment. what’s that called, the Markov Property or something?

It seems to me that if your environment conformed to these simple rules you might be able to create an adequate sensorimotor inference engine with the level of technology we have today in Nupic, or perhaps even with a combination of other technologies.

So I wonder, has anyone done this sort of thing?

1 Like