Ogma's Sparse Predictive Hierarchies

@cezar_t

SPH, HTM, and Sparsey (official website/code) are the pioneering exemplars for what I call Binary Pattern Neural Networks (BPNNs), with Sparsey being the first. I describe BPNNs below in my draft paper (which I have yet to complete).

BPNNs generally have the following properties: 1) they receive input in the form of binary vectors, 2) they use a form of Winner-Take-All (WTA) computation for selecting the neurons to activate, and 3) the neurons have a binary activation function for output. The implementations of BPNNs differ in how neuron activation is implemented, how the network learns, and how the network is architected. BPNNs are not to be confused with Binary Neural Networks (BNNs) [?], which are traditional ANNs but with activation functions that that transform the underlying scalar weights and neuron states into binary. Unlike BNNs, BPNNs natively operate on binary states.

Given the previous discussion on clusterons, I think including dendrites as first-order computational objects should also fall into this umbrella definition. The power of research in this area is in the clear visual exploration of distributed representation and computation of discrete information packets. This is in contrast to the ANN approach which linearizes computation and embeds information into vector spaces. I’m starting to strongly believe the latter’s availability of strong existing linear algebra computational and math tools is severely inhibiting scientific advancements in artificial cognition.

What’s missing for BPNNs, and why HTMs seem to have been stalled, is a clear theoretical framework for how information is represented and transformed through BPNN networks. Without that theory, trying to connect spatial poolers and temporal sequence memories together to create some effect is just like wiring blackboxes together to see what happens. When you fail, there’s no way to understand why you fail or how to improve it without that theoretical understanding.

I think I have part of this theory, but it’s still a long way from explaining what’s happening, and how to get desired effects.

4 Likes