Clone-structured cognitive graphs (CSCG)

I do not know if everyone is familiar with Dileep George’s paper Clone-structured graph representations enable flexible learning and vicarious evaluation of cognitive maps.

In any case, he recently gave a seminar at the Simons Institute Space is a latent sequence: a unifying theory of representations and remapping in the hippocampus where he presented the main concepts in this paper.

According to Dileep, Clone-structured cognitive graphs (CSCG)s offer a coherent explanation for place cells, place cell remapping, boundary vector cells and almost all phenomena connected with place cells, except maybe grid cells.

I found his model very compelling.

At the beginning of his talk I thought that there was a strong similarity to HTM but it later became clear that this is not so. Though they may have some similar genes HTM and CSCG are modelling completely different phenomena.

I wonder if CSCG can be the underlying mechanism for TBT frames of reference.

6 Likes

Interesting article and will have to read this a few times as parts of it look similar to an approach I have been trying out. The splitting at t=1 in fig.1 (f) is an interesting process as this is what I would think of (in language terms) as say b=“full” c=“cup” d=“mug” e=“coffee”, where the model is really indifferent to the sensory stream (i.e. c could be “cup” and e=sensory feeling of a warm object - warm sense on a finger tip).

The splitting I would tend to think of as occuring by virtue of the number of relevant parallel streams varying through a memory. That is if you start off with one sensory input and then experience two senses in parallel this sudden change requires temporal branching or splitting… the difference is that those branches then have thier own relative temporal relevance and they are not time locked to each other.

2 Likes

Does ‘Vicarious AI’ have a forum like this one?

1 Like

My (purely subjective) opinion is that Vicarious AI is a purely commercial endeavour and not particularly interested in developing a community or allowing free use of their development. Even the code published in support of the CSCG paper on Github has essentially a “keep your hands off” sign.

BTW Vicarious AI just have been purchased by Google.

2 Likes

I take back my earlier comment that CSCG and HTM are not similar.

If the columns of CSCG nodes are implemented by HTM mini-columns, with a one-hot vector as input rather than a SDR and furthermore, synapses are allowed between adjacent neurons in the mini-column, then the resultant HTM Temporal Memory should be able to implement most of the features of CSCG.

If I am right, then this could be an elegant way to incorporate spacial and conceptual reference frames into cortical columns. I hope to test this idea.

4 Likes

Do you have any updates on testing this idea?
We are working on the probabilistic extension of the HTM model in a very similar to CSCG way too. One of the differences is indeed using SDR instead of one-hot. The other difference is our model learns fully online without future information backpropagation [required by Baum-Welch belief propagation algo] used in CHMM/CSCG. We have a preprint that broadly describes our approach, however I must say it’s pretty raw at the moment.

4 Likes

At the time I hit a wall with the repeating pattern limitation of HTM so I temporarily put it aside to deal with other problems and unfortunately forgot about it. Thanks for reminding me :slight_smile:
Thanks also for the preprint. It looks interesting.

2 Likes

Have you considered habituation? That is, a local factor that reduced response based on the number of recent responses.

That would be counter-productive to a CSCG implementation.

1 Like