Using composition and time to explain non-hierarchical connections

I watched Thousand Brains Theory & Hierarchy episode 16 today. This is a great tutorial.

Then I find there might be another way to explain connections that skip hierarchies.

2 Likes


The reason for hierarchical structure could be utilizing shared objects. For example, sentence A is “I am a polar bear.” and sentence B is “Bear is my name.” “Bear” is a shared object in both sentences. It saves space by using the word “Bear” twice. It also associated two sentences with the same sub-objects.

Crossing level connections could be formed by the when and where an object is required in combination. For example, “I” and “am a polar bear” are combined at the L4. “Bear” joined two combinations at L1 and L3.

3 Likes

I’ve been thinking about this a little bit too. I’ll share some of my thoughts.

You can think of generic problems in software development as trying to find some specific location in some high-dimensional space. A lot of software is written where individual variables represent specific locations in a space, but it’s also possible to build software in a way that variables can be thought of as a projection of the full space, and where the value of the variable can be thought of as a low-resolution approximation of a subset of this space. There’s a respect in which this is basically what a bloom filter is doing, and this is part of what gives bloom filters their weird properties.

Interestingly, it seems that SDRs can be thought of in a similar way. Things get complicated when you discuss unions of SDRs and bursting (unfortunately I don’t have time to go into such complexity at the moment), but the simple explanation is that the idea that each bit represents some feature of the object can be reinterpreted as meaning that each bit represents the space of all possible objects that possess that feature, and an SDR, being a concatenation of these bits, represents the intersection of all of the spaces that correspond to “1” bits.

There are some really interesting ideas here, but unfortunately giving them their proper justice would require a very long explanation. Representing spaces rather than explicit objects happens to provide shortcuts to approximating a lot of NP problems, or at least being able to represent and manipulate such NP problems as primitives in much the same we represent and manipulate numbers. It doesn’t directly allow you to solve them, though it can make them somewhat easier to solve.

The core operations involved are equivalent to intersections and unions of sets (or at least rough approximations of sets), or another way, increasing and decreasing ambiguity. In this case, a lot of language, as you explain, can be thought of as composing ideas. This composition adds and decreases ambiguity. I’ve been thinking for a while that merely presenting the brain with several ideas in succession would automatically do this. This lines up well with properties of TM, and would explain not just the example you bring up, but also phenomena like the Kuleshov effect.

Also, if I remember correctly, the PirahĂŁ language, which controversially does not appear to support linguistic recursion (thus making the description of hierarchical relationships rather difficult), seems to approximate it through sentence composition.

1 Like

I just find it hard to represent everything in a high dimension space. For example, For example, it is hard to represent a sentence or article without lossing information. I am trying to develop a system that could define the meaning of an object by a set of environments object was in.

For example the meaning of “apple” could be the time sequence when a person saw it. Consider the object apple is a vocabulary and its surounding as sentence. Then, the meaning of apple will be a node connection all the sentence it appeared (e.g. how it is located on an apple tree, how it looks like).

Of course I will try to compress overlaped “sentence” to save more space. Instead of matrix and vectors, I think graph is a better way to represent memory.