The brain is unlike a computer in that a computer is very precise (ie if you change one bit in a byte in computer memory it changes the whole meaning of the representation, whereas in the brain it only slightly changes the meaning). The brain tends to be unprecise because it facilitates flexibility. SDRs are a great example of flexibility. However it seems sequence memory in computers tend to be quite rigid. The shifting example above - it encodes ABCD in that exact sequence. ABCD can be content-accessible by providing incomplete sequences A-C- or AB–, -B-D, A–D, etc. (where _ is a placeholder for any symbol) but that exact order needs to be maintained (A then B then C then D). ABCD will have little overlap when compared to AABCCD or A-BC-D even though our brains still perceive them as being similar. How could this invariance be applied when processing a sequence of SDRs?
A sequence can be spread out across a binomial tree, breaking the sequence into parts.
In first order memory A activates (or ‘maps to’) B, then B activates C, then C activates D. Then in the next level of the tree AB activates BC, then BC activates CD. The same applied further up the tree. This is lateral activation. There is also vertical activation wherein if A & B activate in that exact sequence then the node above (AB) will be activated. Then if AB & BC are activated in that exact order then the above ABC node will be activated. The core idea here is that a sequence is made up of discrete sub-sequences.
The sub-sequences in the tree can be compared to those in another tree to find overlap.
There is enough overlap in AABCCD to fulfil the ABCD sequence.
As BC & CD in the ABCD tree overlaps with BC and CD in the AABCCD tree it activates BCD which in turn activates ABCD.
Let’s take a less similar sequence - ABVVVCD. ABCD does exist in this sequence, it’s just that AB and CD are separated by a sub-sequence of VVV. (Even upon reading the sequence ABVVVCD as a human, it might not be immediately obvious that ABCD is in the sequence. This is simply because there slightly less sequential similarity/overlap).
Overlaps are only found on the first row of the tree. On a ABCD tree there is not enough vertical activation to fully activate ABC and BCD, however they could be partially activated, then combine vertically to partially activate ABCD. This partial activation is ideal as it shows that it is not a 100% match. This is analogous to the activation frequency of an SDR - if it is highly similar it has a higher frequency, else if it has a low similarity it has a low frequency. In this case if VVV was already a known sub-sequence then it will activate with a higher frequency than ABCD. In a competitive population of cells with lateral inhibition the VVV representation will inhibit the ABCD representation as it will have a higher frequency. However, if you had an attentional bias for ABCD (ie if you were looking for it) then top-down activation will increase the frequency of ABCD allowing it to inhibit VVV.
I will compare one more sequence that is very dissimilar to ABCD. The sequence AVBFCSD contains ABCD even though it is not very obvious. For me (as a human) I would not have spotted ABCD immediately, even if I was looking for it. However, using another method ABCD could still be detected.
As seen above, there is no direct overlap in any of the nodes. However, take the first row of AVBFCSD tree and apply each node into a union using an OR operation. Then apply another union of the first row of the ABCD tree. Then compare the unions by applying an overlap/AND operation to reveal the AB, BC, CD nodes. This same procedure can be applied to all the rows of multiple trees simultaneously. Using this method should ideally result in a low-frequency activation of the ABCD nodes.
I have yet to work out the implementation details of all this. The basic idea is that each node of the tree will encode the child nodes by bit-shifting them into a union. This will make comparisons with other nodes/trees quite simple as nodes can reverse-shift back to their original SDRs - maintaining full semantic value.