Triadic Memory — A Fundamental Algorithm for Cognitive Computing

I found this interesting on the whole subject of associative/sparsely distributed memory implementation. It also seems to be optimized for SDRs without using this acronym

How does the brain store and compute with cognitive information? In this research report, I revisit Kanerva’s Sparse Distributed Memory
theory and present a neurobiologically plausible implementation, founded on combinatorial connectivity patterns of large neural
assemblies. This type of neural network gives rise to a new efficient algorithm for content-addressable memory. Extending this new kind of
memory to store crossassociations (i.e. triples of items) leads to the discovery of the triadic memory algorithm, which turns out to be a
powerful building block for cognitive computing and possibly the long sought-for universal algorithm of human intelligence. As proof of
concept, I develop a semantic triplestore, analogy-finding operators, autoencoders for sparse hypervectors, and a basic temporal memory.
Software implementations are included in this report for reference

3 Likes

The idea is intriguingly simple, I managed to write a simple python class. Not thoroughly tested. SDRs are sparse formatted (a list of ON bits)

2 Likes

Adding 2 lines to your code:

    X2 = [10,11,12,13,14,15,16,17,18,20]
    print(tm.query(X2, Y))

Then X2 query returns:
[200 201 202 203 204 205 206 207 208 330]

Expected Z:
Z = [200,201,202,203,204,205,206,207,208,209]

So we have mostly Z as result but a random result in the non matched offset (330), or is this some sort of bitwise merge?
Is this what you would expect?

330 is out of range for a uint8 (on init) - which makes it easy to find. Is this actually a feature?

Also, out of interest, how did you find this paper? What was it linked from?

Stored vectors cannot not have repeated elements.

e.g. swap 11 for a second 10

    X3 = [10,10,12,13,14,15,16,17,18,19]

Will not retrieve the same sequence on query. Instead you’ll get:
X3 = [10,12,13,14,15,16,17,18,19,329]

Looks like another feature. Not a show stopper at all though.

Thanks for posting this! Just signed up here to drop a comment…

Yes, the algorithm operates on SDRs. As an associative memory, it stores a superposition of SDR triples. One part of a triple can be queried by giving the other two parts.

Combining two such memories in a feedback circuit, one can build a simple temporal memory which behaves like the HTM temporal pooler.

By the way, the paper also describes an efficient implementation of a classical hetero-associative SDM. If there’s a practical need for this, I’d be happy to share the code.

3 Likes

Please do.

@cezar_t Thanks for sharing your code earlier, also.

I think I stumbled into this on r/machinelearning reddit group.

The author’s github link is at the end of the pdf. I tried a python version since he had only C and Mathematica codes.


What I find quite cool is the simple storage/retrieval of triple associations/relationships.

We had some talking here about how to encode two terms in a single SDR and use associative query to find the third, missing term of the triple. That however would involve some overlap + permutation (or rotation) trickery to pack order semantics in a single, “thicker” SDR.

But here all three terms are “clean” and with a single store operation of [A,B,C] sequence, one can query either of:

“What follows [A,B] ?”
“What precedes [B,C]?”
“What is supposed to be in between [A,C]”?

2 Likes

First of my python version reflects a rough understanding and doesn’t match the authors algorithm. There are a couple edge cases which need to be handled.

You can try his C reference version which compiles immediately.

1 Like

Ah ok.

I found that you cannot store more than about 50 of (uint8, 10 digit) random vectors before the x,y,z store fails and the value z cannot be retrieved intact. It appears to be data dependent because if varies by (random) run.

Ok, this does not work at all as I expected. I’m missing something obvious.

The test is:

  • store (a,b,c) test (a,b,_ ) [expect c]
  • store (b,c,d) test (a,b, _) [expect c again]

c comes back with two different values.

What is the expectation here please?

This is the output from the c version:

./triadicmemory 1000 0
{39 54 93 200 205 211 232 248, 18 58 71 109 168 205 211 255, 46 49 62 71 125 155 167 216}
{39 54 93 200 205 211 232 248, 18 58 71 109 168 205 211 255,_}
46 49 62 71 125 155 167 216 
{18 58 71 109 168 205 211 255, 46 49 62 71 125 155 167 216, 55 102 117 162 171 192 199 208}
{39 54 93 200 205 211 232 248, 18 58 71 109 168 205 211 255,_}
46 49 55 62 71 102 117 125 155 162 167 171 192 199 208 216

I think 0 in command line is a bit… wrong. It should be the implicit “solidity” (aka the number of 1 bits) of your SDRs.

First argument is the size of the SDR second is the assumed number of 1 bits.

Author’s example is 1000 and 10 . I know 1% is a bit sparser than in HTM canon, but that’s the limitation at this stage.

Thanks, well spotted.
But it does not change the result at all.

Well, although (probably) flawed, the result is interesting, because:
you stored:
(A,B,C) sequence. Then (B,C,D) one

When queried for (A,B,_) the answer contains C and D overlapped which in a sense is correct.

Because it can be regarded as a response for the longer (A,B,C,D) sequence

Those integers represent the non-zero positions of an SDR, so indeed elements should not be repeated.

1 Like

The memory capacity is approximately sparsity^-3 triples of random SDRs.

For SDRs of dimension 1000 and a sparse population of 10, that would be one million triple associations that can be stored and accurately retrieved.

./triadicmemory 1000 10
{13 22 56 57 142 185 197 246, 26 31 37 72 85 96 153 162, 22 36 71 81 87 102 104 206}
{22 36 71 81 87 102 104 206, 3 77 113 118 134 135 195 241, 19 46 56 73 114 174 187 235}
{13 22 56 57 142 185 197 246, 26 31 37 72 85 96 153 162, _}
22 36 71 81 87 102 104 206 
{_, 3 77 113 118 134 135 195 241, 19 46 56 73 114 174 187 235}
22 36 71 81 87 102 104 206 
{_, 3 77 113 118 134 135 195 241, 105 125 155 166 198 232 237 251}

{_, 3 77 113 118 134 135 195 241, 26 31 37 72 85 96 153 162}                          

{3 77 113 118 134 135 195 241, 19 46 56 73 114 174 187 235, _}

{_, 26 31 37 72 85 96 153 162, 22 36 71 81 87 102 104 206}
13 22 56 57 142 185 197 246 
{26 31 37 72 85 96 153 162, 22 36 71 81 87 102 104 206, _}

or tidied into human readable:

./triadicmemory 1000 10
{A, B, C}
{C, D, E}
{A, B, _}
C 
{_, D, E}
C 
{_, D, F}
/blank
{_, D, B}                          
/blank
{D, E, _}
/blank
{_, B, C}
A
{B, C, _}
/blank

which seems to imply that you can’t store a sequence, since you need two bits of information to find one and order matters (you cant use permutations).

Ok I pushed a (hopefully) corrected python, as advised by POv.

The main test stores two sequences seemingly correctly:
W,X,Y
X,Y,Z

@DanML I think you have to “stitch” the sequences two at a time:
{A, B, C}
{B, C, D} # You have to have this
{C, D, E}

1 Like

There are several ways to represent sequences:

You can build a skeleton linked list {A,B,C}, {B,C,D}, {C,D,E}, {D,E,F}
and traverse it forwards or backwards. To each node you can then attach values
using permutations, for example {C,valB,A}, {D,valC,B}, {E,valD,C}, {F,valE,D}

Or fix one of the input vectors to get a bi-directional associative memory:

{X,A,B}, {X,B,C}, {X,C,D}, {X,D,E}

Also this data structure can be traversed in both directions.

1 Like

Thanks. Still fails at low reps (0-60) depending on (random) 10 digit vectors.

python test_triadic_memory.py 
Retrieval failed!
z q 12, #92, 104, 120, 138, 205, 213, 234, 248
q z 12, 104, 120, 138, 205, 213, 234, 248
Failed at offset 35

Indicating that using stitching (as above) at some point the query (q) returns more items than are held in (z). In the example, another element with value 92 arrived in the result. This happened at input sequence 35.

Offset examples:

  1. ABC
  2. BCD
  3. CDE

This matches the c code behaviour. Any theories on how to handle this?