Sequence memory - minicolumns? or and-ing bitvectors?



Hi all, I have a question about the sequence memory proposed by Subutai and Jeff. I don’t quite get why the minicolumn approach is better than and-ing the underlying bitvectors.

for example for sequence A,B,C,D

A     B    C    D
0100 0101 0110 0111

doing a pairwise collapsing

A & B -> 0100 
B & C -> 0100
C & D -> 0110

then again:

ABC -> 0100 (AB & BC)
BCD -> 0100 (BC & CD)

and finally

ABCD -> 0100 (ABC & BCD)

by sequentially collapsing the sequences wouldn’t it lead to a multi-level hierarchical sequence memory? The above encoding is obviously not the actual way one would do it, with proper encoding could this work?


I’m not sure I understand. Doesn’t this throw away all ordinal information, so ABCD would have the same representation as DCBA, or any other permutation? This is not a desirable property when you want to predict ordered sequences.