Exploring the "Repeating Inputs" problem

temporal-memory

#21

I like the mechanism you introduce here and awesome presentation, thanks! I wanted to ask about learning representations in the output layer. In this example we presume to know about ‘ABCD’ and ‘XBCY’ as repeated sequences, but what if we didn’t know to look for these? Or what if another sequence appears like ‘DAYC’, is there a mechanism for forming a new representation bit in the output layer?

It seems to be that with this feature the flexibility of this pooling mechanism could really help out the TM overall, and especially in sequence classification. Great work!


#22

Yes, I have been working on a mechanism which utilizes hex grid formation. I’ll go into it in more depth on a separate thread (didn’t want to muddy the waters on this thread, since the focus here is to explore strategies for addressing the repeating inputs problem)