Temporal hierarchy like spatial hierarchy?

I wasn’t sure before but I think you’re right - sequences in each region is arbitrary in length and begin with a string of subsequences then gradually ‘join together’ to form a more stable representation. This can be thought of intuitively. When you learn a sequence for the first time (say a song) you remembers parts, it is kind of fuzzy. You may remember bits that repeated or are similar parts to other songs. As you listen to the song more your memory goes from fuzzy to more specific.

I was worried that a low-level region would be able to learn very long sequences that should be represented higher in the hierarchy. However, given that low-level inputs tend to be very noisy, it would be unlikely that the exact same input sequence would repeat enough times. It is only when TP generalizes the noisy low-level sequences that it can becomes more stable in the higher regions. So it’s almost like it self-organizes these levels of abstractions - in theory.

I am currently taking my time writing a post in the other thread about TP. I’m taking what you explained about TM and bursting and combining it with the competing TP cells that increase their activity when they correctly predict and decrease when they incorrectly predict. That along with lateral inhibition gives a good representation of how similar TP cells are to the input sequence. The more I think about TM and bursting the more thing become possible (eg subsequence abstraction). Thanks for opening my eyes :wink:

1 Like