HTM and Deep learning

Im my own experience… HTM is

  1. Around as good as a vanilla RNN in predicting known sequence
  2. Way faster than a equal size LSTM
  3. Very bad at inference (predict known patterns that is not in the training data)
  4. Great at anomaly detection compared to AutoEncoders and Recurrent AutoEncoders.

And some of my speculations:

  1. The prediction power/memory grows logarithmically. As the connections is generated randomly. It gets harder and harder to connect to the right bits
  2. Around 5~16 steps of memory capacity in NLP tasks. (Better than a RNN but way worse than a LSTM)

I have developed a toy NLP program in HTM not so long back. You can see how well it works for your self.

3 Likes