HTM-Based Large Language Model

Here is a somewhat working HTM based language model. It’s very small, not sure if it could handle anything more than a paragraph right now. But it works with continuous language learning.

This is vibe coded with Gemini 2.5 Pro

1 Like

Here is a re-architected version, this shows some intelligence when trained on a tiny book, GitHub - NQevxvEtg/dao

Architecture diagram is at dao/references/diagrams/diagram_rdr.md at main · NQevxvEtg/dao · GitHub

1 Like

@MTIzNDU2Nzg5 thanks. I found your work very interesting. Two cents from me:

  1. maybe it is better if you implement using HTMcore from HTM community, or better using region concept of HTM for clearly working with different layers/regions. But HTMcore runs a little slowly.
  2. For accelerating, you can using Etaler implementation of HTM
    GitHub - etaler/Etaler: A flexable HTM (Hierarchical Temporal Memory) framework with full GPU support.

What do you think?
Can you share us your concept here?
Thanks

1 Like

Thanks, this would definitely help. I’m still exploring different concepts, right now the biggest hurdle is compute resources to test on a larger dataset. I’m also considering using pre trained embeddings.

1 Like

Not related to HTM theory, but some may find this LLM interesting. It does continuous learning using more standard LLM theories. GitHub - NQevxvEtg/dao-The-Living-Dynamo

1 Like