Here is a somewhat working HTM based language model. It’s very small, not sure if it could handle anything more than a paragraph right now. But it works with continuous language learning.
This is vibe coded with Gemini 2.5 Pro
Here is a somewhat working HTM based language model. It’s very small, not sure if it could handle anything more than a paragraph right now. But it works with continuous language learning.
This is vibe coded with Gemini 2.5 Pro
Here is a re-architected version, this shows some intelligence when trained on a tiny book, GitHub - NQevxvEtg/dao
Architecture diagram is at dao/references/diagrams/diagram_rdr.md at main · NQevxvEtg/dao · GitHub
@MTIzNDU2Nzg5 thanks. I found your work very interesting. Two cents from me:
What do you think?
Can you share us your concept here?
Thanks
Thanks, this would definitely help. I’m still exploring different concepts, right now the biggest hurdle is compute resources to test on a larger dataset. I’m also considering using pre trained embeddings.
Not related to HTM theory, but some may find this LLM interesting. It does continuous learning using more standard LLM theories. GitHub - NQevxvEtg/dao-The-Living-Dynamo