I’m experimenting with a method for self learning the structure of any language (as they all share the commonality of biology) and have a preliminary working prototype that breaks out the definite, indefinite, demonstratives, pronouns and possessive determiners. This is with just the text input for learning, effectively just a string of numbers. Cautiously excited.
Still experimenting at this stage because my version and structure as to what I think English represents and how it is structured may be a little wide of current thinking but I think it is far closer to the biology than any current model that I know of. That said even with looking at brain scans, charts, biology, etc. I still know nothing really.
The method I think is closer to the biological process that goes on between the hippocampus, memory and cortex (TBT style) and what I believe to be the fundamental foundation for HTM to “evolve” rather than top down learning. It integrates continuous learning, HTM concept and long term memory. The same process I think can work for audio, vision, etc. but I’m already frying my brain with English at this stage and still a few months off even contemplating anything other than language.
I’m not willing to expand on the process at this point as it may just be me over thinking 1+1=2 or dancing with joy that I have created fire while everyone else drives around in a car.
What I was curious about is what research methods/papers have been tried that are closer to biology than the majority of blind statistical approaches for NLP ? Not interested in BOW, Ngram and alike. NLP that includes conceptual understanding ? Learns a language from just text ?
Might just be an oncoming Homer Simpson moment of Doh ! Back to the drawing board…