Numenta Research Meeting - July 1, 2020

@cmaver VP Marketing, Numenta

Steve Omohundro on GPT-3

In today’s research meeting, guest Stephen Omohundro gave a fascinating talk on GPT-3, the new massive OpenAI Natural Language Processing model. He reviewed the network architecture, training process, and results in the context of past work. There was extensive discussion on the implications for NLP and for Machine Intelligence / AGI.

Link to GPT-3 paper:
Link to slides from this presentation:


Great talk. I believe that that the rule of the thumb is that statistical analysis and all of its varieties will demonstrate benchmarks approximating human-like abilities in what they refer to as “type 1” kind of intelligence. “Type 1” is quite similar to what’s been referred to as crystallised intelligence; it’s not that surprising- SDR and other forms of statistical mechanisms are no different in such tasks as long as they concern maintaining long chains of statistical relations over large sets of data. “Type 2” or fluid intelligences require sensory-motor abilities; fluid intelligence enables one to sequentially plan out different motor actions, but the same mechanism also brings about an agent’s ability to formalise learned knowledge. In some ways, it’s the crux of human intelligence. Almost all creative acts of humankind have the results of fluid intelligence. Fluid intelligence also embodies working memory. HTM is the only neural architecture that comprehensively structures both forms of intelligence in a single framework. However, HTM will require the role of attention, motor feedback, hierarchy to be more fully integrated into the theory for partial completion.


To put it more succinctly, as long as performance-tasks to be measured underpin the accuracy of the predictions of static/passive data- SDR and HTM will not be so exceptional amongst other forms of statistical analysis. SDR and HTM are special because they also form the foundation for and the possibilites of motor behaviour, attention…