Speaking about compositionality, the advances brought by the new framework of Numenta could be really significant, in regard to what deep learning experts are saying.
On one side, the new paper from Numenta “A Framework for Intelligence and Cortical Function Based on Grid Cells in the Neocortex” proposes a theory that could explain the compositionality of objects (and compositionality of langage by extension).
We propose an extension of grid cells, called “displacement cells”. Displacement cells enable us to learn how objects are composed of other objects, also known as object compositionality
On the deep learning side, the famous Yoshua Bengio has just published (as a co-author) a paper underlying the difficulty of learning the compositional properties in langage with deep learning methods :
Chevalier-Boisvert, Bahdanau, Bengio - October 2018 - http://export.arxiv.org/pdf/1810.08272
We put forward strong evidence that current deep learning methods are
not yet sufficiently sample efficient when it comes to learning a language with
Are you aware of other AI models that integrate compositionality in their approach ?
I’m really curious to see how those approaches will evolve !