Karan Grewal gives an overview of the paper “Continual Lifelong Learning with Neural Networks: A Review” by German Parisi, et al… He first explains three main areas of current continual learning approaches. Then, he outlines four research areas that the authors advocate will be crucial to developing lifelong learning agents.
In the second part, Jeff Hawkins discusses new ideas and improvements from our previous “Frameworks” paper. He proposes a more refined grid cell module where each layer of minicolumns contains a 1D voltage-controlled oscillating module that represents movement in a particular direction. Jeff first explains the mechanisms within each column and how anchoring occurs in grid cell modules. He then gives an overview on displacement cells and deduces that if we have 1D grid cell modules, it is very likely that there are 1D displacement cell modules. Furthermore, he makes the case that the mechanisms for orientation cells are analogous to that of grid cells. He argues that each minicolumn is driven by various 1D modules that represent orientation and location and are the forces behind a classic grid cell / orientation cell module.
“Continual Lifelong Learning with Neural Networks: A Review” by German Parisi, et al… : Continual lifelong learning with neural networks: A review - ScienceDirect
“A Framework for Intelligence and Cortical Function Based on Grid Cells in the Neocortex” paper: Frontiers | A Framework for Intelligence and Cortical Function Based on Grid Cells in the Neocortex | Frontiers in Neural Circuits
Other papers mentioned in Karan’s presentation:
- “Overcoming catastrophic forgetting in neural networks”: : https://arxiv.org/pdf/1612.00796.pdf
- “Continual Learning Through Synaptic Intelligence”: [1703.04200] Continual Learning Through Synaptic Intelligence
- “Learning without forgetting”: [1606.09282] Learning without Forgetting
- “Using Fast Weights to Deblur Old Memories (1987)”: CiteSeerX — Using Fast Weights to Deblur Old Memories
- “FearNet: Brain-Inspired Model for Incremental Learning”: [1711.10563] FearNet: Brain-Inspired Model for Incremental Learning
- “Learning and development in neural networks: the importance of starting small”: Learning and development in neural networks: the importance of starting small - ScienceDirect