In this research meeting Subutai and Karan focus on reviewing 4 related meta-learning papers. Subutai (after an initial surprise reveal) summarizes MAML, a core meta-learning technique, by @chelseabfinn et al, and a simpler variant, Reptile, by Alex Nichol et al. Karan reviews two probabilistic/Bayesian variants of MAML by Tom Griffiths et al.
In this research meeting Subutai and Karan focus on reviewing 4 related meta-learning papers. Subutai (after an initial surprise reveal) summarizes MAML, a core meta-learning technique, by @chelseabfinn et al, and a simpler variant, Reptile, by Alex Nichol et al. Karan reviews two probabilistic/Bayesian variants of MAML by Tom Griffiths et al. Papers: Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks,
On First-Order Meta-Learning Algorithms,
Recasting Gradient-Based Meta-Learning as Hierarchical Bayes,
and Reconciling meta-learning and continual learning with online mixtures of tasks.