Why not review Vicarious papers?

Just curious why don’t you review Vicarious papers? (Unrelated to differentiable plasticity)

@aetolicus - Do you have a link to these papers?

It’s their main paper about recursive cortical network: https://science.sciencemag.org/content/358/6368/eaag2612

Some more neuroscience-related papers:


The main advantage over other papers I see is that they don’t use backpropagation, while having harder than MNIST datasets solved

1 Like

All the paper’s we’re reviewing have to do with sparsity in neural networks, which is the topic we’re focusing on. Are you aware of any Vicarious papers that deal with applied sparsity?

Not really. Vicarious guys share the same initial ideas as Numenta does, but they concentrate more on high-level concepts. I understand that Numenta has it’s own trajectory of research, but Vicarious papers were insightful for me

1 Like