Evolution of the neocortex

Thanks! Jeff wrote recently “common cortical algorithm doesn’t mean there are no variations … the issue is how much is common in all cortical regions, and how much is different. The evidence suggests that there is a huge amount of commonality.” My impression is that this is what Jeff has always believed, and if the TBT book suggests otherwise I would assume it’s just poor choice of words.

My impression of TBT was that they were imagining something like current HTM plus some missing ingredients (involving grid cells) that they’re still working out the details of. Someone can correct me if I’m wrong.

Yes the thing I’m talking about is “single minicolumn algorithm that is parameterized”.

Current HTM definitely has loads of adjustable parameters (what I’m calling “hyperparameters”), like how many coincident firings you need until you form a synapse, and the number of neurons in a pooling layers, etc. etc. (or something like that, I forget the details of HTM).

I guess there’s bound to be some gray area between “hyperparameter variation” and “totally different algorithms”, but there are also things that are clearly one or the other. Like merge-sort vs a database query compiler are definitely “totally different algorithms”; and a ConvNet with learning rate 1e-3 vs a ConvNet with learning rate 1e-4 is definitely “hyperparameter variation”. Again there’s probably a gray area between those but I can’t think of any examples off the top of my head. :slight_smile:

5 Likes