Here is back of the envelope calculation …
100 000 CC = 200M mCols = 20B neurons
receptive-field == Cortical-Column ? use 2000 mCols
Ncc = number of CC = 100 000 <== 20B / 2000
Capacity = (CCc / sparsity) * Ncc
CCc = neurons * segments * patt-per-segments
Lets use numbers rounded to 10 :
mCol = 100n * 100s * 10p = 100 000 patterns
CC-Capacity = (100 000 / 0.02) = 5 Mil transitions per CC
Capacity = 5 mil * 100 000 = 500B transitions
So the capacity of the neuro cortex is 500 Billion transitions <<
In old AI terms we can think of it as 500 Billion fuzzy if-then.
if SDR then SDR
So you can look at it as 500B lines of code Production System.
That of course exludes the Memory and the other parts of the brain.
What do you think about it ? Does it look too much ? too little ?
100 000 CC seems too little to me !! Wouldnt organizing them in bigger structure increase the capacity i.e. patterns-of-patterns in a groups of CC’s, then we “count” the cortex ?
Does it sound plausible ? Can you figure better storage system, Dict ?
PS> Should we look at CC as 2000 mCols ? In my simple tests even 500 could do it !
|pattern||....10001010......||(2000 choose 40)||2000/40|
|synapses||eqiv to a bit||10 syns to detect pattern|
|dendrite segment||100 synapses||10 patterns|
|neuron||100 dendrite segments||1000 patterns||10000 synapses|
|mini-column||100 neurons||50 microM|
|- capacity||100 n * 100 segs * 10 pat||100 000 patterns|
|macro-column?||100 mini-cols||500 microM|
|cortical-column||2000 mini-cols||5M transitions|
|cortex||100 000 CC||500B transitions||20B neurons|