If neocortex represent SDR, then why it has 25 billion neurons?

As we see, a normal decent size SDR has a astronomically large data representation capacity. And if brain use SDR, then why it uses 25 billion neurons for it? It’s like an “overkill” to me . I know, owr brain is not a designer clothe. But is the number some kind of evolutionary baggage that we don’t want really?


Every neuron has dendrites, and each dendrite can represent one or perhaps a few, SDRs.

Why so many?

To learn EVERYTHING you perceive; to build a predictive model all of the perceptions in your life to the point where you are surprised if a door feels light or heavy when you pull on the handle - that requires a vast memory store.


I always assumed that the human brain must have massive redundancy, patients can survive massively destructive damage to the brain by percentage and suffer only minor symptoms in some cases and even prosper after a hemispherectomy.

The brain is also a noisy place, having massive built in redundancy might help maintain a clean signal too.

I always wonder what the MVP “Minimum Viable Product” is for a human like brain? I guess in the coming decades we are going to find out.


Evolution doesn’t ask what is the most efficient, it asks what will survive.


…or whats good enough


You have to keep in mind that brains don’t just serve to provide consciousness, they also control all the biological systems that support and maintain physical existence - these things are less relevant in terms of building a brain that operates a mechanical body.


Another useful view on it is the Red Queen Hypothesis. The absolute capacity doesn’t matter if a little more additional capacity turned out to be statistically beneficial for the genes (in Dawkinsian sense) that provided for it. The capacity to keep everything at the same time in the head is infinitely complex, as it may require modeling other brains: social structures, interactions, grudges, affairs, cheats, conspiracies, and anything else, which could be beneficial for passing on genes: poetry, music, etc.

Coming back to the Red Queen: it’s sufficient that this delta caused some genes to survive, and the next generations had to start from there.

As brains are costly, they might have actually been decreasing in size recently.

There is another reason why brains are shrinking:
The processed food experiment has done extensive damage to the older generation in developed countries.
And now you have the emergence of minimum cognitive load politics and policies that seem simple and ‘obvious’ but are no solutions at all.
Correlation much?

1 Like

It’s also worth noting that the complexity of the world also scales exponentially.

Think about it this way; 32 bytes of memory (256 bits) can store 2^256 possible states. That’s astronomically large, so why would computers ever need gigabytes of memory?

Say you’re writing some text. A couple paragraphs can easily be over a kilobyte. A kilobyte can store so many possible states! So what’s all that extra capacity for? Well, most of those states will be random gibberish, and even out of the ones that aren’t, there will be an unfathomable number that are just variations on ones that do make sense, but full of typos.

You can throw compression at the problem to weed some of these out, but that makes it harder to reason about the data, per the time-space tradeoff.

As soon as you start considering compositional data - cases where multiple objects can exist independently and must be tracked independently (for example, multiple letters in a piece of text) - the state space grows exponentially with the number of objects. Even if the space of possible states of your representation grows rapidly, the complexity of the environment has a similar growth rate.


That’s not actually true. In a closed system efficiency is an overriding factor.

Brains are extremely expensive to operate, are only possible within the constraints of the cell’s collective machinery (biologically) and prior anatomy.

But the more efficient solution wins out.