I’ve been thinking a lot lately about how computation is done in the brain very generally.
Generally speaking, computation is done in the brain by changing the network connections. The change in network connections is the computation itself since there is no central processing unit that does computation, it’s distributed across the network.
In trying to wrap my head around what this means I’ve answered two questions on Quora about the topic.
How does bandwidth relate to computation?
Perhaps generalization is best done on a network for a reason…
I’d love your help to kind of condense these thoughts down to one idea, one theory or model of what computation is in a network context. It seems to me that a rigorous and universal definition of computation has not been created, let alone a mathematical one. All we have are some languages that can approximate what computation is in a serial fashion (Turing machine, lambda calculus).
These languages seem to assume a central processing unit that does the computation or they’re abstracted away from the implementation of the computation, dealing with the realm of symbols and logic.
However. it seems to me that computational power (the power to change memory) exists along a spectrum of centralized to decentralized; from serial to parallel, and there’s no cohesive description to articulate what that spectrum means.
I think if we had this description (mathematically articulated) we’d have the perfect relationship between memory and computation. Where, on one side of the spectrum they are separate, and on the network side, they’re one and the same thing. Their interaction with each other is, perhaps, the mathematical description of what intelligence actually is. Or what it ideally is.
I mean, when Jeff says the brain is not a computer it’s a memory structure, that’s what I’m talking about. It changes over time, and the way it changes is based upon its base structure which is approximated by HTM theory, and that structure, is highly constrained (relative to the vast combinatorial complexity that it could allow) but highly variable, able to approximate almost any model, that is to say, able to mirror any data. But I’m certain that HTM is a specific case of a general memory-as-computation theory because HTM has evolutionary pressure to evolve the way it did. that memory-as-computation model lives at one end of the spectrum, which if we could define how memory interacts with computation as the two become decoupled we’d have a complete theory of efficient information processing.
But what do I know?
I’d love to hear your thoughts, it’d be great if someone could simplify this mess. It seems overly complex to me, and I can’t quite grasp it.
The spectrum looks like this in relation to memory and computing power. meaning structure and how the structure changes over time.
One the left side computation is serial so it is straightforward and simple with logic gates, on the other end it’s… complicated.
It seems to me that all along the spectrum from centralized to decentralized, from separate computation and memory to one and the same in the form of a network, the algorithm for how computation should be done on the network the algorithm for how the network should change should be changing - all along the spectrum. And I think the most generalizable context in which to view how that computational algorithm should change is through a feedback loop with some environment. we should view each computational entity, every memory structure as a sensory-motor inference engine.
You know, it seems like on the centralized end there’s very little feedback between computation and structure, in other words, the loop is really big, looping in the external environment. programmers need to be involved to change the way programs are written (how computation is encoded in memory). But on the right-hand side, the feedback loops are many and more interconnected. In other words one of the key metrics to I think has to be self-referential feedback loops. but I dont’ think that’s it because the optimal solution is not simply an RNN for everything.
Anyway, I think now I’m just ranting. Let me know how you view it.