Well I have a somewhat fundamental question. I have been wondering what objective function is the brain trying to optimize? HTM gives an elegant neocortext formulation but I don’t think the brain is just about modelling sequences. I think modelling sequences is just a mean not an end. Recently, I started to think if there’s some forms of label or objective function that the brain might be trying to optimize in a global sense that then leads to sequence modelling and feature extraction or encoding in the lower level.
Haven had a baby recently, I noticed that the first thing she did after birth is crying. Few minutes later she started moving her mouth around looking for food. As days went by, she started crying when she felt hot or cold. What I’m saying is that, without much ability to process to sensory inputs, she was acting based on pain signals in her stomach, skin, bone or else where.
Could it be that they brain maps sensory data to the pain space and when it’s outside design boundary (specified by biology), then it controls the body to act until the pain equivalent of the sensory data then falls within biology specification. That would mean the brain needs to learn:
- Sensory data to pain transformation
- Compare result to biology specification
- Learn optimum action policy that will return the recognized or predicted pain level to the pain level required by biology.
The hypothesis would mean an individual or an intelligent organism would have this pain specification within its biology otherwise it would not work. That mean the brain would first learn pain to sensory data regression using the result from Nociceptors as the target and then learn optimum action policy that will return the body to biology specification.
- is there evidence of pain specification encoded in mammalian bodies?
- Is there evidence of brain learning association between sensory inputs and Nociceptors signals?
- Finally, would this formulation be useful in building intelligent machines?
Please feel free to pass your comments.