Biological way to decode SDR back to Objects

@breznak

In the brain, there is a tight coupling between the decode and encode parts of speech.
In the “Dad’s song” project we hypothesize that the critter learns the sound first, then learns the muscle movements necessary to mimic the sound.

So how does that play out in a conversation?

Extending this to what is happening in the brain - you perceive something in the overall global workspace. That is the extended network of connected areas/maps.

The “highest level” version of this activation pattern extends to the temporal lobe to be experienced. This is shared with the hippocampus and through that - to the rest of the “older brain structures.” What they get is a highly digested version of the perception of the world - they are spoon-fed your experience.

The lower brain structures process this and then project a command output to the frontal lobe which is elaborated through the various levels of the forebrain. This elaboration is shaped by fiber loops connecting to brain areas that are also used to parse the world - guided by remembered facts about the sensed world from the physics of the world through to the facts and relationships of remembered objects and places.

@Gary_Gaulin This bit should give you some crazy ideas for your “bug creatures.”
You can think of the level of processing of these lower brain structures somewhat like what a moth does in flying up to the light in response to all the cues that signal mating time. In the moth genetics have tuned it to fly up to the moon to mate; genetics did not plan for porch lights. With our big brains these old senses and drives are vastly enhanced. These senses should be better at processing sensory cues and turning those drives into suitable action plans. I call this my dumb boss/smart advisor model.

At the lowest levels to the forebrain, the output fibers don’t project to the body, they project to the temporal lobes to be experienced as “thinking” and “recall.” This is really the same thing as the lower brain structures pointing the eyes thought the FEF (frontal Eye Fields) to look at things of interest, but this part is all internal to the brain. These recalled memories are then experienced by the temporal lobe, hippocampus, and related structures in a loop of experience we call consciousness.

Some of these forebrain activities may result in the selection and production of motor activity - words and actions. These are all stored motor programs that are being called into play, customized by the recalled memories and drives from the limbic system/forebrain. The networks in the various areas settle into states where there is the least “conflict” between the various activation patterns. Experienced AI researchers will recognize this as a relaxation computing process.

So - to address your question about decoding SDR contents - the right kind of internal activity could interrogate the contents of memory. It would have to be done at a system level to be biologically plausible. The system must learn the access method at the same time it learns the data.

2 Likes