Outline of Jeff's April 4, 2019 presentation "What do we know about Cortical Columns?"

Facts and Conjecture on REM in sleep:
Fact: The frontal eye fields are driven by lower forebrain, which is in turn driven from sub-cortical structures. (The lizard brain)

Fact: Visual object recognition is a form of 20 questions where you gather a basket of features of objects to learn its identity, driven by the lizard brain parsing your environment.

Fact: The rapid eye movements are strongly suspected to be part of the memory consolidation process where the contents of today’s learning in the EC/HC complex are moved to long term storage in the cortex.

Conjecture: This REM activity could be a by-product of consolidating the sequence of movement and visual features to identify an object, based on today’s experience in scanning said objects.

3 Likes

I agree that the REM activity during sleep is most probably used for this. I can say that when I am going to sleep (and it’s your choice to believe it or not of course) as I start seeing dream while still being concious, I have to move eyes in order to see objects. Lucid dreaming was proven to scientifc community by Stephen LaBerge exactly by memorizing a pattern of eye movements and then replaying it in the dream, And also this study Single-neuron activity and eye movements during human REM sleep and awake vision - PMC suggests that eye movements in the dream are correlated with the picture observed. And my question here is why eye muscles are not paralyzed during sleep? Most of the body muscles are, and we still can experience sensations in dreams, but eyes are kept as not paralyzed

Different efferent path not passing through the same gating structure as the rest of the body.
More than likely this is due the special nature of direct control by the lizard brain.
See:

This part of the sub-cortical structures needs direct access to the eyes to get things done.

1 Like
2 Likes

I would add to what Gary is offering.

At the lowest level what he posted is absolutely correct.

This is part of a larger system of map interconnections.

While the efforts to identify the function of the various layers is a worthy exercise you cannot ignore the vast body of literature that localizes control and sensing to much larger regions of the brain.

The location signal is part of a large distributed system that covers most of the brain. Efforts to fix the location function to a tiny cortical column should be expanded to match this schema.

Here is the general bottom to top construction of the postural system, including forming your sense of location in your environment; the basis of your sense of self.

See that the blocks labeled Neck joint receptors down to postural adjustment are all part of the area shown in dotted lines below. While this is vastly simplified and not strictly correct the general layout of map connections looks something like this:

The arrows are all interconnecting fiber tracts between maps. See how much of the brain is part of generating and using the location signals. The amount of this location signal that is available to any individual mini-column to form an object is very limited. Object recognition must be considered to happen in a much larger and more distributed form in the cortex.

This paper is one of the very best I know of to give some idea of how semantic knowledge is distributed over the cortex.
https://www.cell.com/trends/cognitive-sciences/pdf/S1364-6613(13)00122-8.pdf

Once you start to think of things this way it is easy to see that the parts that code for entire object are clearly spread over a MUCH larger area than single columns, and likewise, representation of space and relations between those objects.

@Viacheslav to amplify on @Gary_Gaulin’s excelent answer, the high-level coordination between touch and motor planning is in be loops joining sensory fusion in the association areas and the pre-motor areas of the frontal cortex.

BTW: I am prepared to defend the global layout of function presented above but it is off-topic for this thread so I am asking you to “take my word for it” for the this post.

4 Likes

Thanks for the detailed answer. I think I should’ve been more specific with my question. I also meant to ask that we still don’t know how the location (and orientation) signals are generated, as mentioned it this paper (for location only): “we don’t know how the location signal is generated”. https://www.frontiersin.org/articles/10.3389/fncir.2017.00081/full
Or maybe we do know now or are there any suggestions?

It should be clear in my answer that I do not agree with this paper.

It’s not that there is no location signal. This is essential to decoding objects with motion. The paper makes a clear case for the utility of this function.

My problem is that this paper ignores the well known distribution of function in the brain. To square the circle I need to see how this global location signal is parceled out to the individual column structures.

I see this “distribution of location” as implemented by the reverse hierarchy mechanism.

Retinal motion sensors require a controlled amount of motion over what is being scanned.

Audio waves are already in motion, and move at the speed of sound. No muscles are required to move the waves across our ears.

From what I can see auditory disattention pertains more to controlling unwanted auditory noise and hallucinations that come from the mind itself, not sound waves traveling through the air.

Motion sensors requiring motion to sense an object, and audio waves already being in motion, would in this case both be unrelated to mechanisms of auditory disattention.

(Just a minor aside)
REM sleep can’t be equated to dreaming; dreaming can’t be equated to REM sleep.
REM-free dreams and dream-free REM sleep are neither pathological nor very rare.
Mark Solms presents lots of data, in his 1997 “The Neuropsychology of Dreams
A Clinico-Anatomical Study,” and in

Paper available here:
Dreaming and REM sleep are controlled by different brain mechanisms - Mark Solms

… and an interesting paper by the same author here:
The Interpretation of Dreams and the Neurosciences - Mark Solms
https://psychoanalysis.org.uk/articles/the-interpretation-of-dreams-and-the-neurosciences-mark-solms

Experimental research on dreaming: state of the art and neuropsychoanalytic perspectives - Perrine M. Ruby

1 Like

Eureka! I found something important that’s related to the connections included in the outline, for the (gatekeeper of the cerebral cortex) thalamus:

During sleep, most thalamic neurons are in burst mode. During waking many thalamic neurons remain in burst mode. In burst mode, neurons cannot communicate specific information. However, if a novel stimulus is presented, the sudden change from burst to tonic mode may be a major factor in alerting the cortex. Additionally, this intrinsic rhythmicity probably contributes to the generation of cortical electroencephalographic rhythms.

https://www.dartmouth.edu/~rswenson/NeuroSci/chapter_10.html

During REM sleep the brainstem’s non-logical emotion related circuitry can be expected to (through pons) also influence the thalamus.

This sounds to me like a categorization process needed to store emotion based memory data to the most appropriate neocortical memory locations. If that’s true then the closest electronic example I can think of for the circuit being constructed is an address decoder:

Temporal memory can playback sequences as they were previously experienced. From the outline sensory input goes into Layer 4, Layer 5ab output goes outwards laterally.

4   Primary Input layer for skin, eyes, ears sensory. Where everything starts. 
    Minicolumn network represents sequences, melodies, how things behave.
    Output connects to 2/3 then non-drivers connect 2/3 back to 4.
    Sensory input is gated by thalamus.

5ab Lateral Output layer, 
    also (through thalamus) laterally to 4 making Places part of an Object.
    Also connects to its 6.
    Represents motor behaviors, motor outputs connect subcortically.
    Two cell types, one for output other not. 
    In humans 5a thin tufted cells connect laterally to other 5a
    In humans 5b thick tufted cells assigned to movement.

Each whisker and other body sensor has its own cortical column located in the brain relative to where the sensor is spatially located on the body, and in relation to other whiskers or other sensors that may have in turn been involved. Speed of something moving through field of vision is a frequency that comes from retinal motion sensors hooked up in a long string that one after another fire as edge moves across in proper direction. So many are strung through each other a retina looks like a tangled mess. Along with color and other information moving visual experiences can be recalled by sequence of events that map to the retina, instead of ears.