Intelligence vs Consciousness


#90

I see daisy chains.
Brain images display the beauty and complexity of consciousness:
https://www.newscientist.com/article/mg23431290-400-brain-images-display-the-beauty-and-complexity-of-consciousness/


#91

Without getting deep into the philosophical swamp, I think the practically useful definition for consiousness is simply: “The ability of a system to represent itself as part of it’s model of the world”, i.e it has the concept of “self”.


#92

Wouldn’t a higher region forwarding an input to L4 in a lower region be indistinguishable from the cortex having sensors (as in the context of the opening post of this thread)?


#93

This is essentially what I am proposing here:


#94


#95

Aren’t Intelligence and Consciousness a classification of 2 very different phenomena?

Intelligence is the ability of an entity to acquire adaptability, knowledge and skills to become more proficient with its environment. (An organism doesn’t have to have any idea that its acquiring abilities to acquire the abilities, no?)

Consciousness is the ability to assess one’s own relationship to its reality, environment and consider itself? The ability to have meta-knowledge?


#96

In my work yes there two different things. But for intelligence to work it needs
consciousness system in place fist, as support structure/system.


#97

Good read on this topic.


#98

Lizard consciousness is simple one reality being played out.
In better bigger system, have many realities and back up models of realities to fall
back on to. At the last moment blocking neurons unwanted, not correct reality.
How well this is done is the intelligence of the system. Boosting form other realities
to assemble a reality is possible.
In wet ware many must be running in parallel. In silicon many realities could be run
one after another at high speed and select the right one.

But for this to work you need an echo or a memory model.

RE: Alan Watts Reflection ft. Who Am I?:


#99

I think you nailed the key problem… we can never ‘describe’ consciousness because if you really think about it, we cannot describe anything completely. At any level of graininess/abstraction, describing anything completely would be describing everything (including what it is not). To communicate, we can only allude to stuff, and hope (or take for granted) that the receiver has the requisite experiential representations. In other words, could we communicate without any shared experience? ( If not, physics is our only hope with aliens :stuck_out_tongue: )


#100

I think intelligence (the ability to learn and create new responses) requires the inclination to respond contextually while informed by memory. Intelligence enjoys creating responses that are synthetic, i.e. more than just remembered behaviors.

The book (On Intelligence by Jeff Hawkins) starts off right by declaring that any sections of the (six layer) cerebral cortex are similar to any other part, and its plasticity even affords that vision (normally experienced in the occipital lobes) can be experienced as a kind of vision in the sensory cortex (behind the central sulcus) such as when a blind person wears a device that transduces video to a tongue mounted display. This requires intelligence, and it also means that any form of mental activity in any piece of neocortex has equal natural physics to any other ongoing mental activity which is the essence of consciousness.

Activity at any part of the cerebral cortex can be measured by electrical field variances - I like to refer to these fleeting bits of electrical activity as mental objects. They can be detected by EEG or Probes, and when a pulse is skillfully applied to a person’s brain in the same spot, the person will remark that the mental object (more or less) is present (a sensory motor type of test in humans)

Conscious activity in the cerebral cortex is the ongoing arising and passing away of mental objects, both via senses, and from formed associative memories - (these are recallable sequences of mental objects having commonality with currently active mental objects).

A natural part of consciousness includes the continuous formation of new recallable sequences.

I would say,

Consciousness is experienced in the brain’s dimension of memory usually forming new memory. Some instances of consciousness are more intelligent than others

.


#101

Good. Basically, from the perspective of information theory, it is the coding of information.
Is there any explanation of thinking, attention and conscious based on this theory? we know that thinking is a necessary feature of AI.
or, is there any explanation of thinking, not related to the neocortex?


A Framework for Intelligence and Cortical Function Based on Grid Cells in the Neocortex
#102

“Thinking” is fuzzy enough to be essentially a useless term.
As far as the rest of your list please see if this addresses your questions:

and


#103

Thinking includes analysis, reasoning, comparison, induction, etc.
how could it be a useless term? In fact, I don’t care much about consciousness.
Let us assume that grid cell theory can model the universe, how does it do planning or logical reasoning?


#104

It is interesting that you include a list of activities that could well include consciousness and attention all under the umbrella of “thinking.” It is this combination of a large number of interacting processes that make this a useless term. What you are really asking is “how does the whole brain work?”

Consciousness is likely to be the “vehicle” that carries the thinking process but as I indicated in the linked post - consciousness is composed of many sub-activities that work together to create a final result.

I dare say that nobody in neuroscience will be able to tie all of these together into a whole until someone gets a working AGI to stand as an example.

If you feel up to the task I will be delighted to read your take on how all this works.


#105

This is not what I want to ask.
For example, I won’t ask how to achieve emotions because I don’t think it is a necessary feature of AGI.


#106

While I do disagree with you on the need for emotions in a functioning AGI I won’t engage on that in this thread - I have posted my take on this topic in numerous places in this forum.

To try and keep this discussion on-topic for the original “framework for …” I would like to turn this back to you and ask “What would an answer to your question look like?”

I proposed a description of the process of consciousness and assume that the contents of consciousness will include much of the processes that you are asking about. You have rejected that out of hand so what would these processes look like to your way of thinking?


#107

Both intelligence and consciousness are fuzzy concepts reflected in multiple definitions. All the definitions I know are post-hoc products. Starting from an evasive (human) internal sense of familiarity or a presumed function they serve, and leading us to compile a definition that would comply with the former motivations. The first question should be whether we could explain and model human behavior without needing to pool out these concepts. Nonetheless, these concepts are fascinating and addictive as they “color” internal processing to stand out.
As for consciousness, a helpful model would be to see it as some sort of co-activation of a specific representation (input or output) with some sort of self-representation. This co-activation is “tagging” these representations for higher relevance to the self (for inputs) and self agency (for outputs). This in turn is valuable gain-control information for learning. For instance, our outputs are also an input for us through various sensory channels and it is important to to distinguish them from inputs that have other agents. This is essential for reinforcement learning.


#108

Hard to accept it as an activation of a representation to the self.
I have often heard of the “self” applied as an agency separate from consciousness.
memory and sensation should not be considered as products for consumption by a separate self - if they were, then what is the self, where is it located, and how does that work… this leads to a too heavily layered model.

memory and sensation are entwined in “consciousness” AKA “normal brain function while awake or dreaming”.


#109

Maybe I’m confounding consciousness with self awareness and self consciousness. Anyway, I’m not referring to some homunculus structure with mysterious and miraculous properties. Just a representation that gets activated whenever the input\output is self generated or solicited (via attention for instance). maybe if I refine these thoughts, a representation locked in space time to some entity. Similarly, working memory and episodic memory are different in nature (and importance for learning) from semantic memory. Finally, as I stated before, consciousness could be none existing as we think of it, but only as an emerging property or byproduct of a system that evolved to function in an ever-changing environment.