Right so from that premise how do you build emotional drives which first maintain the basic functions of life and then defend life and then pass on the information that is a life?
That is the question isn’t it? We’re all here to contribute to whatever the final solution is. But at least we can begin to determine what that final solution’s constituent parts must be?
Um, I think that something was lost in the translation here.
The limbic system does all these stuff I posted above all by itself. It’s functions are much older than the cortex and are vital to survival for all the “lower” lifeforms.
The Papaz circuit loops in dozens of clusters of nerve bodies, each tuned up to do different things.
I have mentioned the amygdala but there are other important bits and pieces involved:
Each of these bits has bunches of sub-components - for example - the hypothalamus:
Back to the amygdala - it is also composed of lots of subsystems:
It’s tiny little projecting fibers go all over the place!
So all these emotions are not some little bit you learn - they are discrete chemical pathways that are learned as part of the mix of fibers feeding into the prefrontal cortex and the temporal lobe.
As you experience whatever it is that you experience the limbic system is adding it’s output to the other senses to be learned and remembered right alongside those other senses.
Remember - this is all before the cortex has anything to say about things.
Right so there is that portion of beneficial survival pre-programming that is genetically passed on… Emotions and behaviors having to do with survival… But how much of this is necessary? And, how much of this is necessary in an artificial intelligence? Is our goal to reproduce something that looks human or is it to produce intelligent cognitive applications which can benefit humanity without anthropomorphism?
Do we need to duplicate the emotional/survival machinery in order to produce sentience?
This stuff is well preserved from reptiles and amphibians.
I would think that until you know how the stuff works I assume that it is all needed until you learn otherwise.
This is the “dumb boss” part of my “dumb boss / smart adviser” model.
The “dumb boss” is still pretty smart.
When you can program a robot to do everything a crocodile does using neural networks you can make the call on what parts are not needed.
How do you purposely evolve an entity that will come about through emergence? How do you place constraints on an emergent process? Moral constraints? Behavioral constraints? Because I don’t think any of this can be built - it has to be grown from something capable of evolving these qualities?
That is some of the questions I have been working on for the last 20 or so years.
The work progresses slowly.
So if our AIs are going to evolve through emergence, what scares me is that the one example we have passes through a selfish/bratty stage. Ever try to negotiate with a 5 year old? They could care less about equanimity
I am much more worried about making an autistic or psychotic AGI.
Or one that we have no model of in humans but it still very undesirable.
It’s hard enough programming a little human to do the right thing. Nobody has any experience with AGIs so we have no idea what you can do wrong. I am certain we will find out.
Pro tip: No mater how it pleads or what fancy argument it offers: don’t give your new AGI a great big strong invincible robot body or control of a nuclear arsenal.
Come with me if you want to live! lmao!
What if human beings are simply an interim development on the road to what nature intends to create?
Emotions are how we make sense out of what to do in terms of preserving, protecting and propagating life. All living things do these three things in some way…what I suggest is that the basic control of preserving life gives rise to learning and that this mechanism is likely the same across species and even kingdoms. Why else would plants play?
I think if we figure out how seeking and play evolve as drives, either from our emotional centers or the equivalent in plants and simpler animals we will nail the learning question. Learning is more than just patterning…its actively seeking out patterns, strengthening them by making them more dense and then playing with them to see how patterns go together. The rest of the emotion thing just confuses things because it’s there to protect against threats that should be largely irrelevant in a complex civilization.
I beg to differ.
We are social animals.
The amygdala is very good at picking at picking out faces and social cues.
Did what I did make mom smile or frown?
Did the alpha male groom us or strike us?
Do my peers emit affirming or threatening behavior?
All of this adds to emotional tone to my actions (not yet consolidated in my hippocampus memory buffer) and modulates learning rates. I would not consider this irrelevant to a complex civilization. In fact - we have examples of humans that do not have this influence baked into the cortex during the critical phases of development (abandoned orphans in the post-Soviet era) and these are deeply damaged persons.
This is also the area where social shapes & cues are recognized - secondary sexual characteristics, mating dances, pheromones, “my” species recognition. All part of civilization.
I know that it is attractive to focus on the exploration/play as the important parts and toss out all the rest as “unnecessary” messy bits but I don’t think that is wise.
Likewise - some would have us toss out the sensorimotor parts as unnecessary to making an AGI. These people will be on their own without guidance from modeling the brain as this is proving to be an integral part of how the brain forms semantic categories. Indeed - it is looking like speech is, in fact, a learned motor action encompassing broad swaths of the sensorimotor areas.
Like it or not - the only example we have of an actual working intelligence has messy biological sub-systems that seem to be needed to work the way we want it to work. Let me clarify to head off misunderstandings; not the actual bloody bits but the functions and arrangements of functions.
I wonder if this means language is “un-learnable” without a tongue? (…or functionally equivalent apparatus). For example, is the physical motion of word and syllable formation necessary to acquire language distinctions? Interesting!
…and then there is the fact that there is no knowing without language. Observation is a phenomenon of distinction in the domain of language - so language is the foundation of sentience! (By language I don’t have to necessarily mean words though - but conceptual representation)
Well at least we can eliminate the parts which are present to support the fact that the mechanisms are biological… (i.e. neurotransmitters, pheromones/hormones etc). Mind you, I don’t mean eliminate what they do - but replace their function digitally.
Yes and no. One of the interesting applications is diagnosis based on chemicals in the breath.
In general - a sense of smell would also be a useful feature.
So, in this case, the digital sensor will still be reading the messy analog bits.
I am fully aware of the utility of emotions and was not suggesting that we don’t need them in our current state of civilization. Civilization evolved with emotions as part of the mix and so it obviously still requires them. What I mean is that emotions seem to be a basic form of very imprecise and inarticulate form of communication. Language emerges as a more precise means of communication but is still terribly imprecise. As an example, I could take your words above as an admonishment for not thinking the way you do. However I choose to see them as a means by which I can draw out my own thoughts about something very complex. Even in language there are huge areas of imprecision, no where near as imprecise as emotional communication but still lacking in precision. I think it all stems from biological signalling which slowly becomes more organized and precise. Signals are patterned and formed into more complex forms of communication each level adding to the precision with which we can communicate. Underlying it all has to be a simple repeated signalling system which gradually self organizes according to some as yet intangible pattern. I also think we have to be careful not to assume that we are the only example of working intelligence. Many other organisms display elements of intelligent behaviour and even the capacity to learn from the environment, just not at human rates, otherwise why bother experimenting with chimps and lab rats at all. The fact that we lack the language to communicate with other organisms is what makes us think they are not intelligent. We can not have thoughts without some sort of symbolic language to convey them, however that language can take many forms. Is sign language really the same as English? Can it really convey the same thoughts in the same way with the same level of precision? What about drawing or dancing or music, are these not forms of communication, just different in their precision and or better suited to conveying certain types of information? Bees do a dance to very precisely convey information about where a food source is to be found. All kinds of social animals convey meaning through different channels of communication… is this not intelligence? Tool use and prediction are everywhere in the natural world and we highly value these things in defining our own intelligence. Just a thought, and in the interests of precision not a forgone conclusion just ideas.
Bitking, thanks for this info it is new to me.
Humor is very important. I hope Hal has a sense of humor.
pro tip: Make friends with the AIs.
BTW please note the use of the word SHOULD in the above quote… we should be able to do better than we do at this stage in our progression through evolution. I think we got side tracked here. Emotions can and do play a role in learning… it’s just a question of which ones and why. I don’t claim to have any of this figured out … just trying to learn. The reality in human learning is that emotions, more often than not, are a hindrance largely because of how complex the interactions are.
Chill dude - I’m not attacking.
I do see on this site that many of the AGI experimenters are focused on the cortex as the final answer.
My main point is that it is just one (a big one) of many interacting systems.
It is my strongly held personal opinion that if you remove the functions provided by these other systems you will end up with a useless toy. I am not trying to attack anyone and I do understand the desire to make things simple and easy. I just don’t think that hooking a bunch of cortex maps together with a few encoders and decoders are going to get a functional AGI.
I did not come to this point lightly - it is the result of a long study. I will share what I can to help other people see why I think this is true.