I think there is the realization that altruism and self-help are indistinguishable… In the same way as if I live in a neighborhood dominated by Honda’s and I go out and buy a Rolls Royce, I then have to have a garage and an alarm system and other protective measures to ensure the security of my disproportionately expensive property. So the more valuable something is compared to what others might have, the less freedom I’d have with it.
So freedom is inversely proportional to the pervasiveness of a thing. Therefore, real prosperity can’t exist in an isolated condition because its appreciation gets constrained inversely according to its conspicuousness.
So in order to experience true prosperity - everyone around us must be prosperous as well… so altruism is maybe actually the way to benefit one’s self?
Could be…maybe long term benefit vs short-term gain? I don’t disagree…still it seems so many problems can be solved if we put our selves second to ourselves… the penalty for to little is much higher than the penalty for to much…from a biological standpoint. There has to be some simple priciple at work here that works it’s way up to the neocortex…otherwise how does a frog or a stink bug or a phytoplankton do it.
Um, I think that something was lost in the translation here.
The limbic system does all the stuff I posted above all by itself. It’s functions are much older than the cortex and are vital to survival for all the “lower” lifeforms.
The Papaz circuit loops in dozens of clusters of nerve bodies, each tuned up to do different things.
I have mentioned the amygdala but there are other important bits and pieces involved:
Right so there is that portion of beneficial survival pre-programming that is genetically passed on… Emotions and behaviors having to do with survival… But how much of this is necessary? And, how much of this is necessary in an artificial intelligence? Is our goal to reproduce something that looks human or is it to produce intelligent cognitive applications which can benefit humanity without anthropomorphism?
Do we need to duplicate the emotional/survival machinery in order to produce sentience?
How do you purposely evolve an entity that will come about through emergence? How do you place constraints on an emergent process? Moral constraints? Behavioral constraints? Because I don’t think any of this can be built - it has to be grown from something capable of evolving these qualities?
So if our AIs are going to evolve through emergence, what scares me is that the one example we have passes through a selfish/bratty stage. Ever try to negotiate with a 5 year old? They could care less about equanimity
Emotions are how we make sense out of what to do in terms of preserving, protecting and propagating life. All living things do these three things in some way…what I suggest is that the basic control of preserving life gives rise to learning and that this mechanism is likely the same across species and even kingdoms. Why else would plants play?
I think if we figure out how seeking and play evolve as drives, either from our emotional centers or the equivalent in plants and simpler animals we will nail the learning question. Learning is more than just patterning…its actively seeking out patterns, strengthening them by making them more dense and then playing with them to see how patterns go together. The rest of the emotion thing just confuses things because it’s there to protect against threats that should be largely irrelevant in a complex civilization.
The amygdala is very good at picking at picking out faces and social cues.
Did what I did make mom smile or frown?
Did the alpha male groom us or strike us?
Do my peers emit affirming or threatening behavior?
All of this adds to emotional tone to my actions (not yet consolidated in my hippocampus memory buffer) and modulates learning rates. I would not consider this irrelevant to a complex civilization. In fact - we have examples of humans that do not have this influence baked into the cortex during the critical phases of development (abandoned orphans in the post-Soviet era) and these are deeply damaged persons.
This is also the area where social shapes & cues are recognized - secondary sexual characteristics, mating dances, pheromones, “my” species recognition. All part of civilization.
I know that it is attractive to focus on the exploration/play as the important parts and toss out all the rest as “unnecessary” messy bits but I don’t think that is wise.
Likewise - some would have us toss out the sensorimotor parts as unnecessary to making an AGI. These people will be on their own without guidance from modeling the brain as this is proving to be an integral part of how the brain forms semantic categories. Indeed - it is looking like speech is, in fact, a learned motor action encompassing broad swaths of the sensorimotor areas.
Like it or not - the only example we have of an actual working intelligence has messy biological sub-systems that seem to be needed to work the way we want it to work. Let me clarify to head off misunderstandings; not the actual bloody bits but the functions and arrangements of functions.
I wonder if this means language is “un-learnable” without a tongue? (…or functionally equivalent apparatus). For example, is the physical motion of word and syllable formation necessary to acquire language distinctions? Interesting!
…and then there is the fact that there is no knowing without language. Observation is a phenomenon of distinction in the domain of language - so language is the foundation of sentience! (By language I don’t have to necessarily mean words though - but conceptual representation)
Well at least we can eliminate the parts which are present to support the fact that the mechanisms are biological… (i.e. neurotransmitters, pheromones/hormones etc). Mind you, I don’t mean eliminate what they do - but replace their function digitally.
I am fully aware of the utility of emotions and was not suggesting that we don’t need them in our current state of civilization. Civilization evolved with emotions as part of the mix and so it obviously still requires them. What I mean is that emotions seem to be a basic form of very imprecise and inarticulate form of communication. Language emerges as a more precise means of communication but is still terribly imprecise. As an example, I could take your words above as an admonishment for not thinking the way you do. However I choose to see them as a means by which I can draw out my own thoughts about something very complex. Even in language there are huge areas of imprecision, no where near as imprecise as emotional communication but still lacking in precision. I think it all stems from biological signalling which slowly becomes more organized and precise. Signals are patterned and formed into more complex forms of communication each level adding to the precision with which we can communicate. Underlying it all has to be a simple repeated signalling system which gradually self organizes according to some as yet intangible pattern. I also think we have to be careful not to assume that we are the only example of working intelligence. Many other organisms display elements of intelligent behaviour and even the capacity to learn from the environment, just not at human rates, otherwise why bother experimenting with chimps and lab rats at all. The fact that we lack the language to communicate with other organisms is what makes us think they are not intelligent. We can not have thoughts without some sort of symbolic language to convey them, however that language can take many forms. Is sign language really the same as English? Can it really convey the same thoughts in the same way with the same level of precision? What about drawing or dancing or music, are these not forms of communication, just different in their precision and or better suited to conveying certain types of information? Bees do a dance to very precisely convey information about where a food source is to be found. All kinds of social animals convey meaning through different channels of communication… is this not intelligence? Tool use and prediction are everywhere in the natural world and we highly value these things in defining our own intelligence. Just a thought, and in the interests of precision not a forgone conclusion just ideas.