I have to apologize to both of you. This is old baggage for me.
The baggage aspect is from several prior exchanges with @abshej on this same general topic. I have been seeing this general viewpoint from various AI researchers for years. This revolves around the concept that there is some sort of intelligence function that is separable from running a body and if we can just isolate that we can dispense with the messy biological bits like emotion and the body I/O functions. Often this is mixed in with cortex chauvinism that the older parts of the brain are just some sort of relay stations to get the cortex the data it needs to do what it does.
Some sort of general G thing with a sprinkle of the pixie dust of self-awareness.
This runs contrary to what I have been reading to the point where I am wondering why they can’t see that the brain just does not work that way. I am convinced that function follows form here and that trying to extract the “intelligence” from the brain will mostly end up duplicating the functions (if not the actual form) of the brain.
The brain - It’s not a filter. It’s not a computer. It’s not some heightened expression of panpsychism. It’s not an algorithm or a cost reduction function. I really think it’s silly to try and tie in quantum uncertainty. I’m sorry that these are triggers for me but after 35 years of reading about this stuff these sort of statements are annoying to me.
I will try to keep this in check but it’s a struggle.
For example: