Language the base of conscious thought?

neuroscience
#21

That raises another question that is what exactly does it mean to experience something.
The architecture it uses is called ‘The Transformer’(it’s not even an RNN!).
It “explores” its text by attending previous inputs selectively.
Where’s the line between just statistics and meaningful actions by the true understanding?
P.S. The positional encoding it uses has very similar characteristics to grid cells. It uses sine value with different frequencies for different scale of positions to represent where the each words are. I just thought this is interesting.

1 Like

#22

Thanks for informing me. Do you have an understanding of when in the training process it does this? Does it get the option to attend previous inputs selectively between training epochs?

Meaningful action takes place in an environment over time. “Meaningful” is also objective. For a biological organism, “meaningful” means an action likely to increase my chances of gene survival. This can be as simple as not walking off a cliff, or planning and building a shelter. These are the types of actions we want to get to. I think if you have the right modeling system hooked up to sensors in an environment, you should be able to build models fo the space starting with random movements. At every time step, with every motion, you get immediate feedback and models are updated. You can learn extremely quickly this way, especially once you’ve defined yourself, space itself, and the line between them. Now I’m rambling sorry, Saturday night. :stuck_out_tongue:

1 Like

#23

As I understand it, it doesn’t happen in the training phase, if that’s what you’re asking.
It only happens in the feedforward phase.
I think it’s just for finding the context of the current word.

I don’t think it does as that wouldn’t help with finding the context.
But I’m not sure as I’ve started to read about it just now. :confused:

I think you could say it takes meaningful actions by statistics.
Does it make any sense?

I would agree that it doesn’t have its own intention or self motivation.
And it wouldn’t do anything outside of it has told to do by its own interest just like every other weak AIs.
Like you’ve said, I think being aware of itself to know how to differentiate itself from the environment is very important.

1 Like

#24

You are absolutely right. Let me know if you find anything else interesting.

2 Likes

#25

“everything lives in the domain of language”

My dog checks the yard every morning. If he finds a spot that smell like an intruder he pees on top of it.

1 Like

#26

Bit and Paul, yes we need meaning and a way to store/represent/recall meaning. We also need a model of the world because meaning is context dependent. An artful balence of several systems.

1 Like