I thought this might be interesting. It may provide insight into the structure of hierarchy used by the brain.
This grammatical rule is seemingly shared among all languages, with the relative order influenced by the SVO/VSO/OSV structure of the language impacting the relative expression of ideas.
What this means is that our brains are wired to understand something like this:
The awesome small new rectangular white Apple glass communications device was introduced yesterday.
As opposed to this:
The Apple small communications rectangular awesome glass new white device was introduced yesterday.
The second is grammatically equivalent to the first, but sounds schizophrenic. Both are awkward and arbitrary, but the first is far more coherent sounding.
I was thinking that this meta-grammar could give insight into how knowledge is represented in the brain. Knowing that, or other meta-grammatical rules, could inspire an HTM based natural language system based on the hierarchy that the rules imply.
Does anyone know of other similar rules or phenomena in language?
This is very insightful. This would be a perfect way to introduce grid cells in natural language comprehension. Thanks for this link.
I remember vaguely from latin classes that the romans were allowed to put the verb wherever they wanted. But poets started a tradition of reciting verses with strict rythm. In latin it is called scandere. (Google translate doesnāt know it unfortunately).
If you look into the history of oral traditions and the evolution of writing, youāll find all sorts of tantalizing hints that memory palace style mnemonics were used. I just recently read āThe Memory Codeā which goes down the rabbit hole of memory skills and knowledge transmission.
There was a discussion about the transformation of writing in scrolls, which was without index or numbered pages, requiring a comprehensive knowledge of the whole text in order to navigate, and how phonetic mnemonics were implicit to the writings. Thereās something that hints at a similar meta-grammatical rule, or set of rules in those concepts. Theyāre phenomena only a degree or two of separation from grid cell phenomena.
Thereās something about Zipfās law, mnemonics, grid cells, and communication/language that seem to correlate. This adjective order phenomena is a piece of it, and I think there might be something valuable underlying it all.
It could be that those phenomena simply correlate because intelligence is required - they all imply a high level of Phi / integrated information. Or maybe thereās a way to parse out hierarchy or cognitive structure.
Iāve wondered about those same topics, except in relation to Gematria. Do Letters and Numbers have some kind of equivalency? Humans seem fascinated by the possibility.
We know both word and number can be encoded as SDRs. Maybe displacement cells could be related to addition and subtraction. Up and down the hierarchy like multiplication/division. Iām not sure how language fits onto that structure, though.
I also find Zipfās Law screaming at me that there must be something valuable here.
As for hex grids (if I understand correctly), the original function was locations-in-room, and later in evolution, points-on-object. It seems like further abstractions of letters-in-word and numbers-on-line would function similarly?
Iāve been thinking about Braille lately in regards to these subjects. Here we have points-on-object and letters-in-word as the exact same thing, like a bridge.
Separately, the āadjective order rule meta-grammarā has me thinking about the uni-directional flow of time for some reason.
I might have different opinions about it. I think although some default structure is present in the brain, but thereās more (maybe buffering) going on.
For example. The two Esperanto sentences can be parsed intuitively by speakers from any culture
La knabo amas grandan pomojn -> The boy loves big apples La knabo amas granda pomojn -> The big boy loves apples
The only different is the -n suffix after granda (the Esp word for big). Indicating the adjective should be applied to the object instead of subjects. This sort of sentences come up a lot in Esperanto conversations and people have no problem with it.
(And thereās also Chinese. Itās grammar is a horrible mess.)
The shape and arrangement of the letters primes reading patterns and parsing of the individual words.
Languages that depend on endings direct attention to the shape of the endings much like the overall shape of the word. It is part of learning to read that language. You pay more attention to the shape of the word than to the letters when you are reading at speed.
And then there is this:
What can you infer from the decoding process before you learn to mirror read? How does it feel and what āautomaticā parts are you having to fill-in for? As I teacher I have gotten reasonably proficient in reading student papers and hand-writing upside down. It does become automatic after a remarkably short time.
By the way - ādefault structureā is not a thing. Many languages read vertically. This is learned along with learning the overall language. I suspect that readers of bidirectional languages have very different scanning patterns than I do.
In the dadās song project we posit that the structure of language is learned by listening first, then imitation.
In a nutshell:
Passively learning dadās song coupled with emotional flavoring.
Passively learning with other sounds with no or other emotional flavoring.
(Could be alert calls or other signalling such as food)
Active learning to produce sounds by exploratory vocalizing motor skills.
Time passes.
Hormonal / reinforcement driven learning to create prior learned Dads song.
As far as āinate structuresā of language two areas seem to be key: Brocaās and Wernickeās Areas process objects and grammar in separate areas.
Brocaās area seems to be specialized for motor production, and the learned patterns correspond somewhat to parsing and producing grammar.
Wernickeās area seems to be specialized for high-level object representation. I see this as being of somewhat the same complexity as the Cortical IO retina construct. More details here.
Considering the relation between grammar and word store should give some flavor of understanding of the kind of information that can be carried by an association fiber, in this case, the Arcuate fasciculus.
By the way - the first link mentions "Mirror neurons.ā The relation between maps and object store is relevant here: Maybe itās time to ditch Mirror Neurons? The referenced Friedemann PulvermuĢller paper show that the idea of a central semantic object store is hopelessly idealized, and that in fact, the representation is grounded and distributed over all of the sensory processing streams. Referring to a concept from SQL database lore, Wernickeās area is the key or index to the distributed semantic concept representation.