I have noticed that memory in our brains is generally associated with the synapses, though the HTM theory isn’t incorporating those principles.
Considering how memory storing and retrieval can play an important role in intelligence, is there a plan to study that in the future or is it not considered necessary for machine intelligence?
I have no idea why you think that synapses play no part in HTM theory.
The data elements in each SDR directly correspond to synapses.
I meant to say that Synapses themselves are not studied in detail as far as I can recall.
Am I misguided or is the detailed study not required?
Are nerves still considered transmission lines? Only computer ram can store a state? Nerves
can not hold a state.
I may have missed new science?.
I am liked the concept of using a daisy chain of nerves in a
loop to hold a pules. Just like mercury memory delay lines like in the days of old.
Old computer tech: delay-line memory:
My artificial Neural network AGI brain uses this type of memory.
The speed of a neuron pulse is 25 miles per hour. Slow the pule the more more memory
can be stored. And smaller the nerve length the better too.
A length of four neuron loop can old one pulse from zero pulse to a max of one.
Eight neuron loop can hold two pulses max, from zero pulse to to a max of two.
Daisy chain loop can be hundred of neurons long and longer.
Synapses are the primary mechanism for memory. Numenta are indeed using a form of synaptic mechanism called Hebbian learning. It seems they’ve done more research on synapses than anyone has done in popular ANNs.
Checkout this paper for a good overview of the theory and you’ll find a section on synapses.
4 Likes