I’ve gotten so used to the idea from ML that synapses between neurons are weighted. After learning from Numenta’s research (and other’s) that pyramidal neurons actually have binary weights, it got me thinking about non-cortical neurons and their synapses.
For all the genetically determined neural networks that do not implement learning, are their synapses weighted? I’ve always assumed they are, but it almost seems like having a weight is far too fragile for the noisy and robust brain. Then again I can’t imagine all neurons having a connection of 1.
From what I’ve read about the nervous system and the more primitive parts of the brain, I haven’t read anything about the synapses being weighted. Is there some other way computation between neurons via synapses is performed other than weights?
Anyway, I was hoping someone might be able to shed some light on this for me.