Learning how birds teach themselves to sing Dad's song

Sure, Hebbian learning is THE candidate for error-based learning, which is pretty awesome. Think about a cell that is connected to other cells in an SDR through their distal dendrites. Using Hebbian learning, if the cell is active while receiving input from distal connections of the other cells in that SDR then the synapses between them get reinforced (‘neurons that fire together wire together’). For feedforward learning the proximal dendrites are the primary drivers for cell activity. When each cell becomes active from proximal dendrites from sensory input the distal synapses strengthen along with the other active cells near by. Take this idea but switch out proximal input with apical input.

If apical feedback can drive cell activity like proximal input then learning will occur in the same way. The feedback (teaching signal) comes down the apical dendrite and excites the cell (like proximal dendrites would do) and any other cells that are active that are connected via distal dendrites will have their synapses strengthened. So if this was a stochastic SDR and there was feedback from a teacher region then the cells in that SDR will get reinforced then tend to co-activate with together as an SDR in the future. The more times the apical teaching signal is fed to the same cells in that SDR the more they will be reinforced.

image

So from a cell’s perspective it is feedforward learning flipped on its head. Instead of proximal inputs it is apical.

Of course the verdict is still out on if aprical dendrites can drive cell activation. But there is some evidence that apical dendrites form ‘plateau potentials’ which cause the cell to burst, which is super convenient for a cell that is learning. Here is a post with the relevant papers.

Does that answer your question?

3 Likes