Multidirectional propagation of biological neurons - could/should we recreate it e.g. with joint distribution neurons?

While ANNs are rather trained for unidirectional propagation, action potential propagation in biological neurons is symmetric e.g. ”it is not uncommon for axonal propagation of action potentials to happen in both directions” (from “Dynamics of signal propagation and collision in axons” PRE). As it is possible, they should be evolutionarily optimized for such multidirectional propagation, what might be crucial e.g. for learning (currently not well understood), consciousness (?)

Are there considered artificial neurons operating in multidirectional way?

One approach is somehow containing representation of joint distribution model e.g. ρ(x,y,z)

, which allows to find conditional distributions in any direction by substituting some variables and normalizing - below is such inexpensive practical realization from , allowing for many additional training approaches - could biology use some of them?

Are there different approaches? Research in this direction?

Is multidirectional propagation important/crucial for (e.g. learning of) biological neural networks?

enter image description here

1 Like

I don’t think they-re optimized for that it’s simply the way action potential work it propagates radially in all directions, except towards areas that already discharged. So it will gradually discharge the whole cell membrane regardless where the spark starts.

Imagine a network of pipes having the shape of a single neuron filled with air and flammable gas. You can start a spark anywhere within the pipe and the flame will propagate everywhere and after all gas has burned, it is slowly replenished for the new firing.

The above seems as “directional” because most frequently, the “spark plugs” = upstream input axon terminations (synaptic boutons to be more specific) attach to dendrite trees because these expose/cover a larger surface area / volume than any other part of the neuron.

Well, at least that’s my armchair half-scientific arguing.
Synaptic boutons if you look at pictures are like tiny fingers they can “grab” to any other neuron membrane they bump into.

I don’t think the directionality of the signal between different neurons actually changes - I mean yeah if action potential starts on the axon it will propagate both ways - towards axon’s own boutons and towards cell body and dendrites, but only the axonic synapses (boutons) will be able to forward that signal to other neurons. Synapses are one way.

1 Like

But from one side we don’t understand e.g. learning of biological NN - use e.g. literally backpropagation in ANNs for that … so isn’t some propagation in the opposite direction crucial for biological learning?
Maybe being focused on unidirectional propagation is one of reasons for being unsuccessful in this understanding?

From the other side, such symmetric multidirectional propagation just happens in biological realizations - shouldn’t evolution do it’s best to exploit it?

1 Like

It’s a dendritic backflow inside a neuron, after axonal spike. That’s how Hebbian Learning / synaptic potentiation works.


Indeed “Neurons that fire together, wire together” is a nice summary, but seems we are still far from understanding of learning of biological neural networks?

Maybe there are some more sophisticated mechanisms to uncover, and focusing on unidirectional propagation could be what is blocking us …

For example in theory neuron could hide/estimate/update model of joint distribution of its connection, what can be relatively simple to represent as above … how to confirm or deny it?

1 Like

Much closer than you think. Read up on computational neuroscience, there’s a ton of very detailed models. To begin with: Spike-timing-dependent plasticity - Wikipedia


I suspect this is still a low level understanding - not details of their consequences, leading e.g. much more efficient learning than current artificial neural networks (?)

To really understand it, we could start with reachable theoretical possibilities - then search for them in hidden subtle dependencies of biological neural networks.

And agnostic theoretical possibility of single neuron is modelling joint distribution of its connections (more than standard for ANN value dependence) - containing the entire statistical dependencies, allowing for multidirectional propagation (as in biological neurons), adding subtle novel training possibilities … and relatively easy to represent using density as linear combination like above - should be reachable by biology+evolution.

1 Like

Agnostic is not good, for a ~200 year old field.

1 Like

Agnostic as avoiding arbitrary assumptions - here: instead of guessing parametrization like standard ANNs, model joint distribution which contains all statistical information.

1 Like

ANN is not brute-force enough for you? Sorry, this is not going anywhere.

1 Like

Standard ANNs are brute force optimizations of guessed parametrizations … maybe biology is smarter? The question is in which direction …

Available statistical information is joint distribution, neuron modelling it could additionally e.g. propagate in any direction, values or distributions, would have additional training modes … seems basic but looking new research direction.

1 Like