Inhibitory activity

Here is an example of a spiking neural network model, with inhibitory synaptic plasticity.
They use “leaky integrate and fire” neurons.

Balancing Feed-Forward Excitation and Inhibition via Hebbian Inhibitory Synaptic Plasticity.

Luz Y, Shamir M (2012)
PLoS Comput Biol 8(1): e1002334.
Balancing Feed-Forward Excitation and Inhibition via Hebbian Inhibitory Synaptic Plasticity

It has been suggested that excitatory and inhibitory inputs to cortical cells are balanced, and that this balance is important for the highly irregular firing observed in the cortex. There are two hypotheses as to the origin of this balance. One assumes that it results from a stable solution of the recurrent neuronal dynamics. This model can account for a balance of steady state excitation and inhibition without fine tuning of parameters, but not for transient inputs. The second hypothesis suggests that the feed forward excitatory and inhibitory inputs to a postsynaptic cell are already balanced. This latter hypothesis thus does account for the balance of transient inputs. However, it remains unclear what mechanism underlies the fine tuning required for balancing feed forward excitatory and inhibitory inputs. Here we investigated whether inhibitory synaptic plasticity is responsible for the balance of transient feed forward excitation and inhibition . We address this issue in the framework of a model characterizing the stochastic dynamics of temporally anti-symmetric Hebbian spike timing dependent plasticity of feed forward excitatory and inhibitory synaptic inputs to a single post-synaptic cell. Our analysis shows that inhibitory Hebbian plasticity generates ‘negative feedback’ that balances excitation and inhibition, which contrasts with the ‘positive feedback’ of excitatory Hebbian synaptic plasticity. As a result, this balance may increase the sensitivity of the learning dynamics to the correlation structure of the excitatory inputs.


I doubt that. DNN do not use biological learning rules, or really any part of biology beyond the basic concept of a “neuron” as a unit of information processing.

1 Like

Cool! They wrote two review articles about inhibitory plasticity, which look to be useful.

Inhibitory Plasticity: Balance,Control, and Codependence

Published 2017
https://doi.org/10.1146/annurev-neuro-072116-031005
http://www.neurotheory.ox.ac.uk/~timv/pub/Hennequin2017.pdf

Abstract

Inhibitory neurons, although relatively few in number, exert powerful con-trol over brain circuits. They stabilize network activity in the face of strong feedback excitation and actively engage in computations. Recent studies re-veal the importance of a precise balance of excitation and inhibition in neural circuits, which often requires exquisite fine-tuning of inhibitory connections.We review inhibitory synaptic plasticity and its roles in shaping both feed for-ward and feedback control. We discuss the necessity of complex, codepen-dent plasticity mechanisms to build nontrivial, functioning networks, and we end by summarizing experimental evidence of such interactions.

Inhibitory synaptic plasticity: spike timing-dependence and putative network function

doi: 10.3389/fncir.2013.00119
http://www.neurotheory.ox.ac.uk/~timv/pub/VogelsFrontiers2013.pdf
Published 2013

Abstract

While the plasticity of excitatory synaptic connections in the brain has been widely studied, the plasticity of inhibitory connections is much less understood. Here, we present recent experimental and theoretical findings concerning the rules of spike timing-dependent inhibitory plasticity and their putative network function. This is a summary of a workshop at the COSYNE conference 2012.

The way back-propagation is implemented is not biologically feasible. In the same way HTM argues that lots of details are not important, it is debatable how accurate the models need to be and what short-cuts can be taken. There is a case to be made that from an abstract perspective the back-propagation is just modelling biological learning.

DNN do sometimes use a lot more biology than you might imagine, CNN are often modelled on the biological visual system’s hierarchy. They can demonstrate high levels of correlation with biological networks in some cases e.g. learning to detect similar visual features at the same levels in the hierarchy. Other examples include the use of recurrence, memory elements, reinforcement signals - all biologically inspired.

The remark was made by Prof. Giacomo Indiveri, he is working on neuromorphic hardware so not at all in the DNN camp. So maybe worth considering.

1 Like

In terms of output, inhibitory neurons in cortex are pretty much local. I recall there’s at least one exception, but I can’t find it so I dunno whether it was just part of biological development or outside cortex. Outside cortex, inhibitory neurons aren’t always local, e.g. basal ganglia.

Input-wise, they can receive from non-local sources.

Some do, like martinotti cells. I don’t know what inputs their apical dendrites get.

I think @Bitking mentioned a biologically plausible deep learning.

1 Like