Searching the most efficient path to real-time learning

In a new study, researchers at the University of Bern proposed that the connections between neurons follow a "least-action" rule—mirroring a fundamental principle of physics.

A Unifying Concept in Physics and Neuroscience

In physics, the basic laws of motion in the various subdisciplines can be derived from a single principle, the principle of least action. In classical mechanics, it states that bodies move along a path of minimal action, such that the difference between kinetic and potential energy is minimized. Newton’s law of motion, the motion of planets and of electrons are derived from this principle. It engages the mathematical theory of variational calculus, which gives a local condition for trajectories to show a minimal action among neighbouring trajectories. Simply put, the local least-action condition states that the forces derived from the various energy contributions are moving the bodies in a dynamic equilibrium at any moment in time. Prof. Walter Senn, lead author of the study, further elaborates: "A similar concept of the global brain activity maintaining a moving equilibrium can also describe the state of individual neurons within a recurrently connected network, where neurons continuously influence one another."

Dynamic Neural Processing: Adapting to Continuous Sensory Input

Energy-based approaches to neural networks have been so far restricted to describing steady-states, reached while external inputs are kept constant in time. Yet, sensory input streams continuously change, and the brain integrates these inputs while processing previous streams. Prof. Senn explains: “We set out to describe the ongoing processing of cortical neuronal networks of pyramidal and interneurons locally in space and time, while they integrate external feedback and reduce internal and external errors, the `mismatch energy’, on the fly”.

Prospective Prediction Errors Drive Neuronal Computation and Learning

The key notion of the neuronal least-action principle is the prospective somato-dendritic prediction error. “The error is prospective since it linearly extrapolates from current errors into the future (using temporal derivatives). Each model pyramidal neuron extracts its own prospective error based on what the postsynaptic neurons, to which they project to, signal back” describes Prof. Senn. Apical dendrites try to explain away the top-down feedback via local interneurons, forming local errors. The errors prospectively modulate the somatic voltages of the pyramidal neurons, reduce their own errors in real time, and induce synaptic plasticity on the basal dendrites to prevent errors in the future.

Unraveling Real-Time Brain Dynamics: From Cognitive Frameworks to Neuromorphic Innovation

A real-time description of the complex brain dynamics in a functional, rather than only a biophysical context, is fundamental. It allows for structuring local spatio-temporal processes in the brain in terms of notions related to cognition and behaviour. It stimulates experiments for testing the postulated prospective error representation and error learning in pyramidal neurons and microcircuits within and across cortical areas, and it inspires the rebuilding of these processes in brain-inspired neuromorphic hardware.

Publication details:

Walter Senn, Dominik Dold, Akos F. Kungl, Benjamin Ellenberger, Jakob Jordan, Yoshua Bengio, João Sacramento and Mihai A. Petrovici. A Neuronal Least-action Principle For Real-time Learning In Cortical Circuits. eLife. DOI: https://doi.org/10.7554/eLife.89674.3