Using new theoretical results and experiments on cultures from neurons (brain cells), a group of scientists led by Prof. Ido Kanter of Israel’s Bar-Ilan University has demonstrated that the long-held assumption that learning occurs only in the synapses (links) between neurons is mistaken.

The brain is a complex network containing billions of neurons. Each of these neurons communicates simultaneously with thousands of other via the synapses. The neuron collects synaptic incoming signals through several extremely long branched “arms” called dendritic trees.

In an article published in the journal Scientific Reports, the researchers go against conventional wisdom to show that learning is actually done by several dendrites, similar to the slow learning mechanism currently attributed to the synapses.

“The newly discovered process of learning in the dendrites occurs at a much faster rate than in the old scenario suggesting that learning occurs solely in the synapses,” said Kanter, a physicist in the university’s Gonda (Goldschmied) Multidisciplinary Brain Research Center.

“In this new dendritic learning process, there are a few adaptive parameters per neuron, in comparison to thousands of tiny and sensitive ones in the synaptic learning scenario.”

In 1949, Donald Hebb’s pioneering work suggested that learning occurs in the brain by modifying the strength of the synapses, whereas neurons function as the computational elements in the brain. This has remained the common assumption until today.

Neurons don’t need to wire together?

The newly suggested learning scenario indicates that learning occurs in a few dendrites that are in much closer proximity to the neuron.

“Does it make sense to measure the quality of air we breathe via many tiny, distant satellite sensors at the elevation of a skyscraper, or by using one or several sensors in close proximity to the nose? Similarly, it is more efficient for the neuron to estimate its incoming signals close to its computational unit, the neuron,” said Kanter.

Hebb’s theory – which gave rise to the popular phrase “neurons that fire together wire together” — has been so deeply rooted in the scientific world that no one has ever proposed such a different approach.

Kanter’s team’s discovery may spark a reevaluation of current treatments for disordered brain functionality.

In addition, because the brain’s learning mechanism is the basis for machine-learning and deep-learning advancements, the change in understanding opens new horizons for different types of deep-learning algorithms and artificial intelligence applications imitating brain functions at a much faster speed.

Another important finding of the study is that weak synapses, previously assumed to be insignificant even though they comprise the majority of the brain, play an important role. They induce oscillations of the learning parameters rather than pushing them to unrealistic fixed extremes, as previously suggested.

Kanter’s research is supported in part by the TELEM grant of the Israel Council for Higher Education. His team on this study included Shira Sardi, Roni Vardi, Anton Sheinin, Amir Goldental and Herut Uzan.