One of the main advantages computers have over the human brain is that the computer can remember and quickly access all the data it is fed, while the brain limits the quantity of data it can receive and recall.

What if the brain evolved so that this seeming downside was actually an intentional advantage?

That’s one of the surprising conclusions of new Israeli-American research led by Prof. Arnon Lotem of Tel Aviv University’s department of zoology.

Lotem and his team hypothesize that the brain’s “mechanism of filtering the data from the surroundings is an integral element in the learning process. Moreover, a limited working memory may paradoxically be helpful in some cognitive tasks,” such as language.

Lotem points out that chimpanzees have a larger working memory than humans, but the chimps use it mainly to discriminate between objects, such as different types of trees in the forest. Humans don’t need to keep all that visual data in mind because we have language to do so.

As a result, the human brain assigns most of its resources to the “computational burden” of language, which is done, to use a computer term, as a background process, rather than keeping those resources available for working memory.

“When we listen to a string of syllables, we need to scan a massive number of possible combinations to identify words,” Lotem says.

Lotem developed this new learning model in collaboration with Prof. Joseph Halpern and Prof. Shimon Edelman of Cornell University, and Oren Kolodny of Stanford University (formerly a PhD student at Tel Aviv University). The research was recently published in the Proceedings of the National Academy of Sciences.

The research looks at how cultural activities (such as language and tool-making) have shaped the evolution of cognition. “We believe that, over lengthy time scales, some aspects of the brain must have changed to better accommodate the learning parameters required by various cultural activities,” Lotem explains.

Lotem believes that his new research may be used someday to improve artificial intelligence.

“Currently the concept of limiting memory in order to improve computation is not something that people do in the field of AI, but perhaps they should try and see whether it can paradoxically be helpful in some cases, as in our human brain,” Lotem says.