” Study of polyglots – people proficient in many languages – offers insight on brain’s language processing ” | GNN INFO
”
With one intriguing exception, activity increased in the areas of the cerebral cortex involved in the brain’s language-processing network when these polyglots – who spoke between five and 54 languages – heard languages in which they were the most proficient compared to ones of lesser or no proficiency.
“We think this is because when you process a language that you know well, you can engage the full suite of linguistic operations – the operations that the language system in your brain supports,” said Massachusetts Institute of Technology neuroscientist Evelina Fedorenko, a member of MIT’s McGovern Institute for Brain Research and senior author of the study published on Monday in the journal Cerebral Cortex, opens new tab.
“You can access all the word meanings from memory, you can build phrases and clauses out of the individual words, and you can access complex, sentence-level meanings,” Fedorenko added.
But an exception caught the attention of the researchers. In many of the participants, listening to their native language elicited a lesser brain response compared to hearing other languages they knew – on average down about 25%. And in some of the polyglots, listening to their native language activated only a part of the brain’s language network, not the whole thing.
“Polyglots become experts in their native language from the point of view of efficiency of neural processes that are required to process it. Therefore, the language network in the brain does not activate as much when they do native versus non-native language processing,” said neuroscientist and study co-lead author Olessia Jouravlev of Carleton University in Canada.
“One’s native language may hold a privileged status, at least in this population,” Fedorenko added, referring to the study’s polyglot participants.
The brain’s language network involves a few areas situated in its frontal and temporal lobes.
“The language network supports comprehension and production across modalities – spoken, written, signed, etc. – and helps us encode our thoughts into word sequences and decode others’ thoughts from their utterances,” Fedorenko said.
Study co-lead author Saima Malik-Moraleda, a doctoral student at the Harvard/MIT Program in Speech and Hearing Bioscience and Technology, said the findings suggest that the distillation of meaning governs brain response to language.
“The more meaning you can extract from the language input you are receiving, the greater the response in language regions – except for the native language, presumably because the speaker is more efficient in extracting meaning from the linguistic input,” Malik-Moraleda said.
The 34 study participants, 20 men and 14 women, ranged in age from 19 to 71. Twenty-one were native English speakers, with the rest native speakers of French, Russian, Spanish, Dutch, German, Hungarian and Mandarin Chinese.
Their brain activity was monitored when they listened to recordings of passages in eight languages: their native language, three others in which they were highly proficient, moderately proficient and minimally proficient, and then four they did not know. Half heard recordings of Lewis Carroll’s “Alice in Wonderland.” The other half heard recordings of biblical stories.
The lesser brain response to hearing one’s native language was most pronounced among the study participants who heard the biblical stories – linguistically simpler, according to Fedorenko, than Carroll’s writing.
“A lot of work in language research,” Fedorenko said, “has focused on individuals with linguistic difficulties – developmental or acquired. But we can also learn a lot about cognitive and neural infrastructure of some function by looking at individuals who are ‘experts’ in that function. Polyglots are one kind of language ‘experts.'”
”