If yuo cna raed tihs, then teehr are two bnair netowrks in plya
Reading the line above involves two brain networks, discovered scientists from the University of Texas Health Science Center (UTHealth) Houston.
The study gives a clue to comprehension and language generation that could help with conditions such as dyslexia and aphasia.
Using a new, invasive approach, the researchers implanted electrodes in people with epilepsy. Then, they collected the data to see how the brain interprets words and sentences.
Dr Nitin Tandon, professor and chairman of UTHealth Neurosciences–Texas Medical Center and lead author, says the electrodes provide insights into the brain’s inner workings, especially for an activity like reading.
“Our work is making it clear that most processes – say comprehension or language generation – don’t occur in a single region but are best understood as very transient states that many separate areas of the brain achieve by very brief, yet critical, interactions,” Dr Tandon says in a statement.
Language centres of the brain
Multiple regions of the brain play a role in reading and comprehension of the text. For example, the temporal lobe is the prime region to interpret the pronunciation and sound of words. In comparison, Broca’s area of the frontal lobe governs speech and comprehension.
This study observed crosstalk between these two prime regions for the first time.
Read more – Know the brain regions.
To be privy to this `conversation’, Dr Tandon’s team studied the continuous flow of information from the implanted electrodes.
“Most of the time, when people study the brain’s language centres, they do so using non-invasive measures, such as measuring brain activity using an MRI scan. This method [using pre-implanted electrodes] tells us where activity is happening. Still, we only get a snapshot of what has happened in the last few seconds. Hence, it gives us a very static picture of what’s happening,” Dr Oscar Woolnough, the first author of the study, tells Happiest Health.
This information was gathered from participants who already had electrodes implanted in their left (i.e. language-dominant) hemisphere for managing localised epilepsy. Localised epilepsy is when a person experiences seizures due to abnormal electrical activity in one specific brain region
Thirty-six participants were connected to 2,675 electrodes to get spatiotemporal data. They were asked to read three types of sentences. They silently read regular sentences, lines from the poem Jabberwocky (a nonsense poem by Lewis Carroll), and a list of nonsense words. Their neural activity was monitored as they read regular sentences and those without meaning and structure.
What the readings revealed
The researchers observed specific back-and-forth crosstalk between two distinct brain regions: the inferior frontal gyrus (IFG) of the frontal lobe and the middle temporal gyrus (MTG) of the temporal lobe.
In the first network, the activity from the IFG to MTG ramped up as a person read a regular sentence. It was absent in the case of Jabberwocky sentences or nonsense strings of words. The researchers concurred that this first network is involved in deriving meaning out of a sentence.
Then the researchers uncovered the second and functionally distinct network: here, signals from the MTG to the IFG were independent of sentence structure, but dependent on the complexity of the words.
One network finds meaning in the overall text, and the second identifies the meaning of each word. The IFG is responsible for combining the words into an understandable phrase before sending signals to the temporal lobe to decipher the words.
Dr Woolnough says that this means we can relate meanings to words while simultaneously getting the meaning of a complex sentence.
How it works in bilingual people
“Our work here is most relevant for people with aphasia, [which disrupts a person’s] ability to understand or produce language fluently and accurately. It’s a condition with which Emilia Clarke, from Game of Thrones, has had issues, and [which] Bruce Willis recently announced he has,” says Dr Woolnough.
In previous studies, the same team looked at brain circuits that visually find meaning in words combined into a complex sentence structure.
“We are now working with bilingual people to see if these networks work the same or differently across each language. Some recent non-invasive studies using dozens of languages have shown that all languages seem to activate the same regions in people who speak them,” adds Dr Woolnough.
Takeaways from the study
1. While reading two networks, the brain processes the context and meaning of words.
2. This can be seen across languages; the prefrontal cortex and temporal lobe primarily manage reading and language comprehension
3. This counters the notion that reading uses only one brain region