Decoding how the brain understands sentences in real-time

NewsGuard 100/100 Score

In a recent study published in Nature Communications, researchers examine neural networks in the left language-dominant hemisphere responsible for semantic integration using intracranial recordings in epilepsy patients during reading tasks. These experiments were also used to distinguish between effects driven by semantic coherence and task-based referentiality.

Study: The spatiotemporal dynamics of semantic integration in the human brain. Image Credit: Triff / Shutterstock.comStudy: The spatiotemporal dynamics of semantic integration in the human brain. Image Credit: Triff / Shutterstock.com

Background 

Understanding the neural mechanisms responsible for human sentence processing is pivotal for grasping cortical computation's structure and timing. Despite the fundamental role of language in cognition by enabling us to derive meanings from unfamiliar cues, the specific brain regions responsible remain debated.

Certain areas, like the posterior temporal lobe, as well as the prefrontal and parietal cortices, are involved in language development; however, the consensus among researchers remains debated. Furthermore, traditional methods like functional magnetic resonance imaging (fMRI) lack the resolution to dissect the intricacies of language processes.

Thus, additional research is essential to better understand the specific cortical regions and their specializations in semantic processing, especially given the variability in literature and challenges in isolating distinct semantic processes with high spatiotemporal resolution.

About the study 

In the present study, 58 native English-speaking patients aged between 18 and 41 underwent experiments after providing informed consent. Patients with significant neurological histories or anomalies such as prosopagnosia were excluded.

All study participants underwent thorough neuropsychological evaluations. All procedures were approved by the University of Texas Health Science Center at Houston's Committee for the Protection of Human Subjects.

Data were gathered using depth or subdural grid electrodes that were surgically implanted, and their positions were confirmed using a combination of MRI and computed tomography (CT) imaging. Intracranial data collection began one day after electrode implantation for depth electrodes and two days for grid electrodes. Data were scrutinized for noise and artifacts, and any unreliable data points were discarded.

For the main experiment, patients were shown words and asked to promptly and accurately name everyday objects based on those words. Some sentences were designed to be incoherent, thus challenging patients to determine the meaning. A secondary norming study was conducted on a non-clinical population to ascertain the effectiveness of the stimuli. 

A thorough analysis of the collected data revealed that out of over 13,000 electrode contacts implanted, only 9,388 were considered suitable for analysis. The raw data was subsequently filtered, transformed, and smoothed to decipher significant activation points during the trials.

The data were then mapped on a cortical surface model to understand its significance. A significant portion of the analysis was dedicated to understanding the interrelation between the language system and episodic memory networks. The study also utilized the Human Connectome Project for determining regions of interest and ensuring precise electrode placements. 

Study findings 

The average individual reaction time was 1,765 milliseconds (ms) following the conclusion of the final word in a sentence. When individuals encountered referential trials, their articulation reaction times were significantly quicker as compared to non-referential trials.

Advanced mapping techniques were used to gain insights into the spatiotemporal dynamics of orthographic sentence processing. While reading a sentence, there was a gradual escalation of activation in certain regions of the brain, particularly the inferior frontal gyrus, medial parietal cortex, anterior temporal lobe, and posterior middle temporal gyrus.

Subsequent phases of sentence processing revealed activation in the ventromedial prefrontal cortex, posterior cingulate, and orbitofrontal cortex. Some regions remained active and exhibited heightened activity towards the end of the sentence reading process.

At the beginning of the final word, referential trials displayed greater brain activity in various areas, such as the middle frontal gyrus (MFG), middle inferior frontal sulcus (IFS), medial parietal cortex (MPC), parahippocampal cortex, ventromedial prefrontal cortex (vmPFC), and orbitofrontal cortex (OFC). Comparatively, non-referential trials led to increased activity in the posterior superior temporal cortex and anterior inferior frontal gyrus immediately after the onset of the final word.

When exploring semantic coherence, distinct brain activity patterns were observed in the analysis of non-referential sentences based on their coherence. Incoherent non-referential sentences caused heightened activity in the medial frontal cortex and superior medial parietal cortex. Coherent non-referential sentences resulted in increased activity in regions like the IFS, anterior inferior frontal gyrus (aIFG), angular gyrus, posterior middle temporal gyrus (pMTG), and OFC.

The study further examined integrative lexical access by assessing semantic narrowing, which refers to the probability of identifying a defined object even before the final word of a sentence is presented. For example, while some sentences provided clear hints about the object in question, others remained ambiguous until the very end.

There was no significant difference in articulation reaction times between strong and limited narrowing conditions. Furthermore, there were no notable disparities in sentence length or frequency of the final word between the two conditions.

Referential trials with limited semantic narrowing had increased brain activity in regions such as the posterior superior temporal sulcus, MPC, IFS, anterior temporal lobe, and OFC, especially after the onset of the final word.

Journal reference:
  • Murphy, E., Forseth, K. J., Donos, C. et al. (2023). The spatiotemporal dynamics of semantic integration in the human brain. Nature Communicationsdoi:10.1038/s41467-023-42087-8 
Vijay Kumar Malesu

Written by

Vijay Kumar Malesu

Vijay holds a Ph.D. in Biotechnology and possesses a deep passion for microbiology. His academic journey has allowed him to delve deeper into understanding the intricate world of microorganisms. Through his research and studies, he has gained expertise in various aspects of microbiology, which includes microbial genetics, microbial physiology, and microbial ecology. Vijay has six years of scientific research experience at renowned research institutes such as the Indian Council for Agricultural Research and KIIT University. He has worked on diverse projects in microbiology, biopolymers, and drug delivery. His contributions to these areas have provided him with a comprehensive understanding of the subject matter and the ability to tackle complex research challenges.    

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Kumar Malesu, Vijay. (2023, October 29). Decoding how the brain understands sentences in real-time. News-Medical. Retrieved on May 05, 2024 from https://www.news-medical.net/news/20231029/Decoding-how-the-brain-understands-sentences-in-real-time.aspx.

  • MLA

    Kumar Malesu, Vijay. "Decoding how the brain understands sentences in real-time". News-Medical. 05 May 2024. <https://www.news-medical.net/news/20231029/Decoding-how-the-brain-understands-sentences-in-real-time.aspx>.

  • Chicago

    Kumar Malesu, Vijay. "Decoding how the brain understands sentences in real-time". News-Medical. https://www.news-medical.net/news/20231029/Decoding-how-the-brain-understands-sentences-in-real-time.aspx. (accessed May 05, 2024).

  • Harvard

    Kumar Malesu, Vijay. 2023. Decoding how the brain understands sentences in real-time. News-Medical, viewed 05 May 2024, https://www.news-medical.net/news/20231029/Decoding-how-the-brain-understands-sentences-in-real-time.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Microbial metropolis: How building design can boost your brain