Public Release: 

White matter connectome with cortical lesion map clarifies temporal auditory comprehension

Medical University of South Carolina


IMAGE: These are cortical brain regions and their white matter connections related to speech comprehension. view more 

Credit: From: Temporal lobe networks supporting the comprehension of spoken words Brain. Published online August 03, 2017. doi:10.1093/brain/awx169 Brain | © The Author (2017). Published by Oxford University Press on behalf of...

Communicating via spoken language is a fundamental human capability that enables us to form connections with other people by sharing knowledge and emotions, working together and assessing our experiences. Thus, losing the ability to comprehend speech due to a stroke, traumatic brain injury or neurological disorder such as dementia, is particularly devastating.

Auditory word comprehension is a complex cognitive process that requires the participation of multiple brain areas to transform initial auditory signals into meaningful abstract concepts. The first, and most basic, aspect of understanding spoken language is the auditory processing of speech sounds. However, many subsequent steps involving multi-layered, hierarchical brain networks are necessary to derive phonemes, syllables, words, syntax, meaning and context.

Currently, our understanding of exactly which brain areas handle the various aspects of spoken language comprehension is incomplete. The prevailing theory is that neocortical regions adjacent to the auditory cortex are primarily responsible for word comprehension. However, recent studies in patients with primary progressive aphasia have challenged this concept and suggest that the left temporal pole may play a central role.

To unravel these conflicting findings, a team of researchers, led by Leonardo Bonilha, M.D., Ph.D., associate professor in MUSC's Department of Neurology, in close collaboration with Julius Fridriksson, Ph.D., Professor of Communication Sciences and Disorders at the University of South Carolina's (USC) Arnold School of Public Health and the USC Aphasia Lab, developed a novel study methodology to identify the specific neural structures that, when damaged by stroke, are associated with impaired auditory word comprehension.

"We need to better define what takes place in the brain when someone understands speech so we're better able to help those with aphasia who cannot do that anymore," explains Bonilha."Evidence indicates that areas associated with speech comprehension are in the posterior lateral temporal lobe and close to those responsible for hearing. What we need to define is how they are linked to other brain areas that process all the secondary associations that enable you to understand and connect meaning to the sounds you hear."

Capitalizing on their recent work to optimize connectome mapping in individuals with post-stroke brain lesions, the team designed a study combining traditional voxel-based lesion symptom mapping (VLSM) with connectome-lesion symptom mapping (CLSM). CLSM, a new brain mapping method based on the concept of the human brain connectome, provides a three-dimensional map of all medium- and long-range white matter connections outside of the area damaged by the stroke.

"Before CLSM, we primarily looked at the stroke lesion, the damaged area," explains Bonilha. "We focused on understanding what brain areas were gone and matched those to what function was gone. But, of course, brain functions don't depend exclusively on one area. Using the connectome, we can see the impact of the stroke beyond the lesion and begin to identify networks that the damage has disconnected beyond the stroke lesion. These areas might appear to be OK on MRI after the stroke, but, in fact, they are disconnected and do not receive the signals they need to function."

The team reasoned that assessing white matter networks beyond the area of cortical necrosis may provide a more comprehensive assessment of brain damage, residual brain integrity and its impact on language processing.

They recruited 67 people with chronic aphasia who had suffered a one-time ischemic stroke at least 6 months prior to the study. Each participant was assessed for word comprehension, aphasia and aphasia severity, as well as semantic processing. Magnetic resonance imaging (MRI) was conducted to facilitate VSLM and CSLM. The computational steps to measure the connectome from MRI were developed in collaboration with Chris Rorden, Ph.D., professor of neuroimaging and endowed chair in USC's Department of Psychology.

VLSM and CLSM are complementary. "Both tools provide valuable information," says Bonilha. "The connectome tells us about areas outside the lesion that were highly connected to that region and where reduced connections post-stroke are affecting a particular function. But the connectome is not good for looking at areas inside the lesion to determine what functions happen there. That's where voxel-based methods are more useful."

Study results supported the prevailing view that posterior lateral and inferior aspects of the temporal cortex are most critical for word comprehension. In addition, these areas may serve as a 'hub' that integrates the auditory and conceptual information necessary to recognize words. CLSM results also explained why other studies suggest that the temporal pole plays a role in word comprehension by revealing that the temporal pole is functionally and structurally connected to the middle temporal gyrus. The authors propose that, when the pole is disproportionally affected, an indirect knock-on effect may lead to a statistical association with poor word comprehension.

This study provides a more comprehensive description of crucial neuronal networks involved in speech comprehension that may contribute to improved targeting of therapy for individuals with impaired auditory speech comprehension. Its findings demonstrate that temporal poles (in the anterior temporal lobe) are part of a broader network associated with semantic interpretation. However, when the effect of object recognition is factored out, only the core of that network (i.e., the middle and inferior temporal areas) is necessary for word comprehension. While the left temporal pole has an indirect role in word comprehension, the anterior temporal regions most likely play a central role in additional and deeper levels of semantic processing. The study's findings also indicate that the temporal pole is likely to be essential for recognizing objects -- an important early process for matching spoken words to pictures or objects.


About MUSC

Founded in 1824 in Charleston, The Medical University of South Carolina is the oldest medical school in the South. Today, MUSC continues the tradition of excellence in education, research, and patient care. MUSC educates and trains more than 3,000 students and residents, and has nearly 13,000 employees, including approximately 1,500 faculty members. As the largest non-federal employer in Charleston, the university and its affiliates have collective annual budgets in excess of $2.2 billion. MUSC operates a 750-bed medical center, which includes a nationally recognized Children's Hospital, the Ashley River Tower (cardiovascular, digestive disease, and surgical oncology), Hollings Cancer Center (a National Cancer Institute designated center) Level I Trauma Center, and Institute of Psychiatry. For more information on academic information or clinical services, visit For more information on hospital patient services, visit

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.