In the first imaging study that directly compares reading and listening activity in the human brain, Carnegie Mellon scientists discovered that the same information produces systematically different brain activation. And knowing what parts of the brain fire during reading or listening comprehension affects the answer to one of the classic questions about language comprehension: whether the means of delivery through eyes or ears makes a difference. "The brain constructs the message, and it does so differently for reading and listening. The pragmatic implication is that the medium is part of the message. Listening to an audio book leaves a different set of memories than reading does. A newscast heard on the radio is processed differently from the same words read in a newspaper," said Carnegie Mellon Psychology Professor Marcel Just, co-author of the report that appears in this month's issue of the journal Human Brain Mapping.
Just said that the most recent methods of functional magnetic resonance imaging (fMRI) were applied to measure brain activity during these high-level conceptual processes. Rather than examining the processing of single digits or words, his group is applying brain imaging to societal, workplace, and instructional issues. "We can now see how cell-phone use can affect driving, how reading differs from listening, and how visual thinking is integrated with verbal thinking," Just said.
Using the non-invasive fMRI, scientists were able to measure the amount of activity in each of 20,000 peppercorn-sized regions of the brain every three seconds and create visual maps of how the mental work of thinking was allocated throughout the brain from moment to moment. To the scientists' surprise, there were two big differences in the brain activity patterns while participants were reading or listening to identical sentences, even at the conceptual level of understanding the meaning of a sentence. First, during reading, the right hemisphere was not as active as anticipated, which opens the possibility that there were qualitative differences in the nature of the comprehension we experience in reading versus listening.
Second, while listening was taking place, there was more activation in the left-hemisphere brain region called the pars triangularis (the triangular section), a part of Broca's area that usually activates when there is language processing to be done or there is a requirement to maintain some verbal information in an active state (sometimes called verbal working memory). The greater amount of activation in Broca's area suggests that there is more semantic processing and working memory storage in listening comprehension than in reading.
Because spoken language is so temporary, each sound hanging in the air for a fraction of a second, the brain is forced to immediately process or store the various parts of a spoken sentence in order to be able to mentally glue them back together in a conceptual frame that makes sense. "By contrast," Just said, "written language provides an "external memory" where information can be re-read if necessary. But to re-play spoken language, you need a mental play-back loop, (called the articulatory-phonological loop) conveniently provided in part by Broca's area."
The study doesn't attempt to suggest that one means of delivering information is better than another, Just said. "Is comprehension better in listening or in reading? It depends on the person, the content of the text, and the purpose of the comprehension. In terms of persons, some people are more skilled at one form of comprehension and typically exercise a preference for their more skilled form where possible. It may be that because of their experience and biology they are better and more comfortable in listening or reading," he explained.
Just carries out his research on the human brain through the Center for Cognitive Brain Imaging at Carnegie Mellon (www.ccbi.cmu.edu). The language comprehension project is funded by the National Institutes of Health.