News Release

How does the brain learn categorization for sounds? The same way it does for images

A two-step process may explain how the brain quickly builds on past learning

Peer-Reviewed Publication

U.S. National Science Foundation

Functional MRI

image: Functional MRI response from a representative subject during a listening task. view more 

Credit: Xiong Jiang, Georgetown University

Categorization, or the recognition that individual objects share similarities and can be grouped together, is fundamental to how we make sense of the world. Previous research has revealed how the brain categorizes images. Now, researchers funded by the National Science Foundation (NSF) have discovered that the brain categorizes sounds in much the same way.

The results are published today in the journal Neuron.

"Categorization involves applying a single label to a wide variety of sensory inputs," said Max Riesenhuber, professor of neuroscience at Georgetown University and lead co-author of the article. "For example, apples come in many colors, shapes and sizes, yet we label each as an apple. Children do this all the time as they learn language, but we actually know very little about how the brain categorizes the world."

The importance of this work was underlined by Uri Hasson, program director for NSF's Cognitive Neuroscience Program, which supported the work.

"These findings reveal what may not only be a general mechanism about how the brain learns, but also about how learning changes the brain and allows the brain to build on that learning," Hasson said. "The work has potential implications for understanding individual differences in language learning and can provide a foundation for understanding and treating people with learning disorders and other disabilities."

Riesenhuber's group at Georgetown had previously studied how the brain categorizes visual objects and found that at least two distinct regions of the brain were involved. One region, in the visual cortex, encoded images, while a region in the prefrontal cortex signaled their category membership. For their more recent research, Riesenhuber and lead author Xiong Jiang were interested in whether the same processes underlie categorization of auditory cues. They joined forces with co-author Josef Rauschecker, also a professor of neuroscience at Georgetown and an expert on the auditory cortex and neural plasticity.

To find out how the brain categorizes auditory input, the researchers invented new sounds using an acoustic blending tool to produce sounds from two types of monkey calls. The blending produced hundreds of new sounds that differed from the original calls.

Subjects listened to several hundred calls and categorized them under two arbitrary labels that were created by the researchers. The researchers used functional MRI prior to and following the training to image subjects' brains while they listened to the sounds, but did not yet label them. The results showed that learning to categorize the sounds had increased the brain's sensitivity to the acoustic features that distinguished one sound from another. This occurred in the lower-level auditory cortex, which is responsible for representing sound but does not appear to give it any meaning or significance.

The subjects' brains were then scanned while they judged which category the sounds belonged to. These scans showed that neural activity patterns in another brain region, the prefrontal cortex, distinguished between categories and that subjects used that information to make their judgments. The researchers also found that the category selectivity of neural activity patterns in the prefrontal cortex was task-dependent. When subjects were listening to the sounds but not judging which category they belonged to, the neural activity patterns in the prefrontal cortex region did not distinguish one category from another.

The discovery of similar processes for visual and auditory categorization promises important advances for how we understand learning.

"Knowing how senses learn the world may help us devise workarounds in our very plastic brains," Riesenhuber said. "If a person can't process one sensory modality, say vision, because of blindness, there could be substitution devices that allow visual input to be transformed into sounds." Added Rauschecker, "One disabled sense would be processed by other sensory brain centers."

###


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.