News Release

Brain scan patterns identify objects being viewed

Peer-Reviewed Publication

NIH/National Institute of Mental Health

National Institute of Mental Health (NIMH) scientists have shown that they can tell what kind of object a person is looking at -- a face, a house, a shoe, a chair -- by the pattern of brain activity it evokes. These landscapes of strong, intermediate and weak responses, in a visual processing area on the bottom surface of the brain, are different for each category of objects.

The patterns may provide a key to deciphering the brain's code for recognizing objects and faces, say the researchers. James Haxby, Ph.D., NIMH Laboratory of Brain and Cognition, and colleagues, report on their functional magnetic resonance imaging (fMRI) study in the September 28, 2001 Science.

"Brain imaging may be able to show how the brain encodes complex information, such as the appearance of objects, not just where the encoding occurs," said Haxby.

How is the brain's visual system able to represent a virtually unlimited number of faces and objects? Neuroscientists are looking for the answer in the functional architecture of the ventral temporal lobe of the visual cortex.

Some have suggested that specialized sites have evolved there for recognizing faces, spaces and objects used for navigation.

Others propose that the face recognition site might instead be specialized for expert recognition of any category of objects. By contrast, the NIMH group finds evidence for a different kind of organization.

They show that information specific to numerous categories is more widely distributed and overlapping. For example, even within "specialized" regions, such as the "place" area, distinct patterns of responses can be distinguished for other objects, such as shoes, cats and chairs. They call this model "object form topography."

Earlier NIMH fMRI studies had shown that brain areas that respond maximally to a particular category of object are consistent across different people.

This new study finds that the full pattern of responses -- not just the areas of maximal activation -- is consistent within the same person for a given category of object.

Moreover, all of the responses - even the weak responses -- convey information about attributes of the category. Even when the areas of maximal activation are excluded from analysis, it's possible to infer the category from the distinctive constellation of weaker responses.

The researchers measured patterns of response with fMRI while 6 subjects viewed pictures of faces, cats, and 5 categories of man-made objects: houses, chairs, scissors, shoes and bottles. They were also shown scrambled nonsense images as a control. Pairs of responses within and between categories were compared. Overall, the pattern of fMRI responses predicted the category with 96% accuracy.

Accuracy was l00% for faces, houses and scrambled pictures. When the areas of maximal response were excluded from the analysis, accuracy dropped to 94%.

The specificity of the brain's responses "is not restricted to categories for which dedicated systems might have evolved because of their biological significance" -- i.e., human faces, say the researchers. Rather, an extensive "topography" of responses, large and small, with finer spatial resolution than could be accommodated by category-specific areas, somehow represents how complex attributes of objects and faces are "related visually, structurally or semantically." Such an architecture "has the capacity to produce unique representations of a virtually unlimited number of object categories."

###

Also participating in the study were: Drs. Ida Gobbini, Maura Furey, Alumit Ishai and Jennifer Schouten, NIMH Laboratory of Brain and Cognition, and Dr. Pietro Pietrini, University of Pisa, Italy.

The National Institute of Mental Health (NIMH) is part of the National Institutes of Health (NIH), the Federal Government's primary agency for biomedical and behavioral research. NIH is a component of the U.S. Department of Health and Human Services.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.