News Release

The audiovisual integration of material information in preverbal infants

Peer-Reviewed Publication

Chuo University

A joint group of researchers from Chuo University, Japan Women's University, and Kagoshima University have revealed that infants aged 4- to 5-months already hold a primary cerebral representation of audiovisual integration of material information in their right hemisphere, and the number of types of material which can be processed by infants' brain increases with the experience of the materials. This finding may lead to understand the trajectory of acquiring general knowledge about objects around us.

The study was published online in Scientific Reports on 18th June.

The question of how multisensory association of material properties is formed is one of the recent hot topics in cognitive science. Material properties of objects consist of multisensory information. However, multiple modality inputs are not always necessary to categorize materials since our perceptual system enables us to extract visual information from auditory information. Previous studies reported on the neural basis of material category perception in adult humans and monkeys and indicated cortical activation in the ventral visual pathway for visual-based material perception, specifically the fusiform gyrus in humans (Hiramatsu et al., 2011) and the inferior temporal cortex in monkeys (Goda et al., 2014). On the other hand, previous studies reported the activation in the ventro-medial pathway for auditory-based material perception in humans (Arnot et al., 2008). A recent functional magnetic resonance imaging study in monkeys revealed that the supramodal neural representation developed in the posterior inferior temporal cortex after simple long-term visuo-haptic experience. However, in humans, the development of the association between the auditory and visual material property of objects remains poorly understood. Therefore, in the present study, we addressed this question by employing infants, before the acquisition of language, as participants for brain-activity measurements.

In our study, we adopted near-infrared spectroscopy (NIRS) to measure the brain activity in the right temporal region in preverbal 4- to 8-month-old infants. We assessed differences in the signal strength between the matching conditions by alternating two visual materials ("Metal" and "Wood") within an impact sound of one material. We demonstrated for the first time the presence of a mapping of the auditory material property with visual material in the right temporal region in preverbal infants. Furthermore, we found that infants acquired the audio-visual mapping for a property of the "Metal" material later than for the "Wood" material, since infants form the visual property of "Metal" material after approximately 6 months of age. Our findings indicated that the development of the association of multisensory material properties may depend on the material's familiarity during the first half year of life.

###

This study was supported by JSPS KAKENHI Grant Numbers JP16H07207 and JP26730076. This study was also supported in part by JST-RISTEX. This work was also supported by Grant-in-Aid for Scientific Research on Innovative Areas No.17H06343 "Construction of the Face-Body Studies in Transcultural Conditions" and No. 16H01677 "SHITSUKAN Science and Technology".

Journal: Scientific Reports, 8, 9301
Title: Crossmodal association of auditory and visual material properties in infants.
Authors: Yuta Ujiie (Chuo University), Wakayo Yamashita (Kagoshima University), Waka Fujisaki (AIST and Japan Women's University), So Kanazawa (Japan Women's University), and Masami K Yamaguchi (Chuo University).
Affiliations: 1 Chuo University, 2 Kagoshima University, 3 AIST, 4Japan Women's University
The corresponding author: Yuta Ujiie, Chuo University, E-mail: yuta.ujiie.160330@gmail.com


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.