News Release

Facial expression recognition is modulated by approach-avoidance behavior

~Approaching or distancing from others can influence how we perceive their facial expressions~

Peer-Reviewed Publication

Toyohashi University of Technology (TUT)

Figure: Participants were more likely to perceive the avatar’s expression as angry when they actively avoided the avatar, compared to when the avatar moved away from them.

image: 

Figure: Participants were more likely to perceive the avatar’s expression as angry when they actively avoided the avatar, compared to when the avatar moved away from them.

view more 

Credit: COPYRIGHT(C)TOYOHASHI UNIVERSITY OF TECHNOLOGY. ALL RIGHTS RESERVED.

<Summary>

A research team from the Cognitive Neurotechnology Unit and the Visual Perception and Cognition Laboratory at Toyohashi University of Technology has found that approach–avoidance behavior in a virtual reality (VR) environment modulates how individuals recognize facial expressions. Notably, the study demonstrated that participants were more likely to perceive a facial expression as “angry” when they actively moved away from the face stimulus than when the face moved away from them. These findings contribute to a better understanding of the reciprocal relationship between perception and action in social contexts. The study was published online on July 31, 2025, in the International Journal of Affective Engineering. https://doi.org/10.5057/ijae.IJAE-D-24-00049

<Details>

Facial expressions play a fundamental role in social communication. While it is well established that others’ expressions influence our behavior—such as approaching a smiling person or avoiding an angry one—the reverse effect, namely whether our own behavior affects how we recognize others’ expressions, has been less explored.
To address this question, the research team conducted three psychophysical experiments using VR. Participants wore a head-mounted display and observed 3D face models (avatars) under four distinct approach–avoidance conditions:

  1. Active approach: The participant approached the avatar.
  2. Active avoidance: The participant moved away from the avatar.
  3. Passive approach: The avatar approached the participant.
  4. Passive avoidance: The avatar moved away from the participant.

The facial expressions were generated by morphing between happy and angry (or fearful) expressions across seven levels. Participants were instructed to judge each expression as either “happy” or “angry” (or “happy” or “fearful”) depending on the experimental condition.
Results from Experiment 1 showed that participants were more likely to recognize the avatar’s expression as “angry” when they actively avoided the face, compared to when the avatar moved away from them. This suggests that one’s own avoidance behavior may enhance the perception of threat in others’ facial expressions. The pattern supports the hypothesis that behavior and perception are linked in a bidirectional manner.

Yugo Kobayashi, the first author and a doctoral student at the Department of Computer Science and Engineering, commented: “In today’s communication environments such as video conferencing, opportunities for physical movement are limited. These findings suggest that face-to-face communication involving bodily action may facilitate more natural recognition of facial expressions.”

<Future Directions>

The study provides evidence that one’s own approach–avoidance behavior can modulate facial expression recognition. Future work will examine which aspects of these behaviors—such as motor intention, visual motion, or proprioceptive feedback—are critical to this modulation.

<Acknowledgments>

This work was supported by JSPS KAKENHI (Grant Numbers JP21K21315, JP22K17987, JP20H05956, and JP20H04273), the Nitto Foundation, and research funding support for doctoral course students at Toyohashi University of Technology in FY2024.

<Publication Information>

Kobayashi, Y*., Tamura, H., Nakauchi, S., & Minami, T. (2025). Facial expression recognition is modulated by approach–avoidance behavior. International Journal of Affective Engineering, Volume 24, Issue 3, Pages 253-266; https://doi.org/10.5057/ijae.IJAE-D-24-00049
*Corresponding author.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.