A Multimodal Method for Modeling Human Emotions (IMAGE)
Caption
Researchers used a multilayered multimodal latent Dirichlet allocation model to integrate bodily signals, sensory information and language from human participants. By learning emotion concepts from multimodal data and evaluating their consistency with human emotional categories, this computational model provides valuable insights into the mechanisms underlying human emotion formation.
Credit
Assistant Professor Chie Hieida from the Nara Institute of Science and Technology, Japan
Usage Restrictions
Cannot be reused without permission.
License
Original content