News Release

Building a better VR headset

UTA technology can reconstruct facial activity for VR, health care environments

Grant and Award Announcement

University of Texas at Arlington

VP Nguyen

image: VP Nguyen view more 

Credit: UT Arlington

A researcher at The University of Texas at Arlington is developing technology that would allow virtual reality users to see the facial expressions of the person they are interacting with online.

VP Nguyen, an assistant professor of computer science, received a grant of nearly $250,000 from the National Science Foundation for the project, which is part of a larger grant with Jian Liu at the University of Tennessee–Knoxville.

“This is one of the first devices that will allow us to monitor human facial activity in detail, and it has a variety of potential applications ranging from virtual reality gaming to health care,” Nguyen said. “The project bridges the gap between anatomical and muscular knowledge of the human face and electrical and computational modeling techniques to develop analytical models, hardware and software libraries for sensing face-based physiological signals.”

Current virtual reality headsets block the majority of a user’s face. Nguyen’s team has created a device that is lightweight, unobtrusive and will preserve users’ privacy.

The device would have applications for speech enhancement in stroke patients or others who have trouble speaking, allowing physicians to pinpoint exactly where a person’s muscles are affected and create a treatment plan based on that information. Additionally, it could be used with phones in noisy environments such as subways or bars where speech recognition technology cannot perform well.

“Monitoring human facial muscle activities at fine-grained details and reconstructing 3D facial expressions are challenging tasks,” said Hong Jiang, chair of UTA’s Computer Science and Engineering Department. “This project will advance the state of the art in ear-worn sensing techniques and transfer what is learned across multiple sensing modalities. The findings from this research could enable a wide range of applications, ranging from virtual and augmented reality to emotion recognition to health care.”  

Nguyen, who joined UTA in 2020, is the director of the Wireless and Sensor Systems Laboratory, which focuses on building wireless, mobile and embedded systems for cyber-physical solutions to monitor and improve human health conditions and living environments. This includes wearable, mobile and wireless sensing systems to monitor human breathing volume, brain activities, muscle activities, skin surface deformation and blood pressure for various health care applications.

  • Written by Jeremy Agor, College of Engineering

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.