Feature Story | 2-Apr-2024

Wristband uses echoes, AI to track hand positions for VR and more

Cornell University

ITHACA, N.Y. – Cornell University researchers have developed a wristband device that continuously detects hand positioning – as well as objects the hand interacts with – using AI-powered, inaudible soundwaves.

Potential applications include tracking hand positions for virtual reality (VR) systems, controlling smartphones and other devices with hand gestures, and understanding a user’s activities; for example, a cooking app could narrate a recipe as the user chops, measures and stirs. The technology is small enough to fit onto a commercial smartwatch and lasts all day on a standard smartwatch battery.

EchoWrist is among the newest low-power, body pose-tracking technology from the Smart Computer Interfaces for Future Interactions (SciFi) LabCheng Zhang, assistant professor of information science directs the lab.

“The hand is fundamentally important – whatever you do almost always involves hands,” Zhang said. “This device offers a solution that can continuously track your hand pose cheaply and also very accurately.”

The device uses two tiny speakers mounted on the top and underside of a wristband to bounce inaudible sound off the hand and any hand-held objects. Two nearby microphones pick up the echoes, which are interpreted by a microcontroller. A battery smaller than a quarter powers the device.

The team developed a type of artificial intelligence model inspired by neurons in the brain, called a neural network, to interpret a user’s hand posture based on the resulting echoes. To train the neural network, they compared echo profiles and videos of users making various gestures and reconstructed the positions of 20 hand joints based on the sound signals.

With help from 12 volunteers, the researchers tested how well EchoWrist detects objects such as a cup, chopsticks, water bottle, pot, pan and kettle, and actions like drinking, stirring, peeling, twisting, chopping and pouring. Overall, the device had 97.6% accuracy.

The technology could be used to reproduce hand movements for VR applications. Existing VR and augmented reality systems accomplish this task using cameras mounted on the headset, but this approach uses a lot of power and can’t track the hands once they leave the headset’s limited field of view.

Researchers noted however, that EchoWrist still struggled to distinguish between objects with highly similar shapes, such as a fork and a spoon. But the team is confident that the object recognition will improve as they refine the technology. With further optimization, they believe EchoWrist could easily be integrated into an existing off-the-shelf smartwatch.

Funding for the project is from the National Science Foundation.

For additional information, see this Cornell Chronicle story.

Media note: Pictures can be viewed and downloaded here.



Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.