News Release

Machine learning lends a helping ‘hand’ to prosthetics

Using machine learning, a camera, and a sensor, researchers improve the way prosthetic hands predict the required grip strength.

Peer-Reviewed Publication

American Institute of Physics

Pressure sensors are placed at the fingertips of the prosthetic hand

image: 

Pressure sensors are placed at the fingertips of the prosthetic hand, and a camera near its palm (left). The prosthetic hand was tested on a can, an egg, and a USB stick (right).

view more 

Credit: Li et al.

WASHINGTON, Jan. 20, 2026 — Holding an egg requires a gentle touch. Squeeze too hard, and you’ll make a mess. Opening a water bottle, on the other hand, needs a little more grip strength.

According to the U.S. Centers for Disease Control and Prevention, there are approximately 50,000 new amputations in the United States each year. The loss of a hand can be particularly debilitating, affecting patients’ ability to perform standard daily tasks. One of the primary challenges with prosthetic hands is the ability to properly tune the appropriate grip based on the object being handled.

In Nanotechnology and Precision Engineering, by AIP Publishing, researchers from Guilin University of Electronic Technology, in China, developed an object identification system for prosthetic hands to guide appropriate grip strength decisions in real time.

“We want to free the user from thinking about how to control [an object] and allow them to focus on what they want to do, achieving a truly natural and intuitive interaction,” said author Hua Li.

Pens, cups and bottles, balls, metal sheet objects like keys, and fragile objects like eggs make up over 90% of the types of items disabled patients use daily. The researchers measured the grip strength needed to interact with these common items and fed these measurements into a machine learning-based object identification system that uses a small camera placed near the palm of the prosthetic hand.

Their system uses an electromyography (EMG) sensor at the user’s forearm to determine what the user intends to do with the object at hand.

“An EMG signal can clearly convey the intent to grasp, but it struggles to answer the critical question, how much force is needed? This often requires complex training or user calibration,” said Li. “Our approach was to offload that ‘how much’ question to the vision system.”

The group plans to integrate haptic feedback into their system, providing an intuitive physical sensation to the user, which can establish a two-way communication bridge between the user and the hand using additional EMG signals.

“What we are most looking forward to, and currently focused on, is enabling users with prosthetic hands to seamlessly and reliably perform the fine motor tasks of daily living,” said Li. “We hope to see users be able to effortlessly tie their shoelaces or button a shirt, confidently pick up an egg or a glass of water without consciously calculating the force, and naturally peel a piece of fruit or pass a plate to a family member.”

###

The article “Design of intelligent artificial limb hand with force control based on machine vision” is authored by Yao Li, Xiaoxia Du, and Hua Li. It will appear in Nanotechnology and Precision Engineering on Jan. 20, 2026 (DOI: 10.1063/5.0253551). After that date, it can be accessed at https://doi.org/10.1063/5.0253551.

ABOUT THE JOURNAL

Nanotechnology and Precision Engineering (NPE) is an open access journal that is published on behalf of Tianjin University. As a peer-reviewed, interdisciplinary research journal, NPE covers all areas related to nanotechnology and precision engineering, which provides a forum for researchers of the related field all over the world. See https://pubs.aip.org/tu/npe.

###


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.