News Release

University of Utah engineers give a bionic hand a mind of its own

Researchers use AI to finetune robotic prosthesis to improve manual dexterity

Peer-Reviewed Publication

University of Utah

Trout and participant

image: 

Lead author Marshall Trout, right, worked with four amputees to investigate how AI could be used to autonomously control an advanced prothesis. The AI-powered prosthesis was capable of working intelligently alongside the amputees to enhance dexterity and make the prosthesis more intuitive to use.

view more 

Credit: Utah NeuroRobotics Lab

Whether you’re reaching for a mug, a pencil or someone’s hand, you don’t need to consciously instruct each of your fingers on where they need to go to get a proper grip.

The loss of that intrinsic ability is one of the many challenges people with prosthetic arms and hands face. Even with the most advanced robotic prostheses, these everyday activities come with an added cognitive burden as users purposefully open and close their fingers around a target.

Researchers at the University of Utah are now using artificial intelligence to solve this problem. By integrating proximity and pressure sensors into a commercial bionic hand, and then training an artificial neural network on grasping postures, the researchers developed an autonomous approach that is more like the natural, intuitive way we grip objects. When working in tandem with the artificial intelligence, study participants demonstrated greater grip security, greater grip precision and less mental effort.

Critically, the participants were able to perform numerous everyday tasks, such as picking up small objects and raising a cup, using different gripping styles, all without extensive training or practice.

The study was led by engineering professor Jacob A. George and Marshall Trout, a postdoctoral researcher in the Utah NeuroRobotics Lab, and appears Tuesday in the journal Nature Communications.

“As lifelike as bionic arms are becoming, controlling them is still not easy or intuitive,” Trout said. “Nearly half of all users will abandon their prosthesis, often citing their poor controls and cognitive burden.”

One problem is that most commercial bionic arms and hands have no way of replicating the sense of touch that normally gives us intuitive, reflexive ways of grasping objects. Dexterity is not simply a matter of sensory feedback, however. We also have subconscious models in our brains that simulate and anticipate hand-object interactions; a “smart” hand would also need to learn these automatic responses over time.

The Utah researchers addressed the first problem by outfitting an artificial hand, manufactured by TASKA Prosthetics, with custom fingertips. In addition to detecting pressure, these fingertips were equipped with optical proximity sensors designed to replicate the finest sense of touch. The fingers could detect an effectively weightless cotton ball being dropped on them, for example.

For the second problem, they trained an artificial neural network model on the proximity data so that the fingers would naturally move to the exact distance necessary to form a perfect grasp of the object. Because each finger has its own sensor and can “see” in front of it, each digit works in parallel to form a perfect, stable grasp across any object.

But one problem still remained. What if the user didn’t intend to grasp the object in that exact manner? What if, for example, they wanted to open their hand to drop the object? To address this final piece of the puzzle, the researchers created a bioinspired approach that involves sharing control between the user and the AI agent. The success of the approach relied on finding the right balance between human and machine control.

"What we don’t want is the user fighting the machine for control. In contrast, here the machine improved the precision of the user while also making the tasks easier,” Trout said. “In essence, the machine augmented their natural control so that they could complete tasks without having to think about them."

The researchers also conducted studies with four participants whose amputations fall between the elbow and wrist. In addition to improved performance on standardized tasks, they also attempted multiple everyday activities that required fine motor control. Simple tasks, like drinking from a plastic cup, can be incredibly difficult for an amputee; squeeze too soft and you’ll drop it, but squeeze too hard and you’ll break it.

“By adding some artificial intelligence, we were able to offload this aspect of grasping to the prosthesis itself,” George said. “The end result is more intuitive and more dexterous control, which allows simple tasks to be simple again.”

George is the Solzbacher-Chen Endowed Professor in the John and Marcia Price College of Engineering’s Department of Electrical & Computer Engineering and the Spencer Fox Eccles School of Medicine’s Department of Physical Medicine and Rehabilitation.

 

This work is part of the Utah NeuroRobotics Lab’s larger vision to improve the quality of life for amputees.

 

"The study team is also exploring implanted neural interfaces that allow individuals to control prostheses with their mind and even get a sense of touch coming back from this,” George said. “Next steps, the team plans to blend these technologies, so that their enhanced sensors can improve tactile function and the intelligent prosthesis can blend seamlessly with thought-based control."

#####

The study was published online Dec.9 in Nature Communications under the title “Shared human-machine control of an intelligent bionic hand improves grasping and decreases cognitive burden for transradial amputees.”

Coauthors include NeuroRobotics Lab members Fredi Mino, Connor Olsen and Taylor Hansen, as well as Masaru Teramoto, research assistant professor in the School of Medicine’s Division of Physical Medicine & Rehabilitation, David Warren, research associate professor emeritus in the Department of Biomedical Engineering, and Jacob Segil of the University of Colorado Boulder. Funding came from the National Institutes of Health and National Science Foundation.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.