image: Prabhat Pathak and James Arnold demonstrate the wearable robot in the lab.
Credit: Eliza Grinnell / Harvard SEAS Communications
Key Takeaways
- Harvard researchers have created a soft, wearable robotic device that provides personalized movement assistance for individuals with upper-limb impairment, such as stroke and ALS patients.
- The latest version of the robot combines machine learning and a physics-based model to learn each user’s unique movements and provide support for daily activities like eating and drinking.
- The device was tested with stroke and ALS patients and could someday offer both assistive and rehabilitative benefits.
Having lived with an ALS diagnosis since 2018, Kate Nycz can tell you firsthand what it’s like to slowly lose motor function for basic tasks. “My arm can get to maybe 90 degrees, but then it fatigues and falls,” the 39-year-old said. “To eat or do a repetitive motion with my right hand, which was my dominant hand, is difficult. I’ve mainly become left-handed.”
People like Nycz who live with a neurodegenerative disease like ALS or who have had a stroke often suffer from impaired movement of the shoulder, arm or hands, preventing them from daily tasks like tooth-brushing, hair-combing or eating.
For the last several years, Harvard bioengineers have been developing a soft, wearable robot that not only provides movement assistance for such individuals but could even augment therapies to help them regain mobility.
But no two people move exactly the same way. Physical motions are highly individualized, especially for the mobility-impaired, making it difficult to design a device that works for many different people.
It turns out advances in machine learning can create a more personal touch. Researchers in the John A. Paulson School of Engineering and Applied Sciences (SEAS), together with physician-scientists at Massachusetts General Hospital and Harvard Medical School, have upgraded their wearable robot to be responsive to an individual user’s exact movements, endowing the device with more personalized assistance that could give users better, more controlled support for daily tasks.
The research published in Nature Communications was led by Conor Walsh, the Paul A. Maeder Professor of Engineering and Applied Sciences, whose lab develops human-centered assistive robotic devices for those with movement impairments. For more than six years, Walsh’s lab has collaborated with stroke and neurorehabilitation specialist Dr. David Lin, director of the Massachusetts General Hospital Neurorecovery Clinic; and ALS specialist Dr. Sabrina Paganoni, co-director of the Massachusetts General Hospital Neurological Clinical Research Institute — both paper co-authors — to develop clinically relevant devices for patients.
“This has been a wonderful collaboration as Dr. Walsh’s team prioritized including both the clinician and patient perspectives from Day one,” Paganoni said. “This collaborative approach allowed us to work together on the very initial prototypes and study design.”
Nycz was referred to the SEAS study team by Paganoni in 2018, not long after she was diagnosed with ALS a week shy of her 33rd birthday. Nycz has provided data and user testing for several iterations of the device, including the latest that includes a personalized motor feedback component. “I’m big on technology and devices to help improve quality of life for people living with ALS … I feel like this robot could help with that goal,” she said.
Software update with machine learning model
The paper describes a major update to the software powering the device, which consists of a sensor-loaded vest with a balloon attached underneath the arm that inflates and deflates to apply mechanical assistance to a weak or impaired limb.
The researchers used a machine learning model that personalizes assistance levels to the individual user by learning which movements the user is trying to do, via sensors that track both motion and pressure.
In previous versions of the device, which only tracked motion, the researchers found that users had had trouble pushing their arm back down once the robot had helped lift it up. “Some people didn’t have enough residual strength to overcome any kind of mistake the robot was making,” explained co- first author and graduate student James Arnold.
In the new version, in addition to the machine learning model, they incorporated a physics-based model they had previously developed that estimates the minimum pressure needed to support the arm during movement. This makes the robot’s assistance feel more natural to the user, offering more nuanced help on basic tasks like eating and drinking. Combining the models allowed the robot to quickly dial up or down how much assistance it is giving at any time, based on what it has learned about how that user normally moves.
User testing
In collaboration with the clinical researchers at MGH, the engineers tested their device with nine volunteers, including Nycz – five who had experienced a stroke and four living with ALS.
“For people living with ALS, the most important considerations include comfort, ease of use, and the ability of the device to adapt to their specific needs and movement patterns,” Paganoni said. “Personalization is crucial to enhance their functional independence and quality of life … This device holds the potential to significantly improve upper limb function, enhance daily living activities, and reduce compensatory movements.”
Results showed that a robot trained on an individual user’s movement data could distinguish the user’s shoulder movements with 94% accuracy. The amount of force a person needed to lower their arm was reduced by about a third, compared to previous versions. The users also showed larger ranges of motion in their shoulders, elbows, and wrists, reducing the need to compensate with body leaning or twisting, and making their movements overall more exact and efficient.
Past studies with the wearable robot had focused on a single joint or a single clinical score for evaluating patient movement, explained co-first author and postdoctoral fellow Prabhat Pathak. “What we did here was look at simulated activities of daily living, using a highly accurate motion capture system — similar to systems used in movies. We looked at how each and every joint movement changed, and if they were able to do the tasks more efficiently.”
Nycz said seeing the different versions of the device over the years has been gratifying, and she’s noticed some of her feedback has been reflected in newer versions.
“They’ve done a great job incorporating and including the person,” she said. “They’re not sitting in the lab just playing with the robot. I felt like they were really engaged with me. I didn’t feel like a lab rat or a cog in a wheel.”
Generalizable to many populations
The researchers noted that their device could be generalizable to many populations of people with upper limb impairments. While stroke patients are usually focused on rehabilitation through gradual regaining of strength and movement, ALS is degenerative which means the device might be more valuable for movement assistance only. Through continued support from the National Science Foundation’s Convergence Accelerator, under the Directorate for Technology, Innovation and Partnerships, the team is continuing to refine the technology to someday enable users to independently use it in the home.
The paper was co-authored by Yichu Jin, David Pont-Esteban, Connor M. McCann, Carolin Lehmacher, John P. Bonadonna, Tanguy Lewko, Katherine M. Burke, Sarah Cavanagh, Lynn Blaney, Kelly Rishe, and Tazzy Cole.
The research had federal support from the National Science Foundation under grant No. 2236157 and 2345107, and the NSF Graduate Research Fellowship under grant No. DGE 2140743.
Watch: https://www.youtube.com/watch?v=ZhHnEOf7eeY
Journal
Nature Communications
Method of Research
Experimental study
Subject of Research
People
Article Title
Personalized ML-based wearable robot control improves impaired arm function
Article Publication Date
2-Aug-2025