A team of Harvard researchers has developed a soft, wearable robotic device that uses artificial intelligence to deliver personalized assistance to people with upper-limb impairments, offering new hope to patients living with conditions such as stroke and amyotrophic lateral sclerosis (ALS).
The latest version of the robot, created at Harvard’s John A. Paulson School of Engineering and Applied Sciences (SEAS), merges machine learning with a physics-based model to learn how individual patients move and then adapt support for daily tasks such as eating, drinking, and brushing teeth. Early trials with stroke and ALS patients suggest the technology could serve both as an assistive tool and as a rehabilitation device in clinical settings.
For people like Kate Nycz, who was diagnosed with ALS in 2018, the promise is deeply personal. “My arm can get to maybe 90 degrees, but then it fatigues and falls,” said Nycz, 39. “To eat or do a repetitive motion with my right hand, which was my dominant hand, is difficult. I’ve mainly become left-handed.”
ALS is a progressive neurodegenerative disease that robs patients of motor control, while stroke can leave survivors with partial paralysis or chronic weakness. Both conditions often make simple tasks nearly impossible.
From concept to clinical testing
More than six years ago, the wearable robot began as a collaboration between SEAS bioengineer Conor Walsh, who directs the Harvard Biodesign Lab, and physician-scientists at Massachusetts General Hospital and Harvard Medical School. Stroke specialist Dr. David Lin and ALS expert Dr. Sabrina Paganoni joined the effort early, advising on design and helping identify patients for testing.
“This has been a wonderful collaboration,” Paganoni said. “From day one, the engineers involved both clinicians and patients in the process. That allowed us to shape the prototypes in ways that truly matter in people’s daily lives.”
Nycz was one of the first participants recruited in 2018, only weeks after learning she had ALS. She has provided feedback for several iterations of the device. “I’m big on technology and devices to help improve the quality of life for people living with ALS,” she said. “I feel like this robot could help with that goal.”
How it works
The device resembles a lightweight vest equipped with motion and pressure sensors. Under the arm sits a soft, inflatable balloon-like actuator that fills with air to gently lift the limb when needed. Earlier versions relied solely on motion tracking, but users often struggled to push their arms back down after a lift. Some lacked the strength to correct the robot when it overcompensated.
The newest upgrade combines two layers of intelligence. A machine learning model studies each user’s movement patterns to predict their intended motion, while a physics-based model calculates the minimum amount of pressure required to assist without overcorrecting. Together, the models allow the robot to adjust assistance in real time, providing subtle boosts when a patient eats, drinks or raises an arm, then backing off when less support is needed.
“Some people didn’t have enough residual strength to overcome any kind of mistake the robot was making,” said James Arnold, a graduate student and co-first author of the study. “The new system feels more natural because the robot can quickly dial its support up or down.”
Nine volunteers tested the new robot in partnership with Massachusetts General Hospital: five stroke survivors and four ALS patients, including Nycz. The device was trained on each person’s movements, achieving 94 percent accuracy in distinguishing shoulder motions. Participants required about one-third less force to lower their arms than in older versions, and their overall range of motion improved.
Researchers also noted reductions in compensatory behaviors such as leaning the torso or twisting the spine to compensate for weak arms. Similar to film animation, motion capture systems revealed more precise, efficient joint movements across the shoulders, elbows, and wrists.
“For people living with ALS, comfort, ease of use, and adaptability are critical,” Paganoni said. “This device can potentially improve functional independence and quality of life in a very meaningful way.”
Human-centered design
Patients who tested the device emphasized that their voices were heard. Nycz said she could see her suggestions reflected in later versions. “They’re not sitting in the lab just playing with the robot,” she said. “They included me in the process. I didn’t feel like a lab rat or a cog in a wheel.”
The project demonstrates a broader shift in assistive robotics: tailoring machines to human anatomy and highly individualized movement patterns. Because no two people with stroke or ALS move in the same way, algorithms capable of adapting in real time may prove critical to broader adoption.
Researchers believe the device can eventually be generalized to a wide range of upper-limb impairments. For stroke patients, it could be deployed in rehabilitation programs where regaining strength and mobility is the goal. For ALS patients, whose condition progresses over time, the focus would be on providing ongoing support to preserve independence as long as possible.
Through funding from the National Science Foundation’s Convergence Accelerator program, Walsh’s team is refining the technology to enable home use. Future versions could be donned independently by patients, without clinical oversight.
“We’re not just developing a medical device,” Walsh said. “We’re creating a system that can adapt to each user’s unique body mechanics, giving them more control in their daily lives.”
The researchers stress that the robot is still experimental, but the results – detailed in Nature Communications – represent an important step forward in merging artificial intelligence with soft robotics for medical care.
For Nycz, the payoff comes in small, everyday victories. “Even if it’s just making eating or drinking easier, that’s huge,” she said. “It’s about quality of life. That’s what matters most.”
The study was co-authored by Yichu Jin, David Pont-Esteban, Connor M. McCann, Carolin Lehmacher, John P. Bonadonna, Tanguy Lewko, Katherine M. Burke, Sarah Cavanagh, Lynn Blaney, Kelly Rishe, and Tazzy Cole. Funding came from the National Science Foundation under grants 2236157 and 2345107 and the NSF Graduate Research Fellowship.

