New Technology Could Make Prosthetic Hands Easier to Use

Researchers placed EMG sensors on the forearms of able-bodied volunteers, tracking neuromuscular signals as they performed various actions.

Photograph courtesy of NC State University.

Researchers in the joint biomedical engineering program at North Carolina State University and the University of North Carolina at Chapel Hill have developed technology that can decode neuromuscular signals to control powered prosthetic wrists and hands. The technology relies on computer models that closely mimic the behavior of the natural structures in the forearm, wrist, and hand. The technology has worked well in early testing but has not yet entered clinical trials, so will not be commercially available for some time.

Current state-of-the-art prostheses rely on machine learning to create a pattern recognition approach to prosthesis control. This approach requires users to teach the device to recognize specific patterns of muscle activity and translate them into commands such as opening or closing a prosthetic hand.

“Pattern recognition control requires patients to go through a lengthy process of training their prosthesis,” says He (Helen) Huang, PhD, a professor in the joint biomedical engineering program. “This process can be both tedious and time-consuming. We wanted to focus on what we already know about the human body,” said Huang, who is senior author of a paper on the work. “This is not only more intuitive for users, it is also more reliable and practical.

“That’s because every time you change your posture, your neuromuscular signals for generating the same hand/wrist motion change. So relying solely on machine learning means teaching the device to do the same thing multiple times; once for each different posture, once for when you are sweaty versus when you are not, and so on. Our approach bypasses most of that.”

Instead, the researchers developed a user-generic, musculoskeletal model. They placed EMG sensors on the forearms of six able-bodied volunteers, tracking which neuromuscular signals were sent when they performed various actions with their wrists and hands. The data was then used to create the generic model, which translated those neuromuscular signals into commands that manipulate a powered prosthetic.

In preliminary testing, both able-bodied and amputee volunteers could use the model-controlled interface to perform the required hand and wrist motions with little training. The study, “Myoelectric Control Based on A Generic Musculoskeletal Model: Towards A Multi-User Neural-Machine Interface,” was published online May 18 in IEEE Transactions on Neural Systems and Rehabilitation Engineering. 

“We’re currently seeking volunteers who have transradial amputations to help us with further testing of the model to perform activities of daily living,” Huang says. “We want to get additional feedback from users before moving ahead with clinical trials.

Editor’s note: This story was adapted from materials provided by NC State University.

Exit mobile version