Mind and Machine Share Control in Next-Gen Prosthesis

Engineers have long struggled to design bionic hands that combine advanced functionality with ease of use. Unfortunately, as devices grow more complex, their use requires more mental effort—to the point that they’re not practical to use for routine tasks. It’s simply too exhausting.

Researchers at the University of Utah have developed a fresh approach to that challenge, testing an upper-limb prosthesis that offloads some of the cognitive load to the device’s own machinery.

Their prototype, described last month in a Nature Communications paper, employs a “shared control” model in which amputees don’t have to concentrate on every movement. Instead, the device makes certain decisions intuitively—in much the same way that your fingers, without conscious intervention, wrap themselves securely around everything from car keys to champagne flutes, kettlebells, and shoelaces. In granting some autonomy to bionic hands, the Utah team hopes to point the way toward more dexterous, intuitive control.

The idea of shared control was borrowed in part from robotics, where human operators work with semi-autonomous systems. This type of machinery has been applied in various settings, including manufacturing, surgery, and aviation. The winning device in the Assistive Robot race at the 2024 Cybathlon employed a shared-control model.

To translate this approach to upper-limb prosthetics, the Utah researchers modified a commercial bionic hand by embedding proximity sensors and pressure detectors in each fingertip. These sensors give the hand a form of “awareness” of nearby objects and contact forces. Using that information, a machine controller—essentially a trained neural network—positions each digit in a sensible way relative to the object about to be grasped. These impulses are blended continuously with signals from the user’s residual muscles, which the device decodes via standard surface electromyography. The result is a dynamic partnership in which the user decides what to grasp and how strongly, and the machine figures out how to position the wrist, palm, and fingers to execute the intended action.

To illustrate this sharing process, the authors describe a “fragile object transfer” test in which participants were asked to manipulate a breakable item without breaking or dropping it. Too much grip force would crush the object; too little would cause it fall out of the user’s grasp. “The machine automatically adjusted its target position to the exact joint position necessary to … stabilize the user’s grip around the object,” the authors explained, “effectively offloading the positioning of the digits to the machine and allowing the user to focus exclusively on the force they intended to produce.”

In experiments with both intact-limb participants and people with transradial amputations, this collaboration between user and machine led to striking improvements in function and ease of motion. Users were better able to secure objects and modulate grip force, and they did so with a reduced cognitive load. Everyday tasks such as drinking from a cup, moving an egg between two plates, and lifting a single sheet of paper became appreciably easier with the shared-control system than with conventional myoelectric control alone. With conventional myoelectrics, participants repeatedly broke or dropped these commonplace objects. Under shared control, they completed the tasks flawlessly.

This approach doesn’t try to remove the user from the loop. It simply makes partnership with the machine more intuitive. As a result, users never feel as if they’re being overridden, nor left alone to manage a dozen moving parts. The experience feels more like assistance than automation, a form of cooperation rather than replacement.

“This work represents the first demonstration of shared control with multiple amputee participants using a physical prosthesis,” the authors conclude. “We show enhanced performance both in terms of physical and cognitive function; participants had better grip security, enhanced grip precision, and reduced cognitive burden with no increase in physical burden.”

The full paper is available online at Nature Communications.

Amplitude