ASU Researchers Using Virtual Reality to Tune Prosthetic Arms


A bioengineering doctoral student in the ASU Neural Engineering Lab demonstrates how patients will use the Oculus Rift headset to learn controlled movement of phantom fingers. Photograph by Kevin O’Neill, courtesy of ASU.

According to Bradley Greger, PhD, an associate professor of biomedical engineering in the Arizona State University (ASU) Ira A. Fulton Schools of Engineering, having “super amazing” robotic limbs isn’t enough. “The hard part is the interface, getting the prosthetics to talk to the nerves,” he said. “It’s not just telling the fingers to move; the brain has to know the fingers have moved as directed.” Toward this end, he and his colleagues will be using virtual reality (VR) to help tune the arms.

Research by Greger’s team, published in the March issue of the Journal of Neural Engineering, is seeking to establish bidirectional communication between a user and a new prosthetic limb that is capable of controlling more than 20 different movements. The published study involved implanting an array of 96 electrodes for 30 days into the median and ulnar nerves in the arms of two individuals with upper-limb amputations. The electrodes were stimulated individually and in groups with varying degrees of amplitude and frequency designed to determine how the participants could perceive the stimulation. Neural activity was recorded during intended movements of the subjects’ “phantom fingers,” and 13 specific movements were decoded as the subjects controlled the individual fingers of a virtual robotic hand. The motor and sensory information provided by the implanted microelectrode arrays indicate that patients outfitted with a highly dexterous prosthetic limb controlled with a similar, bidirectional, peripheral nerve interface might begin to think of the prosthesis as an extension of themselves rather than a piece of hardware, explained Greger.

“We’re now at the stage in this process where we ask patients to mirror movements between hands,” explained Greger. “We can’t record what the amputated hand is doing, but we can record what a healthy hand is doing.” So, for instance, asking the patient to wave both hands simultaneously, or to point at an object with both hands, will be integral to the latest technology employed in the feedback loop: an Oculus Rift VR headset. The advantage of the VR headset is that the patient is able to interact directly with his or her virtual limb rather than by watching it on a screen.

Kevin O’Neill, a doctoral student at ASU, is developing the technology that not only allows the patient to see what his or her virtual limb is doing, but also decodes the neural messages that enable the motion to happen. “At first, when patients are learning to manipulate their virtual hands, they will be asked to strictly mirror movements of a healthy hand,” he explained. “Once we have learned what information the signals contain, we can build a neural decoding system and have patients drive the virtual representation of a missing limb independently of a healthy hand.”

For Greger, the most important next steps are getting the technology into human trials and then creating effective limbs that are available to patients at an affordable price. “We’re working toward limbs that are accessible both financially and in terms of usability,” said Greger. “We want to create limbs that patients will use as true extensions of themselves.”


Editor’s note: This story was adapted from materials provided by ASU.

Exit mobile version