Researchers Improve Brain Control of Robotic Arm

In another demonstration of brain-computer interface (BCI) technology, a woman with quadriplegia used just her thoughts to shape the hand of a robotic arm into a grasping position. She was able to control the robotic arm with ten dimensional (10D) performance and demonstrated the ability to reach, grasp, and place a variety of objects, such as big and small boxes, a ball, an oddly shaped rock, and thick and thin tubes. The findings, by researchers at the University of Pittsburgh (Pitt) School of Medicine, were published online December 17 in the Journal of Neural Engineering.

“Our project has shown that we can interpret signals from neurons with a simple computer algorithm to generate sophisticated, fluid movements that allow the user to interact with the environment,” said senior investigator Jennifer Collinger, PhD, an assistant professor in the Pitt School of Medicine’s Department of Physical Medicine and Rehabilitation, and a research scientist for the VA Pittsburgh Healthcare System.

In February 2012, small electrode grids with 96 tiny contact points each were surgically implanted in the regions of trial participant Jan Scheuermann’s brain that would normally control her right arm and hand movement. Each electrode point picked up signals from an individual neuron, which were then relayed to a computer to identify the firing patterns associated with particular observed or imagined movements, such as raising or lowering the arm or turning the wrist. That “mind-reading” was used to direct the movements of a prosthetic arm developed by the Johns Hopkins University Applied Physics Laboratory.

Within a week of the surgery, Scheuermann could reach in and out, left and right, and up and down with the arm to achieve 3D control, and before three months had passed, she also could flex the wrist back and forth, move it from side to side, and rotate it clockwise and counterclockwise, as well as grip objects, adding up to seven dimensions of control. Those findings were published in The Lancet in 2012.

To bring the total of arm and hand movements to ten, the pincer grip was replaced by four hand shapes: finger abduction, in which the fingers are spread out; scoop, in which the last fingers curl in; thumb opposition, in which the thumb moves outward from the palm; and a pinch of the thumb, index, and middle fingers. As before, Scheuermann watched animations and imagined the movements while the team recorded the signals her brain was sending, a process called calibration. Then they used what they had learned to read her thoughts so she could move the hand into the various positions.

“Jan used the robot arm to grasp more easily when objects had been displayed during the preceding calibration, which was interesting,” said co-investigator Andrew Schwartz, PhD, a professor of neurobiology in the Pitt School of Medicine. “Overall, our results indicate that highly coordinated, natural movement can be restored to people whose arms and hands are paralyzed.”

Editor’s note: This story was adapted from materials provided by the University of Pitt School of Medicine.

Exit mobile version