Senses Get Under Robot’s Skin


A doctoral student shakes hands with an optoelectronically innervated prosthesis. Photograph by Huichan Zhao, courtesy of Cornell.

Most robots achieve grasping and tactile sensing through motorized means, which can be bulky and rigid. A group at Cornell University has devised a way for a soft robot to feel its surroundings internally, in much the same way humans do. A paper describing how stretchable optical waveguides act as curvature, elongation, and force sensors in a soft robotic hand was published December 6 in the debut edition of Science Robotics. The work has implications for use in bio-inspired robots and prostheses.

“Most robots today have sensors on the outside of the body that detect things from the surface,” said doctoral student Huichan Zhao, who is lead author of the paper. “Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example.”

Optical waveguides have been in use since the early 1970s for numerous sensing functions, including tactile, position, and acoustic. Fabrication of stretchable optical waveguides was originally a complicated process, but the advent of soft lithography and 3D printing has led to the development of elastomeric sensors that are easily produced and incorporated into a soft robotic application. A group led by Robert Shepherd, MBA, PhD, assistant professor of mechanical and aerospace engineering and principal investigator of Cornell’s Organic Robotics Lab, employed a four-step soft lithography process to produce the core (through which light propagates), and the cladding (outer surface of the waveguide), which also houses the light-emitting diode and the photodiode. The more the prosthetic hand deforms, the more light is lost through the core. That variable loss of light, as detected by the photodiode, is what allows the prosthesis to “sense” its surroundings.

“If no light was lost when we bend the prosthesis, we wouldn’t get any information about the state of the sensor,” Shepherd said. “The amount of loss is dependent on how it’s bent.”

The group used its optoelectronic prosthesis to perform a variety of tasks, including grasping and probing for both shape and texture. Most notably, the hand was able to scan three tomatoes and determine, by softness, which was the ripest.

Future work on optical waveguides in soft robotics will focus on increased sensory capabilities, in part by 3D printing more complex sensor shapes, and by incorporating machine learning as a way of decoupling signals from an increased number of sensors. “Right now,” Shepherd said, “it’s hard to localize where a touch is coming from.”


Editor’s note: This story was adapted from materials provided by Cornell.

Exit mobile version