Technology

Sensitive Robots on the Horizon As E-Skin Makes Them Capable Of Feeling Pain

Sensitive Robots on the Horizon As E-Skin Makes Them Capable Of Feeling Pain

Roboticists have been working on developing electronic skin (e-skin) for technology that may act as our own big sensory organ, relaying information about the surroundings for a long time. To get there, engineers at the University of Glasgow believe they’ve invented an electrical gadget that is far-reaching and incredibly sensitive, capable of relaying information in the blink of an eye. The computational e-skin prototype, which allows robots to record pain, was published in the journal Science Robotics and is said to be a big step forward in touch-sensitive robotics, with the potential to enhance prosthetic limbs by giving them near-human sensitivity to touch.

Previous attempts to build touch-sensitive robots have hit a hitch with processing time, since spread-out sensors may transmit a great volume of data, but it takes a minute for a computer to transform that data into anything useful. This innovative design was inspired by the peripheral nerve system of humans, which begins processing sensations at the point of contact and only delivers the most critical information up to the brain. In robots, a similar method would free up communication lines and prevent the computer from being clogged with excessive sensory data.

The key to unlocking this kind of information processing was a grid of 168 synaptic transistors built of zinc-oxide nanowires that could be stretched out across a flexible surface. These were used to make a robotic limb that could distinguish between light and heavy touch by deploying them over a human-shaped “hand” outfitted with skin sensors.

Making a robot experience pain may appear cruel, but the goal is to improve sensitivity in a way that aids trial and error learning. Pain is a great tool for learning things as a youngster, such as “touching a hot iron is bad,” and the sensation of touch can help robots learn from external stimuli in the same manner. Professor Ravinder Dahiya, who heads up the University of Glasgow’s Bendable Electronics and Sensing Technologies (BEST) Group, said in a statement, “What we’ve been able to create through this process is an electronic skin capable of distributed learning at the hardware level, which doesn’t need to send messages back and forth to a central processor before taking action.”

“Instead, by reducing the amount of computing necessary, it considerably speeds up the process of responding to touch.” We feel this is a significant step forward in our efforts to develop large-scale neuromorphic printed electronic skin that can respond to inputs correctly.” It’s expected to have uses for human prosthesis in the future, in addition to producing robotics that can learn to comprehend their surroundings and prevent damage. “In the future, this research could serve as the foundation for a more advanced electronic skin that allows robots to explore and interact with the world in novel ways, or for the development of prosthetic limbs with near-human levels of touch sensitivity,” said Fengyuan Liu, a member of the BEST group and a co-author of the paper.