Technology

Scientists have developed AI System that Deciphers Human gestures

Scientists have developed AI System that Deciphers Human gestures

Scientists have developed AI System that Deciphers Human gestures

Artificial Intelligence (AI) is a branch of computer science that studies intelligent systems (i.e. software, computers, robots, etc.). Scientists have developed an Artificial Intelligence (AI) system that recognizes hand gestures by combining skin-like electronics with computer vision. There are speech recognition bots out there, but until they can decipher what the hell it is we’re doing with our hands, even the most sophisticated AI will be missing a crucial aspect of human communication. AI gesture recognition systems that were initially visual-only have been improved upon by integrating inputs from wearable sensors, an approach known as ‘data fusion’.

Very recent advances have increased the ability of AI to encode the complexity in human interaction. The problem, it turns out, was the data being fed into these algorithms. But with better sensors, the Nanyang Technological University team thinks they’ve got it sorted out. Further challenges arise from the integration of visual and sensory data as they represent mismatched datasets that must be processed separately and then merged at the end, which is inefficient and leads to slower response times.

Scientists have developed an Artificial Intelligence (AI) system that recognizes hand gestures by combining skin-like electronics with computer vision. Very recent advances have increased the ability of AI to encode the complexity in human interaction.

Power Glove

AI can now encode words as well as a person does, as well as emotion and non-verbal aspects of communication. Sometimes, scientists will use computer vision to train AI, but it helps to supplement that data with the spatial information from people making gestures while wearing motion-capture gloves. To capture reliable sensory data from hand gestures, the research team fabricated a transparent, stretchable strain sensor that adheres to the skin but cannot be seen in-camera images.

Those devices can be clunky — it turns out the data they provided was all-but-useless compared to the fine motions that AI would have to interpret. So the Nanyang Tech team did away with the heavy robotics and coated their performers’ hands with a stretchy, form-fitting sensor instead, according to research published in the journal Nature Electronics. The result is a system that can recognize human gestures more accurately and efficiently than existing methods.

Use the Force

AI coding has been shown to be moderate to substantially reliable. The resulting AI isn’t exactly a master conversationalist, and it’s way too soon to expect any algorithm to truly comprehend what we say. Results showed that hand gesture recognition powered by the bio-inspired AI system was able to guide the robot through the maze with zero errors, compared to six recognition errors made by a visual-based recognition system. But the team does know it at least understood what gestures meant: they were able to guide a robot through a maze using nothing but fine hand gestures.

The NTU research team is now looking to build a VR and AR system based on the AI system developed, for use in areas where high-precision recognition and control are desired, such as entertainment technologies and rehabilitation in the home. It’s always cool to learn that AI is starting to catch up, but for the wild-armed Italians among us (hi), it’ll still be quite some time before machines understand what it is we’re going on about.