If new research is to be believed, you may find yourself coming home from work in a bad mood one day, only to have your smart speaker automatically scan your emotions and begin to play soothing music. A team of researchers in the United Kingdom is using radio waves to detect subtle changes in heart rhythm and then using advanced AI called a neural network to decipher what those signals mean — in other words, what the subject is feeling. It’s a breakthrough that could one-day aid human-intelligence analysts in Afghanistan in determining who poses an insider threat.
That is one application for a new neural network taught by Queen May University of London engineers to automatically interpret certain human emotions — by bombarding people with radio waves and picking up on emotional cues such as changes in their heartbeat. According to research published earlier this month in the journal PLOS One, the algorithm can detect feelings such as fear, disgust, joy, and relaxation with 71 percent accuracy. That’s far from perfect, but it’s impressive enough that we might be able to put it to use in the real world.
According to researchers from the Queen Mary University of London, the use of radio waves to measure heart rate and breathing signals can be used to predict how someone is feeling even in the absence of any other visual cues, such as facial expressions.
The researchers used deep learning techniques for this study, in which an artificial neural network learns its own features from time-dependent raw data, and demonstrated that this approach could detect emotions more accurately than traditional machine learning methods.
The team, led by Yang Hao, dean of research at the faculty of science and engineering, bounced radio waves off subjects using a small transmitting antenna. They used the signals to create a database of different heart rhythms while the subjects watched emotionally charged videos eliciting relaxation, fright, disgust, and joy. The team also connected the subjects to an electrocardiogram to ensure that the signals picked up by the antenna were correct. The researchers ran their data through a deep neural network and discovered that their system correctly classified the subjects’ emotional state 71% of the time.
Black Hat
According to Defense One, the algorithm has been trained to detect changes in a person’s heartbeat as detected by radio waves and interpret them as specific feelings. The military-focused publication was naturally interested in whether the system could be used for interrogation, but lead author and Queen Mary engineer Yang Hao explained that wasn’t the case.
“As for its implications to… national security,” Hao told Defense One, “more research is needed, just like other issues concerning ethics and responsible use of this technology.”
Learning Curve
Of course, 71 percent accuracy isn’t ideal, but the research shows that the neural network outperforms other, less sophisticated AI architectures significantly.
According to the study, a more traditional machine learning algorithm only got it right about 40% of the time. So, while we don’t yet have machines that understand the complex, subjective experience of human emotions, we’re getting closer to tools that can assist us in decoding them.
Methods for detecting human emotions are frequently used by researchers involved in psychological or neuroscientific studies, but these approaches may have broader implications for health and wellbeing management.
While this study shows that this approach can be used to detect emotions, any large-scale deployment would need to take into account other social and ethical concerns, such as data protection. “In the past, similar approaches have been used for identifying human physiological data, which has been widely used in body-centric wireless communications and wearable/implantable sensors for healthcare monitoring,” Professor Yang Hao added. As with many other wireless technologies, having a secure network to protect users’ privacy is critical.”
In the future, the research team plan to work with healthcare professionals and social scientists on public acceptance and ethical concerns around the use of this technology.