A novel, wireless-based approach to artificial intelligence (AI) may help to expose our inner emotions, according to new research by Queen Mary University of London. The research, published in the journal PLOS ONE, reveals the use of radio waves to calculate heart rate and breathing signals and predicts how someone feels even in the absence of all other sensory cues, such as facial expressions.
Achintha Avin Ihalage, a Ph.D. student at QMUL, said their proposed deep learning solution is a novel neural architecture that can process time-dependent wireless signal and frequency-domain wavelet transformation images simultaneously while retaining temporal and spatial relationships.
New research from the Queen Mary University of London explores how an AI approach based on wireless signals could lead to new methods of emotion detection.
Participants were originally requested to watch a video chosen by participants for their ability to elicit one of four basic forms of emotions: rage, sorrow, excitement, and enjoyment. As the individual observed the video, the researchers then released innocuous radio signals, such as those sent by every wireless device, like radar or Wi-Fi, to the individual and analyzed the signals that reflected back from them. By observing the changes in these signals triggered by subtle body movements, the researchers were able to uncover ‘secret’ details about the heart and the rate of breathing of the human.
Previous research has used similar non-invasive or wireless methods of emotion detection, however, in these studies, data analysis has depended on the use of classical machine learning approaches, whereby an algorithm is used to identify and classify emotional states within the data. For this study, the scientists instead employed deep learning techniques, where an artificial neural network learns its own features from time-dependent raw data and showed that this approach could detect emotions more accurately than traditional machine learning methods.
Achintha Avin Ihalage, a Ph.D. student at Queen Mary, said: “Deep learning helps one to analyze data in a way comparable to how a human brain might function by looking at various levels of knowledge and creating correlations between them. Most of the published literature that uses machine learning tests emotions in a subject-dependent manner, capturing a signal from a single person, and using it to pre-screen emotions.
“With deep learning, we’ve shown we can accurately measure emotions in a subject-independent way, where we can look at a whole collection of signals from different individuals and learn from this data and use it to predict the emotion of people outside of our training database.”
Traditionally, emotional identification has focused on the determination of observable signs such as facial expressions, speech, body motions, or eye movements. However, these approaches can be ineffective since they do not adequately capture the individual’s internal feelings, and researchers are constantly searching for ‘invisible’ cues, such as ECG, to explain emotions.
The ECG signals sense electrical activity in the heart, providing a connection between the nervous system and the rhythm of the heart. To date, these signals have been primarily assessed using electrodes that are mounted on the body, but lately, researchers have been looking at non-invasive techniques that use radio waves to detect these signals.
Methods for identifying individual feelings are mostly utilized by scholars interested in psychiatric or neuroscientific experiments, although it is thought that these methods may also have broader consequences for wellness and well-being management. The research team plans to collaborate with healthcare providers and social scientists on societal acceptance and ethical questions for the application of this technology in the future.
Ahsan Noor Khan, a Ph.D. student at Queen Mary and first author of the study, said: “Being able to detect emotions using wireless systems is a topic of increasing interest for researchers as it offers an alternative to bulky sensors and could be directly applicable in future ‘smart’ home and building environments. In this study, we’ve built on existing work using radio waves to detect emotions and show that the use of deep learning techniques can improve the accuracy of our results.”
“We are also looking to investigate how we might use low-cost current systems, such as Wi-Fi routers, to monitor the emotions of a large number of people gathered, for example in the workplace or work atmosphere. This sort of method will allow us to identify people’s emotions on an individual basis when conducting regular tasks.
Professor Yang Hao, project leader, added: “This research opens up many opportunities for practical applications, especially in areas such as human/robot interaction and healthcare and emotional wellbeing, which has become increasingly important during the current Covid-19 pandemic.”