can be tricky enough for humans to read, let alone machines, but a new system can predict people’s feelings with 87 percent accuracy by bouncing off them, researchers say.
The setup, dubbed EQ-Radio, analyzes the signal reflected off a subject’s body to monitorboth breathing and heartbeat. These physiological cues are commonly used to detect a person’s emotions, but it typically requires hooking up the subject to a host of sensors.
Using a device smaller than a Wi-Fi router, researchers at MIT were able tomonitor a person’s breathing and heartbeat wirelessly. These measurements were then fed into a machine-learning algorithm that classified the subject’s emotion as excited, happy, angry or sad. The accuracy was similar to state-of-the-art wired approaches, the scientists said. [5 Ways Your Emotions Influence Your World (and Vice Versa)]
The inventors say potential applications include health care systems that detect if you’re getting depressed before you do, “smart” homes that can tune lighting and music to your mood or tools that allow filmmakers to get real-time feedback on their audience’s reaction.
“The idea is that you can enable machines to recognize our emotions so they can interact with us at much deeper levels,” said Fadel Adib, a doctoral student at MIT’s Computer Science and Artificial Intelligence Lab who helped design the system.
To test EQ-Radio, 12 subjects were monitored for 2 minutes at a time while experiencing no emotion and also while using videos or music to recall memories that evoked each of the four emotions (excited, happy, angry and sad). A machine-learning algorithm was then trained on each subject’s heartbeat and breathing data from each monitoring period.
According to Adib, the system intelligently combines the two and then maps the results onto a graph where one axis represents arousal and the other represents “valence” – essentially, whether an emotion is positive or negative. This is then used to classify the emotion into the four broad categories.
After training on each subject individually, the system could accurately classify their emotional states 87 percent of the time, the researchers said. A separate system trained on data from 11 participants was able to classify the emotions of the unseen 12th subject 72.3 percent of the time.
“Ourare continuous and it doesn’t make sense for us just to assign them to one of these states,” Adib told Live Science. “But it’s a way to start and moving forward we can develop techniques to understand better the different classes or subclasses of emotion.”
The system relies on a radar technique called Frequency Modulated Carrier Waves, which is particularly powerful because it can eliminate reflections from static objects and other humans, the researchers said. This high-precision body tracking is sensitive enough to pick up the rising and falling of the chest during breathing as well as minute vibrations caused by blood pulsing through the body. As heart contractions happen much faster than breathing acceleration, measurements are used to isolate the fainter heartbeat signals, they added.
Dimitrios Hatzinakos, a professor of electrical and biometric security, said the potential for automated emotion recognition is huge. But he said the controlled nature of the experiments on the EQ-Radio device make it hard to judge if it would work in real-world situations.at the University of Toronto who specializes in
“Real life is brutal in this sense. Themight work fine under some conditions and fail in others,” Hatzinakos told Live Science. “A thorough evaluation should be done in real-life environments if we want to talk about practical systems.”
But Dina Katabi, a professor of electrical engineering and computer science at MIT, who led the research, is confident the device will hold up in real-life situations. She plans to incorporate the emotion-detection capability into devices made by her company Emerald that use wireless signals to detect falls among the elderly.
The researchers also think the fact that the system relies onrather than electrical ones to monitor the heart could lead to significant applications in health care.
“What really tells you about functioning of the heart are the mechanical signals,” Adib said. “So it will be very interesting to try to explore what are the conditions we can actually extract, given that we are getting this level of granularity.”
The team will present the work at the Association of Computing Machinery’s International Conference on Mobile Computing and Networking, which is being held from Oct. 3 to 7 in New York City.
Original article on Live Science.
for more features.