Carnegie Mellon scientists have successfully determined what emotion a person is experiencing based on readings of their brain activity.
Researchers used functional magnetic resonance imaging (fMRI) scans and computer modeling to measure brain signals. Then they determined a way to "read" what emotions people were going through.
Before this study, there wasn't really an objective way to measure emotions. People aren't exactly honest when it comes to their feelings, the authors pointed out. Also, people may sometimes be experiencing emotions that they don't realize they are feeling.
"This research introduces a new method with potential to identify emotions without relying on people's ability to self-report," lead author Karim Kassam, assistant professor of social and decision sciences at Carnegie Mellon in Pittsburgh, Pa., said in a press release. "It could be used to assess an individual's emotional response to almost any kind of stimulus, for example, a flag, a brand name or a political candidate."
Scientists used 10 actors from CMU's School of Drama and showed them words of different emotions: Anger, disgust, envy, fear, happiness, lust, pride, sadness and shame. Then they told them to think of the word while they were being scanned.
"Our big breakthrough was my colleague Karim Kassam's idea of testing actors, who are experienced at cycling through emotional states. We were fortunate, in that respect, that CMU has a superb drama school," co-author George Loewenstein, the Herbert A. Simon University Professor of Economics and Psychology, said in the press release.
Film clips were not used because the emotional impact of a scene decreases after repeated viewings.
In order to make sure that the scans were measuring the actual emotion and not the act of thinking about an emotion, researchers then showed the participants neutral and disgusting photos that they had not seen before.
A computer model - which was based on data from the fMRIs that linked different brain activation patterns to different emotions -- was able to accurately determine which emotion the participants were feeling based on the photograph they saw.
The model achieved a rank accuracy of 0.84 when looking at neutral pictures and a rank accuracy of 0.91 when looking at the disgusting pictures. Rank accuracy reflects the percentile rank of the correct emotion -- random guessing would result in a rank accuracy of 0.5. The model chose disgust out of the nine emotions to choose from as the most likely emotion 60 percent of the time, and it was in its top two choices 80 percent of the time.
Researchers then compiled data from nine participants to create fMRI emotion profiles, and tested it on the tenth subject who was told to cycle through the nine emotions. The model was able to achieve a rank accuracy of 0.71.
The computer model was able to determine happiness the easiest, and had the hardest time figuring out envy. It was least likely to misidentify lust, which may mean that the emotion creates a unique neural pattern. It rarely mixed up positive and negative emotions.
"Despite manifest differences between people's psychology, different people tend to neurally encode emotions in remarkably similar ways," co-author Amanda Markey, a graduate student in the Department of Social and Decision Sciences, said in a press release.
The researchers hope that this model will be able to further emotion research, especially how to identify emotions that people are trying to hide and how to figure out emotions when a person is expressing many of them at the same time.
The study was published in PLOS ONE on June 19.