Hooked on a Feeling
Award shows, like the Tony Awards®, are a time when actors put their poker faces to the test.
At Carnegie Mellon University, art meets technology when actors put their emotions on display to help researchers map where feelings — such as happiness and sadness — reside in the brain.
The study, published last year in PLOS ONE, combined functional magnetic resonance imaging (fMRI) and machine learning to measure brain signals to accurately read emotions in individuals.
Led by researchers in CMU's Dietrich College of Humanities and Social Sciences, the findings illustrate how the brain categorizes feelings, giving researchers the first reliable process to analyze emotions.
The research team had a challenge to find a way to repeatedly and reliably evoke different emotional states from the participants. The researchers solved the problem by recruiting actors from CMU's School of Drama, which is celebrating its centennial this year.
"Our big breakthrough was my colleague Karim Kassam's idea of testing actors, who are experienced at cycling through emotional states. We were fortunate, in that respect, that CMU has a superb drama school," said George Loewenstein, the Herbert A. Simon University Professor of Economics and Psychology.
For the study, 10 actors were scanned at CMU's Scientific Imaging & Brain Research Center while viewing the words of nine emotions: anger, disgust, envy, fear, happiness, lust, pride, sadness and shame. While inside the fMRI scanner, the actors were instructed to enter each of these emotional states.
"This research introduces a new method with potential to identify emotions without relying on people's ability to self-report," said Kassam, assistant professor of social and decision sciences and lead author of the study. "It could be used to assess an individual's emotional response to almost any kind of stimulus, for example, a scene, a character or an entire play."
Identifying emotions based on neural activity builds on previous discoveries by CMU's Marcel Just and Tom M. Mitchell, which used similar techniques to create a computational model that identifies individuals' thoughts of concrete objects, often dubbed "mind reading."
Another challenge was to ensure that the technique was measuring emotions per se, and not the act of trying to induce an emotion in oneself. So, a second phase of the study presented participants with pictures of neutral and disgusting photos that they had not seen before. The computer model, constructed from using statistical information to analyze the fMRI activation patterns gathered for 18 emotional words, had learned the emotion patterns from self-induced emotions. It was able to correctly identify the emotional content of photos being viewed using the brain activity of the viewers.
"Despite manifest differences between people's psychology, different people tend to neurally encode emotions in remarkably similar ways," noted Amanda Markey, a graduate student in the Department of Social and Decision Sciences.
The research team also found that while on average the model was best at identifying happiness and least accurate in identifying envy.
Just, the D.O. Hebb University Professor of Psychology, director of the university's Center for Cognitive Brain Imaging and leading neuroscientist, explained, "We found that three main organizing factors underpinned the emotion neural signatures, namely the positive or negative valence of the emotion, its intensity — mild or strong, and its sociality — involvement or non-involvement of another person. This is how emotions are organized in the brain."
Related Links: Brain and Behavior | Dietrich College of Humanities and Social Sciences | Scientific Imaging & Brain Research Center | Center for Cognitive Brain Imaging | Social and Decision Sciences | Read press release | Carnegie Mellon Computer Model Reveals How Brain Represents Meaning