Marianne Reddan spent the last 10 years staring at human faces, searching for traces of two distinct but closely related emotions: surprise and fear. After so much time spent studying the intricacies of facial expressions, she can barely tell them apart anymore.
That whyâs Reddan, a post-doctoral neuroscience fellow at Stanford University, knew her research was onto something when a machine learning-powered system trained to detect emotions successfully distinguished between the two.
Reddan was impressed to find that the system, known as âEmoNet,â wasnât just looking at facial expressions to make sense of human emotionsâit was taking in context clues to determine the overall mood, just like a person would.
âIf EmoNet can tell surprise and fear apart, it says to me itâs not just picking up the expressions on a face, itâs learning something important about the actions going on,â Reddan told The Daily Beast.
The business of creating that machine learning system, a neural network sourced from many previously existing datasets, took researchers from the University of Colorado Boulder and Duke University a year to develop.
First Reddan and her fellow researchers repurposed AlexNet, a deep learning model that enables computers to recognize objects (itâs modeled after the human visual cortex) and retrained it to recognize emotions instead of objects.
AlexNet was trained to identify various objects by being fed photos of items like chairs and pens and assigning class labels to the images. Reddan and the other researchers wondered if a similar classification could be done with emotions. With the machine learning getting really good at detecting objects, it seemed ready for a new challenge.
Lead researcher Philip Kragel, who works as a research associate at the University of Colorado Boulderâs Institute of Cognitive Science, fed the neural network 25,000 images and asked the system to sort them into 20 categories of emotions, some subtle.
The list included emotions like anxiety and boredom, but also less obvious human emotional experiences like âaesthetic appreciationâ and âempathic pain.â Analyzing the images, the neural network made sense of what it saw by parsing the facial expressions and body posture of humans depicted.
Then it was time to see how Emonetâs emotion categorization skills compared to that of the average human brain. Eighteen human subjects were brought in and hooked up to functional magnetic resonance imaging (fMRI) machines. Their brain activity was measured as they were shown flashes of 112 images and the neural network analyzed the pictures in parallel.
The results suggested that the neural net was capable of tracking the humansâ own emotional responses and that ârich, category-specific visual features can be reliably mapped to distinct emotions,â according to the paper, which was published in the journal Science Advances.
Building a neural network, a computer program that simulates the human brain, has been a scientific dream for many years, but even sophisticated computers struggle with some aspects of the human experience. âEmotions are a big part of our daily lives,â Kragel told The Daily Beast. âIf neural networks didnât account for them it would have a very limited understanding of how the brain works.â
Kragel was surprised the neural network worked as well as it did, but it still had limitations. The two categories the system categorized most accurately were âsexual desireâ and âcraving,â but it often didnât do well with dynamic emotions like surprise, which can easily turn to joy or anger. The system also struggled to tell the difference between emotions like adoration, amusement, and joy, in part because those emotions are so closely intertwined.
In the future, a neural network like Emonet could be used to moderate online content, serving as a content filter that pre-screened visual posts before they met human eyes.
Researcher Hannah Davis, a professor of generative music at NYU and former OpenAI scholar, previously worked on a project where she used AI to generate âemotional landscapesââlandscape images a computer associates with evoking various human emotions. She says the new emotion research seems innovative in the way it has mapped brain patterns and created a model that can decode facial expressions according to those categories.
But while Davis believes that teaching a computer to read emotions isnât inherently dangerous, âthereâs also the risk of assuming that all brains work the same, and poorly classifying peoples' behavior based on overly generalizable models,â she wrote in an email.
In the future, Kragel wants to investigate if a neural network like EmoNet can categorize emotions in spoken language, using tone and timbre to detect differences.
Reddan remains cautious about what the research means. Detecting a facial expression that correlates with a human emotion isnât the same as understanding it.
âIs the model feeling the emotions? Itâs definitely not. Itâs just sorting into chunky categories, not the complexity of the human experience,â Reddan told The Daily Beast. âCould it one day feel? Maybe.â