'Stereotyping' emotions is getting in the way of artificial intelligence. Scientists say they've discovered a better way.

Advertisement
'Stereotyping' emotions is getting in the way of artificial intelligence. Scientists say they've discovered a better way.

smiling smile happy woman

Strelka Institute for Media, Architecture and Design/Flickr

Facial cues, like a smile, aren't always indicative of a corresponding emotional state.

Advertisement
  • Emotions are complicated.
  • For example, facial expressions don't map neatly to emotional states.
  • A recent review paper from researchers at Northeastern and the California Institute of Technology found a massive disconnect between common assumptions about how emotions work and what actually happens when people express emotions and how they're understood.
  • This could lead to major problems. As artificial intelligence grows more prevalent, the researchers warn that "the science of emotion is ill-equipped to support any of these initiatives."
  • Click here for more BI Prime stories.

Understanding an emotion isn't as simple as noticing a smile- but we still look to facial movements for everything from navigating everyday social interactions to the development of emotionally attuned artificial intelligence.

According to a July 2019 study from researchers at Northeastern and the California Institute of Technology, facial expressions only reflect the surface of emotions: The culture, situation, and specific individual around a facial expression add nuance to the way a feeling is conveyed.

For example, the researchers note that Olympic athletes who won medals only smiled when they knew they were being watched by an audience. While they were waiting behind the podium or facing away from people, they didn't smile (but were probably still happy). These results reinforce the idea that facial expressions aren't always reliable indicators of emotion.

"Such findings are consistent," the researchers write, "with more recent sociological evidence that smiles are social cues that can communicate different social messages depending on the cultural context."

Advertisement

The common assumptions about emotions is getting in the way of actually understanding them - especially in tech.

This stance challenges what the researchers call the common view, or assumption that certain facial movements, like a furrowed brow or tightened lips, relate to emotional categories like anger or frustration. That presents a real risk for technologists: If artificial intelligence developers rely on the common view, they could over-simplify emotional states and construct AI based on incomplete information.

But technology companies are already investing millions of dollars in research around the common view. Amazon is even looking into virtual human tech to interact with consumers, and virtual humans can be used in everything from educating children to training physicians. Regardless of where you stand within the corporate hierarchy, understanding the nuances of emotional intelligence, and how it can be developed, can help you understand yourself and your colleagues (whether human, virtual, or artificial) a bit better.

The researchers suggest that current research isn't as far along in understanding emotion as people assume- and this may also be why technologies like AI currently fall short in connecting the dots of human emotion.

Technology companies should take note as they attempt to develop emotional intelligence in AI.

But the "common view" is also an issue for everyday interactions.

On a more personal level, moving away from the common view could impact how well you deal with your emotions. Communicating your emotions as specifically as possible, and learning how you and others express them, elevates your emotional intelligence.

Advertisement

Dr. Lisa Feldman Barrett, lead author on the study and director of Northeastern University's Interdisciplinary Affective Science Laboratory, introduced the concept of 'emotional granularity' or the notion that the better you are at finely tuning and communicating your feelings, the more precise emotions you'll experience.

"So if you have a very fine-grained conceptual system for emotion, you know a lot about emotion, then your brain is able to construct very precise prediction in a way that's tailored very specifically to your situation," she told Drake Baer, current deputy editor at Business Insider. "So you're not using stereotypes; you're using these very fine-grained, honed, situated predictions."

Instead of expressions, humans and AI alike should be looking for emotions in context.

Based on that research, tech companies now seem to be asking the wrong question. If individual emotional intelligence can be improved by understanding the context in which emotions occur, and communicating feelings specifically, then AI developers should prioritize those factors over facial movements to understand emotional cues.

"It is time to move beyond a science of stereotypes to develop a science of how people actually move their faces to express emotion in real life," Barrett and her colleagues write, "and the processes by which those movements carry information about emotion to someone else."

{{}}