Some of us give away more than others in our expressions, but there’s no doubt the human face can convey a wide range of emotions.
New research digs into just how flexible and versatile our facial features can be when it comes to expressing how we’re feeling.
Using a sophisticated, bespoke graphics software platform, the researchers showed 100 participants a variety of computer-generated faces, and asked them to record the emotions that these faces expressed.
It turns out that the emotional expression written in our faces is as diverse and as rich as the faces themselves, the study revealed: different facial movements, like a raised brow or gaping mouth, can put across two different types of information at once.
In other words, something like a wrinkled nose can put across both broad information (negativity, for example) and a specific emotional category (sad, for example). This is known as giving multiplexed signals.
“This research addresses the fundamental question of how facial expressions achieve the complex signaling task of communicating emotion messages,” says Rachael Jack, a professor of computational social cognition at the University of Glasgow in the UK.
“Using computer generated faces combined with subjective human perceptual responses and novel analytical tools, we show that facial expressions can communicate complex combinations of emotion information via multiplexed facial signals.”
Participants placed the various faces they saw in one of the six categories traditionally used to group emotions: happiness, surprise, fear, disgust, anger and sadness. They also had to say whether a dimension was expressed: positivity or negativity, or how ‘activated’ the face looked (so the difference between calm and excited).
Facial movements (called action units or AUs) that combined both a category and a dimension were then labeled as multiplexed. Out of 42 different AUs, from the closing of the eyes, to the dropping of the jaw, 26 got the multiplexed badge.
“Facial expressions can jointly represent specific emotion categories and broad dimensions to perceivers via multiplexed facial signal components,” the researchers write in their paper.
This new study opens up several interesting questions – not least how we perceive emotions as they’re expressed through the faces of other people. Do dimensions like negativity then help us to recognize emotion categories like sadness, for example?
The researchers are hoping to include some kind of brain scanning in future studies in order to look more closely at this. After all, we’ve been relying on decoding the expressions of other people for as long as our species has been around.
These same research results could be useful far into the future as well: in terms of robotics and virtual reality, it’s crucial that artificial faces are designed to put across the sentiments that they’re supposed to put across.
“These results advance our fundamental understanding of the system of human communication with direct implications for the design of socially interactive AI, including social robots and digital avatars to enhance their social signaling capabilities,” says Jack.
The research has been published in Current Biology.