How neural circuits orchestrate facial expressions

When a baby smiles at you, it's almost impossible not to smile back. This spontaneous reaction to a facial expression is part of the back-and-forth that allows us to understand each other's emotions and mental states.

Faces are so important to social communication that we've evolved specialized brain cells just to recognize them, as Rockefeller University's Winrich Freiwald has discovered. It's just one of a suite of groundbreaking findings the scientist has made in the past decade that have greatly advanced the neuroscience of face perception.

Now he and his team in the Laboratory of Neural Systems have turned their attention to the counterpart of face perception: facial expression. How neural circuits in the brain and muscles of the face work together to, for example, form a smile has remained largely unknown-until now. As they published in Science, Freiwald's team has discovered a facial motor network and the neural mechanisms that keep it operating.

In this first systematic study of the neural mechanisms of facial movement control, they found that both lower-level and higher-level brain regions are involved in encoding different types of facial gestures-contrary to long-held assumptions. It had long been thought that these activities were segregated, with emotional expressions (such as returning a smile) originating in the medial frontal lobe and voluntary actions (such as eating or speaking) in the lateral frontal lobe.

"We had a good understanding of how facial gestures are received, but now we have a much better understanding of how they're generated," says Freiwald, whose research is supported by the Price Family Center for the Social Brain at Rockefeller.

"We found that all regions participated in all types of facial gestures but operate on their own distinct timescales, suggesting that each region is uniquely suited to the 'job' it performs," says co-lead author Geena Ianni, a former member of Freiwald's lab and a neurology resident at the Hospital of the University of Pennsylvania.

Where facial expressions come from

Our need to communicate through facial expressions runs deep-all the way down to the brain stem, in fact. It's there that the so-called facial nucleus is located, which houses motoneurons that control facial muscles. They also project into multiple cortical regions, including different areas of the frontal cortex, which contributes to both motor function and complex thinking. 

Neuroanatomical work has demonstrated that there are multiple regions in the cortex that directly access the muscles of facial expression-a unique feature of primates-but how each one specifically contributes has remained largely unknown. Studies of people with brain lesions suggest different regions may code for different facial movements. When people have damage to the lateral frontal cortex, for example, they lose the ability to make voluntary movements, such as speaking or eating, while lesions in the medial frontal cortex lead to the inability to spontaneously express an emotion, such as returning a smile.

They don't lose the ability to move their muscles, just the ability to do it in a particular context."

Winrich Freiwald, Rockefeller University

"We wondered, could these regions make unique contributions to facial expressions? It turns out that no one had really investigated this," Ianni says.

Adopting an innovative approach designed by the Freiwald lab, they used an fMRI scanner to visualize the brain activity of macaque monkeys while they produced facial expressions. In doing so, they located three cortical areas that directly access facial musculature: the cingulate motor cortex (medially located), and the primary and premotor cortices (laterally located), as well as the somatosensory cortices.

Mapping the network

Using these methods, they were able map out a facial motor network composed of neural activity from the different regions of the frontal lobe-the lateral primary motor cortex, ventral premotor cortex, and medial cingulate motor cortex-and the primary somatosensory cortex, in the parietal lobe.

Using this targeted map, the researchers were able to then record neural activity in each cortical region while the monkeys produced facial expressions. The researchers studied three types of facial movements: threatening, lipsmacking, and chewing. A threatening look from a macaque involves staring straight ahead with an open jaw and bared teeth, while lipsmacking involves rapidly puckering the lips while flattening of the ears against the skull. These are both socially meaningful, contextually specific facial gestures that macaques use to navigate social interactions. Chewing is neither social nor emotional, but voluntary.

The researchers used a variety of dynamic stimuli to elicit these expressions in the lab, including direct interaction with other macaques, videos of other macaques, and artificial digital avatars controlled by the researchers themselves.

They were able to link neural activity from these regions to the coordinated movement of specific regions of the face: eyes and eyebrows; the upper and lower mouth; and the lower face and ears.

The researchers found that both higher and lower cortical regions were involved in producing both emotional and voluntary facial expressions. However, not all of that activity was the same: The neurons in each region operated at a distinct tempo when producing facial gestures.

"Lateral regions like the primary motor cortex housed fast neural dynamics that changed on the order of milliseconds, while medial regions like the cingulate cortex housed slow, stable neural dynamics that lasted for much longer," says Ianni.

In related work based on the same data, the team recently documented in PNAS that the different cortical regions governing facial movement work together as a single interconnected sensorimotor network, adjusting their coordination based on the movement being produced.

"This suggests facial motor control is dynamic and flexible rather than routed through fixed, independent pathways," says Yuriria Vázquez, co-lead author and a former postdoc in Freiwald's lab.

"This is contrary to the standard view that they work in parallel and separate action," Freiwald adds. "That really underscores the connectivity of the facial motor network."

Better brain-machine interfaces

Now that Freiwald's lab has made significant insights into both facial perception and expression in separate experiments, in the future he'd like to study these complementary elements of social communication simultaneously.

"We think that will help us better understand emotions," he says. "There's a big debate in this field about how motor signals relate to emotions internally, but we think that if you have perception on one side and a motor response on the other, emotions somehow happen in between. We would like to find the areas controlling emotional states-we have ideas about where they are-and then understand how they work together with motor areas to generate different kinds of behaviors."

Vázquez sees two possible future avenues of research that could build on their findings. The first involves understanding how dynamic social cues (faces, eye gaze), internal states, and reward influence the facial motor system. These insights would be crucial for explaining how decisions about facial expression production are made. The second relates to using this integrated network for clinical applications.

The findings may also help improve brain-machine interfaces. "As with our approach, those devices also involve implanting electrodes to decode brain signals, and then they translate that information into action, such as moving a limb or a robotic arm," Freiwald says. "Communication has proven far more difficult to decode. And because of the importance of facial expression to communication, it will be very useful to have devices that can decode and translate these kinds of facial signals."

Adds Ianni, "I hope our work moves the field, even the tiniest bit, towards more naturalistic and rich artificial communication designs that will improve lives of patients after brain injury."

Source:
Journal reference:

Ianni, G. R., et al. (2026) Facial gestures are enacted through a cortical hierarchy of dynamic and stable codes. Science. DOI: 10.1126/science.aea0890. https://www.science.org/doi/10.1126/science.aea0890

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Scientists discover how stem cells navigate and repair brain damage after stroke