Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging

https://doi.org/10.1016/j.neuropsychologia.2006.06.003Get rights and content

Abstract

Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120 ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.

Introduction

Faces are multi-dimensional stimuli conveying many important signals simultaneously, each with a complex social and motivational significance. Faces provide not only distinctive information about a person's identity, gender, or age, but also more subtle signals related to emotion, trustworthiness, attractiveness, as well as gaze direction or intention of other people. However, still little is known about how these various dimensions are coded and how they are integrated into a single face percept. Results from haemodynamic and electrophysiological brain imaging in humans have begun to uncover the distributed nature of neural activity arising during the perception of faces and facial expressions, and reveal important interactions taking place between regions in this network (see Gobbini & Haxby, 2007; Winston, O’Doherty, Kilner, Perrett, & Dolan, 2007; Puce, Epling, Thompson, & Carrick, 2007). Here we will provide an overview of recent studies specifically concerning the interaction between face and emotion processing.

A traditional view in cognitive neuropsychology has considered that different aspects of face processing involve different specialized parallel processing routes (Bruce & Young, 1986; Burton, Young, Bruce, Johnston, & Ellis, 1991; Hancock, Bruce, & Burton, 2000). This model was primarily derived from dissociations observed in brain-damaged patients (e.g., Adolphs, Tranel, Damasio, & Damasio, 1995; Bowers, Bauer, Coslett, & Heilman, 1985; Sergent & Villemure, 1989) and behavioral measures in healthy subjects (Bruce, 1986; Bruce & Young, 1986). According to the influential cognitive model proposed by Bruce and Young (1986), facial expression and facial identity are processed along two separate pathways after an initial stage of visual structural encoding, such that expression can be processed regardless of identity, and vice versa (Bauer, 1984; Breen, Caine, & Coltheart, 2000). Indeed, in some behavioral experiments, the accuracy and speed of expression categorization tasks is unaffected by familiarity of the faces (Young, Mcweeny, Hay, & Ellis, 1986), consistent with independent processing routes for identity and expression. However, in other experiments, expression judgments can be modulated by face identity and familiarity, even though identity judgments are independent of expression (Schweinberger & Soukup, 1998), suggesting asymmetric dependencies between these processes. Similarly, learning new faces is facilitated when these unfamiliar faces are initially seen with different expressions (Baudouin, Gilibert, Sansone, & Tiberghien, 2000; Sansone & Tiberghien, 1994), again suggesting some interactions between emotion and identity processing in some circumstances. On the other hand, prosopagnosic patients with lesions in associative visual cortices can still recognize facial expressions (Damasio, Damasio, & Van Hoesen, 1982; Damasio, Tranel, & Damasio, 1990; Sergent & Villemure, 1989), whereas deficits in expression recognition can occur in patients without prosopagnosia, and often seem selective for some categories of emotion depending on the site of brain lesion, e.g., fear after amygdala lesions (Adolphs et al., 1995), disgust after insula damage (Calder, Keane, Manes, Antoun, & Young, 2000), or anger after ventral basal ganglia lesions (Calder, Keane, Lawrence, & Manes, 2004). These neuropsychological data provide compelling evidence for specialized neural systems underlying explicit recognition of facial expressions.

More recently, functional brain-imaging studies have delineated an extensive neural network of areas implicated in face processing in humans. These include not only face-selective regions in lateral fusiform gyrus (Kanwisher, McDermott, & Chun, 1997) and inferior occipital gyrus (Hoffman & Haxby, 2000), but also other regions in the superior temporal sulcus (STS) and anterior temporal pole (Haxby, Hoffman, & Gobbini, 2000; Ishai, Ungerleider, Martin, & Haxby, 2000; Sergent, Ohta, & MacDonald, 1992), as well as several areas traditionally related to the limbic system such as the amygdala, orbitofrontal cortex, and retrosplenial or posterior cingulate regions (Gorno-Tempini et al., 1998; Ishai, Pessoa, Bikle, & Ungerleider, 2004; Shah et al., 2001; Gobbini & Haxby, 2007). In this distributed network, different regions have been associated with dissociable abilities, in keeping with the traditional cognitive models. For instance, distinct cortical regions in fusiform and superior temporal cortex may subserve the recognition of invariant (e.g., identity) versus changeable aspects (e.g., expression) of faces, respectively (see Haxby et al., 2001). Likewise, in the temporal domain, electrophysiological studies using EEG or MEG have suggested that face processing activates specialized neural systems in inferior temporal cortex within 200 ms post-stimulus onset, as typically indexed by the N170 component recorded over posterior scalp electrodes (Bentin, Allison, Puce, Perez, & McCarthy, 1996; Carmel & Bentin, 2002; George, Evans, Fiori, Davidoff, & Renault, 1996; Schweinberger, Pickering, Jentzsch, Burton, & Kaufmann, 2002; or the M170 with MEG, see Liu, Harris, & Kanwisher, 2002) as well as by the N200 measured intracranially (Allison, Ginter et al., 1994; McCarthy, Puce, Belger, & Allison, 1999; Seeck et al., 2001), whereas distinct components with different latencies and topographies seem more specifically sensitive to expressions (e.g., Krolak-Salmon, Fischer, Vighetto, & Mauguiere, 2001; Munte et al., 1998).

However, the exact role and dynamics of these different brain areas and of these different cognitive processes is still far from settled, although increasing evidence from imaging studies suggests that several subregions within the distributed face network may act in concert and in fact influence each other in an interactive manner, rather than truly operate independently one from another. Yet, the functional consequences of such interactions are just beginning to be understood. Here we review the neuroanatomical systems underlying the interactions of face perception with emotion processing and attention, and will mainly focus on fearful expressions, since this emotion category has been by far the more extensively studied in recent years, and is probably the most easily corroborated by corresponding animal studies on fear processing (Davis & Whalen, 2001; LeDoux, 1996).

The case of emotion expression is also worth considering because facial expressions constitute important social and biologically meaningful incentives (Ohman & Mineka, 2001), with different content corresponding to six basic emotions (Ekman & Friesen, 1976) plus several other secondary categories (Calder, Burton, Miller, Young, & Akamatsu, 2001; Eisenberg, 2000), all playing an important role in guiding interpersonal exchanges and behavior during social interactions. Thus, emotional signals perceived from a face are likely to influence how an unknown person will be approached and later remembered, and conversely, previous familiarity with a person might certainly influence how facial expressions will be perceived and interpreted. Moreover, interactions between emotion and face perception do not only constitute a central issue to understand the architecture of social functions in the human brain (Adolphs, 2003), but also provide important insights into more general mechanisms underlying reciprocal links between emotion and cognitive processes (see Drevets & Raichle, 1998). A better understanding of how emotion can modulate perception and cognition should thus help to go beyond strict modular views of neural architecture and information processing.

In this paper, we will review recent findings showing that face processing in visual cortex and other brain regions is modulated by their affective significance, considering first results from PET and fMRI studies, and then results from EEG and MEG. However, although this work has now provided a considerable amount of data on the spatial extent and temporal dynamics of face and affective processing, it remains relatively unclear which aspects may involve mechanisms strictly specific to faces only (Bruce & Young, 1986) and which may reflect more general mechanisms governing emotion-cognition interactions (Drevets & Raichle, 1998). In this perspective, the modulation of face-selective responses in visual cortex by affective expressions might correspond to a fundamental regulatory role of basic emotion signals, especially fear and threat, but also more complex effects associated with social appraisal and attention to salient visual stimuli (e.g., see Schultz et al., 2003; Singer, Kiebel, Winston, Dolan, & Frith, 2004).

Section snippets

Modulation of face processing by emotion

Classic neurophysiological studies recording single-cell responses in the monkey reported that different neurons in superior temporal sulcus were selective for identity and expression of faces (Hasselmo, Rolls, & Baylis, 1989), but with distinct responses to different types and intensities of expression. Responses were generally found to be stronger to threatening than to other faces. More recently, Sugase, Yamane, Ueno, and Kawano (1999) described that the activity of face-selective neurons

Sources of emotional modulation during face processing

The existence of anatomical connections projecting directly from the amygdala to visual cortical regions (Fig. 1B) has long been suspected to play an important role in modulating sensory responses to emotional stimuli (Amaral et al., 2003, LeDoux, 1996). Indirect support was first provided by an early PET study (Morris, Friston et al., 1998) showing a significant correlation between the enhancement of fusiform responses to fearful faces and the magnitude of amygdala activation by fearful vs

Visual pathways for emotional face processing

A number of findings converge to indicate that emotional responses to fearful faces in the amygdala may persist under some conditions of inattention or unawareness, such as when faces are shown at ignored locations (Anderson et al., 2003; Vuilleumier, Armony et al., 2001), masked (Morris, Ohman et al., 1998; Whalen et al., 1998), suppressed by binocular rivalry (Pasley et al., 2004, Williams et al., 2004), or presented on the neglected side in parietal patients (Vuilleumier et al., 2002) or in

The speed of emotional face perception

The notion that emotional face perception may involve distinct stages of processing with important feedback interactions between distant brain areas necessarily implies that such processing should not only be distributed anatomically within large-scale neural networks but also spread over different periods of time. Several issues concerning the temporal course of face and emotion perception have been intensively investigated by EEG and MEG in humans during recent years.

What electrophysiological

Conclusions and future directions

Emotional face perception is a complex visual process, involving a distributed brain network (Haxby et al., 2000; Gobbini & Haxby, 2007) in which distant areas make selective contributions with a distinct time-course and exert reciprocal interactions between each other at different latencies. Here, we have highlighted recent data from neuroimaging studies mainly concerning the processing of fearful expressions, for which a lot of converging evidence has now begun to provide a fairly detailed

Acknowledgements

The authors’ work is supported by a grant from the Swiss National Fund (632.065935). We thank J. Armony, R. Dolan, J. Driver, S. Schwartz, J. Winston, M. Richardson, N. George, D. Sander, and D. Grandjean for many helpful discussions and enjoyable collaborations.

References (199)

  • J. Bullier

    Integrated model of visual processing.

    Brain Research. Brain Research Reviews

    (2001)
  • A.M. Burton et al.

    Understanding covert recognition

    Cognition

    (1991)
  • A.J. Calder et al.

    A principal component analysis of facial expressions

    Vision Research

    (2001)
  • S. Campanella et al.

    Discrimination of emotional facial expressions in a visual oddball task: An ERP study

    Biological Psychology

    (2002)
  • D. Carmel et al.

    Domain specificity versus expertise: Factors influencing distinct processing of faces

    Cognition

    (2002)
  • L. Carretie et al.

    An ERP study on the specificity of facial expression processing

    International Journal of Psychophysiology

    (1995)
  • A. Cowey et al.

    The neurobiology of blindsight

    Trends in Neurosciences

    (1991)
  • B. de Gelder et al.

    Affective blindsight: Are we blindly led by emotions? Response to Heywood and Kentridge

    Trends in Cognition Science

    (2000)
  • J. Driver et al.

    Perceptual awareness and its loss in unilateral neglect and extinction

    Cognition

    (2001)
  • E. Eger et al.

    Rapid extraction of emotional expression: Evidence from evoked potential fields during brief presentation of face stimuli

    Neuropsychologia

    (2003)
  • M. Eimer et al.

    Event-related brain potential correlates of emotional face processing

    Journal of Neurophysiology

    (2007)
  • H. Fischer et al.

    Brain habituation during repeated exposure to fearful and neutral faces: A functional MRI study

    Brain Research Bulletin

    (2003)
  • I. Fried et al.

    Single neuron activity in human hippocampus and amygdala during recognition of faces and objects

    Neuron

    (1997)
  • C.K. Friesen et al.

    Abrupt onsets and gaze direction cues trigger independent reflexive attentional effects

    Cognition

    (2003)
  • N. George et al.

    Seen gaze-direction modulates fusiform activity and its coupling with other brain areas during face processing

    Neuroimage

    (2001)
  • N. George et al.

    Brain events related to normal and moderately scrambled faces.

    Brain Research. Cognitive Brain Research

    (1996)
  • M.I. Gobbini et al.

    Neural systems for recognition of familiar faces

    Journal of Neurophysiology

    (2007)
  • E. Halgren et al.

    Spatio-temporal stages in face and word processing. 2: Depth-recorded potentials in the human frontal and Rolandic cortices

    Journal of Physiology, Paris

    (1994)
  • P.J. Hancock et al.

    Recognition of unfamiliar faces

    Trends in Cognition Science

    (2000)
  • M.E. Hasselmo et al.

    The role of expression and identity in the face-selective responses of neurons in the temporal visual cortex of the monkey

    Behavioural Brain Research

    (1989)
  • J.V. Haxby et al.

    The distributed human neural system for face perception

    Trends in Cognition Science

    (2000)
  • M.J. Herrmann et al.

    Face-specific event-related potential in humans is independent from facial expression

    International Journal of Psychophysiology

    (2002)
  • P.C. Holland et al.

    Amygdala circuitry in attentional and representational processes

    Trends in Cognition Science

    (1999)
  • A. Holmes et al.

    The processing of emotional facial expression is gated by spatial attention: Evidence from event-related brain potentials.

    Brain Research. Cognitive Brain Research

    (2003)
  • S. Kastner et al.

    Increased activity in human visual cortex during directed attention in the absence of visual stimulation

    Neuron

    (1999)
  • L. Kilpatrick et al.

    Amygdala modulation of parahippocampal and frontal regions during emotionally influenced memory storage

    Neuroimage

    (2003)
  • R. Adolphs

    Cognitive neuroscience of human social behaviour

    Nature Reviews. Neuroscience

    (2003)
  • R. Adolphs et al.

    A mechanism for impaired fear recognition after amygdala damage

    Nature

    (2005)
  • R. Adolphs et al.

    Fear and the human amygdala

    The Journal of Neuroscience

    (1995)
  • T. Allison et al.

    Face recognition in human extrastriate cortex

    Journal of Neurophysiology

    (1994)
  • T. Allison et al.

    Human extrastriate visual cortex and the perception of faces, words, numbers, and colors

    Cerebral Cortex

    (1994)
  • S. Anders et al.

    Parietal somatosensory association cortex mediates affective blindsight

    Nature Neuroscience

    (2004)
  • A.K. Anderson et al.

    Neural correlates of the automatic processing of threat facial signals

    The Journal of Neuroscience

    (2003)
  • A.K. Anderson et al.

    Lesions of the human amygdala impair enhanced perception of emotionally salient events

    Nature

    (2001)
  • V. Ashley et al.

    Effects of orbital frontal cortex lesions on ERPs elicited by emotional faces

    Society for Neuroscience Abstracts

    (2002)
  • V. Ashley et al.

    Time course and specificity of event-related potentials to emotional expressions

    Neuroreport

    (2004)
  • J.M. Baas et al.

    Threat-induced cortical processing and startle potentiation

    Neuroreport

    (2002)
  • M. Bar

    A cortical mechanism for triggering top-down facilitation in visual object recognition

    Journal of Cognitive Neuroscience

    (2003)
  • J.Y. Baudouin et al.

    When the smile is a cue to familiarity

    Memory

    (2000)
  • S. Bentin et al.

    Electrophysiological studies of face perception in humans

    Journal of Cognitive Neuroscience

    (1996)
  • Cited by (0)

    View full text