Research reportNeural substrates of facial emotion processing using fMRI
Introduction
The ability to decode emotional information on the face is critical to human communication. Clinical and experimental research in humans and in non-human primates suggests right hemisphere specialization [10], [11], [12], [22], [26], [35], [47], [53], [54]. Electrical stimulation, lesion, and microelectrode recording studies have identified several intrahemispheric brain regions involved in facial affect processing. These regions include the right temporal lobe [27], [55], [63], the right or left basal ganglia and anterior temporal lobe [16], the right mesial occipital and right inferior parietal regions [2], the right somatosensory cortex [1], and the amygdala [3], [4], [17], [72].
Neuroimaging studies of facial emotion perception suggest that there may be two parallel pathways, one for conscious processing of stimuli and one for unconscious processing of simple and associative stimulus features [45], [52]. Several imaging studies have implicated the amygdala in the processing of both seen and unseen fearful faces in humans [6], [14], [49], [50], [51], [58], [59], [70]. Morris et al. [52] identified a subcortical pathway from the midbrain and thalamus to the right amygdala that is involved in the processing of unseen visual events (e.g. fear-conditioned faces). Their data suggest that fusiform and orbitofrontal regions may be components of a pathway subserving conscious identification. In a more recent functional magnetic resonance imaging (fMRI) study, Critchley et al. [20] found that explicit processing of facial expressions of emotion activated temporal cortex in comparison to implicit processing which evoked significantly greater activity in the amygdala.
While investigators are beginning to map out the pathways engaged in conscious and unconscious emotional processing, little is known about how these pathways compare across different emotions, especially the pathways involved in conscious processing. Many studies such as those by Blair et al. [8], Morris et al. [49], Phillips et al. [58], and Sprengelmeyer et al. [67] required subjects to perform a gender discrimination task with a button-press response when emotional faces were presented. Thus attention is directed away from the emotions, and the regions activated are likely to differ from those engaged by fully conscious emotional face processing [20].
Only a few neuroimaging studies have examined the response to more than one facial emotion without using a backward masking paradigm or a task unrelated to emotional face perception. Using fMRI, Phillips et al. [57] found that happy face processing was associated with increased activity in the left anterior cingulate, bilateral medial frontal region, bilateral posterior cingulate, left supramarginal gyrus, and right putamen. They did not find any areas to be significantly more active in comparisons between sad and neutral faces. Also using fMRI, Breiter et al. [14] found bilateral activation of the amygdala and fusiform gyrus associated with viewing fearful versus neutral faces. They also found activation of the left amygdala and bilateral fusiform in comparisons of happy and neutral faces. In a 133Xe inhalation study, Gur et al. [33] found that happy and sad emotional face discrimination tasks activated the right parietal lobe more than did an age discrimination task. Furthermore, the happy discrimination task activated the left frontal region relative to the sad discrimination task. In sum, the literature identifying brain regions engaged in the conscious processing of different facial emotions is limited and inconclusive.
Our fMRI study was designed to investigate explicit processing of happy, sad, angry, frightened, and neutral faces by normal volunteers. Subjects were instructed to ‘concentrate on each person’s expression’ when viewing a series of emotionally expressive faces. We did not require a behavioral response such as a button press because we did not wish to force subjects to engage in an artificial decision making task [60]. For each subject, four separate runs corresponding to the four different emotions were presented. We predicted that activations observed during the emotion conditions, over and above those seen during the neutral condition, would occur in the left and right fusiform gyri and in the left and right amygdalae and that these effects would be greater in the right hemisphere than in the left. We also sought to compare and contrast the brain regions involved in the perception of the four different emotions, although the existing literature did not permit us to make specific predictions in this regard.
Section snippets
Subjects
Twenty-one healthy adult volunteers (11 men, 10 women; aged 18–45; mean age 21.6 years) gave informed consent under an institutionally approved protocol. All participants were right-handed and reported no first-degree left-handed biological relatives. Exclusions were current cigarette smoking, visual acuity poorer than 20/25, or medical conditions that could affect the central nervous system as determined by a board-certified neurologist (CDS).
Visual stimuli
Stimuli consisted of black and white still
Behavioral data and subject debriefing
All subjects performed an emotion labeling task immediately after the second of two fMRI scanning sessions. Every face that had been shown during the four emotion runs was singly presented. Subjects were asked to categorize each face as neutral, happy, sad, angry, afraid, ‘other,’ or ‘don’t know’ and to rate the intensity of each expression on a five-point scale. Three analysis-of-variance F-tests based on data from all but the neutral faces showed that emotion category had a significant effect
Discussion
This study was undertaken to identify the brain regions that underlie the conscious perception of four basic facial emotions. To determine this, we first sought to distinguish regions associated with the processing of emotionally neutral faces. We found that the following areas activated above threshold in comparisons of neutral faces with scrambled images: the right and left fusiform gyri, the right and left amygdalae/entorhinal cortices, the right superior temporal sulcus, the right inferior
Acknowledgements
These studies were supported by NSF Grant IBN-9604231. We thank Robin Avison, Sherry C. Williams, Aileen Wiglesworth, Xia Wang, Derek Mace, and Marta Mendiondo for their technical assistance.
References (71)
- et al.
Functional magnetic resonance imaging of facial affect recognition in children and adolescents
J. Am. Acad. Child Adolesc. Psychiatry
(1999) - et al.
Dissociation between the processing of affective and nonaffective faces in patients with unilateral brain lesions
Brain Cogn.
(1985) - et al.
Response and habituation of the human amygdala during visual processing of facial expression
Neuron
(1996) - et al.
Lesion localization in acquired deficits of emotional expression and comprehension
Brain Cogn.
(1990) AFNI: Software for analysis and visualization of functional magnetic resonance neuroimages
Comput. Biomed. Res.
(1996)- et al.
Affective neuroscience: the emergence of a discipline
Curr Opin. Neurobiol.
(1995) - et al.
Recognition and discrimination of emotional faces and pictures
Brain Lang.
(1980) - et al.
Neural activation during covert processing of positive emotional facial expressions
Neuroimage
(1996) Perceptual and conceptual organization of facial emotions: hemispheric differences
Brain Cogn.
(1984)- et al.
Gender differences during transient self-induced sadness or happiness
Biol. Psychiatry
(1996)