Identification of emotional intonation evaluated by fMRI
Introduction
During speech production, information about a speaker's emotional state is predominantly conveyed by the modulation of intonation (affective prosody). At the perceptual level, emotional tone is characterized by variations of pitch, syllable durations, loudness, and voice quality across utterances (suprasegmental features) imposed upon segmental verbal information encoded in phonetic/phonological units (Ackermann et al., 1993, Bachorowski and Owren, 2003, Banse and Scherer, 1996, Cutler et al., 1997, Sidtis and Van-Lancker-Sidtis, 2003). As concerning cerebral topography of prosody processing, observations in patients suffering from focal brain lesions indicate that the well-established left-sided dominance for language comprehension does not extend to perception of emotional intonation (Adolphs, 2002, Baum and Pell, 1999, Borod et al., 2001, Borod et al., 2002, Charbonneau et al., 2003, Pell and Baum, 1997). According to the neuroanatomical model proposed by Ross (1981), prosodic information is processed within distinct right-sided perisylvian regions that are organized in complete analogy to left-sided language areas. Expression of affective prosody, thus, is believed to rely on the Broca-homotop within the right inferior frontal cortex, whereas comprehension of intonational information is presumed to be bound to the Wernicke-homotop within the right superior temporal region. However, empirical evidence for this model as provided by Ross (1981) was based on a few case reports only, and more systematic investigations yielded divergent results. Nevertheless, as concerns comprehension of speech melodies, the findings of the majority of lesion studies seem to be compatible with the assumption that perceptual prosodic functions are predominantly bound to the right posterior perisylvian cortex (Borod et al., 2002, Darby, 1993, Heilman et al., 1984, Starkstein et al., 1994). In addition, various clinical examinations indicate a widespread network of—partially bilateral—cerebral regions including the frontoparietal cortex (Adolphs et al., 2002, Breitenstein et al., 1998) and the basal ganglia (Breitenstein et al., 1998, Breitenstein et al., 2001, Cancelliere and Kertesz, 1990, Pell and Leonard, 2003) to contribute to comprehension of emotional intonation. In line with these findings, neuroimaging studies as a rule yielded rightward lateralization of hemodynamic activation within posterior temporal regions (Buchanan et al., 2000, Kotz et al., 2003, Mitchell et al., 2003, Wildgruber et al., 2002) and revealed additional—partially bilateral—responses within the frontal cortex (Buchanan et al., 2000, George et al., 1996, Imaizumi et al., 1997, Kotz et al., 2003, Wildgruber et al., 2002, Wildgruber et al., 2004), the anterior insula (Imaizumi et al., 1997, Wildgruber et al., 2002, Wildgruber et al., 2004), and the basal ganglia (Kotz et al., 2003) during recognition of affective intonation. Considerable differences in lateralization and exact localization of cerebral responses, however, did not allow for an indisputable identification of the neural substrate of prosody processing. Presumably, the observed discrepancies are due to differences in methodology including stimulus selection, task conditions, control conditions, and imaging modalities. It has been proposed, for example, that extraction of specific acoustic properties underlying emotional prosody relies on differentially lateralized cerebral regions (Ackermann et al., 2001, Sidtis and Van-Lancker-Sidtis, 2003, Van Lancker and Sidtis, 1992, Wildgruber et al., 2002, Zatorre, 2001). Moreover, the selection of emotional categories included in the stimulus material might matter, since the valence of emotional expression has been reported to influence lateralization of cerebral responses (Canli et al., 1998, Davidson and Tomarken, 1989, Davidson et al., 1999, Murphy et al., 2003). As concerns speech intonation, several clinical examinations failed to show any interactions between hemispheric lateralization and emotional valence (Baum and Pell, 1999, Borod et al., 2002, Kucharska-Pietura et al., 2003, Pell, 1998). Considering functional imaging data, however, distinct cerebral activation patterns bound to specific emotional categories such as disgust, anger, fear, or sadness have been observed during perception of facial emotional expressions (Kesler-West et al., 2001, Murphy et al., 2003, Phan et al., 2002, Sprengelmeyer et al., 1998). Several studies have corroborated the notion that responses of the amygdala are specifically related to facial expressions of fear (Adolphs, 2002, Phan et al., 2002), whereas facial expressions of disgust seem to be linked to activation of the anterior insula (Calder et al., 2000, Phan et al., 2002, Phillips et al., 1998, Sprengelmeyer et al., 1998, Wicker et al., 2003). As concerns vocal emotional expressions, fear-specific responses within the amygdalae have also been reported (Morris et al., 1999, Phillips et al., 1998), whereas the predicted disgust-related activation of the anterior insula has not been observed in a prior PET experiment (Phillips et al., 1998). It is unsettled, thus, to which extent lateralization and exact localization of cerebral activation during comprehension of emotional prosody is linked to specific emotional categories. Based on the aforementioned clinical and neuroimaging studies, presumably there are cerebral regions—including the right posterior temporal cortex—that contribute to comprehension of emotional prosody independent of any specific emotional content, whereas other regions—including the amygdala and anterior insula—are selectively linked to comprehension of specific emotional categories.
To allow for a separation of these components, the stimulus material used in the present study comprised sentences spoken in emotional intonation of five different basic emotions (happiness, anger, fear, sadness, and disgust). Verbal utterances were presented to healthy subjects under two different task conditions during functional magnetic resonance imaging (fMRI). On one hand, subjects were asked to identify the emotion expressed by the tone of voice, on the other, they performed a phonetic identification task. Since both tasks require evaluation of completely identical acoustic stimuli and involve very similar response mechanisms, comparison of the respective hemodynamic activation patterns allows for the separation of task-specific cerebral responses independently of stimulus characteristics and unspecific task components. In order to delineate cerebral structures contributing to identification of affective prosody independent of specific emotional categories, responses during identification of affective prosody across all emotional categories were compared to the phonetic control condition. To disentangle patterns of cerebral activation related to comprehension of specific emotional categories, each emotional category was compared against the others. The main goal of the study, thus, was to test predictions for the following two questions:
- (a)
Which areas of the human brain contribute to identification of affective intonation independent of specific emotional information conveyed? Hypothesis: a network of right-sided regions including the right posterior temporal cortex.
- (b)
Does the localization of responses vary for specific emotions? Hypothesis: perception of different emotional categories is associated to specific brain regions, such as fear-specific responses within the amygdalae and disgust-specific responses within the anterior insula.
Section snippets
Materials and methods
In order to generate appropriate verbal stimuli, 100 short German declarative sentences with emotionally neutral content were selected, such as “Der Gast hat sich für Donnerstag ein Zimmer reserviert [The visitor reserved a room for Thursday]”, “Die Anrufe werden automatisch beantwortet [Phone calls are answered automatically]”. Sentences were randomly ascribed to one of five different target emotions (happiness, anger, fear, sadness, or disgust) and spoken by a professional actress and an
Results
During the fMRI experiment, subjects correctly identified sentences with respect to emotional tone at a slightly lower rate (mean: 75.2 ± 7.9%) as compared to vowel identification (mean: 83.4 ± 7.0%, P < 0.05). The accuracy scores for happy (90%), angry (82%), and sad (84%) expressions did not differ significantly from pretest ratings, whereas fearful (51%) and disgusted (57%) expressions were identified at significantly lower rates (P < 0.05) (Fig. 2a). Response times for the emotional task
Discussion
Evaluation of the behavioral data revealed considerable differences in accuracy rates across emotional categories. Lower identification rates for prosodic expression of disgust and fear as compared to happy, sad, and angry vocalizations are in good accordance with prior observations and might be related to differential degrees of acoustic recognizability for specific emotions (Banse and Scherer, 1996). Significantly less accurate performance on these emotional expressions during the fMRI
Conclusion
The findings of the present study support the hypothesis of an important contribution of right-sided temporal and frontal regions to the processing of emotional prosody independent of specific emotional categories. The observed rightward lateralization at the level of the posterior temporal cortex (BA 22/42) might be bound to extraction of specific acoustic cues from complex speech signals (i.e., suprasegmental features such as pitch contours and rhythmic structures), whereas engagement of the
Acknowledgments
This study was supported by the Junior Science Program of the Heidelberger Academy of Sciences and Humanities and the German Research Foundation (DFG WI 2101).
References (69)
Neural systems for recognizing emotion
Curr. Opin. Neurobiol.
(2002)- et al.
Impaired perception of vocal emotions in Parkinson's disease: influence of speech time processing and executive functioning
Brain Cogn.
(2001) - et al.
Recognition of emotional prosody and verbal components of spoken language: an fMRI study
Cogn. Brain Res.
(2000) - et al.
Lesion localization in acquired deficits of emotional expression and comprehension
Brain Cogn.
(1990) - et al.
Perception and production of facial and prosodic emotions by chronic CVA patients
Neuropsychologia
(2003) - et al.
Regional brain function, emotion and disorders of emotion
Curr. Opin. Neurobiol.
(1999) - et al.
Face and voice expression identification in patients with emotional and behavioral changes following ventral frontal lobe damage
Neuropsychologia
(1996) - et al.
Neural substrates of facial emotion processing using fMRI
Cogn. Brain Res.
(2001) - et al.
Improvement of the acquisition of a large amount of MR images on a conventional whole body system
Magn. Reson. Imaging
(1999) - et al.
On the lateralization of emotional prosody: an event-related functional MR investigation
Brain Lang.
(2003)
Perception of emotions from faces and voices following unilateral brain damage
Neuropsychologia
The neural response to emotional prosody, as revealed by functional magnetic resonance imaging
Neuropsychologia
Saying it with feeling: neural responses to emotional vocalization
Neuropsychologia
Assessment and analysis of handedness: the Edinburgh inventory
Neuropsychologia
Recognition of prosody following unilateral brain lesions: influence of functional and structural attributes of prosodic contours
Neuropsychologia
Unilateral brain damage, prosodic comprehension deficits, and the acoustic cues to prosody
Brain Lang.
Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI
NeuroImage
A critical review of PET studies of phonological processing
Brain Lang.
Auditory lexical decision, categorical perception, and FM direction discrimination differentially engage left and right auditory cortex
Neuropsychologia
Both of us disgusted in my insula: the common neural basis of seeing and feeling disgust
Neuron
Dynamic pattern of brain activation during sequencing of word strings evaluated by fMRI
Cogn. Brain Res.
Dynamic brain activation during processing of emotional intonation: influence of acoustic parameters, emotional valence and sex
NeuroImage
Prosodische störungen bei neurologischen erkrankungen-eine literaturübersicht
Fortschr. Neurol. Psychiatr.
Rate-dependent activation of a prefrontal–insular–cerebellar network during passive listening to trains of click stimuli: an fMRI study
NeuroReport
Neural systems for recognition of emotional prosody: a 3-D lesion study
Emotion
Sounds of emotion: production and perception of affect-related vocal acoustics
Ann. N. Y. Acad. Sci.
Acoustic profiles in vocal emotion expression
J. Pers. Soc. Psychol.
The neural basis of prosody: insights from lesion studies and neuroimaging
Aphasiology
Electrodynamic headphones and woofers for application in magnetic resonance imaging scanners
Med. Phys.
Dissociable neural responses to facial expressions of sadness and anger
Brain
Asymmetries of emotional perception and expression in normal adults
Emotional processing deficits in individuals with unilateral brain damage
Appl. Neuropsychol.
Emotional processing following cortical and subcortical brain damage: contribution of the fronto-striatal circuitry
Behav. Neurol.
Impaired recognition and experience of disgust following brain injury
Nat. Neurosci.
Cited by (282)
Music impacts brain cortical microstructural maturation in very preterm infants: A longitudinal diffusion MR imaging study
2023, Developmental Cognitive NeuroscienceThe emotional component of inner speech: A pilot exploratory fMRI study
2023, Brain and CognitionPsychiatric sequelae of stroke affecting the non-dominant cerebral hemisphere
2021, Journal of the Neurological SciencesTemporal division of the decision-making process: An EEG study
2021, Brain Research