Cerebral pathways in processing of affective prosody: A dynamic causal modeling study
Introduction
In spoken language, information about the emotional state of the speaker can be expressed via propositional cues at the verbal level and via non-verbal means of communication by modulation of the speech melody (affective prosody). Affective prosody is characterized by variations of suprasegmental language features, such as pitch, syllable duration, and voice quality (Banse and Scherer, 1996). Evidence obtained from lesion studies indicates a right-hemispheric superiority in processing of these features (Blonder et al., 1991, Bowers et al., 1987, Borod et al., 2002). In line with this suggestion, functional imaging studies have consistently demonstrated right-lateralized activations at the level of the auditory cortex during evaluation of affective prosody (Mitchell et al., 2003, Wildgruber et al., 2005). However, it has been shown that left-hemispheric lesions can also compromise comprehension of affective speech melody (Adolphs et al., 2002, Hornak et al., 1996, Hornak et al., 2003, Kucharska-Pietura et al., 2003, Pell, 1998, van Lancker and Sidtis, 1992) challenging the hypothesis that processing of affective prosody is exclusively subserved by the right hemisphere. Specifically, unilateral lesions within the inferior frontal cortex of both hemispheres as well as deep white matter lesions of the mid-rostral part of the corpus callosum can result in severe deficits in comprehension of affective prosody (Hornak et al., 1996, Hornak et al., 2003, Ross et al., 1997). Accordingly, functional neuroimaging and event-related electrophysiological studies on the neural correlates underlying the comprehension of affective prosody demonstrated bilateral activations in the inferior frontal lobe (Imaizumi et al., 1997, Pihan et al., 2000, Wildgruber et al., 2002, Wildgruber et al., 2004). These converging results from lesion and neuroimaging studies suggest that the frontal lobes of both hemispheres cooperate in decoding of non-verbal emotional information in the voice and that intact transcallosal communication of information is necessary for comprehension of affective prosody. However, neither lesion studies nor conventional analysis of functional imaging data can clarify whether this cooperation of both hemispheres is accomplished in a serial way via sequential processing steps or if both frontal lobes receive their information independently from each other from the right temporal cortex. It was the aim of this study to investigate the connectivity pattern subserving communication between the right temporal cortex and the frontal lobes during decoding of affective prosody. To this end, we used event-related functional magnetic resonance imaging (fMRI) combined with the novel technique of dynamic causal modeling (Friston et al., 2003). Dynamic causal modeling enables inferences on (1) the parameters representing influences of experimentally designed inputs, (2) the intrinsic coupling of different brain regions, and (3) how this coupling is modulated by an experimental factor. Given the lack of knowledge on the connectivity between neural areas implicated in processing of affective prosody, we precluded modulating factors and focused on the investigation of input regions and the intrinsic connectivity pattern within this network. Therefore, we compared models in which the right secondary auditory cortex serves as input region with models in which direct inputs are assumed to enter the network via one of the frontal areas. Furthermore, to investigate the architecture of the interregional connections, we compared dynamic causal models corresponding to serial and parallel processing within the frontal lobes.
Section snippets
Subjects
24 right-handed German native speakers (11 males, 13 females, mean age 24.4 years) with no history of neurological or psychiatric illness participated in an fMRI experiment. Handedness was determined using the Edinburgh Inventory (Oldfield, 1971). The Ethical Committee of the University of Tuebingen had approved the investigation. Informed consent was obtained according to the Declaration of Helsinki.
Stimuli
Six professional actors (3 females/3 males) pronounced 162 German adjectives in either happy,
Behavioral data
Mean group ratings of acoustic stimuli in the prestudy and mean group ratings during fMRI were strongly correlated for both emotional word content (r = 0.93) and affective prosody (r = 0.93) indicating that the participants of the fMRI experiment did comprehend verbal and non-verbal affective information in presence of scanner noise with sufficient accuracy (see Fig. 2).
Conventional fMRI analysis
To identify brain regions specifically contributing to the processing of affective prosody and emotional word content, blood
Discussion
In the present study, conventional analysis of fMRI data based on a general linear model was employed to identify brain regions underlying understanding of affective prosody and emotional word content. Subsequently, dynamic causal modeling (Friston et al., 2003) was used to investigate input regions and architecture of interregional connections within the network involved in comprehension of affective prosody.
Acknowledgments
This study was supported by the Deutsche Forschungsgemeinschaft (DFG grant WI 2101/1-1) and by the Junior Science Program of the Heidelberger Academy of Sciences and Humanities.
References (48)
- et al.
Comprehension of emotional prosody following unilateral hemispheric lesions: processing defect versus distraction defect
Neuropsychologia
(1987) - et al.
Measuring emotion: the self-assessment manikin and the semantic differential
J. Behav. Ther. Exp. Psychiatry
(1994) - et al.
Recognition of emotional prosody and verbal components of spoken language: an fMRI study
Cogn. Brain Res.
(2000) - et al.
Classical and bayesian inference in neuroimaging: applications
NeuroImage
(2002) - et al.
Dynamic causal modelling
NeuroImage
(2003) - et al.
Face and voice expression identification in patients with emotional and behavioural changes following ventral frontal lobe damage
Neuropsychologia
(1996) - et al.
Phonetic perception and the temporal cortex
NeuroImage
(2002) - et al.
A PET study of visual and semantic knowledge about objects, 2005
Cortex
(2005) - et al.
Perception of emotions from faces and voices following unilateral brain damage
Neuropsychologia
(2003) - et al.
The neural response to emotional prosody, as revealed by functional magnetic resonance imaging
Neuropsychologia
(2003)
Assessment and analysis of handedness: the Edinburgh inventory
Neuropsychologia
Recognition of prosody following unilateral brain lesion: influence of functional and structural attributes of prosodic contours
Neuropsychologia
Comparing dynamic causal models
NeuroImage
Auditory lexical decision, categorial perception, and FM direction discrimination differentially engage left and right auditory cortex
Neuropsychologia
Lateralization of affective prosody in brain and the callosal integration of hemispheric language functions
Brain Lang.
The neuroanatomical and functional organization of speech perception
Trends Neurosci.
Going beyond the information given: a neural system supporting semantic interpretation
NeuroImage
Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain
NeuroImage
Dynamic brain activation during processing of emotional intonation: influence of acoustic parameters, emotional valence, and sex
NeuroImage
Identification of emotional intonation evaluated by fMRI
NeuroImage
Neural systems for recognition of emotional prosody: a 3-D lesion study
Emotion
Acoustic profiles in vocal emotion expression
J. Pers. Soc. Psychol.
The role of the right hemisphere in emotional communication
Brain
Neural correlates of sensory and decision processes in auditory object identification
Nat. Neurosci.
Cited by (193)
Aberrant Emotional Prosody Circuitry Predicts Social Communication Impairments in Children With Autism
2023, Biological Psychiatry: Cognitive Neuroscience and NeuroimagingBrain mapping of emotional prosody in patients with drug-resistant temporal epilepsy: An indicator of plasticity
2022, CortexCitation Excerpt :The ability to decode prosody was intact also in the RTLE group. The neuroimaging results showed an essential role of the RSTG for emotional prosody perception, in line with previous findings from the literature (Alba-Ferrara et al., 2011, 2012; Ethofer et al., 2006, 2012; Mitchell & Ross, 2008; Ross et al., 1988; Wildgruber et al., 2006). Our results offer additional support to Ross and Mesulam's hypothesis (1979), as we found a predominant role of the right hemisphere in emotional prosody, which connects and interacts with the functional anatomical organization of propositional language in the left hemisphere.
Does mobile payment change consumers’ perception during payment process? —An ERP study
2021, Neuroscience Letters