Elsevier

NeuroImage

Volume 49, Issue 1, 1 January 2010, Pages 994-1005
NeuroImage

CNS activation and regional connectivity during pantomime observation: No engagement of the mirror neuron system for deaf signers

https://doi.org/10.1016/j.neuroimage.2009.08.001Get rights and content

Abstract

Deaf signers have extensive experience using their hands to communicate. Using fMRI, we examined the neural systems engaged during the perception of manual communication in 14 deaf signers and 14 hearing non-signers. Participants passively viewed blocked video clips of pantomimes (e.g., peeling an imaginary banana) and action verbs in American Sign Language (ASL) that were rated as meaningless by non-signers (e.g., TO-DANCE). In contrast to visual fixation, pantomimes strongly activated fronto-parietal regions (the mirror neuron system, MNS) in hearing non-signers, but only bilateral middle temporal regions in deaf signers. When contrasted with ASL verbs, pantomimes selectively engaged inferior and superior parietal regions in hearing non-signers, but right superior temporal cortex in deaf signers. The perception of ASL verbs recruited similar regions as pantomimes for deaf signers, with some evidence of greater involvement of left inferior frontal gyrus for ASL verbs. Functional connectivity analyses with left hemisphere seed voxels (ventral premotor, inferior parietal lobule, fusiform gyrus) revealed robust connectivity with the MNS for the hearing non-signers. Deaf signers exhibited functional connectivity with the right hemisphere that was not observed for the hearing group for the fusiform gyrus seed voxel. We suggest that life-long experience with manual communication, and/or auditory deprivation, may alter regional connectivity and brain activation when viewing pantomimes. We conclude that the lack of activation within the MNS for deaf signers does not support an account of human communication that depends upon automatic sensorimotor resonance between perception and action.

Introduction

The linguistic articulators for sign language are the same as those involved in everyday human actions, such as reaching, grasping, object manipulation, and communicative gesture. Here, we explore the interaction between the neural systems that support human action understanding and those involved in sign language comprehension. Recently, the human mirror neuron system (MNS) has been argued to be the neural mechanism that underlies action understanding through embodied simulation and automatic sensorimotor resonances (e.g., Gallese, 2007, Rizzolatti and Craighero, 2004). The MNS is hypothesized to be a perception–action matching system that is automatically engaged during the observation of both communicative and non-communicative gestures or actions. The neuroanatomical correlates of the human MNS consist of the inferior frontal gyrus (IFG), ventral premotor cortex, and the inferior parietal lobule (IPL) (see Rizzolatti and Sinigaglia, 2008, for review). In addition, other regions outside the MNS play a role in the perception of actions and gestures. Specifically, the superior temporal sulcus (STS) is involved in the perception of biological motion and, more broadly, in processing social communication (e.g., Grossman et al., 2000, Allison et al., 2000). Visual regions, including the fusiform face area (FFA) and the extrastriate body area (EBA), are also recruited during the perception of gestures and actions involving the hands, arms, and face (Montgomery and Haxby, 2008, Astafiev et al., 2004).

We investigated whether knowledge and use of American Sign Language (ASL) has an impact on the neural systems that are recruited during the perception of pantomimes, which are meaningful but non-linguistic (i.e. they are not lexical signs). Specifically, we presented pantomimes, which (unlike signs) can involve the whole body, are not stored in a signer's lexicon, and may violate phonological constraints on form (Klima and Bellugi, 1979). Since signers have different life experiences with manual communication than hearing non-signers, we speculated that observing pantomimes might engage distinct neural regions for deaf signers compared to hearing non-signers. Native deaf signers have been exposed from birth to a manual linguistic system that serves as their primary means of communication. In addition, deaf signers have extensive experience with pantomimic communication through their interactions with hearing non-signers and through storytelling in ASL, which often incorporates pantomimic body and facial gestures (Emmorey, 1999). We hypothesized that these different experiences with manual communication might alter the nature of the neural systems that underlie pantomime recognition for deaf signers.

In support of this hypothesis, Corina et al. (2007) recently reported the surprising result that deaf signers did not engage the fronto-parietal network associated with the MNS when passively viewing manual actions that were self-oriented (e.g., scratch neck, lick lips, rub shoulder) or object-oriented (e.g., bite an apple, read a book, pop a balloon; i.e., the model handled the objects). In contrast, hearing non-signers exhibited robust activation within the MNS when observing these actions. Corina et al. (2007) hypothesized that life-long experience with a visual language shifts neural processing of human actions to extrastriate association areas (including the EBA), regions that were particularly active for the deaf signers. Corina et al. (2007) suggested that this shift arises because signers must actively filter human actions in order to be able to quickly distinguish linguistic from non-linguistic actions for further semantic and syntactic processing. Such pre-processing of human action is not required for non-signers. In the current study, we attempt to replicate and extend this finding by investigating whether differences between signers and non-signers in neural circuitry for action observation extends to processing meaningful pantomimes.

A second question we addressed was whether and how neural regions differ when signers are engaged in processing meaningful hand movements that have linguistic form (ASL signs) vs. meaningful hand movements that are non-linguistic (pantomimes). Comprehension of single lexical signs (even iconic signs) can be impaired in deaf signers with aphasia who nevertheless are able to recognize pantomimes (Corina et al., 1992, Marshall et al., 2004). However, there are no reports of patients with preserved sign language comprehension who are impaired in recognizing pantomimes, suggesting that a double dissociation may not exist between processing sign language and gesture (MacSweeney et al., 2008). There is also some evidence that similar neural circuitry supports processing linguistic signs and non-linguistic gestures. MacSweeney et al. (2004) contrasted perception of signed sentences (British Sign Language) with perception of a set of non-linguistic manual gestures known as Tic Tac, used in racecourse betting (the gestures were not known to the participants in the fMRI study). In general, very similar neural systems were recruited for both types of stimuli, although left perisylvian regions were recruited to a greater extent for the linguistic signs than for the non-linguistic Tic Tac gestures (left IFG, posterior STS, and anterior supramarginal gyrus).

Corina et al. (2007) contrasted perception of meaningful linguistic stimuli (ASL nouns) with perception of manual actions (actions on objects and self-oriented actions), and found that the neural systems recruited during sign perception were different from those recruited during action perception: ASL signs engaged left inferior frontal cortex (BA 46/9), left superior temporal gyrus (BA 41), and the insula, whereas actions engaged bilateral superior frontal cortex (BA 10) and right occipital-temporal cortex, extending into the right temporal pole.

Note that the Tic Tac stimuli used in MacSweeney et al. (2004), although non-linguistic in form, were symbolic and, in this sense, similar to the sign stimuli. The gestures had the potential to communicate and, in fact, participants were instructed to guess which Tic Tac gesture string did not make sense; in other words, participants were looking for meaning in the gestures. In contrast, the actions used in Corina et al. (2007) were neither linguistic nor symbolic. In the current study, we tease apart some of these effects by presenting ASL verbs and pantomimes. Both stimuli are meaningful to deaf signers, but signs participate in a linguistic system of constraints and have stored lexical representations; pantomimes do not. As in Corina et al. (and unlike MacSweeney et al.), participants in our study passively viewed the stimuli, rather than performing a semantic judgment task.

A third question we explored was how neural activation patterns differ when hearing non-signers observe meaningful hand gestures (pantomimes) compared to meaningless hand gestures (ASL verbs). Although the contrast between meaningful and meaningless hand gestures has been of central importance in the apraxia literature (e.g., Buxbaum, 2001, Goldenberg, 2009), very few neuroimaging studies have examined whether and how meaningfulness impacts the neural correlates underlying the observation of human movements. Decety et al. (1997) found that viewing pantomimes recruited more left hemisphere structures than ASL signs (which were meaningless to their participants). Greater left hemisphere involvement for meaningful movements can be attributed to the greater role of the left hemisphere in processing semantic information (e.g., Gonzalez-Rothi et al., 1991). However, Villarreal et al. (2008) recently reported that an extensive, bilateral common neural network was engaged during the recognition of both meaningful hand movements (pantomimes and emblems) and meaningless hand movements (actions involving movements comparable to those involved in meaningful actions but with no goal). Here, we further investigate the extent to which meaningful hand movements (pantomimes) and meaningless hand movements (ASL signs, which were not known to the hearing participants) engage extensive overlapping regions and whether meaningful movements preferentially activate left hemisphere structures.

Finally, we applied functional connectivity analyses to characterize more fully the extent to which sign and gesture processing might build upon the fronto-parietal MNS in deaf signers and hearing non-signers. Functional connectivity analyses are able to identify the extent to which activation levels in two regions are correlated, and this correlation is interpreted as a reflection of the degree to which the two regions are functionally connected (e.g., Friston, 1994). Such analyses have demonstrated differences in functional connectivity within the left perisylvian language network for high- vs. low-capacity readers (Prat et al., 2007) and have revealed altered connectivity patterns within the motor network for patients with multiple sclerosis (Rocca et al., 2007). In this experiment, we selected seed voxels from regions within the MNS (ventral premotor cortex and inferior parietal cortex) and from a region outside the MNS (fusiform gyrus). The fusiform gyrus was chosen because this region is known to be engaged when viewing faces and bodies. In these analyses, correlation coefficients were computed between mean time series in seed voxels and all other voxels in the brain.

Section snippets

Participants

Fourteen deaf signers (7 males) and 14 hearing non-signers (6 males) participated in the experiment. All participants were right-handed (Oldfield handedness scores were 85.6 for the deaf group and 88.3 for the hearing group), and all had attended college. The deaf signers (mean age = 22.3 years; range: 19–43 years) were all born into signing families, were exposed to ASL from birth, and reported a hearing loss of ≥ 70 dB. The hearing non-signers (mean age = 24.3 years; range: 22–29 years) reported

Pantomime minus fixation

As expected, hearing non-signers showed extensive activation within the mirror neuron system when observing meaningful pantomimes compared to fixation baseline, as shown in Fig. 2A. Significant bilateral activation was present in the inferior frontal gyrus (IFG), premotor cortex, and the inferior parietal lobule (IPL), extending into the superior parietal lobule (SPL) (Table 1). In contrast, no significant activation was observed in the MNS when deaf signers viewed pantomimes (Fig. 2A and Table

Discussion

The most striking result of this study was the lack of activation within the mirror neuron system (inferior frontal gyrus, ventral premotor cortex, and inferior parietal lobule) for deaf ASL signers when passively viewing either signs or communicative gestures compared to a fixation baseline (see Fig. 2A). In contrast, hearing non-signers showed robust activation within the MNS for both sets of stimuli despite the fact that, for these participants, pantomimes are meaningful and signs are not.

Acknowledgments

This research was supported in part by NIH grants R01 HD-13249 and R01 DC-00201. We would like to thank Franco Korpics for help recruiting Deaf participants, Jeannette Vincent for help with stimuli development, Helsa Borinstein for help with the gesture norming study, and Jeffery Solomon for technical support with the connectivity analysis. Finally, we thank all of the deaf and hearing individuals who participated in the study.

References (45)

  • HickokG. et al.

    The neural organization of language: evidence from sign language aphasia

    Trends Cogn. Sci.

    (1998)
  • KangE. et al.

    Developmental hemispheric asymmetry of interregional metabolic correlation of the auditory cortex in deaf subjects

    NeuroImage

    (2003)
  • MacSweeneyM. et al.

    Speechreading circuits in people born deaf

    Neuropsychologia

    (2002)
  • MacSweeneyM. et al.

    Dissociating linguistic and nonlinguistic gestural communication in the brain

    NeuroImage

    (2004)
  • MacSweeneyM. et al.

    The signing brain: the neurobiology of sign language

    Trends Cogn. Sci.

    (2008)
  • MantheyS. et al.

    Premotor cortex in observing erroneous action: an MRI study

    Cogn. Brain Res.

    (2003)
  • PoldrackR.A.

    Imaging brain plasticity: conceptual and methodological issues—a theoretical review

    NeuroImage

    (2000)
  • SaxeR. et al.

    A region of right posterior superior temporal sulcus responds to observed intentional actions

    Neuropsychologia

    (2004)
  • ToniI. et al.

    Language beyond action

    J. Physiol. Paris

    (2008)
  • VaidJ. et al.

    Visual field asymmetries in numerical size comparisons of digits, words, and signs

    Brain lang.

    (1989)
  • VillarrealM. et al.

    The neural substrate of gesture recognition

    Neuropsychologia

    (2008)
  • AllenJ.S. et al.

    Morphology of the insula in relation to hearing status and sign language

    J. Neurosci.

    (2008)
  • Cited by (54)

    • The role of the superior parietal lobule in lexical processing of sign language: Insights from fMRI and TMS

      2021, Cortex
      Citation Excerpt :

      Additionally, involvement of frontal areas (IFG/PreCG) and parietal regions (postcentral gyrus/SMG extending to the SPL in the right hemisphere) was found, and these areas have been identified as hubs within the mirror neuron system (Buccino et al., 2004; Cattaneo & Rizzolatti, 2009; Rizzolatti & Sinigaglia, 2008) – a unified network engaged in processing a broad spectrum of human actions. Notably, participants were aware of the linguistic context of the task and even though they did not know the meaning of signs, they might have tried to extract its linguistic aspects (a similar effect was also discussed by Emmorey et al. (2010) and Liu et al. (2017)). This pattern of neural activity is in line with previous results in non-signers performing a task with signs that are meaningless to them (Corina et al., 2007; Emmorey et al., 2010; MacSweeney et al, 2004, 2006; Newman et al., 2015; Williams et al., 2016).

    • Sensorimotor system engagement during ASL sign perception: An EEG study in deaf signers and hearing non-signers

      2019, Cortex
      Citation Excerpt :

      Increased involvement of sensorimotor cortices for experts would demonstrate greater sensorimotor resonance during action observation, a phenomenon suggested to facilitate greater action understanding. Alternatively, it is possible that deaf signers will show less desynchronization in sensorimotor EEG rhythms than hearing non-signers, as has been suggested by relevant work in fMRI (Emmorey et al., 2010). Our second prediction is that there will be a greater desynchronization of power observed at central electrode sites during observation of two-handed signs as compared to one-handed signs for both Deaf and Hearing groups due to greater vicarious engagement of the sensorimotor cortex.

    View all citing articles on Scopus
    View full text