Elsevier

Journal of Physiology-Paris

Volume 98, Issues 1–3, January–June 2004, Pages 161-170
Journal of Physiology-Paris

Visuo-tactile representation of near-the-body space

https://doi.org/10.1016/j.jphysparis.2004.03.007Get rights and content

Abstract

Here we report findings from neuropsychological investigations showing the existence, in humans, of intersensory integrative systems representing space through the multisensory coding of visual and tactile events. In addition, these findings show that visuo-tactile integration may take place in a privileged manner within a limited sector of space closely surrounding the body surface, i.e., the near-peripersonal space. They also demonstrate that the representation of near-peripersonal space is not static, as objects in the out-of-reach space can be processed as nearer, depending upon the (illusory) visual information about hand position in space, and the use of tools as physical extensions of the reachable space. Finally, new evidence is provided suggesting the multisensory coding of peripersonal space can be achieved through bottom–up processing that, at least in some instances, is not necessarily modulated by more “cognitive” top–down processing, such as the expectation regarding the possibility of being touched. These findings are entirely consistent with the functional properties of multisensory neuronal structures coding near-peripersonal space in monkeys, as well as with behavioral, and neuroimaging evidence for the cross-modal coding of space in normal subjects. This high level of convergence ultimately favors the idea that multisensory space coding is achieved through similar multimodal structures in both humans and non-human primates.

Introduction

A stable representation of external space requires integration of information stemming from multiple sensory modalities. As we typically receive a simultaneous flow of information from each of our senses in real word situations, our perception of space is the product of an integrated multisensory processing. It seems entirely adaptive that the multiple sources of information, derived from the different modalities, could be combined to yield the best estimate of the external events [14].

Some of the sensory integration systems responsible for such processing have now been documented physiologically and their functional role in the coding of nearby space has been shown. For instance, animal experiments have revealed an ever-growing body of evidence supporting the notion that visual space surrounding the body (peripersonal space) in the monkey is coded through multisensory integration at the single-neuron level [15], [16], [25], [26], [56], [57]. Several brain structures (putamen, parietal and premotor areas) contain a relatively high percentage of multimodal neurons that have, for example, both a tactile and visual receptive field (RF). Visual RFs roughly match the somatotopic location of the tactile RF, extending only a few centimetres outward from the skin. Besides responding to tactile stimuli, these neurons also respond to visual stimuli, provided that visual objects are presented very close to a particular body part (e.g., the head or the hand), where the tactile RF is located.

The main functional properties of these multimodal neurons can be summarized as follows: (1) the visual RFs, which are in approximate spatial register with the tactile RFs, operate to some degree in body-part-centered co-ordinates, moving with the body part when it moves, and not when the eye moves; (2) the extent of the visual RFs is typically restricted to the space immediately surrounding the body part; (3) the strength of the visual response decreases with distance from the body part.

On the basis of these functional properties, several authors have suggested that premotor cortex, parietal areas and putamen form an interconnected system for integrated multisensory coding of near-peripersonal space [7], [16], [22], [23], [28], whereby multisensory responses are observed, as opposed to a far-peripersonal space that is outside the range of bimodal response sensitivity.

By close analogy with monkey data, recent neuropsychological findings from our laboratory provided the first evidence that the human brain forms integrated visual–tactile representations of the space closely surrounding the hand. Starting from the above reported animal evidence, we put forward the hypothesis that a similar multisensory system may operate in the human brain. To the aim of verifying this hypothesis, we took advantage of a neuropsychological condition called `extinction', which provided considerable insight into the behavioral characteristics of multimodal spatial representation in humans. Extinction [43], [54] is a pathological sign following brain damage whereby some patients fail to report a stimulus presented to the contralesional affected side only when a concurrent stimulus is presented on the ipsilesional side of the body, that is, under conditions of double simultaneous stimulation [1]. Although some sensory deficits may be present in extinction patients [38], extinction is not merely a disorder of primary sensory processing, as confirmed by the fact that extinction patients are able to detect the presence of the same contralesional stimulus most of the time when it is delivered singly to the affected side.

Extinction emerges as a consequence of an uneven competition between spared and affected representations of space, whereby ipsi- and contra-lesional stimuli benefit from competitive weights of different strength, for accessing a limited pool of attentional resources [9], [12], [13], [17], [18]. When two opposite stimuli are engaged in competition in patients affected by extinction, the ipsilesional stimulus benefits from a higher weight as compared to the contralesional stimulus, whose competitive strength has been reduced by the lesion of a brain area representing a portion of contralateral space. As a consequence of this uneven competition, the contralesional stimulus will lose, due to a weaker activation of the representation of that portion of space, and would appear to be extinguished by the more strongly represented ipsilesional stimulus [10], [66].

In the following sections we introduce a series of studies from our laboratory, which demonstrated that the competition between the right and left representations of space can lead to extinction across the visual and tactile modalities [1], [50]. In addition, and more relevant to the aim of the present paper, the prediction of these studies was that if a multisensory (visuo-tactile) system processing tactile and visual stimuli near the body is in charge of coding left and right spatial representations, then delivering visual stimuli close to a body part (⩽7 cm) would be more effective in producing cross-modal visuo-tactile extinction than presenting the same visual stimuli at larger distances (⩾35 cm). Hereinafter, we will refer to these locations as, respectively, the near-peripersonal and the far-peripersonal space of a given body part.

In a recent study by Làdavas et al. [40] on right-hemisphere damaged (RBD) patients, who suffered from reliable tactile extinction, a visual stimulus presented close to the patient's ipsilesional hand (i.e., in the near-peripersonal space) inhibited the processing of a tactile stimulus delivered on the contralesional hand (cross-modal visuo-tactile extinction for near space) to a comparable extent as did an ipsilesional tactile stimulation (unimodal tactile extinction). In striking contrast, less interfering effects of vision on touch perception were observed when the visual stimulus was presented more distant from the patient's hand (i.e., in the far-peripersonal space).

This pattern of results is what should be expected if an ipsilesional visual stimulus presented near the body were processed by an integrated visual–tactile system coding near-peripersonal space, functionally analogous the one described in monkeys. Due to this sensory integration, a visual stimulus presented near ipsilesional body parts strongly activate the corresponding somatosensory representation of those body parts. Because extinction becomes manifest when there is a competition between two or more spatial representations [18], the simultaneous activation of somatosensory representation of the left hand by a tactile stimulus, and of the right hand by a visual stimulus presented in the near-peripersonal space, produces an extinction of those stimuli presented in the weaker representation, in this case that of the left hand. Instead, the same visual stimulus presented in the far-peripersonal space weakly activates the somatosensory representation of the right hand and thus a reduction of cross-modal visual–tactile extinction is observed.

To examine the spatial co-ordinates used by this integrated visuo-tactile system to code near-peripersonal space, a patient with tactile extinction was asked to cross the hands such that the left hand was in the right hemispace and the right hand in the left hemispace [11]. A visual stimulus presented near the right hand (in the left space) extinguished tactile stimuli applied to the left hand (in right hemispace). Thus, the cross-modal visuo-tactile extinction was not modulated by the position of the hands in space. This finding seems to suggest that, when the hand is moved, the visual peripersonal space remains anchored to the hand and therefore it moves with the hand.

One interesting question is whether visual peripersonal space exists also for other body parts. On the basis of neurophysiological evidence, showing that the bimodal neurons mainly represent body parts such as the animal hands and the face, we hypothesized that the latter body part might also be involved in the same multisensory processing. We investigated this possibility in patients with tactile extinction and we found that, as for the hand, visual stimuli delivered in the space near the ipsilesional side of the face extinguished tactile stimuli on the contralesional side (cross-modal visuo-tactile extinction) to a comparable extent as did an ipsilesional tactile stimulation (unimodal tactile extinction). However, when visual stimuli were delivered in the far-peripersonal space of the head, visuo-tactile extinction effects were dramatically reduced [41]. Thus, these studies suggest the existence of an integrated system that controls both visual and tactile inputs within near-peripersonal space around the face and the hand, which can be functionally separated from that controlling visual information in the far-peripersonal space [20], [39].

While the simultaneous co-occurrence of visual and tactile events (relative to the affected hand) clearly worsen tactile perception, the spatial distance of visual events from a given sector of the patient's body is crucial for the occurrence of space-specific cross-modal effects. But how does the multisensory system estimate the distance between the hands and nearby visual objects? In humans, one possibility is by combining proprioception and vision [30], [60], [64]. In monkeys, these inputs can be merged at the level of single, multimodal neurons, which respond to near-peripersonal visual stimuli even when proprioception is the only available source of information to reconstruct arm position, that is when direct vision of the arm by the monkey is prevented. However, this proprioception-alone based response is much weaker than that evoked when vision of the arm is also allowed [24], [46], [53]. Therefore, one may ask whether visual information regarding hand posture is more relevant than proprioceptive information for the cross-modal representation of near-peripersonal space in humans.

Accordingly, we investigated whether the cross-modal modulation of hand tactile perception can still be obtained when the patients' hand is not visible [42]. To verify this hypothesis, two different experiments were performed with another group of RBD patients affected by left tactile extinction: one in which patients could see their hands and one in which vision of their hands was prevented. The results showed that reduction of contralesional tactile perception was much stronger when the visual stimulus was presented near the visible right hand compared to when the vision of the hand was prevented. In fact, the amount of cross-modal extinction obtained when vision was prevented was comparable no matter whether the visual stimulus was presented near or far from patients' ipsilesional hand. That is, cross-modal extinction was not prevalent in the near-peripersonal space when only proprioceptive cues about hand position were available, thus showing that proprioception can only weakly mediate the representation of the hand-centered visual peripersonal space.

Therefore, vision of hand position in space has a major impact in coding the distance of visual stimuli from someone's hands, in agreement with the notion that visual information usually overwhelms low spatial resolution senses like proprioception [8], [33], [51], [58], [65].

Recent studies have shown that visual information about hand, besides being necessary, can also be sufficient for mediating the integrated processing of visual–tactile input in the peripersonal space. Graziano et al. [27] have found that when the monkey's real arm was hidden from the view with a shield, and a realistic fake arm was visible above the shield and placed either at the same location or at a different location from the monkey's own arm, many neurons in area 5 were modulated also by the sight of the fake arm. Similar evidence has been provided in a group of RBD patients with tactile extinction by Farnè et al. [21]. In this study visual stimuli were presented near or far from patients ipsilesional real hand (real hand cross-modal condition) or they were always presented far from patients ipsilesional real hand, placed behind their back, but near a rubber hand that could be either visually aligned or misaligned with patients ipsilesional shoulder (rubber hand cross-modal conditions). In both situations, unseen tactile stimuli were delivered on patients contralesional hand. Patients were simply required to verbally report the side(s) of stimulation, irrespective of the sensory modality.

The results showed that the visual stimulus presented near a seen right rubber hand induced a strong cross-modal visual–tactile extinction, comparable to that obtained by presenting the same visual stimulus near patients real hand. Critically, this cross-modal effect was strongly reduced when the seen rubber hand was arranged in an implausible posture (i.e., misaligned with respect to subjects right shoulder).

These findings clearly show that the integrated processing of visuo-tactile inputs in near-peripersonal space can also be uniquely activated on the basis of visual information regarding hand position, whereas proprioception per se does not seem to be crucial for the distinction between near and far space. The results also provide neuropsychological evidence that the human brain can form visual representations of the near-peripersonal space of a non-owned body part, like a rubber hand, as if it were a real hand. Indeed, the multisensory system coding hand-centered peripersonal space in humans uses mainly visual information to compute the spatial relation between the positions of hands and visual stimuli (see also [48]). Thus, vision of a fake hand can dominate over conflicting information provided by proprioception, such that a visual stimulus presented far from a patient's real hand is processed as if it were in near-peripersonal space. However, this phenomenon can take place only if the fake hand looks plausible with respect to the patient's body, showing that visual dominance is not complete, and probably not necessary because, in normal situations, vision and proprioception convey congruent information. When extremely conflicting information is provided by the different senses, the seen rubber hand may no longer be processed as a personal belonging and, thus, does not capture the felt hand position. It is important to underline that this result is again fully consistent with neurophysiological evidence. When a fake realistic arm is visible, instead of the arm of the animal, the activity of many visuo-tactile neurons in the ventral premotor and in parietal areas is modulated by the congruent or incongruent location of a seen fake arm [24], [27].

Besides interfering with tactile perception, in some circumstances the vision of a body part can actually help enhancing tactile sensitivity, both in neurological patients [31], [32] and normal subjects [62], [63]. Seeing the corporeal origin of a tactile sensation can ameliorate impaired somatosensory perception in brain-damaged patients with hemisensory loss of the upper limb [31], [32], [52]. Similar to what was described earlier in patients, this has also been shown when the seen body part is a fake, rubber replica of a hand. Rorden et al., for example [59], have shown that a patient with hemisensory loss could recover lost tactile sensitivity if a visual stimulus, presented in spatial congruence with the unseen tactile stimulation of the affected hand, was attached to a rubber hand mimicking the real one. By taking advantage of the rubber hand illusion [3], [55], patient's tactile sensitivity deficit could be strongly ameliorated. Once again, the occurrence of cross-modal enhancement of tactile sensation strictly depended upon the rubber hand's orientation with respect to the patients' shoulder. Indeed, the improvement was present only when the seen position of the rubber hand was visually compatible with the patient's body. That is, when the rubber hand was superimposed and aligned with the subject's hidden hand. Moreover, Kennett et al. [36] have recently shown in normal subjects that not only tactile detection, but also two-point discrimination can be improved by viewing the tactually stimulated body part through a visually magnifying glass, even without seeing the tactile stimulation.

How far does the representation of near-peripersonal space spread out from our body surface? Is the extension of the peripersonal space fixed in space or can it be modified? If it can be modified, according to what kind of experience? Is a simple change of our visual body-image sufficient to dynamically re-map far space as near, or is some kind of visuo-motor activity necessary to produce this re-mapping? These important questions concerning near-peripersonal space in humans are tightly linked to, and partially suggested by, available neurophysiological evidence. Recent animal studies have examined whether the near-peripersonal space of monkeys' hands, and especially its spatial extension and location, can be modified through different kinds of sensorimotor experience. So far, the manipulations attempted have concerned the use of a tool as an extension of the personal space, and the online visual control of hand motor actions projected on a video monitor [34], [35], [53].

A re-coding of relatively far visual stimuli as nearer ones has been observed in monkey single-cells studies, after extensive experience with the use of a tool. In these studies, a rake-shaped tool was used to connect the animal's hand with objects located outside its reaching distance, with the result of actually extending the hand's reachable space. Once the monkeys were trained to use the rake to retrieve distant food, a few minutes of tool-use induced an expansion of visual RFs of bimodal neurons recorded in the parietal cortex. This rapid expansion along the tool axis seemed to incorporate the tool into the hand's peripersonal space representation. The extended visual RF contracted back to pre-tool-use dimensions after a short rest, even if the monkey was still passively holding the rake [34]. Therefore, the tool-use related expansion of the visual RFs was strictly dependent upon the intended use of the rake to reach distant objects. No modification was ever found when the monkey was just passively holding the tool.

A similar effect of re-coding of visual stimuli located in far-peripersonal space, as if they were closer to the participants' body, has been found behaviorally in a group of right brain-damaged patients with tactile extinction by Farnè and Làdavas [19]. In this study cross-modal visual–tactile extinction was assessed by presenting visual stimuli far from patients ipsilesional hand, in correspondence of the distal edge of a rake statically held in their hand. The results showed that cross-modal extinction was more severe after the patients used the rake to retrieve distant objects with respect to a condition in which the rake was not used. The evidence of an expansion of peri-hand space lasted only few minutes after tool use. After 5 min resting period, the amount of cross-modal extinction was comparable to that obtained before tool use, suggesting that the spatial extension of the hand peripersonal space was contracted towards patients' hand. Finally, pointing movements towards distant objects did produce cross-modal extinction entirely comparable to that obtained in the pre-tool-use condition, showing that the expansion of hand peripersonal space is strictly dependent upon the use of the tool [19], [47], aimed at physically reaching objects located outside the hand reaching space, and it does not merely result from directional motor activity.

A phenomenon related to the expansion of peri-hand space is that recently described by Berti and Frassinetti [2] in a neglect patient who showed rightward errors in line-bisection with a laser pointer, within near but not far space. However, when bisecting far lines with a long stick, which brought the unreachable lines into the reachable space, her neglect returned. Also in this case the re-mapping of the space was apparent when the patient use the stick to reach far objects, in this case lines.

In conclusion, neurophysiological and neuropsychological findings constitute a direct evidence that the representation of peri-hand space can be expanded along the tool axis to include its length and shows that re-mapping of far space as near space can be achieved trough a re-sizing of the peri-hand area where visual–tactile integration occurs.

Another way to extend our own visual peripersonal space has been very recently shown by Iriki et al. [35]. In this interesting study the authors trained the monkey to recognize the image of the hand in a video monitor. Before the training, no neuron responded to visual stimuli presented around the image of the hand in the monitor screen, but immediately after the monkey learned to recognize the self-image in the monitor, a group of neurons appeared with new visual RFs formed around the screen-image of the hand. A similar re-coding of far visual stimuli as near ones, has recently been observed in a patient with tactile extinction and normal subjects [48], [49]. In this patient, a flash of light actually delivered close to the right hand, but appearing in far space due to observation via a mirror, produced closely similar effects to that produced by a light which was directly viewed near the hand in peripersonal space. Thus, seeing his own hand via a mirror activates a representation of the peripersonal space around that hand, not of the extrapersonal space as suggested by the distant visual image in the mirror. Thus, a mirror can be considered a different kind of tool which allows objects located in the far space to be treated as falling close to the actual location of the patient's hand.

Summarizing, tools enable human beings, as well as other animals, to act on objects when they are not directly reachable by hands. Acting on distant objects by means of a tool requires sensory information that are mainly provided by vision and touch. The expansion of the peri-hand area whereby vision and touch are integrated would render the possibility of reaching and manipulating far objects as if they were near the hand.

The results of the study on the fake hand show that, due to the dominance of vision over proprioception, the system coding the space surrounding the body can be deceived by the vision of a fake hand, provided that its appearance looks plausible with respect to subject's body. Although this deception may seem surprising, it can be better understood as the results of a normal adaptive process. Because the visual response of the monkey's bimodal neurons does not change after repeated stimulation [28], it has been suggested that the basic functional properties of these neurons, e.g., the distance-dependent gradient of firing, are hard-wired and that the spatial correspondence between visual and tactile RFs can be further calibrated through experience [28], [61]. While a major experience-dependent change in the extent of visual RFs can be induce by tool use, as reported above, the experience through which spatial calibration can be achieved would consist of the repeated exposure to visual stimuli approaching the hand, and vice versa. In most of these instances, both the visual stimulus and the hand are under visual control, and the felt position of the hand is congruent with its seen position. Thus, the deception operated by a rubber hand may reflect a sort of impenetrability of the integrated visual–tactile system to discrepant information provided by proprioception and vision which, in Bayesian terms, will normally have little chance of conveying conflicting information. In addition, it also shows that this impenetrability is present despite the subject's conscious awareness concerning the actual discrepancy between the senses.

Owing to such a hard-wired property, the integrated processing of visual peripersonal space could be resistant also to other types of subjective knowledge or expectation as, for example, that regarding the possibility of being touched. Recent brain imaging studies revealed that activation of the primary and secondary somatosensory cortex, similar to that obtained following tactile stimulation, can be induced when no touch is actually delivered, but the subject is waiting for it [6]. The common pattern of cerebral activity during anticipation and real somatosensory stimulation indicates that anticipation may invoke sustained top–down regulation of neural processing.

These findings raise the question of whether multisensory processing of the space near the hand can be affected by the subject's expectancy of being touched. On the basis of the above cited resistance of the multisensory system to the subject's conscious awareness, one should expect that this system would process proximal visual stimuli independent of their actual possibility of touching the subject's body. In order to test the hypothesis that peripersonal space representation is not affected by cognitive top–down regulation, a study was conducted in which the right hand of a patient with left tactile extinction was covered by a transparent Plexiglas. In this experimental setting, the patient clearly knows that the transparent barrier would prevent any possibility for the visual stimulus (the experimenter's finger) to get into physical contact with his/her hand. However, the Plexiglas will certainly not prevent either the direct vision of his/her right hand, or the proprioceptive processing. Both these sensory sources will provide congruent inputs relative to the spatial proximity of the hand and visual stimulus. This information should activate the integrated visual–tactile system responsible for coding the visual space near the body, despite the interposition of a protective, but transparent barrier between the experimenter's and patient's hands. As a consequence, we should expect to find cross-modal extinction also when the right hand is covered by a glass.

In order to verify this prediction, a patient with left tactile extinction was examined in a cross-modal visuo-tactile paradigm similar to the previous one. The experimeter's hand was placed either in the near- or far-peripersonal space of the patient's hand, which, in separate blocks, could be “protected” or not by the plexiglas.

Section snippets

Case report

Patient RMA is an 52-year old lady who gave her informed consent to participate in the study. She suffered an haemorrhagic stroke in the right hemisphere 3 months before the test (Table 1). The CT scan revealed a right unilateral lesion involving the frontal and the parietal lobes. On clinical examination she was alert and oriented in time and space. No visual field defects were evident when assessed by Goldman Perimetry. She did not present any sign of visual neglect, as assessed by Albert

Results

Patient RMA never responded to catch trials. A substantial extinction phenomenon emerged in cross-modal conditions. Overall, the left tactile stimulus was detected only on 29% of trials when presented together with a simultaneous touch on the right, and it was detected on 90% of trials when presented alone (p<0.0001 by Fisher Exact test). Most important, cross-modal extinction varied according to the distance of the visual stimulus from the patient's hand. Cross-modal extinction was more

Discussion and conclusions

The results of the single case study clearly show that visual stimuli presented in the near-peripersonal space of the ipsilesional hand induce strong cross-modal extinction of tactile stimuli simultaneously delivered on the contralesional hand, independent of the patient's knowledge concerning the real possibility of being touched. Therefore, this knowledge does not prevent the activation of the somatotopic representation of the right hand by a visual stimulus presented near that hand. In

References (66)

  • J.B. Mattingley et al.

    Attentional competition between modalities: extinction between touch and vision after right hemisphere damage

    Neuropsychologia

    (1997)
  • G. Rizzolatti et al.

    The organization of the cortical motor system: new concepts

    Electroencephalogr. Clin. Neurophysiol

    (1998)
  • G. Rizzolatti et al.

    Afferent properties of periarcuate neurons in macaque monkeys. II Visual responses

    Behav. Brain Res

    (1981)
  • M.B. Bender

    Disorders in Perception

    (1952)
  • A. Berti et al.

    When far becomes near: remapping of space by tool use

    J. Cog. Neurosci

    (2000)
  • M. Botvinick et al.

    Rubber hands “feel” touch that eyes see

    Nature

    (1998)
  • K. Carlsson et al.

    Tickling expectations: neural processing in anticipation of a sensory stimulus

    J. Cog. Neurosci

    (2000)
  • C.L. Colby et al.

    Ventral intraparietal area of the macaque: anatomic location and visual response properties

    J. Neurophysiol

    (1993)
  • P. Dassonville

    Haptic localization and the internal representation of the hand in space

    Exp. Brain Res

    (1995)
  • G. di Pellegrino et al.

    Spatial extinction to double asynchronous stimulation

    Neuropsychologia

    (1997)
  • G. di Pellegrino et al.

    Seeing where your hands are

    Nature

    (1997)
  • J. Driver

    The neuropsychology of spatial attention

  • J. Driver et al.

    Extinction as a paradigm measure of attentional bias and restricted capacity following brain injury

  • J.R. Duhamel et al.

    Congruent representation of visual and somatosensory space in single neurons of monkey ventral intra-parietal area (VIP)

  • J.R. Duhamel et al.

    Ventral intraparietal area of the macaque: congruent visual and somatic response properties

    J. Neurophysiol

    (1998)
  • J. Duncan

    The locus of interference in the perception of simultaneous stimuli

    Psychol. Rev

    (1980)
  • J. Duncan

    Coordinated brain systems in selective perception and action

  • A. Farnè et al.

    Dynamic size-change of hand peripersonal space following tool use

    Neuroreport

    (2000)
  • A. Farnè et al.

    Auditory peripersonal space in humans

    J. Cog. Neurosci

    (2002)
  • A. Farnè et al.

    Left tactile extinction following visual stimulation of a rubber hand

    Brain

    (2000)
  • L. Fogassi et al.

    Coding of peripersonal space in inferior premotor cortex (area F4)

    J. Neurophysiol

    (1996)
  • L. Fogassi et al.

    Visual responses in the dorsal premotor area F2 of the macaque monkey

    Exp. Brain Res

    (1999)
  • M.S.A. Graziano

    Where is my arm? The relative role of vision and proprioception in the neuronal representation of limb position

    Proc. Nat. Acad. Sci

    (1999)
  • Cited by (67)

    • Peripersonal space (PPS) as a multisensory interface between the individual and the environment, defining the space of the self

      2019, Neuroscience and Biobehavioral Reviews
      Citation Excerpt :

      Less or no competition occurs for stimuli far from the homologous body part. This evidence has been advocated to demonstrate the existence of different multisensory representations of the space around different body parts, as separated from the representations of further positions of space in humans (di Pellegrino and Ladavas, 2015; Ladavas, 2002; Ladavas and Farne, 2004). Behavioral data in healthy human participants confirm that the processing of tactile information on the body is more effectively influenced by visual (Macaluso and Maravita, 2010) or auditory (Occelli et al., 2011) stimuli occurring near, as compared to far from, the body, as shown by using the crossmodal congruency task (Spence et al., 2004b).

    View all citing articles on Scopus
    View full text