Research report
The ventriloquist in motion: Illusory capture of dynamic information across sensory modalities

https://doi.org/10.1016/S0926-6410(02)00068-XGet rights and content

Abstract

Integrating dynamic information across the senses is crucial to survival. However, most laboratory studies have only examined sensory integration for static events. Here we demonstrate that strong crossmodal integration can also occur for an emergent attribute of dynamic arrays, specifically the direction of apparent motion. The results of the present study show that the perceived direction of auditory apparent motion is strongly modulated by apparent motion in vision, and that both spatial and temporal factors play a significant role in this crossmodal effect. We also demonstrate that a split-brain patient who does not perceive visual apparent motion across the midline is immune to this audiovisual dynamic capture effect, highlighting the importance of motion being experienced in order for this new multisensory illusion to occur.

Introduction

Objects and events in our everyday environments typically produce correlated input to several sensory modalities simultaneously (i.e., they are multisensory) as well as information about movement with respect to the observer (i.e., they are dynamic). However, crossmodal interactions involving dynamic stimuli have rarely been studied in the laboratory, and are consequently poorly understood (e.g. Refs. [1], [16], [17], [22], [25], [29]). Determining whether multisensory interactions occur in the domain of motion perception, and elucidating the factors that modulate this integration is important, as it should enhance our understanding of the information processing that takes place in the dynamic multisensory environments of everyday life.

Previous research using spatially static events has revealed that information presented to different modalities is frequently integrated into a unitary multisensory percept (see Refs. [7], [27] for reviews). For example, in the classic ventriloquist illusion, people often misjudge the position of a static sound toward a light flash presented concurrently at a different spatial location (see Refs. [2], [3], [6]). The results from previous studies are not as clear, however, when motion is introduced in two sensory modalities. While some researchers have reported that concurrent visual apparent motion can modulate the ability to experience auditory apparent motion [16], [29], others have found null results [1]. Moreover, past studies differ strongly about the nature of this dynamic crossmodal interaction (e.g., Refs. [1], [16], [29]). Some suggest that the relative direction of motion of the stimuli (whether they move in the same or different directions) may not play a critical role in the integration of information from dynamic events in different modalities [1], [16], while others have reported strong congruency effects [29]. In addition to these conflicting results, all previous studies of dynamic crossmodal integration have several potential shortcomings. For instance, most of them confounded sensory modality and spatial location [1], [16]: that is, stimuli in different modalities were presented from different possible spatial locations (e.g., sounds were typically presented from headphones and lights from LEDs in front of the observer). Given the important role that spatial coincidence plays in crossmodal integration (e.g., Ref. [14]), it is possible that the mixed results obtained in previous studies may reflect differences in integrating information across different physical locations. Critically, the only study in which spatial location was not confounded with sensory modality [29] relied solely on phenomenological reports from the participants, compromising the reliability and replicability of the findings (see Ref. [1] for an extended discussion).

In the present study, the perceived direction of apparent motion in audition was evaluated as a function of the direction of apparent motion in vision. Synchrony (sounds and lights presented synchronously versus asynchronously) and Congruency (sounds and lights moving in the same versus opposite directions) were factors in all the experiments. The introduction of synchrony as a factor allowed us to assess the effects of temporal coincidence in multisensory integration, and to distinguish the congruency effects due to post-perceptual stages of processing from the congruency effects due to perceptual integration. Note that post-perceptual processes include simple response biases from the irrelevant modality or confusion about which modality the participant had to respond to, and as such, their impact should be equivalent across synchronous and asynchronous conditions. By contrast, the effects due to perceptual integration should be revealed only in the synchronous condition, as crossmodal integration between two events breaks down quickly as one moves away from their simultaneous occurrence (e.g., Refs. [2], [14], [21], [22], [27]).

In Experiments 1 and 2, we tested multisensory integration of dynamic information under conditions of spatial coincidence and spatial displacement. As noted above, this factor may be a key reason for the mixed results reported in previous studies. In Experiment 3, we evaluated the degree to which the sensation of motion modulates perceptual integration in the present illusion, allowing us to test directly the role of dynamic vs. non-dynamic factors. Finally, in Experiment 4 we examined the crossmodal integration of motion information in a split-brain patient, to further evaluate the importance of dynamic properties of the visual stimulus in this type of crossmodal integration.

Section snippets

Participants

We tested undergraduates from the University of British Columbia who volunteered in exchange for course credit. All participants reported normal hearing and normal or corrected-to-normal vision.

Apparatus and materials

Two loudspeaker cones (Audax VE100AO) were positioned 15 cm to either side of the participants’ midline (30 cm center to center), and each loudspeaker was connected to one channel (left/right) of the computer’s soundcard (ProAudio Basic 16, MediaVision). An orange LED (64.3 cd/m2) was centered in front

Experiment 1

This experiment (n=25) included two separate blocks of trials (13 participants ran the two blocks in one order and the rest in the reverse order): these two blocks were identical except for the fact that auditory stimuli were presented over headphones in one block, and from loudspeakers situated directly behind the LEDs in the other block.

Individual accuracy in determining the direction of auditory apparent motion for each condition was submitted to an analysis of variance (ANOVA) including

Experiment 2

In this experiment (n=12), sounds were always presented from external loudspeakers placed 30 cm apart (as in Experiment 1, see Fig. 1) while the location of the LEDs was varied systematically: The Overlapping Close arrangement (LEDs placed 10 cm apart, centered between the two loudspeakers); the Overlapping Far arrangement (LEDs 50 cm apart, each placed 10 cm outside of each loudspeaker); and the Orthogonal arrangement (LEDs placed 30 cm apart, centered vertically on the midline of the setup).

Experiment 3

In Experiment 3 (n=24), we kept the spatial arrangement fixed and varied the interstimulus interval (ISI) between the two pairs of events (first and second light, and first and second hearing). We used the known relationship between ISI and strength of apparent motion (with apparent motion decreasing as the ISI between the two events increases; e.g., Refs. [11], [28] in vision; [4] in hearing) to measure dynamic capture at different degrees of perceived motion. The ISIs used were 50, 100, 300,

Experiment 4

The fact that cortical areas are crucial to both visual (e.g., Ref. [26]) and auditory (e.g., Refs. [10], [12]) motion processing suggests that, as in many other aspects of crossmodal integration, cortical mechanisms may also be critically involved in the crossmodal integration of dynamic information [5]. Indeed, recent research using fMRI has pointed out several cortical areas potentially involved in this type of crossmodal integration (e.g., Ref. [13]). In addition, as indicated by the

General discussion

The results of the present study demonstrate a strong crossmodal interaction in the domain of motion perception. In particular, our findings suggest the obligatory perceptual integration of dynamic information across sensory modalities, often producing an illusory reversal of the true direction of the auditory apparent motion (originally unambiguous if presented in isolation). Further, support for the uniqueness of this phenomenon comes from the fact that dynamic capture was shown to depend on

Acknowledgements

This work was supported by a postdoctoral award from the Killam Trust to S.S.-F., and by grants from the National Science and Engineering Research Council of Canada and from the Human Frontiers Science Project to A.K. We thank John McDonald for comments on a previous version of the manuscript.

References (27)

  • M.S. Gazzaniga et al.

    Semin. Neurol.

    (1984)
  • T.D. Griffiths et al.

    Evidence for a sound movement area in the human cerebral cortex

    Nature

    (1996)
  • P.A. Kolers

    The illusion of movement

    Sci. Am.

    (1964)
  • Cited by (0)

    View full text