Skip to main content

Advertisement

Log in

Low-level integration of auditory and visual motion signals requires spatial co-localisation

  • Research Article
  • Published:
Experimental Brain Research Aims and scope Submit manuscript

Abstract

It is well known that the detection thresholds for stationary auditory and visual signals are lower if the signals are presented bimodally rather than unimodally, provided the signals coincide in time and space. Recent work on auditory–visual motion detection suggests that the facilitation seen for stationary signals is not seen for motion signals. We investigate the conditions under which motion perception also benefits from the integration of auditory and visual signals. We show that the integration of cross-modal local motion signals that are matched in position and speed is consistent with thresholds predicted by a neural summation model. If the signals are presented in different hemi-fields, move in different directions, or both, then behavioural thresholds are predicted by a probability-summation model. We conclude that cross-modal signals have to be co-localised and co-incident for effective motion integration. We also argue that facilitation is only seen if the signals contain all localisation cues that would be produced by physical objects.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Alais D, Burr D (2004) No direction-specific bimodal facilitation for audiovisual motion detection. Cogn Brain Res 19:185–194

    Article  Google Scholar 

  • Blauert J (1983) Spatial hearing. MIT Press, Cambridge, MA

    Google Scholar 

  • Frassinetti F, Bolognini N, Ladavas E (2002) Enhancement of visual perception by crossmodal visuo-auditory interaction. Exp Brain Res 147:332–343

    Article  PubMed  Google Scholar 

  • Graham N (1989) Visual pattern analyzers. Oxford University Press, London

    Google Scholar 

  • Hofbauer M, Wuerger SM, Meyer GF, Roehrbein M, Schill K, Zetzsche C (2004) Catching audio-visual mice: predicting the arrival time of auditory–visual motion signals. Cogn Affect Behav Neurosci 4:241–250

    PubMed  Google Scholar 

  • Howard RJ, Brammer M, Wright I, Woodruff PW, Bullmore ET (1996) A direct demonstration of functional specialization within motion-related visual and auditory cortex of the human brain. Curr Biol 6:1015–1019

    Article  PubMed  Google Scholar 

  • Lewis JW, Beauchamp MS, DeYoe EA (2000) A comparison of visual and auditory motion processing in human cerebral cortex. Cereb Cortex 10:873–888

    Google Scholar 

  • McDonald JJ, Teder-Sälejärvi WA, Hillyard SA (2000) Involuntary orienting to sound improves visual perception. Nature 407:906–908

    Article  PubMed  Google Scholar 

  • Meese T, Andersen SJ (2002) Spiral mechanisms are required to account for summation of complex motion components. Vision Res 42:1073–1080

    Article  PubMed  Google Scholar 

  • Meredith MA, Stein BE (1996) Spatial determinants of multisensory integration in cat superior colliculus. J Neurophysiol 75:1843–1857

    PubMed  Google Scholar 

  • Meredith MA, Nemitz JW, Stein BE (1987) Determinants of multisensory integration in superior colliculus neurones. I. Temporal factors. J Neurosci 10:3215–3229

    Google Scholar 

  • Meyer G, Wuerger S (2001) Cross-modal integration of auditory and visual motion signals. Neuroreport 12:2557–2600

    Article  PubMed  Google Scholar 

  • Mullen KT, Sankeralli MJ (1998) Evidence for the stochastic independence of the blue-yellow, red-green and luminance detection mechanisms revealed by subthreshold summation. Vision Res 39:733–745

    Article  Google Scholar 

  • Quick RF (1974) A vector magnitude model of contrast detection. Kybernetik 16:65–67

    Article  PubMed  Google Scholar 

  • Röhrbein F, Zetzsche C (2000) Auditory–visual interactions and the covariance structure generated by relative movements in natural environments. In: Guidati G, Hunt H, Heiss A (eds) Proceedings of the 7th International Congress on Sound and Vibration. International Institute of Acoustics and Vibration. Kramer Technology Publishing, Munich, pp 2427–2434

  • Sanabria D, Soto-Faraco S, Spence C (2004) Exploring the role of visual perceptual grouping on the audiovisual integration of motion. Neuroreport 15:2745–2749

    PubMed  Google Scholar 

  • Soto-Faraco S, Lyons J, Gazzaniga M, Spence C, Kingstone A (2002) The ventriloquist in motion: illusory capture of dynamic information across sensory modalities. Cogn Brain Res 14:139–146

    Article  Google Scholar 

  • Soto-Faraco S, Kingstone A, Spence C (2003) Multisensory contributions to the perception of motion. Neuropsychologia 41:1847–1862

    Article  PubMed  Google Scholar 

  • Soto-Faraco S, Spence C, Kingstone A (2004) Moving multisensory research along: motion perception across sensory modalities. Curr Direct Psychol Sci 13:29–32

    Article  Google Scholar 

  • Spence C, Driver J (1996) Audiovisual links in endogenous covert spatial attention. J Exp Psychol Hum Percept Perform 22(4):1005–1030

    Article  PubMed  Google Scholar 

  • Spence C, Driver J (1997) Audiovisual links in exogenous covert spatial orienting. Percept Psychophys 59:1–22

    PubMed  Google Scholar 

  • Stein BE, Meredith MA (1993) The merging of the senses. MIT Press, Cambridge, MA

    Google Scholar 

  • Tyler CW, Chen C-C (2000) Signal detection theory in the 2AFC paradigm: attention, channel uncertainty and probability summation. Vision Res 40:3121–3144

    Article  PubMed  Google Scholar 

  • Wallace MT, Stein BE (1997) Development of multisensory neurons and multisensory integration in cat superior colliculus. J Neurosci 17:2429–2444

    PubMed  Google Scholar 

  • Wallace MT, Stein BE (2001) Sensory and multisensory responses in the newborn monkey superior colliculus. J Neurosci 21:8886–8894

    PubMed  Google Scholar 

  • Wallace MT, Meredith MA, Stein BE (1992) Integration of multiple sensory modalities in cat cortex. Exp Brain Res 91:484–488

    Article  PubMed  Google Scholar 

  • Wallace MT, Meredith MA, Stein BE (1998) Multisensory integration in the superior colliculus of the alert cat. J Neurophysiol 80:1006–1010

    Google Scholar 

  • Watson AB, Pelli D (1983) QUEST: a Bayesian adaptive psychometric method. Percept Psychophys 33:113–120

    PubMed  Google Scholar 

  • Wuerger SM, Hofbauer M, Meyer GF (2003) The integration of auditory and visual motion signals at threshold. Percept Psychophys 65:1188–1196

    PubMed  Google Scholar 

Download references

Acknowledgements

This work was supported by the EU TMR projects SPHEAR and HOARSE and by the Royal Society. We are grateful to the subjects who took part in the experiments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Georg F. Meyer.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Meyer, G.F., Wuerger, S.M., Röhrbein, F. et al. Low-level integration of auditory and visual motion signals requires spatial co-localisation. Exp Brain Res 166, 538–547 (2005). https://doi.org/10.1007/s00221-005-2394-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00221-005-2394-7

Keywords

Navigation