Integration of visual and inertial cues in the perception of angular self-motion

  • K. N. De Winkel*
  • , F. Soyka
  • , M. Barnett-Cowan
  • , H. H. Bülthoff
  • , E. L. Groen
  • , P. J. Werkhoven
  • *Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    37 Citations (Scopus)

    Abstract

    The brain is able to determine angular self-motion from visual, vestibular, and kinesthetic information. There is compelling evidence that both humans and non-human primates integrate visual and inertial (i.e., vestibular and kinesthetic) information in a statistically optimal fashion when discriminating heading direction. In the present study, we investigated whether the brain also integrates information about angular self-motion in a similar manner. Eight participants performed a 2IFC task in which they discriminated yaw-rotations (2-s sinusoidal acceleration) on peak velocity. Just-noticeable differences (JNDs) were determined as a measure of precision in unimodal inertial-only and visual-only trials, as well as in bimodal visual-inertial trials. The visual stimulus was a moving stripe pattern, synchronized with the inertial motion. Peak velocity of comparison stimuli was varied relative to the standard stimulus. Individual analyses showed that data of three participants showed an increase in bimodal precision, consistent with the optimal integration model; while data from the other participants did not conform to maximum-likelihood integration schemes. We suggest that either the sensory cues were not perceived as congruent, that integration might be achieved with fixed weights, or that estimates of visual precision obtained from non-moving observers do not accurately reflect visual precision during self-motion.

    Original languageEnglish
    Pages (from-to)209-218
    Number of pages10
    JournalExperimental Brain Research
    Volume231
    Issue number2
    DOIs
    Publication statusPublished - 2013 Nov

    Bibliographical note

    Funding Information:
    Acknowledgments this research was supported by Grant Number alWGO-MG/08-04 of the Netherlands Institute for space Research (sRON) and the European research project sUPRa (contract FP7-233543, www.supra.aero). F.s. and M.B.c. were supported by Max Planck society stipends. h.h.B was supported by the WcU (World class University) program through the National Research Foundation of Korea funded by the Ministry of Education, science and technology (R31–10008). the authors wish to thank J. Weesie and R. van de Pijpekamp for scientific discussion and technical support.

    Keywords

    • Inertial
    • Maximum likelihood
    • Multisensory integration
    • Self-motion
    • Vestibular
    • Visual

    ASJC Scopus subject areas

    • General Neuroscience

    Fingerprint

    Dive into the research topics of 'Integration of visual and inertial cues in the perception of angular self-motion'. Together they form a unique fingerprint.

    Cite this