Abstract
Mental rotation is the capacity to predict the outcome of spatial relationships after a change in viewpoint. These changes arise either from the rotation of the test object array or from the rotation of the observer. Previous studies showed that the cognitive cost of mental rotations is reduced when viewpoint changes result from the observer's motion, which was explained by the spatial updating mechanism involved during self-motion. However, little is known about how various sensory cues available might contribute to the updating performance. We used a Virtual Reality setup in a series of experiments to investigate table-top mental rotations under different combinations of modalities among vision, body and audition. We found that mental rotation performance gradually improved when adding sensory cues to the moving observer (from None to Body or Vision and then to Body & Audition or Body & Vision), but that the processing time drops to the same level for any of the sensory contexts. These results are discussed in terms of an additive contribution when sensory modalities are co-activated to the spatial updating mechanism involved during self-motion. Interestingly, this multisensory approach can account for different findings reported in the literature.
Original language | English |
---|---|
Pages (from-to) | 59-68 |
Number of pages | 10 |
Journal | Experimental Brain Research |
Volume | 197 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2009 Jul |
Bibliographical note
Funding Information:Acknowledgments This work was presented at the IMRF 2008 conference. Manuel Vidal received a post-doctoral scholarship from the Max Planck Society and Alexandre Lehmann received a doctoral scholarship from the Centre Nationale pour la Recherche ScientiWque. We are grateful to the workshop of the Max Planck Institute for the construction of the table set-up.
Keywords
- Mental rotations
- Multisensory
- Spatial updating
- Virtual reality
ASJC Scopus subject areas
- Neuroscience(all)