Abstract
To perceive the external environment our brain uses multiple sources of sensory information derived from several different modalities, including vision, touch and audition. All these different sources of information have to be efficiently merged to form a coherent and robust percept. Here we highlight some of the mechanisms that underlie this merging of the senses in the brain. We show that, depending on the type of information, different combination and integration strategies are used and that prior knowledge is often required for interpreting the sensory signals.
Original language | English |
---|---|
Pages (from-to) | 162-169 |
Number of pages | 8 |
Journal | Trends in Cognitive Sciences |
Volume | 8 |
Issue number | 4 |
DOIs | |
Publication status | Published - 2004 Apr |
Bibliographical note
Funding Information:This work was supported by the Max Planck Society and by the 5th Framework IST Program of the EU (IST-2001–38040, TOUCH-HapSys). We thank Marty Banks, Roberta Klatzky, Andrew Welchman and Knut Drewing for helpful comments on this draft and Martin Breidt for help with the figures.
ASJC Scopus subject areas
- Neuropsychology and Physiological Psychology
- Experimental and Cognitive Psychology
- Cognitive Neuroscience