Probing cross modal interactions in the perception of object motion and self-motion (2009–2011)
Our experience of the world results from integrating information from all our senses simultaneously. Multisensory integration often results in an enhanced sensory experience and reduces perceptual ambiguity, which can result from relying on just one sensory modality. This project aims first, to probe the nature of auditory/visual interactions in resolving visual displays of objects in motion, looking specifically at the contributions of sensory and decisional factors involved. Second, to investigate the individual contributions of auditory, vestibular and visual information, and their combinations to the perception of self-motion.