Spatial representations during eye movements
We perceive the world around us as stable. This is remarkable given that our body parts as well as we ourselves are constantly in motion. Humans and other primates move their eyes more often than their heart beats. Such eye movements lead to coherent motion of the images of the outside world across the retina. Furthermore, during everyday life, we constantly approach targets, avoid obstacles or otherwise move in space. These movements induce motion across different sensory receptor epithels: optical flow across the retina, tactile flow across the body surface and even auditory flow as detected from the two ears. It is generally assumed that motion signals as induced by one’s own movement have to be identified and differentiated from the real motion in the outside world.
In my lecture I will review a number of studies from my lab that aimed to functionally characterize the primate posterior parietal cortex (PPC) and its role for multisensory encoding of spatial and motion information. Extracellular recordings in the macaque monkey showed that during steady fixation the visual, auditory and tactile spatial representations in the ventral intraparietal area (VIP) are congruent. This finding was of major importance given that an fMRI study determined the functional equivalent of macaque area VIP in humans. Further recordings in other areas of the dorsal stream of the visual cortical system of the macaque pointed towards the neural basis of perceptual phenomena (heading detection during eye movements, saccadic suppression, mislocalization of visual stimuli during eye movements) as determined in psychophysical studies in humans.
Supported by DFG and EU