Scientific Article
previous document  unige:100158  next document
add to browser collection

Multisensory spatial representations in eye-centered coordinates for reaching

Ducom, Jean Christophe
Torri, Jeffrey
Published in Cognition. 2002, vol. 83, no. 1, p. B1-11
Abstract Humans can reach for objects with their hands whether the objects are seen, heard or touched. Thus, the position of objects is recoded in a joint-centered frame of reference regardless of the sensory modality involved. Our study indicates that this frame of reference is not the only one shared across sensory modalities. The location of reaching targets is also encoded in eye-centered coordinates, whether the targets are visual, auditory, proprioceptive or imaginary. Furthermore, the remembered eye-centered location is updated after each eye and head movement. This is quite surprising since, in principle, a reaching motor command can be computed from any non-visual modality without ever recovering the eye-centered location of the stimulus. This finding may reflect the predominant role of vision in human spatial perception.
Keywords AdultAttentionDiscrimination LearningFemaleForm PerceptionHumansMaleOrientationPsychomotor PerformancePsychophysicsSound LocalizationStereognosis
PMID: 11814488
Full text
Article (Published version) (278 Kb) - public document Free access
(ISO format)
POUGET, Alexandre et al. Multisensory spatial representations in eye-centered coordinates for reaching. In: Cognition, 2002, vol. 83, n° 1, p. B1-11. doi: 10.1016/s0010-0277(01)00163-9

290 hits



Deposited on : 2017-12-12

Export document
Format :
Citation style :