Scientific Article
previous document  unige:100158  next document
add to browser collection
Title

Multisensory spatial representations in eye-centered coordinates for reaching

Authors
Ducom, Jean Christophe
Torri, Jeffrey
Published in Cognition. 2002, vol. 83, no. 1, p. B1-11
Abstract Humans can reach for objects with their hands whether the objects are seen, heard or touched. Thus, the position of objects is recoded in a joint-centered frame of reference regardless of the sensory modality involved. Our study indicates that this frame of reference is not the only one shared across sensory modalities. The location of reaching targets is also encoded in eye-centered coordinates, whether the targets are visual, auditory, proprioceptive or imaginary. Furthermore, the remembered eye-centered location is updated after each eye and head movement. This is quite surprising since, in principle, a reaching motor command can be computed from any non-visual modality without ever recovering the eye-centered location of the stimulus. This finding may reflect the predominant role of vision in human spatial perception.
Keywords AdultAttentionDiscrimination LearningFemaleForm PerceptionHumansMaleOrientationPsychomotor PerformancePsychophysicsSound LocalizationStereognosis
Identifiers
PMID: 11814488
Full text
Article (Published version) (278 Kb) - public document Free access
Citation
(ISO format)
POUGET, Alexandre et al. Multisensory spatial representations in eye-centered coordinates for reaching. In: Cognition, 2002, vol. 83, n° 1, p. B1-11. https://archive-ouverte.unige.ch/unige:100158

77 hits

31 downloads

Update

Deposited on : 2017-12-12

Export document
Format :
Citation style :