UNIGE document Scientific Article
previous document  unige:47416  next document
add to browser collection

On the use of the auditory pathway to represent image scenes in real-time

Published in Neurocomputing. 2009, vol. 72, no. 4, p. 839-849
Abstract The See Color interface transforms a small portion of a colored video image into sound sources represented by spatialized musical instruments. Basically, the conversion of colors into sounds is achieved by quantization of the HSL color system. Our purpose is to provide visually impaired individuals with a capability of perception of the environment in real time. In this work we present the system principles of design and several experiments that have been carried out by several blindfolded persons with See ColOr prototypes related to static pictures on a tablet and simple video images. The goal of the first experiment was to identify the colors of static pictures’ main features and then to interpret the image scenes. Although learning all instrument sounds in only a training session was too difficult, participants found that colors were helpful to limit the possible image interpretations. The experiments on the analysis of static pictures suggested that the order of magnitude of the slow down factor related to the use of the auditory channel, instead of the visual channel could correspond to the order of magnitude related to the ratio of visual channel capacity to auditory channel capacity. Afterwards, two experiments based on a head mounted camera have been performed. The first experiment pertaining to object manipulation is based on the pairing of colored socks, while the second experiment is related to outdoor navigation with the goal of following a colored serpentine painted on the ground. The “socks” experiment demonstrated that blindfolded individuals were able to accurately match pairs of colored socks. The same participants with the addition of a blind individual successfully followed a red serpentine painted on the ground for more than 80 m. According to task time durations, the order of magnitude of the slow down factor related to the “socks” and “serpentine” experiments could be equal to one. From a cognitive perspective this would be consistent with the fact that these two tasks are simpler than the interpretation of image scenes.
Keywords Sensory substitutionColor sonificationSound spatializationBlind navigation
Full text
Research groups Computer Vision and Multimedia Laboratory
Multimodal Interaction Group
(ISO format)
BOLOGNA, Guido, DEVILLE, Benoît, PUN, Thierry. On the use of the auditory pathway to represent image scenes in real-time. In: Neurocomputing, 2009, vol. 72, n° 4, p. 839-849. doi: 10.1016/j.neucom.2008.06.020 https://archive-ouverte.unige.ch/unige:47416

404 hits



Deposited on : 2015-03-03

Export document
Format :
Citation style :