Scientific article
Open access

On the use of the auditory pathway to represent image scenes in real-time

Published inNeurocomputing, vol. 72, no. 4, p. 839-849
Publication date2009

The See Color interface transforms a small portion of a colored video image into sound sources represented by spatialized musical instruments. Basically, the conversion of colors into sounds is achieved by quantization of the HSL color system. Our purpose is to provide visually impaired individuals with a capability of perception of the environment in real time. In this work we present the system principles of design and several experiments that have been carried out by several blindfolded persons with See ColOr prototypes related to static pictures on a tablet and simple video images. The goal of the first experiment was to identify the colors of static pictures' main features and then to interpret the image scenes. Although learning all instrument sounds in only a training session was too difficult, participants found that colors were helpful to limit the possible image interpretations. The experiments on the analysis of static pictures suggested that the order of magnitude of the slow down factor related to the use of the auditory channel, instead of the visual channel could correspond to the order of magnitude related to the ratio of visual channel capacity to auditory channel capacity. Afterwards, two experiments based on a head mounted camera have been performed. The first experiment pertaining to object manipulation is based on the pairing of colored socks, while the second experiment is related to outdoor navigation with the goal of following a colored serpentine painted on the ground. The “socks” experiment demonstrated that blindfolded individuals were able to accurately match pairs of colored socks. The same participants with the addition of a blind individual successfully followed a red serpentine painted on the ground for more than 80 m. According to task time durations, the order of magnitude of the slow down factor related to the “socks” and “serpentine” experiments could be equal to one. From a cognitive perspective this would be consistent with the fact that these two tasks are simpler than the interpretation of image scenes.

  • Sensory substitution
  • Color sonification
  • Sound spatialization
  • Blind navigation
Citation (ISO format)
BOLOGNA, Guido, DEVILLE, Benoît, PUN, Thierry. On the use of the auditory pathway to represent image scenes in real-time. In: Neurocomputing, 2009, vol. 72, n° 4, p. 839–849. doi: 10.1016/j.neucom.2008.06.020
Main files (1)
Article (Accepted version)
ISSN of the journal0925-2312

Technical informations

Creation03/03/2015 4:36:06 PM
First validation03/03/2015 4:36:06 PM
Update time03/14/2023 10:56:41 PM
Status update03/14/2023 10:56:41 PM
Last indexation08/29/2023 3:03:32 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack