Proceedings chapter
Open access

Transferring Neural Representations for Low-Dimensional Indexing of Maya Hieroglyphic Art

Presented at Amsterdam (The Netherlands), 8-10 and 15-16 October 2016
PublisherCham : Springer International Publishing
Publication date2016

We analyze the performance of deep neural architectures for extracting shape representations of binary images, and for generating low-dimensional representations of them. In particular, we focus on indexing binary images exhibiting compounds of Maya hieroglyphic signs, referred to as glyph-blocks, which constitute a very challenging dataset of arts given their visual complexity and large stylistic variety. More precisely, we demonstrate empirically that intermediate outputs of convolutional neural networks can be used as representations for complex shapes, even when their parameters are trained on gray-scale images, and that these representations can be more robust than traditional handcrafted features. We also show that it is possible to compress such representations up to only three dimensions without harming much of their discriminative structure, such that effective visualization of Maya hieroglyphs can be rendered for subsequent epigraphic analysis.

  • Shape retrieval
  • Neural networks
  • Dimensionality reduction
Citation (ISO format)
ROMAN RANGEL, Edgar Francisco et al. Transferring Neural Representations for Low-Dimensional Indexing of Maya Hieroglyphic Art. In: Computer Vision – ECCV 2016 Workshops. Amsterdam (The Netherlands). Cham : Springer International Publishing, 2016. p. 842–855. doi: 10.1007/978-3-319-46604-0_58
Main files (2)
Proceedings chapter (Accepted version)
Proceedings chapter (Published version)

Technical informations

Creation09/13/2019 3:04:00 PM
First validation09/13/2019 3:04:00 PM
Update time03/15/2023 6:02:05 PM
Status update03/15/2023 6:02:04 PM
Last indexation10/19/2023 11:08:07 AM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack