en
Proceedings chapter
Open access
English

Learning User Queries in Multimodal Dissimilarity Spaces

Presented at Glasgow (UK), Jul. 28-19
PublisherSpringer
Collection
  • Lecture Notes in Computer Science; 3877
Publication date2005
Abstract

Different strategies to learn user semantic queries from dissimilarity representations of audio-visual content are presented. When dealing with large corpora of videos documents, using a feature representation requires the on-line computation of distances between all documents and a query. Hence, a dissimilarity representation may be preferred because its offline computation speeds up the retrieval process. We show how distances related to visual and audio video features can directly be used to learn complex concepts from a set of positive and negative examples provided by the user. Based on the idea of dissimilarity spaces, we derive three algorithms to fuse modalities and therefore to enhance the precision of retrieval results. The evaluation of our technique is performed on artificial data and on the annotated TRECVID corpus.

Keywords
  • Data Structures, Cryptology and Information Theory
  • Information Storage and Retrieval
  • Multimedia Information Systems
  • Information Systems Applications (incl. Internet)
  • Image Processing and Computer Vision
  • Artificial Intelligence (incl. Robotics)
Citation (ISO format)
BRUNO, Eric, MOENNE-LOCCOZ, Nicolas, MARCHAND-MAILLET, Stéphane. Learning User Queries in Multimodal Dissimilarity Spaces. In: Adaptive Multimedia Retrieval: User, Context, and Feedback : 3rd International Workshop, AMR 2005. Revised Selected Papers. Glasgow (UK). [s.l.] : Springer, 2005. p. 168–179. (Lecture Notes in Computer Science) doi: 10.1007/11670834_14
Main files (1)
Proceedings chapter (Published version)
accessLevelPublic
Identifiers
478views
175downloads

Technical informations

Creation03/06/2015 5:12:10 PM
First validation03/06/2015 5:12:10 PM
Update time03/14/2023 10:58:53 PM
Status update03/14/2023 10:58:53 PM
Last indexation08/29/2023 3:10:18 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack