Scientific article

Using hand gestures to control mobile spoken dialogue systems

Published inUniversal access in the information society, vol. 13, no. 3, p. 257-275
Publication date2014

Speech and hand gestures offer the most natural modalities for everyday human-to-human interaction. The availability of diverse spoken dialogue applications and the proliferation of accelerometers on consumer electronics allow the introduction of new interaction paradigms based on speech and gestures. Little attention has been paid, however, to the manipulation of spoken dialogue systems (SDS) through gestures. Situation-induced disabilities or real disabilities are determinant factors that motivate this type of interaction. In this paper, six concise and intuitively meaningful gestures are proposed that can be used to trigger the commands in any SDS. Using different machine learning techniques, a classification error for the gesture patterns of less than 5 % is achieved, and the proposed set of gestures is compared to ones proposed by users. Examining the social acceptability of the specific interaction scheme, high levels of acceptance for public use are encountered. An experiment was conducted comparing a button-enabled and a gesture-enabled interface, which showed that the latter imposes little additional mental and physical effort. Finally, results are provided after recruiting a male subject with spastic cerebral palsy, a blind female user, and an elderly female person.

Research group
Citation (ISO format)
TSOURAKIS, Nikolaos. Using hand gestures to control mobile spoken dialogue systems. In: Universal access in the information society, 2014, vol. 13, n° 3, p. 257–275. doi: 10.1007/s10209-013-0317-0
ISSN of the journal1615-5289

Technical informations

Creation11/05/2013 12:26:00 PM
First validation11/05/2013 12:26:00 PM
Update time03/14/2023 8:38:34 PM
Status update03/14/2023 8:38:34 PM
Last indexation01/16/2024 8:15:32 AM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack