Scientific article
Open access

Advocating a Componential Appraisal Model to Guide Emotion Recognition

Published inInternational Journal of Synthetic Emotions, vol. 3, no. 1, p. 18-32
Publication date2012

Most models of automatic emotion recognition use a discrete perspective and a black-box approach, i.e., they output an emotion label chosen from a limited pool of candidate terms, on the basis of purely statistical methods. Although these models are successful in emotion classification, a number of practical and theoretical drawbacks limit the range of possible applications. In this paper, the authors suggest the adoption of an appraisal perspective in modeling emotion recognition. The authors propose to use appraisals as an intermediate layer between expressive features (input) and emotion labeling (output). The model would then be made of two parts: first, expressive features would be used to estimate appraisals; second, resulting appraisals would be used to predict an emotion label. While the second part of the model has already been the object of several studies, the first is unexplored. The authors argue that this model should be built on the basis of both theoretical predictions and empirical results about the link between specific appraisals and expressive features. For this purpose, the authors suggest to use the component process model of emotion, which includes detailed predictions of efferent effects of appraisals on facial expression, voice, and body movements.

Research group
Citation (ISO format)
MORTILLARO, Marcello, MEULEMAN, Ben, SCHERER, Klaus R. Advocating a Componential Appraisal Model to Guide Emotion Recognition. In: International Journal of Synthetic Emotions, 2012, vol. 3, n° 1, p. 18–32. doi: 10.4018/jse.2012010102
Main files (1)
Article (Published version)
ISSN of the journal1947-9093

Technical informations

Creation09/18/2017 3:07:00 PM
First validation09/18/2017 3:07:00 PM
Update time03/15/2023 2:04:59 AM
Status update03/15/2023 2:04:58 AM
Last indexation05/02/2024 7:31:48 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack