UNIGE document Scientific Article
previous document  unige:15739  next document
add to browser collection
Title

Emotion recognition from expressions in face, voice, and body: The Multimodal Emotion Recognition Test (MERT)

Authors
Bänziger, Tanja
Published in Emotion. 2009, vol. 9, no. 5, p. 691-704
Abstract Emotion recognition ability has been identified as a central component of emotional competence. We describe the development of an instrument that objectively measures this ability on the basis of actor portrayals of dynamic expressions of 10 emotions (2 variants each for 5 emotion families), operationalized as recognition accuracy in 4 presentation modes combining the visual and auditory sense modalities (audio/video, audio only, video only, still picture). Data from a large validation study, including construct validation using related tests (Profile of Nonverbal Sensitivity; Rosenthal, Hall, DiMatteo, Rogers, & Archer, 1979; Japanese and Caucasian Facial Expressions of Emotion; Biehl et al., 1997; Diagnostic Analysis of Nonverbal Accuracy; Nowicki & Duke, 1994; Emotion Recognition Index; Scherer & Scherer, 2008), are reported. The results show the utility of a test designed to measure both coarse and fine-grained emotion differentiation and modality-specific skills. Factor analysis of the data suggests 2 separate abilities, visual and auditory recognition, which seem to be largely independent of personality dispositions.
Identifiers
Full text
Structures
Research group Swiss Center for Affective Sciences
Citation
(ISO format)
BÄNZIGER, Tanja, GRANDJEAN, Didier Maurice, SCHERER, Klaus R. Emotion recognition from expressions in face, voice, and body: The Multimodal Emotion Recognition Test (MERT). In: Emotion, 2009, vol. 9, n° 5, p. 691-704. https://archive-ouverte.unige.ch/unige:15739

273 hits

3 downloads

Update

Deposited on : 2011-05-10

Export document
Format :
Citation style :