en
Scientific article
English

Emotion recognition from expressions in face, voice, and body: The Multimodal Emotion Recognition Test (MERT)

Published inEmotion, vol. 9, no. 5, p. 691-704
Publication date2009
Abstract

Emotion recognition ability has been identified as a central component of emotional competence. We describe the development of an instrument that objectively measures this ability on the basis of actor portrayals of dynamic expressions of 10 emotions (2 variants each for 5 emotion families), operationalized as recognition accuracy in 4 presentation modes combining the visual and auditory sense modalities (audio/video, audio only, video only, still picture). Data from a large validation study, including construct validation using related tests (Profile of Nonverbal Sensitivity; Rosenthal, Hall, DiMatteo, Rogers, & Archer, 1979; Japanese and Caucasian Facial Expressions of Emotion; Biehl et al., 1997; Diagnostic Analysis of Nonverbal Accuracy; Nowicki & Duke, 1994; Emotion Recognition Index; Scherer & Scherer, 2008), are reported. The results show the utility of a test designed to measure both coarse and fine-grained emotion differentiation and modality-specific skills. Factor analysis of the data suggests 2 separate abilities, visual and auditory recognition, which seem to be largely independent of personality dispositions.

Research group
Citation (ISO format)
BÄNZIGER, Tanja, GRANDJEAN, Didier Maurice, SCHERER, Klaus R. Emotion recognition from expressions in face, voice, and body: The Multimodal Emotion Recognition Test (MERT). In: Emotion, 2009, vol. 9, n° 5, p. 691–704. doi: 10.1037/a0017088
Main files (1)
Article (Published version)
accessLevelRestricted
Identifiers
ISSN of the journal1931-1516
54views
0downloads

Technical informations

Creation04/19/2011 4:40:00 PM
First validation04/19/2011 4:40:00 PM
Update time03/30/2023 10:08:11 AM
Status update03/30/2023 10:08:10 AM
Last indexation05/02/2024 11:56:41 AM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack