en
Scientific article
Open access
English

Accuracy of judging others' traits and states: Comparing mean levels across tests

Published inJournal of Research in Personality, vol. 42, no. 6, p. 1476-1489
Publication date2008
Abstract

Tests of accuracy in interpersonal perception take many forms. Often, such tests use designs and scoring methods that produce overall accuracy levels that cannot be directly compared across tests. Therefore, progress in understanding accuracy levels has been hampered. The present article employed several techniques for achieving score equivalency. Mean accuracy was converted to a common metric, pi [Rosenthal, R., & Rubin, D. B. (1989). Effect size estimation for one-sample multiple-choice-type data: Design, analysis, and meta-analysis. Psychological Bulletin, 106, 332–337] in a database of 109 published results representing tests that varied in terms of scoring method (proportion accuracy versus correlation), content (e.g., personality versus affect), number of response options, item preselection, cue channel (e.g., face versus voice), stimulus duration, and dynamism. Overall, accuracy was midway between guessing level and a perfect score, with accuracy being higher for tests based on preselected than unselected stimuli. When item preselection was held constant, accuracy was equivalent for judging affect and judging personality. However, comparisons must be made with caution due to methodological variations between studies and gaps in the literature.

Keywords
  • Interpersonal sensitivity
  • Personality judgment
  • Emotion recognition
  • Accuracy
  • Pi
  • Binomial Effect Size Display
Affiliation Not a UNIGE publication
Research group
Citation (ISO format)
HALL, Judith A. et al. Accuracy of judging others” traits and states: Comparing mean levels across tests. In: Journal of Research in Personality, 2008, vol. 42, n° 6, p. 1476–1489. doi: 10.1016/j.jrp.2008.06.013
Main files (1)
Article (Published version)
accessLevelPublic
Identifiers
ISSN of the journal0092-6566
553views
766downloads

Technical informations

Creation11/20/2017 1:49:00 PM
First validation11/20/2017 1:49:00 PM
Update time03/15/2023 7:43:15 AM
Status update03/15/2023 7:43:14 AM
Last indexation01/17/2024 1:59:02 AM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack