en
Scientific article
English

A pilot study of the relationship between experts' ratings and scores generated by the NBME's Computer-Based Examination System

Published inAcademic medicine, vol. 67, no. 2, p. 130-132
Publication date1992
Abstract

This pilot study evaluates the consistency of experts' ratings of students' performances on the National Board of Medical Examiners' Computer Based Examination (CBX) cases and the relationship of those ratings to the CBX's scoring algorithm. The authors were investigating whether an automated scoring algorithm can adequately assess an examinee's management of a computer-simulated patient. In 1989-90, at the Michigan State University College of Human Medicine, eight students, completing a surgery clerkship, each managed eight CBX cases and took a computer-administered, multiple-choice examination. Six clerkship coordinators rated the students' performances in terms of overall management, efficiency, and dangerous actions. The ratings correlated highly with scores produced by the CBX's scoring system.

Keywords
  • Algorithms
  • Clinical Clerkship
  • Clinical Competence
  • Computer Simulation
  • Educational Measurement/methods
  • General Surgery/education
  • Patient Simulation
  • Pilot Projects
Affiliation Not a UNIGE publication
Citation (ISO format)
SOLOMON, David J. et al. A pilot study of the relationship between experts” ratings and scores generated by the NBME’s Computer-Based Examination System. In: Academic medicine, 1992, vol. 67, n° 2, p. 130–132. doi: 10.1097/00001888-199202000-00020
Main files (1)
Article (Published version)
accessLevelRestricted
Identifiers
ISSN of the journal1040-2446
477views
2downloads

Technical informations

Creation01/06/2016 12:07:00 PM
First validation01/06/2016 12:07:00 PM
Update time03/15/2023 12:01:59 AM
Status update03/15/2023 12:01:58 AM
Last indexation01/16/2024 7:57:20 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack