Validity Study of an End-of-Clerkship Oral Examination in Obstetrics and Gynecology

Elsevier

Available online 18 October 2022

Journal of Surgical EducationOBJECTIVE

Surgical clerkships frequently include oral exams to assess students’ ability to critically analyze data and utilize clinical judgment during common scenarios. Limited guidance exists for the interpretation of oral exam score validity, thus making improvements difficult to target. We examined the development, administration, and scoring of a clerkship oral exam from a validity evidence framework.

DESIGN

This was a retrospective study of a third-year, end-of-clerkship oral exam in obstetrics and gynecology (OBGYN). Content, response process, internal structure, and relationship to other variables validity evidence was collected and evaluated for 5 versions of the oral exam.

SETTING

Albert Einstein College of Medicine, Bronx, New York City.

PARTICIPANTS

Participants were 186 third-year medical students who completed the OBGYN clerkship in the academic year 2020 to 2021.

RESULTS

The average number of objectives assessed per oral exam version were uniform, but the distribution of questions per Bloom's level of cognition was uneven. Student scores on all questions regardless of Bloom's level of cognition were >87%, and reliability (Cronbach's alpha) of item scores varied from 0.58 to 0.74. There was a moderate, positive correlation (Spearman's rho) between the oral exam scores and national shelf exam scores (0.35). There were low correlations between oral exam scores and (a) clinical performance ratings (0.14) and (b) formal presentation scores (-0.19).

CONCLUSIONS

This study provides an example of how to examine the validity of oral exam scores for targeted improvements. Further modifications are needed before using scores for high stakes decisions. The authors provide recommendations for additional sources of validity evidence to collect in order to better meet the goals of any surgical clerkship oral exam.

Section snippetsINTRODUCTION

Surgical clerkships, including those in obstetrics and gynecology (OBGYN), frequently include oral exams that aim to assess students’ ability to critically analyze data and utilize clinical judgment. Although approximately half of specialties under the umbrella of the American Board of Medical Specialties require physicians to pass oral exams as part of their board certification process, medical student and resident exposure to quality oral assessments is variable1. At the clerkship level

Overview

We conducted a retrospective educational assessment validity study of a third-year, end-of-clerkship oral exam in OBGYN. Our goals were to examine the development, administration, and scoring of a pre-existing oral exam using Messick's validity framework to guide modifications and improvement of the exam at the institutional level, and share our experience with other clerkship educators using oral exams. Practical examples of each of the four types of validity evidence as applied to oral exams

Content Validity

The average number of APGO objectives assessed per exam form was 23 and ranged from 21 to 24. The distribution of questions per Bloom's level per form was unevenly distributed (Table 3). The greatest percentage of Bloom's level two or greater questions per individual exam form was 96%, and the lowest was 70%. Average student scores on each Bloom's question type were relatively high on all forms regardless of level, and ranged from 87% to 100% (Table 4).

Internal Structure Validity

Reliability coefficients for version one

Discussion

To improve the quality of an existing oral OBGYN clerkship exam and guide other educators in developing their own, we collected validity evidence in four domains of Messick's framework. Our study was not without limitations. Below we suggest that additional sources of validity evidence must be added to the validity argument to help better serve our primary goal of differentiating students by cognitive skills and adding a meaningful component to the summative clerkship grade.

Because the content

ACKNOWLEDGMENTS

We would like to thank Denise Mabry, Audrey Paul, Joy McIntyre, Brian Martin, Dr. Philip Reeves, and the Johns Hopkins Master of Education in the Health Professions program.

REFERENCES (18)V Wass et al.Assessment of clinical competence

Lancet

(2001)

J Hassett et al.Utility of an oral examination in a surgical clerkship

Am J Surg.

(1992)

D Juul et al.Oral ExaminationsT Sisson et al.Clerkship ExaminationsY Zhou et al.Effectiveness of written and oral specialty certification examinations to predict actions against the medical licenses of anesthesiologists

Anesthesiology

(2017)

BW Wormald et al.Assessment drives learning: an unavoidable truth?

Anat Sci Educ

(2009)

KJ Hardy et al.Undergraduate surgical examinations: an appraisal of the clinical orals

Med Educ.

(1998)

M Wiggins et al.Implementing a structured oral examination into a residency program: getting started

Ophthalmic Surg Lasers Imaging

(2008)

M Shenwai et al.Introduction of structured oral examination as a novel assessment tool to first year medical students in physiology

J Clin Diag Res

(2013)

There are more references available in the full text version of this article.

View full text

© 2022 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

留言 (0)

沒有登入
gif