Do mock medical licensure exams improve performance of graduates? Experience from a Saudi medical college



   Table of Contents   ORIGINAL ARTICLE Year : 2022  |  Volume : 10  |  Issue : 2  |  Page : 157-161

Do mock medical licensure exams improve performance of graduates? Experience from a Saudi medical college

Mona Hmoud Al-Sheikh1, Waleed Albaker2, Muhammed Zeeshan Ayub3
1 Department of Physiology, College of Medicine, Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia
2 Department of Medicine, College of Medicine, Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia
3 Department of Medical Education, College of Medicine, Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia

Date of Submission11-Mar-2021Date of Decision10-Jun-2021Date of Acceptance04-Apr-2022Date of Web Publication28-Apr-2022

Correspondence Address:
Mona Hmoud Al-Sheikh
Department of Physiology, College of Medicine, Imam Abdulrahman Bin Faisal University, King Fahd University Hospital, P. O. Box 2208, Al Khobar 31952
Saudi Arabia
Login to access the Email id

Source of Support: None, Conflict of Interest: None

Crossref citationsCheck

DOI: 10.4103/sjmms.sjmms_173_21

Rights and Permissions


Background: All medical graduates in Saudi Arabia are required to pass a Saudi Medical Licensure Exam (SMLE) to be able to practice and/or enroll in postgraduate training. Mock exams are a useful preparatory tool, but no study from Saudi Arabia has assessed its impact on performance in the actual licensure examinations.
Objectives: To evaluate the impact of a series of mock SMLEs with immediate personalized feedback on graduate scores and their performance in the actual SMLE.
Methods: This retrospective study included medical students who graduated in the 2019-20 academic year from Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia, and undertook mock SMLE exams offered in August 2020. Three mock exams were constructed using the SMLE blueprint and were offered to graduates 1 week apart. Immediately after each exam, a personalized learning outcomes achievement report was sent to each graduate. Exam reliability was measured by the Kuder–Richardson formula, and were 87%, 94%, and 96% for the first, second, and third exam, respectively.
Results: A total of 71, 70, and 61 students completed the first, second, and third exams, respectively. Across the three mock exams, the mean (±SD) score showed an increasing trend, from 87.6 (±33.4; range: 28–191) in the first test to 93.5 (±45.6, range 15–204) in the second and 96 (±42.6; range: 25–203) in the third. Forty graduates completed all three mock exams; of these, the scores of 25 (62.5%) students significantly improved in both the second and third exams compared to the prior test (P = 0.002). A nonsignificant positive correlation was found between the average mock and the actual SMLE scores for whom data were available (r = 0.29; P = 0.27).
Conclusion: The performance of graduates improved in subsequent mock exams, and there was a nonsignificant positive correlation between the mock and actual SMLE exam results. This study presents the usefulness of using mock exams as a preparatory tool for licensure examinations in Saudi Arabia.

Keywords: Formative feedback, graduate medical education, licensure exam, mock exam, norm-referenced assessment, Saudi Arabia


How to cite this article:
Al-Sheikh MH, Albaker W, Ayub MZ. Do mock medical licensure exams improve performance of graduates? Experience from a Saudi medical college. Saudi J Med Med Sci 2022;10:157-61
How to cite this URL:
Al-Sheikh MH, Albaker W, Ayub MZ. Do mock medical licensure exams improve performance of graduates? Experience from a Saudi medical college. Saudi J Med Med Sci [serial online] 2022 [cited 2022 Apr 30];10:157-61. Available from: https://www.sjmms.net/text.asp?2022/10/2/157/344319   Introduction Top

The quality and critical nature of medical profession requires strict regulation as to who is “fit to practice.”[1] Being licensed to practice means that a medical graduate has an adequate level of competency.[2] Medical licensure exam (MLE) helps decide which medical graduate is legible to pursue specialization and compete for the limited seats of postgraduate medical training in a norm-referenced design.[3] Setting a standard for medical practice is important to ensure that only competent graduates and safe doctors can practice.[4]

Two calls for a national medical licensure examination were sent in 2008 by 18 Saudi educationists. The rationale for the proposed Saudi Medical Licensure Exam (SMLE) was discussed and an action plan was defined to achieve this high-stake exam.[5],[6] Presently, SMLE is a 300-multiple choice question (MCQ) paper divided into three sections for a total of six hours duration. The SMLE blueprint is based on the Saudi-MED framework.[6] The licensure exam is offered 11 times a year and registration might be required three months ahead of the test date.

The SMLE scores of medical graduates determine their fate: it is their license to practice and/or to enroll in postgraduate training. For medical colleges, the mean SMLE score of its graduates determines their rank and popularity within the country. This became a strong driver for some medical colleges to teach for the SMLE or to design the curriculum around the Saudi-MED framework and SMLE blueprint. This was the main criticism of medical licensure exams in the long-standing debate.[7] Nonetheless, there is strong evidence that better performance in MLE is associated with better patient care, greater patient safety, improved quality of care, better health indices, and fewer medical error.[8],[9] In fact, a systematic review concluded that MLE is associated with greater patient safety and improved quality of health care.[10]

Mock exams are formative assessments that are useful as a preparatory tool, given that these provide students with exposure to the actual exam-like situation, helps them understand exam structures, and address weaknesses identified.[11] However, to the best of the authors' knowledge, no study from Saudi Arabia has yet studied the effectiveness of mock SMLE exams for improving the performance in the actual SMLE exam. The current study was conducted with the aim of evaluating the impact of a series of mock SMLEs with immediate personalized feedback on their performance in the actual SMLE. We hypothesized that there would be an incremental increase in the performance of students with the number of exams undertaken and that there would be a correlation between the mock and actual SMLE exam results.

  Methods Top

Study design and participants

This retrospective study included medical students who graduated in the 2019-20 academic year from Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia, and undertook mock SMLE exams offered in August 2020. The study was approved by the institutional review board of the University.

Graduates were defined as those who had completed all course requirements for the MBBS degree and had not yet taken the SMLE test or enrolled in postgraduate training. Any students who has not completed all course requirements or had already taken the SMLE test were not offered the mock exams.

Mock exam design and sampling

Three mock SMLE exams (300 MCQs in each exam) were offered 1 week apart for all eligible graduates. For developing the mock SMLEs, the exam blueprint outlined by the Saudi Commission for Health Specialties was followed closely and the support of the following five online MCQ banks was sought for the item-generation purpose: International Database for Enhancement of Assessment and Learning, Canadian Q Bank, Prometric, Fudul, and AMBOSS. MCQs were single best of the A-type with four options. The three mock SMLE papers had different MCQ items but shared the same blueprint/learning outcomes. Exam reliability was measured by the Kuder Richardson formula and found to be 0.87, 0.94, and 0.96, respectively.

As in the actual examination, the optical mark reader (OMR) sheets were used for the mock SMLE, wherein the selected bubble was shaded with an H2 pencil. Electronic scoring was done using the Optical Mark Reader “OpScan® 8” (Scantron Inc. Minnesota, USA). The REMARK Classic OMR 6 software was customized to produce learning outcomes achievement measurement feedback based on the fed blueprint. An additional customized feature in the software allowed automatic emailing of the personalized learning outcomes achievement report to the respective student.

All graduates (N = 215) of the chosen academic year (i.e., a single student cohort) were approached by the Vice Dean of Clinical Affairs secretariat, which is in charge of coordinating internship slots and schedules, and informed them of their eligibility to undertake the three mock SMLE exams. Graduates were informed that these mock exams were optional and were assured that scores obtained would not be used for any summative purposes.

Following the mock SMLE, graduates were requested to respond to the following qualitative questions: (1) What were the content areas most expected? (2) What were the content areas least expected? In addition, students were requested to provide their perception of the usefulness of the exams. The items were sent electronically by email as a link using the QuestionPro Survey Software. The purpose of this questionnaire was to record students' perceptions of this experience.

Outcomes

The primary outcomes were determining differences in scores between the first, second, and third mock exams and measuring the correlation between the mock and actual SMLE scores. The secondary outcome was measuring the students' perception of the usefulness of mock exams.

The actual SMLE scores of the students who participated in the mock tests were later acquired from the Vice Dean of Clinical Affairs secretariat after ensuring data confidentiality in line with the University's policy and the anonymization of findings.

Data analysis

Data were analyzed using SPSS version 26 (SPSS Inc. Chicago, IL). The performance scores of students were reported as descriptive analysis. Student t test and ANOVA were used to compare the SMLE scores; improvement in the score was defined as a significant increase in the total SMLE score. The correlation between the mock and actual SMLE scores were calculated using Pearson's correlation. P < 0.05 was considered significant. The qualitative feedback was analyzed by manual coding.

  Results Top

Of the 215 eligible graduates, 142, 143, and 140 registered for the mock SMLE 1, 2, and 3, respectively, but only 71 (33%), 70 (32.6%), and 61 (28.4%), appeared and completed each mock exam, respectively. Across the three mock exams, the mean (±SD) score showed an increasing trend, from 87.6 (±33.4; range: 28–191) in the first test to 93.5 (±45.6, range 15–204) in the second and 96 (±42.6; range: 25–203) in the third.

A total of 100 graduates completed at least one mock exam: 40 completed all three, 22 completed two exams, and 38 only one exam. Of the those who completed all three exams, the scores of 25 (62.5%) significantly improved compared to the prior test in both instances (P = 0.002), those for 2 improved only in one test, 11 had an insignificant decline in scores in the second and third tests compared to the first test (P = 0.28), and for 2, the scores remained the same throughout the three tests. Of the 22 graduates with single repetitions, 13 (59.1%) had non-significant improvement in scores compared to the first test [Table 1].

The actual SMLE scores were only available for 52 graduates at the time of reporting this study. A nonsignificant positive correlation was found between the average mock and the actual SMLE scores (r = 0.29; P = 0.27) [Figure 1].

Figure 1: Correlation plot between mock and actual SMLE scores. SMLE: Saudi Medical Licensure Exam

Click here to view

Qualitative data

The qualitative feedback revealed that graduates found the mock SMLE to be useful, as it allowed them to familiarize with long MCQ tests and learn how to tolerate the tension of the setting for a comprehensive test. Many also reported that the mock exams reduced their anxiety and increased their awareness of the blueprint of the SMLE and that they learnt how to study and prepare for the SMLE. Students reported being least exposed to “Anesthesia” items, and most exposed to “Surgery” items. Students also complained about the timing and mode of the mock SMLE being during internship and requiring physical presence in campus and suggested online mock SMLE.

  Discussion Top

This study found that the mock exam performance of students incrementally increased and had a nonsignificant positive correlation with the actual SMLE exam results. This is indicative that the comprehensive personalized learning outcome achievement feedback provided after each test helped students diagnose their knowledge gaps and this, along with familiarization with the exam structure and setting, allowed for improvement in the actual SMLE exam performance.

Performance on MLE strongly predicts performance during residency with regards to interpersonal and communication skills, patient care and professionalism competency domains.[12],[13] In KSA, SMLE score is a determinant of a graduate's eligibility for postgraduate studies in medical colleges and may also play a role in employers' confidence in hiring. For universities, SMLE scores are used as significant benchmarks for assessing the curriculum effectiveness and the imparted quality of education and program outcomes, as poor performance in MLE affects both the graduates and medical colleges. To remedy this, institutions use various tools, including using predictors to determine graduates at risk of failing the licensure exam and supporting them before they reach the final years.[12] In Saudi Arabia, several medical colleges have incorporated SMLE preparation in their curriculum.[14] As shown in this study, mock SMLE exams may be another useful method for improving the performance of graduates in the actual exams.

The current study findings are in line with studies from medical and dental programs.[15],[16],[17] For example, in one study from the United States, passing a mock prosthodontic preparation exam was found to be significantly associated with successful completion of the licensure examination.[17] Similarly, for successful completion of the American Registry of Radiologic Technologists (ARRT) certification exam, knowledge mastery, exam familiarity, and skill strategies were identified as key themes that helped in the perception of preparedness. Mock exams provide exposure and possibility to master all these three themes.[18] In a study by Ha et al., which assessed the impact of introducing a mock exam before the actual summative exam for a pharmaceutics course, it was found that mock exams contributed in the year-on-year increase in the average score (compared to the year with no mock exams) and that performance in mock exams was correlated with that of scores obtained in the actual exams.[11] From Saudi Arabia, currently there is limited evidence regarding the impact of mock exams on the actual one. Therefore, future studies can build on the current study findings to find a wider consensus regarding the use of mock exams for various licensure examinations in this country. The participants' perceptions of degree of exposure to content areas in the mock SMLE is useful for the Curriculum Committee. Furthermore, data regarding the representation of subject domains would be helpful in developing a more representative mock exam.

Limitations

A key limitation of this study is that this is a single center study, and a relatively small proportion of students undertook all three mock exams. This could have been due to various factors, including the timing of the exams, which was conducted during the internship period of the graduates, thereby limiting their participation. This study also could not account for several confounding factors in the analysis owing to the retrospective study design. Therefore, future prospective studies that account for the confounding factors and recruit larger student populations are needed to confirm our findings.

  Conclusion Top

Student performance increased in subsequent mock exams, and there was a nonsignificant positive correlation between the mock and actual SMLE exam results. This study presents the usefulness of using mock exams with immediate personalized feedback on learning outcome achievements as a preparatory tool for licensure examinations in Saudi Arabia.

Ethical considerations

The study was approved by the Institutional Review Board at Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia (Ref. no.: 2021-01-097; dated: March 09, 2021). At the time of undertaking the mock exams, all participants had provided written consent for their results and responses being used for study purposes. The study adhered to the principles of the Declaration of Helsinki, 2013.

Data availability statement

The datasets generated during and/or analyzed during the current study are not publicly available but are available from the corresponding author on reasonable request.

Peer review

This article was peer-reviewed by four independent and anonymous reviewers.

Acknowledgement

We would like to acknowledge all staff of the Vice-Deanship of Clinical Affairs for their efforts in conducting the exams and communicating with graduate students.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

 

  References Top
1.Taber S, Akdemir N, Gorman L, van Zanten M, Frank JR. A “fit for purpose” framework for medical education accreditation system design. BMC Med Educ 2020;20:306.  Back to cited text no. 1
    2.Huijskens EG, Hooshiaran A, Scherpbier A, van der Horst F. Barriers and facilitating factors in the professional careers of international medical graduates. Med Educ 2010;44:795-804.  Back to cited text no. 2
    3.Ponnamperuma GG. Selection for postgraduate training. Clin Teach 2010;7:276-80.  Back to cited text no. 3
    4.Southgate L, Hays RB, Norcini J, Mulholland H, Ayers B, Woolliscroft J, et al. Setting performance standards for medical practice: A theoretical framework. Med Educ 2001;35:474-81.  Back to cited text no. 4
    5.Bajammal S, Zaini R, Abuznadah W, Al-Rukban M, Aly SM, Boker A, et al. The need for national medical licensing examination in Saudi Arabia. BMC Med Educ 2008;8:53.  Back to cited text no. 5
    6.Zaini RG, Bin Abdulrahman KA, Al-Khotani AA, Al-Hayani AM, Al-Alwan IA, Jastaniah SD. Saudi Meds: A competence specification for Saudi medical graduates. Med Teach 2011;33:582-4.  Back to cited text no. 6
    7.Jolly B. National licensing exam or no national licensing exam? That is the question. Med Educ 2016;50:12-4.  Back to cited text no. 7
    8.Busche K, Elks ML, Hanson JT, Jackson-Williams L, Manuel RS, Parsons WL, et al. The validity of scores from the new MCAT exam in predicting student performance: Results from a multisite study. Acad Med 2020;95:387-95.  Back to cited text no. 8
    9.Swanson DB, Roberts TE. Trends in national licensing examinations in medicine. Med Educ 2016;50:101-14.  Back to cited text no. 9
    10.Archer J, Lynn N, Coombes L, Roberts M, Gale T, Price T, et al. The impact of large scale licensing examinations in highly developed countries: A systematic review. BMC Med Educ 2016;16:212.  Back to cited text no. 10
    11.Ha C, Ahmed U, Khasminsky M, Salib M, Andey T. Assessment of mock exam utility: A correlative and comparative study in a pharmaceutical calculations course. Am J Pharm Educ 2022;8654 [Online Ahead of Print].  Back to cited text no. 11
    12.Marcus-Blank B, Dahlke JA, Braman JP, Borman-Shoap E, Tiryaki E, Chipman J, et al. Predicting performance of first-year residents: Correlations between structured interview, licensure exam, and competency scores in a multi-institutional study. Acad Med 2019;94:378-87.  Back to cited text no. 12
    13.Rayamajhi S, Dhakal P, Wang L, Rai MP, Shrotriya S. Do USMLE steps, and ITE score predict the American Board of Internal Medicine Certifying Exam results? BMC Med Educ 2020;20:79.  Back to cited text no. 13
    14.Abu-Zaid A, Salem H, Alkattan K. The Saudi Medical Licensure Examination-Clinical Skills (SMLE-CS): A call for implementation. J Family Med Prim Care 2020;9:12-5.  Back to cited text no. 14
[PUBMED]  [Full text]  15.Stewart CM, Bates RE Jr., Smith GE. Does performance on school-administered mock boards predict performance on a dental licensure exam? J Dent Educ 2004;68:426-32.  Back to cited text no. 15
    16.McConnell MM, St-Onge C, Young ME. The benefits of testing for learning on later performance. Adv Health Sci Educ Theory Pract 2015;20:305-20.  Back to cited text no. 16
    17.Stewart CM, Vertucci FJ, Bates RE. Improving performance on the endodontic section of the Florida Dental Licensure Examination. J Dent Educ 2004;68:829-33.  Back to cited text no. 17
    18.Harradon SE. American Registry of Radiologic Technologists Exam Preparation: A Case Study [theses]. Portland-Biddeford, Maine: University of New England; 2020.  Back to cited text no. 18
    
  [Figure 1]
 
 
  [Table 1]
  Top  

留言 (0)

沒有登入
gif