Home > Abstract Details > Can CREOG In-Service Exam Scores Be Improved by a Standard Intervention?

« Back to Search

Can CREOG In-Service Exam Scores Be Improved by a Standard Intervention?

Caitlin Bump, MD, MS St. Luke's-Roosevelt Hospital, New York, NY
Objective: A quantitative milestone to assess residents’ medical knowledge is performance on the Council on Resident Education in Obstetrics and Gynecology (CREOG) in-training exam. The CREOG Council and the American Board of Obstetrics and Gynecology (ABOG) have stated that this test is a self-assessment tool for both programs and residents and is not to be used in a “punitive” way. Increasingly, however, fellowship programs are requesting residents’ CREOG scores, implying that they are being used for the selection process and it is no longer an internal measure of medical knowledge. In the past five years, our residents have had 100% pass rate on the ABOG written and oral exams; however, only 33% of our residents have scored ≥ 200 (50th percentile) on the CREOG exam (Figure 1). This was disturbing to both residents and faculty for it was unclear if performance on this test was due to the preparation of the learner (resident) or the teacher (program). The objective of this study was to improve CREOG scores by instituting a standard intervention in an attempt to have ≥80% of our residents score ≥ 200.

Methods: In the academic year of 2012-2013 in a 24 resident program (6/year) we instituted an intervention to improve 2013 CREOG scores compared to 2012 scores. Residents were required to complete an online OB/GYN computer based board review course prior to the 2013 CREOG exam. Completion of this course was documented by the program coordinator. Our established academic program consisting of weekly lectures (based on CREOG Educational Objectives), structured resident lead chapter reviews, and a 6 session CREOG review course would remain unchanged. Using a mean score of 200, we compared each resident to themselves and to the national scores to analyze if, after the intervention, there was an improvement from 2012 scores (pre-intervention) to 2013 scores (post-intervention).

Results: Resident demographics: mean age was 32 ±3.3 years; average number of years from college to residency was 5.5±3.39; 62% of residents were Caucasian, 21% African American, and 17% Asian; 88% spoke English as their first language; 71% graduated from American medical schools. There was no difference in demographics between those residents that scored <200 vs those that scored ≥200. 100% of residents took the USMLE exams. There was a linear correlation between USMLE Step I and CREOG test scores (Figure 2). Table 1 demonstrates the results of all 18 residents’ scores pre- and post-intervention. The mean CREOG score of all residents compared to all years significantly improved post training (p=0.0020), but not when residents were compared to their own year. Residents scored significantly better in all sub-categories except Primary Care post-intervention. Overall, residents scored significantly worse in General Consideration (p=0.0448). When each PGY year was analyzed individually, the results were similar to the general analysis of the intervention. PGY-2s significantly improved in all categories except General Considerations and Primary Care. PGY-3s significantly improved in REI, Oncology, and Genomics, but scored significantly worse in General Considerations. PGY-4s significantly improved only in Gynecology and Genomics, and overall scored higher than the other residents. When comparing 2013 PGY-1s scores (post-intervention) to the 2012 PGY-1s scores (pre-intervention), the 2013 PGY-1s had significantly higher Gynecology (p=0.0026), REI (p=0.0480), Oncology (p=0.0270), and Genomics (p=0.0065) scores.

Conclusion: Our data suggest that a standardized intervention to improve residents’ CREOG scores made a difference in overall score and in individual categories. However, it was not of the magnitude that we proposed. Despite this, analyzing the CREOG exam is an informative tool that enables educators to study the results and improve the accountability of both the teacher and learner. Further study is needed to see if this individualized approach will improve scores and help residents achieve this milestone.

Keywords: Resident education; In-training examinations; Resident assessment

Topics: CREOG & APGO Annual Meeting, 2014, Resident, Residency Director, Medical Knowledge, GME, Assessment, General Ob-Gyn,

General Information

Date Presented
2014 CREOG & APGO Annual Meeting
Poster Abstract

Keywords
Feedback & Evaluation, Assessment, Licensure & Certification 

Intended Audience
Resident, Residency Director

Competencies Addressed
Medical Knowledge

Educational Continuum
GME

Educational Focus
Assessment

Clinical Focus
General Ob-Gyn

Intended
Audience
Resident,Residency Director,
Competencies
Addressed
Medical Knowledge,
Educational
Continuum
GME,
Educational
Focus
Assessment,
Clinical Focus
General Ob-Gyn,

Author Information

Primary Author
Caitlin Bump, MD, MS
St. Luke's-Roosevelt Hospital, New York, NY

Co-Authors
Lois Brustman, MD
Barbara Deli, MD

Additional Materials

Related Abstracts

Association of Professors of Gynecology and Obstetrics

2130 Priest Bridge Drive, Suite 7, Crofton, MD 21114

410-451-9560

APGO logo

Follow Us


COPYRIGHT © 2020
Association of Professors of
Gynecology and Obstetrics (APGO)