Abstract
In response to growing concern about the declining performance on the American Board of Family Medicine Certification Examination, several strategies were employed to assist program directors with preparing their residents to take the examination. The effect of these efforts seems to have resulted in significant improvement in performance.
Previous reports expressed concern about the declining pass rate of graduating residents on the American Board of Family Medicine (ABFM) Certification Examination (CE) from 2007 to 2011.1,2 In analyzing program pass rates, Falcone and Middleton2 demonstrated that 47.8% of family medicine residency training programs violated Accreditation Council of Graduate Medical Education program requirements with greater than 10% of their residents failing the examination on a 3- to 5-year rolling average.
To assist program directors better prepare their residents to take the examination, the ABFM undertook several strategic initiatives beginning in 2009. This included creating a common scoring scale in 2009 across the administrations of both the CE and In-Training Examination (ITE); moving up the first administration of the CE in each year from July to April in 2012; introducing Family Medicine Certification entry requirements, including mandatory completion of at least 1 Self-Assessment Module (SAM), in 2012; and providing better predictive feedback from the ITE using the Bayesian Score Predictor in 2013. Given these changes, we wished to determine what effect, if any, they may have had on subsequent resident performance on the CE.
Using ABFM examination data, we calculated the pass rate for both United States medical graduate (USMG) and international medical graduate (IMG) initial certifiers who had not failed a previous attempt on the CE for years 2009 to 2016. Given that the minimum passing standard (MPS) for the CE was lowered by the ABFM Board of Directors in 2014 from a scaled score of 390 to 380, we calculated 2014 to 2016 pass rates using both the 380 and 390 passing standards to adjust for the effects of the change in MPS.
The results are shown in Figure 1 below. USMG performance reached a low of 91.3% in 2010 and increased steadily thereafter to reach a high of 98.3% in 2016. IMG performance reached a nadir of 81.2% in 2011 and for the most part increased steadily thereafter to reach a high of 96.8% in 2016. The change in the MPS for the 2014 to 2016 cohorts did increase the pass rate for both USMGs and IMGs, but the increase in pass rate from the 2013 baseline to the 390-adjusted pass rate was always larger than the increase from the 390-adjusted pass rate to the 380-actual pass rate.
Although it is difficult to attribute the variance in these trends to each of the individual strategies employed, we have previously demonstrated the predictive value of the ITE on subsequent CE performance as well as greater likelihood of passing the CE after completion of a SAM.3,4 Given the timing of when these initiatives were implemented and when the pass rate increases occurred, it seems plausible that a time lag related to the adoption of the tools/policies and how educators and residents learned to make the best use of them may have been operant. However, we cannot discount that some or all the improvement might be explained by the improving quality of family medicine trainees recruited into training programs during this time period. Further investigation should explore the impact of each of these factors on examination performance.
Notes
This article was externally peer reviewed.
Funding: none.
Conflict of interest: JCP, MRP, and TRO are employees of the ABFM.
To see this article online, please go to: http://jabfm.org/content/30/5/570.full.
- Received for publication February 20, 2017.
- Revision received May 8, 2017.
- Accepted for publication May 8, 2017.