Considerations of a Resident Recruitment Committee on the USMLE Step 1 Examination
4
Citation
5
Reference
10
Related Paper
Citation Trend
Abstract:
To the Editor: As members of a residency recruitment committee (RRC), we disagree with Andolsek1 and Chen et al.2 Elimination of the United States Medical Licensing Examination (USMLE) Step 1 score would limit the ability of the RRC to evaluate candidates for postgraduate training, threaten board certification, and jeopardize the excellence of physicians. Selection of medical school applicants for residency is rigorous. Ideal applicant characteristics include motivation to achieve excellence, a greater goal than competence, and predicted successful board certification. We previously described a successful well-rounded approach to candidate selection.3 Assessment of medical knowledge (MK) is constant during medical school, postgraduate training, and maintenance of certification. Successful certification is paramount for individuals as well as residency training programs. In anesthesiology, for example, Step 1 and in-training examination scores predict academic and clinical success.4,5 Additionally, anesthesiology residents must pass part one of their specialty board certification exams during their second year of residency. Inability to pass this exam prohibits further continuation in residency. Data from our institution demonstrate that success on this exam strongly correlates to Step 1 performance. Furthermore, the lay public expects that physicians possess excellent MK. We suspect they would embrace excellence over competence. The paucity of useful information within the medical student performance evaluation (MSPE) increases the need to rely on Step 1 scores. The majority of MSPEs now use pass/fail grading, an arbitrary ranking system wherein “outstanding” has equal likelihood to mean either first or fourth quartile, and they lack National Board of Medical Examiners shelf exam scores during clerkships. Without useful information in the MSPE, numeric Step 1 scores are critical for evaluation of applicants for residency. Johanna Blair de Haan, MDAssistant professor, Department of Anesthesiology, McGovern Medical School at UTHealth, Houston, Texas; [email protected] Travis Markham, MDAssistant professor, Department of Anesthesiology, McGovern Medical School at UTHealth, Houston, Texas. Semhar Ghebremichael, MDAssistant professor, Department of Anesthesiology, McGovern Medical School at UTHealth, Houston, Texas.Keywords:
Excellence
Board certification
Specialty
Licensure
Grading (engineering)
Graduate medical education
Educational measurement
Milestone Achievements in a National Sample of PEM Fellows: Impact of Primary Residency Training Abstract: Purpose: Pediatric Emergency Medicine (PEM) fellowships uniquely draw from two distinct residencies: either pediatrics (American Board of Pediatrics-ABP) or emergency medicine (American Board of Emergency Medicine-ABEM). The Accreditation Council for Graduate Medical Education (ACGME) defines separate track requirements for each with the 2015 PEM Milestones reflecting a combination of milestones from the two residencies. Training is disparate with most applicants from pediatric or EM completing 3 years of residency, and some EM residents having a 4-year residency experience. While …
Milestone
Graduate medical education
Pediatric emergency medicine
Cite
Citations (0)
Graduate medical education
Cite
Citations (2)
Abstract. Objective: To determine whether changes in graduate medical education (GME) funding have had an impact on emergency medicine (EM) residency training programs. Methods: A 34‐question survey was mailed to the program directors (PDs) of all 115 Accreditation Council for Graduate Medical Education (ACGME)‐accredited EM residency programs in the United States in the fall of 1998, requesting information concerning the impact of changes in GME funding on various aspects of the EM training. The results were then compared with a similar unpublished survey conducted in the fall of 1996. Results: One hundred one completed surveys were returned (88% response rate). Seventy‐one (70%) of the responding EM residency programs were PGY‐I through PGY‐III, compared with 55 (61%) of the responding programs in 1996. The number of PGY‐II through PGY‐IV programs decreased from 25 (28%) of responding programs in 1996 to 17 (16%). The number of PGY‐I through PGY‐IV programs increased slightly (13 vs 10); the number of EM residency positions remained relatively stable. Fifteen programs projected an increase in their number of training positions in the next two years, while only three predicted a decrease. Of the respondents, 56 programs reported reductions in non‐EM residency positions and 35 programs reported elimination of fellowship positions at their institutions. Only four of these were EM fellowships. Forty‐six respondents reported a reduction in the number of non‐EM residents rotating through their EDs, and of these, 11 programs reported this had a moderate to significant effect on their ability to adequately staff the ED with resident physicians. Sixteen programs limited resident recruitment to only those eligible for the full three years of GME funding. Eighty‐seven EM programs reported no change in faculty size due to funding issues. Sixty‐two programs reported no change in the total number of hours of faculty coverage in the ED, while 34 programs reported an increase. Three EM programs reported recommendations being made to close their residency programs in the near future. Conclusions: Changes in GME funding have not caused a decrease in the number of existing EM residency and fellowship training positions, but may have had an impact in other areas, including: an increase in the number of EM programs structured in a PGY‐I through PGY‐III format (with a corresponding decrease in the number of PGY‐II through PGY‐IV programs); a decrease in the number of non‐EM residents rotating through the ED; restriction of resident applicants who are ineligible for full GME funding from consideration by some EM training programs; and an increase in the total number of faculty clinical hours without an increase in faculty size.
Graduate medical education
Cite
Citations (2)
To determine whether the 2003 Accreditation Council for Graduate Medical Education (ACGME) duty hours reform affected medical knowledge as reflected by written board scores for internal medicine (IM) residents.The authors conducted a retrospective cohort analysis of postgraduate year 1 (PGY-1) Internal Medicine residents who started training before and after the 2003 duty hour reform using a merged data set of American Board of Internal Medicine (ABIM) Board examination and the National Board of Medical Examiners (NMBE) United States Medical Licensing Examination (USMLE) Step 2 Clinical Knowledge test scores. Specifically, using four regression models, the authors compared IM residents beginning PGY-1 training in 2000 and completing training unexposed to the 2003 duty hours reform (PGY-1 2000 cohort, n = 5,475) to PGY-1 cohorts starting in 2001 through 2005 (n = 28,008), all with some exposure to the reform.The mean ABIM board score for the unexposed PGY-1 2000 cohort (n = 5,475) was 491, SD = 85. Adjusting for demographics, program, and USMLE Step 2 exam score, the mean differences (95% CI) in ABIM board scores between the PGY-1 2001, 2002, 2003, 2004 and 2005 cohorts minus the PGY-1 2000 cohort were -5.43 (-7.63, -3.23), -3.44 (-5.65, -1.24), 2.58 (0.36, 4.79), 11.10 (8.88, 13.33) and 11.28 (8.98, 13.58) points respectively. None of these differences exceeded one-fifth of an SD in ABIM board scores.The duty hours reforms of 2003 did not meaningfully affect medical knowledge as measured by scores on the ABIM board examinations.
Graduate medical education
Board certification
Demographics
Educational measurement
Institutional review board
Cite
Citations (17)
Purpose To determine whether scores on structured interview (SI) questions designed to measure noncognitive competencies in physicians (1) predict subsequent first-year resident performance on Accreditation Council for Graduate Medical Education (ACGME) milestones and (2) add incremental validity over United States Medical Licensing Examination (USMLE) Step 1 and Step 2 Clinical Knowledge scores in predicting performance. Method The authors developed 18 behavioral description questions to measure key noncognitive competencies (e.g., teamwork). In 2013–2015, 14 programs (13 residency, 1 fellowship) from 6 institutions used subsets of these questions in their selection processes. The authors conducted analyses to determine the validity of SI and USMLE scores in predicting first-year resident milestone performance in the ACGME’s core competency domains and overall. Results SI scores predicted midyear and year-end overall performance ( r = 0.18 and 0.19, respectively, P < .05) and year-end performance on patient care, interpersonal and communication skills, and professionalism competencies ( r = 0.23, r = 0.22, and r = 0.20, respectively, P < .05). SI scores contributed incremental validity over USMLE scores in predicting year-end performance on patient care ( ΔR = 0.05), interpersonal and communication skills ( ΔR = 0.09), and professionalism ( ΔR = 0.09; all P < .05). USMLE scores contributed incremental validity over SI scores in predicting year-end performance overall and on patient care and medical knowledge. Conclusions SI scores predict first-year resident year-end performance in the interpersonal and communication skills, patient care, and professionalism competency domains. Future research should investigate whether SIs predict a range of clinically relevant outcomes.
Graduate medical education
Milestone
Licensure
Educational measurement
Core competency
Cite
Citations (39)
The American Academy of Family Physicians (AAFP) and the Accreditation Council for Graduate Medical Education’s (ACGME) program requirements for residency education in family practice acknowledge the importance of research and other scholarly activity in residency training.[1][1] Included in the
Graduate medical education
Graduate Education
Cite
Citations (3)
Graduate medical education
Cite
Citations (3)
To determine if there is an association between several commonly obtained premedical school and medical school measures and board certification performance. We specifically included measures from our institution for which we have predictive validity evidence into the internship year. We hypothesized that board certification would be most likely to be associated with clinical measures of performance during medical school, and with scores on standardized tests, whether before or during medical school.Achieving board certification in an American Board of Medical Specialties specialty was used as our outcome measure for a 7-year cohort of graduates (1995-2002). Age at matriculation, Medical College Admissions Test (MCAT) score, undergraduate college grade point average (GPA), undergraduate college science GPA, Uniformed Services University (USU) cumulative GPA, USU preclerkship GPA, USU clerkship year GPA, departmental competency committee evaluation, Internal Medicine (IM) clerkship clinical performance rating (points), IM total clerkship points, history of Student Promotion Committee review, and United States Medical Licensing Examination (USMLE) Step 1 score and USMLE Step 2 clinical knowledge score were associated with this outcome.Ninety-three of 1,155 graduates were not certified, resulting in an average rate of board certification of 91.9% for the study cohort. Significant small correlations were found between board certification and IM clerkship points (r = 0.117), IM clerkship grade (r = 0.108), clerkship year GPA (r = 0.078), undergraduate college science GPA (r = 0.072), preclerkship GPA and medical school GPA (r = 0.068 for both), USMLE Step 1 (r = 0.066), undergraduate college total GPA (r = 0.062), and age at matriculation (r = -0.061). In comparing the two groups (board certified and not board certified cohorts), significant differences were seen for all included variables with the exception of MCAT and USMLE Step 2 clinical knowledge scores. All the variables put together could explain 4.1% of the variance of board certification by logistic regression.This investigation provides some additional validity evidence that measures collected for purposes of student evaluation before and during medical school are warranted.
Matriculation
Board certification
Specialty
Licensure
Educational measurement
Entrance exam
Cite
Citations (20)
Preparing family medicine physicians to meet the needs of their patients is a fundamental goal of residency training. These needs shift, and so training must also adapt. The revised Accreditation Council for Graduate Medical Education (ACGME) requirements for GME in family medicine call on residency
Graduate medical education
Cite
Citations (0)
Introduction: The United States Medical Licensing Examination (USMLE) Step 1 score is one of the few standardized metrics used to objectively review applicants for residency. In February 2020 the USMLE program announced that the numerical Step 1 scoring would be changed to a binary (Pass/Fail) system. In this study we sought to characterize how this change in score reporting will impact the application review process for emergency medicine (EM) program directors (PD). Methods: In March 2020 we electronically distributed a validated anonymous survey to EM PDs at 236 US EM residency programs accredited by the Accreditation Council for Graduate Medical Education. Results: Of 236 EM PDs, 121 responded (51.3% response rate). Overall, 72.7% believed binary Step 1 scoring would make the process of objectively comparing applicants more difficult. A minority (19.8%) believed it was a good idea, and 33.1% felt it would improve medical student well-being. The majority (88.4%) reported that they will increase their emphasis on Step 2 Clinical Knowledge (CK) for resident selection, and 85% plan to require Step 2 CK scores at application submission time. Conclusion: Our study suggests most EM PDs disapprove of the new Step 1 scoring. As more objective data is peeled away from the residency application, EM PDs will be left to rely more heavily on the few remaining measures, including Step 2 CK and standardized letters of evaluation. Further changes are needed to promote equity and improve the overall quality of the application process for students and PDs.
Graduate medical education
Equity
Cite
Citations (8)