A working group (box) to develop educational Milestones for surgery was first convened in 2009. Under the leadership of Richard H. Bell Jr, MD, American Board of Surgery (ABS), the group consisted of 12 members representing ABS, the Accreditation Council for Graduate Medical Education (ACGME), the Residency Review Committee for Surgery (RRC-Surgery), and the Association for Program Directors in Surgery (APDS). The group included 1 resident member and an expert in surgical evaluation and performance assessment. In 2011, 5 new members were added, including 2 new members each from the RRC-Surgery and the APDS and a new chair from the ABS, Thomas H. Cogbill, MD.The working group began the development of the Milestones by reviewing the Dreyfus model of professional development (from novice to expert),1 along with perspectives from the literature on expertise,2 and contemporary methods of evaluation for surgeons in training.3–5 The working group then undertook a multistep project to define the primary domains of practice in which surgical trainees needed to develop proficiency. Eight distinct domains of surgical practice were selected: (1) care for diseases and conditions; (2) coordination of care; (3) performance of operations and procedures; (4) self-directed learning; (5) teaching; (6) improvement of care; (7) maintenance of physical and emotional health; and (8) performance of administrative tasks. These domains provided a framework for identifying Milestones integrally related to residents' learning and performance of patient care and related professional responsibilities. The group identified the ACGME competencies that were most relevant to each domain and then developed Milestones for each of these competency-domain pairs.The group drew from the widely used SCORE curriculum to describe advancing expertise in patient care and technical skills.6 For example, early levels of patient care for the domain "Performance of Operations and Procedures" describe the expectation that residents perform steps in some essential/common operations identified in the SCORE curriculum, whereas higher levels describe the expectation that residents perform all essential operations (without the need for direct supervision) and have significant experience with complex operations in the SCORE curriculum. The working group also referenced the 2011 report from the ACGME expert panel that focused on Milestone development for professionalism, interpersonal and communication skills, practice-based learning and improvement, and systems-based practice to modify draft Milestones. A principle that guided the group was to make Milestones as succinct as possible, while ensuring that essential skills, knowledge, and behaviors for unsupervised surgical practice were represented. With this in mind, the working group conducted numerous reviews of progressive iterations of the Milestones and made revisions to reduce redundancy and fine tune the Milestones.The final product consisted of 16 subcompetencies that were mapped to both an ACGME competency and a single domain of surgical practice. Sets of descriptors were composed for definitions of critical deficiency and a "Milestone set" consisting of 4 levels of performance. These descriptors were designed to discriminate between each of the levels of performance. The 16 sets of Surgery Milestones map to the ACGME competencies as follows: 3 Milestones for patient care; 2 for medical knowledge; 2 for systems-based practice; 3 for practice-based learning and improvement; 3 for professionalism; and 3 for interpersonal and communication skills. The Milestone sets map to the domains of surgical practice as follows: 5 sets for care for diseases and conditions; 2 for coordination of care; 3 for performance of operations and procedures; 1 for self-directed learning; 1 for teaching; 2 for improvement of care; 1 for maintenance of physical and emotional health; and 1 for performance of administrative tasks.In December 2011, an alpha pilot test of a semiannual evaluation form for the Surgery Milestones was conducted in 8 stakeholder surgical residencies, representing 8 members of the Surgery Milestone Working Group. The results and comments were compiled and shared with the entire working group. The document underwent significant revision with reordering of some descriptors by level of performance, incorporation of more consistent language, and removal of duplicative descriptions. The revised document and a summary of the process used in its development were shared with the ABS, APDS, and the RRC-Surgery at the spring meeting of each organization. Additional revisions were made in response to suggestions made by these organizations.In June 2012, the working group launched a 2-phase beta pilot test. An invitation was sent to 22 diverse general surgery residency programs; residencies with a stakeholder member of the Milestone Working Group were excluded. The test involved Clinical Competency Committee (CCC) evaluation of a sample of a program's residents (2 residents per postgraduate year) at 2 times approximately 6 months apart, the first at the end of academic year 2011–2012 and the second in midacademic year 2012–2013. The pilot was designed to investigate whether the Milestones would detect change in resident learning and performance over a 6-month period, and whether the predicted patterns of performance across levels of training and levels of Milestones would occur. Participants also completed surveys at the end of each phase. Among the information collected were participant perspectives on Milestone clarity, usefulness relative to existing methods, and feasibility of implementation, including time to complete the Milestone report. The survey also asked participants to identify Milestones that needed revision.Eighteen programs completed Phase 1 in August 2012 and 17 programs completed Phase 2 in January 2013. The typical CCC consisted of 4 members and the average time to complete the semiannual resident evaluation was 18 minutes in Phase 1 and 14 minutes in Phase 2. For most residents, CCCs selected higher Milestone performance levels for Phase 2 than Phase 1 reporting. Milestone evaluation results showed the expected progression for most subcompetencies.Faculty participants provided positive evaluations of the Milestones, with 97% of respondents reporting that the descriptors were clearly understandable. In addition, 85% concluded that use of the Milestones allowed for meaningful evaluation of the resident, and 76% stated that Milestone-based evaluations provided a more fair and systematic semiannual evaluation than their program's current review process. The working group trialed 1 added approach to the CCC, with the pilot instructions asking that all members review all residents in the context of the CCC. A Phase 2 survey question asked participants about the CCC approach they anticipated they would use in the future. The most frequent response expressed a necessity of distributing the Milestone reporting work across time, committee members, and education sites.The results of this comprehensive pilot were shared with the ABS, ACS, and APDS at their respective annual meetings. In May 2013, the Surgery Milestone Working Group considered the comments and suggestions received by the participants of the beta pilot test as well as members of the aforementioned organizations. A considerable number of additions, revisions, and deletions were made to incorporate this feedback, and the document was finalized. The final draft of the Surgery Milestone document was posted to the ACGME website in June 2013.Early in Milestone design, the working group considered the process by which Milestone evaluations would be accomplished. An evaluation plan was drafted that identified a set of existing evaluation tools that could be used at each level of training. The working group's discussions emphasized that the evaluation process would need to be continuous and practical, and that new evaluation tools would need to be developed to allow CCCs to complete the semiannual reports for all residents.The working group intends the educational Milestones in surgery to be a dynamic document. Future changes will be made based on a review of suggestions and comments by surgery program directors and CCC members at surgery residency programs. The group anticipates that changes also will be needed to parallel advances in surgical care and education. The working group is excited to learn how the document will perform for individual resident evaluation as well as a tool for the accumulation of national aggregate data on resident performance.
The Accreditation Council for Graduate Medical Education (ACGME) has outlined its "Next Accreditation System" (NAS) that will focus on resident and residency outcome measurements. Emergency medicine (EM) is one of seven specialties that will implement the NAS beginning July 2013. All other specialties will follow in July 2014. A key component of the NAS is the development of assessable milestones, which are explicit accomplishments or behaviors that occur during the process of residency education. Milestones describe competencies more specifically and identify specialty-specific knowledge, skills, attitudes, and behaviors (KSABs) that can be used as outcome measures within the general competencies. The ACGME and the American Board of Emergency Medicine (ABEM) convened an EM milestone working group to develop the EM milestones. This article describes the development, use within the NAS, and challenges of the EM milestones. El Accreditation Council for Graduate Medical Education (ACGME) ha descrito su próximo sistema de acreditación (NAS, Next Accreditation System) que se centrará en la medidas de los resultados del residente y la residencia. La medicina de urgencias y emergencias (MUE) es una de las siete de las 26 especialidades que implementará el NAS a principios de julio de 2013. El resto de especialidades lo hará en julio de 2014. Un componente clave del NAS es el desarrollo de los objetivos por áreas evaluables, que son logros o comportamientos explícitos que ocurren durante el proceso de formación de la residencia. Los objetivos por áreas describen las competencias más específicamente e identifican el conocimiento, las habilidades, las actitudes y los comportamientos específicos de la especialidad que pueden ser utilizados como medidas de resultados de las competencias generales. El ACGME y el American Board of Emergency Medicine convinieron un grupo de trabajo para desarrollar los objetivos por áreas de la MUE. Este artículo describe el desarrollo que se utiliza en el NAS y los retos de los objetivos por áreas de la MUE. In May 2008, Dr. Thomas Nasca outlined the Accreditation Council for Graduate Medical Education (ACGME) vision of "The Next Step in the Outcomes-Based Accreditation Project."1 This included two components: 1) the development of competency milestones for each specialty and 2) the implementation of milestone assessment tools for each specialty.2 This represented an extension of the Outcomes Project, in which six general competencies were defined.3 This evolution of the Outcomes Project marked a transition from a process focus to an outcomes focus. In 2011, the ACGME, through the Residency Review Committee for Emergency Medicine (RRC-EM), and the American Board of Emergency Medicine (ABEM) jointly convened the Emergency Medicine (EM) Milestone Working Group (EM MWG). This group was composed of representatives from major organizations within the specialty and reported to the EM Milestone Advisory Panel (see Data Supplement S1, available as supporting information in the online version of this paper). The EM MWG was given the task of developing the EM Milestones.2 This article describes the process used for the initial development of the EM Milestones, as well as challenges in their assessment. In 1998, the ACGME changed the focus of resident evaluation by emphasizing a transition to educational outcomes measurement from a predominantly process-oriented focus to an outcomes-oriented focus.3 The Outcomes Project identified the six general competencies (patient care, medical knowledge, interpersonal communications, professionalism, practice-based learning and improvement, and systems-based practice) and a timeline to transition from awareness of the six competencies to the implementation of outcome measurements related to each of these core competencies.4 The competencies served as a framework for residency program curricula. In 2001, the general competencies were written into the ACGME program requirements, and by 2008, programs were expected to enhance teaching and assessment of the competencies. Beginning in 2009, the next phase of the Outcomes Project was identified as the time when the ACGME would begin to collect and use educational outcome data in accreditation.5 The concept of milestone development for residency training was introduced in May, 2008, when Thomas Nasca, MD, ACGME CEO, announced the Milestone Project in the ACGME Bulletin.1 Educational milestones represent explicit accomplishments or behaviors that occur during the process of becoming a physician capable of "entrustable professional activities" defined by each specialty.6 Milestones augment the general competencies by 1) describing competencies more specifically than described in current program requirements; 2) identifying core specialty–specific knowledge, skills, attitudes, and beliefs (KSABs); and 3) describing the competencies as progressions of KSABs that residents should demonstrate across the residency continuum from entry to graduation. The decision to develop milestones to describe progressive expectations for learning and performance is consistent with developmental models such as the Dreyfus and Dreyfus model of skill acquisition.7 Such models emphasize the progressive nature of acquiring expertise, thereby providing a conceptual basis for milestones. Resident acquisition of KSABs within the milestones will function as educational outcome data for accreditation. Beginning July 2013, the ACGME launches its Next Accreditation System (NAS).8 The milestones will satisfy the overall Outcomes Project's goal to increase the emphasis on outcomes. The milestones will be an indicator of residency program quality in the NAS. Current plans call for resident acquisition of markers within the milestones to be assessed semiannually. The program will monitor the forward progress of each resident through the milestones and use a stalled progress or regression as one indication of the need for remediation. Each program's aggregate data of resident progress within the various milestone subcompetencies will be reviewed by the specialty's Residency Review Committee to determine overall progression in performance. Initially, the use of milestone data by the ACGME will be used to identify areas of improvement for programs, rather than be a source of citations. The NAS emphasizes program improvement in contrast to conferring adverse actions. As such, patterns of milestone data will be examined to determine if there are gaps in the curriculum or resident capabilities that need to be addressed. In addition, milestone data will be used to provide assurance to the public, payers, and policy-makers that residency programs are educating residents in targeted areas of health care delivery. These would include the ability to practice in a variety of settings and systems, coordination of care across settings, cost and value of diagnostic and treatment options, interprofessional and multidisciplinary care teams, methods for identifying system errors and implementing system solutions, and use of health technology. Emergency medicine has a history of defining the specialty's core content, especially in regard to the conditions that are encountered and the skills that are necessary for the practice of EM.5 These skills and tasks form the origins of the EM milestones and can be traced back to 1975 (Figure 1), when a national survey of EM practice was conducted. This effort was coordinated by the American College of Emergency Physicians and the University Association for Emergency Medicine and administered by the Office of Medical Education, Research and Development (OMERAD) at Michigan State University9 (see Data Supplement S2, available as supporting information in the online version of this paper). A 25-member task force was formed that rated the relative importance of the conditions and skills to the practice of EM. This project resulted in the first EM Core Content document that was used by the ABEM to develop certification exam specifications.9 After 20 years of use, the EM Core Content was updated to reflect the evolving specialty.10 To guide this process, the National Board of Medical Examiners (NBME) conducted a practice analysis of EM. The NBME conducts practice (job) analyses for the purpose of establishing or refining examination content specifications.11 This also had direct applicability to the description of the breadth of EM practice. During this review, 40,000 ED patient encounters were reviewed to determine the frequency of ED visits by individual conditions or diseases. Based upon these data, the Core Content Task Force II developed the Model of the Clinical Practice of Emergency Medicine (the EM Model) that included individual conditions and components (similar to the EM core content document) based upon acuity (critical, emergent, low acuity).12 The EM Model also identified 14 separate physician tasks such as emergency stabilization, therapeutic interventions, multitasking, and team management. The original EM Model was published in 2001 and is revised every 2 years by representatives of the major EM societies and organizations.13 In 2004, ABEM began a review of initial certification standards, with an Initial Certification Task Force (ICTF) formed in 2007. A working group was formed to review changes in the practice of EM and was referred to as the Relevance of Examinations to Physician Practice (REPP). The goal of this working group was to determine if ABEM examinations reflected current EM practice. As a result of this study, several changing practice patterns were identified (Data Supplement S3, available as supporting information in the online version of this paper). A comprehensive practice survey was used to rank the prevalence of these changes. The REPP findings were important because they highlighted the changing practice of EM, such as orders being completed before a patient is seen by a physician, increased multitasking, greater use of information technology, and a greater use of diagnostic testing, including the use of ultrasound. Given the importance of the REPP findings, ABEM formed the ICTF Advisory Panel with the intention to revise its standards for initial certification. The ICTF Advisory Panel identified knowledge, skills, and abilities (KSAs) required to practice EM based on changes to practice (as informed by the REPP task force findings), the EM Model, and the six competencies. After the KSAs were written, a national survey of all ABEM-certified physicians was conducted asking for the importance and frequency of the newly developed KSAs, procedures, and conditions (for this material, there were 9,740 surveys sent with 2,571 respondents [28.2%]).14 The results of the survey validated the KSAs that had been developed by the ICTF Advisory Panel. Furthermore, there were hierarchical levels of achievement from which new initial certification testing standards were developed. These final certification standards and the KSAs were used by the EM MWG to closely align the completion of residency training with the initial certification examination for those milestones that were related to physician tasks. This included milestone subcompetencies 1 through 9.2 The EM MWG also developed additional procedure-based milestones. Members of the EM MWG developed these milestones through an expert consensus process. This involved milestone subcompetencies 9 through 14.2 In addition, the ACGME convened expert panels to develop subcompetencies and milestones related to the common competencies (professionalism, interpersonal and communication skills, systems-based practice, and practice-based learning and improvement). The EM MWG modified selected subcompetencies and milestones from this group to enhance the relevancy to EM. This set of subcompetencies included subcompetencies 15 through 23.2 Teaching was originally a subcompetency that was identified by the EM MWG, but later removed due to difficulty in clearly delineating milestone acquisition and progression. As each topic was reviewed, ABEM's certification standard was also considered. To align the completion of residency training with the initial certification examinations, the EM MWG assigned ABEM's certification standard to the Level 4 category within the ACGME Milestone scoring schema (five levels of proficiency) when possible. Through this process the EM MWG was able to align EM graduate medical education goals with ABEM initial certification standards. From the ABEM certification content standard as Level 4 in the ACGME hierarchical schema, each subcompetency was then fully developed. With each specialty developing milestones focused on that specialty, there was a need to develop common nomenclature that would cross all specialties. A milestone set consists of all milestones at all levels within a particular sub-competency (Figure 2). Each milestone subcompetency (EM has 23) is categorized by the general competency affected, such as patient care for the milestone subcompetency of emergency stabilization. A milestone is an observable behavior that falls within five levels of proficiency, from entry level at Level 1 (a medical school graduate), to Level 5, a level expected to be achieved after years of clinical practice. Level 4 is the behavior expected of a graduating EM resident, matching the ABEM initial certification content standard. The ACGME will use residents' collective achievement of milestones as one of nine program performance indicators in its NAS. Programs are expected to prepare a summary milestones evaluation semiannually for each resident. These data will be used in an annual accreditation review of program performance. Features of the process include standardized reporting, the formation of a Clinical Competency Committee (CCC), and the use of milestone acquisition as a reflection of resident and residency program performance. More specifically, programs will use a standard reporting worksheet and electronic reporting system that is provided by the ACGME. Each residency will form a CCC that will include faculty. The CCC will determine milestone achievements by each resident on a semiannual basis. The CCC's data will be entered online to the ACGME. The resulting reports can then be used by the program director toward meeting the requirement regarding semiannual evaluation of resident performance. The CCC will prepare semiannual reports of milestones for all residents. In addition, the ACGME will produce reports for accreditation showing residents' attainment of milestones across time, to ascertain the program's progression in the subcompetencies. The structure and use of the semiannual milestone report have raised a number of questions. A primary concern is around data integrity, particularly in regard to accuracy and validity. Milestone subcompetency proficiency level reporting is expected to be based on objective data. The major threats to performance assessment validity include too few performance cases (construct underrepresentation) and bias in rating (construct irrelevant variance) due to known year level of residents by faculty raters.15 Faculty development to ensure consistent and appropriate evaluation will be an important aspect of an EM program's initiation of this process. Educational sessions that increase the meaning of the milestones and illustrate their use in structuring feedback could be helpful in addressing these concerns. The use of a CCC to gather many points of information about a resident's performance will help to equalize the variation of assessments by different evaluators and will help to lessen this data integrity threat. A second issue tied to semiannual reporting is the potential difficulty in interpreting data for comparisons of milestone performance across residents, even within the same postgraduate year (PGY). Some PGY2 residents could perform better than other PGY2s on a subset of milestones simply because of the sequence of their exposure to a topic during the residency. It is also the case that the theoretical steady progression in performance that is expected on milestones across 6-month evaluation periods may not be demonstrated due to curriculum organization and the timing of learning opportunities and rotations. The EM Milestones and the ACGME EM Program Requirements16 are separate, but related, works. The program requirements define the requirements a residency must meet to receive and maintain accreditation and include requiring evidence of resident progress toward competency (outcomes). The milestones are expected to demonstrate the progress a resident makes in the specialty-specific achievements at established intervals through training.1 As an example, where the program requirements include "must demonstrate competency in adult medical resuscitation" [(IV.A.5.b). (2).(c).(i)], the milestones provide the measurable instrument Milestone PC1; "Emergency Stabilization: Prioritizes critical initial stabilization action and mobilizes hospital support services in the resuscitation of a critically ill or injured patient and reassesses after stabilizing intervention." Simultaneous to the development of the milestones and in preparation for the NAS, every EM program requirement was identified as either a core requirement, a detail requirement, or an outcome requirement. While core and detail requirements refer to program structure or process (e.g., emergency department volume, case logs), outcome requirements are "statements that specify expected measurable or observable attributes (knowledge, abilities, skills, or attitudes) of residents or fellows at key stages of their graduate medical education."17 The new program requirements include this categorization. As the EM MWG developed the milestones, they ensured that the milestones were integrated into the program requirements. Collaboration between the EM MWG and the RRC-EM resulted in reference to the milestones as identifiable outcome requirements. Due to timing of milestones development and core program requirements revision, incorporation of the EM Milestones into the EM core program requirements was accomplished. The American public is both the consumer and the financier of the U.S. residency training system. By standardizing the learning outcomes in GME, the milestones will assure the public that objective markers of knowledge and skill acquisition are being monitored. This addresses one issue that the Institute of Medicine identified when it called for the development of new approaches to medical education.18 With the advent of the Milestones Project, there are now observable events that each resident must demonstrate. This will assure the public that key professional development has occurred prior to the certification by the residency program that a resident is competent for independent practice in a medical specialty. The bridge between individual competencies demonstrated during residency training and the actual clinical practice of medicine are described as the Entrustable Professional Activities (EPA).19 These activities are the series of interconnected actions performed by a physician that define a specific specialty's clinical practice that a trainee must demonstrate he or she can be trusted to perform independently. Some specialties such as pediatrics have delineated these and use milestone progression and completion to document for the public the professional development that leads to EPAs. The EM Model incorporates physician tasks into its framework. These physician tasks form the foundation for the EM Milestone subcompetencies. The EM MWG focused on these subcompetencies because of the significant body of data available to support its use. Further expansion of the milestone subcompetencies into EPAs for EM should be based on need and added value to overall milestone subcompetency assessment. The milestones also clarify the expectations training programs have of undergraduate medical education (UME). Since all residents are expected to start their GME training with the initial competencies for each milestone, medical schools have a much clearer understanding of the basic knowledge and abilities expected of students by training programs.20 The Milestone Project has the potential to have a significant effect on the training of UME. On the other end of the continuum, the EM Milestones have the potential to inform continuing professional development, especially in regard to the ABEM Maintenance of Certification (MOC) program. In 2004, MOC was developed in an effort to assure that physicians are committed to lifelong learning and competency in their specialty by requiring ongoing acquisition of skill and knowledge.21 The milestone subcompetencies are along this continuum of acquisition of knowledge and skills during residency and their maintenance during practice. Milestones are observable behaviors that will be reported to the ACGME within each of 23 EM subcompetencies semiannually. The next challenge to each residency and the specialty as a whole is the development of objective measures of milestone subcompetency assessment. Multiple assessment tools will need to be developed that provide objective measures of one or more milestone subcompetencies. Issues of assessment tool validity and of inter-rater reliability will need to be studied and addressed as various assessment tools are developed, piloted, and put into widespread use. The specialty of EM has developed a set of milestone subcompetencies.2 These 23 subcompetencies span the six general competencies. Development was based on expert group opinion from input of all major EM organizations, previous work completed by ABEM, expert groups from within the ACGME, and a desire to closely align the milestone subcompetency proficiency levels with the similar ABEM initial certification standards. Development and implementation of assessment methods will present the next great challenge to the specialty of EM. Please note: The publisher is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.
ObjectivesTo report the results of a project designed to develop and implement a prototype methodology for identifying candidate patient care quality measures for potential use in assessing the outcomes and effectiveness of graduate medical education in emergency medicine. MethodsA workgroup composed of experts in emergency medicine residency education and patient care quality measurement was convened. Workgroup members performed a modified Delphi process that included iterative review of potential measures; individual expert rating of the measures on four dimensions, including measures quality of care and educational effectiveness; development of consensus on measures to be retained; external stakeholder rating of measures followed by a final workgroup review; and a post hoc stratification of measures. The workgroup completed a structured exercise to examine the linkage of patient care process and outcome measures to educational effectiveness. ResultsThe workgroup selected 62 measures for inclusion in its final set, including 43 measures for 21 clinical conditions, eight medication measures, seven measures for procedures, and four measures for department efficiency. Twenty-six measures met the more stringent criteria applied post hoc to further stratify and prioritize measures for development. Nineteen of these measures received high ratings from 75% of the workgroup and external stakeholder raters on importance for care in the ED, measures quality of care, and measures educational effectiveness; the majority of the raters considered these indicators feasible to measure. The workgroup utilized a simple framework for exploring the relationship of residency program educational activities, competencies from the six Accreditation Council for Graduate Medical Education general competency domains, patient care quality measures, and external factors that could intervene to affect care quality. ConclusionsNumerous patient care quality measures have potential for use in assessing the educational effectiveness and performance of graduate medical education programs in emergency medicine. The measures identified in this report can be used as a starter set for further development, implementation, and study. Implementation of the measures, especially for high-stakes use, will require resolution of significant measurement issues.