Artigo Acesso aberto Revisado por pares

How Competent Are Emergency Medicine Interns for Level 1 Milestones: Who Is Responsible?

2013; Wiley; Volume: 20; Issue: 7 Linguagem: Inglês

10.1111/acem.12162

ISSN

1553-2712

Autores

Sally A. Santen, Nicholas Rademacher, Sheyl L. Heron, Sorabh Khandelwal, Samantha J. Hauff, Laura R. Hopson,

Tópico(s)

Radiology practices and education

Resumo

The Next Accreditation System (NAS) of the Accreditation Council for Graduate Medical Education (ACGME) includes the implementation of developmental milestones for each specialty. The milestones include five progressively advancing skill levels, with Level 1 defining the skill level of a medical student graduate, and Level 5, that of an attending physician. The goal of this study was to query interns on how well they thought their medical school had prepared them to meet the proposed emergency medicine (EM) Level 1 milestones. In July 2012, an electronic survey was distributed to the interns of 13 EM residency programs, asking interns whether they were taught and assessed on the proposed Level 1 milestones. Of possible participants, 113 of 161 interns responded (70% response rate). The interns represented all four regions of the country. The interns responded that the rates of Level 1 milestones they had been taught ranged from 61% for ultrasound to 98% for performance of focused history and physical examination. A substantial number of interns (up to 39%) reported no instruction on milestones such as patient disposition, pain management, and vascular access. Graduating medical students were less commonly assessed than taught the milestones. Skills with technology, including "explain the role of the electronic health record and computerized physician order entry," were assessed for only 39% of interns, and knowledge (USMLE) and history and physical were assessed in nearly all interns. Disposition, ultrasound, multitasking, and wound management were assessed less than half of the time. Many entering EM interns may not have had either teaching or assessment on the knowledge, skills, and behaviors making up the Level 1 milestones expected for graduating medical students. Thus, there is a potential gap in the teaching and assessment of EM interns. Based on these findings, it is unclear who will be responsible (medical schools, EM clerkships, or residency programs) for ensuring that medical students entering residency have achieved Level 1 milestones. El próximo sistema de acreditación (NAS, Next Accreditation System) del Accreditation Council for Graduate Medical Education (ACGME) incluye la implementación de objetivos por área de desarrollo para cada especialidad. Los objetivos por área incluyen cinco niveles de habilidades progresivamente avanzadas, con un nivel 1 definido por el nivel de habilidad de un estudiante licenciado de medicina, y un nivel definido por el nivel, de un médico adjunto. El objetivo de este estudio fue preguntar a los residentes cómo pensaban que sus universidades les habían preparado para alcanzar los objetivos por área de nivel 1 propuestos en medicina de urgencias y emergencias (MUE). En julio de 2012, se distribuyó una encuesta electrónica a los residentes de 13 programas de residencia de MUE, preguntándoles si estaban formados y evaluados en los objetivos por área de nivel 1 propuestos. De los posibles participantes, 113 de 161 residentes (70%) respondieron. Los residentes representaban las cuatro regiones del país. Los residentes respondieron que los porcentajes de objetivos por área de nivel 1 en los que se habían formado variaron del 61% para la ecografía al 98% para la realización de la historia clínica y la exploración física. Un número importante de residentes (hasta un 39%) respondieron no formarse en objetivos por áreas tales como la ubicación del paciente, el manejo del dolor y el acceso vascular. Los estudiantes licenciados de medicina fueron menos frecuentemente evaluados que formados en los objetivos por área. Las habilidades con la tecnología, incluyendo la explicación del rol de la historia clínica electrónica y la solicitud de órdenes médicas computarizadas, se evaluaron sólo en el 39% de los residentes y el conocimiento (USMLE, United States Medical Licensing Examination) y la historia clínica y exploración física se evaluaron en casi todos los residentes. Es más, la ubicación, la ecografía, la multitarea y el manejo de heridas se evaluaron en menos de la mitad de las ocasiones. Muchos de los residentes que se inician en MUE pueden no haber tenido formación o evaluación en el conocimiento, las habilidades y los comportamientos preparatorios para los objetivos por área de nivel 1 esperados para los estudiantes licenciados de medicina. Además, hay una brecha potencial en la formación y la evaluación de los residentes de MUE. En base a estos hallazgos, no está claro quién será el responsable, las facultades de medicina, la administración de la MUE o los programas de residencia, para asegurar que los estudiantes de medicina que entren en la residencia hayan alcanzado los objetivos por área de nivel 1. The Accreditation Council for Graduate Medical Education (ACGME) has mandated transition to the Next Accreditation System (NAS).1 One intent of the NAS is to improve graduate medical education (GME), specifically in the focus areas of "patient safety, quality improvement, care transitions, supervision, and professional responsibility."1 Another component of the NAS involves implementation of specific developmental milestones for each specialty to improve clarity of competency-based assessment. Each of the six competencies has multiple milestones that should be met progressively through training. Each milestone contains five levels of progressively advanced skills; Level 1 describes the competencies that should be demonstrated by medical school graduates and Level 5 describes those for attending physicians.2, 3 The proposed emergency medicine (EM) milestones released in May 2012 are meant to help guide training from the end of medical school through residency completion.2 While the EM Residency Review Committee and the ACGME as the accrediting bodies set clear residency competency objectives, the Liaison Committee on Medical Education is not proscriptive in setting objectives for undergraduate medical education (UME). This allows medical schools to determine the educational objectives for graduation and introduces potential variation into achievement of the Level 1 milestones. While many schools have adopted the ACGME competencies, it is unclear where the milestones will be embedded in the curriculum and how well prepared new medical school graduates are to meet Level 1 milestones. Moreover, the question of whether the responsibility for ensuring interns have met the milestones lies with UME or GME has not been resolved. There is a paucity of data on the level of preparation of medical school graduates for these competencies. The objective of this study was to begin to understand to what extent medical schools currently prepare EM interns to meet the Level 1 milestones by querying interns in July which milestones they had been taught or assessed. This information will help guide the three major stakeholder groups (EM clerkship directors, residency directors, and medical school deans) in planning for the transition from medical school to internship. An anonymous electronic survey (Qualtrics Labs, Provo, UT) was distributed several times during July to EM interns starting residency at 13 EM programs. This study was reviewed by the institutional review board and granted exemption from informed consent requirements. The survey was developed by the authors. To provide content validity evidence, the milestones were listed verbatim. The survey was distributed to selected individuals for feedback in clarity, readability, and appropriateness to ensure response process validity. The survey consisted of questions asking EM interns if they recalled having been taught and assessed on the proposed EM Level 1 milestones (see the Data Supplement S1, available as supporting information in the online version of this paper).2 In addition, to provide student anonymity, we collected the region of where the medical school was located. Invitations to complete the survey were sent to the residency leadership multiple times to forward the e-mail link to their interns. One site was excluded due to difficulty distributing the survey. Descriptive data are reported without statistical analysis, demonstrating the rates of EM milestones taught and assessed. The response rate was 70%, with responses from 113 of 161 interns who had graduated from US medical schools. Participants attended medical schools from all four regions (16% west, 22% south, 34% midwest, and 28% northeast). Seventy-five percent of residents attended medical schools with EM residency programs. Participants had a mean (±SD) of 3.1 (±0.7) months of EM training throughout medical school. The responses of EM interns on the rates of EM Level 1 milestones taught and assessed are noted in Table 1. It appears that medical schools are more likely to teach than to assess the milestones, with many of the milestones taught to nearly all of the EM interns. A substantial minority (up to 39%) reported no instruction on milestones such as patient disposition, pain management, multitasking, or ultrasound. Moreover, disposition, ultrasound, multitasking, and wound management were assessed less than half of the time. The July 2014 NAS phase-in presents a daunting task.1 This study demonstrates that while many graduating medical students have been taught and assessed on some of the Level 1 EM milestones, there remains a teaching and assessment gap. The question remains: who is responsible for filling the gap and ensuring that interns have consistently met the Level 1 milestones? Based on the ACGME Milestone Project, Level 1 milestones should be achieved at the end of medical school. Yet, medical schools do not teach or assess all milestones for each specialty. Most would agree that schools are responsible for teaching basic milestones that cross all specialties, such as the skills of history and physical examination. However, many medical schools will struggle with creating individualized curricula to teach milestones across the many specialties. Moreover, many schools have not moved to clear, competency-based assessment.4 While some schools may be teaching the skills of basic milestones, rigorous assessments have not been demonstrated. Assessment of the milestones attained by students prior to graduation from medical school might help to equalize the knowledge and skills of incoming intern classes. The Milestones Project is geared toward identifying markers of achievement along the continuum of GME. While it would be ideal for those students entering EM residency programs to be competent in Level 1 milestones, this poses challenges to those directing curricula at the undergraduate level. The responsibility of UME leaders is to develop graduates with fundamentally sound core competencies, irrespective of career choice. Mandatory EM rotations are developed with the intent of teaching emergency principles to the general student, such as one who may be going into neurology or psychiatry. While EM rotations are teaching milestones relevant to all graduating students, they are often not teaching all EM milestones. Looking to the utility of the fourth year, more specialty-specific items (e.g., ultrasound, procedural skills) can be developed on a tailored, individual basis according to a student's desired training pathway. This may also add deliberate rigor to a year whose importance is not clearly defined.5-7 However, it is important to remember that not all schools have either mandatory EM clerkships, nor residency programs. We can argue that to encourage students going into EM to be competent in all Level 1 Milestones in effect mandates a need for schools to offer unique experiences in EM or the opportunity to rotate at other institutions. The evolving curriculum in UME speaks to the development of generic Level 1 competencies for all graduating students. The call for creation of critical care competencies is yet another example of how UME can be more systematic in how medical schools prepare their graduates.8, 9 For example, emergency stabilization, airway management, and vascular access could fall under the rubric of critical care in addition to EM.8, 9 At this juncture the milestones have not included strategies and direction of how UME will meet the expectations of the suggested Level 1 competencies. Perhaps specialty-specific milestones warrant the creation of subinternships or boot camps in each specialty. The students could then choose a sequence of advanced courses that would take on the task of demonstrating the Level 1 milestone competencies of their proposed training pathways. There are clearly significant advantages to the postgraduate training program to have trainees entered with a uniform set of competencies. This is particularly important in the core areas of EM clinical skills such as patient assessment, basic stabilization measures, diagnostic, and clinical reasoning. Ultimately, it allows optimization of postgraduate training. Some GME programs have implemented assessment prototypes to assess interns and allow deficiencies to be addressed early in training. One example is the Postgraduate Orientation Assessment, which offers an objective structured clinical examination for incoming residents to identify gaps.10 Another option is a rigorous intern orientation month to ensure acquisition and assessment of Level 1 milestones. From the perspectives of our author group of students, program directors, and deans, we recommend that the responsibility for attaining Level 1 milestones be shared. Standard medical school curricula should teach and assess the milestones cross-cutting all specialties. EM subinternships should address the EM-specific milestones. On the GME side, it is appropriate for GME to assess the cross-cutting competencies for incoming interns and identify gaps. Finally, EM intern orientation months should confirm the EM Level 1 milestones and begin interns on the pathway to expertise. The survey was based on the proposed EM milestones, which have subsequently been revised and re-released. The revisions might change the rates of response found in this study. We intentionally used the proposed EM milestones prior to revisions to be able to contribute early to the discussion of who owns the responsibility of ensuring competency of Level 1 milestones. While the revised milestones have some differences, many Level 1 milestones are unchanged (Data Supplement S1). In addition, the sample size is small and does not represent all medical schools. There are two potential biases: response bias with interns less prepared not responding and recall bias or a lack of understanding of the specific milestones, so that respondents might over- or underestimate the frequency. We intentionally chose interns for this study because they represent a broad cross-section of medical schools and are a key stakeholder group. However, other stakeholders (clerkship directors, program directors, medical school deans) might have better knowledge of the teaching and assessment of the milestones. Future studies might query the complementary views of these groups as well. Finally, the results may not be generalizable across all medical schools and residencies. While competency in the Level 1 milestones is expected at graduation from medical school, many entering emergency medicine interns reported they were not taught or assessed on many these milestones. It is unclear at this time whether medical schools, emergency medicine clerkships, or residency programs are responsible for ensuring that medical students entering residency have achieved Level 1 milestones. Further research is warranted to monitor the teaching and assessment of students during the transition from medical school to residency training in emergency medicine. Please note: The publisher is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.

Referência(s)