by Eric Holmboe, MD, MACP, FRCP, Chief Research, Milestone Development, and Evaluation Officer at ACGME
Competency-based medical education (CBME) is an outcomes-based approach to physician training. Competencies are the framework used to define the educational outcomes (i.e., individual abilities) in graduate medical education (GME). The American Board of Medical Specialties (ABMS) and Accreditation Council for Graduate Medical Education (ACGME) originally approved six general competencies in 1999 and formally launched their implementation with the Outcome Project in 2001. However, program directors and faculty members still struggle to implement, teach, and assess specific competencies (e.g., systems-based practice). A lack of shared understanding, or shared mental models, has hampered curriculum development and slowed the evolution of better assessments. Despite ongoing challenges, substantial progress has been made in the last 22 years, such as the introduction of the competency Milestones and entrustable professional activities (EPAs). Milestones, deliberately designed to be developmental and formative, continue to evolve based on ongoing validity research and evaluation of the Milestone subcompetencies. Several ABMS Member Boards are in the early stages of implementing EPAs as part of determining eligibility for initial certification.
Yet, misconceptions about CBME remain and current regulatory structures and processes can slow improvements and inhibit innovation. One important and persistent misconception is CBME primarily focuses on shortening training. In CBME, time is viewed as a resource and not an intervention or measure. However, time is still often used as a measure of competence in GME. For example, some requirements for certification still depend on counting specific curricular experiences or volume of experience (e.g., months on specific rotations, number of specific procedures, or types of medical conditions seen by the learner). While ensuring appropriate clinical experiences remains vital, these process-based quantity measures are insufficient for determining whether a learner is ready for unsupervised practice.
Shortening training is not the primary goal of CBME. Time should be used wisely and the amount of training time required should be based on when the learner has achieved the necessary abilities for effective, unsupervised practice (i.e., educational outcomes). The core principles of CBME can still be advanced within pre-specified (fixed) program lengths, designing outcomes-based flexibility within a residency or fellowship and extending it for those who need extra time.
The five core components of CBME should become embedded within blueprints for certification:1
- Competencies required for practice are clearly articulated (e.g., six general competencies).
- Competencies are arranged and sequenced progressively.
- Learning experiences facilitate the progressive development of competencies;
- Teaching practices promote and support the progressive development of competencies;
- Programmatic assessment practices are essential to support and document the progressive development of ALL competencies.
Importantly, the core components framework (CCF) is grounded in a developmental and growth mindset. Moving forward, the CCF should guide a significant redesign of our assessment practices including a greater focus on promoting learner growth and development through frequent formative assessment (i.e., assessment for learning). CBME requires robust longitudinal, programmatic assessment that enables residencies and fellowships to accurately determine the progress of the learner and support the learner through frequent feedback, coaching, and adjustments to their learning plans. CBME explicitly recognizes that learners progress through their training at different rates within and across competencies. Our current models of residency and fellowship are heavily influenced by certification and accreditation policies and have impeded implementation of developmentally focused curriculum and programmatic assessment. Programmatic assessment is a systematic approach involving a group of integrated and related assessment activities (or methods) that are managed in a coordinated manner. It includes continually and longitudinally collecting information about learner progress, rigorously analyzing the information to support professional development during training, and enabling a fair and valid high-stakes judgment at the end of training.
Professional self-regulation is an important component of the overall GME system, especially regarding assessment. The primary constituency for accreditation and certification organizations is the public, and these organizations have a collective responsibility to work toward ensuring all graduates are truly prepared for their next stage of training or practice. Yet the complex professional self-regulatory system must also recognize it is the training programs ultimately making the consequential decisions whether to graduate learners for unsupervised practice.
What are the implications for accreditation and certification? First, we must examine our own practices and assumptions critically and honestly seek to answer questions of whether current policies and requirements still support optimal training. Letting go of policies that may no longer be optimal is hard and challenging, yet we routinely do this in clinical practice. For example, few would argue we should treat hepatitis C the same way we did 22 years ago. That could be malpractice. Second, the regulatory community must accelerate support for innovation. Modest progress is being made through the Advancing Innovation in Residency Education pathway but is only a start. Innovation means testing our assumptions and taking informed, hypothesis-driven risks combined with rigorous evaluation of the innovations. But this work must be done collectively. The CBME symposiums in 2022 and 2023, hosted by ABMS and ACGME, brought together the specialty boards, accreditation review committees, and key leaders from the GME community to explore ways to build momentum for CBME and support innovation. Collectively, this community can support GME programs on the path to achieving the full potential of CBME to better meet the needs of patients and communities and the complex health care systems that serve them.
References
- Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-based Medical Education Collaborators. A Core Components Framework for Evaluating Implementation of Competency-Based Medical Education Programs. Acad Med. 2019 Jul;94(7):1002-1009.