Plenary speakers for the Lois Margaret Nora Endowed Lecture at ABMS Conference 2024 revisited the start of their specialty’s journey toward competency-based medical education (CBME) and relayed where they are today. Integrating CBME into the continuum from training through certification to practice is the endgame. All stressed the importance of collaborating with key stakeholders, developing useful assessment tools, and engaging in faculty and resident development.
Pathology’s road to CBME
For pathology, the road toward CBME began with developing entrustable professional activities (EPAs) that align with the Accreditation Council for Graduate Medical Education (ACGME) Milestones, noted Bronwyn H. Bryant, MD, Co-chair of the Pathology EPA Working Group. In 2017, the College of American Pathologists (CAP) Graduate Medical Education Committee published EPAs for anatomic and clinical pathology. In 2018, the Association of Academic Pathology (AAPath) and CAP co-sponsored the establishment of the EPA Working Group, which included stakeholders such as the ACGME Residency Review Committee (RRC) for Pathology and American Board of Pathology (ABPath).
The EPA Working Group chose four EPAs to pilot test. Next steps involved building assessment tools, validating them, and training faculty to use them.
Faculty and resident development occurred concurrently as both are key players in the process, said Dr. Bryant, who is the Associate Program Director of the Pathology Residency Program at the University of Vermont Medical Center. These sessions included background information about CBME and EPAs, performance dimension training to define what a competent job looks like, and frame-of-reference training involving the review of two vignettes for each EPA.
For faculty, the goal was to set a shared mental model. “Your faculty are your assessment tool; the EPAs capture their assessments,” she said, adding that “Faculty must be trained to translate observations into a rating on the EPA scale.” While trust is intuitive, it still has subjective measures that must be identified using words such as competence, reliability, honesty, and humility.
Six residency programs participated in the 2021-22 national pilot to validate the EPAs. Residents noted that the EPAs serve as a mini curriculum for any given task. They had a better understanding of what is expected of them, and they received more specific feedback than in the past. Faculty indicated that the EPAs captured entrustment assessments, offered language for providing feedback, and supported coaching efforts. The program directors (PDs) and Clinical Competency Committee (CCC) members noted that they have a better understanding of residents’ skills and abilities at year’s end, and mapping to the Milestones was easier.
Among the barriers in this pilot was the use of paper EPAs. While they fit best in the clinical workflow, paper EPAs are not a feasible long-term platform, she said. There was some lack of faculty buy in, as well. “It was just one more thing to ask of a stressed workforce,” said Dr. Bryant, who believes the EPA tool will ultimately increase efficiency as its use increases.
The EPA Working Group built an app-based tool, which has doubled the number of assessments being completed, she said. Next steps include expanding the tool to multiple institutions, collecting validity evidence in a database, and expanding faculty development.
Communication is essential for keeping all the stakeholders informed of the progress, she noted. Dr. Bryant continues to present at the AAPath’s annual meeting. During the pilot, there were quarterly check-ins. In 2023, the EPA Pathology Community, which is comprised largely of frontline educators, was formed. During these meetings, pilot participants share challenges, celebrate achievements, share tips, and provide updates from the EPA Working Group. Every year, more PDs show interest in implementing EPAs in their programs, she said. On a broader scale, the ACGME-ABMS CBME Learning Community provides an opportunity to strategize with all the Member Boards to move the needle forward.
Meanwhile, ABPath has approved pilot initiatives to re-evaluate the length, format, and content within the primary and subspecialty certification exams; pilot in-training competency-based assessments (CBAs) to provide formative and summative feedback to trainees and PDs prior to the primary certification exam; and determine the relationship of CBA with certification. Another pilot will involve simulated sign-out of surgical reports. ABPath is also collaborating with the ACGME RRC for Pathology to review future training requirements; Dr. Bryant would like to include language that incorporates CBA. Other tasks include validating EPAs for graduated responsibility in the ongoing CBME/CBA pilots in pathology and continuing to engage with the ABPath Cooperating Societies on in-training CBAs. “There are multiple roads leading to CBME and we will continue to work on the tools to get us there,” Dr. Bryant concluded.
ABOS’ assessment tool
Surgical education is under stress from electronic medical records, billing practices, and duty hour restrictions to name a few, stated David F. Martin, MD, Executive Director of the American Board of Orthopaedic Surgery (ABOS). At the same time, deficiencies in how residents’ competence is documented, measured, and taught have been identified, which begs the question, “Are graduating residents really ready to enter the board certification process and begin independent practice?”
Orthopaedic surgery residency programs and ABOS have a common mission: to educate residents and produce competent orthopaedic surgeons, Dr. Martin said. The Member Board’s role is to provide assessment tools to measure knowledge, surgical skills, and professional behavior (KSB). The ABOS KSB Program is a system that the board developed for measuring and tracking resident progress.
In addition to ACGME, other organizations involved in the tool’s development and moving the specialty toward CBME include the American Academy of Orthopaedic Surgeons (AAOS) and American Orthopaedic Association/Council of Orthopaedic Residency Directors (AOA/CORD). Dr. Martin also called out the Association of Residency Coordinators in Orthopaedic Surgery (ARCOS) and the ABOS Resident Advisory Panel (RAP), the latter of which ABOS consults when rolling out various initiatives.
ABOS heard from ARCOS and RAP that residents want timely feedback that is driven by them, and formative feedback that translates into a summative assessment of competence, he said. Finally, they want their progress tracked through residency.
The ABOS KSB Program measures the following:
- resident knowledge using the AAOS Orthopaedic In-Training Examination®, which is administered by AAOS and now linked to the ABOS Part I Examination;
- surgical skills using the O-P Surgical Skills Assessment tool that aligns with the ACGME Review Committee for Orthopaedic Surgery case logs; and
- professional behavior using the ABOS Behavior Tool (ABOSBT) through assessment requests of attendings and other health care professionals, plus an end-of-rotation review and 360o evaluation.
Dr. Martin explained how ABOS decided to measure these three areas, which was accomplished through a great deal of collaboration. Regarding orthopaedic knowledge, ABOS and AAOS developed a common set of questions and standard blueprint to be used on the ABOS Part I Examination and the OITE, and identified a minimum score on the OITE that roughly corresponds to an ABOS Part I Examination minimum passing standard. Residents have access to their own ABOS Resident Dashboard that tracks their progress and displays their scaled OITE scores as well as their surgical skills and professional behavior assessments.
Concerning surgical skills, AOA/CORD, ARCOS, and ABOS collaborated to develop a web-based surgical skills evaluation program. The goal of the program, which can be completed on a computer or mobile app, is to provide meaningful, timely feedback with minimal burden. Residents are assessed in eight different facets of surgical skills, much like ABOS’ Part II Oral Examination, Dr. Martin said. The web-based surgical skills assessment tool was pilot tested across 16 residency programs. The published findings support the construct validity of the tool for differentiating levels of autonomy by year in training.
Regarding professional behavior, AOA/CORD, ACGME, ARCOS, and ABOS collaborated to develop a blueprint to assess behaviors in five domains (i.e., ethical behavior, communication, interaction, reliability, and self-assessment). The ABOSBT was piloted in 18 residency programs. While 97.6 percent of all evaluations scored high levels of professional behavior, the 360o evaluations identified more low performers than the ABOSBT. This information enables PDs to remediate low-performing residents during training, Dr. Martin noted.
These tools map to the ACGME Milestones, so the resident reports can be utilized by their CCC, he said. Efforts are underway to map them to the ACGME Milestones 2.0.
All orthopaedic surgery residency programs will be onboarded by January 2025. Beginning in the academic year 2025-26, residents must meet ABOS KSB participation requirements to be eligible for the ABOS Part I Board Certification Exam. While participation is required, specific levels of achievement are not yet in place, he said.
“Launching a program like this requires a dedicated staff, a committed board, dependable collaborators, and dependable partners in information technology and psychometrics,” Dr. Martin concluded. “Quality is critical as is persistence. We want the residents to own this as their program.”
CBME across the continuum
The American Board of Pediatrics (ABP) developed its EPA framework approximately 10 years ago, stated David A. Turner, MD, ABP’s Vice President of CBME. It includes 17 EPAs for general pediatrics, seven EPAs common to all subspecialties, and three to six EPAs for each specific subspecialty. The EPAs serve to define the expectations/outcomes for the specialty, and guide both curriculum and assessment, he said.
At ABP, EPAs are an important part of defining the outcomes of a CBME framework, Dr. Turner stated. EPAs represent one of the five core components of CBME and also integrate the ACGME competencies and Milestones. In a fully realized CBME model, programs have tools to collect data to inform decisions about readiness to perform EPAs, and the approach should involve collecting multiple data points from many observers, consistent with the concept of programmatic assessment. Programmatic assessment is another of the five core components of CBME and provides programs with the ability to make a decision about a person’s readiness to practice.
Right now, PDs are often making these decisions based on very limited information, he said. A common approach is to make decisions on “default,” that is, trainees should progress unless there are specific concerns. “In health care and medical education, that’s not the approach we should take,” Dr. Turner stressed.
During the past decade, ABP has conducted numerous research studies to learn more about implementation of EPAs, mostly at the level of the CCC in graduate medical education training programs. The data, compiled from more than 10,000 trainees, continue to suggest that most, but not all, residents are ready for unsupervised practice, with the vast majority of people who complete training ultimately becoming board certified. “That’s why the CBME agenda has to be advanced in the training space,” Dr. Turner said, adding, “We’re trying to improve consistency of outcomes and make sure our graduates are ready to do the things our patients and their families expect and need from them as certified physicians.”
For ABP, implementing EPAs for assessment means using “competency-based board eligibility,” which includes program attestation of readiness to practice using the EPA framework, integrating EPAs into the CCC decision-making process, and integrating the core components of CBME (including frontline EPA-based assessments). “What’s missing right now in many programs is the frontline assessment piece,” he said. In the graduate medical education space, specialties such as orthopaedics, pathology, and surgery have provided some form of assessment tool designed to increase consistency of the frontline assessment to facilitate better informed decision-making, and ABP has begun a similar pilot. To that end, ABP is facilitating more robust data for residency program decision-making through piloting a smartphone-based app for its EPA-based frontline assessment. Optimizing equity in assessment is an important priority of this pilot project.
The ABP is also looking to incorporate CBME as a set of organizing principles for its entire certification program. The board plans to use EPAs for initial certification decision-making by 2028. Determining what that looks like, Dr. Turner said, will largely depend on collaborations with the pediatric community and data gathered from the frontline EPA-based assessment in training pilot. Given the variability in readiness to practice at the conclusion of training, there also needs to be a structure for supporting learners in lifelong learning once they become board certified, he said, reinforcing the importance of integrating the principles of CBME in continuing certification programs.
Dr. Turner points out that core components of CBME are already represented in many of the Member Boards’ continuing certification programs. Longitudinal assessments often have an “assessment for learning” component that individualizes education based on the needs of learners, which is a key principle of CBME. Many programs also integrate the principles of CBME into parts of their improving health and health care programs. Advancing CBME in continuing certification, however, requires a comprehensive assessment framework that makes the connection between moving an individual who progresses through training and graduates with a certain level of competence toward mastery and expertise. That requires breaking down some silos between training and practice and thinking about CBME across the entire continuum, he said. Ongoing discussions should include determining what does a partnership with training programs look like in a fully realized CBME model, where the boards fall on this spectrum, and how the boards continue to advance CBME principles in continuing certification.
“The ABP’s approach has been methodical and has prioritized research, learning, and partnership designed to reinforce our goal of collaboration to facilitate transformational change in education and assessment,” Dr. Turner concluded. “We’ve learned a lot along the way, and we’ve built some significant partnerships within our community. I genuinely believe that we’re at a tipping point now as we think about how we can transform both education and assessment along the entire continuum from training into practice.”
© 2024 American Board of Medical Specialties