International Journal of Academic Medicine

: 2021  |  Volume : 7  |  Issue : 1  |  Page : 1--4

What's new in academic medicine? Focus on evolving models of competence in Graduate Medical Education

Nicholas Taylor1, Nicole Defenbaugh2, Alaa-Eldin A Mira3, Erin Bendas4,  
1 Department of Obstetrics and Gynecology, St. Luke's University Health Network, Bethlehem, Pennsylvania, USA
2 Department of Graduate Medical Education, St. Luke's University Health Network, Bethlehem, Pennsylvania, USA
3 Department of Geriatric Medicine, St. Luke's University Health Network, Bethlehem, Pennsylvania, USA
4 Department of Hospice and Palliative Medicine, St. Luke's University Health Network, Bethlehem, Pennsylvania, USA

Correspondence Address:
Dr. Nicholas Taylor
Department of Obstetrics and Gynecology, St. Luke's University Health Network, Bethlehem, Pennsylvania

How to cite this article:
Taylor N, Defenbaugh N, Mira AEA, Bendas E. What's new in academic medicine? Focus on evolving models of competence in Graduate Medical Education.Int J Acad Med 2021;7:1-4

How to cite this URL:
Taylor N, Defenbaugh N, Mira AEA, Bendas E. What's new in academic medicine? Focus on evolving models of competence in Graduate Medical Education. Int J Acad Med [serial online] 2021 [cited 2021 Apr 12 ];7:1-4
Available from:

Full Text

Defining competence in Graduate Medical Education (GME) is an ever-evolving challenge.[1] The Accreditation Council for GME (ACGME) and other analogous accrediting bodies around the world are tasked with something that most would consider both challenging and a “constantly moving target”: parsing and quantifying the qualities that make competent physicians and measuring a learner's progress through their training.[1],[2],[3] Competency-based medical education (CBME) originated in the early 1970s. The overarching goal was to ensure competence that met “local needs” and expectations.[4] This implied that the definition of competency is flexible, and the goals of the learner and training program, including specific interpretations of such goals, certainly play an important role within a multifactorial matrix. Six general competencies such as medical knowledge (MK), patient care (PC), interpersonal and communication skills (ICS), practice-based learning and improvement (PBLI), professionalism (PROF), and systems-based practice (SBP) were introduced in 1999 by the ACGME and the American Board of Medical Specialties with the goal of providing a “shared model” of professional development.[5] Initially, it was difficult for programs to understand what the competencies meant and how the general competencies could be applied in clinical practice. Subcompetencies were then developed by each subspecialty which used milestones to provide a better description of the six general competencies. These specialty-specific milestones were meant to show development and progress of the trainee over time and were introduced in 2013.[5] There was a wide variation among subspecialties for the more subjective milestones. Across 26 subspecialties, there were 230 different descriptions of PROF, 171 for PBLI, 176 for ICS, and 122 for SBP.[6] This is not entirely surprising. Physicians are trained in quantitative research and analysis, so it can be difficult to measure some of the more qualitative competencies and find validity in the construct. Ultimately, uniform subcompetencies for PROF, PBLI, ICS, and SBP were developed.[5]

Given the sheer number of subcompetencies within each subspecialty (i.e., 28 in obstetrics/gynecology, 20 in gynecologic oncology, 19 in family medicine, and 23 in preventive medicine), extensive engagement from educators and trainees is necessary to ensure that the ACGME model of competence maintains the ability to provide objective measures of a trainee's knowledge, attitude, and skills.[7] Various subspecialties require learners to self-assess their milestone competencies[8] or participate as a member of their self-assessment team,[9],[10] recognizing the role self-assessment plays in GME There are also other important considerations, as discussed below.

For example, lack of faculty engagement, perhaps from feeling overwhelmed by the multiple subcompetencies, can lead to uniform evaluations based on a global perception of competence.[11],[12] The overarching perception that a trainee is “good” can inform the remainder of the evaluation, basically invalidating the instrument by narrowing the evaluation down to one factor.[13],[14],[15] The innate tendency toward global competency introduces important questions of whether “more items to evaluate” is superior to “more evaluative observations.”[16] The “less is more” concept of evaluating trainees is tempting because it may be perceived as being less time consuming, but this method of evaluation will not suffice as we continue to add complexity to training programs. Gynecologic oncology fellows, as an example, are expected to not only train in their discipline but also understand translational research, aspects of radiation oncology, chemotherapy, genetics, oncofertility, and critical care.[17] Family medicine residents who choose to specialize in preventive medicine, for example, must train in family medicine (19 subcompetencies) and in public health and general preventive medicine (23 subcompetencies) to gain competence in areas such as descriptive and analytic epidemiology, disease outbreak, behavioral health, environmental health, and biostatistics.[18] The complexity of these fields lends itself to more sophisticated models of competency. In another area of training, fine-tuning competencies for a fellowship in hospice and palliative medicine required a dedicated workgroup to reframe what was most valuable in this field. Recently,[19] the milestones underwent a revision to better define what is expected of graduating hospice and palliative medicine fellows. Included in these were comprehensive assessments of physical, emotional, and spiritual suffering; prognostication; and symptom management for patients suffering from a serious illness.[20]

The complexity of the GME model has driven research into more objective models of competency assessment which can inform various relevant milestones. For example, myTIPreport,[21] an app, is designed to track surgical progress in obstetrics and gynecology through immediate evaluation of a resident's ability to perform certain portions of the procedure, reflecting well the evolving experience of postgraduate year 4 residents who rated higher than postgraduate year 1 residents.[22] Other methods of surgical competency assessment include laparoscopic simulation. The Fundamentals of Laparoscopic Surgery curriculum is endorsed by the American College of Surgeons.[23],[24] Passing the course, which includes benchmarks such as “time to completion” for various fundamental laparoscopic skills, is also required for our obstetrics and gynecology residents.

Objective models of assessment can additionally inform competency in communication skills, PROF, and SBP. For example, an Emergency Medicine Residency Program used standardized patient encounters to evaluate important communication-focused scenarios such as death notification, medical error disclosure, treatment refusal, and medical noncompliance.[25] A Pediatric Residency Program developed an observed standardized clinical examination simulation with standardized patients to assess the ethical tenets in the PROF milestones.[26] To better evaluate PROF-related competencies such as conscientiousness, integrity, accountability, aspiring to excellence, teamwork, stress tolerance, and patient-centered care, Cullen et al.,[27] developed a situational judgment test applied to 21 residency programs across two institutions. The test score predicted PC, SBP, PBLI, ICS, and PROF competencies 1 year later.[27] Five family medicine residency sites participated in validating the instrument for communication skills and PROF Assessment (InCoPrA) to was used to assess ICS and PROF domains in simulated scenarios.[28] One of the more “real-world” methods of objective assessment is using entrustable professional activities (EPAs). An EPA is defined as a unit of professional practice, defined as a task or responsibility that a trainee is entrusted to perform unsupervised. EPAs, in many ways, are very similar to practice as they cross multiple milestones. Initial evaluation of EPAs demonstrates that they correlate well with experience (better performance with more training) and milestones.[29],[30] In primary care, for example, the patient-centered medial home (PCMH) model has transformed education and assessment, leading to the creation of 25 PCMH EPAs in internal medicine.[31]

The aging population calls for change in medical education as clinical areas of comorbidity among older adults and age-related conditions are often underrecognized in residency training. Geriatrics training incorporates unique medical, social, and ethical challenges requiring a multidimensional, interdisciplinary approach.[32] There is a growing mandate for residency programs to directly assess residents' clinical competence in care of the elderly.[33] A dedicated working group identified seven domains for the essential IM/FM geriatric competencies which include mediation management, cognitive and behavioral health, palliative and end-of-life care, hospital patient safety, chronic illness, transitions of care, and ambulatory care. Selecting a set of essential geriatric domains which focus on clinical outcomes and can be assessed during patient care will allow a transition from knowledge-oriented to performance-orientated competencies.[34] For example, under the medication management domain, residents will be assessed on the outcomes of applying their medical knowledge of high-risk medications in older adults.

The ACGME model of competency, whether or not augmented with simulation, skills tests, or EPAs, has been rigorously studied and appears to be a valid way of demonstrating a trainee's progress. Does trainee progression and milestone achievement correlate with performance as an attending physician? There is evidence that achieving MK milestones is correlated with passing written boards in obstetrics and gynecology.[35] Similarly, MK milestones have been correlated with in-training examination scores.[36] Although there is a “shared model” between the ACGME and the ABMS, assessment of the six general ACGME competencies constitutes only a small part of physician evaluation after graduation. Chart completion, adequate documentation, appropriate billing, patient satisfaction, academic participation, malpractice cases, readmission rates, surgical complication rates, OR time, etc., are often quoted as objective measures of competency for practicing physicians. The gap between training milestones and the more business-oriented “milestones” in practice is large and the significance of the gap is unknown. There are no studies, to date, evaluating milestones and frequency of malpractice suits, for example. Regardless of the lack of correlation to real-world metrics, CBME is here to stay and we should take every opportunity to enrich the milestone framework through development, implementation, and study of new objective models of competency. The more objective measures we have for assessing trainee competency, the less opportunity there is for global assessment and straight-line scoring to torpedo assessment validity. The more valid the training model, the higher the correlation with practice (theoretically). In education, success depends heavily on how it is measured.


1DeWaay DJ, Clyburn EB, Brady DW, Wong JG. Redesigning Medical Education in Internal Medicine: Adapting to the changing landscape of 21st century medical practice. Am J Med Sci 2016;351:77-83.
2Stawicki SP, Firstenberg MS, Orlando JP, Papadimos TJ. Introductory Chapter: A Quest to Transform Graduate Medical Education into a Seamless Journey toward Practice Readiness, in Contemporary Topics in Graduate Medical Education. London: IntechOpen; 2019.
3Stawicki SP, Galwankar S, Garg M, Firstenberg MS, Papadimos TJ, Barrera R, et al. What's new in Academic International Medicine? Highlighting the need for establishing a national accreditation system for International Medical Programs. Int J Acad Med 2019;5:151.
4McGaghie WC. Miller GE, Sajid A, Telder TV. NoCompetency-based curriculum development in medical education. An introduction. Public Health Pap 1978;68:11-91.
5Edgar L, Roberts S, Holmboe E. Milestones 2.0: A step forward. J Grad Med Educ 2018;10:367.
6Edgar L, Roberts S, Yaghmour NA, Leep Hunderfund A, Hamstra SJ, Conforti L, et al. Competency crosswalk: A multispecialty review of the accreditation council for graduate medical education milestones across four competency domains. Acad Med 2018;93:1035-41.
7ACGME. Family Medicine and Preventive Medicine Milestones; 2020. Available from: [Last accessed on 2021 Feb 20].
8Goldflam K, Bod J, Della-Giustina D, Tsyrulnik A. Emergency medicine residents consistently rate themselves higher than attending assessments on ACGME milestones. West J Emerg Med 2015;16:931-5.
9Foster E, Defenbaugh N, Hansen SE, Biery N, Dostal J. Resident assessment facilitation team: Collaborative support for activated learning. Qual Res Med Health 2017;1:109-120.
10Foster E, Biery N, Dostal J, Larson D. RAFT (Resident Assessment Facilitation Team): Supporting resident well-being through an integrated advising and assessment process. Fam Med 2012;44:731-4.
11Hawkins RE, Welcher CM, Holmboe ES, Kirk LM, Norcini JJ, Simons KB, et al. Implementation of competency-based medical education: Are we addressing the concerns and challenges? Med Educ 2015;49:1086-102.
12McQueen SA, Petrisor B, Bhandari M, Fahim C, McKinnon V, Sonnadara RR. Examining the barriers to meaningful assessment and feedback in medical training. Am J Surg 2016;211:464-75.
13Metheny WP. Limitations of physician ratings in the assessment of student clinical performance in an obstetrics and gynecology clerkship. Obstet Gynecol 1991;78:136-41.
14Natesan P, Batley NJ, Bakhti R, El-Doueihi PZ. Challenges in measuring ACGME competencies: Considerations for milestones. Int J Emerg Med 2018;11:1-6.
15Pulito AR, Donnelly MB, Plymale M. Factors in faculty evaluation of medical students' performance. Med Educ 2007;41:667-75.
16Williams RG, Verhulst S, Colliver JA, Dunnington GL. Assuring the reliability of resident performance appraisals: More items or more observations? Surgery 2005;137:141-7.
17Alvarez RD, Fowler WC Jr. The times they are a-changin'-Transformation of accreditation and certification in gynecologic oncology. Gynecol Oncol 2017;145:221-3.
18Reporting M. The preventive medicine milestone project: Public health and general preventive medicine. J Grad Med Educ 2014;6:271-280.
19Gustin JL, Yang HB, Radwany SM, Okon TR, Morrison LJ, Levine SK, et al. Development of curricular milestones for hospice and palliative medicine fellowship training in the US. J Pain Symptom Manage 2019;57:1009-17.e6.
20ACGME. Hospice and Palliative Medicine Milestones; 2020. Available from: [Last accessed on 2021 Feb 23].
21Connolly A, Goepfert A, Blanchard A, Buys E, Donnellan N, Amundsen CL, et al. myTIPreport and training for independent practice: A tool for real-time workplace feedback for milestones and procedural skills. J Grad Med Educ 2018;10:70.
22Husk KE, Learman LA, Field C, Connolly A. Implementation and initial construct validity evidence of a tool, myTIPreport, for Interactive Workplace Feedback on ACGME Milestones. J Surg Educ 2020;77:1334-40.
23Oropesa I, Sánchez-González P, Lamata P, Chmarra MK, Pagador JB, Sánchez-Margallo JA, et al. Methods and tools for objective assessment of psychomotor skills in laparoscopic surgery. J Surg Res 2011;171:e81-95.
24Hesham H, et al. Innovative methods to teach and train minimally invasive surgery. Atlas Gynecol Oncol Invest Surg 2018:38:293.
25Vora S, Lineberry M, Dobiesz VA. Standardized patients to assess resident interpersonal communication skills and professional values milestones. West J Emerg Med 2018;19:1019.
26Waltz M, Davis A, Cadigan RJ, Jaswaney R, Smith M, Joyner B. Professionalism and ethics: A standardized patient observed standardized clinical examination to assess ACGME pediatric professionalism milestones. MedEdPORTAL 2020;16:10873.
27Cullen MJ, Zhang C, Marcus-Blank B, Braman JP, Tiryaki E, Konia M, et al. Improving our ability to predict resident applicant performance: validity evidence for a situational judgment test. Teach Learn Med 2020;32:508-21.
28Abu Dabrh AM, Waller TA, Bonacci RP, Nawaz AJ, Keith JJ, Agarwal A, et al. Professionalism and inter-communication skills (ICS): A multi-site validity study assessing proficiency in core competencies and milestones in medical learners. BMC Med Educ 2020;20:362.
29Johnson NR, Pelletier A, Berkowitz LR. Mini-clinical evaluation exercise in the era of milestones and entrustable professional activities in obstetrics and gynaecology: Resume or reform? J Obstet Gynaecol Can 2020;42:718-25.
30Albright JB, Meier AH, Ruangvoravat L, VanderMeer TJ. Association between entrustable professional activities and milestones evaluations: Real-time assessments correlate with semiannual reviews. J Surg Educ 2020;77:e220-8.
31Chang A, Bowen JL, Buranosky RA, Frankel RM, Ghosh N, Rosenblum MJ, et al. Transforming primary care training—patient-centered medical home entrustable professional activities for internal medicine residents. J Gen Intern Med 2013;28:801-9.
32Braude P, Reedy G, Dasgupta D, Dimmock V, Jaye P, Birns J. Evaluation of a simulation training programme for geriatric medicine. Age Ageing 2015;44:677-82.
33Charles L, Triscott JA, Dobbs BM, McKay R. Geriatric core competencies for family medicine curriculum and enhanced skills: Care of elderly. Can Geriatr J 2014;17:53-62.
34Williams BC, Warshaw G, Fabiny AR, Lundebjerg Mpa N, Medina-Walpole A, Sauvigne K, et al. Medicine in the 21st century: Recommended essential geriatrics competencies for internal medicine and family medicine residents. J Grad Med Educ 2010;2:373-83.
35Bienstock JL, Shivraj P, Yamazaki K, Connolly A, Wendel G, Hamstra SJ, et al. Correlations between Accreditation Council for Graduate Medical Education Obstetrics and Gynecology Milestones and American Board of Obstetrics and Gynecology qualifying examination scores: An initial validity study. Am J Obstet Gynecol 2020;224: 308.e1-308.e25.
36Kimbrough MK, Thrush CR, Barrett E, Bentley FR, Sexton KW. Are surgical milestone assessments predictive of in-training examination scores? J Surg Educ 2018;75:29-32.