5
AAIM Perspectives AAIM is the largest academically focused specialty organization representing departments of internal medicine at medical schools and teaching hospitals in the United States and Canada. As a consortium of five organizations, AAIM represents department chairs and chiefs; clerkship, residency, and fellowship program directors; division chiefs; and academic and business administrators as well as other faculty and staff in departments of internal medicine and their divisions. Milestones: Do Learners and Teachers Agree? Kathleen Heist, MD, a Jason Post, MD, b Lauren Meade, MD, c Suzanne Brandenburg, MD a a Department of Medicine, Division of General Internal Medicine, University of Colorado, Denver; b Department of Medicine, General Internal Medicine, Mayo Clinic, Rochester, Minn; c Department of Medicine, General Internal Medicine, Tufts Baystate Medical Center, Springfield, Mass. In 1998, the Accreditation Council for Graduate Medical Education (ACGME) initiated the drive towards compe- tency and outcomes-based training and evaluation with the Outcomes Project. Six domains of competency were established at that time: patient care; medical knowledge; practice-based learning and improvement; interpersonal and communication skills; professionalism; and systems- based practice. 1 Although competency-based evaluation has many benefits over previous models, 2 competencies are often intangible to the learner, and their relationship to patient care may be unclear. 3 In 2007 and 2008, the American Board of Internal Medicine (ABIM) and ACGME convened a task force of internal medicine education stakeholders to articu- late milestones, which are discreet observable behav- iors that characterize each proficiency. 4 One hundred forty-two curricular milestones linked to the 6 core competencies were established. These milestones can be bundled together to facilitate the evaluation of learn- ers at critical times in training, which then ultimately demonstrate the development of skills that define the profession. Such skill sets are labeled entrustable pro- fessional activities (EPAs), or activities that the public entrusts all physicians to be capable of performing competently. 5 By July 2013, ACGME will require milestones-based evaluation systems for accreditation. 6 However, integration of these milestones into current curriculum and well-defined evaluation processes has been challenging. 3,7 Clearly defining the most important and relevant milestones for a particular EPA is one proposed method in the creation of competency-based evaluation tools. Learner and evaluator buy-in has been cited as a key step in defining competencies; 8 however, few data exist as to whether learners and evaluators agree about pri- oritization of particular skill sets to demonstrate a pro- ficiency. Most of the work on the transition to mile- stones-based evaluations has been done by faculty members without significant input from and engage- ment of the learner. In 2009, the Educational Research Outcomes Col- laborative (E-ROC) for Internal Medicine was formed to investigate practical application of milestones to the residency evaluation process. This group identified the 22 milestones from the original set of 142 established by the ABIM/ACGME task force that most distinc- tively demonstrate the EPA “readiness for indirect super- vision for essential ambulatory care.” Mastery of this EPA allows the early learner in ambulatory care to see simple patients without direct attending supervision. E-ROC pre- viously engaged faculty members in an exercise to prior- itize these ambulatory care milestones. In total, 149 units of 1-4 faculty members from 13 diverse institutions were involved in this prior exercise (Table 1). We engaged the learners and resident prioritization of the same 22 established milestones for the same EPA Funding: None. Conflict of Interest: None. Authorship: The 4 individuals listed as authors are qualified for authorship and all those who are qualified to be authors are listed as such in the author byline. Each author has made substantial contri- butions to both conception and design of study, acquisition of data, analysis and interpretation of data, and drafting the article. Requests for reprints should be addressed to Kathleen Heist, MD, University of Colorado Denver, 1635 Aurora Court F729, Aurora, CO 80045. E-mail address: [email protected] AAIM Perspectives 0002-9343/$ -see front matter © 2013 Alliance for Academic Internal Medicine. All rights reserved. http://dx.doi.org/10.1016/j.amjmed.2012.11.003

Milestones: Do Learners and Teachers Agree?

  • Upload
    suzanne

  • View
    217

  • Download
    3

Embed Size (px)

Citation preview

Page 1: Milestones: Do Learners and Teachers Agree?

G

asba

UC

AAIM

Pers

pect

ives

AAIM PerspectivesAAIM is the largest academically focused specialty organization representing departments of internal medicine at medical schools and teachinghospitals in the United States and Canada. As a consortium of five organizations, AAIM represents department chairs and chiefs; clerkship, residency,and fellowship program directors; division chiefs; and academic and business administrators as well as other faculty and staff in departments of

internal medicine and their divisions.

Milestones: Do Learners and Teachers Agree?Kathleen Heist, MD,a Jason Post, MD,b Lauren Meade, MD,c Suzanne Brandenburg, MDa

aDepartment of Medicine, Division of General Internal Medicine, University of Colorado, Denver; bDepartment of Medicine,eneral Internal Medicine, Mayo Clinic, Rochester, Minn; cDepartment of Medicine, General Internal Medicine, Tufts

Baystate Medical Center, Springfield, Mass.

m

In 1998, the Accreditation Council for Graduate MedicalEducation (ACGME) initiated the drive towards compe-tency and outcomes-based training and evaluation withthe Outcomes Project. Six domains of competency wereestablished at that time: patient care; medical knowledge;practice-based learning and improvement; interpersonaland communication skills; professionalism; and systems-based practice.1 Although competency-based evaluationhas many benefits over previous models,2 competenciesare often intangible to the learner, and their relationship topatient care may be unclear.3

In 2007 and 2008, the American Board of InternalMedicine (ABIM) and ACGME convened a task forceof internal medicine education stakeholders to articu-late milestones, which are discreet observable behav-iors that characterize each proficiency.4 One hundredforty-two curricular milestones linked to the 6 corecompetencies were established. These milestones canbe bundled together to facilitate the evaluation of learn-ers at critical times in training, which then ultimatelydemonstrate the development of skills that define theprofession. Such skill sets are labeled entrustable pro-fessional activities (EPAs), or activities that the public

Funding: None.Conflict of Interest: None.Authorship: The 4 individuals listed as authors are qualified for

uthorship and all those who are qualified to be authors are listed asuch in the author byline. Each author has made substantial contri-utions to both conception and design of study, acquisition of data,nalysis and interpretation of data, and drafting the article.

Requests for reprints should be addressed to Kathleen Heist, MD,niversity of Colorado Denver, 1635 Aurora Court F729, Aurora,O 80045.

E-mail address: [email protected]

0002-9343/$ -see front matter © 2013 Alliance for Academic Internal Medhttp://dx.doi.org/10.1016/j.amjmed.2012.11.003

entrusts all physicians to be capable of performingcompetently.5 By July 2013, ACGME will require

ilestones-based evaluation systems for accreditation.6

However, integration of these milestones into currentcurriculum and well-defined evaluation processes hasbeen challenging.3,7

Clearly defining the most important and relevantmilestones for a particular EPA is one proposed methodin the creation of competency-based evaluation tools.Learner and evaluator buy-in has been cited as a keystep in defining competencies;8 however, few data existas to whether learners and evaluators agree about pri-oritization of particular skill sets to demonstrate a pro-ficiency. Most of the work on the transition to mile-stones-based evaluations has been done by facultymembers without significant input from and engage-ment of the learner.

In 2009, the Educational Research Outcomes Col-laborative (E-ROC) for Internal Medicine was formedto investigate practical application of milestones to theresidency evaluation process. This group identified the22 milestones from the original set of 142 establishedby the ABIM/ACGME task force that most distinc-tively demonstrate the EPA “readiness for indirect super-vision for essential ambulatory care.” Mastery of this EPAallows the early learner in ambulatory care to see simplepatients without direct attending supervision. E-ROC pre-viously engaged faculty members in an exercise to prior-itize these ambulatory care milestones. In total, 149 unitsof 1-4 faculty members from 13 diverse institutions wereinvolved in this prior exercise (Table 1).

We engaged the learners and resident prioritization

of the same 22 established milestones for the same EPA

icine. All rights reserved.

Page 2: Milestones: Do Learners and Teachers Agree?

ABDHHMSSSUUUW

271Heist et al Milestones: Do Learners and Teachers Agree?

“readiness for indirect supervision for essential ambu-latory care” in 2 internal medicine programs. We as-sessed the level of agreement of resident prioritizationwith the previously reported faculty prioritization.

METHODSTwo internal medicine resi-dency training programs partic-ipated in the study: MayoClinic and University of Colo-rado. The internal medicineresidency program at MayoClinic has 48 categorical resi-dents per year as well as 24preliminary residents. The 144categorical residents partici-pate in continuity clinics at oneclinical site that is divided into6 firms. Each firm contains 24residents. Residents from all 3years of training were involvedin this observational study atMayo Clinic. The internal med-icine residency program at Uni-versity of Colorado has 165 residents; the 65 internsinclude 38 categorical interns, 12 primary care interns,and 15 preliminary interns. There are 8 different con-tinuity clinic sites: 2 at university-based clinics, 3 at alocal county hospital, 2 at a veterans’ hospital, and 1 ata private hospital. There are between 3 and 12 interns ateach clinic site. At University of Colorado, only internswere included in this educational exercise.

We used Q-sort methodology as our means of stan-dardizing the prioritization process. Q-sort methodol-ogy is both a qualitative and quantitative exercise ofprioritizing opinions in which participants rank a set ofstatements (in this case about curricular milestones)

PERSPECTIVES V

● Q-sort methodoloused to introducto the conceptevaluation.

● Learners and teastitutions are inmost importantessential ambula

● Both learners aengaged in the ction methods.

Table 1 Faculty Demographics

Program

urora Healthcareanner Good Samaritan Medical Centeruke University Medical Centerennepin County Medical Centerenry Ford Hospitalayo Clinic College of Medicinecripps Mercy Hospitalumma Health Systemouth Illinois Universityniversity of San Francisco, School of Medicineniversity of Cincinnatiniversity of Wisconsinestchester Medical/New York Medical College

from most important to least important.9,10 This meth-odology engages participants in the prioritization pro-cess and allows for organization of divergent opinions.A Q-sort “game board” (Figure) allowed for an inter-active process in which participants place “game

pieces” of associated mile-stones in rank order in a stan-dard bell-curve distributionpattern based on importance.University of Colorado used apaper version of this Q-sortgame, whereas Mayo Clinicused an electronic version.

At University of Colorado,interns were involved in theproject during their intern ori-entation week, before the startof their clinical rotations inJune 2011. During one after-noon, all interns were intro-duced to milestones and Q-sortduring a brief orientation ses-sion. Then the 65 interns weredivided into 15 groups with 3-5interns per group. The groups

were each asked to come to consensus on the impor-tance of each milestone in demonstrating the EPA usingQ-sort methodology. Paper game boards and gamepieces were used.

At Mayo Clinic, approximately 70 residents from all3 years of training completed individual Q-sort exer-cises during March and April 2011. Q-sort methodol-ogy was introduced to the residents as they were par-ticipating in a quality review session for their continuityclinic. Due to scheduling limitations, not all categoricalresidents were able to attend these sessions. Residentsnot attending the sessions were not asked to participate

OINTS

a tool that can beners and teacherstones in resident

from different in-ement as to the

stones related tocare.

achers should beon of new evalua-

ty, StateNumber of Q-Sorts(1-4 Faculty per Q-Sort)

ilwaukee, Wis 7oenix, Ariz 9

urham, NC 21inneapolis, Minn 9etroit, Mich 1ochester, Minn 16n Diego, Calif 11

kron, Ohio 21ringfield Ill 10n Francisco, Calif 14ncinnati, Ohio 8adison, Wis 6lhalla, NY 6

IEWP

gy ise learMiles

chersagre

miletory

nd tereati

Ci

MPhDMDRSaASpSaCiMVa

Page 3: Milestones: Do Learners and Teachers Agree?

e boar

272 The American Journal of Medicine, Vol 126, No 3, March 2013

in the Q-sort exercise. After discussing the rationale forthe exercise, the Q-sort was sent to each resident as aMicrosoft Word document. Electronic game boards andgame pieces were used. Residents completed the exer-cise individually.

At both institutions, the Q-sort data were collectedand analyzed by resident prioritization of milestones.These data were then analyzed for agreement with theset of reported data of faculty members from the pre-viously mentioned 13 diverse institutions. Comparisonswere made with regards to rank category and standarddistribution based on the normal bell-shaped distribu-tion curve. The top 8 prioritized milestones were ana-lyzed for concordance between residents and facultymembers. The top 8 prioritized milestones also wereanalyzed with regards to concordance based on level oftraining using data gathered at Mayo Clinic.

RESULTSResidents at both institutions as well as faculty mem-bers ranked the same 7 milestones highest, although theorder was slightly different for each group (Table 2).The eighth ranked milestone varied among the resi-dency programs and the faculty. The most highlyranked milestone from all groups was “recognize situ-ations with a need for urgent or emergent medical careincluding life threatening conditions.” The other top 6milestones highlighted the need to recognize limita-tions and to know when to ask for help. The top mile-stones also addressed fundamental patient encounter

Priority rank for each item: 7 = Most Im7 6 5 4

Game Board

MileGam

MilestoneGame Piece

MilestoneGame Piece

Figure Q sort gam

skills, such as the ability to take an appropriate and

hypothesis-driven history; the ability to perform anaccurate and targeted physical examination; and theability to synthesize data to define the patient’s centralclinical condition and formulate an appropriate differ-ential diagnosis with a therapeutic plan. Not ranked inthe top 8 were milestones related to communicationskills; shared decision-making; incorporating patientpreference and background; demonstration of an under-standing of the patient’s socioeconomic barriers; abilityto incorporate feedback; ability to accept personal er-rors and honestly acknowledge them; and demonstra-tion of a commitment to relieve suffering.

At Mayo Clinic, residents from all postgraduate year(PGY) levels were included in this study, which al-lowed for comparison of prioritization based on level oftraining. Again, similar prioritization was found. Resi-dents from all years ranked the same 7 milestonesconsistently in the top 8, with a focus on taking anaccurate history and physical, synthesis of data, anddeveloping a differential and appropriate therapeuticplan, as well as knowing when to seek additional guid-ance. PGY-1s and PGY-3s both ranked “accept per-sonal errors and honestly acknowledge them” in the top8, whereas PGY-2s ranked “demonstrate empathy andcompassion to all patients” in the top 8 (Table 3).

DISCUSSIONThis innovation demonstrates concordance betweenlearners and teachers for the prioritization of milestonesreflecting the EPA “readiness for indirect supervision in

t; 1 = Least Important3 2 1

MilestoneGame Piece

d and game pieces.

portan

stonee Piece

essential ambulatory care.” This concordance observed

Page 4: Milestones: Do Learners and Teachers Agree?

bdenSrpact

dbaml

R

SA

P

D

R

R

D

A

D

273Heist et al Milestones: Do Learners and Teachers Agree?

across institutions and levels of experience adds validity tothe use of milestones to measure competency with regardsto a specific EPA. As the medical community movestowards more concrete methods of competency-based res-ident evaluation, definition of critical skill sets is a keyfirst step. This study demonstrates that agreement fromboth learners and teachers about the most important skillsis possible. This innovation also demonstrates one methodto facilitate the daunting task of breaking down the largenumber of curricular milestones into a more manageablenumber of key milestones related to one EPA to allow forfocused evaluation.

This study demonstrates that Q-sort methodology cane used as an innovative exercise to engage learners in theevelopment of their own evaluation process. The medicalducation community is embarking on a challenging jour-ey as we transition to the next accreditation system.imply introducing the concept of milestones to bothesidents and faculty is just one beginning step in thatrocess. The Q-sort exercise is an enjoyable, interactivepproach that can be used to introduce the milestonesoncept and engage learners as residency programs worko develop new evaluation systems.

Q-sort methodology can be used in a variety ofifferent formats as was demonstrated in this study,oth in person and electronically, in group format or onn individual basis. This exercise was easy to imple-ent in a variety of educational settings and at various

Table 2 Milestones Rank by Training Site

Milestones

Mayo Clinic Residents

% of TimeMilestone WasRanked Top 8

OveRan

ecognize situations with a needfor urgent medical care

95 1

ynthesize available data 86 2cquire history: customized,prioritized, hypothesis driven

83 3

erform accurate physicalexamination

75 4

evelop differential diagnosis,diagnosis therapeutic plan

70 5*

ecognize scope of abilities, askfor assistance appropriately

70 5*

ecognize when to seekadditional guidance

63 7

eliver succinct, hypothesis-driven oral presentations

ccept personal errors andhonestly acknowledge them

48 8

emonstrate empathy andcompassion to all patients

CU � University of Colorado.*Tied for rank.

evels of training. The innovation tool was imple-

mented at various times during training, from beforeany clinical experience until the end of a residency. Theamount of agreement across these different levels oftraining and different institutions demonstrates that thetool is effective.

We did not receive specific feedback from the par-ticipants about their experience with the Q-sort process.Future studies might be warranted to determine whetherpaper versus electronic and group versus individualQ-sorts are more engaging to the learner.

Our study only investigated prioritization of mile-stones relevant to ambulatory care, and it is unclearwhether or not the concordance between learner andteacher would be as consistent in the inpatient setting.This area is another potential study as we move towardsthe next accreditation system. In addition, our studyfocused on ambulatory milestones at the intern level. Itwould be interesting to have residents and faculty pri-oritize the key milestones at the PGY-2 or PGY-3 levelin the ambulatory care setting. While faculty membersand residents prioritized the fundamental aspects of anyexamination during this exercise, it would be interest-ing to see how the key milestones change as the level oftraining advances. Would upper-level resident-priori-tized milestones focus on very important but moresophisticated aspects of a patient encounter such ascommunication skills and the incorporation of patientpreferences, culture, and socioeconomic backgrounds

CU Residents Composite Faculty

% of TimeMilestone WasRanked Top 8

OverallRank

% of TimeMilestone WasRanked Top 8

OverallRank

100 1 95 1

93 2* 78 493 2* 87 2

87 4* 85 3

87 4* 76 5

87 4* 71 6

80 7 56 7

60 8

41 8

rallk

into decision-making? Further investigation is needed

Page 5: Milestones: Do Learners and Teachers Agree?

1

274 The American Journal of Medicine, Vol 126, No 3, March 2013

to determine how prioritization would change at differ-ent levels and in different clinical settings.

We hope this innovative approach provides guidancefor future studies that will incorporate both teachers andlearners in the development of competency-based evalu-ation tools. Perhaps this idea can be extended to othermembers of the health care team such as patients, nurses,and social workers. This prioritization exercise has dem-onstrated that engagement of the learner in the develop-ment of new evaluation processes is possible. We hope itpersuades educators that input from both learners andteachers about the most important skills should be soughtas evaluation tools are developed.

References1. Swing S. The ACGME outcome project: retrospective and pro-

spective. Med Teach. 2007;29:648-654.2. Weinberger SE, Pereira AG, Iobst WF, et al. Competency-based

education and training in internal medicine. Ann Intern Med.

Table 3 Top 8 Ranked Milestones by Postgraduate Year (P

PGY-1n � 23 %

PGY-2n � 25

Item 1Acquire accurate and relevant history

96 Item 6Recognize s

need foremergent

Item 6Recognize situations with a need for

urgent or emergent medical care

87 Item 4Synthesize a

data

Item 4Synthesize all available data

74 Item 1Acquire accu

relevant hItem 5Develop differential diagnoses,

diagnostic and therapeutic plan

74 Item 2Perform an

physical eItem 2Perform an accurate physical

examination

65 Item 20Recognize t

his/her aItem 7Recognize when to seek additional

guidance

61 Item 5Develop diff

diagnosesand thera

Item 17Accept personal errors, honestly

acknowledge them

61 Item 7Recognize w

additiona

Item 20Recognize the scope of his/her

abilities

57 Item 18Demonstrate

compassiopatients

2010;153;11:751-756.

3. Jones MD, Rosenberg AA, Gilhooly JT, Carraccio CL. Perspec-tive: competencies, outcomes and controversy—linking profes-sional activities to competencies to improve resident educationand practice. Acad Med. 2001;86(2):161-165.

4. Green ML, Aagaard EM, Caverzagie KJ, et al. Charting the roadto competence: developmental milestones for internal medicineresidency training. J Grad Med Educ. 2009;1(1):5-20.

5. Carraccio C, Burke AE. Beyond competencies and milestones:adding meaning through context. J Grad Med Educ. 2010;2(3):419-422.

6. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GMEaccreditation system—rationale and benefits. N Engl J Med.2012;366(11):1051-1056. Accessed April 27, 2012.

7. ten Cate O. Entrustability of professional activities and compe-tency-based training. Med Educ. 2005;39(12):1176-1177.

8. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C.Shifting paradigms: from Flexner to competencies. Acad Med.2002;77:361-367.

9. Brown SR. Q Methodology and qualitative research. QualHealth Res. 1996;6:561-567.

0. Valenta AL, Wigger U. Q-methodology: definition and applica-tion in health care informatics. J Am Med Inform Assoc. 1997;

vel at Mayo Clinic

%PGY-3n � 23 %

ns with aoral care

96 Item 4Synthesize all available data

87

lable92 Item 6

Recognize situations with aneed for urgent or emergentmedical care

87

nd76 Item 2

Perform an accurate physicalexamination

78

eation

76 Item 20Recognize the scope of his/her

abilities

78

pe of72 Item 1

Acquire accurate and relevanthistory

74

losticplan

68 Item 7Recognize when to seek

additional guidance

74

seeknce

52 Item 5Develop differential diagnoses,

diagnostic and therapeuticplan

63

thy andll

44 Item 17Accept personal errors, honestly

acknowledge them

52

GY) Le

ituatiourgentmedic

ll avai

rate aistory

accuratxamin

he scobilities

erentia, diagnpeutic

hen tol guida

empan to a

4:501-510.