of 53/53
Kompetencevurdering - anno 2015 Gitte Eriksen Uddannelseskoordinerende overlæge Ph.d. MPM HR-lægelig uddannelse Aarhus Universitetshospital

Kompetencevurdering - anno 2015

  • View
    0

  • Download
    0

Embed Size (px)

Text of Kompetencevurdering - anno 2015

Dias nummer 1• Kompetencevurdering på OUH
– Udfordringer
– Argumenter
lægens professionelle udvikling
•personlige evner og egenskaber
•evne til at udnytte personlige evner i praksis
SST rapport 7 lægeroller, 2012
Aarhus Universitetshospital Region Midtjylland
- 2 min alene
– Hvilken effekt vil det have i min afdeling, hvis vi
systematisk kompetencevurderede alle
eller en video?
Status Effekt Systematisk
• Lav dine egne noter…..
– Hvad bliver jeg inspireret af i oplægget i forhold til at
indføre systematisk kompetencevurdering?
nogle af oplæggets pointer?
Aarhus Universitetshospital Region Midtjylland
Kompetencer vurderer vi fordi
• Er vores specialist eksamen
Aarhus Universitetshospital Region Midtjylland
360º feedback
Simulation mm
– ”workplace based assessment”
Framework from Miller, GE (1990) Acad. Med 65 (9): 63-67
Aarhus Universitetshospital Region Midtjylland
High-stake Del af produkt kontrol = summativ
–Er kompetencen erhvervet – er målet opnået?
Det kan vi blive bedre til …..
Aarhus Universitetshospital Region Midtjylland
– ”Der mangler viden om KV og feedback hos vejledere og
YL – dette er en væsentlig barriere”
– ”Der mangler KV kultur/tradition”
– en værktøjskasse….
– for en arbejdsgruppe……
Aarhus Universitetshospital Region Midtjylland
• Definition
• Anvendelighed
Aarhus Universitetshospital Region Midtjylland
Aarhus Universitetshospital Region Midtjylland
Struktureret
• Forståelse af rollen som bedømmer – Due/høg perspektiv, ”det rette niveau”
– Konsekvenshavende – ”license to” eller ”no go”
• Konstruktiv feedback – Brug af og evne til at modtage feedback understøtter læring
• Formel uddannelse af vejledere og yngre læger – en forudsætning for implementering
• Prioriteres af afdelingen/hospitalet/regionen – Opbakning fra både ledelsen og fra speciallægerne
Aarhus Universitetshospital Region Midtjylland
til praksis
Aarhus Universitetshospital Region Midtjylland
Aarhus Universitetshospital Region Midtjylland
Aarhus Universitetshospital Region Midtjylland
Tør vi ikke vurdere?
Aarhus Universitetshospital Region Midtjylland
systematisk kompetencevurdering?
SLOW tracks
FAST tracks
• Græsk metafysik ”Ud af ingenting kan der ikke komme noget”
• Kristendom ”Ud af intet – skabte gud alting”
• Storm P –”Ingenting kommer af ingenting – undtagen lommeuld”
Aarhus Universitetshospital Region Midtjylland
for videreuddannelse af læger
Aarhus Universitetshospital Region Midtjylland
– YL indgår hurtigere og mere kompetent i driften
– forbedret kvalitet i patient behandlingen
• Midler
• Fokus på vurdering af kompetencer (IT, 360 web)
Aarhus Universitetshospital Region Midtjylland
kompetencevurdering?
Standardisering
YL?”
Viden Hvilke metoder findes?
Hvordan kan/skal skalaer bruges?
Hvad siger er evidensen?
Aarhus Universitetshospital Region Midtjylland
– Niveau 1; Manglende forståelse
• ”Jeg forstår ikke hvorfor? Hvordan jeg skal gribe det an …!”
– Niveau 2; Emotionel reaktion
• ”Jeg kan ikke lide det! Jeg er for presset! Jeg kan ikke overskue..!”
– Niveau 3; Manglende tillid til uddannelseslederne
• ”Jeg kan ikke lide dig! Jeg tror ikke på, at vi anderkendes….!”
» Rick Maurer: Modstand mod forandringer
Der er modstand mod KV
Aarhus Universitetshospital Region Midtjylland
• Fred ved der er en sprække i isen
• Hvordan flytter Fred pingvinerne?
at kompetencevurdering giver dygtigere
læger i fællesskab?
Anvendelighed = R x V x A x E x C
•Reliabilitet
– pålidelige
•Validitet
•Acceptable/gennemførlig
•Effekt på læring
– balance mellem udgifter/gevinster
32 Developing and maintaining an assessment system - a PMETB guide to g ood practice
References
1. PMETB. Principles for an assessment system for postgraduate medical training. 2004. Available from: www.pmetb.org.uk/
pmetb/publications/
2. van der Vleuten C. The assessment of professional competence: developments, research and practical implications.
Advances in Health Sciences Education. 1996; 1: 41-67.
3. Schuwirth L, van der Vleuten C. How to design a useful test: the principles of assessment. Edinburgh: ASME. 2006.
4. Schuwirth LW, Southgate L, Page GG, Paget NS, Lescop JM, Lew SR, et al. When enough is enough: a conceptual basis for
fair and defensible practice performance assessment. Medical Education. 2002 Oct; 36(10): 925-30.
5. van der Vleuten CP, Schuwirth LW. Assessing professional competence: from methods to programmes. Medical
Education. 2005 Mar; 39(3): 309-17.
6. Downing SM. Reliability: on the reproducibility of assessment data. Medical Education. 2004 Sep; 38(9): 1006-12.
7. Crossley J, Davies H, Humphris G, Jolly B. Generalisability: a key to unlock professional assessment. Medical Education.
2002 Oct; 36(10): 972-8.
8. Cronbach L, Shavelson R.J. My current thoughts on coefficient alpha and successor procedures. Educational and
Psychological Measurement. 2004 June; 64(3): 391-418.
9. Streiner D, Norman G. Health Measurement Scales: A Practical Guide to their Development and Use. 2nd ed. New York:
Oxford University Press. 1995.
10. Newble D, Jolly B, Wakeford R, editors. The Certification and Recertification of Doctors: Issues in the Assessment of
Clinical Competence. Cambridge University Press. 1994.
11. Downing SM. Validity: on meaningful interpretation of assessment data. Medical Education. 2003 Sep; 37(9): 830-7.
12. Crossley J, Humphris GM, Jolly B. Assessing health professionals: introduction to a series on methods of professional
assessment. Medical Education. 2002; In press.
13. Dauphinee D, Fabb W, Jolly B, Langsley D, Wealthall S, Procopis P. Determining the content of cer tifying examinations.
In: Newble D, Jolly B, Wakeford R, editors. The certification and recertification of Doctors: Issues in the assessment of
clinical competence: Cambridge University Press. 1994; 92-104.
14. Miller G. The assessment of clinical skills/competence/perf ormance. Academic Medicine. 1990; 65(Suppl): S63-S7.
15. Bloom B. Taxonomy of educational objectives. London: Longman. 1965.
16. Eraut M. Developing Professional Knowledge and Competence. London: Falmer Press. 1994.
17. Bridge PD, Musial J, Frank R, Roe T, Sawilowsky S. Measurement practices: methods for developing content-valid
student examinations. Medical Teacher. 2003 Jul; 25(4): 414-21.
18. Roberts C, Newble D, Jolly B, Reed M, Hampton K. Assuring the quality of high-stakes undergraduate assessments of
clinical competence. Medical Teacher. 2006 Sep; 28(6): 535-43.
19. GMC. Good Medical Practice. 2006 [cited 2002 October]. Available from: http://www.gmc-uk.org/standards/
20. Swanson D, Norman G, Linn R. Performance-based assessment: Lessons learnt from the health professions. Education
Research. 5-11: 24(5).
21. Swanwick T, Chana N. Workplace assessment for licensing in general practice. British Journal of General Practice. 2005;
55: 461-7.
22. Dixon H. Trainees’ views of the MRCGP examination and its effects upon approaches to learning: a questionnaire study
in the Northern Deanery. Education for Pr imary Care. 2003; 146-57; 14.
23. Livingston SA, Zieky MJ. Passing scores: a manual for setting standards of performance on educational and occupational
tests. Princeton: Educational Testing Service. 1982.
24. Glass G. Standards and criteria. Journal of Educational Measurement. 1978; 15(4): 237-61.
25. Cizek G. Standard Setting. In: Downing S, Haladyna T, editors. Handbook of Test Development. Mahwah, NJ: Lawrence
Erlbaum; 2006; 225-57.
Aarhus Universitetshospital Region Midtjylland
på samme individ ved forskellige bedømmere eller
flere gange med samme bedømmer
under samme betingelser
AOGS M A IN RESEA RCH A RTICLE
A st ruct ured f our -st ep curr icul um in basic laparoscopy:
developm ent and val idat ion
JEANETT STRANDBYGAARD1, FLEMMING BJERRUM 1, MATHILDE MAAGAARD1, CHRISTIAN RIFBJERG
LARSEN2, BENT OTTESEN1 & JETTE L. SORENSEN1
1Department of Obstetrics and Gynecology, Juliane Marie Center, Center for Children, Women and Reproduction,
Rigshospitalet University Hospital, Copenhagen and 2Department of Obstetrics and Gynecology, Nordsjaellands Hospital,
Hillerød, Denmark
are no conflicts of interest in connection with
this article. All authors have signed the ICMJE
Form for Disclosure of Potential Conflicts of
Interests.
step curriculum in basic laparoscopy:
development and validation. Acta Obstet
Gynecol Scand 2014; DOI: 10.1111/
aogs.12330
Abstract
Objective. The objective of this study was to develop a four-step curriculum in
basic laparoscopy consisting of validated modules integrating a cognitive com-
ponent, a practical component and a procedural component. Design. A four-
step curriculum was developed. The methodology was different for each step.
Step 1: A 1-day course in basic laparoscopy developed on the background of a
regional needs analysis. Step 2: A multiple-choice test, developed and validated
through interviews with experts in laparoscopy and subsequently through a
Delphi audit involving regional chief physicians. Step 3: A procedural training
task (a salpingectomy) on a validated virtual reality simulator. Step 4: An oper-
ation on a patient (a salpingectomy) with following formative assessment based
on a validated assessment scale. Setting. University hospital, Copenhagen, Den-
mark. Population. Fifty-two first-year residents in obstetrics and gynecology
from 2009 to 2011. Method. Observational cohort study. Main outcome mea-
sure. Completion rate. Results. All participants completed step 1 and improved
post-course test scores compared with pre-course test scores, p = 0.001. Step 2
was completed by 75% (37/52); all improved test scores after 6 months,
p = 0.001. Step 3 was completed by 75%. Participants used 238 min (range
75–599) and 38 repetitions (range 8–99) to reach proficiency level on a virtual
reality simulator. Step 4 was completed by 55%. There was no correlation
between test scores and simulator training time. Protected training time was
correlated with increasing completion rate. Conclusion. A four-step curriculum
in basic laparoscopy is applicable in residency training. Protected training time
correlated with increasing completion rate.
Abbreviat ions: FLS, Fundamental of Laparoscopic Surgery; OSA-LS, Objective
Structured Assessment of Laparoscopic Salpingectomy.
Int roduct ion
tial laparoscopic skills training outside the operating
room on simulators to increase patient safety (1,2). How-
ever, in spite of convincing research demonstrating the
advantages of laparoscopic simulators, including virtual
Key M essage
both a cognitive and a practical component. Pro-
tected training time is essential for participation and
completion of a basic laparoscopy curriculum.
ª 2014 Nordic Federation of Societies of Obstetrics and Gynecology, Acta Obstetricia et Gynecologica Scandinavica 1
ACTA Obstetricia et Gynecologica
AOGS M AIN RESEARCH ARTICLE
A st ruct ured f our -st ep curr iculum in basic laparoscopy:
developm ent and val idat ion
JEANETT STRANDBYGAARD1, FLEMMING BJERRUM 1, MATHILDE MAAGAARD1, CHRISTIAN RIFBJERG
LARSEN2, BENT OTTESEN1 & JETTE L. SORENSEN1
1Department of Obstetrics and Gynecology, Juliane Marie Center, Center for Children, Women and Reproduction,
Rigshospitalet University Hospital, Copenhagen and 2Department of Obstetrics and Gynecology, Nordsjaellands Hospital,
Hillerød, Denmark
are no conflicts of interest in connection with
this article. All authors have signed the ICMJE
Form for Disclosure of Potential Conflicts of
Interests.
step curriculum in basic laparoscopy:
development and validation. Acta Obstet
Gynecol Scand 2014; DOI: 10.1111/
aogs.12330
Abstract
Objective. The objective of this study was to develop a four-step curriculum in
basic laparoscopy consisting of validated modules integrating a cognitive com-
ponent, a practical component and a procedural component. Design. A four-
step curriculum was developed. The methodology was different for each step.
Step 1: A 1-day course in basic laparoscopy developed on the background of a
regional needs analysis. Step 2: A multiple-choice test, developed and validated
through interviews with experts in laparoscopy and subsequently through a
Delphi audit involving regional chief physicians. Step 3: A procedural training
task (a salpingectomy) on a validated virtual reality simulator. Step 4: An oper-
ation on a patient (a salpingectomy) with following formative assessment based
on a validated assessment scale. Setting. University hospital, Copenhagen, Den-
mark. Population. Fifty-two first-year residents in obstetrics and gynecology
from 2009 to 2011. Method. Observational cohort study. Main outcome mea-
sure. Completion rate. Results. All participants completed step 1 and improved
post-course test scores compared with pre-course test scores, p = 0.001. Step 2
was completed by 75% (37/52); all improved test scores after 6 months,
p = 0.001. Step 3 was completed by 75%. Participants used 238 min (range
75–599) and 38 repetitions (range 8–99) to reach proficiency level on a virtual
reality simulator. Step 4 was completed by 55%. There was no correlation
between test scores and simulator training time. Protected training time was
correlated with increasing completion rate. Conclusion. A four-step curriculum
in basic laparoscopy is applicable in residency training. Protected training time
correlated with increasing completion rate.
Abbreviat ion s: FLS, Fundamental of Laparoscopic Surgery; OSA-LS, Objective
Structured Assessment of Laparoscopic Salpingectomy.
Int roduct ion
tial laparoscopic skills training outside the operating
room on simulators to increase patient safety (1,2). How-
ever, in spite of convincing research demonstrating the
advantages of laparoscopic simulators, including virtual
Key M essage
both a cognitive and a practical component. Pro-
tected training time is essential for participation and
completion of a basic laparoscopy curriculum.
ª 2014 Nordic Federation of Societies of Obstetrics and Gynecology, Acta Obstetricia et Gynecologica Scandinavica 1
ACTA Obstetricia et Gynecologica 2014
Aarhus Universitetshospital Region Midtjylland
Brug
• At understøtte feedback og læring?
• At godkende/certificere?
• At afsløre uprofessionel adfærd Cees Van der Vleuten, Ottowa 2014
•”Guld standarden” – findes ikke
Professionalisme, holdning & klinisk vurdering……
Aarhus Universitetshospital Region Midtjylland
Som argument
– ”alt vurderes ens - halo effekten ”
• Positive skævheder
• Køn, etnicitet mm
Aarhus Universitetshospital Region Midtjylland
Bested et al 2011
pålideligheden af KV
– Bedømmere bør erkende sympatier eller antipatier overfor YL
Aarhus Universitetshospital Region Midtjylland
men en information
– ”TICK BOX”
Under forventet niveau Forventet niveau Over forventet niveau
Objektiv undersøgelse
(ikke observeret)
Under forventet niveau Forventet niveau Over forventet niveau
Empati og professional adfærd
Under forventet niveau Forventet niveau Over forventet niveau
• Klinisk dømmekraft
Aarhus Universitetshospital Region Midtjylland
Aarhus Universitetshospital Region Midtjylland
Aarhus Universitetshospital Region Midtjylland
En model kunne være
”EPAs” Entrusted Professional Activities
• Kan YL varetage en professionel praksis? – ansvar for en konkret ”opgave/aktivitet”
– kræver multiple kompetencer
– også om natten/fredagen?
Aarhus Universitetshospital Region Midtjylland
kompetencevurdering
– ”søstertesten” – skal den læge behandle min søster?
• den professionelle eksperts vurdering af komplekse kompetencer
Cees van der Vleuten, Ottawa 2012
• Subjektivitet er nødvendig
– Måske er supervisor ikke præget af bias, men af visdom?
Kevin Era, 2011
Det er for komplekst…..
Yngre læge
Nye læringsmål/aftaler
Alle afdelinger skal selv om de er forskellige kunne besvare..
• Hvornår kompetencevurderer vi yngre læger?
• Hvordan bruger vi metoderne?
• Hvordan følger vi op?
Aarhus Universitetshospital Region Midtjylland
Aarhus Universitetshospital Region Midtjylland
Kompetencevurderingsredskaber til
• Er der vi styr på de metoder vi skal bruge?
• Bruger vi metoderne systematisk og planlagt?
• Er de yngre læger aktive ”medspillere”
• Kan vi bruge metoderne mere enkelt end vi gør?
• Er vejlederne godt nok uddannet i metoderne
• Har vi styr på bedømmer bias?
• Bruger vi KV som til at skabe og følge udviklingen?
”KV er styrende for læring”
Aarhus Universitetshospital Region Midtjylland