Click here to load reader
Upload
mobeen
View
213
Download
0
Embed Size (px)
Citation preview
2010; 32: e443–e447
WEB PAPER
Competency-based integrated practicalexaminations: Bringing relevance to basicscience laboratory examinations
RIFFAT SHAFI, KHURRAM IRSHAD & MOBEEN IQBAL
Shifa College of Medicine, Pakistan
Abstract
Background: The practical examinations in subject-based curriculum have been criticized for lack of relevance and clinical
application. We developed competency-based integrated practical examinations (IPEs) for first two years incorporating basic
science principles with clinical relevance in our integrated curriculum.
Aim: To bring relevance to basic science laboratory practical examinations by conducting competency-based IPEs.
Methods: IPEs were developed according to competency-based blueprinting for each integrated module. Clinical scenarios were
used as triggers followed by tasks pertaining to laboratory tests, relevant physical diagnosis and ethics/professional aspects
utilizing standardized patients. Checklists were developed for standardized marking. A feedback questionnaire and two focus
group discussions were administered to a random group of students from both first and second year students. Faculty members’
feedback was also recorded on a questionnaire.
Results: Almost all the students agreed that IPE was a useful experience. Eighty-nine percent agreed that it was a fair examination
and elicited a lesser degree of psychological stress. Eighty-two percent agreed that IPE encouraged critical thinking and application
of knowledge. However, students suggested better organization and longer duration of stations. Faculty members also liked the
experience.
Conclusion: In conclusion, IPEs were well-received and valued both by students and faculty members.
Introduction
Assessment is an essential part of the teaching and learning
process (Harden 2001). Any teaching method needs to be
matched by an appropriate assessment that relates to the
objectives of the teaching. Students may or may not learn what
is in the curriculum or what we teach, but they will learn what
we assess them on, as ‘‘assessment drives learning’’ (Shimura
et al. 2004). As medical curricula around the world move from
discipline-based to integrated teaching and learning, the
integration of assessment must necessarily follow (Hudson &
Tonkin 2004). A change in instructional methods without
changing assessment will not achieve desired outcomes
(Ahmed et al. 2007). Studies have shown that such a mismatch
can eventually lead to failure of any adopted curriculum
(Ghosh & Pandya 2008).
The limitations of clinical and practical examinations have
been realized for a long time and have given rise to attempts at
improving the current scenario (Stiliman et al. 1977; Edelstein
& Ruder 1990; Newble 1991). An earlier innovation in this
regard is the objective structured clinical examination (OSCE)
later extended to the objective structured practical examination
(OSPE) described in 1975 and in greater detail in 1979 by
Harden and his group from Dundee (Harden et al. 1975;
Harden & Gleeson 1979). This method with some modifica-
tions has stood the test of time and has largely overcome the
problems of the conventional clinical examinations mentioned
earlier (Ananthakrishnan 1993). It appears that the assessment
of problem solving skills must occur in the context of a clinical
Practice points
. The integration of assessment must match the integrated
medical curriculum.
. Shifa College of Medicine, Islamabad, Pakistan moved
from discipline-based to system-based integrated
curriculum.
. IPE, a tool for assessing multiple competencies was
developed.
. Basic science knowledge was linked to clinical practice
in IPEs.
. IPEs were well received by students and faculty.
Correspondence: R. Shafi, Department of Basic Health Sciences, Shifa College of Medicine, Sector H-8/4, Islamabad, Pakistan. Tel: 92 51 460 3365;
fax: 92 51 443 5046; email: [email protected]
ISSN 0142–159X print/ISSN 1466–187X online/10/100443–5 � 2010 Informa UK Ltd. e443DOI: 10.3109/0142159X.2010.513405
Med
Tea
ch D
ownl
oade
d fr
om in
form
ahea
lthca
re.c
om b
y B
iblio
teka
Uni
wer
syte
tu W
arsz
awsk
iego
on
10/2
8/14
For
pers
onal
use
onl
y.
problem, yet no unanimity exists on the tools to be used
(Nendaz & Tekian 1990).
Skills assessment does not only measure the perfor-
mance but also provides an indication of the effectiveness
of learning strategies and appropriateness of the content
input. A good assessment should be valid, reliable, prac-
ticable, and should have educational value (Ahmed et al.
2007).
Miller acknowledged that no single method of assess-
ment could provide all the data required for judgment
of the delivery of professional services by a physician (Miller
1990).
The move from discipline-based to system-based inte-
grated curriculum at Shifa College of Medicine, Islamabad,
Pakistan in 2008, provided exciting opportunities to integrate
the learning of basic and clinical sciences for medical
undergraduates. After introducing integrated written assess-
ments in the form of integrated multiple choice questions
(MCQs) and short answer questions (SAQs), we aimed to
develop a competency based integrated practical examina-
tion (IPE), for the 1st and 2nd year of our 5-year
undergraduate curriculum. This type of assessment serves
as a tool for testing multiple competencies related to skills
(as in performance exercises), grounded knowledge (as in
OSPE; Howley 2004; Hudson & Tonkin 2004) and attitudes
of the students.
The objective of this study is to share our experience of
bringing relevance to basic science laboratory practical exam-
inations by conducting competency-based IPEs, and to ana-
lyze its efficacy for the students.
Methods
CanMeds framework for competencies was utilized for devel-
oping the curriculum (Mickelson & MacNeily 2008). The level
of achievement for these competencies was defined according
to the years of undergraduate education. In our 5-year
undergraduate curriculum, spirally integrated modules were
developed for two spirals delivered over a period of first three
years. First spiral aimed at system-based modules comprising
anatomy, physiology, biochemistry, ethics, professionalism,
clinical relevance, and evidence-based medicine. The modules
in the second spiral incorporated revisit of the systems with a
stress on pharmacology, community health sciences, pathol-
ogy, relevant clinical disciplines, ethics, professionalism, and
evidence-based medicine. Modules revolved around longitu-
dinal themes which were revisited with varying objectives
throughout the curriculum. The last 2 years of curriculum
revolved around clinical clerkships in the disciplines of
General internal medicine, Family medicine, Surgery,
Ophthalmology, Otolaryngology, and Obstetrics and
Gynecology.
Multidisciplinary modular teams developed objectives
using SMART (specific, measurable, attainable, relevant, and
targeted) technique linking objectives to CanMeds compe-
tency framework. Longitudinal clinical themes covering the
important clinical concepts were developed for all the mod-
ules. Appropriate learning strategies were employed, which
included small group discussions, followed mostly by a large
group wrap-up session, problem-based learning and self-
directed learning. Modular delivery involved a great deal of
stress on learning clinical skills about data gathering and
physician–patient interaction. Students were provided with
opportunity to learn skills in skills laboratory sessions. Faculty
members from both basic as well as clinical sciences facilitated
skill-related sessions utilizing models, simulations, and stan-
dardized and real patients.
In our traditional curriculum, practical examinations were
discipline based. The practical assessments revolved
around laboratory techniques for biochemical and physiolog-
ical testing with very little relevance to real life practice of a
physician.
IPEs were constructed for each module by the team
members from various disciplines including Anatomy,
Physiology, Biochemistry, General medicine, and Surgery.
Content validity was ensured by developing a blue print
and repeated discussions during planning meetings.
Competencies related to Performance Skills,
Communication Skills, Reasoning Skills, and Humanistic
Qualities/Professionalism were incorporated by developing
an IPE construction template. A rating instrument was also
developed which had these competencies listed on it. The
raters were given workshops on the rating instrument.
Students were rated on all the competencies incorporated
in an IPE on a three-category scale of unsatisfactory,
satisfactory, and superior. Practical performance and clinical
skills like history taking, physical examination, and coun-
seling skills were assessed in a clinical context. Relevant
ethical and professional aspects were also addressed. Basic
science knowledge was linked to clinical practice and high
construct validity was achieved by subjecting the draft of
each station to extensive review. Faculty members involved
in the process of assessment were briefed about the
process of assessment and rating of the students.
It was a multistation, objective, structured examination.
Case scenario, video, Images/Photograph/Model/Specimen, or
Standardized Patient were used as triggers at each station.
Three or four tasks relevant to the trigger were given. Each
module exam had 8–12 stations. Each station was given equal
time, after which the candidate was required to move on to the
next station. These stations included tasks which were both
interactive (for example, taking history or performance on
patient) and static (for example, identification of slide or data
interpretation).
A faculty member was present as an observer at
each station where performance was required. Global rating
scale was used to rate the students, using checklist as a
reminder.
Feedback
A feedback questionnaire was administered to a random
group of 44 students from first and second year classes,
and they were asked to give suggestions and to com-
ment on the IPE. Their comments were recorded.
Feedback pro forma was also administered to the faculty
members.
R. Shafi et al.
e444
Med
Tea
ch D
ownl
oade
d fr
om in
form
ahea
lthca
re.c
om b
y B
iblio
teka
Uni
wer
syte
tu W
arsz
awsk
iego
on
10/2
8/14
For
pers
onal
use
onl
y.
Sample stations of IPE
Results
Likert scale was used; strongly agree and agree were merged
for the purpose of analysis and strongly disagree and agree
were also merged for analysis.
Of the students, 80–90% agreed with organization and
content of the tasks for various stations. Structure and
sequence of the IPE and the time given for each task were
the areas where less than 50% of the students agreed (Table 1).
Of the students, 70–80% agreed with the content of the
tasks in each IPE station (Table 2).
Comments of the students
Most of the students liked change from traditional to the IPE,
but thought that it needs to be organized further; they thought
that the time given for each station was not enough for the
performance of the tasks. Some of them also suggested that
the oral viva should be more extensive. They thought it was an
unbiased way of examination.
Some representative comments of the students regarding
gains and concerns are shown in Table 3.
Comments of the faculty
Faculty found IPE a good way of assessing application of
knowledge and skills of the students (Table 4).
Discussion
In our study, the feedback from students was strongly positive.
This is consistent with another study in which the multistation
IPE was ranked high by the majority of students (Abraham
et al. 2005).
The organization as well as clinical relevance in the
practical exam was highly appreciated by the students.
Eighty-two percent of the students thought that it reflected
relevance and helped create connections across various
disciplines. This is consistent with another study in
Katmandu, although this study included examination in the
discipline of pharmacology as compared to our study where
the examination was integrated. In the curriculum of
Katmandu university stations, testing skills in pharmacology
are proposed to be introduced and the attitude of students
A 40 year old male presents to the emergency room with history
of severe pain in his right loin for 2 days.
A 60 year old, chronic smoker came to the OPD with the complaint of
on-and-off episodes of coughing and breathlessness for the last 3 years.
Task 1 Task 1
Take history relevant to his symptoms (standardized patient) Perform peak expiratory flow rate on the patient (standardized patient)
Discipline represented: (Clinical skills/communication skills) Discipline represented: (physiology)
Task 2 Task 2
Identify the abnormality in the imaging study (an IVU with a stone) Identify the impressions marked on specimen (lung specimen)
Discipline represented: (Anatomy/Clinical skills) Discipline represented: (Anatomy)
Task 3 Task 3
Identify the structure under microscope and give two reasons that
favor your identification (slide under microscope)
Identify the marked structures on the chest radiograph (X-ray chest)
Discipline represented: (Anatomy) Discipline represented: (Anatomy/Clinical skills)
Task 4 Task 4
How do you relate urine microscopic findings with patient’s
presentation? (Urine microscopy)
Interpret the data (arterial blood gases report)
Discipline represented: (Biochemistry) Discipline represented: (Physiology/Biochemistry)
Table 1. Student’s feedback on module organization and structure.
Agree n (%) Uncertain n (%) Disagree n (%)
IPE was fair examination when compared with traditional practical and viva 39 (89) 2 (4) 3 (7)
IPE reflected relevance and helped create connections across various disciplines 36 (82) 7 (16) 1 (2)
IPE was well-administered 20 (70) 15 (34) 9 (20)
IPE was well-structured and sequenced 21 (48) 14 (32) 9 (20)
Clinical cases and instructions at the workstation were appropriate 24 (55) 11 (25) 9 (20)
Tasks reflected learning objectives as discussed/taught in the module 36 (82) 6 (14) 2 (4)
IPE was a useful experience 43 (98) 1 (2) 0
Time at the stations was enough for various tasks 20 (45) 9 (20) 15 (34)
Degree of psychological stress elicited by IPE was less than
any other format of exam (viva, traditional practical)
39 (89) 2 (4) 3 (7)
Integrated practical examinations
e445
Med
Tea
ch D
ownl
oade
d fr
om in
form
ahea
lthca
re.c
om b
y B
iblio
teka
Uni
wer
syte
tu W
arsz
awsk
iego
on
10/2
8/14
For
pers
onal
use
onl
y.
toward this development was generally positive (Shankar &
Mishra 2002). Another study by Boon et al. (2001) conducted
at the University of Pretoria in South Africa shows that the
majority of students considered that the clinical case studies
gave them a better understanding of the relevant basic
sciences.
The majority of the students agreed that the IPE was a
better way of assessing practical skills when compared with
the traditional method of practical exam. Eighty-nine percent
thought it was a fair exam with lesser stress for the candidate.
This is consistent with the results of another study by Hudson
and Tonkin (2004), where students admitted that the previous
assessment method encouraged test-directed studying. It was
pleasing that a significant number of students thought that the
examination was clinically relevant and it was a good
preparation for later assessments and clinical practice.
Students acknowledged that the integrated practical items
had high clinical relevance.
The feedback on knowledge area covered in the IPE
was strongly positive. Eighty-two percent of the students
agreed that it covered a wide knowledge area. Seventy-
nine percent of the students appreciated that the clinical
relevance helps to clarify concepts in basic sciences. This is
consistent with another study where students found that the
clinical cases and interactions with actual patients helped
them synthesize and retain basic science knowledge,
although we used simulated patients in the IPEs (Muller
et al. 2008).
In our modules, formal teaching of important competencies
related to medical ethics, professionalism, and communication
skills which are generally ignored in the traditional curriculum
(Verma et al. 1991; Nayar et al. 1995) were also introduced,
using CanMeds framework of competencies (Mickelson &
MacNeily 2008). History taking, physical examination, profes-
sionalism, and communication skills were assessed during
IPEs. Ethics-related issues were also incorporated in written
assessment. Assessment of attitudes in IPE where it was
integrated with basic sciences knowledge was appreciated by
the faculty in this study. Students and the faculty members
both appreciated the competency-based integrated examina-
tions; however they felt that there was still room for improve-
ment in the structure and organization of IPEs, and that the
time allocated for each station was not appropriate.
Running IPE simultaneously for all the students would
certainly be the best option to ensure a fair exam, but lack of
enough space was a limitation which was overcome by
utilizing the same space for all the groups. However, the
stations and the tasks were changed for each group to ensure
privacy, although it took longer and was hectic for the faculty
and SPs. It was made sure that the students’ feedback on
organization of the module and inappropriateness of time at
each station was rectified in the forthcoming examinations.
Students study more thoughtfully when they anticipate
certain examination formats, (Hakstian 1971) and changes in
the format can shift their focus to clinical rather than theoretical
issues (Newble & Jaeger 1983). The introduction of compe-
tency-based IPE is unique in assessing the basic science
practicals in undergraduate medical curriculum which
addresses multiple competencies that are required by a good
practicing physician.
In conclusion, competency-based IPEs were well received
and valued both by students and faculty members.
Declaration of interest: The authors report no conflicts of
interest. The authors alone are responsible for the content and
writing of the article.
Notes on contributors
RIFFAT SHAFI, MBBS, Fellow College of Physicians and Surgeons of
Pakistan (FCPS). She is an assistant professor in the Section of Physiology,
Department of Basic Health Sciences. She is a curriculum coordinator at
Shifa College of Medicine, Islamabad, Pakistan.
KHURRAM IRSHAD, MBBS, Fellow College of Physicians and Surgeons of
Pakistan (FCPS). He is an assistant professor in the Section of Physiology,
Table 3. Student’s comments on IPEs.
Traditional exam vs. IPE
Favorable feedback Areas to be improved
IPE’s far better than traditional exam
Structured and fair ways of examination Time duration at individual
stations is brief
Helpful in clarifying concepts Need more organization in
terms of logisticsExcellent way of accessing practical skills
It has clinical relevance
It is interactive
Integration with clinical sciences
makes it interesting and easier
Table 4. Faculty’s comments on IPEs.
Traditional exam vs. IPE
Favorable feedback Areas to be improved
Excellent way of assessing practical skills Need more organization
Helps in clarifying the concepts
Integration between basic sciences
and clinical skills is good and useful
Good training program for
preclinical students
Tests the attitude along with
knowledge and skill
Table 2. Student’s feedback on the content of the tasks.
Agreen (%)
Uncertainn (%)
Disagreen (%)
Wide knowledge area
was covered
36 (82) 5 (11) 3 (7)
IPE encouraged critical thinking
and application of knowledge
36 (82) 6 (14) 2 (4)
IPE provided opportunities to learn
and clarify concepts further
35 (79) 6 (14) 3 (7)
R. Shafi et al.
e446
Med
Tea
ch D
ownl
oade
d fr
om in
form
ahea
lthca
re.c
om b
y B
iblio
teka
Uni
wer
syte
tu W
arsz
awsk
iego
on
10/2
8/14
For
pers
onal
use
onl
y.
Department of Basic Health Sciences, Shifa College of Medicine, Islamabad,
Pakistan.
MOBEEN IQBAL, MBBS, Fellow American College of Chest Physicians
(FCCP). He is a professor of Medicine and associate dean of Medical
Education at Shifa College of Medicine. He is a consultant, Pulmonary and
Critical Care Medicine, at Shifa International Hospital, Islamabad, Pakistan.
References
Abraham RR, Upadhya S, Torke S, Ramnarayan K. 2005. Student perspec-
tives of assessment by TEMM model in physiology. Adv Physiol Educ
29:94–97.
Ahmed A, Begum M, Begum S, Akhter R, Rahman N, Khatun F. 2007. Views
of the students and teachers about the new curriculum (curriculum
2002) and their opinion on in-course assessment system. J Med 8:39–43.
Ananthakrishnan N. 1993. Objective structured clinical/practical examina-
tion (OSCE/OSPE). J Postgrad Med 39:82–84.
Boon JM, Meiring JH, Richards PA, Jacobs CJ. 2001. Evaluation of clinical
relevance of problem oriented teaching in undergraduate anatomy at
the University of Pretoria. Surg Radiol Anat 23:57–60.
Edelstein DR, Ruder HJ. 1990. Assessment of clinical skills using video tapes
of the complete medical interview and physical examination.
Med Teach 12:155–162.
Ghosh S, Pandya HV. 2008. Implementation of integrated learning program
in neurosciences during first year of traditional medical course:
Perception of students and faculty. BMC Med Educ 8:44.
Hakstian RA. 1971. The effects of type of examination anticipated on test
preparation and performance. J Educ Res 64:319–324.
Harden RM. 2001. The learning environment and the curriculum.
Med Teach 23:335–336.
Harden RM, Gleeson FA. 1979. Assessment of clinical competencies using
an objective structured clinical examination (OSCE). ASME Medical
Education Booklet no. 8. Dundee: ASME.
Harden RM, Stevenson M, Wilson Downie W, Wilson GM. 1975.
Assessment of clinical competence using objective structured exami-
nation. BMJ 1:447–451.
Howley LD. 2004. Performance assessment in medical education: Where
we’ve been, and where we’re going. Eval Health Prof 27:285–303.
Hudson JN, Tonkin AL. 2004. Evaluating the impact of moving from
discipline based to integrated assessment. Med Educ 38:832–843.
Mickelson JJ, MacNeily AE. 2008. Translational education: Tools for
implementing the CanMeds competencies in Canadian urology resi-
dency training. Can Urol Assoc J 2(4):395–404.
Miller GE. 1990. The assessment of clinical skills/competence/performance.
Acad Med 65(9 Suppl.):S63–S67.
Muller JH, Jain S, Loeser H, Irby DM. 2008. Lessons learned about
integrating a medical school curriculum: Perceptions of students,
faculty and curriculum leaders. Med Educ 42:778–785.
Nayar U, Verma K, Adkoli BV, editors. 1995. Inquiry- driven strategies for
innovation in medical education in India. New Delhi: AIIMS.
Nendaz MR, Tekian A. 1990. Assessment in problem-based learning
medical schools: A literature review. Teach Learn Med 11(4):232–243.
Newble DI. 1991. The observed long case in clinical assessment. Med Educ
25:369–373.
Newble DI, Jaeger K. 1983. The effect of assessments and examinations on
the learning of medical students. Med Educ 17:165–171.
Shankar PR, Mishra P. 2002. Student feedback on the objective structured
component of the practical examination in pharmacology. J Nepal Med
Assoc 41:368–374.
Shimura T, Aramaki T, Shimizu K, Miyashita T, Adachi K, Teramoto A. 2004.
Implementation of integrated medical curriculum in Japenese medical
schools. J Nippon Med Sch 71(1):11–16.
Stiliman PL, Brown DR, Redfield DL, Sabors DL. 1977. Construct validation
of the Arizona clinical interview rating scale. Educ Psychol Meas
37:1031–1038.
Verma K, D’Monte B, Adkoli BV, editors. 1991. Inquiry – driven
strategies for innovation in medical education in India. New Delhi:
AIIMS.
Integrated practical examinations
e447
Med
Tea
ch D
ownl
oade
d fr
om in
form
ahea
lthca
re.c
om b
y B
iblio
teka
Uni
wer
syte
tu W
arsz
awsk
iego
on
10/2
8/14
For
pers
onal
use
onl
y.