9
Dental students’ peer assessment: a prospective pilot study J. Tricio 1,2 , M. Woolford 1 , M. Thomas 1 , H. Lewis-Greene 1 , L. Georghiou 1 , M. Andiappan 1 and M. Escudier 1 1 King’s College London Dental Institute, London, UK, 2 Faculty of Dentistry, University of los Andes, Santiago, Chile keywords peer assessment; peer feedback; Direct Observation of Procedural Skills. Correspondence Jorge Tricio King’s College London Dental Institute Guy’s Hospital Central Office 18th Floor Tower Wing London SE1 9RT, UK Tel: +44 (0)20 71881162 Fax: +44 (0)20 71881159 e-mail: [email protected] Accepted: 8 July 2014 doi: 10.1111/eje.12114 Abstract Introduction: Peer assessment is increasingly used in health education. The aims of this study were to evaluate the reliability, accuracy, educational impact and student’s perceptions of undergraduate pre-clinical and clinical dental students’ structured and prospective Peer assessment and peer feedback protocol. Materials and methods: Two Direct Observation of Procedural Skills (DOPS) forms were modified for use in pre-clinical and clinical peer assessment. Ten year two dental students working in a phantom-heads skills laboratory and 16-year five dental students attending a comprehensive care clinic piloted both peer DOPS forms. After training, pairs of students observed, assessed and provided immediate feedback to each other using their respective peer DOPS forms as frameworks. At the end of the 3-month study period, students anonymously provided their perceptions of the protocol. Results: Year 2 and year 5 students completed 57 and 104 peer DOPS forms, respec- tively. The generalizability coefficient was 0.62 for year 2 (six encounters) and 0.67 for year 5 (seven encounters). Both groups were able to differentiate amongst peer-assessed domains and so detect improvement in peers’ performance over time. Peer DOPS scores of both groups showed a positive correlation with their mean end-of-year exam- ination marks (r 0.505, P 0.051) although this was not statistically significant. There was no difference (P 0.094) between the end-of-year examination marks of the participating students and the rest of their respective classes. The vast majority of both groups expressed positive perceptions of the piloted protocol. Discussion: There are no data in the literature on the prospective use of peer assess- ment in the dental undergraduate setting. In the current study, both pre-clinical and clinical students demonstrated the ability to identify those domains where peers per- formed better, as well as those which needed improvement. Despite no observable edu- cational impact, most students reported positive perceptions of the peer DOPS protocol. Conclusions: The results of this pilot study support the need for and the potential benefit of a larger- and longer-term follow-up study utilising the protocol. Introduction The demand to develop dentists who are self-directed, life-long learners and reflective practitioners (1, 2) has stimulated the development and use of alternative assessment forms. These include peer assessment and peer feedback which aim to develop these skills and support students’ learning (36). In this context, peer assessment is an arrangement that involves observation by students who have attained the same general level of training or expertise and status, to judge structured tasks or provide global impressions of the amount, level, value, worth, quality or success of their peers’ work (79). ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd 1 European Journal of Dental Education ISSN 1396-5883

Dental students' peer assessment: a prospective pilot study

  • Upload
    m

  • View
    216

  • Download
    4

Embed Size (px)

Citation preview

Page 1: Dental students' peer assessment: a prospective pilot study

Dental students’ peer assessment: a prospective pilot studyJ. Tricio1,2, M. Woolford1, M. Thomas1, H. Lewis-Greene1, L. Georghiou1, M. Andiappan1

and M. Escudier1

1 King’s College London Dental Institute, London, UK,2 Faculty of Dentistry, University of los Andes, Santiago, Chile

keywords

peer assessment; peer feedback; Direct

Observation of Procedural Skills.

Correspondence

Jorge Tricio

King’s College London Dental Institute

Guy’s Hospital

Central Office 18th Floor

Tower Wing

London SE1 9RT, UK

Tel: +44 (0)20 71881162

Fax: +44 (0)20 71881159

e-mail: [email protected]

Accepted: 8 July 2014

doi: 10.1111/eje.12114

Abstract

Introduction: Peer assessment is increasingly used in health education. The aims ofthis study were to evaluate the reliability, accuracy, educational impact and student’sperceptions of undergraduate pre-clinical and clinical dental students’ structured andprospective Peer assessment and peer feedback protocol.

Materials and methods: Two Direct Observation of Procedural Skills (DOPS) formswere modified for use in pre-clinical and clinical peer assessment. Ten year two dentalstudents working in a phantom-heads skills laboratory and 16-year five dental studentsattending a comprehensive care clinic piloted both peer DOPS forms. After training,pairs of students observed, assessed and provided immediate feedback to each otherusing their respective peer DOPS forms as frameworks. At the end of the 3-monthstudy period, students anonymously provided their perceptions of the protocol.

Results: Year 2 and year 5 students completed 57 and 104 peer DOPS forms, respec-tively. The generalizability coefficient was 0.62 for year 2 (six encounters) and 0.67 foryear 5 (seven encounters). Both groups were able to differentiate amongst peer-assesseddomains and so detect improvement in peers’ performance over time. Peer DOPSscores of both groups showed a positive correlation with their mean end-of-year exam-ination marks (r ≥ 0.505, P ≥ 0.051) although this was not statistically significant.There was no difference (P ≥ 0.094) between the end-of-year examination marks ofthe participating students and the rest of their respective classes. The vast majority ofboth groups expressed positive perceptions of the piloted protocol.

Discussion: There are no data in the literature on the prospective use of peer assess-ment in the dental undergraduate setting. In the current study, both pre-clinical andclinical students demonstrated the ability to identify those domains where peers per-formed better, as well as those which needed improvement. Despite no observable edu-cational impact, most students reported positive perceptions of the peer DOPSprotocol.

Conclusions: The results of this pilot study support the need for and the potentialbenefit of a larger- and longer-term follow-up study utilising the protocol.

Introduction

The demand to develop dentists who are self-directed, life-longlearners and reflective practitioners (1, 2) has stimulated thedevelopment and use of alternative assessment forms. Theseinclude peer assessment and peer feedback which aim to

develop these skills and support students’ learning (3–6). Inthis context, peer assessment is an arrangement that involvesobservation by students who have attained the same generallevel of training or expertise and status, to judge structuredtasks or provide global impressions of the amount, level, value,worth, quality or success of their peers’ work (7–9).

ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd 1

European Journal of Dental Education ISSN 1396-5883

Page 2: Dental students' peer assessment: a prospective pilot study

Students’ formative Peer assessment can successfully focus onthe provision of objective and immediate feedback (10, 11), andcan enhance the students’ learning process in several ways. It cancultivate high levels of student responsibility (4), encourages dip-lomatic criticism (12) and peer integration (13). It can also facili-tate greater student involvement in their learning development(3) and increase their familiarity with evaluation criteria (14),whilst at the same time helping them to overcome unrealisticexpectations (13). Importantly, it also encourages reflection andlifelong learning (15), as well as critical skills (5, 16), and by sodoing can improve students’ overall performance (17).However, there remain a number of limitations of Peer

assessment. These include ‘friendship’ or collusive marking(18), students not accepting Peer assessment and peer feedbackas accurate and helpful (19), as well as a reluctance to acceptany responsibility for assessing or criticising their friends (20).Nevertheless, most of the students enjoy (21), see the benefitsof (22), and are in favour of peer assessment (23) as a fair andvalid assessment (24), provided it has an open and clear ratio-nale (13) with pre-known guidelines and criteria (10).The implementation of and criteria used in peer assessment

of clinical performance using standardised forms (question-naires) are well documented for medical undergraduate stu-dents (25), foundation and postgraduate medical trainees (26),revalidation of medical career grade (27), dental postgraduatetrainees (18) and dental undergraduate students (22, 28, 29).However, no study has been published on the prospective

use of peer assessment in dental undergraduate students despitethe potential for dental peers to contribute to each other’slearning process. The basis for this benefit is their frequentexposure and hence detailed knowledge of each other’s work,in a variety of contexts, which is not always available to facultymembers (3). Students also have the advantage of observingeach other performing the complete task or procedure underreal conditions (30). They are therefore uniquely placed to for-mally assess each other fairly and accurately (4) with the addedbenefit of a stress-free environment (18).As part of a chronological line of research on peer assess-

ment at King’s College London Dental Institute (KCLDI) andbefore implementing a larger- and longer-term follow-up inves-tigation, this study reports the development and piloting of astructured protocol of formative prospective peer assessment ofpre-clinical and clinical dental students’ skills, used as a frame-work for the provision of immediate peer feedback. The aimswere:

• To evaluate the reliability and educational impact of under-graduate pre-clinical and clinical dental students’ structuredand prospective peer assessment.

• To investigate students’ perceptions of the suitability of theassessed domains, feasibility for future use, identification oflearning needs, and acceptability and fairness of the pro-spective peer assessment and peer feedback protocol.

Materials and methods

Ethical approval

The study received full ethical approval from the King’s CollegeLondon Biomedical Sciences, Dentistry, Medicine and Natural

& Mathematical Sciences Ethical Committee (number BDM/11/12-21).

Developing the instrument

Two standard Direct Observation of Procedural Skills forms(31, 32) were used as templates to develop pre-clinical andclinical peer assessment tools [peer DOPS (Direct Observationof Procedural Skills)]. These formed a framework for a struc-tured protocol of prospective peer assessment and peer feed-back of undergraduate dental students’ pre-clinical competenceand clinical performance.Changes to the original templates included a new general lay-

out, replacement of the traditional norm-referenced assessmentscale (Below expectation, Borderline, Meets expectations andAbove expectations) (33), which might prove difficult forjunior students to make judgements about their peers’ perfor-mance quality (8), with a criterion-referenced one containingfour written descriptions based on the needed frequency ofclarification (Frequent, Some, Very Little and No Clarification,Warning and/or Assistance) (32). An ‘unable to comment’option when a given behaviour was not observed was alsoincluded.The assessment domains were different for pre-clinical and

clinical peer assessment. Based on blueprinting principles (34),pre-clinical peer DOPS contained 10 non-compounded items(35) (representing the main learning outcomes of the KCLDIyear 2 coursebooks), designed for the purpose of peer assess-ment of any training procedure performed at the simulationskills laboratory. Similarly, clinical peer DOPS (representing themain learning outcomes of the year 5 coursebooks) wereintended for peer assessment of whichever clinical procedurestudents performed on their patients.Both forms also incorporated an assessment of ‘Students’

insight into their performance’ (36) and a 6-point Likert scalefor students to rate the utility of giving/receiving feedback as atechnique to improve their future performance. Writteninstructions on how to complete the forms and a wider expla-nation of the grading scale were also included.Both peer DOPS drafts were reviewed by five internal pre-

clinical and clinical teachers (each of whom had at least 7 yearsof teaching experience) to ensure they sampled all the relevantdomains (37). Following this, Bachelor of Dental Surgery(BDS) year 2 and year 5 students were also asked to review thewording and content of the forms and then use them oncebefore feeding back. This process identified two areas of stu-dent concern. The first related to the new criterion-referencedscale and the need to grade the frequency of peer ‘clarification’whilst working. This was felt to negatively affect peer-collabora-tion as they would refrain from asking questions in order toobtain a better assessment. As this was not the intention of theexercise, the criterion-referenced scale was changed to a six-option educationally referenced one containing a graphical andwritten anchor of the desired ‘increasing ability over time’(Fig. 1) starting at the beginning of their respective trainingyear, to facilitate understanding and hence use by junior stu-dents (38, 39). For example, when a student first performs apractical task, they would be peer-rated as ‘starting to develop’the ability for that task. Subsequently, he or she would ideally

2 ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd

Peer assessment of dental students Tricio et al.

Page 3: Dental students' peer assessment: a prospective pilot study

progress to ‘initial capability’, followed by ‘constant acceptable’,‘constant clear’, ‘constant good’ and finally ‘constant extremelygood’ ability. Consequently, the new anchor did not requirestudents to make any judgements about the quality of perfor-mance or frequency of clarification (Fig. 1).The second students’ concern was the use of the word ‘asses-

sor’ in relation to the ‘observing’ student, as they were unhappyto be ‘assessing’ their peers. In view of this, all references to‘assessor’ were replaced by ‘observer’ (Fig. 2).

Administration and data collection

Participants

In January 2012, 26 invited students (18 females and 8 males,aged 18–40, mean = 24.3, SD = 5.9) comprising two groups,consented to participate in this peer assessment and peer feed-back pilot study. The first group consisted of 10 pre-clinicalBDS 2 Conservative Dentistry students who were under a singleclinical supervisor and working at neighbouring phantomheads. The second group comprised 16 clinical BDS 5 PrimaryDental Care (PDC) students who worked on the same day ofthe week as clinical partners (alternating dentist/assistant roles).

Peer assessment training

At the start of the study, each group of students received a45-min peer assessment training and familiarisation sessiondelivered by the same researcher (JT) relating to observation,peer assessment, peer feedback, action plan and completion ofthe instrument. Using written/video examples and role-playing,they learnt and practised how to give and receive confidential,brief, constructive, task-focused and immediate dialogic feed-back (9), using their peer DOPS domains as a framework (40).BDS 2 students (organised in fixed pairs) working at neigh-bouring phantom heads and BDS 5 clinical partners (randomlyallocated each session) acted as ‘observer’ and ‘trainee’, respec-tively, during the first half of the day and then switched rolesduring the second half of the day.

Peer assessment piloting

During six occasions, BDS 2 students performed their own pro-cedures as normal whilst ‘observing’ their peers’ pre-clinical

work every 15 min to avoid interfering with their own work.BDS 5 students performed their usual clinical activities in pairsso that the assistant student ‘observed’ the dentist studentwhilst treating the patient together, on seven occasions. Theobserved procedure was then used to score each of the respec-tive pre-clinical or clinical peer DOPS domains selecting andticking one of the six options of the educationally referencedscale (Figs 1 and 2) for every domain. If a given behaviour wasnot observed, they ticked the ‘unable to comment’ option.These scores provided a grounded framework to provideinformed feedback. Subsequently, they agreed an appropriateaction plan to address any developmental needs (41). Finally,after signing the forms, students self-reflected on the feedbackand noted their thoughts in a private reflection diary.

Students’ perceptions

To investigate students’ perceptions of the prospective peerassessment and peer feedback protocol, during the final sessionof peer assessment, both groups anonymously answered the fol-lowing four questions using a 5-point Likert scale (Stronglyagree, Agree, Neutral, Disagree and Strongly disagree). To whatextent do you agree that the peer assessment and peer feedbackprotocol used in this study: (i) Assessed you in areas that cor-respond to your activity in the pre-clinic/clinic? (ii) Could beintroduced in the future to all students at KCLDI as part oftheir pre-clinical/clinical education? (iii) Have helped you toidentify learning needs and to improve your performance? (iv)Was acceptable and fair?

Statistical analysis

All peer DOPS forms data were manually digitised by the sameresearcher (JT) into a spread sheet. To analyse students’ peerassessment scores, each of the six levels of ‘increasing abilityover time’ of the educationally referenced scale was assigned anumerical value from 1 to 6. Thus, the ‘starting to develop’ ini-tial stage of ability was given a score 1; the ‘show initial capa-bility’ a score 2 and so on until the highest “show constantextremely good ability” which was given a score 6. Subse-quently, scores were checked for normality assumptions usinghistogram and box plot before carrying out any parametricanalysis.The reliability of both peer DOPS tool scores was assessed

independently using generalizability coefficient. Thus, a crossedthree-facet [10 students (s) 9 6 occasions (o) 9 11 items (i)]random effects for BDS 2 (fixed pairs of students throughoutthe study) and a nested three-facet (16 students 9 7 occa-sions 9 13 items) random effects for BDS 5 (random pairs ofstudents) were used.Descriptive statistics were used to summarise peer assessment

scores, peer-observation time, peer feedback time and the util-ity of giving/receiving feedback. The same method was used todescribe students’ perceptions of the prospective peer assess-ment and peer feedback protocol. When comparing variousmeasures observed for BDS 2 and BDS 5 groups, independent-samples t-test was used.To compare the scores students gave to each other to their

high-stakes marks, a Pearson correlation analysis between BDS

Stages of Progress Achievement

Startingto

develop

Show initial

capability

Show constant

acceptableability

Show constant

clearability

Show constant

goodability

Show constant extremely

goodability

Unable to comment

Beginning BDS 2-5 Training

Endpoint BDS 2-5 Training

Increasing ability over time

Fig. 1. Six-point educationally referenced scale used in both pre-clinical

Bachelor of Dental Surgery (BDS) year 2 and clinical BDS year 5 peer-

Direct Observation of Procedural Skills instruments which asks the

‘observing’ student to judge their peer’s ability over time.

ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd 3

Tricio et al. Peer assessment of dental students

Page 4: Dental students' peer assessment: a prospective pilot study

2 and BDS 5 students’ peer DOPS scores and their respectiveofficial end-of-year mean examination mark was performed.Further, to investigate a possible effect of the peer DOPS exer-cise on participating students’ academic performance, indepen-dent-samples t-test was used to compare the end-of-year meanexamination marks of the 26 BDS 2 and BDS 5 students who

used the peer assessment protocol with the rest of their respec-tive classes.One-way Analysis of Variance (ANOVA) was carried out to

compare the mean scores of the 11 items (domains) from BDS2 peer assessment and the 13 items from BDS 5 peer assess-ment, separately. Where the ANOVA showed significant results,

Fig. 2. Modified clinical peer-Direct Observation of Procedural Skills form used for Bachelor of Dental Surgery (BDS) year 5 peer assessment. Students

had to complete all XVII items of the instrument at each encounter.

4 ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd

Peer assessment of dental students Tricio et al.

Page 5: Dental students' peer assessment: a prospective pilot study

a post hoc analysis was carried out using Tukey’s test. The totalpeer DOPS scores observed at various time points were com-pared using repeated-measures ANOVA for BDS 2 and BDS 5groups, separately.All analyses were carried out using SPSS� version 19 (SPSS

Inc. IBM, Chicago, IL, USA) except for the generalizabilitycoefficient which was calculated using the software EduG 6.1e(Neuchatel, Switzerland).

Results

In line with current best practice (8), one of the researchers (JT)carefully organised, delivered and monitored the whole pilotingpeer assessment process. Thus, starting on February 2012 andduring six fortnightly occasions for BDS 2 and seven for BDS 5,students observed, assessed and provided feedback to oneanother, completing 57 and 104 peer DOPS forms, respectively(BDS 2 mean = 5.7 per student and BDS 5 mean = 6.5 per stu-dent). BDS 2 students were peer-assessed across seven differentpre-clinical procedures, ranging from composite, amalgam andtemporary restorations to root canal treatments and directveneers. BDS 5 students were assessed for 19 clinical procedures,including oral health instruction, impression, bite and face-bowregistration, composite, amalgam and temporary restorations,crown, bridge and veneer preparation and cementation, rootcanal treatments, wax try-in and root surface debridement.Considering all peer assessment single scores from both

groups, on all occasions, they ranged from 2 (show initial capa-bility) to 6 (show constant extremely good ability)(mean = 5.0, SD = 0.7 mode = 5) and were normally distrib-uted. The Generalizability coefficient for BDS 2 was 0.62 for sixencounters, whereas for BDS 5, it was 0.67 for seven encoun-ters. The variance analysis for the BDS 2 peer DOPS toolshowed a maximum of 36.9% of the variance attributed toundifferentiated error, followed by the student’s component(20.7%), students and items interaction (16.5%), occasions(8.7%), occasion and items (6.9%), students and occasions(6.2%) and items (4.1%). Similarly, for the BDS 5 peer DOPStool, the larger percentage of variance was the undifferentiatederror which accounted for 60.3%, followed by occasions anditems (15.8), students and occasions (6.5%), students and items(5.7%), occasions (5.0%), students (3.6%) and items (3.1%).The overall mean peer assessment scores for BDS 2 and BDS

5 groups, along with their respective peer-observation time,peer feedback time and utility of giving/receiving feedback, arepresented and compared in Table 1. The mean peer assessmentscores for each of the 11 BDS 2 peer DOPS domains (Table 2)showed significant differences (F = 3.94, P < 0.0001) betweenthese 11 items, with the post hoc analysis using Tukey’s testrevealing the better-performed ‘Observing aseptic technique. . .’(item 6) was significantly (P = 0.04) different from all otheritems. Similarly, BDS 5 mean peer assessment scores for the 13peer DOPS items (Table 3) also differed significantly (F = 6.55,P < 0.0001). Peer’s scores for ‘Consideration of patients/profes-sionalism’ (item 11) were statistically higher than all otheritems (P = 0.02).The prospective peer DOPS marks of every fortnightly assess-

ment occasion for pre-clinical (BDS 2) and clinical (BDS 5)students are shown in Fig. 3. The repeated-measures ANOVA

of total peer DOPS scores observed at these various time points(occasions) showed that the overall performance scores differedsignificantly between occasions (P < 0.0001) for both BDS 2and BDS 5 groups.Mean peer DOPS scores of the 10 individual BDS 2 partici-

pating students (mean = 4.8) showed a positive correlation(r = 0.593) with their mean end-of-year examination marks.However, this was not statistically significant (P = 0.071). Simi-larly, for the 16 BDS 5 participating students, the correlationwas (r = 0.505) and it was not statistically significant(P = 0.051).Subsequently, participating students’ mean end-of-year

examination marks were compared to those examination marks

TABLE 1. Mean and standard deviation (SD) of the peer-Observation

and peer Feedback times (minutes), Overall peer assessment score (scale

1–6) and students’ perception of the Utility of giving and receiving feed-

back to improve future performance (scale 1–6), for each of the studied

groups. The statistical significance of the difference (paired t-test)

between pre-clinical Bachelor of Dental Surgery (BDS) year 2 and clinical

BDS year 5 students is also presented

Variables BDS 2 BDS 5

P value of

difference

Observation time 153.2 (28.2) 100.2 (21.9) <0.0001

Feedback time 6.7 (2.9) 4.8 (1.9) <0.0001

Overall peer assessment score 4.8 (0.8) 5.6 (0.8) <0.0001

Utility of giving feedback 4.8 (1.0) 5.0 (0.6) 0.34

Utility of receiving feedback 5.1 (0.8) 5.3 (0.5) 0.19

TABLE 2. Mean and standard deviation (SD) mark for each of the 11

pre-clinical peer DOPS items showing the ability of pre-clinical BDS year

2 participating students to identify differences (F = 3.94, P < 0.0001) in

their peers’ performance (57 completed forms)

Item Pre-clinical peer DOPS assessment items for BDS 2 Mean (SD)

1 Understanding indications and technique of

this procedure

4.7 (0.7)

2 Understanding the properties of dental materials

being used for this procedure

4.7 (0.6)

3 Preparing for procedure according to taught

protocol

4.8 (0.7)

4 Technical skills, manual dexterity and instruments

handling

4.8 (0.7)

5 Following sequence and completing accurately all

steps of the procedure

4.8 (0.6)

6 Observing aseptic technique/Infection control and

safe use of instruments

5.3 (0.7)

7 Seeking help where appropriate 4.8 (0.9)

8 Managing time/punctuality effectively 4.7 (0.8)

9 Supporting and communicating effectively

with colleagues and tutors

5.0 (0.8)

10 Overall ability to perform procedure 4.8 (0.8)

11 Does the trainee show insight into his/her

performance?

4.5 (0.8)

BDS, Bachelor of Dental Surgery; DOPS, Direct Observation of Procedural

Skills.

ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd 5

Tricio et al. Peer assessment of dental students

Page 6: Dental students' peer assessment: a prospective pilot study

of the rest of their respective classes. Thus, the 10 participatingBDS 2 students showed a mean examination mark of 60.5(SD = 7.2), whilst the remaining 142 BDS 2 students who didnot use the peer assessment protocol showed a mean of 60.1

(SD = 8.5). The difference between these two groups was notstatistically significant (P = 0.886). Further, 16 participatingBDS 5 students exhibited a mean examination mark of 67.9(SD = 5.5), whilst the other 146 BDS 5 students who did notuse the peer assessment protocol showed a mean of 65.5(SD = 5.9). Once again, this difference was not statistically sig-nificant (P = 0.094).Selected students’ peer DOPS feedback comments and the

subsequent agreed action plans for both study groups are pre-sented in Table 4. The majority of BDS 2 and BDS 5 studentsexpressed a positive perception of the prospective peer assess-ment and peer feedback protocol (Table 5).

Discussion

The present study reports the development and piloting of astructured protocol of formative prospective peer assessment ofpre-clinical and clinical dental students’ skills, used as a frame-work for the provision of immediate peer feedback using pre-clinical and clinical peer DOPS forms.The reliability coefficient G was higher for BDS 5 students

(0.67 for seven occasions) as compared to BDS 2 ones (0.62 forsix occasions). These G coefficients are comparable to thosereported by Wilkinson et al. (42) when six multi-source feed-back (MSF) encounters (0.65) were used to assess medical spe-cialists. However, they are lower than those reported byMoonen-van Loon et al. (43) for six (0.74) and seven (0.76)MSF encounters (supervisors/peers/nurses/administrative staff/patients/self-assessment), and the 0.7 and 0.8 threshold gener-ally accepted for low- and high-stakes judgements, respectively(44, 45).Despite both groups of undergraduate students being on the

path from novice to experts (46), the analysis of the datashowed that both groups were able to identify differences inperformance when assessing their peers across the respective 11and 13 domain forms. BDS 2 students distinguished between abetter clinical performance skill in ‘Observing aseptic tech-nique. . .’ and a lower behaviour in ‘Does the trainee showinsight into his/her performance?’ (Table 2). Similarly, BDS 5students assessed the behaviour of ‘Consideration of patient/professionalism’ item with the maximum possible score whilstseveral other domains were significantly lower (Table 3). Thesefindings are in agreement with the study of Bennett et al. (47),who reported medical undergraduates’ peer assessment abilityto identify ‘areas in which peers performed well and those thatrequired improvement’.Yet, another indication of the students’ ability to make an

accurate inference of their peer’ performance (37) was theirability to detect changes in their peers’ performance with time,though with dissimilar patterns (Fig. 2). In the case of BDS 5students, they were able to perceive significant progress in theirpeers’ performance by the third peer assessment session. Incontrast, BDS 2 students only started to notice peer improve-ment at the fifth session, whilst during the first four encounters,scores dropped significantly. This difference might be explainedby an initial calibration process (48) or the time needed to gainexperience as evaluators (49) or even adjusting to the learningenvironment (50), before scores started to rise. This increasein ratings over time is in keeping with the earlier findings of

TABLE 3. Mean and standard deviation (SD) mark for each of the 13

clinical peer DOPS items showing the ability of clinical BDS year 5 partici-

pating students to identify differences (F = 6.55, P < 0.0001) in their

peers’ performance (104 completed forms)

Item Clinical peer DOPS assessment items for BDS 5 Mean (SD)

1 Demonstrates understanding of indications, dental

materials, complications and technique of the

procedure

5.4 (1.0)

2 Obtains informed consent after explaining

procedure & possible complications

5.6 (0.6)

3 Demonstrate appropriate preparation pre-procedure 5.6 (0.7)

4 Administers effective analgesia or safe sedation 5.7 (0.6)

5 Demonstrate appropriate technical ability in line

with usual practice

5.5 (0.6)

6 Demonstrate aseptic technique/Infection control &

safe use of instruments & sharps

5.4 (1.3)

7 Deals with unexpected events or seeks help

when appropriate

5.4 (1.0)

8 Completes post procedure managements 5.3 (0.8)

9 Communication skills (patient & team) 5.4 (0.8)

10 Organisation/efficiency and time management 5.7 (0.6)

11 Consideration of patient/professionalism 6.0 (0.4)

12 Overall ability to perform procedure 5.7 (0.6)

13 Does the trainee show insight into his/her

performance?

5.7 (0.5)

BDS, Bachelor of Dental Surgery; DOPS, Direct Observation of Procedural

Skills.

5.31 5.32 5.52 5.605.71 5.64 5.80

5.174.87 4.75

4.54 4.554.85

February1st half

February2nd half

March1st half

March2nd half

April1st half

April2nd half

May1st half

Peer

DO

PS s

core

s an

d an

chor

lege

nds

Assessment occasions

BDS 5 BDS 2

2Show initial capability

3Show constant

acceptable ability

4Show constant

clear ability

5Show constant

good ability

6Show constant extremely good

ability

1Starting to

develop

Fig. 3. Graphical representation of prospective peer-Direct Observation

of Procedural Skills marks for every fortnightly assessment occasions

(mean and standard deviation), showing pre-clinical Bachelor of Dental

Surgery (BDS) year 2 and clinical BDS year 5 students’ ability to detect

changes in their peers’ performance with time (P < 0.0001).

6 ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd

Peer assessment of dental students Tricio et al.

Page 7: Dental students' peer assessment: a prospective pilot study

Prescott-Clements et al. (36) in postgraduate dental traineesassessed by staff members in eight similar items, and Davieset al. (51) in foundation medical trainees’ peer assessment.Further, whilst not statistically significant (probably due to

sample size), both groups peer assessment scores were positivelycorrelated with students’ end-of-year examination marks.Similarly, despite students’ high perceptions in that the peer

DOPS protocol ‘helped them to identify learning needs andimprove performance’, it had no observable educational impactas both groups of participating students showed no differencein their end-of-year examination marks when compared to theother students of their respective classes who did not use thepeer assessment protocol. This might be explained by the smallsample size or the short observation period (around 3 months)which only allowed a few assessment encounters per student(BDS 2 mean = 5.7 and BDS 5 mean = 6.5).At the end of the pilot, an anonymous questionnaire of both

groups of students gathered their perception of the peer assess-ment protocol (Table 5). All students (100%) agreed orstrongly agreed that both peer DOPS instruments assessedsuitable domains. Likewise, the vast majority (>80%) agreed or

strongly agreed that the peer DOPS protocol was feasible forfuture use in all courses as part of their pre-clinical/clinicaleducation, helped them to identify learning needs and improveperformance, and probably most importantly was acceptableand fair. The latter is particularly important as a sound assess-ment needs to be accepted, fair (52) and authentic (53) or it isdestined to fail (54). The fact that students’ views and opinionswere requested during the development phase of both peerDOPS instruments may have contributed to an increase in theacceptability (55). Further, we did not observe nor receive anynegative comment or concern related to the peer assessmentexperience as previously reported (4, 20).Whilst highly significant and interesting in terms of the

implementation of an undergraduate peer assessment protocol,the current study has a number of limitations. The small sizesample (N = 26) means care should be exercised when tryingto generalise. It also limited the possibility to review the psy-chometric properties of the instruments used as did the shortobservation period. Further limitations include that no evalua-tion of the quality of both the feedback and the action planswas undertaken. The implementation of this peer assessment

TABLE 4. Selected BDS year 2 and year 5 students’ feedback comments and agreed challenges and actions, extracted from their peer DOPS completed

forms

Feedback comments from peers Agreed challenges and actions

Year 2 Good work, maybe more packing could be used with amalgam Better condensation of amalgam

Shape and anatomy of amalgam must be improved Better/improved fissure pattern

Need specific targeted advice for improving positioning for vision in

phantom head – tending to working outside head, freehand.

Work more together with demonstrator to facilitate affective

practice in the phantom head

Difficulty with indirect vision e.g. Direct for posteriors Practice using dental mirrors to get used to indirect vision

Margin of restoration could have been smoother Cavity margins smoother and better anatomy on next restoration

Time management lacking To be more efficient, aim to two teeth per session

Year 5 Very difficult case. Could be better listening to patient and own time

management

In the future try not to undertake work beyond own capabilities

and ensure complexity of dentistry is understood

Being more confident in own ability Correct laboratory instructions

Trouble placing temporary crown on with Temp Bond Better control throughout

Problem placing rubber dam. Place the hole more centrally for full

coverage

Improve technique

Could improve infection control procedures, technically and clinically Better cross infection control

Student a little out of practice in jaw registration of partial dentures Read up on jaw registration of partials

BDS, Bachelor of Dental Surgery; DOPS, Direct Observation of Procedural Skills.

TABLE 5. Anonymous BDS year 2 (n = 10) and BDS year 5 (n = 16) students’ perceptions (%) of the peer assessment protocol after using the peer

DOPS instruments for six and seven occasion, respectively

To what extent do you agree that the peer assessment

and feedback protocol used in this study BDS course Strongly disagree Disagree Nor agree nor disagree Agree Strongly agree

Assessed you in areas that correspond to

your activity in the pre-clinic/clinic?

Year 2 0 0 0 70.0 30.0

Year 5 0 0 0 37.5 62.5

Could be introduced in the future to all

students at the Dental Institute as part of

their pre-clinical/clinical education?

Year 2 0 0 20.0 40.0 40.0

Year 5 0 0 12.5 12.5 75

Have helped you to identify learning needs

and to improve your performance?

Year 2 0 0 30.0 50.0 20.0

Year 5 0 0 12.5 50.0 37.5

Was acceptable and fair? Year 2 0 0 20.0 50.0 30.0

Year 5 0 0 0 18.8 81.3

BDS, Bachelor of Dental Surgery; DOPS, Direct Observation of Procedural Skills.

ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd 7

Tricio et al. Peer assessment of dental students

Page 8: Dental students' peer assessment: a prospective pilot study

protocol was very time-consuming and required careful andrigorous preparation, as well as dedication especially for juniorstudents.There was no statistically significant correlation between peer

scores and end-of-year examination marks and no observableeducational impact of the protocol. However, the results dem-onstrate that both pre-clinical and clinical dental undergraduatestudents are able to (i) identify differences in performancewhen assessing their peers across different domains and (ii)detect performance changes with time. The applicability ofthese findings to a larger, wider ranging cohort, followed over alonger period of time, requires further study.

Conclusion

Both BDS 2 and BDS 5 students were able to identify differ-ences in peers’ performance across various domains and alsodetect improvement over time. Junior students generallyrequired a longer period of peer assessment to adapt to theprotocol, whilst both groups expressed positive perceptions ofthe approach. These findings support the further development,implementation and assessment of the tool over a longer periodand across a broader study group to provide more reliable andgeneralizable data.

References

1 General Dental Council. Preparing for practice: dental team

learning outcomes for registration. London, UK: GDC, 2012: 1–104.2 American Commission on Dental Accreditation CODA.

Accreditation Standards for Dental Education Programs, 2010.

3 Dochy F, Segers M, Sluijsmans D. The use of self-, peer and co-

assessment in higher education: a review. Stud High Educ 1999: 24:

331–350.4 Boud D, Falchikov N. Rethinking assessment in higher education:

learning for the longer term. Hoboken, NJ: Taylor and Francis,

2007.

5 Speyer R, Pilz W, Van Der Kruis J, Brunings JW. Reliability and

validity of student peer assessment in medical education: a

systematic review. Med Teach 2011: 33: e572–e585.6 Hobson R, Rolland S, Rotgans J, et al. Quality assurance,

benchmarking, assessment and mutual international recognition of

qualifications. Eur J Dent Educ 2008: 12: 92–100.7 Topping K. Peer assessment between students in colleges and

universities. Rev Educ Res 1998: 68: 249–276.8 Norcini J. Peer assessment of competence. Med Educ 2003: 37:

539–543.9 Finn GM, Garner J. Twelve tips for implementing a successful peer

assessment. Med Teach 2011: 33: 443–446.10 Falchikov N, Goldfinch J. Student peer assessment in higher

education: a meta-analysis comparing peer and teacher marks. Rev

Educ Res 2000: 70: 287–322.11 Sargeant J, McNaughton E, Mercer S, Murphy D, Sullivan P, Bruce

DA. Providing feedback: exploring a model (emotion, content,

outcomes) for facilitating multisource feedback. Med Teach 2011:

33: 744–749.12 Falchikov N. Peer feedback marking: developing peer assessment.

Innovat Educ Train Int 1995: 32: 175–187.13 Hounsell D, Blair S, Falchikov N, et al. Innovative assessment

across the disciplines: an analytical review of the literature.

Heslington, UK: Higher Education Academy, 2007.

14 Higgins R, Hartley P, Skelton A. The conscientious consumer:

reconsidering the role of assessment feedback in student learning.

Stud High Educ 2002: 27: 53–64.15 Sch€on D. The reflective practitioner: how professionals think in

action. farnham, Surrey, UK: Ashgate Publishing, Ltd, 2009.

16 Manogue M, Kelly M, Bartakova Masaryk S, et al. 2.1 Evolving

methods of assessment. Eur J Dent Educ 2002: 6: 53–66.17 Boursicot K, Etheridge L, Setna Z, et al. Performance in assessment:

consensus statement and recommendations from the Ottawa

conference. Med Teach 2011: 33: 370–383.18 Evans AW, Leeson RMA, Petrie A. Reliability of peer and self-

assessment scores compared with trainers’ scores following third

molar surgery. Med Educ 2007: 41: 866–872.19 Beaumont C, O’Doherty M, Shannon L. Reconceptualising

assessment feedback: a key to improving student learning? Stud

High Educ 2011: 36: 671–687.20 Dannefer EF, Henson LC, Bierer SB, et al. Peer assessment of

professional competence. Med Educ 2005: 39: 713–722.21 Orsmond P, Merry S, Reiling K. The importance of marking

criteria in the use of peer assessment. Assess Eval High Educ 1996:

21: 239–250.22 Larsen T, Jeppe-Jensen D. The introduction and perception of an

OSCE with an element of self- and peer-assessment. Eur J Dent

Educ 2008: 12: 2–7.23 Cheng W, Warren M. Having second thoughts: student perceptions

before and after a peer assessment exercise. Stud High Educ 1997:

22: 233–239.24 Gukas ID, Miles S, Heylings DJ, Leinster SJ. Medical students’

perceptions of peer feedback on an anatomy student-selected study

module. Med Teach 2008: 30: 812–814.25 Nofziger AC, Naumburg EH, Davis BJ, Mooney CJ, Epstein RM.

Impact of peer assessment on the professional development

of medical students: a qualitative study. Acad Med 2010: 85:

140–147.26 Archer J, Norcini J, Southgate L, Heard S, Davies H. mini-PAT

(peer assessment tool): a valid component of a national assessment

programme in the UK? Adv Health Sci Educ Theory Pract 2008: 13:

181–192.27 Mackillop LH, Crossley J, Vivekananda-Schmidt P, Wade W,

Armitage M. A single generic multi-source feedback tool for

revalidation of all UK career-grade doctors: does one size fit all?

Med Teach 2011: 33: e75–e83.28 Ali K, Heffernan E, Lambe P, Coombes L. Use of peer assessment

in tooth extraction competency. Eur J Dent Educ 2014: 18: 44–50.29 Taylor CL, Grey NJA, Satterthwaite JD. A comparison of grades

awarded by peer assessment, faculty and a digital scanning device in

a pre-clinical operative skills course. Eur J Dent Educ 2013: 17:

e16–e21.30 Shumway JM, Harden RM. AMEE guide no. 25: the assessment of

learning outcomes for the competent and reflective physician. Med

Teach 2003: 25: 569–584.31 Norcini J, Burch V. Workplace-based assessment as an educational

tool: AMEE guide no. 31. Med Teach 2007: 29: 855–871.32 The Royal College of Surgeons of England. Intercollegiate Surgical

Curriculum Programme (ISCP) Workplace Based Assessments.

Available: https://www.iscp.ac.uk/dental/wbas.aspx (accessed July

2014).

33 Norcini J. Workplace assessment. In: Swanwick T, ed.

Understanding medical education: evidence, theory and practice.

Chichester, UK: Wiley-Blackwell, 2011: 232–245.34 Crossley J, Humphris G, Jolly B. Assessing health professionals.

Med Educ 2002: 36: 800–804.35 Mackillop L, Parker-Swift J, Crossley J. Getting the questions right:

non-compound questions are more reliable than compound

8 ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd

Peer assessment of dental students Tricio et al.

Page 9: Dental students' peer assessment: a prospective pilot study

questions on matched multi-source feedback instruments. Med

Educ 2011: 45: 843–848.36 Prescott-Clements LE, van der Vleuten CPM, Schuwirth L, Gibb E,

Hurst Y, Rennie JS. Measuring the development of insight by dental

health professionals in training using workplace-based assessment.

Eur J Dent Educ 2011: 15: 159–164.37 Streiner DL, Norman G. Health measurement scales – a practical

guide to their development and use. Oxford: Oxford University

Press, 2008.

38 Beard J. Workplace-based assessment: the need for continued

evaluation and refinement. Surgeon 2011: 9: S12–S13.39 Crossley J, Jolly B. Making sense of work-based assessment: ask the

right questions, in the right way, about the right things, of the right

people. Med Educ 2012: 46: 28–37.40 Beard JH, O’sullivan P, Palmer BJA, Qiu M, Kim EH. Peer assisted

learning in surgical skills laboratory training: a pilot study. Med

Teach 2012: 34: 957–959.41 Miller A, Archer J. Impact of workplace based assessment on

doctors’ education and performance: a systematic review. BMJ

2010: 341: c5064.

42 Wilkinson JR, Crossley JG, Wragg A, Mills P, Cowan G, Wade W.

Implementing workplace-based assessment across the medical

specialties in the United Kingdom. Med Educ 2008: 42: 364–373.43 Moonen-van Loon JMW, Overeem K, Donkers HHLM, van der

Vleuten CPM, Driessen EW. Composite reliability of a workplace-

based assessment toolbox for postgraduate medical education. Adv

Health Sci Educ Theory Pract 2013: 18: 1087–1102.44 Crossley J, Davies H, Humphris G, Jolly B. Generalisability: a key to

unlock professional assessment. Med Educ 2002: 36: 972–978.45 Beard J, Marriott J, Purdie H, Crossley J. Assessing the surgical

skills of trainees in the operating theatre: a prospective

observational study of the methodology. Health Technol Assess

2011: 15: 1–168.

46 Dreyfus H, Dreyfus S, Athanasiou T. Mind over machine: the

power of human intuition and expertise in the era of the computer.

New York, NY: Free Press, 1988.

47 Bennett D, Kelly M, O’Flynn S. Framework for feedback: the peer

mini-clinical examination as a formative assessment tool. Med Educ

2012: 46: 512.

48 Hauser AM, Bowen DM. Primer on preclinical instruction and

evaluation. J Dent Educ 2009: 73: 390–398.49 Karl M, Graef F, Wichmann M, Beck N. Evaluation of tooth

preparations – a comparative study between faculty members and

pre-clinical students. Eur J Dent Educ 2011: 15: 250–254.50 Schoenrock-Adema J, Heijne-Penninga M, van Duijn MAJ,

Geertsma J, Cohen-Schotanus J. Assessment of

professional behaviour in undergraduate medical education:

peer assessment enhances performance. Med Educ 2007: 41: 836–842.

51 Davies H, Archer J, Southgate L, Norcini J. Initial evaluation of the

first year of the Foundation Assessment Programme. Med Educ

2009: 43: 74–81.52 Norman GR, van der Vleuten CPM, De Graaff E. Pitfalls in the

pursuit of objectivity: issues of validity, efficiency and acceptability.

Med Educ 1991: 25: 119–126.53 McCoubrie P. Improving the fairness of multiple-choice questions:

a literature review. Med Teach 2004: 26: 709–712.54 van der Vleuten C. The assessment of professional competence:

developments, research and practical implications. Adv Health Sci

Educ Theory Pract 1996: 1: 41–67.55 Shue CK, Arnold L, Stern DT. Maximizing participation in peer

assessment of professionalism: the students speak. Acad Med 2005:

80: S1–S5.

ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd 9

Tricio et al. Peer assessment of dental students