21
DESIGNING A FACULTY PEER EVALUATION SYSTEM FOR INSTRUCTIONAL SKILLS Lise McCoy EdD (ABD) Sharon Obadia, DO Christine Morgan EdD ATSU-SOMA AACOM Conference Washington, DC April 3, 2014

DESIGNING A FACULTY PEER EVALUATION SYSTEM FOR

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

DESIGNING A FACULTY PEER

EVALUATION SYSTEM FOR

INSTRUCTIONAL SKILLS

Lise McCoy EdD (ABD)

Sharon Obadia, DO

Christine Morgan EdD

ATSU-SOMA

AACOM Conference

Washington, DC

April 3, 2014

Participant Learning Objectives

Identify reasons for developing a system for

peer evaluation of instruction

Explore the rubric development process

Review rating categories and scoring system

Rate an instructional performance

A System of Peer Evaluation

Peer Evaluation

System

Rubrics: Large &

Small Group Instruction

Peer Coaching: Feedback

Self Reflection

Goal Setting

What are some advantages of a peer-evaluation system?

Peer Evaluation System Goals

Set standards for instructional delivery.

Provide opportunities for formative feedback

Strengthen community among faculty

Teach constructive feedback techniques

Goal: Encourage and reward instructional skill

competency

What is a Rubric?

“An assessment tool to save grading

time, convey effective feedback, and

promote student learning”

-Stevens & Levi, 2005

Are you currently using a peer evaluation

rubric for faculty development?

Rubric Development Process

Research & Design

• Discovery process > Lit review

• Faculty suggestions

• Learning-Centered principles

Stakeholder Review

• Administration

• Faculty Council

• Education Specialists

• Faculty Forum

Pilot & Implementation

• IRB

• Training

• Pilot

Large Group Instruction Rubric Categories

• Content

• Relevance

• Clarity

• Learning Objectives

• Engagement

• Inquiry

• Exam Items

Scoring System

• Scale (1-4)

• Underline specific statements

• Converge toward a major score category, tick box

• Ok to use.5 system

• Mark during peer evaluation & discuss scores later

with peer-partner

• Submit a reflection on areas to strengthen to FD.

Rating Rubric: ContentPlease mark

in one tick

box for each

row.

4

Highly

Effective

3

Effective

2

Improvement

Necessary

1

Does Not

Meet

Standards

1.

Content

Provides

highly

organized

materials.

Content is

current and

accurate.

Visual aids are

high quality.

Provides

organized

materials.

Content

appears

accurate.

Visual aids

are mainly

high quality.

Some attempt

to organize

materials.

Content and

visual aids are

of limited

quality.

Materials lack

organization.

Most content

and visual

aids are of

poor quality.

Underline phrases describing the performance level for each aspect of

the category. Then converge the scores to select a holistic score.

Rating Rubric: Relevance

Please mark

in one

tick box for

each row.

4

Highly

Effective

3

Effective

2

Improvement

Necessary

1

Does Not

Meet

Standards

2.

Relevance

All content is

level

appropriate,

medically

relevant, and

integrates well

within the

sequence of the

course.

References are

made to related

lectures.

Most content

is level

appropriate,

medically

relevant, and

integrates

within the

course.

Some content

is not level

appropriate,

medically

relevant, or

insufficiently

integrated

within the

course.

Begins

instruction

without

providing any

LO’s or

indication of

where

instruction is

headed.

Rating Rubric: Clarity

Please mark

in one

tick box for

each row.

4

Highly

Effective

3

Effective

2

Improvement

Necessary

1

Does Not

Meet

Standards

3. Clarity All lesson

content,

examples, and

explanations are

presented in a

clear, stepwise

manner,

defining all

concepts and

terms.

Most lesson

content

examples, and

explanations

are presented

in a clear

stepwise

manner,

defining most

concepts and

terms.

Some content

or

explanations

are unclear.

Examples were

not provided.

Definitions

were not

provided.

Lesson

content and

explanations

are not

presented in

a clear

manner. No

examples

were

provided.

Concepts

lack

definition.

Rating Rubric: Objectives

Please mark

in one

tick box for

each row.

4

Highly

Effective

3

Effective

2

Improvement

Necessary

1

Does Not

Meet

Standards

4.

Learning

Objectives

Presents LO’s

early in the

lesson. LO’s

perfectly comply

with LO

guidelines.*

Content is

perfectly aligned

to LO’s.

Presents LO’s

early in the

lesson. LO’s

mainly comply

with guidelines.

Content is

mainly aligned

to LO’s.

Presents LO’s,

but they require

better alignment

with guidelines.

Content is not

well aligned

with LO’s.

Begins

instruction

without

providing any

LO’s or

indication of

where

instruction is

headed.

Underline phrases describing the performance level for each

aspect of the category. Then converge the scores to select a

holistic score.

Rating Rubric: Engagement

4 3 2 15.

Engagement**

The

presentation

is interesting.

Provides

more than

one highly

participatory

active

learning

engaging

feature**

such as rich

discussion,

pause

activities, or

team concept

practice.

Most of the

interesting.

Provides at

least one

active

learning

feature.**

Some of the

presentation is

interesting.

Limited or

ineffective

attempt to

provide

engaging

features.**

Presentation is

not interesting.

Engaging

features** or

pause

activities are

not provided.

Rating Rubric: Inquiry – In Person

4 3 2 1

6a.

Inquiry

(In-Person)

Provides

several

opportunities to

ask or answer

questions.

Ensures

accurate

response to the

question, and

facilitates

discussion well.

Provides

opportunities to

ask answer

questions in

some sections

of the lesson.

Facilitates

discussion

adequately.

Provides

minimal

opportunities to

ask or answer

questions, or

does not

adequately

facilitate the

discussion.

Provides no

opportunities to

ask or answer

questions.

Some students

express

confusion.

Rating Rubric: Inquiry - Distance

4 3 2 1

6b.

Inquiry

(Distance)

Provides a

variety of

discussion

and inquiry

features.***

Provides

some

discussion

and inquiry

features.

Provides

few

discussion

and inquiry

features

Provides no

discussion

and inquiry

features.

Rating Rubric: Exam Items

4 3 2 1

7.

Exam

Items

Well written

items comply

with NBOME

guidelines. They

reflect

application of

knowledge.

Perfectly

aligned with

LO’s.

Items require

minor editing

to comply with

NBOME. They

reflect some

application of

knowledge.

Mainly aligned

with LO’s.

Items require

editing to align

with NBOME.

Most are recall

of isolated

facts, or

ineffective

application of

skill.

Sometimes

aligned with

LO’s.

Items require

major editing to

meet NBOME.

Nearly all are

factual recall.

Not aligned

with LO’s.

Rubric Guidelines

SOMA faculty requested guidelines for these

categories (see p.2 of rubric).

Objectives

Engagement

Inquiry

Also requested: Didactics for strengthening weak

areas.

Activity: Rate An Instructional Performance

Content Lab Values and Normal Ranges

1. Watch the following video clip.

2. Rate this performance for the “content” category.

3. Confer with a partner to compare/converge your

scores toward a consensus score.

Questions?

• What are your thoughts/ insights?

• Provide written feedback using the

comment card provided.

References

• AACOM, TOPCE http://www.aacom.org/InfoFor/educators/mec/facultydev/Pages/default.aspx

• Philadelphia, M. (2013). Will School-based Online Faculty Development be an Effective Tool

for their Professional Growth? University of Southern California.

• Arreola, R. (2000). Developing a Comprehensive Faculty Evaluation System. Bolton, MA:

Anker Publishing Company.

• Rhem, J. (2013). Thresholds are Troublesome. National Teaching and Learning Forum.

Accessed 11.13.13 from: http://onlinelibrary.wiley.com/journal/10.1002/%28ISSN%292166-

3327

• Stevens, D. D., & Levi, A. J. (2005). Introduction to Rubrics: An Assessment Tool to Save

Grading time, Convey Effective Feedback, and Promote Student Learning. Sterling, VA:

Stylus Publishing LLC.

• Frank, J. R., Mungroo, R., Ahmad, Y., Wang, M., De Rossi, S., & Horsley, T. (2010). Toward a

definition of competency-based education in medicine: a systematic review of published

definitions. Medical teacher, 32(8), 631–7. doi:10.3109/0142159X.2010.500898

• Peer coaching for better teaching. By: Skinner, Michael E., Welch, Frances C., College

Teaching, 87567555, Fall96, Vol. 44, Issue 4

• OUCOM. (20133. Classroom Observation Form:

http://www.oucom.ohiou.edu/fd/programs.htm

• Steinert, Y., Mann, K., Centeno, A., Dolmans, D., Spencer, J., Gelula, M., & Prideaux, D. (2006).

A systematic review of faculty development initiatives designed to improve teaching

effectiveness in medical education: BEME Guide No. 8. Medical teacher, 28(6), 497–526.

doi:10.1080/01421590600902976

• Sukin G, Wagner E, Harris I, Schiffer R. What Makes a Good Clinical Teacher in Medicine? A

Review of the Literature. Acad Med., 2008; 83:5.

Rubric Development History

• The 2014 edition of the rubric described in this presentation includes inspiration

from the following sources:

• 2012. A draft “Faculty Performance Rubric (2012)” was developed by the SOMA

team of Chris Sullivan PhD, Marjorie Buick-Kinney MEd, and Sharon Obadia DO.

It was subsequently circulated to SOMA faculty. The format for this 2012 “Faculty

Performance Rubric” was developed with inspiration from a rubric model for

delivery of instruction by American educator Kim Marshall.

• 2013. During Academic Year 2013-2014, the team of Lise McCoy, Sharon Obadia,

and Chris Morgan evolved the categories and descriptions through six iterations

involving SOMA leadership, department chairs, the Faculty Development

Advisory Group, Faculty Council, Curricular Dean input, and literature searches.

• 2014. The 2014 edition (McCoy, Obadia, Morgan) involves updated category

descriptors, guidelines, a rating and scoring system, and discrete categories for

distance engagement and inquiry.