19
Evaluation and Case Study Review Dr. Lam TECM 5180

Evaluation and Case Study Review

  • Upload
    stamos

  • View
    24

  • Download
    0

Embed Size (px)

DESCRIPTION

Evaluation and Case Study Review. Dr. Lam TECM 5180. Summative Evaluation vs. Formative Evaluation. Summative evaluation- Assessment OF learning Formative evaluation- Assessment FOR learning. Summative or Formative?. - PowerPoint PPT Presentation

Citation preview

Page 1: Evaluation and Case Study Review

Evaluation and Case Study Review

Dr. LamTECM 5180

Page 2: Evaluation and Case Study Review

Summative Evaluation vs. Formative Evaluation

• Summative evaluation- Assessment OF learning

• Formative evaluation- Assessment FOR learning

Page 3: Evaluation and Case Study Review

Summative or Formative?

• You have been asked to to determine how much money your training program has saved over a six month period.

• You asked trainees to complete a 50-question paper and pencil exam covering the learning objectives of your course.

• You have been asked to observe trainees doing their jobs and write a report that describes their level of knowledge transfer.

• You asked trainees for feedback about the content and delivery of the course and the facilitator.

• After six months, you have emailed managers and asked them about each trainees performance.

Page 4: Evaluation and Case Study Review

Types of Evaluation

• Lots of models of evaluation• Kirkpatrick’s four models of evaluation• Stufflebeam’s four step evaluation process• Rossi’s five-domain evaluation model• Brinkerhoff’s success case model

• We’ll talk about Kirkpatrick (1994) because:• It’s widely accepted• It’s easy to grasp

Page 5: Evaluation and Case Study Review
Page 6: Evaluation and Case Study Review

Reactions

• What?: Perceptions of the trainees • How?: Questionnaire's and feedback forms• Why?: Gives designer’s insight about

training satisfaction, which can be good and bad.

• Trainee feedback is relatively quick and easy to obtain; it is not typically very expensive to analyze

Page 7: Evaluation and Case Study Review

Evaluation instrument examples

• See Piskurich pages 274-275

Page 8: Evaluation and Case Study Review

Questionnaires • Open-ended items- allows users to express opinions

in their own words• Advantages: allows users to give unique, open, and

honest feedback• Disadvantages: difficult to analyze; trainees often prefer

not to fill them out (biased results)

• Close-ended items- allows users to express opinions on a predetermined quantitative scale• Advantages: easy to analyze; fast completion for trainees• Disadvantages: inhibits unique feedback; doesn’t always

provide a full picture

Page 9: Evaluation and Case Study Review

Creating close-ended questions

• Use a scale that allows for degrees of comparison• Not good: Did you find the course beneficial? Yes or No• Better: On a scale from 1 to 5, how beneficial did you

find the course?

• Always use the same scale (e.g., 5-point or 7-point likert scale)

• Construct questions grammatically consistent• Develop questions for specific purposes (i.e., don’t

ask questions if you don’t know what you’ll do with the result)

Page 10: Evaluation and Case Study Review

Creating open-ended questions

• Limit your use of these• Use them to supplement close-ended responses• Reserve these for unique responses• Bad use of open-ended: What did like about the

presentation slides?• Improved: On a scale from 1 to 5, how useful were like

slides in supplementing the facilitator’s content?• Bad use of close-ended: Rate the following on a scale of 1

to 5 with 1 being strongly disagree and 5 being strongly agree: I would make changes to the delivery of this course.

• Improved: What changes would you make to the delivery of the course?

Page 11: Evaluation and Case Study Review

Learning

• What?: Measure of increase in knowledge before and after training

• How?: Formal assessment; interview or observation

• Why?: To ensure your trainee’s have learned what you set out for them to learn

• Already created if you’ve designed and developed your course properly (See last week’s presentation slides for assessment overview)

Page 12: Evaluation and Case Study Review

Behavior• What?: The extent of applied learning back on

the job• How?: Observation and interviews over time;

retesting • Why?: To measure the long-term efficacy of

your training program• Measuring behavior is difficult and requires the

cooperation of management and other overseers of day-to-day operations of a trainee

Page 13: Evaluation and Case Study Review

Piskurich’s “Transfer to the job” Evaluation

1. Did the training address the requirements of the job?

2. Were the trainees performing the job requirements competently before the training?

3. Are the trainees now performing the job requirements competently?

4. What are the trainees still not doing correctly?5. Were there any unintended consequences of the

training?

Page 14: Evaluation and Case Study Review

Examples

• See Piskurich pages 278-279

Page 15: Evaluation and Case Study Review

Results

• What?: The effect on the business or environment of the trainee

• How?: Measured with already implemented systems; ROI; Cost-effectiveness analysis;

• Why?: To measure the impact training has on organization (on a macro-level)

• Difficult to isolate training as a variable

Page 16: Evaluation and Case Study Review

ROI

• Return-on-investment• Use ROI to: demonstrate effectiveness;

promote importance; suggest refinements; project future costs; measure success

• Drawbacks to ROI: can’t compute intangible benefits; can’t measure all variables and data; can be misleading

Page 17: Evaluation and Case Study Review

How to calculate ROI• ROI = Net Benefits/Cost• Net benefits can include: increased productivity; greater

customer satisfaction; higher quality work product• Costs can include: design and development costs;

ongoing costs; evaluation costs• The hard part is quantifying benefits and costs• Although ROI is a quantifiable metric, there is an

interpretative element in calculating ROI• Therefore, your logic and rationale behind the metric is as

important as the metric itself

Page 18: Evaluation and Case Study Review

How to determine what evaluations to conduct

• Why do I want to evaluate?• What am I going to evaluate?• Who should I involve as part of the

evaluation?• How am I going to do the evaluation?• When should I do the evaluation?

Page 19: Evaluation and Case Study Review

Implementing Revisions

• As-needed revisions- most common type of revision; reactionary

• Planned revisions- less common type of revision; proactive