74
TRAINING EVALUATION & METRICS

Evaluating Training and Metrics

Embed Size (px)

Citation preview

Page 1: Evaluating Training and Metrics

TRAINING EVALUATION& METRICS

Page 2: Evaluating Training and Metrics

• Evaluation is the systematic process of determining the worth, value or meaning of any process or activity.

Page 3: Evaluating Training and Metrics

PURPOSE

PURPOSE OF TRAINING

EVALUATION

FEEDBACK

RESEARCH

INTERVENTIONPOWER

CONTROL

Page 4: Evaluating Training and Metrics

REASONS

• ALIGN WITH BUSINESS NEEDS

• JUSTIFY THE INVESTMENT

• RIGHT DIRECTION

• IDENTIFICATION OF GAPS-IMPROVING FUTURE TRAINING EFFORTS

• BUILD CREDIBILITY

Page 5: Evaluating Training and Metrics

BARRIERS

• LACK OF TOP MANAGEMENT SUPPORT

• LACK OF SKILLS TO EVALUATE

• KNOWING WHAT CRITERIONS TO EVALUATE ON

• RISKY AND EXPENSIVE ENTERPRISE

Page 6: Evaluating Training and Metrics

TYPES OF EVALUATION OBJECTIVES

Formative

Formative evaluation provides ongoing feedback to the curriculum designers and developers to ensure

that what is being created really meets the needs of the intended audience.

Process evaluation provides information about what occurs

during training. This includes giving and receiving verbal feedback.

Summative

Outcome evaluation determines whether or not the desired results (e.g., what

participants are doing) of applying new skills were achieved in the short-term

Impact determines how the results of the training affect the strategic goal

Page 7: Evaluating Training and Metrics

MODE / FRAMEWORK Levels of evaluation

KIRKPATRICK Four levels: Reaction, learning, behavior, results

CIPP(GALVIN) Four levels: context, input, process and product

CIRO Context, input reaction and outcome

BIRKERHOFF Six stages: goal setting, program design, program implementation,Immediate outcomes, intermediate or usage outcomes and impacts and worth

SYSTEMS APPROACH(IPO) Four sets of activities: inputs, process, output and outcome

KRAIGER, FORD AND SALAS A classification scheme that specifies three categories of learning outcomes(cognitive ,skill- based, affective) and proposes evaluation measures appropriate for each category of outcomes

KAUFMAN ANS KELLER Five levels: enabling and reaction, acquisition, application, organizational outputs, and societal outcomes

HOLTON Identifies five categories of variable and relationships among them: secondary influences, motivation elements, environmental elements, outcomes , ability/enabling elements

DIFFERENT APPROACHES……

Page 8: Evaluating Training and Metrics

KIRKPATRICK/ PHILLIPS MODEL

LEVEL I

• REACTION

LEVEL2

• LEARNING

LEVEL 3

• BEHAVIOUR

LEVEL 4

• RESULTS

LEVEL5

• ROI

KIRKPATRICK-1959

LEVEL 1-4

PHILLIPS -1970

LEVEL 5 ROI

Page 9: Evaluating Training and Metrics

PROCESS

ROI

RESULTS

BEHAVIOUR

LEARNING

REACTION & PLANNED ACTION

LEVEL 5 : 5%-10%

LEVEL 4; 10%-20%

LEVEL 3:30%-40%

LEVEL 2: 40%-50%

LEVEL 1: 90%-100%

Page 10: Evaluating Training and Metrics

PROCESS

BEFORE

DURING

AFTER

Page 11: Evaluating Training and Metrics

Evaluation

• Plan the Evaluation during Design of the Training Program

• Develop Data collection tools to be used in each stage

Page 12: Evaluating Training and Metrics

Reaction Level Evaluation –Level I

Page 13: Evaluating Training and Metrics

Reaction - What Is It?

• How favorably participants react to the training (“Customer satisfaction”)

– Collects reactions to instructor, course, and learning environment

– Communicates to trainees that their feedback is valued

– Can provide quantitative information

Page 14: Evaluating Training and Metrics

Reaction - What It Looks Like

• Feedback Questionnaire – Program Objectives

– Content

– Methods of delivery

– Course materials

– Instruction tools

– Assignments

– Media

– Facilitator

– Facilities

– Trainee Motivation to Learn

Page 15: Evaluating Training and Metrics

Reaction - How to Perform

• Determine what you want to find out

• Design a form to collect/quantify reactions

• Do it Immediately- “ Give it tomorrow”

• Develop acceptable scoring standards

• Follow-up as appropriate- Count

• Feedback to Feedback

• Utilise and incorporate

Page 16: Evaluating Training and Metrics

Questionnaire

• Open Ended Questionnaire

• Check list

• Two way questions

• Multiple Choice Questions

• Ranking

Page 17: Evaluating Training and Metrics

Uses of Reaction Data

• Monitoring customer satisfaction

• Identifying strengths and weaknesses

• Evaluating facilitators

• Evaluating facilities

• Evaluating planned improvements

• Linking with follow up data

• Marketing programs

Page 18: Evaluating Training and Metrics

Limitations

• Subjective, based on the feelings at the time of testing

• Participants may be too polite or too rude

• It is also subject to misuse

• A good rating is no assurance that participants will practice what has been taught

“ For those who believe, no proof is required

For those who do not, no proof is sufficient”

Page 19: Evaluating Training and Metrics

Factors associated with Rating

• The time of the day that session is offered, participants’ desire to attend the training, the gender, physical attractiveness of trainer (Baldwin, 2004;Oliver & Sautter, 2005)

• Feedback system is inappropriate when they are used to evaluate learning in an environment where participants are expected to share their experiences and learn (Smith, 2004)

Page 20: Evaluating Training and Metrics

Learning Level Evaluation – Level II

Page 21: Evaluating Training and Metrics

Objectives of L II Evaluation

• Providing individual feedback

• Improving training program

– Objectives, Content & Delivery

• Evaluating instructors

Page 22: Evaluating Training and Metrics

What is learnt

Knowledge

Skills

Self-Concepts

Traits

Motives

Page 23: Evaluating Training and Metrics

Measuring Learning with Tests

• Based on medium

– Paper pencil test, simulations , actual piece of work and computer based test

• Based on test design

– Oral exams, essay test, objective tests, norm-referenced test, criterion-referenced test and performance testing

Page 24: Evaluating Training and Metrics

Tests to Measure Learning

• Norm –referenced tests

– Compare trainees with each other rather than to specific instructional objectives

• Criterion –referenced test

– An objective test with a predetermined cut-off score for specific instructional objectives

• Performance testing

– Allows trainees to exhibit a skill that was learned in the program

Page 25: Evaluating Training and Metrics

Measure Learning with Simulation

• This involves construction and application of a procedure or task that models activity for the program being conducted

• They can provide accurate evaluation if the performance in simulation is objective & can be clearly measured

Role Plays

Business Games

Page 26: Evaluating Training and Metrics

Learning - How to Perform

• Use a control group, if feasible

• Evaluate knowledge, skills, and/or abilities before and after

• Get 100% participation or use statistical sample

• Follow-up as appropriate

Page 27: Evaluating Training and Metrics

Exercise

1.Analysis of Feedback Forms

2.Analysis of Compiled Ratings

Page 28: Evaluating Training and Metrics

Behaviour Level Evaluation –Level III

Page 29: Evaluating Training and Metrics

Behavior

• Transfer of knowledge, skills, and /or abilities to the real world

– Measure achievement of performance objectives

• Involve the immediate supervisor/s

• Observe performer, first-hand

• Survey key people who observe performer

• Use checklists, questionnaires, interviews, or combinations

Page 30: Evaluating Training and Metrics

Organisation support and Transfer Climate

• Organizational policies and practices

• Extent of encouragement to apply new learnings

• Consequences when applying

• Reinforcement and Coaching

Page 31: Evaluating Training and Metrics

Trainee Characteristics

• Ability to learn– Aptitude & specific intelligence

• Motivation to learn– WIIFM, belief in training, perceived need for KSA

improvement, perceived back to job situation

• Attitude– Job satisfaction, low organizational commitment,

intention to leave

• Personality (The BIG 5 Factors)– Initiative, Openness to experience, extraversion ,

agreeableness, conscientiousness

Page 32: Evaluating Training and Metrics

Demonstrated Capability - Trainee

• Ability to transfer (self efficacy)

• Intention to transfer (WIIFM & perceived support)

• Initiation

• Partial transfer

• Conscious maintenance

• Unconscious maintenance

Page 33: Evaluating Training and Metrics

Behavior - How to Perform

• Evaluate before and after training

• Allow ample time before observing

• Survey key people

• Consider cost vs. benefits

– 100% participation or a sampling

– Repeated evaluations at appropriate intervals

– Use of a control group

Page 34: Evaluating Training and Metrics

Data Collection Questionnaire

• Action plan implementation

• Use of program materials

• KSA application

• Frequency of application

• Measurable improvements

• Improvements linked to program

• Monetary impact

• Barriers

• Enablers

Page 35: Evaluating Training and Metrics

Level IV Results

Page 36: Evaluating Training and Metrics

Results - What Is It?

• Assesses “bottom line,” final results

• Definition of “results” dependent upon the objectives of the training program

Page 37: Evaluating Training and Metrics

Results - What It Looks Like

• Depends upon objectives of training program

– Quantify

• Proof vs. Evidence

– Proof is concrete

– Evidence is soft

Page 38: Evaluating Training and Metrics

Results - How to Perform

• Use a control group

• Allow time for results to be realized

• Measure before and after the program

• Consider cost versus benefits

• Be satisfied with evidence when proof is not possible

Page 39: Evaluating Training and Metrics

L III & L IV – Data Collection

Method L III L IV

Follow-up Questionnaires Yes Yes

Observation Yes

Interviews with Participants Yes

Action Planning Yes Yes

Performance Contracting Yes Yes

Performance Monitoring Yes

Page 40: Evaluating Training and Metrics

Tools for Evaluation

Tools/Learning Reactions Learnings Behavious Results

Participants/Questionnaires or reports

4 4 4 4

Manager questionaires or reports

4 4 4 4

Written test or examination 4

Practical test or demonstration 4 4

Customer survey 4 4

Employee survey 4 4

Interviews 4 4 4 4

Observation ‘on the job’ 4 4

Page 41: Evaluating Training and Metrics

Metrics

Page 42: Evaluating Training and Metrics

Metrics for measuring T & D efficiency

Areas Key Performance Indicators

Training Cost • Company training expenditure (% of salaries & wages)

Training Hours • Avg. no. of training hours per employee

Training courses • No. of courses offered• No. of courses implemented

Training satisfaction • Employee Satisfaction with training

Training budget • % of HR budget spent on training• Avg. training costs per employee

Training results • % of employees gone through training• Avg. time to competence• % employees reach competence after training

Training penetration rate • % of employees completing the course compared to total no. of employees

E-learning training • E-learning courses utilized• % of e-learning pass rate

Page 43: Evaluating Training and Metrics

I am Provided Effective Coaching By My Supervisor To Enhance My In-Clinic

Performance

3.2 3.1 3.2 3.1

1

1.5

2

2.5

3

3.5

4

Total

Responses

Maxima Critica Intima

Page 44: Evaluating Training and Metrics

I get adequate support in

handling customer queries

3.2 3.2 3.3 3.2

1

1.5

2

2.5

3

3.5

4

Total

Responses

Maxima Critica Intima

Page 45: Evaluating Training and Metrics

Post training data across batchesLeaders Academy

0.00

1.00

2.00

3.00

4.00

5.00

6.00

7.00

Batch 8 Batch 1 Batch 2

Step 1: Be Supportive

Self

Manager

Direct Reports

0.00

1.00

2.00

3.00

4.00

5.00

6.00

7.00

Batch 8 Batch 1 Batch 2

Step 2: Define the Topic and Need

Self

Manager

Direct Reports

0.00

1.00

2.00

3.00

4.00

5.00

6.00

Batch 8 Batch 1 Batch 2

Step 3: Establish the Impact

Self

Manager

Direct Reports

0.00

1.00

2.00

3.00

4.00

5.00

6.00

7.00

Batch 8 Batch 1 Batch 2

Step 4: Initiate a Plan

Self

Manager

Direct Reports

Page 46: Evaluating Training and Metrics

Post training data across batches

0

1

2

3

4

5

6

Batch 8 Batch 1 Batch 2

Step 5: Get a Commitment

Self

Manager

Direct Reports

0.00

1.00

2.00

3.00

4.00

5.00

6.00

Batch 8 Batch 1 Batch 2

Step 6: Confront Excuses & Resistance

Self

Manager

Direct Reports

0.00

1.00

2.00

3.00

4.00

5.00

6.00

7.00

Batch 8 Batch 1 Batch 2

Step 7: Clarify Consequences

Self

Manager

Direct Reports

0.00

1.00

2.00

3.00

4.00

5.00

6.00

7.00

Batch 8 Batch 1 Batch 2

Step 8: Don't Give Up

Self

Manager

Direct Reports

Page 47: Evaluating Training and Metrics

• Criteria for certification : Time Status

1. Develop and present district business plan to RBMs Oct ’06

2. Territory wise analysis as per case study model Nov ’06

3. Demonstrate business turnaround in two territories 6 months

• Present L4 metrics

Impact : Certification Metrics

HQ - Raipur 6%98%

2%79%HQ - Bilaspur

GrowthAchievementGrowthAchievement

After enrolment

(YTD Mar ’07)

Before Enrolment

(YTD Sept ’06)

2 Project PSOs

chosen for project

DM / HQ : Nagpur RBM :

110%

106% 10%

18%

Page 48: Evaluating Training and Metrics

Impact : Advanced Retailing Skills (for Consumer Health Care)

FEB 06

(Pre Trainng)

APR 06

(Post Trainng)

IMPACT METRICS

Retail Calls 242 253 More No. of retailers covered

Outlet Conversion Improved Outlet Conversion

Personal Order Booking Rs. 65,000 84,000 Increase in POB value

Average booking

per Outlet Rs.

Rs.288 Rs.373 Enhanced

Order booking as %age of TGT

11.7% 15.3% Growth

Productive Calls : Increased from 3 to 5 amongst 10 Retailers covered

Feedback From a PSO - L3 & L4 metrics

8% 16%

29.2%

30%

Page 49: Evaluating Training and Metrics

RETURN ON INVESTMENT

PARTICIPANT ANNUALIMPROVEMENT VALUE

Reasons CONFIDENCE ISOLATION FACTOR

ADJUSTED VALUE

11 $ 36000 Improvement in efficiency of group 85% 50% $ 15300

42 90000 Turnover reduction 90% 40% $ 32400

74 24000 Improvement in customer response time 60% 55% $ 7920

55 2000 5% improvement in my effectiveness 75% 50% $ 750

96 10000 Absentiesm reduction 85% 75% $ 6375

PARTICIPANTS INPUT IN LEADERSHIP PROGRAM OF MANAGER

Page 50: Evaluating Training and Metrics

Training Failure AnalysisCreate Focus/ Build

Intentionality

Provide Quality

Learning

Interventions

Support

Performance

Improvement

ReasonTrainees did not

really need the

training

Training was focused

on the wrong people

Trainees were not

prepared/motivated

to learn

Training was not

aligned w/

performance needs

Trainees were not

prepared to apply

learning on the job

Trainees could not

learn material;

instruction was not

good enough

Training design was

flawed

Exercises/simulatio

ns were irrelevant

Facilitator did a bad

job

No opportunity to

use the training

Trainees did not

get support from

manager when

trying to use

training

Trainees gave up

on new skills too

quickly when

running into

obstacles

Lack of peer

support

No incentive to

use it

Lack of feedback/

coaching when

trying to use it

Page 51: Evaluating Training and Metrics

Your partners in building Excellence……

Accelerate

Thank You

Page 52: Evaluating Training and Metrics

Remember learning to ride a bike?

Page 53: Evaluating Training and Metrics

Leveraging Great Learning

Learning x Follow-through = Results

Page 54: Evaluating Training and Metrics

The Literature on Follow-Through

• Improvement correlates to follow-up

– Goldsmith

• Months of practice

– Goleman

• Involvement of manager

– Brinkerhoff & Montesino

Page 55: Evaluating Training and Metrics

Why Different Results?

Background: 8,000 Fortune 100 managers All received 360 feedback and leadership training Leadership effectiveness evaluated18 months later

Three distinct groups:– Little or no improvement– Moderate improvement– Significant sustained improvement

Same course, same company, different results.

Why?

Page 56: Evaluating Training and Metrics

The Difference is Follow-Through

0

5

10

15

20

25

30

35

-3 -2 -1 0 1 2 3

No Follow Through

0

5

10

15

20

25

30

35

40

45

-3 -2 -1 0 1 2 3

Some Follow-through

0

10

20

30

40

50

60

-3 -2 -1 0 1 2 3

Consistent Follow-Through

Less effective No Change More Effective

Less effective No Change More Effective

Less effective No Change More Effective

Goldsmith, M: “Ask, Learn, Follow-up, and Grow,” in Hesselbein et al: Leaders of the Future, 1996

”Consistent or periodic follow-up had a dramatic, positive impact.”

perc

ent

perc

ent

perc

ent

Page 57: Evaluating Training and Metrics

Conclusions - 8 company study

• Real leadership development is a process.

• Almost any follow-up is better than none.

• One of the greatest weaknesses in most training and development is the insufficient attention paid to follow-up.

• The biggest challenge for most leaders is not understanding the practice of leadership; it is practicing their understanding of leadership.

Goldsmith and Morgan, Best Practices in Organizational Development, in press, 2003

Page 58: Evaluating Training and Metrics

Months of Practice are Required

Why does emotional intelligence competence take months rather than days?

Because the emotional centers of the brain, not just the neocortex, are involved. To master a new behavior, the emotional centers need repetition and practice.

The more often a behavioral sequence is repeated, the stronger the underlying brain circuits become.

At some point, the new neural pathways become the brain’s default option.

Goleman, “Leadership that Gets Results” Harvard Business Review March 2000

Page 59: Evaluating Training and Metrics

Manager Involvement

Learners who had pre/post course discussions with their managers (on new skills, applications, etc.) reported significantly higher skill levels and success.

Brinkerhoff & Montesino, Partnerships for Training Transfer, HRD Quarterly Fall 1995

Page 60: Evaluating Training and Metrics

What Hinders Follow-Through

• Barriers to Transfer

• High vs. Low Performance Factors

• The Knowing/Doing Gap

Page 61: Evaluating Training and Metrics

Barriers to Transfer

• Lack of reinforcement on the job

• Non-supportive organizational climate

• Learners: new skills are impractical, irrelevant

• Separation from instructional source

• Negative peer pressure

Broad & Newstrom, Transfer of Training, 1992

Page 62: Evaluating Training and Metrics

High vs. Low Performance Factors

• High performance learners– Explored content before training– Pre/post discussions with managers– Clear idea on how to apply new skills– Frequent practice after training

• Low performance learners and their managers– Had none of the above factors supporting use

of new skills

Feldstein & Boothman, In Action: Transferring Learning to the Workplace, 1997 ASTD

Page 63: Evaluating Training and Metrics

Final thoughts…

Training should move from a service provider to an internal consultant- Passion

Develop rapport proactively with internal KOLs-speak the customer’s language

Move out of the comfort zone—Innovate and benchmark with other industries

Adopt new technology @ speed of thought--ensure stakeholder buy-in

Relentless focus on top line and bottom line

Take pride in your role—you are a Life Changer !...

Page 64: Evaluating Training and Metrics

HOW EFFECTIVE IS YOUR TRAINING PROGRAM?

• Broad and Newstrom (1992) report studies have shown less than 30% of what is actually taught transfers to the job in a way that enhances performance.

• Source : Broad, M., & Newstrom, J. W. (1992). Transfer of training: Action packed

strategies to ensure high payoff from training investments. Reading, MA: Addison-Wesley.

Page 65: Evaluating Training and Metrics

THANK YOU

Page 66: Evaluating Training and Metrics

Research

“If we knew what we were doing, it wouldn’t be research.”

- Albert Einstein

Page 67: Evaluating Training and Metrics

The Indian Scenario– a research

• The sample size consisted of the following seven companies:

• Godfrey Phillips India Ltd.

• Mahindra & Mahindra Limited

• Indian Oil Corporation.

• Tata Power Ltd.

• Larsen & Toubro Ltd.

• Tata Motors Ltd.

• Johnson & Johnson Ltd.

• The Indian Hotels Company Ltd.

Page 68: Evaluating Training and Metrics

Training Evaluation - Kirkpatrick

Kirkpatrick's four-level model, each successive

evaluation level is built on information provided

by the lower level.

Page 69: Evaluating Training and Metrics

The Indian Scenario– a research

Organisation: ITES

Need: Team Leader Effectiveness

Specific area evaluated: Team Huddles

Process: Pre/Post Training scores- 8 E

Outcomes: Improved scores

Drop in absenteeism

Reduced attrition

Page 70: Evaluating Training and Metrics

Training Evaluation –Recommendations

•Build capacity by doing research & benchmarking

Train the Team to understand basics of TE

•Quantify information before intervention—get stakeholder involvement

•Set specific targets for evaluation at all levels

•Decide on specific measurements

•Allocate resources

•Integrate TE with performance management

Page 71: Evaluating Training and Metrics

Training Evaluation -Recommendations

•Start small-- one course as a pilot

•Focus on a small sample size – simplify

•Share results with trainers and the entire organization

•Celebrate success stories with stakeholders

•Design improvement plans

Page 72: Evaluating Training and Metrics

Parting Words

“If you always do what you always did, you’ll always get what you always got.”

- Anon

Page 73: Evaluating Training and Metrics

The Last Word

“In the last analysis Management is practice

Its essence is not knowing but doing

Its test is not logic but results

Its only authority is Performance”

Peter Drucker

Page 74: Evaluating Training and Metrics

Thank You