Upload
singydan
View
469
Download
0
Embed Size (px)
Citation preview
EFFECTIVE METRICS AND CONTINUOUS IMPROVEMENT FOR THE L&D FUNCTION 15 APRIL 2015
Daniel Mitchell Talent, Learning & Diversity Leader, APAC Marsh
MARSH
A request comes through from the business…
1 13 July 2015
To: (your name)
From: Jolin Tsai, General Manager of Production
Re: Motivation problems – need training
Dear (name):
Recently I’ve noticed quite an increase in staff turnover among production line workers working in plant 2. While plants 1 and 3 seem to have relatively stable staff retention rates, plant 2’s turnover has been steadily increasing over the past 6 or 7 months. After conducting a quick survey amongst production line workers, the problem seems to be that our production supervisors are in serious need of communication skills training. Staff reported in the survey and in conversations with me that they feel supervisors don’t have their best interests in mind, are self-serving in their management style, are too pushy and demanding, and generally lack solid communication and leadership skills.
Could you help me find a suitable vendor with experience working with our kind of organization who can help us identify what’s up with our supervisors and suggest a training program to address these issues?
Thanks, Jolin.
MARSH 2 13 July 2015
Our agenda
• Understanding needs and the individual performer
• Analyzing and assessing needs – have an approach
• CEOs – are they concerned?
• Designing learning matters to measurement
• Metrics: The Kirkpatrick Model & The Brinkerhoff “Success Case Method”
• The Kirkpatrick Model: working with the levels
• Why Remembering and Forgetting matter
• The Brinkerhoff “Success Case Method”
• Six Continuous Improvement Practices for the L&D Function
MARSH
Back to that request…
3 13 July 2015
To: (your name)
From: Jolin Tsai, General Manager of Production
Re: Motivation problems – need training
Dear (name):
Recently I’ve noticed quite an increase in staff turnover among production line workers working in plant 2. While plants 1 and 3 seem to have relatively stable staff retention rates, plant 2’s turnover has been steadily increasing over the past 6 or 7 months. After conducting a quick survey amongst production line workers, the problem seems to be that our production supervisors are in serious need of communication skills training. Staff reported in the survey and in conversations with me that they feel supervisors don’t have their best interests in mind, are self-serving in their management style, are too pushy and demanding, and generally lack solid communication and leadership skills.
Could you help me find a suitable vendor with experience working with our kind of organization who can help us identify what’s up with our supervisors and suggest a training program to address these issues?
Thanks, Jolin.
MARSH
How do you respond?
4 13 July 2015
“I can help you solve your problem...” But after you make this statement to your client, what do you do next?
Yes No
Agree to provide training (or whatever solution was requested)?
Offer your support?
Create a plan for conducting more in-depth discussion and analysis?
Say, “…but I won’t! I’m too busy and you should first think about how to solve your own problems before coming to me!”
√
√
√
√
MARSH
Before doing anything, what are some of the questions you might ask your internal ‘client’
when they request a performance improvement intervention?
MARSH
1. What triggered this request?
2. In an ideal world, what would you love to see happening?
3. Compared to this ideal state, what specifically is happening now?
4. What new behaviors will you have to see to convince you that all is right in your team/business/country?
5. Others?
MARSH
Understanding “The Anatomy of Performance”
Management
The Business Environment
The Economy The Culture The Country Legislation
Resources
Capital
Human Resources
Materials
Technology
Organization System & Culture
(A) Performance Planned (B) Performance Managed
Expectations / Goals Set
Gaps Analyzed, Cause
Determined
Action Taken (Correct / Prevent / Sustain)
Performance / Behavior Monitored
Plans Operational and Support in Place
Planning Set / Support
Requirements Determined
Primary Process
Financial Stakeholders
Customers
Capital Markets
Consumer Market
Earnings / Returns
Products / Services
(C) Performance Delivered
CPI
CBI
CJI
Orders for Products / Services
Competition Products / ServicesResources
© Gearry Rummler
CJI
CPI
CBI
(C) Performance
Delivered
MARSH
Impacting performance begins with understanding how individuals deliver results
Knowledge, Skills,
Abilities / Attitudes
Performer
Behaviors Tasks Job Outputs
Process Outputs
Organization Outputs
Knowledge or skill input is provided to the performer, either gained through education or training, or gained through on-the-job experience
The performer uses their knowledge, skills and abilities to produce certain behaviors in the workplace.
Behaviors either contribute to or inhibit execution of relevant job tasks.
The accumulate job tasks produce specific job outputs.
The performer’s job outputs lead to process outputs. All jobs in the organization are part of and contribute to work processes.
These process outputs contribute to organizational outputs, which may or may not be in line with the organization’s goals.
© Gearry Rummler
MARSH
Begin by understanding the “should”
Knowledge, Skills,
Abilities / Attitudes
Performer
Behaviors Tasks Job Outputs
Process Outputs
Organization Outputs
•Interview planning skills •Listening skills •Questioning skills •Reinforcement skills •Summarization skills •Knowledge of meeting guidelines
•Establish meeting objectives & time parameters •Ask questions •Collect examples •Diagram answers where possible •Restate points to verify •Ask for other data sources •Summarize what you learned, etc.
•Information gathered •Needs identified •Decision-makers and users identified •Constraints determined •Credibility established
•Qualified leads •Appointments •Opportunities •Proposal request •Proposal
•Customer orders / sales Profits.
© Gearry Rummler
MARSH
The “is” (or “often is”)
Knowledge, Skills,
Abilities / Attitudes
Performer
Behaviors Tasks Job Outputs
Process Outputs
Organization Outputs
•Interview planning skills •Listening skills •Questioning skills •Reinforcement skills •Summarization skills •Knowledge of meeting guidelines
•Establish meeting objectives & time parameters •Ask questions •Collect examples •Diagram answers where possible •Restate points to verify •Ask for other data sources •Summarize what you learned, etc.
•Information gathered •Needs identified •Decision-makers and users identified •Constraints determined •Credibility established
•Qualified leads •Appointments •Opportunities •Proposal request •Proposal
•Customer orders / sales Profits.
© Gearry Rummler
MARSH
What questions would you ask to understand the “is” and the “should”?
Knowledge, Skills,
Abilities / Attitudes
Performer
Behaviors Tasks Job Outputs
Process Outputs
Organization Outputs
© Gearry Rummler
How might you use this tool to better understand the
situation Jolin Tsai has told you about in
her email? Who might you focus
on, production line workers or production
supervisors?
MARSH
PERFORMER
- Necessary understanding and skill to perform
- Capacity to perform (intellectually, emotionally, etc.)
- Willingness to perform (given the incentives available)
OUTPUT
- Adequate and appropriate criteria (standards) with which to judge successful performance
CONSEQUENCES
- Sufficient positive consequences (incentives) to perform.
- Few, if any, negative consequences (disincentives) to perform.
INPUT
- Clear or sufficiently recognizable indications of the need to perform.
- Minimal interference from incompativle of extraneous demands
- Necessary resources (budget, personnel, equipment) to perform
FEEDBACK
- Frequent and relevant feedback as to how well (or how poorly) the job is being performed.
1
2 3
4
5
© Gearry Rummler
How might you use this tool to better understand the
situation Jolin Tsai has told you about in
her email? Who might you focus
on, production line workers or production
supervisors?
An alternative approach: The “Human Performance System” view
MARSH
And yet another: The “Performance Analysis Grid” Decide if training is the answer, or not
A B
C D
Does the employee have the adequate job knowledge and skills?
10
9
8
7
6
5
4
3
2
1
Motivation
Training Selection / Discharge /
Transfer
Resources / Work Environment
Does the employee have the proper attitude (desire) to perform the job? 1 2 3 4 5 6 7 8 9 10
How might you use this tool to better understand the
situation Jolin Tsai has told you about in
her email? Who might you focus
on, production line workers or production
supervisors?
MARSH
Metrics & Continuous Improvement
14 13 July 2015
Which process you use is less important than the questions it raises. The important question for L&D is this: To what extent does what we do, plan to do, or are being asked to do, make it more likely that behaviors change in a way that supports sustainable performance improvement?
Management
The Business Environment
The Economy The Culture The Country Legislation
Resources
Capital
Human Resources
Materials
Technology
Organization System & Culture
(A) Performance Planned (B) Performance Managed
Expectations / Goals Set
Gaps Analyzed, Cause
Determined
Action Taken (Correct / Prevent / Sustain)
Performance / Behavior Monitored
Plans Operational and Support in Place
Planning Set / Support
Requirements Determined
Primary Process
Financial Stakeholders
Customers
Capital Markets
Consumer Market
Earnings / Returns
Products / Services
(C) Performance Delivered
CPI
CBI
CJI
Orders for Products / Services
Competition Products / ServicesResources
VI. The Human Performance System
PERFORMER
- Necessary understanding and skill to perform
- Capacity to perform (intellectually, emotionally, etc.)
- Willingness to perform (given the incentives available)
OUTPUT
- Adequate and appropriate criteria (standards) with which to judge successful performance
CONSEQUENCES
- Sufficient positive consequences (incentives) to perform.
- Few, if any, negative consequences (disincentives) to perform.
INPUT
- Clear or sufficiently recognizable indications of the need to perform.
- Minimal interference from incompativle of extraneous demands
- Necessary resources (budget, personnel, equipment) to perform
FEEDBACK
- Frequent and relevant feedback as to how well (or how poorly) the job is being performed.
1
2 3
4
5
MARSH
CEOs are concerned about the lack of credible metrics
“Such figures might be understandable in the context of general-purpose training without any business objectives. But let’s imagine a bank that knows its sales performance could improve if call-center employees were better at identifying unmet customer needs. A range of skills might be relevant to achieve this objective. Assessing which skills really affect sales performance and applying metrics that show how well employees deploy them are critical for allocating training resources effectively and for actually boosting sales.”
15 13 July 2015 Do your training efforts drive performance? www.mckinsey.com, March 2015, by Richard Benson-Armer, Silke-Susann Otto, and Nick van Dam
MARSH
A bit on designing learning interventions
16 13 July 2015
Adapted from work done by Will Thalheimer (www.atworklearning.com)
MARSH
Metrics The Kirkpatrick Model & The Brinkerhoff “Success Case Method”
17 13 July 2015
What strengths does this model have?
Weaknesses?
What successes have
you had with it?
What are your concerns about it?
MARSH
Metrics The Kirkpatrick Model & The Brinkerhoff “Success Case Method”
18 13 July 2015
What strengths does this model have?
Weaknesses?
What successes have
you had with it?
What are your concerns about it?
MARSH
The Kirkpatrick Model Level 1 (Reaction): Uses and Misuses
Level 1 evaluation focuses on the reaction of participants to the training program. Although this is the lowest level of measurement, it remains an important dimension to assess in terms of participant satisfaction.
20 13 July 2015
Best Practices Why? Ask learner not just about satisfaction, but also about how valuable training is to job performance.
Asking about value is more relevant and meaningful.
Ask learner to evaluate each topic separately in terms of its value.
Learners are better at assessing details than generalities.
Ask learner these questions after they’ve gotten back to the job and can really evaluate the value.
Enables learners to focus on relevant applicability.
Effort
Adapted from work done by Will Thalheimer (www.atworklearning.com)
MARSH
The Kirkpatrick Model Level 1 (Reaction): Uses and Misuses
Level 1 evaluation focuses on the reaction of participants to the training program. Although this is the lowest level of measurement, it remains an important dimension to assess in terms of participant satisfaction.
21 13 July 2015
Learning Concept Circle One # Below
Circle ONE Below Circle ONE Below
Measuring retrieval is essential (even if we measure on‐the‐job performance and results) because retrieval is required for on‐the-job application. It is on the causal pathway from learning to performance and results.
Value of this Concept
(Circle NUMBER)
Low 1 2 3 4 5 6 High
1. Concept was new to me. 2. Deepened earlier understanding. 3. Provided nice reminder. 4. I already use concept regularly. 5. Most people already know this.
1. Taught really well. 2. Taught well. 3. Taught inadequately. 4. Taught poorly.
Example:
Effort
Adapted from work done by Will Thalheimer (www.atworklearning.com)
MARSH
The Kirkpatrick Model Level 2 (Learning): Uses and Misuses
22 13 July 2015
Level 2 evaluation determines whether the participants actually learned what they were supposed to learn as a result of the training session. It measures the participant’s acquisition of cognitive knowledge or behavior skills.
Best Practices Why?
Measure Learning at end of program and after a week or more delay. Or, measure learning after a week or more delay.
Delayed is better predictor of on-the-job retrieval.
Avoid asking memorization questions on critical information, unless directly prerequisite or relevant to performing on the job.
Memorization questions not as relevant as other types.
Utilize scenario-based decision-making questions. Such questions are moderately realistic and predictive.
Utilize simulations as measurement instruments. Simulations are relatively realistic and predictive.
If assessing to build learner understanding, utilize substantial feedback and provide more practice.
Feedback and more practice are helpful for new material.
If assessing to boost remembering, utilize repeated realistic retrieval practice opportunities.
Repeated retrieval practice bolsters remembering.
Adapted from work done by Will Thalheimer (www.atworklearning.com)
MARSH
A note on “Forgetting”
Forgetting is more likely than the additional learning, because so much of what we learn is not utilized routinely.
Don’t mislead yourself – avoid doing level 2 assessments at the same time learning ends (unless measurement is meant simply for compliance purposes).
23 13 July 2015
What might lead to more learning, even after training ends?
What leads to
forgetting?
What’s L&D’s level of influence? How?
Adapted from work done by Will Thalheimer (www.atworklearning.com)
MARSH
The Kirkpatrick Model Level 3 (Application): Uses and Misuses
24 13 July 2015
Level 3 evaluation focuses on the degree to which training participants are able to transfer learning to their workplace behaviors.
Best Practices Why?
Determine, in advance, what on-the-job (or real-life) improvements you expect the training to facilitate. Measure that or some proxy of that.
It’s good to measure what you most care about.
Work with management to reinforce behaviors through performance management and recognition / reward programs
If people think that new behaviors are “nice to have” but won’t be recognized, reinforced, and come with accountability, why bother?
Consider legal requirements. Tests must validly & reliably predict job performance—do not create unfair disadvantages. Not okay to document post-hoc.
If your tests don’t meet legal requirements, your organization may face legal consequences.
Adapted from work done by Will Thalheimer (www.atworklearning.com)
MARSH
The Brinkerhoff “Success Case Method” What are the steps?
26 13 July 2015
Step 1: Clarify business goals and training process and costs; complete Impact Analysis Profile
Step 2: Design and administer brief survey to sample of trainees (long enough after training for learning to have been applied)
Step 3: Analyze survey data; gauge scope of impact and identify success and non-success cases
Step 4: Conduct success case interviews (Usually conducted by phone, can be in-person, 20-30 minutes)
Step 5: Analyze all impact and performance support data; communicate results!
How is this method different than other
common methods of measurement?
What do you think is
meant by “communicate
results”?
MARSH
The Kirkpatrick Model Level 4 (Impact): Uses and Misuses
29 13 July 2015
Level 4 evaluation moves beyond the training participant to assess the impact of training on organizational performance. Before level 4 can be done, levels 1 – 3 need to measured.
Best Practices Why?
Keep metrics limited and try to focus in on those that are most influenced by a limited number of employee behaviors.
Organization results are necessarily impacted by many variables, not just training. Results measures are influenced by multiple factors.
Look for metrics already measured by organization, if possible. Use comparison-group strategies to isolate effects of learning from other factors.
Avoid creating new metrics management doesn’t already look at.
Conduct level 4 evaluation when the course is expected to be a part of a core curriculum and have a long life
Level 4 is not easy to measure – don’t burn yourself out.
Conduct level 4 evaluation when the course has high visibility with and importance to senior management
Puts L&D in the position to partner with management on improving learning investment decisions
MARSH
The Kirkpatrick Model Level 4 (Impact): How it works
30 13 July 2015
Step 1: Conduct the business or needs analysis, and determine the metric to be measured (e.g. Management: specific measures of employee engagement)
Step 2: Develop your evaluation plan (should include all four levels)
Step 3: Design & develop your instruments and methodologies, e.g. surveys or questionnaires (with confidence levels); consider comparing against a control group
Step 4: Collect your data
Step 5: Analyze and interpret. Look specifically for impact in your chosen business metric(s). Using a control group might allow you to measure impact of the training vs. performance of a group with no training.
Adapted from Chapter 31: Level 4 – Results (ASTD Handbook for Workplace Learning Professionals) by Donald V. McCain (2008)
Who has experience measuring learning
impact?
Was it useful? How?
How can you avoid conflating results
with other drivers of performance?
MARSH
Six Continuous Improvement Practices for the L&D Function 1. Don’t just take orders; assess and analyze needs
31 13 July 2015
Which process you use is less important than the questions it raises. The important question for L&D is this: To what extent does what we do, plan to do, or are being asked to do, make it more likely that behaviors change in a way that supports sustainable performance improvement? Help managers see learning differently.
Management
The Business Environment
The Economy The Culture The Country Legislation
Resources
Capital
Human Resources
Materials
Technology
Organization System & Culture
(A) Performance Planned (B) Performance Managed
Expectations / Goals Set
Gaps Analyzed, Cause
Determined
Action Taken (Correct / Prevent / Sustain)
Performance / Behavior Monitored
Plans Operational and Support in Place
Planning Set / Support
Requirements Determined
Primary Process
Financial Stakeholders
Customers
Capital Markets
Consumer Market
Earnings / Returns
Products / Services
(C) Performance Delivered
CPI
CBI
CJI
Orders for Products / Services
Competition Products / ServicesResources
VI. The Human Performance System
PERFORMER
- Necessary understanding and skill to perform
- Capacity to perform (intellectually, emotionally, etc.)
- Willingness to perform (given the incentives available)
OUTPUT
- Adequate and appropriate criteria (standards) with which to judge successful performance
CONSEQUENCES
- Sufficient positive consequences (incentives) to perform.
- Few, if any, negative consequences (disincentives) to perform.
INPUT
- Clear or sufficiently recognizable indications of the need to perform.
- Minimal interference from incompativle of extraneous demands
- Necessary resources (budget, personnel, equipment) to perform
FEEDBACK
- Frequent and relevant feedback as to how well (or how poorly) the job is being performed.
1
2 3
4
5
MARSH
Six Continuous Improvement Practices for the L&D Function 2. Don’t forget the 90%
32 13 July 2015
MARSH
Six Continuous Improvement Practices for the L&D Function 3. Don’t treat face-to-face delivery like the most important step
• While many of you might agree with this point, a simple analysis of your annual learning spend would probably reveal the vast majority goes to live training and other “formal” interventions.
• Consider investing in: – Learning infrastructure (e.g.
resource guides) – Coaching – Assessment & development centers – Broadening & deepening talent
reviews – Better on-boarding – Career maps and similar programs
33 13 July 2015
MARSH
Six Continuous Improvement Practices for the L&D Function 4. Evaluate impact, but do so selectively
34 13 July 2015
Time – Effort - Resources
MARSH
Six Continuous Improvement Practices for the L&D Function 5. Partner with the business
35 13 July 2015
Employee Engagement/
Retention Seamless in Executing
the Fundamentals
Guided by Better Metrics
and Measurement
Effective in Supporting
Line Managers
Cost Effective
Performance Analysis
Organizational Performance
Mindset
Myth- Busting
Focusing on The 80/20
Leadership & Manager Coaching
Recognition
Talent- Differentiated
Learning
What’s your manifesto? Business-focused examples of L&D BP activities
More Strategic
MARSH
Average retention rates from various learning modes:
Lecture.............. 5% Reading ....................10% Audio Visual ...........................20% Demonstration.................................... 30% Learner focused discussion............................. 50% Learner practice in learning context ..............................75% Immediate application of learning in real situation ........................90%
Everyone’s seen this, right? Is it “right”?
MARSH
Six Continuous Improvement Practices for the L&D Function 6. Continuously educate yourself and challenge convention
There’s a surprising amount of mis-information out there about learning and performance improvement.
What works…
What doesn’t…
What delivers the best results…
Experienced senior leaders in organizations are sometimes the worst ambassadors of bad ideas (largely because, like others, they believe the way they learned is THE best way).
Be a proponent of good ideas backed by research.
37 13 July 2015
Adapted from www.willatworklearning.com by Will Thalheimer, Ph.D.
MARSH
Six Continuous Improvement Practices for the L&D Function 6. Continuously educate yourself and challenge convention
There’s a surprising amount of mis-information out there about learning and performance improvement.
What works…
What doesn’t…
What delivers the best results…
Experienced senior leaders in organizations are sometimes the worst ambassadors of bad ideas (largely because, like others, they believe the way they learned is THE best way).
Be a proponent of good ideas backed by research.
38 13 July 2015
Dr. Michelene Chi of the University of Pittsburgh (who is, by the way, one of the world's leading authorities on expertise). She said this about the graph: "I don't recognize this graph at all. So the citation is definitely wrong; since it's not my graph."
Adapted from www.willatworklearning.com by Will Thalheimer, Ph.D.
MARSH
Resources & Acknowledgements
• Will Thalheimer, Ph.D. (www.willatworklearning.com)
• The ISPI Handbook of Human Performance Technology
• The ASTD Handbook for Workplace Learning Professionals
• www.google.com
• Experience in the trenches learning (hopefully) from my mistakes
39 13 July 2015