41
welcome [email protected] Andrew Downes Learning and Interoperability Consultant

Beyond surveys

Embed Size (px)

Citation preview

welcome

[email protected]

Andrew DownesLearning and Interoperability

Consultant

Learning Evaluation:Beyond Surveys

keytakeaways

Good learning evaluation is more complex than simple surveys

A seven step model (with worksheets) that you can implement.

Practical advice and case studies you can follow to get started.

More than surveys

Evaluation in Learning and Development

“Oh! You’re a survey tool?”

Evaluation in other industries

evaluationmodels

01

KirkpatrickFour levels of evaluation

03

PhillipsReturn on Investment

02

KaufmanFive levels of evaluation

04

AndersonValue of learning

05

BrinkerhoffSuccess case method

Kirkpatrick1. Reaction – did the training feel useful?2. Learning – did learn what they were

supposed to?3. Behavior – did they change how they

worked?4. Results – did the business metric improve?

Kirkpatrick• Useful, well known starting point• Higher levels are more important• Lower levels give faster warning of problems • All four levels are important

Kaufman• Based on Kirkpatrick• Considers societal and customer consequences• Splits Kirkpatrick’s Level 1 into

• Input – quality of resources• Process – delivery of learning experiences

Kaufman• Useful to evaluate learning resources

separately from experiences• Societal/customer impact is usefully either too

far removed to evaluate or already included in business metrics

Phillips• Adds a fifth level to Kirkpatrick – Return on

Investment (ROI)

ROI = $ benefit of training - cost of training cost of training

Costs include:• Technology costs• Learning team time• Time attending/completing training

Phillips• ROI should underpin the level 4, business goal.• Figure out ROI at the start of the project. • If the ROI is not going to be acceptable, either

the project budget is too high or your business goal is wrong.

Anderson3-stage cycle1. Determine current alignment2. Assess and evaluate learning’s contribution3. Establish most relevant approaches

• Is a learning program that meets it’s goals successful if those goals are bad?

• Goals need to be aligned to the organization’s strategic priorities

Anderson’sValue of Learning

What metrics do my stakeholders need?

Emphasis on short

term benefits

Emphasis on long

term benefits

Trust in the value

of learning

Need evidence of the value of

learning

Learning function effectiveness

measures

Return on expectation

Return on investment

Measures against industry

benchmarks

Anderson• It’s very important that Learning’s goals

support organizational priorities • Use the model alongside other methods that

evaluate specific learning programs and experiences.

BrinkerhoffAny learning program, no matter how good or bad will have good and bad elements1. Identify the very best successes and very

worst failures of the program2. Conduct interviews and other research around

those successes and failures to learn lessons3. Promote success stories to market the

program

Brinkerhoff• Best to use alongside, not in place of, previous

evaluation models• Use success case model to dig deeper, learn

lessons and shout about successes

seven stepsof evaluation

Align Define Discover Design

Monitor Analyze Explore

https://www.watershedlrs.com/blog/

01

Step 1Align

Identify program goals and evaluate alignment with strategic priorities.

• Define program goals• Evaluate how closely goals align with strategic

priorities• Decide whether or not to proceed as planned

01

Step 1Align

Who are your stakeholders?

02

Step 2Define

Identify success metrics most appropriate to the organization.

• Identify reporting stakeholders and metrics that will meet their needs.

• Define the set of metrics to monitor and analyze the program

• Expect these metrics to change and reduce in scope during the discover and design steps

02

Business goalWhat do people need to do to achieve that?

What do people need to learn

to do that?

What program is needed for them

to learn that?

Was the goal achieved?

Are people doing what they need

to do?

Are people learning what they need to

learn?

Is the program working?

4 3 2 1

…your evaluation metrics design

Your learning design is mirrored by…

03Step 3Discover

Identify what learning is already happening that supports the program’s goals.

• Identify formal and informal learning within your organization that relates to the program’s goals.

• Evaluate how effective these learning experiences are.

• Determine the extent to which the learning is positive (are people learning the right thing?)

04Step 4Design

Design how evaluation metrics will be captured, aggregated, and displayed.

• Evaluate the feasibility of evaluate metrics and finalize your list.

• Design monitoring dashboards and analysis reports.

05

Step 5Monitor

Continually monitor success and progress towards the program goal

• Identify any problems early on.• Quickly implement fixes for the problems.

06

Step 6Analyze

Analyze data in detail at key points in the program

• Determine whether or not program goals were achieved

• Evaluate the reasons why• Collect success stories• Document lessons learned

07Step 7Explore

Research further into particularly successful and unsuccessful elements

• Create detailed case studies of success and failure

• Promote the program within your organization and beyond

Getting startedWith xAPI and an LRS

“You can’t eat an elephant all at once”

Start with one or two ‘easy’ data sources to prove the concept.

Level 1: Learning resources and experience

Data about utilization from LMS, Intranet etc.

• xAPI integration• xAPI Apps attendance tool• CSV import

Quality of resources & experience data from surveys

• SurveyGizmo• Recommended xAPI authoring

tool• Google forms with Zapier• CSV import from another survey

tool

Level 2: What did they learn?

Data from learning assessments

• Recommended xAPI authoring tool• Recommended xAPI LMS assessment• In-house app/platform• CSV import

Level 3: Did they change?

Job performance data from surveys

• SurveyGizmo• Recommended xAPI authoring

tool• Google forms with Zapier• CSV import from another survey

toolBut: How accurate is job performance survey data?

See https://hbr.org/2015/02/most-hr-data-is-bad-data

Job performance data from observations

• xAPI Apps

Integration with real job tools

• xAPI integration• Zapier integration• Stand alone connector• CSV import

Level 4: Business metrics

Data about the business metric the learning is designed to impact

• xAPI integration• Zapier integration• Stand alone connector• CSV importOften the data set is small

enough that CSV is the most sensible option

Case studies

Conducting an A/B TestConducting an A/B Testand saving money

Content utilizationwhat’s working and who’s engaged?

Monitoring progresstowards pre-defined milestones

Comparing learning performancewith life-saving KPIs

* Not real data

thank you.

Seven stepsRead the blog & download the worksheet• https://www.watershedlrs.com/blog/watersheds-seven-steps-of-learning-evaluation

• https://www.watershedlrs.com/blog/define-metrics-for-learning-evaluation

• https://www.watershedlrs.com/blog/learning-evaluation-whats-happening

• https://www.watershedlrs.com/blog/design-evaluation-metrics-learning-evaluation

• https://www.watershedlrs.com/blog/monitoring-success-learning-evaluation

• https://www.watershedlrs.com/blog/analyze-outcomes-learning-evaluation

• https://www.watershedlrs.com/blog/the-final-step-of-learning-evaluation-explore-the-outcomes

https://www.watershedlrs.com/blog/all