4

Click here to load reader

Measuring the impact of training

Embed Size (px)

Citation preview

Page 1: Measuring the impact of training

Measuring the impact of training

Garry Platt explores what is needed to prove training’s effectiveness

he default setting for the he default setting for the he default setting for the he default setting for the evaluation of training in evaluation of training in evaluation of training in evaluation of training in many organisations is the many organisations is the many organisations is the many organisations is the issuing of a ‘happy sheet’ at issuing of a ‘happy sheet’ at issuing of a ‘happy sheet’ at issuing of a ‘happy sheet’ at

the conclusion of an event, usually the conclusion of an event, usually the conclusion of an event, usually the conclusion of an event, usually at 4:55pm when people just want at 4:55pm when people just want at 4:55pm when people just want at 4:55pm when people just want to go home and then participants to go home and then participants to go home and then participants to go home and then participants hastily tick through a series of hastily tick through a series of hastily tick through a series of hastily tick through a series of vaguely defined questions, the vaguely defined questions, the vaguely defined questions, the vaguely defined questions, the results of which are held up as results of which are held up as results of which are held up as results of which are held up as evidence of the effectiveness of the evidence of the effectiveness of the evidence of the effectiveness of the evidence of the effectiveness of the training training training training –––– Duh! Duh! Duh! Duh! The missed opportunity to objectively assess and review how training is being undertaken, and the results it is achieving, is a sad reflection on the status and function of training in many organisations. Because the training function fails to undertake much significant analysis of what training is achieving and then act on the feedback, the role is often seen as a peripheral activity within the organisation and usually one of the first to experience financial or staffing cutbacks when times are hard. This should not be a surprise and is in fact entirely appropriate. Any department that fails to ensure that what it does and undertakes is of real benefit, and doesn’t expend any energy in confirming this, really deserves to be the first up against the wall come the revolution! What most learning or training departments need is a structured approach to evaluating and measuring training. Kirkpatrick1, in his original 1959 book, Evaluating Training Programs, made giant steps towards creating this more structured approach but failed, in some respects, to define in detail what can be done or give detailed examples. Consequently, many ideas languished and were not taken up. The strategy proposed here is a variant of the original Kirkpatrick model and looks at a number of different levels. 1Kirkpatrick, D. L. (1959) Evaluating Training Programs, 2nd ed., Berrett Koehler, San Francisco.

T

FEATUREFEATUREFEATUREFEATURE

www.trainingjournal.com TJ

Page 2: Measuring the impact of training

Level 1: Response Level 1: Response Level 1: Response Level 1: Response – This level looks at the experience the individual has during the learning event. It is important and relevant. Our reactions and immediate experiences colour and influence how we receive and retain the learning imparted. Level 2: Knowledge Level 2: Knowledge Level 2: Knowledge Level 2: Knowledge – In order for learning to take place, knowledge must be acquired and information and data must be imparted to the learners. Without ‘know-how’ there can be no resulting action or behaviour. Level 3: Behaviour Level 3: Behaviour Level 3: Behaviour Level 3: Behaviour – Learning must result in the demonstration and application of new skills and behaviours. If it doesn’t, learning is pointless. In creating and defining our training events we must ensure that individuals get to practise and receive feedback on their actions and behaviours. Level 4: Job Level 4: Job Level 4: Job Level 4: Job – Participating in learning should be focused on achieving changes in workplace results. The job the individual does should be undertaken more effectively or more efficiently as a result of the learning acquired. Level 5: Organisation Level 5: Organisation Level 5: Organisation Level 5: Organisation – The host organisation should ultimately benefit from the training delivered. The bottom line of the organisation should be impacted upon as a result of the learning achieved. These five levels create a framework around which a comprehensive and practical approach to evaluation might be created. A structured approach that will supply answers to the following three essential questions: 1. What aspects of development are working and which are not? 2. What changes to the training design and delivery needs to take place? 3. What different types of value has learning contributed to the organisation? Each of these levels presents its own challenges and problems but each can be approached in a creative and constructive way. However, before we can even begin to consider measuring training we need to be clear about why we are doing the training. So to start with, we need to look at how to identify true training needs and what that means, and determine

what the programme content should be. Occasionally, it can be the case that training or development is used as a substitute for discharging management responsibility, like giving someone unpalatable feedback. Instead, the individual is sent on a training course. This type of training will always be valueless. A clear definition is required of what is a training issue and what is a ‘management’ issue. This is most clearly illustrated in a flowchart (see figure 1) and shows that training should, in effect, be the last port of call for creating workplace change. Once training is identified that is appropriate, we must be clear about what it will achieve by developing training objectives. The absence, or poor quality of training objectives, is often the second barrier to effective evaluation of training, as we have no clear definition of what the training is intended to do. Training objectives are often vague, indeterminate statements such as: ‘At the conclusion of the training the learner will be familiar with the process of undertaking a management appraisal.’ This statement might initially sound perfectly acceptable as a training objective, but in fact means nothing. What does; ‘familiar with’ mean? What does it actually define? The answer is nothing. The statement can be interpreted in any number of ways and consequently determines nothing. There is a way to write training objectives using the R.F. Mager system of defining a behaviour, then the standard to which that behaviour must be dem-onstrated and the conditions under which that behaviour should be conducted. Behaviour – standard – condition defines very precisely what is to be undertaken and done. So now the objective, in this example, becomes: Behaviour Behaviour Behaviour Behaviour Undertake a management ap-

praisal. Standard Standard Standard Standard As defined in the company

handbook, Section 3 ‘Undertaking a Performance Management Review’, pages 23 to 32. With no errors, incorrect sequencing or omissions.

Conditions Conditions Conditions Conditions (Training) In a simulated role

play. (Workplace) In the office.

FEATUREFEATUREFEATUREFEATURE

TJ www.trainingjournal.com

Page 3: Measuring the impact of training

Once we have established what is appropriate to train and what that training is to achieve, the five evaluation levels now come into play. The training should be designed in a way that integrates a range of measurement strategies that will supply useful data as to the real success of the learning delivered. Some of the issues and options to consider are as follows. Level 1: Response Level 1: Response Level 1: Response Level 1: Response • What can you and can’t you ask in an end

of course review sheet? • Marking strategies, 4 or 5 box models? • Analysing feedback • Alternatives to review sheets; detailed

lesson plan feedback • Immediate response posters • Immediate reactionnaires • Using electronic options: e.g. Survey

Monkey.

Level 2: Knowledge Level 2: Knowledge Level 2: Knowledge Level 2: Knowledge • Types of exam/test questions • Determining a fair pass mark • Creating PowerPoint quizzes • Creating paper-based quizzes • Using learning logs • Using class reviews • Creating debates and dialogues.

Level 3: Behaviour Level 3: Behaviour Level 3: Behaviour Level 3: Behaviour • Understanding defensible assessment • Determining what we are looking for and

how to record it • Using and assessing role play • Using activities to demonstrate behaviours.

Level 4: Job Level 4: Job Level 4: Job Level 4: Job • Using SMART objectives • Using workplace assignments • Supporting and encouraging

management involvement • Using capability matrices.

FEATUREFEATUREFEATUREFEATURE

www.trainingjournal.com TJ

Figure 1: Training or management flowchart

Do nothing Give feedback Provide practise & give feedback

Remove punishment

Has the manager given the individual feedback?

Do they perform the task

infrequently?

Is the performance of the task being punished?

Is the individual missing the knowledge &

skills necessary to do the job?

Can the task be

simplified?

Do obstacles prevent

performance?

Train Simplify the task

Remove obstacles

Is there a defined Performance Gap?

TRAINING OR MANAGEMENT?

YES NO

NO YES YES

YES NO

NO NO NO

NO

YES YES YES

Look at alternative employment options

Page 4: Measuring the impact of training

Level 5: OrganisationLevel 5: OrganisationLevel 5: OrganisationLevel 5: Organisation • LEarning and APplication (LEAP) system • Measuring workplace change using IOTA (Impact of Training Analysis)

In addition to these options there are also things we can do to streamline and focus the applications for attendance on training events so that we at least have a fighting chance from the beginning of achieving some form of positive outcome and result. There are often three types of people on training courses: The political prisonersThe political prisonersThe political prisonersThe political prisoners Has to be there, doesn’t want to be there and doesn’t know why they’re there. The touristsThe touristsThe touristsThe tourists A day training is better than a day working. The explorersThe explorersThe explorersThe explorers Wants to actually learn something new to help them in the workplace. Many requests for training are not predicated on workplace changes, but merely reflect a whim or prejudice of the attendee or line manager, or both. This is not really their fault. They have probably never truly understood what the role and function of training is within the organisation and may well have viewed it as a reward mechanism, a burden of professional development or even a punishment that must be endured. Applications for training should essentially be requests for development tied to a business case. Any request or compulsory attendance on a training event must be undertaken once the following questions have been answered:

• Why? • What is going to be stopped, started, done faster or better?

• How will we (the business) know that this money has been well spent?

Any attendance on any learning event that does not have clear answers to these points has got to be questioned. We can support managers and attendees by creating much more effective application or attendance forms which help all parties understand exactly why they are participating and what they, and the organisation, should get out of the process. With that degree of clarity, success is still not guaranteed, but it is more likely.

Training is a fashion industry – trends, initiatives and fads come and go. We adopt and jump on these band wagons just likes buses, except with buses we know where we want to go and when to get off. With training fashions, we adopt them and use them with no clear understanding of whether they are working, or what they should achieve. For the last couple of years it’s been coaching. Before that it was emotional intelligence, and before that NLP and then it was ... Well you get the picture. Training, learning, development (call it what ever you like) should be measured and evaluated regardless of our personal opinions or how the developmental initiative came to be introduced. What learning and development professionals need to do is to begin a process of tracking the results of their training. This can be accomplished by introducing a range of methods and systems either by augmenting existing approaches or starting afresh. Using these techniques will provide data about the effectiveness or otherwise of the development delivered and from this it will be possible to revise, reallocate and maintain a range of effective training interventions and, as importantly, have the evidence to prove it.

FEATUREFEATUREFEATUREFEATURE

TJ www.trainingjournal.com

Garry Platt is a training and development specialist. Further details of the systems and processes can be found on TJ’s Measuring the Impact of Training workshop. The outline and agenda for the day can be found at www.trainingjournal.com/events/mit.pdf To book your place contact 01353 865352