39
Evaluation of Training Program Anita F. Lopes BSN, MSN, RN

Evaluation of training program

Embed Size (px)

Citation preview

Page 1: Evaluation of training program

Evaluation of Training Program

Anita F. LopesBSN, MSN, RN

Page 2: Evaluation of training program

Objectives

• Overall Aim: The trainers are able to identify and implement the various methods of ‘Evaluation of the Training Program’

Page 3: Evaluation of training program

Specific Objectives

The group--Reviews the definition, purposes and types of Evaluation-Defines ‘Evaluation of the Training Program’ and Differentiates Evaluation of Training Program and Training evaluation-Reviews the various models for Evaluation for a Training program-Identifies the various aspects of Evaluation in a training program-Implements the Various Evaluation Methods/Tool

Page 4: Evaluation of training program

Definition: Evaluation

•Evaluation is the assessment of merit or worth-Scriven,1963

•Evaluation is a systematic determination of a subject's merit, worth and significance, using criteria governed by a set of standards.

-Wikipedia

Page 5: Evaluation of training program

TYPES

Page 6: Evaluation of training program

TRAINING PROGRAMME EVALUATION

Page 7: Evaluation of training program

TRAINING PROGRAMME EVALUATION• A systematic, rigorous, and meticulous application of

scientific methods to assess the design, implementation, improvement, or outcomes of a program.

• It is a resource-intensive process, frequently requiring resources, such as, evaluate expertise, labor, time, and a sizable budget.

• Training program evaluation is a continual and systematic process of assessing the value or potential value of a training program.

• Results of the evaluation are used to guide decision-making around various components of the training (e.g. instructional design, delivery, results) and its overall continuation, modification, or elimination.

Page 8: Evaluation of training program

DifferentiationTraining Program Evaluation

• Examines the entire program including the training component,

• Training program should be consistently monitored and evaluated to determine progress and areas of improvement.

• All processes within the program contribute to program success or failure, yet all processes are not "training".

• Knowledge, understanding and skill in program evaluation is necessary to effectively evaluate training and development programs

Training Evaluation• Assessment of the

training course, process, or component within a program.

Page 9: Evaluation of training program

Purpose of Evaluation• Assist an organization, program, project or any

other intervention or initiative to assess any aim, realisable concept/proposal, or any alternative, to help in decision-making; or

• To ascertain the degree of achievement or value in regard to the aim and objectives and results of any such action that has been completed.

• Enable reflection and assist in the identification of future change.

• “Determine the quality of a program by formulating a judgment" Marthe Hurteau, Sylvain Houle, Stéphanie Mongiat (2009).

Page 10: Evaluation of training program

MODELS OF EVALUATION

Page 11: Evaluation of training program

Various Models of Evaluation

• Daniel Stufflebeam's CIPP Model (Context, Input, Process, Product) The CIPP Model for evaluation is a comprehensive framework for

guiding formative and summative evaluations of programs, projects, personnel, products, institutions, and systems. This model was introduced by Daniel Stufflebeam in 1966 to guide mandated evaluations of U.S. federally funded projects.

• Robert Stake's Responsive Evaluation Model: Robert Stake (1975) coined the term responsive evaluation. Responsive evaluation distinguishes four generations in the historical development of evaluation: measurement, description, judgment and negotiation. ‘Measurement’ includes the collection of quantitative data.

Page 12: Evaluation of training program

Various Models of Evaluation• Robert Stake's Countenance Model: This model focuses on description

and judgment. Stake wrote that greater emphasis should be placed on description, and that judgment was actually the collection of data. Stake wrote about connections in education between antecedents, transactions and contingencies (outcomes). He also noted connections between intentions and observations, which he called congruence. Stake developed matrices for the notation of data for the evaluation

• CIRO (Context, Input, Reaction, Outcome): This model is based on four stages or types of evaluation – context, input, reactions and outcome – and is underpinned by a set of three questions, which, according to the authors, the trainer should always bear in mind.

• PERT (Program Evaluation and Review Technique): The Program (or Project) Evaluation and Review Technique, commonly

abbreviated PERT, is a statistical tool, used in project management that is designed to analyze and represent the tasks involved in completing a given project. First developed by the U.S. Navy in the 1950s, it is commonly used in conjunction with the critical path method (CPM).

Page 13: Evaluation of training program

Various Models of Evaluation• Michael Scriven's Goal-Free Evaluation Approach: This approach is premised on the assumption that

an evaluation should establish the value of a program by examining what it is doing rather than what it is trying to do.

• Provus' Discrepancy Model: Developed in 1969 this model provides information for program

assessment and program improvement. Under this model evaluation is defined as the comparison of an actual performance to the desired standard..

• Illuminative Evaluation Model: An illuminative evaluation is a custom built research strategy which lacks in formal statements of objectives, avoids (but does not exclude) statistical procedures, employs subjective methods, and is primarily interested in the informing function of an evaluation, rather than the more usual inspectoral or grading functions of an evaluation

Page 14: Evaluation of training program

Kirkpatrick's evaluation model

Page 15: Evaluation of training program

ASPECTS

Page 16: Evaluation of training program

Common Tools for Evaluation

• Rating Scales• Observation checklists• Questionnaires• Anecdotes of Observer

Page 17: Evaluation of training program

1) Evaluation Of Objectives• General/ Overall aim of the Training Program• Specific Objectives• Each session : Will have specific objectives• Specific Evaluation Areas: -Objectives must be set-up in line with the Goal

of the Training Program-Objectives must cover from the definition to

evaluation (feedback) of the specified training content

Page 18: Evaluation of training program

Tool for Evaluating Educational Objectives

Page 19: Evaluation of training program

2) Evaluating the Teaching content

• Sequence must be maintainedFor e.g. Definition to Nursing Care• Includes A.V aids• Relevant Examples• Demonstrations/ Re-demo• Feedbacks ( Teaching- Learning activity)• Assignments (Classroom teaching)

Page 20: Evaluation of training program

Tool for Evaluating the Teaching Content

• PRE TEST POST TEST• POST TEST ONLY

Page 21: Evaluation of training program

OBSERVATION CHECKLIST

Page 22: Evaluation of training program

3) EVALUATION OF AV AIDS

Page 23: Evaluation of training program

Tool for Evaluating A.V aid

• Subject related• Supplements Teaching content• Properly presented• Accurate• Variety • Handled appropriately

Page 24: Evaluation of training program

4) Evaluation of Speakers• Well Prepared with the content• Work-needs and people identification. • Involvement in training programme and evaluation

development. • Support of pre-event preparation and holding briefing

meetings with the learner. • Giving ongoing, and practical, support to the training

programme. • Holding a debriefing meeting with the learner on their

return to work to discuss, agree or help to modify and agree action for their action plan.

• Reviewing the progress of learning implementation. • Final review of implementation success and

assessment.

Page 25: Evaluation of training program

Role of speaker• Analyze the need of the group, then facilitate

learning.• To enable learning.• Plans and designs the training programmes.• Collaborator• Selects appropriate teaching technology and

methods.• Enables skill development.• Keeps themselves self abreast with the latest trends

and technology.• Conduct , organize and participate in various

workshops.

Page 26: Evaluation of training program

RATING

SCALE

Example of Tool for Evaluating Speaker

Page 27: Evaluation of training program

5) Selection of EvaluatorsThe American Evaluation Association has created a set of Guiding Principles for evaluators. The principles run as follows:

1) Systematic Inquiry: Evaluators conduct systematic, data-based inquiries about whatever is being evaluated.

Page 28: Evaluation of training program

Selection of Evaluators

2) Competence: Evaluators need the required expertise.3) Integrity/Honesty: Underscored by three principles: impartialityindependencetransparency

Page 29: Evaluation of training program

Selection of Evaluators4. Respect for People: Evaluators respect

the security, dignity and self-worth of the respondents, program participants, clients.

5. Responsibilities for General and Public Welfare: Evaluators articulate and take into account the diversity of interests and values that may be related to the general and public welfare.

Page 30: Evaluation of training program

6) Teaching Learning Environment

Page 31: Evaluation of training program

Teaching Learning Environment

The environment has to be conducive in order for learning to take place

Must supplement the effectiveness of teaching

State of the Learners

Effectiveness of the Speaker

Page 32: Evaluation of training program

Tool for Evaluating the Tng-Lng Envt

Physical facilities : Seating arrangement, Visibility, Lighting, Washroom facilities

Safety measures

Communication equipments : Audible, A.V aids

Room temperature

Page 33: Evaluation of training program

7) Organization of the Workshop1. Identification and setting of objectives by the

organization 2. Planning, design and preparation of the training

programmes against the objectives 3. Pre-course identification of people with needs and

completion of the preparation required by the training programme

4. Provision of the agreed training programmes 5. Pre-course briefing meeting between learner and

line manager 6. Pre-course or start of programme identification of

learners' existing knowledge, skills and attitudes7. Interim validation as programme proceeds

Page 34: Evaluation of training program

Organization of the Workshop8. Assessment of terminal knowledge, skills, etc., and

completion of perceptions/change assessment 9. Completion of end-of-programme reactionnaire 10. Completion of end-of-programme Learning

Questionnaire or Key Objectives Learning Questionnaire 11. Completion of Action Plan 12. Post-course debriefing meeting between learner and

line manager 13. Line manager observation of implementation progress 14. Review meetings to discuss progress of implementation 15. Final implementation review meeting

Page 35: Evaluation of training program

Checklist on Evaluating the Organisation of Workshop

I. RegistrationII. Program Schedule providedIII. Tng-Lng Envt maintainedIV. Implemented as per scheduleV. Mets the requirement of DelegatesVI. Delegate’s Feedback

Page 36: Evaluation of training program

Need For Evaluation? workshop• Use of evaluation data meets these demands in various ways:

1. Planning: To assess needs, set priorities, direct allocation of resources, and guide policy

2. Analysis of Course/Program Effectiveness or Quality: To determine achievement of objective, identify strengths and weaknesses of a program/course, determine the cost-effectiveness of a program/course, and assess causes of success or failure

3. Direct Decision-making: To improve effectiveness, identify and facilitate needed change, and continue, expand, or terminate a program/course

4. Maintain Accountability: To stakeholders, funding sources, and the general public

Page 37: Evaluation of training program

SUMMARY

Page 38: Evaluation of training program
Page 39: Evaluation of training program