of 16 /16
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust Fund

OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust

Embed Size (px)

Text of OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst...

OECD/INFE evaluation principles and guides

OECD/INFE Tools for evaluating financial education programmesAdele Atkinson, PhDPolicy AnalystOECDWith the support of the Russian/World Bank/OECD Trust FundOutline Terminology

Motivation for focusing on evaluation

Overview of the OECD INFE research and tools developed under the Trust Fund

Introduction to the 2 practical guides

Focus on High-level Principles

TerminologyChecking whether targets were met by monitoring inputs and outputs

keeping track of the day-to-day inputs and processes involved in delivering the education

Assessing the outcomes and impact for participants

Analysing the cost-effectiveness of the programme

Monitoring and Evaluating financial education programmes: what do we mean?3Motivation for focusing on evaluation4Benefits for programme delivery5The challenges faced46 Authorities from 29 countries responded to an INFE request for information about the extent to which they were evaluating and the challenges they faced; 28 authorities in 23 countries had evaluated

The most frequently faced challenges: The stock take indicates that it is not uncommon for financial education initiatives to be evaluated. There are many authorities undertaking monitoring and evaluation on a regular basis. However, it is striking that each authority has made its own decision about how to evaluate, which methods to use, who to ask for guidance, and what aspects of the education process to focus on.

6Evaluation research and tools7

7 OECD-INFE Guide available online

8Non-technical 7 page guide, answering questions such as:Why evaluate?What types of questions will an evaluation answer?Providing guidance on:The principles of a good evaluationThe key stepsSuggested methodsDetailed guidance also available online

16 page, detailed guidance in non-technical language:Information about the theory of changeDetailed guidance on the steps of evaluationInformation about analysis and interpretation of dataReminders about reporting the resultsAnnex with information about additional resourcesOECD INFE High-level Principles5 key principles :New programmes should be evaluated; try to also evaluate existing programmesInclude evaluation in the budgetEvaluation should be credible: consider an external evaluator or reviewerDesign the evaluation in accordance with the objectives and type of programmeReport what worked, and what didnt work

3 steps : planning, implementation, reporting10101. Evaluation: an essential element of FE programmesNew programmes: Develop a strategy for monitoring and evaluating alongside programme designKeep in mind the benefit of collecting information before the programme startsAll programmes: Encourage dialogue and collaboration with key stakeholders to ensure clarity and consistency of aims and objectivesReassure providers that evaluation is not designed to judge them, but to improve efficiency if appropriate and to identify successful programmes that will ensure the best possible outcomes for future participants11FE Programmes planning should include evaluationNote that the objectives should be realistic and clear112. Budget for evaluationFind out how much other evaluations have cost and gather estimates before finalising the budgetThe amount of money available shouldnt determine the design of evaluation, but may indicate the need to prioritise certain aspects of evaluationLook for ways of reducing costs: e.g. sharing questionnaires, drawing on existing data and international methodology and drawing on contacts; piloting programmes before large scale roll-out

12A good evaluation ensures that resources are being well spent: it is a wise expense!

3. External evaluators: adding credibility, skills and experienceExternal evaluators bring skills and independenceOther ways of ensuring credibility:Use technology: Administrative systems and websites can automate data collection. Electronic games can store scores and measure improvement. Ask (well designed) questions of participants and non-participants, trainers and designers: use surveys, tests, interviews, focus groups. Corroborate findings where possible: check bank statements, pension fund records, credit counselling services

13it is much more likely that unintended consequences will be identified through in- depth conversations than through a survey.Dont overburden participants especially if they havent invested a lot into the programme or they are in the control/comparison group.the style of questioning can really help to improve the quality of an evaluation. Objective questions such as Have you been overdrawn on your current account in the last 3 months? rather than would you say you have been overdrawn more or less often since you finished the course than you were before.

134. Appropriate evaluation designContinuous monitoring: Count/measure/quantify- how many participants, hours of contact, leaflets distributed etcMeasure change according to programme type and objectives: monitor improved awareness, evaluate behaviour change strategies, test knowledgeIdentify ways of attributing change: create a control group- lottery for places, random marketing of courses according to programme design.Undertake comparisons of: knowledge, behaviour, attitudes before vs. after and long after; participants vs. non-participants; targets vs. achievements; budget vs. expenditure, opinions of providers vs. users

14Expert evaluators can advise on this, but in order to hire the most appropriate expert use the INFE Guides to identify design. For example:Objective: to reach 1 million webusers:Monitor with website visit counter. Could also undertake online survey with a web panel to compare users of different sitesObjective: to improve knowledge of 100 adultsAttendance records + tests before and after the programme to monitor changesObjective: increase take-up of insurance in region x using social marketing

145. ReportingReporting is critical for the future of FE programmes

Avoid over generalisation: check carefully and get advice on whether findings may apply more widelyAlso report the method and limitations of the evaluationDisseminate the findings widely, using different styles of reporting (newsletter, academic paper..)Draw on the report when making future funding decisions & designing future programmesCompare your results to those of other programmes1515Questions, comments, further information:

[email protected]

OECD/INFE www.financial-education.org

Russian Trust Fund www.finlitedu.org