Upload
amelia-welch
View
214
Download
0
Embed Size (px)
Citation preview
The ALNAP Meta-evaluation
Tony Beck, Peter Wiles and John Lakeman
What is the ALNAP meta-evaluation?
An overview of evaluation of humanitarian action quality
Identification of strengths and weaknesses
Recommendations for improvement across the sector and in individual agencies
The ALNAP Quality Proforma
ALNAP’s meta-evaluation tool
Draws on good practice in EHA and evaluation in general
Revised and peer reviewed this year
2003-4 meta-evaluation
Rated representative set of 30 evaluations
Focus this year at request of ALNAP members on:Good practiceDialogue with 11 evaluation offices, focusing of impact of evaluation processes on evaluation quality
Agencies included in dialogue
CAFOD, Danida, ECHO, ICRC, OCHA, OFDA, Oxfam, SC-UK, SIDA, UNHCR, and WHO
Findings from dialogue with evaluation managers
Some areas affecting evaluation quality not currently captured by the QP
Evaluation quality depends on subtle negotiations within agencies about key findings, eg staffing, use of DAC criteria
Likely follow-up from recommendations is difficult to predict and dependent on a number of processes in agencies
Findings from dialogue with evaluation managers: the EHA market
Main constraint to improved evaluation quality is agencies accessing available evaluators with appropriate skills
Does the EHA market need further regulation?
Mainstreaming of the Quality Proforma
By ECHO to revise tor (lesson learning, protection, identification of users, prioritisation, time frame and users of recommendations etc)
DEC Southern Africa evaluation (rated 7 agency reports)
Groupe URD (for planning of evaluations)
Findings from the Quality Proforma 2003-2004
Significant improvement in use of DAC criteria, although efficiency and coherence still problematic
Greater attention to protection (2002/3 – 6 per cent rates satisfactory or better, 2003/4 32 per cent rated satisfactory)
Findings from the Quality Proforma 2003-2004
No improvement in appropriateness of evaluation methods use, vis a vis good practice
Limited improvement in primary stakeholder consultation (13% satisfactory or better in 2002/3; 20% in 2003/4).
Most other QP areas fairly similar to 2000-2002 average
Greater attention to HIV/AIDS
Good practice examples
Follow-upConsultation with primary stakeholdersSocio-economic analysisRTEEvaluation of efficiencyProtection
Next steps
Agencies valued interaction and dialogue with meta-evaluators, this should continue.
Eg internal/external rating by agencies and meta-evaluators using slimmed down Quality Proforma (mainstreaming).
Eg interaction between non-agency evaluators and meta-evaluators.
Next steps
Is work needed on the EHA market, eg bringing in evaluators from the south
Best format for proceeding – working group on evaluation quality?