Impact Evaluation for Real Time Decision Making

Preview:

DESCRIPTION

Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank. Impact Evaluation for Real Time Decision Making. FIRST What is a results chain?. Example of a results chain. ?. The results chain. Developmental hypothesis that helps you define: - PowerPoint PPT Presentation

Citation preview

DIME – FRAGILE STATESDUBAI, MAY 31 – JUNE 4

Impact Evaluation for Real Time Decision MakingArianna LegoviniHead, Development Impact Evaluation Initiative (DIME)World Bank

FIRST What is a results chain?

Inputs• Financ

ial and human resources

Activities• Action

s to convert inputs into outputs

Outputs• Tangib

le products, • infor

mation campaigns,

• trainings,

• studies

Inter-mediate outcomes• Use of

outputs by intended population• Take

-up• Use

Final Outcomes• Object

ive of the program• Gro

wth• Soci

al cohesion

• Employment status 2

Implementation (SUPPLY SIDE) Results (DEMAND RESPONSE)

Example of a results chain

Inputs• Financi

al and human resources to build local institutions

Activities• Develo

p com-munication

• Set up grant mechanisms

• Provide technical assistance to communities

Outputs• Comm

unica-tion package

• Grant offerings

• Technical assistance visits

Intermediate outcomes• Grants

awarded and disbursed

• Projects implemented

• Community rallies

Final Outcomes• Reduce

d conflict

• Improved inclusion of ex-combatants

• Expanded employment

3

Monitor whether program is implemented as planned Evaluate whether program is effective

?

The results chain

Developmental hypothesis that helps you define: what you are doing and for what

purpose what needs to be monitored and what needs to be evaluated

4

What is monitoring?

Monitoring tracks indicators over time (in the treatment group)

It is descriptive before-after analysis

It tells us whether things are moving in the right direction

5

What is Impact Evaluation? Impact evaluation tracks outcomes

over time in the treatment group relative to a control group

It measures the effect of an intervention on outcomes relative to a counterfactual what would have happened without it?

It identifies the causal output-outcome link separately from the effect of other time-

varying factors

Monitoring & Impact Evaluation Use monitoring to track implementation efficiency (input-output)

7

INPUTS OUTCOMESOUTPUTS

MONITOR EFFICIENCY

EVALUATE EFFECTIVENESS

$UPPLY

DEMAND RESPONSE

Use impact evaluation to measure effectiveness (output-outcome)

Pick the right method to answer your questions

Descriptive Analy

sis

Monitoring (and process evaluation)• Is

program being implemented efficiently?

• Is program targeting the right population?

• Are outcomes moving in the right direction?

Causal

Analysis

Impact Evaluation• What

was the effect of the program on outcomes?

• How do alternative implementation modalities compare?

• Is the program cost-effective?

8

Discuss among yourselves (5m)What happens if you use monitoring to evaluate impact?

9

Discuss among yourselves (5m)What happens if you use monitoring to evaluate impact?

You get the wrong answer……100% of the times 10

AfterBefore

B

A

t0 t1

B

Intervention

Change

C Impact

NEXT: Do we know ex ante… On Community Driven Development, what

information will get communities to respond? facilitation will results in high quality proposals? rules will increase inclusion in the decision-making

process? monitoring mechanisms and co-payments will

improve local projects and their use of funds? On Disarmament, Demobilization and

Reintegration, are community based or targeted approaches most

effective? should we try to delink combatants from units or

build on unit cohesion? is including or excluding command structures most

effective?

11

Trial and error

We turn to our best judgment for guidance and pick an information campaign, a package of services, a structure of incentives.

Is there any other campaign, package, incentive structure that will do better?

12

The decision process is complexA few big decisions are taken during

design but many more decisions are taken during roll out & implementation

13

DesignEarly roll outImplementation

Pick up the ball:What is a results tree?A results tree is a representation of

the set of results chains that are considered viable during program design or program restructuring.

Is a set of competing policy and operational alternatives to reach a specific objective.

14

Example of a decision tree for a combatant reintegration program

ReintegrationPublic communication campaign to reintegrate ex-combatantsUse combatant unit cohesion

structuresInclude command structuresExclude command structures

Delink combatant and dismantle unit structureSupply-based training

Demand-based trainingCommunity-based campaignUse combatant unit cohesion

structuresInclude command structuresExclude command structures

Delink combatant and dismantle unit structureSupply-based training

Demand-based training

15

How to select between plausible alternatives?Establish which decisions will be

taken upfront and which will be tested during roll-out

Experimentally test critical nodes: measure the impact of one option relative to another or to no intervention

Pick better and discard worse during implementation

Cannot learn everything at onceSelect carefully what you want to

test by involving all relevant partners

16

Walk along the decision tree to get more results out of a reintegration program

17• Take-upMAX • Use of servicesMAX • Benefits from

servicesMAX

ReintegrationPublic communication campaign to reintegrate ex-combatantsUse combatant unit cohesion structures

Include command structuresExclude command structures

Delink combatant and dismantle unit structureSupply-based training

Demand-based trainingCommunity-based campaign

Use combatant unit cohesion structuresInclude command structuresExclude command structures

Delink combatant and dismantle unit structureSupply-based training

Demand-based training

H0w IE can support you

18

Discuss among yourselves (5m)How many times do you make

changes to your program on the feeling that something is not working right?

How useful would it be to know for sure which ways are best?

19

Why evaluate?

Improve quality of programs Test alternatives and inform design in real

time Increase program effectiveness Answer the “so what” questions

Build government institutions for evidence-based policy-making Plan for implementation of options not solutions Find out what alternatives work best Adopt better way of doing business and taking

decisions 20

The market for evidence

21

PM/Presidency:Communicate to constituencies

Treasury/Finance:Allocate budget

Line ministries:Deliver

programs and negotiate budget

Cost-effectiveness of different programs

Effects of government

program

BUDGET

SERVICE DELIVERY

CAMPAIGNPROMISES

Accountability

Cost-effectiveness of alternatives and effect of sector

programs

Shifting Program ParadigmFrom: Program is a set of activities designed to deliver expected results

Program will either deliver or notTo: Program is menu of alternatives with a

learning strategy to find out which work bestChange programs overtime to deliver

more results 22

Shifting Evaluation Paradigm

From retrospective, external, independent evaluation Top down Determine whether program worked or not

To prospective, internal, and operationally driven impact evaluation /externally validated Set program learning agenda bottom up Consider plausible implementation alternatives Test scientifically and adopt best Just-in-time advice to improve effectiveness

of program over time 23

Retrospective (designed & evaluated ex-

post) vs. Prospective (designed ex-ante and evaluated ex-post)

Retrospective impact evaluation: Collecting data after the event you don’t know

how participants and nonparticipants compared before the program started

Have to try and disentangle why the project was implemented where and when it was, after the event

Prospective impact evaluation: design the evaluation to answer the question

you need to answer collect the data you will need later Ensure analytical validity 24

24

Is this a one shot analytical product? Evaluative process to provide useful (actionable)

information at each step of the impact evaluation

25

Discuss among yourselves (5m)

What are some of the entry points in policy-making cycles?

What are some of the answers you would like to have?

26

Ethical considerations

It is not ethical to deny benefits to something that is available and we know works HIV medicine proven to prolong life

It is ethical to test interventions before scale up if we don’t know if it works and whether it has unforeseen consequences Food aid may destroy local markets create

perverse incentives Most times we use opportunities created

by roll out and budget constraints to evaluate so as to minimize ethical considerations AND

We can always and should test alternatives to maximize out results

27

Questions? Comments?

28

0

50

100

150

200

250

300

350

2004 2005 2006 2007 2008 2009 2010

Num

ber o

f ong

oing

IE

Evolution of IE in the Bank

SAR

MNA

LAC

ECA

EAP

AFR

Recommended