27
Impact Evaluation Methods

Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

  • View
    229

  • Download
    2

Embed Size (px)

Citation preview

Page 1: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Impact Evaluation

Methods

Page 2: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Methods

• Randomized Trials

• Regression Discontinuity

• Matching

• Difference in Differences

Page 3: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

The Goal

• Causality

We did program X, and because of it, Y happened.

Page 4: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

The Goal

• Causal Inference

Y happened because of X, not for some other reason. Thus it makes sense to think that if we did X again in a similar setting, Y would happen again.

Page 5: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Getting to Causality

In a more research-friendly universe, we’d be able to observe a single person (call him Fred) after we both gave and didn’t give him the treatment.

Ytreated Fred-Yuntreated Fred

Page 6: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Getting to Causality

In the reality-based community,

finding this Ytreated Fred-Yuntreated Fred

“counterfactual” is impossible.

Is the solution to get more people?

Page 7: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Getting to Causality

With more people, we can calculate

Average (treated)-Average(untreated).

But what if there’s an underlying difference between the treated and untreated?

Page 8: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Getting to CausalityConfounding Factors/Selection Bias/Omitted Variable

Bias

Textbook Example:If textbooks were deliberately given to the most needy

schools, the simple difference is incorrect.

If textbooks were already present in the schools where parents cared a lot about education, the simple difference is incorrect.

Page 9: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Problem Solved

If we randomize the treatment, on average, treatment and control groups should be the same in all respects, and there won’t be selection bias.

Check that it’s true for all observables.

Hope that it’s therefore true for all unobservables.

Page 10: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Math You’d Rather Not See

See Clair’s slides from September 15-omitted variable bias

Very accessible reading from same week by Duflo, Glennerster & Kremer.

-selection bias

Page 11: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Randomization

Randomize who gets treated.Check if it came out OK.

Basically, that’s it.

Y T −Y C

Page 12: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Randomization

Examples:Progresa-Cash if kids go to schoolMoving to Opportunity-voucher to move to better

neighborhoodFertilizer & Hybrid SeedLoan maturity & Interest rateDeworming

Page 13: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Regression Discontinuity

Being involved in a program is clearly not random.

Smarter kids get get scholarships.Kids in smaller classes learn better.Big firms are more likely to unionize.

Page 14: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Regression Discontinuity

Being involved in a program is clearly not random.Or is it?

Scholarship cutoff +1 girl vs. scholarship cutoff-1 girl

Isreali 41 kid school vs. Isreali 40 kid school

Union-yes 50%+1 school vs. Union-yes 50% -1 school

Page 15: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Regression Discontinuity

Being involved in a program is clearly not random.Or is it?

Scholarship cutoff +1 girl vs. scholarship cutoff-1 girl

Isreali 41 kid school vs. Isreali 40 kid school

Union-yes 50%+1 school vs. Union-yes 50% -1 school

Page 16: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

So how do we actually do this?1. Draw two pretty pictures

1. Eligibility criterion (test score, income, or whatever) vs. Program Enrollment

2. Eligibility criterion vs. OutcomeFigure 1: Participation in PANES and eligibility

0

.2

.4

.6

.8

1

- . 0 2 - . 0 1 0 . 0 1 . 0 2

s t a n d a r d iz e d S E S

Figure 2: Political support for the government and program eligibility

.5

.6

.7

.8

.9

-.02 -.01 0 .01 .02

standardized SES

Page 17: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

So how do we actually do this?2. Run a simple regression.

(Yes, this is basically all we ever do, and the stats programs we use can run the calculation in almost any situation, but before we do it, it’s necessary to make sure the situation is appropriate and draw the graphs so that we can have confidence that our estimates are actually causal.)

Outcome as a function of test score (or whatever), with a binary (1 if yes, 0 if no) variable for program enrollment.

QuickTime™ and a decompressor

are needed to see this picture.

Page 18: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

As Good As Random, Sort Of

Randomize who gets treated (within a bandwidth).Check if it came out OK (within a bandwidth).

(within a bandwidth)

Basically, that’s it (within a bandwidth).

Y T −Y C

Page 19: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Difference in Differences

Change for the treated - Change for the control(t1-t0)-(c1-c0)t1-t0-c1+c0t1-c1-t0+c0t1-c1-(t0-c0)Which is the same as…

QuickTime™ and a decompressor

are needed to see this picture.

Page 20: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

QuickTime™ and a decompressor

are needed to see this picture.

Page 21: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Examples

Malaria• Bleakley, Hoyt. Malaria Eradication in the Americas: A Retrospective

Analysis of Childhood Exposure. Working paper.

Land Reform• Besley, Timothy and Robin Burgess. Land Reform, Poverty Reduction, and

Growth: Evidence from India. Quarterly Journal of Economics. May 2000,

389-430.

Page 22: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Matching

• Match each treated participant to one or more untreated participant based on observable characteristics.

• Assumes no selection on unobservables

• Condense all observables into one “propensity score,” match on that score.

Page 23: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Matching

• After matching treated to most similar untreated, subtract the means, calculate average difference

YJon(T ) −YJohn(C ) + YJim(T ) −YTim (C )

2

Page 24: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Matching

Examples:

Does piped water reduce diarrhea?• Jalan, Jyotsna and Martin Ravallion. Does Piped Water Reduce Diarrhea for

Children in Rural India? Journal of Econometrics. January 2003, 153-173.

Anti-poverty program in Argentina• Jalan, Jyotsna and Martin Ravallion. Estimating the Benefit Incidence of an

Antipoverty Program by Propensity Score Matching. Journal of Business and

Economic Statistics. January 2003, 19-30.

Page 26: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Summary

The weakest (easiest) assumption is the best assumption.

Randomization wins.

Real scientists use it too.

Page 27: Impact Evaluation Methods. Randomized Trials Regression Discontinuity Matching Difference in Differences

Proof by One Example

LaLonde, Robert. Evaluating the Econometric Evaluations of Training Programs with Experimental Data. American Economic Review, September 1986.

Run a randomization and analyze it well. Then pretend you don’t have all the data that you do, construct fake comparison groups using the census, and show that none of your crazy methods get you right answer.