Example of an education intervention

Preview:

DESCRIPTION

Example of an education intervention. Girls’ scholarship program. What helps improve learning?. Often small/no impacts on actual learning in education research Inputs (textbooks, flipcharts) little impact on learning De-worming affected attendance but not test scores - PowerPoint PPT Presentation

Citation preview

Girls’ scholarship program

Often small/no impacts on actual learning in education research◦ Inputs (textbooks, flipcharts) little impact on learning◦ De-worming affected attendance but not test scores

What is often most important in education policies and programs? Incentives

What happens if we offer direct incentives for student learning?

What happens if only offer this for a disadvantaged subgroup? Girls

The debate over cash incentives◦ “Pros”

Incentives to exert effort Helps with self-control problems Externalities to effort

◦ Possible “cons” Exacerbate inequality Weaken intrinsic motivation (short or long run) Gaming the system (cramming, cheating)

Merit awards could affect◦ Eligible students’ own effort◦ Other students effort & teacher effort could be either

complements or substitutes

The Girls Scholarship Program Randomized evaluation in Kenyan primary schools

63 treatment & 64 comparison schools Balanced treatment groups

Announced an award for girls in treatment schoolsBased on end of year standardized test scoresTop 15% of grade 6 girls in program schools win award

1000 KSh (US$12.80) for winner and her family500 KSh (US$6.40) for school feesPublic recognition at an award ceremony

Two cohorts of scholarship winners, 2001 & 2002 Survey data on attendance, study habits, attitudes

Program implemented in two districts: Teso & Busia

Randomization and awards stratified by district◦ Historical and ethnic differences in the two

districts◦ NGOs have poor relations with some Teso

communities◦ Tragic lightning incident early 2001

C

CC

CC

C

C

CC

C

CC

CC

C

C

C

C

C

C

C

CC

C

C

CC

CCC

C

C

CC

C

C

C

CC

C

C

CC

C

C

C

CC

CC

CC

C C

C

C

C

C C

C

C

C

C

C

T

T

T T

TT

T

TT

TT T

TT

T

T

T

TT

TTT

T T

TT

TTT

T

T

T

T

T

T

T

TT

T

TT

T

T

T

T

T

T T

T T

T T

T

T

TT

T

T

T

TT

T

T

T

r

ÿ

GSP DistrictsTesoBusia

School Attrition

School Pulled Out

GSP SchoolsC ComparisonT Treatment

Effects of Lightning

ÿ Lightning

r Winner Refused

10 0 10 20 Miles

N

EW

S

School attrition: five Teso schools pulled out in 2001

Test attrition: treatment vs. comparison with complete 2001 data:◦ Teso 54% vs. 66%, Busia 77% vs. 77%

Differential test attrition: significantly more high-achieving students took the 2001 exam in comparison schools relative to program schools, likely to bias estimated program impacts toward zero in Teso

How to deal with it?Make sure that no one drops out from your original

treatment and control groups. If there is still attrition…

Check that it is not different in treatment and control. Also check that it is not correlated with observables.

If there is differential attrition1.Impute outcome variable based on baseline covariates2.Bounds: run the analysis under the “best-case” and

“worst-case” scenario. Either the best or the worst students are the ones that drop out at a rate that is equal to the rate of differential attrition

Estimate total effects and district effects Estimate effects for treatment schools (T):

istsistsist XTTEST

Cheating is likely not a concern Evidence of “learning”: consistent effects

over two years and two cohorts No effect on tutoring, household textbook

purchases, self-esteem, attitudes toward school, amount of chores at home

Teachers report more parental support in Busia

Student and teacher attendance increased

Important to think through programmatic issues when designing interventions – incentives must be aligned (teachers, parents, students…)

Randomizing by school can help to pick up within class/school externalities

Things can go wrong – need to monitor attrition Large and persistent gains in learning are

possible to achieve

What if instead of linking student performance to students, we made the teachers responsible?

Randomized evaluation in Kenya (Glewwe, Ilias and Kremer (2004))

Offered teachers prizes based on schools’ average scores◦ Top scoring schools and most improved schools

(relative to baseline)◦ Each category 1st, 2nd, 3rd and 4th prizes were

awarded (21% to 43 % of teacher monthly salary)◦ Penalized teachers for dropouts by assigning low

scores to students who did not take the exam

What was affected:Treatment scores 0.14 sd above controlStrongest for geography, history, and religion (most

memorization)Exam participation roseExtra-prep sessions

What was NOT affected:Dropout/ repetition ratesTeacher attendance Homework assignment or pedagogyLasting test score gains

Conclusions:Teachers’ effort concentrated in improving short-run

outcomes, rather than stimulating long-run learning

Busia: Overall (0.18 – 0.20 s.d.)◦ Persistent effect for girls the next year◦ Spillover effect for boys

Teso: Scholarship less successful: either no significant program effect or unreliable estimates

Merit-based scholarships can motivate students to exert effort◦ Test score and attendance gains among girls in the medium-run

Positive classroom externalities◦ Initially low-achieving girls, boys, and teachers

Possible multiple equilibria in classroom culture Cost-effective way to boost test scores Equity concerns – may wish to restrict to particular areas

or populations

Recommended