11
PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011 www.pepa.ac.uk

PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

Embed Size (px)

Citation preview

Page 1: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

PEPA is based at the IFS and CEMMAP

© Institute for Fiscal Studies

Programme Evaluation for Policy AnalysisMike Brewer, 19 October 2011

www.pepa.ac.uk

Page 2: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

Programme Evaluation for Policy Analysis: overview

• Part of the ESRC-funded National Centre for Research Methods

• PEPA is about ways to do, and ways to get the most out of, “programme evaluation”

© Institute for Fiscal Studies

“estimating the

casual impact of”“government

policies” (although

can often

generalise)

Page 3: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

Programme Evaluation for Policy Analysis: overview

• Part of the ESRC-funded National Centre for Research Methods

• PEPA is about ways to do, and ways to get the most out of, “programme evaluation”– Estimating the counterfactual– Characterising the uncertainty– Generalizing and synthesizing

• Beneficiaries– those who do programme evaluation– those who commission or design evaluations, or make

decisions based on the results of evaluations

© Institute for Fiscal Studies

Page 4: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

1. Step change in conduct of programme evaluation

• Training courses, workshops, on-line resources

• Research– doing inference more accurately– social networks and policy interventions– estimating bounds of true impact

• Substantive research projects as exemplars

© Institute for Fiscal Studies

Page 5: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

2. Maximise the value of programme evaluation

• Research– Combining behavioural models with results of evaluation– Compare RCTs with non-experimental approaches– Synthesising results

• Training courses, especially for – Those who commission evaluations and interpret results of

evaluations (link with Cross-Government Evaluation Group)

© Institute for Fiscal Studies

Page 6: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

PEPA: who we are

• Professor Richard Blundell, UCL & IFS• Professor Mike Brewer, University of Essex & IFS

(Director)• Professor Andrew Chesher, UCL & IFS• Dr Monica Costa Dias, IFS (Deputy Director)• Dr Thomas Crossley, Cambridge & IFS• Professor Lorraine Dearden, Institute of Education & IFS• Dr Hamish Low, Cambridge & IFS• Professor Imran Rasul, UCL & IFS• Dr Barbara Sianesi, IFS• Department for Work and Pensions is a partner

www.pepa.ac.uk© Institute for Fiscal Studies

Page 7: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

Thoughts on evidence provision

• Pilots often not designed with focus of answering “what works?”

• Funding “What works?” seen as government’s responsibility

• Data

• Limitation of “what works” evidence

© Institute for Fiscal Studies

Page 8: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

Spare slides on PEPA

© Institute for Fiscal Studies

Page 9: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

PEPA: overview

PEPA. Director: Mike

Brewer

1. Are RCTs worth it?

Barbara Sianesi, Jeremy Lise

2. Inference

Thomas Crossley, Mike Brewer,

Marcos Hernandez, John Ham

3. Control functions and

evidence synthesis

Richard Blundell, Adam Rosen,

Monica Costa Dias, Andrew

Chesher

4. Structural dynamic models

Hamish Low, Monica Costa

Dias, Costas Meghir

5. Social networks

Imran Rasul, Marcos

Hernandez

0. Core programme evaluation

skills

© Institute for Fiscal Studies

Page 10: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

PEPA: research questions

PEPA

1. Are RCTs worth it?

Can non-experimental methods

replicate the results of RCTs?

How can we combine results from

RCTs with models of labour market

behaviour?

How do GE effects alter estimated

impact of training programmes?

2. Inference

Correct inference and power

calculations where data have multi-

level structure & serially-correlated

shocks?

Correct inference when policy impacts

are complex functions of estimated

parameters?

Impact of time-limited in-work

benefits on job retention?

3. Control functions and evidence

synthesis

Can we weaken control function

approach to estimate bounds?

Link between control function and

structural or behavioural model s?

How are lessons from multiple

evaluations best synthesised

4. Structural dynamic models

How best to use ex post evaluations in

ex ante analysis?

How are education decisions affected

by welfare-to-work programmes?

How do life-cycle time limits on

welfare receipt affect behaviour?

5. Social networks

How best to collect data on social

networks?

How is impact of policy affected by

the social networks within and

between treated and control groups?

Can social networks explain

heterogeneity in impact of a health

intervention?

© Institute for Fiscal Studies

Page 11: PEPA is based at the IFS and CEMMAP © Institute for Fiscal Studies Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011

PEPA: training and capacity building

PEPA

0. Core programme evaluation

skills

Core course in evaluation

methods

Courses for designers and users

of evaluations

How-to guide for PS matching

1. Are RCTs worth it?

Course on estimating “search

models”

Workshop on using “search

models”

Workshop on value of RCTs

2. Inference

Course, manual and software

tools on power calculations

Courses, survey, manual &

software for better inference

Workshop and courses on using

survivor models for policy

evaluation

3. Control functions and

evidence synthesis

Course and workshop on control

functions

Course on bounds in policy

evaluation

4. Structural dynamic models

Courses and resources for

building dynamic behavioural

models

Workshop on dynamic

behavioural models and policy

evaluation

5. Social networks

Survey and courses on using

data on social networks

Survey and courses on

collecting data on social

networks

© Institute for Fiscal Studies