Upload
tyrone-ellett
View
214
Download
0
Tags:
Embed Size (px)
Citation preview
Workshop:Evaluations as a policy instrument for sustainable development How can we improve the learning process in political decision-making?
Stockholm October 18, Johan Harvard
Workshop program
13.00 WelcomeJohan Harvard, Swedish Agency for Growth Policy Analysis
13.00 – 13.20 The Power of Knowing: Why Improved Evaluation and Learning are Needed to Realise the Green FutureJohan Harvard, Swedish Agency for Growth Policy Analysis
13.20 – 13.40 Dealing with Complex Policy Evaluations – The European Social FundSteffen Ovdahl, Ramböll Management Consulting
13.40 – 14.30 Plenary Panel DiscussionComments byErik Fahlbeck, Chief Analyst, Economic Analysis Secretariat, Ministry of Enterprise
Barbro Widerstedt, Analyst, Swedish Agency for Growth Policy Analysis
Steffen Ovdahl, Senior Consultant, Ramböll Management Consulting
Presentation:The Power of Knowing: Why Improved Evaluation and Learning are Needed to Realise the Green Future
Stockholm October 18, Johan Harvard
Summary
• Lots of policy movement in the SD/GG-field, in developed countries
• Complex challenges – complex policy• Policy is not sufficiently/appropriately evaluated• Lessons not learned – inefficient policy and wasted tax
money• However – SD/GG policy very complex to evaluate• Thereby – SD/GG policy evalutions difficult to use for
policymakers
Key terminology• Evaluation vs Monitoring vs Benchmarking
• Evaluating process/implementation, results, impact
• Factual vs Counterfactual: Need to assess additionality
Green policy is picking up speed
• European Union’s Growth Strategy for 2020• Korea’s National Strategy and Five-Year-Plan for green
growth• Green development focus of China’s Five-Year-Plan• South Africa’s New Growth Path and Green Economy
Accord
Needs highlighted• Wish to stimulate sustainable growth: push for action
Analysis preceding policy action• Ex-ante analysis
Policy Action• Decisions are made
Implementation• Execution of policy
Learning after/during implementation• Lessons learned
New policy cycle
1
2
3
4
5
6
StatusPolicy Process: Crucial Elements Flawed
What is being done today?
• Local/regional level– Yes: Some monitoring/evaluation of implementation and results
– No: Impact evaluations are very rare
• National level– Yes: Benchmarking, monitoring macro indicators
– No: Impact evaluations, system evaluations linking policy to outcomes – very rare
Consequence: Suboptimal policy
• If you do not measure results, you cannot tell success from failure.
• If you cannot see success, you cannot reward it.• If you cannot reward success, you are probably
rewarding failure.• If you cannot see success, you cannot learn from it.• If you cannot recognize failure, you cannot correct it.• If you can demonstrate results, you can win public
support
Source: Adapted from Osborne & Gaebler 1992.
But… difficult problem to address
• SD/GG policy is inherently complex– Policy on all levels simultaneously (local/regional/national/
international)– Very different types of policies implemented simultaneously
(supply/demand, skills/capital/incentives/market)– External factors very important (economic turmoil, global market
demand, competition)– Conflicting objectives (economic growth vs sustainability)– Multiple policy areas – many different actors in play– Interaction effects
Simple – Complicated - Complex
• Simple problems - Like baking a cake. A few basic techniques but once mastered likelihood of success is high
• Complicated problems - Like sending a rocket to the moon. No straightforward recipe often require multiple people and teams. Unanticipated difficulties are frequent. Timing and coordination become serious concerns
• Complex problems - Like raising a child. Once you learn how to send a rocket to the moon. You can repeat and improve the process. One rocket is like another rocket. But raising one child is not like raising another. Each is unique. Once child may require a completely difference approach to the previous one. This brings up another feature of highly complex problems. Their outcomes remain highly uncertain. Source: Rogers (2008)
Evaluating SD/GG policy• Qualitative evaluation
– Contribution analysis– Theory-based approach– etc
• Quantitative evaluation– Comparison groups (deal with selection bias)– Quasi-experimental methods (PSM, Discontinuity etc)– Difference-in-difference (if unobservables are time invariant)
• There is no one solution need combinations – mixed methods
Useful evaluation – evaluation utilization• Currently – partial evaluations are carried out – leading to
inability to draw clear conclusionsDifficult to use these evaluations
Patton:• ”Evaluations should be judged by their utility and actual
use; therefore, evaluators should facilitate the evaluation process and design any evaluation with careful consideration of how everthing is done, will affect use”
• Focus on intended use by intended users• Mutual responsibility of users and evaluators
Sweden’s Cleantech Industry Development Strategy - Evaluation• Encompassing 4x100 Million SEK over 2011-2014,
currently 18 programs launched to help business grow– Include supply and demand side policies: export advisory
services, innovative public procurement, cleantech incubation, inward FDI, commercialization & export of public sector expertise, establishing business-research development consortia’s, matching of investors-investees, etc.
• Evaluation: – Process, Results, Impacts– Top-down + Bottom-up– Quantitative + Qualitative– Utilization: Hoping for process use as well as instrumental use
Conclusions and Recommendations
• Need of systematic improvement of evaluations – more impact evaluations
• Authorities should not underestimate the complexity and importance of impact evaluations– Need better incentives? Will cost more initially!
• Evaluators should improve their use of best methods for impact evaluation – including use of quantitative methods
Questions• We need more and better evaluations on all levels – how
can this be achieved? – Incentives for impact evaluations?– How can we make sure evaluations include economic and
environmental aspects – not either or?
• How can the procurement of evaluations improve, to facilitate better evaluations and improve evaluation use?
• Complex evaluations - difficult to use for policymakers: how can this be overcome? Capacity for use?
• Could/should policy be designed differently, to facilitate evaluation?– …if a changed design would compromise quality?