38
Evaluation Collaboration: Opportunities and Challenges Michael Quinn Patton Utilization-Focused Evaluation 19 May, 2010 Evaluating the Haiti Response: Encouraging Improved System-wide Collaboration A joint OECD/DAC Evalnet-UNEG-ALNAP Roundtable Meeting

Evaluation Collaboration: Opportunities and Challenges Michael Quinn Patton Utilization-Focused Evaluation 19 May, 2010 Evaluating the Haiti Response:

Embed Size (px)

Citation preview

Evaluation Collaboration:Opportunities and Challenges

Michael Quinn PattonUtilization-Focused Evaluation

19 May, 2010

Evaluating the Haiti Response: Encouraging Improved System-wide Collaboration

A joint OECD/DAC Evalnet-UNEG-ALNAP Roundtable Meeting

VISION and PERCEPTION

THOMAS’ THEORM

What is perceived as real is real in its consequences.

2

Degrees of working together1. Networking: sharing information and ideas

2. Cooperating: helping distinct members accomplish their

separate individual goals.

3. Coordinating: working separately on shared goals.

4. Collaborating: working together toward a common goal

but maintaining separate resources/responsibilities

5. Partnering: Shared goals, shared decisions, shared

resources within a single entity.3

4

Please number a sheet of paper:

1.

2.

3.

5

Question 1. What is current state of working together in humanitarian evaluation (current baseline)?

0. Not at all working together1. Networking: sharing information and ideas2. Cooperating: Helping distinct members

accomplish their separate individual goals.3. Coordinating: working separately on shared goals.4. Collaborating: Working together toward a common goal

but maintaining separate resources and responsibilities5. Partnering: Shared goals, shared decisions, shared

resources within a single entity.

6

Question 2. From your perspective, what is the desired level of working together for the Haitian relief evaluation that should be the goal of this meeting?

0. Not at all working together1. Networking: sharing information and ideas2. Cooperating: Helping distinct members accomplish

their separate individual goals.3. Coordinating: working separately on shared goals.4. Collaborating: Working together toward a common

goal but maintaining separate resources and responsibilities

5. Partnering: Shared goals, shared decisions, shared resources within a single entity.

7

8

Question 3. At what level of working together are you

personally committed on behalf of your organization?

0. Not at all working together1. Networking: sharing information and ideas2. Cooperating: Helping distinct members accomplish

their separate individual goals.3. Coordinating: working separately on shared goals.4. Collaborating: Working together toward a common

goal but maintaining separate resources and responsibilities

5. Partnering: Shared goals, shared decisions, shared resources within a single entity.

Evaluation Framework

The WORKING TOGETHER continuum as an evaluation framework for evaluating the Haitian relief evaluation.

Potentially different degrees of working together for different evaluation purposes.

9

Diverse Types of Evaluation

Different uses for different primary intended users

10

Accountability

• Were funds used appropriately?Each organization accountable to its own stakeholders for basic accountability.

• What can be done together?Criteria for accountability, for example:* What is appropriate use of relief funds?* What is timely relief?* What is coordinated relief?

12

Other Evaluation Purposes

• Improvement of relief efforts in the short term

• Learning and sharing lessons• Development of Haiti in the long term• Overall judgment about Haitian relief

13

Speaking Truth to Power

14

About this book

Preview this book

Shake Hands with the Devil By Roméo Dallaire

This is a preview. The total pages displayed will

be limited. L

Loading...

Loading...

15

resultSearch Booksresultoe9S6SgfGeneral RPP1PP1ACfU3U1E

New Directions for Evaluation

Enhancing Disaster and Emergency Preparedness, Response, and Recovery Through

Evaluation

edited by Liesel Ashley Ritchie and Wayne MacDonald

New Directions for Evaluation1. Enhancing Disaster and Emergency Preparedness,

Response, and Recovery Through Evaluation Liesel Ashley Ritchie and Wayne MacDonald

2. Real-Time Evaluation in Humanitarian Emergencies Emery Brusset, John Cosgrave, Wayne MacDonald

3. The Interagency Health and Nutrition Evaluation Initiative in Humanitarian Crises: Moving From Single-Agency to Joint, Sectorwide Evaluations

Olga Bornemisza, André Griekspoor, Nadine Ezard and Egbert Sondorp

4. Save the Children’s Approach to Emergency Evaluation and Learning: Evolution in Policy and PracticeMegan Steinke-Chase and Danielle Tranzillo

5. Logic Modeling as a Tool to Prepare to Evaluate Disaster and Emergency Preparedness, Response, and Recovery in SchoolsKathy Zantal-Wiener and Thomas J. Horwood

6. Evolution of a Monitoring and Evaluation System in Disaster Recovery: Learning from the Katrina Aid Today National Case Management ConsortiumAmanda Janis, Kelly M. Stiefel &Celine C. Carbullido

7. Disasters, Crises, and Unique Populations: Suggestions for Survey ResearchPatric R. Spence, Kenneth A. Lachlan

8. Evaluation of Disaster and Emergency Management: Do No Harm, But Do BetterLiesel Ashley Ritchie & Wayne MacDonald

FOCUS and PRIORITIES

Utilization-Focused Evaluation lessons1.Less is more: The dangers and delusions of

comprehensiveness. Quality first.2.Ask important questions: Better to get

weaker data on important questions than harder data on unimportant questions.

2. Stay focused on use: actionable findings for intended uses by primary intended users.

20

Understanding and Creating CONTEXT

“Social scientists need to recognize that individual behavior is strongly affected by the context in which interactions take place rather than being simply a result of individual differences.”

Elinor Ostrom 2009 Nobel Prize in Economics

for her analysis of economic governance, especially the commons.

21

The Central Role of Trust

“Crucial role of trust among participants as the most efficient mechanism to enhance transactional outcomes….

Empirical studies confirm the important role of trust in overcoming social dilemmas.”

Reference: American Economic Review 100 (June 2010): 133http://www.aeaweb.org/articles.php?doi=10.1257/aer.100.3.1

22

The Central Role of Trust

• “Thus, it is not only that individuals adopt norms but also that the structure of the situation generates sufficient information about the likely behavior of others to be trustworthy reciprocators who will bear their share of the costs of overcoming a dilemma.”

23

Elinor Ostrom’s Conclusion“Extensive empirical research leads me to argue

that … a core goal of public policy should be to facilitate the development of institutions that bring out the best in humans. We need to ask how diverse polycentric institutions help or hinder the innovativeness, learning, adapting, trustworthiness, levels of cooperation of participants, and the achievement of more effective, equitable, and sustainable outcomes at multiple scales.”

24

Evaluation Theory of Change

• Process Use – Beyond findings’ use

• How an evaluation is conductedhas an impact beyond the findings that come from the evaluation.

• For example, evaluation questions carry messages about what matters, what’s important.

25

26

Lessons on effective evaluation collaborations

Research on Shared Measurement Platforms Comparative Performance Systems, and Adaptive Learning Systems

The research was based on six months of interviews and research by FSG Social Impact Advisors. They examined 20 efforts to develop shared approaches to performance, outcome or impact measurement across multiple organizations.

Reference: Kramer, Mark, Marcie Parkhurst, & Lalitha Vaidyanathan (2009). Breakthroughs in Shared Measurement and Social Impact. FSG Social Impact Advisors. www.fsg-impact.org

27

Eight factors important to effective evaluation collaborations

1. Strong leadership and substantial funding throughout a multi-year development period.

2. Broad engagement in the design process by many organizations in the field, with clear expectations about confidentiality or transparency

3. Voluntary participation open to all relevant organizations

4. Effective use of web-based technology

28

Eight factors important to effective evaluation collaborations

5. Independence from funders in devising indicators and managing the system

6. Ongoing staffing to provide training, facilitation, and to review the accuracy of all data

7. Testing and continually improving the system through user feedback

8. In more advanced systems, a facilitated process for participants to gather periodically to share results, learn from each other, and coordinate their efforts.

29

Adaptive Learning Systems

The most important lesson learned: the power of breakthroughs to promote a systemic and adaptive approach to solving social problems.

Adaptive Learning Systems offer a new vision

that goes beyond capacity building for individual organizations.

30

Adaptive Learning Systems

Breakthroughs offer ways to increase the efficiency, knowledge, and effectiveness of the entire system of interrelated organizations that affect complex interactions and working together.

Adaptive Learning Systems provide a collaborative process for all participating organizations to learn, support each other’s efforts, and improve over time.

31

Adaptive Learning Systems

“We believe that shared measurement systems can help move the sector beyond fragmented and disconnected efforts…by creating a new degree of coordination and learning that can magnify the impact of funders and grantees alike.”

32

Adaptive Learning SystemsAttentive to complexity concepts and understandings: COMPLEX DYNAMIC INTERACTIONS EMERGENCE ADAPTABILITY CO-EVOLUTION DEALING WITH UNCERTAINTY SYSTEMS CHANGE INNOVATION

For more on complexity and aid, see Ramalingam et al, (2008) Exploring the science of complexity, ODI Working Paper 285 http://www.odi.org.uk/resources/download/583.pdf 33

34

DevelopmentalEvaluation:

Applying Complexity Conceptsto Enhance Innovation and Use

Elinor Ostrom on ComplexityTo explain the world of interactions and outcomes

occurring at multiple levels, we also have to be willing to deal with complexity instead of rejecting it. Some mathematical models are very useful for explaining outcomes in particular settings. We should continue to use simple models where they capture enough of the core underlying structure and incentives that they usefully predict outcomes. When the world we are trying to explain and improve, however, is not well described by a simple model, we must continue to improve our frameworks and theories so as to be able to understand complexity and not simply reject it.

35

Major Capacity Development Purpose Question

Is the collaborative system to be designed only for Haitian relief, or is it to become the foundation for future evaluations of disaster relief?

What’s the VISION for the system of evaluation collaboration being built?

36

Results of the Baseline Survey

THOMAS’ THEORM

What is perceived as real is real in its consequences.

37