94
1 Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton MESI 2014

Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

  • Upload
    adsila

  • View
    73

  • Download
    2

Embed Size (px)

DESCRIPTION

Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton. MESI 2014. Research on Social Innovators’ Perceptions of Evaluation. Interpretive Frameworks. May 2003 Harvard Business Review "The High Cost of Accuracy." Kathleen Sutcliffe and Klaus Weber. - PowerPoint PPT Presentation

Citation preview

Page 1: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

1

Developmental Evaluation:Systems Thinking and

Complexity Science

Michael Quinn Patton

MESI 2014

Page 2: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

2

Research on Social Innovators’

Perceptions of Evaluation

Page 3: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Interpretive Frameworks

• May 2003 Harvard Business Review "The High Cost of Accuracy." Kathleen Sutcliffe and Klaus Weber.

They concluded that "the way senior executives interpret their business environment is more important for performance than how accurately they know their environment."

3

Page 4: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

They further concluded that it is a waste of resources to spend a lot of money increasing the marginal accuracy of data available to senior executives compared to the value of enhancing their capacity to interpret whatever data they have.

Executives were more limited by a lack of capacity to make sense of data than by inadequate or inaccurate data.

4

In essence, they found that interpretive capacity, or "mind-sets," distinguish high-performance more than data quality and accuracy.

Page 5: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Original Primary Options

Formative

and

Summative

Evaluation(Mid-term and End-of-Project Reviews)

5

Page 6: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Blandin Community Leadership Program

6

Page 7: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Fundamental issue

Development

vs.

Improvement

7

Page 8: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

ACTIVITIES

OUTCOMES

IMPACT

INPUTS

Inspired by Jeff Conklin, cognexus.org

Time

Complex development situations are ones Complex development situations are ones in which this…in which this…

OUTPUTS

8

Page 9: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

9

And this…

Page 10: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

OUTPUT

OUTCOME

INPUTS

ACTIVITY

INPUTS

ACTIVITY

INPUTS

ACTIVITY

INPUTS

OUTPUT

OUTPUT

ACTIVITY

OUTPUT

OUTPUT

OUTCOME

OUTCOME

OUTCOME

OUTCOME

OUTCOME

OUTCOMETurns out to be this…Turns out to be this…

Time10

Page 11: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

11

OUTPUT

OUTCOME

INPUTS

ACTIVITY

INPUTS

ACTIVITY

INPUTS

ACTIVITY

INPUTS

OUTPUT

OUTPUT

ACTIVITY

OUTPUT

OUTPUT

OUTCOME

OUTCOME

OUTCOME

OUTCOME

OUTCOME

OUTCOME…looks like this

Page 12: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Henry MintzbergStrategic LeadershipExpert

12

Evaluation of strategyImplementaion

Page 13: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Mintzberg on Strategy

Unrealized Strategy

Intended

Strategy

Deliberate Strategy

Realized

Strategy

Emergent Strategy13

Page 14: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

14

Developmental Evaluation Defined

Purpose: Developmental evaluation (DE) informs and supports

innovative and adaptive development in complex dynamic environments.

DE brings to innovation and adaptation the processes of asking evaluative questions, applying evaluation logic, and gathering and reporting evaluative data to support project, program, product, and/or organizational development with timely feedback.

Page 15: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Key DE Characteristics• Focus on development (versus improvement,

accountability or summative judgment) • Takes place in complex dynamic environments• Feedback is rapid (as real time as possible). • The evaluator works collaboratively with social

innovators to conceptualize, design and test new approaches in a long-term, on-going process of adaptation, intentional change, and development.

15

Page 16: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Key DE Characteristics

• The DE evaluator can be part of the intervention team.

• The evaluator's primary functions are to elucidate the innovation and adaptation processes, track their implications and results, and facilitate ongoing, real-time, data-based decision-making in the developmental process. 

• DE becomes part of the intervention. 16

Page 17: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

High Degree of Process Use

Process use refers to and is indicated by individual changes in thinking and behavior, and program or organizational changes in procedures and culture, that occur among those involved in evaluation as a result of the learning that occurs during the evaluation process. Evidence of process use is represented by the following kind of statement after an evaluation: "The impact on our program came not so much from the findings but from going through the thinking process that the evaluation required."

17

Page 18: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Small group exercise

Distinguishing

improvement

from

development

Each person describe a project example that distinguishes

an improvement from

a development

18

Page 19: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Other names

Real time evaluation Emergent evaluation Action evaluation Adaptive evaluation

19

Page 20: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

20

Development Evaluation

Developmental Evaluation

DD2 = Developmental evaluation used for development evaluation

DD2

Page 21: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

21

Primary developmental evaluation purpose

Complex system challenges Implications

1. Ongoing development Being implemented in a complex and dynamic environment

No intention to become a fixed/standardised modelIdentifies effective principles

2. Adapting effective principles to a new context

Innovative initiativeDevelop ‘own’ version based on adaption of effective principles and knowledge

Top-down—general principles knowledge disseminatedBottom-up—sensitivity to context, experience, capabilities and prioritiesAdaptation vs Adoption

3. Developing a rapid response in turbulent crisis conditions, e.g.,natural resource orhumanitarian disaster

Existing initiatives and responses no longer effective as conditions change suddenly

Planning, execution and evaluation occur simultaneously

Five purposes of developmental evaluation

Page 22: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

22

Primary developmental evaluation purpose

Complex system challenges Implications

4. Pre-formative development of potentially scalable innovative

Changing and dynamic situations require innovative solutions to worsening conditionsModel needs to be developed/does not exist

Models may move into formative and summative evaluation, others remain in developmental modeInform different potential scaling options

5. Major systems change and cross scale developmental evaluation

Disrupt existing systemTaking an innovation to scaleMajor systems change and changing scale will add levels of complexity, new uncertainties and disagreements

Models change as they are taken across time, space and to larger systemsAdaptive cross scale innovations assume complex, nonlinear dynamics—agility and responsivenessAdaptation -- Replication

Five purposes of developmental evaluation

Page 23: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Conditions that challenge traditional model-testing evaluation

• High innovation

• Development

• High uncertainty

• Dynamic

• Emergent

• Systems Change

AdaptiveManagement

andDevelopmental

Evaluation

23

Page 24: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Complexity concepts & Evaluation• Emergence: Self-organizing, Attractors

• Nonlinear: Small actions can have large reactions. “The Butterfly Wings Metaphor

• Dynamical: Interactions within, between, and among subsystems and parts within systems can volatile, changing

• Getting to Maybe: Uncertainty; unpredictable; uncontrollable; unanticipated consequences

• Coevolution: Process uses; interdependence

• Adaptation: Staff & Intended beneficiaries24

Page 25: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

EMERGENCE

• Self-organizing group experiences and outcomes

• Inter-relationships and interconnections

• Boundaries and levels of analysis

25

Page 26: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Taking Emergence Seriously

• Beyond “unanticipated consequences” to genuine openness

26

Page 27: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Uncertainty and Emergence“No battle plan ever survives contact with the

enemy.” Field Marshall Helmuth Carl Bernard von Moltke

“Everyone has a plan…until he gets hit.”

Former World Heavyweight boxing champion, Mike Tyson

Tom Peters (1996) Liberation Management :

“READY. FIRE. AIM.”

27

Page 29: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Seeing Through A Complexity Lens

29

DEFINING COMPLEXITY

Page 30: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Getting to Maybe:

How the World Is Changed?

Frances Westley,

Brenda Zimmerman,

Michael Q. Patton

Random House Canada,2006

30

Page 31: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Conceptual Options

•Simple

•Complicated

•Complex

31

Page 32: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Situation Analysis Matrix:

Mapping the Territory

Degree ofCertainty

Degre

e o

fA

gre

em

en

t

Close to Far from

Far

from

Clo

se to

32

Page 33: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Simple Space

Certainty

Ag

reem

en

t

Close to Far from

Far

from

Clo

se to Simple

Plan, control

33

Page 34: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

34

Following a Recipe A Rocket to the Moon Raising a ChildComplicated Complex

The recipe is essential

Recipes (best practices) are tested to assure replicability

Recipes produce standard products

Focus is on following the recipe (Fidelity

evaluation)

The goal is certainty of same results

every time

Simple

METAPHORS

Page 35: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

35

Page 36: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Technically Complicated

Certainty

Ag

reem

en

t

Close to Far from

Far

from

Clo

se to Simple

Plan, control

Technically Complicated Experiment, coordinate expertise

36

Page 37: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Following a Recipe A Rocket to the Moon Raising a Child• Formulae are

critical and necessary

• Sending one rocket increases assurance that next will be ok

• High level of expertise in many specialized fields + coordination

• Rockets similar in critical ways

• High degree of certainty of outcome

Complicated Complex

The recipe is essential

Recipes are tested to assure replicability of later efforts

No particular expertise; knowing how to cook increases success

Recipes produce standard products

Certainty of same results every time

Simple

37

Page 38: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Socially Complicated

Certainty

Ag

reem

en

t

Close to Far from

Far

from

Clo

se to Simple

Plan, control

Technically Complicated Experiment, coordinate expertise

SociallyComplicated Build relationships, create common ground

38

Page 39: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

39

Space Shuttle Disasters

• Challenger disaster  January 28, 1986

• Columbia disaster  February 1, 2003

Page 40: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Socially complicated

Implementing human rights agreements, like gender equity or outlawing child labor

Environmental Initiatives Many different and competing

stakeholders Diverse vested interests High stakes

40

Page 41: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Socially complicatedsituations

pose the challenge

of coordinating and

integrating

many players

41

Page 42: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Know When Your Challenges Are In the Zone of Complexity

Certainty

Ag

reem

en

t

Close to Far from

Far

from

Clo

se to Simple

Plan, control

Zone of Complexit

y

Technically Complicated Experiment, coordinate expertise

SociallyComplicated Build relationships, create common ground

42

Page 43: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Following a Recipe A Rocket to the Moon Raising a Child

Sending one rocket increases assurance that next will be ok

High level of expertise in many specialized fields + coordination

Rockets similar in critical ways

High degree of certainty of outcome

• Formulae have only a limited application

• Raising one child gives no assurance of success with the next

• Expertise can help but is not sufficient; relationships are key

• Every child is unique

• Uncertainty of outcome remains

Complicated Complex

The recipe is essential

Recipes are tested to assure replicability of later efforts

No particular expertise; knowing how to cook increases success

Recipes produce standard products

Certainty of same results every time

Simple

43

Page 44: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Following a Recipe A Rocket to the Moon Raising a Child• Formulae are critical

and necessary

• Sending one rocket increases assurance that next will be ok

• High level of expertise in many specialized fields + coordination

• Separate into parts and then coordinate

• Rockets similar in critical ways

• High degree of certainty of outcome

• Formulae have only a limited application

• Raising one child gives no assurance of success with the next

• Expertise can help but is not sufficient; relationships are key

• Can’t separate parts from the whole

• Every child is unique

• Uncertainty of outcome remains

Complicated

Complex

The recipe is essential

Recipes are tested to assure replicability of later efforts

No particular expertise; knowing how to cook increases success

Recipe notes the quantity and nature of “parts” needed

Recipes produce standard products

Certainty of same results every time

Simple

44

Page 45: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

“A Leader's Framework for Decision Making” by David J. Snowden and Mary E. Boone, Harvard Business Review,

November, 2007:

Wise executives tailor their approach to fit the complexity of the circumstances they face.

45

Page 46: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Wise evaluators tailor their approach to fit the complexity of the circumstances they face

46

Page 47: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Example

The McGill-McConnell Leadership Program Example

Simple elements

Complicated elements

Complex elements

47

Page 48: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Simple outcomes

• Increase knowledge and skills of participants

Evaluation: Pre-post data and documentation of learning

48

Page 49: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Complicated Impacts

• Change participants’ organizations

Evaluation:

Case studies

of

organizational change

49

Page 50: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Complex Vision

• Infuse energy into the moribund not-for-profit (voluntary) sector

• Make the sector more dynamic

• Create network of leaders who actively engage in change

50

Page 51: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Evaluating the Complex

• Real time follow-up of network connections and actions

• Follow-up is an intervention

• Rapid feedback of findings permits infusion of resources in support of emergent outcomes

51

Page 52: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Exercise

1. Identify a program or project at your table.

2. What elements are…Simple

ComplicatedComplex

52

Page 53: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

53

Primary developmental evaluation purpose

Complex system challenges Implications

1. Ongoing development Being implemented in a complex and dynamic environment

No intention to become a fixed/standardised modelIdentifies effective principles

2. Adapting effective principles to a new context

Innovative initiativeDevelop ‘own’ version based on adaption of effective principles and knowledge

Top-down—general principles knowledge disseminatedBottom-up—sensitivity to context, experience, capabilities and prioritiesAdaptation vs Adoption

3. Developing a rapid response in turbulent major change context

Existing initiatives and responses no longer effective as conditions change suddenly

Planning, execution and evaluation occur simultaneously

Purposes of developmental evaluation

Page 54: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Evidence-based Practice

Evaluation grew up in the “projects” testing models under a theory of change that pilot testing would lead to proven models that could be disseminated and taken to scale:

The search for best practices

and evidenced-based practices54

Page 55: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Fundamental Issue:How the World Is Changed

Top-down dissemination of

“proven models”

versus

Bottoms-up adaptive management

55

Page 56: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Models vs. Principles

Identifying proven principles for adaptive management

(bottoms-up approach)

versus

Identifying and disseminating

proven models

(top down approach)56

Page 57: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Some premises:• Evaluation is part of initial program design,

including conceptualizing the theory of change• Evaluator’s role is to help users clarify their

purpose, hoped-for results, and change model.• Evaluators can/should offer conceptual and

methodological options.• Evaluators can help by questioning

assumptions.• Evaluators can play a key role in facilitating

evaluative thinking all along the way.

57

Page 58: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Three ways of conceptualizingand mapping theories of change

Linear Newtonian causality Interdependent systems

relationships Complex nonlinear dynamics

58

Page 59: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Linear Logic ModelINPUTS (people, materials)ACTIVITIES (processes)

OUTPUTS OUTCOMES

CHANGES IN PEOPLES LIVES IMPACTS

CHANGES IN COMMUNITIES

59

Page 60: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Output / ProductEssential Attributes

Attributes required to meet of exceedcustomer needs:

"Do the Right Thing"EfficacyAppropriate

Characteristics to meet or exceedcustomer wants and expectationsof excellence

"Do the Right Thing Well":EfficiencyDignity and RespectEffectivenessTimelinessReduce WasteSafetyContinuityAvailability

What inputs need to go into theprocess to make the productthat produces the desiredresult?

What steps need to be taken tocreate the product that achievesthe desired result?

What features / characteristics should the producthave?

Systems Logic Model

CustomerOutcomes

&Satisfaction

MeasureEffectiveness

MeasureSatisfaction

InformImprovementneeds

Effect

Inputs

Staff ResourcesFinancial resourcesInternal StandardsExternal Requirements

and InformationEquipment/Materials

Key Processes & Functions

Inputs organized and utilizedProceduresStepsKey processes

Measure VariabilityAssess Process ControlAssess fidelity to planned

proceduresAssess impact of variationEvaluate opportunity to raise the

bar

Cause

Feedback into process

What is the desired result?What should customerexperience?

Planning

Implementation

Structure Process OUTCOMES

Feedback

60

Page 61: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Division workplace that:· Offers a healthy

work environment· Recognizes

excellence· Provides quality

training and management

· Includes effective systems, procedures, and communication (Goal 5)

Increased adoption, reach, implementation, and sustainability of recommended public health strategies to achieve strategic plan goals:· Prevent risk factors for

heart disease and stroke (Goal 1)

· Increase detection and treatment of risk factors (Goal 2)

· Increase early identification and treatment of heart attacks and strokes (Goal 3)

· Prevent recurring cardiovascular events (Goal 4)

Division for Heart Disease and Stroke Prevention Evaluation Planning Logic Model

Internal

Workforce that is:· Diverse· Skilled

Resources that are:· Available· Timely

Division Leadership that provides sufficient:· Infrastructure· Policies· Strategic Planning

Leadership

Disparities

Surveillance

Research

Evaluation

Program

Translation and dissemination of the current knowledge base, and identification of ways to improve that knowledge base

Effective:· Management· Coordination· Staff

development

Enhanced competency of public health workforce

Enhanced ability of programs to apply findings to improve public health

Enhanced external application of Division goals and strategies

Increased advocacy and “activated constituency”

Engaged network of states and partners

Enhanced integration among chronic disease programs

Increased focus on heart disease and stroke prevention efforts by states and partners, especially with regard to disparities

Policy

Increased knowledge of signs and symptoms

Improved emergency response

Improved quality of care

Reduced risk factors

Reduced economic impact of heart disease and stroke

Eliminated preventable strokes and risks

Reduced levels of disparities in heart disease and stroke

Reduced morbidity and mortality of heart disease and stroke

External

Planning Activities Translation,

Dissemination Adoption, Practice, Sustainability Impact

WHAT WHYHOW

Communication

Collaboration

61

Page 62: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

62

Primary developmental evaluation purpose

Complex system challenges Implications

4. Pre-formative development of potentially scalable innovative

Changing and dynamic situations require innovative solutions to worsening conditionsModel needs to be developed/does not exist

Models may move into formative and summative evaluation, others remain in developmental modeInform different potential scaling options

5. Major systems change and cross scale developmental evaluation

Disrupt existing systemTaking an innovation to scaleMajor systems change and changing scale will add levels of complexity, new uncertainties and disagreements

Models change as they are taken across time, space and to larger systemsAdaptive cross scale innovations assume complex, nonlinear dynamics—agility and responsivenessAdaptation -- Replication

Purposes of developmental evaluation

Page 63: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Systems

• Parts are interdependent such a change in one part changes all parts

• The whole is greater than the sum of the parts

• Focus on interconnected relationships

• Systems are made up of sub-systems and function within larger systems

63

Page 64: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Systems Concepts in Evaluation – An Expert Anthology. 2006.Bob Williams and Iraj Imam

AEA Monograph,

EdgePress/AEA Point Reyes CA.

64

Page 65: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Understanding the

Elephant

from

a Systems Perspective

65

Page 66: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

66

Page 67: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

The relationship between what goes in and what comes out

What conceptual framework informs front-end evaluation work?

67

Page 68: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Teen Pregnancy Program Example

68

Page 69: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Logic Model for Pregnant Teens Program1. Program reaches out to pregnant teens

2. Pregnant teens enter and attend the program (participation)

3. Teens learn prenatal nutrition and self-care (increased knowledge)

4. Teens develop commitment to take care of themselvesand their babies (attitude change)

5. Teens adopt healthy behaviors: no smoking, no drinking,attend prenatal clinic, eat properly (behavior change)

6. Teens have healthy babies (desired outcome)

69

Page 70: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Systems web showing possible influence linkages to a pregnant teenager

Teachers/ other adults

Youngpregnantwoman's

attitudes &behaviors

Herparents &

other familymembers

Child'sfather &

peers

Prenatal program

staff

Her peer group

70

Page 71: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Program systems web showing possible institutional influences affecting pregnant teenagers:

SCHOOL SYSTEM

Youngpregnantwomen's

attitudes &behaviors

PrenatalClinic andHospitalOutreach

Church

Prenatal program

Other community-based youth

programs

Other Systems-- welfare-- legal -- nutrition programs-- transportation-- child protection-- media messagesContext factors-- politics-- economic incentives-- social norms-- culture-- music

YouthCulture

71

Page 72: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

SYSTEMS CHANGE:Interrelationships, Boundaries,

Perceptions, Networks

72

Page 73: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

HIV/AIDS Example• Hits every system: health, family,

social, religious, economic, political, community, international

• Requires multiple interventions on multiple fronts in all subsystems simultaneously

• Resulting reactions, interactions, consequences dynamic, unpredictable, emergent, and ever changing

73

Page 74: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Challenges:

Situation Recognition

and Appropriate

Evaluation Designs

74

Page 75: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

The nature of

EXPERTISE

75

Page 76: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Complex Situations• Highly emergent (difficult to plan

and predict)

• Highly dynamic, rapidly changing

• Relationships are interdependent and non-linear rather than simple and linear (cause-effect)

76

Page 77: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Some Particularly Appropriate Applications of DE:

Examples of Innovative Arenas

• Social Movements and networks

• Advocacy Evaluation

• Large-scale, cross-sector, collaborative initiatives

• R & D in public health, technological innovation, science

• Prevention77

Page 78: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

BeyondjustSummativeand Formative

78

Page 79: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Beyond

Static Accountability

Models

79

Page 80: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Creative Challenge

Situational adaptability: Contingency-based evaluation Appropriateness

--Using standard forms of evaluation and

-- Going beyond standard forms when appropriate and useful

80

Page 81: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Paradigms and Lenses

• The importance of interpretive frameworks

• Complexity as an interpretive framework

81

Page 82: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

VALUES CONTRASTS

Traditional evaluations…

1. Testing models

Complexity-based, Developmental Evaluation…

1. Supporting innovation and adaptation

82

Page 83: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Traditional Evaluation…

2. Render definitive judgments of success or failure:

Does the program work?

Developmental Evaluation…

2. Rendering nuanced, disaggregated feedback & generate learnings for adaptation & development:

What works for whom in what ways under what conditions?

83

Page 84: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Traditional DevelopmentalEvaluation… Evaluation…

3. INDEPENDENCE:Evaluator external, independent, objective

3. RELATIONSHIP-FOCUSED, COLLABORATIVEEvaluator a facilitator and learning coach bringing evaluative thinking to the table, supportive of innovator’s vision

84

Page 85: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Traditional DevelopmentalEvaluation… Evaluation…

4. CONTROL:Evaluator determines the design based on the evaluator’s perspective about what is important. The evaluator controls the evaluation.

4. OPENNESS &

NATURALISTIC

INQUIRY

Evaluator goes with the flow, watches for what emerges

85

Page 86: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Traditional DevelopmentalEvaluation… Evaluation…

5. CERTAINTY:Predetermined

outcomesFix the design

upfrontPredetermind

indicatorsFixed

questions

5. FLEXIBILITYEmergent

outcomesFlexible designEmergent

indicatorsDynamic

questions

86

Michael Quinn Patton AEA 2011 86

Page 87: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Traditional DevelopmentalEvaluation… Evaluation…

6. Linear cause-effect thinking and logic models

6. Systems and complexity thinking with attention to dynamics, permeable boundaries,

interdependencies,

and emergent interconnections

87

Page 88: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Traditional DevelopmentalEvaluation… Evaluation…

7. Value top-down change based on generalizable findings across time & space

• High fidelity, prescriptive “best practices” based on summative evaluation

.

7. Value bottom-up principles that provide direction but have to be adapted to context

• Aim to produce context-specific understandings that inform ongoing

innovation and adaptation.

88

Page 89: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Traditional DevelopmentalEvaluation… Evaluation…

8. Accountability focused on and directed to external authorities and funders.

8. Accountability centered on the innovators’ deep sense of fundamental values and commitments –

and learning as accountability

89

Page 90: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

9. Being outside the action, above the fray

9. Being part of the action, engaged in the fray

Traditional DevelopmentalEvaluation… Evaluation…

90

Page 91: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

Traditional DevelopmentalEvaluation… Evaluation…

10. TRUTH:

Speaking truth to power

10.PERSPECTIVES

Facilitating dialogue and engagement with complexity and shifting understandings

91

Page 92: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

92

Page 94: Developmental Evaluation: Systems Thinking and Complexity Science Michael Quinn Patton

ReferencesDevelopmental Evaluation: Applying Complexity

Concepts to Enhance Innovation and Use. Guilford Press, 2011.

Utilization-Focused Evaluation, 4th ed.,

Michael Quinn Patton, Sage, 2008.

Essentials of Utilization-Focused Evaluation,

Michael Quinn Patton, Sage, 2012

94