25
Evaluation’s Contribution to the Measurement of Research, Technology and Development: From Prescription to Menu to Guidelines Presentation at the Canadian Evaluation Society Victoria, BC May 4, 2010 [email protected]

Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

Evaluation’s Contribution to the Measurement of Research, 

Technology and Development: From Prescription to Menu to Guidelines

Presentation at the

Canadian Evaluation Society

Victoria, BC

May 4, 2010

[email protected]

Page 2: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

The Situation

• Evaluation has influenced the performance measurement of RT&D

• Traditionally – a prescription has been offered– Inputs– Outputs– Publication productivity– Citations– Some ‘impacts’

• Problem:  Mission is sometimes neglected

2

Page 3: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

Some Background

• A short ‘history’ of RT&D indicators..• Paper by Fred Gault, Statistics Canada (SPRU 2006) provides 

an excellent review of Canadian and international R&D and innovation statistics– Accepted input indicators include, GERD, R&D personnel, assumed to 

result in innovation (input / output model)– Output indicators link to IP (publications, patents) and new HQP also 

assumed to link directly to innovation, economic growth – Outcomes relate to increased innovation and need to consider a wide 

range of factors outside the direct influence of R&D– Recognition that the linear innovation model is simplistic and doesn’t 

reflect sectoral differences, influence of other policies and international economic situation

• More recently we have seen these formalized as scorecards and ‘payback’ categories 

3

Page 4: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

4

Evaluating the Valuation Approaches

Examples: Assessment:Aggregate Economic Statistics- Expenditures on R&D- GERD- Market share- Trade deficit in innovative goods

Not ‘outcomes’Attribution problemsUnderstanding?

Micro-economic Benefit-Cost- Benefit-Cost Analysis- Cost-Effectiveness Analysis

Favours easily quantifiable ‘outcomes’Monetization problemsHigh sensitivity to assumptions

Scientific Merit Valuation- Peer review- Publications- Citations

Partial ‘outcomes’Bias problems – science viewpointMisses the bigger system

[email protected]

Page 5: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

Recent Developments

• Results logic to ‘inform’ indicators

• Sort the levels

• Suggest the connections

5

Page 6: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected] 6

Case Impacts – A Structural Assessment Project Incrementality

(Direct Influence)Direct Collaborator / Performer / User

ImpactsIndustry Sector / Supply

Community Impacts Economy / Societal Impacts

Involvement

□ project would not have been done without program assistance

□ contributed to completing the project more quickly

□ contributed to completing the project more thoroughly

Technical results

□ new or improved product

□ new or improved process

□ advancement of knowledge

□ increased technical capabilities

□ improved quality control

□ new skills internally

□ increased efficiency / improved productivity

□ technology transfer

Policy / legislative results

□ policy behavioural changes

□ agreement / accord

□ legislation / regulation

Infratechnology results

□ Codes, standards, databases, protocols

□ acceptance of standards

Commercial results

□ increased sales

□ increased market share

□ increased profitability

□ cost savings

Organizational effects

□ increase in jobs

□ diversification

□ expansions

□ strategic alliances / partnerships

□ achievement awards / recognition

□ production process efficiencies

□ increased science and technology information

□ increased sales

□ cost savings

□ changes to industry structure (e.g., concentration, competitiveness internationally)

□ spin-off companies

□ technology infrastructure (e.g., standard scientific and engineering data, industry standards, test protocols, and instrumentation)

□ training of technological problem-solvers whose talents can be applied in many areas

□ establishment of quality, performance standards

Economic Benefits

□ reduced consumer costs

□ increased employment

□ improved competitiveness

□ reduction in subsidies

Societal Benefits

□ improved quality of life

□ protection of environment

□ improved energy efficiency

□ improved public health and safety

□ education / awareness

□ public service efficiency gains (i.e. lowered taxpayer burden)

* Note that this template attempts to integrate a wide range of impacts into a structured quantitative and qualitative assessment.

Page 7: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

7

Persuasion / information search

- More knowledge sought in this area

- New research ideas developed

- Obtained and tried knowledge tools (e.g., software, curriculum)

- Demonstrated/ saw demonstration

- Hands-on experience

Access Information and are Aware of:

- Knowledge gaps- information resources

(curriculum, tech reports, etc.)

- Program opportunities (program offerings/ labels, etc)

- EERE potential/ opportunities

Product filter: Knowledge/tool has value- Relative advantage

• More credible information• Information confers

- economic benefit- efficiency benefit- production benefit- quality benefit

- Compatible• With other knowledge/tools• Production requirements• Organizational goals/systems• Culture

- Complexity• Easy to understand• Simplifies existing knowledge

- Can be tried by target audience - Benefits of use can be Observed

Broadcast Contagion

InnovatorsEarly Adopters

Early and late majority

Advocacy Spreads word to others Awareness of confirmed value spreads through

- Advocacy groups- Peer networks- One-to-one interactions

Implemented: Behavior/ Practices ChangedCapital- Larger, stronger knowledge community/ critical

mass, core competencies- Invest in more research in this areaDesign/Plan- Changed knowledge seeking behavior (e.g.

research approach, how calculate savings, use different procedures, consider labeling criteria)

- Considered limitations, benefits/ opportunities (e.g. in setting standards)

Implement- Changed attitude, perception of value used in

negotiations, text books, technical specifications- Used tool to measure, choose option, shift loads Maintain- keep data base current, correct, consistent over

time (e.g. climate data, wind resources data)

Confirm Value• Knowledge valued

in community• Knowledge cited

by others• Knowledge used

by others

Replication: Emulate changes within Organization/household/ facility/firm

- Changes at the same site

- Changes at another site

Replication by those not directly involved with the program

-Widespread acceptance of product among target group

- Tip the market- Market penetration increases

Sustained Knowledge R&D community, Policy

makers, Households, Firms use and share the knowledge routinely in textbooks, technical manuals, regulations, etc.

Others observe changes and learn through peer interaction

A Generic Version of the Diffusion Model Appropriate to Knowledge Entities

Source: A Generic Theory Based Logic Model for Creating Scientifically Based Program Logic Models. John Reed, AEA 2006

[email protected]

Page 8: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

Logic Chart for a Research and Technology Development and Deployment Program

8

Source: Logic Models:  A Tool for Telling Your Program's Performance Story . John A. McLaughlin and Getchen  B. Jordan, July 1998

Page 9: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

CAHS Framework Logic Model of Health Research Progression to Impacts

9Source: Making an Impact, Canadian Academy of Health Sciences, Assessment Report ,January 2009.

Page 10: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

The Challenge: Can We Go Farther?

• Can we use results logic ‐ or a simplified results hierarchy to sort RT&D indicators?

• Can an Innovation diffusion model help? (Rogers, Bennett)

• Can we increase the emphasis on learning.. As compared to ‘justification’?

10

Page 11: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

CAHS ROI IndicatorsOver 60 Indicators of the following:• Advancing knowledge indicators and metrics include measures of research quality, activity, outreach and structure. 

Identified are some aspirational indicators of knowledge impacts using data that are highly desirable but currently difficult to collect and/or analyze (such as an expanded relative‐citation impact that covers a greater range of publications, including book‐to‐book citations and relative download‐rates per publication compared to a discipline benchmark).

• Research capacity‐building indicators and metrics fall into subgroups that represent personnel (including aspirational indicators for improving receptor and absorptive capacity), additional research‐activity funding and infrastructure.

• Informing decision‐making indicators and metrics represent the pathways from research to its outcomes in health, wealth and well‐being. They fall into health‐related decision‐making (where health is broadly defined to include health care, public health, social care, and other health‐related decisions such as environmental health); research decision‐making (how future health research is directed); health‐products industry decision‐making; and general public decision‐making. Provided are two aspirational indicators for this category (media citation analysis and citation in public policy documents).

• Health‐impact indicators and metrics include those on health status, determinants of health and health‐system changes, and they include quality of life as an important component of improved health. Determinants of health indicators can be further classified into three major subcategories: modifiable risk factors, environmental determinants, and modifiable social determinants.

• Broad economic and social impacts are classified into activity, commercialization, health benefit (specific costs of implementing research findings in the broad health system), wellbeing, and social‐benefit indicators (socio‐economic benefits).

11

Page 12: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

Source: The Returns from Arthritis Research, RAND Europe 2004

The RAND Payback Categories

Page 13: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

Theory of Action ‐Levels of Evidence and the Logic Model

13

Source:  Adapted from Claude Bennett 1979.  Taken from Michael Quinn Patton, Utilization‐Focused Evaluation:  The New Century Text, Thousand Oaks, California, 1997, p 235.

7.  End results7.  Measures of impact on overall problem, 

ultimate goals, side effects, social and economic consequences

6.  Practice and behavior change 6.  Measures of adoption of new practices and behavior over time

5.  Knowledge, attitude, and skill changes5.  Measures of individual and group changes in 

knowledge, attitudes, and skills

4.  Reactions4.  What participants and clients say about the program; 

satisfaction; interest, strengths, weaknesses

3.  Participation3.  The characteristics of program participants and clients; 

numbers, nature of involvement, background

2.  Activities 2.  Implementation data on what the program actually offers or does

1.  Inputs1.  Resources expended; number and types of staff involved; time 

extended

Program Chain of Events(Theory of Action) Matching Levels of Evidence

Page 14: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected] 14

7.  End results

6.  Practice and behavior change

• Final outcomes*• Wider economic sector benefits**• Health sector benefits**

5.  Knowledge, attitude, skill and / or aspirations changes

• Adoption by practitioners and public*• Informing policy and product development**

4.  Reactions

• Knowledge production / capacity building**

3.  Engagement / involvement• Research targetting (participation of groups)**

2.  Activities and outputs • Issue identification, Project selection, Research process*

1.  Inputs • Inputs to research*

Results Chain RAND ‘Payback’ Scoring Areas

*RAND  Returns From Arthritis Research (2004) Section 2.1, ** Ibid Section 3.3.2

Note:  These results categories can be tracked over time to tell a performance story.

Results Chains vs. RAND Case Categories

Page 15: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

Initiative Chain of Results

Hierarchy of Evaluation Criteria / Evidence

Typical Indicators Typical Sources / Methods

End outcomes Measures of impact on overall problem, ultimate goals, side effects, social and economic consequences

• Rate or incidence of cancer (incidence, mortality, morbidity)• Level of quality of life (Index TBD)• Level of advances in cancer science / research

• Specialized analyses / evaluations• Statistical agency data• Canadian cancer statistics• Analytical and policy groups

Practice and behaviour change

Measures of adoption of new practices and behaviour over time

• Level of research used (knowledge transfer, practice adoption) by scientists / policy makers / institutions / health care practitioners / consumers

• Level of research used in curricula for new researchers (citation in text books and reading lists)

• Level of research cited in ongoing health professional education material• Level of research cited in public policy documents• Level of research cited in advocacy publications

• Physical observation• Inspections, reviews• Surveys• Evaluation studies• Content analysis

Knowledge, attitude, skill and aspiration change

Measures of individual and group changes in knowledge, abilities, skills and aspirations

• Level of understanding of key related science information generated through research by scientists / policy makers / institutions / health care practitioners / consumers

• Level of self-expressed commitment to specific areas of science / research or practice / protocol / policy change by scientists / policy makers / institutions / health care practitioners / consumers

• Level of development of new knowledge in cancer research• Level of development of new methods in cancer research• Level of published research findings in a timely manner and in peer-

reviewed journals with high “impact factors”

• Independent review of target groups• Content analysis • Survey, group self-assessment• Testing / certification• Bibliometrics

Reactions What participants and clients say about the program; satisfaction; interest, strengths, and weaknesses

• Level of program recognition and support from key stakeholders / target groups / participants

• Level (volume, accuracy and ‘tone’) of media coverage of research and program activities

• Usage / participation tracking• Correspondence content analysis• Surveys• Media content analysis

Engagement / participation

The characteristics of program participants and clients; number, nature of involvement, and background

• Level of engagement with other centres, networks, academic institutions, government agencies, etc.

• Level of engagement by stakeholders / target groups / participants• Level of multidisciplinary and / or multisectorial research activities• Level of recruitment and retention of stakeholders / target groups /

participants (e.g. junior investigators, researchers, review panellists, etc.)• Level of established external scientific advisory board(s)

• Web use tracking• Correspondence content analysis• Observation of meetings / events• Meeting attendance records• Stakeholder relationship management /

tracking (e.g. contracts and agreements)• Surveys

Activities & outputs

Implementation data on what the program actually offers

• Level of research as per internal review guidelines• Extent to which plans, strategies, frameworks, etc. are delivered as per

expectations (expected timelines, resource usage and quality levels)• Extent to which governance structure adheres to internal guidelines• Extent to which policy and financial decisions are made according to Board

/ Senior Management / Expert Advisory Committee(s) accepted guidelines and standards

• Extent to which internal and external communication strategies adhere to internal standards and protocols / policies

• Project / initiative tracking• Project reports• Content analysis or records• Peer-review • Operating reviews

Inputs Resources expended; number and types of staff involved; time expended

• Level of human resources (staffing) at all levels (according to norms, vacancies, expectations, benchmarks)

• Level of financial resources (budgets vs. actuals) at all levels

• Budget analysis• Time, reporting and budget / plan review• Activity-based costing

Sources: Networks of Centres of Excellence, National Cancer Institute and Canadian Academy of Health Sciences.

A Results Chain Typical Indicator and Measurement “Menu” for CCSRI National Programs and Networks

15

Page 16: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

The Vision

• Refer to a hierarchy to sort key indicators

• Focus on Mission as the key end outcome

• Tell a performance story – over time

• Emphasize behaviour change

• Employ ‘Realist Evaluation’ Questions… not simply ‘payback’ questions 

16

Page 17: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

End Result (Outcome)

Behaviour / Use

Knowledge

Reactions

Engagement

Activities

Human Resources

Financial Resources $

(Res)

5 year operating grant for CCSRI

T1 (0‐5)

17

Page 18: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

End Result (Outcome)

Behaviour / Use

Knowledge

Reactions

Engagement

Activities

Human Resources

Financial Resources $ $

(Res)(Res)

5 year operating grant for CCSRI

Fellowship / student award

5 year operating grant for CCSRI

Publication•----------•----------

T1 (0‐5) T2 (6‐10)

18

Page 19: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

End Result (Outcome)

Behaviour / Use

Knowledge

Reactions

Engagement

Activities

Human Resources

Financial Resources $ $ $

(Res)(Res)

5 year operating grant for CCSRI

Fellowship / student award

5 year operating grant for CCSRI

Funding for clinical trials CCSRI    (Phase 1)

Publication•----------•----------

Publication•----------•----------

Research used by•----------•----------

T1 (0‐5) T2 (6‐10) T3 (11‐15)

(Res)           (CTG) 

19

Clinical Trials (CTs) by•----------•----------

Page 20: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

End Result (Outcome)

Behaviour / Use

Knowledge

Reactions

Engagement

Activities

Human Resources

Financial Resources $ $ $ $

(Res)(Res)

5 year operating grant CCSRI

Fellowship / student award

5 year operating grant for CCSRI

Funding for clinical trials CCSRI    (Phase 1)

Further clinical trial funding CCSRI   (Phase 2)

Publication•----------•----------

Publication•----------•----------

Research used by•----------•----------

Research used by•----------•----------

Approval of use of X by ‘accrediting bodies’ A, B and C

CTs Phase 2

Published CT findings

Published CT learning

Community Support

Commitment to use and ‘practice’ learning (use of X)

T1 (0‐5) T2 (6‐10) T3 (11‐15) T4 (16‐20)

(Res)           (CTG)  (Res)           (CTG)  

Communications Activities

20

Clinical Trials (CTs) by•----------•----------

Page 21: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

End Result (Outcome)

Behaviour / Use

Knowledge

Reactions

Engagement

Activities

Human Resources

Financial Resources $ $ $ $ $ $

(Res)(Res)

5 year operating grant for CCSRI

Fellowship / student award

5 year operating grant for CCSRI

Funding for clinical trials CCSRI    (Phase 1)

Further clinical trial funding CCSRI    (Phase 2)

Expanded Research and Implementation CCSRI funding

Publication•----------•----------

Publication•----------•----------

Clinical Trials (CTs) by•----------•----------

Research used by•----------•----------

Research used by•----------•----------

Approval of use of X by ‘accrediting bodies’ A, B and C

Use of X by•‐‐‐‐‐‐‐‐‐‐•‐‐‐‐‐‐‐‐‐‐

CTs Phase 2

Published CT findings

Published CT learning

Community Support

Engagement of health organizations and health practitioners

Commitment to use and ‘practice’ learning (use of X)

Commitment to use and ‘practice’ learning (use of X)

T1 (0‐5) T2 (6‐10) T3 (11‐15) T4 (16‐20) T5 (21‐25) T6

(Res)           (CTG)  (Res)           (CTG)  

Reduced ‘burden’ (inc/mort/morb/ effects/etc.) by Y

Communications Activities

Communications Activities

21

Page 22: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

End Result (Outcome)

Behaviour / Use

Knowledge

Reactions

Engagement

Activities

Human Resources

Financial Resources $ $ $ $ $

(Res)(Res)

5 year operating grant for CCSRI

Fellowship / student award

5 year operating grant for CCSRI

Funding for clinical trials CCSRI    (Phase 1)

Further clinical trial funding CCSRI    (Phase 2)

Expanded Research and Implementation CCSRI funding

Publication (1*)•----------•----------

Publication (1*)•----------•----------

Clinical Trials (CTs) by (2*)•----------•----------

Research used by (2*)•----------•----------

Research used by (2*)•----------•----------

Approval of use of X by ‘accrediting bodies’ A, B and C (3*)

Use of X by (3*)•‐‐‐‐‐‐‐‐‐‐•‐‐‐‐‐‐‐‐‐‐

CTs Phase 2

Published CT findings (1*)

Published CT learning (1*)

Community Support

Engagement of health organizations and health practitioners

Commitment to use and ‘practice’ learning (use of X) (3*)

Commitment to use and ‘practice’ learning (use of X) (3*)

T1 (0-5) T2 (6-10) T3 (11-15) T4 (16-20) T5 (21-25) T6

(Res) (CTG) (Res) (CTG)

Reduced ‘burden’ (inc/mort/morb/ effects/etc.) by Y (4*)

Communications Activities

Communications Activities

22

Mock-up Results Chain Applied Over Decades to a Research Initiative

*Linkage to RAND Payback categories respectively: 1. Knowledge Creation; 2. Research Targetting, Capacity Building; 3. InformingPolicy or Product Development; 4. Health Benefits; 5. Broader Economic BenefitsSource: S Montague – ACOR Presentation May 27, 2009

Page 23: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

Realist Evaluation QuestionsAn explanatory compendium for complex RDT initiatives

(Adapted from Evidence‐based Policy, Ray Pawson)

1. Program theories – how is the RDT initiative supposed to work?

2. Reasoning and reactions of stakeholders – are there differences in the understanding of the RDT initiative theory?

3. Integrity of the implementation chain – is the RDT initiative theory applied consistently and cumulatively?

4. Negotiation and feedback in implementation – does the RDT initiative theory tend to bend in actual usage?

5. Contextual influences – does the RDT initiative theory fare better with particular individuals, interpersonal relations, institutions and infrastructures?

6. History of the RDT initiative and relationships with other policies – does the policy apparatus surrounding the theory advance or impede it?

7. Multiple, unintended, long‐term effects – is the theory self‐affirming or self‐defeating or self‐neutralizing?

23

Page 24: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

Challenges

• Research culture vs. management culture… Investigator vs. mission driven

• The criticism that this might  drive out explorative innovation…

• ‘Getting’ the point that the real evaluation story  is all about human behaviours 

24

Page 25: Contribution to the Technology and Development: From ... · 5/4/2010  · - Benefit-Cost Analysis - Cost-Effectiveness Analysis Favours easily quantifiable ‘outcomes’ Monetization

[email protected]

Conclusions

• A hierarchy of results can help to sort RT&D indicators• A storyline is key to provide indicator context –evaluation logic models offer this.. but they need to be adapted for RT+D.. Stay tuned..

• Realist evaluation questions may prove to be more promising as a support to management than simple ‘payback’ enquiries… at least if we are serious about learning – as well as ‘justification’

• Principle – based guidelines  work better than prescriptions 

25