36
Implementation Research: A Primer Stefan Baral, MD, MPH, CCFP, FRCPC (CM)

Implementation Research: A Primer

  • Upload
    amusten

  • View
    200

  • Download
    0

Embed Size (px)

Citation preview

Implementation Research: A Primer

Stefan Baral, MD, MPH, CCFP, FRCPC (CM)

Overview

Definitions Characteristics Conceptual Frameworks

Evidence and Outcomes Methods

Qualitative Quantitative Descriptive Analytic Experimental

Conclusions

2

Definitions

Implementation The use of strategies to introduce or change evidence-based health

interventions (policies, programs, individual practices) within specific settings

Implementation Science in HIV Implementation science is a multi-disciplinary field that seeks

generalizable knowledge about the behaviour of stakeholders, organizations, communities, and individuals in order to understand the magnitude, reasons for and strategies to close the gap between evidence and routine practice for health in real world contexts

Key Themes Multidisciplinary Generalizable Multiple stakeholders Closing gap between evidence and practice Real world contexts

3Lobb and Coldtiz, Implementation Science and Its Application to Population Health Annual Review of Public Health, 2013; Odeny, Padian, Doherty, Baral, Beyrer, Ford, Geng, Definitions of implementation science used in the HIV/AIDS literature: a synthetic review. The Lancet Infectious Diseases, In Press, 2015

Implementation Research and Other PH Study Designs

4Source: Olakunle Alonge, Lobb and Coldtiz, Implementation Science and Its Application to Population Health Annual Review of Public Health, 2013; https://catalyst.harvard.edu/pathfinder/t2detail.html

Characteristics of Implementation Research

5

Findings are Warranted to Inform Policy/Program There is “sufficient evidence” to support the conclusions

of the work What is sufficient evidence?

Transparency of Methods Support Critical Assessment of the Study

Whether processes are adequate Conclusions justified Repeatability

Don’t be afraid of “failure” A well done study is still a success in terms of

generating generalizable knowledge

Traditional Scientific Method

6http://www.sciencebuddies.org/science-fair-projects/project_scientific_method.shtml

Differences with IR

Competencies on a IR team: Research Methodologist

Qual, Quant, Mixed Methods Ministry, Government, Agencies

Either as members of team or study oversight committee

Health Professionals Involvement of health professionals from study settings

Communications Public Health Professionals

Health Commissioner/Associate Health Commissioner Public Health Inspector/Public Health Nurse

Privacy Expert Stakeholder Assessment

Community

7http://www.who.int/tdr/publications/year/2014/ir-toolkit-manual/en/

Traditional IR Approaches

8http://www.sciencebuddies.org/science-fair-projects/project_scientific_method.shtml

Active Consultation

Active Consultation

Appropriate Team, Active Consultation

Active Consultation

Active Consultation

Active Consultation

IR Specific Objectives: Three Broad Areas

Three Broad Areas of IR Specific Objectives

1. Describe Health Situation or Interventions

2. Provide Data to Evaluate Ongoing Interventions or Information Needed to Adjust Interventions

3. Analyze missed targets and potential solutions

9

Describe Health Situation and Intervention

Magnitude of the problem Distribution of health needs of the population Risk factors for some problems People’s awareness of the problem Utilization patterns of services Cost-effectiveness of available and potential other

interventions

10

Evaluate Interventions

Coverage of priority health needs Coverage of target groups Acceptability of the services Quality of services Cost-effectiveness of the intervention Impact of the program on health

11

Analyze Missed Targets

Availability Acceptability Affordability Service delivery problems

Fidelity

12

Evidence and Outcomes in Implementation Research

Conceptual FrameworksOutcomesEvidence

Conceptual Frameworks Commonly used in IR

RE-AIM Reach, Efficacy/Effectiveness, Adoption,

Implementation, Maintenance

Stages of implementation National Implementation Research Network Exploration and Adoption, Program Installation (Prep),

Initial Implementation (pilot/adapt), Full Implementation (>50% coverage), Sustainability

Consolidated Framework for Implementation Research Intervention Characteristics, Inner Setting, Outer

setting, Individuals in the Intervention, Implementation process

Many others….14

Source: Glasgow et al 1999, National Implementation Research Network, 2005, Damschroder, 2009

Outcomes in Implementation Research

15

Clients Outcome

Satisfaction

Symptomatology

Function

Population-Based

Incidence of diseases

Morbidity

Mortality

DALYs

Health Outcomes

Efficiency

Coverage

Equity

Responsiveness

Services Outcomes

Acceptability

Adoption

Appropriateness

Costs

Feasibility

Fidelity

Penetration

Sustainability

Implementation Outcomes

Source: Olakunle Alonge, Proctor et al 2011

Implementation Outcomes

16

Implementation Outcome

Working Definition* Related terms**

Acceptability Perception among stakeholders that an intervention is agreeable

Related factors: (e.g. Comfort, Relative advantage, Credibility)

Adoption Intention, initial decision, or action to try to employ a new intervention

Uptake, Utilization, Intention to try,

Appropriateness Perceived fit or relevance of the intervention in a particular setting or for a particular target audience (e.g. provider or consumer) or issue

Relevance, Perceived fit, Compatibility, Perceived usefulness or suitability

Feasibility The extent to which an intervention can be carried out in a particular setting or organization

Practicality, Actual fit, Utility, Trialability

Fidelity The degree to which an intervention was implemented as it was designed in an original protocol, plan, or policy

Adherence, Delivery as intended, Integrity, Quality of programme delivery, Intensity or dosage of delivery

Implementation cost

Incremental cost of the implementation strategy Marginal cost, Total cost***

Coverage Degree to which the population that is eligible to benefit from an intervention actually receives it.

Reach, Access, Service Spread or Effective Coverage, Penetration

Sustainability The extent to which an intervention is maintained or institutionalized in a given setting.

Maintenance, Continuation, Routinization Institutionalization, Incorporation

Source: Proctor et al 2011; Peters, Adams, Alonge et al 2013

What is Sufficient Evidence?

Evidence-based medicine is a global standard Double-Blinded (DB) RCT is gold standard

Evidence-based PH interventions should also be a global standard Often limited evidence, PH decision still needs to be

made Precautionary Principle for PH?

When there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation

To develop guidelines Need to characterize

Efficacious Effective Sustainable and Scalable programs

17

Tension Between Internal and External Validity Challenges for Evidence Combination Prevention

Internal Validity Minimal study biases suggesting confidence in

ultimate conclusion of the study External Validity

Generalizability of ultimate findings to broader population

Traditional Question for Clinicians/Programmers Does it work? What is effect size?

Should I use it? Implementation Questions

How, when, why, and where does it work? What factors influence effectiveness?

Should I use it? How should I use it?

Traditional Research Pathway

Effectiveness Research (and guideline development) generally happens prior to implementation research Are there more time-effective approaches to integrate

implementation research with effectiveness/efficacy research

Assess barriers/facilitators to intervention uptake acceptance/adoption/routinization

Diagnose quality gaps Fidelity

Characterize Sustainability Maintenance, Cost-Effectiveness

Methods in Implementation Research

Research Objectives Qualitative Methods

Exploration Using inductive methods to explore concepts, constructs,

phenomena, situations to develop hypotheses Quantitative Methods

Explanation Characterizing the

Relationship between concepts and phenomena Reasons for occurrences of events

Prediction Use knowledge to forecast events

Experiment Intervene/manipulate different settings or variables to produce

a desired outcome Mixed Methods

Description Identify and describing the precursors/antecedents, nature, or

etiology of an outcome/phenomena

Qualitative Methods in Implementation Research

To explore a health problem Evidence-based approach to characterizing individual, network, and

structural determinants of the health issue To identify variables that can later be measured

Measuring the “right” implementation outcomes To develop a complex and detailed understanding of an issue

Formative work to design the implementation plan for an effective intervention

Maximizing Acceptability, Appropriateness To understand the contexts or settings surrounding an

experience or phenomenon Case-Study of a program or a policy

To understand the meaning behind an issue or experience Explore the “why” or “how” an intervention works or why not

Fidelity

Quantitative Research Designs

Observational Categories

Descriptive Analytic

Cross-Sectional Prospective Quasi-Experimental Studies

no randomization of individuals/communities/institutions

Experimental

Rapid Innovation in Implementation Design

Cohorts In Implementation Research Method

Traditional “Active” Cohort includes Enrollment of participants and active follow-up of individuals Expensive to ensure retention

“Passive” Cohorts (registries, EHR Repositories, etc) Rapidly becoming the standard in IR, but limitations include

quality and extent of data collection

What is Different from Traditional Cohort Studies? Traditional Cohorts

Incidence, Prevalence, Relationship between Exposure and Outcome

Cohorts in IR focus on measuring traditional IR Outcomes Acceptability, adoption, appropriateness, feasibility, fidelity

of interventions, implementation costs (cost-effectiveness), Determinants of Coverage, Sustainability/Maintenance

Implementation Outcomes

Outcome of Interest Definition Data Source

Proportion of HIV-positive men who enter care

At least one clinic visit attended (where they see a clinician) after positive test result within 3, 6 and 12-months of study visit

Clinic records, NHLS documentation of lab tests completed

Proportion of men who receive CD4 results

Receive POC CD4 results at study visit or get CD4 tests at clinic and return to receive results

Study visit CRFs, Clinic records, SMS surveys

Proportion of ART-eligible men who initiate ART

ART initiated by 12 months Study visit CRFs, Clinic records

Time to ART initiation for ART-eligible men

Length of time between receiving positive HIV test result and CD4 <350 to initiating ART

Study visit CRFs, Clinic records

Proportion of ART-eligible men who initiate ART and are retained in care

Attend 6-month clinic visit (see a clinician)

Attend 12-month clinic visit (see a clinician)

Attend 2+ clinic visits at least 3 months after within a 12-month period

Clinic records, NHLS documentation lab tests completed (CD4, viral load, others)

Proportion of treatment ineligible HIV-infected men who receive a CD4 test within 6 months following their study visit CD4

Receipt of repeat CD4 test Clinic records, NHLS documentation lab tests completed (CD4, viral load, others)

Implementation Outcomes

Outcome of Interest Data SourceAcceptability of outreach/CBO-based testing intervention

Questions on study visit CRFs, surveys about why participants chose to test

Relative advantages of non-clinic-based ART initiation and retention package compared to standard of care

CRFs, questions in surveys, qualitative data from both participants and providers

Perceived credibility of CBOs to initiate ART as compared to standard ART clinics

Survey indicators, qualitative data

Adoption of experimental interventions including intention of use of decentralized NIMART-trained nurse initiated ART and peer navigator based support by participants

CRFs, survey indicators, qualitative data

Implementation costs associated with experimental condition

Review of clinic-based budgets and ultimate costs to assess marginal and total costs of interventions as compared to standard of care

Maintenance and routinization of using clinic-based approaches and peer-navigators for retention as indicators to describe potential sustainability of the interventions

Provider and participant qualitative data, survey indicators

Experimental Studies

Explanatory (Traditional Gold Standard) Understand and explain benefit of an intervention

under controlled conditions Maximize internal validity

Pragmatic Trials Focus on the intervention in routine practice Intentional maximization of variability in how study is

implemented Variability of research settings (communities,

practice settings, types of providers, patients) Maximize external validity

Adaptive Designs Emerging area of implementation research that

attempts to balance internal and external validity

Pragmatic Trials

Testing a new intervention while maximizing external validity Formative Period

Qualitative work, some descriptive or analytic observational work Consider different types of outcomes of effectiveness

Directly Measured Health Outcomes Service Outcomes Implementation Outcomes Resources

Institutional Human Financial

Cost-effectiveness, Cost-Utility, Cost-Minimization, Cost-Benefit, etc

Indirect Assessment/Modeled Benefits Increasing use of Mathematical Models to Scale Results for

potential longer term outcomes, etc.

Effectiveness-Implementation Hybrid Trials

Goal Measure markers of implementation and impact in the same

study Three Broad Designs

Differentiated by prioritization of data collection Type 1

1st priority - Impact of health intervention 2nd Priority - gathering measures of implementation

Feasibility/Acceptability using qualitative/mixed methods

Type 2 Equal priority to impact and implementation

Type 3 1st priority – Implementation

Fidelity of intervention, measures of adoption 2nd Priority Impact of Health Intervention

Stepped Wedge Cluster Randomized Designs

Method Assess baseline situation in all communities, but randomly phase

in intervention activities in steps, evaluating impact of intervention time on outcomes

http://www.biomedcentral.com/1471-2288/6/54

Stepped Wedge Designs

Pros Differences in exposure time allow each community site to

receive the intervention Ethics

Mixed views on the ethics of stepped wedge Some believe more ethical to give intervention to all and

more feasible to implement, others believe trial not warranted if success of intervention is certain and standard of care can be justified so why not assess more cleanly with parallel design

Cons Analysis concerns around when an intervention starts – e.g. if a

community starts receiving the intervention today but takes 2 months to scale up and reach a substantial number of people, exposure time will be diluted as everyone starts receiving the exposure at the same time within the cluster

If the outcome cannot be expected to happen over the time period of one step, stepped wedge designs will be underpowered

Adaptive Designs For Implementation Studies

Adaptive Intervention/Adaptive Implementation Strategy Specific decision rules for the implementation of an

interventions based on individual/community needs Trying to maximize both internal and external validity

Trial Design Sequential, multiple assignment, randomized trials

(SMART) Use outcome data to inform the implementation of

the intervention being evaluated Can be at multiple steps

http://methodology.psu.edu/ra/adap-inter

SMART Study ExampleResearch Question: Among clinics not responding to Replicating effective interventions/REP, how much does external or internal facilitation help improve mood disorders program

Source: Kilbourne, Almirall, Implementation Science, 2014

Adaptive Implementation Strategy A priori decisions about intervention based on response Randomize to control for confounders but being done in real world

setting Improved balance of Internal/External Validity

Measure Implementation Outcomes Throughout

Power and Sample Size Calculations

P&S Calculations for IS studies are complicated Powered to assess the “preponderance of evidence”

of the benefit of interventions Most realistic, but murky

Powered for at least primary outcome (Eg. Viral suppression)

Cleaner, but is this really implementation research?

Powered on Outcome and Implementation Outcomes Limited resources, etc.

De-Implementation Science Studies

The science of dissemination and implementation confronts two problems Getting wider uptake of evidence-based interventions

in clinical or public health practice Elimination from clinical or public health practice of

tests and interventions that use resources without enhancing patient outcomes

As a field, we focus more on increasing interventions than we do reducing unnecessary ones More incentive to discover new tools than to try and

more politically sensitive to try and remove services for folks

De-Implementation Science Methods Use many of the same experimental methods (CRCT,

SW, etc) but in reverse

Conclusions

Implementation Research Seeks Generalizable Information intending to Close

Gap Between Evidence and Practice

Tension Between Internal and External Validity What is most important for you and your stakeholders

Rapid Evolution of Experimental Approaches in Implementation Research

36