Real-World Evidence for Drug Effectiveness Evaluation:Addressing the Credibility Gap
Richard Willke, PhD, Chief Science Officer, ISPOR
NIH Collaboratory Grand Rounds, October 25, 2019
2
Disclosures
Richard Willke was employed by
Pfizer and its legacy companies
from 1991 to 2016
Acknowledgements
This presentation has benefited
from my participation in several
working groups and conferences in
recent years.
ISPOR is an international,
multistakeholder nonprofit dedicated
to advancing health economics
and outcomes research excellence
to improve decision making for
health globally.
ISPOR
Stakeholders
4
The Challenge of Real World Evidence
So much data, so much potential information
but is the evidence derived
reliable and trustworthy?
5
Framework for FDA’s
Real-World Evidence Program
December 2018
“As the breadth and reliability of RWE increases, so
do the opportunities for FDA to make use of this
information.”Scott Gottlieb, FDA Commissioner National Academies of
Science, Engineering, and Medicine, Examining the
Impact of RWE on Medical Product Development,
September 19, 2017
“FDA will work with its stakeholders to understand how
RWE can best be used to increase the efficiency of clinical
research and answer questions that may not have been
answered in the trials that led to the drug approval, for
example how a drug works in populations that weren’t
studied prior to approval.”Janet Woodcock, M.D., Director, CDER
SOURCES OF REAL WORLD EVIDENCE
• PRAGMATIC CLINICAL TRIALS• PROSPECTIVE OBSERVATIONAL STUDIES / REGISTRIES
• SECONDARY USE OF EXISTING RWD• Retrospective Observational Studies of Existing Datasets
6
Making RWE Useful Requires
• Quality Production– Careful data collection and/or curation
– Appropriate analytic methods
– Good procedural practices for transparent study process
– Replicability/reproducibility
• Responsible Consumption– Informed interpretation
– Fit-for-purpose application
7
Recent work on Data Quality
from the Duke-Margolis
RWE Collaborative
8
RWD analytical gremlins• Non-representative populations• Upcoding• Missing data, especially when not at random• Misclassification bias, other types• Immortal time bias• Ascertainment bias• Protopathic bias• Berkson’s paradox• Informative censoring• Depletion of susceptibles• Channeling bias/confounding by indication• Healthy user effect• Adjustment for causal intermediaries• Reverse causality• Time-varying confounding• Selection bias or endogeneity by any other name• And … p-hacking
9
“If you don't know where you're going, you'll end up someplace else.”
And a variety of analytical pathways
• New user design• Stratification• Propensity score matching• Regression analysis• GLM/GEE• Instrumental variable analysis• Finite mixture modeling• Classification trees• Random forest• Other machine learning approaches
10
Dynamite with a laser beam?
Causal inference approaches, e.g.,
• Directed acyclic graphs
• Structural equation models
• Marginal structural models
• G-estimation of structural nested models
• Sequential approaches
• Estimate prediction/classification models using machine learning techniques to select features
• Estimate causal models with epidemiologic or econometric approaches using selected features in the model specifications
• Targeted maximum likelihood
As well as:
• Quasi-experimental designs, e.g., natural experiments and difference in difference analysis, nonequivalent group design, regression discontinuity designs
• Specification tests for residual confounding
From Johnson ML, Crown W, et al. Value in Health 2009; 12:1062-1073.
11
Berger ML, Mamdani M, Atkins D, Johnson ML. Good research practices for comparative effectiveness research: defining, reporting and interpreting nonrandomized studies of treatment effects using secondary data sources: The ISPOR good research practices for retrospective database analysis task force report—Part I. Value Health 2009;12:1044-52.
Cox E, Martin BC, Van Staa T, Garbe E, Siebert U, Johnson ML, Good research practices for comparative effectiveness research: approaches to mitigate bias and confounding in the design of non-randomized studies of treatment effects using secondary data sources: The ISPOR good research practices for retrospective database analysis task force–Part II. Value Health 2009;12:1053-61.
Johnson ML, Crown W, Martin BC, et al. Good research practices for comparative effectiveness research: analytic methods to improve causal inference from nonrandomized studies of treatment effects using secondary data sources: The ISPOR good research practices for retrospective database analysis task force report—Part III.Value Health 2009;12:1062-73.
ISPOR Task Force Reports on RWD Methods for Comparative Effectiveness Analysis(among many other sources)
12
ISPOR/ISPE Joint Special Task Force on Real World Evidence in Health Care Decision Making
Marc Berger, MD
New York, NY, USA
Sebastian Schneeweiss, MD, ScD, FISPE
Harvard Medical School, Boston, MA, USA
C. Daniel Mullins, PhD
University of Maryland, Baltimore, MD, USA
Shirley Wang, PhD, MSc
Harvard Medical School, Boston, MA, USA
Transparency Paper Co-Chairs Reproducibility Paper Co-Chairs
Objective: To provide a clear set of good practices for enhancing the transparency, credibility, and reproducibility of real world database studies in healthcare, with the aim of improving the confidence of decision-makers in utilizing such evidence.
STF work initiated late 2016, published Sept 2017
13
Read the freely available
Good Practices Reportsispor.org/RWEinHealthcareDecisions
14
Transparency of study processes
15
Transparency of study processes
Reproducibility of study implementation
16
Reproducibility - Good study procedures
• The importance of achieving consistently reproducible research
is recognized in many reporting guidelines – STROBE, RECORD, PCORI Methodology Report, EnCePP
– ISPE Guidelines for Good Pharmacoepidemiology Practice (GPP)
• While these guidelines certainly increase transparency, even
strict adherence to existing guidance would not provide all the
information necessary for full reproducibility.
17
What do we need?
Sharing Data Would allow exact reproductionHowever: Data use agreements usually do not allow sharing HIPAA-limited data with third parties
Sharing programming code Demonstrates good willHowever:It is almost impossible for a third party to assess whether a study was implemented as intended
Sharing all study implementation parametersand definitions
Provides clarity on what was actually done and enables reproduction with confidence
18
Transparency - Primary Recommendations
1. A priori, determine and declare that study is a “Hypothesis-Evaluating Treatment Effect” (HETE) or
“exploratory” study
2. Post a HETE study protocol and analysis plan on a public study registration site prior to conducting the
study analysis.
3. Publish HETE study results with attestation to conformance and/ or deviation from original analysis
plan.
4. Enable opportunities for replication of HETE studies whenever feasible (ie, for other researchers to be
able to reproduce the same findings using the same data set and analytic approach).
5. Perform HETE studies on a different data source and population than the one used to generate the
hypotheses to be tested, unless it is not feasible.
6. Authors of the original study should work to publicly address methodological criticisms of their study
once it is published.
7. Include key stakeholders (eg, patients, caregivers, clinicians, clinical administrators, HTA/payers,
regulators, and manufacturers) in designing, conducting, and disseminating the research.
19
Which studies?
Phase I
Phase II - IV
Single arm
Pragmatic Trials
Prospective Cohorts
Some Patient Registries
Add-on Studies
RWE using routinely
collected data
Add-on studies, some
registries
Primary data
use
Secondary data
use
Interventional Study Non-Interventional Study
20
Which studies?
Phase I
Phase II - IV
Single arm
Pragmatic Trials
Prospective Cohorts
Some Patient Registries
Add-on Studies
RWE using routinely
collected data
Add-on studies, some
registries
Primary data
use
Secondary data
use
Interventional Study Non-Interventional Study
Hypothesis-Evaluating
Treatment Effect Studies
21
Which studies?
Phase I
Phase II - IV
Single arm
Pragmatic Trials
Prospective Cohorts
Some Patient Registries
Add-on Studies
RWE using routinely
collected data
Add-on studies, some
registries
Primary data
use
Secondary data
use
Interventional Study Non-Interventional Study
Hypothesis-Evaluating
Treatment Effect StudiesHypothesis-Evaluating
Treatment Effect 2ndary
data use studies
22
Transparency of Process - Primary Recommendations
1. A priori, determine and declare that study is a “HETE” or “exploratory” study
2.Post a HETE study protocol and analysis plan on a public study registration site prior to
conducting the study analysis.
3. Publish HETE study results with attestation to conformance and/ or deviation from original analysis
plan.
4. Enable opportunities for replication of HETE studies whenever feasible (ie, for other researchers to be
able to reproduce the same findings using the same data set and analytic approach).
5. Perform HETE studies on a different data source and population than the one used to generate the
hypotheses to be tested, unless it is not feasible.
6. Authors of the original study should work to publicly address methodological criticisms of their study
once it is published.
7. Include key stakeholders (eg, patients, caregivers, clinicians, clinical administrators, HTA/payers,
regulators, and manufacturers) in designing, conducting, and disseminating the research.
23
ISPOR
Real-World Evidence Transparency
Study Registration Working Group
February 25-26, 2019
Gaylord National Resort
Washington, DC, USA
24
Representatives from 7
pharma companies
Real-World Evidence Transparency Partnership
25
Objective: Building trust and transparency in
secondary observational research
Focus on:
– Studies using secondary (retrospective) use of data
– Objective of studying comparative treatment effects (including safety)
What is needed to ensure transparency of study process?
– What ‘mechanism’ is needed? Is pre-registering the best way to build
credibility?
– Which data and documents are required? And When?
– How do we hold investigators accountable, and Who does so?
26
Starting point: ISPOR/ISPE RECOMMENDATION 2
Post a HETE study protocol and analysis plan on a public study registration site prior to
conducting the study analysis.
• Publicly declare the “intent” of the study—exploratory or hypothesis evaluation—as
well as basic information about the study.
• Registration in advance of beginning a study is a key step in reducing publication
bias
• For transparency, posting of exploratory study protocols is also encouraged.
• Options include EU Post‐Authorisation Study Register (ENCePP), clinicaltrials.gov,
and perhaps others
– None of these options may be ideal as they currently stand
27
Key Areas of Discussion
• Rationale for pre-registration
• Review of potential registries and general need for modifications
• Definition of a study
• Need to provide a basis/rationale for study hypothesis as part of the protocol
• Reporting of “pre-looks” and study verification
• Need for confidential “lockbox”
• Philosophy about enforcement
• Need for evolution of technical solutions and business processes
• Potential use cases
• Key issues for technical groups to address
28
Draft White Paper Released on Sept. 18th
– Open for Public CommentThis White Paper was authored by the Steering Committee of
the Real-World Evidence Transparency Initiative Partnership.
The Initiative is led by ISPOR, the International Society for
Pharmacoepidemiology, Duke-Margolis Center for Health
Policy, and the National Pharmaceutical Council, with
involvement of a number of other organizations and
stakeholders. A list of all authors can be found in the appendix.
The white paper comment period remains open through
Nov. 15: https://www.ispor.org/strategic-initiatives/real-
world-evidence/real-world-evidence-transparency-initiative
29
White paper recommendations (1 of 3)
1. Near term: Identify location for pre-registration of secondary observational research studies
Considerations
– with a view to modify or enhance existing registration sites
– clearly define the study type – hypothesis evaluating treatment effect studies (HETE) for decision making
Actions
– Actively encourage registration on current sites NOW
– Initiate discussion with leaders of current registries, clinicaltrials.gov and ENCePP/EMA
– Look at the Center for Open Science format for possible new site, if needed
30
Potential Registries for Non-interventional RWE Studies
• NIH clinicaltrials.gov
• ENCePP EU-PAS Register
• Center for Open Science OSFRegister
31
White paper recommendations (2 of 3)
2. Medium term: Determine what a “good” registration process entails to fit the purpose
Considerations– Feasibility - research and reviewer workload
– Core elements of study registration including website fields and associated documents (e.g. protocol content)
– Transparency vs confidentiality ("lock box" with different access levels)
– Time-stamped registration including data looks
– Don’t let perfect be the enemy of good - this should be a progressive effort
Actions
Consider creating ‘task forces’ to:
– Survey potential users about needs and considerations regarding feasibility, transparency and confidentiality
– Design core elements of registration and protocol
– Design timing of release of information
– Pilot test registration site updates and update partner site or new site if required
32
White paper recommendations (3 of 3)
3. Long term: Incentives for routine pre-registration for HETE studies
Considerations
– End users start requiring registration: funding bodies, journals, regulators, payers/health technology assessors
– Provide register ‘use reports’ (quarterly report of registered studies, with key information): e.g. on the website; from time to time published
Actions
– Build off collaboration with key stakeholders from task force activities to encourage adoption of pre-registration requirements.
– Involve key stakeholders from survey of potential users.
33
ISPOR Summit 2019
Real-World Evidence Transparency InitiativeOctober 11, 2019 | Baltimore, MD, USA
Agenda
1. Transparency in RWE - Time for a Unified Approach
2. Registration site(s) - Opportunities to Optimize
3. Nuts and Bolts of Fit-for-Purpose
4. Behavior Modification - Boosting and Nudging
5. Transparency in RWE - Moving Forward
www.ispor.org/Summit2019
34
35
RWE Credibility
Data Quality
Reproducibility
Analytic Methods
Process Transparency
36
Richard Willke, PhD, Chief Science Officer | [email protected]