38
Trial Transparency Deborah A. Zarin, M.D. Director, ClinicalTrials.gov October 2013

Trial Transparency/media/Files/Activity Files/Research...IPD & Full Documentation : Improving the CER (e.g., ... Required for journal publication : Required by US federal law . Which

  • Upload
    lydat

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Trial Transparency

Deborah A. Zarin, M.D. Director, ClinicalTrials.gov

October 2013

Overview

• Trial Transparency requires a system – Policies – Databases/websites – Implementation by key players

• There are a lot of trials ( ~400/week) • ClinicalTrials.gov can help

2

Need for Transparency Issue Suggested Policy

Publication Bias Prospective registration of all clinical trials

Selective Reporting: e.g., • Outcome Measures • Adverse Events

Public reporting of all results for all clinical trials

Fidelity to Protocol Registration, Results Reporting, & Disclosure of Full Protocol/SAP

Integrity of Study Results IPD & Full Documentation Improving the CER (e.g., design of future trials; unnecessary duplication)

All of the above

3

Key Reporting Policies

• 1997 FDA Modernization Act (FDAMA) • 2001 European Commission (EC)/

European Medicines Agency (EMA) • 2005 International Committee of Medical Journal

Editors (ICMJE) • 2007 FDA Amendments Act (FDAAA) • 2008 (2013) Declaration of Helsinki (DoH) • 2013 Centers for Medicare & Medicaid Services

(CMS)

4

5

ICMJE FDAAA Why? Required for journal

publication Required by US federal law

Which Trials? Interventional Studies - All Phases - All Intervention

Types

Interventional Studies - Not Phase

1/Feasibility - Drugs, Biologics,

Devices When to Submit Results?

Not Applicable Within 12 months of final data collection for the primary outcome

• Need policy incentives • Non-legally binding policy can work • Even now, evidence of (many?) unregistered

trials 6

FDAAA – Drugs/devices Not phase 1

CMS – Medicare (coverage for routine costs)

7

ICMJE/WHO – All interventional studies

Summary Results Data

• Decision makers (other than FDA) rely on summary data – Clinical decision making – Policy decision making (e.g., payors)

• Characteristics of Summary Data – Convenient – Assume they are accurate reflection of underlying

participant level data—(assume little room for subjectivity)

8

However…

Summary Data May Not Always be Accurate Reflection of Participant-Level Data

9

Issues in Trial Registration and Results Reporting

• Most, but not all trials are registered • Outcome measure specification is still

problematic (e.g., “cardiovascular and neurologic outcomes”)

• Little academic leadership

10

Four Levels of Specification in Reporting Outcome Measures

Level 1 Domain: Anxiety Depression Schizophrenia

Beck Anxiety Inventory Hamilton Anxiety Rating Scale Fear Questionnaire Level 2 Specific Measurement:

Level 3 Specific Metric: End Value Change from baseline Time to Event

Level 4 Method of Aggregation: Continuous

Mean Median

Categorical

Proportion of participants with decrease ≥50%

Proportion of participants with decrease ≥ 8 points

Description of Measure at Specified Time

Zarin et al. N Engl J Med. 2011. Mar 3;364(9):852-60. 11

“…the plethora of analytical and interpretation options may infuse subjectivity in the evidence procured by randomized controlled trials.”

Saquib N, Saquib J, Ioannidis JP. BMJ. 2013;347:f4313. 12

Issues in Trial Registration and Results Reporting

• Summary results reporting has proven challenging – Not standard practice (!?!) – Uneven quality – Limited leadership outside of industry

• We train “data submitters” but not always the right people – Must be considered an intellectual endeavor – Need training and academic credit

• But over 10,000 results entries, and growing

13

Figure. Information loss as clinical trials data progress from raw uncoded data to summary data

Uncoded

Data Type

Abstracted Coded Computerized Edited/cleaned

Analyzable Analyzed/Summary

Leve

l of i

nfor

mat

ion

Max

Min

Individual Participant-Level Data Aggregated Data

14

Documents that may help to explain the journey

• Protocol and Amendments • Investigator Brochure • Statistical Analysis Plan (SAP) • Informed Consent Form(s) • DSMB Reports • Clinical Study Reports • AE Reports • Other ??

15

Individual Participant Level Data (IPD)

• Provides an audit trail and may improve confidence in summary data

• Allows for different types of analyses, e.g., – Sub-group analyses, different metrics – Different ways of categorizing AEs

• Allows for data pooling across studies

16

Are We Playing “Tetris” with IPD?

17

Concept of Vertical Transparency

A B C D E F G H I J K L M N O P Q R S T U V W X Y All Clinical Trials of Intervention X for Condition Y in Population X

Trial ID:

Type of Information:

Trial Registration Record

Summary Results Database

Journal Publication

Clinical Study Report (CSR)

Individual Participant-Level Data (IPD)

• Uncoded

• Coded

• Analyzable

Trial A: “Documented at all levels”

Trial Y: “Invisible”

Trial K: “Registration & Publication”

18

Points to Consider

• Decision makers will always need summary data • Structured curated data help to mitigate against acts of

commission and acts of omission • Participant-level data might allow for

– Audit/accountability function – Subgroup and other analyses not possible with summary data – Pooling of data leading to potential new discoveries

• Non-systematic data release could also generate a new kind of “disclosure bias,” e.g., if dependent on publication;

• IPD policies that do not build upon current system could undermine current trial disclosure system;

19

Key Questions to Ask About New Data Disclosure Policies

1. What is the scope of trials for which participant-level data will be made accessible?

2. Which data (e.g., type, format) and supporting materials will be accessible? (Be precise.)

3. What is the process for obtaining access? 4. How transparent is that process?

Zarin DA. N Engl J Med. 2013 Aug 1;369(5):468-9. 20

There Are a Lot of Trials

As Reflected by ClinicalTrials.gov

21

22

ClinicalTrials.gov Statistics (First Registered in FY 2013)

Registration Total* 19,111 Type of Trial

Observational 3,794 (20%) Interventional 15,177 (80%)

Drug & Biologic 8,141 Behavioral, Other 4,873 Surgical Procedure 1,560 Device 2,335

International Sites (148 countries) US only 6,756 (35%) Non-US only 10,361 (54%) US & Non-US mixed 652 ( 3%) Not Specified 1,342 ( 7%)

23 * Includes 140 expanded access programs and device trials eligible for delayed posting (FDAAA)

ClinicalTrials.gov Statistics (First Registered in FY 2013)

24

Registration Total 19,111 Funded by

NIH 1,238 ( 6%) Industry 6,146 (32%) University, Other 11,811 (62%)

User Statistics Page Views per month 98 million Unique visitors per month 900,000

Funder (First Registered in FY2013; n = 19,111)

Industry 6,062 (32%)

NIH 1,238 (6%)

Other 11,811 (62%)

25

Possibly Subject to FDAAA (First Registered in FY2013; n = 19,111)

Yes 3,884 (20%)

No 15,227 (80%)

26

Key Issue: Standards for Organizing Clinical Trial Data

27

Sample Issues

• Clinical trial data need to be structured and organized, e.g., – For which trial (and summary information)? – Standards for describing the data set and

documentation – How to report uniquely/unambiguously which

data set was used and where it is located? – How to search and find all trial data? and results?

28

Sample Search

“Trials of oxygen levels for resuscitation of premature infants to prevent retinopathy of prematurity (ROP)”

29

ClinicalTrials.gov (14 Records Retrieved)

30

WHO ICTRP Search Portal (3 Records Retrieved)

31

Potential Role of the National Library of Medicine (NLM)

• Expertise in standards development, data management, informatics

• US Federal Government entity

32

Potential Role for ClinicalTrials.gov

• Provide framework and access to key trial information – Registration – Results – Links – Documents

• Provide context for available information – List of all trials for given topic – Documentation of what information is available for

each trial – Help to avoid “disclosure biases” of all sorts

33

“Informational Chaos” Diffuse, hard-to-access information about a single study

34

Sponsor

Investigator

Other study documents

SAPs Full protocols

CSRs

Results database entries

Conference abstracts

Sample Routes of Dissemination of Information about a Single Study

ClinicalTrials.gov Record

Journal publications IPD sets 34

ClinicalTrials.gov: Informational Scaffold

Results database entries

Journal publications

SAPs

IPD

Other Information (e.g., press releases,

news articles, editorials)

CSRs

Full protocols

Other study documents Conference abstracts

ClinicalTrials.gov Record

35

Mockup: “Checklist” for Each Trial

36

Trial Registration: NCT00000001

Primary Outcome Measure

Outcome Measure 1

Outcome Measure 2

Secondary Outcome Measure

Outcome Measure 3

Outcome Measure 4

Outcome Measure 5

• Documents Available? • Full Protocol

• Individual Participant-Level Data Available? • Summary Results?

Take Home Messages

• IPD policies and standards need to support (rather than undermine) the quest for a full set of registration and summary results data.

• Leverage ClinicalTrials.gov infrastructure as the central scaffolding for access to/sharing IPD rather than inventing a new system, e.g., – ClinicalTrials.gov Identifier (NCT Number) – ClinicalTrials.gov search engine

37

Take Home Messages – 2

• Simple, easy-to-do things to do now to support “horizontal” transparency – Ensure trial registration and summary results

reporting – Reference the NCT Number on all trial-related

information and documents

• Support training and incentives for trial reporting

38