17
Evaluation of Media Reports as a DataSource for Death During Hurricane Sandy — United States, 2012 Olaniyi Olayinka, MD, MPH EIS Officer, Health Studies Branch Disaster Epidemiology Community of Practice April 17, 2014 National Center for Environmental Health Division of Environmental Hazards and Health Effects

Using Media Reports to Track Death - Hurricane Sandy-2012

Embed Size (px)

DESCRIPTION

The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention

Citation preview

Evaluation of Media Reports as a DataSource for Death During Hurricane Sandy — United States, 2012

Olaniyi Olayinka, MD, MPH

EIS Officer, Health Studies Branch

Disaster Epidemiology Community of Practice

April 17, 2014

National Center for Environmental Health

Division of Environmental Hazards and Health Effects

Background

October 29, 2012

Hurricane Sandy

Category 2 on landfall

Storm surge

Power outage

Public Health Impact of Hurricane Sandy

Public healtho Deathso Severe morbidity (injuries,

disabilities)o Mental health

Economic losso $68 billion

Current Problems with Reporting Disaster-Related Mortality

There is no national-level active surveillance system for disaster-related fatalities

There is a time-lag in reporting by state vital statistics

Death certificates do not always indicate disaster-relatedness

Description of CDC/HSB Media Report Pilot Surveillance to Track Hurricane Sandy-Related Deaths

From October 29–November 5, 2012 CDC/HSB tracked Hurricane Sandy-related deaths using Google search engine to search the Internet for demographics and circumstance of death

Keywords

Removed duplicates for deaths reported from multiple sources

Reported actively tracked Sandy-related deaths to CDC EOC every 24 hrs.

o Death o Disastero Drowningo Hurricane Sandy

o Memorial o Sandy oro Storm

Objectives of Evaluation

Assess CDC/HSB media report tracking for active mortality surveillance during Hurricane Sandy

Establish

o Accuracy

o Usefulness

Stakeholders

One city and eight state health departments

o New York City (NYC) and Connecticut (CT), Maryland (MD), New Hampshire (NH), North Carolina (NC), New Jersey (NJ), Pennsylvania (PA), Virginia (VA), and West Virginia (WV)

Media

CDC/HSB

Local disaster management officers

Evaluation Design

In 2013, we requested vital statistics records or line lists of all Hurricane Sandy-related deaths from NYC and eight states

1. Requested vital statistics offices to identify Sandy-related deaths using the following:

ICD-10-CM (X37) code

Text string searches using keywords (e.g., “Hurricane Sandy” or “storm related” to identify deaths)

2. If a state could not provide a comprehensive list of Sandy-related deaths, we requested vital statistics records based on the media list we compiled

Evaluation Design (Continued)

Compared data based on media reports with information based on vital statistics records• Determined percent agreement for all deaths that were able to

be match

• Considered a match between media-reported death and vital statistics if first and last name matched or sex and one of the following key attributes matched with vital statistics data:

o Age (+/- 1 year)

o Date of deatho Place of death (i.e., state, city, county, borough, or specific place of

death)

Calculated sensitivity and positive predictive value (PPV)

Flow Chart of Process Used in Selecting Deaths Reported in Media for Comparison with Vital Statistics

(N=75)

115 Media-reported deaths from all states

99 Media-reported deathsfrom jurisdictions participating in

evaluation(CT, NC, NJ, NYC, PA, VA)

75 deaths reported from media matched with vital statistics

records

16 media-related deaths from non-participating states (WV, MD, NH) excluded

24 deaths excluded: • Did not match a vital statistics

record (n=18) • Were outside study area (n=5)• Were outside study time period (n=1)

Number of Hurricane Sandy-related deaths reported by the media and vital statistics, NYC and five Northeastern states

(CT, NC, NJ, PA, VA) (N=75)

CTn (%)

NCn (%)

NJn (%)

NYCn (%)

PAn (%)

VAn (%)

TOTAL

MediaReports

5 (5) 4 (4) 24 (24) 47 (47) 17 (17) 2 (2) 99

VitalStatistics

5 (6) 2(2) 23 (26) 41 (46) 17 (19) 2 (2) 90

Matched n (%) 5 (100) 2 (50) 19 (79) 34 (72) 13 (76) 2 (100) 75 (76)

0 10 20 30 40 50 60 70 80 90 100

Cause of Death

Date of Death

Circumstance of Death

Place of Death

Age

Sex

Name

Percent Agreement of Hurricane Sandy-Related Deaths reported by media and vital statistics death records in NYC

and five Northeastern states (CT, NC, NJ, PA, VA) (N=75)

Percent Agreement

Ke

y A

ttri

bu

tes

Sensitivity and Positive Predictive Value

New York City Vital Statistics Record

Yes No

Yes

No ___

34 7

7

41

Sensitivity = 34 / 41 = 83% Positive Predictive Value = 34 / 41 = 83%

41

Re

po

rte

d in

th

e M

ed

ia

Limitations

Keywords used in conducting text searches of vital statistics death records were not exhaustive

PPV calculated only for NYC, not able to calculate for other five states

Media-reported deaths did not have unique identifiers other than the key attributes (e.g., name, age, sex) thus accuracy of some key attributes might be overestimated

Conclusions

Media reported deaths provided timely information on Hurricane Sandy-related deaths

Information provided by media was moderately sensitive and for many of the key attributes there was high agreement

If deaths are not actively tracked during a disaster it can be difficult to identify deaths that were related to the event

Recommendations

Use media death reports as supplemental source of information on disaster-related deaths

o Situational awareness

o Immediate public health decision-making during the initial stages following a disaster

Use more traditional sources of information when more accurate information is needed, such as specific details of cause of deaths

National Center for Environmental Health

Division of Environmental Hazards and Health Effects

The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.

Acknowledgment

CDC Operation Dragon Fire Workgroup

• Brant Goode

Participating City and States

Connecticut State Vital Records Office

New Jersey Office of Vital Statistics

New York City Office of Vital Records

North Carolina Vital Records

Pennsylvania Department of Health

Virginia Office of Vital Records

CDC/NCEH

Tesfaye Bayleyegn

Amy Wolkin

Sherry Burrer

Lauren Lewis

Nicole Nakata