16
Chief Analytics Officer Deputy Chief Scientist Office of the Chief Scientist Science and Technology Directorate The Role of the Chief Analytics Officer in DHS S&T Adapting to the DHS mission and environment 26 January 2016 Dewey Murdick, Ph.D.

Dept of Homeland Security presentation at the Chief Analytics Officer Forum East Coast USA (#CAOForum)

Embed Size (px)

Citation preview

Chief Analytics Officer

Deputy Chief Scientist

Office of the Chief Scientist

Science and Technology Directorate

The Role of the Chief Analytics

Officer in DHS S&T

Adapting to the DHS mission and environment

26 January 2016

Dewey Murdick, Ph.D.

S&T mission: To deliver effective and innovative insight,

methods and solutions for the critical needs of the Homeland

Security Enterprise.

Monitors technology and threats

Capitalizes on technological advancements at a rapid pace

Develops solutions and bridges capability gaps

Created by Congress in 2003, S&T conducts DHS-relevant:

Basic and applied research

Development

Demonstration

Testing and evaluation

Department of Homeland Security (DHS)

Science and Technology Directorate (S&T)

2

Visit http://www.dhs.gov/science-and-technology

Executive:

President of the United States of America

DHS Secretary

Congressional: House Committee on Homeland Security

House Committee on Science, Space, and Technology

Senate Homeland Security and Government Affairs Committee

Appropriation Committees

The DHS S&T “Board of Directors”

3

Coast Guard

Customs & Border Protection

Federal Emergency

Management Agency

Secret Service

Transportation Security

Administration

U.S. Citizenship & Immigration

Services

U.S. Immigration & Customs

Enforcement

Domestic Nuclear Detection

Office

Federal Law Enforcement

Training Center

Intelligence & Analysis

National Protection &

Programs Directorate

Office of Health Affairs

Operations Coordination &

Planning

DHS S&T Business – Mission Components

4

Many Inputs / Directions / Expectations

and a R&D Budget of $400-$450M

Mission: To develop and execute analytic

strategies to improve the efficiency, effectiveness,

and/or timeliness of decision making within S&T

and DHS.

Positioned within the Office of the Chief Scientist under the

Under Secretary for S&T

Decision Support Analytics Mission

5

Proposed “Objective Function” within government

Anticipate & support decision making with timely data-driven input

Improve independent analysis of the portfolio: Data collection/updates, analysis, and periodic reviews

Independent quality assurance for projects (as needed)

Manages knowledge and lessons learned

Tracks performance over time, runs analytics to support S&T decisions

Establish a robust technical horizon scanning capability

Prototype anticipatory analytics capabilities with DHS Components

Marshal internal and external data resources, discover new sources

Other: Rapid response, strategic planning, governance input, …

Chief Analytics Officer Responsibilities

6

S&T Portfolio

Project data

Milestones and metrics

Publication / patent output

Contracts and deliverable output

Knowledge management records

Financial Records

Human Resource Records and Planning

Strategic Goals and Requirements (e.g., President, DHS Secretary, Congress, Under Secretary, Components, Missions, …)

External Data Exploration (e.g., product futures, venture capital, crowd sourcing, news media, …)

Example S&T Data Sources (In Progress)

7

Project prioritization

Degree of alignment with presidential,

congressional, secretary, under secretary,

and other priorities

Criteria to update S&T priorities

Risk/reward tolerance, portfolio balance

New start criteria

Mid-project rebalancing or course correction

Engagement profile

Traditional vs. non-traditional entities

Awards (e.g., SETA, Deliverable Contract,

Grants, FFRDC, Price, Other Transactions)

Infrastructure

What is critical to maintain? Outsource?

Update executive initiatives

Start, stop, revise initiative

Project health in context (internal and

external)

Start, stop, revise project

Group / Division funding level for next cycle

Project communication strategy

Timing and audience scope for

announcements

Transition (e.g., when to get commitments)

Project security protection

Tech protection, risk management

Export control

Classification

Human capital

Tech specialization areas

Seniority, number, …

Tenure and position duration

S&T “Decision Levers” (Selected)

8

Operation Points (Notional)

Model(s) or Mode(s) of Operation:

• Basic R&D

• Applied R&D

• High-risk, high-payoff R&D

• First Adopter

• Rapid Deployment and Integration

• Tech Horizon Scanning / Warning, Analysis

• …

Num

ber

of

Proj

ect

s

$ value for project (binned)

Tot

al Fun

ding

Entity’s degree of previous engagement (history and % budget from USG) (binned)

Non-Traditional Orgs

Portfolio Risk Balance (binned by total $)

Risks: tech, adoption, …

Medium risk?

… or project duration

Low risk projects?

… steady state? 9

UNCLASSIFIED

Project Portfolio Map(s), e.g., a graph

Relatedness of goals/methods/teams

Map to strategic goals, gap identification

Quality distributions, for example,

Clarity, data quality measures

Program management milestones / target achievement

State of the art alignment (sampled set)

Deliverable analysis (sampled set)

Engagement profile analytics: Type of work and who is performing it

Project risk analytics: High risk, medium risk, …

Transition impact analysis (sampled set)

Financial analysis

Potential Priority Analytics

10

Iterative Feedback Required

Anticipatory Analytics for

Decision Support

Exploring…

11

Anticipatory Analytics – Motivating Research

Event Type Program Lead-Time Accuracy Scale

Geo-political

Conflict, Elections,

Economic Events,

Instability, etc.

ACE 10-100+ days 87% of days

with correct

forecast

325 questions

Civil Unrest OSI 8 days 75% accurate >10,000 events in South

America

Elections OSI 14 days 85% accurate 20 events in South America

Epidemiology & Biosecurity

Flu OSI 26 days 70% accurate 4,320 events in South America

Rare Diseases OSI 6 days 75% accurate 70 events

Scientific and Technical

S&T Milestone ForeST* 10-100+ days 65% of days

with correct

forecast

172 questions

Results current as of Feb 2015

Visit http://www.iarpa.gov 12

*See the FUSE Program

for longer-term forecasts

Decisions (and who makes them) must be clearly defined. Example elements include, e.g., the forecast accuracy requirements, tolerance for false positives,

lead time requirements to take action, how often decisions of a particular type need to be made, etc.

Events must have a very crisp definition. Events must have sufficient clarity to be forecastable and for warnings to be falsifiable.

Event forecasts need to be sufficiently well-defined to inform action and will likely include who, what, when, where, and how characteristics.

Base rates for these events need to be determined, which will inform the predictability of the event.

Keep score! Decisions made and events that occur must be recorded to form the ground truth required to

evaluate and eventually improve forecasting performance and decision utility.

Warnings generated also need to be recorded to maintain a reliable calibration of the system performance.

Indicators need to be discovered and predictive impact evaluated. Relevant data streams need to be identified that could provide indicators to enhance the accuracy

and lead time of event forecasting.

Competitive forecasting “tournaments” can be particularly effective.

Getting usable predictive accuracy can take time and iteration.

Anticipatory Analytics – Key Elements

13

DHS-component-specific anticipatory analytics

prototypes

Task Outline, 12 months, ~2 prototypes:

Characterize a subset of decision making

Crisply define events that trigger priority decisions

Determine base rates, ground truth, and indicator feasibility

Build baseline prototype system and measure performance

Working with multiple Component partners

Anticipatory Prototypes with Components

14

Assemble analytics team (continued)

Execute implementation plan (refine)

Map decisions, refine questions, define best practices for

data/methods, keep metrics, measure decision-support

performance and impact

Explore decision/event suitability for anticipatory

analytics prototypes within DHS

Next Steps

15

Decisions Questions Data /

Methods Outcome Metrics

16