Knowledge Management Status Report to eRA Project Team · Knowledge Management Status Report to eRA...

Preview:

Citation preview

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 1

Knowledge ManagementStatus Report to eRA Project Team

Transition to Pre-Production Phase

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 2

Life Cycle of Disruptive TechnologiesGOAL: DEMONSTRATION

STAGE 1

DRIVER: CONCEPT

GOAL: MATURITY / DIFFUSION

STAGE 3

DRIVER: COMPETITION

GOAL: NICHE APPLICATIONS

STAGE 2

DRIVER: TRUE BELIEVERS

Myers, et al “Practitioner’s View: Evolutionary Stages of Disruptive Technologies”; IEEE Transactions, v. 49, no. 4, Nov. 2002

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 3

Aims Today

1. Where we have been.

2. Where we are going.

3. How we’ll get there (if we answer a few questions).

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 4

Where we have been.

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 5

Concept Phase

pre-production phaseconceptual phase

-

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

system requirements

functionalrequirements

design phase

Source: Management of Systems Engineering, Wilton P. Chase

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 6

Conceptual Phase

system requirements

MITRETEKconceptual phase

-

functionalrequirements

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

Source: Management of Systems Engineering, Wilton P. Chase

collexis

semio

stratify

inxight

i411

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 7

Design Phase

pre-production phaseconceptual phase

-

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

system requirements

functionalrequirements

design phase

Source: Management of Systems Engineering, Wilton P. Chase

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 8

Design Phase

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

system requirements

functionalrequirements

conceptual phase

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

Pilot #2: reviewer selection

Pilot #1: situational awareness

design phase

Source: Management of Systems Engineering, Wilton P. Chase

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 9

Project Management Plan

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

-

system requirements

functionalrequirements

conceptual phase pre-production phase

design phase

Dec. 1, 2002 Feb. 1, 2003 Sept. 1, 2003

Source: Management of Systems Engineering, Wilton P. Chase

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 10

Next Steps: Pre-Production

pre-production phaseconceptual phase

-

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

system requirements

functionalrequirements

design phase

Source: Management of Systems Engineering, Wilton P. Chase

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 11

Where we are going.

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 12

NIH KM OverviewCore KM Prototypes

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 13

Expanded NIH KM OverviewComplete System

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 14

Metrics for GRSEstimated Benefits•Reduced cycle time•Improved quality and consistency of referrals•Time saved by the organization

Methods Used•Sampling•Survey•Interviews•Internal Logs

Key Measures Key Outputs Key Outcomes System Performance

• Recall • Precision • Reviewer Selection • System Response time • Scalability

o Number of research proposals

o Number of reviewers • Institute Routing

• Time spent “selecting” candidate reviewers

• Time spent “screening” candidate reviewers

• Number of conflicts identified • Percentage of candidate

reviewers chosen • Percentage of correct institute

routing

• User satisfaction • Time saved by the organization

in “selecting” and “screening” candidate reviewers

• Savings or improvements in organizational quality and efficiency

• Time saved in institute routing

System Usage

• System down time • Scalability

o Number of users o Frequency of use

• User Feedback (real-time) • Usability survey (time-lag) • Training time /learning curve

• Usefulness survey • Feedback results • Duration of learning curve • Duration of training time

• User satisfaction • Savings or improvements in

organizational quality and efficiency

• Time saved by the organization • Reduced training time or

learning curve System Operation & Maintenance

• Frequency of Updates • System Downtime • Help Desk Support

• Number of Help Desk support requests

• User satisfaction • Reallocation of Help Desk

resources • Recency of Information

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 15

Metrics for GTSEstimated Benefits•Situational Awareness•Discovery of Patterns and Trends•Informed Decision Making•Time saved by the organization

Methods Used•Sampling•Survey•Interviews•Internal Logs

Key Measures Key Outputs Key Outcomes System Performance

• System Response time • Scalability - Number of research

proposals • Proposal Analysis

• Time spent in understanding proposals

• Time spent in analyzing and identifying relationships among concepts

• Percentage of successful document categorization

• Time spent in analyzing and identifying distributions

• User satisfaction • Time saved by the

organization • Awareness of relationships

among proposals • Improvements in document

categorization • Visual awareness of

distributions, patterns and trends

System Usage

• System down time • Scalability

o Number of users o Frequency of use

• User Feedback (real-time) • Usability survey (time-lag) • Training time /learning curve

• Usefulness survey • Feedback results • Duration of learning curve • Duration of training time

• User satisfaction • Savings or improvements in

organizational quality and efficiency

• Time and reduced cost saved by the organization

• Reduced training time or learning curve

• Visual identification of concept relationships

System Operation & Maintenance

• Frequency of Updates • System Downtime • Help Desk Support

• Number of Help Desk support requests

• User satisfaction • Reallocation of Help Desk

resources

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 16

How we’ll get there.

1. Understand impact of disruptive technologies.

2. Use KM to align workflows and data flows.

3. Answer the hard, but practical questions.

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 17

Life Cycle of Disruptive TechnologiesGOAL: DEMONSTRATION

STAGE 1

DRIVER: CONCEPT

GOAL: MATURITY / DIFFUSION

STAGE 3

DRIVER: COMPETITION

GOAL: NICHE APPLICATIONS

STAGE 2

DRIVER: TRUE BELIEVERS

Myers, et al “Practitioner’s View: Evolutionary Stages of Disruptive Technologies”; IEEE Transactions, v. 49, no. 4, Nov. 2002

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 18

workflows

managecaptureinputs release outputs

display

data

docs

data

docs

disks

decisions decisions

taskstasks

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 19

data flows

store

managecaptureinputs release outputs

process process

extract/mergeprep/input collate/sort

display

data

docs

data

docs

disks

decisions decisions

taskstasks

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 20

KEY QUESTIONS

• Do we have a credible means of verifying best practices?

• Do we have baselines and are we ready impacts? Assessment• Is the XML corpus ready? (If so, when and where?)

• Do we have the needed data sets?Inputs• Does the contractor have the resources and skillsets?

• Do we have the staff to manage and oversee the project?Staff• Does the contractor have needed resources / skillsets?

• Do we have staff to manage and oversee the project?Management• Do we have funds / plan for a full-scale implementation?

• Do we have Phase 2 funds — for pre-production piloting?Budget• Do we have the pilot sites identified, with buy-in?

• Is the organization ready?Readiness

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 21

IC of the Future

IC of the future must serve theneeds of several end-users:

– experimental biologists,– clinical researchers,– science administrators, and– even public health officials.

Biology today is quantitative;it depends on computers for the– production,– analysis, and– management of scientific data.

SJ Wiback and BO Palsson Biophysical Journal, 8:2002

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 22

END

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 23

Expanded NIH KM OverviewAssisted Specialized Taxonomy Generation

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 24

Expanded NIH KM OverviewResearch Proposal Archiving & Collaborative Resources

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 25

project plan

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

system requirements

functionalrequirements

conceptual phase pre-production phase

design phase

Source: Management of Systems Engineering, Wilton P. Chase

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 26

PM plan

performancerequirements

detaileddesign

pre-production phaseconceptual phase

-

build / testprototype

assessresults

modify(produce)

system requirements

functionalrequirements

design phase

internal leadership end-userSource: Management of Systems Engineering, Wilton P. Chase

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 27

KM Project Overview

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

system requirements

functionalrequirements

conceptual phase (pre-)production phase

design phase

Source: Management of Systems Engineering, Wilton P. Chase

27 February 2003 Richard W. Morris <rmorris@niaid.nih.gov> 28

Conceptual Phase

system requirements

functionalrequirements

performancerequirements

detaileddesign

build / testprototype

assessresults

modify(produce)

-

Pilot #1: situational awareness

Source: Management of Systems Engineering, Wilton P. Chase

collexis

semio

stratify

inxight

i411

Pilot #2: reviewer selectionMITRETEK

conceptual phase

Recommended