Qualitative Risk Assessment Charles Yoe, PhD cyoe1@verizon.net Institute for Water Resources 2009

Preview:

Citation preview

Qualitative Risk Assessment

Charles Yoe, PhDcyoe1@verizon.net

Institute for Water Resources2009

The Need

• Manage risk intentionally

• Do better than has been done

• Quantitative risk assessment not always possible or necessary

• Qualitative risk assessment can be a viable option– Partial assessments are often useful

Use Qualitative Assessment

• When consistency and transparency in handling risk are desired

• When theory, data, time or expertise are limited

• When dealing with broadly defined problems where quantitative risk assessment is impractical

Qualitative Risk Assessment

• The process of compiling, combining and presenting evidence to support a statement about risk– Descriptive or categorical treatment of information

• Is formal, organized, reproducible method based on science and sound evidence

• Flexible and consistent• Easy to explain to others• Supports risk management decision making

Qualitative Methods Toolbox

• Increase or Decrease Risk• Risk Narratives• Evidence Mapping• Screening• Ratings• Rankings

• Enhanced Criteria Ranking

• Operational Risk Management (Risk Matrix)

• Develop a Generic Process

• Qualitative Assessment Models

• Multi-Criteria Decision Analysis

Qualitative Assessment

• May include all or just some of the risk assessment steps

• Qualitative risk characterization is usually the endpoint

• Not every qualitative assessment is a true risk assessment

For Any Method

• Necessary preparation!– Identify the problem– Identify the goals– Identify the questions to be answered

• Use an assessment framework

Increase or Decrease Risk

• For some problems it may be enough to know if things are getting better or worse

• Identify the direction of change in a risk and the specific reasons for it• Storm damage has weakened structure• Funding uncertainty clouds future

• Clarifies thinking and rationale

• Not good for netting changes

Risk Narratives

• What can go wrong?• How can it happen?• How likely is it?• What are the

consequences?

• Use simple narratives that answer these questions honestly• Tell story of existing

risk• Tell story of mitigation

effectiveness (risk reduction)

• Tell story of residual, transferred or transformed risk

Evidence Frameworks

• A risk evaluation technique• Identify how experts evaluate current scientific

evidence on chosen topics• What conclusions do they reach regarding risk

potential• What evidence/arguments do they use to justify

conclusions• What consensus/disagreement exists• What uncertainties remain

Source: Risk evaluation of the health effects of mobile phone communication (10/2005) by Peter Wiedermann, Holger Schütz, and Albena Spangenberg

Core Elements

• Evidence base or data• Pro and con

arguments, the warrants– Includes respective

supporting or attenuating arguments

• Conclusions of claim about existence of a hazard with remaining uncertainties

Ordering Techniques

• Screening, rating and ranking with increasing levels of detail/information

• Used to identify hazards, commodities, commodity-pathogen pairs, pathways, mitigation measures, potential risk and the like that are of interest to decision makers

Screening

• Process of separating elements into categories of interest and no interest through systematic elimination

• Requires • Items to be screened• Carefully defined categories (yes/no)• Criteria for screening• Evidence for the criteria • An algorithm for using the criteria to separate the

items into the desired categories

Screening Algorithms

• Domination procedures (better/worse on all criteria)

• Conjunctive procedures (meets all criteria thresholds)

• Disjunctive procedures (meets a least one criterion threshold)

• Elimination by aspects (set cut-off value for most important criterion and eliminate, then set cut-off value for next most important criterion, etc.)

• Lexicographic rules (rank against all criteria then rank alternatives)

Rating

• Systematic process of separating elements into multiple categories of varying degrees of interest• Individuals are rated

• High, medium, low, no risk

• Requires • Items to be rated• Carefully defined categories (non-ordinal is okay)• Criteria for rating• Evidence for the categories• An algorithm for using the criteria to separate the

items into the desired categories

Ranking

• Systematic process used to put items in an ordinal sequence• Rated items can be ranked

• May rely on ordinal ranked categories or an ordinal ranking of each individual item

• Simple when objective measures  of a risk or other characteristic of interest are available

• Requires • Items to be ranked (alternatives)• Carefully defined science-based criteria for ranking• Evidence of each item’s measurement or rating for each criterion• Differential weights for criteria when appropriate• A synthesis algorithm.

Enhanced Criteria Based Ranking

• Criteria• Ratings• All Possible Combinations of Ratings• Ranking• Evaluate Reasonableness of Ranking• Add Criteria• New Combinations of Ratings• New Ranking

Question?

• Which lock gates in division present the greatest potential risk to health and safety and therefore should be repaired first?

Step One: Criteria• Assume criteria equally

important (or not).• Reflect most important

aspects of evaluating risk.

• Define H, M, L scenarios for each criterion.

• Use three or four evidence-based criteria.

H = Twenty and above Years of Age.M = Ten to Twenty Years of Age.L = Zero to Ten Years of Age.

H = Daily Use-approximately 365 times a year.M = Great than one and less than 365 times a year.L = Annual use-Once a Year.

H = Loss of Life and/or Property.M = Structure Damage.L = Minimal Loss of Property and/or Damage.

GATESCriteria #1: Age

Criteria #2: Frequency of Use.

Criteria #3: Consequence of Failure.

Step Two: Rating• Use expert judgment to

critically evaluate the available information

• Develop estimates for each “hazard” against criteria

• Use letters or numbers but numbers do not represent an absolute measurement of risk only a relative means for comparison

Gate Criteria 1 Criteria 2 Criteria 3

Knightsbridge H L M

Steadly H M M

Redwood M H L

Jackflash M H L

Cantget L L L

Roughjustice H M L

IORR L M H

19 L H L

Step Three: All Possible Combinations

• Greatest Risk HHH

HHM, HMH, MHHHHL, HLH, LHH, HMM, MMH, MHMHLM, MHL, HML, LMH, MLH, MMM, LHMHLL, LHL, LLH, MML, LMM, MLMMLL, LML, LLM

Least Risk LLL

This is for equally weighted criteria. Unequal weightsyield different listings.

Step Four: Rank Subjectively

• Establish rank according to descending relative risk

• Identify subjective clusters.

Gate Rating Ranking

Steadly HMM Greatest Risk

Roughjustice HML

Jackflash MHL

Knightsbridge HLM Moderate

Risk

Redwood MHL

IORR LMH

19 LHL

Cantget LLL Least Risk

Step Five: Add Criteria?

• Look at rankings, do they make sense?• Have you thought properly about this

issue?• If they do not, perhaps you did not

consider all the most relevant criteria• A new criteria may be added to more

accurately reflect the assessors rationale for ranking

Step Five: Add Criteria? (cont)

• Suppose the following was added to our example

• Criterion 4: Cost of emergency repair

– H = Major disruptions to navigation or power, much higher costs to repair

– M = Much higher costs to repair

– L = Same as scheduled repair

Step Six: New RatingsGates Criteria #4 Rating

New Combined Ranking

Steadly H HMMH

Jackflash H MHLH

Knightsbridge H HLMH

Redwood M MHLM

IORR M LMHM

19 H LHLH

Roughjustice L HMLL

Cantget H LLLH

Step Seven: New RankingGates

New Combined Ranking

Criteria #4 Rating

Steadly HMMH Greatest Risk

Jackflash MHLH Greatest Risk

Knightsbridge HLMH Greatest Risk

Redwood MHLM Moderate Risk

IORR LMHM Moderate Risk

19 LHLH Moderate Risk

Roughjustice HMLL Moderate Risk

Cantget LLLH Least Risk

Operational Risk Management (ORM)

Steps

• Determine purpose and use of matrix – Identify the question to be answered

• Define consequences of interest

• Identify consequence ranges and definitions

• Identify likelihood ranges and definitions

• Identify levels of risk in the cells of the matrix

Your DE Has Seen This

• “Mishap Risk”

• DOD "Standard Practice For System Safety”

• MIL-STD-882D 10 February 2000

Consequence Severities

Probability Levels

Risk Assessment Values

• Each risk you assess is placed in a cell and managed accordingly

Risk Levels

Another Example

Source: Assessing Environmental Risk, A Lecture to the Irish Environmental Law Association By: L. M. Ó Cléirigh 29June 2004

Risk Matrix

Three Axioms

• Weak consistency• Betweenness• Consistent coloring• 3x3 and 4x4 should

look like this to minimize problems

Low HighHigh

LowLow High

High

Low

Source: What’s wrong with risk Matrices? By Louis Anthony Cox, Risk Analysis Vol. 28 No.2, 2008

The Risk Management Point ofMatrix

Probability of Adverse Impact

Consequence of Adverse Impact

Age x Condition x Usage x Event

Risk =

Generic Process

Economic Environmental PerceivedDamage Damage DamagePotential Potential

+ +

X

Qualitative Assessment Models

• Probability hazard exists-rank as H,M,L

• Probability adverse consequence occurs if exposed-rank as H,M,L

• Probability exposure occurs-rank as H,M,L

• Overall risk estimate, integrate hazard, consequence, exposure- rank as H,M,L

MCDM/MCDA

• Multi-Criteria Decision Making (MCDM) is the study of methods and procedures by which concerns about multiple conflicting criteria can be formally incorporated into the management planning process

• Decision maker contemplates choice of action in an uncertain environment

• MCDA helps people choose among a set of pre-specified alternatives

• Decision making relies on information about these alternatives

• Quality of information can be scientifically-derived hard data to subjective interpretations

• Outcomes of decisions may be certain (deterministic information) or uncertain represented by probabilities and fuzzy numbers

• MCDA can assist in information processing and may lead to better decisions

Take Away Points

• Not all risk assessment needs to be quantitative

• Develop a few consistent and well developed techniques for your usage

Charles Yoe, Ph.D.cyoe1@verizon.net

Questions?

Recommended