28
1 Using Formal Models of Utility to Guide the Development of Safety- Critical Systems Chris Johnson University of Glasgow, Scotland. http://www.dcs.gla.ac.uk/~johnson

1 Using Formal Models of Utility to Guide the Development of Safety-Critical Systems Chris Johnson University of Glasgow, Scotland. johnson

Embed Size (px)

Citation preview

1

Using Formal Models of Utility to Guide the Development of Safety-Critical Systems

Chris Johnson

University of Glasgow, Scotland.http://www.dcs.gla.ac.uk/~johnson

2

Can PRA Guide Formal Methods?

3

10Very High:Failure is

almostinevitable

1 in 2

1 in 3 9

8High: Repeatedfailures

1 in 8

1 in 20 7

6

5

Moderate:

Occasionalfailures

1 in 80

1 in 400

1 in 20004

3Low:

Relatively fewfailures

1 in 15,000

1 in 150,000 2

Remote: Failureis unlikely

1 in 1,500,000 1

AbsoluteUncertainty

Design Control does not detect a potential Cause offailure or subsequent Failure Mode; or there is noDesign Control

10

VeryRemote

Very remote chance the Design Control will detect apotential Cause of failure or subsequent Failure Mode

9

Remote Remote chance the Design Control will detect apotential Cause of failure or subsequent Failure Mode

8

Very Low Very low chance the Design Control will detect apotential Cause of failure or subsequent Failure Mode

7

Low Low chance the Design Control will detect apotential Cause of failure or subsequent Failure Mode

6

Moderate Moderate chance the Design Control will detect apotential Cause of failure or subsequent Failure Mode

5

ModeratelyHigh

Moderately high chance the Design Control willdetect a potential Cause of failure or subsequentFailure Mode

4

High High chance the Design Control will detect apotential Cause of failure or subsequent Failure Mode

3

Very High Very high chance the Design Control will detect apotential Cause of failure or subsequent Failure Mode

2

AlmostCertain

Design Control will almost certainly detect apotential Cause of failure or subsequent Failure Mode

1

Hazardouswithoutwarning

Very high severity ranking when a potentialfailure mode affects safe operation or involvesnon-compliance with a government regulationwithout warning.

10

Hazardouswithwarning

Failure affects safe product operation or involvesnon-compliance with government regulation withwarning.

9

Very High Product is inoperable with loss of primaryFunction.

8

High Product is operable, but at reduced level ofperformance.

7

Moderate Product is operable, but comfort or convenienceitem(s) are inoperable.

6

Low Product is operable, but comfort or convenienceitem(s) operate at a reduced level ofperformance.

5

Very Low Fit & finish or squeak & rattle item does notconform. Most customers notice defect.

4

Minor Fit & finish or squeak & rattle item does notconform. Average customers notice defect.

3

VeryMinor

Fit & finish or squeak & rattle item does notconform. Discriminating customers notice defect.

2

None No effect 1

4

Classical Decision Theory

(f1,p1;f2,p2;…;fn,pn)

s S: V(s) = (ni=1

pi).u(f1f2…fn)

s, s1 S: s1 risk s V(s) > V(s1).

5

Applications of Classical Decision Theory

pump_v(exchanger_error, 0.0000000003, bbv_error, 0.00000241) risk

analyser_v(analyser_error, 0.0000000003)

compound_failure exchanger_error bbv_error

AX(display_exchanger_warning display_bbv_warning)

ordered_response_failure compound_failure analyser_failure

AX(start_standby_pump EF(display_analyser_warning reroute_analysis))

6

Decision Theory and Formal Methods

so |= AX (f ; y) iff:

#{ Fk F| (s0, s1) Fk s1 |= f Fk} = y

#{ Fj F| (s0, s1) Fj Fj}

so |= AX [f P y] iff:

#{ sx P| sx |= f sx} = y

#{ sx P sx}

P

F

S0

7

RPN Paradoxes

8

Decision Theory and Formal Methods

• Issues with probability:– limited incident data;– relational databases;– poor interpretation.

9

Why Bother with Utility?y

x0

Expe

cted

mar

gina

l uti

lity

of r

esou

rces

Expenditure

Point of diminishingreturns

y

x0

Expe

cted

mar

gina

l uti

lity

of r

esou

rces

Expenditure

Point of diminishingreturns

10

Why Bother with Utility?

X2

X10

1

2

1 2

11

Why Bother with Utility?

H. Kortner and A. Kjellsen, Det Norske Veritas - 2000.

0

500

1000

1500

2000

2500

3000

3500

4000

Euros

Maintenance Interval(Months)

Cost of failure

1 3 5 7 9 11 13 15 17 19 21 23

Cost ofmaintenance

Total cost

12

Why Bother with Utility

Attribute B (eg speed)

Attribute A(eg ride quality)

13

Standard Models of Utility

• Users have a consumption set X.

• Trade-offs exist between elements of X:

• There are preference relations over X:– (x1, x2) “x1 is at least as good as x2”

• Axioms avoid paradoxes & define “rationality”.

14

Rationality Axiom 1: Completeness

• For any x1 x2X either x1 x2 or x1 x2

• Implication 1. The Completeness Axiom makes an unrealistic assumption that designers will be able to distinguish between the different strategies or plans that they can exploit.

15

Rationality Axiom 2: Reflexivity

• For all xX x x

• Implication 2 The Reflexivity Axiom states that any alternative is at least as good as itself but designers may associate different values with different means of obtaining the same outcome.

16

Rationality Axiom 3: Transitivity

• For any x1,x2, x3 X

if x1 x2 and x2 x3 then x1 x3

• Implication 3 The Transitivity Axiom makes an unrealistic assumption that users act as “rational” consumers in a technical environment that they may not fully understand.

17

Preference Topologies

• Definition 1 ( preference):constrained to satisfy rationality axioms.

• Definition 2 (>> strict preference):x1 >> x2 iff x1 x2 and (x2 x1)

• Definition 3 (~ indifference):x1 ~ x2 iff x1 x2 and x2 x1

18

Preference Topologies

• For some point x0 = (x01, x0

2):

• At least as good as: {x | x X, x x0}.

• No better than: {x | x X, x0 x}

• Worse than: {x | x X, x0 >> x}

• Preferred:{x | x X, x >> x0}• Indifferent: {x | x X, x ~ x0}

19

Preference TopologiesX2

X1

0Quantity Pref erred

Qua

ntit

y Pr

efer

red

>>(x0)

x0

<<(x0)

~(x0)

- Shows X as a 2D vector of reals.- Paradox to left of x0

- So introduce additional axioms.

20

Axioms of Taste: Continuity

• For all x Rn both and are closed.

• Implication 4 The Continuity Axiom ensures topological nicety and is neutral with respect to safety-critical development.

21

Axioms of Taste: Strict Monotonicity

• For all x0, x1 Rn+ if x0 is greater than or

equal to x1 then x0 x1 while if x0 is strictly greater than x1 then x0 >> x1.

• Implication 5 The Axiom of Strict Monotonicity fails to characterise certain aspects of safety-critical development in which more of a resource can yield a worse design.

22

Axioms of Taste: Strict Monotonicity

X2

X1

0Quantity Pref erred

Qua

ntit

y Pr

efer

red

>>(x0)

x0

<<(x0)

x1

x2

xt

Continuity reduces indifference region. Monotonicity ensures all preferred sets are strictly above indifference sets (non-satiation).

23

Axioms of Taste: Strict Convexity

• If x1x0 and x1x0 then tx1+(1-t)x0 >> x0

for all t[0, 1]

• Implication 6 The Axiom of Strict Convexity reflects a “balanced” approach to resource allocation or substitution. As one of the preference axioms of taste, however, it is inappropriate for all forms of safety-critical development.

24

The Way Forward

• Perhaps I’m missing the point.

• Quantitative analysis unimportant?

• But we keep getting PRA wrong:– Formal methods might help?

25

Wider Conclusions

• Question use of convex utility curves:– in risk analysis and decision theory;– (in stochastic multiplexing; caching etc.)

• Things are more complex than I thought:– subjectivity; ceteris paribus; risk

homeostasis.

26

27

National Attitudes to Risk

2.4

2.6

2.8

3

3.2

3.4

France Germany I taly Portugal UK

28

Caveat• Preference relation orders the consumption set, X.

• Utility functions map preferences onto numeric scale.

• Utility functions “inherit” complete, reflexive, transitive, continuous and strictly monotonic properties.

• Time to examine these assumptions...