AVEN-2007

Embed Size (px)

Citation preview

  • 8/12/2019 AVEN-2007

    1/10

    Risk Analysis, Vol. 27, No. 2, 2007 DOI: 10.1111/j.1539-6924.2007.00883.x

    On the Ethical Justication for the Use of RiskAcceptance Criteria

    Terje Aven

    To protect people from hazards, the common safety regulation regime in many industries isbased on the use of minimum standards formulated as risk acceptance or tolerability limits.The limits are seen as absolute, and in principle these should be met regardless of costs. The justication is ethicalpeople should not be exposed to a risk level exceeding certain limits.In this article, we discuss this approach to safety regulation and its justication. We argue thatthe use of such limits is based on some critical assumptions; that low accident risk has a valuein itself, that risk can be accurately measured and the authorities specify the limits. However,these assumptions are not in general valid, and hence the justication of the approach canbe questioned. In the article, we look closer into these issues, and we conclude that there is aneed for rethinking this regulation approachits ethical justication is not stronger than foralternative approaches. Essential for the analysis is the distinction between ethics of the mindand ethics of the consequences, which has several implications that are discussed.

    KEY WORDS: Bayesian approach; ethical justication; risk acceptance criteria; risk perspectives; riskregulations

    1. INTRODUCTION

    In this article, we are concerned about the jus-tication of the use of risk acceptance criteria, andas an illustrative example, let us consider the Norwe-gian offshoreoil andgas activities and the regulationson health, environment, and safety (HES) issued bythe Norwegian Safety Petroleum Authorities (PSA,2001). The regulations include a number of specicrequirements related to HES, for example, specify-ing the capacity of the rewalls protecting the liv-

    ing quarter. Most requirements are of a functionalform, saying what to achieve rather than the solu-tion to implement. In addition to such requirements,the PSA regulations require that the operators spec-ify risk acceptance criteria formajor accident risk andenvironmentalrisk. Riskacceptance criteriameanthe

    University of -Stavanger, Risk Management, Stavanger, Norway;[email protected].

    upper limit of acceptable risk relating to major acci-dents and risk relating to the environment. Risk ac-ceptance criteria shall be set for the personnel on thefacility as a whole, and for groups of personnel thatare particularly risk exposed, pollution from the facil-ity, and damage done to a third party. The risk accep-tance criteria shall be used in assessing results fromthe quantitative risk analyses.

    Beyond this level, dened by these requirements,the specic requirements, and the risk acceptance cri-teria, risk shall be further reduced to the extent possi-ble, i.e., the ALARP principle applies. Hence we maysee the specic requirements and the risk acceptancecriteria as minimum requirements to be fullled bythe operators. The justication of these minimum re-quirements concerning people and the environmentis ethicalpeople and the environment should notbe exposed to a risk level exceeding certain lim-its. Having established such minimum requirements,the authorities supervision can be carried out by

    303 0272-4332/07/0100-0303$22.00/1 C 2007 Society for Risk Analysis

  • 8/12/2019 AVEN-2007

    2/10

    304 Aven

    checking that these requirements are met. Hence theauthorities are in a position of concluding that theHESlevelisacceptableornot,dependingonthefull-ment of these requirements. Of course, in practice theextent to which the implementation of the ALARP

    principle is performed is also an issue, but as thereare no strict limits to look for, the supervision of theimplementation of this principle is more difcult.

    The purpose of this article is to discuss the ethi-cal justication of such a regulation regime based onthe use of minimum requirements, in form of specicrequirements to arrangements and risk acceptancecriteria. The emphasis is on the risk acceptance cri-teria. Does this regime have a stronger ethical justi-cation than other regimes that do not include riskacceptance criteria as a part of the framework? Whatconditions need to be fullled to obtain such a justi-cation? In the Norwegian offshore industry the oper-ators dene the risk acceptance criteria. Would thatviolate the basic idea of minimum requirements, asthe operators could specify criteria that in practiceare always met?

    The purpose of the article is not to conclude whatsafety regulation regime is preferable. It is beyondthe scope of this article to discuss in detail all the prosand cons of the alternative regimes. It is, however,clear from Aven and Vinnem (2005) that we are infavor of a regulation regime without the use of prede-ned risk acceptance criteria, but we will not repeathere the full argumentation for this view. The presentarticle focuses on the ethical dimension, which is notaddressed by Aven and Vinnem (2005). There seemsto be a prevailing perspective among many regula-tors that the use of risk acceptance (tolerability) lim-its has a superior ethical position compared to otherregimes. The aim of this article is to show that sucha stand cannot be justied. It is not possible to dis-tinguish between the various regimes using ethics asa criterion. Other arguments need to be put forwardto determine the preferred regime, as done by, e.g.,Aven and Vinnem (2005).

    When discussing the ethical justication for the

    various regulation regimes we have to distinguish be-tween various types of ethics. Two basic directions are(e.g., Hovden, 1998; Cherry & Fraedrich, 2002):

    1. Ethics of the mind; an action is justied by ref-erence to its purpose, meaning, or intention.

    2. Ethics of the consequencefocusing on thegood or bad results of an action. The right-ness of an action is totally determined by theconsequences of the action.

    These types of ethics are labeled as deontologicalandteleological theories, respectively. A variantof theteleological theories is utilitarianism, which searchesforalternativeswiththebestbalanceofgoodoverevil.The use of cost-benet analysis may be considered a

    way of making the theory operational. The deonto-logical theories stress that the rightness of an act isnot determined by its consequences. Certain actionsare correct in and of themselves because they stemfrom fundamental obligations and duties (Cherry &Fraedrich, 2002).

    A regime based on requirements of HES and theuse of risk acceptance criteria as required by the PSAis often linked to the former type, the ethics of mind,whereas the use of the ALARP principle is linked tothe latter type, the ethics of consequences. The pointis that in the former case, the requirements shouldin principle be fullled without a reference to otherattributes such as costs, whereas in the latter case,the decision making is based on a consideration of all consequences of the possible alternatives.

    However, a further look into this way of reason-ing shows that it is problematic. When it comes tosafety, what are the consequencesthe expected out-comes from an activity assigned by some analysts, orthe real outcomes generated by the activities? Andhow do we measure the value of these consequences?For example, how good is a cost reduction comparedto a reduction in the safety level? This is discussed inmore detail in the article, based on different ethicalstands; the duty and the utility stands, but also the justice and discourse stands. The justice approach toethics focuses on how fairly or unfairly our actionsdistribute benets and burdens among the membersof a grouppeople shouldbe treated the same unlessthere are morally relevant differences between them.The discourse stand is based on a search for consen-sus through open, informed, and democratic debate(Hovden, 1998).

    We bring new light to this discussion by making asharp distinction between the possible outcomes, theuncertaintyassessments of what will be theoutcomes,

    and our valuation of the outcomes and quantities ex-pressed through the uncertainty assessments.Compared to much of the existing literature in

    this eld, our discussion has a higher level of precisionon the way uncertainty, probability, and expected val-ues are understood and measured. Such a precisionlevel is required to give the necessary sharpness onwhat risk acceptance criteria mean and how the crite-ria are and can be used. The perspective on risk inu-ences the way risk acceptance criteria are understood

  • 8/12/2019 AVEN-2007

    3/10

    Ethical Justication for the Use of Risk Acceptance Criteria 305

    andused. We distinguishbetween a traditional (classi-cal) perspective and a Bayesian perspective. We referto the Appendix for a short review of the most com-mon risk perspectives.

    We study an activity, typically associated with the

    operation of a complex technological system involv-inghighrisks, where risk isdenedby thecombinationof possible consequences and related probabili-ties (or uncertainties when adopting the Bayesianperspective) (Aven, 2003; Aven & Kristensen, 2005).Our main concern is individual risk,but wealso brieyaddress environmental risk. Our starting point is thepetroleum industry, but the discussion is to large ex-tent general and applicable to other areas as well.

    Ethical justication for safety investment hasbeen thoroughly discussed in the literature. In ad-dition to the above cited reference of Hovden, werefer to HSE (2001), Hattis and Minkowitz (1996),Hokstad and Steiro (2005), and Shrader-Frechette(1991). However, a comparison of theethical justica-tion of different regulation regimes, with and withoutthe use of risk acceptance criteria, does not seem tohave been carried out.

    The article is organized as follows. In Section 2 wereview the concepts of risk acceptance criteria andtolerability limits. In Section 3 we analyze how therisk perspective inuences the ethical stands. Then, inSection 4 wediscuss theethical justicationfor theuseof such criteria and the other specic requirements.Section 5 concludes.

    2. RISK ACCEPTANCE (TOLERABILITY)LIMITS

    A risk acceptance criterionstates what is deemedas an unacceptable risk level, and in this context it isrestricted to an upper limit of acceptable risk. Twoexamples of such criteria are

    1. The FAR value should be less than 10 for allpersonnel of the group, where the FAR valueis dened as the expected number of fatalitiesper 100 million exposed hours (or any otherrelevant reference unit).

    2. The individual probability that a person iskilled in an accident during one year shouldnot exceed 0.1%.

    Sometimes the term risk tolerability limit is usedinstead of risk acceptance criterion. The need for riskreducing measures is assessed with reference to thesecriteria. The criteria are used in combination with

    quantitative risk analysesthe method used to checkwhether the criteria are met or not.

    Risk acceptance criteria (tolerability levels) maybe considered a tool for the regulators to ensure aminimum level of safety. The tool is often seen in rela-

    tion to theALARPprinciple, followinga three regionapproach:

    1. The risk is so low that it is considered negligi-ble;

    2. The risk is so high that it is intolerable;3. An intermediate level where the ALARP

    principle applies.

    In most cases in practice risk is found to be inRegion 3 (ALARP region) and the ALARP princi-ple is adopted, which means that an ALARP evalua-tion process is required. This will include a dedicatedsearch for possible risk reducing measures, and a sub-sequent assessment of these in order to determinewhich to be implemented.

    The ALARP principle normally applies in such away that the higher a risk is, the more employers areexpected to spend to reduce it. At high risks, close tothe level of intolerability, they are expected to spendup to the point where further expenditure would begrossly disproportionate to the risk; i.e., costs and/oroperational disturbances are way out in relation tothe risk reduction. This is in general considered to bea reasonable approach as higher risks call for greater

    spending. More money should be spent to reduce therisk if the risk is just below theintolerability level thanif the risk is far below this level.

    Risk acceptance criteria and the ALARP prin-ciple are reviewed and discussed in, e.g., Hokstadet al. (2004), Aven and Vinnem (2005), Fischhoff et al. (1981), UKOOA (1999), Melchers (2001), Pape(1997), Schoeeld (1998), Lind (2002), Rimingtonet al. (2003), and Skjong and Ronold (2002).

    3. THE INFLUENCE OF THE RISKPERSPECTIVES ADOPTED

    Consider the regulation of an activity, involvinga potential for hazardous situations and accidentsleading to loss of lives and injuries. From an ethicalpoint of view, we would require no fatalities and noinjuries. This is ethics of the mind, no one should bekilled or be injured in his or her job. However, inpractice no one can guarantee a 100% safety, and al-ternatives are sought. Examples include

  • 8/12/2019 AVEN-2007

    4/10

    306 Aven

    1. the individuals feel safe;2. the individual risk is sufciently low;3. the calculated individual risk is sufciently

    low;4. risk is reduced to a level that is as low as rea-

    sonably practicable;5. theuncertainties related to possible situationsand events leading to loss of lives and injuriesare reduced to a level that is as low as reason-ably practicable.

    To discuss these goals and criteria, we need todistinguish between different perspectives on risk, asthe meaning of these goals and criteria is differentdependingof theperspective. In this article we restrictattention to two main categories of perspectives:

    1. a traditional (classical) approach to risk and

    risk analysis, and2. a Bayesian perspective.

    Either one starts from the idea that risk (prob-ability) is an objective quantity and this risk has tobe estimated, or one starts from the idea that risk(probability) is a subjective measure of uncertainty asseen through the eyes of the analyst. The former case,which is referred to as the traditional or the classicalview, means that risk is a ctional quantity, express-ing, e.g., theproportionof fatal accidents in an innitereference population of similar situations. The lattercase is referred toas Bayesian,and hasno referenceto

    such an underlying population. Note that there existmany variations of the Bayesian paradigmhere weuse the term when probability is used as a subjectivemeasure of uncertainty; cf. the Appendix and Aven(2003). There are also paradigms that link the classi-calapproachandthe Bayesian approach. An exampleis the probability of frequency approach; see Kaplan(1992) and Aven (2003). We will, however, not go fur-ther into these paradigms in this article. Our discus-sion below for the classical approach also applies tothe probability of frequency approach.

    3.1. A Traditional Approach to Riskand Risk Analysis

    We rst look at the case when risk acceptancecriteria are used.

    Traditionally, risk has been seen as an objectiveproperty of theactivity beingstudied,andhence thereexists an objective real individual risk expressing theprobability that the person is killed or injured. If thisprobability is low, theperson wouldnormally also feel

    safe. If it can be veried that the real individual riskis below a certain number, the regulator would haveensured that the activity is acceptable from a safetypoint ofview. Accidentsleadingto fatalitiesor injuriesmay occur, but the chance would be small and under

    control. It is still an argument based on ethics of themind, as it is grounded on a reection of what is rightand not linked to the possible consequences of theaction. A typical valueused forindividual risk is0.1%,meaning that there should be a maximum of 0.1%probability that a specied individual is killed due toan accident during one year. This number is used withno reference to what it would induceof consequencesrelated to, for example, costs.

    The idea that such an objective risk exists is thebasisforthe regulationsin many countries. It isseldomor never explicitly stated, but it is clear from the waytheregulations are formulated that such a perspectiveon risk is adopted.

    As an alternative to the above regime based onrisk acceptance limits, consider a regulation regimebased on the same principles 15 above, but with nouse of predened risk acceptance criteria. The justi-cation for such a regime would be partly ethics of themind and partly ethics of the consequences:

    1. Ethics of the mind: the basic idea, what is acorrect risk level for the individual, has to beseen in a broader context taking into accountwhat he or she, and others, gain by the activity.

    A low accident risk has no value in itself.2. Ethicsoftheconsequences:thespecicchoice,theaction ordecision, needsto reectwhat thepossibleconsequencesare. For example, an al-ternative may be generated that leads to highrisks for some but extremely positive benetsfor others, and the risks can be compensatedby salary and insurance.

    Of course, even in the case of risk acceptance cri-teria, the ethical justication is partly teleological: wehave to look at the consequences. Requiring a risklevel equal to say 0.01% would have severe conse-

    quences andin many cases mean that activities arenotbeing performed, etc. Adopting the traditional level0.1%, it is known from many years of experience of using this criterion that it is met for most or nearly alltypes of activities in the Western world.

    Addressing a new type of situation, where wehave no or little experience from previous studies,it is difcult to specify the risk acceptance criteria.We simply do not know the consequences. An exam-ple would be a unique operation of great importance.

  • 8/12/2019 AVEN-2007

    5/10

    Ethical Justication for the Use of Risk Acceptance Criteria 307

    Then, using the ethics of the mind to specify a certainlimit would be difcult to justify as the consequencesneed to be addressed.

    The classical approach is based on the idea thatan objective risk exists, but in practice we have to

    estimate this risk, and this estimate would normallybe subject to large uncertainties. And this uncer-tainty needs to be taken into account. Using a riskacceptance criterion of the form 0.1% and adoptinga classical view means that uncertainties in the riskanalysis estimate need to be addressed. The true risknumber could be signicantly different than the es-timate. Hence by adopting a regime based on riskacceptance criteria, no minimum requirements havebeen established,as meeting the0.1% level is not say-ing that the true risk meets this level. We may try toexpress the uncertainties of the estimates, but thatleads to so complex an analysis and so wide uncer-tainty intervals in most real-life cases that the wholeidea of using risk analysis and risk acceptance criteriabreaks down; see Aven (2003).

    An alternative is to refer to procedure 3 above:specify a limit for the calculated individual risk. How-ever, this would not besatisfactory as there is noguar-antee that the real risk is under control.

    3.2. A Bayesian Perspective

    Adopting the Bayesian view, an objective indi-vidual risk does not exist. Using a risk acceptancecriterion of the form 0.1% means that the risk ana-lysts assessment concludes that risk is acceptable orunacceptable, depending on the result of the analysis.However, different assessments could produce differ-entnumbersdependingon theassumptionsmade andthe analysts chosen for the job.

    The idea of minimum requirements dened bythe risk acceptance criteria seems to lose its mean-ing when risk does not exist as an objective quantity.But a further analysis reveals that the Bayesian per-spective is not that different compared to the classical

    approach, acknowledging that in the classical case wehave to deal with risk estimates and in the Bayesiancase subjective assignments. It is possible to use riskacceptance criteria also in the Bayesian case, inter-preting the criteria as limits to compare the risk as-signments. Except for criterion 2, stated in Section3.1, we can implement the others, i.e., 1 and 35, withand without predened risk acceptance criteria. Theethical justicationwouldbeas in theclassical case,in-terpreting risk according to the Bayesian perspective.

    We will discuss this in more depth in the followingsection.

    4. DISCUSSION

    It is obvious from the above considerations thatthe results generated by the risk analysis need to beseen in a broader context taking into account thatthe risk analysis depends on assumptions made, theanalysts performing the analysis, etc.Anethical stand,based on ethicsof themind, foradopting a predenedlevel can still be put forward, but the limitations of the analyses weaken its position. We may formulate arisk acceptance limit as a minimum requirement, butthe tool to be used to check its fulllment or not hasnot the accuracy or precision needed. To cope for thislack ofaccuracyorprecision, we couldspecifya ratherstrong requirement,say0.01%, and use theanalysistocheck that this level is fullled as a guarantee for thereal risk to be lower than 0.1% (say) in the classicalcase, or as a guarantee that different analyses wouldall ensure a level of 0.1% (say) in the Bayesian case.However, such a strong limit wouldnot be used,as theconsequences could easily become unacceptable, asdiscussed in Section 3.1. Instead, a weak limit wouldbe preferred, such as 0.1%, and then the calculatedrisk would nearly always meet this limit. A minimumsafetylevelisthenestablished,butthislevelissoweakthatitseldomorneverisbeingapplied.Alotofenergyand resources are used to verify that these limits aremet, which is not very cost efcient as the results areobvious in nearly all cases.

    The use of conservative assumptions leading tooverestimation of risk or higherrisk assignments thanthe best judgments made by the analysts is oftenseen in practice. However, this does not add anythingnew to the above reasoning, except that such a pro-cedure could simplify the analyses. If the criterion isnot met in a rst run of a risk analysis, it is necessaryto perform a further detailing and remove some of the conservative assumptions, which normally leadsto acceptance.

    At the point of decision making, the conse-quences X , representing, for example, the number of fatalities, are unknown. Expectations, E[ X ], may becalculated and uncertainties assessed, but there is afundamental difference between the real outcomesand the predictions and uncertainty assessments. Thefact that we do not know the outcomes means that wecannot just apply the ethics of the consequences. Inthe case of large uncertainties in the phenomena be-ingstudied,the weighton theethics of themindwould

  • 8/12/2019 AVEN-2007

    6/10

    308 Aven

    necessarily also be large. We can calculate individualdeath probabilities and expected net present valuesin a cost-benet analysishowever, there would be aneed for seeing beyond these calculations, as they arebased on a number of assumptions. How to deal with

    the uncertainties has to have a strong component of ethics of the mind. We are led to the adoption of prin-ciples such as the cautionary principle, saying that inthe face of uncertainties, cautions should be a rulingprinciple, and the precautionary principle, saying thatin the case of lackof scientic certainty about the con-sequences, theactivity should be avoided or measuresimplemented; see Lofstedt (2003), HSE (2001), andAven (2006). These principles are primarilyprinciplesof ethics of the mind. They are of course related to theconsequences in the sense that they are implementedto avoid negative consequences, but the basic ideasof using these principles are founded in a belief thatin the face of uncertainties, caution and precautionshould be the ruling policyyou should not gamblewith lives.

    Adoptingaclassicalperspectiveonrisk,wewouldadd that uncertainties in the risk estimates means an-other reason for adopting the cautionary and pre-cautionary principles, and emphasize the ethics of the mind. In a way these uncertainties are mind-constructed uncertainties, as the objective underly-ing risks are mind-constructed quantities, and in thissense the ethics of the mindmay be given a too strongweight compared to the ethics of the consequences.

    To evaluate the uncertainties, riskanalysis consti-tutes a key instrument, but also risk perception playsa role. If people perceive the risks and uncertaintiesrelated to a phenomenon as high, it could inuencethe decision-making process and the weighting of thevarious concerns. However, taking into account riskperception in the decision-making process does notnecessarily mean that more emphasis is placed on theethics of the mind, relative to the ethics of the con-sequences, as risk perception is also a consequenceor an outcome of the actions and measures consid-ered. Depending on the perspectives on risk adopted,

    risk perception provides to varying degree relevantinformation about the consequences. If risk is consid-ered an objective property of the system being ana-lyzed, risk perception would in general be given lessattention than if risk is a subjective measure of uncer-tainty. Note that risk perception is not the same as asubjective probability (risk). Subjective probabilitiesmay be a basis for the risk perception, but risk per-ception also incorporates emotional aspects such asfear and anxiety.

    So far we have focused on individual safety. Nowsome words about environmental issues. Here, theideal would be no damage to the environment. Sincethis ideal cannot be achieved fully in most cases, theconceptsof risk anduncertainty need to beaddressed.

    Therst issue we then would like to discuss is whethera low environmental risk level has a value in itself.Clearly, a life has a value in itself, and most peoplewould conclude that the environment has a value initself. But the value is not necessarily very large. If wehave to choose between production of a certain typeofunitscausingsomeriskofpollution,ornot,weoftenaccept the risk of pollution. The benets outweigh thenegative consequences. We adopt ethics of the conse-quences. However, as for lives, we also use ethics of the mind, in the face of risks and uncertainties, as wecannot picture the exact consequences of an action.We apply the cautionary and the precautionary prin-ciples. The discussion of using risk acceptance criteriaor not would be analogous to the discussion for theindividual risk.

    In general, one would expect that the regulatorsput more emphasis on theuncertainties andtheethicsof the mind, compared to the industry, as the regula-tors necessarily have a broader societal perspective.This creates a dilemma. Modern safety managementis based on the use of the internal control principle,saying that industry has full responsibility for its ac-tivities. In the Norwegian oil industry this principleis implemented, and the oil companies specify therisk acceptance criteria. However, the primary goalof the industry is prot. The drive for prot meansthat safety is optimized to some extent. Maybe otherwords are used ofcially, but in practice the industrywould seek to avoid unnecessary constraints in theoptimization process, and hence reduce the ethics of mind based criteria to a minimum. If the regulationsrequire such criteria, the result would be the imple-mentation of very weak limits, such that the criteriado not induce any practical constraints.

    Thus the regulators need to specify the criteria if they would like to implementa certain safetystandard

    in theindustry. To some extent this is beingdonein theoffshore industry. For example, the Norwegian andU.K. petroleum authorities have dened upper limitsfor the frequencies of impairment of specic safetyfunctions; see, e.g., Aven and Vinnem (2005). If notforcedby theregulator,one should not expect that theindustry woulddene risk acceptance criteria beyondthese limits, as such criteria could be considered inconict with the primary goals of the industry. Theethics of the consequences would necessarily rule.

  • 8/12/2019 AVEN-2007

    7/10

    Ethical Justication for the Use of Risk Acceptance Criteria 309

    Finally, some words about the justice anddiscourse stands.

    Accordingtothejusticeprincipletoethics,peopleshould be treated the same unless there are morallyrelevant differences between them. An application of

    this principle to safety and risk in society andindustryis meaningless, as safety/risk is just one out of manyattributes that dene welfare and is relevant in thedecision-making process. Specifying some minimumsafety standards does not imply a full implementationof the principle, but provides some constraints for theoptimization. However, how these minimum safetystandards should be dened cannot be deduced fromanethical principle.Theuseof risk acceptance criteriais one way of making such standards operational, butthere are other approaches that could be used as well,as discussed above.

    The discourse stand is based on a search for con-sensus through open, informed, and democratic de-bate.Many aspects of this principle are to large extentimplemented in the Western world through modernregulation and management regimes, emphasizing in-volvement and dialogue. Applying this principle ismainly based on the ethics of the mind, as it is be-lieved that the principle is the right way of dealingwith risks and uncertainties.

    5. CONCLUSIONS

    Many people, in particular, safety people, mockthe utility principle; see, e.g., Hovden (1998). Whatthey often do is to argue against the practical tool forimplementing the principle, the cost-benet analyses.And that is easy. Any tool being used for balancingpros and cons would have strong limitations and bebased on a number of assumptions. Hence there isa need for seeing beyond the tools. We need somemanagerial review and judgment that opens up for abroader perspective, reecting the limitations and as-sumptionsof thetools, andall theethicalconcerns thatneed to be taken into account when making decisionsin the face of uncertainties. The utility stand (ethics

    of the consequences) would be important because weneed to balance the pros and cons. However, it is notpossible to make this principle operational withoutalso reecting the other ethical stands (ethics of themind, justice, and discourse). There should be no dis-cussion on this. What canandshould be debated is thebalance of the various principles and concerns. Forexample, in the case of helicopter shuttling betweenoffshore installations, what should be an acceptable

    safety or risk level for the workers? Should we putsome limitations on the number of ights to controlthe safety level for the personnel? Yes, in practicethis is done, and it seems ethically correct. The ar-gument is of course based on considerations of the

    consequences, but also ethics of the mind and the jus-tice standthe workers should be ensured a certainsafety level. As a result of the regulation and man-agement regime, processes are implemented involv-ing the workers in specifying this level. To the extentpossible consensus is sought.

    Observe that the specication of such a safetylevel (e.g., expressed by a maximum number of ightsn) is not the same as using a predetermined risk ac-ceptance criterion, as discussed above. The decisionto be taken is to determine n and for that purposea procedure emphasizing the generation of alterna-tives (different n values) and assessment of the as-sociated consequences may be adopted. This processgenerates a specic solution, the proper n. The shut-tling risk analysis is used as a basis for specifyingthis n.

    Of course,one maydecide that theassociated riskshould be reformulated as a risk acceptance criteriontobeused for other applications. If that is the case, thediscussion in Sections3 and4 applies. Theethicsof themind is highlighted relative to the ethics of the conse-quences. However, as discussed in Sections 3 and 4, itis possible to also formulate procedures according tothe ALARP principle that are strongly motivated bytheethicsof themindthepoint is that signicant un-certainties in the consequences cannot be adequatelyhandled by standard cost-benet analyses. Applica-tions of the cautionary and precautionary principlesare required.

    As discussed in this article, we see no strongerethical arguments for using predened risk accep-tance criteria compared to other regimes. There areobviously arguments for using and not using any of the above regimes, but these are not primarily of anethical character. To decide which regime to imple-ment, ethical considerations shouldof coursebe taken

    into account, but such a decision has to be put into awider context, reecting thepractical implementationof the regimes and how to understand and deal withrisk and uncertainty. Most people would agree thatthe chosen regime must balance a number of con-cerns and ethical perspectives. The aim of this articlehas been to contribute to clarication of this contextand provide an improved basis for performing thisbalance.

  • 8/12/2019 AVEN-2007

    8/10

    310 Aven

    ACKNOWLEDGMENTS

    This work is a part of a project on HES in thepetroleum industry sponsored by the Norwegian Re-search Council. The work has been inspired and par-tially sponsored by a project run by the PetroleumSafety Authority on decision support for HES regu-lations. The nancial support is gratefully acknowl-edged.

    The author is grateful to Kjell Hausken, GerhardErsdal, and Willy Red, as well as three anonymousreviewers, for useful comments and suggestions to anearlier version of the article.

    APPENDIX: ALTERNATIVE PERSPECTIVESON RISK

    Probably the most common denition of risk isthat risk is the combination of probability and con-sequences, where the consequences relate to variousaspects of HES, e.g., loss of lives and injuries. Thisdenition is in line with the one used by ISO (2002).However, it is also common to refer to risk as prob-ability multiplied by consequences (losses), i.e., whatis called the expected value in probability calculus. If the focus is the number of fatalities during a certainperiod of time, X , then the expected value is givenby E[ X ], whereas risk dened as the combination of probability and consequence expresses probabilitiesfor different outcomes of X , for example, the proba-bility that X is not exceeding 10. Adopting the de-nition that risk is the combination of probability andconsequence, the whole probability distribution of X is required, whereas the expected value just refers tothe center of gravity of this distribution. In the scien-tic risk discipline there is a broadconsensusconclud-ing that risk cannot be restricted to expected values.We need to see beyond the expected values, e.g., byexpressing the probability of a major accident havinga number of fatalities.

    Hence we are led to a denitionsaying that risk isthe combination of probability and consequence. But

    what is a probability? There are different interpreta-tions. Here are the two main alternatives:

    1. A probability is interpreted in the classi-cal statistical sense as the relative fractionof times the events occur if the situationanalyzed were hypothetically repeated aninnite numberof times. The underlying prob-ability is unknown, and is estimated in the riskanalysis.

    2. Probability is a measure of expressing uncer-tainties on what will be the outcomes (conse-quences),seenthroughtheeyesoftheassessorand based on some background informationand knowledge. Probability is a subjective

    measure of uncertainty related to the occur-rence of an event.

    Following denition (1), we produce estimates of the underlying true risk. This estimate is uncertain, asthere could be large differences between the estimateandthe correct risk value. Aspects of theuncertainties(the statistical variation) may be expressed by con-dence intervals. Alternatively, the uncertainties maybe assessed using subjective probabilities(probabilityof frequency approach; Kaplan, 1992; Aven, 2003).

    Following interpretation (2), we assign a prob-ability by performing uncertainty assessments, and

    there is no reference to a correct probability. Thereare no uncertainties related to the assigned probabil-ities, as they are expressions of uncertainties.

    Depending on the risk perspective, one is led todifferent ways of thinking when it comes to risk anal-ysis and assessments, risk acceptance, etc.; see Aven(2003).

    There exist a number of other perspectives to riskthan those mentioned earlier. Below some of theseare summarized (Pidgeon & Beattie, 1998; Okrent &Pidgeon, 1998; Aven, 2003):

    1. In psychology there has been a long traditionofwork that adoptsthe perspectiveto risk thatuncertaintycanbe represented as an objectiveprobability. Here, researchers have sought toidentify and describe peoples (lay-peoples)ability to express level of danger using prob-abilities, and understand which factors inu-ence the probabilities. A main conclusion isthat people are poor assessors if the referenceis a real objective probability value, and thatthe probabilities are strongly affected by fac-tors such as dread.

    2. Economists usually see probability as a way

    of expressing uncertainty of what will be theoutcome,oftenseeninrelationtotheexpectedvalue. The variance is a common measure of risk. Both the interpretations (1) and (2) areapplied, but in most cases without makingit clear which interpretation is used. In eco-nomic applications a distinction has tradition-ally been made between risk and uncertainty,based on the availability of information. Un-der risk the probability distribution of the

  • 8/12/2019 AVEN-2007

    9/10

    Ethical Justication for the Use of Risk Acceptance Criteria 311

    performance measures can be assigned objec-tively, whereas under uncertainty these prob-abilities must be assigned or estimated on asubjective basis (Douglas, 1983). This latterdenition of risk is seldom used in practice.

    3. In decision analysis, risk is often dened asminus expected utility, i.e. E[u( X )], wherethe utility function u expresses the assessorspreference function for different outcomes X .

    4. Social scientists often use a broader perspec-tiveonrisk.Here,riskreferstothefullrangeof beliefs andfeelings that peoplehave about thenature of hazardous events, their qualitativecharacteristics and benets, and, most cru-cially, their acceptability. Such a denition isseen useful if lay conceptions of risk are tobe adequately described and investigated.Themotivationis the fact that there isa widerangeofmultidimensionalcharacteristics ofhazards,rather than just an abstract expression of un-certainty and loss, which people evaluate inperforming perceptionssuch that the risksare seen as fundamentally and conceptuallydistinct. Furthermore, such evaluations mayvary with the social or cultural group to whicha person belongs, the historical context inwhich a particular hazard arises, and may alsoreect aspects of both the physical and humanor organizational factors contributing to haz-ard, such as trustworthiness of existing or pro-posed risk management.

    5. Another perspective, often referred to as cul-tural relativism, expresses that risk is a socialconstruction andit is therefore meaningless tospeak about objective risk.

    There exist also perspectives that intend to unifysome of theperspectives above; see, e.g., Rosa (1998),Aven (2003), and Aven and Kristensen (2005). Onesuch perspective, the predictive Bayesian approach(Aven, 2003), is based on the interpretation (2), andmakes a sharp distinction between historical data andexperiences, future quantities of interest such as lossof lives, injuries, etc. (referred to as observables), andpredictions and uncertainty assessments of these. Thethinking is analogous to cost risk assessments, wherethe costs, the observables, are predicted, and the un-certainties of the costs are assessed using probabilis-tic terms. Risk is then viewed as the combination of possibleconsequences (outcomes) andassociated un-certainties. This denition is in line with the deni-tion adopted by the U.K. government; see CabinetOfce (2002, p. 7). The uncertainties are expressed or

    quantied using probabilities, as in (2) above. Hencein the case of quantication, this denition reducesto the one commonly used; risk dened as the com-bination of possible consequences and probabilities.Using such a perspective, with risk seen as the combi-

    nation of consequences and associated uncertainties(probabilities), a distinction is made between risk asa concept and terms like risk acceptance, risk percep-tions, risk communication, and risk management, incontrast to the broad denition used by some socialscientists in which this distinction is not clear.

    REFERENCES

    Aven, T. (2003). Foundations of Risk Analysis . New York: JohnWiley.

    Aven, T. (2006). On the precautionary principle, in the context of different perspectives on risk. Risk Management: An Interna-tional Journal , 8, 192205.

    Aven, T., & Kristensen, V. (2005). Perspectives on riskReviewanddiscussionof thebasisforestablishinga uniedand holisticapproach. Reliability Engineering and System Safety , 90, 114.

    Aven, T., & Vinnem, J. E. (2005). On the use of risk acceptancecriteria in the offshore oil and gas industry. Reliability Engi-neering and System Safety , 90, 1524.

    Cabinet Ofce. (2002). Risk: Improving Governments Capabilityto Handle Risk and Uncertainty . Strategy unit report. UK.

    Cherry, J., & Fraedrich, J. (2002). Perceived risk, moral philosophyand marketing ethics: Mediating inuences on sales managersethical decision-making. Journal of Business Research , 55, 951962.

    Douglas, E. J. (1983). Managerial Economics: Theory, Practice andProblems , 2nd ed. Englewood Cliffs, NJ: Prentice Hall.

    Fischhoff, B., Lichtenstein, S., Slovic, P., Derby, S., & Keeney,R. (1981). Acceptable Risk . New York: Cambridge UniversityPress.

    Hattis, D., & Minkowitz, W. S. (1996). Risk evaluation: Criteriaarising from legal traditions and experience with quantitativerisk assessment in the United States. Environmental Toxicol-ogy and Pharmacology , 2, 103109.

    Hokstad, P., & Steiro,T. (2005). Overall strategy forrisk evaluationand priority setting of risk regulations. Reliability Engineeringand System Safety , 91, 100111.

    Hokstad, P., Vatn, J., Aven, T., & Srum, M. (2004). Use of riskacceptance criteria in Norwegian offshore industry: Dilemmasand challenges. Risk, Decision and Policy , 9, 193206.

    Hovden, J. (1998). Ethics and safety: Mortal questions for safetymanagement. Paper presentedat theConference Safety in Ac-tion, Melbourne.

    HSE. (2001). Reducing Risks, Protecting People, HSEs De-cision Making Process [R2P2] HSE Books . http://www.hse.gov.uk/dst/r2p2.pdf.

    ISO. (2002). Risk management vocabulary. ISO/IEC Guide 73.Kaplan, S. (1992). Formalism for handling phenomenological un-

    certainties: The concepts of probability, frequency, variability,and probability of frequency. Nuclear Technology , 102 , 137142.

    Lind, N. (2002). Social and economic criteria of acceptable risk.Reliability Engineering and System Safety , 78, 2126.

    Lofstedt, R. E. (2003). The precautionary principle: Risk, regula-tion and politics. Transactions of IchemE , 81, 3643.

    Melchers, R. E. (2001). On the ALARP approach to risk manage-ment. Reliability Engineering and System Safety , 71, 201208.

  • 8/12/2019 AVEN-2007

    10/10

    312 Aven

    Okrent, D., & Pidgeon, N. (1998). Special issue on risk perceptionversus risk analysis. Reliability Engineering and System Safety , 59.

    Pape, R. P. (1997). Developments in the tolerability of risk and theapplication of ALARP. Nuclear Energy , 36(6), 457463.

    PSA (Petroleum Safety Authority). (2001). Ragulations Relat-ing to Management in the Petroleum Activities . Available athttp://www.npd.no/regelverk/r2002/frame e./htm.

    Pidgeon, N. F., & Beattie, J. (1998). The psychology of risk and un-certainty. In P. Calow (Ed.), Handbook of Environmental Risk Assessment and Management (pp. 289318). London: Black-well Science.

    Rimington, J., McQuaid, J., & Trbojevic, V. (2003). Applica-tion of Risk-Based Strategies to Workers Heath and Safety

    ProtectionUK Experience . Ministerie van Sociale Zaken enWerkgelegenheid.

    Rosa, E. A. (1998). Metatheoretical foundations for post-normalrisk. Journal of Risk Research , 1, 1544.

    Schoeld, S. (1998). Offshore QRA and the ALARP prin-ciple. Reliability Engineering and System Safety , 61, 3137.

    Shrader-Frechette, K. S. (1991). Risk and Rationality. Philosophi-cal Foundations for Populist Reforms . Berkeley: University of California Press.

    Skjong, R.,& Ronold, K.O.(2002). Somuchfor safety. Proceedingsof OMAE. Oslo, June, 2328 2002.

    UKOOA. (1999). A Framework for Risk Related Decision Support Industry Guidelines . UK offshore Operators Association.