8
9 July 2008 Computer Fraud & Security COMPUTER MISUSE ACT com/2007/02/hearts/death-text and the base statistics: http://www.nsc.org/research/ odds.aspx 7 Losses to the parent company are in excess of US$135 million, with some commentators estimating, based on average cost per record lost, an eventual loss in excess of US$4 billion 8 For two pertinent examples, consider credit card numbers and the abortive UK “Learning Credit”. 9 Many years ago, when I was a young naval officer, cheques written to officialdom allowing you to withdraw cash onboard were normally written to “SO Cash” – the common abbreviation for the job title “Supply Officer (Cash)”. Shortly thereafter, this was changed to “MOD Public Sub Account” and a number. The dit (naval slang for ‘story’) was that somebody had fallen into temptation and opened an account in the name of ‘Simon Oliver Cash’, into which such cheques were being diverted. No matter the system, there will always be a way! 10 Within the European Union, there is a suggested maximum retention period for such personal data, within the Data Protection regime, of 6 months. This contrasts strongly with the traditional business accounting (for tax and civil law purposes) of 6 years. 11 I was once asked, by a bank director, why I was so cynical. My reply was that I spent my day dealing with the cream of the organisation’s employees and custom- ers – fraudsters, drug abusers, perverts and, thankfully rarely, paedophiles. He went to find somebody else to supervise closely. 12 A transport payment / pre-payment card used in London. 13 Going back a few years, there was a significant difference between the fraud subsequent to phishing attacks between the UK and the US. In the UK, the account credentials were simply used to drain the account; in the US, the favoured fraud was to use the credit and debit card data accessible via the online banking service to forge cards, which were then used to withdraw cash. How, here, can you meaningfully separate credential from account? Of course, the actual credential compromised was the authentication details for the online service – the same regardless of the sub- sequent fraud method. 14 And to note, for those taught American English, in British “pants” come somewhat earlier in the dressing cycle than you may think. The security and privacy impact of criminalising the distribution of hacking tools Vasilios Katos and Steven Furnell Furnell and Katos scrutinise the effect of changes to the Computer Misuse Act on privacy. Following the recent amendments to computer misuse legislation in the UK, which essentially criminalise the posses- sion of and research in security hacking tools, there have been concerns raised by security practitioners and research- ers. These concerns revolve around the apparently unfair and disadvantageous position the “good guys” will be placed in, when required to perform security assessments as part of their IT audit- ing responsibilities. In this paper we investigate the impact this revised legal framework will have on privacy. This is performed by an adoption of market analysis tools ported from the macroeco- nomics domain. This study concludes that an indiscriminate criminalisation of the distribution of hacking tools, irre- spective of the intention, is not a sound decision as it will not only have a nega- tive impact on privacy, but the price for maintaining it on this reduced level will increase. Introduction Although it is generally accepted in many domains that prevention is better than cure (e.g. in relation to information systems security, medicine, and so on), the act of establishing the appropriate preventive measures is not always straight- forward. Not only may the preventive measures not be completely effective, but they can also introduce new problems or amplify existing ones. For example, many Steven Furnell

The security and privacy impact of criminalising the distribution of hacking tools

Embed Size (px)

Citation preview

9July 2008 Computer Fraud & Security

COMPUTER MISUSE ACT

com/2007/02/hearts/death-text and the base statistics: http://www.nsc.org/research/odds.aspx7 Losses to the parent company are in excess of US$135 million, with some commentators estimating, based on average cost per record lost, an eventual loss in excess of US$4 billion8 For two pertinent examples, consider credit card numbers and the abortive UK “Learning Credit”. 9 Many years ago, when I was a young naval officer, cheques written to officialdom allowing you to withdraw cash onboard were normally written to “SO Cash” – the common abbreviation for the job title “Supply Officer (Cash)”. Shortly thereafter, this was changed to “MOD Public Sub Account” and a number. The dit (naval slang for ‘story’) was that somebody had fallen into temptation and opened an

account in the name of ‘Simon Oliver Cash’, into which such cheques were being diverted. No matter the system, there will always be a way!10 Within the European Union, there is a suggested maximum retention period for such personal data, within the Data Protection regime, of 6 months. This contrasts strongly with the traditional business accounting (for tax and civil law purposes) of 6 years.11 I was once asked, by a bank director, why I was so cynical. My reply was that I spent my day dealing with the cream of the organisation’s employees and custom-ers – fraudsters, drug abusers, perverts and, thankfully rarely, paedophiles. He went to find somebody else to supervise closely.12 A transport payment / pre-payment card used in London.

13 Going back a few years, there was a significant difference between the fraud subsequent to phishing attacks between the UK and the US. In the UK, the account credentials were simply used to drain the account; in the US, the favoured fraud was to use the credit and debit card data accessible via the online banking service to forge cards, which were then used to withdraw cash. How, here, can you meaningfully separate credential from account? Of course, the actual credential compromised was the authentication details for the online service – the same regardless of the sub-sequent fraud method.14 And to note, for those taught American English, in British “pants” come somewhat earlier in the dressing cycle than you may think.

The security and privacy impact of criminalising the distribution of hacking toolsVasilios Katos and Steven Furnell

Furnell and Katos scrutinise the effect of changes to the Computer Misuse Act on privacy.

Following the recent amendments to computer misuse legislation in the UK, which essentially criminalise the posses-sion of and research in security hacking tools, there have been concerns raised by security practitioners and research-ers. These concerns revolve around the apparently unfair and disadvantageous position the “good guys” will be placed in, when required to perform security assessments as part of their IT audit-ing responsibilities. In this paper we

investigate the impact this revised legal framework will have on privacy. This is performed by an adoption of market analysis tools ported from the macroeco-nomics domain. This study concludes that an indiscriminate criminalisation of the distribution of hacking tools, irre-spective of the intention, is not a sound decision as it will not only have a nega-tive impact on privacy, but the price for maintaining it on this reduced level will increase.

Introduction

Although it is generally accepted in many domains that prevention is better than cure (e.g. in relation to information systems security, medicine, and so on), the act of establishing the appropriate preventive measures is not always straight-forward. Not only may the preventive measures not be completely effective, but they can also introduce new problems or amplify existing ones. For example, many

Steven Furnell

10Computer Fraud & Security July 2008

preventative security controls do not meet acceptable levels of usability.1,2 As a result, introduction of a security mechanism of poor usability would result in user adop-tion resistance.

The interference and impact of security becomes even more amplified when a given security measure would affect fun-damental aspects of a society, such as pri-vacy. Privacy, being a constitutional right in some countries (see for example the US Constitution’s Fourth Amendment), appears to have a synergetic yet compet-ing relationship with security and more precisely, with the access control aspects of security. Although there can be no pri-vacy assumed if there are no access con-trol mechanisms in place, the principle of accountability targets privacy by requiring the disclosure of the user’s identity on a particular system or process. Furthermore, privacy is a concept comprising of philo-sophical and technical attributes, making it resistant to a straightforward explana-tion and definition.3 Other researchers4

adopted a stricter position on this matter by conjecturing that the privacy problem is intractable.

In this turbulent and dynamic envi-ronment, where security and privacy are under constant negotiations, policy makers and legislators are challenged to develop organisational safeguards for balancing security and privacy. Major historic incidents in security (such as 9/11) contribute towards favouritism in security, whereas incidents in confidential-ity breaches (such as the HM Revenue & Customs loss of two CDs containing citizens’ private data5) justify and support privacy concerns.

This paper explores the likely impact that legislation criminalising the distribution of hacking tools will have on privacy. The methodology used is the one published by Katos and Patel in 2008.6 We argue that this methodology is suitable because it explores the relationship between security and privacy on a macro level and allows one to use a ceteris paribus assumption on the unknown variables and quantities affect-ing security and privacy, as explained later on. Methodologies offering a micro treat-ment of the security and privacy balancing problem include the works by Acquisty et al (2003),7 Acquisty (2004),8 Huang (1998),9

Laudon (1996),10 Otsuka & Onozawa (2001),11 but are not applicable as they can-not examine the overall and aggregate con-sequences of a modified legal framework.

During this paper we will set out an overview of UK computer misuse legisla-tion. Next we will present a brief overview of the methodology and then proceed to utilise it as a basis for two example scenarios. The first scenario involves the straightfor-ward hypothesis of introducing stricter laws against hacking, whereas the second reflects the amendments to computer misuse leg-islation in the UK, which are serving to criminalise the distribution and supply of hacking tools. Having interpreted the results from these scenarios we will present the conclusions and suggestions arising from the study.

An overview of UK computer misuse legislationThe Computer Misuse Act appeared in 1990, in response to rising concerns

about issues such as hacking and viruses, and the fact that existing legislation had already proven itself to be an ineffective basis for handling the problems (the land-mark example being the failed attempt to use the Forgery and Counterfeiting Act 1981 to prosecute Steve Gold and Robert Schifreen for their hacking of BT’s Prestel system). The Act introduced three offences, as summarised in Table 1, which were subsequently used to provide both a deterrent and a means of prosecuting a variety of cases. Over the years, how-ever, the law remained static, whereas the technologies and attacks that it sought to police continued to evolve, with a con-sequence that new forms of attack began to emerge for which the original wording of the Act appeared to offer loopholes (a case in point was provided by denial-of-service (DoS) attacks, which can be achieved without involving unauthorised access or modification as defined by the original Act. For example, the operation of a website could be impaired simply by flooding it with a high volume of permit-ted connections). As a result, the per-ceived utility of the Act decreased and by the early part of the decade both the IT and law enforcement communities were claiming that it was unsuited to dealing with a range of problems that had been ushered in by the widespread adoption of the Internet.12 Such concerns led to calls for the law to be updated, and a series of amendments were ultimately introduced by the Police and Justice Act 2006. These are outlined in the right-hand column of Table 1, with the wording of section 3 being broadened to refer to impairment of the system rather than specifically

Sec. Original Act Amended Act

1 Unauthorised access to computer material Unauthorised access to computer material

2Unauthorised access with intent to commit or facilitate the commission of further offences

Unauthorised access with intent to commit or facilitate the commission of further offences

3Unauthorised modification of computer material

Unauthorised acts with intent to impair, or with recklessness as to impairing, operation of computer, etc.

3A NAMaking, supplying or obtaining articles for use in offence under section 1 or 3

Table 1: The original and amended sections of the Computer Misuse Act.

COMPUTER MISUSE ACT

11July 2008 Computer Fraud & Security

COMPUTER MISUSE ACT

requiring modification, and a new offence being defined in relation to making, sup-plying or obtaining articles that could be used in computer misuse activities.

Notably, however, the addition of sec-tion 3A was met with some concern. Although the additional wording clearly serves to broaden the Act, enabling it to encompass a wider range of offences, it also raises questions about the potential for perfectly legitimate activities to be swept up in the process. Specifically, the Act seems problematic on two grounds:

The criminalisation of possession, fab-rication and distribution of hacking tools seems to have a broad remit, leading to problematic definitions and situations when attempting to establish one’s (mali-cious) intent. Indeed, a related concern was expressed by Lord Lawson of Blaby when the amendment was debated in the House of Lords13: “I am concerned about how this wording will be interpreted […] The wording of the Government’s amendment is, “is likely to be used”, which means anything that is capable of being used. That goes much further than this House should be comfortable with. I hope that the government will therefore give it consideration. With this amend-ment, they seek to narrow the conditions, but they are not narrowing them at all.”

The likelihood of someone committing an offence is not clarified. Admittedly,

establishing likelihood and setting prob-abilities against truthfulness of events or “facts” in a legal context has always been a challenging and controversial subject. However, when a guidance document sets out a candidate list of factors influ-encing likelihood, there are inherent risks involved both in misinterpretation and subjectivity. Table 2 comments on the purpose, feasibility and effectiveness of the specific factors that the Crown Prosecution Service (CPS) has suggested for establishing the likelihood of a criminal offence being committed.14

As a consequence of these issues, there is a likelihood that those who might previously have made use of ‘dual use’ tools to legitimately test their own systems might now be more cautious about doing so until case law prec-edents have been established. As such, although the intention behind crimi-nalising the distribution of tools is to increase security by reducing the poten-tial for offences, there is potential for it to have the opposite effect. In order to explore this issue more formally, the paper proceeds to model the situation based upon an examination of the rela-tionship between security and privacy. This begins in the next section with the description of the underlying modelling approach, followed by its application to the problem domain.

An outline of the methodology

The section summarises the main points of the model. For a more detailed expla-nation the reader is referred to Katos and Patel (2008).

Initially we accept that there is no uni-versally accepted, objective measure for privacy. However, comparative statements can be constructed, for example:

P1: Privacy decreases if I go shopping (as opposed to staying at home)

P2: Privacy decreases more than in P1, if I pay by credit card

P3: Privacy decreases more than in P2, if I apply for a mortgage

It should be obvious that the exact ini-tial level, as well as the absolute change of privacy, cannot be established. This is not only because privacy is qualitative, but also because we have no control or knowl-edge of all variables affecting privacy. For example, how many monitoring tech-nologies (such as CCTV elements) have invaded our private space, and to whom is the captured data available and for how long? What is the quality of the captured data and therefore the likelihood of posi-tive identification? What protection does the legal system provide against third party enquiries to access the data?

CPS guidance Comments and examples

“Has the article been developed primarily, deliberately and for the sole purpose of committing a CMA offence (i.e. unauthorised access to computer material)?”

This is to our opinion a good indicator. Spear phishing attacks using custom worms attempting to exploit a specific (business) infrastructure would fall under this category.

“Is the article available on a wide scale commercial basis and sold through legitimate channels?”

This seems to have missed the legitimate freeware tools.15 Also, establishing the commercial scale of the tool may introduce further challenges and debates on the scope such as local versus global market, etc.

“Is the article widely used for legitimate purposes?”Answers here would not add significant insight to proving one’s intent, as there are dual use tools, such as nmap, and more offensive tools, such as nessus, which are widely used for both benevolent and malicious purposes.

“Does it have a substantial installation base?”

“What was the context in which the article was used to commit the offence compared with its original intended purpose?”

This is a challenging question and in principle would require the contribution of expert and professional witnesses, on a per case basis. Clayton (2007) reports that the introduction of “context” will cause problems with respect to the inter-pretation of the dual use nature of a hacking tool.

Table 2: The Crown Prosecution Service guidance on CMA amendments.

12Computer Fraud & Security July 2008

It can be seen that not only the number and diversity of these questions can be exceedingly high, but also answering them is challenging in principle.

Determining qualitative variables in uncertain and open problem domains has been a major topic of interest in the discipline of Macroeconomics. The well known so-called cross methodology has significantly contributed to the under-standing of the market forces of supply and demand.16,17 The remainder of this section deals with porting these proven techniques to the domain of privacy and security.

Initially we need to classify the rel-evant technologies in two “markets”: the security technologies, and the adversarial technologies. By security technologies we mean those that intend to support the confidentiality of our private data, such as firewalls, antivirus tools and so on. In other words, these are defensive tech-nologies, and primarily access control measures. By adversarial technologies we mean those technologies that are used for testing our security technologies. These are hacking tools, such as vulnerability scanners, exploits, and security assessment frameworks. These offensive security mechanisms are required in order to be able to assess the security level of an IT infrastructure. A key differentiator is the purpose or intention of use of a certain

technology. In the security technologies market, the technologies can only be used for benign purposes, whereas in the adversarial technologies market, the tech-nologies can be used for either benign or malicious purposes. “Ethical hacking” for instance is the term used for capturing the benign use of the adversarial tools.

Against the above we can now proceed in defining the two markets. Figure 1 shows the security technologies market. The process for establishing the respec-tive relations is as follows. Our objective is to determine the relationship between Price and Privacy (i.e. quadrant Q1) in this market. To do this, we need to define Q2, Q3 and Q4; then, Q1 will be defined as an equilibrium by attaching all assumptions (functions) to the other three quadrants.

Starting with Q2, we assume that there is an inverse relationship between the aggregate demand of the security technologies and price, i.e. the lower the price of security technologies (P), the higher the quantity demanded of securitytechnologies (SD).

The assumption captured in Q4 shows that there is a positive relationship between security technologies supply and level of privacy. Indeed, we support the view that there can be no privacy if there are no security technologies in place to protect the relevant personal information (i.e. privacy

is bounded by security). Hence the sup-ply for security technologies function SS = g(V), is rationalised by the fact that the more important (higher) privacy (V) is the higher the quantity supplied of security technologies (SS) to keep privacy at high levels.

Quadrant Q3 reflects the economists’ views who suggest that market forces and economic laws, if left alone, will eventu-ally push security technologies demand to equilibrium with security technologies supply, regardless of their initial alloca-tion. This is represented by the identity function SD=SS, or f(P) = g(V).

By attaching all three assumptions in Q2, Q3 and Q4, we can derive the curve in Q1. Consider price P0, which corresponds to demand SD0, which in turn matches supply SS0, which in turn rest at privacy P0. Similarly, consider the path for a different price P1, leading to demand SD1, matching supply SS1,resulting to privacy P1. The two variable pairs (P0,V0) and (P1,V1) define the two points of the privacy-price curve A0 and A1 respectively. If this is done for the infi-nite number of (P,V) pairs, we eventually obtain curve SS-SD.

A similar line of reasoning is followed for the adversarial technologies market. Although that Q2 and Q4 remain the same as in the security technologies market, the attention should be drawn

Figure 1 : The security technologies market.

COMPUTER MISUSE ACT

13July 2008 Computer Fraud & Security

COMPUTER MISUSE ACT

to Q3 which in this case is substantially different (Figure 3). More specifically, Q3 captures the assumption relating to the use or intention of the technology. Assuming that an adversarial system may be used either for benign or for malicious purpose, SS*=SM+SB would denote the total number of systems used, where SM is the total number of systems used for malicious purposes and SB is the number of systems used for benign purposes. The line drawn in Q3 shows this as a con-straint, and any point on this line sets the proportion between malicious and benign systems: due to the geometric nature of the -45o line, the two components of demand always add up to the total supply on each axis, so that the -45o line directly

represents the equilibrium condition. Any point on this -45 o line gives a demand for malicious purposes technologies component plus a demand for benign or privacy-enhancing security technologies component, which just add up to the total security technologies supply.

Following the process of identifying the (P,V) pairs by equating all assumptions in Q2, Q3 and Q4, we obtain the positively sloped SM-SB curve in Q1, which rep-resents the technologies of the adversary market.

So far we have derived two pieces of geometric equipment. One gives the equilibrium pairs of P and V in Figure 1, i.e. the SD-SS curve in the security tech-nologies market, and the other gives the equilibrium pairs of P and V in Figure 2, i.e. the SM-SB curve in the technologies of the adversary market. By placing these two curves on the same quadrant shown in Figure 3, that is, by solving the two equilibrium equations f(P) = g(V) and SS* = h(P) + k(V) simultaneously, we can find the single (P, V) pair that gives equi-librium in both markets. This is shown as the equilibrium point E(PE, VE) of inter-section of the SD-SS and SM-SB curves in Figure 3.

Alone, this “snapshot” of the expected level of privacy is not very helpful, as we have still not escaped from the need to objectively measure privacy. However, the

benefits of this method can be realised if we performed a comparative statics exercise and incorporated the scenario of crimi-nalisation of the distribution of hacking tools.

Case study analysis

In order to demonstrate the case simula-tion through a comparative statics exer-cise, we first sketch a validation of the model by running a case with a fairly expected and intuitive result.

Baseline case: criminalisation of malicious hackingLet us consider the case where a govern-ment decided to update a regulatory sys-tem by introducing stricter laws in rela-tion to hacking (i.e. the malicious use of adversarial technologies). We could accept that an introduction and application of a strict law on malicious activities should result in higher privacy overall; a number of non determined, opportunistic hack-ers would be deterred and therefore the ratio of benign adversarial technologies to malicious technologies would increase, indicating a redistribution of adversarial technologies to good causes.

The above scenario is expressed in the model by accepting that the demand for security technologies for malicious

Figure 2: The adversarial technologies market.

Figure 3: The equilibrium.

14Computer Fraud & Security July 2008

purposes decreases, as indicated in Figure 4 by a right offset of the demand curve, from position SM = h(P) to position SM=h´(P). That is, for the same price, less adversarial systems will be used for mali-cious purposes. As a result it is shown, following the dashed lines in Figure 4, that curve SM-SB shifts to the right to position SM´-SB´ and thus equilibrium point moves along the SD-SS curve from point E to point E´. Comparing points E and E´ (comparative statics) it is seen that the equilibrium price decreases from level PE to PE´ and privacy increases from level VE to level VE´. In other words, under the assumption of exogenously

fixed supply of security technologies, the introduction of a stricter regulatory system results in a reallocation of security technologies demand between fraud and privacy-enhancing purposes (decreasing for malicious purposes and increasing for benign purposes), in lower levels of prices for security technologies, and correspond-ingly in higher levels of privacy.

Case: criminalisation of distribution of hacking toolsWe now consider the scenario where the distribution of adversarial technologies is

criminalised irrespective of their use. This position is due to the broad description of computer misuse included in the new Act, which states that a person is guilty of an offence if “he obtains any article with a view to its being supplied for use to com-mit, or to assist in the commission of, [a hacking offence]” (Police and Justice Act, 2006).18 As presented earlier, the word-ing of the amendment has a broad remit and as we will demonstrate below, it is expected to have a negative impact upon privacy.

Since the legislation targets the distribu-tion and possession of adversarial technol-ogies, we can assume that the aggregate

Figure 4: The new equilibrium, following an introduction of a strict law against hacking.

Figure 5: The impact of criminalising any distribution and use of hacking tools, irrespective of intention.

COMPUTER MISUSE ACT

15July 2008 Computer Fraud & Security

COMPUTER MISUSE ACT

demand will drop for both malicious and benign purposes. For example, a system administrator who may discover vulner-abilities either intentionally or acciden-tally will be deterred from sharing their findings with the security auditors who in turn would be reluctant to accept the relevant hacking tools, as possession could be loosely coupled with intention to com-mit an offence under the amended act – particularly if the vulnerability ended up being part of a maliciously applied threat vector by third parties. This event will influence the SS* line in Q3 in Figure 5, by a parallel displacement closer to point (0,0), indicating a reduction in the total number of adversarial technologies.

The effect of this displacement would cause the SM-SB line to move towards the left in Q1. As a result, the equilib-rium E will move along to E’’, defined by the pair (VE’’,PE’’), indicating an incre-ment in price and reduction in privacy. This interesting result can be explained as follows. The reduction of the aggregate demand of hacking tools will, for a given price, decrease the ratio of the demand between benign to malicious technolo-gies, i.e. proportionally more malicious technologies will be used, illustrating the perceived disadvantage the “good guys” will experience. This disadvantage would in effect not allow security audits to be properly performed and therefore more investments in security technologies will be made in order to compensate for this handicap, hence the price increase. Nevertheless, privacy is expected to drop, as the auditing tasks cannot be effectively performed, so the robustness and security of the systems would have a higher likeli-hood of failure.

Conclusions

According to the theoretic research exercise described in this paper, indis-criminate criminalisation of hacking tools – or alternatively, adversarial technologies – is expected to have negative consequences for privacy on

an aggregate level, in a way that not only privacy would drop, but in addi-tion the price of maintaining this reduced privacy will increase. Although the increase in price may come initially as a surprise, it could be explained by inspecting information security processes. More specifically, adopting the view that the “security market” consists of both (defensive) security and adversarial technologies, the latter will be moving the balance toward the malicious side, supporting the claim that organised crime is unlikely to be affected by this localised (i.e. national) legal “restriction.”

It should be highlighted that the above is a hypothesis suggested by the applica-tion of the method described in this paper. Further research on validating this hypothesis within specific socio-economi-cal context is an area for future expansion. Nevertheless, the method applied in this paper can certainly assist by highlighting directions in balancing security and pri-vacy which can lead to informed decision making and policy planning.

References

1 Cranor L. & Garfinkel, S. (eds.). 2005. Security and Usability. O’Reilly.2 Furnell, S.M, Katsabas, D. Dowland P.S. and Reid. F. 2007. A practical usabil-ity evaluation of security features in end-user applications”, Proceedings of 22nd IFIP International Information Security Conference (IFIP SEC 2007), Sandton, South Africa, 14-16 May 2007, pp 205-216.3 Brunk, B. 2002. Understanding the Privacy Space. First Monday, 7(10).4 Odlyzko, A. 2003. Privacy, economics, and price discrimination on the internet. In ACM, Fifth International Conference on Electronic Commerce, pp. 355-366 5 McCue, A. 2007. “Missing: 25 million child benefit records”, silicon.com, 17 November 2007.6 Katos, V. and Patel, A. 2008. A Partial Equilibrium View on Security and Privacy. Information Management and Computer Security, Emerald. To Appear.

7Acquisti, A., Dingledine, R. and Syverson P. 2003. On the economics of anonymity. In Financial Cryptography - FC ‘03, Springer Verlag, LNCS, 2003.8 Acquisti, A. 2004. Privacy in Electronic Commerce and the Economics of Immediate Gratification. Proceedings of ACM Electronic Commerce Conference (EC 04). New York, NY: ACM Press, 21-29.9Huang, P. 1998. The Law and Economics of Consumer Privacy Versus Data Mining. Available at SSRN: http://ssrn.com/abstract=94041 or DOI: 10.2139/ssrn.9404110Laudon, K. 1996. Markets and privacy. Communications of the ACM, 39(9).11Otsuka T. and Onozawa A. 2001. Personal Information Market: Toward a Secure and Efficient Trade of Privacy. In Proceedings of the First International Conference, Human Society and the Internet, Springer LNCS 2105, 151.12Range, S. 2002. “Renewed calls to fight cybercrime”, Computing, 15 February 2002, http://www.computing.co.uk/com-puting/analysis/2075433/renewed-calls-fight-cybercrime13Hansard. 2006. Police and Justice Bill. Lords Hansard, Volume No. 685, Part No. 188, 10 October 2006. Column 218. http://www.publica-tions.parliament.uk/pa/ld199697/ldhansrd/pdvn/lds06/text/61010-0015.htm#06101017800005414CPS. 2008. Computer Misuse Act 1990 – Guidance. Crown Prosecution Service. http://www.cps.gov.uk/legal/section12/chapter_s.pdf15Clayton, R. 2007. “Hacking tool guidance finally appears”, Security Research, Computer Laboratory, University of Cambridge, 31 December 2007, http://www.lightbluetouchpaper.org/2007/12/31/hacking-tool-guidance-finally-appears/16Dornbush, R. and Fischer, S. 1998. Macroeconomics. 7th ed. McGraw-Hill, New York.17 Branson, W.H. and Litvack, J.M. 1981. Macroeconomics. 2nd ed.,Harper & Row, New York.18Police and Justice Act 2006. 2006

16Computer Fraud & Security July 2008

RESEARCH FOCUS

Chapter 48. Available from: https://www.hmso.gov.uk/acts/acts2006/pdf/ukpga_20060048_en.pdf

About the authors

Vasilios Katos is a member of the School of Computing, University of Portsmouth, UK.

Steven Furnell is Professor of Information Systems Security at University of Plymouth, UK.

When security causes its own type of harm…

Academics have examined how heavy security controls could ultimately impinge on security itself. Gerald V Post, of University of the Pacific, and Albert Kagan, from Arizona State University, have analysed a user survey examining the trade-off between pro-tection and accessibility. The authors applied a structural equation model to evaluate impact on security lev-els. “Tightening security by making systems more inaccessible can hinder employees and make them less pro-ductive,” said the paper. “It can also result in lower security as workers struggle to find ways around the secu-rity conditions to enable them to do their jobs.”

Two hundred and fifteen US-based employees from a variety of indus-tries took part in the Web-based survey.

Employees were asked whether they strongly agreed, agreed, disagreed, strongly disagreed or felt neutral with the following statements:

• “I consider myself to be knowledge-able about computers.

• I consider myself to be knowledge-able about security.

• I try hard to maintain computer security.

• The MIS department provides time-ly help and support.

• The MIS department has a support-ive attitude.

• The information system provides the information I need.

• The information system is easy to use.

• The information system and data are adequately secure.

• The computer security group is sup-portive and helpful.

• Computer security is emphasized so much that it interferes with my work.

• I often get announcements about computer security.

• Computer security is effective and I see few attacks.

• Computer security slows down my computer or delays my work.

• This organisation has caught several people committing fraud or attack-ing computers.

• I like my job.”

“An important conclusion of this study is that 34% of the respondents perceived interference or delays caused by the computer security systems as a consequence of their current business environment,” said the researchers.

“And the questionnaire showed…in general employees perceive that increases (more onerous measures) in security policies and practices result in greater interference(s) with their job responsibilities.”

Post and Kagan said the result justi-fied concern and showed a need for more studies and care when designing and putting in place security controls.

They also put forward that the find-ings show that security controls need increased support from the IT depart-ment. And a balanced approach is recommended.

“In other words firms need to design a security policy that will protect the organisation’s assets and subsequently allow for user access without undo constraints to job performance and task completion,” said the paper.

Users should also be part of cre-ating a security policy with more knowledgeable users selected to design security controls, according to the research. And security restrictions should be tested on users to ensure task interference is low.

Workers struggle to find ways around the security conditions to enable them to do their jobs.

Researchers have asked users whether IT security controls are a burden or benefit

Paper: Evaluating information security tradeoffs: Restrictingaccess can interfere with user tasksAuthor: Gerald V Post, Albert KaganPublished: Computers & Security 26 (2007)