26
ijcrb.webs.com INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 329 NOVEMBER 2013 VOL 5, NO 7 Main human factors affecting information system security Saeed Soltanmohammadi 1, Saman Asadi 2, Norafida Ithnin 3 Department of Computer Science, faculty of Computing, Universiti Teknologi Malaysia, 81310 Skudai, Johor Abstract In this research relevant areas that are important in Information System Security have been reviewed based on the health care industry of Malaysia. Some concepts such as definition of Information System Security, System Security Goals, System Security Threats and human error have been studied. The Human factors that are effective on Information System Security have been highlighted and also some relevant models have been introduced. Reviewing the pervious factors helped to find out the Health Information System factors. Finally, the effective human factors on Health Information System have been identified and the structure of Healthcare industry has been studied. Moreover, these factors are categorized in three new groups: Organizational Factors, Motivational Factors and Learning. This information will help to design a framework in Health Information System. Keywords: Organizational Factors, Motivational Factors and Learning, Information system security 1.Introduction With less concern for people and organizational issues, a major part of information systems security strategies are technical in nature. As a consequence, since most information systems security strategies are of importance as they concentrate on technical oriented solutions, for instance checklists, risk analysis and assessment techniques, there is a necessity to investigate other ways of managing information systems security as they tend to disregard the social factors of risks and the informal structures of organizations. This investigation concentrates chiefly on human and organizational factors within the computer and information security system. The impact on security can be drastic if human and organizational factors influence their employment and use, irrespective of the power of technical controls (Bishop, 2002). In this aspect, the juncture for computer and information security vulnerabilities may be set by vulnerable computer and information security protection (e.g., weak passwords or poor usability) and malicious intentions may appear. The results of blemished organizational policies and individual practices whose origins are deeply rooted within early design presumptions or managerial choices causes susceptibilities (Besnard and Arief, 2004). Health Information System (HIS) has been implemented in Malaysia since late 1990s. HIS is an integration of several hospitals‘ information system to manage administration works, patients and clinical records. Because it is easy to access HIS data through the internet so its vulnerability to misuses, data lost and attacks will increase. Health data is very sensitive, therefore they require high protection and information security must be carefully watched as it plays an important role to protect the data from being stolen or harmed. Despite the vast research in information security, the human factor has been neglected from the research community, with most security research giving focus on the technological component of an information technology system. The human factor is still subject to attacks and thus, in need of auditing and addressing any existing vulnerabilities. It is wrong assumption that system security expectations should be realized when people follow by avoiding secure behavioral outlines. Security is something that can be easily purchased is another incorrect allegation; that human factor can sometimes demonstrate the most reliable expectations, incorrect. A critical point in Information Security is without question the human

Main human factors affecting information system securityjournal-archieves36.webs.com/329-354.pdf · vulnerability to misuses, data lost and attacks will increase. Health ... Operating

  • Upload
    voanh

  • View
    214

  • Download
    1

Embed Size (px)

Citation preview

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 329

NOVEMBER 2013

VOL 5, NO 7

Main human factors affecting information system security

Saeed Soltanmohammadi 1, Saman Asadi 2, Norafida Ithnin 3

Department of Computer Science, faculty of Computing, Universiti Teknologi Malaysia, 81310

Skudai, Johor Abstract

In this research relevant areas that are important in Information System Security have been

reviewed based on the health care industry of Malaysia. Some concepts such as definition of

Information System Security, System Security Goals, System Security Threats and human error

have been studied. The Human factors that are effective on Information System Security have

been highlighted and also some relevant models have been introduced. Reviewing the pervious

factors helped to find out the Health Information System factors. Finally, the effective human

factors on Health Information System have been identified and the structure of Healthcare

industry has been studied. Moreover, these factors are categorized in three new groups:

Organizational Factors, Motivational Factors and Learning. This information will help to design

a framework in Health Information System.

Keywords: Organizational Factors, Motivational Factors and Learning, Information system

security

1.Introduction

With less concern for people and organizational issues, a major part of information systems

security strategies are technical in nature. As a consequence, since most information systems

security strategies are of importance as they concentrate on technical oriented solutions, for

instance checklists, risk analysis and assessment techniques, there is a necessity to investigate

other ways of managing information systems security as they tend to disregard the social factors

of risks and the informal structures of organizations.

This investigation concentrates chiefly on human and organizational factors within the computer

and information security system. The impact on security can be drastic if human and

organizational factors influence their employment and use, irrespective of the power of technical

controls (Bishop, 2002). In this aspect, the juncture for computer and information security

vulnerabilities may be set by vulnerable computer and information security protection (e.g.,

weak passwords or poor usability) and malicious intentions may appear. The results of

blemished organizational policies and individual practices whose origins are deeply rooted

within early design presumptions or managerial choices causes susceptibilities (Besnard and

Arief, 2004).

Health Information System (HIS) has been implemented in Malaysia since late 1990s. HIS is an

integration of several hospitals‘ information system to manage administration works, patients

and clinical records. Because it is easy to access HIS data through the internet so its

vulnerability to misuses, data lost and attacks will increase. Health data is very sensitive,

therefore they require high protection and information security must be carefully watched as it

plays an important role to protect the data from being stolen or harmed. Despite the vast

research in information security, the human factor has been neglected from the research

community, with most security research giving focus on the technological component of an

information technology system. The human factor is still subject to attacks and thus, in need of

auditing and addressing any existing vulnerabilities.

It is wrong assumption that system security expectations should be realized when people follow

by avoiding secure behavioral outlines. Security is something that can be easily purchased is

another incorrect allegation; that human factor can sometimes demonstrate the most reliable

expectations, incorrect. A critical point in Information Security is without question the human

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 330

NOVEMBER 2013

VOL 5, NO 7

factor. An attacker would take advantage of people who might make untried decisions which

would permit, or might even purposely attack their premises.

External threats are not the main concerns in information security although many organizations

apply advanced technologies in their security systems like smart card and biometrics

(Kreicberge, 2010 and Leach, 2003). As Leach (2003) stated, the main concerns are related to

internal threats such as users' carelessness, errors and omissions which are all caused by internal

factors an categorized as poor users‘ behaviors. According to some studies, in so many security

breaches employees in an organization can be guilty intentionally or unintentionally

(Kreicberge, 2010; Siponen et al.,2010). Employees' guilty role is something that is an internal

threat. As Boujettif and Wang (2010) reported 4 out of 5 security incidents in organizations are

caused by internal threats. Some researches in Malaysia support this fact. For example, Human

error is one of main internal threats in applying Health Information System in Malaysia (Samy,

2010; Humaidi, and Balakrishnan, 2013). This study aims to categorize human factors affecting

information system security (ISS) in Malaysian hospitals.

Literature Review

2. Information System Security

Cooperation among diverse entities towards fulfilling a shared goal is what defines a system. A

system where people and technology, having their own constituents and associated activities,

communicate for the same reason is called an Information Technology system. Systems fail to

operate, as usually anticipated similarly to Information Technology systems. It is not just the

machine segment of the system that must operate adequately but the user as well; when a user

flops, it gives rise to human error. One of the fundamental explanations for system failure is

actually human error. This is a consequence of attempting to shrink the human nature into a

simplified model. Getting rid of elements, which at first might not seem of significance; but the

dearth of them could result in catastrophe, is involved in the process of simplifying a model to

make it simpler to use. Information Technology is such a case; systems which are man-made,

are diminished models of a fact that we want to embody and abide by the flawed patterns of

their creators, us. The very same commands that we prescribe limits our analog nature in a

digital world of binary digits. Hence, as there are nearly limitless probabilities for something

unforeseen to happen, an error is something that we should expect (Rupere et al., 2012).

3. System Security Goals

Data confidentiality, integrity and availability are ensured through Information Security

(Pfleeger, 2003; Bishop, 2003); three properties which can assure data security. When every

system constituent is gained access to by only authorized parties, confidentiality exists. Being

aware of the very presence of the system constituent, viewing or printing are included in the

term access (Pfleeger, 2003). Ensuring that the system constituents can be modified by just

authorized persons or manner is integrity. Modification consists of writing, altering, altering

status, erasing and forming (Pfleeger, 2003). The system constituents that are accessible to

authorized persons at specified times are called availability. The opposite of availability would

be the denial of service, where the access to a specific set of objects is rejected a specified time

(Pfleeger, 2003).

4. System Security Threats

In a general framework, the initial step of forming a secure system is to recognize the probable

dangers. Interception, interruption, modification and fabrication could be classified as the

dangers of a system. These four classes encompass all types of dangers that a system could

experience (Pfleeger, 2003).Information that has become accessible to an outside source without

suitable authority is known by the term interception. An outside source may or may not be

located, and can be an individual, a program or a system (Pfleeger, 2003). Wiretappings which

are not successful and successful respectively could be valid examples of traced and non-traced

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 331

NOVEMBER 2013

VOL 5, NO 7

interceptions. When a system constituent becomes lost, inaccessible or not usable is called

interruption (Pfleeger, 2003). An example would be when the cables linking to a crucial system

are purposely damaged; then system connectivity is disturbed and the resources inside

spontaneously become inaccessible. Modification not only consists of an unauthorized party

accessing a system component, but also modifying it which is unlike interception. Subject to the

technical discernibility of changes, modifications may or may not be able to be distinguished

(Pfleeger, 2003). A computer virus which changes the keyboard output is an example of what an

identifiable modification would be like; in that respect the user will automatically become alert

of a system change. Conversely, the user might not identify any alteration in the system output

or general experience if the same system is otherwise infected by a rootkit, in spite of alterations

on the system kernel. The inclusion of counterfeit objects from an illegitimate party is called

fabrication (Pfleeger, 2003). It might be simpler to identify as these are added objects but it

relies on the attackers‘ ability. For instance, a malevolent user could credit to his account for

every transaction a rather extremely small and apparently untraceable, amount by enclosing a

module at a bank database server.

5. Characteristics of a System Intrusion

A system condition that could permit an intrusion is called system vulnerability. A successful

effort to infiltrate a system by means of abusing a current vulnerability is an intrusion. A

vulnerable constituent will not necessarily cause an intrusion; however, vulnerabilities should be

prevented from being abused by a responsible system management. An attacker initially

attempts to locate the weakest point of a system; while strong looking defenses might also be

vulnerable, the weakest point of a system delivers a greater attack achievement degree and

hence it is on average the first to be aimed at (Pfleeger, 2003).

6. Targeted System Components

Hardware, software and data make up the technological portion of an Information Technology

system. These constituents may be targeted directly or not; the human part of the system is not

abused when attacked directly; rather, when these constituents are attacked indirectly it indicates

that the human part of the system is abused also. Figure 1 shows constituent communications.

Figure 1. The Targeted System Components (Nikolakopoulos, 2009)

The physical portion of a computer system is hardware. It accommodates the software as well as

data constituents and it is the lowermost level of an Information Technology system. The hard

disk, the memory modules and keyboard are some of the hardware constituents. The hardest to

protect are normally from attacks which aim for the hardware as most security measures focus

on the software and data security. An indirect hardware-based attack would be an example of a

hardware keyboard listener, set up to mislead the authorized system user. By peacefully

handling all hardware properties, software is the part of the computer system which facilitates

all functions. Operating System (OS), any installed applications as well as hardware firmware is

included in software; it is positioned between the user and the hardware and allows the latter to

effectually use the former. As software is the most targeted system constituent, most security

solutions are software-based; the carrying out of malicious code on the system through an

inducement email attachment is an example of an indirect software based attack. When the

targeted user permits code execution by opening the attachment, this would allow the attacker to

acquire system entry. The entire system output which is produced by the process of software

execution is called data. Data are utilized in the software layer and not directly executed from

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 332

NOVEMBER 2013

VOL 5, NO 7

the hardware. We could define a database configuration, the database contents or output of a

database query as data. To dupe a system user by phone with the intention of utilizing a fake

verification and enquiry details for a database record is an example of an indirect attack on data.

It is worth mentioning that in contrast to the rest of the system constituents, data cannot be

abused directly unless another system constituent has been infiltrated already. In the example

shown, the human factor reveals the data after his successful abuse.

7. Security Implementation

Achieving the objective of a secure system would typically be just a matter of cost with all the

threats charted. But, an often underestimated factor of technology is security. One need to be

conscious of normal security practices and that human factor is frequently the first point of

letdown even though a liable process for building an Information Technology system integrates

security. A priori secure system is not assured by the presence of the security mechanisms

themselves, likewise data protection is not assured through appropriate security oriented

configuration. Human factors should be assessed, and highlighted when needed for security

employment. An infiltration causes disappointment of one or more of the security objectives

irrespective of which technological constituent gets infiltrated. Thus, the possible point of

intrusion will be the focus of the investigation, which in our topic is the human factor. The

attacks will be analyzed based on human centered criteria and not technological ones. The

topology of all the fundamental constituents on an Information Technology system under attack

and where the human factor is positioned within it is highlighted in Figure 2.

Figure 2.

Linking the Human Factor (Nikolakopoulos, 2009)

8. Role of Human and Organizational Factors in Computer and Information Security

A series of disciplinary standpoints has investigated the role of human and organizational

factors in computer and information security. Investigation from the fields of cognitive

engineering, computer science, human factors engineering, information systems,

macroergonomics, management sciences, and systems dynamics are included in this list, though

it is not in-depth. Investigators have pressed for more investigations into these areas despite the

fact that these research tracks have analyzed various aspects of human and organizational

factors in computer and information security (Dhillon and Backhouse, 2001; Furnell, 2007;

Schultz, 2005; Cresswell and Hassan, 2007). Some of the investigations across these disciplines

will be summarized in this overview.

A survey study was carried out by Stanton et al., (2005) involving 1167 end users in the

financial, industrial, health, military, government, and telecommunications segments on

password-related behaviors in addition to training and organizational awareness. Substantial

USER

ATTACKER

INFORMATION TECHNOLOGY SYSTEM

SYSTEM ACCESS ATTACK

HARDWARE

SOFTWARE

DATA

SYSTEM

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 333

NOVEMBER 2013

VOL 5, NO 7

correlations were discovered by them between healthy password-related manners and training

and awareness. An interview study was carried out by Albrechtsen (2007) to clarify users‘

experiences of information security by organizational factors. Organizational factors such as

high workload forms a clash of interest between functionality and information security was

reported in the study, among other outcomes. These outcomes have been supplemented by

studies of security managers‘ and network administrators‘ views of human and organizational

aspects; these studies also discovered that, among other aspects, weakened system states, human

mistakes, and system performance in general were linked to tasks and high workload (Kraemer

et al., 2006; Kraemer and Carayon, 2007).

Policies, culture, and management support are incorporated in organizational factors in

computer and information security investigation. A variety of dimensions, such as the effectual

application and distribution of computer and information security policies have been analyzed

by studies involving policies (Fulford and Doherty, 2003), employees‘ response toward security

requirement (Pahnila et al., 2007), and the significance of policy management and

organizational processes on the execution and acceptance of computer and information security

policies (Karyda et al., 2005).

Culture in security systems has been discovered to be multidimensional, also, including

(Ruighaver et al., 2007), top management support (Knapp et al., 2006), employee involvement

and training (Kraemer and Carayon, 2005), and employee awareness of security (Siponen,

2000). The chemistry between human and organizational factors has been analyzed through

other investigations in computer and information security (Werlinger et al., 2009). Their

analysis stressed the chemistry among the various factors in the computer and information

security system and generated a cohesive framework of human, organizational, and

technological obstacles associated with security management. The study discovered that a

strategy that all together highlights the factors is needed in order to effectually address the range

of computer and information security issues. Effectual computer and information security is

certainly multi-dimensional and impacts many features of the computer and information security

system. This aspect has been stressed in the huge amounts of research in organizational factors.

But, such as particular susceptibilities or vulnerability types, these areas of investigation have

not yet clearly connected factors to particular performance metrics. In our current strategy,

among factors that may have both direct and indirect influences on various vulnerability types,

we targeted to detect some possible communications and corridors.

Some models of human and organizational factors in computer and information security systems

were created as a result of investigations into system dynamics by way of causal loop

diagramming, based on the viewpoint of security managers (Sarriegi et al., 2006).These studies

have researched security management from an active and difficult view in particular,

highlighting that the difficult nature of these systems needs a modeling procedure adept of

demonstrating the multifarious associations among factors. Factors such as security

management, culture, and the dearth of employee participation, among others, were described as

contributing to a debilitated security performance.

9. Information Security and Types of Human Factor Errors

The weakest connection in security is constantly referred to as humans (Schneier, 2000; Huang

et al., 2007). Without taking into account of how the human communicates with the system by

concentrating solely on the technical aspects of security is obviously insufficient. The types of

human factor faults that can cause security breaches are included in this section. A number of

explanations for these faults will also be deliberated.

Information security breaches can be classified in a number of diverse ways. Five diverse kinds

of human factor errors, which can be used to elaborate on information security breaches were

differentiated by Swain and Guttman (1983). First, where people fail to remember to do a

required action, there are acts of omission. For example, this could involve the disappointment

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 334

NOVEMBER 2013

VOL 5, NO 7

to often change passwords in an information security domain. Second, errors are normally acts

of commission, in which people write down a password which is an incorrect procedure or

action. Third, doing something pointless involves a number of faults caused by extraneous acts.

Fourth, performing something in an incorrect order involves errors which can be caused by

sequential acts. Finally, Swain and Guttman (1983) refer to time errors as triggered by peoples‘

failure to perform a task within the stipulated required time.

In truth, the bulk of human factor errors could be seen as unintentional. The way in which the

individual communicates with a system is connected to accidental human factor errors, and

proof indicates that people may come across issues in identifying, comprehending and utilizing

the security features (Furnell, 2005).

Capture error can be referred to as one typical human error that can result in security breaches.

When a known activity or customary routine takes over (or captures) an unknown activity, such

errors happen, giving rise to cognitive failures or errors (Norman, 1981). For instance, capture

errors can be used to show the truth that people frequently press OK when they know that they

should not. In essence, the act of pressing the OK button is so customary and typical that people

often abide by this routine without properly bearing in mind the repercussions. During times of

inattention or tiredness, these errors are typically often.

Post-completion errors can also be the consequence of inattention and tiredness, in which the

individual fails to undertake an important ‗tidy-up‘ or ‗clean-up‘ action that is needed after the

chief target has been concluded (Anderson, 2008). For instance, from an information security

standpoint, the chief target may consist of sending an email from a secure system. Once that

target has been concluded, it is then vital to finish the closing action of logging off the system.

A condition where the individual in question is unable to finish that closing task involves a post-

completion error, exposing the system to a probable security infringement.

With respect to that, human memory is relied on by a number of security procedures, and since

the capacity of memory is restricted, this can bring about a reduction in safety (Besnard and

Arief, 2004; Sasse et al., 2001). For example, users normally have an immense number of

passwords to recall, and frequently these passwords must comply to a number of strict policies

(such as complying to a specific length or a given mixture of characters), which can further

decrease ease of recalling. People are rather adept at recalling significant items (including

words), but a sound password should be a worthless string of numbers and letters, which are

much more challenging to recall. The odds of these passwords being written down greatly

amplifies when people are expected to recall and frequently change many complex passwords

(Adams and Sasse, 1999).

In relation to this, human intercession is required for many anti-virus updates and other security

patches, and therefore, such procedures may not be implemented due to restrictions connected

with either time or human memory, leading to a decline in security.

A lecture by McCauley-Bell on the human factors issues and their influence on Information

Security, highlights that the enhanced dangers of information technology created new answers

targeted at technological methods, while there has been a tremendous restriction on the human

factors related work; with the only evident exclusion of password generation (Nikolakopoulos et

al., 2009). A factor that security relies upon, the human factor goes unnoticed by organizations

numerous times (Kahraman 2005). The direct answer to Information Security issues is

frequently perceived as technology (Hinson, 2003). But, in spite of the fact that numerous

organizations use a tremendous number of technical security controls, they still display a non-

related number of security breaches; this occurs because Information Security is basically a

human factors issue that remains unresolved (Schultz, 2005). It is equally important to invest in

the human factor since people are the ones who use technology (Hinson, 2003).

A security system will have to depend on the human factor in spite of design and execution; the

on- going execution of technical answers will be unsuccessful to handle the people (Gonzalez

et.al, 2003). On top of that, Schneier states that technology is unable to overcome the security

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 335

NOVEMBER 2013

VOL 5, NO 7

issues and hence displays a want of comprehending the issues and technology (Schneier, 2000).

Mitnik contends that users are targeted when the technological attacks fail; and discovers

technological protection insufficient, as opposed to most sources that discover users targeted

first as the weakest system connection (Mitnick, 2001). A set of measures which should be

taken as a system and not a single unit is Information Security (Sapronov, 2005). With the

exception of compressing the human factor as a constituent, an Information Security system is

also defined as a constantly evolving entity (Danchev, 2006). Without examining the accidental

disclosure of the system to a risk, Panko acknowledges the deliberate threat from both in and out

of the organization premises (Raymond, 2004). A security questionnaire from Cisco Systems,

disclosed that even though users who work remotely admit to have knowledge of security

threats, they would still be involved in actions which harm the system security (Moos, 2006).

The unauthorized use of computer systems is made by either unintentional or purposeful reasons

(Carayon and Kraemer, 2009). Any unforeseen natural disasters and human factor are accidental

causes; for instance, power blowouts or wrong composition (Carayon and Kraemer, 2009).

Actions created by alert choices are deliberate causes; for instance, using a program defect to

gain entree into a computer system (Carayon and Kraemer, 2009). Sixty five percent of the

economic damage in Information Security breaches is because of human mistake, and just three

percent from malevolent outsiders, has been demonstrated through an assessment of factors

which create security breaches (Nikolakopoulos, 2009). It is doubtful why there has been so

much emphasis on technological means seeing the fact that the pains to assess the human factor

in Information Security are basically absent (Nikolakopoulos, 2009).

People, as part of the system, communicate by creating, establishing and using both software

and hardware; an optimum and faultless software or hardware solution will still be of no use

when a user has below par training (Sapronov, 2005). Hence, people will constantly be a weak

system constituent (Sapronov, 2005). Without comprehending the operability or desire to find

out about it, users frequently assume their computer systems as a black box (Sapronov, 2005

and Moos, 2006). A good case in point is that users want to work their computers in a similar

way as any other household electrical items (Sapronov, 2005). By possessing empty passwords

or utilizing their name as one, numerous users are found to handle personal information in a

negligent manner, as opposed to the reality that the same users would never purposely leave

their keys in the outside door lock (Sapronov, 2005). People are undeniably involved in

technology irrespective of the incomplete automation that is presented (Schultz, 2005).Thus,

there is a possibility for human mistakes which may give rise to system leak (Schultz, 2005).

Employees are themselves a possible point of intrusion having by de facto entree and

knowledge about the system; hence, the security of an Information Technology system is vastly

impacted (Maiwald, 2004). When the skills of the employees are greater, security problems may

come into focus (Maiwald, 2004). If users would require to use added software or by assuming

that they have the know-how for making use of any current system vulnerabilities would cause

this to happen; added software could enhance the attack surface and an employee with

manipulative knowledge could eagerly attack the organization from inside (Maiwald, 2004).

But, it could happen that users with greater technological skills normally need software that they

presently possess the required managerial and safety configuration skills for. So, further

investigation is needed from the research community that this would be something that might

not be automatically true.

Careless and unaware users normally cause security breaches (Danchev, 2006). A behavioral

inclination that gives rise for attacks is that the majority of people want to finish their jobs more

than they are keen in safeguarding themselves (Schneier, 2000). On top of that, most people do

not realize small threats and they participate in actions which might uncover the system

(Schneier, 200). The exception handling, or how otherwise people might react when something

unforeseen happens is another view that was not stated by any of the connected sources; the

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 336

NOVEMBER 2013

VOL 5, NO 7

substitute actions that people might take when they come across something for the first time are

depended on many times by the attackers (Schneier, 2000).

Social engineering is another human factor vulnerability; it normally occurs that attackers

straightforwardly abuse the user by forcing them to do what they demand of them (Schneier,

2000 and Rupere et al., 2012). A tremendously potent attack which sidesteps every

technological security is social engineering (Schneier, 2000). Users‘ faith is normally taken

advantage of by attackers; if required, developing this association before the attack (William,

2002). Furthermore, the lack of verifying someone is quite often relied upon by social

engineering attacks, mainly when interactions happen through a telephone call; then the attacker

might, for instance, make-up to be a person with power or a colleague in need of assistance

(William, 2002). Schultz highlights what is proven from a stated survey showing that twenty

percent of users would not evade opening email attachments; with regards to the unresolved

user awareness issue (Schultz, 2005).

Compared to Information Technology staff, end users are normally less taught, skilled and

security-conscious; which renders them the most vulnerable personnel to be attacked in an

organization. Moreover, for the server ones, the security vulnerabilities for workspace computer

systems are found to be significantly more; tremendously enhancing the probability of being

attacked and the significance of security at workspaces. With the workspace taken as the

weakest connection of an Information Technology system; an organization must be able to

integrate workspaces and their users into the defense frontline if it wants to succeed security

wise (Arce, 2003).

A constant procedure which needs a stopples investment in both technology and users‘

education is the outline for security; the sole constituent for having a protected infrastructure

cannot be technology as it is only a part of it (Danchev, 2006). The fundamental factors

identified in highlighting the human characteristic of security (Pfleeger, 2003) are threats and

accountabilities; education and alertness (Moos, 2006) which users should acquire knowledge

about. It is also suggested to persuade users of the importance for security except giving users

understanding of the presence of threats; people would then comply with the security

requirements in a particular situation (Danchev, 2006 and Moos, 2006).

Information Technology security performance which is assessed in a Master‘s thesis, also

includes carrying out a human factors assessment based on awareness, training and education

(Kahraman, 2005); however, the outcomes are not derived through the users as the assessment is

performed from an organizational standpoint, and not user-wise.

A booklet which could be labeled as a more common reference and discovered to be focused on

asset threats or dangers and not security dangers on asset was published by a technical group on

the subject of system security. The focused group of this publication is system administrators

and it is not a scientific publication but written as a compilation of good practices. Human factor

intrusion probability is only characterized under deliberate human threats is where, as a result of

a lack of knowledge of the human factor, there is a strong probability of a security episode due

to inadequate knowledge of the human factor. The proposed highlighting is mainly software

based, trying to solve a human based issue by changing the technological constituent of the

system. Even though it is stated that the security problem is chiefly a problem created from

people; with regards to social engineering, only the authors recommended user-centric solution

by improving education. (Wagner, 1997) Lastly, there is a lack of identification on the human

factor as an unintended threat and an inappropriate recommendation of using user education for

hindering attacks on the system.

A publication which focuses on analyzing security in an enterprise situation, of a similarly

general standpoint, was also found to have insufficient consideration of the human factor role in

security (Newman, 2003). In the mentioned defense examples, the threats are assumed to be out

of premises, always with the purpose and attacking through technical means in spite of the fact

that the threat sources are acknowledged to originate from inside as well as outside the

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 337

NOVEMBER 2013

VOL 5, NO 7

premises. Hence, the majority of the defense mechanisms are either software or hardware based,

comparable to the earlier revised source. Recommendations for lessening such incidents are

made again with a fleeting reference on social engineering. The chances of an inadvertent leak

of the system through a user is not assessed; but, in the appendices under the section of the

Enterprise Systems Security Review does contain a human factors checklist which states

education and awareness as two features which should be investigated.

Panko suggests the principle of having clear roles should be established for designing or

auditing security operations (Panko 2008). The roles demarcate who does what and decides on

procedures (Panko, 2008). The proposal to do user training is another interesting contribution

(Panko, 2008); accumulated under three parts, security awareness, responsibility and self-

protection (Panko, 2008). By using attack formations and case analysis, the security awareness

training would target to aid users to understand the presence of threats (Panko, 2008).

Responsibility training would aid users to be accustomed with actions that they should either

evade or not, based on certain rules and the understanding behind those rules (Panko, 2008).

The term or self-defense training is intended to prepare a user for taking decisive action during

an attack; also, utilizing the users on identifying issues or reporting unsatisfactory user behavior

is part of the self-defense training (Panko, 2008). Having basic training for everyone and added

training according to the users‘ particular position is an alternative solution of Mitnick

(Sapronov, 2005); a solution which takes into account the principle role that Panko provides.

Awareness as being the most cost-saving security control has been identified by Hinson and he

makes a recommendation on how to enhance control investment; but, minus a solution (Hinson,

2003).It is discovered that numerous organizations assess new products and do intermittent

testing on the technological part of their systems with regards to proactive risk management, but

very few make a solemn attempt to recognize dangers in relation with the system users (Hinson,

2003). A better understanding of feedback is required for security enhancements. (Hinson,

2003); since the feedback from the technological constituent can be programmed, a concealed

topic left behind is the one regarding human factors.

The human factors assessment technique for computer and Information Security created by

Kraemer does have certain limitations. The resultant vulnerability assessment follows a

technical vulnerability audit and it occurs on top of vulnerabilities with the human factor

constituents (Kraemer, 2006); this restricts the probability of having a human factors

vulnerability assessment without reviewing the technological constituent. On top of that, the

vulnerability assessment is made solely based on the previously found technical vulnerabilities

(Kraemer, 2006) and hence there is a significant portion of the human factor, the non technical

inadvertent vulnerabilities, which are still not looked into. The feedback derives not from

qualitative interviews with end-users but with comprised network administrators (Kraemer,

2006); the information gained would be more rich and genuine by doing the opposite.

Moreover, the assessment is qualitative and based on results which are derived from a

qualitative analysis software package (Kraemer, 2006); which highlights a risk of discrepancy as

the results may differ if an inappropriate classification is made.

The scientific basis for the exploration of human factors as a possible point of intrusion is

constituted by the earlier researched sources. Acrucial part of Information Security is

undeniably the human factor (Schultz, 2005, Nikolakopoulos, 2009, Kahraman, 2005, Hinson,

2003, Jose, 2003, Schneier, 2000, Sapronov, 2005, Danchev, 2006, Moos, 2006, Maiwald, 2004,

Wagner, 1997). This happens as users assume technology to be present and work for them

(Sapronov, 2005 and, Moos, 2006) or just because they go past any security mechanism to get

their jobs done (Schneier, 2000). The protection of users should be of greater importance with

users being the most targeted and susceptible connection of an Information Technology system

(Arce, 2003). The education and security awareness of the user should be the chief target of an

investigation (Hinson, 2003, Moos, 2006, Wagner, 1997, Newman, 2003), two features which

may explain exploitability as well. Education is regarded as a vital factor for addressing the

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 338

NOVEMBER 2013

VOL 5, NO 7

human factor vulnerability (Pfleeger, 2003; Hinson, 2003; Schneier, 2000; William, 2002;

Danchev, 2006; Panko, 2008; Moos, 2006).

10. The Significance of Human Factors in Information Systems Security

Lately, initiatives to enhance Information Security have been software-centered or hardware-

oriented. Thus far, the attempts to highlight the security obstacles confronted by users have been

few and far between. Not so long ago, it has been found that system users, as well as their

communication with computers are the ultimate loophole in Information Systems security. The

statement that humans are the weakest connection in information security is mentioned in

reference (Mitnick, and Simon, 2002). System users have to collaborate with information

security, but regrettably, numerous organizations target on hardware and software solutions,

abandoning ―people-ware‖ out of the equation and hence resulting in bad design and bad

security culture becoming appropriate (Flechais, 2005).

In the framework of this investigative study, regardless of having all the technical measures for

example firewalls, Intrusion Detection Systems and anti-virus installed, human factors in

information security contribute all those activities mistakenly performed by system users that

decrease information security. All those inadvertent activities done by system users that effect

the security of the system such as inappropriate use of passwords, input mistakes, not

remembering to log out of systems, not adhering to procedures, inexperience, and users who

provide their passwords to colleagues so that they can repair some problems when they are not

in the office are referred to as human factors. Such activities are disparate to insider threats

which consist of malevolent activities targeted to attack a system by people, especially staffs

who work with an information system. Moreover, human factors in this framework are not

associated with any local or international malevolent activities by staffs of an organization or

unauthorized people deliberately wanting to attack a system, sometimes called ―insider threats‖.

Illegally altering portions of code, providing IP addresses of the organization‘s servers to rivals

and indulging in activities such as phishing, wiretapping, password breaking and identity theft

are some examples. In this investigation, concentration is on those activities mistakenly

performed by system users, but it results in the information system being left vulnerable to

attacks. Rendering a system to security threats is caused by the behavior of end-users. An

organization has to address the human side also in order for it to totally apply information

system security; if not, the security will be inadequate, making the system vulnerable to attack.

Also, the truth that automation of procedures can diminish human mistakes in information

systems security has been rejected by some security experts. Investigations have showed that the

majority of human communications with systems are problematic to automate because

information security cannot be totally automated (Hassell et al., 2004 and Gonzales et al.,

2002).

It is not possible to separate the human from the technology factors. In order to achieve a given

task, both elements are indispensable. Today, there are very few professions that can claim to

get by without the help of machines. At the same time, machines do not have intuition and

intelligence. They require instructions in the form of commands such as setup, start and stop

operations. The human worker can receive feedback from the machine, e.g. control parameters,

alarms and other data. Only humans can understand such machine data, analyze it and transform

it into new machine inputs. Humans are not ready to live in a fully automated society. An

attempt by Airbus to develop fully automated airliners was rejected by consumers. Interaction

between humans and machines will always exist (Bubb, 2005).

Both machines and humans are subject to errors and can influence the quality of a product.

Although ultimately every failure can be put down to a human mistake. Our society tends to

always search for someone to bear the responsibility of an accident or error. In that sense,

humans are under constant pressure and hold the responsibility for the quality of the end

product.

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 339

NOVEMBER 2013

VOL 5, NO 7

Depending on the nature of the industry, the errors could result in huge losses. As such,

potential human errors cannot be ignored in a thorough risk analysis. There could be many

different reasons for human errors, including carelessness, inadequate training, lack of

supervision, lack of concentration, etc.

11. The Contribution of Human Errors in Information Security

It has been reported that in excess of 80% of the accidents in venues are contributed to human

errors, stretching from air transport operations to nuclear power factories (Harper et al., 2011

and Hollnagel, 1993). If we conventionally forecast that human error impact on security

practices is two-thirds that of safety accidents, there is still the case where human error exists in

the majority of security accidents.

Statistics have been generated by information security investigators who have commenced

concentrating on them, acknowledging them as a large constituent of issues in computer

security. The majority (86%) of respondents endorsed that human error is the main cause of

information systems failure was disclosed in the Global Financial Services Industry (GFSI)

Security Survey (Ahmed et al., 2012). Where 65% of the economic loss was caused by

information security breaches due to human error and only 3% of the loss was caused by

malicious outsiders as shown in Table 1;derived from (Brostoff, 2004; Fischer and Hubner,

2001) mentioned in the National Institute of Standards and Technology. 41% of security

incidents were due to human error, while only 9% were due to deliberate crimes as discovered

by (Brostoff, 2004 and Spruit et al, 1996).

Table 1. The contribution of Human Errors in Information Security. (Ahmed et al., 2012)

Percentage of Economic Loss

Violations (22%) Errors (65%)

Sabotage 3% malicious outsiders 13% dishonest employees 6% disgruntled employees

Slips and Lapses Skilled based errors mistakes Rule based errors Knowledge based errors

An interesting experiment was carried out by The U.S. Department of Homeland Security

targeted at identifying how simple it would be for hackers to influence workers in order to gain

entree into computer systems (Edwards, 2001). This involved covertly dropping computer discs

and USB sticks in the car parks of government buildings and private contractors. Those who

picked them up and plugged the devices into their office computers made up nearly 60%.

Furthermore, 90% were installed on the employee's computer if the drive or CD had an official

log.

There is no valid investigation and statistics on human error enhancement/ moderation methods

although much of the statistics generated thus far concentrate on human errors in organizational

settings.

12. Human Error Models and Concepts

It is essential to recognize the diverse kinds of human errors and inform users of the potential

risks and implement strategies to avoid them so as to hinder such human errors from happening

in information security contexts. A variety of models and concepts have been created for

understanding and describing various kinds and stages of human error within the field of human

factors. To examine the causes of accidents, these models and concepts have been positively

employed in a variety of industries.

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 340

NOVEMBER 2013

VOL 5, NO 7

Efforts at modeling human error to provide predictive power are scarce. There have, however,

been a large number of taxonomic and descriptive efforts to explain human error behavior.

Some of the most well known of these are the Generic Error Modeling System (GEMS)

approach (Reason, 1990), the stages-of-action model (Norman, 1986) and the Cognitive

Reliability and Error Analysis Method (CREAM) (Hollnagel, 1998). However, they are neither

mechanistic, which limits their explanatory power, or predictive. Without predictive power

these approaches cannot generally be used to determine which of two designs would generate

fewer or less serious errors.

12. 1 Cognitive Reliability and Error Analysis Method (CREAM)

Cognitive reliability and error analysis method (CREAM) developed by Hollnagel (1998), is a

representative method of the second generation HRA methods. Two main features of CREAM

are: (1) emphasizes the important influence of the context on human performance; and (2) has a

useful cognitive model and framework which could be used in retrospective and prospective

analysis. The core of CREAM is that human error is not stochastic, but more shaped by the

context of the task. CREAM identifies nine common performance conditions (CPCs). The CPCs

together provides a comprehensive and well-structured basis for characterizing the conditions

under which the performance is expected to take place. The term control mode is used to reflect

the characteristics of the different conditions. Four kinds of control modes are defined in

CREAM: scrambled control, opportunistic control, tactical control and strategic control, with

the error probability reduced gradually. Which control mode is chosen depends on the combined

characteristic of all CPCs, i.e. combined CPC score. A typical combined CPC score can be

derived by simply counting the number of times where a CPC is expected: (1) to reduce

performance reliability, (2) to have no significant effect, and (3) to improve performance

reliability. This can be expressed by ∑reduced, ∑not significant and ∑improved. One of the purposes of the

prospective analysis of CREAM is to provide a quantification prediction of performance

reliability in the context of PSA. CREAM approaches the quantification in two steps, by

providing a basic and an extended method. The basic method of CREAM could be used as an

initial screening of human actions by producing an overall assessment of the performance

reliability. Considering the characteristics of nine CPCs of a task, one of the four control modes

will be selected by a relatively simple way illustrated as Figure 3 Each control mode has its

corresponding reliability interval, as shown in Table 2 Although the aim of the basic method is

to find a good indicator for the probability of an interaction rather than a precise failure

probability, the intervals in Table 2 appear to be too wide even for the use in screening (Fujita

and Hollnagel, 2004).

Figure 3. Relations between CPC score and control modes.

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 341

NOVEMBER 2013

VOL 5, NO 7

Table 2. Control modes and probability interval (He et.al, 2008)

Control mode Probability interval

Strategic 0.00005<P<0.01 Tactic 0.001<P<0.1 Opportunistic 0.01<P<0.5 Scrambled 0.1<P<1.0

12.2 Situation Awareness (SA)

In complex and dynamic environments, human decision making is highly dependent on situation

awareness (SA) — a constantly evolving picture of the state of the environment. SA is formally

defined as a person‘s ―perception of the elements in the environment within a volume of time

and space, the comprehension of their meaning, and the projection of their status in the near

future (Endsley, 1988).

It encompasses not only an awareness of specific key elements in the situation (Level 1 SA), but

also a gestalt comprehension and integration of that information in light of operational goals

(Level 2 SA), along with an ability to project future states of the system (Level 3 SA). These

higher levels of SA (Levels 2 and 3) have been found to be particularly critical to effective

functioning in complex environments, such as the cockpit, air traffic control, driving, medicine

and control rooms.

The consequences of error in these environments can be severe. Failures in human decision

making are frequently cited as causal in investigations of error in a wide variety of these

systems. In aviation mishaps, for instance, failures in decision making are attributed as a causal

factor in approximately 51.6% of all fatal accidents and 35.1% of non-fatal accidents, of the 80-

85% of accidents which are attributed to human error (Jensen, 1982).

While some of these incidents may represent failures in actual decision making (action

selection), a high percentage are actually errors in situation awareness. That is, the aircrew

makes the correct decision for their picture of the situation, but that picture is in error. This

represents a fundamentally different category of problem than a decision error - in which the

correct situation is comprehended, but a poor decision is made as to the course of action to take

- and indicates different types of solutions.

12.3 Generic Error Modeling System

The cognitive mechanisms involved in human errors as well as the role of organizational and

management factors in the formation of error-prone conditions are studied through Generic

Error Modeling System (GEMS) (Reason, 1990 and Reason and Rowan, 1981). For the

clarification of human errors in information security, this model offers a potential framework.

An extension of the SRK Approach is GEMS and it is explained thoroughly in Reason (1990).

How switching happens between the diverse kinds of information processing (skill, rule, and

knowledge) in tasks is demonstrated through GEMS as displayed in Figure 4.

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 342

NOVEMBER 2013

VOL 5, NO 7

Figure 4. Dynamics of Generic Error Modeling System (GEMS), (Reason,

1990 and Embrey, 2005)

In (Reason, 1990) GEMS model, mental operations can be in either attentional mode or

schematic control mode, that can be explained as: The awareness and the working human

memory of the user are associated to this mode. This kind of mode is slow, needs effort and is

problematic to support for an extended duration of time. Tasks such as goal setting, observing

progress, recuperating from errors/mistakes, etc. Done by humans generally use this mode. A

user may use this mode for remembering their system logon details such as username / password

in the framework of security. Accustomed information is processed rapidly with the help of this

mode. No conscious effort or immense mental strain is required. In terms of the amount or time

span of the stored information is not restricted with this mode. Diverse types and levels of

human error may happen within the various cognitive processing stages (Embrey, 2005).

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 343

NOVEMBER 2013

VOL 5, NO 7

13. Categories of Behavior to Distinguish Types of Error

In Reason (1990) assumes that based upon an individual‘s degree of performance, human errors

may be separated into classifications of behavior. The errors could be differentiated by both

psychological and situational variables.

In Figure 5, the slips/mistakes distinction is further elaborated by relating it to the Rasmussen

SRK classification of performance discussed earlier. Slips can be described as being due to

misapplied competence because they are examples of the highly skilled, well practiced activities

that are characteristic of the skill-based mode. Mistakes, on the other hand, are largely confined

to the rule and knowledge based domains.

Figure 5. Classification of Human Errors (adapted from Embrey, 2005)

The three categories of human error behavior are Skill-based errors, Rule-based errors and

Knowledge-based errors that can be explained as: Skill-based Errors: These kinds of errors are

instantaneous and unaware, and commonly made. They happen under schematic control mode.

Slips, inadvertent actions, or lapses are known as errors of this type. Rule-based Errors: Earlier

kept rules to the information are chosen and employed by this kind of behavior. Mostly, it is

instantaneous and unaware. When a change is required to alter the instantaneous behavior found

at the skill-based level, this type of behavior occurs. To observe the progress and outcome of the

action, the user may apply a memorized rule with periodic checks. Knowledge-based Errors:

This kind of behavior functions under first principles and happens under attentional control.

Knowledge-based behavior only happens after recurrent letdowns and minus a pre-existing

answer. In general, most mistakes are not rule- or knowledge-based but likely skill-based.

(Embrey, 2005)

An influential classification of the different types of information processing involved in

industrial tasks was developed by J. Rasmussen of the Risø Laboratory in Denmark. This

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 344

NOVEMBER 2013

VOL 5, NO 7

scheme provides a useful framework for identifying the types of error likely to occur in different

operational situations, or within different aspects of the same task where different types of

information processing demands on the individual may occur. The classification system, known

as the Skill, Rule, Knowledge based (SRK) approach is described in a number of publications,

e.g. Rasmussen (1982, 1987), Reason (1990). An extensive discussion of Rasmussen‘s

influential work in this area is contained in Goodstein et al (1988) which also contains a

comprehensive bibliography.

The extent of conscious control performed by the individual over his or her activities is referred

by the terms: skill, rule and knowledge based information processing. Two extreme cases are

differentiated in Table 3 the human conducts a task in nearly total awareness in the knowledge

based mode. This would happen in a situation where a beginner was doing the task (e.g. a

trainee process staff) or where an experienced individual was confronted with a completely new

situation. The staff would have to apply immense mental strain to evaluate the situation in both

these cases, and his or her responses will probably be slow. In addition, after each control

action, the worker would need to evaluate its impact before taking the next action, which would

most likely slow down the responses even more to the situation.

The efficient execution of highly practiced, mainly physical actions in which there is almost no

awareness in monitoring is referred by the skill based mode. Some particular event, for example

the prerequisite to operate a valve, which may come about from an alarm, a procedure, or

another individual generally instigates skill based responses. The greatly practiced operation of

opening the valve will then be performed mainly without conscious thought.

Table 3. Contrast of two extreme cases, knowledge-based mode and skill-based mode

Knowledge-Based Mode Conscious Skill-Based Mode Automatic

Unskilled or occasional user Skilled, regular user

Novel environment Familiar environment

Slow Fast

Effortful Effortless

Requires considerable feedback Requires little feedback

Cause for error: i. Overload

ii. Manual Variability iii. Lack of knowledge of modes of use iv. Lack of awareness of consequences

Cause for error: i. Strong habit intrusions

ii. Frequently invoked rule used inappropriately

iii. Situational changes that do not trigger the need to change habits

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 345

NOVEMBER 2013

VOL 5, NO 7

Table 4. Summery of Reviewed Models (Fotta et al., 2005)

Source Category Error Types

GEMS

Skill-based:

Inattention

Double-capture slips Omissions following interruptions

Reduced intentionality Perceptual confusions

Interference errors

Skill-based:

Overattention

Omissions Reversals

Repetitions

Rule-based:

Misapplication of

Good Rules

First exceptions Countersigns and non-signs Rigidity

Informational overload Rule strength Redundancy

General rules

Rule-based:

Application

of Bad Rules

Lack of Encoding Protection by specific rules Wrong rules

Inaccurate encoding Inelegant rules Inadvisable rules

Knowledge-based

(KB)

Selectivity Workspace limitations Illusory correlation

Out of sight out of mindConfirmation Bias Causality

Overconfidence Biased Reviewing

KB:ProblemsWithCo

mplexity

Delayed feed-back Thematic vagabonding Encysting

Causal series vs. Nets Processes in time

CREAM

Observation (O) Observation missed

O: False Observation False reaction False recognition

O: Wrong

Identification

Mistaken cue Partial identification Incorrect identification

Interpretation (I) Delayed interpretation Incorrect prediction

I: Faulty Diagnosis Wrong diagnosis Incomplete diagnosis

I: Wrong Reasoning Induction error Deduction error

Wrong Priorities

I: Decision Error Decision paralysis Wrong Decision Partial Decision

Planning (P):

Inadequate Plan

Incomplete plan Wrong Plan

P: Priority Error Wrong goal selected

Temporary,

Person(TP)

Delayed response Performance variability Inattention

TP: Memory Failure Forgotten Incorrect recall Incomplete recall

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 346

NOVEMBER 2013

VOL 5, NO 7

14. Classification of Identified Human Error Factors

Information security breaches due to human errors by computer users can be caused in a variety

of ways. Lack of computer knowledge, technical faults or just carelessness of the computer

users can result in these errors.

An ever increasing population has access to a computer in this internet age. But, most people

only know the ABCs of using a computer; e.g. forwarding emails, web browsing, word

processing, etc. The significance of security measures such as anti-virus software, firewalls,

regular updates and patches are not known or understood by most users (Roberts, 2004). Such

users rather easily become the victims of malicious software and hackers. This kind of user fault

can cause a computer to be compromised and used as a launch pad for more attacks on other

unsecured systems.

One of the most typical and lethal causes of human errors in information security contexts is

probably carelessness. Many common security breaches, for e.g. users writing passwords on

sticky notes stuck to keyboards, users entering dangerous websites in spite of recurrent warnings

displayed by their web browsers, workers deliberately disregarding and failing to abide to

security policies and procedures can be connected to carelessness.

Even greater dangers to organizations are inconsiderate and untrained insiders. This includes

malevolent and dissatisfied employees as well as workers who fall victim to social engineering

attacks. Millions are lost by businesses due to security breaches, most of which are connected

back to human errors. Most organizations are susceptible to some of the most rudimentary

security risks regardless of the investments in physical and software security measures.

Reducing the risk of human errors for organizations could be aided by a balanced combination

of policies, procedures, training and technology.

TP: Fear Random actions Freeze

TP: Distraction Task suspended Task not completed Goal forgotten

Situation

Awareness

Level 1: Failure to

correctly perceive

information

Data not available Data discrimination/detection Memory loss

Misperception of data Failure to monitor or observe data

Level 2: Failure to

correctly integrate

orcomprehend

information

Poor mental model Use of incorrect mental model Other

Over-reliance on default values

Level 3: Failure to

Project future

actions or state of

the system

Poor mental model Over-projection of current trends Other

General Habitual schema Failure to maintain multiple goals

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 347

NOVEMBER 2013

VOL 5, NO 7

Table 5. Previous Research on Human Errors

Hu

ma

n M

em

ory

Ca

rele

ss

Hig

h W

ork

loa

d

Po

licy

Pa

ssw

ord

Cu

ltu

re

Aw

are

nes

s

Em

plo

yee

Co

op

era

tio

n

Ma

na

gem

ent

Su

pp

ort

Sec

uri

ty T

rain

ing

So

cia

l E

ng

inee

rin

g

Ed

uca

tio

n

Stanton et al., 2005

Nikolakopoulos

et.al, 2009

Sapronov, 2005

Kahraman, 2005

Newman, 2003

Kraemer and

Carayon, 2005

Besnard and Arief,

2004

Sasse,

BrostoffandWeirich,

2001

Adams andSasse,

1999

Schnier, 2000

Rupere et al., 2012

William, 2002

Schultz, 2005

Wagner, 1997

Danchev, 2006

Hinson, 2003

Moos, 2006

Pfleeger, 2003

Panko, 2008

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 348

NOVEMBER 2013

VOL 5, NO 7

Siponen, 2000

Sarriegi et al., 2005,

2006

Albrechtsen, 2007

Kraemer et al., 2006

Kraemer and

Carayon, 2007

Fulfordand Doherty,

2003

Karyda et al., 2005

Ruighaver et al.,

2007

Knapp et al., 2006

15. Motivational Factors and Information System Security

In Burn (1978) and Aarons (2006) researches a better performance recognized in management

on the ways that leaders monitor and control their employees. Leadership is a process to affect

others to track rules and methods in order to reach objectives and leadership style.

Transformational leadership style and transactional leadership style are two leadership styles.

As Burn (1978) mentioned transformational leaders are with their team and encourage them.

They are all involved in their teams' activities. Transactional leaders based on Bass (1985) act in

existing systems, cultures and controls the system accurately in the process of implementation in

any organization. Many leadership studies have revealed that both of leadership styles notably

affect employee‘s work performance (Kaushal, 2011 and Lo et al., 2010). Powerful leadership is

needed to direct users in their decisions and help them to fulfill with information security

policies.

16. Organizational and Information System Security

According to the past studies, we can categorize some of the influential factors on information

system security named organizational factors. These factors are included culture, policies and

workload.

The culture and policy are included as organizational elements in information security and

computer research. Many different aspects, for instance computer distribution, effectual

application and polices of information security have been studied by some researches containing

policies (Fulford and Doherty, 2003,) response of employee related to requirement of security

(Pahnila, 2007) and also the policy management significance and the process of organization for

the acceptance and execution of information security policy and acceptance (Karyda, 2005).

Because of the fact that Malaysia is a multinational country and the past studies revealed that

the cultural issues can impact the employee‘s behavior so it seems that investigation about role

of culture will have important impact on increasing information security system.

Also, the organizational policies is one of the important points that can force the employee to

act properly which leads to increasing the information system security (Sarriegi et al., 2006;

Fulford and Doherty, 2003). For instance, many different studies have mentioned workload as

one of these factors (Kraemer et al., 2006; Kraemer and Carayon, 2007).

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 349

NOVEMBER 2013

VOL 5, NO 7

17. Learning and Information System Security

Above and beyond monitoring and controlling employees, information security training has an

important role on obtaining information security skill by employees. Information security

training is a program introduces and provides information about the importance of information

system‘s security it enables to increase user‘s skill and understanding towards information

security. The value of information systems` security could be obtained by using security

trainings (Koskosas et al., 2011 and Martin and Rice, 2011). Efficient security training train

employees, and consequently, management should be sure that security trainings in their

organizations are done professionally. Two learning factors are Individual Learning and

Organizational Learning.

18. Proposed Framework

Followed by above discussion, this study categorizes main factors influencing information

system security in Malaysian hospital. In this framework any linkage is supported by previous

study (see Figure 6).

Figure 6. Proposed Framework

19. Conclusion

Regarding the urgent need of healthcare industry of Malaysia to ISS, this study has reviewed

various variables that may affect on ISS. Human factors had big portion among these factors, by

considering to the previous researches and also important factors in human resource

management, this research has tried to propose a new framework. This framework is based on

three factors: motivational factors, organizational factors and learning.

At the end, this research suggests to test this proposed framework in healthcare industry of

Malaysia, and also test it in other scopes to make highlight the importance of each variable.

For future study it is recommended other researches emphasis on existing factors on human

resource management. In other word, the lack of management topics in information system

security seems inevitable. For example, the role of HRM practices, job satisfaction and

organizational commitment in improving information system security should be highlighted.

Organizational Factors

Culture

Policy

Motivational Factors

Management Support

Reward/Penalty

Appraisal

Learning

Individual Learning

Organizational Learning

Information

System Security

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 350

NOVEMBER 2013

VOL 5, NO 7

REFERENCES Aarons GA (2006) Transformational and transactional leadership: Association with attitudes toward

evidence-based practice. PsychiatrServ 57: 1162 1169.

Adams, A. Sasse, M. A. (1999). Users are not the enemy. Communications of the ACM, 42(12), 41-46.

Ahmed, M., Sharif, L., Kabir, M., Al-Maimani, M. (2012).Human Errors in Information

Security. International Journal, 1(3).

Albrechtsen, E. (2007). A qualitative study of users‘ view on information security.Computers &

Security, 26(4), 276–89.

Anderson, R. J. (2008). Security Engineering: A Guide to Building Dependable Distributed Systems

(2nd ed.). New York: Wiley.

Arce, I. (2003). The weakest link revisited, IEEE Security and Privacy, 1,72–76.

Bass BM (1985) Leadership and performance beyond expectation.The Free Press, New York.

Besnard, D. Arief, B. (2004).Computer security impaired by legitimate users. Computers and Security,

23, 253-264.

Bishop, M. (2002). Computer security: art and science. Addison Wesley Professional.

Bishop, M. (2003). Computer Security: Art and Science. Pearson Education, Inc.

Boujettif M, Wang Y (2010) Constructivist Approach to Information Security Awareness in the Middle

East. Broadband, Wireless Computing, Communication and Applications (BWCCA), 2010

International Conference.

Brostoff, A. (2004). Improving password systems effectiveness, PhD thesis, UCL, UK, unpublished.

Bubb, H. (2005). Human reliability: a key to improved quality in manufacturing, Human Factors and

Ergonomics in Manufacturing, 15(4), pp.353–368, Wiley Periodical.

Burns JM (1978) Leadership, Harper and Row , New York.

Carayon, P., Kraemer, S. (2009). A human factors vulnerability evaluation method for computer and

information security.

Cresswell, A.,Hassan, S. (2007). Organizational impacts of cyber security provisions: a sociotechnical

framework. In: 40th Hawaii International Conference on Systems Sciences.

Danchev, D. (2006). Reducing ‖human factor‖ mistakes.

Dhillon G., Backhouse J. (2001). Current directions in IS security research: towards socio-organizational

perspectives. Information Systems Journal, 11:127–53.

Edwards, C., Kharif, O., and Riley, M. (2011). Human Errors Fuel Hacking as Test Shows Nothing

Stops Idiocy, Bloomberg, June 2011. Online at http://www.bloomberg.com/news/2011- 06-

27/human-errors-fuel-hacking-as-test-showsnothing- prevents-idiocy.html.Accessed on 13th

March 2012.

Embrey, D. (2005). Understanding human behaviour and error. Human Reliability Associates, 1, 1-10.

Endsley, M. R. (1988). Design and evaluation for situation awareness enhancement. In Proceedings of

the Human Factors Society 32nd Annual Meeting (pp. 97-101). Santa Monica, CA: Human

Factors Society.

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 351

NOVEMBER 2013

VOL 5, NO 7

Fischer-Hübner, S. (2001). IT-security and privacy: design and use of privacy-enhancing security

mechanisms. Springer-Verlag.

Flechais, I. (2005). Designing Secure and Usable Systems, University College London.

Fotta, M. E., Byrne, M. D., & Luther, M. S. (2005). Developing a human error modeling architecture

(HEMA). Proceedings of Human-Computer International, Mahwah, NJ, USA. Lawrence

Erlbaum Associates, Inc.

Fulford, H., Doherty, N. F. (2003). The application of information security policies in large UK-based

organizations: an exploratory investigation. Information Management & Computer Security,

11(3), 106–14.

Furnell S. (2007). Making security usable: are things improving? Computers & Security, 26(6):434–43.

Furnell, S. (2005). Why users cannot use security. Computers and Security, 24, 274 279.

Gonzales, J. J. & Sawicka, A. (2002). A Framework for Human Factors in Information Security,

Proceedings of the WEAS International Conference on Information Security, Rio de Janeiro,

Brazil.

Gonzalez, J. J., Sawicka, A. (2003). A framework for human factors in information security.

Harper, A., Harris, S., Ness, J., Eagle, C., Lenkey, G., and Williams, T. (2011).Gray Hat Hacking, The

Ethical Hacker‘s Handbook, Third Edition, McGraw Hill.

Hassell, L. & Wiedenbeck, S. (2004). Human Factors and Information Security, Drexel University

College of Information Science and Technology.

He, X., Wang, Y., Shen, Z., & Huang, X. (2008).A simplified CREAM prospective quantification

process and its application. Reliability Engineering & System Safety, 93(2), 298-306.

Hinson, G. (2003). Human factors in information security.

Hollnagel E. (1998). Cognitive reliability and error analysis method. Oxford, UK: Elsevier Science Ltd.

Hollnagel, E. (1993). Human Reliability Analysis: Context and Control, London: Academic Press.

Huang, D., Rau, P.P. &Salvendy, G. (2007).A survey of factors influencing people‘s perception of

information security.In J. Jacko (Ed.).Human-Computer Interaction, Part IV. Heidelberg:

Springer.

Humaidi, N., and Balakrishnan, V. (2013). Exploratory Factor Analysis of User‘s Compliance Behaviour

towards Health Information System‘s Security. Journal of Health & Medical Informatics.

Jensen, R. S. (1982). Pilot judgment training and evaluation.Human Factors, 24(1), 61- 73.

Kahraman, E. (2005). Evaluating it security performance with quantifiable metrics.Master‘s thesis, DSV

SU/KTH.

Karyda, M., Kiountouzis, E., Kokolakis, S. (2005). Information systems security policies: a contextual

perspective. Computers & Security, 24, 246–60.

Kaushal, S (2011) Effect of leadership and organizational culture on information technology

effectiveness: A review. Research and Innovation in Information Systems (ICRIIS) International

Conference.

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 352

NOVEMBER 2013

VOL 5, NO 7

Koskosas I, Kakoulidis K, Siomos C (2011) Examining the linkage between information security and

end-user trust. International Journal of Computer Science & Information Security 9: 21-31.

Kraemer, S., Carayon, C., Clem, J. F. (2006).Characterizing violations in computer and information

security systems. In: Proceedings of the 16th triennial congress of the international ergonomics

association. Maastricht, The Netherlands.

Kraemer, S., Carayon, P. (2007). Human errors and violations in computer and information security: the

viewpoint of network administrators and security specialists. Applied Ergonomics, 38(2), 143–

54.

Kreicberge, L. (2010). Internal threat to information security countermeasures and human factor with

SME. Business Aministration and Social Sciences. Lulea University of Technology, 1-66.

Leach, J. (2003). Improving user security behaviour. Computers & Security,22(8), 685-692.

Lo M-C, Ramayah T, de Run EC (2010) Does transformational leadership style foster commitment to

change? The case of higher education in Malaysia.ProcediaSocBehavSci 2: 5384-5388.

Maiwald, E. (2004). Fundamentals of Network Security.McGraw-Hill Technology Education.

Martin N, Rice J (2011) Cybercrime: Understanding and addressing the concerns of stakeholders.

Computers & Security 30: 803-814.

Mitnick, K. D. & Simon, W. L. (2002). The Art of Deception: Controlling the Human Element of

Security. Indianapolis, ID: Wiley Publishing, Inc.

Moos, T. T. (2006). Cisco-sponsored security survey of remote workers reveals the need for more user

awareness.

Newman, R. E. (2003). Enterprise Security, Pearson Education, Inc., first edition.

Nikolakopoulos, T. (2009).Evaluating the human factor in Information Security.

Norman, D. A. (1981). Categorization of action slips. Psychological Review, 88(1), 1-15.

Pahnila, S., Siponen, M., Mahmood, A. (2007). Employees‘ behavior towards IS security policy

compliance. In: Proceedings of the 40th Annual Hawaii International Conference on System

Sciences (HICSS‘07), IEEE.

Pfleeger, C. P., & Pfleeger, S. L. (2003). Security in computing.Prentice Hall.

Raymond, R. P. (2004). Corporate Computer and Network Security, Pearson Education, Inc.

Reason, J. (1990). Human Error, Cambridge, UK: Cambridge University Press.

Reason, P., Rowan, J.v (1981). Human inquiry: a sourcebook of new paradigm research, Chichester:

John Wiley.

Roberts, P. (2004). AOL survey finds home user ignorant to online threats, ComputerWeekly, April

2010. Online at http://www.computerweekly.com/news/2240058434/AOLsurvey- finds-home-

user-ignorant-to-online-threats.Accessed on 10th March 2012.

Ruighaver, A. B., Maynard, S. B., Chang, S. (2007). Organisational security culture: extending the end-

user perspective. Computers & Security, 26(1), 56–62.

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 353

NOVEMBER 2013

VOL 5, NO 7

Rupere, T., Mary, M., &Zanamwe, N. (2012).Towards Minimizing Human Factors In End-User

Information Security. International Journal of Computer Science and Network Security, 12(12),

159-167.

Samy NG, Ahmad R, Ismail Z (2010) Security threats categories in healthcare information systems.

Health Informatics J 16: 201-209.

Sapronov, K. (2005). The human factor and information security.

Sarriegi, J. M., Santos, J., Torres, J. M., Imizcoz, D., Plandolit, A. (2006). Modeling security

management of information systems: analysis of a ongoing practical case. In: The 24th

international conference of the system dynamics society. Nijmegen, The Netherlands.

Sasse, M. A., Brostoff, S. &Weirich, D. (2001).Transforming the ‗weakest link‘ – a human/computer

interaction approach to useable and effective security. BT Technology Journal, 19(3), 122-131.

Schneier, B. (2000). Secrets and Lies: Digital Security in a Networked World, Indianapolis, IN: Wiley

Publishing, Inc.

Schultz, E. (2005). The human factor in security. Computers & Security, 24:425–426.

Siponen, M. T. (2000). A conceptual foundation for organizational information security awareness.

Information Management & Computer Security, 8(1), 31–41.

Siponen, M., Pahnila, S., &Mahmood, M. A. (2010). Compliance with information security policies: An

empirical investigation. Computer, 43(2), 64-71.

Spruit, M. E. M., Looijen, M. I. T. (1996). security in Dutch practice, Computers and Security, 15(2), pp.

157–170.

Stanton, J. M., Stam, K. R., Mastrangelo, P., Jeffery, J. (2005). Analysis of end user security behaviors,

Computers & Security, 24, 124–33.

Swain, A. D., & Guttman, H. E. (1983).Handbook of human reliability analysis with emphasis on

nuclear power plant applications.NUREG/CR-1278, U.S. Nuclear Regulatory Commission,

(Washington D.C.).

Wagner, D. A., Crabb, M. D. (1997). System security: A management perspective, September.

Werlinger, R., Hawkey, K., Beznosov, K. (2009). An integrated view of human, organizational, and

technological challenges of IT security management. Information Management & Computer

Security, 17(1), 4–49.

William, L. S., Kevin, D. M. (2002).The Art of Deception.Wiley Publishing, Inc.

Knapp, K. J., Marshall, T. E., Rainer, R. K., & Ford, F. N. (2006). Information security:

management's effect on culture and policy. Information Management & Computer

Security, 14(1), 24-36.

Mitnick, K. D., & Simon, W. L. (2001). The art of deception: Controlling the human

element of security. Wiley. com.

Panko, R. R. (2008). IT employment prospects: beyond the dotcom bubble.European

Journal of Information Systems, 17(3), 182-197.

Rasmussen, J. (1982). Human errors. A taxonomy for describing human malfunction in

industrial installations. Journal of occupational accidents, 4(2), 311-333.

ijcrb.webs.com

INTERDISCIPLINARY JOURNAL OF CONTEMPORARY RESEARCH IN BUSINESS

COPY RIGHT © 2013 Institute of Interdisciplinary Business Research 354

NOVEMBER 2013

VOL 5, NO 7

Rasmussen, J. (1987). Cognitive control and human error mechanisms. New technology

and human error, 53-61.