26
Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Security & Morality:A Tale of User DeceitSecurity & Morality:

A Tale of User Deceit

Models of Trust on the Web

Edinburgh, UK

May 2006

L. Jean Camp, C. McGrath, A. Genkina

Page 2: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Security & Morality:A Tale of User Deceit?

• Hypotheses about human trust behavior developed from social science

• Compared with implicit assumptions in common technical mechanisms

• Test computer-human trust behaviors• Conclude with guidance for trust

design

Page 3: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Design for Trust

• Start with human trust behaviors• Trust

– Used for simplification– Encompasses discrete technical

problems• privacy, integrity, data security

– Embeds discrete policy problems• business behavior, customer service, quality

of goods, privacy

Page 4: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Human and Computer Trust

–Trust is approached differently by different disciplines•Social Studies of Human Behavior

-Studies based on a micro approach-Experiments to evaluate how people extend trust-Game theory-Common assumption: information exposure == trust

•Philosophy-Macro approach–Trust is a need

»high default to trust–Trust is a tool for simplification-Examine societies and cultural practices

Page 5: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Experimental Definition of Trust

• Coleman’s Three Part Test– enables something not otherwise possible

• individual who trusts is worse off if the trusted party acts in an untrustworthy manner

• individuals who trust are better off if the trusted party acts in a trustworthy manner

– there is no constraint placed on the trusted party

– a time lag exists between a decision to trust and the outcome

Page 6: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Trust & Individiation• People interacting with a computer do not

distinguish between computers as individuals but rather respond to their experience with "computers”– People begin too trusting– People learn to trust computers

• first observed by Sproull on net in computer scientists in 1991

• confirmed by later experiments – Computers are perceived as moral agents

• People will continue to extend trust - so creating another source of trust doesn’t defeat trusting behaviors

Page 7: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Research on Humans Suggest...

• Humans may not differentiate between machines

• Humans become more trusting of ‘the network’

• Humans begin with too much trust for computers- Confirmed by philosophical macro observation- Confirmed by computer security incidents

E-mail based Scams, Viruses & Hoaxes Masquerade attacks

Page 8: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Three Hypotheses

• Do humans respond differently to human or computer "betrayals" in terms of forgiveness?

• Do people interacting with a computer distinguish between computers as individuals or respond to their experience with "computers”?– Does tendency to differentiate between

remote machines increase with computer experience?

Page 9: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Security & Morality:A Tale of User Deceit

• Hypotheses about human trust behavior developed from social science

• Compared with implicit assumptions in common technical mechanisms

• Test computer-human trust behaviors• Conclude with guidance for trust

design

Page 10: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

H1: Response to Failure

• Do humans respond differently to human or computer "betrayals" in terms of forgiveness?– Attacks which are viewed as failures as

‘ignored’ or forgiven– Technical failures as seen as accidents

rather than design decisions• May explain why people tolerate repeated

security failures

Page 11: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

H2: Differentiation

• When people interact with networked computers, they discriminate among distinct computers (hosts, websites), treating them as distinct entities, particularly in their readiness to extend trust and secure themselves from possible harms.

• People become more trusting over time• People differentiate more not less with experience• Do people learn to differentiate or trust?

– “educate the user” may not work

Page 12: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Security & Morality:A Tale of User Deceit

• Hypotheses about human trust behavior developed from social science

• Compared with implicit assumptions in common technical mechanisms

• Test computer-human trust behaviors• Conclude with guidance for trust

design

Page 13: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

The Experiment

• Developed three websites– “life management”

• Elephantmine.com

• Reminders.name

• MemoryMinder.us

Page 14: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Initial Tests

• What information would you share with each site?

• Do you trust the site?• user-defined trust, no macro definition given

• Rejected MemoryMinders.us• people dislike lime green?

• Other two designs had similar evaluations

Page 15: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Two “Betrayal” Types

• One group faced a technical betrayal– Another person’s data is displayed– “John Q. Wilson”– DoB, Credit Card Number, social network data

• One group faced a moral betrayal– Change in privacy policy announced– Collection of third party information correlated

with compiled data• very common policy• eBay, Face Book, mySpace

Page 16: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Three Step Process

• Users introduced to first site– Sites in the same order

• Users experience betrayal– Half the users have technical failure– Half had privacy change– Both sets of users experience a failure

upon departure of first site

• Then users go to second site

Page 17: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Findings: Differentiation

• Users respond to first site betrayal with significant change in behavior wrt second site– users had on average seven years experience

with Internet– computer experience not at all significant– second site not seen as “new” entity

• Cannot support the hypothesis that users differentiate– users do not enter each transaction with a new

calculation of risk

Page 18: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Findings: Betrayal Type

• Stronger reaction to privacy change– Yet technical failure indicated an inability to

protect privacy

Malevolence Incompetence Privacy

Before Privacy After

Security Before

Security After

Your IM Buddy List

22% 09% p<.001

16% 13% p<.001

Coworkers’ Names & Contact

44% 31% p<.01

42% 52%

Friends’ Names & Contact

53% 34% p<.001

65% 68%

Page 19: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Security & Morality:A Tale of User Deceit

• Hypotheses about human trust behavior developed from social science

• Compared with implicit assumptions in common technical mechanisms

• Test computer-human trust behaviors• Conclude with guidance for trust

design

Page 20: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

What To Conclude

• Assuming the human will act like the computer has been a core design problem

• Either remove assumptions about humans

• Or computer security must be designed with social science in mind

Page 21: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Differentiation

• The tendency to differentiate between remote machines decreases with computer experience– More use results in more lumping

• Make better lumping

– Explains common logon/passwords • along with cognitive limits• “My Internet is Down”

• Need explicit DO NOT TRUST signals

Page 22: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Observations

•Users are bad security managers PGP, P3P, passwords, ….

•Security should necessarily be a default

•Surveys illustrate a continuing confusion of privacy & security–educate All Net Users OR–build upon the connection between the

moral (privacy) and technical (security)

Page 23: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Computer security is built for machines

• Passwords Humans are a bad source of entropy

• SSL Two categories: secure and not secure

By requiring per-site differentiation does not enable human differentiation

Every site should include a unique graphic with the lock

Trust all machines with the lock SSL - secured phishing has already

occurred

Page 24: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

PKI is built for Machines

• Better lumping, not demands for user differentiation– Different levels of key revocation are needed

• Falsified initial credential– All past transactions suspect

• Change in status– Future transactions prohibited

• Unrecognized hierarchy– Messages are confusing

• No domain– No alert when moving to IP address space not

connected to DNS

Page 25: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Building for Trust

• Security technologies are not adopted– patching, PGP

• Security technologies do not address user conceptions of trust– Patching

• more secure machine with regular updates to Microsoft?

– PGP• signed email w/o confidentiality to most people

• Technologies linking security (competence) to privacy (beneficence) may prove more effective in trust building than security alone

Page 26: Camp, McGrath, Genkina Security & Morality: A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

Example Project

• Focused on individuals– computer - computer trust– computer- human trust

• Explicit “do not trust” signals