Why U sable S ecurity M atters and How T esting C an H elp A chieve I t SECTEST Keynote

Preview:

DESCRIPTION

Why U sable S ecurity M atters and How T esting C an H elp A chieve I t SECTEST Keynote. Paul Ammann March 31, 2014. Outline. A Poll What’s wrong with usable security thinking The consequences of unusable security Lessons from airplane safety The path forward Security testing - PowerPoint PPT Presentation

Citation preview

1

Why Usable Security Matters and

How Testing Can Help Achieve It

SECTEST Keynote

Paul Ammann

March 31, 2014

2

Outline• A Poll• What’s wrong with usable security thinking• The consequences of unusable security• Lessons from airplane safety• The path forward• Security testing

– What we need to test– How we need to test

4

What’s Wrong With ‘Usable Security’ Thinking?

Security implementers sometimes invent the user instead of discovering the user

5

Proper Focus: Fit with Users & Activity

• If you want productive & secure users– and security is usually the secondary task

• Then you need to understand– Primary user activities– User motivations– User behavior– Impact on bottom line

6

The Consequences of Unusable Security

• Unusable Security Costs Money

• Unusable Security Costs Security

7

Unusable Security Costs Money

8

Standard Security Thinking: “Users Should Make the Effort”

• Question: how much? It all adds up:1. Time spent on security tasks: authentication,

access control, warnings, security education ….

2. Failure: time spent on errors and error recovery (user and visible organizational cost)

3. Disruption of primary tasks = re-start cost

9

Does This Really Help Security?

10

Time is Money

“An hour from each of the US’s 180 million online users is worth approximately US$2.5 billion. A major error in security thinking has been to treat users’ time—an extremely valuable resource—as free.”

C Herley, IEEE S&P Jan/Feb 2014

11

Impact on Productivity – Lost Sales

Not a particularly effective security measure

Not usable: failure rate around 40% - so customers go elsewhere

“CAPTCHAs waste 17 years of human effort every day” (Pogue, Scientific American March 2012)

12

Authentication ‘Wall of Disruption’

13

Authentication Hate List

1. Why can’t I reuse my old password?2. Repeated authentication to the same system

(e.g. because of 15 minute time-outs)3. Authenticating to infrequently used systems

– Difficulty to recall previous password– Password could have expired in the meantime– Resetting a password is not easy

4. Creating a valid password (different rules for each system)

14

Authentication Hate List

4. Managing a high number of different credentials – Different policies means strategies for creating &

recalling passwords don’t work– Which credentials to use for which system

5. Use of RSA tokens– “It's this extra, again, effortful stuff. I have to dig

around in my bag and get the RSA ID token out and then set it on my laptop and then type out the number, make sure that you're not typing it right before changes or as it's changing or whatever.”

15

Impact on Productivity – Long-Term

1. User opt out of services, return devices – Improves their productivity, but often reduces

organizational productivity (example: email)– Organization has less control over alternatives

2. Stifling innovation: new opportunities that would require changes in security

3. Staff leaving organization to be more productive/creative elsewhere

16

Unusable Security is Ridiculous …

17

One Alternative

18

Technology Should be Smarter than This

• Move from explicit to implicit authentication: 1. Proximity sensors to detect user presence2. Behavioral biometrics: zero-effort, one-step,

two-factor authentication3. Exploit modality of interaction: use video-based

authentication in video, audio in audio, etc.4. Web fingerprinting can identify users – why not

use it for good?

19

‘Green shoots’ 1 – FIDO a commercial alliance to replace passwords …

www.fido.org

20

‘Green shoots’ 2: Security that supports user goals: Parental controls

Apparently parents didn’t much careBut business users loved it!

PayPhrase discontinued in February 2012“Purchase Delegation” introduced for business users

21

The Consequences of Unusable Security

• Unusable Security Costs Money

• Unusable Security Costs Security

22

Unusable Security Costs Security!

1. User errors - even when trying to be secure 2. Non-compliance/workarounds to get tasks done 3. Security policies that cannot be followed make

effort seem futile:“It creates a sense of paranoia and fear, which makes some people throw up their hands and say, “there’s nothing to be done about security,” and then totally ignore it.”Expert Round Table IEEE S&P Jan/Feb 2014

23

User Errors When Trying to be Secure

• Document redaction prone to error

• Is the document really free of confidential data?

• If not:– Blame the user?– Or look deeper?

24

Noncompliance

Are these legitimate users?

25

You Can Only Ask For So Much

26

Reasons For Non-Compliance

• Compliance requires ability and willingnessCan’t complySecurity tasks that are impossible to complete – remove/redesign (security hygiene)Could comply but won’t complyThe cost of security tasks that can be completed in theory, but require a high level of effort and/or reduce productivity. Identify & reduce friction through better design or better policies

Can comply and do complySecurity tasks that staff routinely comply with – provides examples of what is workable in a particular environment = template for security

27

Revocation

• Usability and revocation• Who identifies unneeded privileges?

– Manager? Employee?– Answer says a lot about the organization

• Demo environment vs. actual practice– “How does that work with 1000 privileges?”

28

Old Security, No Longer Usable

• Entering a complex password on touchscreen keyboard time-consuming and error-prone

• users look for passwords that are easy to enter severely reduced password space

29

New Security, Unusable Implementation• Replacing existing 2FA

card with a more secure one – good

• Replacing 6-digit numeric code with 8-char alphanumeric password (valid for 1 minute) – not good

30

Impact on Security – Long-Term1. Increased likelihood of security breaches2. ‘Noise' created by habitual non-compliance

makes malicious behavior harder to detect3. Lack of appreciation of and respect for security

creates a bad security culture4. Frustration can lead to disgruntlement:

intentional malicious behavior - insider attacks, sabotage

31

Lessons From Airplane Safety• April 26, 1994, Nagoya Japan• Sequence of events on landing:

– F/O inadvertently entered GO AROUND mode

– Subsequent crew actions led to stall– Crash killed 264 of 271 on board

• Official cause– Crew error

• But usability played a key role– Mental model of crew diverged from actual

airplane state– Crew actions reasonable in a different

airplane state• FAA and Aircraft manufacturers have

learned from this!

32

Analyzing Aircraft Accidents

• Typical report, as paraphrased by NormanAir Force: It was pilot error—the pilot failed to take corrective action.Inspector General: That’s because the pilot was probably unconscious.Air Force: So you agree, the pilot failed to correct the problem.

• There is a similar attitude in security– Fact: Users don’t do what they are supposed to– Question: Is it their fault?

• Can we learn from aircraft designers?– They have a multi-decade head start on us!

33

The Path Forward?

34

First Things First: There Has to be a Reason for System Developers to Care

• Some organizations don’t care about usability or usable security– Not much to do there– Dangerous invitation to competitors!

• Some do careQ: How to make it happen?A: High-level commitmentA: Feedback loopsA: Appropriate personnel

35

Models Have to Include the User

• Modern aircraft design has critical role for human factors– The same needs to happen in security

• Security is a secondary task– Users are trying to do something else

• Modeling human behavior is critical– The user is part of the system– We need to understand how things go wrong

36

Notions from Fault Tolerance• Error management can be viewed in terms of

– Error avoidance– Error detection– Error recovery

• Software engineering is well-acquainted with this approach already!– But we haven’t really applied it to security

37

Classification of Errors (Norman)• Slips

– goal is correct – but execution is flawed

• Mistakes – goal is wrong

Violations are not errorsThis looks a lot like mutation analysis!

38

What We Need To Test

• Security mechanisms in practice– What actually happens?– What do users actually do?

• Models that incorporate user behavior– Can we assess how a given system behaves under

various profiles of human behavior?• Let’s look at two examples:

– Warnings– Spear Phishing

39

The Problem With Warnings• Fact: PDF files are dangerous.

– That’s a usability problem!– Is a generic warning helpful? Why not?

– Is a detailed warning better?

40

The Problem With Warnings (2)• How are users supposed to react?

– Standard advice: Only download “trusted” pdf– “Please phish me!”

• The user has no way to accurately assess risk– Hence, the only rational action is to give up– Which is exactly what users do

• Study results (Krol et al)– Very high “ignore” rate– Risk actually higher for “expert” users

• “Expert” users didn’t understand PDF risks

41

Gone Phishing• Goal: Train motivated users to avoid spear-phishing

– Send out a bespoke, but fake, phish– Present “clickers” with training

• Simple notification: “You’ve been spear-phished” vs.

• Deep training: “Here’s how to recognize an attack”

• Study results (Caputo et al)– Deep training doesn’t work

• “Clickers” panic• And immediately close all windows!• No one reads the deep training!

• Need to study actual users!!!

42

Skill Set for Phishing• Assess targeting phish

– Not too obvious, not too “real”• Looking for more than simple hypothesis test

– Case study design critical• Analysis of unstructured data

– Why did “clickers” fail to benefit?• These are not the typical CS skillset!

43

How We Need To Test• Testing human behavior

– Requires experts in human behavior!– “Security Testing” needs to be interdisciplinary

• We need to understand different approaches to analyzing systems– Case studies that actually are case studies– Rigorous understanding of what works

• and what doesn’t

44

One Possible Model• Lots of software is “user tested” by fielding

– Google does this all the time– Collects data on usage– Makes decisions about products based on facts

• Security is a bit different– Harder to monitor difficulty with mechanisms– Impossible to monitor non compliance– But there is still a lot of data to analyze

45

Questions?

• Contact: – pammann@gmu.edu

• Acknowledgements:– Angela Sasse has taught me a lot about usable security

and shared slides generously!

• Further reading• Adams and Sasse: Users are not the enemy (CACM 1999)• Krol et al.: Rethinking security warnings (7th CRiSIS 2012)• Caputo et al.: Going spear phishing (S&P magazine Jan/Feb

2014)• Herley: More is not the answer (S&P magazine Jan/Feb 2014)• Norman: The Design of Everyday Things (latest 2013)

Recommended