Eran toch: Designing for privacy

Preview:

DESCRIPTION

A lecture in Microsoft Hertzelia, Out-of-the-Box week. The lecture revoles around the privacy threats in mobile computing, and its remedies.

Citation preview

1

http://toch.tau.ac.il/

Designing for privacy

Microsoft HertzeliaApril 2013

Department of Industrial Engineering

A Brief History of Privacy

2

“the right to be let alone”

- Samuel D. Warren and Louis

D. Brandeis,

1890

וישא בלעם את עיניו וירא 'את ישראל ׁשכן לשבטיו'.

מה ראה? ראה שאין פתחיהן מכוונין זה לזה.

אמר: ראויין אלו שתשרה "שכינה ביניהן

100BC-300AC 1980

Controlling information and accessibility to others

- Ruth Gavison

Agenda

① Privacy disasters

② The mobile privacy landscape

③ Is privacy important?

④ The privacy toolbox

3

1. Privacy Disasters

4

What’s the worst that can happen?

Remember Google Buzz?

5

Followers in Buzz

‣ Google suggested a list of followers to new users.

‣ The suggestions were the people who corresponded most with the user.

‣ By default, the list was open to the public and accessible through the user’s profile page.

6

After 4 Days…

‣ Google had canceled the automatic follower list.

‣ And the removed Buzz’s public profile completely.

7

After a Week…

‣ Law suits and FTC complaints were submitted.

‣ Users had abandoned Buzz quickly.

‣ Google had agreed to pay $8.5 Mil and was restricted considerably with regard to user data.

‣ Buzz was cancelled a year later.

8

2. The Mobile Privacy Landscape

9

Privacy Spheres in Mobile Computing

10

Physical PrivacyInterference of the physical environment and attention

Data PrivacyCollecting and using information collected in the user’s action sphere

Information Threats

11

‣ Can other people find where the person is?

‣ And physically threat the user or her property?

Identity Threats

‣ With only 4 locations of a person,

‣ and a census database,

‣ 95% of the population can be uniquely identified.

12

Yves-Alexandre de Montjoye, César A. Hidalgo, Michel Verleysen & Vincent D. Blondel, Unique in the Crowd: The privacy bounds of human mobility, Nature 2013

Social Threats

‣ A location can tell about:

‣ What the user does

‣ Who the user meets

‣ Information is shared with the social network.

13

Physical Privacy

The extent to which the phone interfere with the physical context of the user, draws the attention of the user or the environment.

14

Vellux BeepersSounds and notifications

Concerns in Information Privacy

15

Tsai, Janice, Patrick Kelley, Lorrie Cranor, and Norman Sadeh. "Location-sharing technologies: Privacy risks and controls." TPRC, 2009.

3. Is Privacy Important Anymore?

16

17

“You already have zero privacy anyway. Get over it.”

Scott McNealySun Microsystems CEO

1999

Do Users Actually Care?

18

Shoppers at a mall were offered $10 discount card - and an extra $2 discount if they agreed to share their shopping data. 50% declined the extra offer.

Source: The New York Times - http://www.nytimes.com/2013/03/31/technology/web-privacy-and-how-consumers-let-down-their-guard.html?smid=pl-share

But Wait…

19

Shoppers were offered a $12 discount card and the option of trading it in for a $10 card to keep their shopping record private. 90% percent chose to trade privacy for $2.

Privacy is not Abstract Anymore

20

Google Buzz Facebook Path

People care about concrete privacy threats, that impact their actual lives.

What do users actually do?

21

Facebook users in an American University

Professional and Ethical Duty

22

Legal Duty

23

It is a Basic Human Need

24

Its impossible live without a safe space for

experimentation, growth, and

personal expression

4. The Privacy Toolbox

25

Types of Tools

26

Policy-based Architecture-based

Pri

vacy

Gu

ara

nte

e

Notice

Choice

Access and Recourse

Data Minimization

Source: Marc Langheinrich. 2001. Privacy by Design - Principles of Privacy-Aware Ubiquitous Systems. In Proceedings of the 3rd international conference on Ubiquitous Computing (UbiComp '01),

‣ Be open with the user.

‣ Tell the user what happens to the data, at the right moment, and at the right context.

27

Notice

What is a Good Notice?

‣ A good notice is a way that will enable the user to intelligently make a decision.

‣ We need to think: what is the default? What are the implications? Is there an undo?

28

NoticeTell the user what happens to the data.

29

Privacy as Part of the App Decision-Making Process. Patrick Gage Kelley, Lorrie Faith Cranor, and Norman Sadeh. CHI 2013.

http://cups.cs.cmu.edu/privacyLabel

‣ Provide the user with meaningful control over the information:

‣ Discriminative

‣ Easy to use

‣ Works out of the box

‣ A simple test should be: the data belongs to the user. Can she effectively exercise her ownership?

30

Choice

Discriminative Control

31

http://ie.microsoft.com/testdrive/browser/donottrack/

The Do Not Track (DNT) header requests that a web application disable either its tracking or cross-site user tracking.

Do Not Track

32

http://ie.microsoft.com/testdrive/browser/donottrack/

33

Non-Discriminative: Access to Locations

‣ Application-level limitations:

‣ Not all locations are the same.

‣ Not all situations are the same.

‣ Not all information destinations are the same.

‣ Default is overpowering

Control is ToughWhat happens when we ask the user to control complex sharing preferences?

How can we balance usability and privacy?

34

Crowdsourcing Privacy Preferences

35

Aggregator

Preference

Application

Collecting preferences and their underlying context

ModelerBuilding a model for the preference according to a context

Personalizer Personalizing the model for a specific, given user

Using the preference model in a specific application

Preference

Preference

Preference

Preference

Preference

From: Eran Toch, Crowdsourcing Privacy Management in Context-Aware Applications, Personal and Ubiquitous Computing, 2013.

Our User Study‣ 30 Users, 2 weeks.

‣ Smart-Spaces: Tracking locations and activities.

‣ Participants were surveyed three times a day.

‣ Asked about their willingness to share their location on a Likert scale.

36

Place Discrimination

37

Some places are shared by almost everybody

Some places are

considered private

21 3 4 5

Lesslikely to share More likely to share

38

Accuracy of Decision Strategies

Defaults are Enormously Important

‣ People have a tendency to stick to the defaults:

‣ Organ donation choices

‣ Access control policies

‣ Browser selection

39

Generating Defaults

40

Oded Maimon, Ron Hirschprung, Eran Toch. Evaluating Bi-Directional Data Agent Applicability and Design in Cloud Com- puting Environment, In proceedings of the 17th Industrial Engineering Conference, 2012.

41

Testing the Defaults

‣ Privacy is a long-term relationship.

‣ Applications need to provide an ongoing access to privacy data and controls.

‣ Meaningful recourse (helping with problems) is crucial for the user’s security and trust.

42

Access and Recourse

Personal Data Centers

43

44

Privacy through Time

Digital information is hardly erased.With search engines and timelines, it becomes more and more accessible. What are the consequences for user-controllable privacy?

45

‣ Between-subject user study (n=298)

‣ Analyzing differences between users, randomly assigned to three conditions:

‣ One month

‣ One year

‣ Two years

‣ More than two years.

‣ Using a custom FB application.

Our Study

Eran Toch and Oshrat Rave-Ayalon. Understanding the Temporal Aspects of Sharing Preferences in Online Social Networks, Submitted to SOUPS 2013

46

Willingness to Share Over Time

47

Implications for Design

A default expiration time of 1.5 years

‣ The best solution for privacy is trying not to know anything about the user.

‣ In most interesting applications, its not possible.

‣ However, analyzing the minimal data requirements for an application is always an interesting idea.

48

Data Minimizatio

n

Anonymity Levels

49

More recognition Less recognition

Pri

vacy

Gu

ara

nte

e

Identified

Pseudo-anonymo

us

Anonymous

Pseudo-anonymous Profiles

50

Managing Identity

‣ Don’t ask users to identify.

‣ If users need personalized service, rely on pseudo-anonymous identification.

‣ Use k-Anonymity, l-diversity, p-closeness and differential privacy to release user information.

51

Architectural Choices

52

Client

Client

Client

Server The privacy bottleneck

Client

Client

Client

53

Eran TochDepartment of Industrial Engineering Tel Aviv University, Israel

http://toch.tau.ac.il/

erant@post.tau.ac.il

Recommended