43
Privacy in Wearable Privacy in Wearable Computing Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

  • View
    219

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Privacy in Wearable Privacy in Wearable ComputingComputing

Thad Starner

Contextual Computing Group

College of Computing

Georgia Tech

Page 2: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

HandoutsHandouts

The Challenges of Wearable Computing (Starner)

Privacy Protection in the 1980’s (Turn)Excerpts from Agre and Rotenberg

Technology and Privacy: The New Landscape

Page 3: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

BackgroundBackground

College of Computing, Georgia Tech

Founder, Charmed Technologies

Founder, MIT Wearable Computing Project

IEEE ISWC and Wearable Information Systems TC

Everyday use since 1993

Page 4: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Wearable ChallengesWearable Challenges

Power and heat (mips/watt)On and off-body networking (bits/joule)PrivacyInterface (additional capability vs. load)

– User Interface (cognitive load)– Ergonomics/human factors (weight, heat, etc.)

(Intertwined – changing one effects the others)

Page 5: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Resources (wearable and Resources (wearable and ubiquitous computing)ubiquitous computing)

Phil Agre’s Red Rock Eater list ACM technology alerts Foner “Political Artifacts and Personal Privacy: The

Yenta Multi-Agent Distributed Matchmaking System” (MIT PhD thesis)

Langheinrich “Privacy by Design” EPIC, EFF, ACLU, CPSR, Privacy Journal (on-line),

privacyinterational.org, privacy.org Colleagues: Rhodes, Bruckman, Foner, Kapor, Mann,

Pentland, wearables research mailing list, privacy panel ISWC98

Page 6: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Resources (general)Resources (general)

Pool: Technologies of Freedom, The Social Impact of the Telephone

Agre and Rotenberg: Technology and Privacy: the New Landscape

Westin: Privacy and Freedom R.E. Smith: Our Vanishing Right to Privacy Rothfeder: Privacy for Sale Miller: The Assault on Privacy McCarthy: The rights of Publicity and Privacy Rosen: The Unwanted Gaze (also the article “Is Nothing

Private?”)

Page 7: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Resources (general)Resources (general)

Rosen: The Unwanted Gaze (also the article “Is Nothing Private

Computers, Freedom, and PrivacyIEEE Security and Privacy

Page 8: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

DefinitionsDefinitions

Privacy – right of individuals to control the collection and use of personal information about themselves

Security- protection of information from unauthorized users

Page 9: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Definition (Gellman)Definition (Gellman)

“No definition … is possible, because issues are fundamentally matters of values, interests, and power”

Page 10: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Black’s Law DictionaryBlack’s Law Dictionary

Right of privacy: The right to be let alone; the right of a person to be free from unwanted publicity; and the right to live without unwarranted interference by the public in matters with which the public is not necessarily concerned … concept of ordered liberty, and such right prevents governmental interference in the intimage personal relationships or activities, freedoms of the individual to make fundamental choices involoving himself, his family, and his relationshkp with others.

Page 11: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Privacy ViolationPrivacy Violation

Tort – “civil wrong”Unreasonable intrusion upon the seclusion

of another individual if such intrusion would be highly offensive to a reasonable person

Appropriation of the other’s name or likeness for one’s own use or benefit

Page 12: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Privacy Violation (cont.)Privacy Violation (cont.)

Unreasonable publicity given to the other’s private life if the published matter would be highly offensive to a reasonable person and is no concern of the public

Publicity that unreasonably places the other in a false light before the public where the false light would be highly offensive to a reasonable person and the publisher knows the falsity of the published matter

Page 13: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Revolutionary War to NowRevolutionary War to Now

Wilkes case in EnglandPackwood diariesClinton

Page 14: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

What are PDAs and What are PDAs and wearables?wearables?

Filing cabinets?– Who bought the machine?– Who owns it?

Diaries?– What is its use?– Separate section for private info?

Page 15: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

A Selection of CasesA Selection of Cases

“Fatty” Arbunckle– Tabloids, ubiquitous surveillance, and the changing

social perception of smear campaigns Larry Flynn and the Republican witch hunt Linda Tripp Cellular phone monitoring

– Newt Gingrich– Prince Charles– 911 emergency phone call

EZ Pass

Page 16: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy ArgumentsAnti-privacy Arguments

If you have nothing to hide then you should have no concern for your privacy– Many personal situations might not want

exposed (victim of rape, child abuse, fraud, …)– Opponents will use facts in the worst possible

light (politicians, tabloids, etc.)– Unfair, unregulated environments (racism,

health concerns, etc.)

Page 17: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy ArgumentsAnti-privacy Arguments

“I don’t care about privacy”– But other people have the right to care about

theirs– Equivalent to saying “I don’t say anything

controversial so therefore I don’t care about free speech”

Page 18: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy ArgumentsAnti-privacy Arguments

Privacy discussion is overblown. Big organizations don’t really care about individuals, just narrow goals.– Most harm is in the aggregate, but can still have large

effects on the individual High cost home mortgage loans

– FBI files - sabotage against nonviolent dissidents– Whistle blowers Ralph Nader and GM– Individuals against individuals – University professors

and Freedom of Information act

Page 19: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy ArgumentsAnti-privacy Arguments

Surveillance is inevitable – the real issue is achieving a balance of power where we can watch the people who are watching us– Equal access to information is not the same as equal

ability to use that information. A corporation or government can, and often do, dedicate resources to watching a person or group of people which few individuals can match.

– Political/technical feasiblity of forcing corporations/governments/elite to comply

Page 20: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy ArgumentsAnti-privacy Arguments

Computer technology is just returning us to the rural village of yesteryear/ Technology brings nothing new to the privacy arguments– Simply incorrect. Computers allow large scale data-

mining that was previously impossible with paper– In rural villages, did not have to worry about

corporations with large budgets and large databases. A rural village implies some sense of parity.

Page 21: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy argumentsAnti-privacy arguments

We must balance privacy and industrial concerns/society/government expense– Assumes you can not have the same services in

a privacy-preserving manner. In most cases you can – even at lower cost!

– Ignores industry created by having active privacy protections

Page 22: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy ArgumentsAnti-privacy Arguments

Individuals can make up their own minds about what to reveal– Not if they don’t have the proper information– Currently, the companies that would like to benefit

from your information inform the individual of risks – if they bother at all

– This opinion pits one person’s capabilities against that of large companies with specialists who concentrate on exploiting such data. It also assumes technology will not make new uses of such data possible in the future

Page 23: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy ArgumentsAnti-privacy Arguments

There is no privacy in public– Reasonable expectation– U.S. law (used to be) designed to protect people

not places (1967)– Aggregation of data, not individual instances – Just because it can be done doesn’t mean it isn’t

wrong.

Page 24: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy ArugmentsAnti-privacy Arugments

Companies that distribute collections of personal data are protected under free speech laws– Who owns the bits? Is personal information

property? If so, the rules of copyright and patent are accepted restrictions on the use of such speech

Page 25: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy ArgumentsAnti-privacy Arguments

Tagging a car or cellular phone does not equal tagging a person – circumstantial evidence– Circumstantial evidence is used all the time –

both in and out of court (tabloids)– License plate– EZ Pass in NYC– License for more directed

investigation/harassment

Page 26: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Anti-privacy argumentsAnti-privacy arguments

People on welfare should expect a reduction of privacy for benefits / the elite can violate privacy with enough money – why not make that available to everyone– The rights to privacy should not vary according

to social class – no reason– In any implementation, someone can take

advantage. That does not mean we should not design our systems as well as possible.

Page 27: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

U.S. Privacy Act of 1974U.S. Privacy Act of 1974(Turn and Langheinrich)(Turn and Langheinrich)

Openness and transparency – no secret record keeping

Individual participation- ability to see own records Collection limitation – don’t exceed needs Data quality- relevant to purpose and updated Use limitation- only authorized personnel for

specific purpose Reasonable security Accountability

Page 28: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

U.S. Privacy Act of 1974U.S. Privacy Act of 1974

Sounds reasonable, but..Only applicable to federal agencies and

certain contractors!!!!Conflict with Freedom of Information

Acts!!

Page 29: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

EU Directive 95/46/ECEU Directive 95/46/EC

Data may be transferred only to non-EU countries with “adequate” levels of privacy protection

Explicit consentU.S. Safe Harbor – countries self-certify

– HP only major player

Page 30: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Detecting Privacy ViolationsDetecting Privacy Violations

Violations must be punishable and detectable Different aliases

– E-mail– True names, addresses

Trap entries– Bogus street names– Bogus student names/addresses

Ralph Nader and General Motors

Page 31: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Leonard FonerLeonard Foner

“Those who design systems which handle personal information therefore have a special duty: They must not design systems which unnecessarily require, induce, persuade, or coerce individuals into giving up personal privacy in order to avail themselves of the benefit of the system being designed”

Page 32: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Risks of Ubiquitous Risks of Ubiquitous Computing (Langheinrich)Computing (Langheinrich)

UbiquityInvisibilitySensingMemory amplification

– Bush’s Memex

Page 33: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Wearable vs. Environmental Wearable vs. Environmental ApproachApproach

Data collected on userReleased by userExtreme form: all sensing powered by user

– On-body sensing– RFID

Wearable confounderLittle brother vs. big brother

Page 34: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Privacy Barriers (Starner)Privacy Barriers (Starner)

PhysicalTechnological

– Encryption, biometrics, etc.Legislative

– Changing laws, software monopolies, speed of innovation, enforcement

SocialObscuring

Page 35: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Privacy by Design Privacy by Design (Langheinrich)(Langheinrich)

NoticeChoice and consentAnonymity and pseudonymityProximity and localityAdequate securityAccess and Recourse

Page 36: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Case study - RAVECase study - RAVE

Page 37: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Case Studies – Active BadgeCase Studies – Active Badge

Active badge system for security and convenience at a U.S. state university– Access to secure areas at a distance– Purchasing of snacks at vending machines– Tracking telephone calls– Auto-login– …

Page 38: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

DangersDangers

Security– Spoofing– Tracking– Traffic analysis– Social hacking– Bribery

Legal access to data– FOI– Discrimination/harassment lawsuits

Page 39: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Case StudiesCase Studies

Global Positioning SystemLocustCellular phone 911 trackingPagersEZPassAutomobile black box

Page 40: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Case Studies (cont)Case Studies (cont)

Patent search systemMIT Wearable Computing Project

– Remembrance Agent Augmented memory to previous conversations Deniability Implicit, unpredictable violations of shared

databases

– Webcam

Page 41: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Case Studies (cont)Case Studies (cont)

Face Recognition– FaceIT– Eigenfaces– Social engagement

Augmented Reality and Snowcrash’s CIC

Page 42: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Technologists Are the First Technologists Are the First Line of ProtectionLine of Protection

Design the system so that it is easier and more economical to preserve privacy than violate it

Provide mechanisms by which privacy violation can be detected

Use combinations of mechanisms so that privacy and security can be adjusted for different social conditions and new threats

Design with guidelines in mind

Page 43: Privacy in Wearable Computing Thad Starner Contextual Computing Group College of Computing Georgia Tech

Benjamin FranklinBenjamin Franklin

They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety