39
1 Security and Cryptography II Anonymous & Unobservable Communication Stefan Köpsell (Slides [mainly] created by Andreas Pfitzmann) Technische Universität Dresden, Faculty of Computer Science, D-01187 Dresden Nöthnitzer Str. 46, Room 3067 hone: +49 351 463-38272, e-mail: sk13 @inf.tu-dresden.de , https:// dud.inf.tu-dresden.de

1 Security and Cryptography II Anonymous & Unobservable Communication Stefan Köpsell (Slides [mainly] created by Andreas Pfitzmann) Technische Universität

Embed Size (px)

Citation preview

1

Security and Cryptography II

Anonymous & Unobservable Communication

Stefan Köpsell(Slides [mainly] created by Andreas Pfitzmann)

Technische Universität Dresden, Faculty of Computer Science, D-01187 DresdenNöthnitzer Str. 46, Room 3067

Phone: +49 351 463-38272, e-mail: [email protected], https://dud.inf.tu-dresden.de/

Observability of users in switched networks

interceptor

possible attackers

telephone exchange• operator• manufacturer (Trojan horse)• employee

network termination

radio

television

videophone

phone

internet

countermeasure encryption

• link encryption

Observability of users in switched networks

countermeasure encryption

• end-to-end encryption

interceptor

possible attackers

telephone exchange• operator• manufacturer (Trojan horse)• employee

network termination

radio

television

videophone

phone

internet

Observability of users inswitched networks

countermeasure encryption

• link encryption

• end-to-end encryption

Problem: traffic datawho with whom?when? how long?how much information? Aim: “protect” traffic data (and so data on interests,

too) so that they couldn’t be captured.

data on interests: Who? What?

communication partner

interceptor

possible attackers

telephone exchange• operator• manufacturer (Trojan horse)• employee

network termination

radio

television

videophone

phone

internet

Observability of users in broadcast networks

interceptor

possible attackers

radio

television

videophone

phone

internet

(Examples: bus-, radio networks)

any station gets• all bits• analogue signals

(distance, bearing)

Since about 1990 reality

Video-8 tape 5 Gbyte

= 3 * all census data of 1987 in Germany

memory costs < 25 EUR

100 Video-8 tapes (or in 2014: 1 hard drive disk with 500 GByte for ≈ 35 EUR) store all telephone calls of one year:

Who with whom ?When ?How long ?From where ?

Reality or fiction?

With the development of television, and the technical advance which made it possible to receive and transmit simultaneously on the same instrument, private life came to an end.

George Orwell, 1948

Excerpt from: 1984

Examples of changes w.r.t.anonymity and privacy

Broadcast allows recipient anonymity — it is not detectable who is interested in which programme and information

Examples of changes w.r.t.anonymity and privacy

Internet-Radio, IPTV, Video on Demand etc.support profiling

Remark: Plain old letter post has shown its dangers, but nobody demands full traceability of them …

Anonymous plain old letter post is substituted by „surveillanceable“ e-Mails

The massmedia „newspaper“ will be personalised by means of Web, elektronic paper and print on demand

Privacy and the Cloud?

[http://www.apple.com/icloud/]

Mechanisms to protect traffic data

Protection outside the network

Public terminals– use is cumbersome

Temporally decoupled processing– communications with real time properties

Local selection– transmission performance of the network– paying for services with fees

Protection inside the network

Attacker (-model)

Questions:How widely distributed ? (stations, lines)observing / modifying ?How much computing capacity ?

(computationally unrestricted, computationally restricted)

Realistic protection goals/attacker models:Technical solution possible?

===T===Gate===

Online Social Networks– Web 2.0

The Facebook-Problems

…at least two different problems: 1. Information leakage by (more or less)

intentionally published (Profil-)data(E-Mail) Contact listFace recognitation

2. Profiling of every Internet user„Like“-Button

developed 1997 by Netscape original purpose: enable sessions (transactions) on the Web

small amount of data, sent from the Web server to the Browser will be:

stored by the Browser automatically transmitted with every visit of the Web server

usual content: unique identifier for re-identification (tracking)

Cookies – served on the Web

Nutzer Web-Server

➀ Anfrage des Nutzers

➋ Antwort des Web-Servers

1st visit of a Web-site

2nd and further visits of that Web-site

Cookies – served on the Web

Nutzer Web-Server

➀ Anfrage des Nutzers

➋ Antwort des Web-Servers

Nutzer Web-Server

➀ Anfrage des Nutzers

➋ Antwort des Web-Servers

besides Cookies many other tracking mechanisms exist in modern BrowsersFlash-Cookies, DOM-StorageGEO-Location, Web-BugsList of Fonts, List of Plugins, …

Tracking Profiling, especially: group profilesGoal: Link a person to a group of persons to derive unknown attributes

of that person“behavioural targeting / advertising”

„to be ‚read‘ out“

Why?Make money!“If you are not paying for it, you're not the customer; you're the

product being sold.” [post on MetaFilter.com, August 26, 2010]

To be tracked or not to be tracked?

Google‘s Revenue in Million Dollar

0

500

1000

1500

2000

2500

3000

35004/

03

2/04

4/04

2/05

4/05

2/06

4/06

2/07

4/07

2/08

4/08

2/09

4/09

2/10

4/10

2/11

4/11

2/12

4/12

Facebook-“Like“-Button

small picture, embedded into many Web sites>350000 Web-Sites

if a Facebook user clicks on the Like-Button, his friends will be informedFacebook learns, which sites a user likes

Facebook-“Like“-Button

Facebook-“Like“-Button

<iframe src="//www.facebook.com/plugins/like.php?href=www.tu-dresden.de&amp;send=false&layout=standard&width=450&show_faces=true&action=like&colorscheme=light&font&height=80" scrolling="no" frameborder="0" style="border:none; overflow:hidden; width:450px; height:80px;" allowTransparency="true"></iframe>

Facebook-“Like“-Button

<iframe src="//www.facebook.com/plugins/like.php?href=www.tu-dresden.de&amp;send=false&layout=standard&width=450&show_faces=true&action=like&colorscheme=light&font&height=80" scrolling="no" frameborder="0" style="border:none; overflow:hidden; width:450px; height:80px;" allowTransparency="true"></iframe>

Facebook-“Like“-Button

<iframe src="//www.facebook.com/plugins/like.php?href=www.tu-dresden.de&amp;send=false&layout=standard&width=450&show_faces=true&action=like&colorscheme=light&font&height=80" scrolling="no" frameborder="0" style="border:none; overflow:hidden; width:450px; height:80px;" allowTransparency="true"></iframe>

Facebook-“Like“-Button

<iframe src="//www.facebook.com/plugins/like.php?href=www.tu-dresden.de&amp;send=false&layout=standard&width=450&show_faces=true&action=like&colorscheme=light&font&height=80" scrolling="no" frameborder="0" style="border:none; overflow:hidden; width:450px; height:80px;" allowTransparency="true"></iframe>

Facebook-“Like“-Button

Before You are allowed to enter this Web-site you have to call Facebook.

Please tell your name, your address and the web-sites you plan to visit.

Thanks for your cooperation.

Attacker (-model)

Questions: How widely distributed ? (stations, lines) observing / modifying ? How much computing capacity ? (computationally unrestricted,

computationally restricted)

Unobservability of an event EFor attacker holds for all his observations B: 0 < P(E|B) < 1perfect: P(E) = P(E|B)

Anonymity of an entity

Unlinkability of events

if necessary: partitioning in classes

Counter measures

anonymous & unobservable communicationBroadcastMixesDC-Netprivate information retrieval…

privacy-preserving identity managementservice utilisationvalue exchange

The Mix protocol

Idea: Provide unlinkability between incoming and outgoing messages

Mix 1 Mix 2

A Mix collects messages, changes their coding and forwards them in a different order.

If all Mixes work together,they can reveal the way of a given messages.

Pseudonyms

person pseudonyms role pseudonyms

publicperson

pseudonym

non-public person

pseudonym

anonymous- person

pseudonym

business- relationshippseudonym

transactionpseudonym

A n o n y m i t yScalability concerning the protection

phonenumber

accountnumber

biometric, DNA(as long asno register) pen name

one-time password

examples

Pseudonyms: Use across different contexts partial order

A  B stands for “B enables stronger unlinkability than A”

number of an identity card, social security number,

bank account

pen name, employee identity card number

customer number

contract number

one-time password, TAN, one-time use public-key pair

Usually: one identity per user

Identity Management

Age

driving license

Name

Address

Phone number

Tax class

account number

E-Mail

Problem: Linkability of records

Many Partial-Identities per user

Management / disclosure / linkability under the control of the user

Privacy-preservingIdentity management

ageName

address

tax class

account number

p2

Nameaccount number

p3

Alter

driving licensep5

E-Mail

p4

Name

E-Mail

p1

phone number

• many services need only a few data

• revealing that data under a Pseudonym prevents unnecessary linkability with other data of the user

• different actions / data are initially unlinkable if one uses different pseudonyms

Implementation: Pseudonyms

Example: Car Rental

necessary data:• Possesion of a driving license

valid for the car wanted

p1

p2

Anonymous Credentials

Credential = Attestation of an attribute of a user (e.g. „Users has driving licvense “)

Steps: Organisation issues Credentials User shows credential to service

provider

Properties: User can show credentials under

different pseudonyms (transformation)

Usage of the same credential with different pseudonyms prevents linkability against the service provider and the issuer.

shows Credentials

issues

Credential

publishes credential

types

Organisation

User

Service providers

Usage ofAnonymous Credentials

User A

Credentials issuingOrganisation

havedriving-license

User B

User X

:

User Ahas

driving-license Service provider

havedriving-license

havedriving-license

havedriving-license