83
1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum http://www.cs.yale.edu/homes/jf WITS’08; Princeton NJ; June 18, 2008 Acknowledgement: Aaron Johnson

1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum WITS’08; Princeton NJ; June 18, 2008 Acknowledgement:

  • View
    215

  • Download
    1

Embed Size (px)

Citation preview

1

Modeling and Analysis of Anonymous-Communication Systems

Joan Feigenbaumhttp://www.cs.yale.edu/homes/jf

WITS’08; Princeton NJ; June 18, 2008

Acknowledgement: Aaron Johnson

2

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

3

Anonymity: What and Why

• The adversary cannot tell who is communicating with whom. Not the same as confidentiality (and hence not solved by encryption).

• Pro: Facilitates communication by whistle blowers, political dissidents, members of 12-step programs, etc.

• Con: Inhibits accountability

4

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

5

Anonymity Systems

• Remailers / Mix Networks– anon.penet.fi– MixMaster– Mixminion

• Low-latency communication– Anonymous proxies, anonymizer.net– Freedom– Tor– JAP

• Data Publishing– FreeNet

6

Mix Networks

• First outlined by Chaum in 1981

• Provide anonymous communication– High latency– Message-based (“message-oriented”)– One-way or two-way

7

Mix Networks

Users Mixes Destinations

8

Mix NetworksAdversary

Users Mixes Destinations

9

Mix Networks

Users Mixes Destinations

Protocol

Adversary

10

Mix Networks

1. User selects a sequence of mixes and a destination.

M1

M2

M3

u d

Protocol

Adversary

Users Mixes Destinations

11

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

M1

M2

M3

u d

Protocol

Adversary

Users Mixes Destinations

12

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

Adversary

Users Mixes Destinations

13

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

{{{,d}M3,M3}M2

,M2}M1

Adversary

Users Mixes Destinations

14

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

3. Send the message, removing a layer of encryption at each mix.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

{{{,d}M3,M3}M2

,M2}M1

Adversary

Users Mixes Destinations

15

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

3. Send the message, removing a layer of encryption at each mix.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

{{,d}M3,M3}M2

Adversary

Users Mixes Destinations

16

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

3. Send the message, removing a layer of encryption at each mix.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

{,d}M3

Adversary

Users Mixes Destinations

17

Mix Networks

1. User selects a sequence of mixes and a destination.

2. Onion-encrypt the message.

3. Send the message, removing a layer of encryption at each mix.

M1

M2

M3

u d

Protocol Onion Encrypt1. Proceed in reverse order

of the user’s path.

2. Encrypt (message, next hop) with the public key of the mix.

Adversary

Users Mixes Destinations

18

Mix Networks

u d

Adversary

Anonymity?

1. No one mix knows both source and destination.

Users Mixes Destinations

19

Mix Networks

u d

Adversary

Anonymity?

1. No one mix knows both source and destination.

2. Adversary cannot follow multiple messages through the same mix.

v

f

Users Mixes Destinations

20

Mix Networks

u d

Adversary

Anonymity?

1. No one mix knows both source and destination.

2. Adversary cannot follow multiple messages through the same mix.

3. More users provides more anonymity.

v e

w f

Users Mixes Destinations

21

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

22

Provable Anonymity in Mix Networks

• N users

• Passive, local adversary

– Adversary observes some of the mixes and the links.

– Fraction f of links are not observed by adversary.

• Users and mixes are roughly synchronized.

• Users choose mixes uniformly at random.

Setting

23

Provable Anonymity in Mix Networks

• Users should be unlinkable to their destinations.

• Let be a random permutation that maps users to destinations.

• Let C be the traffic matrix observed by the adversary during the protocol. Cei = # of messages on link e in round i

Definition

e1

e2

1 2 3 4 5

1 0

0 1

0

1

1 1

0 0

24

• Use information theory to quantify information gain from observing C.

• H(X) = x -Pr[X=x] log(Pr[X=x]) is the entropy of r.v. X

• I(X : Y) is the mutual information between X and Y.

• I(X : Y) = H(X) – H(X | Y) = x,y -Pr[X=xY=y] log(Pr[X=xY=y])

Provable Anonymity in Mix Networks

Information-theory background

25

Provable Anonymity in Synchronous Protocols

Definition: The protocol is (N)-unlinkable if I(C : ) (N).

Definition: An (N)-unlinkable protocol is efficient if:

1. It takes T(N) = O(polylog(N/(N))) rounds.

2. It uses O(NT(N)) messages.

Theorem (Berman, Fiat, and Ta-Shma, 2004): The basic mixnet protocol is (N)-unlinkable and efficient whenT(N) = (log(N) log2(N/(N))).

26

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

27

Onion Routing [GRS’96]

• Practical design with low latency and overhead

• Connection-oriented, two-way communication

• Open source implementation (http://tor.eff.org)

• Over 1000 volunteer routers

• Estimated 200,000 users

28

How Onion Routing Works

User u running client Internet destination d

Routers running servers

u d

1 2

3

45

29

How Onion Routing Works

u d

1 2

3

45

1. u creates 3-hop circuit through routers (u.a.r.).

30

How Onion Routing Works

u d

1 2

3

45

1. u creates 3-hop circuit through routers (u.a.r.).

31

How Onion Routing Works

u d

1 2

3

45

1. u creates 3-hop circuit through routers (u.a.r.).

32

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

u d

1 2

3

45

33

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{{{}3}4}1

u d

1 2

3

45

34

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{{}3}4

u d

1 2

3

45

35

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{}3

u d

1 2

3

45

36

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

u d

1 2

3

45

37

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

’u d

1 2

3

45

38

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{’}3

u d

1 2

3

45

39

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{{’}3}4u d

1 2

3

45

40

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

{{{’}3}4}1

u d

1 2

3

45

41

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

4. Stream is closed.

u d

1 2

3

45

42

How Onion Routing Works

1. u creates 3-hop circuit through routers (u.a.r.).

2. u opens a stream in the circuit to d.

3. Data are exchanged.

4. Stream is closed.

5. Circuit is changed every few minutes.

u d

1 2

3

45

43

Adversary

u

1 2

3

45

d

Active & Local

44

Outline

• Anonymity: What and why

• Examples of anonymity systems

• Theory: Definition and proof

• Practice: Onion Routing

• Theory meets practice

45

Formal Analysis(F., Johnson, and Syverson, 2007)

u 1 2

3

45

d

v

w

e

f

1.

2.

3.

4.

Timing attacks result in four cases:

46

1. First router compromised

2.

3.

4.

Timing attacks result in four cases:

u 1 2

3

45

d

v

w

e

f

Formal Analysis(F., Johnson, and Syverson, 2007)

47

1. First router compromised

2. Last router compromised

3.

4.

Timing attacks result in four cases:

u 1 2

3

45

d

v

w

e

f

Formal Analysis(F., Johnson, and Syverson, 2007)

48

1. First router compromised

2. Last router compromised

3. First and last compromised

4.

Timing attacks result in four cases:

u 1 2

3

45

d

v

w

e

f

Formal Analysis(F., Johnson, and Syverson, 2007)

49

1. First router compromised

2. Last router compromised

3. First and last compromised

4. Neither first nor last compromised

Timing attacks result in four cases:

u 1 2

3

45

d

v

w

e

f

Formal Analysis(F., Johnson, and Syverson, 2007)

50

Black-Box, Onion-Routing Model

Let U be the set of users.

Let be the set of destinations.

Let the adversary control a fraction b of the routers.

Configuration C• User destinations CD : U• Observed inputs CI : U{0,1}

• Observed outputs CO : U{0,1}

Let X be a random configuration such that:

Pr[X=C] = u [puCD(u)][bCI(u)(1-b)1-CI(u)][bCO(u)(1-b)1-CO(u)]

51

Indistinguishabilityu dvw

ef

u dvw

ef

u dvw

ef

u dvw

ef

Indistinguishable configurations

52

Indistinguishabilityu dvw

ef

u dvw

ef

u dvw

ef

u dvw

ef

Indistinguishable configurations

Note: Indistinguishable configurations form an equivalence relation.

53

Probabilistic Anonymity

The metric Y for the linkability of u and d in C is:

Y(C) = Pr[XD(u)=d | XC]

54

Probabilistic Anonymity

The metric Y for the linkability of u and d in C is:

Y(C) = Pr[XD(u)=d | XC]

Note: This is different from the metric of mutual information used to analyze mix nets.

55

Probabilistic Anonymity

The metric Y for the linkability of u and d in C is:

Y(C) = Pr[XD(u)=d | XC]

Exact Bayesian inference

• Adversary after long-term intersection attack

• Worst-case adversary

56

Probabilistic Anonymity

The metric Y for the linkability of u and d in C is:

Y(C) = Pr[XD(u)=d | XC]

Exact Bayesian inference

• Adversary after long-term intersection attack

• Worst-case adversary

Linkability given that u visits d:

E[Y | XD(u)=d]

57

Anonymity Bounds

1. Lower bound:E[Y | XD(u)=d] b2 + (1-b2) pu

d

58

Anonymity Bounds

1. Lower bound:E[Y | XD(u)=d] b2 + (1-b2) pu

d

2. Upper bounds:a. pv

=1 for all vu, where pv pv

e for e d

b. pvd=1 for all vu

59

Anonymity Bounds

1. Lower bound:E[Y | XD(u)=d] b2 + (1-b2) pu

d

2. Upper bounds:a. pv

=1 for all vu, where pv pv

e for e d

E[Y | XD(u)=d] b + (1-b) pud + O(logn/n)

b. pvd=1 for all vu

E[Y | XD(u)=d] b2 + (1-b2) pud + O(logn/n)

60

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

61

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:

62

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0]

63

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0]

64

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Let {Ci} be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.

65

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Let {Ci} be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.E[Y | XD(u)=d XI(u)=0]

= i (Pr[Di])2

Pr[Ci] Pr[XD(u)=d]

66

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Let {Ci} be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.E[Y | XD(u)=d XI(u)=0]

= i (Pr[Di])2

Pr[Ci] Pr[XD(u)=d]

(i Pr[Di] Pr[Ci] / Pr[Ci])2

Pr[XD(u)=d]by Cauchy-Schwarz

67

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Let {Ci} be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.E[Y | XD(u)=d XI(u)=0]

= i (Pr[Di])2

Pr[Ci] Pr[XD(u)=d]

(i Pr[Di] Pr[Ci] / Pr[Ci])2

Pr[XD(u)=d]

= pud

by Cauchy-Schwarz

68

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0]

69

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0] b2 + b(1-b) pu

d + (1-b) pud

70

Lower Bound

Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud

Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu

d + (1-b) E[Y | XD(u)=d XI(u)=0] b2 + b(1-b) pu

d + (1-b) pud

= b2 + (1-b2) pud

71

Upper Bound

72

Upper Bound

Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when

1. pv=1 for all vu OR

2. pvd=1 for all vu

Let pu1 pu

2 pud-1 pu

d+1 … pu

73

Upper Bound

Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when

1. pv=1 for all vu OR

2. pvd=1 for all vu

Let pu1 pu

2 pud-1 pu

d+1 … pu

Show max. occurs when, for all vu, pv

ev = 1 for

some ev.

74

Show max. occurs when, for all vu,ev = d orev = .

Upper Bound

Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when

1. pv=1 for all vu OR

2. pvd=1 for all vu

Let pu1 pu

2 pud-1 pu

d+1 … pu

Show max. occurs when, for all vu, pv

ev = 1 for

some ev.

75

Show max. occurs when, for all vu,ev = d orev = .

Upper Bound

Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when

1. pv=1 for all vu OR

2. pvd=1 for all vu

Let pu1 pu

2 pud-1 pu

d+1 … pu

Show max. occurs when, for all vu, pv

ev = 1 for

some ev.

Show max. occurs when ev=d for all vu, or whenev = for all vu.

76

Upper-bound EstimatesLet n be the number of users.

77

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

Let n be the number of users.

78

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

Theorem 5: When pvd=1 for all vu:

E[Y | XD(u)=d] = b2 + b(1-b)pud +

(1-b) pud/(1-(1- pu

d)b) + O(logn/n)]

Let n be the number of users.

79

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

Let n be the number of users.

80

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

b + (1-b) pud

Let n be the number of users.

For pu small

81

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

b + (1-b) pud

E[Y | XD(u)=d] b2 + (1-b2) pud

Let n be the number of users.

For pu small

82

Upper-bound Estimates

Theorem 4: When pv=1 for all vu:

E[Y | XD(u)=d] = b + b(1-b)pud +

(1-b)2 pud [(1-b)/(1-(1- pu

)b)) + O(logn/n)]

b + (1-b) pud

E[Y | XD(u)=d] b2 + (1-b2) pud

Let n be the number of users.

Increased chance of total compromise from b2 to b.

For pu small

83

Conclusions

• Many challenges remain in the design, implementation, and analysis of anonymous-communication systems.

• It is hard to prove theorems about real systems – or even to figure out what to prove.

• “Nothing is more practical than a good theory!” (Tanya Berger-Wolfe, UIC)