36
1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

Embed Size (px)

Citation preview

Page 1: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

1

Alma CemerlicDepartment of Computer Science and Engineering

Master Thesis Defense

Page 2: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

2

Introduction Wireless ad hoc network Motivation of the study Definition of terms

Related work Our methodology

Fine-grained reputation system Integration with a wireless routing protocol

Simulation and evaluation Conclusion

Page 3: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

3

Ad hoc mode No base stations Nodes can only

transmit to other nodes within link coverage

Nodes organize themselves into a network – route among themselves

Page 4: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

4

Ad-hoc Network Characteristics1. No infrastructure support2. Limited resources (power, memory, and

processing)3. Easily eavesdropped 4. Naïve trust model

Examples Hostile environments like battlefields or

rescue operations

Page 5: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

5

Most legacy models of wireless ad hoc routing do not consider security and use hop count as a measure of path cost.

We explore reliability of each node to

improve the quality of service and security.

An approach to routing which incorporates security (reputation) of nodes into legacy routing metrics

Page 6: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

6

Basic idea: Route packets through nodes with high

reputation values Protect network traffic from misbehaving nodes Improve effective throughput Minimize interaction with misbehaving nodes

Solution Evaluate reputation of each node Integrate reputation metric into route selection

Page 7: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

7

Neighbors – communication range of two nodes overlaps.

Neighborhood – all neighbors Behavior – the way that a node handles packet

forwarding, i.e. correctly forwards, alters, injects, etc.

Reputation – evaluated through observation of a neighbors’ behavior.

Report – reputation information periodically exchanged among neighbor nodes.

Trust – indicates whether reports coming from

a particular node can be considered trustworthy.

Page 8: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

8

Bidirectional communication on every link

▪ Many wireless Medium Access Control (MAC) layer protocols require bidirectional communication for reliable transmission

Network interfaces on the nodes support promiscuous mode operation

▪ If a node A is within range of node B, it can overhear all communications to and from B

Page 9: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

9

Adversaries are nodes that misbehave in such ways that they degrade integrity and availability of the network

Misbehaviors include: Packet alternation and injection done by

malicious nodes when they are supposed to forward packets for other nodes

Packet dropping done by selfish nodes when they are supposed to forward packets for other nodes

Page 10: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

10

Use forwarding behavior of a node to estimate its reliability.

Nodes categorized as good or bad (Beta Reputation System) Distinguishing between selfish and malicious

behavior is not possible▪ CONFIDANT (Buchegger 2002a), CORE (Michiardi

2002a), SORI (Wang 2004b), SAFE (Rebahi 2005)

Virtual currency in wireless routing Packet Purse Model Packet Trade Model

▪ Nugglets (Buttyan 2001), (Wang 2004a), (Jakobsson 2003)

Page 11: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

11

“Self-organized system” Decentralized, cooperation cannot be guaranteed

Reputation systems – “soft security” Stimulate ethical behavior and integrity of

members in collaborative environments; recognize and sanction intolerable behavior, reward obedient members

Each node observes behavior and evaluates reputation of its neighbors

Neighbors exchange reputation information Reputation is used to select the most reliable

and secure path from source to destination

Page 12: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

12

Friendly Nodes▪ Correctly forward packets; expected behavior

Selfish Nodes▪ Drop packets; expect other nodes to route their packets;▪ Physical properties (battery, overload), attempt to save

resources, random failure▪ Harm availability of the network

Malicious Nodes▪ Misroute, alter, or inject packets; ▪ Harm integrity of packets

If behavior changes, node’s reputation changes correspondingly.

Friendly Selfish Malicious

Page 13: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

13

Total Reputation is the combination of first-hand and second-hand reputation

First-hand reputation from direct observation of neighbors’

behavior Second-hand reputation

from neighbors’ reports

Page 14: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

14

Each node observes the forwarding behavior of its neighbors: Packets may be correctly forwarded, dropped, maliciously

modified.

A node’s behavior follows the Dirichlet distribution Dir(α), where α = (α1, … , αn)

Conjugate prior to the Multinomial distribution The probability that each independent trial result is exactly

one of some fixed finite number n of possible outcomes with probabilities p1,…, pn

Bayes Theorem Allows us to incorporate new observation into prior knowledge

and obtain posterior probability of node’s behavior.

Page 15: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

15

Combine the Dirichlet distribution and Bayes Theorem

Given the priorthe posterior distribution is calculated as

Starting with the initial state of the prior distribution, the parameters are updated when new data D is available

N represents instances of the new data

First-hand reputation value at a time t is equal to the expectation value of Dir(α)

),...,|()( 1 iriii XDirXP

),...,|()(

)()|()|( 11 ii iririii

iii NNDir

Dp

PDPDP

0 i

ix

Page 16: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

16

Behaviors of a node change from friendly to selfish and malicious.

Window Number of observed packets

α1 α2 α3

0 1 0 0

1 50 40 10 0

2 50 30 15 5

3 50 40 5 5

4 50 35 10 5

5 50 40 10 0

Total number of observed packets in windows 0-5

250 185 50 15

Expectation of Xi --- (185) / (250) = 0.74

(50) / (250) = 0.2

(15) / (250) =0.06

Page 17: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

17

Collaborative monitoring :exchange first-hand reputation with the neighbors

Nodes are required to periodically broadcast

Deviation test – to detect false reports Increase or decrease trust? Incorporate the report into the total reputation value?

dDirEDirE CBAB |)),,(()),,((| 321321

Page 18: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

18

Indicates how trustworthy the neighbor’s reports are

The same Bayesian approach as for the fine-grained reputation system Only two possible instances of behavior:

trustworthy and not trustworthy Use the Beta distribution, conjugate prior to the Binomial

distribution TAB ~ Beta(γ, δ), where γ = trustworthy, δ = not

trustworthy.

Updated when the results of the deviation test are available

Trust value is calculated as:

)),((BetaEAB

Page 19: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

19

Merge first-hand and second-hand reputation information

Second-hand reputation is discounted by the factor ω (trust value) expressing the disbelief in the accuracy of the report

CBACABAB FFR

Page 20: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

20

Observation Windows

Reputation Calculation

Reputation Report

FCB

Reputation Report

FDB

3. Deviation Test and

Trust Threshold

Test

Trust Table

2. Second-hand reputation reports.

6. Update trust value for the reporters C and D: ωAC and ωAD. depending on the outcome of the deviation test.

1. Calculate first-hand reputation FAB based on observations.

5. Merge first-hand and second-hand reputation values to derive total reputation value RAB

4. Depending on the outcome of the deviation test and the trust threshold test, decide whether to accept the second-hand reports.

Page 21: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

21

Two ways to update first-hand information Based on all observations Based on the most recent observations

▪ Reduces computational complexity▪ Early detection of changes in behavior▪ Possibility of redemption over time for misbehaving

nodes

To calculate first-hand information: We divide historic information into time

intervals of equal size and considered only a limited number of the most recent intervals

Page 22: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

22

Window Number of observed packets

α1 α2 α3

1 50 50 0 0

2 50 50 0 0

3 50 40 0 10

4 50 40 0 10

5 50 40 0 10

Windows 0-5

Total number of observed packets in windows 0-5

250 220 0 30

Mode of Xi --- (220) / (250) = 0.88

(0) / (250) = 0

(30) / (250) =0.12

Windows 3-5

Total number of observed packets in windows 3-5

150 120 0 30

Mode of Xi --- (120) / (150) = 0.8

(0)/ (150) = 0

(30) / (150) = 0.2

Reputation based on recent history better reflects changes in behavior

Page 23: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

23

Our fine-grained system classifies wireless nodes as friendly, selfish, or malicious. Malicious nodes cause more damage than selfish nodes.

Reputation of each node is denoted by a vector containing friendly, selfish and malicious parameters (f, s, and m).

Each node consider reputation of neighbor nodes to route packets.

A

B

C

D

<0.8, 0.2, 0.0>

<0.9, 0.1, 0.1>

Page 24: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

24

Cost function converts a reputation vector <f,s,m> into a cost value used to determine most reliable path:

where

Malicious nodes cause more damage than selfish nodes – therefore b<c

Routing when there is no distinction between selfish and malicious behavior: Fine-grained reputation system Beta reputation system

AAAAAA cmbsafmsfC ),,(

cba 1

Page 25: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

25

Moving window size Determines the sensitivity of the system Allows gradual redemption

0

0.2

0.4

0.6

0.8

1

1.2

3 6 9 10 12 15 18 19 21 24 27 28 30 33 36 37

window

repu

tatio

n, f

com

pone

nt

selfish node

malicious node

threshold

Page 26: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

26

50 nodes are placed on 800m by 800m stage

Data packets emission rate is 50 packets/second

Low mobility – static All nodes are connected Random sender-destination pairs Adobe Flash CS3, in AS3.0

Graphical and interactive capabilities Support for OO programming Platform independence and portability

Page 27: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

27

Evaluation Metrics Effective throughput Percentage of packets manipulated by malicious

nodes Connectivity of the network

Compare fine-grained reputation system with Beta system – classify nodes into friendly or selfish Beta system – classify nodes into friendly or

malicious

Page 28: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

28

Same scenarios tested on all three systems

Selfish nodes – drop approx. 20% packets

Malicious nodes – inject approx. 20% packets

Number of misbehaving nodes varies form 4 – 40% of the population (add a reference about 40%)

Page 29: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

29

Effective throughput The ratio of legitimate traffic and total traffic, where legitimate

means that the traffic was not produced as a consequence of the malicious activity

Our system shows highest effective throughput over beta systems

0.8

0.85

0.9

0.95

1

1.05

Simulation case

Eff

ectiv

e th

roug

hput

<f,s,m>

<g,b> malicious

<g,b> selfish

<f,s,m> 0.8848 0.8894 0.9221 0.988 0.9981 0.9308 0.9731 0.993 0.9983 0.9923 0.9936 0.998 0.9972 0.9916 0.9968

<g,b> malicious 0.8442 0.8114 0.8467 0.9232 0.9952 0.9029 0.9203 0.9478 0.9944 0.9829 0.992 0.9956 0.9879 0.9901 0.9869

<g,b> self ish 0.8603 0.8347 0.8586 0.9099 0.9871 0.9019 0.921 0.9685 0.9961 0.9689 0.9888 0.9972 0.994 0.9912 0.994

4% 36%

10% 30%

20% 20%

30% 10%

36% 4%

4% 26%

10% 20%

20% 10%

26% 4%

4% 16%

10% 10%

16% 4%

4% 6%

4% 4%

6% 4%

NODES:% selfish%malicious

Page 30: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

30

Percentage of packets manipulated by malicious nodes Our system shows lowest percentage of packets manipulated

by malicious nodes.

0

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

0.18

0.2

Simulation case

% in

ject

ed p

acke

ts

<f,s,m>

<g,b> malicious

<g,b> selfish

<f,s,m> 0.1122 0.0991 0.0658 0.0099 0.0016 0.0679 0.025 0.0064 0.0016 0.0075 0.0064 0.002 0.0028 0.0083 0.0032

<g,b> malicious 0.1545 0.1803 0.1394 0.064 0.004 0.0962 0.0765 0.0487 0.0052 0.0169 0.0079 0.0044 0.0119 0.0099 0.013

<g,b> self ish 0.1376 0.1619 0.1331 0.0795 0.0107 0.0968 0.0761 0.0287 0.0036 0.0306 0.0111 0.0028 0.006 0.0087 0.006

4% 36%

10% 30%

20% 20%

30% 10%

36% 4%

4% 26%

10% 20%

20% 10%

26% 4%

4% 16%

10% 10%

16% 4%

4% 6%

4% 4%

6% 4%

NODES:% selfish%malicious

Page 31: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

31

Connectivity: average cost for nodes to communicate with each other. Our system shows higher connectivity over beta (malicious) system Our system show slight lower connectivity over beta (selfish)

system. But the increased connectivity in latter system is used to carry malicious traffic, which is indicated by lower effective throughput in beta (selfish system).

0

50

100

150

200

250

300

4% 36% 10% 30% 20% 20% 30% 10% 36% 4%

Simulation case (% selfish, % malicious nodes)

Con

nect

ivity

<f,s,m>

<g,b> malicious

<g,b> selfish

Page 32: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

32

Random communication, 10% misbehaving on every 50 packets forwarded Effective throughput

▪ Beta malicious vs. <f,s,m> 2.84% increase▪ Beta selfish vs. <f,s,m> 2.5% increase

Percentage of malicious packets▪ Beta malicious vs. <f,s,m> 2.76% decrease▪ Beta selfish vs. <f,s,m> 2.49% decrease

Connectivity (over 40% set)▪ Beta malicious vs. <f,s,m> 9.55% increase▪ Beta selfish vs. <f,s,m> 10.56% decrease

Page 33: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

33

Fine-grained reputation system

Beta (selfish) reputation system

Beta (malicious) reputation system

Effective Throughput

HIGH LOW MEDIUM

Percentage of packets manipulated by malicious nodes

LOW MEDIUM HIGH

Connectivity MEDIUM HIGH MEDIUM

Page 34: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

34

CONTRIBUTION

Proposed a novel solution to evaluating and managing reputation and trust in P2P environment.

Integrated reputation management into wireless ad hoc routing.

Defended wireless ad hoc network against misbehaving nodes.

Implemented and evaluated fine-grained reputation system in application of wireless ad hoc routing

Our system performs better than the Beta reputation systems: higher effective throughput and less packets manipulated by malicious nodes

Page 35: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

35

Buchegger, S., Le Boudec, J. (2002). Performance analysis of the CONFIDANT protocol: Cooperation Of Nodes – Fairness In Dynamic Ad-hoc NeTworks. IEEE/ACM Symposium on Mobile Ad Hoc Networking and Computing (MobiHOC), Lausanne, Switzerland.

Buchegger, S., Mundinger, J., Le Boudec, J. (2008). Reputation systems for self-organized networks: lessons learned. IEEE Technology and Society Magazine, Special Issue on Limits of Cooperation in Wireless Communications.

Buttyan, L., Hubaux, J. (2001). Nuglets: a virtual currency to stimulate cooperation in self-organized ad hoc networks. Lausanne, Switzerland, Institute for Computer Communications and Applications, Swiss Federal Institute of Technology.

Jakobsson, M., Hubaux, J., Buttyan, L. (2003). A micro-payment scheme encouraging collaboration in multi-hop cellular networks. Proceedings of Financial Crypto, La Guadeloupe.

Michiardi, P., Molva, R. (2002a). CORE: A COllaborative Reputation mEchanism to enforce node cooperation in mobile ad hoc networks. The IFIP TC6/TC11 Sixth Joint Working Conference on Communications and Multimedia Security: Advanced Communications and Multimedia Security.

Rebahi, Y., Mujica, V. E., Simons, C., Sisalem, D. (2005). SAFE: Securing pAcket Forwarding in ad hoc nEtworks. 5th Workshop on Applications and Services in Wireless Networks. Paris, France.

Wang, W., Li, X., Wang, Y. (2004a). Ad hoc-VCG: a truthful and cost-efficient routing protocol for mobile ad hoc networks with selfish agents. MobiCom '04, San Diego (CA), ACM.

Wang, Y., Giruka, V. C., Singhal, M. (2004b). A fair distributed solution for selfish nodes problem in wireless ad hoc networks. Ad-Hoc, Mobile, and Wireless Networks, Springer Berlin / Heidelberg. 3158: 211-224.

Yang, L., Kizza, J.M., Cemerlic, A., Liu, F. (2007). Fine-grained reputation-based routing in wireles ad hoc networks. IEEE International Conference on Intelligence and Security Informatics, New Brunswick, NY, IEEE.

Yang, L., Novobilski, A, Ege, R. (2008). Trust-based usage control in collaborative environment. The International Journal of Information Security and Privacy 2(2).

Page 36: 1 Alma Cemerlic Department of Computer Science and Engineering Master Thesis Defense

36