19
A Framework for Using Trust to Assess Risk in Information Sharing Chatschik Bisdikian, Yuqing Tang, Federico Cerutti , Nir Oren AT-2013 Thursday 1 st August, 2013 c 2013 Federico Cerutti <[email protected]>

Cerutti-AT2013-Trust and Risk

Embed Size (px)

DESCRIPTION

In this paper we describe a decision process framework allowing an agent to decide what information it should reveal to its neighbours within a com- munication graph in order to maximise its utility. We assume that these neigh- bours can pass information onto others within the graph, and that the commu- nicating agent gains and loses utility based on the information which can be in- ferred by specific agents following the original communicative act. To this end, we construct an initial model of information propagation and describe an optimal decision procedure for the agent.

Citation preview

Page 1: Cerutti-AT2013-Trust and Risk

A Framework for Using Trust toAssess Risk in Information

Sharing

Chatschik Bisdikian, Yuqing Tang, Federico Cerutti, Nir Oren

AT-2013Thursday 1st August, 2013

c© 2013 Federico Cerutti <[email protected]>

Page 2: Cerutti-AT2013-Trust and Risk

Summary

Framework for describing how much information should bedisclosedPreliminary discussion on multi-agent systemsIllustration of the relevant definition with a scenarioDescription of the decision support this framework can providegiven this scenarioMissing in this presentation: some statistical properties of theproposed approach

2 of 18

Page 3: Cerutti-AT2013-Trust and Risk

A Scenario

British Intelligence sent two spies, James and Alec, to FranceJames: clever, very loyalAlec: clumsy, selfish

London knows that France will be invaded by Germany, butLondon just informs her men that France will be invaded by aEuropean countryPurpose: James and Alec can use this information for recruitingnew agents in FranceRisk: if they share that Germany will invade France, this willresult in a loss of credibility of UK government (they are the onlyones aware of these plans)

3 of 18

Page 4: Cerutti-AT2013-Trust and Risk

A Probabilistic Approach: the Big Picture

‘c’ obtains ‘x’ from ‘p’

0

1

0

1

producer, p consumer, c

y

z

x

inferenceimpact

(behavioral trust)

Pr(infer | )

( ; )I

y x

f y x dy

Pr(impact | )

( ; )B

z y

f z y dz

Pr(impact | ) ( ; )R

z x f z x dz

4 of 18

Page 5: Cerutti-AT2013-Trust and Risk

The Formal Definitions (i)

DefinitionA Framework for Risk Assessment (FRA) is a 6-ple:

〈A, C,M, ag,m, Tg〉

where:

A is a set of agents;C ⊆ A×A is the set of communication links among agents;M is the set of all the messages that can be exchanged;ag ∈ A is the producer, viz. the agent that shares information;m ∈M is a message to be assessed;A \ {ag} is the set of consumers, and in particular:

Tg ⊆ A \ {ag} are the desired consumers, and∀agX ∈ Tg, 〈ag, agX〉 ∈ C;A \ ({ag} ∪ Tg), are the undesired consumers.

5 of 18

Page 6: Cerutti-AT2013-Trust and Risk

The Example Formalised (i)

FRABI = 〈ABI , CBI ,MBI , agBI ,mBI , T gBI〉, where:

{BI, James,Alec} ⊆ ABI ;{〈BI, James〉, 〈James,BI〉, 〈BI,Alec〉, 〈Alec,BI〉} ⊆ CBI ;{m1,m2} ⊆MBI with:

m1: France will be invaded by Germany;m2: France will be invaded by a European country;

agBI = BI;mBI = m1;{James,Alec} ⊆ TgBI .

6 of 18

Page 7: Cerutti-AT2013-Trust and Risk

The Formal Definitions (ii)

DefinitionGiven A a set of agents, a message m ∈M , ag1, ag2 ∈ A,xag2ag1(m) ∈ [0, 1] is the degree of disclosure of message m used betweenthe agent ag1 and the agent ag2, where xag2ag1(m) = 0 implies no sharingand xag2ag1(m) = 1 implies full disclosure between the two agents.We define the disclosure function as follows:

d : M × [0, 1] 7→M

d(·, ·) accepts as input a message and a degree of disclosure of thesame message, and returns the disclosed part of the message as a newmessage.

7 of 18

Page 8: Cerutti-AT2013-Trust and Risk

The Example Formalised (ii)

Let’s suppose that xJamesBI = xAlec

BI = x. In other terms, BI uses thesame disclosure degree with both James and Alec.

In addition, d(m1, x) = m2

N.B.m1: France will be invaded by Germany;m2: France will be invaded by a European country;

8 of 18

Page 9: Cerutti-AT2013-Trust and Risk

Disclosure Degree and Multi-Agents Networks

d(m′, xag3ag2) = d(m,xag3

ag1)

where

xag3ag1 = 〈sag2ag1 , x

ag2ag1〉 � 〈s

ag3ag2 , x

ag3ag2〉;

sag2ag1 ∈ [0, 1] is the probability that ag1 will propagate to ag2 thedisclosed part of m that it receives;

� is a transitive function such that

� : ([0, 1]× [0, 1])× ([0, 1]× [0, 1]) 7→ [0, 1]

xag3ag1 ≤ xag2

ag1 .

9 of 18

Page 10: Cerutti-AT2013-Trust and Risk

Disclosure Degree and Multi-Agents Networks

merge(d(m′, xag4ag2), d(m′′, xag4

ag3)) = d(m,xag4ag1)

where

xag4ag1 =

(〈sag2ag1 , x

ag2ag1〉 � 〈s

ag4ag2 , x

ag4ag2〉

)⊕(〈sag3ag1 , x

ag3ag1〉 � 〈s

ag4ag3 , x

ag4ag3〉

);

sag2ag1 ∈ [0, 1] is the probability that ag1 will propagate to ag2 thedisclosed part of m that it receives;

⊕ is a transitive function

⊕ : [0, 1]× [0, 1] 7→ [0, 1]

xag4ag1 ≤ min {xag2

ag1 , xag3ag1}.

9 of 18

Page 11: Cerutti-AT2013-Trust and Risk

The Formal Definitions (iii)

DefinitionGiven a FRA 〈A, C,M, ag,m, Tg〉, let agX ∈ Tg:

P (xagXag ) is a r.v.(FP (·;xagXag ), fP (·;xagXag )) which represents thebenefit agent ag receives when sharing the message m with adegree of disclosure xagXag with agent agX ;yag2|xag2ag1

∈ [0, 1] is the amount of knowledge of m that ag2 can

infer given xag2ag1 according to the r.v. Iag2(xag2ag1) ( FIag2(·;xag2ag1),

fIag2 (·;xag2ag1)).zag2|xag2ag1

∈ [0, 1], the impact that an information producer ag

incurs when an information consumer ag1 makes use of theinformation inferred yag|ag1

from a message m disclosed with xag1ag

according to the r.v. B(yag|ag1) ( FB(·; yag|ag1 ), fB(·; yag|ag1 ));

10 of 18

Page 12: Cerutti-AT2013-Trust and Risk

The Formal Definitions (iv)PropositionGiven a FRA 〈A, C,M, ag,m, Tg〉, an agent agY ∈ A that has receiveda message d(m,x), with x = xagYag . Let y be the inferred (by agY )information according to the r.v. I(x) (with probability ≈ fI(y;x) dy).Then, assuming that the impact z is independent of the degree ofdisclosure x given the inferred information y, ag expects a level of riskz described by the r.v. R(x) with density:

fR(z;x) =

∫ 1

0fB(z; y) fI(y;x) dy.

DefinitionGiven a FRA 〈A, C,M, ag,m, Tg〉, let agX ∈ Tg, ∀agY ∈ A, the netbenefit for the producer to share information with agY is described by:C = P −R, with an average, or expected benefit, E{C(xagYag )} =E{P (xagYag )} − E{R(xagYag )}.

11 of 18

Page 13: Cerutti-AT2013-Trust and Risk

A Probabilistic Approach: the Big Picture

‘c’ obtains ‘x’ from ‘p’

0

1

0

1

producer, p consumer, c

y

z

x

inferenceimpact

(behavioral trust)

Pr(infer | )

( ; )I

y x

f y x dy

Pr(impact | )

( ; )B

z y

f z y dz

Pr(impact | ) ( ; )R

z x f z x dz

12 of 18

Page 14: Cerutti-AT2013-Trust and Risk

Our Scenario Revisited

A

0

1B

100K (impact to the provider)

10K q

1-q

inference impact

w(0)

1-w(0)

w(1)

1-w(1)

x( 25K)

Average impact:E{h} = q

{10w(0) + 100[1− w(0)]

}+ (1− q)

{10w(1) + 100[1− w(1)]

}= 100− 90

{q[w(0)− w(1)] + w(1)

}Expected net benefit: C̄(x) = P̄ (x)− 100 + 90

{q[w(0)− w(1)] + w(1)

}C̄ ≥ 0⇒ 100−P̄ (x)

90 ≤ qw(0) + (1− q)w(1) ≤ 1

13 of 18

Page 15: Cerutti-AT2013-Trust and Risk

Our Scenario Revisited: James

A

0

1B

100K (impact to the provider)

10K q= 0.1

0.9

inference impact

w(0) = 0.9

0.1

w(1) = 0.9

0.1

x( 25K)

Average impact: 10K

Net benefit: 7590 ≤ 0.9 ≤ 1

Conclusion: BI can “safely” share with James the informationthat France is going to be invaded

14 of 18

Page 16: Cerutti-AT2013-Trust and Risk

Our Scenario Revisited: Alec

A

0

1B

100K (impact to the provider)

10K q= 0.6

0.4

inference impact

w(0) = 0.6

0.4

w(1) = 0.4

0.6

x( 25K)

Average impact: 53.2K

Net benefit: 7590 � 0.52 ≤ 1

Conclusion: BI cannot “safely” share with Alec the informationthat France is going to be invaded

15 of 18

Page 17: Cerutti-AT2013-Trust and Risk

Conclusions

Framework enabling an agent to determine how much informationshould disclose to others in order to maximise its utilityAllows to distinguish between “desired” (e.g. James) and“undesired” consumers (e.g. Alec)It helps in handling the risk of information propagated across anetwork of agentsPotential applications in strategic contexts where pieces ofinformation are shared across several partners which can havehidden agendaFuture works:

Integration with quantitative trust modelsStudying statistical properties of the r.v. R(x)Developing statistical operators for representing the propagationof information across a (partially known) network of agents

16 of 18

Page 18: Cerutti-AT2013-Trust and Risk

In loving memory of Chatschik Bisdikian Ph.D.

Born December 21st 1960 — Died April 24th 2013Researcher at IBM, IEEE Fellow, inductee of the Academy of Distinguished Engineers,

Hall of Fame of the School of Engineering of the University of Connecticut, lifelong

member of the Eta Kappa Nu, and Phi Kappa Phi Honor Societies.

17 of 18

Page 19: Cerutti-AT2013-Trust and Risk

Acknowledgement

Research was sponsored by US Army Research laboratoryand the UK Ministry of Defence and was accomplished underAgreement Number W911NF-06-3-0001. The views andconclusions contained in this document are those of theauthors and should not be interpreted as representing theofficial policies, either expressed or implied, of the US ArmyResearch Laboratory, the U.S. Government, the UK Ministryof Defense, or the UK Government. The US and UKGovernments are authorized to reproduce and distributereprints for Government purposes notwithstanding anycopyright notation hereon.

18 of 18