41
Robust Declassification Steve Zdancewic Andrew Myers Cornell University

Robust Declassification Steve Zdancewic Andrew Myers Cornell University

  • View
    221

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

Robust Declassification

Steve ZdancewicAndrew Myers

Cornell University

Page 2: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 2

Information flow policies are a natural wayto specify precise, system-wide, multi-level security requirements.

Enforcement mechanisms are often too restrictive – prevent desired policies.

Information flow controls provide declassification mechanisms to accommodate intentional leaks.

But… hard to understand end-to-end system behavior.

Information Flow Security

Page 3: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 3

Declassification

Declassification (downgrading) isthe intentional release of confidential

information.

Policy governs use of declassification operation.

Page 4: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 4

Password Example

PasswordChecker

publicquery

confidentialpassword

publicresult

This system includes

declassification.

Page 5: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 5

Attack: Copy data into password

PasswordChecker

publicquery

high securitydata

leakedresult

high securitydata

copy

Attacker canlaunder datathrough the

password checker.

Page 6: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 6

Robust Declassification

Goal:

Formalize the intuition that an attackershould not be able to abuse the downgrademechanisms provided by the system to causemore information to be declassified than intended.

Page 7: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 7

How to Proceed?

• Characterize what information is declassified.

• Make a distinction between “intentional” and “unintentional” information flow.

• Explore some of the consequences of robust declassification.

Page 8: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 8

A Simple System Model

A system S is a pair: is a set of states: 1, 2,…

is a transition relation in

Page 9: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 9

Views of a System

A view of (, ) is an equivalence relation on .

Example: = String Integer "integer component is visible"

("attack at dawn", 3) I ("retreat", 3)

(x,i) I (y,j) iff i = j

("attack at dawn", 3) I ("retreat", 4)/

Page 10: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 10

Example Views

Example: = String Integer "string component is visible" (x,i) S (y,j) iff x = y "integer is even or odd" (x,i) E (y,j) iff i%2 = j%2

"complete view" (x,i) (y,j) iff (x,i) = (y,j)

Page 11: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 11

Passive Observers

A view induces an observation of a trace:

= ("x",1)("y",1)("z",2)("z",3) through view I

1 1 2 3

= ("a",1)("b",2)("z",2)("c",3) through view I

1 2 2 3

Page 12: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 12

Observational Equivalence

The induced observational equivalence is S[]:

S[] ' if the traces from look the sameas the traces from ' through the view .

Page 13: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 13

Simple Example

Page 14: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 14

Simple Example

S[] S[]

Page 15: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 15

Why Did We Do This?

is a view of indicating what an observer sees directly.

S[] is a view of indicating what an observer learns by watching S evolve.

Need some way to compare them…

Page 16: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 16

An Information Lattice

The views of a system S form a lattice.

Order: A B

I

E

S

Intuition: Higher = "more information"

Page 17: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 17

is the identity relation is the total relation

Lattice Order

= I S

I

E

S

Join: A B

Meet: A B

Page 18: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 18

Intuition: An observeronly learns information by watching the system.(Can't lose information.)

Information Learned via Observation

S[]

Lemma: S[]

Page 19: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 19

Natural Security Condition

= S[]

Definition: A system S is-secure whenever = S[]

Closely related to standard definitions of non-interference style properties.

Page 20: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 20

Declassification

Declassification is intentional leakage of information.

Implies that = S[]/

We want to characterize unintentional declassification.

Page 21: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 21

A Simple Attack Model

An A-attack is a system A = (, )such that A = A[A]

A is the attacker's view

is a set of additional transitions introduced by the attacker

A = A[A] means "fair environment"

Page 22: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 22

Attacked Systems

Given a system S = (, ) and attack A = (, ) the attacked system is:

SA = (, )

=

Page 23: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 23

More Intuition

S[] describes the information intentionallydeclassified by the system – a specification for how S ought to behave.

(SA)[A] describes the information obtainedby an attack A.

Page 24: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 24

Example Attack

A A

Page 25: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 25

Example Attack

A A

Attack transitions affect the system.

Page 26: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 26

Example Attack

S[A](SA)[A]

Attacked system may reveal more.

S[]

Page 27: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 27

Robust Declassification

A system S is robust with respect to attack A whenever

(SA)[A] S[A].

S[A]

A

(SA)[A]

Intuition: Attack reveals no additional information.

Page 28: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 28

Secure Systems are Robust

Theorem:If S is A-secure then S is A-robust with respect to all A-attacks.

Intuition: S doesn't leak any information toA-observer, so no declassifications to exploit.

Page 29: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 29

Characterizing Attacks

Given a system S and a view A, for whatA-attacks is S robust?

Need to rule outAttack transitions

Like this:

(integrity property)

S[] S[]

Page 30: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 30

Integrity of Policy

S[] S[]Attack circumvents

or alters thepolicy decision made

by the system.

Page 31: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 31

Providing Robustness

• Identify a class of relevant attacks– Example: attacker may modify / copy files– Example: untrusted host may send bogus

requests• Try to verify that the system is robust vs.

that class of attacks.• Proof fails… system is insecure.

• Possible to provide enforcement mechanisms against certain classes of attacks- Protecting integrity of downgrade policy

Page 32: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 32

Future Work

This is a simple model – extend definition ofrobustness to better models.

How does robustness fit in with intransitivenon-interference?

We're putting the idea of robustnessinto practice in a distributed version of Jif.

Page 33: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 33

Conclusions

It’s critical to prevent downgrading mechanisms in a system from being

abused.

Robustness is an attempt to capture this ideain a formal way.

Suggests that integrity and confidentiality are linked in systems with declassification.

Page 34: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 34

Correction to Paper

Bug-fixed version of paper availablehttp://www.cs.cornell.edu/zdance

Page 35: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 35

A Non-Attack

A A

Uses information attacker can't see!

Page 36: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 36

Observational Equivalence

'

S[] ?Are and ' observationally

equivalent with respect to ?

Page 37: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

CSFW14 37

Observational Equivalence

'

'

'

'

'

'

Take the sets of traces starting at and '.

Trc(S)

Trc'(S)

Page 38: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

38 CSFW14

Observational Equivalence

'

'

'

'

'

'

Look at the traces modulo .

Page 39: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

39 CSFW14

Observational Equivalence

'

'

'

'

'

'

Look at the traces modulo .

Page 40: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

40 CSFW14

Observational Equivalence

'

'

'

'

'

'

Obs(S, )

Obs'(S, )

Page 41: Robust Declassification Steve Zdancewic Andrew Myers Cornell University

41 CSFW14

Observational Equivalence

Obs(S, )

Obs'(S, )

Are they stutter

equivalent?

Yes! S[] '

=