View
221
Download
0
Embed Size (px)
Citation preview
Robust Declassification
Steve ZdancewicAndrew Myers
Cornell University
CSFW14 2
Information flow policies are a natural wayto specify precise, system-wide, multi-level security requirements.
Enforcement mechanisms are often too restrictive – prevent desired policies.
Information flow controls provide declassification mechanisms to accommodate intentional leaks.
But… hard to understand end-to-end system behavior.
Information Flow Security
CSFW14 3
Declassification
Declassification (downgrading) isthe intentional release of confidential
information.
Policy governs use of declassification operation.
CSFW14 4
Password Example
PasswordChecker
publicquery
confidentialpassword
publicresult
This system includes
declassification.
CSFW14 5
Attack: Copy data into password
PasswordChecker
publicquery
high securitydata
leakedresult
high securitydata
copy
Attacker canlaunder datathrough the
password checker.
CSFW14 6
Robust Declassification
Goal:
Formalize the intuition that an attackershould not be able to abuse the downgrademechanisms provided by the system to causemore information to be declassified than intended.
CSFW14 7
How to Proceed?
• Characterize what information is declassified.
• Make a distinction between “intentional” and “unintentional” information flow.
• Explore some of the consequences of robust declassification.
CSFW14 8
A Simple System Model
A system S is a pair: is a set of states: 1, 2,…
is a transition relation in
CSFW14 9
Views of a System
A view of (, ) is an equivalence relation on .
Example: = String Integer "integer component is visible"
("attack at dawn", 3) I ("retreat", 3)
(x,i) I (y,j) iff i = j
("attack at dawn", 3) I ("retreat", 4)/
CSFW14 10
Example Views
Example: = String Integer "string component is visible" (x,i) S (y,j) iff x = y "integer is even or odd" (x,i) E (y,j) iff i%2 = j%2
"complete view" (x,i) (y,j) iff (x,i) = (y,j)
CSFW14 11
Passive Observers
A view induces an observation of a trace:
= ("x",1)("y",1)("z",2)("z",3) through view I
1 1 2 3
= ("a",1)("b",2)("z",2)("c",3) through view I
1 2 2 3
CSFW14 12
Observational Equivalence
The induced observational equivalence is S[]:
S[] ' if the traces from look the sameas the traces from ' through the view .
CSFW14 13
Simple Example
CSFW14 14
Simple Example
S[] S[]
CSFW14 15
Why Did We Do This?
is a view of indicating what an observer sees directly.
S[] is a view of indicating what an observer learns by watching S evolve.
Need some way to compare them…
CSFW14 16
An Information Lattice
The views of a system S form a lattice.
Order: A B
I
E
S
Intuition: Higher = "more information"
CSFW14 17
is the identity relation is the total relation
Lattice Order
= I S
I
E
S
Join: A B
Meet: A B
CSFW14 18
Intuition: An observeronly learns information by watching the system.(Can't lose information.)
Information Learned via Observation
S[]
Lemma: S[]
CSFW14 19
Natural Security Condition
= S[]
Definition: A system S is-secure whenever = S[]
Closely related to standard definitions of non-interference style properties.
CSFW14 20
Declassification
Declassification is intentional leakage of information.
Implies that = S[]/
We want to characterize unintentional declassification.
CSFW14 21
A Simple Attack Model
An A-attack is a system A = (, )such that A = A[A]
A is the attacker's view
is a set of additional transitions introduced by the attacker
A = A[A] means "fair environment"
CSFW14 22
Attacked Systems
Given a system S = (, ) and attack A = (, ) the attacked system is:
SA = (, )
=
CSFW14 23
More Intuition
S[] describes the information intentionallydeclassified by the system – a specification for how S ought to behave.
(SA)[A] describes the information obtainedby an attack A.
CSFW14 24
Example Attack
A A
CSFW14 25
Example Attack
A A
Attack transitions affect the system.
CSFW14 26
Example Attack
S[A](SA)[A]
Attacked system may reveal more.
S[]
CSFW14 27
Robust Declassification
A system S is robust with respect to attack A whenever
(SA)[A] S[A].
S[A]
A
(SA)[A]
Intuition: Attack reveals no additional information.
CSFW14 28
Secure Systems are Robust
Theorem:If S is A-secure then S is A-robust with respect to all A-attacks.
Intuition: S doesn't leak any information toA-observer, so no declassifications to exploit.
CSFW14 29
Characterizing Attacks
Given a system S and a view A, for whatA-attacks is S robust?
Need to rule outAttack transitions
Like this:
(integrity property)
S[] S[]
CSFW14 30
Integrity of Policy
S[] S[]Attack circumvents
or alters thepolicy decision made
by the system.
CSFW14 31
Providing Robustness
• Identify a class of relevant attacks– Example: attacker may modify / copy files– Example: untrusted host may send bogus
requests• Try to verify that the system is robust vs.
that class of attacks.• Proof fails… system is insecure.
• Possible to provide enforcement mechanisms against certain classes of attacks- Protecting integrity of downgrade policy
CSFW14 32
Future Work
This is a simple model – extend definition ofrobustness to better models.
How does robustness fit in with intransitivenon-interference?
We're putting the idea of robustnessinto practice in a distributed version of Jif.
CSFW14 33
Conclusions
It’s critical to prevent downgrading mechanisms in a system from being
abused.
Robustness is an attempt to capture this ideain a formal way.
Suggests that integrity and confidentiality are linked in systems with declassification.
CSFW14 34
Correction to Paper
Bug-fixed version of paper availablehttp://www.cs.cornell.edu/zdance
CSFW14 35
A Non-Attack
A A
Uses information attacker can't see!
CSFW14 36
Observational Equivalence
'
S[] ?Are and ' observationally
equivalent with respect to ?
CSFW14 37
Observational Equivalence
'
'
'
'
'
'
Take the sets of traces starting at and '.
Trc(S)
Trc'(S)
38 CSFW14
Observational Equivalence
'
'
'
'
'
'
Look at the traces modulo .
39 CSFW14
Observational Equivalence
'
'
'
'
'
'
Look at the traces modulo .
40 CSFW14
Observational Equivalence
'
'
'
'
'
'
Obs(S, )
Obs'(S, )
41 CSFW14
Observational Equivalence
Obs(S, )
Obs'(S, )
Are they stutter
equivalent?
Yes! S[] '
=