Click here to load reader

Ebrahim Tarameshloo, Philip W.L.Fong, Payman Mohassel University of Calgary Calgary, Alberta, Canada {etarames, pwlfong, pmohasse}@ucalgary.ca On Protection

Embed Size (px)

Citation preview

Slide 1

Ebrahim Tarameshloo, Philip W.L.Fong, Payman MohasselUniversity of CalgaryCalgary, Alberta, Canada{etarames, pwlfong, pmohasse}@ucalgary.caOn Protection in Federated Social Computing Systems1March 2014

1Federated Social Computing SystemsExample: Her access policy: (Share with my friends)@Foursquare vs. (Share with public)@Twitter

Privacy challengesAccess control policy of the originating SCS may not be honored by the destination SCS

2

On Protection in Federated Social Computing SystemsA user may belong to multiple social computing systems (SCSs)Facebook, Foursquare, Twitter, Banjo, Yelp, etc.

Facilitate sharing of contents across the system boundary Connect an account to another SCS [1]

2Outline3Privacy in Federated Social Computing SystemsFormal modelPrivacy via Private Function Evaluation (PFE)Privacy via safe function evaluationOn Protection in Federated Social Computing Systems3Outline4Privacy in Federated Social Computing SystemsFormal modelPrivacy via Private Function Evaluation (PFE)Privacy via safe function evaluationOn Protection in Federated Social Computing Systems4Closer Look at Protection ChallengesPolicy fidelityAmbiguity in terms of what policy to be used for protecting shared contentsMechanism fidelityChallenge of tracking the protection model of the origin site by the destination siteState fidelityThe user information may not be available for policy enforcement at the destination SCS 5On Protection in Federated Social Computing SystemsPolicy fidelityThe access control policy of the shared content prior to migration is not communicated to the destination SCSMechanism fidelityEven if the origin policy (e.g.,friends" in Facebook) is known to the destination site (e.g., Foursquare), the latter may not be able to enforce it.The destination site has to emulate the authorization mechanism of the origin siteThe protection model of a SCS is a moving target State fidelityIn SCSs, users contributed information is used for authorizationExamples: Friendship in Facebook and check-in at Foursquare

5AssumptionsUser identity The manual identity mapping process is consistent and applied whenever needed

Authorization serviceSecure queriable PDPs (Policy Decision Points) for each SCSs of the confederation6

On Protection in Federated Social Computing SystemsUser identity: manual identity mapping is performed when resources are shared (with pulling or pushing the data)we need this too at the time of access to a shared resource Authorization service:The member of the confederation open up their PDPs (or part of them) as services that other member SCSs can query.examples: are these 2 friends? are these 2 colocated?6

Feature Overview of Our Protection Model7Protection of Shared Resources

Native access: (Not the focus of this work)

Shared access: (The goal of our work)

On Protection in Federated Social Computing SystemsOur proposed protection model for federated SCSs offers 4 key features

Native access:suppose : in 4SQ a member request access to the location information of another memberShard access:suppose: requester attempts to access resource from within facebookwe call it shared access. The goal of our work is to articulate a relational protection model for shared accesses.

7

Feature Overview of Our Protection Model8Shared Access PoliciesPolicies for controlling shared accesses defined by resource owner Addresses Policy Fidelity

On Protection in Federated Social Computing Systemsowner of the shared resource needs to specify a policy for controlling shared accesses to her resource

wherever shared access is requested, the SCS to which resource is shared will honor this policy8

Feature Overview of Our Protection Model9Distributed Evaluation of Situated Queries

Shared access policy in the form of situated queries

Example: friend@Facebook, co-located@Foursquare

Distributed evaluation ensures Mechanism and State Fidelity

On Protection in Federated Social Computing SystemsExample:What does this mean?It means that a user may demand that her resource be accessible only to her friends on Facebook, no matter where that resource has been moved

In context of this example, suppose a user requests a shared access on Google+, Google+ will not attempt to emulate friendship testing. Instead, it will query the authorization service of Facebook to make sure if requester and owner are friends9

Feature Overview of Our Protection Model10Policy Composition

More flexible protection model

Made up of boolean combinations of situated queries

Example: (friend@Facebook follower@Twitter) nearby@Foursquare

On Protection in Federated Social Computing SystemsTo make the protection model more flexible, users may formulate composite policies made up of boolean combinations of situated queries.10Outline11Privacy in Federated Social Computing SystemsFormal modelPrivacy via Private Function Evaluation (PFE)Privacy via safe function evaluationOn Protection in Federated Social Computing Systems11Formal Model of Federated SCSs12Confederation Schema Specifies the constant entities in federationPrivacy Configuration Specifies current privacy settings of the confederationProtection State Tracks the current protection state of member SCSs Tracks the whereabouts of shared resources

On Protection in Federated Social Computing SystemsOur formal model of federated SCSs is composed of three layers: Schema, Configuration, State Confederation SchemaSpecifies the basic entities that exist in the confederationI: interpretation function to define the semantic of atomic queries. Components of the schema remain constant unless the membership of the confederation changes

Privacy Configurationwe are specifying active users and resources, origin and the owner of resources, and policies of user resources

12Policy Language13Distinctive featuresAtomic queries can be interpreted at specific SCSComposite policies by composition of atomic queriesSyntax

SemanticsResource owner and requester must satisfy policy formula in a given protection state

On Protection in Federated Social Computing Systemsfeatures:a user can specify policies as atomic queries to be interpreted at specific SCSs, Such atomic queries can then be composed into composite policies via boolean connectives.

Syntax:For simplicity, we used propositional logic%We could have adopted other composition frameworks. e.g. Belnap Logic. but such an extension is trivial

Absolute vs. relative location tags

The semantic of of the policy language is defined with respect to a schema, configuration and state13Outline14Privacy in Federated Social Computing SystemsFormal modelPrivacy via Private Function Evaluation (PFE)Privacy via safe function evaluationOn Protection in Federated Social Computing SystemsLets look at the privacy concerns in distributed evaluation of shared policies14Privacy via Secure Multiparty Computation15Distributed evaluation of shared access policiesPrivacy effect: Disclosure of SCSs protection statesExample: friend@Facebook nearby@FoursquareEvaluation may disclose user location claims in Foursquare to Facebook Privacy goalPreserving the privacy of SCSs protection states during the evaluation of shared access policiesPossible approachSecure Multiparty Computation (SMC)On Protection in Federated Social Computing SystemsEach SCS opens up part of its PDPas a queriable service

privacy effect: An thus the user information

So one possible approach to prevent this discloser is using the SMC15SMC and Output Privacy16SMC allows a group of parties to collectively compute a function of their inputs, while at the same time keeping these inputs private

SMC does not guaranty output privacyExample: SMC does not try to determine which function is safe to compute

On Protection in Federated Social Computing SystemsOk now we can keep the contributed inputs, private

but what if we have conjunction of three inputs

we may infer some information about inputs from output

16SMC and Output Privacy17Privacy challenge in our scheme:Example:Evaluation of at Instagram may leak users location and friendship

Possible approachesHide policy formulas from federated SCSsEvaluate only safe public policy formulas

On Protection in Federated Social Computing SystemsNow consider the same example in our setting

We consider 2 possible approach to deal with the output privacy 17Approach1: PFE-based Architectures18Hide the from the SCSs involved Advantage: no restriction on what the formula can beCore challenge: hiding policy while running the SMC protocol

Private Function Evaluation (PFE)

Three PFE-based architecturesOrigin arch. (Origin tracks policy)User arch. (User tracks policy)TP arch. (Third party tracks all policies)

On Protection in Federated Social Computing SystemsCore challenge:How we can hide the formula and do an SMC

Variants of SMC

3 Arch.we proposed 3 architectures based on PFE protocol. They differ, based on who is responsible for taking care of shared policies we also have done a comparative assessment based on 3 different factors. privacy, knowledge of query vocabulary, fault tolerance18Origin Arch. (Origin SCS Tracks Policy)19PFEAuthorization DecisionOrigin SCSCurrent SCS

Ask to initiate PFE

Each SCS tracks shared access policy of its own resources

On Protection in Federated Social Computing SystemsPrivacyAuthorization decision should be hidden from origin SCS if it contributes an input to the policy formula Knowledge of query vocabularyEvery SCSs must understand the full query vocabulary of all other SCSs in confederation

19User Arch. (User Tracks Policy)20PFEAuthorization DecisionOrigin SCSCurrent SCS

Ask to initiate PFE

Each user stores shared access policies on a user owned storage On Protection in Federated Social Computing SystemsPrivacyThere should not be any collusion between storage service and any SCSExample: Google+ and Google DriveFault tolerantFailing of user storage will affect only the shared resources of that user

20TP Arch. (Third Party Tracks Policy)21PFEAuthorization DecisionOrigin SCSCurrent SCS

Ask to initiate PFETP

Centralized policy storage service by a trusted third party (TP)

On Protection in Federated Social Computing SystemsCentralized policy storage service by a trusted third part (TP)TP will be contacted when a shared access is to be authorized

Privacy assumptions:TP will not collude with any SCSsPolicies are hidden from all SCSs except the TP

Knowledge of query vocabularyOnly TP must understand the full query vocabulary

21Challenge of Policy AdministrationOn Protection in Federated Social Computing Systems22Every user must define a shared access policy for every resource

Tedious for users

Default policies for various categories of resources

It is unlikely that a user will bother to go through such a tedious exercise.

the provision of relative location tags has exactly this application in mind.22Assessment of three architectures 23PrivacyOrigin arch. Authorization decision should be hidden from origin SCS if it contributes an input to the policy formula User arch.There should not be any collusion between storage service and any SCSExample: Google+ and Google DriveTP arch.Should remain trustedKnowledge of query vocabularyOrigin arch.Every SCSs must understand the full query vocabulary of all other SCSs in confederationUser arch.Same as Origin arch.TP arch.Only TP must understand the full query vocabulary Fault toleranceOrigin arch.Failing of one SCS affects all policy lookup of all resources originating from that SCSUser arch.Failing of user storage will affect only the shared resources of that userTP arch.Single point of failure. Will affect entire confederation.Policy administrationEvery user must define a shared access policy for every resourceTedious for usersDefault policies for various categories of resources Example:

On Protection in Federated Social Computing SystemsWe may turn this to a table23Outline24Privacy in Federated Social Computing SystemsFormal modelPrivacy via Private Function Evaluation (PFE)Privacy via safe function evaluationOn Protection in Federated Social Computing Systems24Approach2: Privacy via Safe Functions25All shared access policies are allowed to be publicExample: default policiesEvaluate only safe policies by confederationPrivacy goal: No inference of inputs from output values

An SCS can refrain from providing input if a policy is detected to be unsafeSafe function definition based on Sutherlands definition of information flow via the notion of deducibility On Protection in Federated Social Computing SystemsIn previous approach

refrain= hold back

We adapt the safe function definition based on 25Input NonDeducibilityOn Protection in Federated Social Computing Systems26x1x2xixn-1xnf1101111011101101011110000000000110110If the policy evaluated @ Google+ False Requester is a family memberWhat if the policy evaluated @ LinkedinExample:

To describe the general idea in our context, suppose we have this example. A user shared-access policy is

If f is is input nondeducible, whatever the functions output is, we can not infer anything about xi.

We provide a more complex version of IND.

A function is Is IND despite J, in which J is the set of contributed inputs by the party that can see the authorization decision.26Application and Complexity of IND27SCSs test whether policy function is Ith input nondeducible I is the set of contributed input by an SCS

Deciding input nondeducibility

To implement the static analysisComplement of IND is in Encode IND instance to Quantified Boolean Formula (QBF)Use a QBF solver to test the satisfiability

On Protection in Federated Social Computing SystemsSo what would be the application of IND in the federated SNSs

both IND are in the second level of polynomial hierarchy27IND Functions28Rarity of input nondeducible functionsLimited composibility

Useful IND functions Threshold function Threshold returns 1 if at least m of the n inputs are 1Replacement for conjunction

Conditional function Replacement for disjunction

On Protection in Federated Social Computing SystemsExample: Of the 16 boolean functions with 2 inputs, only XOR and its negation are ith input nondeducibleUsual , functions does not guarantee input nondeducibility XOR not useful for policy composition

there are indeed boolean functions that are both useful and input nondeducible

They can be seen as replacement for conjunction and dijunctions28Policy Idioms29It is unwise to leave it to the user to formulate safe policies

Users can be provided with templates of safe policies

Safe policy templatesThreshold policyConditional policy

On Protection in Federated Social Computing SystemsConsidering the rarity of functions that are input nondeducible, and the limited composibility that they have, it is unwise to leave it to the users to formulate their own shared access policies29Related Work30[1] Ko, Moo Nam, Gorrell P. Cheek, Mohamed Shehab, and Ravi Sandhu. "Social-networks connect services." Computer 43, no. 8 (2010): 37-43.[2] Shehab, Mohamed, Moo Nam Ko, and Hakim Touati. "Enabling cross-site interactions in social networks." Social Network Analysis and Mining 3.1 (2013): 93-106.[3] Squicciarini, Anna Cinzia, Giuseppe Petracca, and Elisa Bertino. "Adaptive data protection in distributed systems." Proceedings of the third ACM conference on Data and application security and privacy. ACM, 2013.

On Protection in Federated Social Computing Systemsand finally, the related work that we were inspired 30CalgaryOn Protection in Federated Social Computing Systems31

ICT Bldg. at the University of CalgaryOn Protection in Federated Social Computing Systems32