Privacy in BiometricsStelvio Cimato, Marco Gamassi, Vincenzo Piuri,Roberto Sassi, and Fabio Scotti
Biometric features are increasingly used for authentication and identification pur-poses in a broad variety of institutional and commercial systems. The large diffusionof e-government, e-banking, and e-commerce applications requires more stringentmethodologies to identify customers or citizens in order to prevent any maliciousbehavior that could lead to economic loss or fraud attempts for the involved parties.Biometric data are natural candidates to be used in authentication systems that shouldguarantee a higher level of security. Such kind of data are indeed unique for eachperson and strictly associated to its owner. They are irrevocable, in the sense that theassociation cannot be changed during the human life and in many cases they are hardto forge.
Many different authentication systems have been proposed taking into accountdifferent biometric traits, some physiological, some behavioral, each proposal havingdifferent advantages or drawbacks. In some cases, practical settings have been devisedand different solutions are available in commercial applications or for border control.If from one side the interest in biometrics techniques is more and more increasingfor their advantages (security, reliability, etc.) on the other side, the potential threat tothe privacy of users, coming from the abuse of biometric information, is an object ofdiscussion and often prevents the adoption of biometric systems on a large scale. Infact, people are not generally willing to give out biometric traits with little assurancethat they cannot be stolen or used without an expressed consent. For the same reasondiscussed above, many people are more and more worried about the adoption ofbiometric systems in practical situation.
Biometrics: Theory, Methods, and Applications. Edited by Boulgouris, Plataniotis, and Micheli-TzanakouCopyright 2010 the Institute of Electrical and Electronics Engineers, Inc.
634 Chapter 25 Privacy in Biometrics
Recently, much research work has been devoted to the construction of tech-niques for the protection of biometric templates. In this way, biometric authenticationschemes can be devised, satisfying the increasing request for privacy coming fromusers. Such techniques usually enable the generation of secure identifiers after a trans-formation of the input biometric traits making it impossible to recover the originalbiometric features (thus preserving the privacy of the biometric traits). Several propos-als have been formulated combining cryptography and biometrics in order to increasethe confidence in the system when biometric templates are stored for verification.
The chapter reviews the privacy issues related to the use of biometrics and presentssome of the most advanced techniques available up to date, providing a comparativeanalysis and giving an overview on future trends. This chapter is structured as follows.In the next section we present the most common biometric traits and features usedin real-world applications as well as the associated risk level in the privacy for theindividuals. In Section 25.3 we introduce efficient representation of biometric featuresin order to protect biometric templates and construct privacy compliant authenticationsystem. In Section 25.4 we discuss privacy issues in multimodal biometric systems,when more than one biometric trait is used, and present in Section 25.5 an innovativemethod for building multimodal privacy-aware verification system.
25.2 BIOMETRIC TRAITS AND PRIVACY
In this section we discuss the privacy issues concerning the practical usage of thebiometric systems. To this purpose it is important to consider both the view of usersand the real risks which they could be exposed to. Different perspectives about privacycan also be given with respect to the application context in which biometrics areexploited and the particular methodology used for the collection of biometric data.Finally, privacy risks can also be evaluated considering the specific traits upon whichthe biometric systems are based.
25.2.1 User Perception and Real Risks
The users commonly perceive biometric authentication and identification techniquesas a threat to their privacy rights. In particular, there are some aspects that enforcethis perception . The first one is related to the fact that the acquisition of thebiometric traits is considered as an exact and permanent filing of the users activitiesand behaviors. For example, very common is the thought that most biometric systemshave 100% identification accuracy and that the biometric samples and templates arenecessarily stored and/or sent over a network, exposing them to further risks of beingexposed. Actually, the latter is a well-founded concern. In fact, while it should begranted to the user that the biometric information collected should not be used for anyother activities than the ones expressly declared, in some cases it is harder to grant thisaspect, especially if the biometric samples themselves are sent over a network. Thesecond issue is related to the possibility of tracking down the user activities associatedwith the biometric acquisition, even in the far future. This produces in the user the
25.2 Biometric Traits and Privacy 635
perception of the possibility to be tracked in his movements or in his buying andlifestyle. Commonly, this issue is associated with a sort of big brother phobia, inwhich a superior entity is capable of observing and acquiring knowledge on eachactivity of the user.
In a negligible part of the population, the usage of a biometric system is also per-ceived as uncomfortable or dangerous. For example, the fingerprint sensorwhenpreviously used by other people and not properly cleanedcan be considered asunpleasant or disgusting. Or face and iris acquisition systems might induce appre-hension to have the eyes damaged by lasers and/or IR sources. Very interestingly,users often overlook others privacy-related problems arising when biometrics areinvolved.
The first point concerns the possible usage of biometric information for operatingProscription Lists. For example, a user can be classified from a previous behavior oractivity in a specific class, and thenas a consequence of this classificationsomeservices and accesses can be denied. Important examples of this situation are the blacklists present in call centers and service providers especially designed to identify and tomanage the users considered as offending or not-collaborative. Other examplesare the bad-credit lists filled in many investor and mutual funds companies. Indeed,proscription lists can be employed also without the adoption of biometric systems (andactually they are), but the usage of biometric technologies can make the situation moreand more dramatic.
The second point concerns the fact that many biometric features can be usedto obtain personal information of the users, such as medical information of past ill-nesses or the current (and future) clinical trends. For example, the retinal patternacquired by the biometric system can produce valuable information on the presenceof hypertension, diabetes, and others illnesses . Much more personal informationcan be extracted from DNA samples .
25.2.2 Applicative Contexts
The real risk of privacy-invasiveness can be analyzed in more detail with respect toboth the final application which the biometric system is dedicated to and the biomet-ric trait that is involved. Table 25.1 plots a qualitative representation of the privacyrisks versus 10 different application features, according to the International BiometricGroup .
Biometric covert applications (such as the surveillance systems without explicitauthorization from the users) are considered to be more privacy invasive. On the otherhand, the biometric systems for identification or verification that are optional are con-sidered as more privacy compliant. In this case, users can decide to not be checkedby a biometric system, and they can adopt a different identification/verificationsystem.
Privacy is considered to be exposed to a greater risk when the biometric systemperforms an identification instead of a simpler verification task. This is related to thefact that the identification process encompasses a one-to-many comparison, which,
636 Chapter 25 Privacy in Biometrics
Table 25.1. Applicative Aspects Concerning thePrivacy (According to the IBG)
Lower Risk of Privacy Invasiveness Greater
Overt CovertOptional Mandatory
Verification IdentificationFixed period Indefinite
Private sector Public sectorIndividual, customer Employee, citizen
Enrollee InstitutionPersonal storage Database storage
Behavioral PhysiologicalTemplates Images
in most cases, is not carried out in the same place of the acquisition (typically, thebiometric data are sent through a network to a database for the comparisons).
Also the duration of the retention of the biometric data impacts the privacy risk.If retention expires in a fixed period of time, the privacy risk is reduced. Best practicenotions require that every project which encompasses biometric data retention shouldalways explicitly state its duration.
Different risks are present with respect to the sector of application: The bio-metric setups in the public sector are considered to be more susceptible to privacy-invasiveness than the same installations in the private sector.
Also the role of the individuals that use the biometric system has great impacton the privacy. There roles have an increasing privacy risk: individual, customer,employee, citizen. The most relevant privacy invasion is related to the associationof the fundamental rights of the individual to a biometric identity test. The privacyrisks are lower in the applications where the individuals retain usage rights over thebiometric data.
Also the storage method of the biometric data affects the privacy risk. The worstcase is when they are all stored in a central database, out of the users control. Thebest case is when the user personally holds the biometric datafor example, whenthe personal biometric information is stored only on a smart card belonging to theusers.
The distinction between behavioral and physiological traits is relevant with re-spect to the privacy risks. The physiological data (such as fingerprints, or iris tem-plates) can be used in a more invasive manner. This is related to the fact that thephysiological traits are the most stable in time, and they are characterized by veryhigh verification/identification accuracies. On the other hand, the behavioral traitstend to be less accurate, and, most of the time, they request the user collaboration.
Also, the storage format is relevant: Templates are usually carrying much lessinformation than the original sample/images. While they are less powerful when usedas direct identifiable data, they are privacy-invasive.
25.2 Biometric Traits and Privacy 637
Table 25.2. Data Collection Approaches
Protective Enterprise security, account-holder verificationSympathetic Application of the best practice notions in common applicationsNeutral Personal PCA, home PC, access controlInvasive Surveillance, some centralized national ID services
25.2.3 System Design and Data Collection
Another useful taxonomy concerns the different approaches for biometric datacollection and storing. The IBG classifies four different classes concerning the privacyprotection (Table 25.2): protective, invasive, neutral, sympathetic .
A privacy-protective system is designed to protect or limit the access to personalinformation, providing a means for an individual to establish a trusted identity. In thiscase, the biometric systems use biometrics data to protect personal information thatmight otherwise be copied, stolen, or misused.
A privacy-sympathetic system limits access/usage to personal data. A privacy-sympathetic approach encompasses the specific design of elements able to protectbiometric data from unauthorized access and usage. Also, the storage and the trans-mission of biometric data must be informed, if not driven, by privacy concerns.
In a privacy-neutral system, privacy aspects are not important or the potentialprivacy impact is slight. Privacy-neutral systems are designed to be difficultly misusedwith regard to privacy issues, but they do not have the capability to protect personalprivacy.
A privacy-invasive system facilitates or enables the usage of personal data in afashion that is contrary to privacy principles. In privacy-invasive systems, personaldata are used for purposes broader than what originally intended. Systems that facil-itate the linkage of personal data without an individuals consent, as well as those inwhich personal data are loosely protected, belong to this class.
25.2.4 Technology Evaluation
The different biometric technologies associated to each biometric trait can producevarious levels of privacy risk. Table 25.3 shows the overall risk for the users privacyassociated to the specific trait. The privacy-related aspects are summarized by takinginto account the four most significant technologies features .
The first feature is associated with the capability of the technology to processsearches in databases of biometric records. The higher this capability, the higher theprivacy risk.
The second feature is associated to the possibility of the technology to effectivelywork in an overt or covert fashion. For example, a face recognition system can be
638 Chapter 25 Privacy in Biometrics
Table 25.3. Privacy Risk Ranking with Respect the Available Technologiesa
Trait Verif./Id. Behav./Phys. Ov./Cov. DB Comp. Overall Risk
Face High Medium High High HighFingerprint High High Low High HighRetina High High Low Low MediumIris High High Low Low MediumHand Low Medium Low Low LowVoice Low Low Medium Low LowKeystroke Low Low Medium Low LowSignature Low Low Low Low Lowa Verif., verification; Id., identification; Behav., behavioral; Phys., physiological; Ov., overt; Cov., covert;DB Comp., database compatibility.
more likely used in a covert manner than a classical fingerprint system. The higherthis capability, the higher the privacy risk.
The third feature tends to distinguish the behavioral traits from the physiologicalones. The acquisition of most behavioral traits need cooperation from the user andthey are less stable in time, hence they are considered to be more privacy compliantthan the physiological. The higher the need of user cooperation or the variability intime, the lower the privacy risk.
The fourth feature is related to two points: (1) the technology interoperabilitywhen working with different databases and (2) the presence of numerous and/orlarge available databases to process comparisons. For example, a face acquisitioncan be used for multiple search in different databases with relatively low efforts.Similarly, manyand largedatabases of fingerprints templates exist...