281
Arenberg Doctoral School of Science, Engineering & Technology Faculty of Engineering Department of Electrical Engineering (ESAT) Cryptographic Protocols For Privacy Enhanced Identity Management Markulf Kohlweiss Dissertation presented in partial fulfillment of the requirements for the degree of Doctor in Engineering March 2010

Cryptographic Protocols For Privacy Enhanced Identity Management

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Cryptographic Protocols For Privacy Enhanced Identity Management

Arenberg Doctoral School of Science, Engineering & TechnologyFaculty of EngineeringDepartment of Electrical Engineering (ESAT)

Cryptographic Protocols ForPrivacy Enhanced Identity Management

Markulf Kohlweiss

Dissertation presented inpartial fulfillment of therequirements for the degreeof Doctor in Engineering

March 2010

Page 2: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 3: Cryptographic Protocols For Privacy Enhanced Identity Management

Cryptographic Protocols ForPrivacy Enhancing Identity Management

Markulf Kohlweiss

Jury: Dissertation presented inProf. dr. ir. Paul Van Houtte, president partial fulfillment of theProf. dr. ir. Bart Preneel, promotor requirements for the degreeProf. dr. ir. Vincent Rijmen of Doctor in EngineeringProf. dr. ir. Bart De DeckerDr. Claudia DiazDr. Jan Camenisch(IBM Zurich Research Laboratory)

Dr. George Danezis(Microsoft Research Cambridge)

Prof. dr. David Pointcheval(Ecole Normale Superieure)

U.D.C. number

March 2010

Page 4: Cryptographic Protocols For Privacy Enhanced Identity Management

© Katholieke Universiteit Leuven – Faculty of EngineeringAddress, B-3001 Leuven (Belgium)

Alle rechten voorbehouden. Niets uit deze uitgave mag worden vermenigvuldigden/of openbaar gemaakt worden door middel van druk, fotocopie, microfilm,elektronisch of op welke andere wijze ook zonder voorafgaande schriftelijketoestemming van de uitgever.

All rights reserved. No part of the publication may be reproduced in any form byprint, photoprint, microfilm or any other means without written permission fromthe publisher.

ISBN number 978-94-6018-191-7Legal depot number D/2010/7515/30

Page 5: Cryptographic Protocols For Privacy Enhanced Identity Management

All great truths begin as blasphemies. (George Bernard Shaw)

Page 6: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 7: Cryptographic Protocols For Privacy Enhanced Identity Management

Acknowledgments

It is my pleasure to thank many people for helping me to realize this Ph.D. andfor making my time here in Leuven much more enjoyable.

First, I want to thank Prof. Bart Preneel for agreeing to be the supervisor of thisthesis—and finding the money to fund all my trips. And also for his incrediblepatience and wise advice.

I would like to express my gratitude to Dr. Jan Camenisch, Dr. George Danezis,Prof. Bart De Decker, Prof. Claudia Diaz, Prof. David Pointcheval, and Prof.Vincent Rijmen for kindly accepting to be members of the jury, and Prof. PaulVan Houtte for chairing it.

I am indebted to my many co-authors: Christer Andersson, Mira Belenkiy, MelissaChase, Sebastian Faust, Lothar Fritsch, Bartek Gedrojc, Susan Hohenberger,Leonardo Martucci, Andriy Panchenko, Alfredo Rial, Caroline Sheedy, ClaudioSoriente who were both my teachers, and my friends. And in particular to JanCamenisch, Anna Lysyanskaya, and Gregory Neven for being my daily supervisors.I was lucky, not merely for being a dwarf on the shoulders of giants, but for beingguided in the ascent by many able hands.

Working at COSIC, is like having an oracle that can answer many cryptographicand mathematical questions, in ways that are not only entertaining and informative,but that also help to better understand the questions itself. Without any particularorder and claims to completeness, I want to thank the following members of thisoracle, who are all as individuals more than the division of the whole: Jan, Mina,Dries, Carmela, Josep, Benedikt, Stefan, Stefaan, Fre, Robert, Thomas, Danny,Svetla, Antoon, An, Karel, Elke, Klaus, Sebastiaan, Seda, Andreas, Andrey, Emilia,Orr, Bart, Michael, Jens, Junfeng, Kazuo, Nessim, Len, Filipe, Roel, Miroslav,Ozgul, Stefan, Koen, Leif, Gauthier, Vesselin, Venellin, Victor, Anthony, andBrecht. In particular, I would like to thank Elena Andreeva and Sebastian Faustfor all those years of sharing and moving offices together throughout the entireESAT building and for their very special sense of humor. Over the years the oraclecontinues to grow in scope and assumes a more distributed nature. I also want tothank my many friends and colleagues abroad.

v

Page 8: Cryptographic Protocols For Privacy Enhanced Identity Management

Two members of COSIC deserve mention for being more down-to-earth: Pela Noe,not only for practical matters such as paperwork, but primarily for, through pureessence, being someone whom you can always talk to (and there is a lot to talkabout). Without her COSIC would certainly be a different place. And ElviraWouters for making sure that I signed my working contract every year and formaintaining discipline in a place that is otherwise recklessly laissez-faire. Bedankt!

Last, but definitely not least, I want to thank my partner Ksenia Orman for thecontinuous support and fighting off reductions. And her family for their heartilywelcome. My best thoughts are extended to my parents and family. Without theirinitial support and encouragement this thesis would not even have been started.

Markulf KohlweissLeuven, December 2009

Page 9: Cryptographic Protocols For Privacy Enhanced Identity Management

AbstractPrivacy, as informational self-determination, is an important individual right ina world flooded with data. User-centric identity management aims at providingusers with software that allows them to fully take advantage of their privacy rightswhen interacting with novel online services and applications. In this dissertation,we propose several cryptographic protocols that reduce the reliance of onlinetransactions on the release of personal data while at the same time improvingtransaction security.This thesis looks at two families of such protocols: The first are protocols forthe anonymous release of certified data. Example protocols in this family areanonymous credentials, anonymous electronic cash, and group signatures. Theseprotocols all support anonymity by hiding the relation between the released dataand the identity of the user. We construct the first efficient delegatable anonymouscredential scheme, as well as building blocks that avoid the use of an idealized hashfunction (a random oracle) when designing privacy enhancing protocols. Based onthese building blocks we construct the first efficient compact e-cash scheme thatdoes not require a random oracle for its proof of security.The second protocol family covers protocols for the private access of data. Exampleprotocols in this family are private information retrieval, oblivious transfer, andoblivious and private searching. These protocols hide the access patterns of the user.We introduce the first oblivious keyword search scheme for public key encrypteddata.In their pure form such protocols have only limited applications. Anonymity needsto be counterweighted by accountability to make sure that users do not abuse theiranonymity. Otherwise, it would for instance be possible to share an anonymoussubscription with an unlimited number of users. Similarly, when giving usersthe possibility to access data privately, it is often necessary to check the user’saccess rights, or require appropriate payment before granting access to the data.We construct flexible mechanisms to restrict the number of certified releases, i.e.,the number of anonymous credential shows, per time period. We also show howto implement payment and access control mechanisms for oblivious transfer andoblivious keyword search based access to encrypted data. We show how to usethese mechanisms in two security and privacy critical applications: location-basedservices, and data retention.

vii

Page 10: Cryptographic Protocols For Privacy Enhanced Identity Management

viii

SamenvattingPrivacy, het recht tot het beschermen van de persoonlijke levenssfeer, is eenbelangrijk individueel recht in een wereld overspoeld met data. Het doel vanidentiteitsbeheer is om gebruikers van software te voorzien die hen in staat steltom over het geheel van hun priverechten te beschikken tijdens het gebruik vanonline diensten en toepassingen. In deze thesis stellen we meerdere cryptografischeprotocollen voor. Deze reduceren de afhankelijkheid van online transacties van debekendmaking van vertrouwelijke gegevens, en verbeteren daarnaast de veiligheidvan dergelijke transacties. In deze thesis worden twee protocolfamilies bekeken:Als eerste beschouwen we protocollen voor anonieme bekendmaking van gecerti-ficeerde gegevens. Voorbeelden van zulke protocollen zijn anonieme credentials,anonieme elektronische cash (e-cash), en groepshandtekeningen. Deze protocollenverschaffen anonimiteit door het verband tussen geopenbaarde data en de identiteitvan de gebruiker te verbergen. We introduceren het eerste efficiente schemavoor delegeerbare anonieme credentials, alsook bouwblokken die het gebruik vaneen ge-idealiseerde hashfunctie (een willekeurig orakel) voor het ontwerp vanprivacyverbeterende protocollen vermijden. Middels deze bouwblokken construerenwe het eerste efficiente compacte e-cashsysteem dat geen willekeurig orakel nodigheeft voor het veiligheidsbewijs.De tweede protocolfamilie bestaat uit protocollen voor privetoegang tot gegevens.Voorbeelden van zulke protocollen zijn systemen voor het terugvinden vanvertrouwelijke gegevens (private information retrieval), blinde overdrachten(oblivious transfers), en blind en geheim zoeken (oblivious and private searching).Deze protocollen verbergen patronen in het toegangsgedrag van gebruikers. Inhet bijzonder introduceren we het eerste schema voor het blind zoeken (oblivioussearch) naar sleutelwoorden in data versleuteld middels publiekesleutelcryptografie.In hun basisvorm hebben de beschreven protocollen slechts beperkte toepassingen.Anonimiteit en aansprakelijkheid dienen te worden gebalanceerd om te garanderendat gebruikers hun anonimiteit niet misbruiken. Inderdaad, het zou andersbijvoorbeeld mogelijk zijn om een anoniem abonnement met een willekeurigaantal andere gebruikers te delen. Bovendien, wanneer gebruikers de mogelijkheidhebben om privegegevens te bekijken, is het vaak wenselijk om van dezegebruikers de toegangsrechten te kunnen controleren, of om een betaling tekunnen vereisen alvorens hen toegang te verlenen. Wij construeren flexibelemechanismen om het aantal gecertificeerde bekendmakingen, dat wil zeggen hetaantal vertoningen van anonieme credentials, per tijdsperiode in te perken. Verdertonen wij hoe betalingsmechanismen, mechanismen voor toegangscontrole voorblinde overdrachten, en schema’s voor het blind zoeken (oblivious search) naarsleutelwoorden in vercijferde data kunnen geımplementeerd worden. Bovendientonen wij hoe deze mechanismen kunnen gebruikt worden in twee toepassingenwaarvoor veiligheid en privacy belangrijk zijn: locatiegebonden dienstverlening endataretentie.

Page 11: Cryptographic Protocols For Privacy Enhanced Identity Management

ix

ZusammenfassungUnsere Privatsphare, im Sinne von informationeller Selbstbestimmung, ist einwichtiges Individualrecht in unserer von Daten uberfluteten Welt. NutzerzentriertesIdentitatsmanagement soll, vermittels Software, Online-Nutzern ermoglichen diesesRecht moglichst ganzheitlich auszuschopfen. Das ist insbesondere fur elektronischeInteraktionen mit neuartigen Online-Diensten von wachsender Bedeutung. In dieserDoktorarbeit entwerfen wir kryptographische Protokolle, die die Notwendigkeit zurHerausgabe personenbezogener Daten eliminieren, gleichzeitig aber die Sicherheitder durchgefuhrten Transactionen verbessern.Insbesondere untersuchen wir zwei Protokollfamilien: Erstere umfasst Protokollezur anonymen Herausgabe zertifizierter Daten. Beispiele fur solche Protokollesind anonyme elektronische Berechtigungsnachweise, anonymes elektronischesGeld, und Gruppensignaturen. Diese Protokolle unterstutzen Anonymitat indemsie die Beziehung zwischen den herausgegebenen zertifizierten Daten und derIdentitat des Benutzers verschleiern. Teil des wissenschaftlichen Beitrages dieserDissertation ist die Konstruktion delegierbarer anonymer Berechtigungsnachweise.Ein weiterer Beitrag sind Bausteine, die zur Konstruktion Privatsphare schutzenderProtokolle dienen. Dabei legten wir besonders Wert auf Sicherheitsbeweise, dieohne idealisiierte Zufallsorakel auskommen.Die zweite Familie umfasst Protokolle fur den privaten Datenzugriff. Beispielefur solche Protokolle sind Private-Information-Retrieval, Oblivious-Transfer (aufDeutsch etwa: “Privater Informationszugriff” und “Ubertragung ohne Gedachtnis”),und ”gedachtnisloses” und privates Suchen. Diese Protokolle verbergen dieZugriffsmuster der Nutzer. Unser Beitrag ist die erste Losung fuer “gedachtnislose”Stichwortsuche in asymmetrisch verschlusselten Daten.In ihrer reinen Form sind derartige Protokolle in fur die Praxis relevantenSystemen lediglich begrenzte anwendbar. Vielmehr mussen haufig Maßnahmenergriffen werden, die sicher stellen, dass Nutzer verantwortungsvoll handeln. Diesestellen sicher, dass Nutzer ihre Anonymitat nicht missbrauchen. Andernfallsware es beispielsweise moglich ein anonymes elektronisches Abonnement miteiner unbegrenzten Anzahl von Nutzern zu teilen. Ebenso ist es beim privatenZugriff auf Daten haufig notig die Zugriffsrechte von Nutzern zu uberprufen,oder eine adaquate Zahlung fur den Zugriff auf die Daten einzufordern. Wirentwerfen flexible Mechanismen um die Anzahl der pro Zeitabschnitt anonymvorgewiesenen Berechtigungsnachweise einzuschranken. Des weiteren realisierenwir Zahlungs- und Zugangskontrollmechanismen fur Oblivious-Transfer und“gedachtnislose” Stichwortsuche, die keine zusatzliche Information uber die Zugriffs-und Suchmuster des Benutzers preisgeben. Wir presentieren diese Mechanismen inzwei Anwendungen, namlich Vorratsdatenspeicherung und ortsbasierte Dienste

.

Page 12: Cryptographic Protocols For Privacy Enhanced Identity Management

x

Основные положения диссертации

Неприкосновенность частной жизни в смысле информационного самоопре-деления (privacy) является важным индивидуальным правом в современноммире, изобилующем данными. Ориентированное на пользователя управлениеидентификацией имеет целью обеспечение пользователей программами, поз-воляющими им в полной мере реализовать свое право на неприкосновенностьчастной жизни при работе с новыми онлайн-услугами и приложениями. Вэтой диссертации предлагается несколько криптографических протоколов,которые уменьшают зависимость сетевых транзакций от степени открытостиличных данных, в то же время повышая безопасность таких транзакций.

Рассматриваются две группы таких протоколов. Первая группа включает всебя протоколы, предназначеные для анонимного предоставления сертифи-цированных данных. Например, к таким протоколам относятся анонимноеэлектронное удостоверение, анонимные “электронные деньги” и групповыеэлектронные цифровые подписи. Эти протоколы обеспечивают анонимность,скрывая связь между опубликованными данными и личностью пользователя.Предлагается первая эффективная делегируемая схема анонимных электрон-ных удостоверений со вспомогательными компонентами, которые не требуютиспользования идеализированной хэш-функции (случайного оракула) дляразработки протоколов с повышенным уровнем безопасности. На основеэтих вспомогательных компонентов мы строим первую эффективную икомпактную схему “электронных денег”, не требующую случайного оракуладля доказательства её безопасности.

Вторая группа охватывает протоколы частного доступа к данным. Примерамитаких протоколов являются поиск личной информации, протоколы передачис забыванием, а также частный поиск данных с забыванием. Эти протоколыскрывают параметры доступа к данным. Разработана первая схема поискапо ключевому слову с забыванием для данных, зашифрованных на открытомключе.

В чистом виде эти протоколы имеют лишь ограниченное применение. Ано-нимность должна быть уравновешена ответственнотью, чтобы убедиться, чтопользователи не злоупотребляют своей анонимностью. В противном случае, кпримеру, было бы возможным передавать анонимную подписку неограничен-ному количеству пользователей. Подобным же образом, давая пользователямвозможность частного доступа к данным, зачастую необходимо контроли-ровать их права доступа или требовать соответствующей платы за доступк данным. В работе были построены гибкие механизмы ограничения числасертифицированных передач данных, т. е. числа предъявлений анонимногоэлектронного удостоверения в единицу времени. Также было показано, какреализовывать механизмы осуществления платежей и управления доступомк данным в протоколах передачи с забыванием, а так же в протоколах

Page 13: Cryptographic Protocols For Privacy Enhanced Identity Management

xi

поиска по ключевому слову с забыванием для зашифрованных данных. Былопродемонстрировано, как использовать описанные в работе механизмы в двухприложениях c высокими требованиями к безопасности и неприкосновенностичастной жизни: в службах, основанных на определении местоположения, и вуслугах хранения данных.

Page 14: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 15: Cryptographic Protocols For Privacy Enhanced Identity Management

Contents

1 Introduction 11.1 Online Transactions and Internet Privacy . . . . . . . . . . . . . . 31.2 State of the Art of Privacy Protocols . . . . . . . . . . . . . . . . . 51.3 Outline and Summary of Contribution . . . . . . . . . . . . . . . . 7

2 Preliminaries 112.1 Modular and Reductionist Protocol Security . . . . . . . . . . . . . 112.2 Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

2.2.1 Protocol Participants and Algorithms . . . . . . . . . . . . 152.2.2 Groups and Generators . . . . . . . . . . . . . . . . . . . . 162.2.3 Asymptotic Security . . . . . . . . . . . . . . . . . . . . . . 17

2.3 Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.3.1 Communication, Trust, and Setup Assumptions . . . . . . . 182.3.2 Idealized Assumptions . . . . . . . . . . . . . . . . . . . . . 192.3.3 Complexity Assumptions . . . . . . . . . . . . . . . . . . . 21

2.4 Cryptographic Building Blocks . . . . . . . . . . . . . . . . . . . . 262.4.1 Commitment Schemes . . . . . . . . . . . . . . . . . . . . . 262.4.2 Public-Key Encryption Schemes . . . . . . . . . . . . . . . 302.4.3 Digital Signature Schemes . . . . . . . . . . . . . . . . . . . 322.4.4 Pseudo-Random Function (PRF) . . . . . . . . . . . . . . . 332.4.5 Secure Two-party Computation . . . . . . . . . . . . . . . . 34

2.5 Cryptographic Proof Systems . . . . . . . . . . . . . . . . . . . . . 362.5.1 Interactive and Non-Interactive Cryptographic Proofs . . . 392.5.2 Cryptographic Proofs of Knowledge . . . . . . . . . . . . . 412.5.3 Hiding Properties of Cryptographic Proofs . . . . . . . . . . 432.5.4 Proofs of Knowledge about Discrete Logarithms . . . . . . 442.5.5 Proofs About Pairing Product Equations . . . . . . . . . . 452.5.6 Randomizable Non-Interactive Proofs . . . . . . . . . . . . 47

I Anonymous Credential Related Schemes 51

3 Introduction to Part I 53

xiii

Page 16: Cryptographic Protocols For Privacy Enhanced Identity Management

xiv CONTENTS

3.1 Basic Anonymous Credential System . . . . . . . . . . . . . . . . . 573.2 Pseudonym Systems . . . . . . . . . . . . . . . . . . . . . . . . . . 613.3 Limited Spending . . . . . . . . . . . . . . . . . . . . . . . . . . . . 643.4 Pseudonymously Delegatable Credentials . . . . . . . . . . . . . . . 663.5 Summary of Contribution . . . . . . . . . . . . . . . . . . . . . . . 69

4 P-signatures 754.1 Security Notion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 764.2 P-Signatures From Weak BB Signatures . . . . . . . . . . . . . . . 824.3 P-Signatures From Full BB Signatures . . . . . . . . . . . . . . . . 844.4 A Multi-block P-Signature Scheme . . . . . . . . . . . . . . . . . . 864.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

5 Compact e-Cash Based on a Common Reference String 895.1 Security Notions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 925.2 Simulatable Verifiable Random Functions . . . . . . . . . . . . . . 95

5.2.1 A New sVRF Construction . . . . . . . . . . . . . . . . . . 955.2.2 NIZK Proofs for Pseudo-Random Functions . . . . . . . . . 975.2.3 NIZK Proofs for Double-Spending Equations . . . . . . . . 98

5.3 Construction of Compact E-Cash . . . . . . . . . . . . . . . . . . . 995.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

6 Periodic n-Times Spending 1036.1 Security Notion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1056.2 Periodic n-Times Anonymous e-Tokens . . . . . . . . . . . . . . . . 108

6.2.1 Basic Construction . . . . . . . . . . . . . . . . . . . . . . . 1096.2.2 Efficiency Discussion . . . . . . . . . . . . . . . . . . . . . . 112

6.3 Additional Extensions . . . . . . . . . . . . . . . . . . . . . . . . . 1136.3.1 Weak Exculpability . . . . . . . . . . . . . . . . . . . . . . 1136.3.2 Strong Exculpability . . . . . . . . . . . . . . . . . . . . . . 1136.3.3 Tracing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1146.3.4 Dynamic Revocation . . . . . . . . . . . . . . . . . . . . . . 115

6.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115

7 Delegatable Anonymous Credentials 1177.1 Security Notion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1207.2 Building Blocks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125

7.2.1 Certification Secure Message Authentication Scheme . . . . 1267.2.2 Additional Protocols . . . . . . . . . . . . . . . . . . . . . . 127

7.3 Construction of Delegatable Credentials . . . . . . . . . . . . . . . 1307.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133

Page 17: Cryptographic Protocols For Privacy Enhanced Identity Management

CONTENTS xv

II Oblivious Transfer Related Schemes 135

8 Introduction to Part II 1378.1 Protocols for the Private Access to Data . . . . . . . . . . . . . . . 140

8.1.1 Private Information Retrieval . . . . . . . . . . . . . . . . . 1408.1.2 Private Information Search . . . . . . . . . . . . . . . . . . 1428.1.3 Oblivious Transfer . . . . . . . . . . . . . . . . . . . . . . . 143

8.2 Hiding Access Patterns during Access Control . . . . . . . . . . . . 1458.3 Summary of Contribution . . . . . . . . . . . . . . . . . . . . . . . 147

9 Efficient Oblivious Augmented Maps 1519.1 Privacy Protected LBS Scheme: Definition and Solution Sketch . . 153

9.1.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1539.1.2 High-Level Approach and First Sketch . . . . . . . . . . . . 1549.1.3 First Revision: Database Secrecy . . . . . . . . . . . . . . . 1569.1.4 Second Revision: Payment Infrastructure . . . . . . . . . . 158

9.2 A Multi-Party Proxy LBS Scheme . . . . . . . . . . . . . . . . . . 1599.3 Security and Efficiency . . . . . . . . . . . . . . . . . . . . . . . . . 162

9.3.1 Efficiency Analysis . . . . . . . . . . . . . . . . . . . . . . . 1629.3.2 Security Analysis . . . . . . . . . . . . . . . . . . . . . . . . 163

9.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166

10 Authorized Private Searches on Public Key Encrypted Data 16710.1 Definitions of Committed Blind Anonymous IBE and PEOKS . . . 170

10.1.1 Anonymous Identity-Based Encryption . . . . . . . . . . . . 17010.1.2 Committed Blind Anonymous IBE . . . . . . . . . . . . . . 17110.1.3 Public Key Encryption with Oblivious Keyword Search . . 173

10.2 Construction of a Committed Blind Anonymous IBE Scheme . . . 17410.2.1 The Underlying Anonymous IBE Scheme . . . . . . . . . . 17410.2.2 Blind Extraction Protocol . . . . . . . . . . . . . . . . . . . 175

10.3 Transformation to PEOKS . . . . . . . . . . . . . . . . . . . . . . 17810.4 Authorized Private Searches on Public Key Encrypted Data . . . . 17910.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182

11 Conclusions and Future Research 18311.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18311.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186

Page 18: Cryptographic Protocols For Privacy Enhanced Identity Management

xvi CONTENTS

Bibliography 191

List of Publication 215

Curriculum Vitae 217

Appendix 219

A Security Proofs 219A.1 Security of First Construction of P-Signatures . . . . . . . . . . . 219A.2 Security of Second Construction of P-Signatures . . . . . . . . . . 221A.3 Security of Multi-block Signature Scheme . . . . . . . . . . . . . . 225A.4 Security of sVRF Scheme . . . . . . . . . . . . . . . . . . . . . . . 227A.5 Security of Our Compact E-Cash Scheme . . . . . . . . . . . . . . 228A.6 Security of e-Token Scheme . . . . . . . . . . . . . . . . . . . . . . 233A.7 Security of Delegatable Credential Scheme . . . . . . . . . . . . . . 234A.8 Security of Committed Blind Anonymous IBE Scheme . . . . . . . 240

A.8.1 Security of Anonymous IBE Scheme Variant . . . . . . . . . 240A.8.2 Security of Blind Extraction Protocol . . . . . . . . . . . . 245

B Generic Group Proofs 249B.1 Generic Group Proof of IDDHI . . . . . . . . . . . . . . . . . . . . 251B.2 Generic Group Proof of BB-HSDH . . . . . . . . . . . . . . . . . . 254B.3 Generic Group Security of BB-CDH . . . . . . . . . . . . . . . . . 256

Page 19: Cryptographic Protocols For Privacy Enhanced Identity Management

List of Figures

2.1 Ideal functionality based definitions . . . . . . . . . . . . . . . . . . 13

3.1 Identity federation . . . . . . . . . . . . . . . . . . . . . . . . . . . 543.2 Pseudonym system . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

8.1 Adaptive OT based on Chaum blind signatures . . . . . . . . . . . 144

9.1 Five phases of oblivious map protocol . . . . . . . . . . . . . . . . 157

10.1 Identity federation . . . . . . . . . . . . . . . . . . . . . . . . . . . 168

List of Tables

3.1 Input and output description of the selective-show protocol . . . . 583.2 Backus-Naur form grammar for credential claims . . . . . . . . . . 593.3 Input and output description of the blind-issue protocol . . . . . . 613.4 Input and output description of the delegation protocol . . . . . . 69

8.1 Private information retrieval protocol . . . . . . . . . . . . . . . . . 1408.2 OTn1 protocol . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

xvii

Page 20: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 21: Cryptographic Protocols For Privacy Enhanced Identity Management

List of Symbols and Acronyms

Basic integers and variablesa, b, c, u, v, w often group elements, but also exponents and bit stringsA,B,X, Y and other capital letters are often group elementsb also for single bitsg, h generators, often g ∈ G1 and h ∈ G2i, j commonly used as indicesI, J,N,M lengths of lists with index i, j, n,mlx length parameter for xL levels of delegationk security parameterm,n, ` used for indices, lengths of lists, and upper limitsn, n products of two large primesp often a primeq often another prime or the size of a one-more assumptionsr, s, x, y, z, ρ, θ, α, β, γ commonly used for secret exponentsv,u font indicates vectors or list of elementsv, u when clear from context, vectors might use normal fonty, w, L,R instance, witness, language, and relation of a proof

Basic setsG,H,G1,G2,GT groupsN natural numbers, i.e., the set 0, 1, 2, 3, . . .NP languages decidable in non-deterministic polynomial timeR real numbersV vector spaceZ integers, i.e., the set a,−a : a ∈ NZn remainders modulo nZ∗n multiplicative group of remainders modulo n

xix

Page 22: Cryptographic Protocols For Privacy Enhanced Identity Management

xx LIST OF TABLES

Functionsf some function, e.g., function concretizing f -extractionfs pseudo-random functionF function concretizing F -unforgeabilityH cryptographic hash functionκ knowledge errorν a negligible functionpoly a polynomial functionYs, Ps functions mauling instance and proof respectively

OperatorsAlgorithm font indicates an algorithma‖b concatenation of the two bit strings a and b|a| bit length of bit string a|S| size of set Se : G1 ×G2 → GT a bilinear map

Parametersatti an attribute namect ciphertextcomm commitment valuecred credentialD e-token dispenserext extraction trapdoorm messagenym pseudonymopen opening value for commitmentπ, φ cryptographic proofsparams parameters that may contain a common reference stringpk a public keyS serial numberσ signaturesim simulation trapdoorsk a secret keyT double spending tag

Page 23: Cryptographic Protocols For Privacy Enhanced Identity Management

LIST OF TABLES xxi

PartiesA adversaryE environment, knowledge extractor or a generic entity/partyI ideal protocol or credential issuerKGC key generation centerL location-based service providerM merchantP prover and location-based service proxyR real protocolS simulator and service providerT privacy trusteeT GE trapdoor generation entityU userV verifier

AssumptionsBB-CDH Boneh-Boyen CDHBB-HSDH Boneh-Boyen HSDHCDH Computational Diffie-HellmanDBDH Decisional Bilinear Diffie-HellmanDCR Decisional Composite ResiduosityDDHI Decisional Diffie-Hellman InversionDLIN Decision LinearHSDH Hidden Strong Diffie-HellmanIDDHI Interactive DDH InversionIHSDH Interactive Hidden SDHRSA Security of Rivest-Shamir-Adleman cryptosystemTDH Triple Diffie-HellmanSDH Strong Diffie-HellmanSRSA Strong RSAXDH External Diffie-Hellman

Page 24: Cryptographic Protocols For Privacy Enhanced Identity Management

xxii LIST OF TABLES

List of abbreviations2PC Two-Party ComputationBB Boneh-BoyenBNF Backus-Naur FormCEO Chief Executive OfficerCL-signature Camenisch-Lysyanskaya signatureCPA Chosen Plaintext AttackCRS Common Reference StringDB DatabaseDH Diffie-HellmanDY Dodis-Yampolskiye-cash electronic cashEF-CMA Existential Forgery under adaptive Chosen Message AttackGMR-notation Goldwasser-Micali-Rivest notationGS Groth-SahaiGSM Global System for Mobile communicationsIBE Identity-Based EncryptionIMS Identity Management SystemsIND-CCA INDistinguishability under Chosen Ciphertext AttackIND-CPA INDistinguishability under Chosen Plaintext AttackIP Internet ProtocolISP Internet Service ProvidersLBS Location-Based ServiceMAC Media Access ControlNIPK Non-Interactive Proof of KnowledgeNIZK Non-Interactive Zero-KnowledgeNIZKPK Non-Interactive Zero-Knowledge Proof of KnowledgeOT Oblivious TransferPEKS Public key Encryption with Keyword SearchPEOKS Public key Encryption with Oblivious Keyword SearchPET Privacy Enhancing TechnologiesPIP Pseudonymous Identification ProtocolPIR Private Information RetrievalPKI public key infrastructurep.p.t. probabilistic polynomial timePRF Pseudo-Random FunctionPRIME PRivacy and Identity Management for EuropeP-signature Signature with efficient ProtocolsRSA Rivest-Shamir-AdlemansVRF simulatable Verifiable Random FunctionTSN Token Serial NumberUBSS Unique Blind Signature SchemesWBB Weak Boneh-Boyen signature schemeZK Zero-Knowledge

Page 25: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 1

Introduction

New technologies have a profound influence on society. In the last decades,technology has changed our perception of privacy. Today, a lot more informationabout us is available for automatic processing than ever before. In this thesis, welook at ways in which technology and in particular cryptographic protocols enableindividuals to manage their personal data in more effective and more secure ways.

The internet and other forms of networked electronic communication such assatellite communication, wireless local area networks, and mobile phones havebecome part of our daily lives. They have changed the way we communicate,learn, and play. Today, electronic communication is an important channel forentertainment, business, and governance. As a medium that is readily accessible tosmall businesses and individuals, the internet is an important driver of technologicalinnovation and social change.

Our dependence on these new technologies, however, creates threats and risks thatare not yet fully understood. Lacking such an understanding, and often also thetechnical expertise many services do not adequately protect their users, and mayopen new attack surfaces.

The innovations that result in new services are often matched in resourcefulness bycriminals who aim at abusing and attacking these services. Prominent examples forsuch abuses are internet worms, phishing attacks, and spam. Internet worms are selfreplicating software programs that spread over the internet. In a phishing attacksthe attacker masquerades as a trustworthy entity of an electronic communicationwith the goal of stealing the users access credentials. Spam is unsolicited electroniccommunication that is sent in bulk for primarily for commercial reasons. Suchattacks can be made more efficiently, if they can be targeted towards specificindividuals based on personal information.

1

Page 26: Cryptographic Protocols For Privacy Enhanced Identity Management

2 INTRODUCTION

Other threats, this time to our individual liberties, are censorship1 and surveillance.2Also commercial products can be perceived as offensive intrusions into our onlinelives, e.g., Facebook beacon.3 A common thread behind these perceived threats isthe blurring and rewriting of the border between the public and the private sphere.

Privacy, as the right to informational self-determination, is an important self-defense mechanism against such excesses. Unfortunately, privacy on the internetis still in its infancy. Many users adopt new services without being aware of thelong-term consequences.4 Arguably, users are becoming more privacy aware overtime, even if they continue to adopt new services that provide sub-optimal privacyprotection.5 Presumably, one reason for this is that more privacy-friendly servicesare simply not available.

In our daily lives we often give away information about ourselves in order to obtainservices or goods. We do, however, have preferences concerning the amount ofinformation revealed and want to assess the trustworthiness of data recipients. Theaim of privacy mechanisms is to reduce the privacy and security risks associatedwith a transaction. As a rule of thumb, the less data is revealed, the lower theprobability that it is abused. Consequently transactions should only require usersto reveal necessary information. This is known as the data minimization principleand is a legal principle embedded in European data protection legislation [eu-06a].

Processes that have the primary purpose of managing personal data fall into thearea of identity management. Electronic identity management systems managethe attributes and access rights of network users. They play an important role insecuring electronic services because they manage the authorization of user requests.Traditionally this is done in several steps. First, a user authenticates with the helpof his access credentials such as, e.g., his username and password or his public keycertificate. After user authentication, the identity management system provides theservice with user-specific attributes and configuration information. The collectionof information about a user is called a profile. Finally, an authenticated user isauthorized based on his profile and the business rules and access control policies ofthe service.

We speak about service-centric identity management, if every service provideroperates his own identity management system with its own user interface and

1A more settled form of censorship is self-censorship. This is also related to the so-calledchilling effect: http://en.wikipedia.org/wiki/Chilling˙effect˙(term).

2A fitting metaphor that expresses the dangers of surveillance very pointedly is the panopticon:http://en.wikipedia.org/wiki/Panopticon. Some express the fear that the internet may becomean electronic panopticon [Bri02].

3A service that broadcasted shopping behavior. It was removed after intense media scrutiny:http://www.pcworld.com/article/id,140182-c,onlineprivacy/article.html.

4This is particularly true for cloud computing applications such as webmail, web-based officeapplications, social networking, and online storage: http://www.worldprivacyforum.org/pdf/WPF˙Cloud˙Privacy˙Report.pdf

5http://www.ecommercetimes.com/story/%2020346.html

Page 27: Cryptographic Protocols For Privacy Enhanced Identity Management

ONLINE TRANSACTIONS AND INTERNET PRIVACY 3

user experience. With the rapid increase in the number of services accessed byusers, this process is neither very user-friendly nor very secure: users have tomanage a multitude of access credentials and learn multiple user interfaces. Overall,because of their reliance on widely distributed profile data, service-centric identitymanagement systems lack support for data minimization and do not adequatelyprotect the privacy of users.

In a user-centric identity management system a user manages all of his credentialsand profiles through a common interface. Such systems have received growingattention as an important tool for improving the privacy of online users [CK01,Her08]. Through an adequately designed user interface and the support of privacyenhancing protocols users regain control over their personal data. The goal ofthis thesis is not the design of a comprehensive user-centric identity managementsolution—this was the goal of research projects the author participated in [pri09a,pri09b]. Rather, the goal of our inquiry is to design new cryptographic protocolsthat effectively and efficiently support identity management and data minimizationgoals.

To make the most of data minimization, we need to deepen our understandingof two different areas. First we need to understand the business processes ofonline transactions and their impact on privacy. Second we need to understandthe technical means in the form of cryptographic protocols that can be used tominimize the reliance of these processes on the release of personal data.

In the next section we describe an abstract and somewhat oversimplified transactionmodel for internet privacy. We use this model to classify existing privacy enhancingprotocols in Section 1.2. We summarize the contribution of this thesis in Section 1.3.

1.1 Online Transactions and Internet Privacy

When referring to transactions in the context of this thesis, we do not necessarilyimply any specific technical properties, such as atomicity or consistency,6 but meanany agreement, communication, or movement carried out between separate partiesthat involves the exchange of items of value, such as information, goods, services,and money.7 We focus on privacy requirements [GZ09] and particularly on dataminimization. For the purpose of this thesis, we consider the following three-facettransaction model:

1. Negotiation and authorization: The parties involved in the transactionnegotiate which credentials and data to release to each other, exchangethe agreed upon credentials and data, and authorize the transaction.

6http://en.wikipedia.org/wiki/ACID7http://en.wikipedia.org/wiki/Transaction

Page 28: Cryptographic Protocols For Privacy Enhanced Identity Management

4 INTRODUCTION

2. Service personalization and provisioning: The service is provided to the user,after being personalized based on his context (e.g., his geographic location)and profile (e.g., his language preferences or subscription type).

3. Accountability and law enforcement: In exceptional cases, accountabilitymechanisms are invoked to ensure that the service is not abused. Illegaltransactions need to be detected. Upon complaint of one party, conflictsbetween parties should be resolvable in court or through arbitration.(Mechanisms that guarantee accountability need to be interwoven with theprocesses of Facet 1 and 2 in order to be effective.)

In the offline world we often do not pay attention to the security, privacy, andaccountability issues that arise during the execution of a transaction. We entera shop, bargain the service or goods we want to buy, and pay in cash. Securityand accountability are, for instance, ensured by running after the thief and callingthe police. With the possibility to choose whichever shop we want and the optionto pay anonymously using cash, offline transactions can be very privacy-friendly.Both the processes that guarantee accountability and the processes that supportour privacy have evolved over a long time, and appear natural to us. While morecomplex than meets the eye, we have learned how to do this.

At the start of an online transaction, participants know even less about eachother. They may be based in different jurisdictions, and they may provide falseinformation.8 This, together with the fact that we often access the online worldfrom the privacy of our own homes, may seem good for privacy.

Because of the technical mediation involved, it is much harder to judge what isgoing on in the online world. Today’s online transactions are rather privacy invasivefor ordinary users. In the absence of countermeasures, every online action leavesbehind information about network identifiers such as internet protocol (IP) and (inthe local network) media access control (MAC) addresses. Moreover, applicationprograms, such as web browsers, can reveal even more information.9 All of thisinformation can be used to track and identify a user.10

It is a common misconception that more information means better security. IPaddresses carry a privacy risk but provide little security guarantees.11 Attackerscan fake IP addresses or can use compromised machines to launch attacks. Userswith the necessary technical knowledge and criminal energy can disguise their

8Proverbially, ‘On the Internet, nobody knows you’re a dog’.Page 61 of July 5, 1993 issue of The New Yorker, (Vol.69 (LXIX) no. 20), http://www.unc.edu/depts/jomc/academics/dri/idog.html

9http://stackoverflow.com/questions/87365/10Technology, such as credit cards, customer loyalty cards, RFID chips, and surveillance cameras

can also lead to deteriorating privacy in the offline world.11In absence of alternative they are nevertheless used for some forms of access control on the

internet, e.g. for http://www.springerlink.com or http://www.wikipedia.org.

Page 29: Cryptographic Protocols For Privacy Enhanced Identity Management

STATE OF THE ART OF PRIVACY PROTOCOLS 5

identity, create fake identities, and commit identity theft. Service providers thatcollect personal information such as credit card information, address, email, andphone numbers do not improve security, as this information about their users couldbe stolen or misused. Such collection is also bad for privacy, because these dataitems allow to identify users, and can be extended with behavioral information suchas, for instance, click patterns, search queries, and past purchases. Indiscriminateretention of traffic data for law enforcement, e.g., decreed by a highly controversialEuropean directive [eu-06b] raises similar concerns. Both the profile data storedat service providers and the data retained for the police and the secret serviceshave left the user’s sphere of control. Unlawful use of the data, e.g., for illegal datamining and the loss or compromise of the data can be the consequence. As a simplesecurity heuristic, the more widespread information is, the higher the likelihoodthat it is abused.

1.2 State of the Art of Privacy Protocols

A security protocol employs cryptographic mechanisms to achieve a security relatedgoal such as key agreement, entity authentication, message encryption, and messageauthentication. These goals mutually support each other and are free of conflicts.The goal of privacy protocols used for identity management and service provisioningis to simultaneously address the privacy and security issues of online-transactions.In particular we look at data minimization protocols that either allow to releasedata only anonymously or keep it hidden altogether. At the same time ourprotocols show how to fulfill the constraints on authorization, personalization andaccountability outlined in the transaction model. The thesis will demonstratehow several seemingly conflicting identity management and privacy requirements(for instance authorization and anonymity) can be simultaneously achieved usingadvanced cryptographic techniques. We look at two families of protocols:

Protocols for the anonymous release of certified data. The goal of protocolsfor the anonymous release of certified data is to generate trustworthy assertionsabout a user’s attributes and credentials. The assertions can be handed to serviceproviders without compromising the privacy of users and are used to authorizethe actions of users. This touches Facet 1: Negotiation and authorization of thetransaction model (cf. Section 1.1).

Standard public key certificates provide similar mechanisms, i.e., the can be usedas trustworthy assertions. However, as for instance pointed out by Brands [Bra00]public key certificates have many privacy issues. A public key certificate consistsof a list of attribute-value pairs A = (att1 = a1, . . . , attn = an) and a signatureσ = Sign(skCA, A) on these attributes. The signature is generated by a certificationauthority (CA) that publishes the public key pkCA needed for verification. Two

Page 30: Cryptographic Protocols For Privacy Enhanced Identity Management

6 INTRODUCTION

mandatory attributes of such a certificate are the user’s name and his public keypk. The user keeps the corresponding secret key sk hidden. To prove ownershipof a certificate, a user reveals A, σ and proves to the recipient of the certificatethat he is the owner of pk (this is done using the secret key sk). The recipient ofthe certificate checks the signature σ to verify the user’s attributes. This is highlyprivacy invasive: a user reveals A which contains all his attributes to every verifier.This alone allows linkability and reveals unnecessary information; but even worse,if the proof of ownership is done using a digital signature all of the user’s actionsare non-repudiable.

Anonymous credentials [Cha85, Bra00, Ver01, CL01, Lys02] provide a more privacy-friendly alternative. They allow a user to selectively prove access rights andattributes about himself, e.g., a user could prove that he possesses a subscription,and is younger than 25. A user can do this without revealing any other informationabout himself or his credentials. In particular, multiple shows of the same credentialare unlinkable. The name anonymous credentials stems from the property that,given an anonymous channel, users remain anonymous within the set of users thathave the same access rights and attributes.

The use of anonymous credentials also affects Facet 3: Accountability and lawenforcement of the transaction model. From an accountability perspective, userswho maliciously share the same credential are a major concern. Moreover, in caseof abuse, users may need to be deanonymized [CL01] and their credentials mayneed to be revoked [BS01, CL02a, BS04].

Other protocols in this family are electronic cash [Cha82, Bra93b, CHL05]12, groupsignatures [CvH91, CS97a, ACJT00, BBS04], and blind signatures [Cha82]. Animportant building block for the construction of many of these protocols areCamenisch-Lysyanskaya (CL) signatures [CL02b].

Protocols for the private access to data. If the information that the user needs torelease for the service to be provided is already privacy sensitive, then anonymizationalone may be insufficient. Protocols for the private access of data go one step furtherand hide the access patterns of the user. When such protocols are used in Facet 2:Service personalization and provisioning of our transaction model, then the serviceprovider might not even get to know which exact service the user is requesting fromhim. Take for example the provisioning of a location-based service. The user wantsto request information about his current location. This means that the serviceaccess patterns of the user are location dependent. The stream of location datarevealed by an individual can be highly privacy sensitive. It allows for tracking andidentification, e.g., through revealing the home and work address of the user [Kru07]and re-identification [MDBP08]. Iqbal and Lim [IL07] show how to infer furtherinformation using data mining and pattern recognition techniques. The information

12E-cash is a certified statement about a payment that a merchant can cash in at a bank.

Page 31: Cryptographic Protocols For Privacy Enhanced Identity Management

OUTLINE AND SUMMARY OF CONTRIBUTION 7

derived from movement patterns can include indicators of illnesses and political,religious, and sexual orientation. A more privacy-friendly solution would makes useof a cryptographic protocol called oblivious transfer [Rab81, NP99b, NP01, CNS07].With the help of cryptography, the user would learn the information item thatcorresponds to his location while the service provider that stores the informationremains oblivious of the user’s location.

Other protocols in this family are private information retrieval [CGKS95, CMS99],and oblivious and private searching [OK04, WBDS04, ABC+05, CGKO06, OS07a,DD07].

Oblivious transfer is a two-party protocol in which the receiver can obtain onemessage out of a list of messages held by the sender. We speak of oblivious transferif (a) the sender does not learn which message the receiver obtained and (b) thereceiver does not learn anything about the other messages of the sender. A protocolthat only guarantees (a) still qualifies as a private information retrieval protocol.In oblivious and private searching, the user does not necessarily know which item(or items) he is requesting, but makes a query based on the search terms he isinterested in.

When used in a commercial environment, protocols for the private access to servicesalso need to support Facet 1: Negotiation and authorization of the transactionmodel. Service providers that provide private access to databases, while beingin the dark about which records a user is accessing, still want to enforce accesscontrol on those records based on the (potentially hidden) access privileges ofusers [CGH09, CJKT09]. Service providers also have an interest in receivingadequate payment before authorizing access to the database. Priced oblivioustransfer [AIR01] provides a privacy-friendly solution based on an encrypted prepaidaccount.

1.3 Outline and Summary of Contribution

Outline. We define what we mean with a cryptographic protocol in Chapter 2and present the state of the art methodology for constructing such protocols in asecure way from smaller building blocks. We also introduce the building blocksand sub-protocols that we use in the rest of the thesis. This chapter also describesa new building block: randomizable non-interactive proofs.

The rest of the dissertation is divided in two parts: Privacy enhanced certificationschemes are the topic of Part I. This includes anonymous credentials, their buildingblocks, and their features, as well as electronic cash. Part II is concerned withprotocols for the private access to data.

Page 32: Cryptographic Protocols For Privacy Enhanced Identity Management

8 INTRODUCTION

The final chapter of the thesis draws conclusions and discusses ways for extendingand deepening the techniques presented in Part I and Part II.

Summary of Contribution This thesis contains contributions to the theory ofprotocols for the anonymous release of certified data in Part I: AnonymousCredential Related Schemes and the theory of protocols for the private accessto data in Part II: Oblivious Transfer Related Schemes. Chapter 3 and Chapter 8give more specific introductions to these protocol families and discuss relatedwork. To allow for an in-depth treatment we discuss the technical contributions inSection 3.5 and Section 8.3 respectively.

The goal of this section is to allow for the assessment of the overall contributionof the work presented in this thesis. All content chapters have been published inthe proceedings of peer-reviewed international conferences: [BCKL08, BCKL09,CHK+06, BCC+09, KFF+07, CKRS09].

Chapter 2: Preliminaries. In this chapter we introduce cryptographic theory andbuilding blocks that are used both in Part I and Part II. Non-interactiverandomizable proofs are a novel contribution presented in this chapter. Suchproofs allow a randomizing party to rerandomize a proof such that the randomizedproof looks like a freshly generated proof. We make use of such proof systems inour construction of delegatable anonymous credentials in Chapter 7.

Chapter 3: Introduction to Part I.Chapter 4: P-signatures. We construct a signature with an efficient non-interactive

proof of knowledge of a signature on a committed message in the commonreference string (CRS) model, and an efficient issuing protocol. Such a signaturecan take over the role of a CL-signature [CL02b] and is an important buildingblock for the construction of protocols for the anonymous release of certifieddata.This work was published in [BCKL08] and is joint work with Mira Belenkiy,Melissa Chase, and Anna Lysyanskaya.

Chapter 5: Compact e-Cash Based on a Common Reference String. We use such aP-signature scheme and pseudo-random functions with an efficient non-interactiveproof to construct compact e-cash in the CRS model. In this way we removereliance on an idealized hash function, that has been taken as a given in mostprevious e-cash schemes.This work was published in [BCKL09] and is joint work with Mira Belenkiy,Melissa Chase, and Anna Lysyanskaya.

Chapter 6: Periodic n-Times Spending. We demonstrate how to show and verifyanonymous credentials in a way that cryptographically assures the verifier that intotal the credential was not shown more than n-times per time period (to this orother verifiers). The restrictions only apply for the current time period. In a new

Page 33: Cryptographic Protocols For Privacy Enhanced Identity Management

OUTLINE AND SUMMARY OF CONTRIBUTION 9

time period, identified through a different time period identifier, credentials canagain be shown n-times anonymously. We point out applications to anonymoussensor reports and anonymous online gaming as well as other subscription basedservices.This work was published in [CHK+06] and is joint work with Jan Camenisch,Susan Hohenberger, Anna Lysyanskaya, and Mira Belenkiy.

Chapter 7: Delegatable Anonymous Credentials. The efficient delegation of anony-mous credentials was a long outstanding open problem. Credentials are delegatedsimilar to public key certificates that form a certification hierarchy: the ownerof a certificate can extend the certification chain by creating a certificate for thenext level. In delegatable anonymous credentials, parties only know each otherunder unlinkable pseudonyms, and can receive, delegate, and show credentialsanonymously using different pseudonyms.This work was published in [BCC+09] and is joint work with Mira Belenkiy, JanCamenisch, Melissa Chase, Anna Lysyanskaya, and Hovav Shacham.

Chapter 8: Introduction to Part II.Chapter 9: Efficient Oblivious Augmented Maps. We show how to use oblivious

transfer for location-based services (LBS). The setting is more complex thanis typical in cryptography. The protocol involves multiple LBS providers andmobile users that are subscribers of a mobile operator. The mobile operatoracts as a semi-trusted intermediary that knows the users’ location and hascontractual relationships with the LBS providers. The privacy goal of theprotocol is to reveal as little information as possible. In particular, we achieveservice privacy, meaning that the mobile operator and the LBS providers donot learn which service the user accesses, and location privacy towards the LBSproviders. As even the provider of the service consumed by the user is excludedfrom learning which users are accessing his service, we provide an anonymouspayment mechanism that allows to distribute the LBS subscription fees collectedby the mobile operator to the LBS providers based on the popularity of theirservices.This work was published in [KFF+07] and is joint work with Sebastian Faust,Lothar Fritsch, Bartek Gedrojc, and Bart Preneel.

Chapter 10: Authorized Private Searches on Public Key Encrypted Data. Search-ing on encrypted data has many privacy-protecting applications. We takethe secure implementation of a data retention scheme as an example. Data isretained in encrypted form using a public key encryption scheme, such that eventhe retaining server cannot decrypt the retained entries. In order to decryptthe entries, the police (or a secret service) has to obtain a search trapdoor for akeyword from a high security server. We describe such a system and show howthe police can obtain search trapdoors for keywords that have been authorizedby a third party, e.g. a judge, without revealing the keywords to the highsecurity server. This protects the secrecy needs of investigators while assuring

Page 34: Cryptographic Protocols For Privacy Enhanced Identity Management

10 INTRODUCTION

accountability and the privacy needs of the data subject.This work was published in [CKRS09] and is joint work with Jan Camenisch,Alfredo Rial, and Caroline Sheedy.

Chapter 11: Conclusion and Future Research.

These and seven more articles have been published as part of the research for thisthesis and can be found in the list of publications on page 215.

Collaboration. The work presented in Chapters 4, 5, 6, and 7 is the product ofa long-term collaboration with Prof. Anna Lysyanskaya and her students MiraBelenkiy and Melissa Chase at Brown university. Many of the core ideas weredeveloped in interactive sessions, thus credit belongs to everyone. The initial ideaswere then worked out in a long process of formalization and refinement of thesecurity protocol and its security definition.

Chapters 9 and 10 were both started as collaborations with researchers that visitedour research group: Bartek Gedrojc and Caroline Sheedy. In both works I providedthe original idea and participated in the completion of the research. In the latterwork I also collaborated with Alfredo Rial, then an Erasmus master student andnow a fellow researcher.

Throughout my thesis I worked with and was guided by Jan Camenisch, thetechnical leader of my work in the PRIME [pri09a] and PrimeLife [pri09b] researchprojects, and my supervisor Prof. Bart Preneel.

Page 35: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 2

Preliminaries

We start the chapter with a short discussion of reductionist security, a subfield ofprovable security. Next, we introduce notation, assumptions, and cryptographicprimitives. Special attention is paid to zero-knowledge and in particular to non-interactive zero-knowledge proof systems. The main aim of this chapter is tomake the thesis self-contained and we advise the reader to treat it primarily asa reference. Several of the techniques, in particular f -extractable, randomizable,non-interactive proofs of knowledge, however, have been developed as part of theresearch for this thesis and are published in [BCKL08] and [BCC+09]. Furtherinformation on provable security can be found in [Gol00, Gol04, Bel98, Can01].

2.1 Modular and Reductionist Protocol Security

A cryptographic protocol is a protocol that performs a security (or privacy) relatedfunction with the help of cryptographic building blocks. Its specification consistsof a set of algorithms that make use of the cryptographic building blocks. Thespecification also describes how protocol participants should use these algorithmswhen exchanging protocol messages. When designing a secure cryptographicprotocol one needs to take every possible adversarial behavior into account. Toestablish the security of a protocol one has to consider the security properties ofthe protocol’s building blocks as well as their composition to check whether theprotocol meets its security notion.

The study of basic building blocks for cryptographic protocols is an important areaof ongoing research. A cryptographic assumption is the belief that such a buildingblock, be it a cryptographic primitive (such as a cryptographic hash function) ora number theoretic one-way function (such as modular exponentiation), is hard

11

Page 36: Cryptographic Protocols For Privacy Enhanced Identity Management

12 PRELIMINARIES

to break for any ‘realistic’ adversary. We call an assumption weaker than anotherassumption if it is more ‘rational’ to assume that breaking the correspondingbuilding block is hard. We say that a protocol’s security is based on a set ofassumptions, if breaking the protocol implies breaking at least one of its buildingblocks (that are secure given the assumptions). This is generally proved using areduction. Assuming the existence of a successful attacker A for the protocol, asecurity reduction B is an efficient algorithm that use A to breaks at least one ofthe protocol’s building blocks. Formal relations between building blocks, and thusbetween assumptions, are established in the same way.

While it is preferable to base the security of a protocol on the weakest possibleassumptions as they provide the highest security, stronger assumptions often allowfor constructions that have higher performance, or have features that could hithertonot be implemented at all. The use of groups with bilinear maps for constructingidentity-based encryption schemes is a prominent example. Schemes are calledtheoretical, if they are too inefficient for practical use. The goal is to come upwith practical constructions that are both based on reasonable assumptions andefficient, especially for protocols for which until now only theoretical constructionsare known.

A security notion is a specification of what a protocol should and should not do. Inparticular it says what an adversary should not be able to achieve when attacking aprotocol, and what he is allowed to achieve, e.g., encrypted message communicationreveals some information about the size of the encrypted message and about thesource and destination of the message, but not the plaintext of the message unlessthe key is known. Different security notions have been proposed for differentprotocols, together with theoretical and practical constructions. These securitynotions can be roughly divided into attack-based (also known as property-based),or ideal functionality based (also known as simulation-based) definitions.

Attack-based definitions. These definitions try to describe all conceivable attacks.A cryptographic scheme is secure if all attackers have negligible probability insucceeding in each of the attacks. More formally each attack is defined as agame between the attacker and a challenger that the attacker needs to win. Thechallenger generates all secret parameters and simulates the environment in whichthe cryptographic scheme is expected to work.

As an example, the indistinguishability under chosen plaintext attack (IND-CPA)property of public key encryption is defined as an IND-CPA game in which theattacker receives a public key and then provides the challenger with two messagesm0 and m1. The challenger flips a bit b and returns the encryption of mb. Theattacker succeeds if he can guess the bit b with probability greater than 1/2. Forindistinguishability under chosen ciphertext attack (IND-CCA), this needs to holdeven in an environment in which the attacker can ask for the decryption of messages

Page 37: Cryptographic Protocols For Privacy Enhanced Identity Management

MODULAR AND REDUCTIONIST PROTOCOL SECURITY 13

other than the challenge ciphertext. This is modeled by a modified IND-CCA2game in which the challenger provides the attacker with an additional oracle thatdecrypts such messages.

Ideal functionality based definitions. In this case one starts by describing anidealized version of the cryptosystem that can never be broken. The ideal systemusually involves an abstract third party that is always trusted and that can establishsecure communication channels with the parties involved in the protocol.

A proof of security for a cryptographic scheme under such a definition includes thedescription of a simulator that interacts with the idealized system. The simulatormay use a copy of the attack algorithm internally. A protocol is secure, if thesimulator and the idealized system produce output that is indistinguishable fromthe output of the attacker and the real protocol. (This captures the notion that thecryptographic scheme will always behave like the idealized scheme, independent ofthe environment in which it will be employed.) Now the game that the attackerneeds to win is the standard one of forcing the simulation to fail. This captures theidea that as long as the simulation is successful, all damage that can be done (e.g.secret information learned) by the adversary interacting with the real protocol,could be done (learned) by the simulator interacting with the ideal protocol alone.

S I

A RE

AFigure 2.1: Ideal functionality based definition: No environment E can distinguishwhether it is interacting with an attacker A attacking the real protocol R, or witha simulator (having black-box access to A) interacting with the ideal protocol I.

Another difference between attack-based and ideal functionality based definitionsis how the definition models the environment in which the protocol is expected towork. In attack-based definitions the environment is hard coded into the challenger,and there is no guarantee that the protocol would still be secure if attacked ina different way. Ideal functionality based definitions can be formulated in a way

Page 38: Cryptographic Protocols For Privacy Enhanced Identity Management

14 PRELIMINARIES

that they guarantee security for all possible environments. In such a definition theenvironment is an arbitrary algorithm that interacts either with the real protocolR (and the adversary A) or the ideal protocol I (and the simulator S). If noenvironment E can distinguish these two settings the protocol is said to be secure.See Figure 2.1

Two properties make this approach interesting: (a) The ideal functionality capturesall security properties in one concise abstraction. (b) The environment capturesall possible situations in which such a protocol may ever be used. Extensiveframeworks based on these two ideas allow to reason about the preservation ofsecurity properties of protocols under composition [Can01, PW00]. Here securecomposition means that the composed protocol still remains secure whenever oneof its ideal sub-protocols is replaced by a secure real protocol. The most commonframework is the universal composability framework [Can01].

Remark 2.1.1 Security definition styles and proof techniques may be easilyapplicable to some schemes, but difficult to apply to others. For instance, idealfunctionality based definitions are particularly well suited for secure two-partyand multi-party computation protocols. The function or circuit that should becomputed corresponds in a straightforward manner to the ideal functionality. Anadequate ideal functionality based definition for digital signatures, however, was onlyfound after several attempts [BH04, Can04]. In this thesis we give property-baseddefinitions for our protocols. However, a better understanding of the desirable andachievable properties of privacy enhancing protocols contributes to the longer-termgoal of defining unified and comparable ideal functionality based definitions forprivacy enhancing protocols.

2.2 Notation

Most of our notation is standard and can be skipped by readers familiar with similarcryptographic literature. All computation is modeled by probabilistic polynomialtime (p.p.t.) Turing machines. For easy reference, we summarize the notation thatwe use to represent parties, algorithms, and protocols between two parties. We alsorecall some basic results about algebraic groups frequently used in cryptography.

Let 0, 1 be the set containing the 0 and 1 bit and let 0, 1∗ the set of bit strings.Let N denote the natural numbers 0, 1, . . . ; then for k ∈ N, 1k is the bit string ofk ones, and 0, 1k is the set of bit strings of length k. We write ε for the emptystring. For a bit string a, |a| signifies its length and a‖b signifies the concatenationof the two strings a and b. Let S be a finite set. We denote the size of the set by|S| and write x← S to mean that x is chosen uniformly at random from S.

Page 39: Cryptographic Protocols For Privacy Enhanced Identity Management

NOTATION 15

2.2.1 Protocol Participants and Algorithms

We denote the entities E participating in a protocol by calligraphic letters. Entities(also called parties) make use of probabilistic polynomial time (p.p.t.) algorithmsfor computation. We write (out1, . . . , outm)← Algorithm(in1, . . . , inn, ρ) to denotean algorithm Algorithm that takes inputs ini and produces outputs outi. Therandom tape ρ is usually omitted. We say that an algorithm is p.p.t., if there existsa polynomial poly such that its running time is upper bounded by poly(

∑ni=1 |ini|).

For instance, a p.p.t. key generation algorithm Keygen(1k) must run in timepolynomial in the security parameter k. We refer to the unnamed algorithm of anentity E by the same calligraphic letter (E1, E2 if an entity has two algorithms).We will thus often use E to denote both the entity and the algorithm that it usesfor computation.

In our analysis, algorithms can receive random variables as input and producerandom variables as outputs. We use the standard GMR [GMR88] notationto describe probability spaces. Let pred be a Boolean predicate. Without ← Algorithm(in) : pred(out) we denote the event that pred(out) is true when outis obtained by running Algorithm on input in and Pr[out ← Algorithm(in) : pred(out)]denotes the probability of this event.

An entity E with algorithm (mout , stateout) ← AlgE(min, statein) communicateswith other entities by taking a message min generated by the algorithm of someentity (and their own state) as input and producing a message mout directed atthe algorithm of another (or the previous) entity (as well as an updated statefor possible future runs) as output. The incoming message min is ⊥ if the entityis initiating the protocol. We introduce a special notation to refer to two suchalgorithms of entities E and E ′ that exchange messages in an orderly fashion. Weinterpret the initial and final states of AlgE and AlgE′ as in, out and in′, out′respectively. To represent a two-party protocol, we write as a short form either(out, out′)← AlgE(in)↔ AlgE′(in′) or (out, out′)← Alg(E(in), E ′(in′)). The latternotation can be generalized to more than two parties.

We also introduce a notation to say that an algorithm can call another algorithmas an oracle. We will write AOAlg(sk,.) to denote that the algorithm A can make useof the algorithm Alg. In the example above, the algorithm will be run on input skwhich is hidden from A. The second input is picked freely by A. Oracles can keepinternal state Qtype, e.g., Qx might denote the set of inputs on which the oraclehas been called. As a convention, the call to an oracle takes unit time, so that thealgorithm is restricted to polynomially many oracle queries.

In our constructions we aim at practical efficiency. This means, for example, thatwe try to minimize the number of exponentiations and the number of calls tocryptographic algorithms that we use as building blocks. We define the security ofa protocol with respect to all possible p.p.t. attack algorithms.

Page 40: Cryptographic Protocols For Privacy Enhanced Identity Management

16 PRELIMINARIES

2.2.2 Groups and Generators

A group (in multiplicative notation) is a set G that is closed under an associativeand invertible binary operation with identity element 1 ∈ G, i.e., for all a, b ∈ G,ab ∈ G, for all a, b, c ∈ G, (ab)c = a(bc), for all a ∈ G, there exists a−1 ∈ G suchthat aa−1 = 1, and for all a ∈ G, a1 = a.

1. In a commutative group (also called Abelian group) the group operation iscommutative, i.e., for all a, b ∈ G, ab = ba.

2. A group is called finite if it has only a finite number of elements.

3. The number of elements |G| in a finite group G is called its order.

4. A group G is cyclic, if there exists an element g so that for all a ∈ G thereexists some x ∈ N such that gx = a. We say that g generates the group orthat g is a generator of G.

5. For a ∈ G the set of powers ax | x ∈ N forms a cyclic subgroup H of Gcalled the subgroup 〈a〉 generated by a.

6. The order of a ∈ G is defined as t = minx ∈ N\0 | ax = 1. If such at does not exist, the order is said to be infinite. For an a of finite order,t corresponds to the size of the subgroup generated by a. Consequently,t = |〈a〉|.

Theorem 2.2.1 (Lagrange’s Theorem) Let G be a finite group. The order ofa subgroup H always divides the order of G.

Gauss proved in 1801 in his Disquisitiones arithmeticae1 that there exists a subgroupfor every number dividing the order of a cyclic group. This extends the result ofLagrange, which stipulates that the order of a subgroup is always a divisor of thecorresponding group order.

Theorem 2.2.2 Every subgroup of a cyclic group G is cyclic, in fact there existsexactly one subgroup H for every divisor t of the order |G| of G. Furthermore, if gis a generator of G, g|G|/t is a generator of subgroup H.

As a simple corollary a prime order group G has only two subgroups: the trivialone element group 〈1〉 and G itself.

1In the original Latin: http://resolver.sub.uni-goettingen.de/purl?PPN235993352.

Page 41: Cryptographic Protocols For Privacy Enhanced Identity Management

NOTATION 17

Groups and subgroups modulo prime q. The non-zero remainders modulo qconstruct a cyclic multiplicative group Z∗q . The order of the group is q − 1.

According to Theorem 2.2.2 there exists a cyclic subgroup H of order t for everydivisor t|q − 1 of Z∗q . This subgroup is particularly interesting for cryptographyif t = p with p a large prime, for example p ≥ 2160. Algorithms for creating themodulus and generators for such groups can be found in [MvOV96].

Groups and subgroups modulo a composite n. The multiplicative groups Z∗n ofremainders modulo a composite n are of order ϕ(n) = (p1−1)pk1−1

1 · · · (pr−1)pkr−1r ,

with pj the distinct primes of the prime factor representation n = pk11 · · · pkrr .

Such groups are of interest to cryptography if the composite n is hard to factor,e.g. if it is the product of two large primes p and q. Because of its use in the RSAcryptosystem, we call such a composite an RSA modulus. As in this case the orderϕ(n) = (p − 1)(q − 1) is hard to compute from n alone, we refer to such groupsalso as hidden order groups.

These groups are typically non-cyclic. However, if n = pq is a special RSAmodulus with p = 2p′ + 1 and q = 2q′ + 1 the corresponding safe primes, then thesubgroup QRn denoting the quadratic residues modulo n is a cyclic group undermultiplication. A prime p′ (and q′) for which a corresponding prime p = 2p′ + 1(respectively q = 2q′ + 1) exists is called a Sophie Germain prime. The size of thisgroup is p′q′ or p−1

2 · q−12 . It can be seen, that this group has two subgroups of

order p′ and q′. Thus all but p′ + q′ − 1 of the elements of QRn are generators.

Groups with a bilinear map. Let G1, G2, and GT be groups of prime orderp. A bilinear map e : G1 × G2 → GT must satisfy the following properties: (a)Bilinearity: a map e is bilinear if e(ax, by) = e(a, b)xy; (b) Non-degeneracy: for allgenerators g ∈ G1 and h ∈ G2, e(g, h) generates GT ; (c) Efficiency: There existsa p.p.t. algorithm BMGen(1k) that outputs (p,G1,G2,GT , e, g, h) to generate thebilinear map and an efficient algorithm to compute e(a, b) for any a ∈ G1, b ∈ G2.

All known instantiations of groups with a bilinear map use elliptic curves andelliptic curve pairings such as the Tate or the Weil pairing. An overview of thedifferent pairing types used in cryptography is given in [GPS08].

2.2.3 Asymptotic Security

We define our assumptions and the security of protocols asymptotically. Insteadof bounding the concrete success probability of the attacker, we require that thisprobability decreases super-polynomially as the security parameter k increases. We

Page 42: Cryptographic Protocols For Privacy Enhanced Identity Management

18 PRELIMINARIES

say that a function ν : Z→ R is negligible if for all integers c there exists an integerK such that ∀k > K, |ν(k)| < 1/kc.

2.3 Assumptions

Every cryptographic protocol necessarily has numerous connections with the realworld: Who are the attackers? How can they interact with the protocol? Whichbuilding blocks and mathematical structures are used in the implementation of theprotocol? How do these building blocks and structures behave under attack?

During the design of a protocol we make abstractions that generalize andsimplify the real world into a formal model. Ideally, a correct instantiation of acryptographic protocol secure in the formal model is also secure in all environmentsin which it would later be employed. We already discussed the importance ofsecurity definitions, either attack-based (with a sufficiently general environmentdescription that includes all relevant attacks) or simulation-based definitions (withan appropriate ideal functionality). A second requirement is to correctly captureall assumptions of the model that are made during the definition, construction,and the proof of the protocol.

2.3.1 Communication, Trust, and Setup Assumptions

Communication assumptions cover the communication mechanisms available toparties. Trust assumptions model assumptions about the behavior of parties,such as, e.g., certification authorities. Setup assumptions can be seen as trustassumptions that are restricted to the time and place in which the setup of theprotocol is performed. The common reference string model [BFM88] is a type ofsetup assumption that assumes the existence of a binary string from a randomdistribution. This string can either be computed by a trusted party, or could bederived from some natural phenomenon, i.e., a random pattern found in nature.

Communication and trust assumptions. While only an authenticated channelcan give hard guarantees about the identity of communication partners, mostchannels leak information about the sender and the recipient of the communication,e.g., their internet protocol (IP) address. Channels that hide this information arecalled anonymous communication channels and are important building blocks ofprivacy enhancing protocols.

A real world deployment of a cryptographic protocol often needs to bindcryptographic players to parties in the real world. A common approach for achievingthis binding is to associate cryptographic players with their public key. In a second

Page 43: Cryptographic Protocols For Privacy Enhanced Identity Management

ASSUMPTIONS 19

step these public keys are bound to real world entities. This binding needs to bemodeled by some form of assumption. The most common of them is that of a trustedcertification authority that certifies public keys as part of a public key infrastructure.Using such certified public keys it is possible to establish an authenticated (andconfidential) channel between two public key holders [DH76, Can01, Can04].

The binding of cryptographic players to parties in the real world is also relevant insettings where parties are anonymous or pseudonymous. Mechanisms for achievingsuch a binding in a privacy friendly way will be the topic of Part I of this thesis.Also in these setting one has to put some trust into those organizations that vouchfor the real world properties of the user, such as, e.g., his age.

Common reference string model. The common reference string (CRS) modelcaptures the assumption that a trusted setup exists in which all involved partiesget access to the same string params taken from some distribution D. Schemesproven secure in the CRS model are secure given that the setup was performedcorrectly. The common reference string model is a generalization of the commonrandom string model, in which D is the uniform distribution of bit strings. Asstated in [CF01], the CRS model is equivalent to other models with a trusted setup,e.g., reference string model [FF00] and the public parameters model [Dam00].

The CRS model was first introduced in the study of non-interactive zero-knowledgeproofs [BFM88]. The implications of the CRS model are however much bigger.It was for instance shown that in absence of an honest majority a CRS (or arandom oracle) is necessary in order to realize many desirable ideal functionalities(e.g., commitments [CF01]) in a way that allows for secure composition with otherprotocols [CKL06].

2.3.2 Idealized Assumptions

The random oracle model, and the generic group model are idealized assumptionsabout cryptographic primitives (hash functions) and mathematical structures(groups), respectively, that are used to simplify the analysis.

Random oracle model. A random oracle behaves like a random function thatmaps every possible query value in the input domain to a random response fromits output domain. In cryptography a random oracle is used as a black boxthat acts as an idealized abstraction of a cryptographic hash function in securityproofs [FS87, BR93].

Proofs in the random oracle model are useful in practice, as they allow to checkexisting protocols for weaknesses that are independent of the hash function used

Page 44: Cryptographic Protocols For Privacy Enhanced Identity Management

20 PRELIMINARIES

in a concrete instantiation. Proofs in the random oracle have, however, beencriticized [CGH04]. The following are two important lemmas of this criticism:1) a random oracle cannot be implemented by any efficient algorithm; 2) thereexist insecure protocols (sometimes called contrived counterexamples) that can beproven secure in the random oracle model.

The intuition is that the security of the protocol relies on some unknown propertyof the function used in the cryptographic protocol, that in the best case holds forthe concrete hash function used in a real world instantiation, and in the worst case,as in the counterexamples, can only hold for the random oracle itself.

In the area of privacy enhancing protocols there are, however, also other reasons foravoiding the random oracle model and primitives that make use of a cryptographichash function. For instance, to achieve anonymity in certification protocols onecan use the following strategy: instead of revealing a signature, one might want toprove possession of a signature on a committed message. If the signature involvesa hash function that is modeled as a random oracle a concrete hash function needsto be fixed before proving the relation between the signature and the commitment.As the committed message must be first hashed and then signed, then a proof thata committed message has been signed must not only prove knowledge of a validsignature on the resulting hash, but must also prove that the pre-image of thisvalue is contained in the given commitment. For most modern hash functions it iscompletely unclear how to do this efficiently. Another area in which hash functionscan be an obstacle are randomizable non-interactive proof systems as introducedin Section 2.5.6.

Generic group model. The generic group model [Sho97, Mau05] is an idealizedcryptographic model, where the adversary is only given access to a randomly chosenencoding of a group, instead of efficient encodings, such as those used by the finitefield or elliptic curve groups used in practice.

The model includes an oracle for the group operation. The oracle takes two randomencodings of group elements a and b as input and outputs a random encoding ofelement ab. If the group should allow for a pairing operation this operation wouldbe modeled as an additional oracle.

One of the main uses of the generic group model is to analyze number theoreticcomplexity assumptions. An analysis in the generic group model can answer thequestion: “what is the fastest generic algorithm for breaking a cryptographiccomplexity assumption.” A generic algorithm is an algorithm that only makes useof the group operation, and does not consider the encoding of the group. Thisquestion was answered for the discrete logarithm problem by Shoup [Sho97]. Otherresults in the generic group model are for instance [MW98].

The generic group model suffers from some of the same problems as the random

Page 45: Cryptographic Protocols For Privacy Enhanced Identity Management

ASSUMPTIONS 21

oracle model. In particular, it has been shown [Den02] using a similar argument asin [CGH04] that there exist cryptographic schemes which are provable secure inthe generic group model, but which are trivially insecure once the random groupencoding is replaced with any efficiently computable instantiation of the encodingfunction.

2.3.3 Complexity Assumptions

In most cases in cryptography we face a situation in which an all-powerful adversarycan in principle always break a cryptographic protocol. Such protocols are onlysecure against a restricted adversary. The standard restriction that we considerin this work is that of a probabilistic polynomial time (p.p.t.) adversary. We usecomplexity theoretic assumptions to state problems believed to be hard for suchan adversary.

The main purpose of this section is to serve as a reference for the interpretation ofthe specific security guarantees offered by a protocol. We will refer back to thissection in future chapters whenever we make use of a new assumption.

The security of our protocols relies on a number of complexity assumptions relatedto the factoring of large numbers, and the discrete logarithm problem. Justassuming the difficulty of these two basic problems is, however, insufficient forthe construction and proof of most cryptographic protocols. Even for the mostbasic asymmetric cryptographic primitives, the RSA cryptosystem and the Diffie-Hellman key exchange protocol, the exact relation of the security of these schemesto the assumption that factoring and computing discrete logarithm is hard is stillunder question. To cope with this situation, cryptographers introduced separateassumptions, called RSA assumption, and computational Diffie-Hellman (CDH)assumption respectively ([Mau94] proves the equivalence between CDH and discretelogarithm for certain groups). Intuitively the RSA assumption states that given anRSA modulus n, an integer e coprime with ϕ(n), and an integer c ∈ Z∗n, it is hardto compute an integer m such that me ≡ c (mod n). In turn, the Diffie-Hellmanassumption requires that given gx, gy ∈ G it is hard to compute gxy.

Cryptographic complexity assumptions come in a variety of flavors [Nao03].Especially for the case that we do not want to rely on random oracles, newcryptographic schemes are often proven secure under new and unconventionalassumptions. We motivate the use of two such flavors: one-more assumptions anddecisional assumptions:

The Decisional Diffie-Hellman assumption requires that given gx, gy ∈ G it is hardto distinguish gxy from a random element in G. This assumption is equivalent tothe semantic security of the ElGamal encryption scheme [Bon98] and has been usedfor the construction of pseudo-random functions [NR97]. Many other decisional

Page 46: Cryptographic Protocols For Privacy Enhanced Identity Management

22 PRELIMINARIES

assumptions have been introduced for the analysis of public key encryption schemesand pseudo-random functions.

One-more assumptions have been heavily used in the analysis of cryptographicschemes. The need for such assumptions naturally arises, if an attacker is able tointeract in a security reduction with a challenger, or a simulator, that provides himwith q solution instances for a hard problem. The attacker himself, however, shouldbe unable to produce a q + 1th solution for a related instance. For instance, in achosen message attack secure signature scheme, the attacker can obtain q signaturesof his choice, but should not be able to produce a signature himself. While it isdesirable to avoid one-more assumptions as they are inherently stronger [Che06,KM08], this is not always feasible, especially if the proof does not make use ofrandom oracles, if the reduction should be tight, and if the protocol should beefficient, i.e. have small keys and messages.

The description of the assumptions below is only semi-formal. A formal definitioncould be describe as game between a challenger and an adversary. The adversary setsup the groups involved in an assumption, generates the assumption instance, and(for an interactive assumption) answers the assumption’s oracles. The assumptionwould then require that no p.p.t. adversary can win such a game.

Assumptions for commitment, encryption, and NIZK schemes. The followingassumptions underlie the security of the encryption and commitment schemesused in our protocols. As commitments based on the External Diffie-Hellman(XDH) and Decision Linear (DLIN) assumptions play an essential role in the non-interactive proof system [GS08] that we employ, the witness indistinguishabilityand zero-knowledge property of this proof system also relies on one of these twoassumptions.

Definition 2.3.1 (Decisional Composite Residuosity (DCR) [Pai99])Given a random RSA modulus n and a random z ∈ Zn2 , decide whether z is ann-residue modulo n2 or not, i.e., whether there exists a y such that z ≡ yn mod n2.The DCR assumption holds if all p.p.t. algorithms have negligible (with respect tothe bit length of n) advantage in solving the above problem.

Definition 2.3.2 (Decisional Bilinear Diffie-Hellman (DBDH))Given g, ga, gb, gc ∈ G1, h, ha, hb ∈ G2, and Z ∈ GT for random exponents a, b, c ∈Zp, decide whether Z = e(g, h)abc or a random element from GT . The DBDHassumption holds if all p.p.t. algorithms have negligible (with respect to the bitlength of p) advantage in solving the above problem.

Definition 2.3.3 (Decision Linear (DLIN) [BBS04])Given g, ga, gb, gac, gbd, Z ∈ G1, h, ha, hb ∈ G2 for random exponents a, b, c, d ∈ Zp,

Page 47: Cryptographic Protocols For Privacy Enhanced Identity Management

ASSUMPTIONS 23

decide whether Z = gc+d or a random element in G1. The Decision Linearassumption holds if all p.p.t. algorithms have negligible (with respect to the bitlength of p) advantage in solving the above problem.

Definition 2.3.4 (External Diffie-Hellman (XDH))XDH assumption requires that the DDH assumption holds for groups with a bilinearmap. By necessity this can only be the case for an asymmetric bilinear mape : G1 × G2 → GT . Moreover, w.o.l.g., say that DDH should hold for G1, theremust not exist a homomorphism that maps elements of G1 to elements of G2.

If homomorphisms in both directions are excluded, and if DDH is also requiredto hold for G2, the combined assumption is called Symmetric XDH assumption.The SXDH assumption was first used by Scott [Sco02], and has been discussed andused extensively since [BBS04, GR04, Ver04, BGdMM].

Assumptions for signature schemes. The signature schemes used in our protocolsall rely on one or more of the following complexity theoretic assumptions. Some ofthese assumptions are one-more assumptions.

We say that it is hard to compute a value or a set of values v from Z∗n, Zp, or G(of order p), if all p.p.t. algorithms have negligible (with respect to the bit lengthof n or p respectively) advantage in outputting such a value.

Definition 2.3.5 (Strong RSA (SRSA) [BP97, FO97])Given an RSA modulus n and a random element c ∈ Z∗n, the Strong RSA assumptionrequires that it is hard to compute w ∈ Z∗n and integer e > 1 such that we ≡ c modn.

We use a modulus n of a special form pq, where p = 2p′ + 1 and q = 2q′ + 1 aresafe primes.

Definition 2.3.6 (q-Strong Diffie-Hellman (q-SDH) [BB04b, BB08])Given gx, gx2

, . . . , gxq ∈ G for random exponent x ∈ Zp the q-Strong Diffie-Hellman

assumption requires that it is hard to compute (c, g1x+c ) ∈ Zp ×G.

Boyen and Waters [BW07] defined the q-Hidden SDH assumption over bilinear mapsusing symmetric groups e : G × G → GT . We give a definition over asymmetricmaps e : G1 ×G2 → GT . In a symmetric pairing setting our definition is identicalto the Boyen-Waters definition of the q-HSDH assumption.

Page 48: Cryptographic Protocols For Privacy Enhanced Identity Management

24 PRELIMINARIES

Definition 2.3.7 (q-Hidden Strong Diffie-Hellman (q-HSDH))On input g, gx, u ∈ G1, h, hx ∈ G2 and g 1

x+c` , hc` , uc``=1...q for random exponentsx, c1, . . . , cq ∈ Zp, it is hard to compute a new tuple (g

1x+c , hc, uc).

When (p,G1,G2,GT , e, g, h) and H = hx are fixed, we refer to tuples of the form(g

1x+c , hc, uc) as an HSDH triple. Note that we can determine whether (A,B,C)

form an HSDH tuple using the bilinear map e, as follows: suppose we get a tuple(A,B,C). We check that e(A,BH) = e(g, h) and that e(u,B) = e(C, h).

We extend the q-HSDH assumption further and introduce a new assumption thatwe call the q-Interactive HSDH assumption. We allow the adversary to adaptivelyquery an oracle for HSDH triples on ci of his choice.

Definition 2.3.8 (q-Interactive Hidden SDH (q-IHSDH))The q-IHSDH assumption requires that for a random exponent x ∈ Zp it is hard tocompute a tuple (g

1x+c , hc, uc) given g, gx, u ∈ G1, h, hx ∈ G2 even when given the

possibility to make q queries to an oracle Ox(c) that returns g1x+c . The c in the

HSDH triple output by the adversary must be different from the values it used toquery Ox(·).

We relax the q-HSDH assumption further and introduce a new assumption wecall the q-BB-HSDH assumption. Intuitively speaking, we allow the adversaryto obtain the c` used in his challenge. We call this assumption q-Boneh-BoyenHSDH, because the adversary is given q Weak-BB signatures (g

1x+c` , c`) for random

messages c`. It is easy to see that q-BB-HSDH implies q-HSDH. Thus our genericgroup proof for q-BB-HSDH (See Appendix B.2) also establishes generic groupsecurity for q-HSDH.

Definition 2.3.9 (q-Boneh-Boyen HSDH (q-BB-HSDH))On input g, gx, u ∈ G1, h, hx ∈ G2, random c1, . . . , cq ∈ Zp, and g

1x+c1 , . . . , g

1x+cq

for random exponent x ∈ Zp, it is hard to compute a new tuple (g1x+c , hc, uc).

We introduce a new assumption, we call q-Boneh-Boyen CDH. It is a relaxedversion of CDH, in which the adversary is also given q weak BB signatures asinput. We can show that this is implied by the q-SDH assumption. We present itas separate assumption to simplify our security proof. Note that we obtain genericgroup security from the generic group proof for q-SDH in [BB08].

Definition 2.3.10 (q-Boneh-Boyen CDH (q-BB-CDH))On input g, gx, gy ∈ G1, h, hx ∈ G2, random c1, . . . , cq ∈ Zq, and g

1x+c1 , . . . , g

1x+cq

for random exponents x, y ∈ Zp, it is hard to compute gxy.

Page 49: Cryptographic Protocols For Privacy Enhanced Identity Management

ASSUMPTIONS 25

We introduce a new assumption, we call q-Triple DH. This assumption is similar innature to q-BB-CDH but stronger. It is used together with the q-HSDH assumptionto prove the security of our multi-block signature scheme in Section 4.4.

Definition 2.3.11 (q-Triple DH (q-TDH))On input g, gx, gy ∈ G1, h, hx ∈ G2, random c1, . . . , cq ∈ Zp, and g

1x+c1 , . . . , g

1x+cq

for random exponents x, y ∈ Zp, it is hard to compute a tuple (hµx, gµy, gµxy) forµ 6= 0.

Assumptions for pseudo-random functions. The assumptions for pseudo-randomfunctions are both decisional, and one-more. The security of the Dodis-Yampolskiy(DY) [DY05] pseudo-random function that we use in this thesis requires one ofq-DDHI [BB04a, CHL05] or q-IDDHI.

In some of our schemes we require that the q-DDHI assumption holds either inG1 or G2. Note that this is slightly stronger than the DBDHI assumption usedin [DY05] to construct an efficient VRF (there the challenge is e(g, h) 1

α or a randomelement of GT ). However, it is still weaker than the BDHBI assumption used inthe sVRF construction in [CL07].

Definition 2.3.12 (q-Decisional Diffie-Hellman Inversion (q-DDHI))Given g, ga, ga

2, . . . , ga

q

, Z ∈ G, for random exponent a ∈ Zp, decide whetherZ = g1/a or a random element in G. The q-Decisional Diffie-Hellman Inversionassumption holds if all p.p.t. algorithms have negligible (with respect to the bitlength of p = |G|) advantage in solving the above problem.

If the pseudo-random function is required to have a very large domain from whichthe adversary can adaptively choose, we require the stronger q-IDDHI assumption.

Definition 2.3.13 (q-Interactive DDH Inversion (q-IDDHI))Let Oa(·) be an oracle that, on input z ∈ Z∗p, outputs g

1a+z . Given g, ga, Z ∈ G,

for random exponent a ∈ Zp, after choosing x ∈ Z∗p decide whether Z = g1

a+x ora random element in G. The q-Interactive Decisional Diffie-Hellman Inversionassumption holds if all p.p.t. algorithms that can make q accesses to Oa(·) havenegligible (with respect to the bit length of p = |G|) advantage in solving the aboveproblem.

Page 50: Cryptographic Protocols For Privacy Enhanced Identity Management

26 PRELIMINARIES

2.4 Cryptographic Building Blocks

We summarize the necessary information about our system components. Mostof definitions and cryptographic primitives are standard and well establishedwith the exception of strongly computationally hiding commitments, Groth-Sahaicommitments, CL-signatures, DY-PRFs, and a dedicated secure function evaluationprotocol for modular arithmetic.

2.4.1 Commitment Schemes

A commitment scheme is a two-phase scheme that allows a user to commit to ahidden value, while preserving the ability of the user to reveal the committed valueat a later stage. The standard definition of a non-interactive commitment schemeconsists of a setup algorithm ComSetup, and an algorithm Com that is used both inthe commit and reveal stage. ComSetup(1k) outputs public parameters paramsComfor the commitment scheme. Com(paramsCom, x, open) is a deterministic functionthat outputs comm, a commitment to x, using randomness open. One opens acommitment comm by revealing x and open and verifying that Com(paramsCom,x, open) = comm.

The properties of a commitment scheme are hiding: the value committed to mustremain undiscovered until the reveal stage, and binding: the only value that maybe revealed is the one that was chosen in the commit stage. In our protocols wemake use of different binding and hiding properties:

Definition 2.4.1 (Perfectly Binding) For all params output by ComSetup itholds that for any x 6= x′ the set of commitments to x is disjoint from the set ofcommitments to x′. This means that there does not exist open and open′ such thatCom(paramsCom, x, open) = Com(paramsCom, x

′, open′).

Definition 2.4.2 (Computational Binding) For all p.p.t. algorithms that oninput paramsCom ← ComSetup(1k) output x, x′, open, open′, x 6= x′, the probabilitythat Com(paramsCom, x, open) = Com(paramsCom, x

′, open′) is a negligible functionν in k.

Definition 2.4.3 (Perfect, Statistical and Computational Hiding) Let Ukbe the uniform distribution over the opening values under paramsCom ←ComSetup(1k). A commitment scheme is perfectly, statistically, computationallyhiding, if for all x 6= x′ the probability ensembles Com(ComSetup(1k), x, Uk)k∈Nand Com(ComSetup(1k), x′, Uk)k∈N are equal, statistically close, or computation-ally indistinguishable.

Page 51: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC BUILDING BLOCKS 27

Definition 2.4.4 (Strongly Computationally Hiding) There exists an alter-nate setup HidingSetup(1k) that outputs parameters (computationally indistinguish-able from the output of ComSetup(1k)) so that the commitments become perfectlyhiding.

In some cases we also make use of two additional properties: the extraction andthe chameleon property.

Extractable commitment schemes have been introduced by De Santis et al. [SCP00].Informally speaking, a commitment scheme is extractable if there exists anextractor with two efficient algorithm. The first algorithm prepares commitmentparameters and a trapdoor, such that the second algorithm can later on extractthe committed value from any valid commitment sent by the committer. Theextraction parameters must be indistinguishable from the real parameters of thecommitment scheme.

Chameleon commitment schemes have been introduced by Brassard et al. [BCC88].Informally, the property requires that there exists a simulator with two efficientalgorithms. The first algorithm prepares commitment parameters and a trapdoor,such that the second algorithm can later open any of his commitments to a valueof his choice. The chameleon parameters must be indistinguishable from the realparameters of the commitment scheme.

We use these properties in the common reference string model. This model allows(for proof purposes) to replace the honestly generated commitment setup producedby ComSetup with an indistinguishable alternative setup.

Pedersen and Fujisaki-Okamoto commitments. We use the perfectly hidingcommitment scheme proposed by Pedersen [Ped92]: given parameters paramsComthat describe a group G of prime order p with generators g and h such thatlogg(h) is unknown, generate a commitment C to x ∈ Zp by choosing at randomopenx ← Zp and computing comm = gxhopenx . The commitment is opened byrevealing x and openx. We sometimes use a generalized Pedersen commitmentcomm = gx1

1 · · · gxnn hopen to multiple values. The Pedersen commitment scheme isbinding under the discrete logarithm (DL) assumption (see Section 2.3.3). Pedersencommitments have the chameleon property, with logg(h) acting as the trapdoor,but are not extractable.

Fujisaki and Okamoto [FO97] showed how to expand this scheme to hidden ordergroups.

Groth-Sahai (GS) commitments. Groth-Sahai commitments [GS08] are commit-ments that behave advantageously when used together with a bilinear map. To

Page 52: Cryptographic Protocols For Privacy Enhanced Identity Management

28 PRELIMINARIES

commit to a group element x of a prime order group G1 of size p, the committingparty computes a vector of I group elements. We use multiplicative notation, thustwo vectors can be multiplied, and individual vectors are scaled using component-wise exponentiation with an element in Zp. In a first step x is mapped to a vectorthrough an injective function µ. This can for instance be done by mapping thegroup element to one component of the vector, and setting all other componentsto the neutral element 1. Let I be the dimension of the vector space V1. In orderto hide the committed element x, the resulting vector is combined with a randomlinear combination of vectors ui ∈ V1, 1 ≤ i ≤ I.To commit to an element x ∈ G1, choose random opening open = (r1, . . . , rI)←ZpI , and compute C = µ1(x) ·∏I

i=1 urii . Elements y ∈ G2 are committed to in the

same way using µ2 and v1, . . . , vJ ∈ V2, and an opening vector open ∈ RJ . Forsimplicity we assume that GSCom(paramsGS ,m, open) first determines whetherm ∈ G1 or m ∈ G2 and then follows the appropriate instructions. The samecommitment scheme can be used to commit to a value m ∈ Zp using a group elementh as the base. We write ExpCom(paramsGS , h,m, open) = GSCom(paramsGS ,hm, open).

If the subspace generated by the vectors ui and the range of µ share only the 1vector, the commitment scheme is perfectly binding. Clearly, this requires thatthe vectors ui are not all linearly independent. For the commitment scheme to bestrongly computationally hiding, the vectors ui generated by ComSetup need to becomputationally indistinguishable from the linearly independent vectors output byHidingSetup. A random combination of linearly independent vectors ui, 1 ≤ i ≤ Igenerates the whole of V1 and hides the value x perfectly.

The property that makes GS commitments so useful for the construction of non-interactive proofs [GS08], is that they allow for the evaluation of a bilinear mape : G1 × G2 → GT on committed elements in the committed domain. Given acommitment to a and a commitment to b it is possible to compute a vector ofelements in GT using a map E : V1 × V2 → VT , that acts as a commitment tothe value e(a, b). Intuitively, if the subspace generated by the vectors E(ui, vj)is orthogonal to E(µ1(G1), µ2(G2)), then the resulting commitment scheme isperfectly binding. Moreover the commitment is strongly computationally hiding ifthe commitments to a and b are strongly computationally hiding.

We instantiate this approach with commitment schemes that are perfectly bindingand strongly computationally hiding under the SXDH and DLIN assumption (seeSection 2.3.3). We also show that based on appropriate parameters and trapdoorinformation the resulting commitments have the extraction property for GSComand the chameleon property for ExpCom.

SXDH instantiation. In the SXDH setting, one commits to elements in G1 asfollows (committing to elements in G2 is similar):

Page 53: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC BUILDING BLOCKS 29

Let vector space V1 = G1×G1. The parameters are generated by choosing randoms, z and computing u1 = (g, gz) and u2 = (gs, gsz). The public parameters areu1, u2. If extraction is necessary, the trapdoor will be s, z.

One commits to x ∈ G1 by choosing random r1, r2 ∈ Zp and computing (1, x)ur11 u

r22 .

The commitment is opened by revealing x, r1, r2. Given the trapdoor s, z, itis possible to extract x from a commitment (c1, c2) by computing c2/c

z1. The

commitment is perfectly binding and extractable.

The commitment scheme is strongly computationally hiding. Perfectly hidingparameters are generated by choosing random s, z, w ∈ Zp and computing u1 =(g, gz) and u2 = (gs, gw). The public parameters are u1, u2. Note that these publicparameters will be indistinguishable from those described above under the SXDHassumption and that under these parameters the commitment scheme is perfectlyhiding.

Under this setup the commitment scheme is a chameleon commitment. We can formcommitments for which we can use the trapdoor s, z, w to open the commitmentto any value for which we know the discrete logarithm. We compute such acommitment by choosing random c1, c2 ∈ Zp and computing (gc1 , gc2). To openthis commitment to any value gφ, we need only find a solution (r1, r2) to theequations c1 = r1 + sr2 and c2 = φ+ zr1 + wr2.

DLIN instantiation. In the DLIN setting one commits to elements in G1 asfollows (committing to elements in G2 is similar):

Let vector space V1 = G1 ×G1 ×G1. The parameters are generated by choosingrandom a, b, z, s and computing u1 = (ga, 1, g) and u2 = (1, gb, g), and u3 =(gaz, gbs, gz+s). The public parameters are u1, u2, u3. If extraction is necessary,the extraction trapdoor will be a, b, z, s.

One commits to x ∈ G1 by choosing random r1, r2, r3 ∈ Zp and computing(1, 1, x)ur1

1 ur22 u

r33 . Opening would reveal x, r1, r2, r3. In this case, given the

trapdoor a, b, s, z, we will be able to extract x from a commitment (c1, c2, c3)by computing c3/(c1/a1 c

1/b2 ). The commitment is perfectly binding and extractable.

The commitment scheme is strongly computationally hiding. Perfectly hidingparameters are generated by choosing random a, b, s, z, w ∈ Zp and computingu1 = (ga, 1, g) and u2 = (1, gb, g) and u3 = (gaz, gbs, gw). The public parameterswill be u1, u2, and u3. Note that these public parameters will be indistinguishablefrom those described above under the DLIN assumption and that the resultingcommitment scheme is perfectly hiding.

Under this setup the commitment scheme is a chameleon commitment. We canform commitments for which we can use the chameleon trapdoor a, b, s, z to opento any value for which we know the discrete logarithm. We compute such a

Page 54: Cryptographic Protocols For Privacy Enhanced Identity Management

30 PRELIMINARIES

commitment by choosing random c1, c2, c3 ∈ Zp and computing (gc1 , gc2 , gc3). Toopen this commitment to any value gφ, we need only find a solution (r1, r2, r3) tothe equations c1 = ar1 + azr3, c2 = br2 + bsr3 and c3 = φ+ r1 + r2 + (z + s)r3.

2.4.2 Public-Key Encryption Schemes

A public key encryption scheme consists of three algorithms: Keygen, Enc, and Dec.Keygen(1k) generates a public encryption key pk and a secret decryption key sk.Enc(pk,m) encrypts messages m and produces ciphertext ct. Dec(sk, ct) decryptsthe ciphertext and returns m.

Let Enc(pk,m, r) be the encryption algorithm, with the random tape of theencryption algorithm made explicit. For each key pair (pk, sk) generated bythe randomized algorithm Keygen and every possible random tape r, the functionEnc(pk, ., r) : M→ C and Dec(sk, .) : C→M is an injective respectively surjectivemapping. M is called the message space, and C the ciphertext space. An encryptionscheme is said to be correct if Dec(sk, .) always is the inverse of Enc(pk, ., r) (whenthe domain of the first is restricted to the range of the latter), i.e., for all keypairs(pk, sk)← Keygen(1k) and all messages m, Dec(sk,Enc(pk,m)) = m.

Definition 2.4.5 (Secure Public-Key Encryption (IND-CCA2)) Let k bea security parameter. A public key encryption scheme is secure (against adaptivechosen ciphertext attack [GM84, RS92]) if every p.p.t. adversary A has anadvantage negligible in k in the following game:Setup. The game runs Keygen(1k) to generate (pk, sk) and hands pk to A.Phase 1. A may query an oracle ODec(sk, .) polynomially many times. On inputct, the oracle runs Dec(sk, ct) and returns the associated m.Challenge. A presents two challenge messages m0, m1. The game selects arandom bit b, and returns the challenge ciphertext ct∗ = Enc(pk,mb) to A.Phase 2. A may again query oracle ODec(sk, .) polynomially many times on ctprovided ct is not ct∗.Guess. A outputs b′. We define the advantage of A as |Pr[b′ = b]− 1/2|.

Security against chosen plaintext attack (IND-CPA) is defined in a similar way,but the adversary is not given access to the decryption oracle.

We will make use of several variants and extensions of public key encryption.

Verifiable encryption. For our purposes, in a verifiable encryption scheme, theencrypter/prover convinces a verifier that the plaintext of an encryption under aknown public key is equivalent to the value hidden in a Pedersen commitment.

Page 55: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC BUILDING BLOCKS 31

Camenisch and Damgard [CD00] developed a technique for turning any secureencryption scheme into a secure verifiable encryption scheme.

Bilinear ElGamal encryption. We require a cryptosystem where gx is sufficientfor decryption and the public key is φ(gx) for some function φ. One exampleis the bilinear ElGamal cryptosystem [BF01, AFGH06]. Here the public keycorresponds to pk = e(g, g)x. To encrypt, the ciphertext is computed as ct =(gr,m · pkr). Such a ciphertext can be decrypted given the knowledge of gx, i.e.,m = m · e(g, g)rx/e(gx, gr). This scheme is semantically secure under the DBDHassumption (see Section 2.3.3).

Homomorphic encryption. Homomorphic encryption is a form of malleableencryption. Given two ciphertexts, it is possible to create a third ciphertext,with a plaintext that is related to the first two.

In homomorphic encryption, the ‘operations’ on the ciphertext and the plaintextpreserve the homomorphism Ench:

Ench(m1)⊗ Ench(m2) = Ench(m1m2)

We speak of additive homomorphic encryption if the message space M correspondsto a ring. For additive homomorphic encryptions with M = Zn, the encryptedplaintexts fulfill the following relations:

Ench(m1)⊕ Ench(m2) = Ench(m1 +m2), c⊗ Ench(m) = Ench(c ·m).

In this case + corresponds to the addition operation of the ring. Note that n can(but does not necessarily) correspond to an RSA modulus. For c ∈ Zn, we writec⊗ Ench(m) to denote the c times homomorphic addition of Ench(m). Note thatfor Paillier [Pai99] and Damgard-Jurik Encryption [DJ01] c⊗Ench(m) correspondsto Ench(m)c and can be implemented efficiently.

Damgard-Jurik encryption. The Damgard-Jurik cryptosystem [DJ01] is ageneralization and adaptation of the Paillier cryptosystem [Pai99] based on thedecisional composite residuosity (DCR) assumption (see Section 2.3.3). It allowsfor the encryption of arbitrary long messages without the need to generate newkeys. It preserves the homomorphic property of Paillier encryption. Damgardand Jurik [DJ01] also describe a threshold decryption variant and efficient zero-knowledge proofs for their scheme.

Threshold decryption. In a distributed decryption protocol a private key isshared among a group of parties, where only a threshold of the parties is allowed

Page 56: Cryptographic Protocols For Privacy Enhanced Identity Management

32 PRELIMINARIES

to decrypt a ciphertext ct, whereas fewer parties learn nothing on the secret noron the decryption of ct.

2.4.3 Digital Signature Schemes

A digital signature scheme consists of three algorithms: Keygen, Sign, and VerifySig.Keygen(1k) generates a secret signing key sk and a public verification key pk.Sign(sk,m) computes a signature σ on m. VerifySig(pk,m, σ) outputs accept if σis a valid signature on m, reject if not.

Let Sign(sk,m, r) be the signing algorithm, with the random tape made explicit.Each key pair (pk, sk) generated by the randomized algorithm Keygen and everypossible random tape r defines two functions Sign(sk, ., r) : M → S andVerifySig(pk, ., .) : M × S → accept, reject where M is called the messagespace, and S the signature space. A signature scheme is said to be correct ifVerifySig(pk,m, σ) outputs accept for all values for which there exists an r suchthat σ = Sign(sk,m, r), i.e., for all keypairs (pk, sk)← Keygen(1k) and all messagesm, VerifySig(pk,m,Sign(sk,m)) = accept.

Definition 2.4.6 (Secure Digital Signature (EF-CMA) [GMR88]) We saythat a digital signature scheme is secure (against existential forgery under adaptivechosen message attack) if it is Correct and Unforgeable.

Correctness. For keys generated by Keygen, all signatures obtained using the Signalgorithm should be accepted by the VerifySig algorithm.

Unforgeability. No adversary should be able to output a valid message/signaturepair (m, σ) unless he has previously obtained a signature on m. Formally, forevery p.p.t. adversary A, there exists a negligible function ν such that

Pr[(pk, sk)← Keygen(1k);

(QSign,m, σ)← A(paramsSig, pk)OSign(sk,·) :

VerifySig(pk,m, σ) = 1 ∧m 6∈ QSign] < ν(k).

OSign(sk,m) records all m queries in QSign and returns Sign(sk,m).

CL-signatures. In a series of papers, Camenisch and Lysyanskaya [CL01, CL02b,CL04] identified a key building block commonly called “a CL-signature” that isfrequently used in the construction of privacy protocols. A CL-signature is asignature scheme with a pair of useful protocols.

Page 57: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC BUILDING BLOCKS 33

The first protocol, called Issue, lets a user obtain a signature on committed messageswithout revealing the messages. The user wishes to obtain a signature on valuesa1, . . . , an from a signer with public key pk. The user forms commitments commi

for every ai and gives the commi to the signer. After running the protocol, theuser obtains a signature credpk on the ai, and the signer learns no informationabout a1, . . . , an other than the fact that he has signed the values that the userhas committed to.

The second protocol, called Prove, is a zero-knowledge proof of knowledge ofa signature on committed values. The prover has a message-signature pair((a1, . . . , an), cred) and a commitment opening openi for every ai. The verifieronly knows the commitments comm1, . . . , commn. The prover proves in zero-knowledge that he knows a pair ((a1, . . . , an), cred) and values (open1, . . . , openn)such that VerifySig(pk, (a1, . . . , an), cred) = accept and commi = Com(ai, openi).(Cryptographic proof systems will be introduced in Section 2.5.)

It is clear that using general secure two-party computation [Yao86] and zero-knowledge proofs of knowledge of a witness for any NP statement [GMW86],we can construct the Issue and Prove protocols from any signature scheme andcommitment scheme. Camenisch and Lysyanskaya’s contribution was to constructspecially designed signature schemes that, combined with Pedersen [Ped92] andFujisaki-Okamoto [FO98] commitments, allowed them to construct Issue and Proveprotocols that are efficient enough for use in practice. In turn, CL-signatures havebeen implemented and standardized [CV02, BCC04]. They have also been usedas a building block in many other constructions [JS04, BCL04, CHL05, CHL06,DDP06, CHK+06, TS06, CGH06, CLM07].

2.4.4 Pseudo-Random Function (PRF)

A pseudo-random function family fsk [GGM86] requires the existence of twoefficient polynomial algorithms Keygen and Eval: Keygen(1k) outputs a secret seedsk that acts as an index into the function family and fixes the output domain O;Eval(sk, x) evaluates the function fsk on point x and outputs the image y of x in O.

Definition 2.4.7 A pseudo-random function family is secure, if no adversary candistinguish given the possibility to query either a pseudo-random function fsk ora completely random function f∗ with output domain O whether he is interactingwith the one or the other.

More formally, for all p.p.t. adversaries A, there exists a negligible function ν suchthat

Pr[(O, sk)← Keygen(1k); b← 0, 1;A(O)OEval(b,sk,.) = b]− 1/2 < ν(k)

Page 58: Cryptographic Protocols For Privacy Enhanced Identity Management

34 PRELIMINARIES

Depending on bit b, when queried on a fresh x, OEval(b, sk, x) either runs y ←Eval(sk, x) or samples a random y from the output domain O and stores it as itsimage under x.

DY PRF. Let p be a prime of bit length k, let G = 〈g〉 be a group of order pand let a be a random element of Z∗p. Dodis and Yampolskiy [DY05] showed thatfDYg,a (x) = g

1a+x is a pseudo-random function with output domain G, under the

q-DDHI assumption, when either: (1) the inputs are drawn from the restricteddomain 0, 1O(log(k)) only, or (2) the adversary specifies a polynomial-sized set ofinputs from Z∗p before a function is selected from the PRF family (i.e., before thevalue a is selected). For our purposes, we require something stronger: that the DYconstruction work for inputs drawn arbitrarily and adaptively from Z∗p.

Theorem 2.4.8 In the generic group model, the Dodis-Yampolskiy PRF isadaptively secure for inputs in Z∗p.

The proof of Theorem 2.4.8 follows from the generic group proof of the q-IDDHIassumption; see Appendix B.1.

2.4.5 Secure Two-party Computation

The security of a two-party computation protocol [Yao82] is usually defined througha comparison with an idealized scenario that is secure by definition. The idealizedscenario involves a trusted party that collects the input of the two parties oversecure channels and returns the result of the computation if none of the partieschooses to abort. The cryptographic two-party computation protocol is secure,if it behaves no worse than this ideal protocol, but without the additional trustassumptions. This is usually modeled using a simulator. The task of the simulatoris to act as a wrapper around the idealized protocol to make it appear like thecryptographic protocol. The simulation succeeds with respect to an unbounded,respectively computationally bounded adversary if the output of the simulator isequal to, respectively computationally indistinguishable from the output of thecryptographic protocol. A two-party computation protocol is secure, if for alladversaries there exists a successful simulator.

Secure function evaluation is a special form of two-party computation, in which eachparty provides input to the computation exactly once. As such, a secure functionevaluation is an interactive algorithm between entities E and E ′ that evaluatesthe functions fp and f ′p parameterized by some public value p on secret inputsin and in′. E has private input in and E ′ has private input in′. This is written(fp(in, in′), f ′p(in, in′)) ← E(p, in) ↔ E(p, in′). If none of the parties chooses to

Page 59: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC BUILDING BLOCKS 35

abort, E leans fp(in, in′) and E ′ learns f ′p(in, in′). The standard definition forwhat it means for an interactive algorithm to securely compute a function (see, forexample, Lindell and Pinkas [LP07]) is a subcase of secure two-party computationand implies among others the extraction of the inputs in and in′ of dishonestparties by the simulator and the possibility of simulating the behavior of honestparties knowing only their output and not their input to the function.

In our protocols we are particularly interested in the evaluation of functions oncommitted values from Zp. Jarecki and Shmatikov give a protocol for secure two-party computation on committed inputs [JS07]. We describe an efficient committedsecure function evaluation protocol for computing algebraic terms with additionand multiplication modulo a prime p that generalizes ideas presented in [CKW04].Note that the protocol described below only evaluates f ′p and E does not receiveany output. This restriction can be removed by running the protocol again on thesame committed inputs with reversed roles.

Committed secure function evaluation for modular arithmetic. The protocoluses an additive homomorphic encryption scheme with encryption and decryptionfunctions Ench and Dech with message space Zn. In addition, the encryption shouldbe verifiable [CD00], meaning it should allow for efficient proofs of knowledge aboutthe encrypted content. The key pair is generated by E ′ and is made available to E .Let x1, . . . , xN , openx1 , . . . , openxN ∈ Zp and y1, . . . , yM , openy1 , . . . , openyM ∈ Zpbe the secret input variables and openings of E and E ′ respectively and let commxi

and commyi be public commitments to the xi and yi. We provide a protocolfor computing the multivariate polynomial

∑L`=1 a`

∏Nn=1 x

u`nn

∏Mm=1 y

v`mm where

u11, . . . , uLN , v11, . . . , vLM ∈ 0, 1 and a` ∈ Zp are publicly known values.

The parties can do parts of the computation locally: E computes X` = a`∏Nn=1 x

u`nn

mod p and E ′ computes Y` =∏Mm=1 y

v`mm mod p. To prove that the computation

was done correctly E computes commitment commX` = Com(X`, openX`) and E ′computes commY` = Com(Y`, openY`).

The parties can complete the computation using homomorphic encryption asdescribed in the following protocol (the size n of the message space of thehomomorphic encryption needs to be at least 2k`p2). The

⊕operator denotes the

homomorphic addition of multiple ciphertexts.

The round complexity of the protocol is 3 if non-interactive proofs of knowledgeare used and 12 if interactive proofs of knowledge are used.2

2The round complexity can be reduced by interleaving the proofs and piggybacking some ofthe messages.

Page 60: Cryptographic Protocols For Privacy Enhanced Identity Management

36 PRELIMINARIES

E(x1, . . . , xN , E ′(y1, . . . , yM ,openx1 , . . . openxN ) openy1 , . . . openyM )

e`L`=1 e` = Ench(Y`)L`=1ex =

(⊕L

`=1(e` ⊗X`)) PK1 -

⊕(Ench(r)⊗ p) ex -PK2 -

x = Dech(ex)openx ← Zp

commx commx = gxhopenx

PK3 -

Party E ′ encrypts each Y` and sends it to the user. The proof PK1 assures E thatcommY` was computed correctly using the values in commitments commyi andthat the e` are encryptions of the values committed to in the commY` .

Next, E computes the encrypted result. The term r · p, 0 < r < (2k − 1)`p isadded to avoid possible modulo overflows from revealing any statistically significantinformation about the input of E . E proves to E ′ in PK2 that commX` was computedcorrectly using the values in commitments commxi and that ex was computedcorrectly using the values committed to in commY` .

As a last step, E ′ decrypts ex, does a single modulo p reduction to obtain the resultof the computation, and commits to the result in commitment commx. In PK3 E ′proves to the user that commx contains the same value modulo p as encrypted inex. For details on how to do the proofs PK1, . . . ,PK3 we refer to [CS97b, CS03].An efficient implementation of such a protocol is presented in [CKW04] using thePaillier homomorphic encryption scheme [Pai99].

2.5 Cryptographic Proof Systems

The proof systems used in cryptographic protocols are a generalization of staticmathematical pencil and paper proofs. The reductionist proofs mentioned inSection 2.1 are examples for pencil and paper proofs. They are written as ameans of verification of the protocol’s design. Cryptographic proofs [GMR89,Gol00, BFM88, FLS99], on the contrary, form part of a larger cryptographicprotocol and prove statements about messages exchanged between parties; theseproofs are created and verified through electronic means. (Example applicationsof cryptographic proofs in the current chapter are verifiable encryption schemes,CL-signature schemes, and secure two-party computation protocols.)

Both mathematical and cryptographic proofs provide answers to the followingquestion: “Given a statement, which additional resources are needed to convince a

Page 61: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC PROOF SYSTEMS 37

verifier V that the statement is true?”. The proofs found in mathematical textbooksfall into the complexity class NP, for which the additional resources available tothe verifier are a polynomial length witness w and a deterministic polynomial timemachine V. The goal of the cryptographic proof systems used in this thesis is tohide the witness w used in the proof from the verifier and instead provide himwith alternative resources, e.g., the possibility to interact with the prover in aquestion and answer game. Other resources that allow for non-interactive proofsare a random oracle or a common reference string.

In a formal context, statements are often defined in terms of language membershipof an instance y in a formal language L. The statement is then y ∈ L. Theinstances and the corresponding witnesses define the witness relation R, such thatfor every y in the language and every possible witness wi there exists a (y, wi) ∈ R.

Informally, zero-knowledge captures the notion that a verifier learns nothing fromthe proof but the truth of the statement. Witness indistinguishability is a weakernotion that guarantees that the verifier learns nothing about which witness wasused in the proof. In either case, we will also require soundness, meaning thatan adversarial prover cannot convince an honest verifier of a false statement, andcompleteness, meaning that all correctly computed proofs are accepted by thehonest verification algorithm. Sometimes we require a stronger soundness property,namely that the prover ‘knows’ a witness for the statement he is proving. This isformalized using an extractor that can obtain the witness from a successful prover.

Proofs with basic soundness guarantees are called proofs of language membershipwhile extractable proofs are referred to as proofs of knowledge. In our protocolswe will use cryptographic proofs as basic building blocks. The description of ourprotocols abstracts from the concrete proof system. We quickly summarize ournotation. (The subsections explaining the definitions of the different proof variantsare not strictly necessary for understanding the contribution of the thesis.)

Notation for proofs of language membership. For proofs of language member-ship it suffices to define their language. In practical settings we often want to showthat some system of equations Eqs holds for public values y1, . . . , ym and secretvalues x1, . . . , xm, where without loss of generality the first ` values x1, . . . , x` arecommitted in (perfectly binding) commitments comm1, . . . , comm` with openingopen1, . . . ,e ll.3 For statement y = (comm1, . . . , comm`, y1, . . . , ym) and witnessw = (x1, open1, . . . , x`,e ll, x`+1, . . . , xn) the language is defined as

Lparams = y | ∃w such that C1 = Com(params, x1, open1) ∧ . . .∧

C` = Com(params, x`,e ll) ∧ Eqs(params, x1, . . . , xn, y1, . . . , ym) .3Perfectly hiding commitments would not add restrictions to the language.

Page 62: Cryptographic Protocols For Privacy Enhanced Identity Management

38 PRELIMINARIES

The language is parametrized with the public parameters params that are sharedwith the commitment scheme. For practical proof systems the parameters usuallyinclude the description of a group that is used when performing the proof. Weoften omit the parameters when they are clear from the context.

Notation for proofs of knowledge. When referring to proofs of knowledge, wefollow the notation introduced by Camenisch and Stadler [CS97b, CS97a]. Wegeneralize their notation so it could be used for all proofs of knowledge that canbe described by a set of equations. As for proofs of language membership, we areinterested in proofs of knowledge about commitments (For proofs of knowledge itis meaningful to make statements about perfectly hiding commitments).

We define f -extraction to be able to deal with proof systems that do not allowthe extraction of the full witness, e.g., in Groth-Sahai proof system [GS08] onecannot extract exponents or the opening of commitments. (In this case we willagain require commitments to be perfectly binding.) We express partial extractionof an f -function of the witness using the notation

PK( f(x1, open1, . . . , x`,e ll, x`+1, . . . xn) ) :

∀`i=1 Ci = Com(xi, openi) ∧ Equations(x1, . . . , xn, y1, . . . , ym) .

In our construction we will use proofs based on extractable commitments. Theseproofs let us extract the committed value xi, but not the opening openi. In thiscase f(params, (x1, open1, . . . , x`,e ll, x`+1, . . . xn)) = (x1, . . . , . . . xn). To simplifythe notation we use ‘x inC’ to denote that there exists open such that C =Com(params, x, open) and write

PK( x1, . . . , . . . xn ) : x1 inC1 ∧ . . . ∧ x` inC` ∧ Eqs(x1, . . . , xn, y1, . . . , ym) .

We write NIPK for non-interactive proofs. In our notation, π ← NIPK. . . denotesthe creation of a proof and π ∈ NIPK. . . means that VerifyProof accepts theproof π for instance (C1, . . . , C`, y1, . . . , ym). To emphasize that a non-interactiveproof is zero-knowledge we sometimes write NIZKPK instead of NIPK.

Other notational conventions. The concatenation of two non-interactive proofsπ and π′ is a proof π π′ that combines all the commitments and proves the ANDof the two conditions. If a non-interactive proof π proves a condition about valuesof a set of commitments C, a projection π′ = π S proves the same condition aboutthe values of the subset C \ S of commitments.

In Section 2.5.6 we define randomizable non-interactive proofs and randomizablecommitments. The randomization algorithm RandProof((comm1, . . . , comm`),(open′1, . . . ,e ll′), π) takes a proof, a list of commitments, and a list of opening

Page 63: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC PROOF SYSTEMS 39

updates as input. It outputs a proof π′ that looks like a freshly generatedproof for the same equations but for randomized commitments comm1 Com(1, open′1), . . . , comm` Com(1,e ll′). The commitments we use are multi-plicatively homomorphic, so for a commitment comm = Com(x, open), comm Com(1, open′) = Com(x, open+ open′). If commitments are to remain unchanged,they can be omitted from the input.

Outline of this section. We formalize the properties of different cryptographicproof systems below and provide further explanations on our notation. Section 2.5.1introduces the distinction between interactive and non-interactive proof systems anddefines their completeness and soundness properties. Section 2.5.2 formally definesproofs of knowledge for both interactive and non-interactive proof systems, andfurther discusses the Camenisch and Stadler notation. Section 2.5.3 formally definesthe hiding properties (witness indistinguishability and zero-knowledge) of bothinteractive and non-interactive proof systems. Sections 2.5.4 and 2.5.5 introducesproof systems for statements that are particularly relevant for cryptographicprotocols: proofs about discrete logarithms and proofs about pairing productequations. Section 2.5.6 introduces randomizable proofs.

2.5.1 Interactive and Non-Interactive Cryptographic Proofs

Interactive proof systems. In an interactive proof system, the prover and theverifier can make random coin tosses and interact in a question and answergame [GMR89]. Formally, a prover P and a verifier V interact using the followinginteractive algorithm:

(⊥, accept/reject)← P(y, w)↔ V(y) .

Definition 2.5.1 (Perfect Completeness) V accepts for every (y, w) ∈ R afterinteracting with P on common input y.

∀(y, w) ∈ R : Pr[(⊥, accept)← P(y, w)↔ V(y)] = 1

Definition 2.5.2 (Soundness) Soundness requires that no prover can make theverifier accept a wrong statement y 6∈ L except with some small probability. Theupper bound ε of this probability is referred to as the soundness error of a proofsystem. More formally, for every prover P, and every y 6∈ L:

Pr[(⊥, accept)← P(y)↔ V(y)] < ε .

The above definition is parameterized by the soundness error ε. While for practicalsystems a soundness error of around 2−80 is recommended, the definition is also

Page 64: Cryptographic Protocols For Privacy Enhanced Identity Management

40 PRELIMINARIES

meaningful for bigger values. As long as the soundness error is bounded by apolynomial fraction in the potential running time of the verifier, it is alwayspossible to amplify soundness by repeating the proof until the soundness errorbecomes negligible relative to the running time of the verifier. (Per definition, ap.p.t. verifier runs in poly(|y|) time.) After ` repetitions, a soundness error ε willbe reduced to ε`.

In our protocols we will make use of proof systems with a soundness error that isnegligible in the size of the instance y, which will in turn depend on the securityparameter k. As noted above, this can always be achieved using repetition.

Non-interactive proof system. The term ‘non-interactive’ is somewhat mislead-ing. Such proofs put strong restrictions on the interactions allowed between theprover and the verifier; these restrictions are however not as strict as in staticmathematical proofs.

The two relaxations that we consider are the random oracle model and the commonreference string model. In the random oracle model, both the prover and the verifierare given access to a random oracle. The prover can make random coin tosses andcan send a single message to the verifier. [BR93]

We will look in more detail at non-interactive proof systems in the commonreference string model. In this model the prover P and the verifier V are inpossession of a reference string sampled from a distribution D by a trustedsetup params ← Setup(1k). To prove statement y ∈ L with witness w, P runsπ ← Prove(params, y, w) and sends the proof π to the verifier. V accepts ifVerifyProof(params, y, π) = accept, and rejects otherwise. To account for the factthat params may influence the statements that are being proven, the witnessrelation can be generalized to (y, w) ∈ Rparams parametrized by params.

Definition 2.5.3 (Completeness) VerifyProof accepts for all params ∈ Setup(1k)and every (y, w) ∈ Rparams after interacting with Prove on common input y. Moreformally, for all k, all params ∈ Setup(1k), and all (y, w) ∈ Rparams :

Pr[π ← Prove(params, y, w) : VerifyProof(params, y, π) = accept] = 1

Definition 2.5.4 (Soundness) Soundness requires that no prover can make theverifier accept a wrong statement y 6∈ L except with some small probability. Theupper bound of this probability is referred to as the soundness error of a proofsystem. Formally, for every prover P, there exists a negligible function ν such that

Pr[(params ← Setup(1k), (y, π)← P(params) :

y 6∈ L ∧ VerifyProof(params, y, π) = accept] = ν(k) .

Page 65: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC PROOF SYSTEMS 41

The above definition requires the soundness error to be negligible in the securityparameter k. By increasing k the soundness error can be made arbitrary small. Ifthe soundness error is 0 for all k, we speak of perfect soundness.

2.5.2 Cryptographic Proofs of Knowledge

In a proof of knowledge the prover P does not only need to convince the verifierof the truth of the statement y ∈ L, rather P needs to prove that he ‘knows’ awitness w such that (y, w) is in the witness relation R.

This requires a formalization of what it means for a machine to know something.Loosely speaking a machine knows a value if it is possible to extract the value usingan additional machine that is referred to as a knowledge extractor E . A proof isa proof of knowledge, if the success probability of E is comparable to the successprobability of P.

This is formalized differently depending on the model.

Definition 2.5.5 (Blackbox Extractability) There exists a p.p.t. extractor Ethat is given oracle access to a potentially malicious prover P . Extractabilityrequires that for every P, it is the case that EP (y)(y) ∈ R(y) ∪ ⊥ and

Pr[EP (y)(y) ∈ R(y)] ≥ Pr[P (y)↔ V (y)→ 1]− κ(y) .

Let R(y) be the set of all witnesses for instance y, while the result ⊥ signifies thatthe extractor E did not come to a conclusion.

The knowledge error κ(y) denotes the probability that the verifier V might accepty, even though the prover does in fact not know a witness w. This definition ofthe validity property is a combination of the validity and strong validity propertiesin [BG92]. For small knowledge errors κ(y), such as, e.g., 2−80 or 1/poly(|y|) itcan be seen as being stronger than the soundness property of ordinary interactiveproofs.

In our protocols we will make use of proof systems with a knowledge error thatis negligible in the size of the instance y, which will in turn depend on thesecurity parameter k. Like soundness, knowledge extraction can be amplified usingrepetition.

Definition 2.5.6 (CRS Extractability) For non-interactive proof systems inthe common reference string model, extraction is defined using an extractor thatcan output the witness after manipulating the reference string.

More formally as defined by De Santis et al. [SCP00], there has to exist an extractorE with p.p.t. algorithms (ExtractSetup,Extract). ExtractSetup(1k) outputs (ext,

Page 66: Cryptographic Protocols For Privacy Enhanced Identity Management

42 PRELIMINARIES

params) where params is distributed identically to the output of Setup(1k). Forall p.p.t. adversaries A, the probability that A(1k, params) outputs (y, π) such thatVerifyProof(params, y, π) = accept and Extract(ext, y, π) fails to extract a witnessw such that (y, w) ∈ Rparams is negligible in k. We have perfect extractability ifthis probability is 0.

To formalize partial extraction we define f-extractability as a generalization ofextractability. f -Extractability means that the extractor Extract only has to outputa partial witness wpar such that ∃w : (y, w) ∈ Rparams ∧ wpar = f(params, w).

If f(params, ·) is the identity function, we get the usual notion of extractability. Iff(params, ·) is a constant function we only get soundness.

PK and NIPK notation. When referring to proofs of knowledge, we follow thenotation introduced by Camenisch and Stadler [CS97b, CS97a] for various proofsof knowledge of discrete logarithms and proofs of the validity of statements aboutdiscrete logarithms. We generalize their notation so it could be used for all proofsof knowledge whose witness relation R could be described as a set of equationswith the public variables y1, . . . , ym corresponding to the instance y, and secretvariables x1, . . . , xn corresponding to the witness w.

PK( x1, . . . , xn ) : Eqs(x1, . . . , xn, y1, . . . , ym) .By convention, the letters in the parenthesis, in this example x1, . . . , xn, denotequantities whose knowledge is being proven, while all other values are known tothe verifier.

These techniques allow to express statements about cryptographic primitives thatare defined by a set of equations. For instance one can specify a proof that twocommitments contain the same value, or that a value was verifiably encrypted.Given a statement Alg(x) = y and Alg′(x′) = y′ about two algorithms, with secretinput x, x′ and public output y, y′, it is possible to prove AND and OR relationsof these statements. We write in a short form notation, e.g., for AND

PK(x, x′) : Alg(x) = y ∧ Alg′(x′) = y′ .

Following Camenisch and Stadler [CS97a], we use the following notation toexpress an f -extractable proof for an instance (C1, . . . , C`, y1, . . . , ym) that includescommitments to a subset x1, . . . , x` of the witness elements. The witness thenincludes the openings for these commitments, i.e., w = (x1, open1, . . . , x`,e ll, x`+1,. . . , xn). We can express partial extraction of an f -function of the witness usingthe notation

PK( f(params, (x1, open1, . . . , x`,e ll, x`+1, . . . xn) ) ) :

∀`i=1 Ci = Com(xi, openi) ∧ Eqs(params, x1, . . . , xn, y1, . . . , ym) .

Page 67: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC PROOF SYSTEMS 43

The f -extractability ensures that, with overwhelming probability over the choiceof params, if VerifyProof accepts then we can extract f(params, (x1, open1, . . . ,x`,e ll, x`+1, . . . xn)) from π, such that xi is the content of the commitment Ci, andEqs(params, x1, . . . , xn, y1, . . . , ym) is satisfied.

We primarily use f -extraction together with non-interactive proofs, which we denoteby NIPK. . . . We do, however, not exclude similar relaxations for interactiveproofs of knowledge.

2.5.3 Hiding Properties of Cryptographic Proofs

Witness indistinguishable. Loosely speaking a proof is witness indistinguishableif after execution of the proof the verifier gains no advantage in determining whichwitness from the set R(y) = w|(y, w) ∈ R was used to prove the statementy ∈ L. [FS90]

Definition 2.5.7 (Witness Indistinguishability) A proof system is perfect,statistical, computational witness indistinguishable, if for all y, w0, w1, (y, w0) ∈R, (y, w1) ∈ R and all auxiliary information z ∈ 0, 1∗ and for all malicious p.p.t.verifiers V the probability ensembles P(y, w0) ↔ V(y, z)|y| and P(y, w1) ↔V(y, z)|y| are equal, statistically close, or computationally indistinguishable. Here|y| is the bit length of the statement being proven. All algorithms are restricted topolynomial time in |y|.

Definition 2.5.8 (Non-interactive Witness Indistinguishable Proof) Algo-rithms Setup,Prove,VerifyProof constitute a witness indistinguishable non-inter-active proof system if the following property holds.

For all p.p.t. A1,A2 there exists a negligible function ν such that:

Pr[params ← Setup(1k); (y, w0, w1, state)← A1(params);

b← 0, 1;π ← Prove(params, y, wb);

b′ ← A2(state, π) :

(y, w0) ∈ Rparams ∧ (y, w1) ∈ Rparams ∧ b = b′] = 1/2 + ν(k) .

Zero-knowledge. Informally, zero-knowledge captures the notion that a verifierlearns nothing from the proof but the truth of the statement. This is modeled withthe help of an additional machine S, called the simulator, that produces outputthat cannot be distinguished from a real proof. [GMR89]

Page 68: Cryptographic Protocols For Privacy Enhanced Identity Management

44 PRELIMINARIES

Definition 2.5.9 (Blackbox Zero-knowledge) A proof system is perfect, sta-tistical, computational black-box zero-knowledge, if for all (y, w) ∈ R and allauxiliary information z ∈ 0, 1∗ there exists a p.p.t. simulator S such that for allmalicious p.p.t. verifiers V the probability ensembles P(y, w) ↔ V(y, z)|y| andSV(y,z)(y, z)|y| are equal, statistically close, or computationally indistinguishable.All algorithms are restricted to polynomial time in |y|.

Definition 2.5.10 (Non-interactive Zero-Knowledge Proof) The non-inter-active proof system (Setup,Prove,VerifyProof) is multi-theorem zero-knowledge, ifthere exists a simulator S = (SimSetup,SimProve) such that for all p.p.t. A thereexists a negligible function ν such that,

Pr[params ← Setup(1k) : AOProve(params,.,.)(params) = 1]−

Pr[(params, sim)←SimSetup(1k) :AOSimProve(params,sim,.,.)(params)=1]−12

= ν(k)

Here OSimProve(params, sim, y, w) outputs SimProve(params, sim, y) for (y, w) ∈Rparams and both oracles output ⊥ otherwise.

Composability of Cryptographic Proofs Composability refers to either thewitness indistinguishability or the zero-knowledge property. Interactive proofscan be sequentially or concurrently composable, meaning that the sequential,respectively concurrent execution of multiple proofs does not leak additionalinformation about the witness. Witness indistinguishable proof protocols are alwaysalso sequentially and concurrently composable [FS90]. This does not necessarilyhold for zero-knowledge proof protocols [GK96].

For non-interactive proofs concurrent composition guarantees that arbitrary proofscan be created using the same common reference string params.

In a composable (under the definition of Groth and Sahai [GS08]) non-interactiveproof system there exists an algorithm SimSetup that outputs params together witha trapdoor sim, such that the params output by SimSetup are indistinguishable fromthose output by Setup. Composable witness-indistinguishability or composable zero-knowledge requires that, under these parameters, the witness-indistinguishabilityor zero-knowledge property holds even when the adversary is given trapdoor sim.

2.5.4 Proofs of Knowledge about Discrete Logarithms

We use several existing results to prove statements about discrete logarithms;(1) proof of knowledge of a discrete logarithm [Sch91]; (2) proof of knowledge

Page 69: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC PROOF SYSTEMS 45

of the equality of some element in different representations [CP93b]; (3) proofthat a commitment opens to the product of two other committed values [Bra97,Cam98, CM99a]; and (4) proof of the disjunction or conjunction of any two of theprevious [CDS94].

These results are often given in the form of Σ-protocols [CDS94, Cra97, Dam02].Σ-protocols are a popular family of interactive proofs that are characterized by athree move, commit, challenge, and response structure. They allow to prove variousstatements about discrete logarithms in groups of known and hidden order [Sch91,Bra97, CS97b, BCM05]. Σ-protocols can be turned into zero-knowledge protocolsusing efficient zero-knowledge compilers [Dam99, Dam02].

Such proofs can be represented using the PK notation.

PK(α, β, δ) : y = gαhβ ∧ y = gαhδ

denotes a “zero-knowledge Proof of Knowledge of integers α, β, and δ such thaty = gαhβ and y = gαhδ holds”, where y, g, h, y, g, and h are elements of somegroups G = 〈g〉 = 〈h〉 and G = 〈g〉 = 〈h〉 that have the same order. (Note thatsome elements in the representation of y and y are equal.)

We refer to [CKY09] for a formal treatment of how to implement such a specificationin a secure way when used as part of a bigger protocol.

Σ-protocols can be compiled into non-interactive zero-knowledge proofs of knowledgein the random oracle model by applying a cryptographic trick called the Fiat-Shamirheuristic [FS87].

2.5.5 Proofs About Pairing Product Equations

Pairing-based cryptography has led to several cryptographic advancements. One ofthese advancements is the development of powerful and efficient non-interactivezero-knowledge proofs in the common reference string model. The seminal idea wasto hide the values for the evaluation of the bilinear map in a commitment whoseparameters are given as part of the reference string. Using different commitmentschemes, this idea was used to build non-interactive proof systems under thesub-group hiding [GOS06b] and under the decisional linear assumption [GOS06a].

These proof systems prove circuit satisfiability, and thus by the Cook-Levintheorem [Coo71] allow to prove membership for every language in NP. The size ofthe common reference string and the proofs is relatively small, however, transforminga statement into a Boolean circuit causes a considerable overhead.

Proof systems under the sub-group hiding, decisional linear assumption, andexternal Diffie-Hellman assumption that allow to directly prove the pairing product

Page 70: Cryptographic Protocols For Privacy Enhanced Identity Management

46 PRELIMINARIES

equations common in Pairing-based cryptography have been proposed by Grothand Sahai [GS08] and were revisited by [EGW09].

While proofs for such equations can also be implemented using interactive proofsabout discrete logarithms, we do not want to rely on the random oracle model forobtaining non-interactive proofs. Moreover, Groth-Sahai proofs give us somethingthat we do not know how to obtain with random oracles: the randomizabilityproperty introduced in Section 2.5.6.

Summary of Groth-Sahai proofs. Let paramsBM = (p,G1,G2,GT , e, g, h) be thesetup for pairing groups of prime order p, with pairing e : G1 ×G2 → GT , and g, hgenerators of G1,G2 respectively.4

The GS proof instance consists of the coefficients of the pairing product equation:

aqq=1...Q ∈ G1, bqq=1...Q ∈ G2, t ∈ GT ,

αq,mq=1...Q,m=1...M , βq,nq=1...Q,n=1...N ∈ Zp .

The prover knows a set of values xmMm=1, znNn=1 that satisfy the pairing productequation

Q∏q=1

e(aqM∏m=1

xαq,mm , bq

N∏n=1

zβq,nn ) = t .

As the first step in creating the proof, the prover prepares commitmentsCmm=1...M and Dnn=1...N for all values xm, zn in G1 and G2 respectively.Alternatively, it is possible to reuse commitments from the proof instance. Thus,the instance, known to the prover and verifier, is the pairing product equation (e.g.,its coefficients) and a number of commitments while the witness, known only tothe prover, consists of the secret values and the openings of these commitments.

We now describe how to construct the proof. Let V1, V2 be the vector spacesunderlying two GS commitment schemes for committing to elements in G1 and G2,and let E : V1 × V2 → VT be a bilinear map that evaluates the bilinear map e inthe committed domain. Also let µ1, µ2, µT be efficiently computable embeddingsthat map elements of G1,G2,GT into V1,V2,VT , respectively. Note that by theproperties of E, E(µ1(a), µ2(b)) = µT (e(a, b)). The public parameters paramsGScontain a common reference string with elements u1, . . . , uI ∈ V1, v1, . . . , vJ ∈ V2and values ηh,i,j , 1 ≤ i ≤ I, 1 ≤ j ≤ J , and 1 ≤ h ≤ H as defined by Groth andSahai [GS08].

Groth and Sahai show how to efficiently compute proofs πiIi=1, ψjJj=1 thatprove that values in Cm and Dn satisfy a pairing product equation. To verify such

4There is also an instantiation for composite order groups, but we will not consider it here.

Page 71: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC PROOF SYSTEMS 47

a proof the verifier computes, for all 1 ≤ q ≤ Q, Cq ← µ1(aq) ·∏Mm=1 C

αq,mm and

Dq ← µ2(bq) ·∏Nn=1D

βq,nn . Then the verifier checks that

Q∏q=1

E(Cq, Dq) = µT (t) ·I∏i=1

E(ui, πi) ·J∏j=1

E(ψj , vj) .

The soundness of the proof system follows from the fact that under the perfectlybinding parameters, the vectors ui and vj can be seen as commitments to 1.Then, by the properties of the bilinear map E,

∏Ii=1E(ui, πi) ·

∏Jj=1E(ψj , vj) is a

commitment to the 1 element of GT , and based on the homomorphic property ofthe commitment schemes,

∏Qq=1E(Cq, Dq) necessarily is a commitment to t ∈ GT .

Witness indistinguishability is more difficult to argue, but follows from the factthat under the perfectly hiding parameters, the proofs πiIi=1, ψjJj=1 arerandom vectors of V1 and V2, that are only restricted by the constraint that∏Qq=1E(Cq, Dq) = µT (t) ·∏I

i=1E(ui, πi) ·∏Jj=1E(ψj , vj). As commitments are

perfectly hiding, all proofs are drawn acording to the same distribution, no matterwhich witness was used by the prover. For further details we refer to [GS08].

Applying the notation to Groth-Sahai proofs. We use ‘x inC’ to denote thatthere exists open such that C = Com(params, x, open). The Groth-Sahai proofsystem generates NIPK proofs of the form:

NIPK(x1, . . . , xM , z1, . . . , zN ) :

M∧m=1

xm inCmN∧n=1

zn inDn ∧Q∏q=1

e(aqM∏m=1

xαq,mm , bq

N∏n=1

zβq,nn ) = t .

(Note that the opening of the commitments cannot be extracted.) Our notationmay be of independent interest.

2.5.6 Randomizable Non-Interactive Proofs

We consider a proof system with an additional algorithm RandProof. Thebasic idea is that RandProof takes a proof π for instance y in relation R, andproduces a randomized proof of the same statement. The resulting proof must beindistinguishable from a new proof of the same statement. We allow the adversaryto choose the instance y, the proof π that is used as input for RandProof, and thewitness w that is used to form a new proof of the same statement. More formally:

Definition 2.5.11 We say that Setup,Prove,VerifyProof,RandProof constitute arandomizable proof system if the following property holds. For all p.p.t. A1,A2

Page 72: Cryptographic Protocols For Privacy Enhanced Identity Management

48 PRELIMINARIES

there exists a negligible function ν such that:Pr[params ← Setup(1k); (y, w, π, state)← A1(params);

π0 ← Prove(params, y, w);π1 ← RandProof(params, y, π);

b← 0, 1; b′ ← A2(state, πb) :

RL(y, w) ∧ VerifyProof(params, y, π) = 1 ∧ b = b′]− 12

= ν(k) .

Randomization is perfect if ν(k) = 0.

Remark 2.5.12 Note that the existence of RandProof implies witness indistin-guishability. However, randomization is a much stronger property. We will createan algorithm RandProof for the Groth-Sahai proof system; this is the only proofsystem that we know is randomizable.

Instantiating a randomizable proof system. Randomization is a fundamentallynew property. It is not clear how one might randomize proofs in any of the existingNIZK proof systems [BDMP91, KP98, FLS99]. The one exception is the recentproof system of Groth and Sahai [GS08], (which is an extension of [GOS06b]). Grothand Sahai give a witness-indistinguishable (and in certain cases zero-knowledge)proof system that lets us efficiently prove statements in the context of groupswith bilinear maps. We will show that the Groth-Sahai proof system is indeedrandomizable, by constructing the appropriate RandProof function.

Independently from us Fuchsbauer and Pointcheval [FP09] showed that thesubgroup-hiding variant of the GS proof system is randomizable.

Lemma 2.5.13 Groth-Sahai proofs are randomizable.Proof sketch. RandProof gets as input an instance with the aq, bq, t, αq,m, βq,nvalues as well as the proof

[(π1, . . . , πI , ψ1, . . . , ψJ),Π] .

Π contains the internal commitments C1, . . . , CM and D1, . . . , DN .

The algorithm first chooses randomization exponents (r1,1, . . . , rM,I) and (s1,1,. . . , sN,J) at random from Zp. It then rerandomizes the commitments Cm andDn to C ′m = Cm ·

∏Ii=1 u

rm,ii and D′n = Dn ·

∏Jj=1 v

sn,jj . Then it computes

sq,i =∑Mm=1 rm,i · αq,m, zq,j =

∑Nn=1 sm,j · βq,m, Cq ← µ1(aq) ·

∏Mm=1 C

αq,mm , and

D′q ← µ2(bq) ·∏Nn=1D

′βq,nn . Next, the prover sets

π′i ← πi ·Q∏q=1

(D′q)sq,i and ψ′j ← ψj ·Q∏q=1

(Cq)zq,j .

Page 73: Cryptographic Protocols For Privacy Enhanced Identity Management

CRYPTOGRAPHIC PROOF SYSTEMS 49

These π′i and ψ′j will satisfy the verification equation for the new commitments.

Now the prover must make a certain technical step to fully randomize the proof.Intuitively, for every set of commitments, there are many proofs (π1, . . . , πI ,ψ1, . . . , ψJ) that can satisfy the verification equation. Given one such proof,we can randomly choose another: the prover chooses ti,j , th ← R, and multiplieseach

πi := πi ·J∏j=1

vti,jj and each ψj := ψj ·

I∏i=1

u

∑H

h=1thηh,i,j

i

I∏i=1

uti,ji .

See [GS08] for a detailed explanation of this operation.

The algorithm outputs the new proof [(π′1, . . . , π′I , ψ′1, . . . , ψ′J ),Π′] where Π′ containsthe internal commitments C ′1, . . . , C ′M and D′1, . . . , D′N .

Composable proofs. Groth and Sahai show that their proofs satisfy composablewitness indistinguishability and in some cases composable zero-knowledge. Tosimplify our definitions and proofs, we will require that the randomizability propertyalso be composable.

In the same spirit, we define composable randomizability as a randomizabilityproperty that holds even when the distinguisher is given the simulation trapdoorsim. We do not strictly need composability for our application, but it will make thedefinitions and proofs much simpler. As the randomization of GS proofs also holdsfor simulation parameters, and GS proofs are composable, so is their randomization.

Malleable proofs and randomizable commitments. For our applications simplyrandomizing the proof is not sufficient; we also need to randomize (anonymizeor pseudonymize) the statement that we are proving. Thus we require somelimited malleability of the instance being proven. We model this by a family oftransformations Ys, Pss∈S that transform the instance and the proof respectively(in our construction, S will be the set of all possible commitment openings). Forall (y, π) and all s ∈ S, we require that if π is a valid proof for y, then Ps(π) is avalid proof for Ys(y). More formally:

Definition 2.5.14 We say that Setup, Prove, VerifyProof, RandProof, Ys, Pss∈S,constitute a Y -malleable randomizable proof system, if for all p.p.t. A1,A2 thereexists a negligible ν such that:

Pr[params ← Setup(1k); y, π, s, state ← A1(params) :

VerifyProof(params, y, π)= 1∧ VerifyProof(params, Ys(y), Ps(π)) = 0]=ν(k) .

Page 74: Cryptographic Protocols For Privacy Enhanced Identity Management

50 PRELIMINARIES

Note that this offers no guarantee on the resulting proof besides that it will beaccepted by the verification algorithm. However, if we also apply our randomizationprocedure to the resulting proof Ps(π), then we can be certain that the result willbe indistinguishable from a random fresh proof for Ys(y).

Now we will see how this applies to Groth-Sahai proofs. The above descriptionshows that Groth-Sahai proofs can be used to prove the existence of a solutionto a set of pairing product equations. However, what turns out to be even moreuseful is that Groth-Sahai can be used to prove that the values in a given set ofcommitments form a solution to a specific set of pairing product equations. In thiscase, the commitments are not part of the proof, but part of the instance y. Thus,if we rerandomize a proof to make it unlinkable to a previous proof, we probablyalso want to randomize the commitments that are part of the instance.

More formally, let Com(params, ·, ·) : X × Open 7→ 0, 1∗ be a non-interactivecommitment scheme. X denotes the input domain of the commitment, and Opendenotes the domain for the randomness (both may depend on params). We requirethat Open is an efficiently samplable group with efficiently computable ‘0’ elementand ‘+’ and ‘-’ operations.

The prover wants to show that some system of equations Equations holds for thevalues committed to in commitments C1, . . . , C`. The instance is y = (C1, . . . ,C`, y1, . . . , ym), and the witness for that instance is w = (x1, open1, . . . , x`,e ll,x`+1, . . . , xn), where (xi, openi) is the opening of commitment Ci. The relation is

R = (params, y, w) |C1 = Com(params, x1, open1) ∧ . . .∧

C` = Com(params, x`,e ll) ∧ Equations(params, x1, . . . , xn, y1, . . . , ym) .A proof system supports randomizable commitments if there exist efficient algo-rithms Y and P that on input s = (open′1, . . . , open′`), y = (C1, . . . , C`, y1, . . . , ym)and π ← Prove(params, y, w) output a malled proof π′ ← P (s, π) for instance (C ′1, . . . , C

′`, y1, . . . ym)← Y (s, y) with C ′i = Commit(params, xi, openi+open′i) and

if Y and P fulfill the malleability requirements of Definition 2.5.14.Lemma 2.5.15 The Groth-Sahai proof system is malleable with respect to therandomness in the commitments.

The proof is analogous to the proof of randomization, except that now the externalcommitments are randomized according to s = (open′1, . . . , open′`).Remark 2.5.16 To simplify notation, in describing our application to delegatablecredentials we will refer to a RandProof algorithm, that takes s = (open′1, . . . , open′`)as input, applies the appropriate function Ps, and then runs the randomizationalgorithm. (The commitments C1, . . . , C` are now part of the instance rather thanthe proof, so our randomization algorithm should not affect them. Thus, in therandomization procedure described above, we will simply set the randomizationexponents to 0 for these commitments.)

Page 75: Cryptographic Protocols For Privacy Enhanced Identity Management

Part I

Anonymous Credential RelatedSchemes

51

Page 76: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 77: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 3

Introduction to Part I

In our transaction model we have identified Facet 1: Negotiation and authorizationas an important concern of identity management systems. Parties negotiatewhich information to release to each other, exchange credentials to establish thetrustworthiness of this information, and authorize the transaction depending onwhich credentials were successfully presented. We pay particular attention toelectronic transactions that involve two parties: a user, and a service provider.This is a common setting for many e-business, e-government, and e-health services.On the internet users usually have to register for a service before requests areauthorized. During registration a user is asked to reveal personal data abouthimself to the organization.1 A variety of identity management systems have beenproposed that try to simplify this process. These systems may provide additionalfeatures to organizations and users.

Traditionally, identity management systems (IMS) focus on controlling access tothe resources of an organization. Thus, access control based on identification, entityauthentication, and authorization of users (e.g., clients, employees, partners), isone of the key functionalities of identity management from the perspective of theorganization. During identification the user provides an identifier. Authenticationis based on (non-anonymous) credentials that assure that the identifier belongs tothe user requesting the resource. Examples for such identifiers and credentials areusername-password pairs and client-side public key certificates.

Traditional IMS neglect the users’ need to control their personal data. Moreover,as the number of online transactions increases, the effort for the user to managehis identifiers and credentials quickly spirals out of control, if each organization

1Note that authorization is two-sided. The user implicitly gives authorization by proceedingwith the transaction.

53

Page 78: Cryptographic Protocols For Privacy Enhanced Identity Management

54 INTRODUCTION TO PART I

3: Redirect back

Identity provider Service provider

2: RedirectAuthenticate 1: Request

4: Get attributes

User

Figure 3.1: Identity federation: A user (1) requests a service, (2) gets redirected tothe identity provider (where he is either already authenticated or authenticateson demand), and (3) gets redirected back with an assertion. Optionally, (4) theservice provider may request further attributes.

uses its own identity management solution. Thus, these systems are unsatisfactoryfrom a privacy point of view.

User-centric identity management allows users to interact with different organiza-tions using the same user interface. Standardization and development effortsthat are illustrative of this change in focus include the open source projectHiggins [hig09], Microsoft CardSpace [car09], the Web Service standards [web09],the Liberty Alliance [lib09], and the OpenID Foundation [ope09]. These projectshave in common that they rely on the identity federation model. As schematizedin Figure 3.1, identity providers take over the tasks of authenticating the user,managing his attributes, and providing this information to service providers. Thissolution, however, is primarily focused on identity management and less on privacypreservation [AA06]. It typically does not protect the privacy of the user towardshis identity providers as they can collect a lot of information about the userand his interactions with service providers. Consequently, these systems are stillunsatisfactory from a privacy point of view.

A more privacy friendly solution restricts the role of identity providers to thecertification of personal data through the issuing of anonymous credentials. Withthe help of an identity management client, a user interacts with credential issuersand credential verifiers. The client provides an adequately designed user interfaceand supports anonymous credential protocols to prove attributes and accessrights towards services, all without having to reveal the user’s identity. Sucha privacy enhanced identity management solution was put forward by the PRIMEproject [pri09a] and is explored in [ACK+10]. It can be summarized through its

Page 79: Cryptographic Protocols For Privacy Enhanced Identity Management

INTRODUCTION TO PART I 55

guiding principles:

Start from maximum privacy. This is a fresh pseudonym with an empty profileabout which the organization initially has no information. (A profile is a collectionof information about a user that is linked to an identifier or a pseudonym.)

User control. The user is in control of his data. He controls which data is releasedto whom for which purpose. The user also controls the linkability of informationto pseudonyms, i.e., whether it is written into a new empty profile or added tothe profile of an existing pseudonym.

Attribute-based access control. Organisations do not need to uniquely identify a userin order to grant or deny access; instead they use certified attributes and/orentitlements as the basis for their access control decisions.Attribute-based access control as defined here is independent of the concreteaccess control policies that will be implemented by individual systems. Forexample, one can implement role-based access control and mandatory accesscontrol by adding a role attribute and a confidentiality level attribute respectively.

Explicit policies. The use of pseudonyms, personal data, and anonymous credentialsby users and organizations is controlled by explicit policies. This allows to assessthe compliance of policies with data protection legislation.

As these technologies are novel we shortly discuss the socio-political andtechnological environment in which they would need to be deployed. We thennarrow our scope and focus on anonymous credentials.

Discussion. In today’s internet, and given the political climate, it is a challengeto start from maximum privacy. It has been argued that governments donot sufficiently support the use of anonymization techniques and other privacyenhancing technologies out of fear that criminals can abuse anonymity. What ismore, in an attempt to further restrict anonymity on the internet, many governmentsenforce the retention of traffic data [eu-06b]—data that could fall into the wronghands. It is important to recognize that this ignores the real problem, as mosttechnologically savvy criminals can be anonymous despite of such measures byusing open wireless networks and compromised machines. The latter could beorganized into so called ‘botnets’, collections of software agents on compromisedmachines that run autonomously and automatically. Moreover, criminals can moveto or tunnel their communication through countries with favorable legal systems.2

The state of the art of anonymous communication and traffic analysis is importantfor understanding the degree of privacy that can or cannot be achieved with the

2While being aware that this is a political rather than scientific statement, it is the convictionof the author that advanced identity management systems with strong privacy measures are notcontradictory but, on the contrary, essential for stopping the rise of cybercrime.

Page 80: Cryptographic Protocols For Privacy Enhanced Identity Management

56 INTRODUCTION TO PART I

help of privacy-enhanced identity management systems. Traffic analysis examinesdata about the time and duration of communication to determine the detailedshape of the communication streams, the identities of the parties communicating,and what can be established about their locations [DC08]. Identity managementcan also be done over a multitude of different ‘channels’, e.g., it could be done atthe point of sale using a smart card or a mobile device, which would remove theneed for long-distance communication altogether. Unique identifiers such as MACaddresses, device identifiers, and even the physical signal characteristics of a deviceneed to be considered. As in security, the attacker will target the weakest link toundermine a user’s privacy.

The privacy enhancing mechanisms proposed in this thesis operate at the applicationlevel and in any case, the analysis of the overall privacy provided by a real-worldinformation system needs to be done on a system per system basis. Thus, forthe purpose of this thesis, traffic analysis is out of scope and we assume sufficientprotection. (More information on anonymity techniques can be found in [DDS09].)In absence of a perfect anonymization system, we advocate a cautiously optimisticrather than a fatalistic attitude: even given a less than perfect communicationchannel, privacy-enhancing identity management systems provide some data-minimization.

Anonymous credentials and the certification of personal data. It has beenshown (e.g., [KM99]) that extensive electronic e-commerce systems and electroniclegal frameworks can be built on non-anonymous certification systems.3

A public key certificate is an important type of digital credential. As alreadydiscussed in Section 1.2 a public key certificate is a digital signature on a list ofattribute-value pairs (att1 = a1, . . . , attn = an) that was created by a credibleauthority that vouches for the information. E.g., the personal data of a user mayconsist of attribute-value pairs such as

(name= “Alice”, bdate= “1969-08-15”, address= “15 A Street, Sometown”, . . . ) .

However, as we have discussed, the disadvantages of public key certificates are thatthey reveal all attribute values in every show, and that transactions that use themfor authorization can always be linked to each other and to the user’s real identity(or a long-term pseudonym in case of a pseudonym-certificate).

There are many challenging open questions concerning the design of a privacyenhanced identity management system. In this thesis, we focus on the use of

3The importance of such a system for small and medium size businesses is stressed byKleist [Kle04]: “Firms that tend to do one-time, non-repetitive exchanges may benefit more fromestablishing a strong electronic commerce infrastructure of trust than those firms that have longterm, well-established and unchanging relationships with other firms or customers.”

Page 81: Cryptographic Protocols For Privacy Enhanced Identity Management

BASIC ANONYMOUS CREDENTIAL SYSTEM 57

anonymous credentials as a means to avoid application-level linkability, whileproviding the certification mechanisms required for establishing trust.

An anonymous credential is also a digital signature on an ordered list of attribute-value pairs. An anonymous credential protocol allows the owner of a credential (theuser) to selectively show a subset of the attribute values to a verifier. (In fact, it ispossible to prove predicates about attribute values without revealing the valuesthemselves.) This is done in a protocol that convinces the verifier that the user hasa valid credential with appropriate attribute values. The properties of the protocolguarantee that the user cannot claim attribute values for which he does not havecredentials. At the same time the verifier does not learn anything more about thecredential besides the selected attributes (or predicates). In particular the protocoldoes not reveal the value of the signature. Instead it uses a randomized version ofthe signature that looks different for every show. This guarantees that even if theissuer and all verifiers collude, they cannot link the issuing to a show transactionor two show transactions to each other, unless the attribute data revealed aboutthe user allows for such linkability. For instance, if anonymous credentials are usedas online newspaper subscriptions, anonymous accesses by different users cannotbe linked based on the information revealed about their credentials. However, ifonly a single user has a specific type of subscription no anonymity is afforded.

Outline of this chapter. In Section 3.1 we present the basic functionalityof anonymous credential systems. At the same time we give a language forexpressing advanced features of anonymous credentials. These advanced featuresare introduced in the following sections: Section 3.2 discusses anonymous credentialsystems with pseudonyms, a combination that is sometimes also referred to as apseudonym system. Section 3.3 discusses mechanisms to enforce spending limitsfor anonymous credential systems. Finally, in Section 3.4 we introduce delegatableanonymous credentials.

The chapter concludes with a summary of the contributions of Chapters 4-7 inSection 3.5.

3.1 Basic Anonymous Credential System

In [ACK+10] we describe the requirements that an anonymous credential systemhas to fulfill on an abstract level. The notation credpk is used to denote a credential(i.e., the signature) issued by an authority with public key pk. The list of attributenames (att1, . . . , attn) and the meta-information that describes the semanticsattached to these attributes determine the type (cred type) of a credential.

Page 82: Cryptographic Protocols For Privacy Enhanced Identity Management

58 INTRODUCTION TO PART I

Selective-show protocolCommon input: pk1, . . . , pk`, claimUser input: credpk1

1 , . . . , credpk`` Verifier input: none

ai,jUser output: none Verifier output: accept/reject

Table 3.1: Input and output description of the selective-show protocol

Definition 3.1.1 (Credential attribute expression) If credpk is a credentialsigned under the public key pk of a credential type cred type that defines theattributes att1, . . . , attn then cred typepk [atti] (with 1 ≤ i ≤ n) is an expressionthat refers to attribute atti in credpk.

We introduce credential claims to express restrictions on the attribute values of acredential that a user needs to access a service, as follows:

Definition 3.1.2 (Credential claim) Given a public key pk and a list of Booleanconditions 〈condlist〉 over the attributes of the credential type cred type, a credentialclaim cred typepk [〈condlist〉] refers to a credential of type cred type, signed underpk, whose attributes satisfy the restriction expressed by 〈condlist〉.

Example 3.1.3 The following credential claim states that the user possesses acredential of type identity card and that according to that credential he is olderthan 18. The function today() returns today’s date.

identity cardpkG [bdate < today()− 18Y ]

Selective-show protocol. A user can be in possession of credentials from multipleissuers. This reduces the trust that has to be put into a single issuer. Credentialsfrom different issuers can be shown together, e.g., Alice could use one credential toprove her subscription and another credential to prove her age. The selective showprotocol supports advanced features that allow to prove complex claims about theattribute values of a list of credentials credpk1

1 , . . . , credpk`` .

Let credpkii certify the ni attributes (atti,1, . . . , atti,ni). Claims are expressed asa predicate claim over all attributes of credpk1

1 , . . . , credpk`` . The predicate canbe composed using ∧ and ∨ and consists of credential claims and linear terms ofcredential attribute expressions that are compared using the relations “=”, “<”, and“>”. We give a Backus-Naur Form (BNF) grammar for such a predicate language inTable 3.2. This is an adapted version of the original proposal in [ACK+10]. The

Page 83: Cryptographic Protocols For Privacy Enhanced Identity Management

BASIC ANONYMOUS CREDENTIAL SYSTEM 59

〈claim〉 → 〈cred type〉[〈condlist〉] | 〈exp〉 〈relational〉 〈exp〉| 〈claim〉〈logic〉〈claim〉 .

〈cred type〉 → 〈identifier〉〈pk〉 | 〈identifier〉〈nym〉 .〈condlist〉 → 〈cond〉 | 〈cond〉 , 〈condlist〉 .〈cond〉 → 〈attr name〉 〈relational〉 〈exp〉 | 〈nymder〉

| 〈serialder〉 | 〈escrowshare〉 .〈exp〉 → 〈cred type〉 [〈attr name〉] | 〈string〉 | 〈number〉 | 〈var〉 |

| 〈number〉*〈exp〉 | 〈exp〉+〈exp〉〈relational〉 → = | ≥ | < | > | ≤〈logic〉 → ∧ | ∨ .〈identifier〉 → The encoding of an identifier .〈pk〉 → A unique hash of the public key of the credential issuer .〈nym〉 → A unique hash of the pseudonym of the credential issuer .〈attr name〉 → 〈identifier〉 .〈number〉 → The encoding of an integer number .〈string〉 → The encoding of a string .〈nymder〉 → NymDer( 〈pseudonym〉 , 〈attr name〉 ) .〈serialder〉 → SerialDer( 〈serial〉 , 〈attr name〉 , 〈context〉 , 〈limit〉 ) .〈context〉 → 〈string〉 .〈context〉 → 〈number〉 | 〈var〉 .〈escrowshare〉 → EscrowShare( 〈escrow〉 , 〈serial〉 , 〈exp〉 , 〈attr name〉 ,

〈context〉 , 〈limit〉 ) .

Table 3.2: Backus-Naur form grammar for credential claims

predicates for 〈nymder〉, 〈serialder〉, 〈escrowshare〉, and 〈nym〉 will be explainedwhen introducing advanced credential features in Sections 3.2 to Section 3.4. Aselective show of a predicate claim takes inputs and produces outputs as describedin Table 3.1 The user’s input to the protocol are the credentials credpkii and thecorresponding secret attributes. After the protocol the verifier learns nothing butthe truth of the claim (accept/reject).

Example 3.1.4 When purchasing wine Alice may have to reveal her address, andher credit card information. Moreover, she may have to prove that she is older than18. This can be expressed by a claim about two credentials of the following form.The first credential is of credential type identity card, the second of type credit card.

identity cardpkG [bdate < today()− 18Y, address = “15 A Street, Sometown”]

∧credit cardpkB [cardnr = 123456, exp > today()]

∧credit cardpkB [name] = identity cardpkG [name]

Page 84: Cryptographic Protocols For Privacy Enhanced Identity Management

60 INTRODUCTION TO PART I

As shown in the example, anonymous credentials can also be used for data intensivetransactions. The credential show above, however, still hides the age, the name,and the gender of Alice. Multiple transactions that involve the same address or thesame card number would however be linkable.

A more privacy friendly solution would use e-cash and can be expressed using thefollowing claim. (The advanced features used in this example will be explained inSection 3.3.)

identity cardpkG [bdate < today()− 18Y, address = “15 A Street, Sometown”]

∧e-coin walletpkB [SerialDer(S, seed, ε, J),EscrowShare(ess, S, 〈owner〉, seed]∧

0 < J ≤ e-coin walletpkB [size]∧

identity cardpkG [name] = e-coin walletpkB [owner]

Blind-issue protocol. Credential systems also offer a blind issue protocol. A usercan choose some of the values to be included in a credential without revealing themto the issuer. The user can prove claims about the values he chose with respectto attribute values of other credentials. Let credpk1

1 , . . . , credpk`` be credentials theuser already owns and let credpk`+1

`+1 be the newly obtained credential on attributes(att`+1,1, . . . , att`+1,n`+1). The blind issuing of a new credential whose attributesfulfill the claim over the attributes of all ` + 1 credentials takes the inputs andproduces the outputs described in Table 3.3.

The user’s input to the protocol are the credentials credpkii and the correspondingsecret attributes. The attributes (att`+1,j = a`+1,j)j∈D, D ⊂ 1, . . . , n`+1, arepublic; while the attributes (att`+1,j = a`+1,j)j /∈D are the user’s secret input intothe protocol. After the protocol, the issuer only learns whether he successfullyissued a credential that fulfills the predicate claim. The user obtains the newcredential credpk`+1

`+1 .

Constructing anonymous credentials from CL-signatures. Anonymous creden-tials can be built using CL-signatures (cf. Section 2.4.3) and zero-knowledge proofsthat allow to prove statements about values in commitments.

The credential selective-show protocol runs one CL-signature proof protocol percredential credpk1

1 , . . . , credpk`` . Each CL-signature proof protocol proves that the

user possesses a signature under the issuer’s public key on values contained incommitments. The claim is then proven using zero-knowledge proofs about thecontent of these commitments.

Page 85: Cryptographic Protocols For Privacy Enhanced Identity Management

PSEUDONYM SYSTEMS 61

Blind-issue protocolCommon input: pk1, . . . , pk`+1, claim, (att`+1,j = a`+1,j)j∈DUser input: credpk1

1 , . . . , credpk`` Issuer input: sk`+1

ai,j, (att`+1,j=a`+1,j)j /∈DUser output: credpk`+1

`+1 Issuer output: accept/reject

Table 3.3: Input and output description of the blind-issue protocol

The credential blind-issue protocol makes use of the CL-signature issue protocol,that allows the user to obtain a signature on committed values. First the userruns one CL-signature proof protocol per credential credpk1

1 , . . . , credpk`` . Then he

provides commitments to the attribute values of the new credential credpk`+1`+1 . The

claim is proven using zero-knowledge proofs about the content of the commitmentsfor the attributes of the old credential and the new credential. Using the CL-signature issue protocol the user obtains a signature credpk`+1

`+1 on attribute valuesthat are partially hidden from the credential issuer.

3.2 Pseudonym Systems

A pseudonymous identification scheme allows a user to derive multiple pseudonymsnym from a single master secret skU . This is achieved through a randomizedNymDer(skU ) algorithm. The user can then authenticate himself through apseudonymous identification protocol (PIP) by proving that he knows the mastersecret underlying a pseudonym nym. A PIP scheme is secure against impersonationattacks if no user can authenticate himself under a pseudonym without knowing theunderlying master secret and unlinkable if no one can tell whether two pseudonymswere derived from the same master secret or not. A user only has to protect asingle master secret in order to protect the logins to multiple services.

One issue with credential systems that allow for the simultaneous show of multiplecredentials is that multiple users could combine their credentials to obtain accessrights that none of them can obtain on his own. Countermeasures against such anattack are said to guarantee credential consistency. Pseudonymous identificationprotocols that can be efficiently combined with the issuing and show protocols ofanonymous credential systems are of particular relevance to identity managementas they allow to achieve credential consistency in an elegant way. Some authorssee pseudonymous identification as an essential part of anonymous credentials.This is sound in theory as the key skU provides a safe anchor for binding acredential to a specific user. As not all uses of anonymous credentials require

Page 86: Cryptographic Protocols For Privacy Enhanced Identity Management

62 INTRODUCTION TO PART I

pseudonyms, we instead refer to the combination of a pseudonymous identificationprotocol and an anonymous credential system as a pseudonym system. Asdepicted in Figure 3.2, a pseudonym system allows a user to obtain, and lateron use credentials under different unlinkable pseudonyms. The first ideas forsuch a system can be traced back to David Chaum’s influential article [Cha85].Lysyanskaya et al. [LRSW99] gave the first formal definitions of a pseudonymsystem and a theoretical construction that fulfills these definitions. Camenischand Lysyanskaya [CL01] presented the first efficient construction of a pseudonymsystem.

Credential issuer Service provider

1: Get credential 2: Show credential

User

nymU

(I)nym

U

(S)

Figure 3.2: Pseudonymous system: A user (1) obtains an anonymous credentialunder pseudonym nym(I)

U and (2) can show it under the unlinkable pseudonymnym(S)

U . The anonymous credential protocols allow to blindly obtain and selectivelyprove attributes.

In a pseudonym system, the master secret skU becomes one of the attributesof the anonymous credential (usually the first attribute att1 is used).4 Theselective show of the credential system is extended with an additional predicateNymDer(nym, att1) that states that nym was derived using the master secret inattribute att1.

If the NymDer predicate is used in an anonymous credential show, the credential issaid to be shown with respect to the pseudonym nym. This allows to implementa variety of features. One of them is credential consistency. If the user creates afresh pseudonym nymt and shows multiple credentials with respect to the samenymt, the verifier is guaranteed that all credentials were issued to the same user.

4The blindness property of the blind issuing protocol guarantees that the issuer does not learnskU during the issuing.

Page 87: Cryptographic Protocols For Privacy Enhanced Identity Management

PSEUDONYM SYSTEMS 63

This avoids attacks in which a group of users combine their credentials to obtainprivileges that no single group member could obtain on his own.

As a related feature, a master secret as a common attribute in all credentialshinders credential sharing/pooling and credential theft. If the signature is of aform that does not reveal information about the messages that were signed, thena credential can be protected by storing the master secret in a trusted hardwaredevice.5 In this way the credential cannot be shared or stolen unless the hardwaredevice is shared or stolen.

Relation to other credential features. Pseudonymous identification is related totwo other credential features that we will discuss in the following two sections.

The first feature is another measure for discouraging the abuse of credentialsthat we will introduce in the next section: limited spending. For instance, onemight want to enforce policies such as “Users cannot spend more than 5000 e-coincredentials at each merchant per month” to hinder money laundering. The uniquemaster secret defines the ownership of an anonymous credential in a clean way:“The owner of a credential is whoever knows the master secret skU encoded intoit”. This allows to distinguish between anonymous credentials of different users.Camenisch et al. [CHL06] show how to implement the above policy.

The second related feature is the pseudonymous delegation of credentials.Pseudonymously delegatable credentials as introduced in Section 3.4 allow adelegator Bob to delegate a credential received from Alice to a delegee Carol. Inpseudonymous delegation, Alice and Carol only know Bob under the pseudonymsnym(A)

B nym(C)B respectively. The information revealed during the delegation

protocol should not allow anyone to link Bob’s pseudonyms, not even Alice andCarol. Similarly, Bob may only know Alice and Carol under pseudonyms nym(B)

A

respectively nym(B)C .

Constructing pseudonym systems from CL-signatures. Pseudonym systems arean immediate consequence of CL-signatures (for background see Section 2.4.3). Aliceregisters pseudonyms nym(B)

A and nym(C)A with Bob and Carol. The pseudonyms

nym(B)A and nym(C)

A are commitments to her secret key skU , and are unlinkable bythe security properties of the commitment scheme. Suppose Alice wishes to obtaina credential from Carol and show it to Bob. Alice goes to Carol and identifiesherself as the owner of pseudonym nym(C)

A . They run the CL-signature Issueprotocol as a result of which Alice gets Carol’s signature on her secret key. Now

5The signature schemes that are used for building anonymous credentials generally fulfill thisproperty through randomizing the signature. The randomization value should also be stored onthe trusted hardware device.

Page 88: Cryptographic Protocols For Privacy Enhanced Identity Management

64 INTRODUCTION TO PART I

Alice uses the CL-signature proof protocol to construct a zero-knowledge proofthat she has Carol’s signature on the value in commitment nym(B)

A . The latter isan implementation of the NymDer(nym(B)

A , skU ) predicate.

3.3 Limited Spending

Certain applications, such as e-cash or e-coupons, require that the number of timesthat a credential can be shown anonymously is limited. For instance, a credentialrepresenting a wallet of n coins can be shown n times. A user can neverthelessattempt to use a credential more often. This is always possible as digital datacan be arbitrarily reproduced. For this case we require mechanisms that allow todetect overspending and, if necessary, to obtain an escrow. The escrow is certifiedinformation about the user that is hidden until overspending occurs. Only thencan it be obtained to reveal for instance the user’s identity or her bank-accountnumber.

In addition to enabling applications such as e-cash and e-coupons, restricting thenumber of times a credential can be shown in a certain context is an importantsecurity precaution against the sharing and theft of credentials. With context-dependent limited spending we mean that given a concrete context, e.g., a timeand place such as “at verifier X on January 1st, 2009”, the credential can onlybe shown a limited number of times. Legitimate anonymous shows from differentcontexts are, however, always unlinkable. Applications such as e-cash can be seenas a special case of context-dependent limited spending that has only one globalcontext.

Cryptographic serial numbers. Technically, the limited spending of anonymouscredentials is implemented using cryptographic serial numbers. A cryptographicserial number looks like a random number, but is in fact deterministically derivedfrom a unique seed in a credential, the spending context, and the number of timesthat the credential has already been shown in this context. This determinismguarantees that for each credential there can only exist up to the spending limitmany different serial numbers per context. If a user wants to use a credentialmore often she is forced to reuse one of these serial numbers, which in turn can bedetected.

We extend our claim language with a predicate that ensures the correctness of acryptographic serial number S.

Definition 3.3.1 (Cryptographic serial numbers)Condition SerialDer(S, att, context, limit) refers to S being one of the limit validserial numbers for context context and seed att.

Page 89: Cryptographic Protocols For Privacy Enhanced Identity Management

LIMITED SPENDING 65

Several anonymous credential schemes and related protocols, such as anonymouse-cash, realize some form of cryptographic serial numbers (e.g., [TFS04, BCC04,NSN05, CHL05, DDP06, CHK+06]).

Cryptographic serial numbers restrict the unlinkability of overspent anonymouscredential shows, but a malicious anonymous user can still try to get away withshowing the same serial number multiple times. The server is supposed to maintaina database with spent serial numbers. If the shown number already occurs in thedatabase, then the credential is clearly being overspent, so the server can refuse toaccept the credential.

Cryptographic escrow. In some situations, however, checking the serial numberin real time against a central database is impossible. For example, spending couldoccur at thousands of servers at the same time, so that the central database wouldbecome a bottleneck in the system, or spending could occur offline. In these cases,the server cannot refuse access when a credential is being overspent, but needs away to detect overspending after the fact, and a way to deanonymize fraudulentusers.

Anonymous credentials again offer a solution. When showing a credential, the usercan give a piece of (certified) identity information in escrow, meaning that thisidentity information is only revealed when overspending occurs. She does so byreleasing an escrow share at each spending. Such an approach was first proposedby Franklin and Yung [FY93]. If two escrow shares for the same serial number arecombined they reveal the embedded identity information, but a single share doesnot leak any information.

To avoid that a malicious user reveals the same escrow share twice, escrow shareshave to be computed with respect to a unique nonce that is part of the share. Theverifier of the anonymous credential is responsible for checking that this value isglobally unique. One way of guaranteeing this is to make sure that the nonce isverifier dependent and time dependent. In addition, the verifier can keep a smallcache of already used nonces for a certain time interval. Another option is for theverifier to send a random nonce as a challenge before the credential show.

In our policy language, the requirement to give a certified expression 〈exp〉 of theuser’s credential attributes in escrow is expressed as follows.

Definition 3.3.2 (Cryptographic escrow)Condition EscrowShare(ess, S, 〈exp〉, att, context, limit) refers to ess being a validescrow share with serial number S of the attribute expression 〈exp〉 for contextcontext, spending limit limit, and seed att.

Page 90: Cryptographic Protocols For Privacy Enhanced Identity Management

66 INTRODUCTION TO PART I

A subset of the anonymous credential schemes and related protocols that supportcryptographic serial numbers also support cryptographic escrow, e.g., [NSN05,CHL05, DDP06, CHK+06].

Example 3.3.3 Alice could receive from her rich sister Alicia a gift credential thatAlice can spend on buying 3 wine bottles. The gift credential is a credential underpkW , the wine shop’s public key, containing attribute-value pairs(

patron = Alicia, seed = . . . , maxprice = 50).

When Alice wants to hand in her gift credential, e.g., to buy a 45 Euro wine bottleshe computes a serial number S and proves the claim

three gift credentialpkW [maxprice ≥ 45,SerialDer(S, seed, ε, 3)] .

The wine shop checks that it has not received the same S before to check the validityof the gift certificate. The empty string ε denotes the global context.

Example 3.3.4 An escrow based gift credential scheme might be useful if partnershops that might not be constantly online should also accept the gift credential:

three gift credentialpkW [maxprice ≥ 45,

EscrowShare(ess, S, 〈three gift credentialpkW [patron]〉, seed, ε, 3)] .

If Alice overspends her gift credential the escrow shares allow the retrieval of Alicia’sname. If Alice overspent her limit, Alicia would be charged the price of the winebottle, and the gift credential would be revoked.

3.4 Pseudonymously Delegatable Credentials

A delegatable credential (that has been delegated L− 1 times) is essentially a chainof digital signatures cred1, . . . , credL that sign L ordered lists of attribute-valuepairs A1 = (att1

1 = a1,1, . . . , att1n1

= a1,n1), . . ., AL = (attL1 = aL,1, . . . , attLnL =aL,nL). Each list Ai represents one delegation level. A credential authority thatvouches for the information contained in the attributes creates the first signature,which certifies n1 attributes of the level 1 user. The possession of the certifiedattributes bestows some rights on the user, which he may want to delegate toother users acting on his behalf. The role of the ni attributes of the level i useris to restrict the rights of the delegee and to define the purpose of the delegation.Moreover, level i attributes allow the user at level i − 1 to certify additionalattributes about the user at level i.

Page 91: Cryptographic Protocols For Privacy Enhanced Identity Management

PSEUDONYMOUSLY DELEGATABLE CREDENTIALS 67

A simple example for a two level delegation is that of a company and its employers.A credential authority certifies the name, sector and address of a company, e.g.

A1 = (name = “Cryptonite”, sector = “IT security”, country = “USA”,

address = “32 Main Street, Sometown”) .

In turn the company can use its own signing key to certify information about itsemployees, e.g.,

A2 = (name = “Alice”, phone = “09876541”,position = “CEO”) .

This discussion is still independent from any privacy considerations; this systemcould be implemented based only on a conventional public key infrastructure (PKI).Note that if a PKI is used all of the certified information is revealed during eachand every transaction. Two important aspects are instantiated differently forconventional delegation and anonymous delegatable credentials.

The first is the certification of signing keys. In the PKI case, it is natural to solvethis by including the public key of the user at level i as one of his attributes in Ai.This public key can then be used to verify that the signature credi on Ai+1 verifiescorrectly. This is the standard way of implementing a certification chain. Foranonymous delegation we will make use of a pseudonymous identification scheme.

The second aspect is about how to chain the signatures at the different levelstogether. For instance, in the example above, Alice should only be able to provethat she is the CEO of Cryptonite and not of some other company. A viableapproach is to use a cryptographic hash function to hash all the information in Aiinto a unique identifier, and add it to the list Ai+1 that is to be signed. (The factthat the secret key used to sign Ai+1 matches with the public key added to Ai maynot be sufficient, e.g., if the delegator possesses multiple credentials.) In anonymousdelegation, the delegator has the choice to decide which of his attributes he wantsto make available during delegation.

Privacy friendly certification of signing keys. In order to allow for a delegatableanonymous credential system with privacy friendly certification of signing keys, weassume the existence of a pseudonymous identification system. In such a system,users, who are defined through their knowledge of their skU , can not only showcredentials with respect to pseudonyms, but they can also delegate credentials toother users. Here we want to preserve the property that all users, when delegatingand receiving credentials, only know each other under unlinkable cryptographicpseudonyms.

Page 92: Cryptographic Protocols For Privacy Enhanced Identity Management

68 INTRODUCTION TO PART I

To simplify our model, we say that a delegatable credential system has only onetype of participant: users. An originator O of a certain type of credential canregister a pseudonym nymO as its public key to act as credential authority. Usersinteract with other users, including authorities, using many different pseudonyms.Thus a user A can be known to authority O as nym(O)

A and to user B as nym(B)A .

If authority O issues user A a credential for nym(O)A , then user A can prove to user

B that nym(B)A has a credential from authority O. We say that credentials received

directly from the authority are level 1 credentials or basic credentials, credentialsthat have been delegated once are level 2 credentials, and so on. Thus user A canalso delegate his credential to user B, and user B can then prove that he has alevel 2 credential from authority O.

We speak of pseudonymously delegatable credentials because A enjoys the sameanonymity properties for delegating credentials as for showing credentials. Thismeans that a delegation transaction is unlinkable to other issue, delegate, and showtransactions that involve the same credential. Users only know each other underunlinkable pseudonyms, and may reveal certified attributes about themselves. Thisis a very strong definition that can be restricted on demand using attributes.

Let crednymO be a level L credential from authority O with attributes (att11,

. . . , att1n1, . . . , attL1 , . . . , attLnL) owned by a delegator. The user obtains a level L+1

credential cred′nymO on attributes (att11, . . . , att1

n1, . . . , attL+1

1 , . . . , attL+1nL+1

).

As a convention each attribute atti1, 1 ≤ i ≤ L contains a user secret skUi .The last such attribute, attL1 , contains the master secret skUL of the credentialowner. A delegating user A proves the predicate NymDer(nymA, attL1 ). A user Breceiving the credential will obtain a level L+ 1 credential for which he can proveNymDer(nymB , attL+1

1 ).

An attribute model for delegatable anonymous credentials. In a delegatableanonymous credential scheme, all parties can choose which attributes to revealduring issue, delegate, and show transactions.

As for the non-delegatable case, the information revealed about the credentialowner’s attributes during an issue or show transaction is best expressed as aclaim over multiple credentials. The attribute model for delegation is, however,more restrictive. The delegator of a level L credential can choose a subset PL ⊂2, . . . , nL of his attributes that should be revealed. (1 is excluded because skULcan never be revealed.) For the receiving user these attributes are now public andneed to be included in all delegations and shows of this credential (together withthe attributes indicated by the sets Pi, 1 ≤ i < L of previous delegations). Theattributes of the receiving user can, however, be partially hidden from the delegator.The restrictions are once more expressed as a claim over multiple credentials.

Page 93: Cryptographic Protocols For Privacy Enhanced Identity Management

SUMMARY OF CONTRIBUTION 69

Delegation protocolCommon input: nymO1 , . . . ,nymOn ,nymO,nymA, nymB , claim

(attij = ai,j)j∈Pi , i ∈ 1, . . . , L(attL+1

j = aL+1,j)j∈D, D ⊂ 2, . . . , nL+1User input: cred1, . . . , credn Delegator input: crednymO

(attL+1j = aL+1,j)j /∈D

User output: cred′nymO Delegator output: accept/reject

Table 3.4: Input and output description of the delegation protocol

In addition to the selective-show protocol and the blind-issue protocol, delegatableanonymous credentials offer a delegation protocol. See Table 3.4 for a descriptionof its inputs and outputs. Let cred1, . . . , credn be (potentially delegatable)credentials. The claim is about the secret attributes of these credentials andthe new secret attributes (attL+1

j = aL+1,j)j /∈D of the delegated credential. Hereattributes (attL+1

j = aL+1,j)j∈D are public. The attributes (attL+1j = aL+1,j)j /∈D

are the user’s secret input into the protocol. The delegation also reveals thepseudonym nymO of the credential authority, as well as the attributes from thesets Pi, 1 ≤ i ≤ L about the intermediary delegators and the original credentialowner.

3.5 Summary of Contribution

Chapter 4: P-signatures. A shortcoming of CL-signature schemes is that theProve protocol is interactive. Rounds of interaction are a valuable resource. Incertain contexts, proofs need to be verified by third parties who are not presentduring the interaction. For example, in off-line e-cash, a merchant accepts an e-coinfrom a buyer and later deposits the e-coin to the bank. The bank must be able toverify that the e-coin is valid.

There are two known techniques for making the CL Prove protocols non-interactive.We can use the Fiat-Shamir heuristic [FS87], which requires the random-oraclemodel. A series of papers [CGH04, DNRS03, GK03] show that proofs of security inthe random-oracle model do not imply security. The other option is to use generaltechniques: [BFM88, DSMP88, BDMP91] show how any statement in NP can beproven in non-interactive zero-knowledge. This option is, however, prohibitivelyexpensive.

In Chapter 4 we give the first practical non-interactive zero-knowledge proof ofknowledge of a signature on a committed message. We call signatures with such an

Page 94: Cryptographic Protocols For Privacy Enhanced Identity Management

70 INTRODUCTION TO PART I

efficient non-interactive proof protocol P-signatures. We give three constructionsfor P-signatures based on several practical signature schemes and a special class ofcommitments and proofs due to Groth and Sahai [GS08]. The first two constructionsallow for signing a single message block, while the third construction is a multi-blocksignature scheme that allows for signing multiple message blocks at once. (Thistranslates to the signing of multiple attributes in anonymous credential systems.)Our constructions are secure in the common reference string model under a subsetof the following complexity assumptions: XDH, DLIN, IHSDH, HSDH, and TDH(See Section 2.3.3).

Chapter 5: Compact e-Cash Based on a Common Reference String. Electroniccash (e-cash) was introduced by Chaum [Cha82] as an electronic analogue of physicalcash money and has been extensively studied since [CFN90, FY93, CP93a, Bra93a,CPS94, Bra93c, SPC95, FTY96, Tsi97, BP02]. The participants in an e-cashsystem are users who withdraw and spend e-cash; a bank that creates e-cash andaccepts it for deposit, and merchants who offer goods and services in exchange fore-cash, and then deposit the e-cash to the bank. The main security requirementsare anonymity and unforgeability. Unfortunately, it is easy to see that, as describedabove, e-cash is useless. The problem is that money is represented by data, andit is possible to copy data. Unforgeability will guarantee that the bank will onlyhonor at most one of them for deposit and will reject the other one. Anonymitywill guarantee that there is no recourse against Alice. So one of the merchants willbe cheated.

There are two known remedies against this double-spending behavior: online andoffline double spending detection. The two options for preventing fraud in e-cashcorrespond to the two options for limiting the spending of anonymous credentialsdescribed in Section 3.3: cryptographic serial numbers, and cryptographic escrow.In online double spending detection, the serial number of a newly received coinis checked against the database of spent coins before the merchant accepts thecoin. Offline double spending allows the merchant to accept coins offline, withoutinteracting with the bank. If a user double spent, the escrow share will allow toobtain the identity of the offending user.

In conventional e-cash a coin is always at least of the same size as its serial number.In compact e-cash [CHL05], the user withdraws N coins in a withdrawal protocolwhose computational, communication, and storage complexity is O(logN) ratherthan O(N). Historically, compact e-cash can be seen as a type of context dependentlimited spending protocol, in which there is only one context ε.

Compact e-cash and variants such as e-tokens, see Chapter 6, can be obtainedfrom a signature scheme, a pseudo-random function, and a non-interactive zero-knowledge proof of knowledge (NIZKPK) system for the appropriate language. Ourcontribution to the study of electronic cash is to give a construction of a signature

Page 95: Cryptographic Protocols For Privacy Enhanced Identity Management

SUMMARY OF CONTRIBUTION 71

scheme and a PRF such that the non-interactive ZKPK system is provably secureand can be realized efficiently enough to be usable in practice.

All prior work on compact e-cash [CHL05] and most of the work on e-cash [CFN90,Bra00] was in the random-oracle model, with non-interactive proofs obtained frominteractive proofs via the Fiat-Shamir heuristic [FS87]. The reason for this isthat there were no efficient NIZKPK systems for languages most heavily used incryptographic constructions (such as languages of true statements about discretelogarithm representations) until the recent proof system of Groth and Sahai [GS08].

One of the main building blocks of our construction is a pseudo-randomfunction fs and an unconditionally binding commitment scheme Com withan efficient proof of knowledge system for the following language: Lf =S,Cy, Cs | ∃ s, y, rs, ry such that S = fs(y), Cy = Com(y, ry), Cs = Com(s, rs)(the prover will prove knowledge of the witness). Such a pseudo-random functionwith a proof system is a special case of a simulatable verifiable random function(sVRF), introduced by Chase and Lysyanskaya [CL07].

Our other building block is a signature scheme and an unconditionally bindingcommitment scheme (the same one as for the sVRF construction) that allows foran efficient NIZKPK of a signature on a set of committed values, as well as for anefficient protocol for signing a committed value. Here we make use of the multi-block P-signature scheme presented in Chapter 4. We show how to obtain compacte-cash using these building blocks. Our construction is secure in the commonreference string model under a subset of the following complexity assumptions:XDH, DLIN, HSDH, TDH, and DDHI for G1 (See Section 2.3.3).

Chapter 6: Periodic n-Times Spending. In Chapter 6 we show how to extendcompact e-cash to so-called e-tokens that allow up to n anonymous transactions pertime period (for example, this could correspond to subscriptions to an interactivegame sites or to the submission of anonymous sensor reports). Note that contextdependent limited spending is obtained by replacing the time period identifier witha more general context identifier.

A first solution for this problem was presented Damgard, Dupont and Peder-sen [DDP06] and is based on Σ-protocols with a pseudo-random challenge. Theirsolution, however, only realizes periodic 1-times spending and is quite inefficient.The user has to perform 57+68k exponentiations during a show, where k is thesecurity parameter (the user can cheat with probability 2−k).

We provide a completely different approach that yields a practical, efficient, andprovably secure solution. We relate the problem to electronic cash (e-cash) [Cha82]and in particular, to compact e-cash [CHL05]. In our approach, each participantobtains a set of e-tokens from the central server. Similar to the withdrawal protocolof e-cash, the protocol through which a participant obtains these e-tokens does not

Page 96: Cryptographic Protocols For Privacy Enhanced Identity Management

72 INTRODUCTION TO PART I

reveal any information to the server about what these e-tokens actually look like.Our protocol lets a participant obtain all the e-tokens it will ever need in its lifetimein one efficient transaction. The user performs only 3 multi-base exponentiationsto obtain e-tokens, and 35 multi-base exponentiations to show a single e-token.If the user is limited to one e-token per time period (as in the Damgard et al.’sscheme), the scheme can be further simplified and the user will need to do only 13multi-base exponentiations to show an e-token.

Distributed sensors can use an e-token to anonymously authenticate the datathey send to the central server. In the on-line game scenario, each e-token canbe used to establish a new connection to the game. Unlike e-cash, where it iscrucial to limit the amount of money withdrawn in each transaction, the numberof e-tokens obtained by a participant is unlimited, and a participant can go onsending data or connecting to the game for as long as needed. The e-tokens areanonymous and unlinkable to each other and to the protocol where they wereobtained. However, the number of e-tokens that are valid during a particular timeperiod is limited. Similarly to what happens in compact e-cash, reusing e-tokensleads to the identification of the rogue participant. We also show how to reveal allof its past and future transactions.

Thus, in the sensor scenario, a sensor cannot send more than a small number ofdata items per time period, so there is a limit to the amount of misleading datathat a rogue sensor can submit. Should a rogue sensor attempt to do more, it willhave to reuse some of its e-tokens, which will lead to its identification. Using anadditional tracing feature it is also possible to identify all of its past and futuretransactions. Similarly, in the on-line game scenario, a license cannot be usedmore than a small number of times per day, and so it is impossible to share itwidely. Our construction is secure under the SRSA and DDHI assumptions (SeeSection 2.3.3).

Chapter 7: Delegatable Anonymous Credentials. This chapter presents thedesign of an anonymous delegatable credential scheme in which participants canobtain, delegate, and demonstrate possession of credential chains without revealingany additional information about themselves. This is a natural and desirablegoal, the solution of which has, until now, proven elusive. Our main contributionis the first fully delegatable anonymous credential scheme. The only previouslyknown construction of delegatable anonymous credentials, due to Chase andLysyanskaya [CL06], needs kΩ(L) space to store a certification chain of length L(for security parameter k), and therefore could not tolerate non-constant L. Oursolution is practical: all operations on chains of length L need Θ(kL) time andspace.

In Section 2.5.6 we defined and constructed extractable and randomizable proofsof knowledge of a witness to a certain class of relations based on the proof system

Page 97: Cryptographic Protocols For Privacy Enhanced Identity Management

SUMMARY OF CONTRIBUTION 73

for pairing product equations due to Groth and Sahai [GS08]. In Chapter 7 wedefine and construct a delegatable anonymous credential system based on this andother building blocks. Our construction is efficient whenever the building blockscan be realized efficiently. We give efficient instantiations of the building blocksbased on either the XDH or the DLIN assumption together with the BB-HSDHand BB-TDH assumptions (See Section 2.3.3).

Details on publications. The contributions of Chapters 4-7 have been publishedin [BCKL08, BCKL09, CHK+06, BCC+09] respectively. To support our argu-mentation, we moved the material on multi-block P-signatures from [BCKL09] toChapter 4 and the material on randomizable non-interactive proofs from [BCC+09]to Section 2.5.

Page 98: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 99: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 4

P-signatures

We introduce the concept of P-signatures — signatures with efficient Proto-cols [BCKL08]. The main difference between P-signatures and CL-signaturesis that P-signatures have non-interactive proof protocols. As these protocols canbe useful for a variety of applications [JS04, BCL04, CHL05, CHL06, DDP06,CHK+06, TS06, CGH06, CLM07], it is important to give a careful treatment ofthe security guarantees they provide.

Challenges and Techniques. We use Groth and Sahai’s f -extractable non-interactive proofs of knowledge [GS08] to build P-signatures. An issue we confrontis that Groth-Sahai proofs are f -extractable and not fully extractable. Supposewe construct a proof whose witness x contains a ∈ Zp and the opening of acommitment to a. For this commitment, we can only extract ba ∈ f(x) from theproof, for some base b. Note that the proof can be about multiple committedvalues. Thus, if we construct a proof of knowledge of (m, σ) where m ∈ Zp andVerifySig(pk,m, σ) = accept, we can only extract some function F (m) from theproof. However, even if it is impossible to forge (m, σ) pairs, it might be possibleto forge (F (m), σ) pairs. Therefore, for our proof system to be meaningful, weneed to define F -unforgeable signature schemes, i.e. schemes where it is impossiblefor an adversary to compute a (F (m), σ) pair on his own.

Our first construction uses the Weak Boneh-Boyen (WBB) signature scheme[BB04b]. Using a rather strong assumption, we prove that WBB is F -unforgeableand our P-signature construction is secure. Our second construction is based onthe Full Boneh-Boyen signature scheme [BB04b] and uses a better assumption.We had to modify the Boneh-Boyen construction, however, because the GS proofsystem would not allow the knowledge extraction of the entire signature. Themulti-block P-signature construction is an extension of the second scheme.

75

Page 100: Cryptographic Protocols For Privacy Enhanced Identity Management

76 P-SIGNATURES

Outline of the chapter. We give a definition of security in Section 4.1. InSection 4.2 and 4.3 we give two P-signature constructions for signing single messageblocks. Section 4.4 contains our multi-block P-signature construction.

4.1 Security Notion

We introduce P-signatures, a primitive which lets a user (1) obtain a signature σon a committed message m without revealing the message, (2) construct a non-interactive zero-knowledge proof of knowledge of (F (m), σ) such that VerifySig(pk,m, σ) = accept and m is committed to in a commitment comm, and (3) do anon-interactive proof that a pair of commitments are to the same value. In thissection, we formally define the properties of a non-interactive P-signature scheme.

We construct P-signatures from signature schemes that consist of four algorithms:SigSetup, Keygen, Sign, and VerifySig. We split the key generation algorithm ofthe classical [GMR88] definition into two parts: SigSetup and Keygen. SigSetupgenerates universal system parameters (e.g., description of the groups that willbe used) that are shared by all algorithms and which otherwise would need to beincluded in both the public key and the secret key. For the P-signature scheme wewill also make use of these parameters for both the commitment scheme and theproof system, easing the transition to the common reference string model.

We first introduce the concept of F -unforgeability by extending the traditionaldefinition of digital signatures given in Section 2.4.3. Unforgeability under adaptivechosen message attack is insufficient for our purposes for the following reason:Our P-signature constructions prove that we know some value y = F (m) (for anefficiently computable but not necessarily efficiently invertible bijection F ) anda signature σ such that VerifySig(paramsSig, pk,m, σ) = accept. However, evenif an adversary cannot output (m, σ) without first obtaining a signature on m,he might be able to output (F (m), σ). Therefore, we introduce the notion ofF -unforgeability.

F-unforgeability. Let F be an efficiently computable bijection. No adversaryshould be able to output (F (m), σ) unless he has previously obtained a signatureon m. Formally, for every probabilistic polynomial time (p.p.t.) adversary A, thereexists a negligible function ν such that

Pr[paramsSig ← SigSetup(1k); (pk, sk)← Keygen(paramsSig);

(y, σ)← A(paramsSig, pk)OSign(paramsSig,sk,·) :

VerifySig(paramsSig, pk, F−1(y), σ) = 1 ∧ y 6∈ F (QSign)] < ν(k).

Page 101: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY NOTION 77

OSign(paramsSig, sk,m) records m-queries in QSign and returns Sign(paramsSig,sk,m). F (QSign) evaluates F on all values in QSign.

Lemma 4.1.1 F -unforgeable signatures are secure in the standard [GMR88] sense.Proof sketch. Suppose an adversary can compute a forgery (m,σ). Now thereduction can use it to compute (F (m), σ).

Definition 4.1.2 (F-secure Signature Scheme) A signature scheme is F -secure (against adaptive chosen message attacks) if it is Correct (VerifySig alwaysaccepts a signature obtained using the Sign algorithm) and F -unforgeable.

P-signatures

A P-signature scheme extends a signature scheme (SigSetup,Keygen,Sign,VerifySig)and a commitment scheme (SigSetup,Com). It consists of eleven algorithms:(SigSetup, Keygen, Sign, VerifySig, Com, ObtainSig, IssueSig, Prove, VerifyProof,EqComProve, VerEqCom). We describe the function of the newly introducedalgorithms.

SigSetup(1k) outputs public parameters params. These parameters includeparameters for the signature scheme and the commitment scheme.

ObtainSig(params, pk,m, comm, open)↔ IssueSig(params, sk, comm) executes thesignature issuing protocol between a user and the issuer. The user takes as input(params, pk,m, comm, open) such that the value comm = Com(params,m, open)and gets a signature σ as output. The issuer gets (params, sk, comm) as inputand gets nothing as output.

Prove(params, pk,m, σ) outputs the values (comm, π, open), such that comm =Com(params,m, open) and π is a proof of knowledge of signature σ on m.

VerifyProof(params, pk, comm, π) takes as input a commitment to a message m anda proof π that the message has been signed by the owner of public key pk. Thealgorithm outputs accept if π is a valid proof of knowledge of a signature on m,and outputs reject otherwise.

EqComProve(params,m, open, open′) takes as input a message and two commitmentopening values. It outputs a proof π that comm = Com(m, open) is a commitmentto the same value as comm′ = Com(m, open′). This proof is used to bind thecommitment of a P-signature proof to a more permanent commitment.

VerEqCom(params, comm, comm′, π) takes as input two commitments and a proofand accepts if π is a proof that comm, comm′ are commitments to the samevalue.

We define the security of P-signatures using a number of games that each describeone security property.

Page 102: Cryptographic Protocols For Privacy Enhanced Identity Management

78 P-SIGNATURES

Correctness. An honest user who obtains a P-signature from an honest issuerwill be able to prove to an honest verifier that he has a valid signature.

∀m ∈ 0, 1∗ : Pr[params ← SigSetup(1k); (pk, sk)← Keygen(params);

σ ← Sign(params, sk,m);

(comm, π, open)← Prove(params, pk,m, σ) :

VerifyProof(params, pk, comm, π) = accept] = 1

Signer privacy. Signer privacy guarantees that the adversary cannot learn morethan a signature on the committed value from the protocol. This is modeled usinga simulation argument, i.e., no p.p.t. adversary can tell if it is running IssueSigwith an honest issuer or with a simulator who merely has access to a signing oracle.Formally, there exists a simulator SimIssue such that for all p.p.t. adversaries(A1,A2), there exists a negligible function ν so that:∣∣Pr[params ← SigSetup(1k); (sk, pk)← Keygen(params);

(m, open, state)← A1(params, sk);

comm ← Com(params,m, open);

b← A2(state)↔ IssueSig(params, sk, comm) : b = 1]

−Pr[params ← SigSetup(1k); (sk, pk)← Keygen(params);

(m, open, state)← A1(params, sk);

comm ← Com(params,m, open);σ ← Sign(params, sk,m);

b← A2(state)↔ SimIssue(params, comm, σ) : b = 1]∣∣ < ν(k)

Note that we ensure that IssueSig and SimIssue get an honest commitment towhatever m, open the adversary chooses.

Since the goal of signer privacy is to prevent the adversary from learning anythingexcept a signature on the committed value, this is sufficient for our purposes. Notethat our SimIssue will be allowed to rewind A. Also, we have defined Signer Privacyin terms of a single interaction between the adversary and the issuer. A simplehybrid argument can be used to show that this definition implies privacy over manysequential instances of the issue protocol.

Page 103: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY NOTION 79

User privacy. No p.p.t. adversary (A1,A2) can tell if it is running ObtainSigwith an honest user or with a simulator. Formally, there exists a simulator Swith algorithm SimObtain such that for all p.p.t. adversaries A1,A2, there exists anegligible function ν so that:∣∣Pr[params ← SigSetup(1k); (pk,m, open, state)← A1(params);

comm = Com(params,m, open);

b← A2(state)↔ ObtainSig(params, pk,m, comm, open) : b = 1]

−Pr[(params, sim)← SimSetup(1k); (pk,m, open, state)← A1(params);

comm = Com(params,m, open);

b← A2(state)↔ SimObtain(params, pk, comm) : b = 1]∣∣ < ν(k)

Here again SimObtain is allowed to rewind the adversary.

Note that we require that only the user’s input m be hidden from the issuer, butnot necessarily the user’s output σ. The reason that this is sufficient is that inactual applications (for example, in anonymous credentials), a user would nevershow σ in the clear; instead, he would just prove that he knows σ. An alternative,stronger way to define signer privacy and user privacy together, would be to requirethat the pair of algorithms ObtainSig and IssueSig carry out a secure two-partycomputation. This alternative definition would ensure that σ is hidden from theissuer as well. However, as explained above, this feature is not necessary for ourapplication, so we preferred to give a special definition that captures the minimumproperties required.

The definitions of signer privacy and user privacy formally capture securityrequirements that are equally applicable to CL-signatures.

Unforgeability. We require that no p.p.t. adversary can create a proof for anymessage m for which he has not previously obtained a signature or proof from theoracle.

A P-signature scheme is unforgeable if an extractor (ExtractSetup,Extract) anda bijection F exist such that (1) the parameters output by ExtractSetup(1k) areindistinguishable from the output of SigSetup(1k), and (2) no p.p.t. adversary canoutput a proof π that VerifyProof accepts, but from which we extract F (m), σ suchthat one of the following three conditions hold: (a) σ is not a valid signature on m,or (b) comm is not a commitment to m or (c) the adversary has never previouslyqueried the signing oracle on m.

Page 104: Cryptographic Protocols For Privacy Enhanced Identity Management

80 P-SIGNATURES

Formally, for all p.p.t. adversaries A, there exists a negligible function ν such that:

Pr[params0 ← SigSetup(1k); (params1, ext)← ExtractSetup(1k) : b← 0, 1 :

A(paramsb) = b] < 1/2 + ν(k), and

Pr[(params, ext)← ExtractSetup(1k); (pk, sk)← Keygen(params);

(comm, π)← A(params, pk)OSign(params,sk,·);

(y, σ)← Extract(params, ext, π, comm) :

VerifyProof(params, pk, comm, π) = accept ∧ (

[VerifySig(params, pk, F−1(y), σ) = reject]

∨ [∀open, comm 6= Com(params, F−1(y), open)]

∨ [VerifySig(params, pk, F−1(y), σ) = accept ∧ y /∈ F (QSign)] ) ] < ν(k).

Oracle OSign(params, sk,m) runs the function Sign(params, sk,m) and returns theresulting signature σ to the adversary. It records the queried message in QSign. ByF (QSign) we mean F applied to every message in QSign.

For CL-signatures, the unforgeability definition would have to allow for aninteractive proof protocol.

Zero-knowledge. There exists a simulator S with three algorithms (SimSetup,SimProve, SimEqCom), such that for all p.p.t. adversaries with algorithms A1 andA2, there exists a negligible function ν such that under parameters output bySimSetup, Com is perfectly hiding and (1) the parameters output by SimSetupare indistinguishable from those output by SigSetup, but SimSetup also outputs aspecial auxiliary string sim; (2) when params are generated by SimSetup, the outputof SimProve(params, sim, pk) is indistinguishable from that of Prove(params, pk,m,σ) for all (pk,m, σ) where σ ∈ σpk(m); and (3) when params are generated bySimSetup, the output of SimEqCom(params, sim, comm, comm′) is indistinguishablefrom that of EqComProve(params,m, open, open′) for all (m, open, open′) wherecomm = Com(params,m, open) and comm′ = Com(params,m, open′).

This is formally defined as follows:

|Pr[params ← SigSetup(1k); b← A(params) : b = 1]

− Pr[(params, sim)← SimSetup(1k); b← A(params) : b = 1]| < ν(k), and

Page 105: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY NOTION 81

|Pr[(params, sim)← SimSetup(1k); (pk,m, σ, state)← A1(params, sim);

(comm, π, open)← Prove(params, pk,m, σ); b← A2(state, comm, π) : b = 1]

−Pr[(params, sim)← SimSetup(1k); (pk,m, σ, state)← A1(params, sim);

(comm, π)← SimProve(params, sim, pk); b← A2(state, comm, π)

: b = 1]| < ν(k), and

|Pr[(params, sim)← SimSetup(1k); (m, open, open′)← A1(params, sim);

π ← EqComProve(params,m, open, open′); b← A2(state, π) : b = 1]

−Pr[(params, sim)← SimSetup(1k); (m, open, open′)← A1(params, sim);

π←SimEqCom(params, sim,Com(params,m, open),Com(params,m, open′));

b← A2(state, π) : b = 1]| < ν(k).

Zero-knowledge implies witness indistinguishability. However it is possibleto give a weaker definition of secure P-signatures that requires only witnessindistinguishability.

Witness indistinguishability. No p.p.t. adversary can determine which of twomessage/signature pairs (σ0,m0) and (σ1,m1) was used to generate proof (comm, π).Formally, for all p.p.t. adversaries A, there exists a negligible function ν such that:

Pr[params ← SigSetup(1k); (pk, σ0,m0, σ1,m1)← A(params); b← 0, 1;

(comm, π, open)← Prove(params, pk, σb,mb) : A(comm, π) = b

∧ VerifySig(params, pk, σ0,m0) = 1

∧ VerifySig(params, pk, σ1,m1) = 1] < 1/2 + ν(k) .

Definition 4.1.3 (Secure P-signature Scheme) Let F be an efficiently com-putable bijection (possibly parametrized by public parameters). A P-signature schemeis secure if (SigSetup,Keygen,Sign,VerifySig) form an F -unforgeable signaturescheme, if (SigSetup,Com) is a perfectly binding, strongly computationally hidingcommitment scheme, if (SigSetup,EqComProve,VerEqCom) is a non-interactiveproof system, and if the Correctness, Signer privacy, User privacy, Unforgeability,and Zero-knowledge properties hold.

Page 106: Cryptographic Protocols For Privacy Enhanced Identity Management

82 P-SIGNATURES

4.2 P-Signatures From Weak BB Signatures

Our first construction of a P-signature scheme uses the Weak Boneh-Boyen signaturescheme (WBB) [BB04b, BB08] as a building block. The WBB scheme is as follows:

WBB-SigSetup(1k) runs BMGen(1k) to get the pairing parameters paramsBM =(p,G1,G2,GT , e, g, h). In the sequel, by z we denote z = e(g, h). As this valuecan be computed on the fly, paramsSig = paramsBM .

WBB-Keygen(paramsSig).The secret key is α ← Zp. pk = (v, v), where v = hα,v = gα.1 The correctness of the public key can be verified by checking thate(g, v) = e(v, h).

WBB-Sign(paramsSig, sk,m) calculates σ = g1/(α+m), where sk = α.WBB-VerifySig(paramsSig, pk,m, σ) outputs accept if the public key is correctly

formed and if e(σ, vhm) = z, where pk = (v, v). Outputs reject otherwise.

Boneh and Boyen proved that the Weak Boneh-Boyen signature is only weaklysecure given SDH, which is insufficient for our purposes. (In a weak chosen messageattack, the adversary has to submit all signature queries before seeing the publickey) The weak Boneh-Boyen signature scheme is F -secure given IHSDH (whichimplies standard [GMR88] security).

Theorem 4.2.1 Let F (x) = (hx, ux), where u ∈ G1 and h ∈ G2 as given in thestatement of the IHSDH assumption. The Weak Boneh-Boyen signature scheme isF -secure given IHSDH.

We extend the WBB signature scheme to obtain a P-signature scheme (SigSetup,Keygen, Sign, VerifySig, Com, ObtainSig, IssueSig, Prove, VerifyProof, EqComProve,VerEqCom), as follows:

SigSetup(1k). First, obtain paramsGS ← Setup(1k). Note that paramsGS includesthe pairing groups setup paramsBM = (p,G1,G2,GT , e, g, h). Pick u← G1.As before, z is defined as z = e(g, h). Return params = (paramsGS , u).

Keygen(params) runs WBB-Keygen(paramsBM ) and outputs sk = α, pk =(hα, gα) = (v, v).

Sign(params, sk,m) runs WBB-Sign(paramsBM , sk,m) to obtain σ = g1/(α+m)

where α = sk.

VerifySig(params, pk,m, σ) runs WBB-VerifySig(paramsBM , pk,m, σ).1The shadow value v does not exist in [BB04b] and is needed to prove zero-knowledge of our

P-signatures in pairing settings in which no efficient isomorphisms exist.

Page 107: Cryptographic Protocols For Privacy Enhanced Identity Management

P-SIGNATURES FROM WEAK BB SIGNATURES 83

Com(params,m, open). To commit to m, compute C = ExpCom(paramsGS , h, m,open).2

ObtainSig(params, pk,m, comm, open)↔ IssueSig(params, sk, comm). User and is-suer run the following protocol:

1. The user chooses ρ← Zp.2. The user and issuer engage in a secure two-party computation protocol,3

where the user’s private input is (ρ,m, open), and the issuer’s privateinput is sk = α. The issuer’s private output is x = (α + m)ρ ifcomm = Com(params,m, open), and x = ⊥ otherwise.

3. If x 6= ⊥, the issuer calculates σ′ = g1/x and sends σ′ to the user.4. The user computes σ = σ′ρ = g1/(α+m). The user checks that the

signature is valid.

Prove(params, pk,m, σ). Check if pk and σ are valid, and if they are not, output ⊥.Else, pick appropriate open and form the following GS commitment: comm =Com(params,m, open). Compute the following proof: π = NIPK(hm, um, σ) :e(σ, vhm) = z ∧ hm in comm. Output (comm, π, open).

VerifyProof(params, pk, comm, π). Output accept if the proof π is a valid proofof the statement described above for commitment comm and for properlyformed pk = (v, v).

EqComProve(params,m, open, open′). Let comm = GSCom(params, hm,open) andcomm′ = GSCom(params, hm, open′). Use the GS proof system as describedin Section 2.5.5 to compute π ← NIPK(x, y, hθ) : e(x/y, hθ) = 1∧ e(g, hθ) =e(g, h) ∧ x in comm′; y in comm.Groth and Sahai [GS08] show that such witness-indistinguishable proofs arealso zero-knowledge. A simulator that knows the simulation trapdoor simfor the GS proof system can simulate the above proof. The trapdoor simallows the simulator to open an internal commitment to θ to arbitrary values.He can then simulate the first and the second pairing product equations bysetting θ to 0 and 1 respectively. In this way he obtains a trivial witness foreach of the equations and can fake the proofs for arbitrary commitments.

VerEqCom(params, comm, comm′, π). Verify the proof π using the GS proof systemas described in Section 2.5.5.

2Recall that ExpCom(paramsGS , h,m, open) = GSCom(paramsGS ,hm , open).

3This can be done using the protocol we describe in Section 2.4.5. Alternatively, Jarecki andShmatikov [JS07] give a protocol for secure two-party computation on committed inputs; theirconstruction can be adapted here. In general using secure two-party computation is expensive,but here we only need to compute a relatively small circuit on the inputs.

Page 108: Cryptographic Protocols For Privacy Enhanced Identity Management

84 P-SIGNATURES

Theorem 4.2.2 (Efficiency) Using SXDH, each P-signature proof for the weakBoneh-Boyen signature scheme consists of 12 elements in G1 and 10 elements inG2. The prover performs 22 multi-exponentiations and the verifier 44 pairings.Using DLIN, each P-signature proof consists of 27 elements in G1 = G2. Theprover performs 27 multi-exponentiations and the verifier 54 pairings.

Theorem 4.2.3 (Security) Our first P-signature construction is secure for thebijection F (m) = (hm, um) given IHSDH, the security of the GS commitments andproofs, and the security of the two-party-computation. The proof of security is inAppendix A.1.

Remark 4.2.4 While for most cases the P-signature scheme based on the fullBoneh-Boyen signatures that is presented next is preferable. The above signaturescheme offers better performance with comparable security, if only security againstweak chosen message attacks is required, e.g., for signing the coin indices 1, . . . , nin Chapter 5.

4.3 P-Signatures From Full BB Signatures

In this section, we present a new signature scheme and then build a P-signaturescheme from it. The new signature scheme is based on the full Boneh-Boyensignature scheme [BB04b, BB08].

New-SigSetup(1k) runs BMGen(1k) to get the pairing parameters (p,G1,G2,GT ,e, g, h). Pick u← G1. In the sequel, by z we denote z = e(g, h). As this valuecan be computed on the fly, paramsSig = (paramsBM , u).

New-Keygen(paramsSig) picks random α, β ← Zp. The signer calculates v = hα,w = hβ , v = gα, w = gβ . The secret key is sk = (α, β). The public key ispk = (v, w, v, w). The public key can be verified by checking that e(g, v) = e(v, h)and e(g, w) = e(w, h).

New-Sign(paramsSig, (α, β),m) chooses r ← Zp \ −α+mβ and calculates σ1 =

g1/(α+m+βr), σ2 = wr, σ3 = ur. The signature is (σ1, σ2, σ3).New-VerifySig(paramsSig, (v, w, v, w),m, (σ1, σ2, σ3)) outputs accept if e(σ1, vh

mσ2)= z, e(u, σ2) = e(σ3, w), and if the public key is correctly formed, i.e., e(g, v) =e(v, h), and e(g, w) = e(w, h).4

We extend the above signature scheme to obtain a P-signature scheme (SigSetup,Keygen, Sign, VerifySig, Com, ObtainSig, IssueSig, Prove, VerifyProof, EqComProve,VerEqCom), as follows:

4The latter is needed only once per public key, and is meaningless in a symmetric pairingsetting.

Page 109: Cryptographic Protocols For Privacy Enhanced Identity Management

P-SIGNATURES FROM FULL BB SIGNATURES 85

SigSetup(1k). First, obtain paramsGS ← Setup(1k). Note that paramsGS includesthe pairing groups setup paramsBM = (p,G1,G2,GT , e, g, h). Pick u← G1. Letparams = (paramsGS , u). As before, z is defined as z = e(g, h).

Keygen(params) runs New-Keygen(paramsBM ) and outputs sk = (α, β), pk =(hα, hβ , gα, gβ) = (v, w, v, w).

Sign(params, sk,m) runs New-Sign(paramsBM , sk,m) to obtain σ = (σ1, σ2, σ3)where σ1 = g1/(α+m+βr), σ2 = wr, σ3 = ur, and sk = (α, β)

VerifySig(params, pk,m, σ) runs New-VerifySig(paramsBM , pk,m, σ).Com(params,m, open). To commit to m, compute C = ExpCom(paramsGS , h,m,

open).ObtainSig(params, pk,m, comm, open)↔ IssueSig(params, sk, comm).The user and

the issuer run the following protocol:

1. The user chooses ρ1, ρ2 ← Zp.2. The issuer chooses r′ ← Zp.3. The user and the issuer run a secure two-party computation protocol where

the user’s private inputs are (ρ1, ρ2,m, open), and the issuer’s private inputsare sk = (α, β) and r′. The issuer’s private output is x = (α+m +βρ1r

′)ρ2if comm = Com(params,m, open), and x = ⊥ otherwise.

4. If x 6= ⊥, the issuer calculates σ′1 = g1/x, σ′2 = wr′ and σ′3 = ur

′ , and sends(σ′1, σ′2, σ′3) to the user.

5. The user computes σ1 = (σ′1)ρ2 , σ2 = (σ′2)ρ1 , and σ3 = (σ′3)ρ1 and thenverifies that the signature (σ1, σ2, σ3) is valid.

Prove(params, pk,m, σ).Check if pk and σ are valid, and if they are not, output⊥. Then the user computes commitment comm = Com(paramsGS ,m, open) forrandom opening open.The user computes the proof

π = NIPK(σ1, σ2, σ3, hm, um) :

e(σ1, vhmσ2) = z ∧ e(u, σ2) = e(σ3, w) ∧ hm in comm

and outputs (comm, π, open).VerifyProof(params, pk, comm, π) outputs accept if the proof π is a valid proof of

the above statement for commitment comm and for properly formed pk.EqComProve(params,m, open, open′) and VerEqCom(params, comm, comm′, π) can

be implemented in the same way as for the WBB scheme.

Theorem 4.3.1 (Efficiency) Using SXDH GS proofs, each P-signature proof forour new signature scheme consists of 18 elements in G1 and 16 elements in G2.The prover performs 34 multi-exponentiation and the verifier 68 pairings. Using

Page 110: Cryptographic Protocols For Privacy Enhanced Identity Management

86 P-SIGNATURES

DLIN, each P-signature proof consists of 42 elements in G1 = G2. The prover hasto compute 42 multi-exponentiations and the verifier 84 pairings.

Theorem 4.3.2 (Security) Our second P-signature construction is secure givenHSDH and TDH, the security of the GS commitments and proofs, and the securityof the two-party-computation. For the proof of security, see Appendix A.2.

4.4 A Multi-block P-Signature Scheme

We first construct an F -secure multi-block signature scheme.

MB-SigSetup(1k) runs BMGen(1k) to get the pairing parameters (p,G1,G2,GT ,e, g, h). In the sequel, by z we denote z = e(g, h). The algorithm returnsparamsSig = paramsBM .

MB-Keygen(params) picks random α, β1, . . . , βn ← Zp. The signer calculates v =hα, v = gα, wi = hβi , wi = gβi , 1 ≤ i ≤ n. The secret key is sk = (α,β). Thepublic key is pk = (v,w, v, w). The public key can be verified by checking thate(g, v) = e(v, h) and e(g, wi) = e(wi, h) for all i.

MB-Sign(params, (α, β),m) chooses a random r ← Zp\−(α+β1m1+ · · ·+βnmn)and calculates σ1 = g1/(α+r+β1m1+···+βnmn), σ2 = hr, σ3 = ur. The signature is(σ1, σ2, σ3).

MB-VerifySig(params, (v,w, v, w),m, (σ1, σ2, σ3)) accepts if e(σ1, vσ2 ·∏ni=1 w

mii ) =

z and e(u, σ2) = e(σ3, h).

Theorem 4.4.1 Let F (m) = (hm, um). The above signature scheme is F -securegiven the HSDH and TDH assumptions. See Appendix A.3 for the proof.

We extend the above signature scheme to obtain a P-signature scheme (SigSetup,Keygen,Sign,VerifySig,Comn,ObtainSig, IssueSig,Prove,VerifyProof,EqComProven,VerEqComn) for signing blocks of messages m, as follows:

SigSetup(1k). First, obtain paramsGS ← Setup(1k). Note that paramsGS includesthe pairing groups setup paramsBM = (p,G1,G2,GT , e, g, h). Pick u← G1. Letparams = (paramsGS , u). As before, z is defined as z = e(g, h).

Keygen(params) runs MB-Keygen(paramsBM ) and outputs sk = (α,β) and pk =(v,w, v, w).

Sign(params, sk,m) runs BB-Sign(paramsBM , sk,m) to obtain σ = (σ1, σ2, σ3)where σ1 = g1/(α+r+β1m1+···+βnmn), σ2 = hr, σ3 = ur.

VerifySig(params, pk,m, σ) runs New-VerifySig(paramsBM , pk,m, σ).

Page 111: Cryptographic Protocols For Privacy Enhanced Identity Management

A MULTI-BLOCK P-SIGNATURE SCHEME 87

Comn(params,m, open).We commit to m, by committing to the individual mi ascommi = ExpCom(paramsGS , h,mi, openi). Let comm = (comm1, . . . , commn)and open = (open1, . . . , openn).

ObtainSig(params, pk,m, comm, open)↔ IssueSig(params, sk, comm). The user andthe issuer run the following protocol:

1. The user chooses ρ1, ρ2 ← Zp.2. The issuer chooses r′ ← Zp.3. The user and the issuer run a secure two-party computation protocol where

the user’s private inputs are (ρ1, ρ2,m, open), and the issuer’s private inputsare sk = (α,β) and r′. The issuer’s private output is x = (α+r′ρ1 +β1m1 +· · · + βnmn)ρ2 if commi = Com(params,mi, openi) for all 1 ≤ i ≤ n, andx = ⊥ otherwise.

4. If x 6= ⊥, the issuer calculates σ′1 = g1/x, σ′2 = hr′ and σ′3 = ur

′ , and sends(σ′1, σ′2, σ′3) to the user.

5. The user computes σ1 = (σ′1)ρ2 , σ2 = (σ′2)ρ1 , and σ3 = (σ′3)ρ1 and thenverifies that the signature (σ1, σ2, σ3) is valid.

Prove(params, (v,w, v, w),m, (σ1, σ2, σ3)) is defined as follows:Pick random open ← Zpn, compute comm = Comn(m, open) = (comm1, . . . ,commn) and form the Groth-Sahai proof:

π←NIZKhm1 , um1 , wm11 , . . . , hmn , umn , wmnn , σ1, σ2, σ3 :

e(σ1, vσ2∏ni=1w

mii ) = z ∧ e(u, σ2)e(σ3, h

−1) = 1

∧ e(wi, hmi)e(g−1, wmii ) = 1ni=1

∧ e(u, hmi)e(umi , h−1) = 1ni=1

hm1 in commi . . . , hmn in commn

To see that the witness indistinguishable proof π is also zero-knowledge, thesimulation setup sets u = ga. The simulator can then pick s,m1, . . .mn ← Zpand compute σ1 = g1/s. We implicitly set r = s − (α +

∑ni=1miβi). Note

that the simulator does not know r and α. However, he can computehr = hs/(v

∏ni=1 w

mii ) and ur = us/(v

∏ni=1 wi

mi)a. Now he can use hm1 , um1 ,wm1

1 ,. . . , hmn , umn , wmnn , σ1, σ2 = hr, σ3 = ur as a witness and construct

the proof π in the same way as the real Prove protocol. By the witnessindistinguishability, a proof using the faked witnesses is indistinguishable from aproof using a real witness. See also [BCKL08].

VerifyProof(params, pk, π, comm) simply verifies the proof π.EqComProven(params,m, open, open′) and VerEqComn(params, comm, comm′, π) are

implemented by running EqComProve and VerEqCom on all of the commi

respectively.

Page 112: Cryptographic Protocols For Privacy Enhanced Identity Management

88 P-SIGNATURES

Theorem 4.4.2 The above construction is a secure P-Signature scheme giventhe HSDH and TDH assumption, either the SXDH or DLIN assumption, and thesecurity of the two-party computation protocol.

The proof follows from the F -unforgeability of the multi-block signature schemeand the security of the Groth-Sahai proofs, which depend on either the SXDH orDLIN assumptions. The zero-knowledge simulations are done as sketched above.For details we refer to [GS08, BCKL08].

4.5 Conclusion

A pseudonym system with non-interactive anonymous credential shows is a directresult of replacing CL-signatures with P-signatures in the pseudonym systemconstruction sketched in Section 3.2. We present such a system in [BCKL08].5

P-signatures, and in particular their multi-block variant, have, however, manyother applications. They can, e.g., form the basis of a general-purpose anonymouscredential system. One application that we investigate in more detail in this thesisis their use in constructing a compact e-cash scheme that does not rely on therandom oracle heuristic in its security proof.

5Non-interactivity together with randomizable proofs is an important property that will lateron allow us to construct delegatable anonymous credentials in which a non-interactive credentialis randomized and then passed on to the recipient of the delegation.

Page 113: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 5

Compact e-Cash Based on aCommon Reference String

Electronic cash (e-cash) was introduced by Chaum [Cha82] as an electronic analogueof physical cash money1 and has been a subject of ongoing cryptographic researchsince then [CFN90, FY93, CP93a, Bra93a, CPS94, SPC95, FTY96, Tsi97, BP02].The participants in an e-cash system are: users who withdraw and spend e-cash,a bank that creates e-cash and accepts it for deposit, and merchants who offergoods and services in exchange for e-cash and then deposit the e-cash to the bank.The main security requirements are (1) unforgeability: even if all the users andall the merchants collude against the bank, they still cannot deposit more moneythan they withdrew; (2) anonymity: even if the bank and the merchant and allthe remaining users collude with each other, they still cannot distinguish Alice’spurchases from Bob’s.

Unfortunately, it is easy to see that, as described above, e-cash is useless. Theproblem is that here money is represented by bits, and it is possible to copy bits.Unforgeability will guarantee that the bank will only honor at most one of themfor deposit and will reject the other one. Anonymity will guarantee that there isno recourse against Alice. So one of the merchants will be cheated. There are twoknown remedies against this double-spending behavior. The first remedy is on-linee-cash [Cha82], where the bank is asked to vet a coin before the spend protocol canterminate successfully. The second remedy is off-line e-cash, introduced by Chaum,Fiat and Naor [CFN90]. The additional requirement of an offline e-cash system is(informally) that no coin can be double spent without revealing the identity of theperpetrator.

1An interesting related concept is quantum money [BBBW82, Aar09].

89

Page 114: Cryptographic Protocols For Privacy Enhanced Identity Management

90 COMPACT E-CASH BASED ON A COMMON REFERENCE STRING

A further development in the literature on e-cash was compact e-cash [CHL05].In compact e-cash, the user withdraws N coins in a withdrawal protocol whosecomplexity is O(logN) rather than O(N). The main idea is as follows: in thewithdrawal protocol, a user obtains the bank’s signature on (x, s, t), where s and tare random seeds of a pseudo-random function (PRF) f(·)(·) and x is the user’sidentifier. Note that in the withdrawal protocol s and t are kept hidden fromthe bank. In the spend protocol, a serial number of the ith coin is computed asS = fs(i), and a double-spending equation is computed as T = x+Rft(i), whereR is a random challenge by the merchant. The coin itself consists of (S, T,R, π),where π is a non-interactive zero-knowledge proof of knowledge of the followingvalues: x, s, t, i, σ where σ is the bank’s signature on (x, s, t), 1 ≤ i ≤ N , S = fs(i)and T = x+ Rft(i) mod q. If g is a generator of a group G of order q, and G isthe range of the PRF f(·)(·), then the double-spending equation can instead becomputed as T = gxft(i)R. It is easy to see that two double-spending equationsfor the same t, i but different R’s allow us to compute gx. It was shown that thisapproach yields a compact e-cash scheme [CHL05]. Later, this was extended toso-called e-tokens [CHK+06] that allow up to n anonymous transactions per timeperiod (for example, this would correspond to subscriptions to interactive gamesites or anonymous sensor reports).

However, until now no efficient instantiations of the NIZK proofs could be given,and all practical instantiations of compact e-cash had to derive the non-interactiveproofs from interactive proofs via the Fiat-Shamir heuristic [FS87] which is knownnot to yield provably secure constructions [GK03].

Challenges and Techniques. Constructing an efficient provably secure compacte-cash scheme is not simply a matter of replacing the Fiat-Shamir based NIZKproofs with the Groth-Sahai system. There are several issues that arise when weattempt to apply the Groth-Sahai proofs. First, recall that the Groth-Sahai systemonly works for proofs of particular types of statements. Thus, we must find aPRF and a signature scheme where verification can be phrased in terms of suchstatements. In the case of the PRF, we use a modification of the Dodis-Yampolskiyverifiable random function (VRF) [DY05], which outputs elements of a bilineargroup G1. We show that this is secure under the assumption that DDHI holds inthis group.

For the signature scheme, we note that verification of Boneh-Boyen signa-tures [BB08] can be phrased as a pairing product equation. However, becauseGroth-Sahai proofs only allow for the extraction of group elements, we need astronger unforgeability. Here we need that it be impossible to produce F (m),s k(m)for an unsigned message m, where F (m) is a value that can be extracted from acommitment to m. We need the bank to be able to sign a larger set of messages.Our multi-block P-signatures in Chapter 4 fulfills all of these requirements.

Page 115: Cryptographic Protocols For Privacy Enhanced Identity Management

COMPACT E-CASH BASED ON A COMMON REFERENCE STRING 91

We also need to be able to prove that the coin value falls within a given range. Theoriginal Camenisch et al. construction uses a technique by [Bou00], which relies onthe fact that the underlying RSA group has unknown order. Groth-Sahai proofs, onthe other hand, rely on the cryptographic bilinear group model, and it is not knownhow to construct such groups with unknown order. Thus, we must use a differenttechnique for our range proofs. We follow the basic concept of [TS06, CCS08], andimplement the range proofs using the new P-signatures mentioned above.

Finally, while Groth and Sahai present a NIZK proof system for a large class ofstatements, their simpler witness indistinguishable proof system is much moreefficient. Thus, we specifically design our protocols to use NIZK proofs only whennecessary. As a result, we obtain a construction that is almost competitive inefficiency with the original Camenisch et al. construction (with a gap of less thanan order or magnitude).

Simulatable verifiable random functions. Our main observation is that the NIZKproof for a compact e-cash serial number, a proof of the language

LF = S,Cy, Cs | ∃s, y, rs, ry such that S = fs(y),

Cy = Com(y, ry), Cs = Com(s, rs)

is a special case of a simulatable verifiable random function (sVRF), introducedby Chase and Lysyanskaya [CL07]. Chase and Lysyanskaya gave an efficientconstruction of a multi-theorem non-interactive zero-knowledge proof systemfor any language L from a single-theorem one for the same language (whileother single-theorem to multi-theorem transformations required the Cook-Levinreduction [Coo71] to an NP-complete language first).

Chase and Lysyanskaya [CL07] gave two constructions for sVRFs. The first is basedon generic non-interactive zero-knowledge proofs and any PRF. It is impractical,as it would have to make use of the Cook-Levin reduction itself. The secondconstruction is based on composite order bilinear pairings [BGN05, FST06], andhas several shortcomings. In particular, its range is either only logarithmic in thesecurity parameter or it is only weakly simulatable. Our construction is thus moreefficient by a factor of the security parameter k; it is also designed in a way that ismore modular and therefore easier to understand (and improve). Therefore, thisresult is of independent interest.

Our contribution and outline of the chapter. We define the security propertiesof simulatable verifiable random functions and compact e-cash in Section 5.1. Wegive our new simulatable verifiable random functions and e-cash constructions inSection 5.2 and Section 5.3 respectively.

Page 116: Cryptographic Protocols For Privacy Enhanced Identity Management

92 COMPACT E-CASH BASED ON A COMMON REFERENCE STRING

5.1 Security Notions

We recall the definition of simulatable verifiable random function (sVRF) of [CL07]and the notion of compact e-cash as put forward in [CHL05].

Definition of simulatable VRF. At a high level, a simulatable verifiable randomfunction (sVRF) is an extension of a pseudo-random function (PRF) (and also ofa slightly weaker extension, called a verifiable random function (VRF) [DY05]).It includes a key generation procedure that generates a seed for the PRF alongwith a corresponding public key. It also includes a proof system for proving that aparticular output is correct with respect to a given input and a given public key.We require fairly strong hiding properties from this proof system – in particular,we do not want it to interfere with the pseudorandomness properties of the PRF.

We review the definition of sVRFs (as defined in [CL07]).

Definition 5.1.1 (Trapdoor-indistinguishable sVRF) Let Setup(·) be an al-gorithm generating public parameters params on input security parameter 1k. LetD(params) and R(params) be families of efficiently samplable domains for allparams ∈ Setup. The set of algorithms (Gen,Eval,Prove,Verify) constitutes averifiable random function (VRF) for parameters generated by Setup, with inputdomain D(·) and output range R(·) if

Correctness Correctness means that the verification algorithm Verify will alwaysaccept (params, pk, x, y, π) when y = Eval(params, sk, x), and π is the proofof this fact generated using Prove.

Pseudorandomness Pseudorandomness means that, given input (params, pk),even with oracle access to Eval(params, sk, ·) and Prove(params, sk, ·), nop.p.t. adversary can distinguish Fpk(x) from a random element of R(params)without explicitly querying for it.

Verifiability. For all k, for all params ∈ Setup(1k), there do not exist values(pk, x, y1, π1, y2, π2) such that y1 6= y2, but Verify(params, pk, x, y1, π1) =Verify(params, pk, x, y2, π2) = accept.

Trapdoor-Indistinguishable Simulatability.2 There exist (SimSetup,SimGen,SimProve) such that the distribution Setup(1k) is computationally indistin-guishable from the distribution SimSetup(1k) and for all p.p.t. A, A’s viewsin the following two games are indistinguishable:

1Note that TI-Simulatability implies standard Simulatability as described in [CL07], which inturn implies pseudorandomness.

Page 117: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY NOTIONS 93

Real. (params, t) ← SimSetup(1k), (pk, sk) ← Gen(params) and then A(params, t, pk) gets access to the following oracle Real: On query x, Realreturns y = Eval(params, sk, x) and π ← Prove(params, sk, x).

Simulated. (params, t) ← SimSetup(1k), (pk, sk) ← SimGen(params, t),and then A(params, t, pk) gets access to the following oracle S: Onquery x, S (1) checks if x has previously been queried, and if so,computes π ← SimProve(params, sk, x, y, t) for the stored y and returns(y, π); (2) otherwise, obtains a random y ← R(params) and π ←SimProve(params, sk, x, y, t), returns (y, π) and stores y.

In the rest of the chapter, we will refer to the output of Eval(params, sk, x) asPRFsk(x) when the parameters are clear.

Trapdoor-indistinguishable simulatability can be shown via a simple hybridargument to imply full simulatability, where the adversaries can interact withmany public keys and still cannot distinguish Setup,Prove,Eval from SimSetup,SimProve, and random sampling from R(params).

For the full definition, see [BCKL09] or [CL07].

Definition of compact e-cash. Compact e-cash as defined by [CHL05] involvesa bank I as well as many users U and merchants M. Merchants are treated asa special type of user that have a publicly known identifier idM. The parties I,U , andM interact using the algorithms CashSetup, BankKG, UserKG, SpendCoin,VerifyCoin, Deposit, Identify, and the interactive protocol Withdraw.

CashSetup(1k) creates the public parameters params.

BankKG(params, n) outputs the key pair (pkI , skI) that is used by the bank toissue wallets of n coins. For simplicity, we assume that the secret key containsthe corresponding public key.

UserKG(params) generates a user (or merchant) key pair (pkU , skU ). The keys areused for entity authentication and non-repudiation.Merchants also have a publicly known identifier idM = f(pkM) associatedwith their public keys (f is some publicly known mapping).

Withdraw(U(params, pkI , skU ), I(params, pkU , skI)) is an interactive protocol inwhich a user withdraws a wallet W of n coins from the bank where n isspecified in pkI . The wallet includes the public key of the bank. The banklearns some trace information TW that it can later use to identify double-spenders. After a successful protocol run, the bank adds TW to its databaseDBT .

Page 118: Cryptographic Protocols For Privacy Enhanced Identity Management

94 COMPACT E-CASH BASED ON A COMMON REFERENCE STRING

SpendCoin(params,W , pkM, skU , info) allows a user with a non-empty wallet Wand some unique transaction information info to create a coin. The outputof the algorithm is (W ′, coin), the updated wallet and an e-coin that can begiven to a merchant. The e-coin consists of a serial number S, transactioninformation idM‖info, and a proof π.

VerifyCoin(params, pkM, pkI , coin) allows a merchant to verify coin = (S, π, idM‖info) received from a user. The output of the algorithm is either acceptor reject. The merchant accepts the coin on accept but only if he has neveraccepted a coin with the same info before. The info value could for instancedepend on the current time, and the merchant can keep a list of all infovalues that contain the same valid time label as part of their information.

Deposit(params, pkI , pkM, coin, stateI) allows the bank to verify a coin receivedfrom a merchant. The bank needs to maintain a database stateI of allpreviously accepted coins. The output of the algorithm is an updated databasestate′I and the flag result, which can have three values:

(i) accept indicates that the coin is correct and fresh. The bank depositsthe value of the e-coin into the merchant’s account and adds (pkM, coin)to stateI .

(ii) merchant indicates that either VerifyCoin(params, skM, pkI , coin) =reject, or that stateI already contains an entry (pkM, coin). The bankrefuses to accept the e-coin because the merchant failed to properlyverify it.

(iii) user indicates that there exists a second coin with the same serial numberS registered in stateI . (Using the two coins the bank will identify thedouble-spending user.) The bank pays the merchant (who accepted thee-coin in good faith) and punishes the double-spending user.

Identify(params, pkI , coin, coin′) allows the bank to identify a double-spender. Thealgorithm outputs TW , which the bank compares to the trace information itstores after each withdrawal transaction.

Remark 5.1.2 Camenisch et al. [CHL05] only define spending as the interactiveprotocol Spend(U(params,W , pkM),M(params, skM, pkI)). We can derive theirprotocol from our non-interactive algorithms. First the merchant sends the user arandom info. Then the user runs SpendCoin(params,W , pkM, skU , info) and sendsthe resulting coin back to the merchant. The merchant accepts the e-coin only ifVerifyCoin(params, pkM, pkI , coin) outputs accept and the info used to constructthe e-coin is correct. Non-interactive spend protocols are important when two-waycommunication is not available or impractical, e.g. when sending an e-coin byemail.

Page 119: Cryptographic Protocols For Privacy Enhanced Identity Management

SIMULATABLE VERIFIABLE RANDOM FUNCTIONS 95

Definition 5.1.3 (Secure Compact E-Cash with Non-Interactive Spend)A compact e-cash scheme consists of the non-interactive algorithms CashSetup,BankKG, UserKG, SpendCoin, VerifyCoin, Deposit, Identify, and the interactiveprotocol Withdraw. We say that such a scheme is secure if it has the Correctness,Anonymity, Balance, and Identification properties.Weak Exculpability is an additional desirable property that our scheme can achieve.

Correctness. When the bank and user are honest, and the user has sufficientfunds, Withdraw will always succeed. An honest merchant will always acceptan e-coin from an honest user. A honest bank will always accept an e-coinfrom an honest merchant.

Anonymity. A malicious coalition of banks and merchants should not be able todistinguish if the Spend protocol is executed by honest users or by a simulatorthat does not know any of the users’ secret data.

Balance. No coalition of users should be able to deposit more e-coins than theycollectively withdrew.

Identification. The bank will be able to identify any user who generates twovalid e-coins (i.e. e-coins that pass the VerifyCoin test) with the same serialnumber.

Weak Exculpability. If a user never double-spends, then even a malicious bankcolluding with malicious merchants will not be able to frame the honest user,i.e. to produce coin1, coin2 such that Identify(params, coin1, coin2) = TWwhere TW is the trace information for this user.

More formal definitions of security for compact e-cash can be found in [BCKL09].

5.2 Simulatable Verifiable Random Functions

Here we present our new construction for sVRFs. Later, we will show that anextension of this construction (as described in Sections 5.2.2 and 5.2.3) can be usedto construct provably secure e-cash.

5.2.1 A New sVRF Construction

Our construction is in the bilinear group setting where (p,G1,G2,GT , e, g, h) ←BMGen(1k). We will use the function PRFs(x) = g

1s+x to build an efficient

Page 120: Cryptographic Protocols For Privacy Enhanced Identity Management

96 COMPACT E-CASH BASED ON A COMMON REFERENCE STRING

Simulatable VRF.3 Note that the base function is similar to the Dodis-YampolskiyVRF [DY05], which uses the function PRFs(x) = e(g, h)

1s+x and thus gives output

in GT . Moving our function to output elements in G1 is the crucial step that allowsus to use the Groth-Sahai proof techniques.

Theorem 5.2.1 For (p,G1,G2,GT , e, g, h) ← BMGen(1k), let Dk ⊂ Z denotea family of domains of size polynomial in k. If the DDHI assumption holds inG1, then the set g 1

s+x x∈Dk is indistinguishable from the set grxx∈Dk wheres, rxx∈Dk are chosen at random from Zp.The proof is very similar to that in [DY05].

We will build an sVRF based on this function as follows:

Setup(1k). Let (p,G1,G2,GT , e, g, h)← BMGen(1k) be the parameters of a bilinearmap and let paramsGS be the parameters for the corresponding Groth-Sahai non-interactive zero-knowledge (NIZK) proof system (either in theXDH or the DLIN setup). Output parameters paramsVRF = ((p,G1,G2,GT , g, h), paramsGS).

Keygen(paramsVRF). Pick a random seed s← Zp and random opening informationseed, and output secret key sk = (s,s eed) and public key pk =GSCom(hs,s eed).

Eval(paramsVRF , sk = (s,s eed), x). Compute y = g1/(s+x).

Prove(paramsVRF , sk = (s,s eed), x). Compute y = g1/(s+x) and Cy = GSCom(y,openy) from random opening openy. Next create the following two proofs:π1, a composable NIZK proof that Cy is a commitment to y; this is a proofthat the value v committed to in Cy fulfills the pairing product equatione(v/y, hθ) = 1∧e(g, hθ) = e(g, h) (see Section 2.5.5 and [GS08] for details); π2,a GS composable witness indistinguishable proof that Cy is a commitment tosome value Y and pk is a commitment to some value S such that e(Y, Shx) =e(g, h). Output π = (Cy, π1, π2).

VerifyProof(params, pk, x, y, π = (Cy, π1, π2)). Use the Groth-Sahai verification toVerify π1, π2 with respect to Cy, x, pk, y.

Theorem 5.2.2 This construction with domain size q is a strong sVRF under theq-DDHI for G1 and under the assumption that the Groth-Sahai proof system issecure.For proof, see [BCKL09] or Appendix A.4.

3This is the same function that underlies the Weak Boneh-Boyen signature scheme [BB08].

Page 121: Cryptographic Protocols For Privacy Enhanced Identity Management

SIMULATABLE VERIFIABLE RANDOM FUNCTIONS 97

5.2.2 NIZK Proofs for Pseudo-Random Functions

In some applications, we need something stronger than an sVRF. In our e-cashapplication, we need to be certain that the proofs will reveal no information aboutwhich wallet was used, which means that they should completely hide the seedused. Furthermore, we do not want to reveal which coin in the wallet is beingspent, thus we also want to hide the input x.

Thus, we will build a composable NIZK proof for the following language:

LS=Cs, Cx, y | ∃x, s, openx, opens such that

Cs = Com(s, opens) ∧ Cx=Com(x, openx) ∧ y = fs(x) .Note that there are four points where an sVRF proof is weaker than a full NIZKproof. First, the sVRF public key is not guaranteed to hide the secret key, only tohide enough information to preserve the pseudorandomness of the output values.However, this is not a problem in the above construction, since our public key isformed as a commitment. Second, an sVRF has a fixed public key, while we wantto be able to compute unlinkable proofs for many different values of the PRF. Thisagain is not relevant in the above construction: since we form our public key usinga commitment scheme, we can easily use a different value in each proof. Third,in the sVRF proof, the input x is given in the clear. We can fix this fairly easilyby replacing x by a commitment and proof. The final difference is that the sVRFproof need not be fully zero-knowledge – the sVRF simulator is given the secret keyas input (in our construction, the opening of the commitment Cs). We resolve thislast point by adding extra commitments C ′s, C ′x (whose opening the zero-knowledgesimulator will know), and zero-knowledge proofs that they commit to the samevalues as Cs, Cx.

On input (Cs, Cx, y) and (x, s, openx, opens) a NIZK proof of membership in LS isdone as follows: We first compute commitment C ′s to hs. Then we compute Cy, π1as in the sVRF Prove protocol, with pk = C ′s. Next we compute a commitmentC ′x to hx, and a GS composable witness-indistinguishable proof π2 that Cy isa commitment to Y , C ′x is a commitment to X, and C ′s is a commitment to Ssuch that e(Y, SX) = e(g, h). Finally, to make the construction zero-knowledge,we add composable NIZK proofs πs and πx that Cs and C ′s, and Cx and C ′x arecommitments to the same values. Let v be s or x, respectively. Then each proof isa proof that the values v and v′ committed to in Cv and C ′v respectively fulfill thepairing product equation e(v/v′, hθ) = 1 ∧ e(g, hθ) = e(g, h). See [GS08] for whythis is zero-knowledge. The final proof is π = (C ′s, C ′x, C ′y, π1, π2, πs, πx).

The proof is verified using the Groth-Sahai verification techniques to checkπ1, π2, π3, π4 with respect to Cs, Cx, y, C ′s, C ′x, C ′y.

Page 122: Cryptographic Protocols For Privacy Enhanced Identity Management

98 COMPACT E-CASH BASED ON A COMMON REFERENCE STRING

Theorem 5.2.3 The above proof system is a secure composable zero-knowledgeproof system for the language LS(params), where params is output by Setup.The proof appears in [BCKL09].

5.2.3 NIZK Proofs for Double-Spending Equations

In our application, we use NIZKs about PRFs in two different places. The firstis to prove that a given serial number has been computed correctly as PRFs(x)according to a committed seed s and committed input x. That can be done usingthe NIZK protocol described in the previous section. However, we also need tobe able to prove that the double-spending value T has been computed correctly.Thus, we also need a proof system for the following language:

LT =Cs, Cx, Csk , tag, ch | ∃x, s, sk, openx, opens, opensk such that

Cs = Com(s, opens) ∧ Cx = Com(x, openx)∧

Csk =Com(sk, opensk) ∧ tag = (gsk)chfs(x) .We can generalize our above proof system to handle this as well.

Prove(params, Cs, Cx, Csk , tag, ch, s, opens, x, openx, sk, opensk). We first form newcommitments C ′s = GSCom(hs, open′s), C ′x = GSCom(hx, open′x), and C ′sk =GSCom(hsk , open′sk). Then we compute zero-knowledge proofs π1 for (Cs,C ′s),π2 for (Cx, C ′x), and π3 for (Csk , C

′sk), showing that both commitments

in each pair commit to the same value using the techniques described inSection 2.5.5.Next, we compute a commitment C ′y to PRFs(x), and a commitment C ′′sk =GSCom(gsk , open′′sk).We then compute a GS witness indistinguishable proof π4 that the valuecommitted to in C ′y is the correct output given the seed in C ′s and the inputin C ′x. I.e. that C ′y commits to Y , C ′s commits to S and C ′x commits to Xsuch that e(Y, SX) = e(g, h).Next we compute a GS witness indistinguishable proof π5 that the valuecommitted to in C ′′sk is correct with respect to C ′sk , i.e that C ′′sk commits toK ′′ and C ′sk commits to K ′ such that e(K ′′, h) = e(g,K ′).

We can also compute C ′tag = C ′′skchC ′y. Note that by the homomorphic

properties of the commitment scheme, this means C ′tag should be acommitment to (gsk)chy which is the correct value for tag.Finally, we compute a zero-knowledge proof π6 that C ′tag is a commitment totag as in Section 2.5.5.

Page 123: Cryptographic Protocols For Privacy Enhanced Identity Management

CONSTRUCTION OF COMPACT E-CASH 99

The final proof is π = (C ′s, C ′x, C ′sk , C′y, C

′′sk , C

′tag, π1, π2, π3, π4, π5, π6).

VerifyProof(params, Cs, Cx, Csk , ch, π = (C ′s, C ′x, C ′sk , C′y, C

′′sk , C

′tag, π1, π2, π3, π4,

π5, π6)). Uses the Groth-Sahai proof verification procedure to verify π1, π2,π3, π4, π5, π6 with respect to Cs, Cx, Csk , info, C ′s, C ′x, C ′sk , C

′y, C

′′sk , C

′z.

Theorem 5.2.4 The proof system Setup,Prove,VerifyProof is a secure composablezero-knowledge proof system for the language LT (params) described above, whereparams is output by Setup.

5.3 Construction of Compact E-Cash

We construct a compact e-cash scheme using our multi-block P-signatures andsVRF protocols.

CashSetup(1k). The setup runs SigSetup(1k) and returns the P-signature pa-rameters params. Our construction is non-black-box: we reuse the GSNIPK proof system parameters paramsGS that are contained in params.The parameters paramsGS in turn contain the setup for a bilinear pairingparamsBM = (p,G1,G2,GT , e, g, h) for a pairing e : G1 × G2 → GT forgroups of prime order p.

BankKG(params, n). The bank creates two P-signature key pairs, (pkw, skw) ←Keygen(params) for issuing wallets and (pkc, skc) ← Keygen(params) forsigning coin indices. Then the bank computes a P-signature on the n coinindices Σ1, . . . ,Σn, where Σi = Sign(skc, i). The bank’s secret key is skI =(skw, skc) and the bank’s public key is (pkw, pkc,Σ1, . . . ,Σn).

UserKG(params). The user picks skU ← Z∗p and returns (pkU = e(g, h)skU , skU ).Merchants generate their keys in the same way but also have a publicly knownidentifier idM = f(pkM) associated with their public keys (f is some publiclyknown mapping).

Withdraw(U(params, pkI , skU , n), I(params, pkU , skI , n)). The user obtains a wal-let from the bank.

1. The user picks s′, t′ ← Zp; computes commitments commsk = Com(skU ,skU), comms′ = Com(s′, opens′), and commt′ = Com(t′, opent′); andsends commsk , comms′ , and commt′ to the bank. The user proves inzero-knowledge that he knows the opening to these values, and thatcommsk corresponds to the secret key used for computing pkU .4

4These and the rest of the proofs in the issue protocol can be done using efficient Σ-protocols [CDS94, Cra97, Dam02] and their zero-knowledge compilers [Dam00].

Page 124: Cryptographic Protocols For Privacy Enhanced Identity Management

100 COMPACT E-CASH BASED ON A COMMON REFERENCE STRING

2. If the proofs verify, the bank sends the user random values s′′, t′′ ∈ Zp.3. The user picks random opens, opent, commits to comms = Com(s′ +s′′, opens), and commt = Com(t′ + t′′, opent), sends comms and commt

to the bank, and proves that they are formed correctly. Let s = s′ + s′′

and t = t′ + t′′.4. The user and bank run ObtainSig(params, pkw, (skU , s, t), (sk, opens,opent))↔ IssueSig(params, skw, (commsk , comms, commt)) respectively.The user obtains a P-signature σ on (skU , s, t). The user stores the walletW = (s, t, pkI , σ, 0); the bank stores tracing information TW = pkU .

SpendCoin(params, (s, t, pkI , σ, J), pkM, skU , info). The user increases the counterJ by one, and checks that the resulting value is smaller equal n. Then hecalculates a serial number S = PRFs(J) = g1/(s+J). The user needs to provethat he knows a signature σ on (skU , s, t) and a signature ΣJ on J suchthat S = PRFs(J). Next the user constructs a double-spending equationT = (gidM‖info)skUPRFt(J).5 The user proves that T is correctly formed forthe skU , t, J, signed in σ and ΣJ .

All these proofs need to be done non-interactively. We now give more details.The user runs Prove, first on σ and pkw to obtain commitments and proof((Cid , Cs, Ct), π1, (sk, opens, opent)) ← Prove(params, pkw, σ, (skU , s, t)) forskU , s, t respectively and second on ΣJ and pkc to obtain commitment andproof (CJ , π2, openJ)← Prove(params, pkc,ΣJ , J) for J .Then the user constructs non-interactive zero-knowledge proofs that indeed(S, T, Cid , Cs, Ct, CJ , idM‖info) are well formed. This is done by computingtwo proofs πF and πT : πF proves that (Cs, CJ , S) ∈ LS and is computed asdescribed in Section 5.2.2, where LS is defined as:

LS = Cs, Cx, y | ∃x, s, openx, opens such that

Cs = Com(s, opens) ∧ Cx = Com(x, openx) ∧ y = Fs(x) ;

πT proves that (Ct, CJ , Cid , T, (idM|info)) ∈ LT and is computed as describedin Section 5.2.3, where LT is defined as:

LT = Cs, Cx, Csk , tag, ch | ∃x, s, sk, openx, opens, opensk such that

Cs = Com(s, opens) ∧ Cx = Com(x, openx)∧

Csk = Com(sk, opensk) ∧ tag = (gsk)chFs(x) .

The user outputs a coin = (S, T, Cid , Cs, Ct, CJ , π1, π2, πS , πT , idM‖info).5The merchant is responsible for assuring that he never accepts two coins with the same info.

Coins which have the same serial number and the same idM‖info cannot be deposited and thedamage lies with the merchant. The dangers that users get cheated by merchants that do notaccept coins with correct info can be mitigated using techniques such as endorsed e-cash [CLM07].

Page 125: Cryptographic Protocols For Privacy Enhanced Identity Management

CONSTRUCTION OF COMPACT E-CASH 101

VerifyCoin(params, pkM, pkI , coin). To verify parses coin as (S, (T,Cid , Cs, Ct,CJ , π1, π2, πS , πT ), idM′‖info) and checks that the following checks succeed:

1. Check that idM′ = f(pkM).2. VerifySig(params, pkw, π1, (Cid , Cs, Ct)) = accept.3. VerifySig(params, pkc, π2, CJ) = accept.4. VerifyProofLS (paramsGS , (Cs, CJ , S), πS) = accept.5. VerifyProofLT (paramsGS , (Ct, CJ , Cid , T, (idM|info)), πT ) = accept.

Note that the merchant is responsible for assuring that info is unique over allof his transactions. Otherwise his deposit might get rejected by the followingalgorithm.

Deposit(params, pkI , pkM, coin, stateI). The algorithm parses the coin as coin =(S, T, Cid , Cs, Ct, CJ , π1, π2, πS , πT , idM‖info) and performs the same checksas VerifyCoin. The bank maintains a database stateI of all previously acceptedcoins. The output of the algorithm is an updated database state′I = stateI ∪coin and the flag result, that is computed as follows:

(i) If the coin verifies and if no coin with serial number S is stored in stateI ,result = accept to indicate that the coin is correct and fresh. The bankdeposits the value of the e-coin into the merchant’s account and addscoin to stateI .

(ii) If the coin doesn’t verify or if there is a coin with the same serial numberand the same idM‖info already stored in stateI , result = merchant toindicate that the merchant cheated. The bank refuses to accept thee-coin because the merchant failed to properly verify it.

(iii) If the coin verifies but there is a coin with the same serial number Sbut different idM‖info in stateI , result = user to indicate that a userdouble-spent. The bank pays the merchant (who accepted the e-coin ingood faith) and punishes the double-spending user.

Identify(params, pkI , coin1, coin2) allows the bank to identify a double-spender.Parse coin1 = (S, (T,Cid , Cs, Ct, CJ , π1, π2, πS , πT ), idM1‖info1) and coin2 =(S′, (T ′, C ′id , C ′s, C ′t, C ′J , π′1, π′2, π′S , π′T ), idM2‖info2).The algorithm aborts if one of the coins doesn’t verify, if S 6= S′, or ifidM1‖info1 = idM2‖info2. Otherwise, the algorithm outputs TW = pkU =e((T/T ′)1/(idM1‖info1−idM2‖info2), h) , which the bank compares to the traceinformation it stores after each withdrawal transaction.

Theorem 5.3.1 This e-cash scheme is a secure compact e-cash scheme given thesecurity of the P-signature scheme, the PRF, and the Groth-Sahai NIZK proofsystem.For proof see Appendix A.5.

Page 126: Cryptographic Protocols For Privacy Enhanced Identity Management

102 COMPACT E-CASH BASED ON A COMMON REFERENCE STRING

5.4 Conclusion

Using multi-block P-signatures and other cryptographic machinery, such assimulatable verifiable random functions (sVRF) and non-interactive range proofs, weconstructed the first compact e-cash scheme that does not rely on the random oraclemodel in its proof. Such a scheme can also be seen as an accountability mechanismthat makes non-interactive anonymous credentials only n-times showable.

The next chapter shows how to restrict the number of shows per time period.For a change this will be done in the interactive setting. The result is a moreefficient scheme, that can however only be made non-interactive using the Fiat-Shamir heuristic which can only be proven secure using random oracles. While thetechniques in this section might be applicable, this would require further work.

Page 127: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 6

Periodic n-Times Spending

In [CHK+06] we present a solution for periodic n-times use of anonymous credentialsor other cryptographic tokens that allow for the authentication of otherwiseunlinkable transactions. This solution considerably improves previous work byDamgard, Dupont and Pedersen [DDP06]. We motivate the need for such acryptographic mechanism using two different scenarios.

Sensors scenario. As computer devices get smaller and less intrusive, it becomespossible to place them everywhere and use them to collect information about theirenvironment. For example, with today’s technology, sensors mounted on vehiclesmay report to a central traffic service which parts of the roads are treacherous, thusassisting people in planning their commutes. Some have proposed mounting sensorsin refrigerators to report the consumption statistics of a household, thus aiding inpublic health studies, or even mounting them in people’s bodies in an attempt toaid medical science. In all of these areas, better information may ultimately leadto a better quality of life.

Yet this vision appears to be incompatible with privacy. A sensor installed in aparticular car will divulge that car’s location, while one installed in a fridge willreport the eating and drinking habits of its owner.

A naive solution would be to supply only the relevant information and nothingelse.1 A report about the road conditions should not say which sensor made themeasurement. However, then nothing would stop a malicious party from supplying

1Note that divulging the relevant information alone may already constitute a breach of privacy;in this thesis, we do not address this aspect of the problem; it has more to do with statisticalproperties of the data itself. See Sweeney [Swe02] and Chawla et al. [CDM+05] on the challengesof determining which data is and is not safe to reveal.

103

Page 128: Cryptographic Protocols For Privacy Enhanced Identity Management

104 PERIODIC N-TIMES SPENDING

lots of false and misleading data. We need to authenticate the information reportedby a sensor without divulging the sensor’s identity. We also need a way to dealwith rogue sensors, i.e., formerly honest sensors with valid cryptographic keys thatare captured by a malicious adversary and used to send lots of misleading data.

Software license scenario. The same problem arises in other scenarios. Consideran interactive computer game. Each player must have a license to participate, andprove this fact to an on-line authority every time she wishes to play. For privacyreasons, the player does not wish to reveal anything other than the fact that shehas a license. How can we prevent a million users from playing the game for theprice of just one license?

Previous work. A suite of cryptographic primitives such as group signa-tures [CvH91, CS97a, ACJT00, BBS04] and anonymous credentials [Cha85, Dam90,LRSW99, CL01, CL02b, CL04] has been developed to let us prove that a pieceof data comes from an authorized source without revealing the identity of thatparticular source. However, none of the results cited above provide a way toensure anonymity and unlinkability of honest participants while at the same timeguaranteeing that a rogue cannot undetectably provide misleading data in bulk.Indeed, it seems that the ability to provide false data is a consequence of anonymity.

Damgard, Dupont and Pedersen [DDP06] presented a scheme that overcomes thisparadox. The goal is to allow an honest participant to anonymously and unlinkablysubmit data at a small rate (for example, reporting on road conditions once everyfifteen minutes, or joining one game session every half an hour), and at the sametime to have a way to identify participants that submit data more frequently. Thislimits the amount of false information a rogue sensor can provide or the number oftimes that a given software license can be used per time period.

While the work of Damgard et al. is the first step in the right direction, theirapproach yields a prohibitively expensive solution. To authenticate itself, a sensoracts as a prover in a zero-knowledge (ZK) proof of knowledge of a relevant certificate.In their construction, the zero-knowledge property crucially depends on the factthat the prover must make some random choices; should the prover ever re-use therandom choices he made, the prover’s secrets can be efficiently computed from thetwo transcripts. The sensor’s random choices are a pseudo-random function of thecurrent time period (which must be proven in an additional ZK proof protocol).If a rogue sensor tries to submit more data in the same time period, he will haveto use the same randomness in the proof, thus exposing his identity. It is verychallenging to instantiate this solution with efficient building blocks. Damgardet al. use the most efficient building blocks available, and also introduce some oftheir own; their scheme requires that the user perform 57+68k exponentiations to

Page 129: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY NOTION 105

authenticate, where k is the security parameter (a sensor can cheat with probability2−k).

Our solution will use techniques from compact e-cash [CHL05, CHL06] to solvethis problem in a more efficient way. A generic attempt to solve this problemusing anonymous e-cash [Cha82] could work as follows: each report needs to besubmitted together with an e-coin; after the submission of a report and a checkthat the coin was not double spent and is not expired, a new coin is issued that canonly be used in the next time period. The disadvantage of such an approach is thatthe e-coin issuer needs to be constantly online, and sensors need to continuouslysubmit reports to make sure that their e-coins do not expire.

Outline of the chapter. Our main contribution is a new approach to the problemthat is an order of magnitude more efficient than the solution of Damgard et al.In Section 6.2, we present our basic construction, which is based on previously-proposed complexity theoretic assumptions (SRSA and y-DDHI, cf. Section 2.3.3)and is secure in the plain model. We provide details on efficiency in Section 6.2.2.Our construction builds on prior work on anonymous credentials [CL01, Lys02],so that it is easy to see which parts need to be slightly modified, using standardtechniques, to add additional features such as an anonymity revoking trustee,identity attributes, etc. The computational cost of these additional features is afew additional modular exponentiations per transaction.

In Section 6.3, we consider more variations of our basic scheme. We show how toenable the issuer and verifiers to prove to third parties that a particular user has(excessively) reused e-tokens (this is called weak exculpability). In another variationwe enable the issuer and verifiers to trace all e-tokens of a dispenser that wasexcessively reused (this is called tracing). We also show, in the public parametersand random-oracle models, how to achieve strong exculpability. Here the honestverifiers can prove to third parties that a user reused a particular e-token. Finally,we explain how e-token dispensers can be revoked; this requires a model where therevocation authority can continuously update the issuer’s public key.

6.1 Security Notion

Our definitions for periodic n-times anonymous authentication are based on thee-cash definitions of [CHL05] and [CHL06]. We define a scheme where users Uobtain e-token dispensers from the issuer I, and each dispenser can dispense upto n anonymous and unlinkable e-tokens per time period, but no more; thesee-tokens are then given to verifiers V that guard access to a resource that requiresauthentication (e.g., an on-line game). U , V, and I interact using the followingalgorithms:

Page 130: Cryptographic Protocols For Privacy Enhanced Identity Management

106 PERIODIC N-TIMES SPENDING

– IKeygen(1k, params) is the key generation algorithm of the e-token issuer I. Ittakes as input 1k and, if the scheme is in the public parameters model, theseparameters params. It outputs a key pair (pkI , skI). Assume that params areappended as part of pkI and skI .

– UKeygen(1k, pkI) creates the user’s key pair (pkU , skU ) analogously.– Obtain(U(pkI , skU , n), I(pkU , skI , n)) At the end of this protocol, the user

obtains an e-token dispenser D usable n times per time period, and (optionally)the issuer obtains tracing information tD and revocation information rD. Theissuer adds tD and rD to a record RU that is stored together with pkU .

– Show(U(D, pkI , t, n),V(pkI , t, n)) shows an e-token from dispenser D in timeperiod t. The verifier outputs a token serial number (TSN) S and a transcript τ .The user’s output is an updated e-token dispenser D′.

– Identify(pkI , S, τ, τ ′). Given two records (S, τ) and (S, τ ′) output by honestverifiers in the Show protocol, where τ 6= τ ′, the algorithm computes a value sUthat can identify the owner of the dispenser D that generated TSN S.The value sU may also contain additional information specific to the owner ofD that (a) will convince third parties that U is a violator (weak exculpability),that (b) will convince third parties that U double-showed this e-token (strongexculpability), or that (c) can be used to extract all token serial numbers of U(traceability).

A periodic n-times anonymous authentication scheme needs to fulfill the followingthree properties:

Soundness. Given an honest issuer, a set of honest verifiers are guaranteed that,collectively, they will not have to accept more than n e-tokens from a single e-tokendispenser in a single time period. There is a knowledge extractor E that executesu Obtain protocols with all adversarial users and produces functions, f1, . . . , fu,with fi : T × I → S. I is the index set [0..n − 1], T is the domain of the timeperiod identifiers, and S is the domain of TSNs. Running through all j ∈ I, fi(t, j)produces all n TSNs for dispenser i at time t ∈ T. We require that for everyadversary, the probability that an honest verifier will accept S as a TSN of a Showprotocol executed in time period t, where S 6= fi(t, j), 1 ≤ i ≤ u and 0 ≤ j < n isnegligible.

Identification. There exists an efficient function φ with the following property.Suppose the issuer and verifiers V1,V2 are honest. If V1 outputs (S, τ) and V2outputs (S, τ ′) with τ 6= τ ′ as the result of Show protocols, then Identify(pkI , S, τ, τ ′)outputs a value sU , such that φ(sU ) = pkU , the violator’s public key. In the sequel,when we say that a user has reused an e-token, we mean that there exist (S, τ) and(S, τ ′) that are both output by honest verifiers.

Page 131: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY NOTION 107

Anonymity. An issuer, even when cooperating with verifiers and dishonest users,cannot learn anything about an honest user’s e-token usage behavior except whatis available from side information from the environment. This property is capturedby a simulator S that can interact with the adversary as if it were the user. S doesnot have access to the user’s secret or public key, or her e-token dispenser D.

Formally, we create an adversary A that will play the role of the issuer and ofall verifiers. A will create the public and private keys of the issuer and verifiers.Then, A will be given access to an environment E that is either using real users ora simulator; A must determine which. A can make four types of queries to E :

EnvSetup(1k) generates the public parameters params (if any) and the privateparameters sim for the simulator (if there is one).

EnvGetPK(i) returns the public key of user Ui, generated by UKeygen(1k, pkI).

EnvObtain(i) runs the Obtain protocol with Ui: Obtain(U(pkI , skUi , n), A(state)).We use state to denote whatever state the adversary maintains and use Dj

to denote the dispenser generated during the jth run of the Obtain protocol.

EnvShow(j, pkI , t) behaves differently depending on whether the environment isusing a simulator. If the environment is using real users, it will simplyrun the Show protocol with the user U that holds the dispenser Dj :Show(U(Dj , pkI , t, n), A(state)). If the environment is using a simulatorS, then it will run the Show protocol with it: Show(S(params, sim, pkI),A(state)); S will not have access to the dispenser Dj or know who owns it.

An adversary is legal if it never asks a user to use the same dispenser to showmore than n e-tokens in the same time-interval. We say that an e-token schemepreserves anonymity if no computationally bounded legal adversary can distinguishwhen the environment is playing with users and when it is using a simulator.

Additional Extensions. In Section 6.3, we discuss natural extensions to our basicconstruction that build on prior work on anonymous credentials and e-cash, namelythe concepts of weak and strong exculpability, tracing, and revocation. We nowdefine the corresponding algorithms and security guarantees for these extensions:

– VerifyViolator(pkI , pkU , sU ) publicly verifies that the user with public key pkUhas double-spent at least one e-token.

– VerifyViolation(pkI , S, pkU , sU ) publicly verifies that the user with public keypkU is guilty of double-spending the e-token with TSN S.

– Trace(pkI , pkU , sU , RU , n), given a valid proof sU and the user’s tracing recordRU , computes all TSNs corresponding to this user. Suppose the user has obtained

Page 132: Cryptographic Protocols For Privacy Enhanced Identity Management

108 PERIODIC N-TIMES SPENDING

u e-token dispensers, Trace outputs functions f1, . . . , fu such that by runningthrough all j ∈ [0..n− 1], fi(t, j) produces all n TSNs for e-token dispenser Di

at time t. If sU is invalid, i.e. VerifyViolator(pkI , pkU , sU ) rejects, Trace doesnothing.

– Revoke(pkI , rD,RD) takes as input a revocation database RD (initially empty)and revocation information rD that corresponds to a particular user (see Obtain).It outputs the updated revocation database RD. In the sequel, we assume thatRD is part of pkI .

These algorithms should fulfill the following properties:

Weak exculpability. An adversary cannot successfully blame an honest user U forreusing an e-token. More specifically, suppose an adversary can adaptively direct auser U to obtain any number of dispensers and show up to n e-tokens per dispenserper time period. Then, the probability that the adversary produces sU such thatVerifyViolator(pkI , pkU , sU ) accepts is negligible.

Strong exculpability. An adversary cannot successfully blame a user U of reusing ane-token with token serial number S, even if U double-showed some other e-tokens.More specifically, suppose an adversary can adaptively direct a user to obtain anynumber of dispensers and show any number of e-tokens per dispenser per timeperiod (i.e. he can reset the dispenser’s state so that the dispenser reuses some ofits e-tokens). The probability that the adversary outputs a token serial numberS that was not reused and a proof sU such that VerifyViolation(pkI , S, pkU , sU )accepts is negligible.

Tracing of violators. The token serial numbers of violators can be efficiently com-puted. More specifically, given a value sU such that VerifyViolator(pkI , pkU , sU , n)accepts, and supposing U has obtained u e-token dispensers, Trace(pkI , pkU ,sU , RU , n) produces functions f1, . . . , fu such that by running through all j ∈[0..n− 1], fi(t, j) produces all n TSNs for e-token dispenser i at time t.

Dynamic revocation. The Show protocol will only succeed for dispensers D thathave not been revoked with Revoke. (Recall that Show takes as input the valuepkI that contains the database RD of revoked users.)

6.2 Periodic n-Times Anonymous e-Tokens

In a nutshell, the issuer and the user both have key pairs. Let the user’s keypair be (pkU , skU ), where pkU = gskU and g is a generator of some group G ofknown order. Let fs be a pseudo-random function whose range is the group G.

Page 133: Cryptographic Protocols For Privacy Enhanced Identity Management

PERIODIC N -TIMES ANONYMOUS E-TOKENS 109

During the Obtain protocol, the user obtains an e-token dispenser D that allowsher to show up to n tokens per time period. The dispenser D is comprised ofthe seed s for PRF fs, the user’s secret key skU , and the issuer’s signature on(s, skU ). We use CL-signatures to prevent the issuer from learning anything abouts or skU . In the Show protocol, the user shows her ith token in time period t:she releases the token serial number (TSN) S = fs(0, t, i), a double-show tagE = pkU · fs(1, t, i)R (for a random R supplied by the verifier), and runs a ZKproof protocol that (S,E) correspond to a valid dispenser for time period t and0 ≤ i < n (the user proves that S and E were properly formed from values (s, skU )signed by the issuer). Note that we use a function fs with an additional binaryparameter. This allows us to use the same PRF for both the TSN and the double-show tag. Since fs is a PRF, and all the proof protocols are zero-knowledge, it iscomputationally infeasible to link the resulting e-token to the user, the dispenserD, or any other e-tokens corresponding to D. If a user shows n+ 1 e-tokens duringthe same time interval, then two of the e-tokens must use the same TSN. The issuercan easily detect the violation and compute pkU from the two double-show tags,E = pkU · fs(1, t, i)R and E′ = pkU · fs(1, t, i)R

′ with R 6= R′. From the equationsabove, fs(1, t, i) = (E/E′)(R−R′)−1 and pkU = E/fs(1, t, i)R.

6.2.1 Basic Construction

Let k be a security parameter and lq ∈ Ω(k), lx, ltime, and lcnt be length parameterssuch that lq ≥ lx ≥ ltime + lcnt + 2 and 2lcnt − 1 > n, where n is the number oftokens we allow per time period.

In the following, we assume implicit conversion between binary strings and integers,e.g., between 0, 1l and [0, 2l − 1]. Let F(g,s)(x) := fDYg,s (x) := g1/(s+x) forx, s ∈ Zq∗ and 〈g〉 = G being of prime order q. For suitably defined ltime, lcnt, andlx define the function c : 0, 1lx−ltime−lcnt × 0, 1ltime × 0, 1lcnt → 0, 1lx as:

c(u, v, z) :=(u2ltime + v

)2lcnt + z .

Issuer Key Generation: In IKeygen(1k, params), the issuer I generates twocyclic groups:

1. A group 〈g〉 = 〈h〉 = G of composite order p′q′ that can be realized by themultiplicative group of quadratic residue modulo a special RSA modulusN = (2p′ + 1)(2q′ + 1). In addition to CL-signatures, this group will beneeded for zero-knowledge proofs of knowledge used in the sequel. Note thatsoundness of these proof systems is computational only and assumes that theprover does not know the order of the group.

Page 134: Cryptographic Protocols For Privacy Enhanced Identity Management

110 PERIODIC N-TIMES SPENDING

2. A group 〈g〉 = 〈g〉 = 〈h〉 = G of prime order q with 2lq−1 < q < 2lq . Werequire that the bases are chosen in such a way that the discrete logarithmslogg(g), logg(h), and logg(h) are unknown to all parties.

The issuer must also prove in zero-knowledge that N is a special RSA modulus,and 〈g〉 = 〈h〉 are quadratic residues modulo N . In the random oracle model, onenon-interactive proof may be provided. In the plain model, the issuer must agreeto interactively prove this to anyone upon request.

Furthermore, the issuer generates a CL-signature key pair (pk, sk) set in groupG. The issuer’s public key will contain (g,h,G, g, g, h,G, pk), while the secret keywill contain all of the information.

User Key Generation: In UKeygen(1k, pkI), the user chooses a random skU ∈ Zqand sets pkU = gskU ∈ G.

Get e-Token Dispenser: Obtain(U(pkI , skU , n), I(pkU , skI , n)). Assume thatU and I have mutually authenticated. A user U obtains an e-token dispenser froman issuer I as follows:

1. U and I agree on a commitment C to a random value s ∈ Zq as follows:(a) U selects r, s′ at random from Zq and computes a special Pedersen

commitment C ′ = PedCom(skU , s′; r) = gskU gs′hr.

(b) U sends C ′ to I and proves that it is constructed correctly.(c) I sends a random r′ from Zq back to U .(d) Both U and I compute C = C ′gr

′ = PedCom(skU , s′ + r′; r). Ucomputes s = s′ + r′ mod q.

2. I and U execute the CL signing protocol on commitment C. Upon success,U obtains σ, the issuer’s signature on (skU , s). This step can be efficientlyrealized using the CL protocols [CL02b, CL04] in such a way that I learnsnothing about skU or s.

3. U initializes counters T := 1 (to track the current period) and J := 0 (tocount the e-tokens shown in the current time period). U stores the e-tokendispenser D = (skU , s, σ, T, J).

Use an e-Token: Show(U(E, pkI , t, n),V(pkI , t, n)). Let t be the current timeperiod identifier with 0 < t < 2ltime . A user U reveals a single e-token from adispenser D = (skU , s, σ, T, J) to a verifier V as follows:

1. U compares t with T . If t 6= T , then U sets T := t and J := 0. If J ≥ n,abort!

Page 135: Cryptographic Protocols For Privacy Enhanced Identity Management

PERIODIC N -TIMES ANONYMOUS E-TOKENS 111

2. V sends to U a random R ∈ Z∗q .3. U sends to V a token serial number S and a double spending tag E computed

as follows:

S = F(g,s)(c(0, T, J)), E = pkU · F(g,s)(c(1, T, J))R .

4. U and V engage in a zero-knowledge proof of knowledge of values skU , s, σ,and J such that:(a) 0 ≤ J < n,(b) S = F(g,s)(c(0, t, J)),(c) E = gskU · F(g,s)(c(1, t, J))R,(d) VerifySig(pkI , (skU , s), σ) = accept.

5. If the proof verifies, V stores (S, τ), with τ = (E,R), in his database. Notethat verification can be implemented as a distributed system with multipledistributed verifiers and a single central database.

6. U increases counter J by one. If J ≥ n, the dispenser is empty. It will berefilled in the next time period.

Technical Details. The proof in Step 4 is done as follows:

1. U generates the commitments CJ = gJhr1 , Cu = gskUhr2 , Cs = gshr3 , andsends them to V.

2. U proves that CJ is a commitment to a value in the interval [0, n− 1] usingstandard techniques [CFT98, CM99a, Bou00, CCS08].

3. U proves knowledge of a CL-signature from I for the values committed to byCu and Cs in that order. This step can be efficiently realized using the CLprotocols [CL02b, CL04].

4. U as prover and V as verifier engage in the following proof of knowledge,using the notation by Camenisch and Stadler [CS97a]:

PK(α, β, δ, γ1, γ2, γ3) : g = (Csgc(0,t,0)CJ)αhγ1 ∧

S = gα ∧ g = (Csgc(1,t,0)CJ)βhγ2 ∧

Cu = gδhγ3 ∧ E = gδ(gR)β .

U proves she knows the values of the Greek letters; all other values are knownto both parties.

Let us explain the last proof protocol. From the second step we know that CJencodes some value J with 0 ≤ J < n, i.e., CJ = gJhrJ for some rJ . From thethird step we know that Cs and Cu encode values u and s on which the proverU knows a CL-signature by the issuer. Therefore, Cs = gshrs and Cu = guhru

Page 136: Cryptographic Protocols For Privacy Enhanced Identity Management

112 PERIODIC N-TIMES SPENDING

for some rs and ru. Next, recall that by definition of c(·, ·, ·) the term gc(0,t,0)

corresponds to gt2lcnt . Now consider the first term g = (Csgc(0,t,0)CJ)αhγ1 in

the proof protocol. We can now conclude the prover U knows values a and r

such that g = g(s+t2lcnt+J)ahr and S = ga. From the first equation it follows thata = (s+(t2lcnt + J))−1 (mod q) must hold provided that U is not privy to logg h (aswe show via a reduction in the proof of security) and thus we have established thatS = F(g,s)(c(0, t, J)) is a valid serial number for the time period t. Similarly onecan derive that E = gu · F(g,s)(c(1, t, J))R, i.e., that E is a valid double-spendingtag for time period t.

Identify Cheaters: Identify(pkI , S, (E,R), (E′, R′)). If the verifiers who acceptedthese tokens were honest, then R 6= R′ with high probability, and the proof ofvalidity ensures that E = pkU · fs(1, T, J)R and E′ = pkU · fs(1, T, J)R′ . Theviolator’s public key can now be computed by first solving for fs(1, T, J) =(E/E′)(R−R′)−1 and then computing pkU = E/fs(1, T, J)R.

Theorem 6.2.1 Protocols IKeygen, UKeygen, Obtain, Show, and Identify describedabove achieve soundness, identification, and anonymity properties in the plainmodel assuming Strong RSA, and n-DDHI if lx ∈ O(log k) or n-IDDHI otherwise.For the proof consult [CHK+06] or Appendix A.6.

6.2.2 Efficiency Discussion

To analyze the efficiency of our scheme, it is sufficient to consider the numberof (multi-base) exponentiations the parties have to do in G and G. In a decentimplementation, a multi-base exponentiation takes about the same time as a single-base exponentiation, provided that the number of bases is small. For the analysiswe assume that the Strong RSA based CL-signature scheme is used.

Obtain: both the user and issuer perform 3 exponentiations in G. Show: the userperforms 12 multi-base exponentiation in G and 23 multi-base exponentiations inG, while the verifier performs 7 multi-base exponentiation in G and 13 multi-baseexponentiations in G. If n is odd, the user only needs to do 12 exponentiationsin G, while the verifier needs to do 7. To compare ourselves to the Damgardet al. [DDP06] scheme, we set n = 1. In this case, Show requires that the userperforms 12 multi-base exponentiations in G and 1 multi-base exponentiation inG and the verifier performs 7 multi-base exponentiations in G and 1 multi-baseexponentiation in G. Damgard et al. requires 57+68k exponentiations in G, wherek is the security parameter (i.e., 2−k is the probability that the user can cheat).Depending on the application, k should be at least 20 or even 60. Thus, our schemeis an order of magnitude more efficient than Damgard et al.

Page 137: Cryptographic Protocols For Privacy Enhanced Identity Management

ADDITIONAL EXTENSIONS 113

6.3 Additional Extensions

One advantage of our approach to periodic anonymous authentication is that itsmodular construction fits nicely with previous work [CL02a, CHL05]. Thus, it isclear which parts of our system can be modified to enable additional features.

6.3.1 Weak Exculpability

Recall that weak exculpability allows an honest verifier (or group of verifiers) toprove in a sound fashion that the user with public key pkU reused some token.This convinces everyone in the system that the user with pkU is untrustworthy.

To implement weak exculpability, we need to define algorithm VerifyViolator andto slightly adapt the IKeygen, UKeygen, Show, and Identify algorithms. IKeygen′now also runs BMGen, and the parameters for the bilinear map e : G1 ×G2 → GT

are added to pkI . UKeygen′ selects a random skU ∈ Z∗q and outputs pkU =e(g1, g2)skU . In the Show′ protocol, the double-spending tag is calculated asE = gskU

1 · F(g1,s)(c(1, T, J))R. Consequently the value sU , returned by Identify′, isgskU1 – which is secret information! Thus, the VerifyViolator algorithm is defined as

follows: VerifyViolator(pkI , pkU , sU ) accepts only if e(sU , g2) = e(gskU1 , g2) = pkU .

Intuitively, because gskU1 is secret information, its release signals that this user

misbehaved.

A subtle technical problem with this approach is that tag E is now set in a bilineargroup G1. We need to ensure that the DY PRF is still secure in this group. Thuswe require n-DDHI, respectively n-IDDHI to hold in G1.

Theorem 6.3.1 The above scheme provides weak exculpability under the StrongRSA, n-DDHI if lx ∈ O(log k) or n-IDDHI in G1, and either XDH or Sum-FreeDDH assumptions.

6.3.2 Strong Exculpability

Recall that strong exculpability allows an honest verifier (or group of verifiers) toprove in a sound fashion that the user with public key pkU reused an e-token withTSN S.

For strong exculpability, we need to define VerifyViolation and to adapt the Showand the Identify algorithms. In Show′′, the ZK proof of validity is transformed intoa non-interactive proof, denoted Π, using the Fiat-Shamir heuristic [FS87]. Theproof Π is added to the coin transcript, denoted τ . And Identify′′(pkI , S, τ1, τ2)

Page 138: Cryptographic Protocols For Privacy Enhanced Identity Management

114 PERIODIC N-TIMES SPENDING

adds both transcripts τ1, and τ2 to its output sU . (The function φ(sU ) = pkUignores the extra information.)

Thus, the VerifyViolation algorithm is defined as follows: VerifyViolation(pkI ,S, pkU , sU ) parses τ1 = (E1, R1,Π1) and τ2 = (E2, R2,Π2) from sU . Then, itchecks that φ(sU ) = pkU and that Identify′′(pkI , S, τ1, τ2) = sU . Next, it verifiesboth non-interactive proofs Πi with respect to (S,Ri, Ti). If all checks pass, itaccepts; else, it rejects.

A subtlety here is that, for these proofs to be sound even when the issuer ismalicious, the group G′ that is needed as a parameter for zero-knowledge proofsmust be a system parameter generated by a trusted third party, such that no one,including the issuer, knows the order of this group. So in particular, G′ cannot bethe same as G [CM99b].

Theorem 6.3.2 The above scheme provides strong exculpability under the StrongRSA, and n-DDHI if lx ∈ O(log k) or n-IDDHI assumptions in the random oraclemodel with trusted setup for the group G′.

6.3.3 Tracing

We can extend our periodic n-times authentication scheme so that if a user reuseseven one e-token, all possible TSN values she could compute using any of herdispensers are now publicly computable. This is for instance useful as a meansof revocation. Verifiers can compute the TSN values of all future e-tokens of aparticular user. It is also useful for looking into the past, for example to removemalicious road condition reports that were submitted by a sensor before doublespending detection.

We use the same IKeygen′, UKeygen′, Show′, and Identify′ algorithms as weakexculpability, slightly modify the Obtain protocol, and define a new Trace algorithm.

In UKeygen′, the user’s key pair (e(g1, g2)skU , skU ) is of the correct form for thebilinear ElGamal cryptosystem, where the value gskU

1 is sufficient for decryption.Now, in our modified Obtain′, the user will provide the issuer with a verifiableencryption [CD00] of PRF seed s under her own public key pkU . The issuer storesthis tracing information in RU . When Identify′ exposes gskU

1 , the issuer may runthe following trace algorithm:

Trace(pkI , pkU , sU , RU , n). The issuer extracts gskU1 from sU , and verifies this value

against pkU ; it aborts on failure. The issuer uses gskU1 to decrypt all values in RU

belonging to that user, and recovers the PRF seeds for all of the user’s dispensers.

Page 139: Cryptographic Protocols For Privacy Enhanced Identity Management

CONCLUSION 115

For seed s and time t, all TSNs can be computed as fs(t, j) = F(e(g1,g2),s)(c(0, t, j)),for all 0 ≤ j < n.

Theorem 6.3.3 The above scheme provides for the tracing of violators under theStrong RSA, the n-DDHI if lx ∈ O(log k) or the n-IDDHI in G1 assumptions.

6.3.4 Dynamic Revocation

Implementing dynamic revocation requires modifying the Obtain and Show protocolsin the basic scheme, and defining a new Revoke algorithm.

The mechanisms introduced in [CL02a] can be used for revoking CL-signatures. Inan adjusted CL protocol for obtaining a signature on a committed value, the userobtains an additional witness w = ve−1 , where v is the revocation public key ande is a unique prime which is part of the CL-signature σ. In the CL protocol forproving knowledge of a signature, the user also proves knowledge of this witness.Violators with prime e can be excluded by updating the revocation public key v,such that v′ = ve−1 , and publishing e. While all non-excluded users can updatetheir witness by computing function f(e, e,v′,w) = w′, without knowing the orderof G, this update does not work when e = e.

Thus, our e-token dispensers can be revoked by revoking their CL-signature σ.Obtain′′′ is adapted to provide users with a witness w and to store the correspondinge as rD. Show′′′ is adapted to update and prove knowledge of the witness. TheRevoke(pkI , rD) algorithm is defined as follows: Compute v′ = vrD−1 and publishit together with update information rD. Additional details are in [CL02a].

Theorem 6.3.4 The above scheme provides dynamic revocation under the StrongRSA, and n-DDHI if lx ∈ O(log k) or n-IDDHI assumptions.

6.4 Conclusion

The work in this chapter shows the flexibility and power of accountabilitymechanisms for non-delegatable anonymous credentials. The presented mechanismfor implementing n-times periodic spending restrictions is efficient and can becombined with other mechanisms such as tracing and revocation. The efficiencyrelies to some extend on the use of hash functions, Σ-proofs, and hidden-ordergroups.

In the next chapter, we move back to a setting that enables us to us the Groth-Sahaiproof system. The transfer of ideas between these two settings could be the subjectof future research.

Page 140: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 141: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 7

Delegatable AnonymousCredentials

We present work published in [BCC+09]. Access control is one of the most commonuses of cryptography today. We frequently need to answer the question: does theperson requesting access to a resource possess the required credentials? A credentialtypically consists of a certification chain rooted at some authority responsible formanaging access to the resource and ending at the public key of a user in question.The user presents the credential and demonstrates that he knows the correspondingsecret key. Sometimes, the trusted authority issues certificates directly to each user(so the length of each certification chain is 1). More often, the authority delegatesresponsibility. A system administrator allows several webmasters to use his server.A webmaster can create several forums, with different moderators for each forum.Moderators approve some messages, reject others, and even give favored usersunlimited posting privileges. Imagine a world where a single system administratorhas to approve every single message posted on every single forum!

We want cryptographic credentials to follow the same delegation model as accesscontrol follows in the real world. Our system administrator issues credentials oflength 1 to his trusted webmasters. Each webmaster can extend the credentialchain by delegating a credential of length 2 to a moderator. In general, a user witha credential of length L can issue a credential of length L+ 1.

A conventional signature scheme immediately allows for (non-anonymous)delegatable credentials: Alice, who has a public signing key pkA and a certificationchain of length L, can sign Bob’s public key pkB, giving Bob a certificationchain of length L + 1. There is, however, no straightforward transformationof anonymous credential schemes without delegation [Cha85, Dam90, Bra00,

117

Page 142: Cryptographic Protocols For Privacy Enhanced Identity Management

118 DELEGATABLE ANONYMOUS CREDENTIALS

LRSW99, CL01, CL04, BCKL08] into delegatable schemes. Existing approachesfor delegatable anonymous credentials are either inefficient [CL06] or make use ofinteraction [DDJ07]. Prior work on anonymous credentials [Lys02, CL02b, CL04,BCKL08] used signature schemes that lend themselves to the design of efficientprotocols for (1) obtaining a signature on a committed value; and (2) proving thata committed value has been signed. Suppose Oliver wants to issue a credential toAlice. They would run protocol (1) so that Alice obtained a signature on her secretkey. Whenever Alice wants to show her credential, she would give the verifier a freshcommitment to her secret key and use protocol (2) to prove that the committedkey has been signed.

The problem with this approach is that Alice gets Oliver’s signature, and thesignature might reveal his identity. Generalizing to credential chains, Alice wouldlearn the identity of all intermediate signers. Thus, the old approach does not yielditself to delegation and we must try something very different.

Our approach. Our insight is that instead of giving Alice his signature, Olivergives Alice a non-interactive proof-of-knowledge of the signature. The trick isto find a proof-system that would then let Alice (1) delegate the credential byextending the proof and (2) rerandomize the proof every time she shows (or extends)it to preserve her anonymity.

Let Oliver be a credential authority with public key pkO and secret key skO; andlet Alice be a user with secret key skA. Alice wants to obtain the credential directlyfrom Oliver (so her certification chain will be of length 1). Under the old approach,they would run a secure two-party protocol as a result of which Alice obtains asignature σpkO (skA) on skA, while Oliver gets no output. Under the new approach,Alice’s output is (commA, πA), where commA is a commitment to her secret keyskA, and πA is a proof of knowledge of Oliver authenticating the contents of commA.Note that a symmetric authentication scheme is sufficient because no one ever seesthe authenticator ; all verification is done on the proof of knowledge. The symmetrickey skO remains secret to Oliver; we create a “public” key CO that is simply acommitment to skO.

How can Alice use this credential anonymously? If the underlying proof system ismalleable in just the right way, then given (commA, πA) and the opening to commA,Alice can compute (comm′A, π′A) such that comm′A is another commitment to herskA that she can successfully open, while π′A is a proof of knowledge of Oliverauthenticating the contents of comm′A. Malleability is usually considered a bugrather than a feature. But in combination with the correct extraction properties, westill manage to guarantee that these randomizable proofs give us a useful buildingblock for the construction. The bottom line is that (comm′A, π′A) should not belinkable to (commA, πA), and also it should not be possible to obtain such a tuplewithout Oliver’s assistance.

Page 143: Cryptographic Protocols For Privacy Enhanced Identity Management

DELEGATABLE ANONYMOUS CREDENTIALS 119

How does Alice delegate her credential to Bob? Alice and Bob can run a securetwo-party computation protocol as a result of which Bob obtains (commB , πB)where commB is a commitment to Bob’s secret key skB and πB is a proof ofknowledge of an authenticator issued by the owner of comm′A on the contentsof commB. Now, essentially, the set of values (comm′A, commB , π

′A, πB) together

indicate that the owner of comm′A got a credential from Oliver and delegated it tothe owner of commB , and so it constitutes a proof of possession of a certificationchain. Moreover, it hides the identity of the delegator Alice! Now Bob can, in turn,use the randomization properties of the underlying proof system to randomize thisset of values so that it becomes unlinkable to his original pseudonym commB ; hecan also, in turn, delegate to Carol.

It may be somewhat counter-intuitive that adversaries cannot take advantage ofthe malleable proofs to forge proofs of possessing the certification chain. Theexplanation is that we use a perfectly extractable proof system. This lets us extracta certification chain from any proof, no matter how randomized. As a result, inour security argument we can use an adversary that fakes a proof to attack theunforgeability properties of the underlying authentication scheme. Our solutionalso addresses some important details such as (1) how to make it impossible foradversarial users to mix and match pieces of different certification chains to createunauthorized certification chains; (2) how to define and construct an authenticationscheme that remains secure even when an adversary can ask an honest user toauthenticate chosen messages and obtain authentication tokens on the user’s secretkey (i.e., the adversary gets a signature on the user’s secret key, but does not learnthe secret key itself).

Attributes. In many contexts, we want the ability to express why a credentialhas been delegated. For example, our system administrator may want to give eachwebmaster access only to his own website. Webmasters may want to give somemoderators the right to add new users to the forum, while other moderators wouldonly have the right to approve/reject messages.

Why anonymity? The system administrator clearly needs to hold webmastersaccountable if they crash the server. However, moderators and forum users maywant to be able to act anonymously in order to foster a free and lively exchange ofideas.

Our delegatable anonymous credential system lets users add human-readableattributes to each credential. Oliver can give Alice a level 1 credential withattribute “webmaster of Crypto Forum.” Alice can then delegate her credentialto Bob with attribute “moderator of Crypto Forum.” As a result, Bob can log onto the server anonymously and prove that the “webmaster of the Crypto Forum”made him the “moderator of the Crypto Forum.” Our construction lets users addas many attributes as they want to each credential, allowing for the flexibility and

Page 144: Cryptographic Protocols For Privacy Enhanced Identity Management

120 DELEGATABLE ANONYMOUS CREDENTIALS

expressibility that we see in modern (non-anonymous) access control systems. Insome cases, the issuer may want to explicitly add the user’s identity to the attributeto create accountability. Or, he might want to include an expiration date for thecredential, or other conditions controlling its use.

Outline of the chapter. In Section 7.1 we define a delegatable anonymouscredential system. In Section 7.2 we define an appropriate authentication systemand the appropriate zero-knowledge proof of knowledge system. We show howthese building blocks can be instantiated under appropriate assumptions aboutgroups with bilinear maps. In Section 7.3 we give a construction of anonymousdelegatable credentials based on these building blocks.

7.1 Security Notion

An anonymous delegatable credential system has only one type of participant: users.Any user O can become an originator of a credential; all he needs to do is publishone of his pseudonyms NymO as its public key to act as a credential authority.1Each user will have one secret key, but many different pseudonyms correspondingto that secret key. Thus a user A with secret key skA can be known to authorityO as Nym(O)

A and to user B as Nym(B)A . If authority O issues user A a credential

for Nym(O)A , then user A can prove to user B that Nym(B)

A has a credential fromauthority O. We say that credentials received directly from the authority are level1 credentials or basic credentials, credentials that have been delegated once arelevel 2 credentials, and so on. Thus user A can also delegate its credential to userB, and user B can then prove that he has a level 2 credential from authority O. Adelegatable credential system consists of the following algorithms:

Setup(1k) outputs the trusted public parameters of the system, paramsDC .

Keygen(paramsDC ) creates the secret key of a party in the system.

Nymgen(paramsDC , sk). On each run, the algorithm outputs a new pseudonymNym and auxiliary info aux(Nym) for secret key sk.2

Issue(paramsDC ,NymO, skI ,NymI , aux(NymI), cred,NymU , L)↔ Obtain(paramsDC ,NymO, skU ,NymU , aux(NymU ),NymI , L) are the in-teractive algorithms that let a user I issue a level L+1 credential to a user U .

1We assume some kind of PKI for storing the authorities’ public keys, but this is outside thescope of this thesis.

2We assume the existence of a predicate VerifyAux that accepts if and only if Nym is a validpseudonym for (sk, aux(Nym)) and of a protocol NymProve↔ NymVerify that is a zero-knowledgeproof of knowledge of sk, aux(Nym) such that VerifyAux(paramsDC ,Nym, sk, aux(Nym)) = accept.

Page 145: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY NOTION 121

The pseudonym NymO is the authority’s public key, skI is the issuer’s secretkey, NymI is the issuer’s pseudonym with auxiliary information aux(NymI),cred is the issuer’s level L credential rooted at NymO, skU is the user’ssecret key, and NymU is the user’s pseudonym with auxiliary informationaux(NymU ). If NymI = NymO, then the issuer is the authority responsiblefor this credential, so L = 0 and cred = ε. The issuer runs Issue with theseinputs and gets no output, and the user runs Obtain and gets a credentialcredU .

CredProve(paramsDC ,NymO, cred, sk,Nym, aux(Nym), L). Takes as input a levelL credential cred from authority NymO, outputs a value credproof .

CredVerify(paramsDC ,NymO, credproof ,Nym, L). Outputs accept if credproof is avalid proof that the owner of pseudonym Nym possesses a level L credentialwith root NymO and reject otherwise.

We define the security of delegatable anonymous credential more formallyin [BCC+09]. Intuitively, algorithms Setup,Keygen,Nymgen,NymProve,NymVerify,VerifyAux, Issue,Obtain,CredProve, and CredVerify constitute a secure anonymousdelegatable credential scheme if the following properties hold:

Correctness. We say that a credential cred is a proper credential, if for all of theuser’s pseudonyms, CredProve always creates a proof that CredVerify accepts. Thedelegatable credential system is correct if an honest user and an honest issuer canrun Obtain↔ Issue and the honest user gets a proper credential. (The correctnessproperty requires that Issue and CredProve check that their inputs are valid andthat the credentials they delegate or prove are proper. This is necessary to avoidselective failure attacks in which users are given credentials that work for someidentities/pseudonyms, but not all.)

Anonymity. Intuitively, anonymity requires that the adversary’s interactions withthe honest parties in the real game should be indistinguishable from some idealgame in which pseudonyms, credentials and proofs are truly anonymous.

Specifically, there has to exist a simulator (SimSetup, SimProve, SimObtain,SimIssue) such that SimSetup produces parameters indistinguishable from thoseoutput by Setup, along with some simulation trapdoor sim. Under these parameterswe require that the following properties hold when SimProve, SimObtain, SimIssueand even the distinguisher are given sim:

• When generated using the parameters output by SimSetup, Nym is distributedindependently of sk.

Page 146: Cryptographic Protocols For Privacy Enhanced Identity Management

122 DELEGATABLE ANONYMOUS CREDENTIALS

• No distinguisher can tell if it is interacting with Issue run by an honest partywith a valid credential, or with SimIssue which is not given the credential andthe issuer’s secret key, but only told the authority, length of the credentialchain, and the pseudonyms of the issuer and user.

• No distinguisher can tell if it is interacting with Obtain run by an honest partywith secret sk, or with SimObtain that is only given the authority, length ofthe credential chain, and the pseudonyms of the issuer and user (but not sk).

• The simulator SimProve can output a fake credential proof credproof thatcannot be distinguished from a real credential proof, even when SimProve isonly told the authority, length of the credential chain, and the pseudonym ofthe user (it is not given the user’s secret key, or his private credentials).

Remark 7.1.1 Our definition, somewhat in the spirit of the composable zero-knowledge definition given in [GS08], requires that each individual protocol(SimIssue,SimObtain,SimProve), when run on a single adversarially chosen inputproduces output indistinguishable from the corresponding real protocol, even whenthe adversary is given the simulation trapdoor. A simple hybrid argument showsthat this implies the more complex but weaker definition in which the adversaryonly controls the public inputs to the algorithm. This stronger definition is mucheasier to work with as we need only consider one protocol at a time, and only asingle execution of each protocol.

Unforgeability. To define unforgeability we will have all of the honest partiescontrolled by a single oracle, and we will keep track of all honestly issued credentials.Then we will require that an adversary given access to this oracle should have onlynegligible probability of outputting a forged credential.

For unforgeability of credentials to make sense, we have to define it in a settingwhere pseudonyms are completely binding, i.e., for each pseudonym there is exactlyone valid corresponding secret key. Thus, there must exist some (potentiallyexponential-time) extraction algorithm that takes as input a pseudonym andoutputs the corresponding secret key. A forgery of a level L credential occurs whenthe adversary can “prove” that Nym has a level L credential when such a credentialwas never issued to any pseudonym owned by skL = Extract(Nym). Our definitionis slightly stronger, in that we require an efficient algorithm Extract that producesF (skL) for some bijection F , and so we get F -extraction.

Suppose we can extract an entire chain of secret keys from the credential proof.Then we can say that a forgery occurs when the adversary produces a proof ofa level L credential with authority O from which we extract sk1, . . . , skL−1, skLsuch that a level L credential rooted at O was never delegated by skL−1 to skL.Thus, we are not concerned with exactly which sk2, . . . , skL−2 are extracted. In

Page 147: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY NOTION 123

practical terms, this means that once skL−1 has delegated a level L credential fromauthority O to skL, we do not care if the adversary can forge credentials withdifferent credential chains as long as they have the same level, are from the sameauthority, and are for the same skL. (Obviously, we can also consider functions ofthe secret keys F (ski) in this discussion).

Of course, this only makes sense if skL−1 belongs to an honest user; otherwisewe have no way of knowing what credentials he issued. But what if the ownerof skL−1 is adversarial and the owner skL−2 is honest? Then the owner of skLshould be able to prove possession of a credential if and only if skL−2 delegated alevel L− 1 credential rooted at authority O to user skL−1. Generalizing this idea,our definition says that a forgery is successful if we extract sk0, . . . , skL such thatthere is a prefix skO, . . . , ski where ski−1 is honest, but ski−1 never issued a level icredential from root O to ski.

Let F be an efficiently computable bijection and a one-way function. There existsa p.p.t. ExtSetup, and a deterministic Extract with five properties:

• Algorithm ExtSetup outputs (paramsDC , ext). The parameters paramsDCare distributed identically to those output by Setup and ext is an extractiontrapdoor used by Extract.

• Under these parameters, pseudonyms are perfectly binding, i.e., for anyNym, there exists at most one sk for which there exists aux(Nym) such thatVerifyAux accepts.

• Given an honestly generated level L credential proof and the extractiontrapdoor ext, Extract always extracts F -values corresponding to the correctchain of L + 1 identities. I.e., if the credential is formed by using sk0 todelegate to sk1 which delegates to sk2 and so on until skL, Extract will produce(f0, . . . , fL) = (F (sk0), . . . , F (skL)). Note that this must hold for level 0as well: for any valid pseudonym NymO, Extract will produce f0 = F (skO)where skO corresponds to NymO.

• Given an adversarially generated level L credential proof credproof fromauthority NymO for the pseudonym Nym, Extract will always produce eitherthe special symbol ⊥ or f0, . . . fL such that NymO is a pseudonym for F−1(f0)and Nym is a pseudonym for F−1(fL).

Page 148: Cryptographic Protocols For Privacy Enhanced Identity Management

124 DELEGATABLE ANONYMOUS CREDENTIALS

• No adversary can output a valid credential proof from which an unauthorizedchain of identities is extracted:

Pr[(paramsDC , ext)← ExtSetup(1k);

(credproof ,Nym,NymO, L),← AO(paramsDC ,·,·)(paramsDC , ext);

(f0, . . . , fL)← Extract(paramsDC , ext, credproof ,Nym,NymO, L) :

CredVerify(paramsDC ,NymO, credproof ,Nym, L) = accept ∧

(∃i such that (f0, i, fi−1, fi) 6∈ ValidCredentialChains∧

fi−1 ∈ HonestUsers)] ≤ ν(k) ,

where O(paramsDC , command, input) describes all possible ways for the adversaryA to interact with the delegatable credentials system: A can ask the oracle to addnew honest users; the oracle generates sk ← Keygen(paramsDC ), stores it in thelist HonestUsers, and returns F (sk) as the handle.3 A can ask for new pseudonymsfor existing honest users, referenced by F (sk), and he can provide a credential andask an honest user to generate the corresponding proof. Finally, he can ask honestusers to run the Issue↔ Obtain protocol on credentials of its choice, either betweenthemselves, or with an adversarial issuer or obtainer. In this case, we need to keeptrack of which credentials are being issued, so that we will be able to identify aforgery. To do this, we use the Extract algorithm to extract and store the chain ofidentities behind each credential being issued on the list ValidCredentialChains. Fordetails, see [BCC+09].

The oracle responds to the following types of queries:

AddUser. The oracle runs sk ← Keygen(paramsDC ). It stores (sk, F (sk)) in the userdatabase, gives the adversary F (sk), and stores F (sk) in the list HonestUsers.

FormNym(f). The oracle looks up (sk, f) in its user database and terminates ifit does not exist. It calls (Nym, aux(Nym))← Nymgen(paramsDC , sk). Theoracle stores (sk,Nym, aux(Nym)) in its pseudonym database and gives theadversary Nym.

Issue(NymO,NymI , credI ,NymU , L). The oracle looks up (skU ,NymU , aux(NymU ))and (skI ,NymI , aux(NymI)) in its pseudonym database and outputs an errorif they do not exist, or if skU = skI (honest users cannot issue to themselves).The oracle runs CredProve(paramsDC ,NymO, credI , skI ,NymI , aux(NymI),L) to obtain credproof (for L = 0, credproof = ε). Then it runs Extract(

3Since we are assuming that the adversary is given the extraction trapdoor ext, he will beable to extract F (sk) from any pseudonym for this user. Note that F (sk) must not reveal sk.

Page 149: Cryptographic Protocols For Privacy Enhanced Identity Management

BUILDING BLOCKS 125

paramsDC , ext, credproof ,NymO,NymI , L) to obtain f0, f1, . . . fL (Note thatbecause of Correctness fL = F (skI)). The oracle runs Issue(paramsDC ,NymO, skI ,NymI , aux(NymI), credI ,NymU , L)↔ Obtain(paramsDC ,NymO,skU ,NymU , aux(NymU ),NymI , L) → credU . The oracle stores (f0, L +1, fL, F (skU )) in ValidCredentialChains and outputs credU to the adversary.

IssueToAdv(NymO,NymI , credI ,Nym, L). The oracle looks up values (skI ,NymI ,aux(NymI)) in its pseudonym database, and outputs an error if theydo not exist. The oracle generates a credential proof credproof by run-ning CredProve(paramsDC ,NymO, cred, skI ,NymI , aux(NymI), L). It runsExtract(paramsDC , ext, credproof ,NymI ,NymO, L) to obtain f0, . . . fL. Itthen identifies the recipient by running Extract(paramsDC , ext, ε,Nym,Nym, 0)to obtain fL+1. Finally the oracle executes the Issue(paramsDC ,NymO, skI ,NymI , aux(NymI), credI ,Nym, L) to interact with the adversary. If theprotocol does not abort, it stores (f0, L+1, fL, fL+1) in ValidCredentialChains.

ObtainFromAdv(NymO,NymA,NymU , L). The oracle looks up values (skU ,NymU ,aux(NymU )) in its pseudonym database, and outputs an error if they do notexist. It runs Obtain(paramsDC ,NymO, skU ,NymU , aux(NymU ),NymA, L)with the adversary to get cred and outputs cred.

Prove(NymO, cred,Nym, L). The oracle looks up (sk,Nym, aux(Nym)) in itspseudonym database, and outputs an error if they do not exist. Theoracle then runs CredProve(paramsDC ,NymO, cred, sk,Nym, aux(Nym), L) toobtain credproof , and outputs this result.

Remark 7.1.2 We let the adversary track honest users’ credentials and pseudonyms(but, of course, not their secret keys). Our definition is strictly stronger than onethat uses a general oracle that does not reveal the credentials of honest users to theadversary. This approach results in simpler definition and analysis.

7.2 Building Blocks

In Section 7.2.1 we define certification-secure authentication schemes. InSection 7.2.2 we give new efficient protocols for randomizable proofs (cf.Section 2.5.6) of an authenticator. As part of one of these protocols, we givean efficient two-party protocol for generating a Boneh-Boyen signature [BB04b] ona committed message, which may be of independent interest.

Page 150: Cryptographic Protocols For Privacy Enhanced Identity Management

126 DELEGATABLE ANONYMOUS CREDENTIALS

7.2.1 F-Unforgeable Certification Secure Message Authentica-tion Scheme

A message authentication scheme is similar to a signature scheme. However, thereis no public key, and a user uses the secret key to verify that a message has beenauthenticated. We need an authentication scheme for a vector of n messagesm. This is reminiscent of signatures on blocks of messages [CL02b]. In thecommon parameters model such a scheme consists of four protocols: AuthSetup(1k)outputs common parameters paramsA. AuthKg(paramsA) outputs a secret keysk. Auth(paramsA, sk,m) outputs an authentication tag auth that authenticates avector of messages m. VerifyAuth(paramsA, sk,m, auth) accepts if auth is a properauthenticator for m under key sk.

Security. Just like a signature scheme, an authentication scheme must be completeand unforgeable. However, we need to strengthen the unforgeability property to fitour application: F -Unforgeability introduced by Belenkiy et al. [BCKL08] requiresthat for some well defined bijection F , no adversary can output (F (m), auth)without first getting an authenticator on m. We need this stronger definitionbecause in our construction of delegatable credentials, we are only able to extracta function of the message. We write F (m) to denote (F (m1), . . . , F (mn)). Inaddition, we require the authentication scheme to be unforgeable even if theadversary learns a signature on the secret key. This is because an adversary in adelegatable credentials system might give a user a credential – i.e., sign the user’ssecret key. We call this certification security.

An authentication scheme is F -unforgeable and certification secure if for all p.p.t.adversaries A:

Pr[paramsA ← AuthSetup(1k); sk ← AuthKg(paramsA);

(y, auth)← AOAuth(paramsA,sk,.),OCertify(paramsA,.,(sk,.,... ))(paramsA, F (sk)) :

VerifyAuth(paramsA, sk, F−1(y), auth) = 1 ∧ F−1(y) /∈ QAuth] ≤ ν(k) ,where the oracle OAuth(paramsA, sk,m) outputs Auth(paramsA, sk,m) and stores monQAuth, and oracleOCertify(paramsA, sk∗, (sk,m2, . . . ,mn)) outputs Auth(paramsA,sk∗, (sk,m2, . . . ,mn)).

Construction. Our authentication scheme is similar to the Boneh-Boyen weaksignature scheme [BB04b], where sk(m) = g1/(sk+m). Belenkiy et al. showedthat the Boneh-Boyen signature scheme is F -unforgeable for the bijection F (m) =(gm, um), and that the Groth-Sahai proof system can be used to prove knowledge ofsuch a signature. However, in the Boneh-Boyen construction certification-security

Page 151: Cryptographic Protocols For Privacy Enhanced Identity Management

BUILDING BLOCKS 127

does not hold because sk(m) = Signm(sk). In addition, we also need to sign avector of messages.

We use the Boneh-Boyen signature scheme as a building block. For example,to sign message (m1,m2), we choose random keys K∗,K1,K2 and compute(sk(K∗), SignK∗(K1), SignK∗(K2), SignK1(m1), SignK2(m2), F (K∗), F (K1), F (K2)).At a high level, this structure eliminates any symmetries between Authsk(m) andAuthm(sk). More formally:

AuthSetup(1k) generates groups G1, G2, GT of prime order p (where |p| is chosenbased on the security parameter k), bilinear map e : G1 × G2 → GT ,and group elements g, u, u∗, u1, . . . , un ∈ G1 and h ∈ G2. It outputsparamsA = (G1, G2, GT , e, p, g, u, u

∗, u1, . . . , un, h). Note that the element uis only needed to define F ; it is not used in creating the authenticator.

AuthKg(paramsA) outputs a random sk ← Zp.

Auth(paramsA, sk, (m1, . . .mn)) chooses random K∗,K1, . . .Kn ← Zp. It outputsauth = (g

1sk+K∗ , hK

∗, u∗K

∗, g 1

K∗+Ki , hKi , uKii , g1

Ki+mi 1≤i≤n) .

VerifyAuth(paramsA, sk, (m1, . . . ,mn), auth). Parse auth = (A∗, B∗, C∗, Ai, Bi, Ci,Di1≤i≤n).Verify that e(A∗, hskB∗) = e(g, h), and that e(u∗, B∗) = e(C∗, h). For1 ≤ i ≤ n verify that e(Ai, B∗Bi) = e(g, h), that e(ui, Bi) = e(Ci, h), andthat e(Di, Bih

mi) = e(g, h). Accept if and only if all verifications succeed.

Theorem 7.2.1 The message authentication scheme above is F -unforgeable andcertification secure for F (mi) = (hmi , umi) under the BB-HSDH and BB-CDHassumptions.See [BCC+09] for a proof. The signature scheme obtained by setting pk = hsk maybe of independent interest.

7.2.2 Additional Protocols

We describe three additional protocols that we will need in our construction. First,we introduce a commitment scheme with which we can commit to messages min the secret key space of the authentication scheme. We show that given anextraction trapdoor, we will be able to extract F (m) (as defined in Section 7.2.1)from such a commitment.

Next, we describe a protocol for proving knowledge of an authenticator. It ispossible that an authenticator may leak information about a user’s secret key,which would compromise anonymity. Thus, rather than present an authenticator

Page 152: Cryptographic Protocols For Privacy Enhanced Identity Management

128 DELEGATABLE ANONYMOUS CREDENTIALS

as part of a credential, the users in our system will generate a zero-knowledge proofof knowledge of such an authenticator. We can use the Groth-Sahai composablewitness indistinguishable proof system to form this proof.

Finally, we provide a secure two-party protocol for jointly computing a proof ofknowledge of an authenticator. If Bob wants to prove he has a credential from Alice,he needs to show that he has an authenticator under her secret key on his secretkey. However, neither user has enough information to form this authenticator, letalone the corresponding proof. Thus, we also provide a secure two-party protocolfor jointly computing a proof of knowledge of an authenticator.

Commitment scheme. A commitment commx = Com(x, openx) to an exponentx consists of two GS commitments to group elements such that Com(x, (o1, o2)) =(C(1), C(2)) = (GSCom(hx, o1),GSCom(ux, o2)) and a NIPK proof that thesecommitments are correctly related. This allows us to extract F (x) = (hx, ux).

Proof of knowledge of an authenticator. We need a zero-knowledge proof ofknowledge of an unforgeable authenticator for messages m = (m1,m2), where thefirst value is hidden in commitment commm1 and the second value m2 is publiclyknown. In our notation, this is:

NIZKPK(F (sk), F (m1), auth) : sk in commsk ∧m1 in commm1∧

VerifyAuth(paramsA, sk, (m1,m2), auth) = 1 .We use Groth-Sahai witness indistinguishable proofs as a building block. We createa concatenation of three proofs: π = πauth πsk πm1 :

πauth ←NIPK(F (sk), F (m1), A∗, B∗, C∗, Ai, Bi, Ci, Di1≤i≤2) :

hsk in comm′(1)sk ∧ usk in comm′(2)sk ∧ hm1 in comm′(1)m1∧ um1 in comm′(2)m1

e(u, hsk) = e(usk , h) ∧ e(A∗, hskB∗) = e(g, h)∧

e(u∗, B∗) = e(C∗, h) ∧ e(u, hm1) = e(um1 , h, )∧

e(Ai,B∗Bi) = e(g, h) ∧ e(ui, Bi, ) = e(Ci, h) ∧ e(Di, Bihmi) = e(g, h)1≤i≤2 .

Proofs πsk , πm1 are proofs that two commitments contain the same value. Let x =sk,m1, respectively. Then the two proofs are of the form πx ← NIPK(x, x′, hθ) :x in comm(1)

x ∧ x′ in comm′(1)x ∧ e(x/x′, hθ) = 1 ∧ e(g, hθ) = e(g, h).Groth and Sahai [GS08] show that witness indistinguishable proofs like πsk andπm1 are also zero-knowledge. A simulator that knows the simulation trapdoor sim

Page 153: Cryptographic Protocols For Privacy Enhanced Identity Management

BUILDING BLOCKS 129

for the GS proof system can simulate the two conditions by setting θ to 0 and 1respectively. In this way he can fake the proofs for arbitrary commitments.

Note that if πsk , πm1 are zero-knowledge, then the composite proof π will bezero-knowledge: The simulator first picks sk ′ and m′1 at random and uses themto generate an authentication tag. It uses the authentication tag as a witnessfor the witness indistinguishable proof πauth and then fakes the proofs that thecommitments C ′sk , and C ′m1

are to the same values as the original commitmentsCsk and Cm1 .

Two-party protocol for creating a proof of knowledge of an authenticator. Theissuer chooses random K∗,K1,K2 ← Zp. He computes a partial authentication tagauth′ = (g

1sk+K∗ , hK

∗, u∗K

∗, g 1

K∗+K1 , hKi , uKii 1≤i≤2, g1

K2+m2 ). Then the issuercreates πsk and a NIZKGS proof π′auth for this partial authenticator auth′.

Let Keygen,Enc,Dec be an additively homomorphic semantically secure encryptionscheme, let “⊕” denote the homomorphic operation on ciphertexts; for e a ciphertextand r an integer, e ⊗ r denotes “adding” e to itself r times. (For a specificefficient instantiation using the Paillier encryption scheme, see [BCC+09]. For ageneralization of this protocol see Section 2.4.5.) Recall that p is the order of thebilinear groups. The user and the issuer run the following protocol:

1. The issuer generates (skhom, pkhom) ← Keygen(1k) in such a way that themessage space is of size at least 2kp2. He then computes ct1 = Enc(pkhom,K1)and sends ct1, pkhom to the user and engages with her in an interactive zero-knowledge proof that ct1 encrypts to a message in [0, p].

2. The user chooses r1 ← Zp and r2 ← 0, . . . 2kp, then computes ct2 =((ct1 ⊕ Enc(pkhom,m1))⊗ r1)⊕ Enc(pkhom, r2p) and sends ct2 to the user.

3. The issuer and the user perform an interactive zero-knowledge proof in whichthe user shows that ct2 has been computed correctly using the message incommm1 , and that r1, r2 are in the appropriate ranges.

4. The issuer decrypts x = Dec(skhom, ct2), computes σ∗ = g1/x and sends it tothe user.

5. The user computes σ = σ∗r1 and verifies that it is a correct weak BB signatureon m1. The issuer obtains no information about m1.

Based on g1

K1+m1 the user computes a proof π′′auth to extend the proof of the partialauthenticator to a proof of a full authenticator and the proof πm1 . Finally the usercomputes π = π′auth π′′auth πsk πm1 and randomizes the combined proof.

Page 154: Cryptographic Protocols For Privacy Enhanced Identity Management

130 DELEGATABLE ANONYMOUS CREDENTIALS

7.3 Construction of Delegatable Credentials

We construct delegatable credentials using the tools defined in Section 7.2. Theparameters of the system combine the parameters paramsA from the authenticationscheme and paramsPK from the composable and randomizable NIZKPK proofsystem and its associated commitment scheme Com. The key space of theauthenticator must be a subset of the input space of the commitment scheme.Each user U has a secret key skU ← AuthKg(paramsA), and forms his pseudonymsusing Com: NymU = Com(skU , openU ). U can create arbitrarily many differentpseudonyms by choosing new random values openU . A user can act as an authority(originator) for some type of a credential by making his pseudonym NymO publiclyavailable.

How the proof system fits with the construction. Let sk0 be the secret key ofa credential authority A whose public key is NymO = Com(sk0, open0); and let sk1be the secret key of a user U whose pseudonym is NymU = Com(sk1, open1). LetVerifyAuth be as defined in Section 7.2.1; let F be a function such that the messageauthentication scheme is F -unforgeable. A level 1 credential from A to U will be

π1 ← NIZKPK(F (sk0), F (sk1), auth) :

sk0 in NymO ∧ sk1 in NymU ∧ VerifyAuth(paramsA, sk0, (sk1, r1), auth),

where the value r1 will be defined later. Note that A and U will have to compute π1jointly using the protocol defined in Section 7.2.2. While neither can compute authindependently, they jointly have the information to compute auth, and therefore,they jointly have the information to compute a proof of knowledge of auth. Notethat none of them learns auth itself.

A level L credential from A to UL (whose pseudonym is NymL) will be

πL ← NIZKPK(F (sk0), F (sk1), . . . , F (skL), auth1, . . . , authL) :

sk0 in NymO ∧ skL in NymL∧

VerifyAuth(paramsA, sk0, (sk1, r1), auth1) ∧ . . .∧

VerifyAuth(paramsA, skL−1, (skL, rL), authL).

The extractor can extract F (ski) from the credential πL, where ski is the secretkey of any intermediate delegator Ui, 1 ≤ i < L.

The value πL+1 needs to be computable by UL and UL+1, without contactinganyone else. This is tricky because, in fact, neither of them know any of the values

Page 155: Cryptographic Protocols For Privacy Enhanced Identity Management

CONSTRUCTION OF DELEGATABLE CREDENTIALS 131

(sk0, . . . , skL−1) and (auth1, . . . , authL)! So how can they possibly compute a proofof knowledge of these values?

Our first observation is that, although they do not know any of the secret keys andauthentication tags, UL already has πL that is distributed correctly. So, in fact, ifUL and UL+1 can jointly compute

π′L+1←NIZKPK(F (skL), F (skL+1), authL+1) :skL in NymL ∧ skL+1 in NymL+1∧

VerifyAuth(paramsA, skL, (skL+1, rL+1), authL+1)and then compute the concatenated proof πL+1 = πL π′L+1, they will in fact arriveat πL+1 from which all the right values can be extracted, and whose zero-knowledgeproperties are preserved as well.

However, this method of computing πL+1 is inadequate for our purposes, becauseπL+1 can be linked to πL, and, as a result, violates the anonymity requirements.This is why it is crucial that the underlying proof system be randomizable.

Full construction. We denote a user’s private credential as cred. To show ordelegate the credential, the user randomizes cred to form credproof . In ourconstruction, cred is in fact an NIZKPK of a statement about U ’s specific secretpseudonym SU = Com(skU , 0) (this specific pseudonym does not in fact hide skUsince it is formed as a deterministic function of skU ) while credproof is a statementabout a proper pseudonym, NymU = Com(skU , open) for a randomly chosen open.So U randomizes cred to obtain credproof using the RandProof algorithm describedin Section 2.5.6.

Suppose a user with secret key skU has a level L credential from some authority A,and let (skO, sk1, . . . , skL−1, skU ) be the keys such that the owner of ski delegatedthe credential to ski+1 (we let sk0 = skO and skL = skU ). A certification chainis a list of authenticators auth1, . . . , authL, such that ski generated authenticatorauthi+1 on ski+1.

To make sure that pieces of different certification chains cannot be mixed andmatched, we add a label ri to each authenticator. The labels have to be uniquefor each authority and delegation level. Let H be a collision resistant hashfunction with an appropriate range. For a credential chain rooted at NymO, we setri = H(NymO, i). Each authi is an output of Auth(paramsA, ski−1, (ski, ri)).

Here, F is a bijection such that the authentication scheme is F -unforgeable. As aresult, if on input cred the extractor outputs a certification chain involving honestusers, this certification chain will correspond to these users’ delegation activities,or if not, then the security of the authentication scheme does not hold. We useconcatenation, projection and randomization properties of our NIZKPK to allow a

Page 156: Cryptographic Protocols For Privacy Enhanced Identity Management

132 DELEGATABLE ANONYMOUS CREDENTIALS

delegating issuer I with a level L credential to delegate it to a user U . We havealready explained this in Section 7.2.

We now give the full construction. Let PKSetup,PKProve,PKVerify be a proofsystem and let AuthSetup,AuthKg,Auth,VerifyAuth be an authentication scheme,and let H : 0, 1∗ → Zp be any collision resistant hash function.

Setup(1k).Use AuthSetup(1k) to generate paramsA and PKSetup(1k) to generateparams; choose the hash function H (as explained above); and outputparamsDC = (paramsA, params, H).

Keygen(paramsDC ).Run AuthKg(paramsA) and output the secret key sk.Nymgen(paramsDC , sk).Choose a random open ∈ Y (where Y is the domain of

openings of the commitment scheme Com associated with params, as explainedin Section 7.2). Compute Nym = Com(params, sk, open) and output pseudonymNym and auxiliary information open.

CredProve(paramsDC ,NymO, cred, skU ,NymU , openU , L).Recall that cred shouldbe an NIZKPK of a certification chain (above, we already gave the NIZKPKformula for it with the correct Condition and extraction function f). IfPKVerify(paramsPK , (NymO,Com(skU , 0)), cred) rejects, or if NymU 6= Com(skU ,openU ), abort. Otherwise, set credproof ← RandProof((NymO,NymU ),(0, openU ), cred). Note that, by the randomization properties of the proofsystem,

credproof ∈ NIZKPK(F (sk0), . . . , F (skL), auth1, . . . , authL) :

sk0 in NymO ∧ skL in NymU∧

VerifyAuth(paramsA, sk0, (sk1, r1), auth1) ∧ . . .∧

VerifyAuth(paramsA, skL−1, (skL, rL), authL) .CredVerify(paramsDC ,NymO, credproof ,NymU , L) Run PKVerify to verify this proof.Issue(paramsDC ,NymO, skI ,NymI , openI , cred,NymU , L)↔ Obtain(paramsDC ,NymO, skU ,NymU , openU ,NymI , L). If L = 0 andNymO 6= NymI , then this protocol is aborted. The issuer verifies his credusing CredVerify and if it does not verify or if NymI 6= Com(skI , openI) or ifNymU is not a valid pseudonym, the issuer aborts.Else, the issuer and the user both compute rL+1 = H(NymO, L+ 1). The issuerand the user run a two-party protocol with the following specifications: the publicinput is (NymI ,NymU , rL+1); the issuer’s private input is (skI , openI) and theuser’s private input is (skU , openU ). The output of the protocol is as follows: ifthe issuer did not supply (skI , openI) such that NymI = Com(skI , openI), or ifthe user did not supply (skU , openU ) such that NymU = Com(skU , openU ), theprotocol aborts; otherwise, the issuer receives no output while the user receivesas output the value π computed as:

Page 157: Cryptographic Protocols For Privacy Enhanced Identity Management

CONCLUSION 133

π ← NIZKPK(F (skI), F (skU ), auth) : skI in NymI ∧ skU in Com(skU , 0)∧

VerifyAuth(paramsA, skI , (skU , rL+1), auth) .In Section 7.2.2 we gave an efficient instantiation of such a two-party computationprotocol for the specific authentication and NIZKPK schemes we use.If L = 0, then the user outputs credU = π. On the other hand, if L > 0, the issuerobtains credproof I ← CredProve(paramsDC ,NymO, cred, skI ,NymI , openI , L)and sends it to the user. Let SU = Com(skU , 0). Intuitively, credproof I isproof that the owner of NymI has a level L credential under public key NymO,while π is proof that the owner of NymI delegated to the owner of SU . The userconcatenates credproof I and π to obtain credproof I π ∈

NIZKPK(F (skO), F (sk1, . . . , skL−1), F (skI), F (skU ), auth1, . . . , authL, authL+1) :

skO in NymO ∧ skI in NymI ∧ skU inSU∧

VerifyAuth(paramsA, skO, (sk1, r1), auth1)∧

VerifyAuth(paramsA, sk1, (sk2, r2), auth2) ∧ . . .∧

VerifyAuth(paramsA, skL−1, (skI , rL), authL)∧

VerifyAuth(paramsA, skI , (skU , rL+1), authL+1) .To get credU , U now needs to project credproof I π so it becomes a proof about(NymO, SU ) and not about NymI .

Theorem 7.3.1 If AuthSetup,AuthKg,Auth,VerifyAuth is an certification-secureF-unforgeable authentication scheme, and if H is a collision resistant hashfunction, and if PKSetup,PKProve,PKVerify is a randomizable, partially extractable,composable zero-knowledge non-interactive proof of knowledge system, and if the two-party protocol is secure, then the above construction constitutes a secure anonymousdelegatable credential scheme. See Appendix A.7 for proof.

Remark 7.3.2 We can easily extend our construction to attach public attributesto each level of the credential. The main modification is that we now computer` = H(skO, `, attr1, . . . , attr`), where attri is the set of attributes added by the ithdelegator in the delegation chain. When the user shows or delegates a credential,he must then display all the attributes associated with each level.

7.4 Conclusion

We build the first efficient delegatable anonymous credential system. Future workcould, e.g., look at accountability mechanisms or more flexible attribute models.

Page 158: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 159: Cryptographic Protocols For Privacy Enhanced Identity Management

Part II

Oblivious Transfer RelatedSchemes

135

Page 160: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 161: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 8

Introduction to Part II

The first part of this thesis was concerned with anonymous credential protocolsand closely related schemes. We showed how a user can obtain authorization to aservice without revealing his identity. Anonymity is sufficient if the user requests ageneric electronic good, meaning that the same information is provided to a largenumber of users. Examples for such services are the online access to newspapersand the download of software or videos.1

In this second part we take a closer look at personalized services. The informationprovided by a personalized service depends in one way or another on who theuser is, where the user is, and what the user wants. The content of the serviceis generated based on the input of the user and the database of the service. Thelatter is the same for all users. Sometimes the information used for personalizationcan be provided anonymously to avoid user specific data collection. We will,however, be concerned with privacy-enhancing protocols for services for which theinformation needed for personalization is sensitive or identifying. We use location-based services as an example. It has been shown that location traces are highlyprivacy sensitive [Kru07, IL07] and can be used for re-identification [MDBP08].

A simple solution for providing a personalized service in a privacy friendly way anda good benchmark for the best possible solution in terms of privacy is to transfer thedatabase to the user. The user does all computation required for personalizationlocally and does not have to reveal any personal information. An example for sucha system is an automotive navigation system with its own database of maps. Sucha system is privacy friendly, as the user’s location is not transferred to any service

1In fact, electronic services are hardly ever completely generic. For instance, every user hashis own way of reading news with respect to time, speed, and order. In such cases, however,it seems still reasonable to assume that the level of protection of the user’s privacy offered byanonymization is sufficient.

137

Page 162: Cryptographic Protocols For Privacy Enhanced Identity Management

138 INTRODUCTION TO PART II

provider. Many services, however, have restrictions that make such an approachunsatisfactory. In this work we consider the following constraints:

Communication and computation costs: Transferring the whole database to the userand performing the computation locally may be costly.

Intellectual property: The service may not want to reveal the database to the user.Remote storage: Users may decide to store some of their own personal data remotely.

Storing data on a service can reduce the risk of data loss, and gives the user theability to access his data from everywhere through different devices. The usermight however want to hide which data he accesses how often and for whichpurposes.

Third party access: The user may grant other entities access to his remotely storeddata. This access may even happen if the user is offline. Typical use cases forsuch applications are social networks and online health records. For example, apatient may grant a doctor access to his health records stored at a health recordprovider. Here a direct request of the doctor for a particular patient record canreveal sensitive personal information.2

Inter-organizational transactions: Many privacy critical transactions do not directlyinvolve individuals, but are about them. A government agency may want tocheck its list of terrorist suspects against the passengers list of an airline company,or may want to search retained data to learn more about the online activities ofsuspects.3

A general solution based on secure two-party computation. Failing tosimultaneously fulfill the privacy requirements of users and the intellectual propertyrequirements of the database owner can lead to a less than optimal outcome.The same is true if in inter-organizational transactions one organization has toreveal information to another (more powerful and secretive) organization, e.g.,the National Security Agency (NSA), and thus fails to fulfill its data handlingresponsibilities towards all of its customers (and not only those accused of terroristactivities). Organizations need means for performing a transaction in such away that both parties can keep their information secret from the other party.Digital signatures [GMR88] (for authorization) and general secure two-partycomputation [Yao82] (for the private computation of the result) establish theexistence of such protocols. For instance, the data retention scenario could beimplemented through a secure two-party computation that takes a signed warrant

2U.S. publications published false rumors that actress Nicole Kidman might be sufferingfrom breast cancer after someone leaked information about her breast exam to reporters: http://www.wired.com/techbiz/media/news/2004/02/62356.

3Both air traffic surveillance [LT06] and data retention [eu-06b] have raised privacy concerns:http://www.schneier.com/blog/archives/2006/12/automated˙targe.html, http://epic.org/privacy/intl/data˙retention.html.

Page 163: Cryptographic Protocols For Privacy Enhanced Identity Management

INTRODUCTION TO PART II 139

as input provided by the police, and the retained database as input provided bythe retaining organization. As a result of the protocol, the police only learns therecords that matched the warrant, if any, while the retaining organization does notlearn anything (not even the content of the warrant).

General techniques for secure two-party computation are, however, still inefficient.This is especially true if the computation involves cryptographic elements (such asthe signature for the warrant) and large amounts of data. It is thus desirable todesign cryptographic protocols that give us the advantages of general two-partycomputation for security while reducing both computation and communicationoverhead.

Specialized protocols for service access. Since creating a solution that wouldsupport all services is hard, we focus on a subclass of services for which efficientprotocols can be developed. We consider protocols for accessing data items froma database. The common privacy goal for all of these protocols is to hide theuser’s access patterns. (Here, by user we refer to the entity querying the database.As discussed above, the database may contain personal information about thisuser or other users of a larger information system.) Such systems do not supportdynamic content, i.e., content that is computed based on the input of the user.We can emulate some amount of dynamic content either by creating and storing adifferent data item for every possible user input combination or by giving the userthe possibility to search for items that match certain criteria.

Remote storage and inter-organizational transactions often use personal identifiersof data subjects as input. During a third party access to remotely stored user data,an identifier of the user is used to select the correct data item and thus becomespart of the information used for personalization, e.g., the patient’s identifier for ahealth record request.4 Similarly, many inter-organizational transactions are searchproblems in which both organizations have some information about different usersand want to check whether there are matches, e.g., when comparing a no-fly listand a list of passenger names.5

A feature that we consider important is the support of encrypted storage.Encryption is an important counter measure against data leakage. Consequently,data that is stored remotely or is retained for law enforcement purposes shouldbe stored in encrypted form. We are interested in special encryption schemes thatsupport search operations.

Protocols that allow a user to efficiently access a database while hiding allinformation about which items are accessed are known as private information

4Even if the user is anonymous as, e.g., in [DD05], such a system needs to be carefully designedto be secure against traffic analysis attacks.

5Cryptographic solutions for such problems have also been discussed in the popular press:http://www.msnbc.msn.com/id/13351116/.

Page 164: Cryptographic Protocols For Privacy Enhanced Identity Management

140 INTRODUCTION TO PART II

Private information retrieval protocolCommon input: Number of messages nUser input: ı Service input: m1, . . . ,mn

User output: mı or more Service output: none

Table 8.1: Private information retrieval protocol

retrieval (PIR) protocols. Such protocols will form the basis for privacy enhancedservice access. In Section 8.1.1 we summarize PIR research and sketch a PIRprotocol based on homomorphic encryption that we will use in Chapter 9. InSection 8.1.2 we look at private information searching. These are protocols thatallow to retrieve items based on search criteria without knowing the position ofthe data item (and whether it exists at all) in the database.

Moreover, service providers and users have an interest in enforcing access control.It should be possible to limit who can access which data, search for which keywords,and decrypt which content, while at the same time hiding the access pattern fromthe service provider. To enforce access control, we need to address two concerns.First, the access mechanism must only reveal the data items requested by the user.In Section 8.1.3 we discuss oblivious transfer protocols. We sketch an oblivioustransfer protocol based on unique blind signatures that we use in Chapter 9. Second,the access must be authorized through an appropriate cryptographic mechanism.We sketch different approaches in Section 8.2. Both mechanisms hide the user’squeries.

In Section 8.3 we summarize the contribution of Chapter 9 and Chapter 10.

8.1 Protocols for the Private Access to Data

We take a look at existing work on private information retrieval, private informationsearch and oblivious transfer.

8.1.1 Private Information Retrieval

The goal of private information retrieval (PIR) is to allow a user to retrieve a dataitem from a service without revealing which item is being retrieved. The databaseof the service is represented as a list of items m1, . . . ,mn. The user obtains mı

without the service learning ı (cf. Table 8.1).

Page 165: Cryptographic Protocols For Privacy Enhanced Identity Management

PROTOCOLS FOR THE PRIVATE ACCESS TO DATA 141

As PIR only needs to hide the access patterns of the user, but does not have toprotect the database of the service, a trivial solution is to send the whole databaseto the user and let the user make the selection. This solution is also optimal if theuser’s privacy should be protected information theoretically and the database ofthe service is stored on a single server [CGKS95].

Chor et al. [CGKS95] show how to overcome this limitation by replicating thedatabase on multiple non-cooperating servers. Kushilevitz and Ostrovsky [KO97]show how to build private information retrieval for a single server based oncomplexity theoretic assumptions. Since then, private information retrievalprotocols with better communication efficiency have been proposed both for themulti-server [BIKR02, BIK05] and the complexity theoretic setting [GR05, Lip05,OS07b].

As shown by Ostrovsky and Skeith [OS07b], the schemes by Kushilevitz andOstrovsky [KO97] and Lipmaa [Lip05] make use of similar ideas based onhomomorphic encryption. An important difference is that the Kushilevitz andOstrovsky protocol is based on the Goldwasser-Micali cryptosystem [GM84] whilethe protocol by Lipmaa is based on the more efficient and length flexible Damgard-Jurik cryptosystem [DJ01].

Private information retrieval using homomorphic encryption. A well-knownproperty of additive homomorphic encryption is that given an encryption Q =Ench(1) it is possible to compute an encryption of a message m as m ⊗ Q =Ench(

∑mi=1 1) = Ench(m). However, if Q = Ench(0), the same operation results in

an encryption of 0, i.e., m⊗ Ench(0) = Ench(0) (see also Section 2.4.2).

Given the semantic security of the encryption, the party trying to encode themessage cannot distinguish the two cases above. Based on this observation, aPIR protocol can be constructed by using a query vector Q = (Q1, . . . , Qn). Torequest message mı, Qı = Ench(1) and Qi = Ench(0) for i 6= ı. The communicationcomplexity of the database can be reduced by computing E =

⊕nj=1mi ⊗Qi, and

transferring only E to the recipient. Even given this short response message, thisPIR protocol has still communication complexity O(n) because of the large querysize.

Performance can be improved by addressing the items of the database using thediscrete coordinates in a d-dimensional cube. This allows condensing the size of theuser’s query to d d

√n ciphertexts. The example above corresponds to the case d = 1

in which the items are addressed as the discrete points of a line of length n. Ford = 2 the items are arranged as a square of width

√n. Using log(n)/2 dimensions

and a length-flexible homomorphic encryption scheme such as Damgard-Jurik, thisgives rise to a near optimal, poly-logarithmic PIR protocol with O(k log2(n)) bitssent by the user and O(k log(n)) bits sent by the provider of the database [Lip05].

Page 166: Cryptographic Protocols For Privacy Enhanced Identity Management

142 INTRODUCTION TO PART II

It has nevertheless been argued that private information retrieval is impracti-cal [SC07]. The impracticality stems from the requirement to perform a computationof order Ω(n) on each request. Intuitively, this is because the computation hasto involve the whole database in order to guarantee privacy. Following the workof [GO96], a trusted processor with a small amount of secure memory can be usedto overcome this limitation. An efficient PIR protocol based on this approach isdescribed in [WSC08].

It has also been shown how to build a PIR protocol using sender anonymity [IKOS06,TP07].

8.1.2 Private Information Search

The goal of private searching is to hide both the search patterns and the searchresults. Ideally, both the keywords against which data items are matched and therelation between keywords and data items should be hidden. Search mechanismsfor a large number of different scenarios have been proposed:

Searching on encrypted data. The user encrypts data using a symmetric key andstores the data at a storage server. He is the only one who can generatetrapdoors that allow the server to search for keywords. The server only learnswhich ciphertexts match which trapdoors [SWP00, CGKO06].

Searching on public key encrypted data. The user can publish a public key. Every-one can encrypt data and send it to the storage server. The user is the onlyone who can generate trapdoors that allow the server to search for keywords.The server only learns which trapdoors match which ciphertexts [BDCOP04,ABC+05]. As a consequence of the public key settings, the server can do trialdecryptions (for ciphertext for which he knows the keyword) to find the keywordthat corresponds to a given trapdoor.

Public key encryption that allows PIR queries. The user publishes a public key.Everyone can encrypt data and send it to the storage server. The user isthe only one who can initiate searches and he is also the only one to learn theresult of the search. Most of the processing is performed at the server that sendsa short encrypted result to the user [BKOS07].

Oblivious keyword search. The data items that are to be searched are available inclear text at the storage server. The user learns all data items that match acertain keyword, while the server learns nothing [CGN98, OK04, FIPR05].

Private set intersection. Both the user and the service provider are in possessionof a set. A set intersection algorithm allows the user to learn the intersectionbetween his set and the set of the server [FNP04, JL09, CT09]. Set intersectionis related to oblivious keyword search in the following way: the service encodeseach of its set elements as the keyword of a data item in its database; the user

Page 167: Cryptographic Protocols For Privacy Enhanced Identity Management

PROTOCOLS FOR THE PRIVATE ACCESS TO DATA 143

1-out-of-n oblivious transfer protocolCommon input: Number of messages nUser input: ı Service provider input: m1, . . . ,mn

User output: mı Service provider output: none

Table 8.2: OTn1 protocol

also interprets his own set elements as keywords and can use oblivious keywordsearch to check whether one of his elements matches an item in the database.

Private searching on streaming data. The data that is to be searched is available inclear at the storage server, or is streamed through the server. The user generatesan encrypted query for some keywords that can be interpreted as a straight-lineprogram. The server executes the query on the data that is to be searched andobtains an encrypted result that is sent to the user. The server does not learnwhich data items are returned in the encrypted result. In particular, he is unableto learn the keywords of the query [OS07a, DD07].

8.1.3 Oblivious Transfer

PIR can leak information about the database of the service provider. PIRprotocols that do not reveal any information besides the data item that is retrievedare sometimes referred to as symmetric PIR (SPIR). They are, however, morecommonly known as 1-out-of-n oblivious transfer (OT). Di Crescenzo et al. showedin [DCMO00] that for the case where there is only one copy of the database thereexists a communication-efficient reduction from any PIR protocol to a 1-out-of-noblivious transfer protocol.

Oblivious transfer was first introduced by Rabin [Rab81]. Even et al. [EGL85]introduced 1-out-of-2 oblivious transfer (OT2

1). The receiver chooses which messageout of two possible messages he wants to receive. In turn it was shown how toconstruct OTn1 protocols using multiple invocations of an OT2

1 protocol [BCR87,NP99a] (cf. Table 8.2). Direct constructions for OTn1 were proposed by [NP01,AIR01, Kal05].

For unobservable service access, we are not so much interested in single executionsof oblivious transfer, but want to query the same database multiple times atdifferent indices. This problem is known as adaptive oblivious transfer (adaptiveOT). Adaptive OT can be achieved by letting the service commit to the databaseand execute an OTn1 scheme multiple times. However this is not the most efficientsolution. The first adaptive oblivious transfer protocol was proposed in [NP99b].More efficient schemes were proposed by [OK04, CT05].

Page 168: Cryptographic Protocols For Privacy Enhanced Identity Management

144 INTRODUCTION TO PART II

User(ı) Service provider(m1, . . . ,mn)(n, d, e)← KgFor i = 1 . . . nCi ← Encs(mi;H1(H2(i)d))

n, e, C1, . . . , Cn

b← ZnH2(ı)be -

H2(ı)dbmı ← Decs(Cı;H1(H2(ı)d))

Figure 8.1: Adaptive OT based on Chaum blind signatures

Oblivious transfer based on unique blind signatures. Camenisch et al. [CNS07]recognized that the schemes in [OK04, CT05] are based on a common principle toconstruct adaptive OT from unique blind signature schemes (UBSS).

We sketch the basic idea of adaptive OT schemes based on UBSS. First, all messagesm1, . . . ,mn are symmetrically encrypted using the hashed signature of the index i,1 ≤ i ≤ n, as the key. Thus Ci = Encs(mi;H(Sign(i; sk))). H is a cryptographichash function, Encs is a secure symmetric cipher, Sign is the signature algorithmcorresponding to the UBSS, and sk is the signing key of the service providerthat will be used for creating the blind signature. The ciphertexts C1, . . . , Cn aretransferred to the user. To obtain message mı, the user runs a blind signatureprotocol with the service on message ı to obtain the symmetric key (a signature onı).

We show a concrete scheme based on Chaum blind signatures [Cha82] in Figure 8.1.All messages are symmetrically encrypted using the RSA signature of the index. Thevalues n, e, d are the RSA modulus, verification key, and signing key respectively.Note that here we have two hash functions H1 and H2. H1 corresponds to H,while H2 is a hash function that is used by the signature scheme. First, theencrypted database C1, . . . , Cn is transferred to Alice. When Alice wants to obtainthe information for location ı, she runs a Chaum blind signature protocol with theservice to obtain the key. This last step of the protocol can be repeated multipletimes.

Remark 8.1.1 As the generated signature never needs to be verified by anyonebut the user a blind unique message authenticator, also known as a messageauthentication code, would be sufficient. We do, however, need to guarantee in theinteractive generation protocol that the authenticator was computed correctly. Formore information on the problem of building a blind unique message authenticator,

Page 169: Cryptographic Protocols For Privacy Enhanced Identity Management

HIDING ACCESS PATTERNS DURING ACCESS CONTROL 145

we refer the reader to [NNA07]. This problem is also related to oblivious pseudo-random functions [FIPR05].

8.2 Hiding Access Patterns during Access Control

We show how oblivious transfer and oblivious keyword search protocols can beextended to implement access control policies that depend both on the rightsgranted to the user and on the attributes associated with the data items that areto be retrieved.

Access control based on commitments and proofs. Coull et al. [CGH09] providea solution for implementing arbitrary access control policies for oblivious transferprotocols.

Their approach is based on the following strategy. In order to extend OT withaccess control, the user gives the database a hidden handle to her choice. The usercan create such a handle by committing to the database index she is interested in.The hiding property of the commitment assures that the database does not learnabout the user’s choice, while the binding property and a zero-knowledge proofthat binds the commitment to the underlying OT protocol guarantee that the usercannot retrieve a data item with a different index.

The interface for such a committed adaptive OT consists of a setup protocol

(stateU , stateS)← InitU ()↔ InitS(m1, . . . ,mn)

that generates a state and a transfer protocol

((mı, state′U ), state′S)← TransferU (stateU , ı, open)↔ TransferS(stateS , comm)

that allows the user to retrieve a message mı for an indices ı (possibly chosenbased on priviously retrieved messages). The commitment comm = Com(ı, open)commits the user to his choice of ı.

Access control can be enforced using CL-signatures (cf. Section 2.4.3) andanonymous credentials (cf. Part I). Coull et al. [CGH09] achieve their generality byrepresenting access control policies as a state transition graph. They also introducethe concept of stateful credentials that encode the current state of the user. Atransition from one state to another state implies that the user is granted accessto specific data items. By executing the access protocol, the state of the user’sanonymous credential will change from its current state to the target state of thetransition.

We will show how to implement more specific access control policies. For example,role based access control can be implemented by assigning roles to both data items

Page 170: Cryptographic Protocols For Privacy Enhanced Identity Management

146 INTRODUCTION TO PART II

and users. For instance, if we assign the role ‘manager’ to data item m1, then onlya user with role ‘manager’ can access this record.

The service provider can assign roles using its signing key skDB of a CL-signaturescheme. The corresponding verification key pkDB is used during access controlenforcement. Roles are assigned to a data item by signing the index of the itemand the roles associated with it. Roles are assigned to a user by signing the user’smaster secret key together with his roles. Note that here we use the issuing protocolof the CL-signature scheme, so the database does not actually have to learn thesecret key of users. As a simple example, signatures may look as follows:

σ1 = Sign(1, ‘manager’; skDB), . . . , σn = Sign(n, ‘director’; skDB)

σU1 = Sign(skU1 , ‘manager’; skDB), . . . , σU` = Sign(skU` , ‘director’; skDB) .All of these signatures are made publicly available, but only Uj should know themaster secret signed by σUj . Before accessing a data item with committed indexı, the user Uj has to prove that he knows two signatures σı = Sign(ı, role; skDB),σUi = Sign(skUj , role′; skDB), and the secret key skUj such that role = role′ and ıis the same as the value committed in comm. This can be achieved by the followingzero-knowledge proof of knowledge:

PK(σUj , σı, skUj , role, ı, open) : comm = Com(ı, open)∧

Verify(σUj , skUj , role; pkDB) = 1 ∧Verify(σı, ı, role; pkDB) = 1.Such a system was proposed in [CDN09].

This approach also allows the enforcement of more complicated policies. For instancea service provider can impose age restrictions on certain content. The providerassociates a minimum age to each database element mi, 1 ≤ i ≤ n and createssignatures that bind each index to its minimum age: σ1 = Sign(1, 11; skDB), σ2 =Sign(2, 18; skDB), . . . , σn = Sign(n, 18; skDB). The first data item has a minimumage of 11 years, the last of 18 years.

When accessing the ith item, the user has to prove that he possesses a credentialwith a birthdate such that, for the credential associated with the data item,today() − birthdate > minage. Here today() is a function that returns today’sdate.

PK(σUj , σı, skUj , birthdate,minage, ı, open) :

Verify(σUj , skUj , birthdate; pk) = 1 ∧Verify(σı, ı,minage; pkDB) = 1∧

comm = Com(ı, open)∧ today()− birthdate > minage.

A user that is 16 cannot succeed in executing the above proof for ı = 2, while shecan do so for ı = 1.

Page 171: Cryptographic Protocols For Privacy Enhanced Identity Management

SUMMARY OF CONTRIBUTION 147

Access control based on signature authorization. The principal idea behindsignature based authorization is that users should only be able to access a dataitem if they are in possession of a signature on an agreed upon message. Thisfunctionality can be achieved by the approach described above which is based onzero-knowledge proofs. Other, less general but more efficient mechanisms, havebeen proposed in the literature.

Li et al. [LDB05] presented oblivious signature-based envelopes that allow forthe access of individual data items. Privacy-preserving policy-based informationtransfer [CJKT09] can be seen as a signature authorized oblivious transfer scheme.The two schemes are closely related. In [CT09] the same approach is used forauthorizing the inputs for a private set intersection protocol.

8.3 Summary of Contribution

Chapter 9: Efficient Oblivious Augmented Maps. Location-based services (LBS)are an example of services for which the user’s input, i.e., his location, necessaryfor the provisioning of the service is already highly privacy sensitive. For instance,the user’s request for his pollen risk level using a pollen allergy warning servicewould reveal his allergies and his location.

We describe a protocol that involves the mobile operator of the users as a semi-trusted intermediary. The mobile operator knows the users’ locations and hascontractual relationships with the LBS providers. We assume that the LBS areprovided by smaller companies often without a well-established reputation, whereasthe mobile operator already has to be trusted by users to use the location dataonly for the agreed upon purposes. Consequently, it is desirable that LBS providersneed to be trusted less than the mobile operator. A key privacy requirement is toprevent the location-based services from learning the location of users. Involvingthe mobile operator in the protocol allows us to overcome some of the performancerestrictions that would be particularly irksome for users with restricted mobiledevices.

However, we do not want the mobile operator to gain an unfair advantage from hisintermediary role, and thus want to hide both the service usage patterns, and theLBS providers database from him. The mobile operator does not need to knowwhich service a user is interested in, which would for instance reveal that a userhas a pollen allergy. In addition, we achieve service unobservability, meaning thateven the LBS providers do not learn which service is accessed by the user.

We show how to use the PIR mechanism based on homomorphic encryption thatis implicit in [OS07a] and that is described in more detail in [OS07b] to build adistributed private subscription channel.

Page 172: Cryptographic Protocols For Privacy Enhanced Identity Management

148 INTRODUCTION TO PART II

The public key pkU is generated by the user and the user knows the correspondingsecret key. This key is used for the aforementioned PIR. To subscribe to aservice, the user uses an additive homomorphic encryption scheme to generate twoencryptions of 1. For all other services, he generates two encryptions of 0. Thetwo elements of the jth encryption pair are encrypted under two different publickeys: pkU and pkSj . The public key pkSj is for a threshold decryption schemeand is used to support payments. The corresponding secret key is shared betweena quorum of parties: the mobile operator, the corresponding LBS provider Lj ,and an additional privacy trustee. To turn the PIR into an OT and to guaranteepayment fairness the user proves in zero-knowledge that the ciphertext pairs areformed correctly, i.e. that one pair contains encryption of 1 and that all otherscontain encryptions of 0.

For accessing the location specific information the mobile operator and the LBSproviders use the homomorphic property of the first ciphertext of each pair toembed the result of a second oblivious transfer (that the mobile operator executeswith the LBS provider into the outcome of the homomorphic PIR protocol.

For payments, the mobile operator uses the homomorphic property of the secondciphertext in a subscription pair to sum up all encryptions in the encrypted domain.A threshold decryption, involving the privacy trustee, allows to obtain the number ofsubscriptions per LBS provider. This is akin to techniques used for building votingschemes based on homomorphic threshold decryption [HS00]. The subscriptionfees collected by the mobile operator are then distributed to the LBS providersbased on the popularity of their services.

The key ideas behind this work are (1) nesting two OT protocols such that only theoriginator of the first OT learns the final outcome and (2) using the homomorphicevaluation of a tally for payment instead of voting.

Chapter 10: Authorized Private Searches on Public Key Encrypted Data.Legislation on data retention, including EU data retention direction (Directive2006/24/EC) [eu-95], requires that communication service providers such as internetservice providers (ISPs) and telecommunication providers (telcos) store user-relateddata over an extended period of time (from 6 to 24 months), so that it is availablefor criminal investigations and law enforcement. While these laws also require suchdata to be adequately protected, practice shows [Cle] that security breaches arecommon when data is stored at a large scale.

Although storing such data in encrypted form is an obvious protection method,this is not always an effective solution. The data must also be made available sothat it can be searched, accessed, and analyzed. Decrypting the entire databasefor each search is not practical and decreases security. In particular, it providesno protection against malicious insiders. The party doing the search may not

Page 173: Cryptographic Protocols For Privacy Enhanced Identity Management

SUMMARY OF CONTRIBUTION 149

(or should not) have the authority to see the whole database. Danezis [Dan09]describes a centralized data retention scheme that employs encryption and discussesthe risks of such an approach.

We present two new cryptographic primitives, and describe how they can be appliedto the problem of searching on encrypted data in the data retention setting. Thefirst primitive presented is a committed blind anonymous identity-based encryption(IBE) scheme, extending and improving upon the concept of a blind IBE schemepreviously proposed [GH07]. In this context, anonymity means that the ciphertextdoes not leak the key (identity) under which it was encrypted [Gen06, BW06]and blind means that a user can request the decryption key for a given identitywithout the key holder learning the identity [GH07]. We use commitments toenforce restrictions on the blinded identity. This allows one to ensure that akey is only issued to an individual who can prove (in zero-knowledge) that theidentity contained in the commitment fulfills certain conditions imposed by thekey holder. The second new primitive we present is public key encryption withoblivious keyword search (PEOKS), which we implement based on our proposedcommitted blind anonymous IBE scheme.

We describe an application scenario for our scheme in which a communicationservice provider retains data in compliance with the EU data retention directive.Our solution achieves the following properties: (1) The data is stored in a secureand efficient manner. (2) An investigator is able to retrieve data relevant to thespecific (authorized) investigation while still protecting irrelevant (unauthorized)data. The key holder can keep an audit record that consists of commitments thatare signed by judges. These commitments can be used in an audit to hold judgesaccountable. (3) Neither the data holder nor the key holder learn any details aboutthe investigation.

Normally, both the the key holder role and the data holder role will be taken overby the communication service provider that is retaining the data. While the serverscorresponding to the data holders could be widely spread and easy to attack, thekey holder function could be implemented as a centralized high security module,that can only be accessed by the investigator. Different setups are possible.

Page 174: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 175: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 9

Efficient Oblivious AugmentedMaps

Do our electronic maps need to know where we are – or that we are looking at them?Electronic maps can be augmented with information provided by location-basedservices (LBSs). This way subscribed users can quickly find what they need. WithLBSs, however, location privacy is at stake. To reach good privacy, it is advisableto limit access to identity and location information. Even the regular observationof service usage patterns can reveal private information.

Today LBSs are provided in two different ways. Either all the service specific datais made available to the user who computes the result locally—this is the casefor many car navigation systems; or the service is provided remotely. The latteris the dominant approach for providing LBSs in mobile communication networks.Such LBSs can be seen as a by-product of the GSM system (Global System forMobile Communications), as the location of subscribers is already used for mobilitymanagement [GSM03]. Given these constraints, we aim at achieving privacy formobile subscribers who want to use LBSs. We presume that the location will begathered from a mobile network that provides the communication, while externalservice providers will provide the service.

Our approach uses cryptographic protocols to ensure privacy: oblivious transfer andhomomorphic encryption. By developing a framework where the user’s location andsubscription are processed in the encrypted domain, we achieve privacy for certainclasses of LBSs in a new way. Unlike previous approaches based on anonymizationand pseudonymization [KFKK05], our approach achieves the following strongprivacy properties: first, the mobile operator learns nothing in addition to whathe already knows except for the subset of users that are interested in using LBSs.

151

Page 176: Cryptographic Protocols For Privacy Enhanced Identity Management

152 EFFICIENT OBLIVIOUS AUGMENTED MAPS

Thus, no usage profiles can be collected. Second, the service providers only learnthe number of subscribers to their service. Thus, service providers do not learn theusers’ location.

Our protocol offers the additional privacy property of service usage privacy. Eventhe service providers do not know whether a user is accessing their service or not.By introducing a privacy trustee we are able to preserve service usage privacy incase of service/operator collaboration.

The privacy provided is close to optimal: the operator needs to know the set ofLBSs users such that he can localize and charge them, and service providers arepaid by the operator depending on the number of subscribers they were able toattract.

Efficiency consideration. Despite the strong security requirements, our protocolscales well: initialization is independent of the number of users, and subscriptionsgrow linearly with the number of services both in size and computational effort.This is unavoidable, as all services need to be involved to guarantee service privacy.The mobile user, as the party with the most restricted resources, has to receiveand decrypt only a single message.

Related Work. Various privacy enhancing technologies (PET) have been proposedfor LBSs. Most of these techniques focus on providing pseudonymity and anonymityfor LBSs. Federath et al. [FJKP95] proposed the use of a trusted fixed station andMixes [Cha81] for hiding the relation of real world identities to location data inGSM networks. While our protocol can be adapted to such a privacy enhancedGSM network by letting the fixed station localize the user (and take over some ofthe responsibilities of the mobile operator), we explicitly focus on the less privacyfriendly but more practical setting where a third party knows the user’s location.

Researchers started to develop LBS specific PETs called mix-zones (see [BS03]and [GG03]). Mix-zones allow users to switch pseudonyms associated to theirlocation in an unobservable way. Kolsch et al. [KFKK05] use pseudonymizationtechniques in the following realistic setting: a network operator (or a partyconnected to multiple operators) knows the user’s location, while the LBSs areprovided by independent service providers that know the user only under short-livedpseudonyms. However, as location information is inherently attached to a person’slife, re-identification is often easy [Fri07, MDBP08]. Location needs to be hidden,not anonymized [Kru07, IL07].

Outline of the chapter. We describe the properties and high-level implementationof our privacy protecting LBS scheme in Section 9.1. The detailed construction

Page 177: Cryptographic Protocols For Privacy Enhanced Identity Management

PRIVACY PROTECTED LBS SCHEME: DEFINITION AND SOLUTION SKETCH 153

is given in Section 9.2. We analyze the efficiency and security of our solution inSection 9.3 and finally conclude in Section 9.4.

9.1 Privacy Protected LBS Scheme: Definition andSolution Sketch

9.1.1 Definition

Parties. Our protocol involves a user U who accesses LBSs on his mobile device.His goal is to obtain location specific information on topics of his interest. Thisinformation is collected and served by service providers L1, . . . ,L`. A third partywho knows the user’s location and has a financial relationship with the user actsas a proxy P between users and services — this could be the mobile operator ofthe user or an organization associated with it. The proxy is responsible for thesecurity of the location information and assists in the payment transaction. Weassume that the number of users connected through a proxy is much higher thanthe number of services. Finally, we assume the existence of an independent partywith no commercial interests: a privacy protection organization T who can beoffline for most of the time. We refer to all parties except users as organizations.

Security and Privacy Requirements. A secure and privacy friendly LBS protocolshould protect the assets and interests of all involved parties. The assets that needto be protected are: the user’s location, the user’s subscription, the topic specificdatabases of the Lj , and the payment. We consider the following requirements:

• Location privacy: The protocol does not reveal the user’s location to theservice.

• Service usage privacy: Even when the proxy and the LBSs collude, the secrecyof the user’s subscription remains protected. This includes message privacy;i.e., only the users can obtain the information provided by the service hesubscribed to.

• Database secrecy: The the proxy learns no information about the database ofLj . A user learns only the information item that corresponds to the requestedlocations. This property must hold even if the proxy and the user collude.

• Fairness: It is guaranteed that either the user receives the expected data forthe requested location and the LBS receives his expected payment, or thecheating party can be uniquely identified.

Page 178: Cryptographic Protocols For Privacy Enhanced Identity Management

154 EFFICIENT OBLIVIOUS AUGMENTED MAPS

Protocol phases. In the Setup phase the involved parties generate their keys.During the Service Update phase, each service Lj encrypts its topic specific databaseand transfers it to the proxy. In the Subscription phase a user U creates an encryptedsubscription for a service, sends it to the proxy, and is charged the subscription fee.We currently assume services that all have the same price. Support for differentprices could be added based on ideas of priced oblivious transfer [AIR01, RKP09].In the Data Retrieval phase the proxy runs a protocol with every service Lj andobtains an encrypted result. The proxy combines them into a single encryptedresult for the user such that he only receives the data of the subscribed service.The fair allocation of the money collected in the subscription phase takes place inthe Settlement phase under the supervision of the trustee T .

Remarks. The database of a service Lj is represented as a one-dimensional vectorwith one element for each location. We assume that the number of locations n isthe same for all services. Further, we assume that services only update the wholedatabase at once. In the current version of our protocol a user is only subscribedto a single service. Service usage privacy is guaranteed with respect to the totalnumber of subscribed users during a subscription period. A subscription period isdefined as the time between two settlement phases. Finally, we assume that partiescommunicate over secure channels and that P , Lj , and T are able to authenticatecommunication and sign messages.

9.1.2 High-Level Approach and First Sketch

We follow a constructive approach in the description of our protocol. We usebuilding blocks from Section 2.4 and Chapter 8, put them into place, and describetheir function in our construction. Some of the security requirements can befulfilled by the functionality provided by individual building blocks; others requirea complex interplay between building blocks. As a consequence the mapping frombuilding blocks to the sub-protocols of our solution is not one-to-one. We willsketch the sub-protocols as they get assembled from their building blocks.

Our main building blocks are two variants of OT and a threshold encryption scheme.Additive homomorphic encryption and zero-knowledge protocols are building blocksof these schemes, but are also used to glue them together in a secure way. The twoOT protocols are specifically selected for their good performance under repetitionof input data. The blind signature based OT scheme (cf. Section 8.1.3) is optimizedfor the case in which the input database is fixed, while the index varies. Thehomomorphic encryption based PIR/OT (cf. Section 8.1.1) is efficient in theopposite case; it is efficient for fixed indices.

During the protocol execution, a single proxy interacts with a multitude of usersand multiple services. The first building block we put into place is the blind

Page 179: Cryptographic Protocols For Privacy Enhanced Identity Management

PRIVACY PROTECTED LBS SCHEME: DEFINITION AND SOLUTION SKETCH 155

signature based OT protocol. It is executed with the proxy acting as the requesterand one of the service providers as the sender. It allows the proxy to retrievelocation specific information mı,j for a user at location ı without service Lj learningthe user’s location. This guarantees location privacy. The proxy executes thissub-protocol with all offered services. This assures service privacy at the serviceside. In this way the proxy obtains an information vector mı,1, . . . ,mı,`.

Our second building block is the homomorphic encryption based PIR/OT protocol(cf. Section 8.1.1). The proxy acts as the sender (using the aforementioned vectoras input) and the user acts as the requester (using the index of the service L hesubscribed to as input). The protocol allows the user to learn mı, without theproxy learning the user’s subscription; we achieve full service privacy.

Note how the choice of OT protocols is crucial for the performance of our protocol.In the first OT protocol, the same database is queried by the proxy for all users(and different locations as they move about). The database needs to be encryptedand transferred to the proxy only once. For the second OT between user and proxy,the subscribed service is invariant for the duration of a subscription period andit is sufficient to send the first (and expensive) message of the homomorphic OTprotocol only once. Consequently we split off these operations as sub-protocolswhich have the semantic of a service update and a user’s subscription.

This gives us a first instantiation of the first 4 protocol phases. The outline of theprotocol is depicted in Figure 9.1. Some sub-protocols are as-of-now unimplemented.The detailed protocol description in Section 9.2 contains an implementation of allthe sub-protocols in the figure.

Setup. (cf. Figure 9.1.1: ©1 KeygenU,©2 KeygenL) Every user generates a key-pairfor a homomorphic encryption scheme ©1 . These keys are used for the OT basedon homomorphic encryption. Every service generates a key-pair (skI , pkI) that isused for OT based on blind signatures ©2 .

Service Update. (cf. Figure 9.1.2: ©2 EncryptData) The database of the LBS Ljconsists of the n elements m(1,j), . . . ,m(n,j) ©1 . Each of the elements is encryptedwith its own symmetric key H(ki) that is computed by hashing the signatureki = Sign(i; skI) of the index ©2 . The encrypted database DBj = (C1, . . . , Cn),with Ci = Encs(mi, H(ki)) is sent to the proxy ©3 .

Subscription. (cf. Figure 9.1.3: ©1 Subscribe) A user’s subscription ©1 consistsof ` elements S(U,1), . . . , S(U,), . . . , S(U,`), one for each service ©2 . Each elementcontains a ciphertext Q of the homomorphic encryption scheme. Q decrypts to 1 forthe service L the user subscribes to and to 0 otherwise. To ensure the security ofthe OT the user proves in zero-knowledge that all S(U,j) are correctly constructed.

Page 180: Cryptographic Protocols For Privacy Enhanced Identity Management

156 EFFICIENT OBLIVIOUS AUGMENTED MAPS

Data Retrieval. (cf. Figure 9.1.4: ©1 Request,©2 Combine,©4 Decrypt) In the dataretrieval phase a user obtains location-specific data from his subscribed service. Theproxy is involved since he is aware of the user’s location and stores the encrypteddatabases of the services. Recall that these databases are encrypted using hashedsignatures as keys. The proxy acts on the user’s behalf and can request decryptionof individual items without revealing the location of the user. To guarantee serviceusage privacy the proxy has to repeat the following steps for every service Lj ©1 :

The proxy blinds the location ı and sends the blinded value Blind(ı; b, pkI) tothe service. The service replies with the blinded signature 〈kı〉blind. The proxycomputes mı,j = Decs(Cı;H(Unblind(〈kı〉blind; b, pkI))). This completes the firstOT. The proxy collects mı,1, . . . ,mı,` and continues with the second OT (the user’sfirst message is taken from his subscription). The proxy takes the Q correspondingto S(U,j) and computes Ej = mı,j ⊗ Q for all 1 ≤ j ≤ `.This corresponds to anencryption of mı, for L and an encryption of 0 otherwise.

As a last step the proxy combines the Ej by homomorphically adding all theencryptions (not knowing which of them contain the message) ©2 . This way allencryptions of 0 cancel out. The result is transferred to the user ©3 . He decryptsE to obtain mı, ©4 .

The two main issues with this construction are (1) the fact that the proxy learnsthe mi,j vector for the locations of all users. This is a compromise of databasesecrecy and (2) the lack of a fair payment infrastructure.

9.1.3 First Revision: Database Secrecy

We address the lack of database secrecy by intertwining the first OT with thesecond. To this end we let the proxy pass on S(U,j) to Lj . Now (after agreeingon who sends which bit range) both Lj and the proxy can act as senders in thesecond OT without learning each others’ inputs. This is made possible by theproperties of homomorphic encryption, which lets everyone manipulate encrypteddata. Informally, the last message of the first OT will be transferred as part of theencrypted payload of the second OT. This guarantees that only the user with hissecret key can obtain the results of both protocols.

More concretely the following changes have to be made in the subscription anddata retrieval phases.

Subscribe. The S(U,j) are now also sent to the services ©2 .

Data Retrieval. During Request ©1 the proxy blinds the location ı and sends theblinded value Blind(ı; b, pkI) to the service. To ensure that only the user (and not

Page 181: Cryptographic Protocols For Privacy Enhanced Identity Management

PRIVACY PROTECTED LBS SCHEME: DEFINITION AND SOLUTION SKETCH 157

Figure 9.1: Five phases of oblivious map protocol: Subscription S(U,j), encrypteddatabase DBj , service result Ej , combined result E, location-specific message m(ı,),number of subscriptions Nj , location ı, and the subscribed service (cf. Section 9.2for details).

the proxy) can decrypt Cı, the service encrypts the blinded signature 〈kı〉blind. Thisis done with an additive homomorphic encryption scheme. Remember that duringsubscription the user (through the proxy) provided the service L with an encryptionQ = Ench(1). The service computes E = 〈kı〉blind ⊗ Q = Ench(〈kı〉blind · 1) =Ench(〈kı〉blind). The result is sent to the proxy who uses a similar approach to addb and Cı to E. These requests are done for all services, including those the userdid not subscribe to. The latter however received Q = Ench(0) during Subscribeand all the operations result in Ej = Ench(0), for j 6= .

As a last step the proxy computes the homomorphic sum of all encryptions—notknowing which of them contain the unblinding information, the encrypted message,and the blinded signature ©2 . This way all encryptions of 0 cancel out. The resultis transferred to the user ©3 . He decrypts E, obtains b‖Cı‖〈kı〉blind, and computesmı = Decs(Cı;H(Unblind(〈kı〉blind; b, pkI))) ©4 .

Page 182: Cryptographic Protocols For Privacy Enhanced Identity Management

158 EFFICIENT OBLIVIOUS AUGMENTED MAPS

9.1.4 Second Revision: Payment Infrastructure

The core idea for the payment infrastructure is to bind the request of the second OT(the subscription) to a vote. Now revenues can be fairly distributed between servicesby anonymously counting the number of times users voted for (subscribed to) aservice. We use ballot-counting techniques based on homomorphic encryption andthreshold decryption. We make the following changes to the setup and subscriptionphase, and we provide an implementation for the settlement phase.

Setup. (cf. Figure 9.1.1: ©3 PaymentSetup) Each LBS Lj runs a distributed keygeneration protocol together with the proxy and the privacy trustee ©3 . Thisresults in a key pair with a secret key shared according to a (3, 3)-threshold scheme.The shared key is needed in the settlement phase to jointly compute the paymentresult.1

Subscription. (cf. Figure 9.1.3: ©1 Subscribe, ©3 VerifySubscription) A user’ssubscription ©1 consists of ` elements S(U,1), . . . , S(U,), . . . , S(U,`), one for eachservice ©2 . Each element contains two ciphertexts Q and P of the homomorphicencryption scheme, where the first is encrypted with the user’s public key and thelatter with the payment key. Both Q and P decrypt to 1 for the service L theuser subscribes to, and to 0 otherwise. To ensure the security of the OT and thepayment, U proves in zero-knowledge that Q and P are constructed correctly. Theservice providers check these proofs before providing the service ©3 .

Settlement. (cf. Figure 9.1.5: ©1 Settlement) The technique used in the Settlementphase is similar to a technique used in electronic voting protocols. The non-interactive zero-knowledge proof sent by the user in the subscription ensures thatP and Q encrypt the same value (either 1 or 0). The homomorphic propertyof the ciphertexts allows to anonymously sum up the content of all different Pvalues. The trustee T ensures that only the homomorphic sums (and not individualsubscriptions) are decrypted in a 3-out-of-3 threshold decryption ©1 . Based on theresult the proxy divides the subscription money received from the users duringsubscription in a fair way ©2 .

1Instead of the distributed key generation one could make a trust assumption and assumethat the key is generated and distributed by a trusted party. This party is only needed for theinitialization of the scheme.

Page 183: Cryptographic Protocols For Privacy Enhanced Identity Management

A MULTI-PARTY PROXY LBS SCHEME 159

9.2 A Multi-Party Proxy LBS Scheme

Length Parameter. Let k be a security parameter that determines the key sizesof the underlying cryptographic schemes. We use lN ∈ Θ(k) to denote the length ofthe RSA modulus N used for Damgard-Jurik encryption. The length of a ciphertextfor a plaintext of length Ns is Ns+1. For simplicity we use ciphertexts of lengthlN (s+ 1), to encode plaintexts of length (lN − 1)s. We use lH to denote the lengthof the plaintext for the homomorphic encryption scheme. Let lB , lb, and lD be thelength of a blinded signature, the blinding factor, and an encrypted database entryrespectively. We require lB + lb + lD ≤ lH .

Setup. PaymentSetup(Lj(1k),P(1k), T (1k)) is executed for each service Lj . Theprivacy trustee T , the proxy P and the LBS Lj run a distributed key generationprotocol for an additive homomorphic encryption scheme to generate a publicpayment key pkSj . The secret key skSj is shared according to a (3, 3)-thresholdencryption scheme such that only the three parties together can reconstruct thekey. This results in three secret shares skS(Lj ,j), skS(P,j), and skS(T ,j), which areincluded in the secret key of P, Lj , and T respectively.

LBS Key Generation. Every service provider Lj runs KeygenL(1k) to generatethe key pair (pkBj , skBj) for a unique blind signature protocol. The key pair(pkLj , skLj ) = ((pkSj , pkBj), (skS(Lj ,j), skBj)).

Proxy Key Generation. For our construction the proxy does not need to generatekeys on his own. The public key pkP of P results from combining the publicpayment keys of the services. The public key pkP = (pkS1, . . . , pkS`) and thesecret key skP = (skS(P,1), . . . , skS(P,`)).

User Key Generation. KeygenU(1k) generates a key pair (pkU , skU ) for an additivehomomorphic Damgard-Jurik cryptosystem [DJ01] used for the homomorphic OTprotocol.

Service update. Each LBS Lj encrypts its location specific information usingalgorithm EncryptData(m(1,j,v), . . . ,m(n,j,v), v; skLj ). The value v denotes theversion number of the data update. Note that m(i,j,v) contains i,j, and v and issigned by Lj to allow for authenticity checks. A service Lj uses his secret key skBjto compute DB(j,v) = (C(1,j,v), . . . , C(n,j,v), v):

k(i,j,v) = Sign(i‖v; skBj)

C(i,j,v) = Encs(m(i,j,v);H(k(i,j,v))).

Page 184: Cryptographic Protocols For Privacy Enhanced Identity Management

160 EFFICIENT OBLIVIOUS AUGMENTED MAPS

The cryptographic hash function H is used for computing the symmetric key. Notethat in Sign the values i and v need to be interpreted as fixed length bit strings.The resulting database DB(j,v) is sent to the proxy.

Subscription. The user U must subscribe to a service L to receive location specificinformation. To subscribe, the user runs the protocol Subscribe(P(skP , pkU ),U(; skU , pkP)) with the proxy. The proxy’s public key is parsed as pkP =(pkS1, . . . , pkS`) and the user proceeds as follows:

1. U uses his public key pkU to compute subscription elements S(U,j), 1 ≤ j ≤ `:S(U,j) = (Q(U,j), P(U,j), πj) where

Q(U,j) =

Ench(1; pkU ) if j =

Ench(0; pkU ) otherwise, P(U,j) =

Ench(1; pkSj) if j =

Ench(0; pkSj) otherwise

πj = NIPK(r1, r2) :

(Q(U,j) = Ench(1; r1, pkU ) ∧ P(U,j) = Ench(1; r2, pkSj))∨

(Q(U,j) = Ench(0; r1, pkU ) ∧ P(U,j) = Ench(0; r2, pkSj)The Q(U,j) are used to request the location specific information from the LBSL and the P(U,j) are used for the oblivious payment.

2. The resulting S(U,1), . . . , S(U,`) are sent to the proxy together with thepayment for the subscription, e.g., by using a credit card.

3. Additionally, the user proves in zero-knowledge that the homomorphic sumof the values Q(U,j) is an encryption of 1. This can be done using standardtechniques from [DJ01]. See also Section 2.5.4.

4. If the verification of the last proof and of the individual πj proofs succeeds,the proxy adds a time stamp to each S(U,j) and signs it.

The proxy passes each subscription S(U,j) on to the respective service Lj . Hekeeps a counter of the number of user subscriptions in this subscription period.Optionally the proxy may also retain all S(U,j).

Verify Subscription. Service Lj runs VerifySubscription(S(U,j), j; pkU , pkP) toverify that S(U,j) is correct, i.e., that the content of the queries Q(U,j) and P(U,j)are equal and encryptions of 0 or 1, and that the proxy cannot deny that the userhas subscribed. The first is done by verifying the proofs of knowledge, the latterby verifying the proxy’s time stamp and signature.

If the algorithm succeeds, S(U,j) is added to a list of subscriptions Sj . The P(U,j)of Sj will later on be added up using the homomorphic property of the underlyingencryption scheme Ench. If one of the verifications done by the services Lj ,

Page 185: Cryptographic Protocols For Privacy Enhanced Identity Management

A MULTI-PARTY PROXY LBS SCHEME 161

1 ≤ j ≤ ` fails, they refuse to provide the information and the proxy has to refundthe payment for the subscription to the user.

Data retrieval. The proxy runs Request(P(DB(j,v), ı; skP , pkU , pkLj ),Lj(S(U,j);skLj , pkU , pkP)) with Lj to request location specific information for user U .

The input of the algorithm is the database DB(j,v) with most up-to-date version vand the current location of the user ı. The proxy’s output is either an encryption ofm(ı,j,v) based on the location of the user or an encryption of 0 if U is not subscribedto Lj .

1. The proxy chooses a random b and computes

〈ı‖v〉blind = Blind(ı‖v; b, pkBj) .The random blinding factor b hides the location ı of the user in 〈ı‖v〉blind.

2. The proxy sends 〈ı‖v〉blind to Lj , 1 ≤ j ≤ `, which looks up Q(U,j) in S(U,j)and computes

Ej = 〈k(ı,j,v)〉blind ⊗Q(U,j), where 〈kı,j,v〉blind = Sign(〈ı‖v〉blind; skBj) .

3. Every service Lj sends Ej back to the proxy.

4. The proxy enriches Ej with C(ı,j,v) and b. This is done by computingEj := Ej ⊕

((C(ı,j,v)‖b) lB

) ⊗ Q(U,j). Where denotes shifting to theleft. This only changes the content of Ej if Q(U,j) is an encryption of 1.

Combining. After running Request with every Lj and receiving the correspondingEj , the proxy runs Combine(E1, . . . , El, skP , pkU ) to compute E =

⊕`j=1Ej .

Decrypting. The user decrypts E using Decrypt(E; skU , pkL):

C(ı,,v)‖b‖〈k(ı,,v)〉blind =Dech(E; skU )

k(ı,,v) =Unblind(〈k(ı,,v)〉blind; b, pkB)

m(ı,,v) =Decs(C(ı,,v);H(k(ı,,v))) .

Settlement. The proxy P can share the money collected during subscriptionfairly by counting the number of users that subscribed to service Lj in a givensubscription period. However, this has to be done without revealing the user’sservice usage. First, Lj transfers Sj to the proxy and the privacy trustee. P and

Page 186: Cryptographic Protocols For Privacy Enhanced Identity Management

162 EFFICIENT OBLIVIOUS AUGMENTED MAPS

T check the signature and the time stamp of each S(U,j) to make sure that Lj doesnot add self generated subscriptions. Moreover, the proxy checks if |Sj | correspondsto his subscription counter. This is needed to guarantee that services do not try toshrink the anonymity set of users. As the trustee is not online all the time he canonly check the plausibility of the value. However, privacy savvy users can submittheir encrypted subscriptions (or its hash) to the privacy trustee, which checks iftheir descriptions are considered during settlement.

The proxy, the privacy trustee, and the service Lj jointly compute Lj ’s share ofthe payment. First, they compute

⊕U P(U,j). The result is an encryption of the

number of users having subscribed to service Lj . Since⊕U P(U,j) is encrypted

with pkSj , all three parties have to participate in a distributed (3, 3)-thresholddecryption. The output of Settlement(Lj(Sj ; skLj , pkP),P(skP , pkLj ), T (skSj ,T ))is the total number of users subscribed to service Lj . As long as not all partiescollude, the service usage privacy of the users is guaranteed. See Section 9.3.2 fordetails.

9.3 Security and Efficiency

9.3.1 Efficiency Analysis

For our efficiency analysis we focus on two main factors. The first is the limitationin computation and communication resources on small mobile devices. The otherfactor is scalability for the organizations with respect to location resolution, i.e.number n of map cells, number ` of services , and number of users. The costs for thesetup of the payment and the service and proxy key generation are independent ofthe number of users and map cells. They are executed by unrestricted parties, andare thus ignored in the analysis. A similar argument holds for the settlement. Weconsider only public key operations and use the length parameters from Section 9.2.

User. The costs incurred by the user are key generation, subscription, anddecryption. Computation: Key generation involves the generation of a single RSAkey. The decryption requires a single Damgard-Jurik decryption. The most relevantcost for the user is the generation of the subscriptions with 12 exponentiations.However, this cost is incurred only once per subscription period. In principle, thecomputed values can even be reused for multiple periods. Communication: ADamgard-Jurik ciphertext size is about lH + lN , where lN is the size of the RSAmodulus. The overhead in addition to the message length lD is lB + lb + lN . If weuse RSA blind signatures, this is three times the size of an RSA modulus. The sizeof a subscription is about 12`(lH + lN ). For small devices and slow communicationchannels, we suggest to do the key generation and subscription over the user’s PC,

Page 187: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY AND EFFICIENCY 163

and synchronize skU to the user’s mobile device, or create the keys on the device,and move only pkU to the PC for added security.

Organizations. The scalability of our service is nearly optimal. Key setup anddatabase encryption are independent of the number of users. Database encryptionis linear in the number of locations. Subscriptions are linear in the number ofservices—optimal as all services need to be involved to guarantee service privacy.Practically data transfer is independent of the number of locations and again onlydepends on the number of services.2

Computation: The most prominent cost incurred by the service is databaseencryption requiring one signature operation per location. Communication: Themost dominant cost for Lj and P is the transmission of the encrypted databaseDB(j,v) that has length n · lD. Moreover, these costs occur whenever any of the mi

should be updated. This is a consequence of our strong database security definition.For a wide range of services it appears reasonable to relax these requirements.Updated encryptions Ci are computed like in EncryptData but only for a subsetU ⊂ 1, . . . , n of updated locations.

9.3.2 Security Analysis

Our main goal is to implement LBSs without revealing additional informationabout the user’s location and his service usage profiles. An adversary involvedin our protocol should learn nothing except what he already could have learnedby being involved in a scenario without LBSs. For the proxy this implies that heis allowed to know the user’s location but should not learn anything about theuser’s service usage profiles. For service providers this implies that both the user’slocation and the user’s service usage profiles have to be concealed. Note that ina scenario that includes payment mechanisms we have to diminish slightly fromthis strong security notion of no additional knowledge since a service can infer thenumber of subscribed users from the received amount of money.

As an additional trust assumption we state that the proxy helps to solve fairnessconflicts between users and service providers. This trust assumption is supportedby the rationality of the proxy: his core competence is setting up the mobileinfrastructure such that services and users can communicate in a fair way. Cheatingor not cooperating in resolving fairness disputes decreases his reputation, thusdecreasing his profit. Only in cases where accusing the cheater would endangerthe user’s service privacy, we make use of the privacy trustee as an additionaloff-line trusted party. The assumption of a trusted third party to resolve fairness

2De facto the dependence is logarithmic as locations are included in the message; 4 billiondifferent locations can be encoded using 32 bits.

Page 188: Cryptographic Protocols For Privacy Enhanced Identity Management

164 EFFICIENT OBLIVIOUS AUGMENTED MAPS

problems is common in the literature of fair exchange protocols. [ASW97, ASW98]have shown that the problem of fairly exchanging data requires at least an off-linetrusted party.

Location privacy. Note that in our protocol service providers only get in contactwith location related information in the data retrieval phase. However, therethe blind signature based OT protocol protects location related information frombeing revealed unintentionally. The security of the OT scheme is based on thesignature’s blindness property. This property guarantees that for any maliciousservice Lj the view of Lj for a messages M0 = i‖v and for a message M1 = i′‖v′ iscomputationally indistinguishable. As the user’s location ı is hidden in 〈ı‖v〉blind,the location privacy can be reduced to the blindness property of the used blindsignature scheme.

Service usage privacy. Achieving service usage privacy is more challenging. Therelationship between a user and his service provider plays a role in several protocolstages; i.e. it is present during data retrieval and in the payment processes. Moreover,we consider a stronger, but realistic, adversary model and allow for a corruptedproxy, who possibly collaborates with any service. This implies that the serviceusage privacy cannot rely on the help of the proxy. We proceed by analyzing therelevant phases.

In the subscription phase, the user’s subscription together with the proof ofknowledge could reveal the user’s service usage. The semantic security ofthe underlying homomorphic encryption scheme guarantees that two differentsubscriptions are indistinguishable. The zero-knowledge property of πj ensuresthat no further information is revealed.

During the settlement, privacy is protected by using a joint decryption technique, i.e.a ciphertext can only be decrypted if the three parties, P , Lj , and T work together.Hence, even if P and Lj are corrupted, there is no way for the adversary to force thedecryption of a single subscription that would reveal the user/service relationship.This is only true as long as it is not possible to present faked subscriptions tothe privacy trustee, which later on get accepted. This would reduce the size ofthe anonymity set |Sj |. The faked subscription attack reduces to the (n − 1)-attack [Cha81], against which full protection is impossible. However, to make itmore difficult for an adversary, it could be mandatory for users to always submit asubscription (or hash of it) to the trustee. This technique leads to a lower boundfor the size of the anonymity set |Sj |. To some extent the sensitivity to theseattacks can be reduced by using authenticated channels such that subscriptionscan be assigned to real users.3

3A powerful adversary will always find ways to forge subscriptions, even if it is just byconvincing real users with money.

Page 189: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY AND EFFICIENCY 165

Although one cannot assume the honesty of the proxy in general, it often seemsmore realistic to limit the adversaries’ power to corrupt only services. This isfurther supported by our assumption of the proxy’s rationality. In the case of theproxy’s honesty, we have a stronger protection against the faking attack: first,because the proxy and the trustee verify the correctness of the signature and thetimestamp and second, because the proxy checks for the equality of |Sj | and hissubscription counter. Both techniques help to protect against attempts in shrinkingthe anonymity set.

Database secrecy. In contrast to location and service usage privacy the databasesecrecy protects the interest of the LBS. It guarantees that a user gets no more thanthe requested information even if he collaborates with the proxy. The databasesecrecy of our scheme relies on two aspects: First, the service encrypts his databasebefore he sends it to the proxy. Second, as a result of the data retrieval phasethe user only gets to know the requested data. This is due to the ‘Databasesecurity’ [OK04] of the underlying secure OT protocol.

Fairness. Our system is said to be fair if both the interests of the user and of theservice are equally protected. With respect to our protocol this means that neitherthe user gets access to content without paying nor the service is able to cheat andreceive any payment without providing the information.

The fairness of our protocol significantly relies on the rationality and consequentialsemi-honesty of the proxy. Furthermore we require a trade off between serviceprivacy and fairness to protect against active attacks by services.User fairness. From the user’s perspective fairness is accomplished when theuser receives the appropriate location information for his payment.

The proxy sends a request to each service provider. The service provider respondswith a signed value Ej that is either an encryption of 0 (if the user did not subscribeto Lj) or an encryption of the requested data. Should a service fail to provide anyvalue Ej , the proxy forces it to pay a fee corresponding to the price of a servicesubscription and passes that money on to the affected user.

Now the proxy computes the user’s final value E by combining all responses. Theuser decrypts E and checks whether it corresponds with his requested data. Tofacilitate this, the messages m(ı,,v) contain ı,, and v and is signed by Lj . If amessage is incorrect, the user can file a complaint.4

In this case the user has three choices: he can either choose full privacy and give uphis money; he can file the complaint with the privacy trustee; or he can complaindirectly to the proxy. Complaints contain a proof of correct decryption of a signed

4Note that a complaining user acts as a decryption oracle. Together with the homomorphicproperty this can lead to the decryption of arbitrary messages. Consequently, users should notcomplain about random looking messages to an untrusted party.

Page 190: Cryptographic Protocols For Privacy Enhanced Identity Management

166 EFFICIENT OBLIVIOUS AUGMENTED MAPS

E. The recipient of the complaint verifies the proof and pays the money back.T will ask the proxy for the money given to users during conflict resolution inreturn for a list of bad services. These services will receive reduced payment orbe sanctioned otherwise. Should the proxy refuse to pay the money to the trusteethis can only be resolved in court.

Our protocol does not protect against denial of service attacks in which maliciousservices send random ciphertexts instead of 〈k(ı,j,v)〉blind ⊗Qj . However, this canbe detected if users are willing to give up their service usage privacy towards theproxy. We again rely on an optimistic strategy and the punishment of attackersupon detection. It is an open problem to propose an efficient data retrieval protocolbased on zero-knowledge protocols that solves this problem.Service fairness. This means that service providers receive fair payment. Inparticular, a service provider must receive money for every user he serves.

This is ensured by the checks done in VerifySubscription. The algorithm checksthat Q(U,j) and P(U,j) encode the same value v ∈ 0, 1. We refer to P(U,j) asthe vote. Obviously if v = 0, no service is provided. Consider the case of v = 1:if (1) all votes are considered and (2) the votes are counted correctly, P(U,j)increases the subscription counter of Lj by 1. The first is ensured by the timestampand signature of the proxy. The second property relies on the security of thehomomorphic encryption scheme and the security of the distributed decryptionagainst active adversaries.

9.4 Conclusion

We introduced the first privacy-preserving LBS framework based on cryptographictechniques, namely, on oblivious transfer and homomorphic encryption. The privacyof the user is protected by hiding the user’s location from the services and by notrevealing information on the user/service relationship. Additionally, we presenteda system for subscription management including a fair yet anonymous paymentscheme.

We have given strong intuitions on the security properties of our scheme, however,it remains an open challenge to prove them in a formal context.

Page 191: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 10

Authorized Private Searches onPublic Key Encrypted Data

Vast quantities of sensitive personal data are retained for the purpose of networkforensics and cyber investigations [eu-06b]. However, the presumption of innocenceprevents ‘mass surveillance’ by requiring judicial guarantees (it is only possibleto search with a warrant). Moreover, the eventual societal advantages of dataretention must be counterpoised by the dangers that such data could fall into thewrong hands.

The encryption of retained data is a desirable countermeasure against data theftor abuse. But how, then, can the investigator, such as the police or a secretservice, search the data without having to decrypt the whole database? What ifthe investigator should only be given access to data that fulfills certain criteria?This seems to be a hard problem, as the criteria themselves may be sensitive andthus requiring protective measures, such as encryption. Moreover, a secret servicemight be reluctant to reveal the queries it wants to run on the encrypted database.

We consider a scenario in which an investigator searches for data described bymultiple keywords without revealing the keywords or the search results to thedatabase server. This scenario is akin to the private searching of streaming datapresented in [OS07a]. While in [OS07a] the data is searched as it is generated (andcan thereafter be discarded), in our scenario data is first stored in encrypted formand can be searched at a later stage. To provide a high level of security we makeuse of asymmetric cryptography. The retaining server only possesses the publicencryption key (and cannot decrypt the retained data itself). In this way, data thatis already encrypted remains secure even against a strong adversary that breaksinto the database server. The decryption key is stored by a security server, which

167

Page 192: Cryptographic Protocols For Privacy Enhanced Identity Management

168 AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA

Judge

1: Public key

3: Blind search trapdoor generation

4: Private searching

2: Warrant

InvestigatorSecurity Server

Database server

Figure 10.1: Authorized Private Searches on Public Key Encrypted Data: Thesecurity server generates keys and distributes the public key (1). An investigatorobtains a warrant for a keyword from a judge (2) and can use the warrant to get asearch trapdoor from the security server (3). Using the trapdoor the investigatorcan search the database of retained data (4).

will only be involved when executing search queries. The security server could forinstance also be operated by the retaining organization, but should be subject tostronger protection.

As the details of queries made are to be obscured even from the security server, itis necessary to impose some restrictions on the investigator. Thus we introducesome checks and balances to avoid abuse by overzealous or malicious agents. Oneobvious restriction is in the number of queries that the investigator can make.An unreasonable number of requests may be an indication of abuse. Anotherrestriction that we consider is to involve a judge in granting search warrants to theinvestigator. The keyword is still hidden, but the security server is guaranteed thata judge (or another trusted entity) has approved the search for a specific keyword.These interactions are schematized in Figure 10.1.

Waters et al. [WBDS04] build an encrypted and searchable audit log. They proposetwo schemes, one based on symmetric encryption and one based on asymmetricencryption. They conclude that asymmetric encryption provides better security,as it reduces the trust in the encrypting entity. Our work can be seen as anextension of their asymmetric scheme with the possibility to obliviously searchthe encrypted database. For the symmetric case, in which the audit log serverknows all the information needed for decrypting the database, the problem ofperforming oblivious searches is covered by [CGN98, OK04]. The problem ofoblivious searching on public key encrypted data is more difficult.

Page 193: Cryptographic Protocols For Privacy Enhanced Identity Management

AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA 169

Our solution. In [WBDS04], the asymmetric searchable encryption scheme isbased on identity-based encryption (IBE) [BF01]. The keywords themselves areused to encrypt the database, i.e., they are the identity strings of the IBE scheme.The anonymity property of the Boneh-Franklin IBE scheme [BF01] ensures that aciphertext does not leak the identity string used to generate the encryption. Thesecurity server holds the master secret key that is used to derive the secret keyscorresponding to the keywords that are needed for searching. A similar techniquefor searchable encryption was formalized as public key encryption with keywordsearch (PEKS) by [BDCOP04]. In PEKS, the derived keys are referred to as searchtrapdoors, which can be given to third parties to grant them search rights.

When trying to build an oblivious search mechanism for such a database we have toaddress two difficulties: hiding the keywords from the security server and hiding thekeywords and the search results from the database. For the former, we present twonew cryptographic primitives. The first primitive is committed blind anonymousIBE. In this context, anonymous means that the ciphertext does not leak the key(identity) under which it was encrypted [Gen06, BW06] and blind means that auser can request the decryption key for a given identity without the key generationentity learning the identity [GH07]. The work of [GH07] describes how to constructblind key derivation protocols for [BB04a] and [Nac07, CS05], but these schemesare not anonymous. Moreover, it is much harder to derive a blind key derivationprotocol for the Boneh-Franklin IBE scheme [BF01] used in [WBDS04], and weare interested in IBE schemes that do not require random oracles for their securityproofs. As a corollary to our results, we obtain the first instantiation of the schemeproposed by Waters et al. [WBDS04] that is secure without random oracles.

We design a committed blind anonymous IBE scheme based on the anonymousIBE scheme due to Boyen and Waters [BW06]. As the scheme in [BW06] is onlyselective ID secure, we extend it with adapted ID security and prove the modifiedscheme secure. For the modified scheme we design a blind key extraction protocol.This leads to the first blind anonymous IBE scheme we are aware of. We extend thedefinition of blind IBE to allow for the derivation of a secret key for a committedidentity.

The second primitive we present is public key encryption with oblivious keywordsearch (PEOKS), which we implement using our committed blind anonymous IBEscheme. First, we extend the definition of PEKS to incorporate the encryption ofa secret message when computing a searchable encryption. This secret messagecan contain a symmetric key, which allows PEKS to be used directly in settingssuch as [WBDS04]. Then we define blind key extraction with committed keywords,which facilitates the use of a policy that states for which keywords a trapdoor canbe extracted while still keeping them hidden from the trapdoor generation entity.

In order to hide the search results from the database one could in theory downloadthe whole database and then use PEOKS to do the search. This is inefficient.

Page 194: Cryptographic Protocols For Privacy Enhanced Identity Management

170 AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA

We describe a data structure that allows to use private information retrieval(PIR) [CGKS95] to improve the communication efficiency of the search.

Our contribution. We define and construct the first blind anonymous IBE scheme.We generalize PEKS to be usable in settings such as [WBDS04], and we extend it toincorporate the facility to perform oblivious keywords searches (PEOKS). Both ourblind anonymous IBE scheme and our PEOKS scheme support committed blindkey extraction and thus allow for complex policies. The full power of cryptographicproof systems in Section 2.5 and the anonymous credential mechanisms in Chapter 3could be used to prove statements about the keyword in the commitment. Finally,we describe the first public key encrypted database that allows for oblivious searches,i.e., both the keywords and the search results remain hidden.

Outline of the chapter. In Section 10.1 we define committed blind anonymousIBE and PEOKS. We construct a committed blind anonymous IBE scheme and weshow how to apply it to build a PEOKS scheme in Section 10.2. In Section 10.4, wedescribe the use of PEOKS to construct a privacy-preserving searchable encrypteddatabase. Finally, Section 10.5 draws a conclusion and discusses future work.

10.1 Definitions of Committed Blind Anonymous IBEand PEOKS

10.1.1 Anonymous Identity-Based Encryption

We recall the definition of identity-based encryption [BF01]. An IBE schemeΠ = (IBESetup, IBEExtract, IBEEnc, IBEDec) consists of the algorithms:

IBESetup(1k) outputs parameters params and master secret key msk.

IBEExtract(params,msk, id) outputs the secret key skid for identity id.

IBEEnc(params, id,m) outputs ct encrypting m under id.

IBEDec(params, skid , ct) outputs message m encrypted in ct.

An IBE scheme is anonymous [ABC+05], if it is not possible to associate theidentity id used to encrypt a message m with the resulting ciphertext ct (in thecontext of public key encryption this is also known as key privacy [BBDP01]).

Abdalla et al. [ABC+05] define anonymity through a security game in which theadversary receives a ciphertext encrypted with an identity that is randomly picked

Page 195: Cryptographic Protocols For Privacy Enhanced Identity Management

DEFINITIONS OF COMMITTED BLIND ANONYMOUS IBE AND PEOKS 171

from two identities of his choosing. The adversary has to guess the identity usedto encrypt the ciphertext. As in [Gen06], we combine this game with the standardchosen plaintext security game for IBE in which the adversary needs to guess whichmessage out of two possible messages was encrypted.1

Definition 10.1.1 (Secure Anonymous IBE [ABC+05])Let k be a security parameter. An anonymous IBE scheme Π is secure if every

p.p.t. adversary A has an advantage negligible in k in the following game:Setup. The game runs IBESetup(1k) to generate (params,msk).Phase 1. A may query an oracle OIBEExtract(params,msk, .) polynomially manytimes. On input id, the oracle runs IBEExtract(params,msk, id) and returns theassociated skid.Challenge. A presents two target identities id0, id1, which have not been queriedin Phase 1, and two challenge messages m0, m1. The simulator selects two randombits b1 and b2, and returns the challenge ciphertext ct = IBEEnc(params, idb1 ,mb2)to A.Phase 2. A may again query oracle OIBEExtract(params,msk, .) polynomially manytimes on id provided id is not id0 or id1.Guess. A outputs b′1, b′2. We define the advantage of A as |Pr[b′1 = b1 ∧ b′2 =b2]− 1/4|.

10.1.2 Committed Blind Anonymous IBE

In standard IBE schemes, a key generation center KGC executes the key extractionalgorithm IBEExtract that returns the secret key skid corresponding to input identityid. Green and Hohenberger [GH07] propose extracting the secret key in a blindedmanner. The blinding action obscures the identity from the KGC. We extend thisconcept by proposing a committed blind anonymous IBE scheme, where the KGC isgiven a commitment to the requested identity. A user can reveal partial informationabout the identity or prove statements about it using efficient zero-knowledge proofsabout commitments [Bou00, Bra97].2

A committed blind anonymous IBE scheme consists of the algorithms Π of an IBEscheme, a secure commitment scheme Com, and the protocol IBEBlindExtract:

IBEBlindExtract(U(params, id,i d),KGC(params,msk, comm)) generates the secretdecryption key skid for U ’s identity id in an interactive key issuing protocolbetween U and the KGC. If comm = Com(id,i d), U ’s output is a decryption

1We define an adaptive identity security game as this is required by our IBE to PEOKStransformation.

2Technically this can be seen as restricting the blind key derivation queries to a certainlanguage, membership of which is proven in zero-knowledge.

Page 196: Cryptographic Protocols For Privacy Enhanced Identity Management

172 AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA

key skid and the output of the KGC is empty. Otherwise both parties output⊥.

Green and Hohenberger [GH07] construct a security argument for blind-IBE bydefining two properties for the IBEBlindExtract protocol: leak freeness and selective-failure blindness. Leak freeness requires that IBEBlindExtract is a secure two-partycomputation that does not leak any more information than IBEExtract.3 Selective-failure blindness requires that a potentially malicious authority does not learnanything about the user’s identity during the IBEBlindExtract protocol. Additionally,it cannot cause the IBEBlindExtract protocol to selectively fail depending on theuser’s choice of identity. We provide adapted versions of these properties forcommitted blind anonymous IBE.

Definition 10.1.2 (Leak Freeness [GH07])An IBEBlindExtract protocol of an IBE scheme is leak free if, for all efficientadversaries A, there exists an efficient simulator S such that for every valuek, no efficient distinguisher E can determine whether it is playing Game Real orGame Ideal with non-negligible advantage, where

Game Real: Run IBESetup(1k). As many times as E wants he picks a commit-ment C and A’s input state. A executes IBEBlindExtract(A(params, state),KGC(params,msk, comm)) with the KGC. A returns the resulting view toD.

Game Ideal: Run IBESetup(1k). As many times as E wants he picks acommitment C and initial input state. S obtains (params, state) and maychoose values id, id to query an oracle OIBEExtract that knows msk and isparametrized with comm. If comm = Com(id, id), the oracle returns keyskid ← IBEExtract(params,msk, id), otherwise ⊥. S returns a simulatedview to E.

Definition 10.1.3 (Selective-Failure Blindness [CNS07])An IBEBlindExtract protocol is said to be selective-failure blind if every adversaryA has a negligible advantage in the following game: A outputs params and apair of identities id0, id1. A random bit b ∈ 0, 1 is chosen, and A is given twofresh commitments commb, comm1−b to idb and id1−b and black-box access totwo oracles: (params, idb, openidb) and (params, id1−b, openid1−b). The algorithmsproduce skb, sk1−b respectively. If both skb 6=⊥ and sk1−b 6=⊥ , A receives (sk0,sk1); if only sk1−b =⊥ , (ε,⊥); if only skb =⊥ , (⊥, ε); and if skb = sk1−b =⊥, Areceives (⊥,⊥). Finally, A outputs his guess b′ (for b). The advantage of A in thisgame is |Pr[b′ = b]− 1/2|.

3It also implies that the user is required ‘to know’ the id for which she needs a key to beextracted. We also require that she knows the opening to the commitment.

Page 197: Cryptographic Protocols For Privacy Enhanced Identity Management

DEFINITIONS OF COMMITTED BLIND ANONYMOUS IBE AND PEOKS 173

Following [GH07] we define a secure committed blind anonymous IBE as follows.

Definition 10.1.4 (Secure Committed Blind Anonymous IBE)A committed blind anonymous IBE scheme (Π, IBEBlindExtract,Com) is secure ifand only if: (1) The underlying Π is a secure anonymous IBE scheme, (2) Com is asecure commitment scheme, and (3) IBEBlindExtract is leak free and selective-failureblind.

10.1.3 Public Key Encryption with Oblivious Keyword Search

We recall and extend the definition of PEKS [BDCOP04]. A PEKS scheme Υ =(KeyGen,PEKS,Trapdoor,Test) consists of the algorithms:

KeyGen(1k) outputs a public key Apub and secret key Apriv.

PEKS(Apub,W,m) outputs a searchable encryption SW of m under keyword W .

Trapdoor(Apub, Apriv,W ) outputs a trapdoor TW that allows to search for thekeyword W .

Test(Apub, SW , TW ′) outputs the message m encoded in SW , if W = W ′; otherwiseit outputs ⊥.

The definition of Υ extends the standard definition of PEKS scheme in [BDCOP04]by encoding a secretm into the PEKS element SW generated by the PEKS algorithm.Test outputs this secret when a match occurs.

A secure PEKS scheme must be chosen plaintext attack (CPA) secure andconsistent [ABC+05]. CPA security requires that an attacker cannot distinguishtwo PEKS elements generated for keywords and messages of his choice, even ifgiven oracle access to Trapdoor for other keywords. Consistency requires that ifthe searchable encryption and the trapdoor are computed using different keywords,then algorithm Test should output ⊥ upon such input.

In PEKS, the party holding the secret key Apriv runs the Trapdoor algorithm toobtain the trapdoor TW for a keyword W . Public key encryption with obliviouskeyword search (PEOKS) is an extension of PEKS in which a user U performinga search can obtain in a committed and blinded manner the trapdoor TW fromthe trapdoor generation entity T GE . The T GE only learns a commitment to thesearch term.

A PEOKS scheme consists of the algorithms Υ of a PEKS scheme, a securecommitment scheme Com used to commit to keywords, and the followingBlindTrapdoor protocol:

Page 198: Cryptographic Protocols For Privacy Enhanced Identity Management

174 AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA

BlindTrapdoor(U(Apub,W, openW ), T GE(Apub, Apriv, comm)) generates the trap-door TW for a keyword W in an interactive protocol between U and T GE . Ifcomm = Com(W, openW ), U ’s output is the trapdoor TW and the output ofT GE is empty. Otherwise both parties output ⊥.

Leak freeness and selective-failure blindness can be defined for BlindTrapdoorfollowing the definition for IBEBlindExtract.4 We define the security of PEOKSsimilarly to that of a committed blind anonymous IBE scheme, assuming a secureunderlying PEKS scheme.

Definition 10.1.5 (Secure PEOKS)A PEOKS scheme (Υ, BlindTrapdoor, Com) is secure if and only if: (1) Theunderlying Υ is a secure PEKS scheme, (2) Com is a secure commitment scheme,and (3) BlindTrapdoor is leak free and selective-failure blind.

10.2 Construction of a Committed Blind AnonymousIBE Scheme

10.2.1 The Underlying Anonymous IBE Scheme

We present an anonymous IBE scheme that is adaptive-identity secure in thestandard model, based on the anonymous IBE scheme proposed by Boyen-Waters [BW06]. The Boyen-Waters scheme is selective identity secure. We use atransformation due to Naccache [Nac07], a variant of that of Waters [Wat05], toachieve the required adaptive identity security. The use of such a transformationwas proposed by Boyen-Waters [BW06]. We provide what we believe to bethe first proof of security for this variant. Our scheme supports asymmetricbilinear pairings, allowing the use of a wider range of potentially more efficientimplementations using different pairing types [GPS08]. Let identity id ∈ 0, 1`×nand let id1‖ . . . ‖idn = id be the separation of id into `-bit integers idi. LetH1(id) = g0

∏ni=1 g

idii and H2(id) = h0

∏ni=1 h

idii . The functions H1 and H2 are

referred to as a Waters hash functions [Wat05]. We require two such functions, aswe describe our scheme for an asymmetric pairing e : G1 ×G2 → GT . A Watershash function is often used in the construction of cryptographic primitives thatdo not make use of a random oracle in their prove of security. Our anonymousIBE scheme Π = (IBESetup, IBEExtract, IBEEnc, IBEDec) consists of the followingalgorithms :

4The inputs are mapped 1-to-1. KeyGen is used instead of IBESetup and Trapdoor instead ofIBEExtract.

Page 199: Cryptographic Protocols For Privacy Enhanced Identity Management

CONSTRUCTION OF A COMMITTED BLIND ANONYMOUS IBE SCHEME 175

IBESetup(1k). Run BMGen(1k) to obtain a bilinear map setup (p,G1,G2,GT , e, g, h).Choose values α, z0, z1, . . . , zn, t1, t2, t3, t4 ← Zp∗ and keep msk = (α, t1, t2, t3,t4) as the master key. Compute the system parameters as

params =(Ω = e(g, h)t1t2α, g, h, g0 = gz0 , . . . , gn = gzn ,

v1 = gt1 , . . . , v4 = gt4 , h0 = hz0 , . . . , hn = hzn).

IBEExtract(params,msk, id). Choose two random values r1, r2 ← Zp∗ and computethe key

skid =(hr1t1t2+r2t3t4 , h−αt2H2(id)−r1t2 , h−αt1H2(id)−r1t1 , H2(id)−r2t4 ,

H2(id)−r2t3).

IBEEnc(params, id,m). To encrypt a message m ∈ GT , choose s, s1, s2 ← Zp, andgenerate the ciphertext

ct =(Ωs ·m, H1(id)s, vs−s1

1 , vs12 , v

s−s23 , vs2

4

).

IBEDec(params, skid , ct). Parse skid as (d0, d1, d2, d3, d4) and ct as (c′, c0, c1, c2, c3,c4) and return

m = c′ · e(c0, d0) · e(c1, d1) · e(c2, d2) · e(c3, d3) · e(c4, d4).

Theorem 10.2.1 The scheme Π is a secure anonymous IBE under the DBDHand DLIN assumptions. See Appendix A.8.1 for a proof.

10.2.2 Blind Extraction Protocol

We introduce an interactive blind key extraction protocol IBEBlindExtract, whichextends algorithm IBEExtract.

Intuition behind our construction. Generating a randomly distributed secret keyby means of the IBEBlindExtract protocol requires the values r1, r2 to be jointlychosen by the user and the key issuer in a manner which prevents either party fromlearning anything about the other’s randomness. This prevents a user who learnsthe issuer’s randomness from potentially decrypting messages of other users and

Page 200: Cryptographic Protocols For Privacy Enhanced Identity Management

176 AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA

an issuer who learns a user’s randomness from potentially breaking the blindnessof the key issued.

The key issuer, KGC, chooses random values r1, r2 ← Zp∗, and the user U picksrandom values r′1, r′2 ← Zp∗. The key generation protocol may be implementedusing standard secure two-party computation techniques [Yao82], as a protocol inwhich the user inputs r′1, r′2 and the KGC inputs α, t1, t2, t3, t4, r1, r2. The user’soutput in the protocol is a secret key

skid =(hr1t1t2+r2t3t4 , h−αt2H2(id)−r1t2 , h−αt1H2(id)−r1t1 , H2(id)−r2t4 , H2(id)−r2t3)

with r1 = r1r′1 and r2 = r2r

′2. The KGC learns nothing further, and outputs

nothing. By decomposing this protocol into sub-protocols, whose results onlyrequire simple arithmetic operations (addition and multiplication), we obtain anefficient protocol.

Construction. Our committed blind anonymous IBE scheme consists of thealgorithms Π of the underlying IBE scheme, the Pedersen commitment schemeCom, and the following IBEBlindExtract protocol:

IBEBlindExtract(U(params, id, openid)↔ KGC(params,msk, comm)).

1. KGC chooses at random r1, r2 ← Zp∗, and the user U chooses at randomu0, u1, u2 ← Zp and u3, r

′1, r′2 ← Zp∗. Implicitly, r1 = r1r

′1 and r2 = r2r

′2.

U computes commu3 = Com(u3, openu3), and KGC computes commr1 =Com(r1, openr1) and commr2 = Com(r2, openr2). KGC and U make useof a two-party protocol for simple arithmetic modulo p (parametrized bycommu3 , commr1 , and commr2). U inputs u0, u1, u2, u3, openu3 , r

′1, r′2 and

KGC inputs α, t1, t2, t3, t4, r1, openr1 , r2, openr2 , openx0 , openx1 , openx2 . Ifcommu3 = Com(u3, openu3), commr1 = Com(r1, openr1), and commr2 =Com(r2, openr2) the output of KGC is

x0 = (r1r′1t1t2 + r2r′2t3t4) + u0 (mod p) ,

x1 = −(u3/r′1 · αt2) + u1 (mod p) ,

x2 = −(u3/r′1 · αt1) + u2 (mod p) .

Provided that KGC does not abort at that moment, U obtains commx0 =Com(x0, openx0), commx1 = Com(x1, openx1) and commx2 = Com(x2, openx2)as output. Otherwise, both parties output ⊥. In Section 2.4.5 we show howto efficiently realize such a protocol.

2. U computes ID′ = H2(id)u3 , where u3 is a blinding value, and sends ID′ toKGC. U proves that the identity in ID′ corresponds to commid and that ID′is well-formed using commu3 . KGC returns ⊥ if the proof fails.

Page 201: Cryptographic Protocols For Privacy Enhanced Identity Management

CONSTRUCTION OF A COMMITTED BLIND ANONYMOUS IBE SCHEME 177

3. KGC computes

skid′ =(hx0 , hx1ID′−r1t2 , hx2ID′−r1t1 , ID′−r2t4 , ID′−r2t3).

4. KGC sends the blinded key skid′ = (d′0, d′1, d′2, d′3, d′4) to U , and en-

gages in a zero-knowledge proof of knowledge that it is correctly con-structed. The proof assures U that KGC’s values r1, r2, openr1 , openr2 ,t1, t2, t3, t4, x0, x1, x2, openx0 , openx1 , openx2 correspond to skid

′ and to thecommitments commr1 , commr2 , commx0 , commx1 and commx2 . If the prooffails, U returns ⊥. Otherwise, she computes skid = (d0, d1, d2, d3, d4) =

(d′0h−u0 , (d′1h−u1)r′1/u3 , (d′2h−u2)r

′1/u3 , d′3

r′2/u3 , d′4r′2/u3) .

Theorem 10.2.2 The interactive IBEBlindExtract protocol provides a leak-freeand selective-failure blind committed blind extract protocol for the adaptive identitysecure anonymous IBE scheme.

Proof sketch. Leak freeness: Note that the simulator S can rewind an instance ofthe adversary A that he runs internally. He simulates the communication betweenthe distinguisher E and A by passing E ’s input to A and A’s output to E .In the two-party protocol, S can provide random input. Using rewinding techniques,S extracts A’s input r′1, r′2, and u0, u1, u2, u3 to the two-party computation protocol.In the next step of the blind issuing protocol, A must send ID′ = H2(id)u3 togetherwith a proof of knowledge of a correct representation of ID′ and commid. S usesits rewinding access to A in order to also extract id, and id.

Next S submits id, id to OIBEExtract to obtain a valid secret key skid =(d0, d1, d2, d3, d4). S returns (d0 · hu0 , d

u3/r′1

1 hu1 , du3/r

′1

2 hu2 , du3/r

′2

3 , du3/r

′2

4 ) to A.These values are distributed in the same way as in IBEBlindExtract.

Selective-failure blindness: A provides params, and two identities id0, id1. Thegame chooses a random bit b. A is given commitments commb = Com(idb, openb)and comm1−b = Com(id1−b, open1−b). A has black-box access to two oraclesU(params, id1−b, open1−b) and U(params, idb, openb).

Note that once an oracle U is activated, A can run a two-party protocol with theoracle, the result of which are three randomly distributed values in Zp (x0, x1, x2).In the next step, the oracle provides a randomly distributed value in G2 (ID′), toA. Then the oracle performs a zero-knowledge proof with A.

Suppose that A runs one or both of the oracles up to this point. Up to now thedistributions of the two oracles are computationally indistinguishable. (Otherwisewe could break the security of the two-party computation, the hiding property ofthe commitment scheme or the witness indistinguishability of the zero-knowledgeproof. The latter is implied by the zero-knowledge property of the proof system.)

Page 202: Cryptographic Protocols For Privacy Enhanced Identity Management

178 AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA

A must provide values (d′0, d′1, d′2, d′3, d′4) and a proof that these values were correctlycomputed. We can assume that A chooses these values using an arbitrary complexstrategy. We show that any adversary A can predict the output ski of U withoutfurther interaction with the oracles:

1. A does the proof of Step 4 internally with itself. If the proof fails, itrecords sk0 = ⊥. Otherwise, the adversary temporarily records sk0 =IBEExtract(params,msk, id0).

2. In turn, A generates different (d′0, d′1, d′2, d′3, d′4) and executes a second proofof knowledge (again internally), now for the second oracle. It performs thesame checks and recordings for sk1 and id1.

3. Finally the adversary predicts (sk0, sk1), if both sk0 6=⊥ and sk1 6=⊥; (ε,⊥),if only sk1 =⊥; (⊥, ε), if only sk0 =⊥; and (⊥,⊥), if sk0 = sk1 =⊥.

These predictions result in the same distributions as that returned by the oracle,as the same checks are performed. Moreover, note that for the case that keysare returned by the game they are in both cases equally distributed random keysbecause of the random values r′1 and r′2 contributed by the oracles.

For a formal proof based on sequences of indistinguishable games see Ap-pendix A.8.2.

10.3 Transformation to PEOKS

We construct a suitable PEKS scheme for our application scenario using theanonymous IBE scheme presented in Section 10.2.1. We follow a generictransformation by Abdalla et al. [ABC+05] from IBE to PEKS. The transformationtakes as input the algorithms Π of a secure IBE scheme and returns a PEKS schemeΥ = (KeyGen,PEKS,Trapdoor,Test). Note that our scheme differs from preexistingschemes as Test returns a secret message in case of a match:

KeyGen(1k) runs algorithm IBESetup(1k) and returns the key pair (Apub, Apriv),the (params,msk) of the IBE scheme.

PEKS(Apub,W,m) takes as input public key Apub, keyword W and message m.It outputs a searchable encryption SW of message m under keyword W asfollows:

1. Generate a random value C2 ∈ 0, 1k.2. Compute C1 = IBEEnc(Apub,W,m‖C2).

Page 203: Cryptographic Protocols For Privacy Enhanced Identity Management

AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA 179

3. Output the tuple SW = (C1, C2).

Trapdoor(Apub, Apriv,W ) outputs a trapdoor TW = IBEExtract(Apub, Apriv,W )that enables a search for the keyword W .

Test(Apub, SW , TW ′) parses SW as (C1, C2) and computes M = IBEDec(Apub, TW ′ ,C1). If M = m‖C2, it outputs the message m encoded in SW ; if there is nomatch, it outputs ⊥.

In order to achieve the oblivious property in our PEOKS scheme, we extendthe Trapdoor algorithm to a BlindTrapdoor protocol. Our PEOKS scheme is thuscomposed of the algorithms Υ of the PEKS scheme, a secure commitment schemeCom, and a BlindTrapdoor protocol where

BlindTrapdoor(U(Apub,W, openW ), T GE(Apub, Apriv, comm)) generates a trapdoorTW for a keyword W by running protocol IBEBlindExtract(U(Apub,W,openW ),KGC(Apub, Apriv, comm)).

10.4 Authorized Private Searches on Public KeyEncrypted Data

We describe a public key encrypted database that enables oblivious searches. Ourconstruction is similar to the audit log presented in [WBDS04]. Each data recordis encrypted using a fresh random symmetric key and associated with severalsearchable encryptions. Each searchable encryption is generated using input ofa keyword that describes the content of the record, and a secret message thatcontains the symmetric key. Once an investigator obtains a trapdoor that matchesa searchable encryption (i.e., both were computed on input the same keyword), sheis returned the symmetric key that allows her to decrypt the record.

In constructing authorized private searches, we ensure that neither the keywordsof interest for the investigator nor the search results are revealed to the retainingorganization. For the first property, we employ the PEOKS scheme. Theinvestigator runs protocol BlindTrapdoor with the trapdoor generation entity (T GE)in order to retrieve a trapdoor for a committed keyword in a blind manner. Thecommitted blind extraction allows the T GE to construct policies detailing thedata that a particular investigator can obtain. To enforce these restrictions, theT GE requires the investigator to prove in zero-knowledge that the keyword usedto compute the commitment belongs to a certain language. We consider a party(such as a judge) in charge of deciding which keywords the investigator is allowedto use. We describe how the investigator obtains a search warrant from the judgeand how he uses it with the T GE . The judge and the T GE are only involved in

Page 204: Cryptographic Protocols For Privacy Enhanced Identity Management

180 AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA

providing search warrants and trapdoors respectively, and can remain off-line whennot required to perform these tasks.

To obscure the search results, we describe a data structure that allows the use of aPIR scheme and that integrates concepts from [CGKO06] to improve the efficiencyof the searches5. Since the PIR queries are made over encrypted data, we alsoensure that the investigator does not obtain any information about data describedby keywords for which she was not authorized to retrieve a trapdoor. It shouldalso be noted that, due to the public key setting, the database only stores thepublic key of the PEOKS scheme. Thus, in the event that it gets corrupted, recordsencrypted prior to corruption remain secure (forward secrecy).

Details on data storage. We describe a data structure in which only onesearchable encryption per keyword is computed, while still allowing each datarecord to be described by several keywords. Once the investigator finds thesearchable encryption that matches her trapdoor, she receives the informationneeded to decrypt all the data records described by the corresponding keyword.This mechanism of data storage allows for an efficient search (not all the searchableencryptions need to be tested) and is privacy enhancing in so far as it hides thenumber of keywords that describe a record from the investigator.

We use encrypted linked lists and store the encrypted nodes at random positionsin the PIR database to hide which node belongs to which linked list, as introducedin [CGKO06]. We construct one linked list per keyword. Each node in the linkedlist contains the information required to retrieve and decrypt one record associatedwith the keyword. A node contains a PIR query index PR for the data record andthe key KR used to encrypt the record. It also stores a PIR query index to thenext node on the list, and the key used to encrypt it. To encrypt the nodes andthe records of data, we employ a symmetric encryption algorithm Enc.

When the data holder adds a keyword W for which no searchable encryption haspreviously been computed, he chooses a symmetric key KN1 and runs algorithmPEKS(Apub,W,KN1 ||PN1) to compute the searchable encryption. PN1 is the PIRquery index to the first node of the list and KN1 is the symmetric key used toencrypt this node. He then builds the node N1 = (PR,KR, PN2 ,KN2), computesEnc(KN1 , N1), and stores the node in the position given by PN1 . Finally, he deletesPN1 and KN1 from his memory but keeps values PN2 and KN2 . PN2 and KN2 arethe PIR query index and the key for the next node in the list. In position PN2 aflag is stored to indicate the end of the list.

5The amount of PIR queries may give some indication about the number of records retrieved.This information can be hidden through dummy transactions up to an upper limit on the numberof matching records.

Page 205: Cryptographic Protocols For Privacy Enhanced Identity Management

AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA 181

When the data holder chooses this keyword to describe another record R′, it buildsthe second node N2 = (PR′ ,KR′ , PN3 ,KN3), runs Enc(KN2 , N2), and stores theencrypted node in the position given by PN2 . It deletes PN2 and KN2 from hismemory but keeps PN3 and KN3 to facilitate adding another node to the list. Healso stores the flag in PN3 . This iterative procedure is applied as many times asrequired.

If a data record is described by several keywords, one node per keyword is generatedand stored in its corresponding linked list. All these nodes contain the same PIRquery index to the data record and the same key used to encrypt the record.

Authorizing and performing private searches. An investigator who wants tosearch on the encrypted database follows the procedure:

1. The investigator requests authorization from the judge to perform a searchon a given database for a particular keyword W . Assuming the investigatorholds the relevant credentials, the judge grants a warrant. In practice, thismeans that the investigator runs a protocol GetCredential with the judge,which returns to the investigator a credential cred with attribute W from thejudge.

2. The investigator requests a trapdoor from the T GE . This is a three-stepprocess:(a) The investigator has a commitment comm = Commit(W, openW ) to thekeyword W for which she wants to receive a trapdoor, and sends comm tothe T GE .(b) The investigator and the T GE run an interactive protocol, ShowCredential.This verifies the validity of the credential presented by the investigator andthe claim that the keyword used to compute the commitment is the same asthe keyword contained in the credential’s attributes.(c) The investigator and the T GE execute the BlindTrapdoor protocol, withinvestigator input Apub,W, openW and T GE input Apub, Apriv, comm. Theprotocol returns no output to the T GE , and a trapdoor TW to the investigator.

3. The investigator downloads the list of PEKS elements for all the keywords.

4. If an investigator performs a successful Test for a PEKS element (using thecorrect trapdoor), the algorithm returns the key and PIR query index pairthat correspond to the first node of the list. The investigator uses the PIRscheme to retrieve the node and the first record. As above, each node returnssufficient information to link to the next node, until all data related to thekeyword have been returned.

Remark 10.4.1 GetCredential and ShowCredential can be implemented usingconventional signatures: during GetCredential the judge signs comm = Commit(W,

Page 206: Cryptographic Protocols For Privacy Enhanced Identity Management

182 AUTHORIZED PRIVATE SEARCHES ON PUBLIC KEY ENCRYPTED DATA

openW ) to create the credential cred (a signature on comm); in the ShowCredentialprotocol the investigator sends cred together with comm and the T GE verifiesthe signature. More sophisticated credential protocols such as those described inPart I allow for the implementation of more complex policies, such as, e.g., thetime-restricted searches described below.

Time-restricted searches. The notion of temporary keyword search as proposedby Abdalla et al. [ABC+05] uses searchable encryptions and trapdoors that arerelated to a specific time period in such a way that, even if the keyword usedto compute them is the same, they do not match if the time period is different.The simplest way to build PEKS with temporary keyword search is to concatenatekeywords W and time periods t when computing searchable encryptions andtrapdoors. When applying this solution to our database, multiple linked lists aregenerated for the same keyword concatenated, each corresponding to a differenttime frame.

This function is useful to provide searches in which the investigator is allowedto obtain all records described by a specific keyword that were stored within arestricted period of time. In this case, the credential issued by the judge is extendedto contain two additional attributes t1 and t2 corresponding to a start timestampand end timestamp which limit the period of an investigation. When showingthe credential to the T GE , the investigator computes a commitment to W ||t andalso proves that t1 ≤ t ≤ t2. This can for instance be done using the techniquesdescribed in [Bou00].

10.5 Conclusion

We have defined and implemented a searchable encryption scheme, PEOKS, thatallows for oblivious searches on public key encrypted data. For this purpose,we have extended the PEKS primitive by adding blind trapdoor extraction withcommitted keywords. In order to implement PEOKS, we have defined committedblind anonymous IBE and we have provided a construction of such a scheme.Finally, we applied PEOKS to build a public key encrypted database that permitsauthorized private searches.

More efficient anonymous identity-based encryption schemes with more light weightkey derivation protocols would translate directly into highly efficient PEOKS.We also observe that in a practical application it is likely that an investigatorwould want to search for data described by a predicate formed by conjunctionsand disjunctions of keywords. Future work would focus on using attribute-hidingpredicate encryption [KSW08] to build a scheme that permits oblivious searcheson encrypted data by specifying predicates of keywords.

Page 207: Cryptographic Protocols For Privacy Enhanced Identity Management

Chapter 11

Conclusions and FutureResearch

11.1 Conclusions

Identity management and privacy are broad and vague concepts. Even the term‘protocol’ does not restrict one too much when doing cryptographic and securityrelated research. Nevertheless, we hope that the selection of topics chosen for thisthesis, as well as their arrangement, communicate a coherent vision for cryptographicresearch on identity management and privacy.

Cryptography can improve privacy by minimizing the amount of personal data thatis revealed during the provisioning of a service. At the same time we show how toaddress three identity management concerns: authorization, service personalization,and accountability. We show that these concerns can be reconciled with dataminimization.

To this end we presented several protocols for data minimization. We have dividedthese protocols into two classes that were presented in separate parts of the thesis.The first are protocols for the anonymous release of certified data. Here it isanonymity that contributes to the data minimization, as anonymity guaranteesthat separate information sniplets about the user cannot be recombined into acomplete picture of his online activities. The features of such protocols, whichare primarily based on digital signatures and zero-knowledge proofs, allow theimplementation of privacy-friendly authorization and accountability mechanisms.There is, however, still a gap that needs to be bridged with respect to servicepersonalization as the data used for personalization can be privacy sensitive in

183

Page 208: Cryptographic Protocols For Privacy Enhanced Identity Management

184 CONCLUSIONS AND FUTURE RESEARCH

its own right. We have addressed this problem using our second protocol class,which is concerned with private access to data. Here we make use of two-partycomputation protocols, such as private information retrieval, oblivious transfer,and private searching.

We believe that cryptographic protocols such as the ones described in this thesisshould be deployed in next generation user-centric identity management systems.On the one hand, it is uncertain whether a free market will adopt strong privacy-enhancing identity management systems. On the other hand, technical expertisein privacy protection is likely to become a competitive advantage for companiessuch as IBM and Microsoft.1 This environment of high risk but also high reward(both economically and socially) is ideal for collaboration between the businessand academic world. Not surprisingly, the work in this thesis was supported byEuropean research projects. The European commission sees “[investment in privacy-enhancing technologies as] key to placing European industry ahead in a sectorthat will grow as these technologies become increasingly required by technologicalstandards and by consumers more aware of the need to protect their rights incyberspace.”2

Insights for protocol designers. Individually all content chapters of this thesissolve some interesting cryptographic problem. As a whole, they contain severalinsights for the designer of cryptographic privacy protocols. In research the focusvery often is on doing something new. In cryptography this can be functionalitiesthat run counter to common sense or solve long outstanding problems. Such afocus on ‘magical results’ neglects other characteristics such as efficiency, security,modularity and integration. We summarize our contributions and lessons learnedfor all five aspects.

Efficiency: Theoretical cryptography has many famous possibility results. Theearly works of multi-party computation protocols and zero-knowledge proofsshowed that almost everything is possible. As the protocols used for thesepossibility results were often highly inefficient, this resulted in a growing dividebetween cryptographic theory and cryptographic practice. Much of cryptographicresearch is concerned with bridging this divide. Our work on periodic n-timesshow credentials (cf. Chapter 6) achieves a huge efficiency speed up over previouswork and is immediately practical. Other work, such as our work on P-signatures(cf. Chapter 4) and compact e-cash (cf. Chapter 5) is close to practical, even whileavoiding idealized assumptions. Our work on oblivious maps (cf. Chapter 9) andauthorized private searches (cf. Chapter 10) contains ideas on how to efficientlyand securely distribute and delegate trust among multiple parties. The goals ofthese protocols are more specific than general multi-party computation and are1Peter Cullen on Microsoft privacy strategy: http://redmonk.com/jgovernor/?p=14332http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:52007DC0228:EN:NOT

Page 209: Cryptographic Protocols For Privacy Enhanced Identity Management

CONCLUSIONS 185

closer to engineering than to theory. Finally, our work on delegatable anonymouscredentials (cf. Chapter 7) brings something that was purely theory closer topractice.

Security: Reductionist security (sometimes also called provable security) can improvethe quality of protocol design. Without a security proof, the security of a protocolrests on the assumption (a leap of faith) that the protocol will withstand attacksand will behave as expected in all present and future deployments. This leapof faith is often made with little knowledge about which types of attacks anddeployment scenarios to expect. Security definitions are a formalization of theprotection that a protocol should offer and the types of attacks that it shouldwithstand. A security proof shows that an attacker who cannot solve certainhard complexity theoretic problems cannot break the security of the scheme asdefined in its definition. Security proofs can vary in detail from a sanity check ofthe protocol to a concrete analysis that describes the exact relation between thesecurity of the protocol and the hard problems. The schemes in Chapters 4-7have formal security proofs that do not make use of random oracles. This comesat the cost of having to introduce several new assumptions. The applicationscenarios in Chapter 9 and 10 do not have formal definitions or proofs, but makeuse of provably secure building blocks. Overall, the proofs only show asymptoticsecurity, and primarily aim at being a sanity check of the protocols’ design.

Modularity: Breaking down bigger systems into smaller modules and constructingsystems in such a way that individual building blocks can later on be replacedwith better (in the case of cryptography, faster and more secure) building blocksis an important aspect of every engineering discipline. Chapter 2 describes thebuilding blocks used in our constructions. Generally speaking, cryptographicalgorithms, such as commitment, signature, and encryption schemes, and pseudo-random functions provide the core functionality, while zero-knowledge proofsthat guarantee that the values generated by these algorithms are correctlyrelated function as glue. Randomizable non-interactive proofs, a new buildingblock presented in this thesis, allow to reuse some of this glue in order to provesomething about a combination of old and new values. The resulting proof lookslike a freshly generated proof, which is useful when constructing protocols thatrequire unlinkability with respect to previous proofs. Similarly, P-signaturespresented in Chapter 4 are a new high-level building block for privacy protocols.

Integration: It is not enough to develop privacy-enhancing mechanisms in isolation.For optimal support of the transaction environment sketched in Section 1.1,several cryptographic mechanisms need to interoperate. For example, systemsthat provide anonymity, need to be combined with accountability mechanismsthat can be invoked when this anonymity is abused. Sometimes this is doneby fixing a party that is able to deanonymize users. In this case we speakof conditional anonymity. In our work on periodic n-times show credentials(cf. Chapter 6), we show how to achieve unconditional anonymity for honestusers that do not overspend their credentials. Spending restrictions are only one

Page 210: Cryptographic Protocols For Privacy Enhanced Identity Management

186 CONCLUSIONS AND FUTURE RESEARCH

of many features available to make anonymous credentials more accountable.In our work on efficient oblivious augmented maps (cf. Chapter 9) we integrateoblivious transfer protocols with an anonymous payment mechanism based onthreshold cryptography; and in our work on private search (cf. Chapter 10) weshow how to realize authorized private searches by integrating an authorizationmechanism with the blind trapdoor generation mechanism.

Magic results: Cryptographers are obsessed with realizing functionalities that runcounter to common sense. Multi-party computation and zero-knowledge proofscertainly have this magic touch to it, and so do private information retrievaland anonymous credential protocols. Once the mechanisms for realizing thefunctionalities are understood, there is of course nothing magical to it, butsome of the fascination remains. Apart from its scientific value, understandingsuch ‘magic results’ is important from a policy perspective. Privacy protectionlegislation and other policy prescriptions that do not consider the full extentof the technical means available to protect our online privacy may fall short oftheir full potential.In this collection, especially the work on delegatable anonymous credentials hasthis quality (cf. Chapter 7). Similarly, the property of randomizability in proofsis fascinating, as we do not know how to realize it in any different way thanthe one described in this thesis. Naturally, this work still lacks some of therefinement in the support of multiple features that is found in non-delegatableanonymous credentials.

11.2 Future Work

The search space for finding the research questions posed and answered in thisthesis was spanned along the three dimensions of (i) cryptographic primitives suchas efficient signature and encryption schemes that have (ii) efficient zero-knowledgeproofs and two-party computation protocols and allow to (iii) do things we did notknow how to do electronically with good privacy.

The complexity encountered along these three dimensions led to methodologicalchallenges. Ideally, we would formalize the requirements of the application domainfirst. Then we would choose well established cryptographic primitives (or prove newbuilding blocks secure under established assumptions), and in a third step we wouldcombine them to achieve the domain specific functionality in a provably secure wayusing well understood zero-knowledge proofs and secure two-party computationtechniques.

This is an ideal scenario that we failed to achieve in several ways. Here is a list ofsome of the pitfalls and open problems found in this thesis:

Page 211: Cryptographic Protocols For Privacy Enhanced Identity Management

FUTURE WORK 187

No ideal functionality definitions: We failed to fully model the domain specificprivacy functionalities. Instead we used attack-based definitions that list theattacks scenarios a protocol is required to cope with. We call our protocolssecure, if (based on our assumptions) all efficient p.p.t. attackers have negligiblesuccess probability for all of these attacks. We did not define the functionalityrequired by an application scenario in the form of an ideal functionality, althoughthis seems to be the most promising approach for the long term. As describedin Section 2.1 an ideal functionality defines a privacy functionality with thehelp of an abstract trusted third party and thus can never be broken. The goalof a cryptographic protocol realizing the same functionality is then to achievethe same security against all p.p.t. adversaries while removing the reliance onthe abstract trusted third party. In a recent article, which is not includedin this thesis, we did some work of this type by defining and implementing auniversally composable priced oblivious transfer protocol [RKP09]. The worksin [CGH09, CDN09] are also moving toward this direction.

New assumptions: Our protocols make use of several custom-built cryptographicprimitives that could only be proven under new or modified assumptions. This isevident from our lengthy Section 2.3.3. The main reason for this dire state is thatthe GS proof system we are using only allows for the extraction of group elements.The design of efficient signature schemes and pseudo-random functions thatfulfill our requirements and that at the same time rely only on an assumptionimplied by the XDH or DLIN assumption is an interesting open problem. Even ifwe cannot base our protocols on minimal assumptions, it is good to have optionsamong different assumptions, e.g., Fuchsbauer [Fuc09] shows how to avoid theTDH assumption at the cost of introducing a new modified HSDH assumption.

Quality of proofs: Complex protocols such as those presented in this thesis are hardto get right. While we were careful to only make security claims that we canback up with reductions to our underlying assumptions, human mistakes cannotbe excluded. Moreover, the verification of handwritten proofs is cumbersome andwe doubt that most proofs are checked by more than a handful of people. Whilemathematicians read proofs for their beauty, as stories told by the universe,most security proofs for protocols are of a more mundane and contingent nature,not unlike the calculations made by the architect of a bridge. We would notprint these calculations in textbooks for future generations to study, but thecorrectness of the calculations is nonetheless important. Should the protocols inthis thesis come into widespread use, a more formal analysis of their securityand their underlying assumptions would certainly be necessary. In particular,an analysis of the security based on the concrete security paradigm [BKR94]would allow to balance the efficiency and security of the protocols under concretesecurity parameters.

Automated proofs: Automated proof generation and verification promise to makeproving cryptographic protocols more reliable and less tedious. Existingautomated or semi-automated proof techniques are either symbolic, e.g., [AG99],

Page 212: Cryptographic Protocols For Privacy Enhanced Identity Management

188 CONCLUSIONS AND FUTURE RESEARCH

in which case they do not adequately capture the computational assumptionsthat underlie most cryptographic primitives, or achieve computational securityin one of two ways. Computational security can be either proven indirectly,by first establishing the security property symbolically, and then using acomputational soundness theorem to transfer the obtained security guaranteesto the computational setting, as pioneered by [AR02].3 Or computationalsecurity can be proven directly, by transforming the formalized protocol usingprobabilistic observational equivalences defined for the computational setting, aspioneered by [Lau04, BP06]. An overview of the state of the art of these twoapproaches is given in [ABCL09]. Another promising approach that supportsboth symbolic (and computational) reasoning is (C)PCL [DDMR07]. Doubt isexpressed by Koblitz [Kob07].Currently, none of these techniques supports all of the building blocks usedby our protocol. In particular, most formal techniques have difficulties withthe algebraic reasoning that is common in pairing-based cryptography. Itis conceivable that algebraic reasoning can be avoided by only making useof algebraic properties in the construction of building blocks. The buildingblocks could then be axiomatized. Automated proof systems that supporthomomorphic encryption, zero-knowledge proofs and two-party computationprotocols are, however, still unexplored terrain. Zero-knowledge proofs have beenstudied symbolically in [BMU08, BHM08, BU08]. The closest attempt to theformalization of a complete anonymous credential system is due to Modersheimand Sommer [MS09] who formalized the idemix protocol using the so-calledAlice-and-Bob notation [CVB06].

Automated generation: Another approach for constructing provably secure protocolsis to automatically generate them from the description of their functionality.The correctness of the compiler guarantees that the resulting protocol fulfills itssecurity specification. Examples for this approach are zero-knowledge compilersand multi-party computation compilers.While the focus of this thesis is on special purpose protocols for specific function-alities, it would certainly be a mistake to write off general-purpose techniques.Such techniques would potentially allow to implement every implementablefunctionality. Even if those techniques are currently still impractical, therecent discovery of a fully homomorphic encryption scheme [Gen09] showedthat major breakthroughs are still possible. Other general techniques, suchas zero-knowledge proofs, general two-party (2PC) and multi-party (MPC)secure computation have, however, made important progress towards achievingpracticality [BCK+08, MNPS04, BDNP08, BCD+08].3The computational soundness theorem, however, only holds under certain conditions and for

a limited set of cryptographic primitives: [AR02] proposed such criteria for symmetric encryptionschemes, Backes et al. [BPW03] proposed a composable cryptographic library, and Canettiand Herzog [CH04] proposed the use of composable public key encryption. For an overviewsee Canetti [Can08].

Page 213: Cryptographic Protocols For Privacy Enhanced Identity Management

FUTURE WORK 189

Especially communication efficient 2PC (a generalization of PIR) is relevantas a means for providing electronic services. There has been progress oncommunication efficient non-interactive 2PC [CEJ+07, Sah08, Lip08, GGP09].Such protocols are also known as crypto-computing protocols and can be seenas positioned somewhat between 2PC and fully homomorphic encryption.

Despite these shortcomings of our theory, the general thrust of this thesis is towardspractical feasibility. It is possible to build privacy-enhancing protocols even for thechallenging application scenarios that are most privacy invasive.

A substantial amount of work remains to be done, especially at higher protocollayers, where the cryptography needs to be integrated with the identity managementsystem and the application system itself. An important challenge here are usablehuman-computer interfaces. Important work in this direction has been completed,especially for the kind of protocols presented in Part I [pri09a], but similar efforts forthe protocols in Part II are necessary. More work is also needed on lower protocollayers at the anonymous communication and device management level. Anonymouscommunication mechanisms such as Tor [DMS04] are an important infrastructurefor privacy-enhancing identity management. Other ‘channels’ that should beconsidered are privacy-enhancing protocols for smart cards and mobile devices thatwould allow for a more direct interlinkage of physical and electronic transactions.More solutions are also needed for securing identity information and credentialsagainst theft (and somewhat paradoxical) for making sure that one neverthelesscan make use of this information everywhere all-the-time. The metaphor here isthat of a digital wallet. Clearly a solution is needed for bootstrapping the systemshould such a wallet break down or be lost by its owner.

It is only to be expected that this integration and system effort will create newand refined cryptographic requirements and challenging research questions.

Page 214: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 215: Cryptographic Protocols For Privacy Enhanced Identity Management

Bibliography

[AA06] Mansour Alsaleh and Carlisle Adams, Enhancing consumer privacy in theLiberty Alliance identity federation and Web Services frameworks, PrivacyEnhancing Technologies – PET 2006 (George Danezis and Philippe Golle,eds.), Lecture Notes in Computer Science, vol. 4258, Springer-Verlag, Berlin,Germany, 2006, pp. 59–77. 54

[Aar09] Scott Aaronson, Quantum copy-protection and quantum money, IEEEConference on Computational Complexity, IEEE Computer Society, 2009,pp. 229–242. 89

[ABC+05] M. Abdalla, M. Bellare, D. Catalano, E. Kiltz, T. Kohno, T. Lange,J. Malone-Lee, G. Neven, P. Paillier, and H. Shi, Searchable EncryptionRevisited: Consistency Properties, Relation to Anonymous IBE, andExtensions, Advances in Cryptology – CRYPTO 2005 (Victor Shoup, ed.),Lecture Notes in Computer Science, vol. 3621, Springer-Verlag, Berlin,Germany, 2005, pp. 205–222. 7, 142, 170, 171, 173, 178, 182

[ABCL09] Martın Abadi, Bruno Blanchet, and Hubert Comon-Lundh, Models andproofs of protocol security: A progress report, Computer Aided Verification –CAV 2009, Lecture Notes in Computer Science, vol. 5643, Springer-Verlag,Berlin, Germany, 2009, pp. 35–49. 188

[ACJT00] Giuseppe Ateniese, Jan Camenisch, Marc Joye, and Gene Tsudik, A practicaland provably secure coalition-resistant group signature scheme, Advances inCryptology – CRYPTO 2000 (Mihir Bellare, ed.), Lecture Notes in ComputerScience, vol. 1880, Springer-Verlag, Berlin, Germany, 2000, pp. 255–270. 6,104

[ACK+10] Claudio A. Ardagna, Jan Camenisch, Markulf Kohlweiss, Ronald Leenes,Gregory Neven, Bart Priem, Pierangela Samarati, Dieter Sommer, and MarioVerdicchio, Exploiting cryptography for privacy-enhanced access control: Aresult of the PRIME project, Journal of Computer Security 18 (2010), no. 1,123–160. 54, 57, 58, 215

[AFGH06] Giuseppe Ateniese, Kevin Fu, Matthew Green, and Susan Hohenberger,Improved proxy re-encryption schemes with applications to secure distributedstorage, ACM Trans. Inf. Syst. Secur. 9 (2006), no. 1, 1–30. 31

[AG99] Martın Abadi and Andrew D. Gordon, A calculus for cryptographic protocols,Inf. Comput. 148 (1999), no. 1, 1–70. 187

191

Page 216: Cryptographic Protocols For Privacy Enhanced Identity Management

192 BIBLIOGRAPHY

[AIR01] William Aiello, Yuval Ishai, and Omer Reingold, Priced oblivious transfer:How to sell digital goods., Advances in Cryptology – EUROCRYPT 2001(Innsbruck, Austria) (Birgit Pfitzmann, ed.), Lecture Notes in ComputerScience, vol. 2045, Springer-Verlag, Berlin, Germany, 2001, pp. 119–135. 7,143, 154

[AKMP08a] Christer Andersson, Markulf Kohlweiss, Leonardo Martucci, and AndriyPanchenko, A self-certified and sybil-free framework for secure digital identitydomain buildup, Information Security Theory and Practices. Smart Devices,Convergence and Next Generation Networks – WISTP 2008 (Seville, Spain)(Jose Antonio Onieva, Damien Sauveron, Serge Chaumette, Dieter Gollmann,and Constantinos Markantonakis, eds.), Lecture Notes in Computer Science,vol. 5019, Springer-Verlag, Berlin, Germany, 2008, pp. 64–77. 216

[AKMP08b] , Self-certified sybil-free pseudonyms, 1st ACM Conference on WirelessNetwork Security – WiSec 2008 (Alexandria,VI,USA) (Virgil D. Gligor, Jean-Pierre Hubaux, and Radha Poovendran, eds.), Association for ComputingMachinery, 2008, pp. 154–159. 216

[AR02] Martin Abadi and Phillip Rogaway, Reconciling two views of cryptography(the computational soundness of formal encryption), Journal of Cryptology15 (2002), no. 2, 103–127. 188

[ASW97] N. Asokan, Matthias Schunter, and Michael Waidner, Optimistic protocolsfor fair exchange, ACM CCS 97: 4th Conference on Computer andCommunications Security(Zurich, Switzerland), ACM Press, 1997, pp. 7–17.164

[ASW98] N. Asokan, Victor Shoup, and Michael Waidner, Asynchronous protocolsfor optimistic fair exchange, In Proceedings of the IEEE Symposium onResearch in Security and Privacy, IEEE Computer Society Press, 1998,pp. 86–99. 164

[BB04a] Dan Boneh and Xavier Boyen, Efficient selective-ID secure identity-based encryption without random oracles, Advances in Cryptology –EUROCRYPT 2004 (Interlaken, Switzerland) (Christian Cachin and JanCamenisch, eds.), Lecture Notes in Computer Science, vol. 3027, Springer-Verlag, Berlin, Germany, 2004, pp. 223–238. 25, 169

[BB04b] , Short signatures without random oracles, Advances in Cryptology –EUROCRYPT 2004 (Interlaken, Switzerland) (Christian Cachin and JanCamenisch, eds.), Lecture Notes in Computer Science, vol. 3027, Springer-Verlag, Berlin, Germany, 2004, pp. 54–73. 23, 75, 82, 84, 125, 126, 251,256

[BB08] Dan Boneh and Xavier Boyen, Short signatures without random oracles andthe SDH assumption in bilinear groups, Journal of Cryptology 21 (2008),no. 2, 149–177. 23, 24, 82, 84, 90, 96

[BBBW82] Charles H. Bennett, Gilles Brassard, Seth Breidbard, and StephenWiesner, Quantum cryptography, or unforgeable subway tokens, Advances inCryptology – CRYPTO’82 (David Chaum, Ronald L. Rivest, and Alan T.Sherman, eds.), Lecture Notes in Computer Science, Springer-Verlag, Berlin,Germany, 1982, pp. 267–275. 89

Page 217: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 193

[BBDP01] Mihir Bellare, Alexandra Boldyreva, Anand Desai, and David Pointcheval,Key-privacy in public-key encryption, Advances in Cryptology – ASI-ACRYPT 2001 (Gold Coast, Australia) (Colin Boyd, ed.), Lecture Notesin Computer Science, vol. 2248, Springer-Verlag, Berlin, Germany, 2001,pp. 566–582. 170

[BBS04] Dan Boneh, Xavier Boyen, and Hovav Shacham, Short group signatures usingstrong Diffie-Hellman, Advances in Cryptology – CRYPTO 2004 (MatthewFranklin, ed.), Lecture Notes in Computer Science, vol. 3152, Springer-Verlag,Berlin, Germany, 2004, pp. 41–55. 6, 22, 23, 104

[BCC88] Gilles Brassard, David Chaum, and Claude Crepeau, Minimum disclosureproofs of knowledge, Journal of Computer and System Sciences 37 (1988),no. 2, 156–189. 27

[BCC04] Ernest F. Brickell, Jan Camenisch, and Liqun Chen, Direct anonymousattestation, ACM CCS 04: 11th Conference on Computer andCommunications Security(Washington D.C., USA) (Vijayalakshmi Atluri,Birgit Pfitzmann, and Patrick Drew McDaniel, eds.), ACM Press, 2004,pp. 132–145. 33, 65

[BCC+08] Mira Belenkiy, Jan Camenisch, Melissa Chase, Markulf Kohlweiss, AnnaLysyanskaya, and Hovav Shacham, Randomizable proofs and delegatableanonymous credentials, Cryptology ePrint Archive, Report 2008/428, http://eprint.iacr.org/2008/428, 2008. 215

[BCC+09] , Randomizable proofs and delegatable anonymous credentials,Advances in Cryptology – CRYPTO 2009 (Shai Halevi, ed.), Lecture Notesin Computer Science, vol. 5677, Springer-Verlag, Berlin, Germany, 2009,pp. 108–125. 8, 9, 11, 73, 117, 121, 124, 127, 129, 215

[BCD+08] Peter Bogetoft, Dan Lund Christensen, Ivan Damgard, Martin Geisler,Thomas Jakobsen, Mikkel Krøigaard, Janus Dam Nielsen, Jesper BuusNielsen, Kurt Nielsen, Jakob Pagter, Michael Schwartzbach, and TomasToft, Multiparty computation goes live, Cryptology ePrint Archive, Report2008/068, 2008, http://eprint.iacr.org/. 188

[BCK+08] Endre Bangerter, Jan Camenisch, Stephan Krenn, Ahmad-Reza Sadeghi, andThomas Schneider, Automatic generation of sound zero-knowledge protocols,Cryptology ePrint Archive, Report 2008/471, 2008, http://eprint.iacr.org/. 188

[BCKL07] Mira Belenkiy, Melissa Chase, Markulf Kohlweiss, and Anna Lysyanskaya,Non-interactive anonymous credentials, Cryptology ePrint Archive, Report2007/384, 2007, http://eprint.iacr.org/. 216

[BCKL08] , P-signatures and noninteractive anonymous credentials, TCC 2008:5rd Theory of Cryptography Conference(New York, NY, USA) (YevgeniyDodis and Victor Shoup, eds.), Lecture Notes in Computer Science, vol.4948, Springer-Verlag, Berlin, Germany, 2008, pp. 356–374. 8, 11, 73, 75, 87,88, 118, 126, 216

[BCKL09] , Compact e-cash and simulatable VRFs revisited, Pairing-BasedCryptography – Pairing 2009 (Palo Alto, CA, USA) (Hovav Shacham and

Page 218: Cryptographic Protocols For Privacy Enhanced Identity Management

194 BIBLIOGRAPHY

Brent Waters, eds.), Lecture Notes in Computer Science, vol. 5671, Springer-Verlag, Berlin, Germany, 2009, pp. 114–131. 8, 73, 93, 95, 96, 98, 215

[BCL04] Endre Bangerter, Jan Camenisch, and Anna Lysyanskaya, A cryptographicframework for the controlled release of certified data, Twelfth InternationalWorkshop on Security Protocols, Lecture Notes in Computer Science,Springer-Verlag, Berlin, Germany, 2004, pp. 720–42. 33, 75

[BCM05] Endre Bangerter, Jan Camenisch, and Ueli M. Maurer, Efficient proofs ofknowledge of discrete logarithms and representations in groups with hiddenorder., PKC 2005: 8th International Workshop on Theory and Practice inPublic Key Cryptography(Les Diablerets, Switzerland) (Serge Vaudenay,ed.), Lecture Notes in Computer Science, vol. 3386, Springer-Verlag, Berlin,Germany, 2005, pp. 154–171. 45

[BCR87] Gilles Brassard, Claude Crepeau, and Jean-Marc Robert, All-or-nothingdisclosure of secrets, Advances in Cryptology – CRYPTO’86 (Santa Barbara,CA, USA) (Andrew M. Odlyzko, ed.), Lecture Notes in Computer Science,vol. 263, Springer-Verlag, Berlin, Germany, August 1987, pp. 234–238. 143

[BDCOP04] Dan Boneh, Giovanni Di Crescenzo, Rafail Ostrovsky, and Giuseppe Persiano,Public key encryption with keyword search, Advances in Cryptology –EUROCRYPT 2004 (Interlaken, Switzerland) (Christian Cachin and JanCamenisch, eds.), Lecture Notes in Computer Science, vol. 3027, Springer-Verlag, Berlin, Germany, 2004, pp. 506–522. 142, 169, 173

[BDMP91] Manuel Blum, Alfredo De Santis, Silvio Micali, and Guiseppe Persiano,Non-interactive zero-knowledge, SIAM Journal on Computing 20 (1991),no. 6, 1084–1118. 48, 69

[BDNP08] Assaf Ben-David, Noam Nisan, and Benny Pinkas, Fairplaymp: a systemfor secure multi-party computation, ACM CCS 08: 13th Conference onComputer and Communications Security(Alexandria, Virginia, USA) (PengNing, Paul F. Syverson, and Somesh Jha, eds.), ACM Press, 2008, pp. 257–266. 188

[Bel98] Mihir Bellare, Practice-oriented provable security, Lectures on Data Security,Lecture Notes in Computer Science, vol. 1561, Springer-Verlag, Berlin,Germany, 1998, pp. 1–15. 11

[BF01] Dan Boneh and Matthew Franklin, Identity-based encryption from the Weilpairing, Advances in Cryptology – CRYPTO 2001 (Joe Kilian, ed.), LectureNotes in Computer Science, vol. 2139, Springer-Verlag, Berlin, Germany,2001, pp. 213–229. 31, 169, 170

[BFM88] Manuel Blum, Paul Feldman, and Silvio Micali, Non-interactive zero-knowledge and its applications, 20th Annual ACM Symposium on Theory ofComputing(Chicago, Illinois, USA), ACM Press, May 2–4, 1988, pp. 103–112.18, 19, 36, 69

[BG92] Mihir Bellare and Oded Goldreich, On defining proofs of knowledge, Advancesin Cryptology – CRYPTO’92 (Ernest F. Brickell, ed.), Lecture Notesin Computer Science, vol. 2442, Springer-Verlag, Berlin, Germany, 1992,pp. 390–420. 41

Page 219: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 195

[BGdMM] Lucas Ballard, Matthew Green, Breno de Medeiros, and Fabian Monrose,Correlation-Resistant Storage, Johns Hopkins University, CS TechnicalReport # TR-SP-BGMM-050705. http://spar.isi.jhu.edu/˜mgreen/correlation.pdf, 2005. 23

[BGN05] Dan Boneh, Eu-Jin Goh, and Kobbi Nissim, Evaluating 2-DNF formulason ciphertexts, TCC 2008: 5rd Theory of Cryptography Conference(NewYork, NY, USA) (Yevgeniy Dodis and Victor Shoup, eds.), Lecture Notesin Computer Science, vol. 4948, Springer-Verlag, Berlin, Germany, 2005,pp. 325–341. 91

[BH04] Michael Backes and Dennis Hofheinz, How to break and repair a universallycomposable signature functionality, ISC 2004: Information SecurityConference (Palo Alto, CA, USA) (Kan Zhang and Yuliang Zheng, eds.),Lecture Notes in Computer Science, vol. 3225, Springer-Verlag, Berlin,Germany, 2004, pp. 61–72. 14

[BHM08] Michael Backes, Catalin Hritcu, and Matteo Maffei, Type-checkingzero-knowledge, ACM CCS 08: 13th Conference on Computer andCommunications Security(Alexandria, Virginia, USA) (Peng Ning, Paul F.Syverson, and Somesh Jha, eds.), ACM Press, 2008, pp. 357–370. 188

[BIK05] Amos Beimel, Yuval Ishai, and Eyal Kushilevitz, General constructions forinformation-theoretic private information retrieval, J. Comput. Syst. Sci. 71(2005), no. 2, 213–247. 141

[BIKR02] Amos Beimel, Yuval Ishai, Eyal Kushilevitz, and Jean-Francois Raymond,Breaking the o(n1/(2k-1)) barrier for information-theoretic privateinformation retrieval, 43rd Annual Symposium on Foundations of ComputerScience(Vancouver, British Columbia, Canada), IEEE Computer SocietyPress, 2002, pp. 261–270. 141

[BKOS07] Dan Boneh, Eyal Kushilevitz, Rafail Ostrovsky, and William E. Skeith III,Public key encryption that allows PIR queries, Advances in Cryptology –CRYPTO 2007 (Alfred Menezes, ed.), Lecture Notes in Computer Science,vol. 4622, Springer-Verlag, Berlin, Germany, 2007, pp. 50–67. 142

[BKR94] Mihir Bellare, Joe Kilian, and Phillip Rogaway, The security of cipherblock chaining, Advances in Cryptology – CRYPTO’94 (Yvo Desmedt,ed.), Lecture Notes in Computer Science, vol. 839, Springer-Verlag, Berlin,Germany, 1994, pp. 341–358. 187

[BMU08] Michael Backes, Matteo Maffei, and Dominique Unruh, Zero-knowledge inthe applied pi-calculus and automated verification of the direct anonymousattestation protocol, IEEE Symposium on Security and Privacy, IEEEComputer Society, 2008, pp. 202–215. 188

[Bon98] Dan Boneh, The decision Diffie-Hellman problem, ANTS 1998: ThirdAlgorithmic Number Theory Symposium (Joe Buhler, ed.), Lecture Notesin Computer Science, vol. 1423, Springer-Verlag, Berlin, Germany, 1998,pp. 48–63. 21

[Bou00] Fabrice Boudot, Efficient proofs that a committed number lies in an interval,Advances in Cryptology – EUROCRYPT 2000 (Bruges, Belgium) (Bart

Page 220: Cryptographic Protocols For Privacy Enhanced Identity Management

196 BIBLIOGRAPHY

Preneel, ed.), Lecture Notes in Computer Science, vol. 1807, Springer-Verlag,Berlin, Germany, 2000, pp. 431–444. 91, 111, 171, 182

[BP97] Niko Baric and Birgit Pfitzmann, Collision-free accumulators and fail-stopsignature schemes without trees, Advances in Cryptology – EUROCRYPT’97(Konstanz, Germany) (Walter Fumy, ed.), Lecture Notes in ComputerScience, vol. 1233, Springer-Verlag, Berlin, Germany, 1997, pp. 480–494. 23

[BP02] Mihir Bellare and Adriana Palacio, GQ and Schnorr Identification Schemes:Proofs of Security against Impersonation under Active and ConcurrentAttacks, Advances in Cryptology – CRYPTO 2002 (Moti Yung, ed.), LectureNotes in Computer Science, vol. 2442, Springer-Verlag, Berlin, Germany,2002, pp. 162–177. 70, 89

[BP06] Bruno Blanchet and David Pointcheval, Automated security proofs withsequences of games, Advances in Cryptology – CRYPTO 2006 (CynthiaDwork, ed.), Lecture Notes in Computer Science, vol. 4117, Springer-Verlag,Berlin, Germany, 2006, pp. 537–554. 188

[BPW03] Michael Backes, Birgit Pfitzmann, and Michael Waidner, A composablecryptographic library with nested operations, ACM CCS 03: 10th Conferenceon Computer and Communications Security(Washington D.C., USA) (SushilJajodia, Vijayalakshmi Atluri, and Trent Jaeger, eds.), ACM Press, 2003,pp. 220–230. 188

[BR93] Mihir Bellare and Phillip Rogaway, Random oracles are practical: A paradigmfor designing efficient protocols, ACM CCS 93: 1st Conference on Computerand Communications Security(Fairfax, Virginia, USA) (V. Ashby, ed.), ACMPress, 1993, pp. 62–73. 19, 40

[Bra93a] Stefan Brands, An efficient off-line electronic cash system based on therepresentation problem, Tech. Report CS-R9323, CWI, April 1993. 70, 89

[Bra93b] , Electronic cash systems based on the representation problem ingroups of prime order, Advances in Cryptology – CRYPTO’93 (Douglas R.Stinson, ed.), Lecture Notes in Computer Science, vol. 773, Springer-Verlag,Berlin, Germany, 1993, pp. 26.1–26.15. 6

[Bra93c] , Untraceable off-line cash in wallets with observers, Advances inCryptology – CRYPTO’94 (Yvo Desmedt, ed.), Lecture Notes in ComputerScience, vol. 839, Springer-Verlag, Berlin, Germany, 1993, pp. 302–318. 70

[Bra97] , Rapid demonstration of linear relations connected by booleanoperators, Advances in Cryptology – EUROCRYPT’97 (Konstanz, Germany)(Walter Fumy, ed.), Lecture Notes in Computer Science, vol. 1233, Springer-Verlag, Berlin, Germany, 1997, pp. 318–333. 45, 171

[Bra00] , Rethinking public key infrastructures and digital certificates, MITPress, 2000. 5, 6, 71, 118

[Bri02] T. Brignall, The new panopticon: The internet viewed as a structure of socialcontrol, Theory and Science 3 (2002), no. 1, 335–348. 2

[BS01] Emmanuell Bresson and Jacques Stern, Group signatures with efficientrevocation, PKC 2001: 4th International Workshop on Theory and Practicein Public Key Cryptography(Kwangjo Kim, ed.), Lecture Notes in ComputerScience, vol. 1992, Springer-Verlag, Berlin, Germany, 2001, pp. 190–206. 6

Page 221: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 197

[BS03] Alastair R. Beresford and Frank Stajano, Location Privacy in PervasiveComputing, IEEE Pervasive Computing 2 (2003), no. 1, 46–55. 152

[BS04] Dan Boneh and Hovav Shacham, Group signatures with verifier-local revocation., ACM CCS 04: 11th Conference on Computer andCommunications Security(Washington D.C., USA) (Vijayalakshmi Atluri,Birgit Pfitzmann, and Patrick Drew McDaniel, eds.), ACM Press, 2004,pp. 168–177. 6

[BU08] Michael Backes and Dominique Unruh, Computational soundness of symboliczero-knowledge proofs against active attackers, CSF 2008: 21st IEEEComputer Security Foundations Symposium (Pittsburgh, Pennsylvania,USA), IEEE Computer Society, 2008, pp. 255–269. 188

[BW06] Xavier Boyen and Brent Waters, Anonymous hierarchical identity-based encryption (without random oracles), Advances in Cryptology –CRYPTO 2006 (Cynthia Dwork, ed.), Lecture Notes in Computer Science,vol. 4117, Springer-Verlag, Berlin, Germany, 2006, pp. 290–307. 149, 169,174

[BW07] , Full-domain subgroup hiding and constant-size group signatures.,PKC 2007: 10th International Workshop on Theory and Practice in PublicKey Cryptography(Tatsuaki Okamoto and Xiaoyun Wang, eds.), LectureNotes in Computer Science, vol. 4450, Springer-Verlag, Berlin, Germany,2007, pp. 1–15. 23, 254

[Cam98] Jan Camenisch, Group signature schemes and payment systems based on thediscrete logarithm problem, Ph.D. thesis, ETH Zurich, 1998. 45

[Can01] Ran Canetti, Universally composable security: A new paradigm forcryptographic protocols, 42nd Annual Symposium on Foundations ofComputer Science(Las Vegas, Nevada, USA), IEEE Computer Society Press,2001, pp. 136–145. 11, 14, 19

[Can04] , Universally composable signature, certification, and authentication,CSFW 2004: 17th IEEE Computer Security Foundations Workshop (PacificGrove, California, USA), IEEE Computer Society, 2004, pp. 219–235. 14, 19

[Can08] , Composable formal security analysis: Juggling soundness, simplicityand efficiency, ICALP 2008: 35th International Colloquium on Automata,Languages and Programming, Part II (Reykjavik, Iceland) (Luca Aceto, IvanDamgard, Leslie Ann Goldberg, Magnus M. Halldorsson, Anna Ingolfsdottir,and Igor Walukiewicz, eds.), Lecture Notes in Computer Science, vol. 5126,Springer-Verlag, Berlin, Germany, 2008, pp. 1–13. 188

[car09] Windows cardspace, online, 2009, Available from http://www.microsoft.com/windows/products/winfamily/cardspace/, accessed in September2009. 54

[CCS08] Jan Camenisch, Rafik Chaabouni, and Abhi Shelat, Efficient protocols for setmembership and range proofs, Advances in Cryptology – ASIACRYPT 2008(Melbourne, Australia) (Josef Pieprzyk, ed.), Lecture Notes in ComputerScience, vol. 5350, Springer-Verlag, Berlin, Germany, 2008, pp. 234–252. 91,111

Page 222: Cryptographic Protocols For Privacy Enhanced Identity Management

198 BIBLIOGRAPHY

[CD00] Jan Camenisch and Ivan Damgard, Verifiable encryption, group encryption,and their applications to group signatures and signature sharing schemes,Advances in Cryptology – ASIACRYPT 2000 (Kyoto, Japan) (TatsuakiOkamoto, ed.), Lecture Notes in Computer Science, vol. 1976, Springer-Verlag, Berlin, Germany, 2000, pp. 331–345. 31, 35, 114

[CDM+05] Shuchi Chawla, Cynthia Dwork, Frank McSherry, Adam Smith, and HoeteckWee, Toward privacy in public databases, Second Theory of CryptographyConference (Joe Kilian, ed.), Lecture Notes in Computer Science, vol. 3378,Springer, 2005, pp. 363–385. 103

[CDN09] Jan Camenisch, Maria Dubovitskaya, and Gregory Neven, Oblivious transferwith access control, ACM CCS 09: 13th Conference on Computer andCommunications Security(New York, NY, USA) (Ehab Al-Shaer, SomeshJha, and Angelos D. Keromytis, eds.), ACM Press, 2009, pp. 131–140. 146,187

[CDS94] Ronald Cramer, Ivan Damgard, and Berry Schoenmakers, Proofs of partialknowledge and simplified design of witness hiding protocols, Advances inCryptology – CRYPTO’94 (Yvo Desmedt, ed.), Lecture Notes in ComputerScience, vol. 839, Springer-Verlag, Berlin, Germany, 1994, pp. 174–187. 45,99

[CEJ+07] Seung Geol Choi, Ariel Elbaz, Ari Juels, Tal Malkin, and Moti Yung,Two-party computing with encrypted data, Advances in Cryptology –ASIACRYPT 2007 (Kuching, Malaysia) (Kaoru Kurosawa, ed.), LectureNotes in Computer Science, vol. 4833, Springer-Verlag, Berlin, Germany,2007, pp. 298–314. 189

[CF01] Ran Canetti and Marc Fischlin, Universally composable commitments,Advances in Cryptology – CRYPTO 2001 (Joe Kilian, ed.), Lecture Notesin Computer Science, vol. 2139, Springer-Verlag, Berlin, Germany, 2001,pp. 19–40. 19

[CFN90] David Chaum, Amos Fiat, and Moni Naor, Untraceable electronic cash,Advances in Cryptology – CRYPTO’88 (Shafi Goldwasser, ed.), LectureNotes in Computer Science, vol. 403, Springer-Verlag, Berlin, Germany, 1990,pp. 319–327. 70, 71, 89

[CFT98] Agnes Chan, Yair Frankel, and Yiannis Tsiounis, Easy come – easy godivisible cash, Advances in Cryptology – EUROCRYPT’98 (Espoo, Finland)(Kaisa Nyberg, ed.), Lecture Notes in Computer Science, vol. 1403, Springer-Verlag, Berlin, Germany, 1998, pp. 561–575. 111

[CGH04] Ran Canetti, Oded Goldreich, and Shai Halevi, The random oraclemethodology, revisited, Journal of the ACM 51 (2004), no. 4, 557–594.20, 21, 69

[CGH06] Sebastien Canard, Aline Gouget, and Emeline Hufschmitt, A handy multi-coupon system., ACNS 2006: 4th International Conference on AppliedCryptography and Network Security (Singapore) (Jianying Zhou, MotiYung, and Feng Bao, eds.), Lecture Notes in Computer Science, vol. 3989,Springer-Verlag, Berlin, Germany, 2006, pp. 66–81. 33, 75

Page 223: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 199

[CGH09] Scott E. Coull, Matthew Green, and Susan Hohenberger, Controlling accessto an oblivious database using stateful anonymous credentials, PKC 2009:12th International Workshop on Theory and Practice in Public KeyCryptography(Irvine, CA, USA) (Stanislaw Jarecki and Gene Tsudik, eds.),Lecture Notes in Computer Science, vol. 5443, Springer-Verlag, Berlin,Germany, 2009, pp. 501–520. 7, 145, 187

[CGKO06] Reza Curtmola, Juan A. Garay, Seny Kamara, and Rafail Ostrovsky,Searchable symmetric encryption: improved definitions and efficientconstructions, ACM CCS 06: 13th Conference on Computer andCommunications Security(Alexandria, Virginia, USA) (Ari Juels, Rebecca N.Wright, and Sabrina De Capitani di Vimercati, eds.), ACM Press, 2006,pp. 79–88. 7, 142, 180

[CGKS95] Benny Chor, Oded Goldreich, Eyal Kushilevitz, and Madhu Sudan, Privateinformation retrieval, 36th Annual Symposium on Foundations of ComputerScience(Milwaukee, Wisconsin), IEEE Computer Society Press, 1995, pp. 41–50. 7, 141, 170

[CGN98] Benny Chor, Niv Gilboa, and Moni Naor, Private information retrievalby keywords, Cryptology ePrint Archive, Report 1998/003, 1998, http://eprint.iacr.org/. 142, 168

[CH04] Ran Canetti and Jonathan Herzog, Universally composable symbolic analysisof cryptographic protocols (the case of encryption-based mutual authenticationand key exchange), Cryptology ePrint Archive, Report 2004/334, 2004,http://eprint.iacr.org/. 188

[Cha81] David Chaum, Untraceable electronic mail, return addresses, and digitalpseudonyms, Communications of the ACM 24 (1981), no. 2, 84–88. 152, 164

[Cha82] , Blind signatures for untraceable payments, Advances in Cryptology –CRYPTO’82 (David Chaum, Ronald L. Rivest, and Alan T. Sherman, eds.),Lecture Notes in Computer Science, Springer-Verlag, Berlin, Germany, 1982,pp. 199–203. 6, 70, 71, 89, 105, 144

[Cha85] , Security without identification: Transaction systems to make bigbrother obsolete, Communications of the ACM 28 (1985), no. 10, 1030–1044.6, 62, 104, 118

[Che06] Jung Hee Cheon, Security analysis of the strong diffie-hellman problem,Advances in Cryptology – EUROCRYPT 2006 (St. Petersburg, Russia) (SergeVaudenay, ed.), Lecture Notes in Computer Science, vol. 4004, Springer-Verlag, Berlin, Germany, 2006, pp. 1–11. 22

[CHK+06] Jan Camenisch, Susan Hohenberger, Markulf Kohlweiss, Anna Lysyanskaya,and Mira Meyerovich, How to win the clone wars: Efficient periodic n-timesanonymous authentication, ACM CCS 06: 13th Conference on Computer andCommunications Security(Alexandria, Virginia, USA) (Ari Juels, Rebecca N.Wright, and Sabrina De Capitani di Vimercati, eds.), ACM Press, 2006,p. 11. 8, 9, 33, 65, 66, 73, 75, 90, 103, 112, 216

[CHL05] Jan Camenisch, Susan Hohenberger, and Anna Lysyanskaya, Compact E-Cash, Advances in Cryptology – EUROCRYPT 2005 (Aarhus, Denmark)

Page 224: Cryptographic Protocols For Privacy Enhanced Identity Management

200 BIBLIOGRAPHY

(Ronald Cramer, ed.), Lecture Notes in Computer Science, vol. 3494,Springer-Verlag, Berlin, Germany, 2005, pp. 302–321. 6, 25, 33, 65, 66,70, 71, 75, 90, 92, 93, 94, 105, 113

[CHL06] , Balancing accountability and privacy using e-cash, SCN2006: 6th International Conference on Security in CommunicationNetworks(Roberto De Prisco and Moti Yung, eds.), Lecture Notes inComputer Science, vol. 4116, Springer-Verlag, Berlin, Germany, 2006. 33,63, 75, 105

[CJKT09] Emiliano De Cristofaro, Stanislaw Jarecki, Jihye Kim, and Gene Tsudik,Privacy-preserving policy-based information transfer, Privacy EnhancingTechnologies – PET 2009 (Ian Goldberg and Mikhail Atallah, eds.), LectureNotes in Computer Science, vol. 5672, Springer-Verlag, Berlin, Germany,2009, pp. 164–184. 7, 147

[CK01] Sebastian Clauß and Marit Kohntopp, Identity management and its supportof multilateral security, Computer Networks 37 (2001), no. 2, 205 – 219,Electronic Business Systems. 3

[CKL06] Ran Canetti, Eyal Kushilevitz, and Yehuda Lindell, On the limitations ofuniversally composable two-party computation without set-up assumptions,Journal of Cryptology 19 (2006), no. 2, 135–167. 19

[CKRS09] Jan Camenisch, Markulf Kohlweiss, Alfredo Rial, and Caroline Sheedy, Blindand anonymous identity-based encryption and authorised private searcheson public key encrypted data, PKC 2009: 12th International Workshopon Theory and Practice in Public Key Cryptography(Irvine, CA, USA)(Stanislaw Jarecki and Gene Tsudik, eds.), Lecture Notes in ComputerScience, vol. 5443, Springer-Verlag, Berlin, Germany, 2009, pp. 196–214. 8,10, 215

[CKS09] Jan Camenisch, Markulf Kohlweiss, and Claudio Soriente, An accumulatorbased on bilinear maps and efficient revocation for anonymous credentials,PKC 2007: 10th International Workshop on Theory and Practice in PublicKey Cryptography(Beijing, China) (Tatsuaki Okamoto and Xiaoyun Wang,eds.), Lecture Notes in Computer Science, vol. 4450, Springer-Verlag, Berlin,Germany, 2009, pp. 481–500. 215

[CKW04] Jan Camenisch, Maciej Koprowski, and Bogdan Warinschi, Efficient blindsignatures without random oracles, SCN 2004: 4th International Conferenceon Security in Communication Networks(Carlo Blundo and Stelvio Cimato,eds.), Lecture Notes in Computer Science, vol. 3352, Springer-Verlag, Berlin,Germany, 2004, pp. 134–148. 35, 36

[CKY09] Jan Camenisch, Aggelos Kiayias, and Moti Yung, On the portability ofgeneralized schnorr proofs, Advances in Cryptology – EUROCRYPT 2009(Cologne, Germany) (Antoine Joux, ed.), Lecture Notes in Computer Science,vol. 5479, Springer-Verlag, Berlin, Germany, 2009, pp. 425–442. 45

[CL01] Jan Camenisch and Anna Lysyanskaya, Efficient non-transferable anonymousmulti-show credential system with optional anonymity revocation, Advancesin Cryptology – EUROCRYPT 2001 (Innsbruck, Austria) (Birgit Pfitzmann,ed.), Lecture Notes in Computer Science, vol. 2045, Springer-Verlag, Berlin,Germany, 2001, pp. 93–118. 6, 32, 62, 104, 105, 118

Page 225: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 201

[CL02a] , Dynamic accumulators and application to efficient revocation ofanonymous credentials, Advances in Cryptology – CRYPTO 2002 (MotiYung, ed.), Lecture Notes in Computer Science, vol. 2442, Springer-Verlag,Berlin, Germany, 2002, pp. 61–76. 6, 113, 115

[CL02b] , A signature scheme with efficient protocols, SCN 2002: 3rdInternational Conference on Security in Communication Networks(StelvioCimato, Clemente Galdi, and Giuseppe Persiano, eds.), Lecture Notesin Computer Science, vol. 2576, Springer-Verlag, Berlin, Germany, 2002,pp. 268–289. 6, 8, 32, 104, 110, 111, 118, 126

[CL04] , Signature schemes and anonymous credentials from bilinear maps,Advances in Cryptology – CRYPTO 2004 (Moti Yung, ed.), Lecture Notesin Computer Science, vol. 2442, Springer-Verlag, Berlin, Germany, 2004,pp. 56–72. 32, 104, 110, 111, 118

[CL06] Melissa Chase and Anna Lysyanskaya, On signatures of knowledge, Advancesin Cryptology – CRYPTO 2006 (Cynthia Dwork, ed.), Lecture Notes inComputer Science, vol. 4117, Springer-Verlag, Berlin, Germany, 2006, pp. 78–96. 72, 118

[CL07] , Simulatable VRFs with applications to multi-theorem NIZK,Advances in Cryptology – CRYPTO 2007 (Alfred Menezes, ed.), LectureNotes in Computer Science, vol. 4622, Springer-Verlag, Berlin, Germany,2007, pp. 303–322. 25, 71, 91, 92, 93

[Cle] Privacy Rights Clearinghouse, Chronology of data breaches, Available fromhttp://www.privacyrights.org/ar/ChronDataBreaches.htm. 148

[CLM07] Jan Camenisch, Anna Lysyanskaya, and Mira Meyerovich, Endorsed e-cash,IEEE Symposium on Security and Privacy, IEEE Computer Society, 2007,pp. 101–115. 33, 75, 100

[CM99a] Jan Camenisch and Markus Michels, Proving in zero-knowledge that anumber n is the product of two safe primes, Advances in Cryptology –EUROCRYPT’99 (Prague, Czech Republic) (Jacques Stern, ed.), LectureNotes in Computer Science, vol. 1592, Springer-Verlag, Berlin, Germany,1999, pp. 107–122. 45, 111

[CM99b] , Separability and efficiency for generic group signature schemes,Advances in Cryptology – CRYPTO’99 (Michael J. Wiener, ed.), LectureNotes in Computer Science, vol. 1666, Springer-Verlag, Berlin, Germany,1999, pp. 413–430. 114

[CMS99] Christian Cachin, Silvio Micali, and Markus Stadler, Computationally privateinformation retrieval with polylogarithmic communication, Lecture Notes inComputer Science 1592 (1999), 402–414. 7

[CNS07] Jan Camenisch, Gregory Neven, and Abhi Shelat, Simulatable adaptiveoblivious transfer, Advances in Cryptology – EUROCRYPT 2007 (Barcelona,Spain) (Moni Naor, ed.), Lecture Notes in Computer Science, vol. 4515,Springer-Verlag, Berlin, Germany, 2007, pp. 573–590. 7, 144, 172

[Coo71] Stephen A. Cook, The complexity of theorem-proving procedures, 3rd AnnualACM Symposium on Theory of Computing(Shaker Heights, Ohio, USA),ACM Press, May 3–5, 1971, pp. 151–158. 45, 91

Page 226: Cryptographic Protocols For Privacy Enhanced Identity Management

202 BIBLIOGRAPHY

[CP93a] David Chaum and Torben Pryds Pedersen, Transferred cash grows insize, Advances in Cryptology – EUROCRYPT’92 (Balatonfured, Hungary)(Rainer A. Rueppel, ed.), Lecture Notes in Computer Science, vol. 658,Springer-Verlag, Berlin, Germany, 1993, pp. 390–407. 70, 89

[CP93b] , Wallet databases with observers, Advances in Cryptology –CRYPTO’92 (Ernest F. Brickell, ed.), Lecture Notes in Computer Science,vol. 740, Springer-Verlag, Berlin, Germany, 1993, pp. 89–105. 45

[CPS94] Jan Camenisch, Jean-Marc Piveteau, and Markus Stadler, Blind signaturesbased on the discrete logarithm problem, Advances in Cryptology –EUROCRYPT’94 (Perugia, Italy) (Alfredo De Santis, ed.), Lecture Notesin Computer Science, vol. 950, Springer-Verlag, Berlin, Germany, 1994,pp. 428–432. 70, 89

[Cra97] Ronald Cramer, Modular design of secure yet practical cryptographicprotocols, Ph.D. thesis, University of Amsterdam, Amsterdam, 1997. 45, 99

[CS97a] Jan Camenisch and Markus Stadler, Efficient group signature schemes forlarge groups, Advances in Cryptology – CRYPTO’97 (Burton S. Kaliski Jr.,ed.), Lecture Notes in Computer Science, vol. 1294, Springer-Verlag, Berlin,Germany, 1997, pp. 410–424. 6, 38, 42, 104, 111

[CS97b] , Proof systems for general statements about discrete logarithms, Tech.Report TR 260, Institute for Theoretical Computer Science, ETH Zurich,March 1997. 36, 38, 42, 45

[CS03] Jan Camenisch and Victor Shoup, Practical verifiable encryption anddecryption of discrete logarithms, Advances in Cryptology – CRYPTO 2003(Dan Boneh, ed.), Lecture Notes in Computer Science, vol. 2729, Springer-Verlag, Berlin, Germany, 2003, pp. 126–144. 36

[CS05] Sanjit Chatterjee and Palash Sarkar, Trading time for space: Towardsan efficient IBE scheme with short(er) public parameters in the standardmodel, ICISC 05: 8th International Conference on Information Security andCryptology(Seoul, Korea) (Dongho Won and Seungjoo Kim, eds.), LectureNotes in Computer Science, vol. 3935, Springer-Verlag, Berlin, Germany,2005, pp. 424–440. 169

[CT05] Cheng-Kang Chu and Wen-Guey Tzeng, Efficient k-out-of-n oblivioustransfer schemes with adaptive and non-adaptive queries, PKC 2005:8th International Workshop on Theory and Practice in Public KeyCryptography(Les Diablerets, Switzerland) (Serge Vaudenay, ed.), LectureNotes in Computer Science, vol. 3386, Springer-Verlag, Berlin, Germany,January 23–26, 2005, pp. 172–183. 143, 144

[CT09] Emiliano De Cristofaro and Gene Tsudik, Practical private set intersectionprotocols, Cryptology ePrint Archive, Report 2009/491, 2009, http://eprint.iacr.org/. 142, 147

[CV02] Jan Camenisch and Els Van Herreweghen, Design and implementation ofthe idemix anonymous credential system, ACM CCS 02: 9th Conferenceon Computer and Communications Security(Washington D.C., USA)(Vijayalakshmi Atluri, ed.), ACM Press, 2002. 33

Page 227: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 203

[CVB06] Carlos Caleiro, Luca Vigano, and David Basin, On the semantics ofAlice&Bob specifications of security protocols, Theor. Comput. Sci. 367(2006), no. 1, 88–122. 188

[CvH91] David Chaum and Eugene van Heyst, Group signatures, Advances inCryptology – EUROCRYPT’91 (Brighton, UK) (Donald W. Davies, ed.),Lecture Notes in Computer Science, vol. 547, Springer-Verlag, Berlin,Germany, 1991, pp. 257–265. 6, 104

[Dam90] Ivan Damgard, Payment systems and credential mechanism with provablesecurity against abuse by individuals, Advances in Cryptology – CRYPTO’88(Shafi Goldwasser, ed.), Lecture Notes in Computer Science, vol. 403,Springer-Verlag, Berlin, Germany, 1990, pp. 328–335. 104, 118

[Dam99] , Concurrent zero-knowledge is easy in practice, Available online atTheory of Cryptography Library, June 1999. 45

[Dam00] , Efficient concurrent zero-knowledge in the auxiliary string model,Advances in Cryptology – EUROCRYPT 2000 (Bruges, Belgium) (BartPreneel, ed.), Lecture Notes in Computer Science, vol. 1807, Springer-Verlag,Berlin, Germany, 2000, pp. 431–444. 19, 99

[Dam02] , On σ-protocols, Available at http://www.daimi.au.dk/˜ivan/Sigma.ps, 2002. 45, 99

[Dan09] George Danezis, The least privacy-damaging centralised traffic data retentionarchitecture (extended abstract), Cambridge Security Protocols Workshop(SPW 2009), Cambridge, UK, 2009. 149

[DC08] George Danezis and Richard Clayton, Introducing traffic analysis, DigitalPrivacy: Theory, Technologies, and Practices (Alessandro Acquisti, StefanosGritzalis, Costos Lambrinoudakis, and Sabrina di Vimercati, eds.), AuerbachPublications, 2008, pp. 95–116. 56

[DCMO00] Giovanni Di Crescenzo, Tal Malkin, and Rafail Ostrovsky, Single databaseprivate information retrieval implies oblivious transfer, Advances inCryptology – EUROCRYPT 2000 (Bruges, Belgium) (Bart Preneel, ed.),Lecture Notes in Computer Science, vol. 1807, Springer-Verlag, Berlin,Germany, 2000, pp. 122–138. 143

[DD05] Liesje Demuynck and Bart De Decker, Privacy-preserving electronichealth records, CMS 2005: 9th Communications and Multimedia SecurityConference (Salzburg, Austria) (Jana Dittmann, Stefan Katzenbeisser, andAndreas Uhl, eds.), Lecture Notes in Computer Science, vol. 3677, Springer-Verlag, Berlin, Germany, 2005, pp. 150–159. 139

[DD07] George Danezis and Claudia Dıaz, Space-efficient private search withapplications to rateless codes, Financial Cryptography 2007 (Scarborough,Trinidad and Tobago) (Sven Dietrich and Rachna Dhamija, eds.), LectureNotes in Computer Science, vol. 4886, Springer-Verlag, Berlin, Germany,2007, pp. 148–162. 7, 143

[DDJ07] Liesje Demuynck, Bart De Decker, and Wouter Joosen, A credential-basedsystem for the anonymous delegation of rights, SEC 2007: 22nd IFIP TC-11International Information Security Conference (Hein S. Venter, Mariki M.

Page 228: Cryptographic Protocols For Privacy Enhanced Identity Management

204 BIBLIOGRAPHY

Eloff, Les Labuschagne, Jan H. P. Eloff, and Rossouw von Solms, eds.), IFIP,vol. 232, Springer-Verlag, Berlin, Germany, 2007, pp. 169–180. 118

[DDMR07] Anupam Datta, Ante Derek, John C. Mitchell, and Arnab Roy, Protocolcomposition logic (PCL), Electr. Notes Theor. Comput. Sci. 172 (2007),311–358. 188

[DDP06] Ivan Damgard, Kasper Dupont, and Michael Østergaard Pedersen, Unclon-able group identification, Advances in Cryptology – EUROCRYPT 2006(St. Petersburg, Russia) (Serge Vaudenay, ed.), Lecture Notes in ComputerScience, vol. 4004, Springer-Verlag, Berlin, Germany, 2006, pp. 555–572. 33,65, 66, 71, 75, 103, 104, 112

[DDS09] George Danezis, Claudia Diaz, and Paul Syverson, Systems for anonymouscommunication, CRC Handbook of Financial Cryptography and Security(B. Rosenberg and D. Stinson, eds.), CRC Cryptography and NetworkSecurity Series, Chapman & Hall, 2009, p. 61. 56

[Den02] Alexander W. Dent, Adapting the weaknesses of the random oracle model tothe generic group model, Advances in Cryptology – ASIACRYPT 2002(Queenstown, New Zealand) (Yuliang Zheng, ed.), Lecture Notes inComputer Science, vol. 2501, Springer-Verlag, Berlin, Germany, 2002,pp. 100–109. 21

[DH76] Whitfield Diffie and Martin E Hellman, New directions in cryptography,IEEE Trans. on Information Theory IT-22 (1976), no. 6, 644–654. 19

[DJ01] Ivan Damgard and Mads Jurik, A generalisation, a simplification andsome applications of Paillier’s probabilistic public-key system, PKC 2001:4th International Workshop on Theory and Practice in Public KeyCryptography(Kwangjo Kim, ed.), Lecture Notes in Computer Science,vol. 1992, Springer-Verlag, Berlin, Germany, 2001, pp. 119–136. 31, 141, 159,160

[DKD+09] Claudia Diaz, Eleni Kosta, Hannelore Dekeyser, Markulf Kohlweiss, andGirma Nigusse, Privacy preserving electronic petitions, Identity in theInformation Society 1 (2009), no. 1, 14. 215

[DMS04] Roger Dingledine, Nick Mathewson, and Paul F. Syverson, Tor: The second-generation onion router, USENIX Security Symposium (Boston, MA, USA),USENIX, 2004, pp. 303–320. 189

[DNRS03] Cynthia Dwork, Moni Naor, Omer Reingold, and Larry J. Stockmeyer, Magicfunctions, Journal of the ACM 50 (2003), no. 6, 852–921. 69

[DSMP88] Alfredo De Santis, Silvio Micali, and Giuseppe Persiano, Non-interactivezero-knowledge proof systems, Advances in Cryptology – CRYPTO’87 (CarlPomerance, ed.), Lecture Notes in Computer Science, vol. 293, Springer-Verlag, Berlin, Germany, 1988, pp. 52–72. 69

[DY05] Yevgeniy Dodis and Aleksandr Yampolskiy, A verifiable random function withshort proofs and keys, PKC 2005: 8th International Workshop on Theory andPractice in Public Key Cryptography(Les Diablerets, Switzerland) (SergeVaudenay, ed.), Lecture Notes in Computer Science, vol. 3386, Springer-Verlag, Berlin, Germany, 2005, pp. 416–431. 25, 34, 90, 92, 96

Page 229: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 205

[EGL85] Shimon Even, Oded Goldreich, and Abraham Lempel, A randomized protocolfor signing contracts, Communications of the Association for ComputingMachinery 28 (1985), no. 6, 637–647. 143

[EGW09] Nigel P. Smart Essam Ghadafi and Bogdan Warinschi, Groth–Sahai proofsrevisited, Cryptology ePrint Archive, Report 2009/599, 2009, http://eprint.iacr.org/. 46

[eu-95] Directive 1995/46/EC of the European parliament and of the council, OfficialJournal of the European Union, October 1995. 148

[eu-06a] Directive 2002/58/EC of the European parliament and of the council, OfficialJournal of the European Union, April 2006. 2

[eu-06b] Directive 2006/24/EC of the European parliament and of the council, OfficialJournal of the European Union, April 2006. 5, 55, 138, 167

[FF00] Marc Fischlin and Roger Fischlin, Efficient non-malleable commitmentschemes, Advances in Cryptology – CRYPTO 2000 (Mihir Bellare, ed.),Lecture Notes in Computer Science, vol. 1880, Springer-Verlag, Berlin,Germany, 2000, pp. 413–431. 19

[FIPR05] Michael J. Freedman, Yuval Ishai, Benny Pinkas, and Omer Reingold,Keyword search and oblivious pseudorandom functions, TCC 2005: 2ndTheory of Cryptography Conference(Cambridge, MA, USA) (Joe Kilian,ed.), Lecture Notes in Computer Science, vol. 3378, Springer-Verlag, Berlin,Germany, 2005, pp. 303–324. 142, 145

[FJKP95] Hannes Federrath, Anja Jerichow, Dogan Kesdogan, and Andreas Pfitzmann,Security in public mobile communication networks, IFIP TC 6 InternationalWorkshop on Personal Wireless Communications, 1995, pp. 105–116. 152

[FLS99] Uriel Feige, Dror Lapidot, and Adi Shamir, Multiple noninteractive zeroknowledge proofs under general assumptions, SIAM Journal on Computing29 (1999), no. 1, 1–28. 36, 48

[FNP04] Michael Freedman, Kobbi Nissim, and Benny Pinkas, Efficient privatematching and set intersection, Advances in Cryptology – EUROCRYPT 2004(Interlaken, Switzerland) (Christian Cachin and Jan Camenisch, eds.),Lecture Notes in Computer Science, vol. 3027, Springer-Verlag, Berlin,Germany, 2004, pp. 1–19. 142

[FO97] Eiichiro Fujisaki and Tatsuaki Okamoto, Statistical zero knowledgeprotocols to prove modular polynomial relations, Advances in Cryptology –CRYPTO’97 (Burton S. Kaliski Jr., ed.), Lecture Notes in Computer Science,vol. 1294, Springer-Verlag, Berlin, Germany, 1997, pp. 16–30. 23, 27

[FO98] , A practical and provably secure scheme for publicly verifiable secretsharing and its applications, Advances in Cryptology – EUROCRYPT’98(Espoo, Finland) (Kaisa Nyberg, ed.), Lecture Notes in Computer Science,vol. 1403, Springer-Verlag, Berlin, Germany, 1998, pp. 32–46. 33

[FP09] Georg Fuchsbauer and David Pointcheval, Proofs on encrypted values inbilinear groups and an application to anonymity of signatures, Pairing-BasedCryptography – Pairing 2009 (Palo Alto, CA, USA) (Hovav Shacham andBrent Waters, eds.), Lecture Notes in Computer Science, vol. 5671, Springer-Verlag, Berlin, Germany, 2009, pp. 132–149. 48

Page 230: Cryptographic Protocols For Privacy Enhanced Identity Management

206 BIBLIOGRAPHY

[Fri07] Lothar Fritsch, Profiling and location based services, FIDIS D7.5: Profilingthe European Citizen, Cross- disciplinary perspectives (M. Hildebrandt andS. Gutwirth, eds.), 2007. 152

[FS87] Amos Fiat and Adi Shamir, How to prove yourself: Practical solutions toidentification and signature problems, Advances in Cryptology – CRYPTO’86(Andrew M. Odlyzko, ed.), Lecture Notes in Computer Science, vol. 263,Springer-Verlag, Berlin, Germany, 1987, pp. 186–194. 19, 45, 69, 71, 90, 113

[FS90] Uriel Feige and Adi Shamir, Witness indistinguishable and witness hidingprotocols, 22nd Annual ACM Symposium on Theory of Computing(Baltimore,Maryland, USA), ACM Press, May 14–16, 1990, pp. 416–426. 43, 44

[FST06] David Freeman, Michael Scott, and Edlyn Teske, A taxonomy of pairing-friendly elliptic curves, Cryptology ePrint Archive, Report 2006/372, 2006,http://eprint.iacr.org/. 91

[FTY96] Yair Frankel, Yiannis Tsiounis, and Moti Yung, “Indirect discourseproofs:” Achieving efficient fair off-line e-cash, Advances in Cryptology –ASIACRYPT’96 (Kyongju, Korea) (Kwangjo Kim and Tsutomu Matsumoto,eds.), Lecture Notes in Computer Science, vol. 1163, Springer-Verlag, Berlin,Germany, 1996, pp. 286–300. 70, 89

[Fuc09] Georg Fuchsbauer, Automorphic signatures in bilinear groups, CryptologyePrint Archive, Report 2009/320, 2009, http://eprint.iacr.org/. 187

[FY93] Matthew K. Franklin and Moti Yung, Secure and efficient off-line digitalmoney (extended abstract), ICALP’93 : 20th International Colloquium onAutomata, Languages and Programming (Lund, Sweden) (Andrzej Lingas,Rolf G. Karlsson, and Svante Carlsson, eds.), Lecture Notes in ComputerScience, vol. 700, Springer-Verlag, Berlin, Germany, 1993, pp. 265–276. 65,70, 89

[Gen06] C. Gentry, Practical identity-based encryption without random oracles,Advances in Cryptology – EUROCRYPT 2006 (St. Petersburg, Russia) (SergeVaudenay, ed.), Lecture Notes in Computer Science, vol. 4004, Springer-Verlag, Berlin, Germany, 2006, pp. 445–464. 149, 169, 171

[Gen09] Craig Gentry, Fully homomorphic encryption using ideal lattices, 41st AnnualACM Symposium on Theory of Computing(Bethesda, MD, USA), ACMPress, May 31 – June 2, 2009, pp. 169–178. 188

[GG03] Marco Gruteser and Dirk Grunwald, Anonymous usage of location-basedservices through spatial and temporal cloaking, MobiSys’03: 1st InternationalConference on Mobile Systems, Applications, and Services (San Francisco,CA, USA) (Dan Siewiorek, ed.), ACM SIGMOBILE and USENIX, 2003,pp. 31–42. 152

[GGM86] Oded Goldreich, Shafi Goldwasser, and Silvio Micali, How to constructrandom functions, Journal of the ACM 33 (1986), no. 4, 792–807. 33

[GGP09] Rosario Gennaro, Craig Gentry, and Bryan Parno, Non-interactive verifiablecomputing: Outsourcing computation to untrusted workers, CryptologyePrint Archive, Report 2009/547, 2009, http://eprint.iacr.org/. 189

Page 231: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 207

[GH07] Matthew Green and Susan Hohenberger, Blind identity-based encryption andsimulatable oblivious transfer, Advances in Cryptology – ASIACRYPT 2007(Kuching, Malaysia) (Kaoru Kurosawa, ed.), Lecture Notes in ComputerScience, vol. 4833, Springer-Verlag, Berlin, Germany, 2007, pp. 265–282. 149,169, 171, 172, 173

[GK96] Oded Goldreich and Hugo Krawczyk, On the composition of zero-knowledgeproof systems, SIAM Journal on Computing 25 (1996), no. 1, 169–192. 44

[GK03] Shafi Goldwasser and Yael Tauman Kalai, On the (in)security of the Fiat-Shamir paradigm, 44th Annual Symposium on Foundations of ComputerScience(Cambridge, Massachusetts, USA), IEEE Computer Society Press,2003, pp. 102–115. 69, 90

[GM84] Shafi Goldwasser and Silvio Micali, Probabilistic encryption, Journal ofComputer and System Sciences 28 (1984), no. 2, 270–299. 30, 141

[GMR88] Shafi Goldwasser, Silvio Micali, and Ronald Rivest, A digital signaturescheme secure against adaptive chosen-message attacks, SIAM Journal onComputing 17 (1988), no. 2, 281–308. 15, 32, 76, 77, 82, 138

[GMR89] Shafi Goldwasser, Silvio Micali, and Charles Rackoff, The knowledgecomplexity of interactive proof-systems, SIAM Journal on Computing 18(1989), no. 1, 186–208. 36, 39, 43

[GMW86] Oded Goldreich, Silvio Micali, and Avi Wigderson, Proofs that yield nothingbut their validity and a method of cryptographic protocol design, 27th AnnualSymposium on Foundations of Computer Science(Toronto, Ontario, Canada),IEEE Computer Society Press, 1986, pp. 174–187. 33

[GO96] Oded Goldreich and Rafi Ostrovsky, Software protection and simulation onoblivious rams, Journal of the ACM 43 (1996), no. 3, 431–473. 142

[Gol00] Oded Goldreich, Foundations of cryptography: Volume 1, basic tools,Cambridge University Press, New York, NY, USA, 2000. 11, 36

[Gol04] , Foundations of cryptography: Volume 2, basic applications,Cambridge University Press, New York, NY, USA, 2004. 11

[GOS06a] Jens Groth, Rafail Ostrovsky, and Amit Sahai, Non-interactive Zaps and newtechniques for NIZK, Advances in Cryptology – CRYPTO 2006 (CynthiaDwork, ed.), Lecture Notes in Computer Science, vol. 4117, Springer-Verlag,Berlin, Germany, 2006, pp. 97–111. 45

[GOS06b] , Perfect non-interactive zero knowledge for NP, Advances inCryptology – EUROCRYPT 2006 (St. Petersburg, Russia) (Serge Vaudenay,ed.), Lecture Notes in Computer Science, vol. 4004, Springer-Verlag, Berlin,Germany, 2006, pp. 339–358. 45, 48

[GPS08] Steven D. Galbraith, Kenneth G. Paterson, and Nigel P. Smart, Pairingsfor cryptographers, Discrete Appl. Math. 156 (2008), no. 16, 3113–3121. 17,174

[GR04] Steven Galbraith and Victor Rotger, Easy decision-Diffie-Hellman groups,Journal of Computation and Mathematics 7 (2004), 201–218. 23

Page 232: Cryptographic Protocols For Privacy Enhanced Identity Management

208 BIBLIOGRAPHY

[GR05] Craig Gentry and Zulfikar Ramzan, Single-database private informationretrieval with constant communication rate, ICALP 2005: 32nd InternationalColloquium on Automata, Languages and Programming (Lisbon, Portugal)(Luıs Caires, Giuseppe F. Italiano, Luıs Monteiro, Catuscia Palamidessi, andMoti Yung, eds.), Lecture Notes in Computer Science, vol. 3580, Springer-Verlag, Berlin, Germany, 2005, pp. 803–815. 141

[GS08] Jens Groth and Amit Sahai, Efficient non-interactive proof systems forbilinear groups, Advances in Cryptology – EUROCRYPT 2008 (Istanbul,Turkey) (Nigel P. Smart, ed.), Lecture Notes in Computer Science, vol. 4965,Springer-Verlag, Berlin, Germany, 2008. 22, 27, 28, 38, 44, 46, 47, 48, 49,70, 71, 73, 75, 83, 88, 96, 97, 122, 128

[GSM03] GSM Association: Location Based Services, Permanent reference document,Tech. report, SE.23, 2003. 151

[GZ09] Paolo Guarda and Nicola Zannone, Towards the development of privacy-aware systems., Information and Software Technology 51 (2009), no. 2, 337– 350. 3

[Her08] Paul De Hert, Identity management of e-ID, privacy and security in Europe.a human rights view, Information Security Technical Report 13 (2008), no. 2,71 – 75. 3

[hig09] Higgins project, online, 2009, Available from http://www.eclipse.org/higgins/, accessed in September 2009. 54

[HS00] Martin Hirt and Kazue Sako, Efficient receipt-free voting based onhomomorphic encryption, Advances in Cryptology – EUROCRYPT 2000(Bruges, Belgium) (Bart Preneel, ed.), Lecture Notes in Computer Science,vol. 1807, Springer-Verlag, Berlin, Germany, 2000, pp. 539–556. 148

[IKOS06] Yuval Ishai, Eyal Kushilevitz, Rafail Ostrovsky, and Amit Sahai,Cryptography from anonymity, 47th Annual Symposium on Foundationsof Computer Science(Berkeley, California, USA), IEEE Computer SocietyPress, 2006, pp. 239–248. 142

[IL07] Muhammad Iqbal and Samsung Lim, An automated real-world privacyassessment of GPS tracking and profiling., 2nd Workshop on SocialImplications of National Security: From Dataveillance to Uberveillance,2007, pp. 225–240. 6, 137, 152

[JL09] Stanislaw Jarecki and Xiaomin Liu, Efficient oblivious pseudorandomfunction with applications to adaptive ot and secure computation of setintersection, TCC 2009: 6rd Theory of Cryptography Conference(SanFrancisco, CA, USA) (Omer Reingold, ed.), Lecture Notes in ComputerScience, vol. 5444, Springer-Verlag, Berlin, Germany, 2009, pp. 577–594. 142

[JS04] Stanislaw Jarecki and Vitaly Shmatikov, Handcuffing big brother: anabuse-resilient transaction escrow scheme, Advances in Cryptology –EUROCRYPT 2004 (Interlaken, Switzerland) (Christian Cachin and JanCamenisch, eds.), Lecture Notes in Computer Science, vol. 3027, Springer-Verlag, Berlin, Germany, 2004, pp. 590–608. 33, 75

Page 233: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 209

[JS07] , Efficient two-party secure computation on committed inputs,Advances in Cryptology – EUROCRYPT 2007 (Barcelona, Spain) (MoniNaor, ed.), Lecture Notes in Computer Science, vol. 4515, Springer-Verlag,Berlin, Germany, 2007, pp. 97–114. 35, 83

[JS08] Tibor Jager and Jorg Schwenk, On the equivalence of generic group models,ProvSec 2008: 2nd International Conference on Provable Security (Shanghai,China) (Joonsang Baek, Feng Bao, Kefei Chen, and Xuejia Lai, eds.), LectureNotes in Computer Science, vol. 5324, Springer-Verlag, Berlin, Germany,2008, pp. 200–209. 249

[Kal05] Yael Tauman Kalai, Smooth projective hashing and two-message oblivioustransfer, Advances in Cryptology – EUROCRYPT 2005 (Aarhus, Denmark)(Ronald Cramer, ed.), Lecture Notes in Computer Science, vol. 3494,Springer-Verlag, Berlin, Germany, May 22–26, 2005, pp. 78–95. 143

[KFF+07] Markulf Kohlweiss, Sebastian Faust, Lothar Fritsch, Bartek Gedrojc, andBart Preneel, Efficient oblivious augmented maps: Location-based serviceswith a payment broker, Privacy Enhancing Technologies – PET 2007 (NikitaBorisov and Philippe Golle, eds.), Lecture Notes in Computer Science, vol.4776, Springer-Verlag, Berlin, Germany, 2007, pp. 77–94. 8, 9, 216

[KFKK05] Tobias Kolsch, Lothar Fritsch, Markulf Kohlweiss, and Dogan Kesdogan,Privacy for profitable location based services., SPC 2005: 2nd InternationalConference on Security in Pervasive Computing (Boppard, Germany) (DieterHutter and Markus Ullmann, eds.), Lecture Notes in Computer Science, vol.3450, Springer-Verlag, Berlin, Germany, 2005, pp. 164–178. 151, 152

[Kle04] Virginia Franke Kleist, A transaction cost model of electronic trust:Transactional return, incentives for network security and optimal risk in thedigital economy, Electronic Commerce Research 4 (2004), no. 1-2, 41–57. 56

[KM99] Reto Kohlas and Ueli M. Maurer, Reasoning about public-key certification:On bindings between entities and public keys, Financial Cryptography’99(Anguilla, British West Indies) (Matthew Franklin, ed.), Lecture Notesin Computer Science, vol. 1648, Springer-Verlag, Berlin, Germany, 1999,pp. 86–103. 56

[KM08] Neal Koblitz and Alfred Menezes, Another look at non-standard discrete logand diffie-hellman problems, Journal of Mathematical Cryptology 2 (2008),no. 4, 311–326. 22

[KO97] Eyal Kushilevitz and Rafail Ostrovsky, Replication is not needed: singledatabase, computationally-private information retrieval, 38th AnnualSymposium on Foundations of Computer Science(Miami Beach, Florida),IEEE Computer Society Press, 1997, pp. 364–373. 141

[Kob07] Neal Koblitz, Another look at automated theorem-proving, Cryptology ePrintArchive, Report 2007/401, 2007, http://eprint.iacr.org/. 188

[KP98] Joe Kilian and Erez Petrank, An efficient non-interactive zero-knowledgeproof system for NP with general assumptions, Journal of Cryptology 11(1998), no. 1, 1–27. 48

Page 234: Cryptographic Protocols For Privacy Enhanced Identity Management

210 BIBLIOGRAPHY

[Kru07] John Krumm, Inference attacks on location tracks, Pervasive 2007: 5thInternational Conference on Pervasive Computing (Toronto, Canada)(Anthony LaMarca, Marc Langheinrich, and Khai N. Truong, eds.), LectureNotes in Computer Science, vol. 4480, Springer-Verlag, Berlin, Germany,2007, pp. 127–143. 6, 137, 152

[KSW08] Jonathan Katz, Amit Sahai, and Brent Waters, Predicate encryptionsupporting disjunctions, polynomial equations, and inner products, Advancesin Cryptology – EUROCRYPT 2008 (Istanbul, Turkey) (Nigel P. Smart,ed.), Lecture Notes in Computer Science, vol. 4965, Springer-Verlag, Berlin,Germany, 2008, pp. 146–162. 182

[Lau04] Peeter Laud, Symmetric encryption in automatic analyses for confidentialityagainst active adversaries, In Proc. 25th IEEE Symposium on Security andPrivacy, 2004, pp. 71–85. 188

[LDB05] Ninghui Li, Wenliang Du, and Dan Boneh, Oblivious signature-based envelope,Distributed Computing 17 (2005), no. 4, 293–302. 147

[lib09] Liberty alliance project, online, 2009, Available from http://www.projectliberty.org/, accessed in September 2009. 54

[Lip05] Helger Lipmaa, An oblivious transfer protocol with log-squared communi-cation, ISC 2005: Information Security Conference (Singapore) (JianyingZhou, Javier Lopez, Robert H. Deng, and Feng Bao, eds.), Lecture Notesin Computer Science, vol. 3650, Springer-Verlag, Berlin, Germany, 2005,pp. 314–328. 141

[Lip08] , Private branching programs: On communication-efficientcryptocomputing, Cryptology ePrint Archive, Report 2008/107, 2008, http://eprint.iacr.org/. 189

[LP07] Yehuda Lindell and Benny Pinkas, An efficient protocol for secure two-partycomputation in the presence of malicious adversaries, Advances in Cryptology– EUROCRYPT 2007 (Barcelona, Spain) (Moni Naor, ed.), Lecture Notes inComputer Science, vol. 4515, Springer-Verlag, Berlin, Germany, 2007. 35

[LRSW99] Anna Lysyanskaya, Ron Rivest, Amit Sahai, and Stefan Wolf, Pseudonymsystems, SAC 1999: 6th Annual International Workshop on Selected Areasin Cryptography(Howard M. Heys and Carlisle M. Adams, eds.), LectureNotes in Computer Science, vol. 1758, Springer-Verlag, Berlin, Germany,1999. 62, 104, 118

[LT06] Phil Landfried and Hugo Teufel III, Privacy impact assessment for theautomated targeting system, US Department of Homeland Security Re-port. http://www.dhs.gov/xlibrary/assets/privacy/privacy˙pia˙cbp˙ats.pdf, 2006. 138

[Lys02] Anna Lysyanskaya, Signature schemes and applications to cryptographicprotocol design, Ph.D. thesis, Massachusetts Institute of Technology,Cambridge, Massachusetts, September 2002. 6, 105, 118

[Mau94] Ueli Maurer, Towards the equivalence of breaking the Diffie-Hellman protocoland computing discrete logarithms, Advances in Cryptology – CRYPTO’94(Yvo Desmedt, ed.), Lecture Notes in Computer Science, vol. 839, Springer-Verlag, Berlin, Germany, 1994, pp. 271–281. 21

Page 235: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 211

[Mau05] Ueli M. Maurer, Abstract models of computation in cryptography, 10thIMA International Conference on Cryptography and Coding (Cirencester,UK) (Nigel P. Smart, ed.), Lecture Notes in Computer Science, vol. 3796,Springer-Verlag, Berlin, Germany, 2005, pp. 1–12. 20, 249, 251

[MDBP08] Yoni De Mulder, George Danezis, Lejla Batina, and Bart Preneel,Identification via location-profiling in GSM networks, Workshop on Privacyin the Electronic Society – WPES 2008 (Alexandria, VA, USA) (Vijay Atluriand Marianne Winslett, eds.), ACM Press, 2008, pp. 23–32. 6, 137, 152

[MNPS04] Dahlia Malkhi, Noam Nisan, Benny Pinkas, and Yaron Sella, Fairplay –secure two-party computation system, USENIX Security Symposium (SanDiego, CA, USA), USENIX, 2004, pp. 287–302. 188

[MS09] Sebastian Modersheim and Dieter Sommer, A formal model of identity mixer,Tech. report, Technical report, IBM Zurich Research Lab, 2009, 2009. 188

[MvOV96] Alfred J. Menezes, Paul C. van Oorschot, and Scott A. Vanstone, Handbookof applied cryptography, CRC, 1996, http://www.cacr.math.uwaterloo.ca/hac/. 17

[MW98] Ueli Maurer and Stefan Wold, Lower bounds on generic algorithms in groups,Advances in Cryptology – EUROCRYPT’98 (Espoo, Finland) (Kaisa Nyberg,ed.), Lecture Notes in Computer Science, vol. 1403, Springer-Verlag, Berlin,Germany, 1998. 20

[Nac07] D. Naccache, Secure and practical identity-based encryption, InformationSecurity, IET 1 (2007), no. 2, 59–64. 169, 174, 241, 243, 244

[Nao03] Moni Naor, On cryptographic assumptions and challenges, Advances inCryptology – CRYPTO 2003 (Dan Boneh, ed.), Lecture Notes in ComputerScience, vol. 2729, Springer-Verlag, Berlin, Germany, 2003, pp. 96–109. 21

[Nec94] V. I. Nechaev, Complexity of a determinate algorithm for the discretelogarithm, Mathematical Notes 55 (1994), 165–172. 251

[NNA07] Chanathip Namprempre, Gregory Neven, and Michel Abdalla, A study ofblind message authentication codes, IEICE Transactions 90-A (2007), no. 1,75–82. 145

[NP99a] Moni Naor and Benny Pinkas, Oblivious transfer and polynomial evaluation,31st Annual ACM Symposium on Theory of Computing(Atlanta, Georgia,USA), ACM Press, May 1–4, 1999, pp. 245–254. 143

[NP99b] Moni Naor and Benny Pinkas, Oblivious transfer with adaptive queries,Advances in Cryptology – CRYPTO’99 (Michael J. Wiener, ed.), LectureNotes in Computer Science, vol. 1666, Springer-Verlag, Berlin, Germany,1999, pp. 573–590. 7, 143

[NP01] Moni Naor and Benny Pinkas, Efficient oblivious transfer protocols, 12thAnnual ACM-SIAM Symposium on Discrete Algorithms(Washington, DC,USA), ACM-SIAM, January 7–9, 2001, pp. 448–457. 7, 143

[NR97] Moni Naor and Omer Reingold, Number-theoretic constructions of efficientpseudo-random functions, 38th Annual Symposium on Foundations ofComputer Science(Miami Beach, Florida), IEEE Computer Society Press,1997, pp. 458–467. 21

Page 236: Cryptographic Protocols For Privacy Enhanced Identity Management

212 BIBLIOGRAPHY

[NSN05] Lan Nguyen and Reihaneh Safavi-Naini, Dynamic k-times anonymousauthentication, ACNS 2005: 3th International Conference on AppliedCryptography and Network Security (New York, NY, USA) (John Ioannidis,Angelos D. Keromytis, and Moti Yung, eds.), Lecture Notes in ComputerScience, vol. 3531, Springer-Verlag, Berlin, Germany, 2005, pp. 318–333. 65,66

[OK04] Wakaha Ogata and Kaoru Kurosawa, Oblivious keyword search, J.Complexity 20 (2004), no. 2-3, 356–371. 7, 142, 143, 144, 165, 168

[ope09] Openid foundation, online, 2009, Available from http://openid.net/,accessed in September 2009. 54

[OS07a] Rafail Ostrovsky and William E. Skeith III, Private searching on streamingdata, Journal of Cryptology 20 (2007), no. 4, 397–430. 7, 143, 147, 167

[OS07b] , A survey of single-database private information retrieval: Techniquesand applications, PKC 2007: 10th International Workshop on Theory andPractice in Public Key Cryptography(Beijing, China) (Tatsuaki Okamotoand Xiaoyun Wang, eds.), Lecture Notes in Computer Science, vol. 4450,Springer-Verlag, Berlin, Germany, 2007, pp. 393–411. 141, 147

[Pai99] Pascal Paillier, Public-key cryptosystem based on composite degree residuosityclasses, Advances in Cryptology – EUROCRYPT’99 (Prague, CzechRepublic) (Jacques Stern, ed.), Lecture Notes in Computer Science, vol.1592, Springer-Verlag, Berlin, Germany, 1999, pp. 223–228. 22, 31, 36

[Ped92] Torben Pryds Pedersen, Non-interactive and information-theoretic secureverifiable secret sharing, Advances in Cryptology – CRYPTO’92 (Ernest F.Brickell, ed.), Lecture Notes in Computer Science, vol. 740, Springer-Verlag,Berlin, Germany, 1992, pp. 129–140. 27, 33

[pri09a] Prime project, online, 2009, Available from https://www.prime-project.eu/, accessed in September 2009. 3, 10, 54, 189

[pri09b] Primelife project, online, 2009, Available from http://www.primelife.eu/,accessed in September 2009. 3, 10

[PW00] Birgit Pfitzmann and Michael Waidner, Composition and integritypreservation of secure reactive systems, ACM CCS 00: 7th Conferenceon Computer and Communications Security(Athens, Greece) (S. Jajodiaand P. Samarati, eds.), ACM Press, 2000, pp. 245–254. 14

[Rab81] Michael O. Rabin, How to exchange secrets by oblivious transfer, Tech.Report TR-81, Harvard Aiken Computation Laboratory, 1981. 7, 143

[RKP09] Alfredo Rial, Markulf Kohlweiss, and Bart Preneel, Universally composableadaptive priced oblivious transfer, Pairing-Based Cryptography – Pairing 2009(Palo Alto, CA, USA) (Hovav Shacham and Brent Waters, eds.), LectureNotes in Computer Science, vol. 5671, Springer-Verlag, Berlin, Germany,2009, pp. 231–247. 154, 187, 215

[RS92] Charles Rackoff and Daniel R. Simon, Non-interactive zero-knowledgeproof of knowledge and chosen ciphertext attack, Advances in Cryptology –CRYPTO’91 (Joan Feigenbaum, ed.), Lecture Notes in Computer Science,vol. 576, Springer-Verlag, Berlin, Germany, 1992, pp. 433–444. 30

Page 237: Cryptographic Protocols For Privacy Enhanced Identity Management

BIBLIOGRAPHY 213

[Sah08] Amit Sahai, Computing on encrypted data, ICISS 2008: 4th InternationalConference on Information Systems Security (Hyderabad, India), LectureNotes in Computer Science, vol. 5352, Springer-Verlag, Berlin, Germany,2008, pp. 148–153. 189

[SC07] Radu Sion and Bogdan Carbunar, On the practicality of private informationretrieval, NDSS 2007: 14th Annual Network and Distributed System SecuritySymposium (San Diego, California, USA), The Internet Society, 2007, http://www.isoc.org/isoc/conferences/ndss/07/proceedings.shtml. 142

[Sch80] Jacob T. Schwartz, Fast probabilistic algorithms for verification of polynomialidentities, Journal of the ACM 27 (1980), no. 4, 701–717. 251

[Sch91] Claus P. Schnorr, Efficient signature generation for smart cards, Journal ofCryptology 4 (1991), no. 3, 239–252. 44, 45

[Sco02] Mike Scott, Authenticated ID-based key exchange and remote log-in withinsecure token and PIN number, Cryptology ePrint Archive, Report 2006/164,2002, http://eprint.iacr.org/. 23

[SCP00] Alfredo De Santis, Giovanni Di Crescenzo, and Giuseppe Persiano, Necessaryand sufficient assumptions for non-interactive zero-knowledge proofs ofknowledge for all NP relations, ICALP 2005: 27nd International Colloquiumon Automata, Languages and Programming (Geneva, Switzerland) (UgoMontanari, Jose P. Rolim, and Emo Welzl, eds.), Lecture Notes in ComputerScience, vol. 1853, Springer-Verlag, Berlin, Germany, 2000, pp. 451–462. 27,41

[Sho97] Victor Shoup, Lower bounds for discrete logarithms and related problems,Advances in Cryptology – EUROCRYPT’97 (Konstanz, Germany) (WalterFumy, ed.), Lecture Notes in Computer Science, vol. 1233, Springer-Verlag,Berlin, Germany, 1997, pp. 256–266. 20, 249, 251

[SPC95] Markus Stadler, Jean-Marc Piveteau, and Jan Camenisch, Fair blindsignatures, Advances in Cryptology – EUROCRYPT’95 (Saint-Malo, France)(Louis C. Guillou and Jean-Jacques Quisquater, eds.), Lecture Notes inComputer Science, vol. 921, Springer-Verlag, Berlin, Germany, 1995, pp. 209–219. 70, 89

[Swe02] Latanya Sweeney, k-anonymity: a model for protecting privacy, InternationalJournal on Uncertainty, Fuzziness and Knowledge-based Systems 10 (2002),no. 5, 557–570. 103

[SWP00] Dawn Xiaodong Song, David Wagner, and Adrian Perrig, Practical techniquesfor searches on encrypted data, S&P 2000: 21st IEEE Symposium on Securityand Privacy (Oakland, California, USA), IEEE Computer Society, 2000,pp. 44–55. 142

[TFS04] Isamu Teranishi, Jun Furukawa, and Kazue Sako, k-times anonymous authen-tication (extended abstract)., Advances in Cryptology – ASIACRYPT 2004(Jeju Island, Korea) (Pil Joong Lee, ed.), Lecture Notes in Computer Science,vol. 3329, Springer-Verlag, Berlin, Germany, 2004, pp. 308–322. 65

[TP07] Jonathan Trostle and Andy Parrish, Efficient computationally privateinformation retrieval from anonymity or trapdoor groups, Cryptology ePrintArchive, Report 2007/392, 2007, http://eprint.iacr.org/. 142

Page 238: Cryptographic Protocols For Privacy Enhanced Identity Management

214 BIBLIOGRAPHY

[TS06] Isamu Teranishi and Kazue Sako, k-times anonymous authentication with aconstant proving cost., PKC 2006: 9th International Workshop on Theoryand Practice in Public Key Cryptography(New York, NY, USA) (Moti Yung,ed.), Lecture Notes in Computer Science, vol. 3958, Springer-Verlag, Berlin,Germany, 2006, pp. 525–542. 33, 75, 91

[Tsi97] Yiannis S. Tsiounis, Efficient electonic cash: New notions and techniques,Ph.D. thesis, Northeastern University, Boston, Massachusetts, 1997. 70, 89

[Ver01] Eric R. Verheul, Self-blindable credential certificates from the Weil pairing,Advances in Cryptology – ASIACRYPT 2001 (Gold Coast, Australia) (ColinBoyd, ed.), Lecture Notes in Computer Science, vol. 2248, Springer-Verlag,Berlin, Germany, 2001, pp. 533–551. 6

[Ver04] , Evidence that xtr is more secure than supersingular elliptic curvecryptosystems., Journal of Cryptology 17 (2004), no. 4, 277–296. 23

[Wat05] Brent Waters, Efficient identity-based encryption without random oracles,Advances in Cryptology – EUROCRYPT 2004 (Interlaken, Switzerland)(Christian Cachin and Jan Camenisch, eds.), Lecture Notes in ComputerScience, vol. 3027, Springer-Verlag, Berlin, Germany, 2005, pp. 114–127. 174,241, 243

[WBDS04] Brent Waters, Dirk Balfanz, Glenn Durfee, and Diana K. Smetters, Buildingan encrypted and searchable audit log, NDSS 2004: 11th Annual Networkand Distributed System Security Symposium (San Diego, California, USA)(Mike Reiter and Dan Boneh, eds.), The Internet Society, 2004, http://www.isoc.org/isoc/conferences/ndss/04/proceedings/. 7, 168, 169,170, 179

[web09] Web services activity, online, 2009, Available from http://www.w3.org/2002/ws/, accessed in September 2009. 54

[WSC08] Peter Williams, Radu Sion, and Bogdan Carbunar, Building castles outof mud: practical access pattern privacy and correctness on untrustedstorage, ACM CCS 08: 13th Conference on Computer and CommunicationsSecurity(Alexandria, Virginia, USA) (Peng Ning, Paul F. Syverson, andSomesh Jha, eds.), ACM Press, 2008, pp. 139–148. 142

[Yao82] Andrew C. Yao, Protocols for secure computations, 23rd Annual Symposiumon Foundations of Computer Science(Chicago, Illinois), IEEE ComputerSociety Press, 1982, pp. 160–164. 34, 138, 176

[Yao86] , How to generate and exchange secrets, 27th Annual Symposiumon Foundations of Computer Science(Toronto, Ontario, Canada), IEEEComputer Society Press, 1986, pp. 162–167. 33

Page 239: Cryptographic Protocols For Privacy Enhanced Identity Management

List of Publications

Journals

1. Claudio A. Ardagna, Jan Camenisch, Markulf Kohlweiss, Ronald Leenes, GregoryNeven, Bart Priem, Pierangela Samarati, Dieter Sommer, and Mario Verdicchio,“Exploiting cryptography for privacy-enhanced access control: A result of the PRIMEproject,” In Journal of Computer Security (JCS), 18(1), 37 pages, 2010. [ACK+10]

2. C. Diaz, E. Kosta, H. Dekeyser, M. Kohlweiss, and G. Nigusse, “Privacy preservingelectronic petitions,” In Identity in the Information Society (IDIS) 1(1), 14 pages,2009. [DKD+09]

International Conferences/Workshops

1. M. Belenkiy, J. Camenisch, M. Chase, M. Kohlweiss, A. Lysyanskaya, and H.Shacham, “Randomizable Proofs and Delegatable Anonymous Credentials,” InAdvances in Cryptology - CRYPTO 2009, Lecture Notes in Computer Science,Springer-Verlag, 17 pages, 2009. [BCC+09] (Full version [BCC+08])

2. M. Belenkiy, M. Chase, M. Kohlweiss, and A. Lysyanskaya, “Compact E-Cashand Simulatable VRFs Revisited,” In Pairing-Based Cryptography - Pairing 2009,Lecture Notes in Computer Science, Springer-Verlag, 17 pages, 2009. [BCKL09]

3. Alfredo Rial, Markulf Kohlweiss, Bart Preneel, “Universally Composable AdaptivePriced Oblivious Transfer,” In Pairing-Based Cryptography - Pairing 2009, LectureNotes in Computer Science, Springer-Verlag, 24 pages, 2009. [RKP09]

4. J. Camenisch, M. Kohlweiss, and C. Soriente, “An Accumulator Based on BilinearMaps and Efficient Revocation for Anonymous Credentials,” In Public KeyCryptography, 12th International Workshop on Practice and Theory in PublicKey Cryptosystems, PKC 2009, Lecture Notes in Computer Science, Springer-Verlag, 20 pages, 2009. [CKS09]

5. J. Camenisch, M. Kohlweiss, A. Rial, and C. Sheedy, “Blind and AnonymousIdentity-Based Encryption and Authorised Private Searches on Public KeyEncrypted Data,” In Public Key Cryptography, 12th International Workshopon Practice and Theory in Public Key Cryptosystems, PKC 2009, Lecture Notes inComputer Science, Springer-Verlag, 18 pages, 2009. [CKRS09]

215

Page 240: Cryptographic Protocols For Privacy Enhanced Identity Management

216 LIST OF PUBLICATIONS

6. C. Andersson, M. Kohlweiss, L. Martucci, and A. Panchenko, “Self-certified Sybil-free Pseudonyms,” In 1st ACM Conference on Wireless Network Security (WiSec2008), ACM, 6 pages, 2008. [AKMP08b]

7. C. Andersson, M. Kohlweiss, L. Martucci, and A. Panchenko, “A Self-certified andSybil-Free Framework for Secure Digital Identity Domain Buildup,” In InformationSecurity Theory and Practices. Smart Devices, Convergence and Next GenerationNetworks, Lecture Notes in Computer Science 5019, Springer-Verlag, 13 pages,2008. [AKMP08a]

8. M. Belenkiy, M. Chase, M. Kohlweiss, and A. Lysyanskaya, “P-signatures andNon-interactive Anonymous Credentials,” In Theory of Cryptography Conference(TCC 2008), Lecture Notes in Computer Science, Springer-Verlag, 18 pages,2008. [BCKL08] (Full version [BCKL07])

9. M. Kohlweiss, S. Faust, L. Fritsch, B. Gedrojc, and B. Preneel, “Efficient ObliviousAugmented Maps: Location-Based Services with a Payment Broker,” In Proceedingsof Privacy Enhancing Technologies, 7th International Workshop, PET 2007, LectureNotes in Computer Science 4776, N. Borisov, and P. Golle (eds.), Springer-Verlag,17 pages, 2007. [KFF+07]

10. J. Camenisch, S. Hohenberger, M. Kohlweiss, A. Lysyanskaya, and M. Meyerovich,“How to Win the Clone Wars: Efficient Periodic n-Times Anonymous Authentication,”In Proceedings of the 13th ACM Conference on Computer and CommunicationsSecurity, CCS 2006, S. De Capitani di Vimercati, V. Shmatikov, and R. N. Wright(eds.), ACM, 11 pages, 2006. [CHK+06]

11. Tobias Kolsch, Lothar Fritsch, Markulf Kohlweiss, Dogan Kesdogan, “Privacy forProfitable Location Based Services.” In Security in Pervasive Computing, SecondInternational Conference, SPC 2005, Lecture Notes in Computer Science 3450,Dieter Hutter and Markus Ullmann (eds.), 14 pages. Springer-Verlag, 2005.

Page 241: Cryptographic Protocols For Privacy Enhanced Identity Management

Curriculum Vitae

Markulf Kohlweiss was born on May 29, 1978 in Wieselburg, Austria. He received thedegree of Master in Computer Science (Diplom Ingenieur, Dipl.-Ing.) from the Universityof Klagenfurt, Austria. He visited IBM Research Zurich for half a year, starting inNovember 2002, to do his master thesis under the supervision of Dr. Jan Camenisch. InSeptember 2004 he started to work as a researcher in the European PRIME project, firstin Frankfurt at the Johann Wolfgang Goethe University and then, from October 2005onwards, in COSIC (Computer Security and Industrial Cryptography) at the Departmentof Electrical Engineering (ESAT) of the K.U.Leuven. There he also started the work onthis thesis. He continues his close collaboration with IBM Research Zurich and Dr. JanCamenisch and also visited Prof. Lysyanskaya for several month in Spring 2007 and 2008at Brown University, Providence, USA.

217

Page 242: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 243: Cryptographic Protocols For Privacy Enhanced Identity Management

Appendix A

Security Proofs

A.1 Security of First Construction of P-Signatures

Theorem 1 Let F (x) = (hx, ux), where u ∈ G1 and h ∈ G2 as required by the IHSDHassumption. The Weak Boneh-Boyen signature scheme is F -secure given IHSDH.

Proof. Correctness is straightforward. To prove unforgeability, we create a reduction tothe IHSDH assumption. The reduction gets as input (p,G1,G2,GT , e, g,G, h,H, u), whereG = gx and H = hx for some secret x. The reduction sets up the public parameters ofthe Weak Boneh-Boyen signature scheme params = ((p,G1,G2,GT , e, g, h) and a publickey pk = (H,G). To answer a signature query on message m`, the reduction sends aquery to Ow(m`) and sends g1/(x+m`) back to the adversary. Eventually, the adversarywill output a forgery (σ, y), where σ = g1/(x+m), y = F (m) = (hm, um), and m 6= m` forall `. The reduction can then output the HSDH tuple (σ, hm, um).

Theorem 2 (Security) Our first P-signature construction is secure for bijectionF (m) = (hm, um) given IHSDH and the security of the GS commitments and proofs.

Proof. Correctness follows from correctness of GS proofs.

Signer Privacy. We construct a SimIssue algorithm that given params, a commitmentcomm, and a signature σ as input simulate the adversary’s view. SimIssue invokes thesimulator of the two-party computation (2PC) protocol to interact with an internal copyof the adversary algorithm. Recall that in two-party computation, the simulator can firstextract the input of the adversary: in this case, some (ρ,m, open). If the 2PC protocolaborts or comm 6= Com(params,m, open), the simulator aborts just like the adversary.Otherwise the simulator sends the value σ′ = σ1/ρ to the adversary. If the adversary can

219

Page 244: Cryptographic Protocols For Privacy Enhanced Identity Management

220 SECURITY PROOFS

determine that he is talking with a simulator, he either breaks the binding property ofthe commitment scheme or the security of the two-party computation protocol.

User privacy. The P-signature simulator will invoke the simulator for the two-partycomputation protocol. Recall that in two-party computation, the simulator can firstextract the input of the adversary (in this case, some α′, not necessarily the valid secretkey). Then the 2PC simulator requires the target output of the 2PC computation (in thiscase, the value x which is just a random value that the P-signature simulator can pickitself), and proceeds to interact with the adversary such that if the adversary completesthe protocol, its output is x. Suppose the adversary can determine that it is talking witha simulator, then we break the security of the two-party computation protocol.

Zero-knowledge. Consider the following algorithms. SimSetup runs BMGen(1k) to getparamsBM = (p,G1,G2,GT , e, g, h) and generates the simulation parameters paramsGSand trapdoor simGS for the GS proof system. It then picks t← Zp and sets up u = gt.The final parameters are params = (paramsGS , u, z = e(g, h)) and sim = (t, simGS).Note that the distribution of params is indistinguishable from the distribution output bySigSetup. Also note that using these parameters, the commitments generated by GSComare perfectly hiding.

SimProve receives params, sim, and public key (v, v) and can use trapdoor sim to createa random P-signature forgery as follows. Pick s ← Zp and compute σ = g1/s. Weimplicitly set m = s − α. Note that the simulator does not know m and α. However,he can compute hm = hs/v and um = (gs/v)t. Now he can use σ, hm, and um to createcommitments. The proof π is computed in the same way as in the real Prove protocolusing σ, hm, and um and the opening information of the commitments as witnesses. Bythe witness indistinguishability of the GS proof system, a proof using the faked witnessesis indistinguishable from a proof using a real witness, thus SimProve is indistinguishablefrom Prove.

Finally, we need to show that we can simulate proofs of EqComProve given the trapdoorsimGS . This follows from the composable zero-knowledge property of EqComProve.

Unforgeability. Consider the following algorithms: ExtractSetup(1k) outputs the usualparams, except that it follows the GS extraction algorithm to generates paramsGS and antrapdoor ext. The trapdoor ext allows to extract the values in individual commitments.As GS extraction parameters are identical to standard parameters, we know that theparameters generated by ExtractSetup will be indistinguishable from those generated bySigSetup.

Extract(params, ext, comm, π) extracts the values from commitment comm and proof πusing the GS commitment extractor. If VerifyProof accepts then comm = Com(m) andF (m) = (hm , um).

Now suppose we have an adversary that can break the unforgeability of our P-signaturescheme for this extractor and this bijection. We create a reduction to break the IHSDHassumption. The reduction gets (p,G1,G2,GT , e, g, X, h,X, u), where X = hx, X = gx forsome unknown x. The reduction runs ExtractSetup(p,G1,G2,GT , e, g, h) to get paramsGSand ext. It otherwise creates params in the same way as SigSetup (and ExtractSetup).Note that ext lets it open all commitments. The reduction gives (params, ext, pk = (X, X)

Page 245: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF SECOND CONSTRUCTION OF P-SIGNATURES 221

to the adversary. Whenever the adversary queries OSign on m, the reduction returnsσ ← Ox(m) and stores m in QSign.

Eventually, the adversary outputs a proof π. Since π is f -extractable and perfectly sound,Extract(params, ext, comm, π) will return a = hm, b = um, and σ = g1/(x+m). Thuswe have a valid HSDH tuple and m = F−1(a, b) will always fulfill VerifySig. We alsoknow that since VerifyProof accepts, comm = Mh = Com(params,m, open) for some open.Thus, since this is a forgery, it must be the case that (a, b) = F (m) 6∈ F (QSign). Thismeans that we never queried Ox on m and the reduction has generated a fresh HSDHtuple.

A.2 Security of Second Construction of P-Signatures

Theorem A.2.1 Let F (x) = (hx, ux), where u ∈ G1 and h ∈ G2 as in the HSDH andTDH assumptions. Our new signature scheme is F -secure given HSDH and TDH.

Proof. Correctness is straightforward. Unforgeability is more difficult. Suppose we tryto do a straightforward reduction to HSDH. The reduction will setup the parameters forthe signature scheme. Whenever the adversary queries OSign, the reduction will use one ofthe provided tuples (g1/(x+c`), gc` , vc`) to construct a signature for input message m`. Wechoose r` such that c` = m` + βr`. Thus, C1 = g1/(x+c`), C2 = wr` and C3 = ur` . (Theactual proof will be more complicated, because we don’t know c` and therefore cannotcalculate r` directly).

Eventually, the adversary returns F (m) = (hm, um) and a valid signature (C1, C2, C3).Since the signature is valid, we get that C1 = g1/(x+m+βr), C2 = wr = hβr, and C3 = ur.We can have two types of forgeries. In Type 1, the adversary returns a forgery such thatm+ βr 6= m` + βr` for all of the adversary’s previously queried messages m`, in whichcase we can easily create a new HSDH tuple. In Type 2, the adversary returns a forgerysuch that m+ βr = m` + βr`. In this case, we cannot use the forgery to construct a newHSDH tuple. Therefore, we divide our proof into two categories. In Type 1, we reduce tothe HSDH assumption. In Type 2, we reduce to the TDH assumption.

Type 1 forgeries: βr+m 6= βr`+m` for any r`,m` from a previous query. The reductiongets an instance of the HSDH problem (p,G1,G2,GT , e, g, v, v, h, u, C`, H`, U``=1...q),such that v = hx and v = gx for some unknown x, and for all `, C` = g1/(x+c`), H` = hc` ,and U` = uc` for some unknown c`. The reduction sets up the parameters of the newsignature scheme as (p,G1,G2, e, g, h, u, z = e(g, h)). Next, the reduction chooses β ← Zpand calculates w = hβ , w = gβ . The reduction gives the adversary the public parametersand the public key (v, w, v, w).

Suppose the adversary’s `th query is to Sign message m`. The reduction will implicitly setr` to be such that c` = m` + βr`. This is an equation with two unknowns, so we do notknow r` and c`. The reduction sets C1 = C`. It computes C2 = H`/h

m` = hc`/hm` = wr` .Then it computes C3 = (U`/um`)1/β = (uc`/um`)1/β = u(c`−m`)/β = ur` . The reductionreturns the signature (C1, C2, C3).

Page 246: Cryptographic Protocols For Privacy Enhanced Identity Management

222 SECURITY PROOFS

Eventually, the adversary returns F (m) = (F1, F2) and a valid signature (C1, C2, C3).Since this is a valid F -forger, we get that F1 = hm, F2 = um and C1 = g1/(x+m+βr),C2 = wr = hβr, and C3 = ur. Since this is a Type 1 forger, we also have that m+ βr 6=m` + βr` for any of the adversary’s previous queries. Therefore, (C1, F1C2, F2C

β3 ) =

(g1/(x+m+βr), hm+βr, um+βr) is a new HSDH tuple.

Type 2 forgeries: βr+m = βr`+m` for some r`,m` from a previous query. The reductionreceives (p,G1, G2,GT , e, g, h,X,Z, Y, σ`, c`), where X = hx, Z = gx, Y = gy, and forall `, σ` = g1/(x+c`). The reduction chooses γ ← Zp and sets u = Y γ . The reduction setsup the parameters of the new signature scheme as (p,G1,G2, e, g, h, u, z = e(g, h)). Nextthe reduction chooses α← Zp, and calculates v = hα, w = Xγ , v = gα, w = Zγ . It givesthe adversary the parameters and the public key (v, w, v, w). Note that we set up ourparameters and public key so that β = xγ, for some unknown β and u = gγy.

Suppose the adversary’s `th query is to Sign message m`. The reduction sets r` =(α + m`)/(c`γ) (which it can compute). The reduction computes C1 = σ

1/(γr`)` =

(g1/(x+c`))1/(γr`) = g1/(γr`(x+c`)) = g1/(α+m`+βr`). Since the reduction knows r`, itcomputes C2 = wr` , C3 = ur` and send (C1, C2, C3) to A.

Eventually, the adversary returns F (m) = (F1, F2) and a valid signature (C1, C2, C3).Since this is an F -forgery, we get that F1 = hm, F2 = um and that C1 = g1/(x+m+βr),C2 = wr = hβr, and C3 = ur. Since this is a Type 2 forger, we also have thatm+ βr = m` + βr` for one of the adversary’s previous queries. (We can learn m` andr` by comparing F1C2 to hm`wr` for all `.) We define δ = m − m`. Since m + βr =m` + βr`, we also get that δ = β(r` − r). Using β = xγ, we get that δ = xγ(r` − r).We compute: A = F1/h

m` = hm−m` = hδ, B = ur`/C3 = ur`−r = uδ/γx = gyδ/x

and C = (F2/um`)1/γ = u(m−m`)/γ = uδ/γ = gδy. We implicitly set µ = δ/x, thus

(A,B,C) = (hµx, gµy, gµxy) is a valid TDH tuple.

Theorem A.2.2 (Security) Our second P-signature construction is secure given HSDHand TDH and the security of the GS commitments and proofs.

Proof. Correctness. VerifyProof will always accept properly formed proofs.

Signer Privacy. We must construct the SimIssue algorithm that is given as input params,a commitment comm and a signature σ = (C1, C2, C3) and must simulate the adversary’sview. SimIssue will invoke the simulator for the two-party computation protocol. Recallthat in two-party computation, the simulator can first extract the input of the adversary:in this case, some (ρ1, ρ2,m, open). Then SimIssue checks that comm = Com(params,m,open); if it isn’t, it terminates. Otherwise, it sends to the adversary the values (C′1 =C

1/ρ21 , C′2 = C

1/ρ12 , C′3 = C

1/ρ13 ). If the adversary can determine that it is talking with a

simulator we can either breaks the security property of the commitment scheme or of thetwo-party computation.

User Privacy. The simulator will invoke the simulator for the two-party computationprotocol. Recall that in two-party computation, the simulator can first extract the inputof the adversary (in this case, some (α′, β′), not necessarily the valid secret key). Thenthe simulator is given the target output of the computation (in this case, the value x

Page 247: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF SECOND CONSTRUCTION OF P-SIGNATURES 223

which is just a random value that the simulator can pick itself), and proceeds to interactwith the adversary such that if the adversary completes the protocol, its output is x.Suppose the adversary can determine that it is talking with a simulator. Then it breaksthe security of the two-party computation protocol.

Zero-knowledge. Consider the following algorithms. SimSetup runs BMGen to getparamsBM = (p,G1,G2,GT , e, g, h). It then picks t ← Zp and sets up u = gt.Next, it generates the simulation parameters paramsGS and trapdoor simGS for theGS proof system. The final parameters are params = (paramsGS , u, z = e(g, h)) andsim = (t, simGS). Note that the distribution of params is indistinguishable from thedistribution output by SigSetup. SimProve receives params, sim, and public key (v, v, w, w)and can use trapdoor sim to create a random P-signature forgery in SimProve as follows.Pick s, r ← Zp and compute σ = g1/s. We implicitly set m = s − α − rβ. Note thatthe simulator does not know m and α. However, he can compute hm = hs/(vwr) andum = us/(vtwtr). Now he can use σ, hm , um , wr, ur as a witness and construct the proofπ in the same way as the real Prove protocol. By the witness indistinguishability of theGS proof system, a proof using the faked witnesses is indistinguishable from a proof usinga real witness, thus SimProve is indistinguishable from Prove.

Finally, we need to show that we can simulate proofs of EqComProve given the trapdoorsimGS . This follows from composable zero knowledge of EqComProve.

Unforgeability. Consider the following algorithms: ExtractSetup(1k) outputs the usualparams, except that it follows the GS extractor algorithm to get alternative paramsGSand the trapdoor ext. The trapdoor allows for extracting individual GS commitmentsin G1 and G2. As GS extraction parameters are identical to standard parameters, weknow that the parameters generated by ExtractSetup will be indistinguishable from thosegenerated by SigSetup.

Extract(params, ext, comm, π) extracts the values from commitment comm and thecommitments Mh, Mu contained in the proof π using the GS commitment extractor. IfVerifyProof accepts then comm = Mh. Let F (m) = (hm , um).

Now suppose we have an adversary that can break the unforgeability of our P-signaturescheme for this extractor and this bijection.

A P-signature forger outputs a proof from which we extract (F (m), σ) such that either(1) VerifySig(params, pk,m, σ) = reject, or (2) comm is not a commitment to m, or (3)the adversary never queried us on m. Since VerifyProof checks a set of pairing productequations, f -extractability of the GS proof system trivially ensures that (1) never happens.Since VerifyProof checks that Mh = comm, this ensures that (2) never happens. Therefore,we consider the third possibility. The extractor calculates F (m) = (hm, um) where m isfresh. Due to the randomness element r in the signature scheme, we have two types offorgeries. In a Type 1 forgery, the extractor can extract from the proof a tuple of theform (g1/(α+m+βr), wr, ur, hm, um), where m + rβ 6= m` + r`β for any (m`, r`) used inanswering the adversary’s signing or proof queries. The second type of forgery is onewhere m+ rβ = m` + r`β for (m`, r`) used in one of these previous queries. We show thata Type 1 forger can be used to break the HSDH assumption, and a Type 2 forger can beused to break the TDH assumption.

Page 248: Cryptographic Protocols For Privacy Enhanced Identity Management

224 SECURITY PROOFS

Type 1 forgeries: βr+m 6= βr`+m` for any r`,m` from a previous query. The reductiongets an instance of the HSDH problem (p,G1, G2, G, e, g,X, X, h, u, C`, H`, U``=1...q),such that X = hx and X = gx for some unknown x, and for all `, C` = g1/(x+c`),H` = hc` , and U` = uc` for some unknown c`. The reduction sets up the parameters ofthe new signature scheme as (p,G1,G2, e, g, h, u, z = e(g, h)). Next, the reduction choosesβ ← Zp, sets v = X, v = X and calculates w = hβ , w = gβ . The reduction gives theadversary the public parameters, the trapdoor, and the public key (v, w, v, w).

Suppose the adversary’s `th query is to Sign message m`. The reduction will implicitly setr` to be such that c` = m`+βr`. This is an equation with two unknowns, so we do not knowr` and c`. The reduction sets C1 = C`. It computes C2 = H`/h

m` = hc`/hm` = wr` . Thenit computes C3 = (U`)1/β/um`/β = (uc`)1/β/um`/β = u(c`−m`)/β = ur` The reductionreturns the signature (C1, C2, C3).

Eventually, the adversary returns a proof π. Since π is f -extractable and perfectly sound,we extract σ = g1/(x+m+βr), a = wr, b = ur, c = hm, and d = um. Since this is aP-signature forgery, (c, d) = (hm, um) 6∈ F (QSign). Since this is a Type 1 forger, we alsohave that m + βr 6= m` + βr` for any of the adversary’s previous queries. Therefore,(σ, ca, dbβ) = (g1/(x+m+βr), hm+βr, um+βr) is a new HSDH tuple.

Type 2 forgeries: βr+m = βr`+m` for some r`,m` from a previous query. The reductionreceives (p,G1, G2, G, e, g, h,X,Z, Y, σ`, c`), where X = hx, Z = gx, Y = gy, and forall `, σ` = g1/(x+c`). The reduction chooses γ ← Zp and sets u = Y γ . The reduction setsup the parameters of the new signature scheme as (p,G1,G2, e, g, h, u, z = e(g, h)). Nextthe reduction chooses α← Zp, and calculates v = hα, w = Xγ , v = gα, w = Zγ . It givesthe adversary the parameters, the trapdoor, and the public key (v, w, v, w). Note thatwe set up our parameters and public key so that β is implicitly defined as β = xγ, andu = gγy.

Suppose the adversary’s `th query is to Sign message m`. The reduction sets r` =(α + m`)/(c`γ) (which it can compute). The reduction computes C1 = σ

1/(γr`)` =

(g1/(x+c`))1/(γr`) = g1/(γr`(x+c`)) = g1/(α+m`+βr`). Since the reduction knows r`, itcomputes C2 = wr` , C3 = ur` and send (C1, C2, C3) to A.

Eventually, the adversary returns a proof π. The proof π is f -extractable and perfectlysound, the reduction can extract σ = g1/(x+m+βr), a = wr, b = ur, c = hm, and d = um.Therefore, VerifySig will always accept m = F−1(c, d), σ, a, b. We also know that ifthis is a forgery, then VerifyProof accepts, which means that comm = Mh, which is acommitment to m. Thus, since this is a P-signature forgery, it must be the case that(c, d) = (hm, um) 6∈ F (QSign). However, since this is a Type 2 forger, we also have that∃` : m+ βr = m` + βr`, where m` is one of the adversary’s previous Sign or Prove queries.We implicitly define δ = m−m`. Since m+βr = m` +βr`, we also get that δ = β(r`− r).Using β = xγ, we get that δ = xγ(r` − r). We compute: A = c/hm` = hm−m` = hδ,B = ur`/b = ur`−r = uδ/(γx) = gyδ/x and C = (d/um`)1/γ = u(m−m`)/γ = uδ/γ = gδy.We implicitly set µ = δ/x, thus (A,B,C) = (hµx, gµy, gµxy) is a valid TDH tuple.

Page 249: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF MULTI-BLOCK SIGNATURE SCHEME 225

A.3 Security of Multi-block Signature Scheme

Theorem 3 Let F (m) = (hm , um). The above signature scheme is F -secure given theHSDH and TDH assumptions.

We distinguish two types of forgeries. In a Type 1 forgery the signature consists of atuple (σ1, σ2, σ3) where σ1 6= σ

(q)1 for any σ(q)

1 used in answering the forger’s signaturequeries. We will show how a Type 1 forger can be used to break the HSDH assumption.In a Type 2 forgery, ∃q : σ1 = σ

(q)1 . We will show how a Type 2 forger can be used to

break the TDH assumption.

Type 1 forgeries: The reduction gets as input g, v = gα, u, h, v = hα, Sq =g1/(α+σq), Hq = hcq , Uq = ucqq=1...Q, as well as a description of the groups p,G1, G2, GT .It needs to compute a new tuple (g1/(α+c), hc, uc) such that ∀q : c 6= cq.

Setup(1k). The reduction computes paramsGS and gives the adversary public parametersparams = (p,G1,G2,GT , g, h, u, paramsGS).

Keygen(params). The reduction picks random β1, . . . , βn ← Zp. It computes, ∀i ∈[1, n] : wi = hβi and wi = gβi . The reduction gives the adversary the public keypk = (v,w, v, w). (The secret key is sk = (α,β), though the reduction does notknow α.)

OSign(params, (α,β),m). Note that the reduction does not know α and will use elementsof the HSDH instance to answer signature queries. At the first sign query, thereduction sets the counter q = 1, and increments it after responding to eachsign query. The reduction will implicitly set cq = r +

∑n

i=1 βimi. Thus, r =(cq −

∑n

i=1 βimi). However, the reduction does not know cq and r. The reductioncomputes σ1, σ2, σ3 as follows:

σ1 = Sq = g1/(α+cq)

σ2 = Hq/h

∑n

i=1βimi = h

cq−∑n

i=1βimi = hr

σ3 =(Uq/u

∑n

i=1βimi

)= u

(cq−∑n

i=1βimi) = ur .

Forgery. Eventually, the adversary returns a Type 1 forgery F (m) = (hm1 , um1 , . . . , hmn ,umn) = (A1, B1, . . . , An, Bn) and σ1 = g1/(α+r+β1m1+···+βnmn), σ2 = hr, andσ3 = ur. We implicitly set c = r+ β1m1 + · · ·+ βnmn. We now have σ1 = g1/(α+c).

Page 250: Cryptographic Protocols For Privacy Enhanced Identity Management

226 SECURITY PROOFS

We compute the rest of the HSDH tuple:

A = σ2

n∏i=1

Aβii = hrn∏i=1

hmiβi = hc

B = σ3

n∏i=1

Bβii = urn∏i=1

umiβi = uc .

The tuple (σ1, A,B) is a fresh HSDH tuple because this is a Type 1 forgery(∀q : c 6= cq).

Type 2 forgeries: The reduction gets as input g,G = gx, u = gy, h,H = hx, cq, Sq =g1/(x+cq)q=1...Q, as well as a description of the groups p,G1,G2,GT . It needs to computethe tuple (hµx, gµy, gµxy) such that µ 6= 0.

Setup(1k). The reduction computes paramsGS and gives the adversary public parametersparams = (p,G1, G2, GT , g, h, u, paramsGS).

Keygen(params). The reduction chooses a random t ← 1, n and pick randomα, βini=1,i 6=t ← Zp. It computes, ∀i ∈ 1, . . . , n\t : wi = hβi and wi = gβi . Then,the reduction picks a random γ ← Zp, and sets wt = Hγ = hxγ and wt = Gγ = gxγ .The reduction gives the adversary the public key pk = (v = hα,w, v = gα, w). Thesecret key is sk = (α,β), though the reduction does not know βt = xγ.

OSign(params, (α,β),m). At the first sign query, the reduction sets the counter q = 1,and increments it after responding to each sign query. The reduction sets cq =(α+ r +

∑n

i=1,i6=t βimi)/γmt, and solves for r. Then it is easy to verify that

(x+ cq)γmt = xγmt + α+ r +n∑

i=1,i6=t

βimi = α+ r +n∑i=1

βtmt.

This is precisely the inverse of the exponent which forms the first part of thesignature, σ1. The reduction sets σ1 = S

1/γmtq = g1/(x+cq)γmt . Since the reduction

knows r, it computes σ2 = hr and σ3 = ur.Forgery. Eventually, the adversary returns a Type 2 forgery F (m) = (hm1 , um1 , . . . , hmn ,

umn) = (A1, B1, . . . , An, Bn) and σ1 = g1/(α+r+

∑n

i=1βimi), σ2 = hr, and σ3 = ur.

Since this is a Type 2 forgery, the reduction already has a message / signature pairsuch that r +

∑n

i=1 βimi = r(q) +∑n

i=1 βim(q)i . Since it is a forgery, ∃mi ∈ m :

mi 6= m(q)i . With probability 1/n, i = t.

That means that mt 6= m(q)t but βtmt + r +

∑n

i=1,i 6=t βimi = βtm(q)t + r(q) +∑n

i=1,i 6=t βim(q)i .

Now, we will implicitly set µ = (m(q)t − mt)γ (Note that if we have guessed t

correctly, this will be nonzero). We cannot compute this value, however we will beable to compute hµx, gµy, gµxy as follows:

Page 251: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF SVRF SCHEME 227

Compute the following values:

W1 = ((um(q)t )/Bt)γ = (um(q)

t−mt)γ = uµ = gyµ

W2 =n∏

i=1,i 6=t

(Ai/(hm(q)i ))βiσ2/h

r(q)= h

∑n

i=1,i 6=tβi(mi−m(q)

i)h(r−r(q))

= hβt(m(q)t−mt) = hxγ(m(q)

t−mt) = hxµ

W3 =n∏

i=1,i 6=t

(Bi/(um(q)i ))βi(σ3/u

r(q)) = u

∑n

i=1,i 6=tβi(mi−m(q)

i)u(r−r(q))

= uβt(m(q)t−mt) = uxγ(m(q)

t−mt) = uxµ = gxyµ

The reduction will output W1,W2,W3, which will be a valid tuple if this is a type2 forgery such that mt 6= m(q)

t .

A.4 Security of sVRF Scheme

Theorem 4 This construction with domain size q is a strong sVRF under the q-DDHIfor G1 and under the assumption that the Groth-Sahai proof system is secure.

Proof. Correctness and Verifiability follow from the corresponding properties of the WIand NIZK GS proof systems.

Pseudorandomness can be shown via an approach very similar to our proof below forSimulatability, so we will not show it here.

Trapdoor-Indistinguishable Simulatability We define the following simulator algorithms:

SimSetup(1k). Let e : G1 × G2 → GT be a bilinear map of order p. Let g bea generator for G1, and h be a generator for G2. Let (paramsGS , simGS) ←SimSetup(p,G1,G2,GT , g, h) be simulation parameters for a Groth-Sahai NIZKproof system. Output parameters paramsVRF = (p,G1,G2,GT , g, h, paramsGS) andtrapdoor t = simGS .

SimGen(paramsVRF). Pick a random seed s← Zp and random opening information seedand output sk = (s,s eed) and public key pk = GSCom(hs, seed).

SimProve(paramsVRF , sk = (s,s eed), x, y, t). Compute y′ = g1s+x and commitment C =

GSCom(y′). Use the NIZK simulator to compute simulated proof π1 that C is acommitment to y.Create π2 as an honest GS witness indistinguishable proof that C is a commitmentto y′ and pk is a commitment to S such that e(y′, Shx) = e(g, h)Output π = (π1, π2).

Page 252: Cryptographic Protocols For Privacy Enhanced Identity Management

228 SECURITY PROOFS

Now we need to show that Game Simulated Proofs, when instantiated using this simulator,is indistinguishable from Game Real Proofs. We will do this by considering a series ofintermediate games.

First consider the following intermediate simulator algorithm:

HybSimProve(paramsVRF , sk = (s,s eed), x, t). Computes y = g1s+x , and then proceeds as

SimProve: It computes y′ = g1s+x and commitment C = GSCom(y′). It uses the

NIZK simulator to compute simulated proof π1 that C is a commitment to y.It creates π2 honestly: Create π2 as an honest GS witness indistinguishable proofthat C is a commitment to y′ and pk is a commitment to S such that e(y′, Shx) =e(g, h)It outputs π = (π1, π2).

Game Hybrid Sim 1. Runs as Game Real Proofs except that it uses HybSimProveinstead of Prove:(P, t) ← SimSetup(1k), (pk, sk) ← G(P) and then A(P, t, pk) gets access tothe following oracle Real: On query x, Real returns y = Eval(P, sk, x) andπ ← HybSimProve(P, sk, x, t).

Game Hybrid Sim 1 is indistinguishable from Game Real Proofs by the zero-knowledgeproperty of the GS NIZK and by the perfect WI property of the GS WI proofs.

Now consider a second intermediate game:

Game Hybrid Sim 2. Runs as Game Simulated Proofs except that it computes ycorrectly:(P, t) ← SimSetup(1k), (pk, sk = (s,s eed)) ← SimGen(P, t), and then A(P, t, pk)gets access to the following oracle S: On query x, S (1) checks if x has previouslybeen queried, and if so, computes π ← SimProve(P, sk, x, y, t) for the stored y andreturns (y, π); (2) otherwise, it computes y = g

1s+x and π ← SimProve(P, sk, x, y, t),

returns (y, π), and stores y.

First note that Game Hybrid Sim 2 is identical to Game Hybrid Sim 1. Next, we can seethat by Theorem 5.2.1, Game Hybrid Sim 2 is indistinguishable from Game SimulatedProofs. Thus, Game Simulated Proofs is indistinguishable from Game Real Proofs, andthe simulatability property holds.

A.5 Security of Our Compact E-Cash Scheme

This proof makes use of the security definitions of P-Signatures in Section 4.1 and thestandard security notions of pseudo-random functions and zero-knowledge proof systems.We refer the reader to Section 4.1 for a definition of the Signer privacy, User privacy,Correctness, Unforgeability, and Zero-knowledge properties of a P-signature scheme

Page 253: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF OUR COMPACT E-CASH SCHEME 229

and the corresponding simulator protocols (SimIssue, SimObtain,SimSetup, SimProve) andextraction algorithms (ExtractSetup,Extract) of a P-signature scheme.

Theorem A.5.1 The e-cash scheme presented in Section 5.3 is a secure e-cash schemegiven the security of the P-signature scheme, the PRF, and the NIZK proof system.

Proof. We need to prove that CashSetup, BankKG, UserKG, SpendCoin, VerifyCoin,Deposit, Identify, and the interactive protocol Withdraw fulfill the Correctness, Anonymity,Balance, and Identification properties.

Correctness. Correctness is straightforward.

Anonymity. Consider the following simulator S = (SimCashSetup, SimSpend):

SimCashSetup(1k). Runs SimSetup(1k) to obtain params, sim. Our construction is non-black-box: we reuse the GS-NIZK proof system parameters paramsGS that arecontained in params and the GS NIZK simulation parameters simGS contained insim. The parameters paramsGS in turn contain the setup for a bilinear pairingparamsBM = (p,G1,G2,GT , e, g, h) for a paring e : G1 × G2 → GT for groups ofprime order p. The algorithm returns (params, sim).

SimSpend(params, sim, pkI , pkM, info).

• The simulator uses SimProve(params, sim, pkw, 3) to compute ((Cid , Cs, Ct), π1)• The simulator uses SimProve(params, sim, pkc, 1) to compute (CJ , π2).• The simulator picks a random serial number and double spending tag S, T ←

G1 and simulates the non-interactive zero-knowledge proofs πS and πT usingthe zero-knowledge simulator for LS and LT .

We consider a sequence of 5 Games:

Game 1. Corresponds to the gameA plays when interacting withOSpend(params, pkI , ·, ·).Game 2. As Game 1, except that CashSetup is replaced by SimCashSetup to obtain sim.Game 3. As Game 2, except that the oracle uses sim and SimProve to compute

((Cid , Cs, Ct), π1).Game 4. As Game 3, except that the oracle uses sim and SimProve to compute (CJ , π2).Game 5. As Game 4, except that the oracle uses simGS and the zero-knowledge simulator

for languages LS and LT .Game 6. As Game 5, except that S and T are now chosen at random. This corresponds

to the game with OSimSpend(params, pkI , ·, ·, ·).

Games 1 and 2 are indistinguishable by the properties of the P-signature scheme.

A non-negligible probability to distinguish between Games 2 and 3 and between Games 3and 4 allows to break the zero-knowledge property of the P-signature scheme. A non-negligible probability to distinguish between Games 4 and 5 breaks the zero-knowledgeproperty of the proof system.

Page 254: Cryptographic Protocols For Privacy Enhanced Identity Management

230 SECURITY PROOFS

A non-negligible probability to distinguish between Games 5 and 6 allows to break thepseudorandomness of the PRF through the following reduction. The reduction eithergets oracle access to two pseudo-random functions PRFs(.) and PRFt(.) or to two randomfunctions.1 It can simulate all the rest of the spend without knowing s, t given thesimulators. In one case it’s Game 5, in the other case it’s Game 6. If a distinguisher candistinguish between the two games we can break the pseudorandomness of the PRF.

As Game 6 corresponds to the view generated by the simulator, the success probabilityof an adversary in breaking the anonymity property is bounded by the sum of thedistinguishing advantages in the above games. This advantage is negligible.

Balance. A successfully deposited coin can be parsed as coin = (S, (T,Cid , Cs, Ct, CJ , π1,π2, πS , πT ), idM‖info). We consider multiple games.

Game 1. The first game is the same as the balance definition with the oracles using thereal protocol.

Game 2. As Game 1 except that in CashSetup algorithm SigSetup is replaced withExtractSetup to obtain td.

Game 3. As Game 2 except that in ODeposit the game checks every deposited coin anduses td and the Extract algorithm to extract ys, yt and yid from Cid , Cs, Ct, π1 andyJ from CJ , π2. It aborts if the triple (ys, yt, yJ) already appeared in a previouslydeposited coin.

Game 4. As Game 3 except that it also aborts if the value yJ is not in F (1), . . . , F (n).Game 5. As Game 4 except that it abort if the number of deposited coins with different

(ys, yt) pairs is bigger than withdrawals.

Games 1 and 2 are indistinguishable as SigSetup and ExtractSetup are indistinguishable.

Games 2 and 3 are indistinguishable because Game 3 aborts only with negligible probability.An abort can occur only if one of the F (ys)−1, F (yt)−1, F (yJ )−1 does not correspond tothe opening of Cs, Ct, CJ in which case we found a forgery for one of the two P-signaturesor if we broke the soundness of the proof system used to prove language LS and LT . Weguess which of the three options is the case to do a reduction and break the unforgeabilityof the P-signature scheme or the soundness of the proof system.

A distinguisher between Game 3 and 4 allows to break the unforgeability of the P-signature scheme as the only time Game 4 aborts is when it obtains a signature on avalue F−1(yJ) > n. As such a J value was never signed we obtain a P-signature forgery.

A distinguisher between Game 4 and 5 allows to break the unforgeability of the P-signaturescheme as the number of different (ys, yt) pairs with corresponding signatures pairs isgreater than the number of correctly generated signatures output by OWithdraw. We createa reduction and guess which (ys, yt) is the forgery to break P-signature unforgeabilitywith probability at least 1/(withdrawals + 1).

1A standard hybrid argument can be used to show that S and T can be replaced one after theother.

Page 255: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF OUR COMPACT E-CASH SCHEME 231

In Game 5, we can bound the number of successful deposits to at most withdrawals · n.The success probability of A is bounded by the sum of the distinguishing probabilitiesbetween the Games 1 to 5. This probability is negligible.

Identification. A successful adversary A in the identification game outputs two coins(coin1, coin2) that verify and have the same serial number S but different idM‖info. Weconsider multiple games.

Game 1. Is the same as the original security game.Game 2. As Game 1 but in CashSetup algorithm SigSetup is replaced with ExtractSetup

to obtain td.Game 3. As Game 2 but the game parses the coins coin1 = (S, (T,Cid , Cs, Ct, CJ , π1, π2,

πS , πT ), pkM1, info1) and coin2 =(S, (T ′, C′id , C′s, C′t, C′J , π′1, π′2, π′S , π′T ), idM2‖info2)obtained from the adversary and uses td and the Extract algorithm to extractyid , ys, yt, yJ from (Cid , Cs, Ct, π1), (CJ , π2) and y′id , y

′s, y′t, y

′J from (C′id , C′s,

C′t, π1), (CJ , π2). Game 3 aborts if the values yJ or y′J are not in F (1), . . . , F (n).Game 4. As Game 3 but it also aborts if ys 6= y′s.Game 5. As Game 4 but it also aborts if yJ 6= y′J .Game 6. As Game 5 but it aborts if yt 6= yt or yid 6= y′id .

Games 1 and 2 are indistinguishable as SigSetup and ExtractSetup are indistinguishable.

A distinguisher between Game 2 and 3 allows to break the unforgeability of the P-signature scheme as the only time Game 3 aborts is when it obtains a signature on avalue F−1(yJ) > n. As such a J value was never signed we obtain a P-signature forgery.

Games 3 and 4 are indistinguishable. The abort in Game 4 can only happens withprobability greater than n2/p (∼ the probability of collision if S is computed by a randomfunction) in one of the following 4 cases: (i. and ii.) one of the F (ys)−1, F (y′s)−1 doesnot correspond to the opening of Cs, C′s respectively (in this case we can find a forgeryfor the P-signatures scheme), iii. we break the soundness of the proof system for languageLS , or iv. we break the pseudorandomness of PRF.

The reduction to the pseudo-randomness works as follows: We have a polynomial sizeddomain, so an adversary in the pseudorandomness game can compute the output on everyelement in the domain. By pseudorandomness, this should look like a completely randompolynomially sized subset of G1. So the probability that two randomly chosen seeds willproduce intersecting ranges should be negligible. Otherwise we can build a reductionwhich breaks the pseudorandomness property without even seeing the seed: we are givenoracle access to the PRF (or a random function). We query it on all points in the domain.Then we choose another random seed and compute that on all points in the domain.If there is an intersection with the first set, we output “pseudo-random” otherwise weoutput “random”.

Note that n < poly(k) and thus the Games 3 and 4 are indistinguishable, or otherwise weguess which of the 4 cases mentioned above holds do the appropriate reduction.

Page 256: Cryptographic Protocols For Privacy Enhanced Identity Management

232 SECURITY PROOFS

Games 4 and 5 are indistinguishable for the same reason except that the probability of acollision for perfectly random functions is replaced by the probability of distinguishing arandom function from a random permutation given only n < poly(k) queries.

Games 5 and 6 are indistinguishable as aborts in 6 occur only with negligible probability.In Game 6 we abort if yt 6= yt or yid 6= y′id . As the seed s is chosen at random, it is highlyunlikely that two withdrawn wallets contain the same seed. Consequently yt = yt andyid = yid or we break the unforgeability of the P-signature scheme.

In Game 6 the probability of e((T/T ′)1/(idM1‖info1−idM2‖info2), h) /∈ DBT is bounded bythe soundness error of the proof protocol or the probability that yid was never signed bythe bank or does not correspond to the commitment Cid . If the probability of the firstis non-negligible we break the soundness of the proof protocol, if the probability of thelatter is non-negligible we break the unforgeability of our P-signature scheme.

As Games 1 to 6 are computationally indistinguishable, A’s success probability in thereal game is also negligible.

Weak Exculpability. A successful adversary A in the weak exculpability game outputstwo coins (coin1, coin2) that verify and have the same serial number S but differentidM‖info.

While A knows the user’s public key pkU = e(gskU , h) it is hard to compute gskU frompkU alone without knowing skU . As no additional information about gskU is revealed untilU reuses an e-token, an adversary computing gskU can be used to break the asymmetriccomputation Diffie Hellman (asymmetric CDH) assumption: given random ga, hb computegab. Asymmetric CDH is implied by q-DDHI and DLIN.

More formally we define a sequence of games to eliminate all sources of information thatan adversary may potentially have about the honest user’s key besides the public keylearned through a OGetKey query:

Game 1. Here the adversary plays the same game as in the weak exculpability definition.Game 2. As Game 1, but CashSetup is replaced with SimCashSetup.Game 3. As Game 2, but in OWithdraw algorithm ObtainSig is replaced with the P-

signature simulator SimObtain.Game 4. As Game 3, but in OSpend algorithm SpendCoin is replaced with SimSpend.

Games 1 and 2 are indistinguishable by the properties of the P-signature scheme (andthe anonymity of the e-cash scheme itself). If Games 2 and 3 can be distinguished webreak the user privacy of the P-signature scheme. Games 3 and 4 are indistinguishablebased on the anonymity of the e-cash scheme.

In order to break the asymmetric CDH assumption we do the following reduction. Weanswer a random OGetKey with e(ga, hb). Then we use the simulators SimCashSetup,SimObtain, and SimSpend to simulate all interactions with the adversary where this useris involved. A successful adversary outputs gab as (T/T ′)1/(idM1‖info1−idM2‖info2).

Page 257: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF E-TOKEN SCHEME 233

A.6 Security of e-Token Scheme

Theorem 5 Protocols IKeygen, UKeygen, Obtain, Show, and Identify described aboveachieve soundness, identification, and anonymity properties in the plain model assumingStrong RSA, and y-DDHI if lx ∈ O(log k) or IDDHI otherwise.

Proof. Soundness. Informally, in our system, tokens are unforgeable, because each tokenserial number (TSN) is a deterministic function F(g,s)(c(0, t, J)) of the seed s, the timeperiod t, and J ∈ [0, n− 1]. Thus, there are only n valid TSNs per time period, and sincea user must provide a ZK proof of validity for the token, to show n+ 1 or more tokensper time period requires that two shows use the same TSN by the pigeonhole principle.

More formally, we will describe a knowledge extractor E that, after executing u Obtainprotocols with an adversary A acting on behalf of all malicious users, can output functionsf1, . . . , fu that allow to compute all possible token serial numbers that A could outputin any given time period t. Let n be the number of shows allowed per time period. Ourextractor E operates as follows:

1. In step one of Obtain, E behaves as an honest issuer and agrees on an extendedPedersen commitment C = gski gshr = PedCom(ski, s; r) with A, where ski iswhatever secret key A chooses to use and s is the PRF seed.

2. In step two, E must run the CL signing protocol with A to provide A with asignature on (ski, s). As part of the CL protocol, A is required to prove knowledgeof (α, β, γ) such that C = gαgβhγ . There are a number of ways to guarantee thatthis proof of knowledge is extractable; in this step, E employs one of the methodsof CL to extract the secrets (skU , s) from A. (Here we will enforce that Obtainprotocols must be run sequentially, so that rewinding does not become a problem.)

3. E outputs the function fi as the description of the DY PRF (or whatever PRF isalternatively used) together with the seed s.

Since E knows the value s used for every dispenser, it can calculate the token serialnumber S := F(g,s)(c(0, t, J)). The CL-signatures and its protocols are secure under theStrong RSA assumption.

Identification of Violators. Suppose (S,E,R) and (S,E′, R′) are the result of two Showprotocols with an honest verifier(s). Since the verifier(s) was honest, it is the case thatR 6= R′ with high probability since an honest verifier chooses R ∈ Z∗q at random. Due to thesoundness of the ZK proof of validity, it must be the case that E = pkU ·F(g,s)(c(1, t, J))R

and E′ = pkU · F(g,s)(c(1, t, J))R′ for the same values of s, t, J and pkU . Thus, theviolator’s public key can be computed as follows:

(E1/R

E′1/R′

) RR′R′−R =

( gskU/R · F(g,s)(c(1, t, J))gskU/R′ · F(g,s)(c(1, t, J))

) RR′R′−R

=(gskU (1/R−1/R′)

) RR′R′−R = gskU = pkU .

Page 258: Cryptographic Protocols For Privacy Enhanced Identity Management

234 SECURITY PROOFS

To be explicit with respect to our definition of this property, the value sU := pkU and thefunction φ is the identity.

Anonymity. Informally, the intuition for anonymity is that the issuer does not learn thePRF seed during Obtain. Then showing a token in Show consists of releasing (S,E), whereS and E are functions of the PRF (indistinguishable from random) and a zero-knowledge(ZK) proof that the token is valid (reveals one bit). Thus, if a user is honest, nothingabout her identity is revealed by two random-looking numbers and a ZK proof.

More formally, will describe a simulator S which an adversary A cannot distinguish froman honest user during the Show protocol. Recall that A, playing the role of a coalition ofadversarial issuer and verifiers, first runs the Obtain protocol u times with honest users,who then obtain a dispenser D. Let the number of allowed shows per time period be n.

Now, at some point, A outputs a value j ∈ [1, u] and a time period t. A will now executethe Show protocol with either the honest user U that holds the token dispenser Dj attime t or with simulator S, whose input is only the global parameters params, t, n, andthe issuer’s public key pkI . To impersonate an unknown user, S behaves as follows:

1. In steps one and two of the Show protocol, S does nothing.2. In step three, S sends to A random values (S,E) ∈ G2.3. In step four, S simulates a proof of knowledge of (z, s, J, σ) for the statements:

(a) 0 ≤ J < n,(b) S = F(g,s)(c(0, t, J)),(c) E = gz · F(g,s)(c(1, t, J))R.(d) VerifySig(pkI , (z, s), σ) = accept.

Proving this statement, in the honest setting, follows the standard discrete-logarithm-based Σ-protocol, as we detailed in Section 6.2. Thus, for this step, S can simulate thisΣ-protocol in the two standard ways: (1) rewind the adversary (interactive proof) or (2)use its control over the random oracle (non-interactive proof). To prevent any rewindingdifficulties, Show protocols should be executed sequentially.

This simulator’s behavior is indistinguishable from a user with dispenser Dj . The ZKproof is standard. The random values (S,E) are indistinguishable from the user’s real(S′, E′) due to the security of the DY PRF F(·), which relies on y-DDHI (for smallsystem parameters) and otherwise SDDHI. Specifically, SDDHI is required for wheneverparameter lx becomes super-logarithmic due to a technicality in the original proof ofsecurity for the DY PRF.

A.7 Security of Delegatable Credential Scheme

Claim A.7.1 (Correctness) The delegatable credential construction given in Sec-tion 7.3 satisfies the Correctness property

Page 259: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF DELEGATABLE CREDENTIAL SCHEME 235

Proof. (a). Note that Obtain aborts if the credproof that it receives from the issuerdoes not pass CredVerify. Then Obtain only completes without aborting if the two-party computation completes successfully. In this case, by the security of the two-partycomputation, we are certain that πL is indistinguishable from an honestly generated proofof knowledge of an honestly generated authenticator. Then by the correctness of theNIZKPK system and correctness of the authentication scheme, we know that πL will passthe NIZKPK verification. Thus, the new cred that results from combining credproof withπL will pass the NIZKPK verification. Furthermore, by correctness of rerandomization,this means that any rerandomization of this proof will pass the NIZKPK verification andthus the CredVerify, so cred is proper.

(b). Now, if Issue is interacting with Obtain with the appropriate inputs, then itwill produce credproof that will be accepted by the honest user. Then the two partycomputation will be successful by the 2PC correctness property. That means Obtain willnot abort, so by property (a). it will produce a proper credential.

(c). If the cred is not a proper credential, then by the properties of the rerandomization,CredVerify will not accept the resulting credproof , so Issue will abort. Issue also aborts ifNymU is not valid, or if skI , openI do not satisfy VerifyAux.

(d). Note that if cred passes the CredVerify, then by correctness of rerandomization, it mustbe a proper credential. Thus, if cred is not a proper credential, or if Nym, sk, aux(Nym)does not pass VerifyAux, then CredProve aborts. If both of these are correct, then credproofis a rerandomization of a proper credential. That means that the proof must verify, soCredVerify will accept.

(e). Follows from the definition of VerifyAux and Nymgen.

Claim A.7.2 (Anonymity) The delegatable credential construction given in Section 7.3satisfies the Anonymity property under the assumption that our building blocks are secure.

Proof. Define the simulator algorithms as follows:

SimSetup(1k). Uses AuthSetup(1k) to generate paramsA for an F -unforgeable certificationsecure authentication scheme and then uses SimSetup to choose correspondingparamsP for a rerandomizable commitment scheme with a partially extractablerandomizable composable NIZKPK proof system, and to choose the appropriatetrapdoor simNIZK. Finally, the setup chooses a collision resistant hash function Hwhose range is the message space of the authentication scheme and outputs publicparameters paramsDC = (paramsA, paramsP , H), sim = simNIZK.

SimProve(paramsDC , sim,NymO,Nym, L,flag). If flag = reject, abort and return ⊥.Otherwise let Nym0 = NymO and NymL = Nym, generate commitmentsNym1, . . .NymL−1 to random values, and computes r1, . . . rL where ri =H(NymO, i).

Page 260: Cryptographic Protocols For Privacy Enhanced Identity Management

236 SECURITY PROOFS

Then we use the NIZKPK simulator and the simulation trapdoor sim to simulateproofs π1 . . . πL, where πi is a simulated proof of the form

πi ← SimNIZKPK(F (ski−1), F (ski), auth) : ski−1 inNymi−1 ∧ ski inNymi∧

VerifyAuth(params, ski−1, (ski, ri), auth) = accept

Finally, we output Nym0, . . .NymL, and π1 · · · πL.SimObtain(paramsDC ,NymO,NymU ,NymA, L,flag). If flag = reject, abort and return ⊥.

It then proceeds as follows:

1. Receives credproof from the adversary.2. Runs CredVerify(paramsDC ,NymO, credproof ,NymI , L) to check that credproof

is correct. If the checks fail, it aborts.Otherwise, it computes rL = H(NymO, L)), the committed messagecommm1 = (NymU ), and the public message m2 = rL.

3. Now we must simulate the two-party computation protocol. We will do thisby using the 2PC simulator which interacts with a corrupt signer. (Note thatthis simulator expects no input from the trusted functionality.)

SimIssue(paramsDC , sim,NymO,NymD,NymA, L,flag). If flag = reject, abort and return⊥.Otherwise let Nym0 = NymO, NymL = NymD and NymL+1 = NymA, generatecommitments Nym1, . . .NymL−1 to random values, and computes r1, . . . rL+1 whereri = H(NymO, i).Then we use the NIZKPK simulator and the simulation trapdoor sim to simulateproofs π1 . . . πL+1, where πi is a simulated proof of the form

πi ← SimNIZKPK(F (ski−1), F (ski), auth) : ski−1 inNymi−1 ∧ ski inNymi∧

VerifyAuth(params, ski−1, (ski, ri), auth) = accept

1. Send Nym0, . . . NymL, π1, . . . πL to the adversary2. Receive commm1 and check that it matches NymA.3. Now we must simulate the two-party computation protocol. We will do this

by using the 2PC simulator which interacts with a corrupt recipient. Notethat this simulator expects to be provided with a proof of knowledge of theappropriate authenticator. Thus, we will give it the proof πL+1 computedabove.

Now we will prove that these algorithms satisfy the required properties:

(a). Holds by the composable zero-knowledge properties of the underlying NIZK proofsystem.

(b). Holds by the hiding property of the underlying composable commitment scheme.

Page 261: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF DELEGATABLE CREDENTIAL SCHEME 237

(c). Note that the difference between SimProve and CredProve is that SimProve generatesNym1, . . . ,NymL−1 as random commitments and uses SimNIZKPK to generated simulatedproofs π1, . . . , πL. These commitments are identical to the honest commitments by thestrong computational hiding property, and the simulated proofs are indistinguishable fromthe honestly rerandomized proofs by the randomization property of the randomizableproof system.

(d). Note that the difference between SimObtain and Obtain is that SimObtain uses thesimulator to run the two-party computation. This should be indistinguishable from thehonest Obtain protocol by the security of the 2PC.

(e). SimIssue differs in three ways from Issue. First, the initial credproof that is sent tothe user is formed using SimNIZKPK instead of by rerandomizing a real cred. Second,SimIssue uses the simulator to run the two-party computation. Third, the resulting πL+1is the output of SimNIZKPK and not the result of a valid NIZKPK of an Auth computedby the 2PC.

We can prove that SimIssue and Issue are indistinguishable by considering several hybridalgorithms.

Hybrid 1 will be given the same input as Issue. It will verify that cred is proper, that ithas been given a correct sk,Nym, aux(Nym) and that NymA is a valid pseudonym. Itwill compute credproof honestly. Then it will use the 2PC simulator to run the two-partyprotocol. This simulator will extract skA, aux(NymA) from the user, and expect toreceive a corresponding authenticator proof. We will use sk to form an authenticator onskA, rL+1, and then we will use the honest NIZKPK to generate πL+1 from it. Finally, wewill pass πL+1 to the 2PC simulator which will complete the protocol.

Note that Hybrid 1 is indistinguishable from the game involving the real Issue by thesecurity of the 2PC.

Hybrid 2 will be given the same input as Issue. It will verify that cred is proper, that ithas been given a correct sk,Nym, aux(Nym) and that NymA is a valid pseudonym. Itwill compute credproof honestly. Then it will use the 2PC simulator to run the two-partyprotocol. This simulator will extract skA, aux(NymA) from the user, and expect to receivea corresponding authenticator proof. This time, we will use SimNIZKPK to simulatethe proof of knowledge of an authenticator on for NymA. We will pass πL to the 2PCsimulator which will complete the protocol.

Note that Hybrid 2 is indistinguishable from Hybrid 1 by the zero-knowledge propertiesof SimNIZKPK.

Finally, note that the difference between Hybrid 2 and game with SimIssue is that Hybrid2 generates credproof by rerandomizing cred, while SimIssue uses the SimNIZKPK. Thismeans the two games are indistinguishable by the rerandomization properties of theNIZKPK system.

Claim A.7.3 (Unforgeability) The delegatable credential construction given in Sec-tion 7.3 satisfies the unforgeability property under the assumption that our building blocksare secure.

Page 262: Cryptographic Protocols For Privacy Enhanced Identity Management

238 SECURITY PROOFS

Proof. Let ExtSetup be the same as Setup except that when generating params it usesthe extraction setup of the partially extractable randomizable composable NIZKPK proofsystem. (a) Follows from the indistinguishability of the extraction setup of the proofsystem. (b) The commitment schemes used with the NIZKPK proof system is perfectlybinding, so are our pseudonyms.

Let F correspond to the F of the F -unforgeable authentication scheme and let Extractbe an algorithm that verifies the credential and aborts with output ⊥ if CredVerifyrejects, otherwise it uses the extractability features of the NIZKPK proof system toextract all F (ski) values. (c) An honestly generated level L credential is a proof ofknowledge of a certification chain from sk0 to skL. It allows us to extract (f0, . . . , fL) =(F (sk0), . . . , F (skL)). If the length of certification chain is 0, Extract extracts f0 = F (skO)from any valid commitment NymO using the extraction property of the commitment. (d)

(e) Let Q be the maximum number of users in a credential system. We consider twogames. In Game 1 the adversary plays the real unforgeability game. In Game 2 we pick arandom q ∈ 1, . . . , Q. Game 2 is the same as Game 1, except that the oracle querieswith command IssueToAdv and ObtainFromAdv are answered differently for the qth user.

Let sk∗ be the secret key generated for the qth AddUser query.

IssueToAdv(NymI , credI ,Nym, L,NymO). If Nym is not a valid pseudonym for sk,aux(Nym), the oracle terminates. The oracle looks up (skI , pkI ,NymI , aux(NymI)) inits pseudonym database, and outputs an error if they do not exist. If skI 6= sk∗ itproceeds as in Game 1. Otherwise the oracle follows the Issue protocol, but uses thesimulator for the two-party protocol to simulate interaction with the adversarial user.Note that the simulator will extract the adversary’s message m, and expect to get anauthenticator proof as input. In this game we know sk∗, so we can generate this bysimply computing the authenticator.

ObtainFromAdv(NymA,NymU ,NymO, L). Similarly, the oracle looks up (skU ,NymU ,aux(NymU )) in its pseudonym database, and outputs an error if they do not exist.If skU 6= sk∗ the oracle runs Obtain(paramsDC ,NymO, skU ,NymU , aux(NymU ),NymA)with the adversary to get cred (the same as Game 1). Otherwise it follows theObtain protocol, and now uses the simulator for the two-party protocol to simulate theinteraction with the adversarial issuer. It outputs cred.

By a simple hybrid argument either Game 1 and Game 2 are computationallyindistinguishable or we break the security of the two-party computation.

Next we give a reduction to show that an adversary A that can win in Game 2 can beused to break the security of our authentication scheme. The reduction gets paramsA,f∗ = F (sk∗) and access to OAuth(paramsA, sk∗, .),OCertify(paramsA, ., (sk∗, ., . . . )) for achallenge secret key sk∗ from the authentication scheme’s unforgeability game. It createsmatching proof system parameters params and a trapdoor ext, and combines them intoparamsDC . It hands paramsDC and td to A.

The reduction answers As oracle queries as follows:

Page 263: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF DELEGATABLE CREDENTIAL SCHEME 239

AddUser.The oracle keeps a counter indicating the number of times it was queried,otherwise it behaves as the original oracle, except for the qth query. To answers theqth query the oracle stores (, F (sk∗)) in the user database and adds F (sk∗) in thelist HonestUsers. It returns F (sk∗) to A. Note that we use the special token ‘’ toindicate the unknown challenge key.

FormNym(f).The oracle looks up (sk, f) in its user database and terminates if it does notexist (we explicitly allow sk = ). If f 6= f∗ the oracle behaves as the original oracle.Otherwise it picks a random aux(Nym) and computes Nym = Com′(f, aux(Nym)). Theoracle stores (,Nym, aux(Nym)) in its pseudonym database and gives the adversaryNym.

Issue(NymI ,NymU , credI , L,NymO).The oracle looks up (skU ,NymU , aux(NymU )) and(skI ,NymI , aux(NymI)) in its pseudonym database and outputs an error if they donot exist or if skU = skI . If skU 6= and skI 6= the oracle behaves as the originaloracle. Otherwise we distinguish two cases (note that skU = and skI = is notpossible as honest users do not issue to themselves):(a) skU = : If L > 0, it runs Extract(paramsDC , ext, credproof I ,NymI) toobtain f0, f1, . . . fL. If fL 6= F (skI), it aborts. Otherwise the oracle computescredU in the same way as Issue ↔ Obtain except that it uses a query toOCertify(paramsA, skI , (sk∗, H(NymO, L))) to obtain the witness for proof π which itthen can compute itself. The oracle stores (f0, L, f(skI), f∗) in ValidCredentialChainsand outputs credU to the adversary.(b) skI = : If L > 0,it runs Extract(paramsDC , ext, credproof I ,NymI) toobtain f0, f1, . . . fL. If fL 6= f∗, it aborts. Otherwise the oracle computescredU in the same way as Issue ↔ Obtain except that it uses a query toOAuth(paramsA, sk∗, (skU , H(NymO, L))) to obtain the witness for proof π which itthen can compute itself. The oracle stores (f0, L, f

∗, F (skU )) in ValidCredentialChainsand outputs credU to the adversary.

IssueToAdv(NymI , credI ,Nym, L,NymO). If Nym is not a valid pseudonym for sk,aux(Nym), the oracle terminates. The oracle looks up (skI , pkI ,NymI , aux(NymI))in its pseudonym database, and outputs ⊥ if they do not exist. If skI 6= theoracle behaves as the original oracle. Otherwise the oracle follows the Issue protocol,but uses the simulator for the two-party protocol to simulate interaction with theadversarial user. We can simulate the ideal functionality of the two-party protocolwith a OAuth(paramsA, sk∗, H(NymO, L))) query. Note that the simulator of the two-party protocol provides us with skU . If the oracle’s output is not ⊥, the oracle stores(f0, L, f

∗, f(skU )) in ValidCredentialChains.ObtainFromAdv(NymA,NymU ,NymO). Similarly, the oracle looks up the values (skU ,

NymU , aux(NymU )) in its pseudonym database, and outputs an error if they do notexist. If skU 6= the oracle behaves normally. Otherwise it follows the Obtain protocol,and now uses the simulator for the two-party protocol to simulate the interactionwith the adversarial issuer. After a successful protocol execution the oracle outputscred = OCertify(paramsA, skI , (sk∗, H(NymO, L))).

Prove(Nym, cred,NymO).The prove protocol does not require the secret key of any user,and is answered as the original oracle.

This simulation of Game 2 is perfect. We now need to show that a forgery in the credentialsystem can be turned into a forgery for the authentication scheme.

Page 264: Cryptographic Protocols For Privacy Enhanced Identity Management

240 SECURITY PROOFS

A successful adversary outputs (credproof ,Nym,NymO, L) such that CredVerify(paramsDC ,pkO, credproof ,Nym, L) = accept and (f0, . . . , fL) ← Extract(paramsDC , ext, credproof ,Nym,NymO).

If we guessed correctly for which honest user A would create the forgery, fi−1 = f∗ andwe can extract an authi such that VerifyAuth(paramsA, sk∗, (ski, H(NymO, i)), authi) = 1.As (f0, i, fi) 6∈ ValidCredentialChains, this message was never signed before and constitutesa valid authentication forgery.

A.8 Security of Committed Blind Anonymous IBEScheme

A.8.1 Security of Anonymous IBE Scheme Variant

Theorem 6 The scheme Π presented in Section 10.2.1 is a secure anonymous IBEscheme under the DBDH and D-Linear assumptions.

Proof. We prove security using hybrid games. Let ct = (c′, c0, c1, c2, c3, c4) be thechallenge ciphertext given to the adversary during a real attack. Let R be a randomelement of GT and R′, R′′ be random elements of G1. The following games differ on whatchallenge ciphertext is given by the game to the adversary:

Game 1. The challenge is ct = (c′, c0, c1, c2, c3, c4).Game 2. The challenge is ct = (R, c0, c1, c2, c3, c4).Game 3. The challenge is ct = (R, c0, R

′, c2, c3, c4).Game 4. The challenge is ct = (R, c0, R

′, c2, R′′, c4).

Note that the ciphertext in Game 2 leaks no information about the message.Indistinguishability between Game 1 and 2 thus corresponds to chosen plaintext attacksecurity. Similarly, Game 4 leaks no information about the identity since it is composedof six random group elements. We show that the transition from Game 1 to Game 2(Lemma A.8.1) and from Game 2 to Game 3 and Game 4 (Lemma A.8.2 and Lemma A.8.3respectively) are all computationally indistinguishable.

An adversary playing the game in Definition 10.1.1 reacts the same upon receiving realchallenge ciphertexts or random ciphertexts. The probability of this not occurring isnegligible (otherwise he could act as a distinguisher and this would contradict the aboveresults about Games 1 to 4). For random ciphertexts his success probability is exactly1/4, consequently in the real game his success probability is at most 1/4 + ν(k).

Page 265: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF COMMITTED BLIND ANONYMOUS IBE SCHEME 241

Lemma A.8.1 (semantic security) Under the DBDH assumption, there is no p.p.t.adversary that can distinguish Games 1 and 2 with non-negligible advantage.

Proof. We begin by assuming that such an adversary A exists. We construct a reductionB that solves the DBDH problem with non-negligible advantage. Let q be a securityparameter. Algorithm B receives a DBDH challenge (g, A = ga, B = gb, C = gc, h, A =ha, B = hb, z) as input and outputs a guess β′ as to whether z = e(g, h)abc (β = 1)or z is a random element in GT (β = 0). We first describe a simulator that does notquite work, and then modify it so that it does work, an approach that has been usedpreviously [Nac07, Wat05].

Setup: The simulator sets an integer m = 2q and chooses a random integer k ∈ 0, . . . , n,a random n-length vector x = (x1, . . . , xn), where xi ∈ 0, . . . ,m and x′ ∈ 1, . . . ,m−1.Let X∗ denote the pair (x′,x). The simulator also chooses a random y′ ← Zp and ann-length vector y = (yi), where yi ← Zp. Let Y ∗ denote the pair (y′,y).

For a given identity id = (id1, . . . , idn), define three functions for ease of analysis:

F (id) = x′ +n∑i=1

idixi −mk , J(id) = y′ +n∑i=1

idiyi ,

K(id) =

0 if x′ +∑n

i=1 idixi ≡ 0 (mod m)1 otherwise

.

The simulator then generates the public parameters h = h, h0 = Bx′−mkhy

′ andhi = Bxihyi , as well as g = g, g0 = Bx

′−mkgy′ and gi = Bxigyi , where 1 ≤ i ≤ n. It

also chooses random t1, . . . , t4 ← Zp and publishes the parameters (Ω = e(A, B)t1t2 ,g, h, g0, . . . gn, h0, . . . hn, g

t1 , . . . , gt4), where the distribution is identical to the realconstruction and the master secret is (α, t1, t2, t3, t4). Note that α is implicitly setto ab. It is unknown to the reduction.

Phase 1: The simulator must answer the private key queries of A, who issues a queryfor an identity id. If K(id) = 0, the simulator aborts and randomly chooses its guess β′of the challengers value β.

Otherwise the simulator chooses random values r1, r2 ∈ Zp∗ and constructs the private

key d as d =(d0, d1, d2, d3, d4

)where

d0 = (A−1F (id) hr1)t1t2hr2t3t4 ,

d1 = (A−J(id)F (id) (h0

n∏i=1

hidii )r1)−t2 ,

Page 266: Cryptographic Protocols For Privacy Enhanced Identity Management

242 SECURITY PROOFS

d2 = (A−J(id)F (id) (h0

n∏i=1

hidii )r1)−t1 ,

d3 = (h0

n∏i=1

hidii )−r2t4 ,

d4 = (h0

n∏i=1

hidii )−r2t3 .

Let r1 = r1 − aF (id) and r2 = r2. Then

d0 = (A−1F (id) hr1)−t1t2hr2t3t4

= (h−aF (id) hr1)t1t2hr2t3t4

= hr1t1t2+r2t3t4 .

Using the fact that (g0∏n

i=1 gidii )=BF (id)hJ(id) and (g0

∏n

i=1 gidii )a/F (id) =BaAJ(id)/F (id)

we obtain

d1 = (A−J(id)F (id) (h0

n∏i=1

hidii )r1)−t2

= (A−J(id)F (id) (BF (id)hJ(id))r1)−t2

= (Bs1(BF (id)hJ(id))−aF (id) (BF (id)hJ(id))r1)−t2

= (Ba(h0

n∏i=1

hidii )r1− aF (id) )−t2

= h−αt2(h0

n∏i=1

hidii )−r1t2 .

Similarly,

d2 = (A−J(id)F (id) (h0

n∏i=1

hidii )r1)−t1 = h−αt1(h0

n∏i=1

hidii )−r1t1

As r2 = r2, it is d3 and d4 are easily seen to be correct. The reduction is feasible asF (id) 6= 0 (mod p) is implied by K(id) 6= 0.

Challenge: The A submits a message, m and identity id = id1‖ . . . ‖idn. Ifx′ +

∑n

i=1 idixi 6= km the simulator aborts and answers with a random guess. Otherwise,

Page 267: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF COMMITTED BLIND ANONYMOUS IBE SCHEME 243

the simulator constructs the ciphertext

ct =(zt1t2m, CJ(id), Ct1g−s1t1 , gs1t2 , Ct3g−s2t3 , gs2t4

).

If z = e(g, h)abc, then for α = ab and s = c, the above is a correctly formed ciphertext foran identity id = id1‖ . . . ‖idn that fulfills the equation x′ +

∑n

i=1 idixi = km.(c′, c0, c1, c2, c3, c4

)

=(zt1t2m, CJ(id), Ct1g−s1t1 , gs1t2 , Ct3g−s2t3 , gs2t4

)

=(e(g, h)αt1t2sm, H1(id)s, gt1(s−s1), gt2s1 , gt3(s−s2), gt4s2

).

If z is a random element, then c′ is also a random element in GT .

Phase 2: The simulator repeats Phase 1.

Guess: The adversary outputs a bit γ to guess which hybrid game it is playing. Thereduction forwards γ as its educated guess for the solution to the DBDH problem.

Artificial Aborts As in [Nac07, Wat05], this simulation has an issue. That is, it abortswith a probability that is a function of the identities id and id∗. A solution is toartificially abort the simulator at the end of the guess phase. The idea is to make theoverall probability of the simulator aborting consistent.

Using the probabilistic analysis of Naccache [Nac07], we have the following. Beginningat the challenge phase, fix the random variables that are visible to the adversary, fixthe DBDH tuple and the public parameters. Also fix the random values r1, r2 in phaseone. These fixed parameters are remembered by B. This fixes the queried identitiesidj , 1 ≤ j ≤ q and the challenge identity id∗. The adversary can now be seen as adeterministic algorithm.

Using the random variables x′ and xi, the list of private key queries ID = (id1, . . . , idq)and X = (x′, x1, . . . , xn), define the function

τ(X, id, id∗, k) =

0, if F (id∗) = 0 and F (idj) 6= 0 mod m for all 1 ≤ j ≤ q1, otherwise .

The reduction does not abort iff τ(X, id, id∗, k) = 0. The lower bound for the probabilitythat B does not abort is

PrX,k[τ(X, id, id∗, k) = 0] ≥ λ = 14 · q · 2l · n .

The simulator is modified so that it will always abort with probability approachingλ. In the guess phase, the new simulator B′ samples an estimate η′ of the probability

Page 268: Cryptographic Protocols For Privacy Enhanced Identity Management

244 SECURITY PROOFS

PrX,k[τ(X, id, id∗, k) = 0], which is a function of id and id∗. Following from the analysisof Naccache [Nac07], the probability of breaking the IBE scheme is less than q · 2l+4 · ntimes the probability of solving the DBDH problem. For the analysis of an optimumvalue for l, the reader is referred to [Nac07].

Lemma A.8.2 (Anonymity part 1) Under the Decision Linear assumption, no p.p.tadversary can distinguish between the Games 2 and 3 with non-negligible probability.

Proof. Suppose the existence of an adversary A that distinguishes between the twogames, Game 2 and Game 3, with advantage ε. We construct a simulator that wins theDecisional Linear game as follows.

The simulator takes in a D-Linear instance (g, ga, gb, gac, gbd, Z, h, ha, hb) where Z iseither gc+d or random in G1 with equal probability. For convenience, we rewrite this as[g, ga, gb, gac, Y, gs] for s such that gs = Z (that is, s is either c+ d or random). Considerthe task of deciding if Y = ga(s−c). The simulator plays the following game:

Setup: The simulator first chooses random exponents α, t3, t4. It lets g, h in the simulationbe as in the instance and set v1 = gb, v2 = ga. If we pose t1 = b and t2 = a, we note thatthe parameters are distributed as in the real scheme. The simulator sets an integer m = 2qand chooses a random integer k ∈ 0, . . . , n, a random n-length vector x = (x1, . . . , xn),where xi ∈ 0, . . . ,m and x′ ∈ 1, . . . ,m − 1. Let X∗ denote the pair (x′,x). Thesimulator also chooses a random y′ ← Zp and an n-length vector y = (yi), where yi ← Zp.Let Y ∗ denote the pair (y′,y). For a given identity id = (id1, . . . , idn), define threefunctions F (id), J(id) and K(id) as above for ease of analysis.

The simulator generates the public parameters h0 = hb(x′−mk)hy

′ and hi = hbxihyi , aswell as g0 = gb(x

′−mk)gy′ and gi = gbxigyi , where 1 ≤ i ≤ n. The public parameters are

published as(Ω = e(ga, hb)α,g, h, g0, . . . , gn, h0, . . . , hn, v1 = gb, v2 = ga, v3 = gt3 , v4 = gt4

).

Phase 1: To answer a private key extraction query for identity id = id1‖ . . . ‖idnthe simulator chooses random exponents r1, r2 ∈ Zp and outputs a private key d =(d0, d1, d2, d3, d4) where

d0 = har1hr2t3t4 ,

d1 = (hb)−α−F (id)r1 ,

d2 = (ha)−α−F (id)r1 ,

d3 = (ha)−r1J(id)t3 H2(id)−r2t4 ,

d4 = (ha)−r1J(id)t4 H2(id)−r2t3 .

Page 269: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF COMMITTED BLIND ANONYMOUS IBE SCHEME 245

This is a well formed private key skid for

skid =(hr1t1t2+r2t3t4 , h−αt2H2(id)−r1t2 , h−αt1H2(id)−r1t1 , H2(id)−r2t4 , H2(id)−r2t3

),

with r1 = r1F (id)F (id)b+J(id) , and r2 = r2 + J(id)ar1

(t3t4)(F (id)b+J(id)) .

Challenge: The simulator gets from the adversary a message m which it can discard,and responds with a challenge ciphertext for the identity id∗. Pose s1 = c. To proceed,the simulator picks a random exponent s2 ∈ Zp and a random element R ∈ GT , andoutputs the ciphertext as:

ct = (R, (gs)J(id), Y, (gac), (gs)t3g−s2t3 , gs2t4) .

If Y = gb(s−c), i.e. gs = Z = gc+d then c1 = vs−s11 and c2 = vs1

2 all parts of the challengebut c′ are thus well formed and the simulator behaved as in Game 2. If instead Yis independent of a, b, s, s1, s2, which happens when Z is random, then the simulatorresponds as in Game 3.

Phase 2: The simulator answers the queries as in Phase 1.

Output: The adversary outputs a bit γ to guess which hybrid game the simulator has beenplaying. To conclude, the simulator forwards γ as its own answer in the Decision-Lineargame.

Artificial Aborts are handled analogously to Lemma A.8.1.

Lemma A.8.3 (Anonymity part 2) Under the Decision Linear assumption, no p.p.t.adversary can distinguish between the Games 3 and 4 with non-negligible advantage.

Proof. Proof follows from that of anonymity part 1, except the simulation is done overthe parameters v3 and v4 in place of v1 and v2.

A.8.2 Security of Blind Extraction Protocol

Claim A.8.4 The interactive IBEBlindExtract protocol of Section 10.2.2 is leak free.

We prove the claim using a sequence of indistinguishable games. We start with a gamethat resembles the real game in which the distinguisher E obtains the output of adversaryA interacting with the KGC of a real IBEBlindExtract protocol. The last game will beequivalent to the ideal game in which the distinguisher obtains the output of a simulatorS that has access to an oracle OIBEExtract (and the attack algorithm A). S uses simulationto simulate the environment of A and extraction to obtain the correct input for OIBEExtract.

Page 270: Cryptographic Protocols For Privacy Enhanced Identity Management

246 SECURITY PROOFS

Game 0. The same as Game Real.Game 1. The same as Game 0, except that we use the simulator of the two-party protocol

to extract the inputs r′1, r′2, and u0, u1, u2, u3 and the remaining randomnessincluding the openings of the commitments used by A. Game 0 and Game 1 areindistinguishable based on the security of the two-party protocol.

Game 2. The same as Game 1, except that we extract the witness from the proof ofknowledge of Step 2 of the blind extraction protocol. This includes id, and id andan opening of the commitment commu3 . Successful extraction also guarantees thatID′ is constructed correctly. Game 1 and Game 2 are indistinguishability based onthe extraction property of the proof of knowledge.

Game 3. The same as Game 2, except that we abort if the values first extracted inGame 1 and Game 2 allow to open commu3 in two different ways. Game 2 andGame 3 are indistinguishable based on the binding property of the commitmentscheme.

Game 4. As Game 3, but we replace the zero-knowledge proof in Step 4 that provesthat the blinded key skid

′ is constructed correctly by its zero-knowledge simulator.Game 3 and Game 4 are indistinguishable based on the zero-knowledge property ofthe proof system. Note that the commitments commr1 , commr2 , commx0 , commx1 ,and commx2 can be chosen at random now, as they never need to be opened(Both the 2PC and the zero-knowledge proof are simulated). As commitments areperfectly hiding, the proofs remain proofs about true statements.

Game 5. As Game 4, but instead of computing skid′ as described in Step 3 of the blind

extraction protocol, we compute it by first computing skid and then reversingthe user’s unblinding steps in Step 4, i.e., for skid = (d0, d1, d2, d3, d4), skid

′ =(d0 ·hu0 , d

u3/r′1

1 hu1 , du3/r

′1

2 hu2 , du3/r

′2

3 , du3/r

′2

4 ). As from Game 3 onwards we alreadyguarantee the well-formedness of ID′, the distributions of Game 4 and Game 5 arethe same.

By replacing the generation of skid with a query to OIBEExtract, Game 5 can be seen toresemble the distribution generated by the simulation algorithm for Game Ideal.

Claim A.8.5 The interactive IBEBlindExtract protocol of Section 10.2.2 is selective-failure blind.

We prove the claim using a sequence of indistinguishable games starting at the originalselective-failure blindness game for committed blind IBE and ending at a game in whichthe adversary clearly has 0 advantage.

Game 0. The same as the original selective-failure blindness game. The adversary isgiven the correct commitments, and the resulting keys (sk0, sk1) are computedusing the IBEBlindExtract protocol.

Game 1. Extract secret key msk = (α, t1, t2, t3, t4) and the rest of the randomnessincluding the openings of the commitments used by KGC from the two-partycomputation protocol. Game 0 and Game 1 are indistinguishable based on thesecurity of the two-party computation protocol.

Page 271: Cryptographic Protocols For Privacy Enhanced Identity Management

SECURITY OF COMMITTED BLIND ANONYMOUS IBE SCHEME 247

Game 2. Use two-party computation simulator instead of two-party computationprotocol. To simulate the two party computation protocol, the simulator onlyneeds the output of KGC, the three random values x1, x2, x3. Game 1 and Game 2are indistinguishable based on the security of the two-party computation protocol.

Game 3. Extract the witness of the proof of knowledge of Step 4. Successful extractionalso guarantees soundness, i.e., that the values extracted fulfil the equations thatare proven. Game 2 and 3 are indisinguishable based on the extraction property ofthe proof of knowledge system.

Game 4. As Game 3, but abort if the values first extracted in Game 1 and Game 3allow to open a commitment used in the two-party computation protocol in twodifferent ways. Game 3 and Game 4 are indistinguishable based on the bindingproperty of the commitment scheme.

Game 5. Instead of computing sk0 and sk1 (if they are not ⊥) by unblinding sk′0 and sk′1,Game 5 computes them directly using the extracted msk. We already guaranteein Game 4 that values extracted from the two-party protocol and the proof ofknowledge are the same and that the values (d′0, d′1, d′2, d′3, d′4) for both sk′0 and sk′1fulfill the following equations for r1 = r1r

′1 and r2 = r2r

′2

d0 = hx0h−u0 = h(r1r′1t1t2+r2r

′2t3t4)+u0h−u0 = hr1t1t2+r2t3t4

d1 = (hx1ID′−r1t2h−u1)r′1/u3 = (h−(u3/r

′1·αt2)+u1(H2(id)u3)−r1t2h−u1)r

′1/u3

= h−αt2H2(id)−r1t2

d2 = (hx2ID′−r1t1h−u2)r′1/u3 = (h−(u3/r

′1·αt1)+u2(H2(id)u3)−r1t1h−u2)r

′1/u3

= h−αt1H2(id)−r1t1

d3 = (ID′−r2t4)r′2/u3 = ((H2(id)u3)−r2t4)r

′2/u3 = H2(id)−r2t4

d4 = (ID′−r2t3)r′2/u3 = ((H2(id)u3)−r2t3)r

′2/u3 = H2(id)−r2t3 .

Consequently the distributions of Game 4 and Game 5 are identical.Game 6. The same as Game 5, but now instead of providing the adversary with

correct commid and ID′ values, the game uses random commitments. Pedersoncommitments are prefectly hiding, and the u3 values perfectly hide the identity inID′. Game 5 and 6 are indistinguishable based on the witness indistinguishabilityproperty of the proof system.

In Game 6, the game does no longer use id0 and id1 in its interaction with the adversary.The probability of A to correctly guess b is thus exactly 1/2.

Page 272: Cryptographic Protocols For Privacy Enhanced Identity Management
Page 273: Cryptographic Protocols For Privacy Enhanced Identity Management

Appendix B

Generic Group Proofs

Let e : G1 × G2 → GT be a bilinear map over groups G1, G2, GT of prime order p. Inthe generic group model, we encode group elements of G1, G2, GT as unique randomstrings. Group operations are performed by an oracle, that operates on these stringsand keeps an internal representation of the group. We set α : Zp → 0, 1∗ to be theopaque encoding of elements of G1. Let g be a generator of G1; α(a) maps a ∈ Zp tothe string representation of ga ∈ G1. The function β : Zp → 0, 1∗ maps a ∈ Zp tothe string representation of ha ∈ G2 and the function τ : Zp → 0, 1∗ maps a ∈ Zp toe(g, h)a ∈ GT .

We can represent all operations in terms of the random maps α, β, τ . Let a, b ∈ Zp. Wedefine oracle OG that answers queries for the following operations:

Group operation. OG(multG1 , α(a), α(b)) = α(a + b). This corresponds to a groupoperation in G1 as ga · gb = ga+b. Operations multG2 and multGT for G2 and GT

are defined accordingly using representations β and τ .Exponentiation by a constant. OG(expG1 , α(b), a) = α(ab). This corresponds to the

exponentiation of a group element as (gb)a = gab. Operations expG2 and expGT forG2 and GT are defined accordingly using representations β and τ .

Pairing. OG(e, α(a), β(b)) = τ(ab). This is because e(ga, hb) = e(g, h)ab.

Proof strategy. The generic group proofs below follow the following proof strategy aslayed out by Shoup [Sho97]. Jager and Schwenk [JS08] show equivalence between theShoup and the Maurer [Mau05] models.

In the proof, instead of letting the attack algorithm interact with the actual oracles welet him interact with a simulation for which we can bound the success probability of theadversary.

249

Page 274: Cryptographic Protocols For Privacy Enhanced Identity Management

250 GENERIC GROUP PROOFS

We simulate the adversaries oracle queries using lists of rational multivariate polynomials.The variables X1, . . . , X` of the polynomials correspond to the secret values x1, . . . , x` ofthe assumptions. The goal is to fix these values only after the simulation is completed.

The simulation maintains three lists of pairs Lα = (Fα,s, αs) : s = 0, . . . , Sα − 1,Lβ = (Fβ,s, βs) : s = 0, . . . , Sβ − 1, and Lτ = (Fτ,s, τi) : s = 0, . . . , Sτ − 1, suchthat, at step S in the game, we have Sα + Sβ + Sτ = S + S0, where S0 is the number ofgroup element that is given as input to the adversary. Let the Fα,s, Fβ,s, Fτ,s be rationalfunctions (i.e, fractions whose numerators and denominators are polynomials); and allpolynomials are multivariate polynomials in Zp[X1, . . . , X`]. The αs, βs, and τs are setto unique random strings in 0, 1∗.

We start the game at step S = 0 with S0 polynomials Fα,s, Fβ,s, and Fτ,s and thecorresponding random strings αs, βs, τs.

The simulation provids the adversary with the S0 strings αs, βs, τs as required by theassumption. Now, we describe how the simulation answers the oracles A may query.

Group operation. A inputs αs and αt. B checks that αs and αt are in its list Lα, andreturns ⊥ if they are not. Then B computes F = Fα,s +Fα,t. If F is already in thelist Lα, then B returns the appropriate αv. Otherwise, B chooses a random αSα ,sets Fα,Sα = F and adds this new tuple to the list. B increments the counter Sαby 1. B performs a similar operation if the inputs are in G2 or GT .

Exponentiation by a constant. A inputs αs and a constant a ∈ Zp. B checks thatαs is in its list Lα, and returns ⊥ if it is not. Then B computes F = Fα,s · a. If Fis already in the list Lα, then B returns the appropriate αv. Otherwise, B choosesa random αSα , sets Fα,Sα = F and adds this new tuple to the list. B incrementsthe counter Sα by 1. B performs a similar operation if the inputs are in G2 or GT .

Pairing. A inputs αs and βt. B checks that αs and βt are in its lists Lα and Lβ ,respectively, and returns ⊥ if they are not. Then B computes F = Fα,s · Fβ,t. If Fis already in the list Lτ , then B returns the appropriate τv. Otherwise, B chooses arandom τSτ , sets Fτ,Sτ = F and adds this new tuple to the list. B increments thecounter Sτ by 1.

If the assumption is interactive, there will be additional assumption specific oracles thatare answered analogously.

Such a simulation assumes that all equality relations between group element are expressedusing polynomials, i.e., two group elements are equal, if and only if, their polynomialsare equal. Moreover, as group elements are represented as random strings, equality isthe only relation between group elements that can be observed by the adversary. For acomputational assumption one has to show that the adversary cannot compute polynomialsthat fulfill the equations of the assumption output. For a decisional assumption, oneadds first the polynomial equations describing the first and then the second possibleworld to the set of polynomials. This must not lead to additional equalities between thepolynomials.

Page 275: Cryptographic Protocols For Privacy Enhanced Identity Management

GENERIC GROUP PROOF OF IDDHI 251

In a last step of a generic group proof we show that the simulation was perfect withhigh probability. For this we use the following theorem due to Schwartz [Sch80] andShoup [Sho97] as stated in [Mau05].

Theorem B.0.6 The fraction of solutions (x1, . . . , x`) ∈ Zn of the multivariatepolynomial equation F (x1, . . . , x`) ≡ 0 mod n of degree d is at most d/p, where p isthe largest prime factor of n.

The degree of a multivariate polynomial F (X1, . . . , X`) is the maximal degree of anadditive term, where the degree of a term is the sum of the powers of the variables in theterm. Note that we will generally be concerned with the special case n = p.

This allows us to compute the probability of equality relations between group elementsfor random (x1, . . . , x`) that were not revealed by equality relations between polynomials.

B.1 Generic Group Proof of IDDHI

To provide more confidence in the IDDHI assumption, we prove a lower bound for itscomplexity in generic groups [Nec94, Sho97]. The following definition of the assumptionis slightly more formal than the definition given in the preliminaries.

Definition B.1.1 (Interactive DDH Inversion (IDDHI)) Suppose that g ∈ G is arandom generator of order q ∈ Θ(2k). Let Oa(·) be an oracle that, on input z ∈ Z∗q ,outputs g1/(a+z). Then, for all probabilistic polynomial time adversaries A(·) that do notquery the oracle on x,

Pr[a← Z∗q ; (x, state)← AOa(g, ga); y0 = g1/(a+x); y1 ← G;

b← 0, 1; b′ ← AOa(yb, state) : b = b′] < 1/2 + 1/poly(k).

We follow the notation of Boneh and Boyen [BB04b]. We will give the adversary as muchpower as possible and consider the case where we set IDDHI in G1 where e : G1×G2 → GT

and the XDH assumption holds in G1. This will imply that the IDDHI assumption holdsfor non-bilinear groups as well.

In the generic group model, elements of the bilinear groups G1,G2 and GT are encodedas unique random strings. Thus, the adversary cannot directly test any property otherthan equality. Oracles are assumed to perform operations between group elements, suchas performing the group operations in G1,G2 and GT .

Theorem B.1.2 (IDDHI is Hard in Generic Groups) Let A be an algorithm thatsolves the IDDHI problem in the generic group model, making a total of qG queries tothe oracle OG computing the group action in G1,G2,GT of order p and the bilinear map

Page 276: Cryptographic Protocols For Privacy Enhanced Identity Management

252 GENERIC GROUP PROOFS

e, and oracle Oa(·) that on input i ∈ Z∗p, outputs α(1/(a+ i)). If a ∈ Z∗p and α, β, τ arechosen at random, then, when A does not query O on x,

Pr[(x, state)← AOG,Oa(p, α(1), α(a), β(1)); y0 = α(1/(a+ x)); r ← Z∗p; y1 = α(r);

b← 0, 1; b′ ← AOa(yb, state) :b = b′] ≤ 12 + (qG + 3)2(4qG + 6)

p= 1

2 +O

(q3G

p

).

Proof. We simulate the environment of the adversary using multivariate polynomials inZp[A,R] We start the IDDHI game at step S = 0 with Sα = 2, Sβ = 1 and Sτ = 0, i.e.,S0 = 3, polynomials Fα,0 = 1, Fα,1 = A, Fβ,0 = 1, and random strings α0, α1, β0.

B begins the game with A by providing it with the 3 strings α0, α1, β0. The additionaloracle for the interactive assumption is simulated as follows:

Oracle Oa(·): A inputs c in Z∗p, followed by B setting Fα,Sα = (1/A+c). If Fα,Sα = Fα,ufor some u ∈ 0, . . . , Sα − 1, then B sets αSα = αu; otherwise, it sets αSα to arandom string in 0, 1∗ \ α0, . . . , αSα−1. B sends αSα to A, adding (Fα,Sα , αSα)to Lα. Finally, B adds one to Sα.

Eventually A pauses and outputs a value x ∈ Z∗p and an arbitrary state string state. B setsFα,Sα = R, for a new variable R, sets αSα to a random string in 0, 1∗ \α0, . . . , αSα−1,updates Lα, returns (αSα , state) to A, and increments Sα.

Next, B allows A to continue querying the above oracles. (Recall that at no time was Aallowed to query Oa on x.) Eventually, A stops and outputs a bit b′. This is A’s guess asto whether it received α(1/(a+ x)) or α(r) for an independently chosen r, although Bhas not fixed the values b, a, and r yet.

Notice that A cannot use the bilinear map to test his challenge value, because both xand r are in G1. Furthermore, notice that A cannot compute a representation in G1corresponding to the polynomial 1/(A+ x), for a direct comparison, unless he queriesthe oracle for it, which is forbidden. To see this, consider that A can only compute thepolynomial P (A,R) = c0 + c1 ·A+

∑n

j=1c2,jA+xj

+ c3R, where the c0, c1, c2,j , and c3 areinteger constants chosen by A and the xj are values with which it queries oracle Oa(·).A needs to solve P (A,R) = 1

A+x0for some x0 6= xj for all j. By multiplying out the

denominators, we have

(P (A,R)− 1A+ x0

) ·n∏j=0

(A+ xj) = c0 ·n∏j=0

(A+ xj) + c1 ·A ·n∏j=0

(A+ xj)+

c3 ·R ·n∏j=0

(A+ xj) +n∑i=1

c2,i ·

n∏j=1j 6=i

(A+ xj)

· (A+ x0)−n∏j=1

(A+ xj) = 0.

Page 277: Cryptographic Protocols For Privacy Enhanced Identity Management

GENERIC GROUP PROOF OF IDDHI 253

We can see that c0 = c1 = c3 = 0, because otherwise we cannot cancel the terms An+1,R ·An+1, and An+2. As a result we can simplify the equation to

n∑i=1

c2,i ·

n∏j=1j 6=i

(A+ xj)

· (A+ x0) =n∏j=1

(A+ xj).

In order for the left and the right side to be equal, the corresponding polynomials musthave the same linear factors. Note that (A + x0) is a factor on the left side and thatthe factors on the right correspond to the oracle queries. However, this means that x0must be equal to xi for some 1 ≤ i ≤ n. Thus, we have a contradiction, for x1, . . . , xn arethe values with which A queried the oracle. We have shown that A cannot compute thepolynomial P (A) = 1/(x0 + A). This means that adding the equation R = 1/(x0 + A)does not lead to any additional equalities between the polynomials. Thus, A has exactly1/2 chance of solving IDDHI, provided that B’s simulation of this game works successfulyfor both possible values of R. We now evaluate this simulation.

Analysis of B’s Simulation. To see how the success probability of A is bounded, wenow sets A = a for a random a ∈ Z∗p and show that we can set R to both R = 1/(a+ x)and R = r for a random r ∈ Z∗p.

We test (in Equations B.1, B.2, B.3, B.4, B.5, and B.6) whether the simulation was perfectfor all of these cases; that is, if the instantiation of A by a or R by 1/(a+x) or r does notcreate any equality relation among the polynomials that was not revealed by the randomstrings provided to A. Thus, A’s overall success is bounded by the probability that anyof the following hold.

F1,i(a, r)− F1,j(a, r) = 0, for some i, j such that F1,i 6= F1,j , (B.1)

F1,i(a,1

a+ x)− F1,j(a,

1a+ x

) = 0, for some i, j such that F1,i 6= F1,j , (B.2)

F2,i(a, r)− F2,j(a, r) = 0, for some i, j such that F2,i 6= F2,j , (B.3)

F2,i(a,1

a+ x)− F2,j(a,

1a+ x

) = 0, for some i, j such that F2,i 6= F2,j , (B.4)

FT,i(a, r)− FT,j(a, r) = 0, for some i, j such that FT,i 6= FT,j , (B.5)

FT,i(a,1

a+ x)− FT,j(a,

1a+ x

) = 0, for some i, j such that FT,i 6= FT,j , (B.6)

We distinguish two cases. In the first case, (a−xi) = 0, for some 0 ≤ i ≤ n. This happenswith probability at most τ1/p and in this case the simulation fails. Otherwise we look at

Page 278: Cryptographic Protocols For Privacy Enhanced Identity Management

254 GENERIC GROUP PROOFS

the degree of the resulting polynomials when these rational fractions are summed and thedenominators are multiplied out. Each polynomial F1,i, F2,i and FT,i has degree at mostτ1, 0 and τ1, respectively. As the polynomials in equation B.3 and B.4 are only constants,the equation cannot be fulfilled for F2,i 6= F2,j .

For fixed i and j, we satisfy Equations B.1 and B.2 with probability ≤ τ1/p andEquations B.5 and B.6 with probability ≤ τ1/p. Now summing over all (i, j) pairs in eachcase, we bound A’s overall success probability ε ≤

(τ12

)τ1p

+(τT2

)τ1p

= τ1p· ((τ12

)+(τT2

)).

By combining the two cases, the error probability of the simulation is less than τ1/p+(1 − τ1

p) · τ1

p· ((τ12

)+(τT2

)). Since τ1 < p and τ1 + τ2 + τT ≤ qG + 3, we end with

ε ≤ (qG + 3)/p+ (qG + 3)3/p = O(q3G/p).

The following corollary is immediate.Corollary B.1.3 Any adversary that breaks the IDDHI assumption with constantprobability 1/2 + ε > 0 in generic groups of order p requires Ω( 3

√εp) generic group

operations.

B.2 Generic Group Proof of BB-HSDH

We provide more confidence in the BB-HSDH assumption by proving lower bounds onthe complexity of the problem in the generic group model. Note that this also establisheshardness of the HSDH assumption for generic groups as conjectured by [BW07].

Definition 1 (q-Boneh-Boyen HSDH (q-BB-HSDH))On input g, gx, u ∈ G1, h, hx ∈ G2, random c1, . . . , cq ∈ Zp, and g

1x+c1 , . . . , g

1x+cq for

random exponent x ∈ Zp, it is hard to compute a new tuple (g1x+c , hc, uc).

When an adversary tries to break the BB-HSDH assumption in the generic group model,the adversary does not get g, gx, u, h, hx and g1/(x+c`), c``=1...q as input. Instead, thevalues are encoded using the random maps α, β, τ . The generators g, h and e(g, h) becomeα(1), β(1) and τ(1) respectively. We encode gx as α(x), g1/x+c` as α(1/(x+ c`) and hxas β(x). Since g is a generator of G1, there exists a y ∈ Zp such that gy = u. So wechoose y at random and set u = α(y).

To break the BB-HSDH assumption, the adversary needs to output a triple (A,B,C) ofthe form (g1/x+c, hc, uc) for some c ∈ Zp. We can test that the triple is well-formed usingthe bilinear map: e(A, hxB) = e(g, h) ∧ e(C, h) = e(u,B).

In the generic group model we require that he outputs random representations (αA, βB , αC).The adversary can either compute these values using the group oracles, or pick themat random, which he could also do by doing a generic exponentiation with a randomconstant. The adversary succeeds if αA = α(1/(x+ c)) ∧ βB = β(c) ∧ αC = α(yc).

Theorem B.2.1 (BB-HSDH is Hard in Generic Group Model) Let G1,G2,GT begroups of prime order p, (where p is a k-bit prime) with bilinear map e : G1 ×G2 → GT .

Page 279: Cryptographic Protocols For Privacy Enhanced Identity Management

GENERIC GROUP PROOF OF BB-HSDH 255

We choose maps α, β, τ at random. There exists a negligible function ν : N→ [0, 1] suchthat for every p.p.t. A:

Pr[x, y, c``=1...q ← Zp;

(αA, βB , αC)← AOG(α(1), α(x), α(y), β(1), β(x), α(1/(x+ c`)), c``=1...q) :

∃c : αA = α(1/x+ c) ∧ βB = β(c) ∧ αC = α(yc)] ≤ ν(k)

Proof. Let A be any p.p.t. attack algorithm. We create an environment B that interactswith A as follows:

B maintains three lists: Lα = (Fα,s, αs) : s = 0, . . . , Sα − 1, Lβ = (Fβ,s, βs) : s =0, . . . , Sβ − 1, and Lτ = (Fτ,s, τs) : s = 0, . . . , Sτ − 1. The Fα,s, Fβ,s, Fτ,s containrational functions; their numerators and denominators are polynomials in Zp[X,Y ]. Buses Fα,s, Fβ,s, Fτ,s to store the group action queries that A makes and αs, βs, τs tostore the results. Once we fix X, Y to some concrete values x, y, αs = α(Fα,s(x, y)),βs = β(Fβ,s(x, y)) and τs = τ(Fτ,s(x, y)).

B sets Sα = 3 + q, Sβ = 2 and Sτ = 1. B chooses random strings α0, α1, α2, α2+``=1...q,β0, β1, τ0 ∈ 0, 1∗, and sets the corresponding polynomials as:

Fα,0 = 1 Fα,1 = X Fα,2 = Y Fα,2+` = 1/(X + c`)

Fβ,0 = 1 Fβ,1 = X

Fτ,0 = 1

Then B sends the strings to A. Whenever A calls the group action oracle, B updates itslists.

At the end of the game, A outputs (αA, βB , αC). These values must correspond tobivariate polynomials Fα,A, Fβ,B and Fα,C in our lists. (If one of these values is not inour lists, then A must have guessed a random group element; he might as well have askedthe oracle to perform exponentiation on a random constant and added a random value tothe list. Thus we ignore this case.)

Since A must have computed these polynomials as a result of oracle queries, they mustbe of the form a0 + a1X + a2Y +

∑q

`=1 a3/(X + c`). If A is to be successful,

Fα,A · (X + Fβ,B) = 1 and (B.7)

Fα,C = Y Fβ,B . (B.8)

For Equation B.8 to hold identically in Zp[X,Y ], Fα,C and Fβ,B either need to be 0 orFβ,B needs to be a constant, because the only possible term for Y in the polynomials isa2Y . In both cases, the term (X +Fβ,B) in Equation B.8 has degree 1, and Equation B.7can only be satisfied identically in Zp[X,Y ] if Fα,A has degree ≥ p− 1. We know thatthe degree of Fα,A is at most q and conclude that there exists an assignment in Zp to the

Page 280: Cryptographic Protocols For Privacy Enhanced Identity Management

256 GENERIC GROUP PROOFS

variables X and Y for which the Equations B.7 and B.8 do not hold. Since Equation B.7is a non-trivial polynomial equation of degree ≤ 2q, it admits at most 2q roots in Zp.

Analysis of B’s Simulation. At this point B chooses random x, y ∈ Zp, and now setsX = x and Y = y. B now tests (in Equations B.9, B.10, B.11, and B.12) if its simulationwas perfect; that is, if the instantiation of X by x or Y by y does not create any equalityrelation among the polynomials that was not revealed by the random strings providedto A. Thus, A’s overall success is bounded by the probability that any of the followingholds:

Fα,i(x, y)− Fα,j(x, y) =0 in Zp, for some i, j such that Fα,i 6= Fα,j , (B.9)

Fβ,i(x, y)− Fβ,j(x, y) =0 in Zp, for some i, j such that Fβ,i 6= Fβ,j , (B.10)

Fτ,i(x, y)− Fτ,j(x, y) =0 in Zp, for some i, j such that Fτ,i 6= Fτ,j , (B.11)

Fα,A(x, y)(x+ Fβ,B(x, y)) =1 ∧ Fα,C(x, y) = yFβ,B(x, y) in Zp. (B.12)

Each polynomial Fα,i, Fβ,i and Fτ,i has degree at most q, q and 2q, respectively. For fixedi and j, we satisfy Equations B.9 and B.10 with probability ≤ q/(p) and Equations B.11with probability ≤ 2q/(p). We can bound the probability that Equation B.12 holds by≤ 2q/(p)

Now summing over all (i, j) pairs in each case, we bound A’s overall success probability

ε ≤ 2(Sα2

)q

p+(Sτ2

)2qp

+ 2qp≤ 2q

p((Sα2

)+(Sτ2

)+ 1).

Let qG be the total number of group oracle queries made, then we know that Sα+Sβ+Sτ =qG + q + 6. We obtain that ε ≤ (qG + q + 6)2 2q

p= O(q2

Gq/p+ q3/p).

The following corollary restates the above result:

Corollary B.2.2 Any adversary that breaks the q-BB-HSDH assumption with constantprobability ε > 0 in generic bilinear groups of order p such that q < O( 3

√p) requires

Ω( 3√εp/q) generic group operations.

B.3 Generic Group Security of BB-CDH

We provide more confidence in the q-BB-CDH assumption by proving that it is impliedby the q+1-SDH assumption [BB04b]. As SDH is secure in the generic group model,BB-CDH is too.

Definition 2 (q-Boneh-Boyen CDH (q-BB-CDH))On input g, gx, gy ∈ G1, h, hx ∈ G2, random c1, . . . , cq ∈ Zq, and g

1x+c1 , . . . , g

1x+cq for

random exponents x, y ∈ Zp, it is hard to compute gxy.

Page 281: Cryptographic Protocols For Privacy Enhanced Identity Management

GENERIC GROUP SECURITY OF BB-CDH 257

Given a q+1-SDH challenge (g1, gz1 , g

z21 , . . . , gz

q+11 , g2, g

z2), our reduction tries to compute

g1z1 . To create a challenge for a q-BB-CDH assumption, we pick random c1, . . . , cq ← Zp

set g = gz∏q

i=1(1+ciz)

1 , X = gx = g

∏q

i=1(1+ciz)

1 , Y = gy = gr∏q

i=1(1+ciz)

1 , h = gz2 ,Z = hx = g2. Note that implicitly x = 1/z and y = r/z. As z

∏q

i=1(1 + ciz) is apolynomial of maximum degree q + 1 the reduction can compute g.

We also need to compute g1/(x+ci) = g1/( 1z

+ci) = gz/(1+ciz). Substituting g with

gz∏q

i=1(1+ciz)

1 we get

gz/(1+ciz) = (gz∏q

j=1(1+cjz)

1 )z/(1+ciz) = gz2∏q

j=1,j 6=i(1+cjz)

1 .

The polynomial z2∏q

j=1,j 6=i(1+cjz) is again of maximum degree q+1. Thus the reductioncan compute the g1/(x+ci) values.

We are ready to query our BB-CDH adversary to obtain gxy. We know that

gxy = gr/(z2) = (gz∏q

i=1(1+ciz)

1 )r/z2.

If we multiple out the product, we obtain a sum of q + 1 terms in the exponent:

gr(∑q

i=1aiz

q−i)+r/z1 . The value r(

∑q

i=1 aizq−i) is a polynomial of degree q − 1 in z.

We can compute gr(∑q

i=1aiz

q−i)1 and obtain gr/z1 through division. We compute the rth

root to obtain the desired g1/z1 .

The following corollary follows from the generic group proof of the SDH assumption:

Corollary B.3.1 Any adversary that breaks the q-BB-CDH assumption with constantprobability ε > 0 in generic bilinear groups of order p such that q < O( 3

√p) requires

Ω( 3√εp/q) generic group operations.