473
UNIVERSITY OF MIAMI LAUNCHING THE DNS WAR: DOT-COM PRIVATIZATION AND THE RISE OF GLOBAL INTERNET GOVERNANCE By Craig Lyle Simon A DISSERTATION Submitted to the Faculty of the University of Miami in partial fulfillment of the requirements for the degree of Doctor of Philosophy Coral Gables, Florida December 2006

Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

Embed Size (px)

DESCRIPTION

This dissertation investigates the Internet governance debates of the mid 1990s, narrating events that led to the signing of the Generic Top Level Domains Memorandum of Understanding (gTLD-MoU) in May 1997. During that period, an unlikely alliance formed to create a new institutional structure that would administer the Internet’s Domain Name System (DNS). The collaborators included members of the Internet technical community’s “old guard,” leading officials of the International Telecommunications Union, representatives of organized trademark interests, and others. Their ambitious project aimed at constituting a formal procedural apparatus capable of operating at a world-wide level, independent of the sovereign state system. Institutional membership in the new structure was intended to confer participation rights and normative obligations, thereby establishing status relationships that resonated with the kinship, ingroup, and citizenship relationships of legacy social orders.The example serves as a particularly valid and germane case study that can be used to model power relations among responsible agents in an expressly global system of rule. This postulated case allows for a more useful comparison of power relations within past, present, and future epochs.

Citation preview

Page 1: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

UNIVERSITY OF MIAMI

LAUNCHING THE DNS WAR:DOT-COM PRIVATIZATION AND

THE RISE OF GLOBAL INTERNET GOVERNANCE

By

Craig Lyle Simon

A DISSERTATION

Submitted to the Facultyof the University of Miami

in partial fulfillment of the requirementsfor the degree of Doctor of Philosophy

Coral Gables, Florida

December 2006

Page 2: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

©2006Craig Lyle Simon

All Rights Reserved

Page 3: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

UNIVERSITY OF MIAMI

A dissertation submitted in partial fulfillment of the requirements for the degree of

Doctor of Philosophy

LAUNCHING THE DNS WAR: DOT-COM PRIVATIZATION AND

THE RISE OF GLOBAL INTERNET GOVERNANCE

Craig Lyle Simon

Approved:

Vendulka Kubalkova Dr. Steven G. UllmannDepartment of International Studies Dean of the Graduate School

A. Michael Froomkin Haim ShakedSchool of Law Department of International Studies

Jaime Suchlicki Nicholas G. OnufDepartment of International Studies Florida International University

Department of International Relations

Page 4: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

SIMON, CRAIG LYLE (Ph.D., International Studies)Launching the DNS War: (December 2006)Dot-Com Privatization and theRise of Global Internet Governance

Abstract of a dissertation at the University of Miami.

Dissertation supervised by Professor Vendulka Kubalkova.Number of pages in text. (457)

This dissertation investigates the Internet governance debates of the mid 1990s,

narrating events that led to the signing of the Generic Top Level Domains Memorandum of

Understanding (gTLD-MoU) in May 1997. During that period, an unlikely alliance formed

to create a new institutional structure that would administer the Internet’s Domain Name

System (DNS). The collaborators included members of the Internet technical community’s

“old guard,” leading officials of the International Telecommunications Union, representatives

of organized trademark interests, and others. Their ambitious project aimed at constituting

a formal procedural apparatus capable of operating at a world-wide level, independent of the

sovereign state system. Institutional membership in the new structure was intended to confer

participation rights and normative obligations, thereby establishing status relationships that

resonated with the kinship, ingroup, and citizenship relationships of legacy social orders.

The example serves as a particularly valid and germane case study that can be used

to model power relations among responsible agents in an expressly global system of rule.

This postulated case allows for a more useful comparison of power relations within past,

present, and future epochs.

Page 5: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

iii

For my mother, Shirley Simon.

Page 6: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

iv

Acknowledgments

Family, friends, co-workers, colleagues, and members of the University

administration have all seen a long span of time play out between the inception and

conclusion of this project. So much so, that several years in, I decided to tell my nieces and

nephews that I was in the 27 grade. I thank everyone for their extraordinary patience.th

Many individuals deserve mention for their helpful assistance and advice during this

long experience, but I want to single out a few.

I was exceptionally fortunate to receive feedback and advice from Michael Froomkin,

whose expertise in the Internet governance debates prompted numerous corrections and

improvements in the text. Though it became clear early on that he and I often stood on

different sides of this partisan issue, he treated my work with impeccable fairness. I am

deeply impressed by his example.

It has been a privilege to work with Nick Onuf, whose framework profoundly

affected the way I think about the study of human societies and how to do constructive

scholarship. My growing appreciation of his analytical framework radically advanced my

approach to this investigation, providing tools that allowed me to probe far more deeply into

the nature of rule making than I ever imagined possible.

Finally, special thanks go to my dissertation advisor, Vendulka Kubalkova. She is

responsible more than anyone for enabling me to resume academic work, and for stepping

forward at critical junctures to help me see my Doctorate in International Studies, started so

long ago, through to completion. It is traditional when completing a scholarly work of this

sort to acknowledge the input and assistance of others, while taking sole responsibility for

any errors in the result. Indeed, all errors are my own. Her role has been so pivotal, however,

that she deserves a share of the blame for whatever impact this work might have on the

world.

Page 7: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

v

Preface

In the late 1960s an elite group of scientists, engineers, industrialists, educators, and

other specialists embarked upon a long-term project to advance the utility of computer

networks. Their goal was to create a highly scalable suite of data telecommunication

protocols and media technologies. The result is what we now know as the Internet. By the

1980s, leaders within that elite began to refashion their goals, envisioning their project as a

bid to create a universal system of communication. They undertook new projects, intending

to create formal mechanisms for managing key Internet resources. Those efforts were

organized in a manner that some hoped would shape an emerging system of Internet-based

global socio-political organization.

The worldwide surge in the use of the Internet in the mid-1990s sharpened the

organizational efforts of those founding elites. Prominent newcomers fostered idealistic

thinking about how Internet-based activity could provide models for a historical successor

to the international system of states. In other words, a real social phenomenon – the rise of

the Internet – was portrayed as an episode of grand significance, presaging a new moment

in human civilization.

Controversy arose as members of the founding elite attempted to reform the

administration of the Internet’s Domain Name System (DNS)... a distributed database whose

functional apex is called “the root.” The DNS infrastructure serves as the prime navigational

service underlying the Word Wide Web, and is essential to the successful function of many

applications, especially the distribution of electronic mail.

The exploding use of the Internet and the apparently singular importance of the DNS

at the “commanding heights” of Internet rule-making led to the engagement of a wide circle

of actors. They soon fell out among each other, finally joining into factions. Combatants in

the so-called “DNS War” went on to exert lasting influence within broader Internet

governance debates. The legacy of the conflict is likely to resonate within a larger, still-

evolving discourse. Key themes of the Internet governance debates could be replayed

throughout the 21 century as new arrays of global institutions become prominent andst

perhaps ascendent in human culture.

Page 8: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

vi

Many participants in the DNS reform controversy articulated direct challenges to the

nation-state system. Its defenders responded energetically. The contending parties made

ideological claims that reflected clearly discernable patterns of opinion regarding the benefits

and detriments of globalization. Analysis of their discourse gives important insights into a

speculative line of questioning that was contemporaneous to the DNS reform debates.

Namely, how might the rise of “cyberspace” promote a “post-Westphalian” transformation

of world order?

The question is inherently too speculative to be answered with certainty, but it can

be answered insightfully. Any serious consideration of a future world order must also take

up the nature of the present one. Scholarly distinctions on the workings of power within the

current historical epoch are well known. One position speaks of an international system that

is an arrangement of autonomous states in a fundamentally anarchic environment. Another

speaks of a world society that is an expression of overlapping interests and shared values.

As a consequence of the DNS War, it is now also possible to speak of a formally constituted,

expressly global system of rule. Such a system would be empowered to exercise hierarchical

control over identifiable resources.

Many participants in the DNS controversy were notable members of the Internet

standards-making community. Others participants included business people, government

officials, and public interest advocates. Their combined activity displayed a wide variety of

status-based, normative, and material motivations. They generally understood that the

outcome of their arguments would affect essential components of the Internet. Since they

were undertaking rule-making on a grand scale, they were self-consciously aware that they

were engaged in a kind of social engineering. The levels of polemic argument and reflective

introspection were often quite high, as if the participants were categorically and self-

consciously justifying the foundations of their own rationality, even laying a record for

posterity by presuming to speak to distant historical observers,.

At issue was the extent to which accountability for the development of the Internet's

name and number allocation system should be: 1) driven by impersonal market forces; 2) left

in the hands of elite Internet engineers; 3) reverted to national authorities; 4) given over to

formal public processes organized under a new global constituency, or; 5) managed by some

mix of these approaches. The bids to organize formal processes were often remarkably

Page 9: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

vii

ambitious, but only a few bore fruit. Some plans raised the prospect of establishing new

jurisdictions beyond the scope of traditional state powers. One planet-wide election did take

place, and there were hopes for more.

An overarching goal shared by most participants was to organize an accountable

deliberative body capable of governing the administration of key DNS resources. A central

point of contention was how to constitute a legitimate organization whose officers would,

on one hand, be expertly conscious of the Internet’s technical constraints and, on the other,

be directly responsive to the as yet unarticulated interests of a legitimating global polity.

Achievement of the goal was encumbered by contention over: 1) formal legal structure; 2)

how to establish geographic or geopolitical balance in the organization’s membership; 3) the

extent to which the organization’s powers should be limited, and; 4) how to address

conflicting demands from various propertied, credentialed, and ideological constituencies.

This dissertation narrates the early history of the Internet governance debates,

preparing the ground for further investigation and analysis. Research is based on interviews

and correspondence with leading figures in the debates, participation in some key events as

they occurred, and review of documentary evidence.

The methodological approach used here is qualitative, not quantitative. Rather than

prove a statistical correlation between causal variables and specific outcomes, the goal is to

make these events understandable through a historical narrative that is accessible to a wider

audience. It focuses on “The Root,” the most contended piece of “property” within the

Internet’s domain name system. The story of that fight contains enough acronyms to fill a

large can of alphabet soup. No one could claim a full understanding of what happened during

the domain name wars without mastering vocabularies from several diverse fields, including

computer science and intellectual property law.

The dissertation starts by discussing some of the foundational analytical problems

which motivated this investigation. The central narrative begins by foreshadowing one of the

most dramatic episodes in the controversy, an incident in early 1998 which revealed the root

as a point of contention. It also introduces Jon Postel, an eminent personality in Internet

history, as a decisive figure. Relevant aspects of the technical, ideological, and legal

background will be raised along the way.

Page 10: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

viii

The story then jumps back in time to describe how key institutional and personal

relationships were established. The creation and rise of the Internet is covered here, from the

origin of its technical precursors in the late 1960s, its expansion through the early 1990s,

culminating with the first phase of the DNS War near the end of 1994.

The story then returns to the late 1960s to retrace the building of the Internet again,

but this time focusing specifically on the technical and administrative origins of the domain

name system. Building on a base of established characters and concepts, the narrative tracks

the launch of the DNS War, and recounts its early battles in three distinct phases:

1) Prior to September 1995, the US National Science Foundation (NSF) held a

preeminent role delegating oversight responsibility and making policy for the root. A crisis

was triggered on September 14, 1995 when officers within the NSF unilaterally instituted a

radical policy change. As a result, fees were instituted for the formerly free registration of

names within the well-known top level domains, .com, .net, and .org;

2) In response, many constituents of the Internet community demanded reform. Postel

moved to assume policy-making leadership. He was a likely candidate, given his capacity as

a highly respected figure within the Internet engineering community, and his position as

director of the Internet Assigned Numbers Authority (IANA), a key organ of the Internet

standards-making community. Despite giving considerable attention to the problem, he was

unable to offer a solution that satisfied a sufficiently broad constituency of interested parties.

3) In mid 1996 Postel passed the initiative to the International Ad Hoc Committee

(IAHC), a conglomeration of associates from the standards-making community, plus some

representatives of trademark interests. Together they developed a plan which proposed

radical reform of DNS management. Postel endorsed their results, but vehement resistance

gave rise to a concerted opposition. This phase culminated in the signing of the Generic Top

Level Domains Memorandum of Understanding (gTLD-MoU) on May 1, 1997, a bid to set

up an expressly global system for Internet governance.

Ensuing events will be chronicled in a brief synopsis. The gTLD-MoU signing led

to an intense phase of the DNS War, culminating in US Government intervention against it

early 1998. In the aftermath, various agencies of the US government and a number of

lobbyists coordinated their efforts to develop a counter proposal, eventually leading to the

creation of the Internet Corporation for Assigned Names and Numbers (ICANN).

Page 11: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

ix

The narrative will be followed first by an analysis of the motivation of certain key

agents, particularly Vint Cerf, and then by a summary of findings. Three appendices are

included. One provides a list of relevant acronyms, the second offers a justification for the

metaphorical scheme used in the narrative, the third describes Internet-based sources and

interviews referred to in the text.

Page 12: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

x

TABLE OF CONTENTS

List of Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii

List of Figures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii

List of Chronologies.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii

“Who Makes the Rules?”.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1Constituting the Root of Power. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1Power Makers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11Predecessors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15Encountering the Internet.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Investigating the IETF. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26Evaluating ICANN’s Errors.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33Opening the Inquiry. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

Postel in Power. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42The Day the Internet Divided.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42Authority. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50Agency. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

Internetting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57ARPANET. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57Czar. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65IANA. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68TCP/IP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74Commercialization and Privatization, Round 1. . . . . . . . . . . . . . . . . . . . . . . . . . . 83

Institutionalizing the IETF. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95Open Culture. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95The Rise of the IAB. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101Discretionary Authority. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106ISOC Underway. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110The Anti-ITU. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116Palace Revolt. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123“It Seeks Overall Control”. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129Standardizing the Standards Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135Charters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137

Lord of Hosts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144What it All Means. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144Dawn of the DNS.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145Domain Requirements – 1983 - 1985. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150

Page 13: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

xi

From SRI to GSI-NSI – 1991.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159One-Stop Shopping – Summer 1992 - Spring 1993. . . . . . . . . . . . . . . . . . . . . . 165Running the Internet’s Boiler Room – 1993 and Beyond. . . . . . . . . . . . . . . . . . 172The Goldrush Begins – 1994. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181IEEE Workshop – September 1994. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191InterNIC Interim Review – November 1994. . . . . . . . . . . . . . . . . . . . . . . . . . . . 194Exit GA – January 1995.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197Enter SAIC – February 1995. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198Top Level Domains: Who Owns Them? – March 1995. . . . . . . . . . . . . . . . . . . 201Under New Ownership – Summer 1995. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211Pulling the Trigger – September 1995. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213

A New Age Now Begins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221Scrambling for Solutions – September 1995. . . . . . . . . . . . . . . . . . . . . . . . . . . . 221On the Roadshow – November 1995. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230You Must Be Kidding - January 1995. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235ISPs Ascending – February 1996. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240The Rise of ARIN – Flashback. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248The Alt-Root Backlash – Spring 1996.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255Draft Postel – May 1996. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263IRE BOF – June 1996. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273Blessed Envelopes – July 1996. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279

The Technical Construction of Globalism.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293A Chorus of Critics – September 1996. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293Blue Ribbons – November 1996. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304IAHC - Late 1996.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306Constituting CORE – December 1996.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313Blocking the MoUvement Jan-April 1997.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316Haiti and iTLDS – March 1997. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333Laying the Cornerstone for a Global Commons. . . . . . . . . . . . . . . . . . . . . . . . . 336

After Geneva. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 346

Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359Pursuing the Gutenberg Analogy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359Summary of Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368

Appendix I: Acronyms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371

Appendix II: Power. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 380Rule-making on the Internet vs rule-making in general.. . . . . . . . . . . . . . . . . . . 380The Skilled Practice of Rules and Categories. . . . . . . . . . . . . . . . . . . . . . . . . . . 382

Structuration.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 382Speech Acts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390Interests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396

Page 14: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

xii

A Short History of Interests and Motivations. . . . . . . . . . . . . . . . . . . . . . . . . . . 400Fear. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 400Rationality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404Reflexivity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407

Deploying Rules.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411The Scope of Parts and Wholes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411From Neurons to Narratives.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414Direction of Fit.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 428Guides, Gatekeepers, and Peers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 437

Appendix III: Comment on Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 443

Bibliography and Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449

Page 15: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

xiii

List of Tables

Table 1 Reclassification of the IP address space. September 1981. RFC 790. 80Table 2 David Conrad’s Models of Internet Governance 247Table 3 Conceptual Sources of Rule 386Table 4 Triadic Constructions of Motivations and Interests 405Table 5 Matrix of Rule Styles 428Table 6 Searle’s Five Illocutionary Points and Constructivist Derivations 436Table 7 Mailing Lists and Abbreviations 445Table 8 Selected Schedule Interviews 446Table 9 Meetings and Conferences Attended 447

List of Figures

Figure 1 Partial representation of the legacy DNS hierarchy 44Figure 2 Root Server Hierarchy of the late 1990s and early 2000s 45Figure 3 Creators of the ARPANET 56Figure 4 Jon Postel, Steve Crocker and Vint Cerf 56Figure 5 Jon Postel in 1997 56Figure 6 Depiction of early ARPA Network. 58Figure 7 Danny Cohen’s proposal for TCP/IP, circa 1977. 76Figure 8 “The In-Tree Model for Domain Hierarchy.” 148

List of Chronologies

Chronology 1 Selected Events in Internet History 41Chronology 2 Early Infrastructural Development 94Chronology 3 Early Steps in IAB, IETF, and ISOC Governance 143Chronology 4 Selected Events in DNS History 345Chronology 5 DNS History after the gTLD-MoU 358

Page 16: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

Usenet Newsgroups History,“Alt Hierarchy History - Brian Reid, Usenet Newsgroups, Backbone...”1

http://livinginternet.com/u/ui_alt.htm.

Recounted by Stuart Lynn in an interview with WashingtonPost.com, June 20, 2002,2

http://discuss.washingtonpost.com/wp-srv/zforum/02/technews_lynn062002.htm.

1

It's a thorny subject. As the state of my mental handsprove every time I think I have grasped it.

Harald Tveit Alvestrand

1. “WHO MAKES THE RULES?”

a. Constituting the Root of Power

Among the most hallowed beliefs circulating during the late 20 Century was that noth

one was in charge of the Internet. That myth was especially popular among those who should

have known better – the computer specialists and public officials who pioneered the system.

The reality was quite different, of course. Their vision of a scalable, interoperable, globally-

accessible network shaped not only the Internet, but the unfolding of the new millennium.

Nevertheless, the powerful were happy to retreat behind their legend of powerlessness.

Consider Brian Reid, who began working as a computer scientist while an

undergraduate at the University of Maryland in the late 1960s, and who went on to contribute

pivotal advances in electronic document processing, scalable routing systems, and search

engine technology. In 1988 Reid proclaimed, “[To] offer an opinion about how the net

should be run [is] like offering an opinion about how salamanders should grow: nobody has

any control over it, regardless of what opinions they might have.” 1

A long parade of unfounded assumptions kept the legend alive. One quip, nearly a

decade after Reid’s, epitomized them all: “Asking who manages the Internet is like asking

who manages the world's sidewalk program.” The instigator of that one, Stephen Wolff, was2

Director of the Networking Division at the United States National Science Foundation

between 1986 and 1995. Wolff presided over the construction of the Internet’s first high

speed data backbone. He then organized the transfer of that backbone from the US

Government to private ownership. Under Wolff’s leadership the NSF distributed funds that

made it possible for hundreds of universities to connect to the Internet. This made him a key

Page 17: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

2

figure in the events that led to the dominance of the Internet Protocol and the

computerization of modern society. Yet, despite Wolff’s remarkable personal achievements,

the myth conveyed by his quip is far better known than his name.

It is appropriate to set the record straight. Sidewalks do not build or maintain

themselves, and neither did the Internet. It is not a distinct life form, like a salamander, with

its own independent will and self-contained anatomy. The Internet is the intentional product

of human choices, and thus forever subject to them.

The fantasy of the Internet’s sovereign anarchy was sustained by Reid’s and Wolff’s

misleading metaphors and many like them. Over and again, people who laid the foundations

of the Internet fostered myths which denied the far-reaching impact of their choices. This was

not due to modesty, but to a familiar ethic of irresponsibility. It allows people from all walks

of life – leaders and followers, inventors and functionaries – to excuse themselves from the

implications of their power and the ramifications of their actions.

This is not to say that Wolff and Reid were part of some grand conspiracy to foist the

Internet on a naïve public. They embraced the myth of the Internet’s sovereign anarchy as

dearly as anyone. It would be fair to say that they felt humbled and privileged by the

opportunity to contribute to something so amazingly, so excitingly big. The paradox of the

salamander and sidewalk myths was an overweening humility that left no place for the

guiding role of human intention.

Cherished delusions are not unusual in human history. Nowadays they are quite

prominent, made manifest as individuals act out their choices while declaring humble

obedience to providential powers. The list of credited forces is long and familiar: the anarchy

of the international system (the law of the jungle writ large), the invisible hand of the market,

the immutable logic of Reason, the sacred inviolability of tradition and taboo, the will of

supernatural beings, or the rule of scientific progress. These pretenses are characteristic of

a culture in which people learn to deny responsibility for their individual deeds by chanting

the dictates of socially exalted creeds.

It is certainly true that many people shamelessly proclaim virtuous-sounding

philosophies to cover their vicious intents. But that sort of self-conscious hypocrisy is a

Page 18: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

3

The approach used here generally follows from the views of Giddens (1984).3

A basic precept is that large complex systems having slightly varied initial conditions may produce4

widely varying effects. Gleick (1988) remains the most popular book on the concept.

different matter, and ultimately less pervasive. As in Reid’s and Wolff’s cases, most people

seem honestly unaware of the contradiction between the powers proved by their deeds and

the humility implied by their creeds.

Why did the myth that “no one in is charge of the Internet” become such a compelling

conventional wisdom? First of all, because it resonated with the order-out-of-chaos, free-

market Zeitgeist so prominent in that era. And that resonated with a deeper human urge to

subordinate one’s own willful actions to a desire for graceful harmony with the grandest

truths of the universe. But why do people buy into such justifications so readily? And why

was the idea of the Internet’s anarchy so compelling at that particular time?

The answer lies in the way people recognize and deploy the powers that are accessible

to them, and thus their capacity to make choices as responsible agents within a social

context. By acting to make a difference in his or her own life, someone may either reinforce3

an existing social context, or perhaps change it slightly, or even inspire one that is altogether

new. One’s conception of agency informs one’s own sense of power.

Certain questions about power on the Internet suggest themselves: Who sets policy

for the Internet now? How they did they acquire their positions? As I will show, a particular

cast of characters tended to be in the middle of things, collaborating or battling among

themselves. But questions about responsibility must probe deeply, and questions about

attitude toward responsibility more deeply still. Those are the underlying questions that drive

this investigation: Why did the Internet’s pioneers so often justify their decisions by crediting

superior forces beyond their control? Why did those justifications resonate so successfully

as overarching ideologies?

Many of the Internet’s creators were smitten by intellectually fashionable concepts

from chaos theory, a field of mathematics. Playing up its metaphors to describe growth and4

change was a popular exercise in the 1980s and 1990s among thinkers in fields as diverse as

economics, psychology, and biology. As the jargon slipped into the engineering community,

Page 19: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

4

“Self-organizing” was by far the more common metaphor. For a typical example of “edge-controlled”5

see National Research Council (2001), http://newton.nap.edu/html/coming_of_age/ch4.html.

it became trendy to say that power-making on the Internet, like its technical structure, was

self organizing... edge-controlled.5

Those metaphors intersected nicely with the “end-to-end principle,” a fundamental

architectural doctrine of the Internet community. Authors of end-to-end principle rejected the

pyramid-like hierarchical structure of other computer systems in favor of moving intelligence

and control mechanisms out to peripheral devices. This approach paid off dramatically.

Earlier generations of mainframe/dumb-terminal computer systems and switched-circuit

telephones required relatively high overhead to maintain communication between endpoints.

By refusing to concentrate memory and computing power at a single machine, the Internet’s

designers simplified operational management and the process of introducing technological

upgrades. More importantly, the Internet avoided a centralized, circuit-switching approach

in favor of a highly-distributed, packet-based design. This scalable architecture enabled

movement toward a network of interconnected networks – thus, “internet” – far larger than

anything conceived before.

Another feature of the end-to-end design was that information about the state of a

conversation between devices could be maintained exclusively by the devices along the

periphery. There was no need for a central authority devoted to monitoring whether lines

were available or busy. Conceptually, the Internet was a “dumb network,” like a robotic

golem with no consciousness of its own.

As the end-to-end principle proved successful, related concepts like dumb networks

and intelligent peripherals diffused through the Internet community, taking on creedal

properties. Quite a few engineers and analysts were already wary of central authority, perhaps

because of an anti-establishment cultural Zeitgeist inherited from the 1960s, or because of

the accumulated frustrations of working within large bureaucracies. For these and other

reasons, they found it convenient to blend in the trendy jargon of chaotically-styled, edge-

originated, edge-driven variation. It all rolled up into an allegory for Internet-mediated

personal freedom. Personification (or at least Salamandarizaton) of the Internet gave the

Page 20: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

5

Wiener (1964:53-55).6

golem a goal. The Internet’s most avid boosters began to portray it as an avaricious vanguard

of human progress.

Ironically, by crediting a complex mathematical theory for the logic of their design,

members of the engineering community tended to disown credit for furnishing the

circumstances under which variation could occur. Pioneers like Wolff and Reid were happy

to accept recognition for specific contributions, but not for creating and perpetuating the

overarching structure itself. There was more honor in the idea of serving the community

rather than leading it, and in doing so by mimicking the exigencies of Nature.

Not long before the rise of the Internet, the eminent mathematician Norbert Wiener

warned of “gadget worshipers” whose “admiration of the machine,” betrayed a “desire to

avoid the personal responsibility” for potentially disastrous decisions. Gadget worship has6

ramifications for not only physical devices, but institutions as well. Social organizations can

be as enthralling as any new tool.

To say “no one controls the Internet” is to say that one of the most important

institutional innovations of the modern era is immune to human intervention. This is

mistaken. To embed a definite form of decision making at the heart of a process – be it dumb

or intelligent, centralized or anarchic – is to champion one’s own design preferences for the

body at large. Whether they admitted it or not, the Internet’s engineers were social engineers.

People may want to deny their responsibility by denying their power as agents, but such

denials will not hold up. An immutable lasseiz faire is as bound by intention as any tyranny.

* * *

The crux of the story is this: There was a war. As in any war, the victors wrote the

rules. But in this one, however, they described their actions as a process of natural evolution.

The war I will be chronicling ultimately came to be known as the DNS War. DNS

stands for Domain Name System. The DNS consists of a set of databases that are used to link

Internet domain names such as internic.net with underlying numeric addresses that identify

specific computer resources. The conflict erupted on September 14, 1995, when names

Page 21: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

6

See, John Gilmore, “IP: SAIC's acquisition of NSI - details,” IP July 6, 2002. See the filing for7

SAIC’s purchase at http://edgar.sec.gov/Archives/edgar/data/1030341/0000950133-97-003115.txt. See the

filing for the sale to Versign at http://www.sec.gov/Archives/edgar/data/1030341/0001012870-00-002608.txt.

ending in dot-com – names that had once been registered free of charge through a

government-subsidized agency called the InterNIC – were suddenly subjected to registration

fees. The ultimate beneficiary was Science Applications International Corporation (SAIC),

a well-connected Defense contractor which reaped a multi-billion dollar windfall from the

decision.7

The DNS War of the late 1990s and early 2000s engaged countless partisans, plus

untold numbers of interested observers, unwitting bystanders, and aspiring referees. Nearly

all of them were sucked into a complex, drawn-out combat. Fortunes were won and lost.

There was stealth, betrayal, courtroom drama, and backroom dealing, and even the death of

a heroic figure in the heat of battle. There were regular shows of selflessness, and even a few

authentic acts of it. The grand prize of the struggle was control of a small but critical

database at the apex of the DNS. That database is called the root.

Each veteran willing to reminisce about those days is bound to offer distinct

impressions of what the fighting was about. For many, the dominant theme would be the

privatization of the Internet’s domain name system – the creation of a market for a service

that had once been provided without charge by the US Government. Others would stress

more grandiose concerns – the preservation of free speech and open markets in a world they

called cyberspace.

It was about values, and money, and ethics, and establishing the rule of law. It was

about technology, innovation, and visions of the future. So much seemed to be at stake, the

drive to participate in making the decision was all-consuming... simultaneously exhilarating

and exhausting. The fight was about so many things that much is likely to be forgotten.

The parts that will linger, however, concern its central power play... the effort of

various factions to establish an institutional gatekeeper for the Internet. In fact, a quasi-

formal gatekeeping authority had been in place long before the DNS War. But that authority

had assumed a tacit and even self-denying persona, bowing to the self-humbling myth of the

Page 22: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

7

Consider, for example, the “Cyber-Federalist” writings of Hans Klein under the aegis of the Computer8

Professionals for Social Responsibility, http://www.cpsr.org/internetdemocracy/cyber-federalist.html and cyber-

federalist.org, and Marc Holitscher, et al, “Debate: Internet Governance,” Swiss Political Science Review, 1999,

5(1): 115-136, http://www.holitscher.org/downloads/InternetGovernanceSPSR1.pdf.

Internet’s anarchy. Combatants in the DNS War sought to put things on a new level, out in

the open. This meant openly formalizing or perhaps even superseding the chain of authority

that had been in quiet effect under Steve Wolff and his predecessors.

The conflict was intrinsically political. To build up the powers of the gatekeeper’s

office, it would be necessary to provide a foundation of legal standing. That step suggested

another. Formal authority was not enough. There would also have to be some deliberate

articulation of the gatekeeper’s guiding doctrine. Consequently, a deeper, almost

philosophical kind of justification was necessary. What would it mean to regulate the use and

administration of Internet-based services? What would it mean to establish Internet

governance?

Those questions prompted vigorous debates about underlying values. Was the

Internet about freedom for freedom’s sake, the right to compete (or innovate, or

disintermediate), or stable growth for the sake of jobs and commerce? Some saw the debates

as an opportunity to sanctify a brave new orthodoxy. Politics begat religion.

The crisis raised hopes of another exciting step forward in the technical community’s

amazing experiment. Some participants were convinced that the world was on the verge of

a grand constitutional moment, as if the current order could be swept away, and redesignated

ancien regime. After all, the nation-state system wasn’t necessarily a permanent feature of8

the world stage. It had superseded the Holy Roman Empire, and perhaps could be superseded

itself, marking advancement to a new phase in human community.

The goals of naming a global Internet gatekeeper and defining its duties were often

supported with that prospect very much in mind. Done right, it seemed, the results would

serve to advance the interests of global community and human fraternity. The DNS War

provided a venue in which people could dare to impose their own Utopian visions of a good

society onto the blueprints of tomorrow’s Internet. It was an opportunity to exert power right

at the root of a new world order and thus shape the future of our collective civilization.

Page 23: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

8

Preservation of “openness” has served as a battle cry ever since. See, for example, Cliff Saran, “US9

proposals threaten openness of internet, warns Berners-Lee,” ComputerWeekly.com, May 24, 2006,

http://www.computerweekly.com/Articles/2006/05/24/216093/US+proposals+threaten+openness+of+

internet%2c+warns+Berners-Lee.htm.

Yet there was an inherent paradox in this Utopianism. The prevailing myths of the

Internet contrived a realm of unconstrained, participant-directed freedom. True believers

demanded that the Internet’s “openness” be preserved and fortified. Consequently, any move9

to formalize enforcement of the Internet’s overarching rules would seem, on its face,

heretical.

Those sentiments were exploited by individuals whose material interests were best

served by maintaining the status quo. The move to fee-charging created instant beneficiaries

– SAIC among them – who worked hard to preserve that advantage. Rhetorically, they often

found it quite useful to declare themselves harmoniously in line with the vanguard of the

Internet’s spirit, whether one called it freedom, openness, unplanned informal order,

forbearance, chaos, or anarchy. They found no shortage of idealistically-motivated “useful

idiots” who could be enlisted to serve their purposes.

Beyond these tactical skirmishes, the struggle was also confounded by the classic

political dilemma of establishing a fair decision-making process. Those who hoped to lead

the Internet’s first overtly constitutional act faced a double-edged challenge. They needed to

constitute a venue for power making whose legitimacy depended on showing regard for

power sharing. They needed to enthrone a Boss, and they needed to make it look fair.

The text of the Internet’s new guiding values – as well as the rules which would

govern the gatekeeper – were being negotiated and articulated by parties whose material

interests were directly opposed to each other. So-called “stakeholders” squared off against

self-proclaimed “public interest advocates.” The participation of eminent technologists, legal

experts, and duly appointed representatives of national governments made it a Battle Royale.

The goal of the fight, ironically, was not to be recognized as the last fighter standing, or even

as the leader of the pack. The ultimate prize was to be acknowledged as referee.

Those jockeying for power sought to elevate themselves as the responsible peers of

a newly constituted order. All tended to frame their discourse with references to what was

Page 24: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

9

Donald M. Heath, “Beginnings: Internet Self-governance: A Requirement to Fulfill the Promise.”10

Speech delivered April 29, 1997, http://www.itu.int/newsarchive/projects/dns-meet/HeathAddress.html.

best for the Internet community as a whole, as if their ultimate motivation was nobly distinct

from their own particular concerns.

Partisans worked hard to establish themselves as rule-makers. Yet, despite their

remarkable record of success at producing vast changes in the world, they habitually

disclaimed their own agency. Even as they sought to consolidate control of the Internet’s

most valuable resources, they personified the Internet as an entity which could foil any

overseer. Consider the words of Don Heath, President and CEO of the Internet Society, at

a pivotal moment early in the DNS War.

We believe that for the Internet to reach its fullest potential, it will requireself-governance. Our intent is to facilitate that realization. The Internet iswithout boundaries; it routes around barriers that are erected to thwart itsreach - barriers of all kinds: technical, political, social, and, yes, even ethical,legal, and economic. No single government can govern, regulate, or otherwisecontrol the Internet, nor should it. (my emphasis)10

In that context, “self-governance” meant loyal advancement of the Internet’s

barrier-breaking imperative. Traditional human constraints were deemed expendable. Such

handwashing rhetoric sought to shift responsibility onto a higher order purpose, as if one’s

own sources of power were rooted somewhere far beyond oneself.

* * *

By the end of 2001, administrative reform of the system had trailed off into a

bureaucratic quagmire, disappointing the idealists, and reducing their grandiose visions to

historical footnotes. Any description of what an Internet-mediated world order might look

like belongs firmly within the realm of speculation. But the episode left a trail of insights.

Participants in the DNS War offered models of association intended to suit an expressly

global political order. Those models were clearly distinguishable from the familiar kinds of

social arrangements that conventionally arise from within the bounds of sovereign states.

Page 25: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

10

Those distinctions must be examined more closely, but the prospect of doing so raises

a formidable analytical problem. To understand the world that Internet idealists thought they

were creating, we must understand the world they actually inhabited. What framed their

vision? What circumstances did they seek to surpass? In other words, What is the structure

of the contemporary international system in the first place?

Traditional realists portray the current order as a fluid arrangement of autonomous

sovereign states. More economically-minded observers see an interdependent web of

pluralistic interests. And others describe a deepening regime of shared values and common

law. Is it a bit of all, or something else altogether?

For now, let us say that the present order is a discernable reality of rules and

rulemakers in which nation states are deemed to be the preeminent locus of politically-

mediated activity, framing the rights, obligations, and interests of the subjects and citizens

who are, at heart, the animating agents of the present order. Ultimately, what matters is that

any civilized human social structure is rooted in agency. If this is true for the modern

international system, it will also be true for its predecessor or its successor.

The key to making a comparison between past, present, and prospective social

arrangements, then, is to get a clear understanding of the mechanisms by which people

become social agents. That, in turn, means finding out how people actually go about

practicing old rules, making new ones, and abandoning others. If sociality is the skilled

practice of teaming up, it is necessary to examine the precise details of how social skills are

exercised.

In the end, I will argue, the DNS War largely reconfirmed the modern cultural attitude

of exalted denial. The issue of who may rightfully exercise power over expressly global

resources was left unsettled. Gatekeeping power over the Internet’s core resources simply

calcified further into the hands of those who already held it. People, for the most part, still

prefer to do as Dorothy was told in the Wizard of OZ... “Pay no attention to the man behind

the curtain.” Still, there were skirmishes in the DNS War that were about much more than

a greedy battle for market share. The experience challenged us to acknowledge the power of

Page 26: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

11

agency, to open up the curtain, and to impose accountability on those who are at the controls,

starting with ourselves.

b. Power Makers

This is neither a simple story nor a short one. Its details are spun from an intricate

combination of technical, legal, and political threads having antecedents and postscripts that

span decades. Measured in “Internet years,” it spanned epochs. For some it became the

defining struggle of their lives.

Steve Wolff and Brian Reid will appear again later in this narrative, but another pair

of Internet pioneers – Jon Postel and Vint Cerf – will play a far more prominent role. In

many ways, this is a story about them. Their combined impact is likely to have such far-

reaching effects on history that they might rightfully be compared to epoch-making inventors

such as Johannes Gutenberg and Thomas Edison.

By inducing radical and authoritative transformations in peoples’ day-to-day

practices, Cerf and Postel emerged as exceptionally powerful agents of change. Having

known them both, however, I must hasten to add that they did not present themselves as

arrogant, larger-than-life masters of the universe, but as down-to-earth professionals who

were simply smart, diligent, and public-minded. Yet they quite intentionally forged careers

as recognized leaders within the Internet community. To the extent the creation of the

Internet launched a new phase of human order, the outcome bears the imprint of their design.

Three distinct metaphors – guides, gatekeepers, peers – can explain the powers of

these two pioneers. In fact, those metaphors shed light on the powers of any agent.

The first concept is easiest to convey. A gatekeeper’s action brings about a changed

state of affairs that other members of a society are forcefully obliged (whether through

physical or formal means) to recognize and respect. Gatekeepers close out or open up

options. They let people and things pass through gates, or can push them through.

Gatekeepers oblige others to act.

The second metaphor is also quite familiar. A guide’s power can be equated with

concepts like reputation and unenforceable moral authority. Guides make truth claims. But

Page 27: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

12

being recognized as someone who makes a truth claim does not necessarily guarantee one

will be believed or followed. Guides act by speaking of obligation.

The last metaphor is somewhat more sophisticated. The power of a peer generally

implies the action of one party freely joining another to participate in a transaction that binds

them both. It could also mean testing the merits of a guide’s power, or voluntarily deciding

to play out the rules of an existing gatekeeping arrangement. Peers agree to oblige

themselves, and then act accordingly.

Each of these metaphors will be handy for showing the kinds of power struggles that

occurred during the DNS War, and what was at stake. Interested readers can turn to

Appendix 2 to find an extended discussion of how these metaphors were derived. But a

probing theoretical discussion is not necessary to set out the basic distinctions. For that, a few

more examples will suffice.

As the Internet’s most important gatekeeper, Jon Postel headed an office he called

the Internet Assigned Numbers Authority. He was thus empowered to make final decisions

about the allocation of resources within the Internet’s critical symbolic hierarchy. He could

those decisions himself, or delegate the authority to do so. Postel was also a highly respected

guide in that he played a key role in defining the values and principles that motivated both

his delegates and the community at large. In fact, any Internet engineering veteran could

rightfully be considered a guide, presuming he or she advocated the idea of the end-to-end

principle or propounded the virtues of the dumb network.

The Internet’s most famous guide, arguably, was Vint Cerf. As co-creator of the

Internet Protocol, as a self-described evangelist for the technology, and as a perennial

figurehead within various organizational efforts, he became an internationally recognized

personality. Cerf also acted as a leading gatekeeper from time to time, channeling material

resources on behalf of public and private agencies to fund activities that turned out to be

critically important.

Like many who participated in the creation of the Internet, Cerf and Postel were both

peers. They produced experiments, they negotiated as colleagues, and they made agreements

about how to proceed as collaborators. Indeed, the goal of enabling further collaboration was

Page 28: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

13

a central premise of the Internet’s end-to-end principle. The whole project reflected a

conscious intent to create an ever-expanding peer-enabling structure.

To get a richer flavor of Cerf and Postel as people, one might want to know how they

looked, how they sounded, how they moved, how they dealt with the emotions and puzzles

of their inner lives, and perhaps even what clothes they wore on this or that day. Those

pictures would offer a sense of who they were as living, breathing, flesh-and-blood people.

For starters, it might help to note that the correct pronunciation of Postel’s name is päs-‘tel,

with two short vowels and the accent on the second syllable. He was quiet, casual, rather

overweight, and easily recognized by his long beard, ponytail and Birkenstocks. Cerf wore

a beard as well, but his was trim and elegant, like the rest of his clothing and the assiduously

fine manner of his public persona.

Drawing a finer picture will be left to others. Given that this narrative will be focused

on agency in the Internet’s DNS War, biography must take a back seat to describing how

people acted as guides, gatekeepers and peers. Fortunately, those roles are not simply dry

academic abstractions. They reflect activities that any living, breathing individual might

undertake... from acts as grand as Internet engineering to those as mundane as toilet training.

To get a fuller flavor of how guides, gatekeepers, and peers are indeed walking, talking

people, it might help to consider another thing they do, and how their doings deploy power:

Consider, for example therefore, the power of a kiss.

To bestow a kiss reflects a sheerly empirical kind of gatekeeping. An observer using

specific criteria would be able to tell whether or not a kiss had been given, with no other

meaning beyond that raw fact, like any snap of the fingers or clap of the hands. A kiss takes

place in material reality. But the performance of a kiss can also indicate an event of social

significance. For example, depending on the context, bestowing a kiss could change the

status of the recipient. A ceremonial kiss might formalize the recipient’s official standing in

a hierarchy. In certain circumstances it might mark the recipient for arrest, even

assassination. The point here is that the instance of the kiss brings about a changed state of

affairs. And so might a finger snap or a handclap signal that a new reality has come to pass.

The kisser is a gatekeeper.

Page 29: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

14

Once a kiss has been bestowed, those who are sufficiently skilled to know the

implications of the kiss within that social context will have been empowered to speak to its

particular meaning and value. Everybody who knows about the kiss is a potential guide,

starting with those who kiss and tell, and including any gossip or purveyor of status.

There is also, of course, the freely-given, mutually erotic kiss that readers of this

passage probably imagined first. It is most certainly an agreeable act between peers. To the

extent we cherish such kisses and say they are to be cherished, we guide our preferences and

suggest to others what should be called good.

* * *

Please consider yet one more example of how peering involves creative, risk-laden

endeavors. During my research I met quite a few Internet pioneers, several of whom talked

at length about the excitement of working on the very first computer networks. Some of the

equipment they described was hulkingly primitive. Their stories evoked the imagery of a

sepia-colored bygone era. As these graying men recounted the highlights of their youth,

resurfacing the thrill of intense effort, I often saw a change in the appearance of their eyes...

a joyful glimmer that I labeled “The Twinkle.”

When I met Jon Postel in 1997 I noticed he could get rather grumpy when talking

about the ongoing responsibilities of being the Internet’s big-shot guide and gatekeeper

(which, at the time, was an increasingly arduous and politically messy task). On one

occasion, however, our conversation turned to his involvement in an advanced networking

experiment. The technology was far ahead of its time, and he liked talking about its speed.

Once again, I noticed The Twinkle, but with a twist. It was motivated by thoughts about the

present, and not the past, as if he had just been out driving a high-performance concept car,

flat-out and open-throttle, with the top down and the wind in his hair.

The point here is to stress that acting as a peer involves creative risk. A prospective

peer can not know for sure how things will turn out, or even whether potential partners in the

endeavor will be reliable partners. The attempt to engage as a peer may succeed delightfully.

Or it may crash. In either case, the attempt is life affirming.

Page 30: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

15

c. Predecessors

There have been prior narrations of the DNS War, of course, and others are likely to

emerge. What distinguishes this one is its focus on the precursors and early stages of the

conflict. At the time of this writing, the fullest published account of the story can be found

in Milton Mueller’s Ruling the Root: Internet Governance and the Taming of Cyberspace

(2002). Mueller received his advanced degrees from the Annenberg School for

Communication at the University of Pennsylvania and wrote the book while teaching

Information Studies at Syracuse University. His telling spans a period that starts with the

origins of the Internet in the 1970s, closing near the end of 2000 as a new institutional

framework for Internet governance finally took stable form.

This dissertation provides a far more detailed chronicle of the story, but focuses on

a narrower span, beginning in 1990, when the transition to commercialization began to take

shape. It concludes with the May 1, 1997 signing of an instrument called the Generic Top

Level Domains Memorandum of Understanding (gTLD-MoU).

Several of DNS War’s most famous (and most infamous) incidents will receive only

brief treatment here because they occurred after the gTLD-MoU signing. The benefit of

concentrating on this earlier period is to present a discriminating look at the actions and

justifications undertaken by key agents prior to the signing. That enables a more detailed

replay of various dialogues and more attention to the context of a speakers’ circumstances.

Mueller and I reach very different conclusions about what the story means. Drawing

on analytical frameworks of political scientist Elinor Ostrom and economist Gary Libecap,

he portrays the fight over the domain name system as part of the evolution of a “common

pool resource.” Those would be resources that are easily accessible, and that are likely to

regenerate if not overused. Well-known examples include trees in forests, fish in fishing

grounds, and supplies of fresh water in rivers and aquifers.

Mueller sets up the problem in the following way: 1) The establishment, discovery,

or “endowment” of new resources occurs, which immediately spurs the next step; 2) A

rivalrous “appropriation” of those resources gets underway, prompting competitive battles

that lead to conditions of exclusion and scarcity, and finally, therefore; 3) The

Page 31: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

16

Mueller (2002: 57-8, 267).11

Mueller (2002: 163-70).12

Note how Vint Cerf’s influence is characterized by Mueller (2002: 151).13

Mueller (2002: 110-2).14

Mueller (2002: 266-7).15

“institutionalization” of rules helps resolve conflicts and facilitate continued exploitation of

the contended new resource. 11

Framing the Internet governance debates within that theoretical lense, Mueller argues

that the technicians who created the domain name system sought to use it as a mechanism

to fund their future activities. Postel and Cerf were chief among them. Supported by allies

in the telecommunications industry, the trademark lobby, and elsewhere, this core group of

the Internet technical community formed a “dominant coalition” around itself intending to:

1) “reject the authority of the US government” to designate who would be entitled to collect

the rent on domain names; 2) “capture” the Internet governance process, putting itself fully

in charge of that process via the gTLD-MoU, and; 3) impose a new institutional regime over

the domain name system. Members of that dominant coalition are the implicit villains of12

Mueller’s story.13

Putting aside the question of whether Internet resources strictly equate with common

pool resources (since nothing is actually regenerated or replenished when the domain name

system is left fallow), Mueller gives a remarkably brief and sanguine account of the decision-

making process that endowed dot-com registrations with value in the first place. Names in

dot-com had been provided free of charge until September 14, 1995 when a secretly-

negotiated policy amendment caught the Internet community by surprise. Though many

paths to the institution of fees were possible, Mueller jumps to conclude that the September

14 event was “the only option.” His entire analysis depends on interpreting this single14

decision as responsible for transforming the Internet’s name space into a common pool

resource. Mueller goes on to argue that such resources are characterized by “disequilibriating

freedom... social innovation [and] absolute freedom.” By implication, this utopian state of15

Page 32: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

17

Mueller (2002:12, 267).16

affairs reigned over the dot-com namespace in the immediate wake of the fee-charging

decision. That is, until the “dominant consensus” came in to wreck it. Members of the

“entrenched technical hierarchy,” he argues, “lacked the vision to understand” that their

resentment of the fee-charging decision and, more to the point, their subsequent

countermoves starting with the gTLD-MoU, violated the “old spirit” of the freewheeling and

“self-governing” system they had created years before. 16

My closer study of the events leading up to the fee-charging decision will reveal that

the engineering community had good reason to be suspicious of how that decision was made.

It led, after all, to a massive financial windfall for one company – Science Applications

International Corporation (SAIC). As I will show, several of that firm’s subordinates had

cultivated very friendly connections inside the US government, especially within the agency

where the decision was enacted.

Also, Mueller vehemently criticizes members of the “technical hierarchy” for

forfeiting the “Jeffersonian decentralization” that had supposedly characterized their culture

in an earlier period. This is mistaken for a variety of reasons. First of all, the “Jeffersonian”

characterization overstates the case. Despite its famed openness, the community had no lack

of hierarchical decision-making and resource-allocation structures. For example, mechanisms

controlled by Postel became well entrenched quite early on. As I will show, those

mechanisms were sustained by a general recognition of cases in which the authority of

responsible gatekeepers was essential to their own institutional continuity and to the

operation of the Internet itself.

Second, rather than blindly forfeiting their norms, as Mueller suggests, members of

the technical hierarchy responded to the fee-charging event by offering up the gTLD-MoU

as an institutional structure designed to operate independently from states. Though

problematic, that structure was perceived as a less odious compromise of the community’s

norms than subordination to a government’s dictates. Given that states have more coercive

power than other legitimate social bodies, avoiding their potentially meddlesome

Page 33: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

18

Mueller has occasionally published under the aegis of the Libertarian-oriented Cato Institute. It is17

noteworthy that he equates an uncoerced style of power – which he implicitly prefers – with institutional

control. Mueller (2002: 11). Parenthetically, by my scheme, whenever gatekeeping power comes into play, so

would some degree of coercion, whether the brute physical coercion of states, or the ability to have the final

say regarding a simple allocation decision.

For a succinct argument advocating deeper intersection, see Hans Klein, “ICANN Reform:18

Establishing the Rule of Law: A policy analysis prepared for The World Summit on the Information Society

( W S I S ) , T u n i s , 1 6 - 1 8 N o v e m b e r 2 0 0 5 , ” G e o r g i a I n s t i t u t e o f T e c h n o l o g y ,

http://www.ip3.gatech.edu/images/ICANN-Reform_Establishing-the-Rule-of-Law.pdf.

participation had an obvious appeal. As we will see, many members of the Internet

community were becoming extremely suspicious of the motives held by the US and other

governments. One could even argue that the community’s strategy of bypassing governments

was reasonably compatible with Mueller’s undisguised Libertarian-infused stance.17

Circumstances at the onset of the crisis had clearly demonstrated the need for a deeper

intersection between domain name operations, formal legal doctrine, and stable regulatory

bodies. The question was how to do it. The community’s answer was to suggest that the rule

of law need not depend necessarily on the rule of states.18

Finally, it is noteworthy that leaders of the engineering community continually

asserted the nostrum that “no one controls the Internet” as a guiding principle even as they

refashioned its hierarchical mechanisms to establish a new array of gatekeepers. They did

indeed publicize their culture in Jeffersonian terms... as a decentralized, globally inclusive

community made up by innovative, highly autonomous, freedom lovers, even if they

sometimes may have seemed to act at cross purposes with such principles. But that

compromise does not mean that they abandoned their core vision of highly scalable, end-to-

end networking. In fact, their strategy for resolving the domain name crisis was intended to

advance that goal by facilitating continued growth of the Internet, and with it, the capacity

of people to engage as peers.

Another point on which I differ from Mueller concerns the principle of

acknowledging oneself as an agent and not just as an author. Though many argue that the

task of scholars is to speak objectively and to project a neutral, uninvolved stance, I consider

it proper and necessary to disclose the facts that I participated in the Internet Governance

Page 34: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

19

Most notably, I served as a scribe at the first IFWP conference in Reston, Virginia, in July 1998, and19

I was a member of ICANN’s Working Group C (New gTLDs) in early 2000.

Beyond his partisanship in the governance debates, he also provided testimony as an expert witness20

for PG Media, one of the plaintiffs in related litigation (see “Declaration of Milton Mueller on Behalf of

Plaintiff PGMedia, Inc.,” http://namespace.org/law/Mueller-Declaration.html). And Mueller later took on

official responsibilities as a panelist in the Uniform Dispute Resolution Process (UDRP) that emerged from the

institutionalization effort.

debates beginning in late 1997, and that I tended to be a moderate supporter of the technical

community’s position. Mueller was involved also, and far more intensely than I. In fact,19

he was an exceptionally well-traveled and outspoken opponent of the group he calls the

“dominant consensus.” But his book did not reveal that activity. 20

Readers will find that my narrative moves in directions quite distinguishable from

Mueller’s. That said, I also consider him to be a serious scholar and a friendly colleague.

There is no denying that his work stood as the most significant study of this aspect of Internet

history before now, and that it will continue to remain highly useful to any student of the

subject.

* * *

It happens that important elements of the “Internet Governance” story were covered

in various works published well before the topic was recognized as a specific field of

concern. Two of those older resources provide particularly valuable background. Katie

Hafner’s and Matthew Lyon’s Where the Wizards Stay Up Late (1996) is a popular account

of the origins of the Internet and its technical precursor, the ARPANET. Their history ends

before the DNS controversy begins outright, but they fill their pages with personal stories of

many key personalities, including Cerf. Jane Abbatte’s Inventing the Internet (2000) also

focuses on the ARPANET and the early Internet, drawing attention to how supervision passed

from military to civilian agencies. She briefly addresses the domain name controversy, which

was in an early phase as she finished her study. One online history that deserves a look is “A

Brief History of the Internet,” written by several of engineering stars, including Cerf, Postel,

Page 35: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

20

Barry M. Leiner, et al. “A Brief History of the Internet (Version 3.23, revised December 10, 2003),”21

http://www.isoc.org/internet/history/brief.shtml.

Robert H. Zakon, “Hobbes’ Internet Timeline v8.2,” http://www.zakon.org/robert/internet/timeline/.22

An earlier version was published in November 1997 as RFC 2235. Anthony Anderberg, “History of the Internet

and Web,” http://www.anderbergfamily.net/ant/history/. David Mills (2005). “Tutorial on the Technical History

of the Internet.” http://www.eecis.udel.edu/~mills/database/brief/goat/goat.pdf.

Rony and Rony (1998: viii).23

For an early example, see David J. Loundy, “A Primer on Trademark Law and Internet Addresses.”24

15 John Marshall J. of Computer and Info. Law 465 (1997), and http://www.loundy.com/

JMLS-Trademark.html.

Wolff, and Barry M, Leiner. Others include Zakon’s “Hobbes’ Internet Timeline,” Anthony21

Anderberg’s, “History of the Internet and Web,” and an online tutorial by David Mills. 22

Ellen Rony’s and Peter Rony’s copious and wide ranging book, The Domain Name

Handbook. High Stakes and Strategies in Cyberspace (1998), focuses directly on domain

names, covering a time span that is overlapped in this dissertation. Their chief motivation

was not unraveling the controversy, however, but explaining domain name “registration

procedures and pitfalls” to their readers. (Though it is peripheral to this dissertation, the23

“How do I get a domain name?”question has proven to be the one I am asked most often in

public discussions.) The Rony and Rony book also provides plentiful detail about specific

trademark disputes. Similar territory has also been covered by various legal scholars who

have a professional interest in matters of jurisdiction and intellectual property. I will deal24

with a few trademark-related cases only in passing, addressing certain background concerns

– especially political ones – leaving the case histories and legal reasoning to others.

Some other issues examined by Rony and Rony will receive extensive treatment here.

One was an entrepreneurial movement they labeled the “Alterweb.” Another concerned the

questionable legitimacy of the so-called “charter” of the Internet Assigned Numbers

Authority. Like Mueller, they chronicle some of the events which led to the signing of the

gTLD-MoU. Again, what I hope to contribute is a fuller and more focused narration of the

associated episodes, so that the readers can gain a livelier picture of agents and their

circumstances.

Page 36: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

21

Michael Froomkin, “[email protected]: Toward a Critical Theory of Cyberspace,” 11625

Harvard Law Review. 749 (2003).

A. Michael Froomkin, “Wrong Turn in Cyberspace: Using ICANN to Route Around the APA and26

the Constitution,” 50 Duke Law Journal. 17 (2000). Also online at http://personal.law.miami.edu/~froomkin/

articles/icann-main.htm. Note that while the publication date of “Wrong Turn” precedes “Habermas,” (fn. 25

the latter was available online in draft form well before 2000.

See also Paré (2003) and Harold Feld, “Structured to Fail: ICANN and the Privatization Experiment”27

in Thierer and Crews (2003).

A pair of relevant studies were published by A. Michael Froomkin (a Professor of

Law at the University of Miami who was active in the DNS debates, and is also a member

of my dissertation committee). In “[email protected]: Toward a Critical Theory of

Cyberspace,” Froomkin made a sustained argument that the standards-making processes

developed by members of the Internet’s technical community met exceptionally high criteria

for ethics and legitimacy. This illustrates the early appeal of the Internet’s culture, and how25

that culture was attributed a status approaching an exemplar of moral authority. Froomkin’s

“Wrong Turn in Cyberspace: Using ICANN to Route Around the APA and the Constitution,”

written later, portrays some segments of the technical community with very low regard,

indicating the dramatic changes that occurred after the Internet Governance debates got

underway. Froomkin became very critical of the technical community’s participation,

deeming the policy process which emerged to be “precisely the type of illegitimate agency

decisionmaking that modern administrative law claims to be most anxious to prevent.”26

Other studies include Governing the Internet: The Emergence of an International

Regime (2001), by Marcus Franda, and Who Controls the Internet?: Illusions of a Borderless

World (2006), by Jack Goldsmith and Time Wu. Franda’s book launched a three-part series27

about the Internet’s affects on global economic development. That first volume was

particularly concerned with the norms, principles and assumptions underlying the Internet

expansion policies promoted by the West, especially the regulatory forbearance policies

intended to encourage investment. In addressing the domain name controversy and the

related Internet Governance debates as examples of “agenda setting and negotiation,” Franda

was understandably far more concerned with the results of the DNS War than the details of

Page 37: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

22

Franda (2001: 43-81).28

Goldstein and Wu (2006: vii-ix).29

Friedman (2005: 52).30

its episodes. Goldstein and Wu, on the other hand, offer a narrative that briefly recounts28

some of the DNS War’s most celebrated events, linking them as anecdotes in a bigger story

about how the Internet’s idealistic culture ultimately conformed to the demands of geography

and the dictates of governmental authorities.29

* * *

It will be some time before history’s final verdict is rendered, but the idea that the

Internet launched the world on a path of fundamental transformation has become deeply

embedded and is likely to persist. Take the prognostications of Thomas Friedman, the author

and New York Times columnist whose best-selling The World is Flat (2005) portrayed the

dawn of the new millennium as marking the move to a new era of global technological

convergence. He based this prediction on the appearance of ten so-called “flatteners” –

crucial developments that sparked radical, unabating explosions of cross-border transactions.

All but one of them stemmed directly from increasing reliance on the Internet and

computerization. The other was the destruction of the Berlin Wall. And even that welcome

event was a product of the “information revolution,” he wrote, because interconnected

computers and faxes broke the “totalitarian monopoly” on tools of communication. 30

Internet Governance debates were far under the horizon in Friedman’s book, as were

the civil society movements that have increasingly come to the fore in the early 21 century.st

Yet the proliferation of cross-border grassroots political movements during the first decade

of the century certainly appears to validate Friedman’s views about the march of “flattening.”

If anything, it is accelerating.

* * *

Herodotus thought it best to “always wait and mark the end.” Perhaps recent

expectations of an Internet-engendered global transformation were unwarranted, revealing

a lot about peoples’ capacity to engage in exaggerated projections and wishful thinking, and

Page 38: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

23

Mosaic was developed under the leadership of Mark Andreesen at the National Center for31

Supercomputing Applications at the University of Illinois at Urbana-Champaign. Microsoft’s Internet Explorer

browser is a direct descendant of that project. Andreesen also developed the Mozilla code base which was the

heart of Netscape’s original Navigator browser. See http://ibiblio.org/andreessen.html.

very little about likely consequences of the current reality. Whether a wrong turn was taken,

or no turn was taken, it is clear enough that a turn was intended. What follows is the story

of agents who hoped to drive the world in a new direction.

A useful way to begin, I believe, is to describe for the reader my own story of how

I became aware of the DNS War and how I developed my particular understanding of it.

d. Encountering the Internet

My involvement began inadvertently. Though I had already been “online” for several

years through the Compuserve Information Service, I had no direct experience with the

Internet until 1994. That introduction was fascinating, but baffling. It was busy, electric, a

hopping swirl of electrons inside a vast labyrinth of code and wire, zipping with traffic,

organized unlike anything I had encountered before. There was no obvious beginning or end,

just the chance to plunge in and sweep along, giddy, feeling the current push out to an ever-

expanding universe of information-bearing islands and grand tangential way stations.

The main pieces of the Internet’s structure were not easy to make out. At first glance,

the format of email addresses seemed to offer a clue. Compuserve identified its members

with seemingly arbitrary numbers like 72210,3613, whereas the Internet let people use names

like [email protected]. It was an easy guess that email was being routed across the Internet

based on the parts of the name following the @ sign. Beyond that, I was stumped. Still, there

was a way to grope ahead. Compuserve offered something called a “gateway” to the Internet

through which users could send and receive email. Correspondents had to translate between

the two systems’ different address formats. I learned the procedure, but I was performing rote

operations step-by-step, without real insight as to why they were necessary.

Finally, in early 1995, I saw a demonstration of the World Wide Web. A librarian at

my University had a machine running a program called Mosaic, a web browsing tool which

was the precursor of Netscape’s Navigator and Microsoft’s Explorer. One could see that31

Page 39: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

24

the layout of the Web’s hypertext links, such as http://www.miami.edu/index.html, slightly

resembled Internet email addresses. The demonstration was exciting, but perplexing.

Compuserve did not yet provide web access. To get a browser running at home meant having

to find out about a whole new class of tools, and where to get them. With so little knowledge

of the Internet or how to maneuver my way around it, catching up looked like a daring

project. I was willing to strike up a conversation with anyone who seemed like they might

be able to demystify it for me, its naming system included.

The “net,” as insiders called it back then, was particularly difficult to understand

because there was no “there” to it. It had few features that resembled the online world I

already knew. Compuserve, in rudimentary terms, relied on a widely distributed number of

telephone dial-up nodes that fed into high speed trunk lines. Those lines ultimately fed into

a powerful computer system in Columbus, Ohio. Compuserve used a technology known as

time sharing, so called because commands submitted by each logged-in user were serviced

in small slices of time. By sharing a computer’s processing power this way, a complicated

command issued by one user would never seem to bog down the system for everyone else.

It was easy enough to imagine bundles of wires feeding into Compuserve’s building where

a gigantic mainframe computer was humming away behind glass walls, a blinking monolith

tended by bald men wearing thick glasses and white coats.

Before long it became clear that there was little sense to questions like “Where is the

Internet based?” The consistent response was that it didn’t work that way.

A full answer would take a lot of explaining. And catching on meant a lot of

relearning. The categories which rendered Compuserve comprehensible weren’t sufficient

to cover the Internet, where data storage and processing power were physically distributed.

Making analogies to familiar technologies could be misleading. Unlike the phone system,

which depended on continuous transmission over dedicated circuits, the Internet leveraged

a technology called packet switching, sending bits of disassembled information over shared

media.

Domain names stood out as particularly interesting creatures. The Internet was

supposed to be open and decentralized, yet its naming structure was obviously hierarchical.

Page 40: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

25

It was clear that some organization somewhere was in charge of assignments. I developed

a routine question, “Who makes the rules for getting domain names?”

Not long after my first look at Mosaic I met someone who had a good grip on the

answer. He worked for an Internet Service Provider (ISP) which, back then, was a fairly new

and rapidly growing kind of business, and he gave me a reasonable summary of Internet

administration as it then stood. A US government-funded organization called the InterNIC

gave out names free of charge to qualified applicants. Names ending in .com were meant for

businesses. Names ending in .net were for ISPs, .edu was for colleges and universities, and

names ending with .org were for non-profit corporations.

I didn’t bother to ask who designed the system, or why they chose those particular

suffixes, or how the InterNIC was put in place, or any other of a host of questions that now

seem so relevant to the “who makes the rules” part of my inquiry. At the time I was

wondering whether I could get a domain name of my own. As it happened, the rules were

about to change, making the acquisition process easier for everyone.

In September 1995, as the result of a revision in US Government policy, the

subcontractor which had been operating the InterNIC adopted a fee-based, first-come first-

served policy for names ending in .com. Proof of qualification would no longer be needed.

Names had suddenly become commodities. It was impossible to miss the surge of advertising

which ensued. Many small businesses began offering name registration services, typically

for about $200.

For my own particular reasons I liked the word flywheel as a possible business name.

It was suddenly clear that if I didn’t act immediately to register flywheel.com, someone else

would surely beat me to it. I experienced a rising sense of panic that many Internet-savvy

people must have felt back then – the fear of missing a chance to get first pick at one’s

identity (a type of dread that must be deeply rooted in the human psyche).

Luckily, an acquaintance was starting his own Internet business and knew what to do.

He filled out the form required by this mysterious InterNIC, and sent it in on my behalf. He

even took care of requirements of which I was then unaware, like matching the name with

an underlying Internet Protocol number, called an IP address. I later received a $100 invoice

Page 41: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

26

directly from the InterNIC, covering a two year registration period. What a bargain. With my

friend’s generous help I had been able to get the right to control flywheel.com. But I still had

no clue at all about how to put it to use.

Around that same time I signed up for an Internet account via a service provider

which had advertised in a book I had purchased. (This was still well before the genre of

primers pitched at self-selected morons had become a market phenomenon.) Soon I was

exploring news groups on the Usenet, and having some fun with a search engine called

Archie. I wasn’t using the Web at home yet, and I still had only the vaguest notion of how

the Internet fit together. I continued on with Compuserve. I also installed an early version of

America Online, seduced by its promise of hours upon hours of free service. My domain

name continued to lay dormant.

I had learned to place value on domain names, and I had even learned that I could –

with lots of help – acquire them. In a sense, I had just been transformed by a process of

socialization. I had entered into a society – I had become an agent within it – by adopting

pertinent values shared by other members. Other important evidence of that agency was that

I had adapted my actions to suit that society’s rules. Like the other newcomers, I was

acquiring new skills appropriate to the culture of an Internet domain name owner.

Nevertheless, I really hadn’t acquired any deeper understanding of who made the rules for

the InterNIC, or how those rules worked from the top down.

As a consequence of my ad hoc, bottom up approach, I had learned that one of the

chief rules for getting domain names to act quickly and in league with people who knew how

to work the system – in this case my generous friend; the entrepreneurs selling name

services; and the operators of the InterNIC at the top of the chain. I had gotten through the

gates.

e. Investigating the IETF

In early April, 1997 a group of about one thousand computer networking specialists

gathered in Memphis, Tennessee for a week-long meeting of the Internet Engineering Task

Force, better known as the IETF. For them, the meeting was about fleshing out the details

Page 42: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

27

Stephen D. Krasner, ed., International Regimes (1983). See also, Robert Keohane, After Hegemony32

(1984).

of the Internet’s newest technical standards. For me, the flight to Memphis was a fishing trip.

I needed a case study for a Ph.D. dissertation I had just started, and the IETF promised rich

waters.

My general idea for a topic concerned how recent developments in the creation of

standards for high-tech telecommunications seemed to be prompting the emergence of a new

kind of global regime. In common parlance, the word regime generally means a country’s

ruling government. For International Relations scholars like myself, however, regime is a

very precisely defined term of art concerning explicit and implicit kinds of international

arrangements that reveal interdependence among states. Academics generally use the term

to account for interlocking systems of trade, security, law, and even culture. More precisely,

regimes are said to emerge from a confluence of norms, interests, and decision-making

processes pertaining to specific issue areas. Regime-oriented theories have been used to32

explain international collaboration on issues as diverse as public health, arms control, and

management of ocean resources.

While most scholars who wrote about regimes tended to be interested in traditional

arrangements initiated and managed by state actors, I was investigating the nature of

arrangements initiated and managed by explicitly global regimes. The distinction depended

on two factors. One was the preeminent role of individuals within the regime who were

representing private bodies such as corporations, non-governmental organizations (NGOs),

civil society groups, or perhaps even just themselves. Consequently, diplomats and other

governmental functionaries were relatively less likely to perform as leading guides and

gatekeepers, establishing fundamental principles and carrying out operational rules.

The second factor was the increasing role of technical elites in setting policies for the

distribution of important resources. This is what made the IETF so interesting. Recent trends

hinted that governments were either ceding power to entities other than themselves, or that

governments were simply unwilling (or perhaps even unable) to hold the reigns of control

Page 43: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

28

The standard definition is, “a network of professionals with recognized expertise and competence33

in a particular domain and an authoritative claim to policy-relevant knowledge within that domain or

issue-area.” See Peter M. Haas, "Introduction: Epistemic Communities and International Policy Coordination,"

46 International Organization 1 (1992). Peter M. Haas should not be confused with Richard N. Haass, the

former State Department policy planning director.

over certain kinds of modern resources. I was interested in finding examples that could

prove the trend and uncover its causes.

The IETF was not the only option for a case study. Other candidates included

agencies that were allocating radio spectrum, designating satellite orbits, putting out faster

modem protocols, or developing new specifications for moving data across high speed fibre

and cable. The activity in so many areas indicated, to me at least, that a real power shift was

underway in the world. The technical elites who designed such standards were making

important decisions about rules and resources which the rest of the people in the world relied

on but rarely thought about.

In the field of International Relations these expert-run, scientifically-oriented regimes

were known as epistemic communities – groups of “knowers.” They were like technocratic

meritocracies, bringing together people around a set of shared assumptions and rigorous

methodologies as they generated increasingly specialized truth claims within highly focused

disciplinary areas. Epistemic communities exemplified systems of rule by professionally

trained technical specialists, supposedly immune from the “truths” that political leaders

might try to impose. They were weather wonks and garbage geeks. My kind of people.33

To prove the point I wanted to make about the growing influence of such groups, I

needed an example of a standard-making process that had a significant world-wide effect,

yet operated without direct government oversight. The IETF seemed to be more promising

than the other options for several reasons: First of all, the Internet was the new darling of

global communication. A dissertation on the dynamics of Internet standards-making would

certainly have more sizzle than one about the backroom politics of modems. Also, the IETF

was a notoriously open organization with a large and diverse mix of participants who met

relatively frequently. An outsider would find it much easier to observe activities there than

within any of the other candidate agencies.

Page 44: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

29

These Areas are managed through the Internet Engineering Steering Group. See34

http://www.ietf.org/iesg.html.

Most intriguing of all, some IETF members were remarkably uninhibited about

making radical sounding pronouncements. For example, at a meeting in 1992 one of the

IETF’s leaders, Dave Clark, a professor at MIT, had declared, “We are opposed to kings,

presidents and voting. We believe in rough consensus and running code.” The slogan was

soon printed on a thousand T-shirts which the IETF’s membership snatched up and wore

enthusiastically. When I learned about Clark’s statement and how strongly it had resonated

in that community, my mind ran wild with inferences about what it could mean.

The creators of the Internet were already being heralded in the media as technical

revolutionaries. Now I could see that they were daring to talk like political ones. I wanted to

know what kind of a revolution they had in mind. How did their political agenda affect their

technical one? Which of the IETF’s standard development processes most clearly

exemplified the political program of its members? If I could somehow identify the Next Big

Standard at its moment of creation, I believed I would be in a position to watch the future

unfold.

* * *

The Memphis IETF meeting was attended by nearly two thousand people, who spent

the better part of a week discussing hundreds of interrelated standards. The meetings were

arranged according to a handful of overarching categories called Areas, including Security,

Applications, Routing, Operations and Management, Transport, User Services, and Internet,

which dealt with fundamental technical protocols. Each Area had several working groups,

making up about one hundred active groups in all. Working groups were identified by34

acronyms that were only slightly more arcane than the titles they stood for. For example, the

Border Gateway Multicast Protocol group was known as BGMP. There was a lot to absorb.

The participants seemed to be an army of geniuses, led by an inner corps of “doers”

who seemed to turn up everywhere. The meetings also included many silent “goers” who

were just monitoring the proceedings. Not surprisingly, the meetings were pitched at a high

level of skill. Jargon-laded conversations focused on extraordinarily specialized matters. Had

Page 45: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

30

I wanted, I might have tried passing myself off as a goer rather than a topic-hunter, but I was

exposed every time sign in sheets were passed around to attendees at the working group

sessions.

Nearly everyone wrote down a normal-looking Internet email addresses, whereas my

Internet-readable Compuserve address stuck out like a sore thumb. I could have used my

AOL address, but quickly surmised that doing so would have made me look even more, for

lack of a better word, unhip. It hadn’t taken long to pick up on the fact that many IETF

members – people who had been using the Internet since the mid 1980s and earlier – deeply

resented the new mass market email services provided via aol.com, yahoo.com, hotmail.com,

and others. Use of those suffixes immediately identified the owners as clueless newbies –

rough and uncouth in the ways of the veterans’ beloved ‘net. There was even a word for an

IETF member’s prejudice against people who used mass market addresses – “domainism.”

* * *

Prior to my Memphis trip, the most appealing candidate for a case study of standards-

making was the effort to revise the Internet Protocol. The version then in use, IPv4, had a

glaring flaw. It was designed in the 1970s by a team that vastly underestimated Internet’s

eventual popularity. As result, IPv4 didn’t contain enough numeric address space to meet

projected demand through the first decade of the 21st century. It was as if the phone system

had been designed with too few digits in the length of a telephone number. A two digit phone

numbering system, for example, would support only 100 distinct connections.

The Internet’s IPv4 standard could theoretically support “only” about four and a half

billion unique connections... certainly not enough if everyone in the world eventually got

online with at least one device that needed an IP number. Avoiding the expected scarcity

would require an overhaul much more traumatic than simply adding area codes here and

there. Some stopgap measures were in place, but IETF members were hard at work on the

next version, IPv6. Their priority was to deliver gigantically higher capacity.

But that was not all. They were also working to deliver mechanisms that promised

lots more security. New features facilitating encryption and authentication of identity were

Page 46: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

31

Credit goes to Michael Froomkin for bringing the “lock” analogy to my attention.35

The most celebrated cases involved Phil Zimmerman, who developed the Pretty Good Privacy (PGP)36

standard in 1991. Zimmerman then became a target of criminal investigation for violations of munitions control

regulations, until the US government unilaterally dropped the matter.

intended to strengthen confidence that online communication could be a private and

trustworthy form of interaction.

Providing security, of course, happens to be one of the most important things that

states do. The approaches were fundamentally different, however. Where states can provide

public security by coercing people, the Internet community was offering tools by which

individuals could better protect their private communication. Better “locks” could keep

potential thieves out of one’s business, and could keep governments out as well. It didn’t35

take a great leap of imagination to wonder about the ulterior activist motives of the IPv6

designers. The question of how much security to build into the protocol had become a

growing point of contention between the technical community and some governmental

agencies, sparking a series of legal battles between the US government and some IETF

members over the publication and export of some encryption algorithms. 36

It seemed that focusing on development of the protocol revision would force me to

learn a great deal about how the Internet worked, both technically and politically. This, in

turn, would enable me to describe how an epistemic community melds the processes of

technical design and social architecture. Also, the topic offered a very appealing bonus... lots

of spicy source material. Technical though it was, the encryption debate stimulated real

passion. Security questions have a way of triggering a primal emotional response in people.

An important lesson of this poking around in the IETF was that the foundational

structure of the Internet was actually independent from the domain name hierarchy. Routing

IP packets turned out to be an issue that was far more important than domain names, and

much harder to visualize. Routing depended on a complex topology of nodes and exchange

points that constituted the roads and ramps of what was popularly called the “information

superhighway.” For various technical reasons, efficient routing depended on how IP

addresses were allocated among ISPs. Consequently, it was clear that the people who

Page 47: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

32

This was Jeanette Hofman. See Sabine Helmers, Ute Hoffmann, Jeanette Hofmann. “Standard37

Development as Techno-social Ordering: The Case of the Next Generation of the Internet Protocol,” Social

Science Research Center, Berlin, May 1996, http://duplox.wz-berlin.de/docs/ipng/.

controlled the distribution of the new numbers would play a major role in shaping the flow

of Internet traffic for decades to come. Given all these factors, an analysis of the debates

surrounding IPv6 seemed like a perfect topic for anyone who wanted to investigate how an

elite technical community could wield powerful influence over the development of globe-

spanning infrastructures.

As it turned out, there was a German sociologist at the Memphis meeting who was

apparently working on that topic, and she had already developed a strong background as a

result of previous studies on IETF processes. Moreover, she seemed to be on familiar terms37

with many people at the meeting and she clearly had an inside track on getting interviews

with its leaders. Academics generally prefer to be the first to write about a subject, so I was

open to finding another. Finally, on the next to the last day, I saw a presentation about plans

for a pact between the IETF, the International Telecommunication Union (ITU), and several

other private and public organizations.

The agreement, presented in the form of a Memorandum of Understanding (MoU),

had three practical goals. The first was to extend the domain name system by creating seven

additional suffixes, including .web, .art, .shop, and .info. These suffixes are known as Top

Level Domains, or – to people in the know – TLDs. Sales of names within those TLD

categories were to be managed by a new registry service that would function as a kind of

worldwide wholesaler. The second goal was to create a global array of registrar agents to

provide retail and customer service functions for both the new TLD registries and any

existing ones that wanted to leverage those services. The third goal was to develop and apply

a new quasi-judicial system to settle disputes between entities who wanted the same domain

name.

All of this was to be done under the aegis of a radically new kind of administrative

structure constituted expressly for that purpose. That structure was to be based in Geneva.

Since Switzerland was traditionally regarded as a neutral political venue, the move would

Page 48: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

33

signal the people of the world that the Internet’s management was no longer dominated by

the United States Government. Also, to improve the process of registering domain names,

a new supporting standard called “core-db” was to be developed. Taken together, these steps

would transform the US-centered commercial monopoly based around the InterNIC into a

globally-diversified market. Everything was supposed to be completed well before the end

of 1997.

Here, it seemed, was an issue that concerned a fairly narrow band of events, yet was

rich with analytical possibilities. The IETF’s move to create new TLDs was clearly a case

of standards-making prompted by private actors. But there was an intriguing twist, in that the

ITU was one of the world’s oldest international organizations. Writing about the gTLD-MoU

seemed like a perfect fit for an International Relations program, and it would indulge my

abiding curiosity about who makes the rules for getting domain names.

The task ahead appeared to be simple and clear: Just dig into the details concerning

the political and technical background of the MoU and then, over the next few months,

monitor the development and implementation of the new core-db standard. If things stayed

on course, the research could be completed soon... perhaps by December that same year.

f. Evaluating ICANN’s Errors

Three and half years later, November 2000, the dissertation was far from complete

and I was attending yet another in a long series of contentious meetings on domain name

policy, this time at the Marriot Hotel in Marina del Rey, California. It was the second annual

meeting of the Internet Corporation for Assigned Names and Numbers (ICANN), where its

Board of Directors had just selected seven new TLD suffixes for the domain name system.

At the end of the last day, as an audience of about 800 people was clearing the room,

I spotted Bret Fausett sitting in the front row. A young Los Angeles-based attorney, Bret was

clearly preoccupied by an intense internal dialogue, with his head in hands and his elbows

on his knees. Occasionally he would look up and slowly shake his head side to side as if

thinking, “No. I can’t believe it. We were so close.” Then would drop his head into his hands

again. I approached, waiting for his state of mind to shift. He didn’t seem inordinately upset

Page 49: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

34

by his inner recounting. But he was not happy. Given what had just transpired, he was taking

things rather well.

I had recently met Bret as a result of agreeing to sell my domain name, flywheel.com.

A few weeks earlier, someone representing a Silicon Valley startup had called offering thirty

thousand dollars for it. Several such offers had come over the years of the Internet boom, but

that one seemed far more serious. Prices for “good” names ending with dot-com had grown

quite high, partly because of the “irrational exuberance” fueled by the overheating stock

market, but also because of the drawn-out failure of the Internet’s rule makers to introduce

new suffixes.

Bret and I were both longtime members of an Internet-based community and email

list dedicated to the discussion of domain name policy issues. He had reviewed the contract

for me, and now at last, in Marina del Rey, I was meeting him in person. But this did not

seem like the best time to say thank you.

Bret was at the meeting representing a consortium of companies that were bidding

for the rights to inaugurate a TLD named .iii. His clients had paid a non-refundable $50,000

application fee to ICANN, and had shown they were prepared to make millions of dollars

more in related investments if their bid was accepted. More than forty other applicants had

done the same, seeking their own TLDs. During the course of the day the prospects for dot-iii

looked exceptionally good. At each step of the process it ranked high on the list of leading

candidates, which was projected on a giant screen at the front of the room. But fortunes

changed suddenly, right at the very end. Dot-iii was removed from the list. Seven winners

were chosen, and all the losers – including Bret and his clients – were left with nothing. The

application money was spent, hopes were dashed, and calls for reproach were in the air.

The MoU which had sparked my interest so long before was now old history, crushed

before it was launched. Over the ensuing years I had spent countless hours online, monitoring

and joining in as people fought over how to reform the domain name system and debated

what kind of rules should shape the Internet. I had published one of the first articles in my

Page 50: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

35

“Internet Governance Goes Global,” in Kubalkova, et al, eds. International Relations in a38

Constructed World (1997).

Information developed during that period was moved to www.rkey.com/dns/overview.html.39

field to deal with the question of Internet governance, and I had turned flywheel.com into38

a useful site for people looking for information about that subject. I had participated in a39

fascinating and even inspiring series of meetings that some people called the Constitutional

Convention of the Internet. I witnessed ICANN’s emergence, practically out of the blue, after

a surprise backroom deal that made the efforts of those heralded meetings irrelevant. I had

become an active contributor to an ICANN working group that recommended criteria for

creating new TLDs. I had watched the controversy grow, attracting hordes of journalists,

lawyers, entrepreneurs and diplomats, as well as a former Swedish head of state and a

cacophonous handful of loons. And I had been present as various academics from

International Relations and other fields came on the scene to pursue their own investigations.

My topic had become a crowded territory.

At the time of the Marina del Rey conference, the dissertation was on what I called

“the back burner.” I was still working on it intermittently, amid making a living and having

a life. But I had come to this ICANN meeting because, finally, in November 2000, after years

of delays and setbacks, it looked like new Top Level Domains were really going to be added

to the Internet’s Domain Name System. It would have been a shame to miss it.

Bret’s attention soon returned to his exterior world. As our conversation began I

asked him about the choice of .iii as a suffix. His clients had made such a strong bid from a

technical and financial standpoint; it was surprising that they did not also select a suffix that

sounded more marketable. What was the appeal? Was it really a plausible choice? One board

member had derided “eye-eye-eye” for being “too hard to pronounce.” This struck me as a

fair point.

Bret’s answer was unequivocal. Since .iii didn’t mean anything in particular already,

it could be pronounced several ways, would be easier to brand later, and it could be marketed

differently in different languages. The string was also relatively less susceptible to typos.

Moreover, he felt it wasn’t ICANN’s business to mandate what TLDs should or should not

Page 51: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

36

Daniel J. Paré, Internet Governance in Transition: Who is the Master of This Domain (2003:166).40

be created. Nothing in the application process specified the name had to be pronounceable.

Once registry operators had proved their technical and financial bona fides, why not allow

them to proceed? A few of ICANN’s board members had voiced support for portions of

Bret’s logic during the open deliberation period. But they were in the minority when the final

vote was taken.

I had learned a lot over the past few years, and was well aware that Bret knew better

than nearly anyone else on earth how to get a domain name. Yet he couldn’t get the one he

wanted. Ironically, he couldn’t even get a name that no one else wanted.

Despite all the knowledge that had been accumulated about who made the rules for

registering names, in the end it was all too clear that there was something humanly arbitrary

about the process. The choices were political, even personal, but not technical. In rendering

their decisions, ICANN’s staff and board members made a show of giving considerable

attention to published processes. In the end, however, the formalities seemed like lip service.

The final decisions didn’t seem to point to any hard and fast standards. Few among the

remaining applicants considered the process straightforward, neutral, or fair. Luck was

involved. Sentiments weighed heavily. Bargaining was evident. Threats were playing a part

as well. The outcome had more to do with the relative persuasive powers of the selectors than

the qualifications of the contenders.

ICANN’s officers were committing an error that had also been made by the authors

of the 1997 MoU. That same error was repeated throughout the history of the domain name

controversy. In the past it had occurred behind closed doors. This time it was being

performed in full public view.

New rules for the domain name system were being issued by individuals who had

fallen woefully short in their efforts to demonstrate the broadest possible legitimacy of their

authority to do so. History was repeating. It was just another episode in what one scholar

called a pattern of “closure and usurpation.” A venue for “open” discussion would be40

created; numerous participants would work furiously to arrive at a consensus; then some

Page 52: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

37

authoritative body would step in to impose its own notion of what the rules should be. It was

true that ICANN’s Board had done far more than its predecessors to prove its responsiveness

and to try to avoid looking like a usurper, but expectations had heightened over the years.

Observers were now terribly sensitive to legitimacy issues; they were demanding, skeptical,

and very hard to please.

ICANN’s institutional predecessors in the DNS War had been wrecked by allegations

of illegitimate, secretive behavior. Now ICANN was exposing itself to similar attacks. Its

board claimed to be acting as an open and inclusive body. But ICANN’s growing cadre of

critics believed that was a false pretense. The board had emerged from a closed and exclusive

process, after all, and had only opened up in response to strong public pressure. ICANN’s

critics now felt alienated and even subservient in the face of its actions. They railed at every

hint of a process violation and every whiff of a broken commitment. Despite the widespread

desire for new TLD suffixes, the perceived arbitrariness of the selection process did more to

energize ICANN’s opponents than it did to settle issues that had been plaguing the Internet

community for the previous five years. Would ICANN be the DNS War’s next casualty?

There was a critical difference between ICANN and its predecessors, however. For

now, just as long as they didn’t do anything that seemed too crazy, ICANN’s Board members

could count on the Clinton Administration to support their TLD selections and any related

policy decisions. An official from the Department of Commerce would co-sign its own

Memorandum of Understanding with ICANN and that would be that. Many other national

governments were likely to follow suit, in spirit if not in letter.

But governments are only a part of the Internet community. ICANN was just two

years old. It seemed to be creating enemies inside and outside the community faster than it

was creating allies. For the time being, given continued US endorsement of its activities,

ICANN’s existence was not in jeopardy. But now its supporters faced yet another enervating

round of vehement condemnation from a passionate, well-mobilized opposition. ICANN’s

enemies were intent on wearing it down. Its relative effectiveness depended on how well its

officers and supporters could endure each new wave of attacks and insults.

Page 53: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

38

I relished having a front seat to history all those years, and felt privileged to have one

again that afternoon. But those feelings were offset by my belief that I was witnessing

another monumental waste of effort. November 16, 2000 could be construed as the dawning

of real Internet governance, but an old story was unfolding. It was easy to predict what would

come next. There would be massive public fights over process and philosophy while a small

group of individuals with privileged access to members of the US Government decided how

to proceed.

Many familiar faces were present that week, playing out their traditional roles at the

front line of the domain name wars. A contingent of veteran techies from the IETF’s old

guard was there, of course. Dave Crocker, John Klensin, Jun Murai were warning against

doing things that might break the Internet, or somehow sabotage their carefully wrought

design for it. Christopher Ambler and Jonathon Frangie were there again, too. They were the

all time losing contenders for a privately operated suffix, and were pushing their claims for

control of dot-web as relentlessly and as unfruitfully as ever. Chuck Gomes was there too...

polished and charming, well-prepared and well-supported. He was fully determined to

protect his employer, Network Solutions, Inc., and its highly lucrative position as the SAIC

subsidiary that was the incumbent operator of the dot-com registry. Journalists and

academics were there in force. Author/pundit Ellen Rony was among them and was, as

always, chiding the people in power to do better.

Perhaps even the gentle ghost of the Internet’s noble hero was there – the

disembodied spirit of the lately departed Jon Postel. Postel was co-creator of the Domain

Name System, and the long-time steward of the Internet’s standards making process. He had

died two years before, but the presence of his influence was undeniable. He had been

honored and recalled in numerous speeches throughout the meeting. It was his handiwork,

after all, that was the subject of all this brouhaha at the Marriot in Marina del Rey. The suite

of offices where he had served as the Internet’s quasi-helmsman and reigned as its grand old

man was just down the street from the hotel.

If Postel’s ghost was indeed present, he was probably not hovering over the room in

the hope of bestowing some ethereal wisdom. It was far too late for that. Instead, he most

Page 54: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

39

likely have been watching and listening from out in the hallway, on the other side of a half

closed door, an apparition of pained dismay.

g. Opening the Inquiry

When I first became aware of the Internet, some spark of curiosity generated a simple,

practical question about the here and now: “Who makes the rules for getting domain names?”

Along the way, the project mushroomed. My investigation of Internet addresses expanded

into a survey of all kinds of identifiers – not just electronic ones. Before long I was pondering

the structure of language, the nature of identity, and the constitution of power. Each detour

gave rise to others. New tangents beckoned, teasing and pulling with unexplored distinctions

and questions. How were identities assigned in the past? How might they be assigned in the

future?

Had I been more diligent about restraining my interests, I might have ended up

writing a fairly mundane and narrow description of reforms in the Internet’s technical

administration during the late 1990s and early 2000s. Instead, I undertook a much deeper

inquiry into the foundational rules of global society. The long, fractious effort to reform the

Internet’s domain name system is still the underlying thread of this story, but it has been

woven into a broader tapestry.

In hindsight, it is no great surprise that the project turned out this way. Despite all its

twists and turns, the central theme remains the same. This is a recounting of how particular

people fought over their particular interests at a particular point in time. The focus of their

energies was the world’s largest distributed database system, a collection of records that

included the dot-com domain name registry, one of the most productive cash cows in the

history of the Internet boom. With so much to gain and so much to lose, the struggle to

reform the administration of that system often degraded into outright brawl.

But the dispute was not just a mean grab for money and power. It also fired visions

of how the future might work. This corner of the Internet’s gold rush, more than any other,

was rich with Utopian idealism. These events define a pivotal stage in the development of

Page 55: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

40

a computerized human culture. They tell a story about how people behave when they think

they are making history, as indeed they were.

Many of the participants thought they were doing great things. Many conducted

themselves professionally. Others made great sacrifices for their ideals. Some behaved

selfishly, and therefore rather predictably. Far too many showed themselves as fools. Some

covered all the bases. Quite a few justified themselves according to some virtuous ideology,

and thus claimed to be acting in harmony with the proper order of things. What happened is

emblematic of how human foibles and human aspirations so often bump up against human

circumstances.

If this study offers anything unique at all, it is because my curiosity carried me to an

exceptional vantage point. While pursuing my initial question about who makes the rules,

I happened upon a place from which I could observe power being used and abused in the here

and now. By allowing that question to lead me in so many directions, I gained perspectives

that had perhaps never been considered before. I imagined that I could peer backward and

forward through history, into the sources from which power has always been drawn, and

always will be.

Page 56: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

41

Chronology 1 Selected Events in Internet History

Date Event

October 2002 Massive Distributed Denial of Service (DDOS) attack against the root constellation

November 16, 2000 ICANN approves new TLDS.

March 2000 SAIC sells NSI to Verisign for $15.3 billion.

October 16, 1998 Death of Jon Postel.

September 18, 1998 Creation of ICANN announced.

June/July 1998 Release of “White Paper.” First meeting of the IFWP.

January 28 1998 Postel“Splits” the Root.

September 1997 Contentious Congressional hearings.

July 1997 Network Solutions’ IPO. Kashpureff’s cache poisoning.

May 1, 1997 gTLD-MoU signing ceremony in Geneva.

March 1997 Criticism of Postel and IAHC. IANA sued by IODesign.

November 12, 1996 Launch of the IAHC.

September 1996 “Coordinating the Internet Conference” at Harvard.

July 1996 IANA/IODesign “envelope” event. Collapse of newdom.

September 14, 1995 Fee charging begins in .com. Newdom dialogues begin.

March 1995 RFC 1602 formalizes Internet standards process and ISOC/ IETF relationship.

March 10, 1995 SAIC purchases NSI for $6 million.

March 1994 RFC 1591 describes delegation policies for TLDs.

April 1, 1993 Start of the Cooperative Agreement. NSI is running root and .com.

June/July 1993 IPv7 debates. Boston “Tea Party.”

December 1992 Final negotiation of the Cooperative Agreement.

June 1991 Formation of ISOC announced in Copenhagen.

August 1990 RFC 1174 published, describing IANA’s authority.

During 1990 ARPANET decommissioned. Tim Berners-Lee creates http and web technology.

October 1, 1991 Root, DDN-NIC, and ARPA-NIC moved from SRI in California to NSI in Virginia.

July 25, 1989 First meeting of IETF, as umbrella standards group for the Internet.

During 1986 Cerf, Kahn and Uncapher create Corporation for National Research Initiatives.

February 26, 1986 First incrementation of the NIC zone file; DNS is officially in production.

January 17, 1986 First meeting of Internet Engineering (INENG), direct ancestor of IETF.

October 1984 RFC 920 announces TLD selections, including .com.

September 1984 Creation of Internet Advisory Board, precursor of Internet Architecture Board (IAB).

June 24, 1983 First successful test of DNS technology.

January 1, 1983 ARPANET transition from NCP to TCP.

August 1982 RFC 819 proposes tree-like domain hierarchy.

May 1972 Jon Postel proposes himself as numbers “czar” in RFC 349.

During 1971 First use of @ in electronic mail. First use of hosts.txt to list ARPANET sites.

October 29, 1969 First host to host message transmitted on the ARPANET.

Page 57: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

42

Prometheus had been foolish to bestow fire on meninstead of selling it to them: he would have mademoney, placated Jove, and avoided all that troublewith the vulture.

Primo Levi The Periodic Table

2. POSTEL IN POWER

a. The Day the Internet Divided

In early 1998 the Internet briefly split in two. Before the break, a computer known

as “Server A” had served as its unifying apex. To be more precise, Server A was the

technical fulcrum of the Domain Name System (DNS). That system allows names such as

rootsofpower.com to be mapped against the Internet’s underlying numeric addresses.

The split was possible because Server A did not stand alone, but worked in tandem

with a strategically dispersed cluster of “slave” servers. Together, they constituted the “root”

of the DNS. These were not supercomputers or giant mainframes, but nicely-appointed

enterprise-class machines with exceptionally robust, high-speed connections to the Internet.

They were constantly online, running specialized software that could “serve” data to “clients”

anywhere in the world. For most of the 1990s, Server A had been the pole star of the root

system, feeding out information essential for the operation of popular Internet applications

like email and the World Wide Web. Server A was configured as the master of the root

system, while the others providing authoritative copies of Server A’s data.

Server A was based in Herndon Virginia, close to Washington D.C. Its twelve

partners in the root constellation were also designated alphabetically... B through M. The US

Government had hired a private contractor to manage Server A several years before, and was

still providing regulatory oversight. Several other servers in the root constellation were also

subsidized or directly managed by the US government, but not all. Some were privately

operated, and some were based overseas.

In the weeks leading up to the split the root had run smoothly, with no sign of

technical trouble. But the root operators had recently gotten caught up in a raging argument

about Internet policy questions. Over the past few months they had gravitated into factions,

presaging a rift.

Page 58: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

43

Finally, if only for a few days, the root constellation tore apart. Server A remained

in operation, but at the other end of the continent, in Marina del Rey, California, a challenger

arose. The rival, DNSROOT, began to provide root name service under the aegis of the

Internet Assigned Numbers Authority (IANA), a long-standing contact for scientists and

engineers who had developed and expanded the Internet.

Outside the community of operators and government overseers, few were aware of

the rupture. The Internet continued to function as before, with no impact on users. Behind

the scenes, however, its fate was in play – the prize in a desperate tug of war.

* * *

Just as it is possible to row a boat across an ocean, it is possible forego domain names

and still navigate the World Wide Web. Few people have enough skill to do so, however, and

even fewer would ever want to. Locations on the Web are typically linked to underlying

Internet Protocol (IP) addresses, familiar for their “dotted quad” format, which looks like

123.003.008.005. In many cases, using an IP number to access a resource at a particular

address would provide the same result as using its assigned domain name. But maintaining

lists of IP numbers can be daunting. Relying on names tends to make life considerably more

convenient, and offers lots of extra advantages.

The function of the DNS is transparent to most Internet users, but they call on it

often. When a hyperlink is clicked in a browser, or an email is transmitted, the DNS is

effectively being asked to match a name to a number. Server A does not answer the inquiry

directly. For this, there are millions of computers operated by Internet Service Providers

(ISPs) and other enterprises, as well as by a few well-equipped home users. Those machines

keep a list of names that were recently requested and the IP number which should be

provided as an answer.

The genius of the DNS is how the list gets refreshed. Those millions of local servers

don’t need to keep a persistent memory of every name-to-number pairing, just the address

of at least one server in the root zone. In turn, each root server contains a list of Top Level

Domains (TLDs) servers. At the beginning of 1998, when the root was split, that list was

only just over a thousand lines long. There were about two hundred and forty TLDs in the

Page 59: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

44

Figure 1 Partial representation of the legacy DNS hierarchy, circa 2000, showing relations between root servers

A through M (collectively know as the dot “.”), approximately 250 Top Level Domains (of which 4 are shown),

and the arrangement of third, fourth and lower level domains.

list at the time, including .com, .net, .org, .edu, .us, and .de. Nearly every country had its own

distinct two-letter TLD, but the generic three-letter TLDs were better known, and .com was

by far the most popular of all, with over five million registered names.

Machines in the root zone provide the addresses for the machines of the TLD zones,

which in turn serve out the names called by people at their computers. If an ISP or an

enterprise’s DNS server fields a request for a domain name like rootsofpower.com and

doesn’t find that name its current memory cache, it will probably have some memory of

where an authoritative dot-com server is located, and send the request there. Presuming the

dot-com server delivers an answer, the ISP or enterprise DNS server will pass the number

back to the original caller and also save a record of the name-to-number pairing in its own

cache. The ISP’s DNS server would only need to send a request to a member of the root

server constellation if it had just been rebooted, or had received a request for a name with an

unfamiliar suffix.

Page 60: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

45

Figure 2 Root Server Hierarchy of the late 1990s and early 2000s. From a slide distributed by Anthony

Rutkowski.

The arrangement is innately hierarchical, with Server A at the top of the chain. Just

as each of the thirteen root servers is supposed to provide an identical list of all the TLD

zones, every TLD zone could be supported by its own array of (hopefully) identical servers.

Responsibility for adding, deleting, and modifying the most frequently used name and

address match-ups was delegated out to the operators of the TLD zones. Obviously, accurate

replication of the names in a zone was a critical administrative issue.

This was the crux of the debate that preceded the split in the root. Who should be in

charge of adding, deleting and modifying names in the list of TLDs?

Page 61: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

46

See Ellen Rony,“Re: What ICANN doesn't want you to know - a well hidden web site” March 13,41

1999, IFWP http://www.mail-archive.com/[email protected]/msg03219.html. Rony reported similar wording,

“We're the company that everyone loves to hate,” based on his speech at ISPCON on August 22, 1997. See

http://lists.arin.net/pipermail/naipr/1997-August.txt.

* * *

On January 28 , a Saturday, at the command of IANA’s director, Jon Postel, seventh

of the twelve subordinate root servers were reconfigured to treat DNSROOT as their new

master. Two of them – based at IANA’s offices in Marina del Rey – were redirected first.

Soon thereafter, five others – in Tokyo, London, Stockholm, Woodside, California, and

College Park, Maryland – also began to copy the list of TLDs from DNSROOT.

A zone’s master is more properly called a primary name server; the slaves are

secondary name servers. The sole purpose of a secondary server was to replicate all the data

in the primary, thereby providing redundancy and distributing the load. Just as .com, .net, and

.org and each of the country codes zones provided one or more slaves to support the master,

so did the root. But the root was the top of the chain for the entire Internet. Until then, all the

slaves in the root zone had routinely “pulled” their updates of the brief but absolutely

essential root zone file from Server A in Herndon, Virginia. That site was operated by a US

Government subcontractor named Network Solutions, Incorporated (NSI). The four slaves

which remained loyal to NSI were all based in the United States – one also at NSI, and the

others at government-operated facilities in California and Maryland.

As NSI’s Executive Vice President Don Telage once acknowledged, “NSI is the

corporation everyone loves to hate.” NSI was awash in profits by virtue of its US-granted41

monopoly control over .com, .net, .org. The company charged $50 a year for each name it

registered, keeping $35 for itself and sending the rest to the National Science Foundation.

There was little real competition in the name registration business. Veterans and newcomers

alike were extremely sensitive to problems with NSI’s performance and inequities in its

policies. Its monopoly was sorely resented throughout the Internet community, as was its

sweetheart deal with the US Government.

Perhaps Postel didn’t hate NSI, but his decision to split the root was certainly no

demonstration of friendship. Uncle Sam didn’t take kindly to it either.

Page 62: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

47

* * *

Postel was the only person on earth who could issue such an edict and expect his

order to be followed. The servers now within his orbit were given an edited copy of the file

he had obtained from Server A, but there was no substantive difference between the content

of the two roots. Both NSI’s and Postel’s roots pointed to the same lists of TLDs. Given the

background leading up to the split, however, there were good reasons to suspect the two

systems might soon diverge.

Postel, a central figure in the creation of the Internet, was in close and friendly contact

with a global business consortium named the Council of Registrars – CORE. Its members

were planning to introduce seven new Top Level Domain suffixes to the market – .arts,

.firm, .info, .nom, .rec, .shop, and, .web. About one hundred registrars around the world were

preparing to sell names into CORE’s new zones, and the launch was widely anticipated. But

the front end of the business, including the retail system through which buyers would make

their purchases, was not ready. Since nearly everyone expected there would be a huge burst

of demand on opening day, CORE’s members were trying to devise a fair way of dealing

with it. Their plan relied on an as yet untried intercontinental round-robin distribution system

designed to ensure that no single registrar could acquire a built-in advantage when submitting

their buyers’ orders for names. It was essentially a global lottery.

As was often typical of software projects of great ambition, the builders had fallen

behind schedule. Though no domain names had yet been sold, lots of money had nevertheless

changed hands. Several of the registrars were already accepting unguaranteed

preregistrations, promising to move bidders ahead in the round robin queue, a somewhat

sleazy practice that exploited the feverish worldwide hunt for “good” names. Given the

pervasiveness of the Internet’s land rush mentality, it turned out that many people were

willing to pay for a chance to move ahead in a line for lottery tickets, even when the date of

the drawing wasn’t known.

In any case, the back end of CORE’s planned system was much farther along,

virtually complete. The primary nameserver for the new TLDs was sitting at an Internet

colocation facility in San Francisco. Secondaries were based in Barcelona, Dusseldorf, and

Page 63: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

48

Melbourne. Another was slated for somewhere in Japan. These were the machines that would

ultimately serve CORE’s domain names out over the Internet. Most large zones were

supported by eight or so nameservers, but even with just four, CORE promised sufficient

redundancy. According to the rules that Postel himself had helped establish years before,

only two were required. CORE’s nameservers had been undergoing testing for weeks, and

were ready to go live. Postel was well aware of it.

From the perspective of CORE’s operators, the ideal way to proceed depended on

NSI’s operators taking steps that would add the names and addresses of CORE’s new private

zones to the root database file contained in Server A. Once amended, that file would

automatically be distributed to the other root servers. In short order and CORE’s zones would

have been available to the entire Internet. NSI’s managers, however, had no interest in

cooperating.

Nearly two years earlier, a small group of private entrepreneurs had confronted the

same problem. Their strategy had been to bypass the existing root constellation altogether

– to “route around” it, as the expression goes. Their approach was to try creating an alternate

root constellation from the ground up. All the existing TLDs in NSI’s root were to be

included, plus a set of new zones. From the consumer’s perspective, it would be as simple

as simple as getting a phone book with some new area codes, or subscribing to a cable

service that offered extra channels. The challenge was to assemble an array of machines that

could do the job, and then to persuade a few ISPs and savvy end users to use the new

constellation instead. Success rested on the hope that the market would tilt in their direction.

It didn’t, despite their best plans. In fact, the whole thing backfired. As the entrepreneurial

bid came to the fore and finally flopped, its proponents earned a reputation as rogues.

Postel was well aware of that history. But Postel was no fly-by-night entrepreneur.

He had established a long and distinguished tenure as a key player in the legacy system. He

had helped direct pivotal administrative and technical changes on the Internet since its very

beginnings, and was still clearly in a position to do so. Despite the spectacular growth of the

Internet over the previous few years, many critical resources remained subject to his

influence. These included most of the root’s slave servers. It was within his power to

Page 64: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

49

See, for example, comments by Paul Vixie at Harvard University’s conference, “Internet Names,42

Numbers, and Beyond.” http://ksgww.harvard.edu/iip/GIIconf/vixie.html. November 20, 1995.

manipulate the configuration of the root server system – or at least a major chunk of it – and

thus dare to displace Server A as its pole star, anointing DNSROOT as its replacement. This

would have made the physical task of adding CORE’s new zones a simple prospect. Postel

could order changes to DNSROOT’s zone file, adding any suffixes he deemed appropriate,

along with the IP addresses of their corresponding nameservers. And, for reasons to be

explained, he was fully in league with CORE’s operators and overseers.

Suppose Postel had in fact updated DNSROOT to include the new TLDs. Any ISP

in the world that happened to query one of the root servers loyal to him would have been able

to resolve names within the new CORE zones. But the solution was far from perfect. The

names in the new zones would be invisible to any ISPs – and thus to any end users – whose

queries went to servers that stayed loyal to Server A in Reston.

The problem was that the ISPs would not necessarily know which root system they

were using at any given moment. The BIND software then used by nearly every ISP in the

world was configured to query the root servers from A to M, and to stick with the first one

which gave a fast answer. BIND “knew” how to adjust and pick another server if the

response was delayed by network congestion or an outage. Moreover, this adaptive selection

feature worked at each level of the domain name system. Accordingly, operators of both the

root zone and various top level zones could boost responsiveness by distributing their

secondary nameservers at strategic distances. But the design of the DNS was premised on

maintaining identical copies of the data in the respective root and zone servers. The split in

the root raised the worrisome prospect that the answers delivered by BIND could begin to

shift unpredictably between an IANA root which included CORE’s zones and an NSI root

which lacked them.

Introduction of the CORE zones through Postel’s machine would have been

technically sound only if DNSROOT ultimately supplanted Server A altogether. This had

long been a popular idea among key members of the engineering community. In a world42

where the Internet remained split in two, however, service for CORE’s suffixes would have

Page 65: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

50

been spotty, and the Internet as a whole would have become less reliable. Cracks in the

constellation would have become another memorable affliction of the computer age. For

better or worse, things never got that far.

b. Authority

Postel was highly respected for his judgment about how to manage the root, but he

had never done anything like this. During nearly three decades of involvement with advanced

computer networks and the Internet “community,” he had always been careful to promote

harmony and to respect consensus. Until that moment he was widely regarded as a quiet if

stubborn team player... politically astute, a bastion of prudence and good sense. He was

someone who had rightfully exercised power within the vast collaborative enterprise called

the Internet and who had been trusted to exercise that power in a sound way. Yet here he

was, acting seemingly on his own initiative, making a move that would clearly generate

trouble and controversy.

Splitting the root was radical enough. The potential next step was flabbergasting.

Adding CORE’s seven new TLDs to his half of the root would turn the Internet upside down.

The threat to stability was clear. When Postel decided to play tug of war on the Internet, he

certainly must have expected that his adversaries might try to tug back. He couldn’t have

known how hard.

Postel’s authority to oversee the contents of the root derived from longstanding

institutional relationships, informal conventions, quasi-formal documentation, webs of self-

referential charters, and an unusual personal charisma. It was the kind of charisma that

worked well with computer nerds, much of the computer-trade press, and numerous

government officials. It was not a glibly gregarious, high-spirited charm. He epitomized the

computing world’s image of a slightly unkempt, Buddha-paunched, long-bearded guru...

Jerry Garcia with a pocket protector. At first glance, there was little about his appearance

or his physical mannerisms to indicate his stature or why he was treated with such reverence.

It became clear as soon as he spoke. He was a model of relentless precision. His words were

Page 66: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

51

Phone interview with Dave Holtzman conducted May 1, 2000.43

delivered quietly and succinctly, occasionally tempered by a bone-dry, way-inside-the-

ballpark sense of humor.

As the Internet’s übergeek, only Postel had enough power to tell the operators of the

world’s slave servers to salute a new master. That power derived from many sources,

including his long relationship with the US Government dating back to the late 1960s. By

1998, however, he was drawing primarily on reputation and tradition. The root operators

around the world trusted and respected him. They had always followed his requests in the

past and he could count on most of them to do as he asked, even with eyebrows raised.

Liability was problematic, however. Antitrust and restraint-of-trade issues were

looming. The legal basis of Postel’s authority to act was extraordinarily ambiguous. NSI’s

attorneys would certainly consider themselves obliged to challenge the addition of CORE’s

new zones. There was no surprise in that, but it was unclear how Postel’s legal bills might

be paid, win or lose.

There were additional fears, not altogether unfounded, that NSI would act in league

with some of the aspiring private operators who had been undercut by CORE’s plan. Despite

their inability to set the market on fire two years earlier, there were still some who were eager

to proceed under the right circumstances. Dave Holtzman, NSI’s Senior Vice President for

engineering, was in regular contact with several of them. Simon Higgs claimed rights to43

TLDs such as .news and .coupons. Christopher Ambler of IODesign claimed .web. NSI let

the rumors fly that it might simply add those zones in accordance with the doctrine that “the

enemy of my enemy is my friend.”

Postel’s options were also complicated by the possibility of intervention from the US

Government. At the least, he was vulnerable in that his strings could be pulled. Postel’s

IANA work was largely dependent on government funding, and had been for many years. On

the other hand, it was clear to everyone that Internet privatization was well underway. US

funding for Postel’s shop at USC had briefly dried up a year before, prompting colleagues

Page 67: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

52

Kenneth Cukier, “Plan to Protect Net's Key Central Authority Underway,” Communications Week44

International, June 2, 1997, 1.

Conversation related by Magaziner to Gordon Cook and Steve Wolff, recounted by Cook in phone45

interview conducted November 20, 2002. See also Cook’s article on the Magaziner interview, “Internet

Governance Wars,” Cook Report, 6.9, December 1997.

in Europe and Asia to step forward with offers of stopgap support. US Government funding44

resumed shortly thereafter, but it was clear that federal largesse would not be permanent. If

the spigot simply cut off now, in response to the root redirection, Postel would at last have

to confront the inevitable. And if finding new sources of support would indeed be the next

course of business, his options looked good. The Internet boom was still building, and IANA

had a world of alternatives to pursue.

More ominously, however, members of the Clinton Administration were at a critical

juncture in their own investigation of the DNS controversy. They had voiced an intention to

take control of the root expansion policy, and had tried to enlist Postel in working out a

compromise between NSI and its adversaries. The administration officials did not want

assume responsibility for funding IANA in perpetuity, but they also did not want to see

IANA beholden to sources they deemed inimical to US interests. Furthermore, only one

month earlier, Ira Magaziner, the administration’s overseer of the issue, had gotten wind of

the plan to add CORE’s zones to the root, and he had personally warned Postel not to do it.45

It was anyone’s guess what Magaziner or others at the White House could or would do if

Postel proceeded on that course.

c. Agency

The story of the root redirection can be interesting, even fascinating, if only for the

portraits of the people involved. But a retelling can also provide deeper insights into the

construction of social capacities such as status, authority, and responsibility. It can help

explain how people become agents of something much larger than themselves.

Until the DNS controversy got underway, Postel was a man who possessed virtually

undisputed high standing in his community, and yet was nearly invisible beyond it. The

Page 68: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

53

historical circumstances – the rise of Internet – would have tagged celebrity on anyone in the

same position. He had power not only because he had the capacity to make and execute hard

and fast decisions about technical matters, but because power is also a function of

preferences, concepts, and moral standing. Leadership is not just a matter of steering a ship

toward a pole star, but pointing out the pole star as a reliable source of guidance. Such

leadership creates a social memory of what, how, and why to follow. That memory may long

outlive the leader.

For nearly three decades the work of coordinating and regulating the Internet had

been a central part of Postel’s livelihood. People rarely thought hard about the distinction

between those two activities – coordination and regulation – just as they rarely explored the

nuances of Internet management that set governance apart from government. Postel’s action

forced the issue.

The distinctions reflected categorical differences in social behavior. Coordination

conveyed the sense of various parties working out harmonious arrangements on their

collective behalf according to shared principles. Regulation implied that one superior party

would be able to impose an arrangement on the others in accord with some law or

legitimating authority. These were separate functions, reflecting two very different classes

of power... normative influence and official right. Postel appeared to wield them seamlessly

over the years, as if the difference could be ignored.

Postel had banked reservoirs of trust through years of exemplary service to the

research and engineering communities. The US Government’s Department of Defense paid

him to perform those services, but allowed him a very free hand. In fact, he had been given

so much discretion in the performance of his responsibilities, many had come to believe that

his capacity to act was entirely independent of US control.

The onset of the Internet boom made Postel’s significance ever more apparent.

Simultaneously, it became clear that funding from the Defense Department would soon come

to an end. As a consequence, many insiders – Postel included, though reluctantly at first –

began talking about institutionalization. How could his official powers be spelled out and

distilled into bureaucratic posts that a professional could fill? What kind of expertise was

Page 69: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

54

needed? And, regardless of technical competency, who could ever acquire a level of trust and

respect equivalent to Postel’s? Those discussions were subject to their own inexorable logic.

To whom was Postel ultimately accountable? Who were his constituents? Who were the

stakeholders whose interests were to be served? Who would pay for those services to

performed?

Dealing with the question of how the Internet would be ruled – if at all – raised

discussions over what was meant by words like governance, coordination, and regulation.

The struggle to add new TLDs to the root added content and urgency to the matter, turning

what could have been a rather dull process of bureaucratization into a high stakes

confrontation.

When large scale public debates about the reform of the domain name system began

in late 1995, shortly after registration fees were instituted by NSI, the notion of coordination

sounded more agreeable, like a friendly proposal among colleagues. Yet the collegiality

implied by that term left an open question: “What happens when a group can’t settle an issue

among themselves?” And the most obvious alternative to coordination – regulation – implied

an uncomfortably harsh sense of political closure: “Whose rules should have final say?”

Regulation was a sensitive issue for Europeans, most of whom did not want to admit

openly that the Internet would continue to be ruled – directly or indirectly – by American

politicians. Tacit acceptance of ongoing American dominance of the policy making process

was embarrassing enough. A move toward formal regulation would enshrine that dominance,

underscoring the subordinate position of European policymakers.

The trend toward formalizing of US government influence was also troubling to self-

proclaimed Netizens. This was a diverse group that included sophisticated computer

specialists with a philosophical bent, as well as professional journalists and armchair pundits

who thought they knew a thing or two about computerization. Veterans and newcomers had

been drawn together by their enthusiastic celebration of the quasi-legal status of the Internet

and the liberating sense of anarchy they believed it afforded them. Until the likelihood of US

intervention became clear, they thought they had found a way to avoid being ruled by any

nation-state at all.

Page 70: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

55

Ira Magaziner recounted an extensive series of late night phone calls between himself, Postel,46

attorneys at USC, and Erskine Bowles, White House Chief of Staff. “The President was at a reception at the

White House, and I notified the Chief of Staff, who was ready to notify the President if we needed to do

something.” The pressure could not be issued as a direct order since the US government’s immediate authority

over Postel and the root was unsettled. Phone interview with Ira Magaziner, August 5, 2000.

See Sandra Gittlen, “Taking the wrong root?” Network World, February 4, 1998,47

http://www.networkworld.com/news/0204postel.html.

The players groped for a solution, sometimes advancing dialectically. The concept

of “self-regulation” was often bandied out, as if it were possible to constitute a new form of

authority that would resolve the difference between soft coordinated and hard regulated

conceptions of power. As the talk went on, however, the DNS crisis began to boil over.

Despite the looming crisis, authority was in place and working. Formal or not, Postel

already possessed the resources of both guide and gatekeeper. These two distinct but

complementary aspects of power were both available to him, to be deployed at his discretion.

He had been quietly accumulating those powers for decades. His redirection of the root

proved that his own powers were formidable, but that they were not arranged coherently

enough to sustain him on the path he had taken. That single act could have marked the

culmination of his career, putting important elements of the Internet’s administrative

structure on course toward a free-standing official status. In the end however, facing threats

made in the name of the US Government, he backed down.46

* * *

Forces had been massing for a confrontation, but Postel’s move was a shock

nonetheless. His critics called it a hijacking. Redirecting the root was an47

uncharacteristically bold move. Even if there was a strategic logic to it – and a very risky

logic at that – this crossing of the Rubicon and the hasty retreat back drew exceptional

attention to the General heading the march. Now, in addition to his judgment being

questioned, the very basis of his power was open to scrutiny. What were its sources? What

were its limits? His decision would ensure lasting celebrity, but it wasn’t clear whether he

was building a legacy of fame or infamy. Postel had made his own authority an issue. Who

was this man who could post an email and push the Internet to the brink?

Page 71: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

56

Figure 3 Creators of the ARPANET

Figure 4 Jon Postel, Steve Crocker and Vint Cerf Figure 5 Jon Postel in 1997

Page 72: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

The story of the BBN party is recounted by Hafner and Lyon (1996: 257-65).48

The apt characterization of Cerf as “ubiquitous” comes from Craig McTaggert (1999) , Governance49

of the Internet’s Infrastructure: Network Policy for the Global Public Network.

See for example, notes of Robert Kahn at http://www.python.org/workshops/1995-12/50

minutes.s1.html

57

3. INTERNETTING

a. ARPANET

In September 1994, twenty five years after installing the machine that spawned the

Internet, the veterans of the ARPANET project – the Internet’s precursor – regrouped for an

anniversary celebration in Boston. Using a huge mural of the world as a backdrop, a48

photographer set up to take a shot of the most distinguished engineers present. There was

evidently some jostling for position among the nineteen who made it in, but no question

about who would sit front and center in this prestigious class of middle-aged white males.

That spot went to the ubiquitous Vint Cerf, one the Internet’s most inventive and

celebrated personalities. He had directed the ARPANET project from 1976 through 1982 and49

was a truly seminal figure in the long series of technical and political developments that led

to the modern Internet. His productive career brought him countless honors plus the50

benefits of financial success. He would becoming a recurring figure in the DNS War, and

even more prominent in its aftermath.

Cerf was flanked on his right in this picture by Bob Taylor, former director of the

Information Processing Techniques Offices at the Advanced Research Projects Agency

(ARPA), under the U.S. Department of Defense. It was Taylor who had first envisioned the

ARPANET as such, and put the project in motion in 1966. No slacker either, Taylor went on

to create Xerox Corporation’s Palo Alto Research Center (Xerox PARC), the laboratory

where computing innovations from Ethernet to the graphical user interface underlying the

Apple MacIntosh and Microsoft Windows were nurtured and inspired.

On Cerf’s left was Frank Heart, head of the team that submitted and won the bid for

the ARPANET proposal. Heart’s contribution was undeniably important, but his chances to get

a seat in the front row weren’t hurt by the fact that his employer – Bolt Beranek and Newman

(BBN) – was paying for the party and the photographer.

Page 73: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

58

Steve Crocker, “Initiating the ARPANET,” Matrix News , 10.3, March 2000, http://www.mids.org/-51

pay/mn/1003/crocker.html.

Figure 6 Depiction of early ARPA Network. Hosts are shown as rectangles,

Interface Message Processors (IMPs) are shown as circles. Graphic is taken

from a presentation by Bob Braden.

ARPA’s initial award to BBN in 1968 was just over one million dollars. It was

arguably one of the smartest and most effective investments ever made by the U.S.

Government. Taylor had conceived the ARPANET as a project that would enable a few large

computers at universities and research institutions to interoperate with each other. First51

two, then four, then more. Data signals were converted into an analog format so that they

could be carried across the public telephone network. The ARPANET’s proof of viability was

an encouragement to the development of numerous public and private networks over the

following years.

Page 74: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

59

Many other networks sprouted and flourished in the ensuing years, including

MILNET, CSNET, NSFNET, Bitnet, FIDONET, and eventually Compuserve, Prodigy, and

America Online. But the suite of protocols that emerged from the ARPANET community made

it possible to link those various networks together, joining thousands and eventually millions

of nodes into a network of interconnected networks – an internet. Older systems either

merged in, or expired from lack of use. Even the ARPANET was shut down in 1990. High

speed digital lines interconnected the various networks that survived and prospered. The

network of networks came to be treated as a distinct entity – The Internet.

Presently, most people are familiar with the fact that Internet traffic is carried by

private firms called Internet Service Providers – ISPs. When the ARPANET cornerstone was

put in place, no reference was ever made to anything like an ASP – an ARPANET Service

Provider. But for all practical purposes BBN filled that role. The equipment which

constituted the network’s first node had been delivered and installed at UCLA by a BBN

team in September 1969. The company grew as innovation after innovation fostered

explosive use of networking technology and spawned the global Internet.

BBN’s generosity as host for the anniversary party was not driven entirely by

sentiment for days gone by. By the early 1990s the commercial environment for service

providers had become extremely competitive. Facing pressure in the marketplace, the

company was looking for new ways to advertise itself. It was in the process of launching its

own commercial ISP, called BBN Planet. (That venture that was later absorbed by GTE,

which was, in turn, swallowed by Verizon.)

From an advertiser’s perspective, a celebration that brought together some of the most

famous names in modern computing under BBN’s roof looked like a good idea. Reporters

were invited, of course. The company had also commissioned Katie Hafner and Matthew

Lyon, a husband and wife team, to write a book about ARPANET history that would showcase

BBN’s important contributions. This became the critically-acclaimed best seller, Where

Wizards Stay Up Late: The Origins of the Internet (1996). One of its plates included the

group portrait of the ARPANET’s pioneers, featuring Cerf, Taylor, and Heart on the front line,

reproduced here on page 56.

Page 75: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

60

See his July 1961 Ph.D thesis proposal, “Information Flow in Large Communication Nets” at52

http://www.lk.cs.ucla.edu/LK/Bib/REPORT/PhD/proposal.html, and the 1964 book, Communication Nets;

Stochastic Message Flow and Delay.

Email by David P. Reed, “The Internet and nuclear attack,” reprinted by Dave Farber, “IP: Must read53

(especially the press) The Internet and nuclear attack,” IP December 28, 2001.

* * *

The second row of the group portrait included Larry Roberts, whom Taylor had hired

in late 1966 to execute the project on behalf of the Department of Defense. Roberts became

the government’s man in the trenches. Highly regarded for his prodigious talent and

discipline, Roberts was the true leader of the project. He had drawn up the ARPANET’s

specifications, supervising activity around the country as work came to fruition. Roberts left

government work in the mid 1970s, going on to create the first private data

telecommunications carrier, Telenet (also absorbed by GTE), and later serving as President

and CEO of DHL. In the picture, Roberts sat next to his longtime friend, Len Kleinrock. The

two had worked together in the early 1960s on advanced aerospace defense projects at MIT’s

Lincoln Laboratory. Kleinrock wrote a seminal paper on packet switching in 1961 and

published the first book on the subject in 1964. After completing his Ph.D. at MIT 1963,52

he joined the faculty at UCLA, doing research in mathematical queuing theory; a subject of

great practical utility to anyone who wanted to measure the performance of a computer

network.

Roberts awarded the contract that put Kleinrock and the students working under him

– including Cerf – in charge of configuring the ARPANET’s first Interface Message Processor

(IMP). Supplied by BBN, the refrigerator-sized machine was hooked up to a hulking Sigma

7 computer at UCLA. It was also connected, via a phone line leased by BBN, to another

IMP/computer combination at the Stanford Research Institute in Palo Alto. As these

IMP/computer pairings mushroomed to include dozens of nodes, the ARPANET became an

increasingly useful network. (The idea that the system was designed to survive a nuclear

attack is a persistent urban legend, but the IMPs were indeed built into “blast hardened”

cabinets to help demonstrate the concept of a “survivable network.” )53

Page 76: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

61

Also in the third row of the picture was Robert Kahn, who had been a key player at

BBN in the first years of the ARPANET contract. Kahn left the company to work for the US

Government in 1972, and set out to develop a way of connecting computers as peers, rather

than as components in a hierarchy. This was a formidable task, given the diversity of

machines and operating systems to be interconnected. Kahn knew he needed help solving the

peering problem. Cerf had finished his Ph.D. by then and was teaching at Stanford

University. Kahn picked him as his collaborator, and Cerf never returned to academia.

Over the course of 1973 and 1974, Cerf and Kahn devoted themselves to developing

a technology they called the Transmission Control Protocol (TCP). TCP was a huge leap

forward that vastly improved the ability to exchange messages between different types of

computer systems. The protocol afforded a software substrate upon which all the various nets

of the world could ultimately interconnect. The development of TCP made the Internet

possible, and thus raised up its developers as legends.

* * *

There were several more rows of ARPANET veterans in the photo. Smiling broadly,

way back in the next to the last row, were two engineers who might have won a ranking

closer to the front of if they had elbowed for it. But their temperaments were famously

unobtrusive. One, Steve Crocker, had been Cerf’s best friend at Van Nuys High School in

Los Angeles. Next to him was Jon Postel, who had graduated from the same school a few

years later. In addition to wearing a wide happy grin, Postel sported the longest beard of

anyone in the picture.

Cerf, Crocker, and Postel were literally present at the creation of internetworking.

When the first messages were sent between computers at the Stanford Research Institute and

UCLA, they were all working as graduate students under Kleinrock. The first host-to-host

message was sent by another student, Charley Kline, at 10:30 PM on October 29. 1969.

There was no profound message in the transmission. In fact, the initial attempt caused a

Page 77: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

62

George Johnson, “From Two Small Nodes, a Mighty Web Has Grown,” New York Times, October54

12, 1999, D1.

For a photo of the entry, see “The Day the Infant Internet Uttered its First Words,”55

http://www.lk.cs.ucla.edu/LK/Inet/1stmesg.html.

crash. Characteristically, it was Postel who had set up and maintained the IMP log book in54

which that portentous first sign-on was recorded for history.55

Cerf, Crocker and Postel had been together in other commemorative pictures over the

years. One turned up in a special issue of Time Magazine, celebrating the legacies of the

1960s. They didn’t make the cover. That was a scene from the Woodstock music festival.

But inside, there they were, sitting around a table, mischievously talking into tin cans

connected by sausage links.

The actual topology of the ARPANET was considerably more sophisticated than a few

string and cup telephones. BBN made the connections possible, installing four IMP nodes

by December 1969 (the third and fourth were at UC Santa Barbara and the University of Utah

in Salt Lake City), and hooking them up via dedicated phone lines. In those early days Cerf,

Crocker, Postel, and many other graduate students around the country were working to make

those links practical by designing mechanisms that would allow the IMPs to communicate

with their local “host” computers.

With Kleinrock’s support, Crocker had taken the lead in coordinating this dispersed

group of students and volunteers. His goals were fairly simple... to articulate the basic

standards that would allow different kinds of computers to talk with the IMPs, and to

describe new features of the system as they emerged. Those meetings, which began in 1969,

were consecrated as the Network Working Group (NWG). Years later, in 1979, vestiges of

Crocker’s NWG were reunited as the Internetworking Conference Control Board (ICCB).

The ICCB was reconstituted in 1983 as the Internet Activities Board (IAB). The IAB, in turn,

spawned the venerable Internet Engineering Task Force (IETF), which remains the primary

venue for development of the Internet protocol suite.

While still a graduate student, Crocker launched a documentation process for the

NWG that eventually became the formal basis for modern Internet standards. That process

Page 78: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

63

Several different series of documents, including Internet Engineering Notes, were used by that same56

community. “See Internet Archaeology: Documents from Early History,” http://www.rfc-editor.org/history.html.

endures to this day. He titled the series “Request For Comments,” a humble name that

understates its impact. Over 3500 RFCs have been published since Crocker put out the first

one in April 1969. 56

Postel, up until his death in 1998, authored (or co-authored) more RFCs than any

other individual. In 1978 he and Cerf (along with Danny Cohen) presented a major

improvement to the Transmission Control Protocol that Cerf and Kahn had introduced four

years earlier. The revision, known as the Transmission Control Protocol/Internet Protocol

(TCP/IP) became the elegant platform from which the Internet blossomed beyond all

expectation. Postel helped manage the conversion of the ARPANET and other attached

networks from the predecessor Network Control Protocol (NCP) to TCP/IP between 1981

and 1983. He also authored the specifications for the Simple Mail Transfer Protocol (SMTP)

in 1982. It is the basis for relaying email across the Internet’s diverse networks.

These are all huge achievements, but Postel is most famous for something else. When

he was a student at UCLA, one of his dissertation advisors, Dave Farber, volunteered him

to begin recording the various numbers, addresses, and technical parameters that were being

assigned and reserved as the ARAPANET phenomenon grew. Alex McKenzie at BBN had

been updating host table numbers and noting the changes via the RFC series, but there were

many other types of assignments that needed to be tracked. Until then, the records were

dispersed, and were at risk of becoming haphazard. One repository was a set of index cards

that Bob Kahn carried in his shirt pocket. Postel took on the task of consolidating the

information and ensuring it was kept in order, up to date, and publicly available. It turned out

that he was very good at it.

Postel became the reference point for people who needed information on how those

parameters were being used, or who wanted new parameters allocated for specific uses they

had in mind. He made sure that numbers which needed to be well-known were indeed well-

known. His real passion all along, he told me, was high speed, high performance computing.

For most of his career, the tedious but important drudge work of recording and publishing

Page 79: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

64

Diane Krieger, "Heavenly Father of the Net: An Interview with Jon Postel," The Networker, 7.5,57

Summer 1997, http:/www.usc.edu/isd/publications/networker/96-97/Summer_97/innerview-postel.html.

data assignments initially occupied only a moderate portion of his responsibilities.

Nevertheless, those responsibilities kept accumulating. He had helped Crocker edit RFCs

since the beginning, and took charge in late 1971 when Crocker left UCLA and went to work

at ARPA. Postel ended up publishing not only technical standards in the RFC series, but

records of best current practices, informational statements, and even some April Fool’s Day

pranks. He became an institution.

When the Hafner and Lyon book, Where the Wizards Stay Up Late, was published

in 1996, they described Postel as an “unsung hero.” By remaining in a government-funded

think-tank environment rather than moving to the private sector, he had passed up the chance

to become as wealthy as his colleagues. Many had done quite well, making their fortunes at

the leading edge of the Internet boom. Some had gone off to create their own companies,

while others climbed high on the corporate ladder. The young wolves of the IMP era were

now, for the most part, passing through a prosperous middle age together. They were also

becoming celebrities. Cerf and Kahn were receiving numerous public awards, including

perhaps more honorary Ph.D.s between them than any other two humans. They awkwardly

shared the informal but respectful title, “Co-fathers of the Internet.” When Roberts also

claimed to be the Internet’s father (perhaps rightfully), Cerf gracefully declared himself

“midwife.” Kleinrock finessed the paternity question by touting himself as “Inventor.”

Postel, however, was the subject of honorifics that were meant for him alone. He was

best known as the Name and Numbers Czar. He was frequently called the God of the

Internet, and sometimes even Supreme Being. As the Internet became an operational57

system, Postel became the eminence grise among the community’s greybeards. He was no

unsung hero to people in the know. And that was about to become a much larger group.

* * *

In mid 1997 Postel sat alone for a full page picture that ran inside the October issue

of Internet World. Looking aged beyond his years, posing dourly behind a desk piled over

by paperwork, his beard had grown long enough to be worthy of an unrepentant 60s radical.

Page 80: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

65

But now the beard was very very gray. The magazine’s cover was all about him. It showed

a picture of a gold bar, stamped www.$$$$.com. The headline trumpeted “...GOLD RUSH....

EXCLUSIVE INTERVIEW With the Man in the Center of the Domain-Name Battle, Jon

Postel.” (Shown on page 56.)

The wizard behind the curtain of the Internet ultimately become the most famous

wizard of them all – not just a star, but the man of the hour. Postel was finally in the spotlight

at center stage, fully eclipsing the old hands of the ARPANET, even Cerf. Now the world had

found out who he was, raining down attention and demands ever more relentlessly with each

turn of the Domain Name crisis. The smile was gone.

b. Czar

To understand how Postel earned his stature in the Internet community it helps to

understand his diverse contributions to the creation of “internetting.” Most significantly was

that, as things first got underway, it was Postel who meticulously tracked and published the

assignment of the ARPANET’s socket numbers. These numbers were loosely analogous to the

local extensions that might be served by a single phone number. But sockets were ordered

the same way on each host computer. It was as if every company with an internal phone

system always used the same distinct set of extensions for every office, from the CEO to the

mail-room. Everyone relied on Postel for the up-to-date directory.

Constructed out of memory segments, sockets were virtual openings into a

computer’s processes. Since they were only simplex (one-way) connections, they often had

to be used in pairs. Odd numbers were typically used on the server side, and even numbers

were used by the caller. Different socket numbers had to be reserved for specific server-side

conversations, applications such as TELNET, FTP, FINGER, Date and Time, Short Text

Messages, and so on.

To play out the analogy, it was as if someone were obliged to dial out from a specific

extension on his or her own computer to ask the time, and the computer that was called

would send the answer from another specifically designated extension. Later on, the

Page 81: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

66

Jon Postel, “RFC 349: Proposed Standard Socket Numbers,” May 30, 1972 http://ftp.isi.edu/in-58

notes/rfc349.txt.

invention of TCP opened the way for full duplex, two way connections across a single

channel, called a port.

In RFC 349, published in May 1972, Postel moved to formalize his role as both guide

and gatekeeper for numeric assignments on the ARPANET:

I propose that there be a czar (me ?) who hands out official socket numbersfor use by standard protocols. This czar should also keep track of and publisha list of those socket numbers where host specific services can be obtained.58

Postel was acknowledged, ultimately, as the system’s “numbers czar,” a moniker

which was, in those days at least, a term of endearment. As time passed and the Internet

evolved out of the ARPANET, he remained in charge of the allocation of all “unique parameter

values” used by the Internet engineering community, including IP numbers, port addresses,

and the top of the domain name hierarchy. His control of that last resource made him the

power behind the root. His knowledge of how the all the different protocols and assignments

were interwoven and interdependent made him one of the few people on earth who could

simultaneously envision both the big picture and the critical minutiae of how the Internet’s

protocols worked. Cerf, Kahn, and others set the course, but Postel’s hand, perhaps more

than any other, steadied the Internet’s symbolic rudder as it evolved from the ARPANET in the

1960s to an engine of global commerce at the turn of the millennium.

Years later, he told a Congressional committee how his role in the early ARPANET

experiments evolved into a job of such pivotal significance.

Communication of data between computers required the creation ofcertain rules ("protocols") to interpret and to format the data. These protocolshad multiple fields. Certain conventions were developed which would definethe meaning of a particular symbol used in a particular field within aprotocol.

Collectively the set of conventions are the "protocol parameters." Ina project like the ARPANET with the developers spread across the country, itwas necessary to have coordination in assigning meaning to these protocol

Page 82: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

67

See the prepared statement of Jon Postel Before the U.S. House of Representative, Committee on59

Science, Subcommittee on Basic Research, “Internet Domain Names , Part 1,” September 25, 1997. For hearing

transcripts, see http://commdocs.house.gov/committees/science/hsy268140.000/hsy268140_0f.htm.

parameters and keeping track of what they meant. I took on the task of doingthat.59

In other words, Postel’s job was to keep people from stepping on each others’ toes.

Internet growth depended upon technical interoperability across potentially enormous arrays

of hardware and software. The rapidly expanding community required an explicit system for

setting and referencing new codes and protocols. Fortunately, most of the numbers and

symbolic values the engineers needed could be denoted in series, and allocated one after the

other. To avoid chaos and conflict, the programming community needed a reliable point of

contact where such numbers could be distributed and immediately recorded as taken. To

sustain such a high pace of invention, that contact had to be responsive and accessible.

Semantics had to be nailed down and published. The Internet needed a memory of what had

already been said and a system for describing what had been meant by it. Only with such

discipline could the curse of Babel be avoided.

To simply assign a meaning makes one a guide, but as the designated enforcer of that

discipline on behalf of a community – whether he was said to be coordinating, regulating or

controlling the assignments – Postel held an office equivalent to lord high gatekeeper.

Anyone intending to identify a new value knew that its official realization depended on

Postel making the proper pronouncements.

The only penalty for refusing to play the game with the czar’s numbers was the

inability to play with the people already using the czar’s numbers. This was the paradoxical

ambiguity of Postel’s power. Rules constrain, and rules enable. “Coordination in assigning

meaning” was necessary if people intended to play nicely together. Within the game, the

practical day to day act of coordinating meaning was essentially equivalent to regulation;

Postel generally exercised the final word. But playing the game at all was considered a

voluntary act. Outside of the game there was no formal effort by the players to compel

compliance. They counted on markets to do that for them. “The phrase ‘let the market

Page 83: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

68

Dave Crocker, “Re: Stopping independent publications (Re: Comments on IESG charter and60

guidelines drafts),” POISED March 4, 2003.

Scott Bradner, “RFC 2119: Key words for use in RFCs to Indicate Requirement Levels,” March61

1997.

Harald Alvestrand, “RE: Impending publication: draft-iab-considerations-02.txt” IETF September62

8, 2002.

Joe Touch prepared a Curriculum Vitae shortly after Postel’s death. See63

http://www.postel.org/postel-cv-1997.txt.

decide’ used to be the watchword in the IETF,” wrote one well–known insider. “Our job is

to do engineering, not make market decisions.”60

One’s relationship with the engineering community depended on proof of fealty. In

later years, RFC authors started using words like “MUST” and “SHOULD” to express what

it took to conform with an IETF standard. MUST indicated an “absolute requirement.” It61

was to be obeyed instantly and without question, not unlike a soldier hearing an order from

a superior officer. The approach took firm hold within the IETF culture, though such

directives had far lower standing beyond it. According to Harald Alvestrand, IETF chair in

the early 2000s, “a MUST in an RFC has no enforcement mechanism whatsoever.”

The most that can be said is something akin to “if you do the opposite of whatthese words say, and claim to be conformant to this RFC, we will laugh atyou.”62

Within the parameters of the Internet standards game, there was no effective

difference between the meaning of coordination and regulation. For people who loved the

game of using and inventing Internet standards, there was no need to worry about the

distinction. The players at the table needed a dealer, an enabler who would guarantee a fair

turn. Postel made a career of performing that function. His standing was bound to increase

as the Internet became the most important game in town.

c. IANA

Postel moved around a bit after completing his Ph.D. at UCLA in 1974, working for

various defense contractors on ARPANET-related projects. The longest stint was at SRI,63

Page 84: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

69

Hafner and Lyon (1996: 232-3).64

Postel’s Curriculum Vitae, http://www.postel.org/postel-cv-1997.txt.65

where he worked with Doug Engelbart (standing in the third row of BBN’s commemorative

picture) at the Augmentation Research Center (ARC). If the Internet had a spiritual

birthplace, it was there. The ARC had been founded by another ARPA scientist and BBN

engineer, the late J.C.R. Licklider. “Lick,” as he was known, was perhaps the most far

thinking of them all, writing about intergalactic networks and human computer symbiosis

when transistors were barely out of the laboratory. Licklider and Engelbart shared a vision

of the computer as a device that would extend human communication and augment the

human intellect. ARC is where the mouse and the concept of windows were invented. It was

also the place where numerous researchers, Postel included, learned to think aggressively

about how the use of computers could revolutionize human civilization.

Since ARPA’s mission was to focus on leading-edge experimental projects, the

ARPANET had to be turned over to a new home once it was up and running. Taylor tried to

convince AT&T that taking it would be a worthwhile commercial investment, but he was

unsuccessful. Instead, responsibility was transferred to the Defense Communications Agency

(DCA), a military operations group which was developing its own IMP network with BBN’s

help. Postel was still working at SRI at that time. With the shift, work on the ARPANET64

system took on the sensibilities of a routine mission. The new, more practical orientation

undercut the earlier trail-blazing feel.

By this time ARPA had been renamed as the Defense Advanced Research Projects

Agency (DARPA). The addition of the word “defense” didn’t undermine the agency’s

commitment to cutting-edge, avant-garde research, nor its ability to tolerate an intellectually

open environment that had room for relatively nontraditional and nonconformist

personalities. In 1977 Postel began working at a DARPA-funded “think-tank” – the

University of Southern California’s Information Sciences Institute (ISI). He eventually

become ISI’s Associate Director for Networking. ISI was far removed from USC’s campus65

and academic life, leaving Postel largely free of the teaching requirements that can preoccupy

Page 85: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

70

Personal interview with Vint Cerf, November 15, 2001.66

Rony and Rony. (1998: 122-3.)67

Cerf interview, November 15, 2001.68

“Contract Between ICANN and the United States Government for Performance of the IANA69

Function, US Dept. of Commerce Order Number 40SBNT067020,” http://www.icann.org/general/iana-

contract-09feb00.htm. See also Brian Carpenter, Fred Baker, Mike Roberts, “RFC 2860: Memorandum of

Understanding Concerning the Technical Work of the Internet Assigned Numbers Authority,” June 2000.

faculty members. People who needed number or parameter assignments knew to contact him

there. Over time the contact point for those resources came to be known as the IANA – the

Internet Assigned Numbers Authority.

DARPA contracts tended to be rather generalized, conglomerating the work that

ultimately constituted the RFC Editor, IANA and other tasks. That precedent was set by Cerf

during the years he was in charge at DARPA.“In all the time that I was writing and managing

the ISI R&D activity,” Cerf recalled, “I don’t believe we ever used the term IANA. But what

we did was to write very broad language about the areas in which we expected research to

be done.” 66

The convention was followed by Cerf’s successors. The term IANA did not even

appear in DARPA budget documentation until 1993, and then only buried within the contract

for the Teranode High Performance Computing Project.

IANA’s emergence as a named institution occurred gradually. Because of this, and

because of its close identification with Postel as an individual, its rise has been the subject

of extended controversy. For insiders IANA was Jon. For outsiders it was an official agency67

with real standing. Even if the confusion wasn’t intentional, little was done to correct it.

“Jon’s position as IANA was always a little ambiguous,” Cerf admitted, “and we always kind

of left that way.” IANA was finally made the subject of a contract between the Department68

of Commerce and ICANN in February 2000.69

IANA’s enduring authority stemmed from a web of self-referential, and similarly

quasi-official relationships that had operated within the Internet engineering community

since the earliest ARPANET days. Even though US Government funding sustained Postel’s

Page 86: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

71

activities at ISI for many years, his long-run potency as a community leader attested to a

broader social foundation.

Members of the Internet engineering community established standards of merit by

conferring recognizable distinctions on each other. Any anthropologist or sociologist would

recognize the process. In this case, the engineers were building up multifaceted stocks of

culture – a legacy of titles, stories of achievement, citations in RFCs, the binding ties that

result from participation in contracts and agreements, and so on. There was even a self-

referential style of etiquette called “netiquette.” These social inventories served to bootstrap

status relations as the community expanded. Over time, certain patterns of respect took hold

and deepened. Knowledge of those patterns, such as who had the highest status, was essential

for anyone seeking membership in the community. The ability to identify high status insiders

– and, even better, get close to them – was especially useful for those who desired to climb

the ranks.

There is an expression, “Let’s run it up the flagpole and see who salutes.” That was

how authority worked on the Internet. People had, with good reason, gotten into the habit of

saluting flags raised by Cerf, Postel and any others who could regularly demonstrate

expertise in specific domains and convey that expertise with strong communication skills.

That was a pragmatic basis for getting real work done. Newcomers were generally quite

willing to adopt those deferential habits and even promote them. Anyone who made a valid

goal-oriented demonstration of expertise and competency could be elevated by grants of

doctrinal respect. Those displays took on a life of their own. To get along, one had to adopt

the prevailing manner. There was little patience for puffery and lack of acuity. The heralded

openness of the Internet Engineering community implicitly contained an invitation to

snobbery.

Bob Braden, a long-time colleague of Postel’s at ISI, fondly recounted an incident

that makes the case. Braden had challenged the style of endnote references that RFC authors

were required to follow. Postel’s response, recalled Braden, “was equivalent to ‘Get used to

Page 87: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

72

Bob Braden, “Re: rfc-ed reference style,” IETF March 21, 2003.70

Jon Postel, “RFC 204: Sockets in use,” August, 1971.71

it!’” Braden did, and for years thereafter, urged others to do the same. He had decided that70

habits that worked well enough over a long time deserved a standing beyond a fashion of

manners – they were imbued with legitimacy.

This entrenchment of particular styles, manners, and habits of obedience is a

mechanism by which particular societies come to recognize themselves as such. Not all

people are fully aware of how their own individual deeds underlie and reproduce social

conceptions of legitimacy. Background conceptions, after all, are often mistakenly taken as

pre-existing and historically absolute. Nevertheless, Braden’s recollection reflects the way

in which many people do acquire some hazy self-consciousness of their social agency. They

are enlightened enough to affirm their actions as constitutive, insisting that certain behaviors

be upheld as virtuous by their very constitutiveness.

As in any society, the dogma of reputation took on a life of its own. The IANA was

as much a cultural phenomenon as it was a formalistic vestige of the Internet’s routinization.

* * *

In fact, the IANA function performed at ISI was a steady reification of the work

Postel had been doing since he started organizing the entries in IMP logbook. Postel was

always around, policing records and keeping things tidy. An early sign of what was to come

institutionally, even before he volunteered to be Czar, is found in RFC 204, “Sockets in use”

which he published in August 1971.

I would like to collect information on the use of socket numbers for"standard" service programs. For example Loggers (telnet servers) Listen onsocket 1. What sockets at your host are Listened to by what programs?

Recently Dick Watson suggested assigning socket 5 for use by amail-box protocol (RFC196). Does any one object ? Are there anysuggestions for a method of assigning sockets to standard programs? Shoulda subset of the socket numbers be reserved for use by future standardprotocols?71

Page 88: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

73

She described their 15 ½ year long working relationship this way:72

My fondest "story" about how the world looked at Postel & Reynolds as IANA and RFC Editor came from one

of our Internet friends/colleagues. This person sent an email message to Jon and I one day stating, "Please don't

take this as an insult, but you two work so seamlessly together I can't tell who is the IANA and who is the RFC

Editor? So, who does what? Which one of you administers IANA? Who works on the RFCs?" Jon and I were

sitting side by side as usual, reading this email together. Jon turned and looked at me with a big grin on his face,

turned back to the keyboard and started typing a reply. It was one word, "Yes." To this day, I took his response

as a wonderful compliment of how he felt about our work together.

See http://www.isoc.org/postel/condolences.shtml.

Internet Architecture Board, “RFC 1083, IAB Official Protocol Standards 1, Internet Engineering73

Task Force,” December 1, 1988.

Postel asked that comments be sent to him via “The SPADE Group” at UCLA’s

Boelter Hall. The name hinted at the groundbreaking work the engineering students there

believed they had undertaken.

The socket table was updated periodically through the seventies. Other lists such as

“Link Numbers” were added in along the way. The title changed over time, but stabilized as

“Assigned Numbers” in RFC 739, which Postel published in November 1977.

Each RFC in the Assigned Numbers series included a politely phrased directive, “If

you are developing a protocol or application that will require the use of a link, socket, etc.

please contact Jon to receive a number assignment.” That state of affairs persisted until RFC

870 was published in October 1983, the same year TCP/IP became the official protocol of

the ARPANET. The RFC Editor and parameter assignment tasks had grown demanding

enough to justify assistance from another ISI staff member, Joyce Reynolds, who had been

working there since 1979. Starting with RFC 870, Reynolds was named as the primary

author and contact. Postel remained on as coauthor and continued to drive the task of

organizing parameter lists and making assignments. Though Postel was the better known

personality in the assigned numbers business, the importance of Reynolds’ contribution and

the depth of their collaboration was not well understood outside the immediate community.72

Five years later, with the publication of RFC 1083 in December 1988, the “Internet

Assigned Numbers Authority” was finally referenced in print. The term also appears in a73

lower numbered RFC, 1060, but that RFC shows a publication date out of sequence, in

March 1990. In any case, by the end of the decade the use of the term IANA was coming in

Page 89: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

74

Phone interview with Joyce Reynolds.74

Mueller( 2002: 93). A slide from Rutkowski, “History of Supporting Names and Numbers,” includes75

t h e t e x t “ I S I d e n o m i n a t e s P o s t e l D A R P A r o l e a s I A N A ( 1 9 8 8 ) .

http://wia.org/pub/identifiers/identifier_management.gif. See also Rutkowski’s “US DOD [Internet] Assigned

Numbers [Authority]*, Network Information Centers (NICs), Contractors, and Activities: known detailed

history,” http://www.wia.org/pub/iana.html.

to vogue at ISI and across the Internet. The “assignment of meaning” had acquired an

organizational home.

Reynolds couldn’t recall precisely when she first heard the term IANA used as an

acronym. “One day Jon just said, ‘Instead of telling people, ‘Call Jon, or call Joyce for a

number,’ they’ll be told to call the IANA.’” Perhaps the renegotiation of the DARPA74

contract in 1988 prompted a decision to adopt a more distinctly institutional look for this

aspect of their work at ISI. In any case, the acronym turned out to be an apt if inadvertent75

play on the suffix which means “little bits of things,” as in Floridiana, or Americana. If there

is no pivotal moment of IANA’s inception to point out, it is because the long running

continuity of the assignment activity was so much more important to the members of the

burgeoning engineering community than commemorating a date of denomination.

d. TCP/IP

Postel’s dual role as guide and gatekeeper for the community was confirmed with the

development and implementation of TCP/IP. He was not only an outspoken and influential

advocate of its design principles, he managed the initial allocation of shares in the IP address

space, and he exercised authority over allocations of the remaining space for more than a

decade. A key factor in his rise to power was TCP/IP’s great success within commercial and

academic enterprises, first in the United States, and then overseas. Its widespread adoption

reduced the relative influence and authority of the U.S. Department of Defense within the

TCP/IP using community.

As TCP/IP became the networking protocol of choice, the needs of a diversifying user

base went far beyond the limits of the provisioning charters that the US Government’s

program managers were legally bound to follow. Postel had the latitude and the skill to step

Page 90: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

75

Dave Crocker, “The Standards Process That Wasn’t,”in Lynch and Rose (1993).76

Jon Postel, “Comments on Internet Protocol and TCP,” Internet Engineering Notes August 15,77

1977. http://www.isi.edu/in-notes/ien/ien2.txt.

Hafner and Lyon describe the seminal sketch as the result of a hallway conversation (1996: 236).78

up, demonstrating both leadership in the statement of the Internet’s principles, and

responsibility over the distribution of its resources. He was certainly not alone in this. Cerf

and others played undeniably important parts in laying out the design of the system and

building it. But the growing prominence of TCP/IP suite funneled a remarkable mix of rule-

making powers into Postel’s hands.

Though early development of what would become TCP/IP was funded by DARPA,

the work was initially considered to be distinct from the ARPANET effort. In 1977, shortly76

after Postel arrived at ISI, an RFC-like publication dedicated to TCP-related communications

was started up. The series, called Internet Engineering Notes, often overlapped with the

RFC series and finally merged into it several years later, when the ARPANET’s NCP gave way

to TCP/IP.

Postel published IEN-2 in August 1977, starting with a sharp admonition: “We are

screwing up in our design of internet protocols by violating the principle of layering.” He77

went on to sketch out what he called a “cleaner,” simpler field layout for a proposed “internet

protocol message format.” The purpose of the layering principle, he reminded his readers,

was to distinguish the “hop to hop” packaging and routing aspect of the protocol from the

“end to end control of the conversation.” TCP was then at version 2. Layering was one of the

metaphors that would carry the community into the future, and he didn’t want anyone to

forget it.

Only months before, Cerf, Postel, and Danny Cohen had sketched out a new version

of the protocol, presenting a new architectural vision that would open the door to the “end-to-

end” principle. Much of the credit belongs to Cohen, an ISI employee who was working on78

packet-based transmission suitable for realtime voice and video applications. Cohen believed

that problems such as error handling and flow control did not need to be managed inside the

Page 91: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

76

Reliable Unreliable Communication Communication | | | +-------+-------+ | | unreliability | | | injection | | +-------+-------+ | | +-----+----------------+-----+ | Reliability | The original TCP +----------------------------+ | Internet Handling | +-------------+--------------+ | | +----+----+ ----+ Network +---- +---------+

Reliable Unreliable Communication Communication | | +------+------+ | The new TCP | Reliability | | +------+------+ | | | +-----+----------------+-----+ The new IP | Internet Handling | +-------------+--------------+ | | +----+----+ ----+ Network +---- +---------+

Figure 7 Danny Cohen’s proposal for TCP/IP, circa 1977.

network, and that it would be a mistake to lock in the extra costs and excessive delays of

having to provide for reliable delivery. His team had already figured out ways of dealing

with packet losses and out-of-order delivery. As Cohen saw it, incurring the extra overhead

would actually inject factors unreliability factors into realtime applications. He had spent

months trying persuade Cerf that allowance for a loss-tolerant approach, which later came

to be called “best effort,” could work for everyone. The trick, illustrated in Figure 5, was to

separate out different features of the protocol so that various applications and network

Page 92: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

77

This is recounted in a personal email from Danny Cohen to Vint Cerf dated November 28, 1998,79

which Cerf forwarded to me on November 29, 1998.

functions could precisely locate the pieces of information they needed from within the bits

of a packet, ignoring the rest.79

Layering became a key metaphor of the new design. One layer of a transmitted packet

would contain the elements that were of interest only to the sender and the receiver. The

other layer would hold the elements needed by the machines responsible for routing packets

across networks.

Some engineers later portrayed the conception as an hourglass, with IP functioning

as a bearer service in the middle, between media and application components of the network.

Once the distinction was understood and reinforced as rule, a clean design of the fields

within the packet would always reflect that separation.

By wrapping technical solutions in the language of overarching principles, it was

easier to communicate these design assumptions to other computer scientists. Postel’s

announcement, “We are screwing up,” used a coarse vernacular as an intensifier to assert

what was considered good in opposition to what was considered bad. His words underscored

a normative stance that rose above technical comment. Postel was quiet, but not shy. The

utility of the layering principle was to delegate responsibility for accomplishing distinct

technical tasks in the movement of packets. This was something Postel cared about so much

that he wanted all the collaborating engineers to care about it also.

In 1977, however, adoption the layering principle could not be mandated or issued

as a command. It was still more than decade before words such as MUST and SHOULD

would be established as directives within the context of the RFC series, or before the

ubiquity of the Internet protocol suite in markets and infrastructures would give the RFC

imprimatur meaningful gravity.

Postel’s assertions could only mean that anyone who cared to build ties within the

Internet community via the shared ideal of “good” would have to prove loyalty to particular

principles, especially layering and “end to end.” Over time, the advocacy and restatement of

these principles took on a religious flavor. Caring about the success of the Internet went far

Page 93: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

78

Examples in which the phrase is given prominence include, Christian Huitema (2000), Routing in80

the Internet; Brian Carpenter, “RFC 1958: Architectural Principles of the Internet,” June 1996, and; “All about

ISOC - ISOC: Mission Principles,” http://www.isoc.org/isoc/mission/principles/.

Abatte (2000: 140-2). See also Bernard Aboba, “How the Internet Came to Be, Part 2"81

http://madhaus.utcs.utoronto.ca/local/internaut/internet2.html.

beyond concern for adherence to the format of datagrams. As principles like “Connectivity

is its own reward,” came into vogue, the engineering community’s culture would become

unmoored from its roots in the US military’s bureaucracy.80

* * *

Though the initial research on TCP/IP continued to reflect an orientation toward

military interests, development of the Internet was very much unlike that of the ARPANET.

Most importantly, the Internet’s test bed was international from the start. One of the first

three nodes using TCP/IP was based at University College in London. Much of the early

work was linked with the Atlantic Packet Satellite Network (SATNET), a project that

involved several European NATO allies, predominantly Norway and the UK. By 1981,

TCP/IP had matured sufficiently that the ARPANET program managers decided to abandon

the legacy NCP completely. January 1, 1983 was set as the cutover target.

The ARPANET had already evolved into a widely-used operational network.

Consequently, the cutover to TCP/IP was a complicated, protracted, and painful endeavor.

Meeting the deadline became a major preoccupation of 1982. The cutover was ordered by

the military, which still controlled the core of the network, so everyone on the civilian side

had to follow. Much of the logistical effort was coordinated by Dan Lynch, who was then

running the computer center at ISI. The “mad rush” at the end kept dozens of system

managers busy on New Year’s Eve, an experience memorialized by a shirt-button slogan, “I

Survived the TCP Transition.” 81

At that time the US military did not depend on the network for classified

communications. Shortly after the military’s program managers mandated the switch to

TCP/IP, however, they decided to enhance the security of their nodes by isolating them on

a distinct infrastructure. This would require partitioning the network into two sides. The

Page 94: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

79

This was approximately $1,000,000 annually in the mid-1980s. Interview with Don Mitchell,82

February 10, 2003.

See “RFC 755: Assigned Numbers” (also cataloged as “IEN 93"), May 3, 1979.83

military’s operational portion – designated MILNET – came into use in early 1983, even

before the aftershocks of the TCP/IP cutover had subsided. The legacy research platform was

now called “ARPA-Internet.” ARPA was a military agency, of course, but its network had

a distinctly open character that did not fit with the needs of the armed services. The ARPA-

Internet was set further apart after a civilian agency, the National Science Foundation (NSF),

began to subsidize its connectivity expenses.82

It was in the interest of the Defense Department to shed the cost of maintaining the

non-military components of the research network. But it also had an interest in promoting

the network’s continuing growth. Over the long run, the success of a civilian Internet

promised to advance the development of relatively inexpensive hardware and software. The

ability to purchase equipment cheaply through commercial channels would further drive

down costs. This strategy, referred to as “commercial off the shelf” or COTS, paid off quite

handsomely. In fact, things turned out so well, that the military leadership had every reason

to feel delighted. One can imagine a uniformed General standing in the checkout lane of a

computer superstore, pushing a cart filled to the brim with routers and high speed digital

cables, smiling with delight at the bargains his careful plans had wrought.

* * *

Since the IP portion of TCP/IP used a 32 bit address space, the entire Internet could

theoretically include about 4½ billion unique connections. The practical number of

connections was considerably smaller, however, because of the limited capacities of

administrative systems and routing technology. When testing was well underway in the late

1970s, the IP address space followed a regime that allowed for only 256 network

connections; about two dozen were actually assigned by 1979.83

The allocation policy received a major overhaul in 1981, and the IP address space was

segmented into three new subdivisions. Half the space was put aside for just 128 Class A

addresses, each large enough to host 16,777,216 devices. The next twenty five percent was

Page 95: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

80

allocated to 16,383 Class B addresses, each with room for 65,536 hosts. A twelve percent

portion was reserved for 2,097,152 Class C networks, each allowing for only 256 hosts. The

wildly disparate sizes of the classes in the new allocation regime reflected the technical

simplicity of the approach that was taken, subordinating the first three bits of the 32 bit IP

datagram. All addresses starting with binary 0 went into Class A. All prefixed with binary

10 went into Class B, and all leading with binary 110 went into Class C.

Table 1 Reclassification of the IP address space. September 1981. RFC 790.

Class Blocks/Hosts Datagram layout

A

B

C

128/16,777,216

16,383/65,536

2,097,152/256

1 2 3 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+|0| NETWORK | Local Address |+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+|1 0| NETWORK | Local Address |+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+|1 1 0| NETWORK | Local Address |+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+

Instead of spelling out the addresses in the form of 32 binary digits, Postel used a

more readable syntax called the “dotted quad.” The IP string was divided into four 8 bit

segments, and each segment was converted into a decimal equivalent. Since 2 to the 8th

power is 256, and since computer technologists conventionally start counting at 0, the highest

number in each quad could be 255. Consequently, a valid address in the Class A space could

take the form 119.022.133.201

The new policy was promulgated in the “Assigned Numbers”list for September 1981

– RFC 790. Forty one of those Class A blocks were already allocated. The Class B and C

addresses were listed as reserved, with Postel as the designated contact. According to RFC

900, published in June 1984, Postel had recovered nearly two dozen of the Class A

addresses. They were now listed as “unassigned.” Meanwhile, over 50 Class B connections

and hundreds of Class C connections had been allocated, amounting to 902 networks within

the combined ARPA-Internet and MILNET (which Postel had begun calling the DDN-

Internet). But there were also a growing number of allocations for “Independent Uses” being

tracked and registered, many of which were for civilian entities such as universities, though

there were evidently commercial participants as well. These raised the total to 2464.

Page 96: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

81

Mary Stahl, S. Romano, “RFC 1020: Network Numbers,” November, 1987.84

By March 1987 the list of network numbers and associated references was forty two

pages long. The job of maintaining it had become routine and increasingly time consuming.

The main problem, however, was that ISI’s funding depended on DARPA, an agency whose

advanced research mission was specifically intended for inherently non-routine, leading edge,

experimental projects.

In November Postel agreed to move responsibility for the registry operations to the

Defense Data Network’s Network Information Center (DDN-NIC), an SRI-based project

underwritten by the Defense Communication Agency (DCA). The DCA – the official84

sponsor of the MILNET – had been the primary funding source of the DDN-NIC since taking

over responsibility for ARPANET management operations in 1975.

The switch of the numbering operations to the DDN-NIC further exposed just how

many connections were being used by hosts not subject to federal oversight. Since the

military was not authorized to fund civilian activities of this sort, its managers had to find

a way to terminate that activity altogether or at least spin off the funding responsibility so

that it could be allowed to run under the same roof. A model for that was already in place.

The National Science Foundation had been subsidizing non-military domain name

registrations at the DDN-NIC for several years already, beginning shortly after the MILNET

separated from the ARPA-Internet.

There was apparently at least once incident in which a DDN-NIC contract manager

tried to impose funding discipline to the detriment of extraterritorial connections. The

occasion for this arose after a renegotiation of the DDN-NIC contract, shortly after the DCA

was renamed the Defense Information Systems Agency (DISA). SRI was ordered to remove

the entries which provided connectivity to various European and Asian-based hosts. The

shutdown was brief but effective. The NSF, which had been chipping in some funds ever

Page 97: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

82

“We continued to give a little bit of money to SRI so that the non-military registration stuff wouldn’t85

fall through the cracks.” Phone interview with Don Mitchell, February 10, 2003. I qualified the story with the

word “apparently” because I was not able to find any corroboration of a European cutoff, though one supposes

it would have been rather memorable experience for those involved.

Robert Kahn, “The Role of Government in the Evolution of the Internet,” in National Academy of86

Sciences (1994), Revolution in the U.S. Information Infrastructure, http://www.nap.edu/readingroom/books/-

newpath/chap2.html.

since ARPANET had been split into the MILNET and the ARPA-Internet, stepped in with yet

more funds to subsidize the ongoing non-military activity.85

* * *

As the Internet grew, and as the interests of its military and civilian portions diverged,

increasing attention was given to coordination issues, not only for the sake of providing

steady funds, but also to clarify the standards creation process. In July 1984 Postel and

Reynolds issued a “policy statement” as RFC 902 (ARPA-Internet Protocol Policy) defining

authority relationships in the two “worlds” – DARPA and DDN – that constituted the

research community and the operational military network.

According to the RFC, “the DARPA World,” was “headed up by the DARPA office,”

but relied on the Internet Configuration Control Board (ICCB) to “help manage” the Internet

program. Kahn had created the ICCB in 1979 when he was still running the ARPANET

program. Worried that too much of the decision-making process in the TCP/IP development

work depended on Cerf alone, he wanted to foster the growth of institutional memory by

getting other specialists involved at a high level. The ICCB started out as Cerf’s “kitchen86

cabinet” and included close colleagues like Postel, Bob Braden, Dave Clark. It persisted on

after Cerf left DARPA for private industry in 1984. Now Clark was the ICCB chair and

Postel was assistant chairman. RFC 902, by the way, granted Clark and Postel titles which

were considerably more illustrious: “Internet Architect” and “Deputy Internet Architect.”

DARPA was for the research world, the military had its own. “The DDN World”

relied on the DDN’s Project Management Office and various technical groups, most notably

the Protocol Standard Steering Group (PSSG), which took the lead for that side. A few

individuals sat on both the ICCB and the DDN committees to ensure cooperation between

Page 98: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

83

Jon Postel, Joyce Reynolds, “RFC 902: ARPA-Internet Protocol Policy,” July 1984.87

the two groups. For example, important protocols which were published as RFCs for the

DARPA community also had to be published within the Military Standard series used by the

DDN.

RFC 902 also presented a remarkably brief scenario describing the “typical chain of

events” people followed when developing protocol standards. Discussions among “random

people” would lead to “someone” writing a draft which would be passed around to

“interested” people, and ultimately to the RFC Editor. The RFC Editor would pass the draft

around to yet “other people who might also be interested in the problem.” Members of that

“selected informal group” might revise the draft “until the process stabilizes,” at which time

“the protocol draft is sent out as an RFC, identified as a draft proposal of a protocol that may

become an official protocol.” That was it. RFC 902 concluded rather abruptly with a section

titled, “The Bottom Line.”

For the ARPA-Internet and the DARPA research community, DARPA is incharge. DARPA delegates the authority for protocol standards to the ICCB.The ICCB delegates the actual administration of the protocol standards to theDeputy Internet Architect.87

Clarifying the chains of authority was a timely move. It would prove quite useful to

the research community as the Internet’s next phase of growth got underway, accelerating

the system’s reach beyond its military origins. Designating Postel as the ICCB’s Deputy

Architect established his authority in ways that made his powers seem considerably more

concrete in the science-oriented DARPA world, yet independent from direct military

oversight.

e. Commercialization and Privatization, Round 1

The expression “Tipping Point” has become a fashionable way to describe the

moment at which a once-minor social trend becomes so popular that its future dominance is

assured. Malcolm Gladwell glamorized the expression by devoting a book to the topic. Many

academics have written about the phenomenon, using terms like strategic interdependence,

Page 99: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

84

strategic interaction or collective action. The point is that ideas can spread like epidemics.

The challenge is to understand how they propagate. The conventional approach is to identify

inflection points at which members of a society rapidly adopt a new skilled practice. The

classic examples of strategic interaction – demographic shifts like white flight or

gentrification – indicate whether people of a certain ethnicity or economic status consider

it desirable or not live in a particular neighborhood. But tipping points can also reflect

adopted practices such as clothing fashions, political preferences, and, of course, the

deployment of new technology.

The Internet had at least two tipping points, the best-known being the rise of the Web

in the mid-1990s. The shift to TCP/IP that occurred around 1985 was arguably far more

significant, however. The consequence of that tip was the enduring success of TCP/IP as a

widely available, non-proprietary network protocol, and perhaps as the foundation for the

future of global telecommunication.

An essential factor behind the move to TCP/IP was an intentional decision to

cultivate the ground for it and plant the seeds that would promote its growth. In 1983

ARPANET’s managers provided $20 million in funding to support implementation of TCP/IP

on the UNIX platform. That software was quickly ported to a widely-used version known as

the Berkeley Software Distribution – BSD UNIX. Derived from the proprietary version

created by AT&T’s Bell Labs, BSD development was sponsored by the University of

California. Much of the coding was done by Bill Joy, who went on to found SUN

Microsystems. SUN workstations later became an overwhelmingly popular platform for

Internet hosts and DNS servers, including those used in the root constellation.

Another factor in the tip to TCP/IP bears out Gladwell’s hypothesis that respected

Mavens, sociable Connectors, and persuasive Salesman are critical to the successful

transmission of information. Cerf was all three, and he found others to help him. He was

avidly been preaching TCP/IP’s advantages before groups of computer scientists and

engineers within the US and overseas when he met Larry Landweber, Professor of Computer

Science at the University of Wisconsin. This was 1980, just as Landweber was taking the

initial steps to build the research network eventually known as CSNET.

Page 100: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

85

See Malamud (1993), also online at http://museum.media.org/eti/RoundOne09.html.88

The quote is from Don Mitchell. See also SRI Policy Division, The Role of NSF's Support of89

Engineering in Enabling Technological Innovation, http://www.sri.com/policy/stp/techin/inter3.html.

Conceived as a parallel system for computer science departments at universities not

connected to the ARPANET, CSNET was the first major network beyond the ARPANET that

was specifically designed to run TCP/IP. There were other protocols to choose from at the

time, such as OSI, X.25, and DECNET, so Landweber’s conversion had far-reaching

implications. For example, building a gateway between the ARPANET and CSNET turned out

to be a relatively simple endeavor. Compatibility was guaranteed. The move clearly

augmented the utility of both networks.

Parenthetically, CSNET ultimately merged with another regional academic network,

the New England-based BITNET, to form a national system under the auspices of the

Corporation for Research and Educational Networking. CREN was headed by Mike Roberts

(distinct from the ARPANET’s Larry Roberts), whose career had focused on building advanced

networks in academic settings, starting at Stanford University. Mike Roberts later became

a major figure in Internet governance history, joining the inner circle around Cerf in the

1990s and ultimately serving as ICANN’s first CEO.

Landweber’s selection of TCP/IP also set the stage for one of the most decisive

moves to come. In the early 1980s, amid widespread concern that US competence in science

and engineering was falling behind Japan and Europe, there was pressure to boost US

competitiveness by giving researchers in academic institutions better access to

supercomputers. Just as plans for the National Science Foundation’s national computer

networking backbone were reaching fruition, Dennis Jennings, the NSF’s Program Director

for Networking who just happened to be a close colleague of Landweber’s, insisted on using

TCP/IP. That 1985 decision “broke the camel’s back,” because it meant thousands of88

institutions would be involved. The choice signaled manufacturers that investment in mass89

production of TCP/IP-enabled devices was becoming a safe bet. Over time, the growing

supply of those devices and the rise in familiarity with the technology stimulated promotional

activity among vendors and ever more demand among buyers. When the market finally

Page 101: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

86

Barry M. Leiner, “Globalization of the Internet,” in Lynch and Rose (1993: 31-2).90

See Mueller (2002: 99-100). See also Vint Cerf, “RFC 1600: Internet Activities Board,” May 1990.91

moved overwhelmingly toward TCP/IP, the outcome may have seemed inevitable, but it was

built on key purchase decisions from those early days.

* * *

These decisions further empowered the people behind them, with direct implications

for the globalization of the Internet. Landweber had begun working in 1982 to set up Internet

gateways between networks in the US, Latin America, Europe, and Asia. The following year

he began putting on workshops that focused on spreading the use of Internet technologies

throughout the developing world. These meetings were initially known as the International

Academic Networking Workshops (IANW). In 1987 a few of IANW members meeting in

Princeton started a new group, the Necessary Ad Hoc Coordinating Committee (NACC) to

focus on the problems of developing an infrastructure to coordinate their far flung activities.

It was co-chaired by an American, Bill Bostwick from the US Department of Energy, and

James Hutton, Secretary General of RARE, the leading European networking association.

NACC soon transformed itself into the much better known Coordinating Committee for

Intercontinental Research Networks (CCIRN), dedicated to the goal of “global networking.”90

Amid the expansion of the civilian Internet inside and outside the United States,

managers in the Department of Defense were even more keen to transfer the responsibility

for maintaining registration functions over to other agencies or entities. The great challenge

was to do so without causing unnecessary interruption of service. The emergence of CCIRN

helped accelerate movement toward a solution. Prompted by CCIRN’s European members,

Bostwick launched a Federal Research Internet Coordinating Committee (FRICC) within the

US Government, where Division Directors from various agencies including DOE, NSF, and

DoD could coordinate long-range planning and stopgap funding.

In 1990 FRICC evolved into the more enduring Federal Network Council (FNC), a

US Government super-committee that brought together an even larger assortment of Division

Directors (adding NASA and the FCC) in a more formal setting. As before, its purpose was91

Page 102: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

87

to provide a mechanism by which a growing assortment of government bureaucracies could

coordinate their oversight and funding of internetworking research as their particular interests

waxed and waned. The FNC became a venue through which they could manage their shifting

investments, ensuring no plugs were pulled on running systems before new funding sources

were in line. DoD and NSF remained the biggest players. The FNC also met with an

advisory committee of distinguished individuals – academics and business people – who had

been enduring proponents of internetworking research.

The Defense Department’s involvement in support of internetworking research

decreased rapidly in the mid and late1980s. Still, DARPA remained the primary sponsor of

Postel’s shop at ISI and continued to fund the various meetings at which Internet standards

were being developed. These were relatively small investments compared to what the NSF

was doing to advance the use of high speed network technology and the civilian Internet

under the aegis of the NSF’s National Research and Education Network (NREN).

NREN’s design was ambitious and far-reaching, with significant long-term

implications. From the beginning, the NSF planners led by Steve Wolff had contemplated

strategies for spinning off significant pieces of the system to the private sector. His ideas

were sensible enough, in light of NREN’s goals. But the circumstance of the largest spinoff

generated an unhappy controversy, foreshadowing the kinds of battles that would erupt agian

during the commercialization of domain name registration industry later in the decade.

* * *

Through the mid and late 1980s the NSF invested millions of dollars to fund

university supercomputer centers and to build the digital backbone which would connect

them. Like the CSNET, the NSFNET used TCP/IP and hooked into the Internet via the

backbone provided by DARPA’s ARPANET. The first generation of the NSFNET backbone

transmitted data at 56 kilobits per second, about the maximum speed supported by the analog

modems used in home computers. The second generation of the backbone was an outgrowth

of the Michigan Educational Research Information Triad (MERIT), a regional university

Page 103: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

88

MERIT, “The NSFNET Backbone Project, 1987 - 1995,” http://www.merit.edu/merit/archive/92

nsfnet/final.report/. See also, Susan R. Harris, Ph.D., and Elise Gerich, “Retiring the NSFNET Backbone

Service: Chronicling the End of an Era,” ConneXions, 10,4, April 1996, http://www.merit.edu/merit/

archive/nsfnet/retire.connexions.html.

See discussion by Wolff in Brian Kahin ed. “RFC 1192: Commercialization of the Internet,”93

November 1990.

“The NSFNet Backbone Services Acceptable Use Policy,” June 1992, http://www.merit.edu/merit/94

archive/nsfnet/acceptable.use.policy.

network that had entered into an aggressive partnership with IBM and MCI. By 198992

Merit’s part of the backbone reached T1 speed – 1.5 megabits pers second (...equivalent to

a medium-speed DSL modem circa 2005). When the ARPANET was decommissioned in 1990,

the NSFNET backbone was fully prepared to handle the load. Numerous regional educational

and research networks had sprung up by then, and were highly dependent on the availability

of a reliable national carrier. The switch was virtually seamless.

In 1991 the NSF was funding MERIT’s next upgrade of the backbone, this time to

T3 speed – 45 megabits per second. Wolff, Director of NSF’s Networking Division during

most of that period, pumped up the number of connected sites through a remarkably cost

effective program that provided seed money to universities which wanted to tap into the

NSFNET. The Internet as a whole surpassed 5,000 sites that year.

From the beginning, Wolff had taken a farsighted approach. He was thinking about

how to decommission and move beyond the NSFNET even before the ARPANET was shut

down. His strategy included a plan bring about privatization of the NSFNET’s93

infrastructure as soon as possible. This was necessary for opening the Internet to business

activity. Use of the Internet was strictly non-commercial at that time, and would remain so

by law, as long as the NSF was directly in charge. Any regional network that connected to

the Internet via NSF’s backbone was subject to a cumbersome Acceptable Use Policy (AUP)

that virtually precluded the transmission of commercial content. The AUP was designed94

to prevent for-profit ventures from exploiting government-funded infrastructures. The intent

of that policy was fairness, but it was also beginning to fetter Internet growth. It stifled the

users of emerging private regional networks like UUNET and PSINET, and deterred others

Page 104: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

89

Jay P. Kesan and Rajiv C. Shah (2001), “Fool Us Once Shame on You – Fool Us Twice Shame on95

Us: What We Can Learn From the Privatization of the Internet Backbone Network and the Domain Name

System,” February 2001, University of Illinois College of Law Working Paper No, 00-18,

http://papers.ssrn.com/paper.taf?abstract_id=260834.

who might have been interested in the Internet as a long-haul carrier for straightforward

business purposes.

Stimulating private investment in the Internet, therefore, seemed to depend on freeing

the NSFNET from the AUP’s restrictions promised. To accomplish this, Wolff allowed the

operators of the MERIT Network to take over the NSFNET backbone under the aegis of a

new company formed expressly for that purpose. This not-for-profit entity, called Advanced

Network Services (ANS), could sell services back to the government. ANS became the

NSF’s ISP.

Right away, there was controversy. ANS was also allowed to spawn a for-profit

subsidiary – ANS CO+RE – unencumbered by the AUP. The subsidiary sold bandwidth on

the same lines used by the NSFNET, a practice that a rising crop of competitors considered

highly unfair. ANS leveraged its advantage to attract commercial clients seeking access to

the NSFNET, a service no other commercial provider could offer. Critics charged that the

ANS backbone was a quasi-monopoly built on NSF subsidies, arguing that the ability to

apportion part of the backbone for private purposes unfairly tilted the playing field against

aspiring commercial ISPs who had to build entire infrastructures with private funds.

As an analogy, imagine that you had been paid to build a railroad track and a four-car

train. You built the track as agreed, but built an eight car-train instead, and rented out the

four extra cars for your own profit. Yet no one else was allowed to use the tracks. Adding

insult to injury, officials at the NSF had known far ahead of time what ANS planned to do,

but the NSF never announced its decision to permit ANS CO+RE to be run this way. Critics

denounced the arrangement as a “backroom deal.”95

Some of the aspiring ISPs formed their own consortium, calling it the Commercial

Internet Exchange (CIX), The group was led by Mitchell Kapor, the co-creator of Lotus

1-2-3, the original “killer app” of the personal computer industry. Kapor had gone on to

create the Electronic Frontier Foundation and was involved in numerous projects that sought

Page 105: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

90

to engage the computer industry with social activism. He was also a founding investor of

UUNET, a key member of the CIX consortium. One of the most remarkable and innovative

aspects of the CIX arrangement was its peering policy. Rather than deal with the overhead

of negotiating, calculating and redeeming settlements for the exchange of data, participants

simply carried each other’s traffic without charge

In late 1991 CIX sought to connect to the ANS CO+RE backbone, but ANS refused,

evidently seeking to reap the harvest of its monopoly as long as it could. In due course ANS

was subjected to government investigations, Congressional hearings, and a steady onslaught

of public criticism. In June 1992, the ANS directors finally yielded to pressure, and allowed

CIX to interconnect.

* * *

ANS’s move to take advantage of its position and shut out potential competitors

naturally led to charges of insider dealing and monopolistic abuse. Those criticisms would

echo when DNS privatization got underway three years later. For the time being, however,

the spectacular success of the backbone privatization squelched nagging questions about the

preferences granted to ANS. In the end, no single private or public entity could exercise

monopolistic control over the Internet’s main data lines. This was a headline-grabbing,

universally-lauded result. Operators of the various networks that made up the Internet had

become true peers, establishing a new culture of collaboration. By 1994, even ANS had

joined the CIX peering arrangement.

The task of exercising power over the Internet had moved far beyond the level

controlling its hardware. Now there was no center to its physical nervous system, no single

point of failure, no way for one person to pull the plug. This aspect of the Internet was a

fulfillment of the ARPANET’S fundamental design imperatives – especially survivability and

resource sharing. But the trump card was scalable architecture. That feature had facilitated

astonishingly rapid growth, enabling the system to transcend its test-bed origins and spawn

Page 106: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

91

Abbate (2000: 210).96

See Brad Templeton, “Reaction to the DEC Spam of 1978,” http://www.templetons.com/97

brad/spamreact.html.

John Klensin, Einar Stefferud, et al, “RFC 1426: SMTP Service Extension for 8bit-MIME98

Transport,” February 1993.

a global network of networks. By April 1995, over 40 percent of the Internet’s nodes were

outside the United States.96

The undeniable triumph of TCP/IP and the end-to-end principle buttressed the

perception that both the Internet and the people who used it were characterized by a definite

set of progressive norms... openness, autonomy, consensus-driven decision-making, and

freedom from political control by governments. The notion that no one owned the Internet

provided an opening for amplification and articulations. But visionary pronouncements

needed “visionaries” to pronounce them.

One prominent voice in this context was Einar Stefferud, a friend and colleague of

Farber’s and Postel’s since the ARPANET days. Stefferud had made a variety of contributions

to the management of the ARPANET email list, including identification of the first spam – a

1978 unsolicited commercial message set by a DEC marketing representative. He later97

helped write an IETF standards for cross-platform email attachments. By the mid-1990s,98

“Stef,” as he called himself, had positioned himself as a management consultant conversant

on “big-picture” issues such as technology convergence. He began to speak up nearly any

time a discussion about what he called the “Internet paradigm” turned philosophical.

As Stefferud saw it, thanks to the backbone privatization, the Internet had become

“edge-controlled...” a society of “bounded chaos...” impossible to regulate because it was

constituted by free agents. Great consequences would follow, he predicted, such as a huge

leap forward in the way people worked together. Up-and-coming Internet-enabled forms of

collaboration would relegate proponents of central authority (not just governments, but

telephone monopolies and other unreformed corporations) to history’s ash heap.

Propelled by this logic, Stefferud came to believe that the IANA had become a

shackle on the Internet’s capacity for innovation. Predicting that the Internet’s naming and

Page 107: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

92

See Stefferud’s testimony in “Before the U.S. Department of Commerce, National99

Telecommunications and Information Administration: In the Matter of Registration and Administration of

Internet Domain Names, Docket No. 970613137-7137-01,"July 1997, http://www.open-rsc.org/essays/

stef/noi/.

Phone interview with Don Mitchell, February 11, 2003.100

numbering authority suffer the fate of all outmoded relics, he ultimately took a leading role

in a campaign to make that happen as soon as possible. Things got personal later, after the

DNS War was well underway. Stefferud’s rhetoric rose to diatribe when he denounced IANA

as “this last remaining vestigial tail of the ARPANET.” Postel stopped speaking to him. Their99

public split reflected the deepening rift in the community. By the latter half of the 1990s old

friends were breaking off into opposed camps.

* * *

At issue, ultimately, was whether commercialization and privatization was itself the

most fundamental spring of innovation on the Internet, or whether some form of egg-

breaking, omelette-making enlightened stewardship was sometimes appropriate and even

necessary. Wolff’s approach had been to prime the engines of the private sector, injecting

various businesses with a series of market-seeding gambles, supplemented by a few large

market-steering investments. The most critical investment of all – the ANS matter –

inevitably raised hackles for its strategic corner cutting, but things turned out so well so

quickly for so many ISPs that the episode was soon relegated to the annals of tedious old

trivia.

Perhaps Wolff’s could have made some better choices here and there. Nevertheless,

he had a vision for backbone privatization, he turned it into a plan, and he executed that plan

effectively. If he tilted too far in favor of ANS, at least the tilt was correctable and didn’t

produce a lasting injustice in the form of an entrenched monopoly. As one defender put it,

“How often do you hear the word ANS today? Are they big telecom giants? Do they control

the whole world? No.”100

On the other hand, no one in a leadership role expressed a similar long-term vision

for moving the Internet’s name and number administration out of the government’s hands.

Page 108: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

93

This was largely because the community was caught off guard by the Internet’s spectacular

success, especially the inception of domain name-hungry World Wide Web services.

When the 1980s closed, Wolff had already well put things well on the way toward

backbone privatization. He had acted in a farsighted manner, and he had acted quickly

enough to meet the circumstances, but he had also been lucky in that the need for backbone

privatization was long foreseen. With domain names, the story was different. Technical and

policy innovations had outpaced the ability to anticipate the extent to which the sheer

demand for names and numbers would explode. This put Jon Postel in a difficult position.

Possessing so much a prestige and competence, it seemed logical that the initiative would

fall to him. Yet the circumstances severely compressed the amount of time available in which

he could formulate a vision, put it into action, and find a way to manage the inevitable

complaints.

Page 109: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

94

Chronology 2 Early Infrastructural Development

Date Event

During 1994 ANS adopts CIX collaborative peering arrangement.

June 1992 ANS yields to CIX demands for right to interconnect.

During 1991 MERIT is allowed to form ANS, becoming NSF’s ISP.

During 1990 Federal Networking Council (FNC) begins meeting.

December 1988 First explicit reference to “Internet Assigned Numbers Authority”, RFC 1083.

During 1987 NSF contracts with MERIT (including IBM and MCI) to develop NSFNET backbone.

November 1987 IP Registry operations moved from ISI to DDN-NIC at SRI.

During 1985 Dennis Jennings of NSF selects TCP/IP for national backbone.

July 1984 RFC 902 describes administrative relations between research and military networks.

October 1983 Joyce Reynolds joins Postel as assignment contact,

IP segments organized into class-based structure, Assigned Numbers, RFC 870.

During 1983 ARPANET program managers promote UNIX implementation of TCP/IP.

During 1983 Gateway built linking CSNET and ARPANET.

Early 1983 MILNET splits from ARPANET.

January 1, 1983 Deadline for transition from NCP to TCP/IP.

August 1982 Postel publishes Simple Mail Transfer Protocol (SMTP) ,RFC 821.

September 1981 Postel publishes Internet Protocol, RFC 791.

During 1980 CSNET incorporates TCP/IP.

During 1979 NWG reunites as Internetworking Conference Control Board (ICCB).

May 1979 First allocation of IP address segments, Assigned Numbers, RFC 755.

During 1977 Postel begins work at USC’s Information Sciences Institute (ISI).

May 1974 Cerf and Kahn, A Protocol for Packet Network InterCommunications – basis of TCP.

During 1972 Robert Kahn leaves BBN for ARPA, to focus on designs for peering technology.

May 1972 Jon Postel proposes himself as numbers “czar”, RFC 349.

March 1972 Ray Tomlinson of BBN inaugurates use of @ symbol in email.

August 1971 Postel begins taking responsibility for number assignments, Sockets in Use, RFC 204.

July 13, 1970 Kalin publishes A simplified NCP Protocol, RFC 60.

December 1969 First four nodes of ARPANET are connected via IMPs.

October 29, 1969 First host to host message, sent by Charley Kline, from UCLA to SRI.

April 7, 1969 Steve Crocker publishes Host Software, RFC 1.

Early 1969 Meetings of the Network Working Group (NWG) begin.

During 1968 BBN awarded contract to build first IMP.

Late 1966 Robert Taylor hires Larry Roberts to plan and launch the ARPANET.

August 1962 Paul Baran publishes On Distributed Computer Networks.

March 1960 J.C.R. Licklider publishes Man-Computer Symbiois.

July 1945 Vannevar Bush publishes As We May Think.

Page 110: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

For a thorough listing of leaders in the IETF engineering community over an extended period, see101

Abdul Latip, “Trivia IETF,” http://people.webindonesia.com/dullatip/opini/2000/ietf.txt.

95

You wanted to be a standards organization. Now livewith it.

Stev KnowlesInternet Engineering Steering Group

4. INSTITUTIONALIZING THE IETF

a. Open Culture

In 1977, Postel moved to ISI, where he worked until the end of his life. Most of his

projects were funded by the US Government’s Department of Defense, via DARPA.

Ironically, the main client for the parameter assignments and documents Postel produced was

not the US military, but an exceptional group of avowedly anti-authoritarian civilians, the

membership of Internet Engineering Task Force – the IETF. It was the institutional legacy

of the Network Working Group, the team that Steve Crocker had organized in the late 1960s

starting with a handful of students UCLA’s Boelter Hall. Though the original NWG had

long ceased to exist as an ongoing entity, Postel saw to it that those three words – Network

Working Group – headed every RFC he published. This was more than just a bow to

sentiment and tradition. By the early 1990s the most prolific source of documents submitted

to the RFC editor was the IETF, which, for all practical purposes, was the NWG’s rightful

successor.

The core of the IETF is a notoriously casual and clubbish group of people whose

sense of community predates their organizational identity. Many members can trace their

lineage back to the ARPANET days and can boast a very few degrees of personal separation

between Crocker and his NWG colleagues. Crocker’s initiative at UCLA had spawned101

multitudes of Internet-focused working groups and technical committees. The first apparition

of the IETF was simply one among many of those groups. It was officially launched on

January 17, 1986, descended from a working group called Gateway Algorithms and Data

Structures (GADS).

Page 111: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

96

Internet Architecture Board, Henning Schulzrinne, ed. A Brief History of the Internet Advisory /102

Activities / Architecture Board, http://www.iab.org/iab-history.html. See also, “Interview with Lixia Zhang,”

IETF Journal 2.1, Spring 2006, http://www.isoc.org/tools/blogs/ietfjournal/?p=70.

See letter from Rahmat M. Samik-Ibrahim recounting a conversations between Corrigan and Hans-103

Werner Braun, http://www.postel.org/pipermail/internet-history/2001-October/000057.html.

Hans Werner Braun, “Re: IETF as a Pentagon Thing,” IH http://www.postel.org/pipermail/104

internet-history/2001-October/000064.html.

GADS had first met in 1985 and after only a couple meetings was split in two to

account for emerging distinctions in architecture and engineering. Some members were102

primarily interested in “blue sky” design, while the others wanted to focus on solving

immediate operational problems. The latter was designated Internet Engineering – INENG.

Twenty-one participants attended the first INENG meeting in San Diego, chaired by Mike

Corrigan, the DDN’s Pentagon-based technical manager. The acronym was soon changed to

IETF.

Though the first gathering included participants as diverse as Lixia Zhang, an MIT

grad student, early meetings generally included consisted of individuals credentialed by the

Defense Department. The military connection was palpable. One meeting at Moffet Field

was interrupted by the noisy takeoff of a U-2 spy plane. The culture shifted at the end of

1987 when Corrigan was persuaded to open the door to participants from the National

Science Foundation. By that time the NSF’s sponsorship of an educationally-oriented103

national telecommunications backbone had made it a major force in networking. Despite the

radically different missions of the parent bureaucracies, NSF’s experts were kindred spirits

who hoped to improve of their own TCP/IP networks. It made sense for the military to grant

access. With that, the IETF was no longer a “Pentagon thing.” The move immediately104

opened the way for the unrestricted entry by non-governmental participants and private

vendors. As more people entered, even more wanted in. Participants felt they were truly on

the “ground floor” of the burgeoning Internet, and didn’t want to miss a thing.

Attendance at IETF meetings grew steadily, breaking 100 in 1987. A sweeping

reform of the Internet standards-making process in 1989 recast the IETF as the overarching

venue for a wide array of working groups. The IETF consequently became the hub of

Page 112: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

97

“Past Meetings of the IETF,” http://www.ietf.org/meetings/past.meetings.html.105

Froomkin (2003).106

“Overview of the IETF,” http://www.ietf.org/overview.html.107

John Klensin, “Re: [ISDF] RE: www.internetforce.org,” IETF January 10, 2004.108

Internet-related standards development, and thus the rightful heir to the spirit of the NWG.

Meeting attendance grew rapidly thereafter, reaching 500 in1992, 1000 in 1995, and 2000

in 1998, with a drop to around 1600 after the end of the Internet boom. 105

The IETF’s processes have remained remarkably open since the 1989 reform.106

Much of the work of thrashing out the details of new Internet standards is done through

participation on online email distribution lists. Anyone can join an IETF Working Group –

over one hundred are presently active – simply by subscribing to the appropriate list. Most107

groups take a particular standard, or a carefully limited set of standards, as their finite

mission. One senior member, John Klensin (formerly of MCI) described the process this

way:

[T]he fact that we generally close WGs that have finished their work, ratherthan encouraging them to find new things to keep them busy has, I believe,been very important in keeping us from developing many standards that noone wants and that are out of touch with reality.108

The working groups are overseen by a superset of Area Directors who constitute the

Internet Engineering Steering Group (IESG) – the IETF’s de facto oversight body. The IESG

is not considered to be a free standing group. It is a management function of the IETF. New

Working Groups can be called into existence by the IAB or the IESG, or the membership can

self-organize under a provisional status called “Birds of a Feather” (BOF). The BOF

designation was once very much a part of the IETF culture – cute and cutting edge.

RFCs and related publications are provided free of charge over the Internet, as are the

minutes of all meetings, including Working Group, and BOF sessions (that is, when someone

bothers to write them up). A key motive for this open culture was that the IETF’s early

members tended to resent the closed procedures and expensive charges for publications that

Page 113: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

98

Hafner and Lyon (1996: 235-5).109

Andrew L. Russell (2002), “From ‘Kitchen Cabinet’ to ‘Constitutional Crisis’: the Politics of110

Internet Standards, 1973-1992.” See also his (2001), "Ideological and Policy Origins of the Internet,

1957-1965."

were de rigor in most other standards organizations. They wanted their community to stand

as an alternative. Steve Crocker had confirmed that principle early on. While serving as

DARPA program manager he used his leverage to ensure that BBN would release the IMP

source code to anyone who requested it. This reinforced the sentiment within the109

community that the benefits of work performed for the public should not be appropriated by

private parties or sequestered within a government bureaucracy.

The IETF’s celebrated traditions stem from that and numerous like-minded decisions

to build an open and inclusive community. The graduate students and young professors of

the NWG had come to power in the ARPANET environment, and that cultural experience

inscribed an unmistakable influence in their lives. The ARPANET, like most military projects,

placed little priority on commercial imperatives like recovering the cost of investment. But

it was unusual in that it was unclassified. The system embedded the values of what some

considered to be the “golden age” of American science. Just as the commercial aerospace and

satellite industries benefitted from spinoffs of Cold War arms programs, the Internet’s

founders had been “utilizing the resources of the military to advance the state of the art in

their field.” Government funding was generous during that era, its oversight was110

unobtrusive, and the recipients generated one useful discovery after another.

Yet Cold War-styled secrecy and skulduggery was anathema in the Internet

engineering culture. The NWG’s legacy of openness to grassroots participation bloomed

again within the IETF’s culture, instilling a fresh mood of frontier idealism. The Internet’s

newcomers – both engineers and users alike – inhaled a rich, heady atmosphere of open

interaction that induced an uncompromising passion for unrestricted access to information.

As a senior IETF member, Harvard-based Scott Bradner, put it, “It is a matter of pride and

Page 114: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

99

“Testimony of Scott Bradner,” http://www.ciec.org/transcripts/Mar_21_Bradner.html.111

honor in the IETF that all documents are public documents available for free over the net.

We used the paradigm to develop the paradigm.” 111

This is one of the clearest and simplest examples of how computer engineers behaved

as social engineers. The spirit of the net’s culture was to share information about the net’s

design. It was a potent alignment of means and ends, values and goals.

But the luxury of ignoring commercial imperatives for the sake of avant garde

innovation would have long-term ramifications. It would draw many of the IETF’s leaders

into direct confrontation with business managers whose livelihood depended on the exclusive

control of information. The growth of the Internet ultimately would produce an epochal

challenge to the business models of many important industries in the United States. It was

particularly disrupting to the news media and recorded entertainment industries.

* * *

Entry to the IETF’s week long, tri-yearly meetings is open to anyone. Admission fees

were nominal until the late-1990s when government sponsorship finally phased out. That

funding had peaked in 1995 at about $1,000,000 per year. To offset the lost subsidy, the cost

of admission to the meetings rose steadily, reaching a fairly reasonable $450 in 2002... less

for students. Participants meet in open sessions and work online, using a generously endowed

computing and connectivity facility called the “terminal room,” feasting throughout the week

on free snacks and coffee. (During the Memphis meeting I attended in 1997, the IETF’s

Chair, Fred Baker, bragged that the entire city had run out of cookies because of the

members’ insatiable appetites.) Contributions to the group’s work are considered voluntary,

except for the tasks performed by a small secretariat.

Despite the wide open atmosphere and exceptionally casual dress code, the behavior

of the members reflects a highly disciplined blend of clique and meritocracy. Long-term

participants have a strong cultural commitment to preserving the IETF’s manners and

historical memory. One could even argue that the behavior of Internet engineers reflects a

modern version of the guild mentality that characterized artisan societies several centuries

Page 115: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

100

The first person I heard entertain this idea was Milton Mueller.112

See Jon Postel, “RFC 791: Internet Protocol,” September 1981. Bob Braden highlighted the citation113

and provided its better-known phrasing in “RFC 1122: Requirements for Internet Hosts -- Communication

Layers,” October 1989. See Braden’s comments on this in, “Re: Be: liberal...” IETF November 1, 1999,

http://www.netsys.com/ietf/1999/5222.html.

ago. The organization welcomes newcomers and their useful work, but insists that anyone112

seeking to publish a standard conform to a very carefully delineated style of communication.

This is not necessarily a bad thing.

Postel, as RFC Editor, was not only the final arbiter of the acceptable style, he was

the master and model of it. To read the RFCs that he wrote, or to witness one of his live

presentations, was to experience a demonstration of prolific clarity. His words were

eminently constructive, a palpable example of how, in philosophical terms, speaking is

doing. His style of communication also set a standard for making standards, a constraint that

served as a critical enablement. Working Groups meetings tended to be fast and challenging.

Many groups would have foundered if the members had not been presented with such an

exquisite example of how to work well.

Postel had his own pithy take on the ethic of openness, distilled into the aphorism,

“Be liberal in what you accept, and conservative in what you send.” The text from which the

maxim was drawn was utterly pragmatic:

The implementation of a protocol must be robust. Each implementation mustexpect to interoperate with others created by different individuals. While thegoal of this specification is to be explicit about the protocol there is thepossibility of differing interpretations. In general, an implementation mustbe conservative in its sending behavior, and liberal in its receiving behavior.That is, it must be careful to send well-formed datagrams, but must acceptany datagram that it can interpret (e.g., not object to technical errors wherethe meaning is still clear). 113

Much could be inferred from that, far beyond the architecture of the Internet Protocol.

The point of openness is not to unleash a flood of data, but to foster the communication of

useful information. The point of a rule is not to insist on strict conformance to the rule, but

Page 116: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

101

to help accomplish the tasks that the rule makers hoped to facilitate. Appreciation for the

wisdom in that statement is what made Postel a guide, and not just a gatekeeper.

b. The Rise of the IAB

Cerf’s career had briefly drawn him away from the commanding heights of Internet

in the mid 1980s, when he left his position as Network Program Manager at DARPA to

become Vice President of Digital Information Services at MCI. He started there in 1982,

focusing on the development of the first major commercial email system – MCI Mail. It was

not even a TCP/IP-based project. Still, the absence wasn’t long, plus the private sector

experience and contacts it opened were bound to be valuable.

After MCIMail was well underway, Cerf was free to resume high level involvement

in the Internet standards process. This time around he would be as much an impresario as a

developer. His new goal was to promote unabated expansion of the system. What was good

for the Internet was good for MCI. The strategy took shape as he moved to consolidate the

Internet’s planning, standards making, research coordination, education and publications

activities under an organizational umbrella beyond the reach of government control.

The pieces started coming into place in 1986 when Cerf joined TCP co-developer

Robert Kahn, and ISI’s founder, Keith W. Uncapher, in creating the Corporation for National

Research Initiatives (CNRI). It was a kind of “sand box” in which advanced technical ideas

related to internetworking could be investigated and developed. Kahn was the primary

mover, serving as Chairman, President, and CEO. One of CNRI’s first projects involved

linking MCI’s mail system with Internet mail. MCI was among CNRI’s key corporate

sponsors, which included IBM, Xerox, Bell Atlantic, Digital Equipment Corporation, and

other large corporations with interests in the data telecommunication industry.

CNRI began hosting the IETF’s new secretariat, a small operation which performed

tasks such as scheduling meetings, accepting registrations, responding to inquiries, and so

on. The secretariat was initially headed by IETF Chair Phil Gross, who was simultaneously

working on other projects at CNRI. When Gross left CNRI in 1991 to become Executive

Director of ANS (the NSFNET’s backbone provider), Steve Coya, who had been at MCI and

Page 117: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

102

A list of his publications may be found at, “Clark, David D.,” http://www.networksorcery.com/-114

enp/authors/ClarkDavidD.htm.

worked with Cerf on the MCIMail/Internet linking project, was hired to run the Secretariat

full time.

The progeny of the NWG had not been floating aimlessly after Cerf left DARPA in

1982. His successor as Network Program Manager, Barry Leiner, eliminated the Internet

Configuration Control Board (ICCB) of the ARPANET era, replacing it with the Internet

Activities Board (IAB). Though the IAB was larger, the core membership and the leadership

within the two groups was essentially identical. The transition could just as well have been

considered a renaming. The IAB’s first chair was Dave Clark, a Research Scientist at MIT.

Clark, the reader may recall, would go on to herald the IETF as an organization which

rejected “kings, presidents, and voting” in favor of “rough consensus and running code.”

In addition to serving as IAB Chair, Clark was also known during this period as Chief

Protocol Architect, a title evidently intended to confer a status equivalent to Cerf’s during

his own inventive heyday. Clark’s many contributions to Internet development included a

series of papers that carefully described and rationalized its architectural principles –

particularly the “end-to-end” and “fate-sharing” ideas whose implementation greatly eased

the task of moving signals across its networks. 114

The overarching motivation, in keeping with the notions of open architecture Kahn

envisioned for TCP, was to keep the lowest levels of the protocol unencumbered by

requirements that could be handled elsewhere. No information about the content of packets

would be retained by the routers and gateways that moved them around, a condition called

“statelessness.” These ingenious principles assured the TCP/IP protocol suite would remain

open to further extension in many directions. The proof of their virtue was the later

emergence of popular applications like the World Wide Web and realtime chatting.

Clark also wrote the earliest implementation of software that made various Internet

services accessible from DOS-based desktop computers. And he made it his business to call

for a system of distributed “name servers,” thus prompting the development of the modern

domain name system.

Page 118: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

103

Leiner, et al, “A Brief History of the Internet,” http://www.isoc.org/internet/history/brief.shtml115

Leiner and Kahn both left DARPA in 1985, just as the agency was beginning to scale

back its involvement in networking research. This left the IAB without a sponsor in the

government. The IAB’s membership remained active, however, and the group was able to

take on a mantle of independent leadership in Internet-related matters. As a committee of115

technical luminaries whose pedigree traced back directly to the founding fathers, the IAB

began to speak as if it were the official voice of the Internet. The presumption of eminence

was especially useful when circumstances called for gravitas.

Clark remained on the IAB when Cerf became its chair in 1988. In 1989, with CNRI

already in place and ready to sponsor the IETF’s secretariat, Cerf presided over a major

overhaul of the engineering community’s growing constellation of interests. At that point,

Internet Engineering was still just one of many task forces and working groups that had

emerged in the wake of the ARPANET and Crocker’s NWG. Other significant working groups

included Interoperability, Applications Architecture; Security, and End-to-End Services.

Overcrowded meetings had already generated spontaneous attempts at self-

organization into specialized groups, but now everything was consolidated into two task

forces, reconstituted as arms of the IAB. Nearly all the activity was subsumed under the

IETF. The rest went to the much smaller Internet Research Task Force (IRTF), – eventually

headed by Postel. Each task force consisted of an array of working groups organized into

Areas, headed by two Area Directors. Those directors constituted an overarching Steering

Group. The IETF had its Internet Engineering Steering Group (called IESG), and the IRTF

had an Internet Research Steering Group (called IRSG).

* * *

Careful adjustments were being made in the overarching structure of the RFC series

during this period. The complexity and diversity of the contributions necessitated a better

classification of the kinds of work that were being published, and a clearer description of

what it took to get an RFC published.

Page 119: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

104

See for example, “RFC 30: Documentation Conventions,” February, 4, 1970.116

See Bob Braden, “IH Re :Common Questions, RFC-Compliant Architecture,” IH September 6,117

2003.

When Steve Crocker started the RFC series he occasionally inserted RFCs titled

“Documentation Conventions.” These laid out the basic form for the notes in the series.

(Since there was no email yet, each of those very early RFCs also included a distribution list

with the recipients’ ground addresses.) In April 1983 Postel began inserting RFCs116

identifying “the documents specifying the official protocols used in the Internet.” These

RFCs were like catalogs of the essential RFCs and would serve to address the widespread

interest in seeing a clear statement of what the Internet protocol suite actually required for

a compliant implementation. He initially titled this series “Official Protocols,” later changing

it to “Official ARPA-Internet Protocols.” That title was still in use just before the IETF was

reorganized.

The growth of the Internet in the 1980s brought with it a growing desire among

newcomers to understand the process underlying the creation of a standard. When Postel and

Reynolds published RFC 902 in 1984, describing the DDN and DARPA worlds and the

“Typical Chain of Events” leading to the publication of an RFC, it was an important, if only

nominal step forward. Their rendition of the process was far too airy and informal to endure

for long.

In 1988 DARPA’s program managers pushed the members of the Internet Activities

Board to insert more rigor into the publication process. This was a timely recommendation.117

The explosion of interest in the IETF presaged a formidable increase in the workload

shouldered by Reynolds and Postel. IAB members pitched in with the goal of establishing

and refining an official “standards track” that would distinguish formally endorsed standards

from the informational documents that anyone could submit in accordance with the

traditional ethos of openness. Postel renamed the protocol series to “IAB Official Protocol

Standards,” promulgating two independent categories of classification, each with its own

process of maturation

Page 120: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

105

Internet Activities Board, “RFC 1083: IAB Official Protocol Standards,” December 1988118

See RFCs 1310, March 1992; 1602, March 1994, and; 2026 October 1996.119

From Lyman Chapin, “RFC 1310: The Internet Standards Process”, March 1992.120

The first is the state of standardization which is one of "standard", "draftstandard", "proposed", "experimental", or "historic". The second is the statusof this protocol which is one of "required", "recommended", "elective", or"not recommended". One could expect a particular protocol to move alongthe scale of status from elective to required at the same time as it movesalong the scale of standardization from proposed to standard.118

After the reorganization of the IETF, IAB and IESG members continued to work on

firming up the RFC publication process. The next pivotal step came in 1992, under the

authorship of Lyman Chapin, who had succeeded Cerf as Chair of the IAB that same year.

Subsequent refinements were overseen by Scott Bradner, whose primary base of activity at

that time was closer to the IETF working groups and the IESG. The classification system119

began with a set of basic definitions.

[A]n Internet Standard is a specification that is stable and well-understood,is technically competent, has multiple, independent, and interoperableimplementations with operational experience, enjoys significant publicsupport, and is recognizably useful in some or all parts of the Internet. 120

The document representing an Internet Standard had to advance through levels of

maturation – Proposed Standard, Draft Standard, and Internet Standard. Time would be

allowed for peer review from within the working group as well as comment from IANA and

the appropriate IESG Area Directors. In Chapin’s version, final approval depended on the

IAB at the top the chain. While all types of documents were still included in the RFC series,

an official Internet Standard would now receive a second designation. For example, RFC

1157 – the Simple Network Management Protocol – was also denominated as STD0015.

The close of the 1980s was not only an extremely productive period for the Internet

engineering community, people remember it as a wild time – exciting, fun, and full of

surprises. Plus, it was becoming clear that before long a lot of money would be made. As a

group of individuals, they were few enough and well-connected enough that many could still

Page 121: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

106

See his interviews with Tony Perkins, Red Herring, “The Weaving of the Web,” January 26, 2000,121

http://www.redherring.com/insider/2000/0126/news-redeye012600.html. See also David Hochfelder, “Vint Cerf

Oral History,” May 17, 1999, http://www.ieee.org/organizations/history_center/oral_histories/transcripts/

cerf.html.

get to know each other personally. The investment of time and planning from earlier in the

decade was beginning to bear fruit in the form of a useful, wide-reaching network, fostering

a breathtaking new era of online interaction and collaboration. Moreover, there were growing

signs of DARPA’s retreat from direct sponsorship of the standards process, giving weight

to the idea that the IAB had been left in a position of authority. This was not necessarily true,

but for those who dreamed of new worlds in cyberspace, it was beginning to seem as if the

Internet could fulfill that promise.

c. Discretionary Authority

Just as Wolff had started early, putting ducks in a row to prepare the way for

backbone privatization, so had Cerf, who began to remake himself as a policy innovator. At

that time the conduct of private activity on the Internet was still a novel and touchy issue.

Seeking to “bust” through the anti-business “policy logjam” favored by academics who

harbored a disdain for corporate “money-grubbers,” Cerf took what he would later describe

as “a very deliberate step.” He believed that the Internet couldn't possibly expand to its full

capacity “without having an economic engine running underneath it," so he persuaded the

members of the Federal Networking Council to allow MCI Mail, the first commercial email

package, to be tied in. Not coincidentally, he had been working for MCI over the previous121

few years.

Another carefully prepared step was the August 1990 publication of RFC 1174.

Promulgated as the official position of the IAB, it was framed as a policy recommendation

to the FNC’s Chairman about how to deal with the burgeoning Internet.

Cerf wanted to break another logjam, and had every reason to expect the FNC’s

members would again accept his suggestions. He clearly had an ambitious agenda. His

threefold purpose in writing RFC 1174 was: 1) To promote retention of a centralized IANA;

2) To ratify IANA’s authority to “delegate” or “lodge” various assignment tasks with a third

Page 122: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

107

party Internet Registry, which at that time was still based at SRI in the DISA-NIC, and; 3)

To explicitly urge termination of certain requirements by which new networks and service

providers were allowed to connect to the system.

RFC 1174 greatly boosted IANA’s prominence, giving it the semblance of a formal

life. Though references to the “Internet Assigned Numbers Authority” had begun to appear

in 1988, RFC 1174 appears to be the earliest documented invocation of IANA as an

acronym, and the first description of it as an institution, rather than simply the address of the

place where new numbers could be obtained. That description asserted IANA’s centrality in

the management of the Internet’s increasingly complicated number assignments.

Though not explicitly mentioned in the text, a key motivation behind the overhaul

was to expedite the growth of the Internet in Europe. The ratification of IANA’s delegation

authority would formally allow Postel to distribute a large block of IP numbers to a West-

European consortium of networks coordinated by Alex Karrenberg in Amsterdam. Another

block would be delegated to an Asian registry as well. The changes in connection policy

described in RFC 1174 would reduce administrative overhead and radically streamline the

process by which the Europeans and others could hook new networks into the Internet. But

it would also mean that huge chunks of IP space were being delegated outside of American

jurisdiction.

Though taking SRI out of the loop was intended to open up the connection process,

the goal was streamlining, not anarchy. There was every reason to expect that some

responsible party would be called upon to provide oversight and coordination (“sanity

checks”) as the system grew... no longer on a case by case basis for every new network, but

at least for the largest networks and backbone providers. When CCIRN was created, there

was evidently some hope that it might ultimately be able to step and up and do that job,

providing the sort of global infrastructural coordination needed to “make the Internet work.”

Paradoxically, however, increasing levels of participation handicapped the group. In the view

of one member, CCIRN had devolved into a “dinner club” unable to offer much more than

Page 123: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

108

Personal email from Steve Goldstein, former NSF program director for Interagency and122

International Networking Coordination. February 18, 2003.

See the initial IEPG charter in Barry Leiner, “Globalization of the Internet” in Lynch and Rose123

(1993: 33-4).

Geoff Huston, “ RFC 1690: Introducing the Internet Engineering and Planning Group (IEPG),”124

August 1994.

a venue for socializing and “prolonged show-and-tells.” In 1990 CCIRN’s leaders set up122

a smaller and better focused Intercontinental Engineering Planning Group (IEPG) to fulfill

the mission of infrastructural coordination. 123

A parallel Federal Engineering Planning Group (FEPG) had been created in the US

by an FNC subcommittee co-chaired by Phil Dykstra of the National Oceanographic and

Atmospheric Agency (NOAA) and Dick desJardins of NASA. The FEPG was instrumental

in coordinating the creation of two Federal Interagency Exchanges – FIX East at College

Park Maryland and and FIX West in California. These became the prototypes for the

Network Access Points (NAPs) that are essential for the movement of Internet traffic today.

The FEPG remained active through the mid-1990s.

The original IEPG was rechartered in 1994 as the Internet Engineering Planning

Group, keeping the same acronym. The IEPG’s membership tended to be particularly close124

to Postel. The first chair was Phil Gross, a high-profile leader in the standards community.

Another key member was Bill Manning, a long time staffer at ISI, who also participated in

a related project called the Routing Arbiter. The IEPG’s members were in no position to

assume the level of gatekeeping authority that SRI had provided, but neither did they want

to. Their general goal was to liaison with other Internet operators to provide technical

coordination.

Routing issues were not directly under Postel’s purview as Name and Number’s

Czar, but those operations were directly affected by the assignment of IP addresses. Routing

could not take place unless there were numbers to route, so he had to be kept in the loop.

Now that a vast worldwide build-out of digital telecommunications was underway, putting

physical control of the Internet’s infrastructure increasingly beyond the grip of the U.S.

Page 124: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

109

Vint Cerf, “RFC 1174: IAB Recommended Policy on Distributing Internet Identifier Assignment125

and IAB Recommended Policy Change to Internet ‘Connected’ Status,” August 1990.

Government, the importance of getting access to the system by insuring a presence within

its array of names and number identifiers became that much more obvious. And here, IANA

was still the top of the chain.

* * *

An important assumption that informed the writing of RFC 1174 was the belief SRI’s

role as an Internet Registry would soon end. As the home of the legendary Augmentation

Research Center, and the second site of the ARPANET, SRI had received considerable

amounts of government funding to manage related projects. It had been functioning as the

internetworking information center for nearly two decades. Even the RFC series was

distributed via SRI’s online services. Many of the lists and tables of records that were

presumed to be under Postel’s stewardship were in fact being managed at SRI on a day to day

basis. Among the most of important of these were the delegation of IP numbers and second

level domain names.

This was about to change. As a consequence of the NSF’s successes, interest among

academics had reached a self-sustaining critical mass. With the military’s relative investment

in rapid decline, it was clear that the databases would soon be moved to another location and

sustained by new sources of funding. Cerf wanted to be sure that he, Postel, and other

internetworking insiders would continue to have strong influence and perhaps even the final

in the expected redelegation. IANA would provide the venue for this gatekeeping power.

Though the name and the acronym had only recently come into use, Cerf portrayed

IANA as an organization that was as old as the Internet itself:

Throughout its entire history, the Internet system has employed a centralInternet Assigned Numbers Authority (IANA) for the allocation andassignment of various numeric identifiers needed for the operation of theInternet. The IANA function is performed by USC Information SciencesInstitute. The IANA has the discretionary authority to delegate portions ofthis responsibility...125

Page 125: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

110

Asserting the IANA’s “discretionary authority to delegate,” stood as a claim that

Postel’s IANA had the final say over the allocation of critical Internet resources. With this

statement, Cerf asserted the right of the IANA to add and change elements in the

constellation of machines that made up the root of the Domain Name System, and to

distribute blocks of the IP address space. He also asserted the right to assign those tasks to

subordinate agents. Postel had been doing such things all along, of course, but now some of

those subordinates would be overseas. No objection came from any government agency. The

new regime was allowed to stand.

Perhaps even more importantly, the wording expressed a crucial distinction which

would have enduring ramifications. Even though Postel’s employer was ISI, whose employer

was, in turn, the Department of Defense, IANA’s employer was described in RFC 1174 as

the Internet system. IANA’s power – its authority to act – was thus posed as stemming from

its institutional subordination to that system, rather than from contractual duties performed

by individuals at the behest of the US government. If what was true for the IANA was true

for Postel, this is an early indicator that he was serving two masters. In 1990 that did not

present a hint of a problem, but by the end of the decade his loyalties would be put to the test.

And who, if anyone, could speak for the Internet system? This would be debated as

well.

d. ISOC Underway

Computerization rapidly accelerated in American society during the 1980s as the

technology became more affordable and more accessible. While Internet access spread

among colleges and universities, non-networked personal computers become pervasive, if

not ubiquitous. When the decade began, the Tandy TRS-80 was approaching the peak of its

popularity. The open architecture, Intel-based IBM PC was introduced in the summer of

1981. At the start of 1983 Time Magazine anointed the PC “Machine of the Year.” Twelve

months later, Apple’s “1984" ad aired during the Superbowl. The IBM AT reached the

market that same year. By the end of the decade, the Intel 386 series microprocessor was

Page 126: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

111

Bloombecker (1990).126

dominant, IBM faced strong competitors like Compaq, the PC clone market was going

strong, and Microsoft DOS was the ascendant PC operating system.

The first significant DOS-oriented software virus – (c)brain – appeared in 1986. Also

known as the Pakistani Flu, it was written by two brothers in Lahore who were trying to

make a software piracy detection device. Other, more intentionally malevolent viruses

followed, provoking fears of computer-based threats and crimes. The Internet’s126

vulnerability to attack became apparent in November 1988 when Robert Morris, Jr., a 23 year

old doctoral student at Cornell University, in an experiment gone awry, created the first

software worm. Whereas a virus was typically transmitted via diskette and activated when

an unaware user booted a computer with an infected disk was present in one of its drives a

worm was considerably more animate. A worm replicated itself by moving across open

network connections, actively looking for new hosts to infect. The US General Accounting

Office estimated that Moriss’s Internet Worm caused as much as $100 million in damage.

That event raised awareness of vulnerabilities in the Internet, not just within its code

base, but in its oversight systems. Just as the Internet’s machinery would need better

protection to keep operating, the standards makers felt they needed better protection for

themselves and their organizations. The effort to provide for their own collective security

inspired new thinking about their collective goals. To have a legal identity deserving of a

new security apparatus, they found they also needed a shared mission – a civic personality

with which they could identify. Identification with that mission quickly took on a life of its

own. Several threads of activity combined to bring about this result.

Shortly after the Internet Worm event, Scott Bradner discovered an unrelated but

serious flaw in the password function of the UNIX operating system. When he reported the

bug to the Computer Emergency Response Team (CERT) based at Carnegie Mellon

University, he was dismayed to find out how difficult it was to get CERT to acknowledge

the problem and take action that might fix it. Bradner was soon venting about CERT’s

Page 127: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

112

Personal interview with Scott Bradner, July 11, 2002.127

Ibid.128

shortcomings to some colleagues in the IETF. One of them, IESG-member Noel Chiappa,

offered to hire an attorney to see what could be done.127

Chiappa had become a leader in the TCP/IP community in 1977, while a Research

Associate at MIT. In the mid-1980s he worked with Proteon, Inc. on Boston’s Technology

Drive, the East Coast’s answer to Silicon Valley. Proteon helped launch the commercial

router industry. A thoughtful person who likes to collect aphorisms, he claimed to have

coined one that has since become quite well known: “Fast, Cheap, Good - Pick Any Two.”

Flush with cash from his successes at Proteon, Chiappa’s offer to fund the exploration

of the IETF’s options may have indeed been altruistic, but he had a valid ulterior motive –

getting a clear sense of his own vulnerability as a participant in the standards making process.

Ever more money was at stake in the development of standards, and it was hypothetically

conceivable that the “loser” in a standards battle – perhaps a vendor who had bet on the

wrong horse, so to speak – might be motivated to take legal action against the people who

had picked the winner.

CNRI’s offices were made available to host periodic discussions between a small

circle of IETF members and a pair of attorneys from the law firm Hale and Dore. The IETF

regulars included Cerf, Bradner, Chiappa, MIT’s Jeff Schiller, and IETF/IESG Chair Phil

Gross. The initial goal, as Bradner recalled, was to “come up with legal constructs that CERT

could use to indicate there is liability for vendors who do not fix security problems.”128

The group met every few months. The lawyers had been asked to investigate wide-

ranging questions about the liability of computer centers and individuals that didn’t update

their software. After failing in their effort to “provide a stick to CERT,” they were also asked

to look for problems in the standards process. Were there vulnerabilities related to

intellectual property rights or anti-trust? This was in early 1990. The disconcerting report

came back that Cerf, Gross, and others in the IETF who were ostensibly identifiable as

“responsible parties” could indeed be held personally liable for things that went wrong in the

Page 128: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

113

Interestingly, though these events happened only ten years ago, various subjects I interviewed had129

very different recollections of the motivations for creating the Internet Society. Bradner said it was about the

insurance and recalls no discussion about funding the Secretariat. Steve Coya, operator of the Secretariat, also

said it had nothing to do with funding the Secretariat, but was done to build the relationship with the ITU. And

Cerf, the founder, stressed that it was about funding the Secretariat. To this day, the Secretariat does not depend

on ISOC for its primary funding. Bradner’s recounting was by far the most detailed. See also Mueller’s history

of ISOC (2002: 94-7).

process. An instrument known as Directors and Officers (D&O) insurance would normally

be used to protect people in their position, but since none of the groups immediately involved

in the Internet standards-making constellation had ever been incorporated, its responsible

parties were ineligible for such coverage.

Despite all the assistance it was providing to the IETF, CNRI could not also provide

the requisite legal standing. It had no organizational involvement with the execution of

decisions in the standards-making process. This was unlikely to change. Kahn wanted CNRI

to stay true to its original mission of developing new ideas. A key role of a standards body,

however, was to anoint ideas with an official status, and he wanted no part of that or the legal

battles such responsibility might bring. Moreover, CNRI’s funding at the time depended on

large corporate sponsors, and there was a strong desire to keep the IETF’s activities relatively

free from fund-raising obligations and possible dependence on a small basket of corporate

venders.

For the time being, IETF participants could continue on doing what they were doing,

but it was clear that some new entity would have to be constructed as the locus of

responsibility for holding the standards. This perceived need for an independently funded,

benevolent legal umbrella prompted the creation of the Internet Society (ISOC). In fact,129

the idea of having an organized membership seemed like a potential solution to a large mix

of concerns – educational outreach, public relations, bureaucratic stability, and host of other

base-broadening and foundation-fortifying issues.

Even though the liability problems of IETF participants triggered the decision to

move forward with ISOC, the idea of creating a membership organization for the Internet

was something Cerf had been considering for a while. As he recalled,

Page 129: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

114

Personal interview with Vint Cerf, November 15, 2001.130

I set [ISOC] up to help support the IETF because it looked like thegovernment wasn’t going to pay for any of the Secretariat work. This wasback around 1990. The government came to me and said, “We’re not goingto support this anymore.” And I said, “OK. We’ll find some other way toraise funds to make sure that IETF can continue to function.” By that timeit had gotten big enough that you needed full time people... So, my thoughtwas, “Well, gee, there’s more coming on the net, and they’re forming a kindof society of people. Why don’t we call it the Internet Society and people willjoin and their donations will be sufficient to support the IETF.”130

Cerf’s chief ally in realizing that dream was CSNET founder Larry Landweber,

Professor of Computer Science at the University of Wisconsin, who had devoted much of

his energy in the late 1980s to promoting a series of international networking workshops.

Over time, participation had grown so large that Landweber decided to reorganize the

meetings as the International Networking Conference (INET), a quasi-academic event that

included a mix of technically and socially oriented panels. The inaugural meeting was held

in June 1991 in Copenhagen, in combination with a series of internetworking-focused

workshops sponsored by the United Nations Agencies UNESCO and UNDP.

Just before the Copenhagen INET meeting, Landweber convened a number of

important players from CSNET and CREN, along with Cerf and representatives of the

leading European networking association, RARE. Their purpose was to make the final

arrangements for the launch of ISOC. Cerf would be President, Landweber Vice President,

Mike Roberts would be on the Board, and all CSNET’s members would automatically

become members of the new organization. Their vision was inspired by a confluence of

material interests, and a genuine desire to foster the growth of the Internet through outreach

and education. Explicit ties to the engineering community would presumably boost the

reputation of the INET conference, draw readers to its publications, and help generate an

expanding membership around the world. Membership fees, initially set at $70 annually,

would help fund ISOC’s support of the standards process.

Page 130: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

115

Phone interview with Steve Coya, June 28, 2002.131

Cerf announced the formation of ISOC on the opening day of INET ‘91. The Internet

Society, he said, would provide a base for the burgeoning INET conference, assuring that the

INET workshops would be run annually from then on. It would also serve the community by

taking on a leadership role in the Internet standards process. ISOC’s personnel took space

in CNRI’s offices in Virginia. The next INET conference, in Kobe Japan, was organized

from there. It took another 18 months before ISOC was incorporated, at the close of 1992.

The technologists behind ISOC were so excited about their new club, they half-

expected it would capture the popular imagination the same way the Internet itself had begun

to do. According to well-known Internet lore, when ISOC was finally incorporated Jon Postel

got into a race with Phil Gross to become the first member. Postel won by getting his

paperwork in the fastest.

To regularize the activities of the various standards-making groups and functions that

had arisen in the Internet community over the years, key groups were to be “chartered” by

ISOC. The use of the word “charter” was rather artful, reflecting a body of emails, RFCs, and

announcements that identified the IAB, the IANA, the RFC Editor, and the IESG as agents

within ISOC’s structure.

Ironically, the IETF – the ultimate subject of all this attention – was left out of that

mix. In fact, the IETF had never been chartered or incorporated and that aspect of its status

was considered sacrosanct. Many IETF participants thought of themselves as mavericks, and

most of ISOC’s founders recognized the importance of preserving the style of independence

that had characterized the standards-making community till then. Even the use of the word

“membership” is a stretch when talking about the IETF. The status was pure fiction. There

was no fee to belong, no membership card, no qualification for entry, and nothing to renew.

There was no formal barrier to participating. One could just show up at a working group

meeting in person, or not show up at all and simply make comments online. As Coya pointed

out, “We don’t have... decoder rings or handshakes or anything like that. People can come

and join. It’s a very fluid organization.” 131

Page 131: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

116

The mission statement has evolved over time. The cited version was in use around 1997. See “All132

About the Internet Society,” http://www.isoc.org/isoc/mission/.

Codding and Rutkowski (1983).133

In general terms, ISOC was conceived as a non-profit group to provide support and

advocacy on behalf of the Internet community. Its declared mission was, “To assure the

beneficial, open evolution of the global Internet and its related internetworking technologies

through leadership in standards, issues, and education.” Its Board of Trustees was made132

up by individuals from the information technology industry who shared a strongly outspoken

idealism about the virtues and potentials of Internet-based communication. ISOC’s founders

all had very practical goals, though not always compatible. Rifts developed in short order.

e. The Anti-ITU

About the time Cerf, Kahn, Roberts, Landweber and other leaders of the technical

community started talking about a membership organization that could serve their various

needs, Cerf met Tony Rutkowski, an American attorney who was serving at the International

Telecommunications Union (ITU) as Counsellor to the Secretary General. Rutkowski had

a technical background, though not in data communications... he had been a rocket scientist

at NASA. After getting involved in civil rights activism and local government in Florida’s

Space Coast, he began a career in law, ultimately focusing on telecommunications policy.

Along the way, he co-authored a book on the history of ITU, worked for a time at the FCC,

and taught telecommunications policy at New York Law School.133

When Rutkowski wound up working at the ITU, he discovered that its own

telecommunications facilities were terribly outmoded, with an archaic phone system and only

one fax line for the entire operation. There was no email to speak of, so he got a guest

account through the offices of CERN physics laboratory. That happened to be right around

the time Tim Berners Lee was at CERN, developing the base technologies of the World

Wide Web. Those contacts helped expose Rutkowski to the Internet and to IETF discussion

groups. The open and unregulated innovativeness of that community was a welcome contrast

Page 132: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

117

to what he was experiencing in the restrictive and highly regulated European

telecommunications market.

Like many international bureaucracies, the ITU holds lots of conferences. One was

coming up in June 1990 in Fredericksburg Virginia, and Rutkowski had been selected to

moderate a panel on the “business impact of the merging of global information and

telecommunications technologies.” He asked Cerf to speak about the Internet and the IETF’s

still-undocumented standards-making process.

Cerf, always alert for help in putting out the word on TCP/IP, saw Rutkowski as

someone who could facilitate beneficial international contacts, especially with traditional

standards organizations such as the ITU and the Organization for International Standards –

ISO (the letters in the acronym were shuffled to invoke the Greek root meaning “equal”).

While many IETF participants would have been happy to dismiss those groups as

anachronistic bureaucracies, others saw good reason to make alliances. If the ITU and the

ISO could reference Internet standards in their own documents, thus conferring the emblem

of official dignity, participating governments might find it easier to adopt Internet

technologies. But according to ITU and ISO rules, the references could not be made unless

the IETF had a legal identity.

The IETF’s lack of formal existence was once again proving to be a problem. This

provided yet another justification for launching ISOC. Creating a more respectable

imprimatur for the Internet standards process fit into the larger strategy of putting Internet

standards on a par with standards published by well-established national and international

technical bodies. There was no apparent downside, just as long as the IETF could retain its

functional independence and sense of ideological purity.

Around this time Rutkowski made a name for himself in the Internet community by

providing some much needed help to Carl Malamud, who, at the time, was a well-regarded

technical writer. Malamud had embarked on a quest to make the ITU’s standards available

without charge as downloadable files (this preceded the web-browser era). He later did the

same with other reluctant public agencies, including the US Securities and Exchange

Page 133: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

118

See Carl Malamud, “The Importance of Being EDGAR” http://mappa.mundi.net/cartography/-134

EDGAR/.

Malamud (1992). Online at http://museum.media.org/eti/.135

Commission, and the US Patent Office. Malamud recounted his dealings with the ITU’s134

bureaucracy in a memoir, Exploring the Internet: A Technical Travelogue (1992). To135

prepare, he raised private funds in the US to pay for scanning and digital conversion, and for

writing the book as well. Rutkowski opened doors for him in Geneva, persuading the ITU’s

Secretary General, Pekke Tarjanne, to allow the project to proceed as a six month

“experiment.”

By dint of massive effort, Malamud succeeded in digitizing nearly all of the

documents. He then published them on the Internet by loading them on his home server,

named Bruno. That choice of name was evidently intended to honor Giordano Bruno, the

17th century philosopher who was burned at the stake for apostasies such as revealing

various secrets of the Dominican Order, and, most famously, for challenging Catholic dogma

which held that the sun, the planets, and the stars orbited the earth. This modern Bruno was

martyred as well, if only figuratively, when the ITU abruptly ended the project and directed

Malamud to remove the files from his server, an outcome that reinforced the Internet

communities disdain for the ITU.

In any case, by helping Malamud in his ill-fated campaign to open up ITU standards,

Rutkowski had proven his bona fides as a kindred spirit of Internet culture. Cerf invited

Rutkowski onto ISOC’s Board, where he became Vice President for Publications, editing a

rather homely looking quarterly, Internet Society News. (In 1995 it was replaced by a much

slicker monthly, On the Internet, under a different editor.)

Almost as soon as planning for ISOC began, however, differences in approach

became evident. Rutkowski pressed for documentation on the organization of the standards

process and IANA’s operational control of the Internet’s name and number space. Nailing

down the provenance of rights and authorities was a reasonable concern of any attorney,

especially one of Rutkowski’s caliber and background. But it turned out the links were often

fuzzy, since many of those procedures had been established on ad hoc basis, as a need arose.

Page 134: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

119

Recounted in a July 6, 1999 email from Rutkowski.136

This line of inquiry would eventually become a ferocious challenge to Postel’s role in

Internet governance.

Other problems surfaced quickly, starting with disagreements over the name of the

new group. Rutkowski thought the term “Society” would resonate too strongly with

communitarianism in certain languages. He envisioned the new organization as a force for

the commercialization of the Internet. Seeking a title that would underscore that goal, he

lobbied for the name “Internet Industry Association.” Rutkowski also pushed for ISOC to be

incorporated as a 501(c)(6) type of organization, a status appropriate for trade associations.

Cerf and others wanted to incorporate ISOC under 501(c)(3) rules, emulating professional

associations such as the Association for Computing Machinery (ACM) and the Institute of

Electrical and Electronic Engineers (IEEE). Despite losing this battle as well, Rutkowski136

persisted.

After a couple of years Rutkowski gave up his Board seat to become ISOC’s full time

Executive Director. He used that platform in an attempt to turn ISOC’s Advisory Committee

– a group of corporate sponsors who had made large donations to ISOC – into the core of the

organization. Many of the board members and IETF members were put off by this, since it

seemed Rutkowski was neglecting ISOC’s initial non-commercial, membership-oriented

focus. He also pushed for ISOC’s involvement in public policy questions, a consequence of

which was to boost his own visibility in the media. Rising tensions continued to interfere

with relations between Rutkowski and some board members, particularly Landweber.

Differences in personal style didn’t help, but the heart of the conflict between

Rutkowski and the others was ideological. Should ISOC portray itself as acting on behalf of

the public good by performing some form of stewardship for the Internet? If so, what was the

best way to go about it? Rutkowski denigrated this as the “governance” issue. He made little

distinction between the notions of governance and government that others used to finesse the

similarities between stewardship and ownership, coordination and control. ISOC, he felt,

should have nothing to do with either. “On a philosophical level,” he later wrote, “I was

Page 135: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

120

Ibid.137

Rutkowski, “U.S. Department of State Jurisdiction over Unregulated Services,” June 1992.138

http://www.wia.org/pub/1992_pleading.htm.

Rutkowski wrote, “With former ITU Secretary-General Dick Butler, I wrote the provision and139

participated in the conference to gain its acceptance. There was, incidentally, enormous opposition from

delegations who argued that non-public networks like the Internet should not be permitted to exist. It almost

resulted in an aborted conference; but a marvelous Australian diplomat and conference chair, the late Dr. Peter

Wilenski, who subsequently became Australia's UN ambassador, saved the day.” “[ifwp] Re: Re: NEWS

RELEASE: AN ACCOUNTABLE ICANN?” IFWP November, 20 1998 .

Brain R. Woodrow, “Tilting Toward a Trade Regime, the ITU, and the Uruguay Round Services140

Negotiations ,” Telecommunications Policy, 15.4, August 1991. See also Edward A. Comor, “Governance and

the Nation-State in a Knowledge-Based Political Economy”, in Martin Hewson and Timothy St. Clair, eds

(1999). Approaches to Global Governance Theory. New York: State University of New York Press.

always the odd man out.” The industrialized world was then firmly in the throes of the137

Reagan-Thatcher era’s legacy of aggressive privatization and deregulation. Rutkowski sought

to champion those goals in the telecommunication arena, defending what he called the

principle of “unfettered opportunities in the provisioning of advanced services.” Call it138

governance or government, fetters were fetters.

Steeped as he was in international telecommunications law, Rutkowski also

understood that a key factor in the rapid growth in the Internet was that it had an exceptional

status – a non-status, perhaps – which kept its population remarkably free from the

interference of international regulators. In fact, Rutkowski had a hand in inscribing that

status. He helped write Article 9 of the International Telecommunication Regulations in

1988, which was directly pertinent to the issue.139

In legal terms, the Internet was not a public network, but an arrangement of distinct

private networks – ISPs, universities, hospitals, government agencies, businesses, and so on.

The United States government, with the assistance of various transnational corporations, had

won this exemption during the ITU’s World Administrative and Telephone Conference in

1988, one of several major concessions wrought from the European telephone monopolies

during that period. This forbearance policy was further enshrined in US law under the140

Federal Communications Commission’s Computer III doctrine. To change course and begin

Page 136: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

121

James A. Lewis, “Perils and Prospects for Internet Self-Regulation,” Conference Paper, INET 2002,141

http://inet2002.org/CD-ROM/lu65rw2n/papers/g02-d.pdf. Some credit for this decision may go to Stephen

Lukasic, who was Deputy Director of ARPA when ARPANET was launched and the Internet was emerging. He

also served as Chief Scientist for the FCC between 1974 and 1978. And also see Kevin Werbach, “Digital

Tornado: The Internet and Telecommunications Policy,” OPP Working Paper Series 29, Federal

Communications Commission, http://www.fcc.gov/Bureaus/OPP/working_papers/oppwp29.txt.

Anthony M. Rutkowski, “Considerations relating to codes of conduct and good practice for142

displaying material on Internet,” 1997, http://www.wia.org/pub/UNHCHR-paper.html.

treating the Internet as a public network would have subjected service providers to common

carrier laws – the bane of many free market ideologues.

If any agency was a strong candidate to extend its jurisdiction over Internet facilities

based in the US, it was the FCC. But no one ever authorized that step. The idea had been

discussed within the agency much earlier, but the “epochal decision” was made not to

proceed. Consequently, the Internet regime in the United States and globally could be built141

on a foundation of private contract law rather than inter-state agreements. It was the

libertarian dream come true.

In the new deregulated context – especially for those who were skeptical about the

ability of governments to intervene successfully on behalf of the “public interest” – it seemed

more appropriate to describe the Internet using the vocabulary of New-Age-speak: It was a

metaphysical synergy, an emergent phenomenon, the innovative product of chaos, a

centerless sum of privately contracted interconnections and voluntary peering arrangements.

Chaos was the fashionable metaphor of the day for many phenomena, the subject of many

books, and was a particularly favored mantra among proponents of the Internet. Rutkowski,

as au courant as any, even owned the domain name chaos.com. He played out the metaphor

in an article he wrote for the UN Commission on Human Rights, which included illustrations

that depicted the workings of the Internet as the unfolding of a fractal pattern.142

In short, Rutkowski saw danger in defining the Internet the “wrong” way. His

assessment of the law was probably correct. If the Internet were deemed a public network,

it could be brought under the purview of international conventions granting jurisdiction to

Page 137: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

122

Rutkowski best articulated this position in a series of statements and articles in 1997. See,143

“Statement of Anthony Rutkowski Before the Department of State In Re: June 1997 Meeting of the ITU

Council,” http://www.wia.org/pub/itu-council.html; “ITU Models and the Quest for the Internet,”

h t t p : / / w w w . w i a . o r g /p u b / i t u -m o u .h tm l ; “F a c to r s S h a p in g I n te r n e t S e l f -G o v e r n a n c e , ”

http://www.wia.org/pub/limits.html. See also his June 1992 pleading “U.S. Department of State Jurisdiction

over Unregulated Services” in which he warns against DOS reference to the term “recognized public” in its

effort to organize an advanced Message Handling System (MHS) addressing registration process,

http://www.wia.org/pub/1992_pleading.htm.

See Rutkowski, “Proposal for establishing the IETF as an independent organization,”144

http://www.watersprings.org/pub/id/draft-rutkowski-ietf-poised95-orgs-00.txt.

the ITU’s global regulatory regime. Governments would be in charge. This would be like143

turning back the clock on the advance of technically-empowered freedom. Few observers

took issue with Rutkowski’s concern that such an outcome would be undesirable, but the

Internet technologists who had been living with that ambiguity for well over a decade

weren’t very worried about it. They thought they were carving out a distinctly new and rather

resilient world, perhaps a new jurisdiction unto itself.

Rutkowski, on the other hand, seemed obsessed by fear of ITU involvement, often

portraying his former employer as a predatory institution. Now that its leaders were aware

of the importance of the Internet and the threat it presented to the ITU’s tradition of

avaricious centralization, they would inevitably try to grab control of it. Rutkowski was

outspoken, injecting his concerns wherever the opportunity presented itself. One of his

proposals for IETF reform provided a background supplement couched as a history lesson.

It told a story of how large, slow-moving (and therefore bad) organizations traditionally seek

to identify and capture innovative, fast-moving (and inherently good) organizations. The144

implications were clear. The ITU was a dangerous and ill-willed dinosaur, while the IETF

– though a fleet and virile young creature – was open prey.

So Rutkowski believed it was important not to use the accursed word “public” when

speaking of the Internet. Yet, despite the legal niceties, the technologists on ISOC’s Board

continued to think of themselves as managing the Internet’s resources on behalf of the public

good. Postel especially felt this way, and he enjoyed saying so.

With the creation of ISOC, the ARPANET veterans were reveling in their achievement.

The Internet was “taking off.” It was expanding its domain wherever it had already taken

Page 138: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

123

Vint Cerf, “Editorial,” Internet Society News, 3 Winter 1992.145

Christian Huitema, “RFC 1601: IAB Charter,” March 1994.146

root, and was simultaneously penetrating virgin territory across the globe. Each issue of the

Internet Society News included a table prepared by Landweber which listed every country in

the world and the current extent of connectivity there, like a progress chart on the Internet’s

manifest destiny.

The old timers were making new friends and allies everywhere. The community had

grown so large, it felt like a public. Now the Internet Society was in place to become that

public’s voice. The people who shared the mission of building the Internet had finally

developed an official identity, and were now aggressively proselytizing for converts. Cerf

was particularly the enthusiastic about the prospects of what the future could bring.

The formation of the Internet Society is, in some sense, simply a formalrecognition that an Internet community already exists. The users of theInternet and its technology share a common experience on an internationalscale. The common thread transcends national boundaries and, perhaps,presages a time when common interests bind groups of people as strongly asgeo-political commonality does today. If you wonder what the 21 centuryst

may be like, ask a member of the Internet community who already livesthere!145

f. Palace Revolt

One of the first bureaucratic changes in the standards making process that Cerf

initiated upon founding the Internet Society was to put the IAB under ISOC’s hierarchy. The

IAB would now act as a “source of advice and guidance” to ISOC’s Board of Trustees and

simultaneously serve as ISOC’s “liaison” to other organizations concerned with the world-

wide Internet. It was renamed at that time, replacing the word “Activities” with the more146

august-sounding “Architecture.” The deferential self image of the geeky grad students from

California had come a long way since the Request For Comments series was titled. Postel’s

staff handled most of the paperwork for the new Internet Architecture Board. There wasn’t

much.

Page 139: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

124

Huitema (2000:218).147

Lyman Chapin, “IAB Proposals for CIDR and IPv7," July 1, 1992. Cited in Daniel Karrenberg,148

“IAB’s Proposals in response to the work of the ROAD group,” July 2, 1992, http://ripe.net/ripe/

maillists/archive/ripe-list/1992/msg00001.html.

The IETF’s framework for the expansion of the IP address space was called IPNG (IP Next149

Generation).

Cerf stepped down as IAB Chair in early 1992, but kept a seat on the committee. His

successor, Lyman Chapin, Chief Scientist at BBN, got off to a rocky start. The new IAB

made a move which jeopardized nearly everything Cerf had been working for.

One of the most pressing technical issues of the day concerned the classification and

allocation of IP addresses. The addresses were being lumped together and distributed in such

large chunks that a huge portion of the numbering space was being wasted, with no good way

to recover it. If nothing was done to allocate those address blocks more efficiently or expand

the address space altogether, it was clear that the numbers would run out rather soon. To

head off the problem, an IETF Working Group on Routing and Addressing (ROAD)

painstakingly developed a stopgap measure. It also recommended a new framework for

moving forward. The ROAD recommendations were duly passed on to the IESG and then

the IAB.

Tensions mounted on the heels of an official prediction that, if nothing were done,

all the remaining Class B address space (describe earlier at page 79) would be exhausted by

March 1994. Concerned about what Chapin called “a clear and present danger to the future147

successful growth of the worldwide Internet,” IAB members settled on their own stopgap

measure. This occurred at their meeting in Kobe, right after the June 1992 INET148

conference. Their move was a major departure from normal procedure in that they

peremptorily dismissed the ROAD proposal in favor of their own out-of-the-blue revision

to the Internet Protocol, called IPv7. Their action was not only a slap in the face to a high-149

profile IETF Working Group, it would have launched a transition away from the core

features of the Internet Protocol toward an alternative standard called the Connection-Less

Page 140: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

125

Dave Crocker later commented, “ My own sense at the time was that OSI was effectively already150

dead and that the IAB decision was the final attempt to salvage something from it. (Had there been a real,

public process considering alternatives, it is not clear that CLNP should have lost. The preemptive nature of

the IAB decision overwhelmed that technical discussion with an IETF identity crisis.)” Dave Crocker, “One

Man’s View of Internet History,” IH August 4, 2002.

Phone interview with Robert Kahn, July 22, 2002.151

The phrase was attributed to Ed Krol. See Andrew L. Russell, "'Rough Consensus and Running152

Code' and the Internet-OSI Standards War," IEEE Annals of the History of Computing, 28.3, Jul-Sept, 2006,

48-61. http://www.computer.org/portal/cms_docs_annals/annals/content/promo2.pdf.

Network Protocol (CLNP). The clear implication was an endorsement of TCP/IP’s main

competition, an ISO standard called Open-Systems Interconnection (OSI).

The two protocols relied on quite dissimilar models of how to organize the different

layers of services needed to operate a network of networks. The rivalry among their

proponents was highly partisan. A typical comment going around the IETF during those

years was, “OSI is a beautiful dream, and TCP/IP is living it.” The depth of opposition to

OSI within the IETF was so deeply ingrained that the IAB’s maneuver created a shock. Three

leaders in the IESG – Phil Gross (also co-chair of the ROAD group), Scott Bradner, and

Alison Mankin – immediately rejected the move on behalf of the IETF. Since the IAB had

been reconstituted under ISOC by this time, the IESG’s move effectively rejected ISOC’s

authority as well. Many others, notably Steve Crocker and his brother Dave, joined the call

for reform. The fray became a groundswell.150

“The IAB was acting like it ran everything,” recalled Robert Kahn. Its members had

become “pontifical,” unwilling to accept that the dynamism of the standards process had

moved to the working groups of the IETF, now seven hundred members strong. And why

not? “What does royalty get when you embark on democracy?”151

The NWG had been established as a venue that would facilitate participation among

those who cared about the development of internetworking. The IETF had carried on with

that “grassroots” character. But the IAB’s intervention, many believed, had violated the

community’s deeply-entrenched norms of openness, unmasking the board as an out-of-touch

“council of elders.152

Page 141: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

126

For more about the creation of a “warm, fuzzy feeling” for newcomers and old timers alike, see153

Gary Malkin, “RFC1391: The Tao of the IETF,” January 1993.

Anonymous, “The 2nd Boston Tea Party,” Matrix, July 2000, 10.7. http://www2.aus.us.mids.org/154

mn/1007/tea.html.

Relevant RFCs characterized the ensuing discussions as “spirited.” Explosive might

be a better word. In fact, the IAB’s reputation was badly tarnished for years thereafter, and

its power undercut permanently. The dispute erupted in full public view at the IETF’s July

1992 meeting in Boston, an event that has been sanctified within the Internet community as

the “The Second Boston Tea Party,” with the IAB forced to play the role of the tea.

Dave Clark’s declaration, “We believe in rough consensus and running code,” was

made at a plenary session toward the close of that Boston meeting. His sloganeering was a

rhetorical tactic meant to heal the wounds in the IETF. Clark was not then a member of IAB

so he was not tainted by the scandal. Yet was senior enough to serve as a voice for

reconciliation. Everyone present recognized that Clark’s motto could be interpreted as an

elliptical putdown of the secretive, slow-moving, bureaucratic manner that characterized

traditional technical standards organizations like the ITU. By setting the IETF apart, and by

glorifying its open and iconoclastic culture, he hoped to fire up the membership’s esprit de

corps, rekindling the passion of earlier days.

It worked. The invocation of a clear and resonant ideology helped socialize the still-

flocking newcomers into the ways of the IETF, and, over time, would help the veterans

overcome their resentments and suspicions. When Marshall Rose later produced a T-shirt153

with the “running code” slogan on the back. The text on the front was simpler and even

prouder – “Internet Staff.”

Cerf helped diffuse the tension at the plenary with a more humorous approach. His

speech included a refrain about the IAB having “nothing to hide.” Cerf generally dresses

quite elegantly, and often wears three piece suits. As he spoke he took off his clothes, layer

by layer, stripping down to a T-shirt that carried the logo, “IP on Everything.”154

The steps taken to rebuild trust in the organizational hierarchy of the standards

making machinery went far beyond lip service. The IAB retracted IPv7. The enduring

Page 142: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

127

Huitema (2002: 201-18), his Chapter 9 is titled, “CIDR and the Routing Explosion.” For a155

straightforward explanation online see also Pacific Bell, “Classless Inter-Domain Routing (CIDR) Overview,”

http://public.pacbell.net/-dedicated/cidr.html.

Personal interview with Scott Bradner July 11, 2002.156

Lyman Chapin, “RFC 1358: Charter of the Internet Architecture Board (IAB),” August 1992.157

Subsequent revisions include RFC 1601 by Christian Huitema in March 1994, and RFC 2850 by Brian

Carpenter in May 2000.

See Steve Crocker, “RFC 1396: The Process for Organization of Internet Standards Working Group158

(POISED) , January 1993, and “POISED Charter,” http://www.ietf.cnri.reston.va.us/proceedings/94mar/

charters/poised-charter.html.

solution for the IPv4 scarcity problem – called Classless Inter-Domain Routing (CIDR)

came into effect within a year. The system permits a much finer level of subdivisions within

the 32 bit IP address string, and is still in use at the time of this writing. 155

The IAB simultaneously undertook far reaching organizational reforms designed to

ensure it was “no longer in the path of standards creation.” The size of the group was156

expanded from a dozen to fifteen, and its charter was published as an RFC. Except for157

Postel, all the members from the pre-Boston period were replaced within two years. Most

significantly, the IAB was removed from the process of approving standards developed by

IETF working groups. The reformed IAB put much greater emphasis on liaison work with

the IETF, an effort led by Bradner, who became an IAB member and an ex-officio member

of ISOC’s board. A new IETF working group chaired by Steve Crocker – Process for

Organization of Internet Standards (POISED) – was created to formulate policies for

management issues, especially the development of procedures for nominating individuals

who would serve on the IAB, IESG, and as Chair of the IETF. The group was later158

reconstituted under the name POISSON, the French word for fish, because IETF political

management was considered just as slippery.

Despite the willingness of the IAB to make such far-reaching concessions, there was

some lingering resentment. In 1994 IAB member Brian Carpenter conducted an anonymous

survey of IAB and IESG members to get a sense of “image” and “(mis)perceptions” of the

IETF/IESG/IAB/ISOC world. He collected a number of revealing criticisms of the IETF. It

was called a “gang of hippies,” whose supposedly virtuous openness allowed anyone to

Page 143: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

128

IAB, “Minutes for July 27 1994 Minutes for IAB Open Meeting at Toronto IETF,”159

http://www.iab.org/IABmins/IABmins.1994-07-27.html.

IAB “Minutes for September 23, 1994," http://www.iab.org/documents/iabmins/-160

IABmins.1994-09-23.html.

Tony Rutkowski, “Internet [Architecture] [Activities] Board: Known History,” November 4, 1998,161

http://www.wia.org/pub/iab-history.htm.

McTaggart (1999), Especially Section III, “Governance of the Internet’s Infrastructure,”162

http://www.internetstudies.org/members/craimct/thesis/section3.htm.

filibuster and shout down speakers they didn’t like. In such cases, the lack of voting and

other formal procedures discouraged participants who were not willing to shout back. The

“‘IETF gestalt’ (e.g. ignoring economics, insisting on datagrams, knee jerk reaction against

OSI)” was seen as indicative of “closed perspectives.” Some thought that the IETF process

failed to produce well-engineered standards, and that the developers of TCP/IP had “cheated”

by bypassing formal international processes. The cynics did not get the last word, however.159

That went to the unabashed idealists who, later that year, framed a new “vision” statement

for the IAB: “Connect every computer to the Internet with the TCP/IP suite and make the

world a better place.”160

When it was completed, the IAB charter stipulated that the RFC Editor would be

seated as a voting, ex-officio member. Consequently, by 1996 Postel was the only one left

from the pre-Tea Party group. He and Cerf emerged from the episode with their reputations161

not only unscathed, but enhanced. Postel disliked controversy, and had steered clear of the

mess. Cerf had not been a ringleader of that particular faux pas. Both were still regarded as

important pillars of the Internet’s standards-making process who could be trusted to keep it

organized, intact, and strongly linked to its historical origins. They emerged from the

housecleaning episode with elevated status. Now, with Clark, they wore the halos of senior

statesmen.

The cultural homogeneity of the IETF’s participants was an important factor in

sustaining their cohesion. “The members may have come from universities, computer

manufacturers, and software companies,” one scholar observed, “but as electrical engineers

or computer scientists they shared a very similar mental discipline and mental values.” 162

Page 144: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

129

John Galvin. “RFC 2282 :IAB and IESG Selection, Confirmation, and Recall Process: Operation163

of the Nominating and Recall Committees,” February 1998.

Though the ability of the IAB to control the standards making process was

circumscribed as a result of the IPv7 episode, the group retained some pro forma but

nevertheless important oversight functions. These might be described as the ability to

perform “sanity checks” over critical administrative decisions. Current IAB duties include

approving the selection of members of the IESG and the Chair of the IETF. Nominations and

selections are generated by an IETF nominating committee which, in order to guarantee

broad participation, is constituted by names chosen at random from individuals who attended

two of the previous three IETF meetings. The IAB has never rejected a recommendation.163

The IAB also serves as an appellate body empowered to rule on standards-related disputes

among IETF members that can not be resolved by the IESG. The IAB is rarely called upon

to perform this duty, and has ratified the IESG’s decision every time so far.

The IAB continued to issue occasional RFCs, usually in the form of informational

pronouncements that weighed in on various issues. The IAB’s most significant remaining

gatekeeping power was vested in its ability to designate the individual or agency who

performs the IANA function and to specify who will serve as RFC Editor. These were

defined as two distinct roles, each delegated for five year terms. The IAB charter also

stipulates that the occupants of those two positions must work together. This requirement

was an inside joke, since the “collaborators” were one person. Though the IAB had the

capacity to replace Postel in his various jobs, such action was virtually inconceivable while

he was alive.

g. “It Seeks Overall Control”

Though ISOC was formed to support the standards-making process, the standards

makers in the IETF didn’t greet the new organization with united enthusiasm. On the heels

of the IAB crisis, the environment was primed for suspicion of rules imposed from on high.

Working group participants heard that the IETF would now be operating “under the

auspices” of ISOC. No one had a clear sense of what that meant. Some reacted with paranoia,

Page 145: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

130

Phone interview with Steve Coya, June 28, 2002.164

interpreting ISOC’s arrival on the scene as a takeover attempt. The most outspoken critics

satirized ISOC, transmuting its acronym to “It Seeks Overall Control.”Another bit of T-shirt

wisdom – “ISO silent C” – painted ISOC as a conspiratorial alliance with the bureausclerotic

purveyors of OSI.

Steve Coya, head of the IETF’s Secretariat, accepted the lampoons as a natural

response in that sort of a community. “People assumed the worst, and tried to see what jokes

they could make out of it.”

The Internet Society was going, ‘How can I help you?’ And with the IETFbeing ninety-eight percent libertarian, for any organization that says ‘Howcan I help you?’ every red flag in the world is going to go up.164

Coya was famous for hyperbole, but even if the IETF was not “ninety-eight percent

libertarian,” one was more likely to meet a supporter of the Libertarian Party at an IETF

meeting than almost anywhere else in the United States. Most IETF participants tended to

be exceptionally concerned about protecting privacy and free speech rights against

government intrusion... a position that was quite compatible with Libertarian Party priorities.

They shared a political and cultural point of view that was fueled by habits of skepticism.

The distrustful mood extended through the mid 1990s. Much of the trouble was due

to lack of information. ISOC’s Board members were poorly prepared, and were slow at

getting things underway. This undermined their ability to present the organization and its

mission as clearly as was needed. The uncertainty fostered confusion and resentment.

Rumors flew. Old timers even wondered if they would have to join the Internet Society to

continue participating in the IETF.

Rutkowski’s behavior didn’t help. He showed up touting himself as having taken on

a prominent leadership role, and wore out his welcome rather quickly. His outsider status

made him a more obvious target for much of the initial fire hurled at ISOC. But, instead of

settling doubts, his appearances at the IETF meetings and his publicly reported comments

Page 146: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

131

Phone interview with Larry Landweber, June 13, 2002.165

Rutkowski, “ITU Mission Report on INET '91, Executive Summary: Version 3.0,” June 28, 1991.166

http://www.wia.org/ISOC/itu_mission.htm.

Personal interview with Vint Cerf, November 15, 2001.167

Phone interview with Dave Crocker, July 4, 2002.168

outside the meetings did more harm than good. Rutkowski came across, according to

Landweber, as someone who was “on a different wavelength.”165

Word came back that Rutkowski was representing ISOC to the rest of the world as

being in charge of the IETF standards process. He had been saying so in all sorts of ways

from the outset. His first report on Cerf’s announcement of the formation of ISOC recounted

designs far more imperious than what Cerf had uttered. According to Rutkowski’s summary,

“All the existing Internet network planning, standards making, research coordination,

education and publications activities will be transferred to the Society.” Even as he learned166

the impropriety of such embellishment, he went on making other audacious statements that

pushed the limits of tolerance.

The memory of the revolt against the IAB was still fresh, Cerf recalled, “and Tony

inflamed the tension. Whether he did it on purpose or not, he appeared completely oblivious

the effect that he had. And so it just caused endless trouble.”

He [Rutkowski] painted the Internet Society larger than it was. I think henever understood that, in the course of imbuing the Internet Society withmore responsibility than it had – maybe this made him feel better as theExecutive Director, I have no way of knowing – but the side effect of thatwas to piss off everybody in the IETF. It cost me three years, and manyothers, three years of work to try to get ISOC back into reasonablerapprochement with the IETF.167

Dave Crocker (Steve’s brother) was an IESG Area Director during that period, and

became one of Rutkowski’s most enduring and outspoken adversaries:

ISOC represented new and sensitive stuff that needed to be viewed and dealtwith carefully...And if an ISOC [Director] made noises like an AlexanderHaig that says, “We are in charge,” that would not help things much.168

Page 147: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

132

Paul Mockapetris, “Overview of the Problem,” IETF March 21, 1995, http://www.netsys.com/ietf/169

1995/0867.html.

Phone interview with Larry Landweber, June 13, 2002.170

To reassure the IETF members that ISOC would have a benevolent – or at least

benign – effect, ISOC’s Board worked to keep operations as independent as possible,

moving its offices out of CNRI to a separate location. The IETF meetings and secretariat

were still being funded by the government. Care was taken to avoid mixing funds. When

Rutkowski urged transferring the IETF secretariat from CNRI’s offices to ISOC’s, the Board

opposed him. It was a small issue by itself, but occurred in a context that added fuel to the

fire.

Prominent individuals who expressed outspoken dissatisfaction with ISOC included

Paul Mockapetris, Chair of the IETF from 1994 to 1996. In a public comment on the

evolution of the standards process, he characterized ISOC as an organization which “would

like to own the RFC copyrights, the number space, IETF, et al...so that our standards can

become ‘important...’” He supported the idea of developing formal relations between the

IETF and other organizations, insisting that the only way to “do standards for real” would

be to create a “neutral holding company” capable of making legal agreements on the IETF’s

behalf. “This is what ISOC is supposed to be, but is not,” he concluded. “[W]e probably

should reconsider why we are spending time to do the ISOC process.” The message was

explicit enough: If ISOC did not reform the IETF should find a new holding company.169

As personal and professional differences exacerbated the tensions, it became all too

clear that Rutkowski would have to go. From Landweber’s perspective:

If it was the case the case that Tony and the IETF could not get along, and ifit was the case that case that integration of the IETF into ISOC was critical,then there were only two alternatives: Tony stays as ISOC Executive Directorand the IETF not ... come into ISOC, [or] Tony leave and the integrationhappen.170

While investigating the particular circumstances under which Rutkowski finally left

ISOC, it was made apparent to me that Board members were encumbered by legal

restrictions (perhaps akin to a “no disparagement” clause found in many termination-of-

Page 148: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

133

Internet Society, “Minutes of ISOC Board of Trustees Meeting in Honolulu, Hawaii, June 1995,171

“Resolution 95-10, Appreciation of Executive Director, http://www.isoc.org/isoc/general/trustees/mtg07.shtml.

Mueller (2002: 97).172

Rutkowski’s presentation of “ontologies” and historical events also conveyed an unmistakably173

critical tone toward Postel and his role in the allocation of Internet resources.

See, http://www.cpsr.org/cpsr/nii/cyber-rights/Library/Re-Legislation/ILTF-Purpose, also available174

in 1996 archives of Discovery.org.

Internet Society, “Minutes of the 1995 Annual General Meeting of the Board of Trustees, June 26-175

27, Honolulu, Hawaii, §6.18,” http://www.isoc.org/isoc/general/trustees/mtg07.shtml.

employment agreements) regarding what they could say in public. The public record gives

no hint of this. In June 1995 ISOC’s Board passed a resolution that declared appreciation for

Rutkowski’s work. His contract was extended for six months at that time, and he was gone171

by the end of the year. Another historian of these events, Milton Mueller, wrote plainly that

Rutkowski was “forced to leave.” There is no reason to argue otherwise, but one might172

imagine that at some point Rutkowski was as eager to depart as the others were to send him

on his way.

During his last year as ISOC’s Executive Director, Rutkowski associated himself

with two other entities aimed at influencing the Internet policy process. One was the World

Internet Association (WIA), a one-man show that he apparently created as a shell for the kind

of trade association that he had once envisioned for ISOC. No such trade association ever

came about, but the wia.org website archived many useful and interesting documents. The173

second entity was the Internet Law Task Force (ILTF), the brainchild of Peter Harter, an

attorney who had positioned himself as herald of the so-called “New Economy.” Harter’s

design for the ILTF, dated April 8, 1995, was formulated as a memo to Rutkowski as

Executive Director of ISOC, probably because Rutkowski had used that post to orchestrate

a policy-oriented “Internet Summit” from which much of the ILTF’s initial membership was

drawn. In June Rutkowski took steps to integrate the ILTF within ISOC’s structure, but174

the Board blocked him. The ILTF eventually found a home at the Discovery Institute, a175

think tank in Seattle that provided a mouthpiece for ultra-libertarian pundits such as Al

Page 149: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

134

See IAB Minutes for May 10, 1995 (written June 7, 1995) ftp://ftp.iab.org/in-176

notes/IAB/IABmins/IABmins.950510.

A.M. Rutkowski, “Has the IETF Outlived its Usefulness?” IP July 13, 1996, http://lists.elistx.com/177

archives/interesting-people/199607/msg00033.html.

Ibid.178

Gidari and George Gilder. (The Discovery Institute later became the central bastion of the

anti-Darwinian Intelligent Design movement.)

The ILTF was posed as an IETF-like, self-organized effort to fight laws such as the

ill-fated Communications Decency Act, the common enemy of nearly everyone involved in

the Internet those days. Harter’s initial memo to Rutkowski suggested the ILTF could

become the “legal policy norms forum for the Internet.” Early on, Postel and other members

of the IAB expressed hope that the ILTF might provide some practical help with the

trademark problems that were starting to confound operations of the Domain Name

System. Nothing came of this. By the end of the year, however, the core members the ILTF176

community began to transform themselves into a new group, the Internet Law and Policy

Forum (ILPF). Still under the aegis of the Discovery Institute, the ILPF was taking on a

distinctively un-IETF-like way of conducting business. Not only were its meetings closed,

but its Working Groups were to be financed by corporate sponsors whose identities were to

be kept confidential, wrote Rutkowski,“in order to ensure the integrity of the final

product.” 177

By 1996 critics were charging that the ILPF was seeking to usurp the open standards

processes of the IETF. Rutkowski, now departed from ISOC, defended the ILPF’s donor-

centric approach by denigrating the IETF model as outmoded and oversold.

The [standards process] involves significant industry tradeoffs and businessdecisions. Even in the IETF, the most important ideas and specs usually camefrom a handful of people at a beer BOF who really knew what they weredoing, and the rest was window dressing... In fact, most of the importantstandards activities are shifting to a wide variety of specialized consortia ofvarious flavors.178

Page 150: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

135

ISOC’s application for a Class A liaison status with ISO’s Joint Technical Committee 1/179

Subcommittee 6 (JTC1/SC6) was approved in January 1995, http://www.ietf.org/IESG/LIAISON/-

ISO-ISOC-coopagreement.txt. See also, Internet Society, “Minutes of Regular Meeting of Board of Trustees

August 16-17, 1993, San Francisco, California, U.S.A.,” http://www.isoc.org/isoc/general/trustees/mtg03.shtml.

h. Standardizing the Standards Process

Window dressing or not, the refinement of the IETF standards process continued. The

need for insurance and the desire for a relationship with the ITU drove the key players to

meet the expected requirements, however rigorous. The appeals process was spelled out and

procedures for the retention of documents were specified more carefully. With Rutkowski

finally out of the mix, ISOC’s founders were free to proceed with their original plans to

confirm ISOC as the holding company for the IETF. Landweber took the lead in repairing

the relationship with ISOC’s critics in the IETF, meeting with them in small groups and

individually, talking things over until they were satisfied. The time and effort he devoted to

that endeavor paid off, allowing ISOC’s legal umbrella to be erected where it could cast the

protective shadow of legal identity over the responsible persons of the standards process.

Once this was done, the details regarding insurance and recognition by other standards

groups were finally on course to resolution.

In fact, thanks in large part to Rutkowski’s efforts, the basic structure for a

relationship between IETF Working Groups and the ISO had been forged even before the

ISOC /IETF issues were completely settled. Building formal ties with the ITU proved to179

be a greater challenge, however. That process began when representatives of the ITU

contacted Scott Bradner – who was wearing “two hats” as the liaison for standards from both

ISOC and the IETF -- expressing a desire to refer to an IETF standard called the Real Time

Transport Protocol (RTP). The citation would be used in the ITU’s version of IP telephony,

known as H.323. The complication was that the ITU’s own rules required that the legal

relationship between the IETF and its holding company be firmly established before the

reference could be enacted.

Bradner later said he experienced no significant problems working with his ITU

counterparts. They all had to endure the long process of creating what he called the “legal

fiction” that sanctified the relationship between ISOC and the IETF. But those problems were

Page 151: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

136

The incident was related to me by Scott Bradner, interviewed July 11, 2002. See also Bradner, R.180

Brett, G. Parson, “RFC 2436: Collaboration between ISOC/IETF and ITU-T,” October 1998.

The listed authors are Internet Architecture Board and Internet Engineering Steering Group, “RFC181

1602: The Internet Standards Process -- Revision 2," March 1994.

overshadowed when officers of the US Department of State representation to the ITU, led

by Gary Farino, stepped in to block the ITU’s ability to reference IETF standards. By this

action, the United States government had effectively become an obstacle to the global use

of Internet standards. Farino’s office was overruled when a private sector Advisory Group

(mostly telecom companies) lobbied the State Department to reverse its policy.180

A second revision of the Internet standards process – RFC 1602 – emerged in March

1994. Authored primarily by IAB Chair Christian Huitema and IESG Chair Phil Gross, it was

the first to specify a relationship between the IETF and ISOC:

The Internet Standards process is an activity of the Internet Society that isorganized and managed on behalf of the Internet community by the InternetArchitecture Board (IAB) and the Internet Engineering Steering Group.181

Further refinements were needed before the formal relationship between ISOC and

the IETF could be fully and finally consummated. Bradner presented the third revision of the

standards process, RFC 2026, in October 1996. It specified, among other things, that each

person submitting a contribution to the standards process would “grant an unlimited

perpetual, non-exclusive, royalty-free, world-wide right and license to the ISOC and the IETF

under any copyrights in the contribution.” That RFC was also entered in the Best Current

Practice Series as BCP 9, adding to its gravity. This was a pivotal step in ISOC’s assumption

of responsibility in the standards process. The general understanding of the arrangement was

summarized by the head of the POISED working group, Erik Huizer, in RFC 2031, The

IETF-ISOC Relationship, published the same month. All subsequent RFCs included a

copyright notice from the Internet Society.

Unlike most such notices, ISOC’s copyright claim is not intended to restrict

dissemination of content – in this case Internet standards – but to promote distribution. It

Page 152: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

137

See, for example, Scott Bradner, “IETF Rights in Submissions,” February 2003,182

http://www.ietf.org/internet-drafts/draft-ietf-ipr-submission-rights-01.txt.

asserts ownership in a way that satisfied legal concerns without violating the principle of

keeping the documents free. 182

ISOC’s formulation with regard to copyright was one aspect of a growing interest in

the development of innovative legal provisions that would provide rights and protections for

the authors of information – usually computer software – that they wished to see circulated

openly in the public domain with minimal risk of appropriation by other parties. Instruments

employed by the “open source” movement went by names like copyleft and the GNU public

license. Even the use of the word “Internet” was the subject of a legal contest after Concord

EFS, a vendor of financial processing services and automated teller machines, claimed it as

a trademark. The Internet Society fought and won a long battle to keep the word in the public

domain. But these legal victories were small achievements in light of the larger ambitions

shared by the organization’s leaders.

i. Charters

ISOC emerged from a confluence of interests and desires: The leaders of the

standards community needed to ensure their own legal protection, and they also wanted to

act altruistically by promoting the diffusion of Internet technologies they considered to be

beneficial. Those philanthropic desires took on a life of their own, as if the dream of a

worldwide Internet could justify the whole standards management project, propelling it

forward with a coherent, unifying purpose. Idealism fed a visionary wish. The practical

issue of how to manage the standards process transformed into a discourse on Internet

governance. And that discourse fostered, in many minds, the assumption of a broader

mission to serve the global public good. This assumption was given an institutional shape

when Postel’s IANA was established as its majestic figurehead. The manufacture of an

expressly global organization was underway.

As we have seen, in constructing the “legal fiction” that allowed ISOC to function

as the holding company for the Internet standards process, certain forms of language had to

Page 153: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

138

Lyman Chapin, “RFC 1358: Charter of the Internet Architecture Board (IAB),” August 1992.183

Christian Huitema, “RFC 1601: Charter of the Internet Architecture Board (IAB),” March 1994.184

Brian Carpenter, “RF 2850: Charter of the Internet Architecture Board (IAB),” May 2000.185

Personal interview with Vint Cerf, November 15, 2001.186

Jon Postel, “IANA and RFC Editor ‘Charters.’” Archived at http://www.wia.org/pub/postel-iana-187

draft1.htm.

be inserted into the RFCs that defined the standards track. The relationships between the

entities had to be clarified and reinforced as well.

An array of supporting RFCs described the organizations in the process, selection of

leadership, variances in procedure, and so on. The IAB also published a charter – updated

over the years – that declared the organizational ties between itself and ISOC. The first

charter in 1992 described the IAB as responsible for “expert and experienced oversight of

the architecture” of the Internet This was revised in 1994 to assert responsibility for “long183

range planning,” and making sure that important “long-term issues [are] brought to the

attention of the group(s) that are in a position to address them.” A third charter in 2000184

retained that formulation, but added a great deal of specificity regarding the IAB’s relations

with ISOC, the IESG, the IETF, and outside organizations. No one ever promulgated a185

charter for the IETF. It wasn’t needed. The “responsible persons” of the standards process

were already accounted for, leaving IETF participants free to feel unencumbered by

superfluous obligations.

As Cerf recalled, after planning was underway to arrange the various standards bodies

under ISOC’s auspices, Postel decided he wanted to “hang the IANA off the IAB.”186

Anticipating publication of an updated Standards Process document, he emailed the ISOC

list two draft charters – one for IANA and the other for RFC Editor. There was a brief

introduction:

In the current drafty version of the Internet Standards Process document(1602-bis) there is some reference to the IANA and the RFC Editor. There isa need to document these positions and how they get filled.187

Page 154: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

139

R. Hovey, S. Bradner, “RFC 2028: The Organizations Involved in the IETF Standards Process,”188

October 1996.

The substance of each “charter” was couple of short paragraphs, little more than a

succinct description of the work he and Reynolds performed under each function. The draft

for the RFC Editor began by stating the position had been “chartered by the Internet Society.”

The draft for IANA implied more: “The Internet Assigned Numbers Authority (IANA) is

chartered by the Internet Society to act as the clearinghouse to assign and coordinate the use

of numerous Internet protocol parameters.” In the end, however, the final version of RFC

1602 made no mention of IANA’s charter. No such charter was ever issued as an RFC or

published as a distinct document. Interestingly, when the next update of the Assigned

Numbers series, RFC 1700, was published in November 1994, the Federal Networking

Council (FNC) was named as one of IANA’s chartering bodies. Still, no charter was

referenced.

In October 1996, about the time of the third revision of the standards process was

published, Bradner and DEC’s Richard Hovey published RFC 2028, describing IANA’s

position within the Internet standards process. It was “responsible for assigning the values”

of the Internet’s protocol parameters. It was also the “top of the pyramid” for DNS and

Internet Address assignment, “establishing policies for these functions.” The citations in the

RFC listed the IANA and RFC Editor charters as “Work in Progress.”188

The domain name ietf.org was registered March 11, 1995. This was surprisingly late

in the game, demonstrating perhaps that IETF’s standards-making activity was indeed as

decentralized as advertised. Working group mailing lists were supported by vendors,

universities, or other agencies, and still are. The domain names iana.org and iab.org, were

registered June 9, 1995, allowing the launch of associated websites. The primary utility of

IANA’s site was to provide access to RFCs and numerous drafts emerging from the standards

making community. There was a motto at the top of the main entry page, “Dedicated to

preserving the central coordinating functions of the global Internet for the public good.”

Postel’s invocation of the public good must have driven Rutkowski up a wall. IANA’s home

page also repeated language that appeared in RFC 1700:

Page 155: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

140

Rony and Rony, (1998: 122-3).189

See Elise Gerich, “RFC 1466: Guidelines for Management of IP Address Space,” May 1933, and190

IANA, “Internet Protocol v4 Address Space,” updated online at http://www.iana.org/assignments/

ipv4-address-space.

The IANA is chartered by the Internet Society (ISOC) and the FederalNetwork Council (FNC) to act as the clearinghouse to assign and coordinatethe use of numerous Internet protocol parameters.

The Ronys’ history of these events portrayed IANA’s phantom charter as “Authority

by Announcement,” underscoring the point by citing Rutkowski’s ridicule of it as one of

those “things that get propagated in the Internet realm that have no nexus to the real

world.”189

Neither was the IANA a clearinghouse. That is an elegantly neutral term, but

ultimately misleading. The IANA was not simply providing a space in which buyers and

sellers moved products; it (Postel) was performing the task of distributing resources. He also

kept some resources such as IP blocks in reserve. Even though the time he devoted to the190

assignment task was funded by the Department of Defense, there was no text clarifying this

was so. The naive visitor to IANA’s website might have easily inferred that a science-

oriented government agency and a non-profit international organization had created IANA

to serve the interests of the Internet community at large. Postel was creating the impression

he wished to be true.

Regardless of the “real world” existence of the charter as a text with instrumental

force, the presumed chartering agencies allowed the claim to stand. The FNC did nothing to

disavow the relationship, so the imprimatur of the US government’s endorsement – official

or not – bolstered the perception of IANA’s significance and authority. The members of

ISOC’s board, of course, would never have conceived of disavowing IANA. Anything that

added to the perception that IANA was the linchpin between ISOC and the US government

only entrenched ISOC’s claim to status atop the Great Chain of Being in the Internet

standards process.

Page 156: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

141

Froomkin (2000).191

Despite IANA’s rising status, there was a problem. It turned out that the shadow cast

over the standards process by ISOC’s umbrella would not fully cover Postel. As a member

of the IAB, he was presumably secure in the case of litigation against individuals atop the

IETF hierarchy. However, the act of assigning identifiers – particularly IP addresses and

domain names – was clearly distinct from the act of writing and approving standards. The

IANA work seemed to be in a legal limbo. Though he was an employee of USC performing

functions funded by a US government contract, the USC administrators were not at all eager

to take on the risk of serving as his legal backstop. The IANA function was being directed

by the Internet community (or ISOC, or DARPA, or the FNC, or the Czar himself) after all,

and not USC.

* * *

The U.S. Government had granted tremendous latitude to the technical community

through the years, especially to Postel. Authority over conventional resources– namely, the

power to define fields in globally shared data spaces – was widely considered to reside in his

hands. RFC 1174 exposed a growing ambiguity regarding the nature of his authority. Perhaps

he was an agent of the U.S. Government, perhaps not.

Ownership of the top level domain name space ultimately became a contended issue,

spurring vitriolic debates that preoccupied the IETF community for years thereafter. The

question of authority had to be settled, but this was not easily done. Statutory relationships

seemed to favor the US Government, but its ownership of the resource was not set in stone,

written plainly and prominently for everyone to see. The lines of authority were

“muddled.” Evidence pointing one way or the other would have to be dug up or invented.191

Until then, no US officials had ever made any concerted effort to demonstrate the

primacy of governmental authority over the Internet’s name and number space. But in

September 1995, without even attempting to enlist Postel’s consent, a few members of the

U.S. government imposed a decisive change over the policies governing that space,

Page 157: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

142

polarizing the Internet community, oiling the skids of the Internet Boom, and changing the

world forever.

* * *

The next chapter will retrace the history of the Internet once again, going back to its

beginnings in the late 1960s, and marching up the decades. But this next telling will focus

more specifically on the technology of the domain name system, and the events which

triggered the pivotal events of the late 1990s.

Page 158: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

143

Chronology 3 Early Steps in IAB, IETF, and ISOC Governance

Date Event

October 1998 ISOC establishes formal collaboration with ITU via RFC 2436.

June 1995 Final ISOC contract extension for Rutkowski.

During 1995 Peak of US government subsidy of IETF.

April 6, 1995 Plans initiated for Internet Law Task Force (ILTF).

March 1995 Online debates over Ownership of the IP and DNS Spaces.

January 1995 ISOC establishes liaison status with ISO.

November 1994 IANA described as “chartered” by FNC in RFC 1700.

March 1994 Formal relations between ISOC and IETF in RFC 1602.

January 1993 POISED Charter for IETF oversight in RFC 1396.

August 1992 First IAB charter in RFC1358.

March 1992 Internet standards process described in RFC 1310.

Early 1992 Lyman Chapin becomes Chair of IAB.

During 1994 IEPG rechartered.

During 1991 Malamud’s Bruno experiment, publishing ITU standards at no cost to readers.

During 1990 Internet Engineering Planning Group (IEPG) created.

August 1990 IANA’s authority described in RFC 1174.

July 25, 1989 First meeting of IETF as the overarching home of Internet standards-making.

During 1988 Cerf becomes Chair of IAB.

November 1987 CCIRN Created.

October 1987 IETF meetings opened to NSF members.

During 1986 Cerf, Kahn, and Uncapher create Corporation for National Research Initiatives (CNRI).

January 17, 1986 First meeting of Internet Engineering (INENG) ancestor of IETF.

During 1983 Internet Activities Board Replaces ICCB.

During 1982 Cerf begins work with MCI.

During 1972 Steve Crocker ensures publication of IMP source code by BBN.

During 1969 Start of Network Working Group (NWG) and Request for Comments Series (RFC).

Page 159: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

See Ross William Rader, “Alphabet Soup: The History of the DNS,” April 2001,192

http://www.whmag.com/content/0601/dns/print.asp; Mueller (2002: 75-82). Rony and Rony (1998). Paré

(2003).

144

The Internet was never free. It’s just that the userswere divorced from the people who were paying thebills.

Don Mitchell

5. LORD OF HOSTS

a. What it All Means

As with any conflict, to fully appreciate the meaning of the DNS War, one must

explore its causes. Most other students of this topic start out quite sensibly, laying out

technical and administrative history of the domain name system. I have taken a relatively192

circuitous route to this point, generally avoiding the problem of domain names in order to

focus attention on certain personalities, specific deeds, key documents, and the institutional

developments that supported governance of the Internet’s standards-making process.

Nevertheless, here we are. The story of the central crisis is ready to unfold.

This version will put some of the details already covered by others into sharper relief,

and perhaps add an insight or two. But there will be no upheaval of the conventional wisdom,

which can be put this way:

The remarkably productive, unselfish, and even happy collaboration characterized the

Internet’s research and engineering communities in the 1980s and early 1990s began to sour

by the middle of the decade. The formerly happy state of affairs was displaced by predatory

drives such as monopolistic profiteering, bureaucratic turf-building, and political

grandstanding. The freewheeling but insular informality of the early Internet could not endure

mobs of avaricious newcomers. The Boom brought the Fall. The commons opened by

TCP/IP would have to be regulated or else be overrun. Expropriation would be sanctified and

altruism extinguished through the cynical modernity of enforceable rights and duties.

As a result, I will argue, Jon Postel faced a dilemma. His own relevance was rooted

in a culture at risk of losing its relevance. He could take a stand to hold the line, or take the

lead in charting a new course forward. He attempted both, and in so doing achieved neither.

Page 160: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

145

b. Dawn of the DNS

In 1969, at the dawn of internetworking there was no common method of identifying

resources, and neither was there a common syntax for moving messages across networks.

Everything had to be created from scratch. The designers of the ARPANET and the Internet

hoped to create a universal addressing system that would transparently route traffic between

source and destination. Whether traffic moved on subnets or through gateways, the point was

to do it automatically and behind the scenes. That preference dictated that each entity

connected to the system would be assigned a unique numeric identifier. Confusion could be

avoided by incurring the apparently small administrative cost of maintaining and distributing

a comprehensive index.

It didn’t have to be that way. A different approach was exploited by the Unix to Unix

Copy Protocol (UUCP), a popular internetworking system that emerged in the 1970s, not

long after the ARPANET was underway. In principle, UUCP showed that names would not

have to be unique across the entire system, but only among one’s immediate neighbors. Each

person sending an email would have to specify a “bang path” designating the sequence of

hosts responsible for carrying the message from neighbor to neighbor. A host wouldn’t need

to know everyone’s name . . . just those in the immediate vicinity.

The UUCP bang path was literally the route by which email messages were to hop

from source to destination. The sign for the bang was an exclamation point, so a bang path

might look like, “!bigsite!foovax!barbox!me” This approach shifted the overhead of routing

mail from a central administrator to end users at the nodes. It worked, but the Internet

eventually proved superior for so many reasons that bang paths went the way of buggy whips.

The contemporary world’s familiar syntax for email – using the @ sign to separate

the user’s login name from the host identifier – was created in 1971 by Ray Tomlinson, a

Principal Scientist at BBN. At first, hosts could only be identified by numbers. Adding a

feature that allowed the use of names seemed to be a desirable and sensible next step. Names

would presumably be easier to recall than numbers. Sending mail to person@place rather

than to person@12, for example, would feel more natural and pleasing.

Page 161: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

146

The problem of assigning names to the existing set of numbers appeared to be a

relatively simple proposition. But what would the names be? Who would play Adam in the

Internet’s Garden of Eden? The first proposal came from Peggy Karp, an employee at

MITRE, a not-for-profit corporation that specialized in providing high tech services to the

US government. In late 1971 she published RFC 226, a short reference table titled

Standardization of Host Mneumonics.

RFC 226 was essentially a two column list with twenty ARPANET sites. Karp matched

each IMP/host pair on with a corresponding name – 1 was UCLA, 2 was SRI-ARC, 3 was

UCSB, and so on. The numbers hadn’t been given out in strict order; MITRE was 145.

Karp’s list was designed to serve as the basis for a system-wide lookup table called hosts.txt.

Mail sent to person@place, for example, would be referenced in the table and automatically

routed to the machine number matched to “place” in the hosts.txt file. Final delivery to the

addressee would be handled by that host machine. Each host would store an identical copy

of hosts.txt, but would need to get an update of that file as new host names were added or

configurations were changed. This meant that some central registry would have to be made

responsible for maintaining and distributing an authentic, up-to-date list.

Before long, Postel submitted an update. There were now forty two hosts. His list also

added a third column of short alternate identifiers, not to exceed four characters. His

suggestion for the short name of the host at SRI’s Augmentation Research Center was NIC

– Network Information Center – an acronym that proved to have tremendous staying power.

Over time, fields were added to the list that would also account for an entry’s

machine name, operating system, and available protocol services. Fields and subfields were

delimited by the symbols “:” and “,” respectively. Consequently, a single entry in hosts.txt

might look like this:

HOST : 10.2.0.52 : USC-ISIF,ISIF : DEC-1090T : TOPS20 :TCP/TELNET,TCP/SMTP,TCP/FTP,TCP/FINGER,UDP/TFTP :

The label “HOST:” identified the type of record. The next field contained the physical

address of the host machine (10.2.0.52). This was followed by the short and long name sub-

fields (USC-ISIF and ISIF), the operating system (TOPS20), and finally the services (in this

Page 162: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

147

case, several TCP services and one UDP service). The upkeep of hosts.txt-related operations

was subsidized – like nearly everything else on the ARPANET – by the US Government.

The authoritative master copy of the file was kept at SRI’s site in Menlo Park,

California. It was maintained by Elizabeth “Jake” Feinler, an engineer who had been

recruited by Douglas Engelbert in 1969. Engelbert’ s concept of a network information center

for the ARPANET had moved from drawing board to reality in good time, allowing him to

name Feinler as the principal investigator of the full-fledged NIC in 1972. In 1975, after the

Defense Communication Agency (DCA) stepped in to replace ARPA as the primary sponsor

of the ARPANET, the NIC project was officially known as the Defense Data Network’s

Network Information Center (DDN-NIC). Feinler remained primary investigator until 1991

when SRI finally lost the NIC contract.

Feinler stored the master hosts.txt file on a machine designated SRI-NIC, which was

also the long-time home of the RFC series. Postel happened to be working at SRI in the mid

1970s when the DCA took over, but most of his activity at that time was devoted to leading

the ARPA- and Air Force-sponsored Augmentation Research Center Development Group.

When new hosts were added to the ARPANET – and later to the Internet – system

operators applying for names would generally transmit filled-in templates to the DDN-NIC.

Most of the arrangements were completed and verified over the phone. After 1977, day to

day operations were handled by Mary Stahl, an artist who had learned of the job opening at

SRI through a friend. Stahl communicated with the “liaisons” at the host sites, and

maintained their records in a computerized database, adding or amending records as needed.

That database was used to generate the master zone file, she painstakingly proofread and then

made available for download. Copies were routinely transferred twice a week, downloaded

by hosts across the country, and ultimately around the world.

The system worked well enough through the 1970s and 1980s, but the host table kept

growing. Long before the cutover fom NCP to TCP/IP, the need for improvement became

clear. The number of hosts was effectively doubling annually, surpassing 500 in 1983.

Exponential growth was likely to continue. Keeping up with additions and changes would

Page 163: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

148

Jon B Postel, “Internet Name Server,” October 27, 1998, http://www.isi.edu/in-notes/ien/ien61.txt.193

David L Mills, “RFC 799: Internet Domain Names,” September 1981.194

Zaw Sing Su and Jon Postel, “RFC 819: The Domain Naming Convention for Internet User195

Applications,” August 1982.

U / | \ / | \ U -- Naming Universe ^ ^ ^ I -- Intermediate Domain | | | E -- Endpoint Domain I E I / \ | ^ ^ ^ | | | E E I / | \ ^ ^ ^ | | | E E E

Figure 8 “The In-Tree Model for Domain Hierarchy,” From Zaw Sing Su and Jon Postel, “RFC 819:

The Domain Naming Convention for Internet User Applications” August 1982.

be a never-ending task. There were fears that the Internet would eventually become

congested by the transfer of larger and larger hosts.txt files to more and more hosts.

Development of a more efficient alternative was in the works for some time. Postel

had proposed an Internet Name Server as early as 1978. A major step came in 1981 when193

David Mills, a distinguished routing specialist, proposed a comprehensive solution to the

anticipated hosts.txt problem. The following year, Zaw-Sing Su, also of SRI, co-authored194

an RFC with Postel incorporating an approach that was the first to bear a noticeable

resemblance to the current system. It included a long description of why an administrative

name hierarchy was preferable to topological bang paths. Su and Postel represented the

proposed layout with a diagram titled “The In-Tree Model for Domain Hierarchy.”195

Once the development process received a full commitment from DARPA, main

responsibility for the work was assigned to Paul Mockapetris, a computer scientist on ISI’s

staff. He received substantial input from Postel and from Craig Partridge, a computer

Page 164: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

149

See Joi L. Chevalier, “15 Years of Routing and Addressing,” Matrix.net, 11.01, January 2001,196

http://www.matrix.net/publications/mn/mn1101_routing_and_addressing.html.

industry powerhouse who was, among other things, Chief Scientist at BBN and a member

of CSNET’s technical staff. (CSNET – the Computer Science Network – was the196

consortium of local networks and computer systems that had grown out of Larry

Landweber’s decision to adopt the TCP/IP protocol.)

The long-term goal was to eliminate the cumbersome hosts.txt file, and replace it with

a highly scalable and hierarchically distributed root and branch database structure. To avoid

repeating the trauma of the TCP/IP cutover, the team designed for a transition period during

which both systems could be used simultaneously.

The new standard was premised on creating a widely distributed array of facilities.

There would be a root database containing references to multiple subsidiary databases – what

are now known as Top Level Domains. In common parlance they are called “TLDs.” Each

TLD would contain further sets of subsidiary references known as Second Level Domains

– “SLDs”. These references could conceivably extend on down the line, going three, four,

five or even more levels out, finally pointing to resources at a host. The system was

remarkably flexible, even allowing one name to be assigned to another via an alias if desired.

Ultimately, however, those resources would resolve to an endpoint, usually designated by an

IP address.

Whereas the UUCP bangpaths reflected a neighbor-to-neighbor mapping that laid out

how to get from here to there, the inherent structure of the DNS was strictly hierarchical, like

the representation of a file system on most computers. From the user’s perspective, the

locator name of a host resource would consist of a chain of database references, from the

most subordinate all the way back up to the root. For example, an imaginary fifth level name

might look like myhost.mycity.mycounty.mystate.mycountry. There is an implicit “.” at the

end of the chain of which most modern Internet users are not aware. The “.” symbol –

pronounced “dot” – was used inside the database records to symbolize the location of the

root. Specialists came to use the two words – root and dot – synonymously.

Page 165: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

150

Jon Postel, “Single Character TLDs & some history,” NEWDOM November 12, 1995.197

Despite their careful planning for a full, deeply populated tree, with lots of sub-

branches carrying hosts at the tips, the creators of the DNS never anticipated that the new

system would become as relatively “flat” as it did, drawing so many registrations into one

branch so close to the root... the second level domain, dot-com.

c. Domain Requirements – 1983 - 1985

Mockapetris conducted the first successful test of the DNS at ISI on June 24, 1983.

By early November that same year Postel published RFC 881, announcing a preliminary

schedule for the transition to the new system. The suffix .arpa was announced as the first top

level domain. That domain was already being used by testers when the RFC was published,

but the aggressive schedule required that all existing hosts were to be placed under it in

December. This would serve as a temporary stop. Each network administrator would later

be required to re-register his or her machine using a name directly under another top level

domain, or perhaps as a subordinate to another second (or third) level administrator. But no

other top level domains were yet available.

“The original proposal,” Postel wrote later, “was to have TLDs like ARPA, NSF,

ARMY, and so on. Your computers would be named under the government agency that

‘sponsored’ your connection to the net. The idea of having the 26 single [alphabetic]

characters was discussed [also].” When Postel issued RFC 881, however, only one other197

top level domain was scheduled for introduction – .ddn, for military use.

Postel scheduled a key step for May 2, 1984 – “Establish a New Top Level Domains

Only Table.” This was to be the start date for the new domains.txt file, containing the two

TLDs mentioned in RFC 881 and any others that might be added.

Mockapetris simultaneously published two RFCs – 882 and 883 – that laid out the

technical outline for the Domain Name System. The first presented conceptual issues, and

the other went into lengthy detail about specifications and implementation. His first paper

Page 166: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

151

The msggroup archive was once stored at http://www.tcm.org/msggroup/. Those archives can be198

found at www.archive.org.

Jon Postel,“MSGGROUP#2193 RFCs 881, 882, and 883 Now Available,” Nov 4, 1983. See199

http://www.tcm.org/msggroup/msggroup.2101-2200. A partial namedroppers archive can be found at

http://www.cs.utexas.edu/users/chris/sigcomm/t1/partridge.namedroppers.material.txt.

presented three top level domains – .arpa, .ddn and .csnet, – but only, he wrote, for purposes

of “exposition.”

Postel announced those RFCs on November 4, 1983 via several electronic mail lists,

including – most importantly – the ISI-based Msggroup. It was the ARPANET’s first mailing

list, founded in 1975, and was still going strong as a focal point for the technical

community. He invited anyone interested in discussing the “day-to-day technical198

development” of the planned system to join a relatively new list called Namedroppers, hosted

by the SRI-NIC.199

Though Namedroppers was created as an arena for technical discussion, policy

questions intruded frequently. There was no bright line to distinguish the place where the

rules for electrons would end and the rules for people would begin. A key problem involved

the rights and responsibilities of domain administrators.

By design, each administrator admitted into the DNS hierarchy would become a

gatekeeper for his or her subdomains. This prompted to seemingly endless rehashing about

what rules those gatekeepers should have to follow, especially at the top levels. And the rules

for admission into the hierarchy needed clarification. There was talk of not granting an entity

a listing as a second level name unless the aspiring registrant also brought in at least 50

subordinate hosts at the third level or lower. This did not become a requirement, but its

consideration reflects the extent to which the DNS architects wanted to shape the namespace

as a deep hierarchy rather than the shallow expanse it later became.

Semantics and nomenclature were even more confounding. Some of the people on

Namedroppers wanted top level domains to recapitulate the canonical structures of society.

At first glance, it may have seemed like a good idea, but nothing brings out the anal

retentiveness in humans like the chance to invent a naming scheme for the whole world. The

Page 167: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

152

Jon Postel, “RFC 897: Domain Implementation Schedule,” February 1984.200

list’s armchair megalomaniacs came to believe they would soon be designating the

authoritative base structures of human civilization, under the daring presumption such

structures could be known at all.

Postel was frustrated by that particular line of discussion, which he considered non-

productive. His vision of the DNS focused on the flexibility and ease of management that

would result from as structure in which registration authority was consistently delegated

down through the hierarchy. The whole point was to avoid the administrative overhead of

organizing and enforcing a universal classification scheme that would prescribe how each

host should be named. Instead, in his view, decisions about new host registrations should be

left up to administrators and sub-administrators across the Internet.

In February 1984 Postel provided a progress report with an updated schedule. He

reaffirmed the plans to launch in May, “[at which] point a few new domains may be

established, in particular the DDN domain.” When the time came, however, the deadline200

was pushed back. There were still many technical issues to resolve, and the question of what

names to add was starting to bog down the discussion. The last day of April Postel sent the

Namedroppers list a draft of an RFC titled “Domain Requirements.” It focused on examples

of domain structures as administrative entities, laying out “Prototype Questions” that a

“responsible person” applying for a domain might need to answer. These included contact

information, hardware and software in use. Postel also wanted to know how many hosts the

administrator expected to include in the domain the during its first, second, third, and fifth

years of operation.

Postel’s draft also presented three TLDs as examples – he insisted they were only

examples – of how to organize domains. There was .uc (for University of California), .mit

(for Massachuset Institute of Technology), and .csnet. The first exemplified entities that

might organize their hosts at the fourth level, under regional campuses and then under local

departments, as in locus.cs.la.uc. The second and third examples were intended to portray

circumstances in which administrators might not want to make such a distinction, and would

Page 168: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

153

Jon Postel,“Draft RFC on Requirements to be a Domain,” namedroppers April 4, 1984.201

Mueller (2002: 79).202

Personal email from Jake Feinler, November 7, 2006.203

An embellished history once posted by SRI (now removed) described a meeting with Postel and204

Reynolds where Feinler had shouted “Enough is enough!” insisting she would resolve the dilemma and make

the choices herself.“The SRI Alumni Association Hall of Fame,” http://www.sri.com/alumassoc/hoflist.html.

simply register all their hosts at the second level. His point in making an example of CSNET

was to stress that although the consortium was not a true, unified network like MIT, sharing

a single IP block allocation, “it does in fact, have the key property needed to form a domain;

it has a responsible administration.”201

Postel put out another draft on May 11, proposing six top level domains, .arpa, .ddn,

.gov, .edu, .cor, and .pub. The issue of which names to select continued to be, in Postel’s202

words, “much discussed” on the Namedroppers list, but little was settled. At Postel’s request,

Feinler submitted a draft RFC to nail down the naming scheme, but he didn’t like it. Postel

wanted network names; she preferred generic types of organizations, such as .mil, .edu, and

.gov, plus .bus for businesses. Ken Harrenstein, a software architect at SRI, thought that203

.com would be a better choice for commercial enterprises, and implemented his adaptation

of Feinler’s scheme on SRI’s servers. Since the work was approved by the DCA,

Harrenstein’s move was effectively a fait accomplis. When Postel traveled to SRI discuss

Feinler’s draft, he learned what had been done.204

Domain Requirements was ultimately released in October 1984 as RFC 920.

Presented as “an official policy statement of the IAB and DARPA,” Though it drew liberally

from Feinler’s draft, Postel and Reynolds were the only authors listed. “The purpose,” they

wrote, “is to divide the name management required of a central administration and assign it

to sub-administrations [without] geographical, topological, or technological constraints.”

There were only five top level domains named beyond .arpa, and they were not to be

put into use until the following year, 1985. These were .gov, .mil, .edu, .com, and .org. The

DDN Project Management Office was listed as the administrator of .mil. DARPA was listed

as being in charge of the others. There was no description within the RFC of why those

Page 169: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

154

Robert Hobbes Zakon, “Hobbes’ Internet Timeline,” http://www.zakon.org/robert/internet/timeline/.205

Phone interview with Mary Stahl, April 24, 2003.206

Sean Donelan, “T imeline of events with the Domain Name System,”207

http://www.donelan.com/dnstimeline.html.

particular DARPA domains were chosen or how they were to be used. Perhaps Postel

thought their purposes were self-evident. It was more likely that he didn’t want to do

anything more to fuel a new round of quibbling about semantics.

The requirements document also allowed for the inclusion of two letter codes for

country domains and codes “multiorganizations” – entities that “can not be easily classified

into one of the existing categories and [are] international in scope.” Postel insisted on one

correction to Feinler’s scheme. A seventh TLD – .net (network) for the use of ISPs and other

infrastructural facilities – was added within the year. Throughout the development period

there was little expectation that individuals would want – or would be able to afford – hosts

of their own.

In 1985, authority for central administration of the root became yet another of

Postel’s DARPA-funded duties at ISI. This put him directly in charge of screening top205

level domain applicants. The considerably more cumbersome task of providing registration

and support services under the top level domains was added to Mary Stahl’s job at the DDN-

NIC, which was still funded by the Defense Communications Agency. From her perspective,

there was no specific moment when a big switch was flicked and the new system was turned

on. If anything stood out, it was the jump in registration activity that began after the TCP/IP

cutover. Things at SRI were often in flux. Like many other projects, the DNS was phased206

in over a period of time, with lots of tests and experiments being performed long before it

was officially in place. The first registration in .com – symbolics.com – came on March 15,

1985. An even more important milestone – incrementation of NIC zone serial numbers to

publicly flag the availability of authoritative updates – didn’t begin until almost a year later,

February 26, 1986.207

Page 170: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

155

Stephen C. Dennett, Elizabeth J. Feinler, Francine Perillo, eds. “ARPANET Information Brochure,”208

December 1985, http://www.hackcanada.com/blackcrawl/telecom/arpa.txt.

Jon Postel, “politics of names - not on this list -- please !” namedroppers, November 4, 1985.209

Jon Postel, “ tcp-ip@ sri-nic .ARPA, Naming the NIC,” August 3 , 1987210

http://www-mice.cs.ucl.ac.uk/multimedia/misc/tcp_ip/8705.mm.www/0222.html.

The DDN-NIC now served as hostmaster for both the DDN (.mil) and DARPA TLDs

(all the others). Federal policy required that any addition or modification be certified by a

Network Change Directive. It was easy to accept the idea that parties applying for second208

level names would be screened to ensure that the host machine was qualified to appear within

the requested TLD. Only colleges and universities would be allowed under .edu, for example.

Stahl occasionally worked with her liaisons over the phone to help them pick their names.

Arguments about semantics continued to take place long after the initial TLD suffixes

were selected, both on and off the Namedroppers list. The DNS naming hierarchy was in

some ways the victim of its own success. People had begun using it to guess the location of

resources. It made sense to assume the Massachusetts Institute of Technology was mit.edu.

That guess happened to work. If a resource turned out to have the “wrong” name, however,

complaints might arise, perhaps supplemented with suggestions about how to fix the

structure. For example, in mid 1987 someone looking for the Network Information Center

guessed it would be found at nic.sri.com. It was actually at sri-nic.arpa. This led to questions

regarding what might happen if ARPA removed its support of the Internet. Or what might

happen if SRI changed its corporate name or even went out of business? Why not create .nic

as a top level resource?

Postel had to intervene more than once to keep Namedroppers focused on technical

issues. On November 4, 1985, two years to the day after issuing the first official public

invitation to join Namedroppers, he put his foot down, proclaiming a new policy that sought

to ban any discussion of semantics. Thereafter, he had to patrol against any hint of a209

violation, and would issue an interdiction when needed:

The namedroppers list is for discussion of the design and operation of thedomain name system, not for the discussion of what the names are or whatnames would be nice.210

Page 171: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

156

Jon Postel, “re: countries only at the top,” msggroup Nov. 10, 1985.211

Though Postel recommended alternative locations for such discussions, the thrust of

his interventions was to interrupt them and shunt them away from the main stage of

discussion. Nevertheless, he did make at least made one major concession to popular

demand, in particular, the demands of certain users outside the United States.

By design, there was no inherent need for the DNS replicate the physical divisions

of networks in the form of IP blocks, and there was no reason to replicate geographical or

geopolitical subdivisions either. Many Americans were already beginning to hope that the

rise of the Internet presented an opportunity to surmount such constraints. But European

participants were wary of being subsumed within US-dominated categories. There even a

were a few who thought that no other TLDs should be created beside country codes. Postel

rejected that idea:

I think that there are many cases of organizations that operatein multiple countries where attaching them to one country oranother would cause (political and emotional) problems.

211

If country names were to be added at all, it had to be determined what counted as a

country, how should it be listed, and who should administer its registry. There was an easy

enough answer for that. A guide to the names of recognized nation-states and their legal

denotations was available in the form of a United Nations Bulletin called “Country Names.”

It paired the countries of the world with a corresponding “Alpha 2" code. France was

matched with FR, Mexico with MX, and so on. A Berlin-based group known as the ISO 3166

Maintenance Agency (part of the UN’s International Organization for Standardization) used

the “Country Names” Bulletin along with another UN-based list of unique statistical

designations to generate a table known as ISO 3166-1. It maps country names with a column

of two letter codes and another column of three letter codes. In RFC 920 Postel designated

the ISO 3166-1 table as the validating authority for any future country code designations that

might be added to the root.

Page 172: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

157

See his brief online memoir, Peter T. Kirstein, “Early Experiences with the ARPANET and212

INTERNET in the UK,” http://www.cs.ucl.ac.uk/staff/jon/arpa/internet-history.html.

Paré (2003: 70-1).213

Once it was decided to add country codes, the next step was to determine who should

administer the zone. Postel decided that a country code registration, like any domain name

registration, required that an application be made by a “responsible party” who could receive

the delegation. The Internet was still small enough in the mid and late 1980s that an applicant

for a country code might be a familiar colleague within the TCP/IP development community

– most likely an academic, perhaps even an old friend from grad school.

Though the process began slowly, with the addition of .us in February 1985. Two

more – .gb and .uk – were added in July. The assignment of two codes for one country turns

out to be a story in itself.

British computer scientists had been participating in the ARPANET project since 1973

under a Governing Committee chaired by Peter Kirstein, a key figure in the creation of the

Internet. A long time faculty member at University College London (UCL), Kirstein212

pioneered the first TCP test network with Cerf. He also founded the International Academic

Network Workshop meetings... the annual gatherings that were later transformed into INET

by Larry Landweber, providing a platform from which to launch ISOC. Kirsten went on to

participate in CCIRN, overseeing the growth of Internet connectivity worldwide.

In 1985, Kirsten needed a favor. Despite the fact that the ISO 3166-1 table used Great

Britain as a country name, Kirstein wanted Postel to accept .uk as the entry for his country’s

networking community. UCL’s network had been using .uk as an ARPANET identifier long

before any country codes were added to the domain name system. Switching over would be

inconvenient. Postel accepted Kirsten’s request to add .uk to the root, but just as a temporary

solution. A record for .gb was added as well, with the expectation that it would soon come

into widespread use. It didn’t work out that way. The change was continually put off,

making its possible execution look ever more painful and disruptive. Postel made several

attempts over the next few years to get the British networking community to convert, but

gave up in 1988. In the end, the use of the .uk suffix in the DNS was allowed to stand.213

Page 173: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

158

Postel may have been famous as a stickler for technical consistency, but the incident showed

that he could also bend on administrative questions.

* * *

An important design feature of the DNS was its capacity for redundancy. Its zone

files could carry more than one IP address for each name referenced in the system. If the first

listed site failed to respond for some reason, perhaps because of network congestion or a

temporary outage, this feature helped ensure that an alternate site publishing the same

resources could be found. The hierarchy was maintained by allowing only one of those

addresses to be flagged as the “Start of Authority” for subordinate data.

A distinctive feature of the DNS was that it did not rely on TCP to transmit

information across the network. The Uniform Data Protocol (UDP) was used instead. TCP

runs by instantiating “windows” or “frames” that behave as virtual circuits between hosts.

Keeping those windows open requires some extra processing and transmission overhead, but

this overhead is a reasonable penalty for the ability to maintain interactive sessions, transmit

potentially long messages, and support various other services. Since DNS queries are rather

short, it is not necessary to call on TCP just to find out what IP number matches up with a

particular domain name. UDP is fine for this, but imposes certain limits. Since the maximum

length of a UDP packet is 512 bytes, there is a fixed ceiling on the number of hosts addresses

that can be included in a discrete UDP message, and therefore referenced under the same

domain name. That constraint had direct implications for the maximum size of the root zone.

The DNS was initially configured to allow up to eight distributed nameservers at the

root level. Only three were implemented at the time of launch. The primary DNS host was

at SRI and two secondaries were running at ISI. Two more secondaries were eventually

added at military facilities on the east coast. The machines in these arrays were alternatively

known as masters and slaves. After some clever finagling in the 1990s, engineered under the

direction of Bill Manning, the maximum size of the root was extended to thirteen servers.

Page 174: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

159

Privatization of the New Communication Channel, http://www.sit.wisc.edu/%7Ejcthomsonjr/-214

j561/netstand-3.html.

Phone interview with Mary Stahl, April 24, 2003.215

The first generation of Cisco routers were introduced to the market in 1985,

furnishing a compliant platform particularly suitable for DNS traffic. The DNS grew214

steadily more popular, particularly among users of the UNIX operating system, though both

the original host table system and DNS were in simultaneous use for the rest of the 1980s and

into the early 1990s.

d. From SRI to GSI-NSI – 1991

Contention over the mechanics of DNS registration during the late 1980s revealed

that there was a significant disagreement brewing over what constituted the authentic Internet

and who should pay for it. Program officers in the military were increasingly reluctant to

subsidize the cost of registering names and numbers for hosts that were not clearly within the

federal orbit. Military users perceived the Internet – with a capital I – as that subsidized by

official US agencies. The rest was relegated to the less important small ‘i’ internet. Members

of the academic community saw things the other way around. For them, the Internet that

counted was the one with global visibility. Federal sponsorship was incidental.

The name and number registration work conducted at SRI’s site had been growing

in both size and complexity, albeit slowly, for over twenty years. The DDN-NIC was handled

under a sole source contract, meaning that SRI was considered the only vendor capable of

providing the service. Periodic renewals were quiet and routine. In the mid-1980s the NSF

began to chip in funds to DCA on behalf of the wider research community.

With more than a decade on the job, Mary Stahl started to burn out. The tedium of

generating and proofreading hosts.txt and the DNS zone files was finally weighing down on

her. It was a thankless task in many ways. “No one would call to say ‘Great job!’ after you

uploaded a file,” she recalled, “but you’d hear about it if something went wrong.” In the

early 1990s she reduced her commitment to a half-time schedule and started migrating out

of SRI.215

Page 175: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

160

“DISA History,” Original location was http://www.disa.mil.pao/fs/his-may02.doc and is on file with216

the author. A newer, but briefer version is at http://www.disa.mil/pao/fs/history2.html.

Phone interview with Scott Williamson, March 18, 2003.217

The DCA was redesignated as DISA on June 25,1991. The transition put the DDN-216

NIC budget under some scrutiny, adding pressure to consolidate the military’s portion of the

infrastructure and break off the others. Why shouldn’t civilian agencies or the private sector

subsidize the portions they needed?. The character of the Internet was clearly changing. It

made sense that the DDN-NIC and the NSF-funded ARPA-NIC functions could be

overhauled and completely split apart, moving the research portion of the registry directly

entirely under civilian control. But it was still far from clear how this transfer of

responsibility would take place. Renegotiating the DISA contract provided grounds to get

that process underway.

This time SRI faced stiff competition. There were two internetworking heavyweights

in the bidding – BBN and MITRE. There was also a lesser-known company, Government

Systems Incorporated (GSI) which had partnered with a Virginia-based firm, Network

Solutions, Inc. (NSI) under the provision that NSI would be the actual operator. All the

contenders understood that there would be an even bigger solicitation in a year or so for a

“NIC of last resort” to be funded directly by the NSF. Winning this one would help earn a

position on the inside track for the next one.

Though NSI was a small company, it was a formidable competitor. NSI had only ten

or so employees, and was a minority owned and operated firm. This combination of size and

status made it eligible for preferential treatment under a Small Business Administration

certification known as the 8(a) Business Development Program. The company’s relative

proximity to the NSF’s main offices in Virginia didn’t hurt either. Moreover, NSI had

recently beat out SRI for another DISA contract, and the leader of the GSI/NSI negotiating

team, NSI’s Scott Williamson, was now its project manager.217

Williamson had been playing with computers since the days when, as he put it,

“hacking was a good term.” He started out by exploring the same Altair 8000 “Kit

Computer” model that had inspired Bill Gates and Robert Allen to form Microsoft

Page 176: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

161

Ibid.218

Corporation. Though Williamson had only earned a non-technical Associates degree before

he started at NSI, he had also received advanced training in satellite data systems in the

Navy, where he proved himself adept at project management. It was Williamson who noticed

DISA’s Request For Proposal for the DDN-NIC, and who worked out the strategy of NSI’s

partnering with GSI, a company which had a considerably stronger track record of winning

government contracts.

DISA’s award of the DDN-NIC job to GSI/NSI created the conditions for a small

earthquake in the Internet community. SRI was a trusted player from the days before dawn.

The new operator was an unknown upstart. Moving the root and the IP registry would be a

significant event. There hadn’t been a such a drastic operational change since the 1983

cutover from NCP to TCP/IP. Matters were further complicated by apparent lack of

cooperation from SRI’s side.

First SRI contested the award, complaining that the GSI/NSI bid was so low it did

not seem reasonable. This held up the start of the transition for a short period. If SRI’s

business managers did not seem especially eager to transfer the data and the existing work

product to the new operators, perhaps it was because they resented loss of the contract and

were acting vindictively. Or perhaps it was because thought they believed they had the right

to withhold proprietary information that would be useful in the next round of competition

for the NSF’s new NIC. In any case, Williamson endured an experience so “ugly,” he said

it gave him nightmares.218

When the transition finally got underway, it turned into an exasperating series of

unexpected problems and pitfalls. SRI’s DDN-NIC used a DEC TOPS20 minicomputer with

a mix of programming languages (assembler and a not-so-portable version of C), while NSI

would be using a much newer SUN 460 diskless workstation. SRI transferred information

by way of tapes using a format that complicated the logistics for moving material onto NSI’s

platform. And Williamson felt SRI’s managers were not giving his team thorough enough

details on how the root had actually been run for the last decade.

Page 177: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

162

Scott Williamson and Leslie Nobile, “RFC 1261: Transition of NIC Services,” September 1991.219

Williamson complained to DISA’s contract representative about SRI’s “unhelpful”

behavior, but did not get the remedy he wanted. He also visited ISI, primarily to talk about

IP number management, though DNS issues were brought up too. Postel was able to

contribute, at least as far as clarifying standards and requirements, but this was not the same

as disclosing the mechanisms of a working implementation.

Williamson’s team ended up having to create much more of the code base for the new

DNS software than originally anticipated, and they had to do it on a crash basis to meet the

October 1 1991 deadline. In September 1991 Williamson submitted an RFC with a schedulest

for the transition of services. The plan was to suspend registration activity at SRI’s site for

a five day period prior to the transition date, after which it would be resumed at NSI. His219

team was working round the clock at the end, and began seeing traffic on the server three

days before the planned cutover, an experience Williamson later described as “scary.” Ready

or not, it seemed like there was always someone out there a little too eager to start banging

on the system.

NSI also took on hosting for Namedroppers and added lists for discussion of root

servers management were oriented around rs-talk and rs-info. The main responsibility for

hosting RFCs and issuing related publications shifted to CNRI.

When the root was finally moved, there were enough stutters and hiccoughs to make

people notice. Williamson dutifully made himself available to the IETF community,

appearing at the IETF plenary in late November. It was a hard group to please. Many were

caught off guard by the announcement, and were not happy about it. Some were outright

wrathful. “Lots of people were throwing tomatoes,” he recalled. The engineering

community’s first taste of NSI as the root operator was rude awakenings, glitches, and data

losses. Despite being the one who had made the greatest effort to make the transition

succeed, Williamson had to take the heat for the parts that didn’t.

By inheriting the responsibilities for running the DDN-NIC, Williamson inherited the

frustrations that went with it. His long-time predecessor, Mary Stahl would have understood

Page 178: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

163

Phone interview with Kim Hubbard, February 17, 2003.220

the siege he was about to endure, but she was already on the verge of leaving SRI altogether,

and had little to do with the DDN-NIC at the time of the transition. If Williamson had met

Stahl in some neutral venue, perhaps she could have offered him a message of

commiseration... “Welcome to my world.”

Williamson was not the only NSI employee to feel the wrath of the public. The IP

registry management function that Postel had passed on to SRI in 1987 was now in the hands

of Kim Hubbard, who had been working at NSI for a couple of years selling TCP/IP-based

software. The cutover did not go well for her, either. Upon receiving the list of existing IP

allocations, she experienced her own rude awakening. “It didn’t take long, probably like an

hour, to figure out that they just screwed up really bad.” As she saw it, in the period leading

up to the transition, SRI’s personnel had slipped and allowed mistakes to creep in. The

biggest problem involved double allocations, in which the same block of numbers had been

given out to multiple users. “It took me years to get it straightened out,” she said. “I had to

turn around and tell [recipients], ‘Sorry but someone else has that and they got it a week

before you.’”220

* * *

Though Postel had no contractual authority within the DDN-NIC structure, and

exercised no direct say in the movement of the DDN-NIC from SRI to NSI, the transition

served to bolster his authority. As Williamson recalled, NSI’s owners initially put little stock

in Postel’s importance. The DDN-NIC was subordinate to DISA, while Postel’s shop at ISI

was beholden to DARPA. Still, across the Internet as a whole, more and more people were

coming to rely on Postel as a resource. Yes, NSI’s staff members were using military email

addresses like [email protected], which might seem to identify them with the armed

services, but they were increasingly preoccupied with serving the demands of the livelier

civilian research community. They not only deferred to Postel’s advice, they were impressed

by him personally. From Williamson’s perspective, “Postel was a big help; he was the king.”

Page 179: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

164

Hubbard considered Postel to be “a great man” and modeled her own behavior after her

idealized image of him as a paragon of strict adherence to fundamental technical principles.

As gatekeepers with immediate operational control over the allocation of contended

resources, especially the diminishing blocks of available Class B addresses, Williamson and

Hubbard soon found themselves immersed in conflict. Anyone in a position of such

importance, no matter how saintly, presented a likely target for criticism. The work was

demanding enough on its own, but the bitterness of the environment provoked feelings of

personal sacrifice. By imputing a charismatic purpose to their job, they could justify their

travails as a noble duty that had a broader benefit. That ennobling sense of charisma was

fortified by alignment with Postel... the man presumed to be on top, if not above it all.

Ironically, while Postel’s work continued to be funded by DARPA, an agency of the

US military, his efforts benefitted a community that had an increasingly civilian and

international character. Anyone investigating the administrative workings of the Internet

would inevitably learn his name. Moreover, ISOC was on the rise, providing a platform

through which academic and commercial Internet participants could advocate their own

social interests. Postel’s enthusiastic support was taken as a legitimating factor toward that

end.

* * *

By this time, civilian computing had overwhelmingly tilted toward use of the DNS,

while the military sector remained relatively dependent on the old hosts.txt table. Most

commercial off-the-shelf TCP/IP technology now came enabled with DNS support. New

hosts were being added at faster rate on the ARPA-Internet side than the DDN side. It was

clear that the DNS-focused civilian Internet was leapfrogging over the hosts.txt technology,

and the civilian sector’s growing influence on the network was undeniable.

Many new hosts in the civilian sector, particularly those outside the US, weren’t even

bothering to register in hosts.txt. Consequently, hosts.txt users would not be able to “see”

those new sites unless the right IP number was known. Postel had pointed out this problem

in 1984 when he launched the .arpa root. The situation could be highly disruptive to the use

of certain extensions of the Internet Protocol suite, particularly those involving mail

Page 180: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

165

Key parts of the exchange are recounted in Lyman Chapin, “RFC 1401: Correspondence between221

the IAB and DISA on the use of DNS throughout the Internet,” January 1993.

exchange, which relied on features enabled by the DNS. In light of the move toward a

dedicated civilian NIC, continuing to support the obsolescing hosts.txt file was becoming a

thankless task. It seemed like an appropriate moment to dispense with the host table

technology altogether.

The IAB Chair, Lyman Chapin, sought to force the issue in a March 1992 letter

addressed to the Chair of the DoD Protocol Standards Steering Group, Colonel Ken Thomas,

and various other officials at DISA and within the FNC. Chapin’s policy recommendation

titled “The Domain Name System is an Internet Necessity” chided the MILNET community

for failing to follow through on a 1987 promise to convert to the DNS. More audaciously,

he insisted that it was time for the US military to get in step with the “world-wide Internet,”

insisting there were serious risks if the military did not catch up. “The larger community has

evolved so extensively beyond [the host table]...” he warned, that failure to adopt the DNS

would result in a “detrimental impact on overall Internet operations.”

Non-DNS systems on the Internet will eventually be confronted with the needto decide whether they want to continue as a part of the larger Internetcommunity, or remain a rather small, non-conforming subset. Should theychoose not to conform to the otherwise accepted Domain Name System, theywill have to accept the ramifications of this decision.221

Leaders within the civilian engineering community were becoming more confident

of their status and therefore bolder in asserting their interests. Now they could dare to tell the

military what to do. Moreover, the next round of contract negotiations would led by the NSF.

This would finally place the ultimate responsibility for administering the civilian side of the

Internet under a civilian agency. And it would further sanctify Postel’s position of authority

at the top of the chain of command.

e. One-Stop Shopping – Summer 1992 - Spring 1993

For much of the 1990s, the National Science Foundation’s chief contract negotiator

for anything related to national computer networks was Don Mitchell. A disabled Vietnam

Page 181: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

166

Phone interview with Don Mitchell, February 10, 2003.222

Ibid.223

veteran who been working for the NSF since 1972, Mitchell was centrally involved in

arranging the late 1980s deal by which the NSF, IBM, and the University of Michigan

created Merit and thereby launched the NSFNET backbone. His longevity in that position

during that tumultuous period gave him truly senior status in the government, if not formally

as a functionary, then informally as a troubleshooter, mentor, disciplinarian, and deal pusher.

He liked referring to himself as “NSF’s hit man.” This power served a personal agenda.222

Though a consummate bureaucrat, adept at trading favors with friends and getting even with

enemies, he was also an ideologue who sought to champion privatization when possible,

countering what he considered to be the predatory behavior of government regulatory

agencies.

Since Mitchell avidly supported the NSF’s interests in leveraging the Internet as a

research tool, he was sensitive to fact that the critical registry functions performed by the

DDN-NIC (especially IP number allocations) were beholden to military funding. He

especially feared that someone in the military would soon “come back in and do something

like cutting off Europe again.” The tradition of sharing responsibility for such critical

resources among different agencies was complicating the issue of who possessed ultimate

authority over them. “The whole thing gets very shrouded and bewildering,” he recounted.

“People were still afraid to say [the Internet] was no longer the military’s.”223

Mitchell shrewdly recognized that Postel’s academic temperament and social

grounding made him a willing “co-conspirator” in shifting control from DARPA to civilian

agencies. Postel had already proven himself quite adept at navigating the growing division

between the operational and research imperatives of the military without jeopardizing the

interests of the avant garde engineering community. The reservoir of trust Postel had

accumulated gave him tremendous latitude to assist in what Mitchell called the “streaming

Page 182: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

167

Ibid.224

See, for example RFC 1177 Aug 1990 and others in the series, “New Internet User.”225

Solicitation text cited from Simon Higgs, “I like this guy” bwg May 18, 2002. “This project226

solicitation is issued pursuant to the National Science Foundation Act of 1950, as amended (42 U.S.C. 1861

et seq) and the Federal Cooperative Agreement Act (31 U.S.C. 6305) and is not subject to the Federal

Acquisition Regulations." ... "The provider of registration services will function in accordance with the

provisions of RFC 1174."

of authority from the military.” That streaming could be maintained and even accelerated as

long as “the big bear, the Pentagon, hadn’t woken.”224

Mitchell worked with two NSFNET program managers – first Doug Gale and later

George Strawn – developing an ambitious concept for a full-fledged civilian successor to the

DDN-NIC. It was based on the idea of “one stop shopping” for three distinct kinds of

services. The most challenging would be the creation of an Internet directory service, perhaps

modeled after the telephone system or the X.500 network system then popular in Europe. It

wasn’t clear how this might work on the far flung Internet (in the end, after all, it never did

work), but the need for a directory was taken for granted. Second, they wanted to create a

kind of a “help desk” that would be responsive to the ever-burgeoning user community. The

desire to organize a project of this sort correlated with the rise of a User Services Area in the

IETF, and the introduction of the FYI category in the RFC series, particular interests of Joyce

Reynolds. Finally, the IP registry and the root of the DNS had to be operated in a manner225

least likely to impede the continuing growth of the Internet.

In 1992, the NSF issued a solicitation for a National Research and Education

Network Internet Network Information Services Center (NREN NIS). It specified that the

registry portion of the project would be run in accord with RFC 1174, once again

reconfirming IANA’s authority. By emphasizing the primacy of DARPA over DISA as226

owner of the Internet’s key resources, the “streaming” of authority from the military was

proceeding as planned. And once again, elevating IANA elevated Postel as an individual

whose voice mattered.

One of the idiosyncracies of the NSF’s proposal evaluation process in those days was

its mechanism for protecting the intellectual property of those who made submissions. All

Page 183: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

168

Phone interview with Don Mitchell, February 10, 2003.227

Email from Rick Adams reprinted in David Farber, “IP: RE: CRADA – you can hear the rip (off)”228

IP December 24, 2001.

proposals were kept secret, other than those that won. As Mitchell described it, “There was

no proposal that never received an award.” But each of the proposals that (never) arrived227

for the new NREN NIS had the same problem. None offered the comprehensive “one-stop

shopping” that fulfilled all three of the NSF’s requirements.

As expected, NSI was seeking to continue on with the Registry Services portion of

the planned center. It is noteworthy that the company’s bid was written as if it were being

presented on behalf of the “Network Solutions/ISI Team,” rather than Network Solutions

alone. NSI’s bid also touted its staff’s ongoing experience working in an “integrated”

operational relationship with the IANA. This aspect of incumbency was not as cozy for NSI

as it may seem. Apparently every bid submitted to the NSF referred to a warm and fuzzy

relationship between the applicant and IANA, and implied that there would be some funds

redirected to ISI as part of a subcontracting relationship. NSI did have clear advantages,

however. The strongest may have been the experience accumulated by Williamson and his

team during the first transition of the DNS root service from SRI... experience that was hard

won and impossible to duplicate.

With regard to the Directory Services portion of the contract, however, NSI was

considered out of its league. The telecommunications giant AT&T had made a strong bid for

that part. But AT&T’s size raised complications. There was a concern that if a major ISP

were put in charge of distributing IP numbers to its competitors, the numbers might not get

distributed in a timely or reasonably priced manner.

An interesting but relatively sketchy bid came from UUNET, an important backbone

ISP known for a strong and talented engineering staff. As with AT&T, there were conflict-of-

interest concerns. Also, UUNET’s proposal for the registry portion of the contract included

fee-based registration plan (at $1 per name), an idea that was still quite unpopular with some

Federal agencies.228

Page 184: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

169

Phone interview with Don Mitchell, February 10, 2003.229

Another company, San Diego-based General Atomics (GA), made a proposal that

Mitchell deemed to be exceptionally strong in the Information Services area. Beyond the

“help desk” activities, its bid included intriguing proposals for building outreach services,

compiling various online and CD-based guides, and developing resource discovery services.

GA’s team was led by Susan Estrada, founder of an important regional ISP, the California

Education and Research Federation (CERFnet). Estrada had become a bit of a celebrity,

portrayed in the press as a model for women in the high-tech business world. As a member

of the FNC’s Advisory Committee she would have been quite familiar with the NSF and the

ways of Washington. She also had deep ties to the inner circles of the IETF community: She

had already served as an Area Director on the IESG and would soon be elected to ISOC’s

Board of Trustees.

Another key GA participant was Susan Calcari. She had a background in data and

voice network operations working with companies like Sprint, but had shown a talent for

public relations during stints at MERIT and the NSFNET. Over time she had developed a

specialization in data distribution for the academic community, an ideal supplement to the

front line help desk activity that the NSF needed.

GA’s proposal fell short with regard to Registry and Directory Services, but Mitchell

and Strawn were smitten by GA’s plan for handling Information Services. It “was so good,”

Mitchell recalled, “that everything went around it.” They decided to restructure the entire229

project to make sure GA would win a role.

* * *

In the late autumn of 1992 Strawn arranged for a meeting that included the team

leaders for the top three contenders in each category – NSI, AT&T, and GA. The group

assembled at Iowa State University, where Strawn had held a series of faculty and

administrative positions, including Chair of the Computer Science Department. Each

company had two representatives. Williamson was there for NSI, Estrada and Calcari for

GA. As Mitchell tells it, the company reps were put in a room and given a day to develop a

Page 185: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

170

Ibid. and Richard Sexton, “Re: InterNIC Change,” Domain-Policy, August 4, 2000.230

National Science Foundation, “Cooperative Agreement No. NCR-9218742,” Article 3. Statement231

of Work, Paragraph G.

Mueller (2002:102).232

scheme that would preserve the NSF’s desire for a unified interface. The result was the

concept of the InterNIC. The neologism may have been Calcari’s, but Estrada led the pitch

for it when Mitchell and Strawn came back to hear the results. 230

Each company was then asked to resubmit a revised “collaborative cooperative

proposal” – an innovation that Mitchell claimed to have invented on the spot – reflecting the

new division of labor. There was a tremendous time pressure to get it all done by the end of

the year, but Mitchell encountered no serious resistance as he pushed the paperwork through

the Washington bureaucracy over the Christmas holiday period. All the contracts, called

“Cooperative Agreements,” were in place as of Jan 1 1993. They included a three monthst

ramp up period prior to an April 1 launch date. GA’s portion of the agreement included ast

Key Personnel clause stipulating that Calcari would be designated to perform specific

services related to the planned “Infoscout” project.

NSI received a $5.2 million five year, three month award for the Registry Services

portion of the InterNIC. It was executed as a fixed-price contract, but subject to periodic

review and revision, with an option for a six month ramp down at the end. ISI was included

as a subcontractor, responsible for the .us top level domain, with Postel and Reynolds named

as managers. As before, domain names were to be distributed without cost. The “imposition

of user based fee structure” was expressly contemplated, however, pending

“direction/approval of the NSF Program Official. 231

The possibility of eventually imposing user fees was also mentioned in a news release

announcing NSI’s award. That part of the announcement included an important stipulation:

“Decisions will be implemented only after they have been announced in advance and an

opportunity given for additional public comment.” That promise was disregarded two years232

Page 186: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

171

National Science Foundation, “Cooperative Agreement No. NCR-9218742,” Article 3. Statement233

of Work, Paragraph C.

and nine months later, when Strawn presided over the commodification of the .com, .net, and

.org domains.

Though NSI and the NSF were listed as the sole parties to the Cooperative

Agreement, its text made unambiguous reference to IANA, as if IANA had the capacity to

exercise decisive, independent authority over registration policy. The description of the work

to be performed stated that NSI was obliged to “provide registration in accordance with the

provisions of RFC 1174,” the 1992 memo in which Cerf had effectively promulgated a new

lay of the land to the FNC. The text then recapitulated key wording from RFC 1174 which

asserted IANA’s authority. The thrust of the language, remember, was that the Internet

system “has employed a central Internet Assigned Numbers Authority [which] has

discretionary authority to delegate portions of [its] responsibility” for the allocation and

assignment of certain identifiers. The clear implication was that the legitimate execution233

of Cooperative Agreement depended, at least in some part, on IANA’s approval.

* * *

Once again Williamson had to engineer the physical movement of the root. During

the transition from SRI everything had been moved in one piece. Now he faced the challenge

of executing a true physical split of the DDN-NIC and the legacy ARPA-NIC. NSI arranged

to open a new facility in Herndon Virginia that would be devoted to running the InterNIC.

The other operations would be maintained at the old site in Chantilly Virginia, where NSI

would continue to run the military’s IP addresses and the .mil domain space as a

subcontractor to GSI and DISA.

One of the most difficult challenges involved dividing the staff between the two

locations. Moving on to build the InterNIC was by far the more popular assignment. The

Internet was now generating significant public interest, and ARPA-NIC was going to be the

center of the action. Williamson would decide who would go where, which meant that he

would have to disappoint someone. It turned out as he expected. Those who remained with

the DDN- NIC, “felt like they had been left behind.”

Page 187: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

172

Scott Williamson, “RFC 1400: Transition and Modernization of the Internet Registration Service,”234

March 1993.

Personal interview with Mark Kosters, November 14, 2001.235

This time, Williamson was determined to head off political fallout from the IETF. To

prepare, he decided to take a “cleaner” approach, moving the root “before anyone realized

it.” Some might recognize his approach as the strategy called, “What you don’t know won’t

hurt you.”

Williamson published an RFC announcing the change well after the process had

gotten underway. His timetable promised there would be no interruption of service. He was234

thus was able to present the move to the IETF plenary as a fait accomplis, and avoid much

of the personal harassment he had experienced during the first transition.

Don Mitchell wanted a surreptitiously early cutover for other reasons... it would save

money. The funding anniversary of NSF’s portion of the DDN-NIC was coming up a few

weeks before April 1 , so an early move spared the NSF the expense of a contract renewal.st

Williamson was happy to accommodate.

* * *

The final cutover to the new root was executed by Mark Kosters, who had been hired

in 1991 as preparations began to transfer the DDN-NIC from SRI. Kosters was particularly

well qualified for this type of work, with an MS in Computer Science, and prior experience

setting up TCP/IP networks and server-based databases for a Navy subcontractor.

Expecting the Washington DC region might be socked in by a snowstorm for the

weekend, he cloistered himself inside the Herndon office beforehand. He would be busy, and

figured it wouldn’t matter if the streets weren’t dug out for a day or two.235

f. Running the Internet’s Boiler Room – 1993 and Beyond

NSI’s operational staff remained small through the DDN-NIC years and the first two

InterNIC years, with very little employee turnover. The staff faced a constant battle to build

systems that would enable them to meet future demand. It was as if they were fabricating

bigger and buckets to bail water from a ship that never stopped springing leaks.

Page 188: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

173

Ken Harrenstein, Vic White, “RFC 812: NICNAME/WHOIS,” March 1, 1982.236

When the DDN-NIC was moved in 1991, the queue for additions and modifications

in the domain name database was not long enough to fill the office whiteboard. Templates

were still being subjected to visual inspection, just as they had been at SRI. To head off the

kinds of problems that would come with growth, Kosters undertook a number of projects that

promised to deliver better security. The strategy was to improve the processes for identifying

and validating the source of requests. He began by developing a new WHOIS server,

replacing the NICNAME/WHOIS system that had been used at SRI since 1982. The new236

WHOIS included a query/response database search tool that allowed anyone with Internet

access to determine whether a domain was registered, and to see the relevant contact and host

information. It also made it possible to see what domain names were associated with a

particular contact name, based on reference to an assigned lookup “handle.”

Shortly after NSI was awarded the registry portion of the InterNIC contract, Kosters

began work on Guardian, an automated authentication tool intended to protect against

unauthorized modifications of information in the system. Along the way, to keep pace with

the growth of country code entries, he also had to develop better software to generate the top

level zone file. Another critical part of the job involved making sure the machinery was up

to speed. Though well-designed software could – and did – last for years, hardware and

connectivity upgrades became a regular necessity.

Underestimating the demand for the Internet’s resources was a chronic problem of

the pre-Boom period. Nearly everyone in a decision-making position had made that mistake

at one time or another, and the staff at NSI had to shoulder the consequences. When DISA

transferred the DDN-NIC to GSI/NSI, for example, it had only provided enough funding to

support a 56K connection. Fortunately, a NASA program officer, Milo Madine, stepped up

to provide a high speed T1 link before the overburdened connection collapsed altogether.

Similarly, when the registry agreement was negotiated, everyone at NSI – as well as

Mitchell and Strawn at the NSF – assumed that growth in the demand for names would

eventually level off. Most of the expansion from 1991 through early1993 was in the .edu

Page 189: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

174

Interview with Mark Kosters, November 14, 2001.237

domain, after all, and there were only so many institutions of higher learning in the world.

The commercial side of the Internet boom had yet to show its full force, and no one could

envision the overwhelming demand for .com names that would result. (Years later, Postel

occasionally joked that he had known from the beginning how big the Internet would

become, but that he wasn’t allowed to tell anyone.)

The registry contract stipulated that the NSI’s annual award could be renegotiated to

account for changes in direct costs such as demand for registry NSI’s services. However, the

amount allocated to provide for indirect costs was capped, and therefore not renegotiable,

which led to unexpected difficulties. Having grown large enough as a result of the InterNIC

job to lose its qualification for 8(a) status, NSI lost its advantage in competing for other

government contracts. Before long, the InterNIC was NSI’s last main remaining business.

This shrank the financial base the company relied on to cover the cost of corporate overhead,

leaving it with significant cash flow problems.

Demand for registry services had indeed exceeded expectations the first year, so the

NSF increased NSI’s award for the second year by about 10 percent to keep up with revised

estimates. But the company was in such dire straits that its expenditures were eventually

made subject to bank oversight, putting Williamson and Kosters in an even tighter squeeze

when they had to schedule the capital outlays for more bandwidth and new hardware. When

possible, they engineered things to scale, “by adding boxes,” expanding capacity

incrementally until the next overhaul was unavoidable.

With no money to spare for new hires, Williamson and the others kept pace by

putting in overtime. “During that time I was working 18 hour days 6 days a week nonstop,

sometimes seven days a week,” said Kosters. He was once spotted napping in his car after

working through the night. His whole life revolved around his job.

I lived fairly close to the office and I needed some form of exercise, so whatI would do is ride my bike to work as much as I could. And I would park itin the hallway or my office... I didn’t have much of a life other than work.237

Page 190: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

175

Elise Gerich, “RFC 1366: Guidelines for Management of IP Address Space,”October 1992. RFC238

1466, published May 1993, had the same title.

Kosters wasn’t alone in his devotion. Many evenings, even after a full day of dealing

with IP allocation hassles, Hubbard would login from home, helping out by processing

domain name registrations that needed visual inspection.

Mitchell became a fan of NSI’s staff, their work ethic, and the company as a whole.

He saw a blue collar virtue in the way NSI’s registry portion of the InterNIC was being

handled. For him it had become “the Internet’s boiler room.”

* * *

The NSI team inherited immediate responsibility for dealing with a serious but well-

understood problem: The designers of the Internet protocol had notoriously underestimated

demand for IP numbers. In the late 1980s there was growing concern that unused blocks of

address space would run out sometime before the middle of the 1990s unless extraordinary

measures were taken. The solution took shape as a three-pronged strategy – short, medium,

and long term. Two of those strategies had direct implications for Hubbard’s day to day

activities at the InterNIC.

When the risk of shortage was initially recognized, the logical reaction was to impose

tougher discipline on the allocation of any remaining IPv4 blocks. The first task, therefore,

was to apply this discipline within the context of the existing class-based system. Around

1990 the operators of the IP registry at SRI ‘s DDN-NIC moved to slow the pace of Class B

allocations in favor of the smaller Class C blocks. The guidelines of the allocation policy

were spelled out by Elise Gerich, a Merit employee who was a leading member of the IEPG

and who eventually joined the IAB. The final version of her guidelines were published as

RFC 1466 in May 1993, shortly after the InterNIC began operation.238

Gerich stressed the paradox of the legacy allocation regime. The Class A and Class

B spaces were at risk of exhaustion, but the allocations which had been made were clearly

underutilized. On the other hand, increasing use of Class C blocks threatened the

performance of the routing machinery that the Internet’s discrete networks relied upon to

interconnect with each other.

Page 191: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

176

Phone interview with Kim Hubbard, February 17, 2003.239

The routers were able to generate dynamic records of the quickest path forward to

known destinations across the system. The entries were expressed as ranges of IP numbers.

The wider the range, or the better aggregated the number of actual destinations within a given

size of ranges, the more likely a router could stay up to date. Unfortunately, the growing

reliance on Class C blocks led to an explosion of information in the routing tables.

Improvements in technology promised to increase the capacity of the tables over time, but

not fast enough to keep pace with such granular demand for Class C blocks. These

limitations compelled a common interest in the use of densely-populated contiguous blocks.

When Gerich set out the rules for restricting the allocations of the remaining space,

she tied it to a plan to “forward the implementation” of RFC 1174. In particular, she meant

IANA’s authority to designate an IR (Internet Registry) as the “root registry” for the IP

number space, and to make subdelegations to regional registries. Such registries were already

active in Europe and Asia, but there was no counterpart as yet for the Americas and Africa.

Drawing on this presumed authority, Gerich stated clear criteria for future allocations

that would be made by those registries. Organizations applying for a Class B block had to

have at least 4096 hosts at the time of application; their networks had to include at least 32

existing or planned subnets. Gerich also provided algorithms for allocating Class C blocks,

stressing that multiple allocations to single organizations must be arranged as contiguously

as possible.

Since the InterNIC was designated as the IR, the duty to enforce this newly articulated

allocation discipline in the Americas devolved to Kim Hubbard at NSI. From her perspective,

things had been fairly lax before she took over the reigns of the IR from SRI. As she put it,

“Everybody was getting a Class B [a block of 65,536 IP numbers] whether they had five

computers or 5,000.” She was determined to turn the situation around.239

Demand for IP numbers seemed small at the start of Hubbard’s tenure as the end-of-

the-line gatekeeper, coming primarily from the military and an occasional university. But

even while NSI was still running the DDN-NIC, she required that applicants answer

Page 192: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

177

Claudio Topolcic “RFC 1367: Status of CIDR Deployment in the Internet,” October 1992. RFC240

1467, published May 1993, had the same title.

questions about the size of their planned network, and how those networks would be used.

When academic growth began driving the system she added questions about how many

students, faculty, and staff would become users. For businesses, she wanted estimates of how

many staff and how many customers would be supported.

Hubbard was behaving as a gatekeeper might be expected to behave, setting out

criteria which could be useful in determining who may or may not be allowed through her

gate. This early process of “hurdle-building” was just a taste of what was to come.

* * *

The medium-term solution to the address-scarcity problem required movement

toward a new system known as Classless Interdomain Routing (CIDR). Whereas the old

system used only the first three bits to designate aggregates, the new approach exploited the

full range of binary digits in the 32 bit long IP address. The result was that it would be much

easier to scale an allocation of IP addresses to meet an organization’s actual needs. This

would permit an even stingier discipline within the allocation regime.

Two RFCs describing the new policy were published as companions to the ones

written by Gerich. The author was Claudio Topolcic, an MIT graduate who had worked240

for an extended period at BBN prior to joining CNRI and serving as an IESG Area Director.

Unlike Gerich, Topolcic did not cite IANA or RFC 1174 as a source of authority. One might

argue that he did so indirectly by referring to her RFCs, but his own wording put a distinct

spin on the question. Though he mentioned that the new approach had been developed by a

“consensus” of the “global Internet community... cooperating closely in such forums as the

IETF and its working groups, the IEPG, the NSF Regional Techs Meetings, INET,

INTEROP, FNC, FEPG, and other assemblies,” he stressed the leadership of the US

government in coordinating the shift.

Page 193: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

178

Claudio Topolcic “RFC 1467: Status of CIDR Deployment in the Internet,” May 1993. The cited241

text was not included in the companion RFC 1367. See ibid.

Recognizing the need for the mid-term mechanisms and receiving supportfrom the Internet community, the US Federal Agencies proposed proceduresto assist the deployment of these mid-term mechanisms.241

No agencies were specified, but this more-than-implicit ratification provided an

assurance to the business community that investments in routing technology compatible with

CIDR equipment would be likely to pay off. The RFC described implementations of

compliant standards (version 4 of the Border Gateway Protocol) naming companies such as

3COM, ANS, BBN, Cisco, and Proteon.

* * *

As an officer of the InterNIC, Hubbard was empowered to act as the duly appointed

enforcer of the new CIDR allocation policy for all of North America. She and her staff would

have to pay attention not only to how much space was being given out and to whom, but also

to the format of the allocation. CIDR’s granularity made it that much more important to

ensure that the blocks weren’t configured in ways that would needlessly overburden the

Internet’s routing tables. On the other hand, someone designing a company’s local or wide

area network (LAN or WAN) might have good reasons to segment it into distinct but

relatively underpopulated blocks, presuming enough addresses were available. This created

a constant tension between companies asking for allocations that would be optimal for their

own purposes, and the need to conserve resources across the Internet as whole.

As a rule of thumb, an entity requesting a block of addresses had to show that 25

percent of them were to be used immediately and that 50 percent would be in use by the end

of the first year. A policy that seemed reasonable to Hubbard struck others as intrusive and

overly confining. Even worse, the imposition of the new restrictions appeared to lock in the

competitive advantages of those who had received allocations under the earlier, more liberal

regime. As she recalled,

It was pretty rough for a long time because if you said no to certaincompanies, you could basically cost them a lot of money. They had their plan

Page 194: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

179

Phone interview with Kim Hubbard, February 17, 2003.242

See, for example, Sean Doran, “CIDR Blocks at Internic Going only to the ‘big’ players??,” com-243

priv, January 30, 1995, http://www.mit.edu:8008/MENELAUS.MIT.EDU/com-priv/19006.

Phone interview with Kim Hubbard, February 17, 2003.244

for numbering their network a certain way and they would come to us andsay, “We need address space.” And I’d say, “Well I’ll give you half of that.”They’d have to rework everything... We always gave them as much as theyneeded, you know, but... network operators wanted more than they reallyneeded because it’s always easier to have more.242

New entrants were up in arms, considering themselves to be the victims of an unfair

policy. There were long online discussions with headings like, “CIDR Blocks at Internic

Going only to the "big" players??” Hubbard was becoming human lightning rod. “You243

used to get up and give a presentation at IETF or NANOG [the North American Network

Operators Group] and when you’re finished, there is a mile long line of people waiting at the

mikes ready to attack you.” She drove on, despite the resentment. “I took a lot of grief,

believe me. I was considered like the most hated woman on the net..... But I understand it....

That’s just the nature of the beast.” She also took on the arduous task of retrieving unused

address space, going back over the distribution of large blocks and approaching the owners

to see what could be gleaned from them.244

* * *

The policies introduced in 1993 were designed to buy time, husbanding the IPv4

address space until the long-term solution was ready... a new, more amply supplied Internet

Protocol – IPv6. The capacity to promulgate all three strategies depended on common

acceptance of the vague, ambivalent notion that both the Internet community and the US

government exercised authority over the allocation of the Internet’s critical resources. No one

(other than Tony Rutkowski, perhaps) yet insisted on seeing the statutes that made this so.

The conventional wisdom was derived from the vague, ambivalent formulations of RFC

1174, and people had been content to live with that for several years.

Page 195: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

180

IAB and IESG, “RFC 1881 IPv6 Address Allocation Management,” December 1995.245

Ibid.246

By the close of 1995 those feelings of content were gone. The practice of happy

equivocation had worn out and broken down. The atmosphere had grown so contentious, that

it seemed as if the whole edifice of trust and collegiality was at risk.

Whereas the short and mid-term solutions delivered in 1993 by Gerich and Toplocic

perpetuated RFC 1174's ambivalence about the source of authority over the address space,

the first announcement of allocation policy for IPv6 drew an unmistakable line in the sand.

This time there would be no doubt. IANA would be in charge at the top of the chain.

Published in December 1995 as “IPv6 Address Allocation Management,” RFC 1881

named IANA as the primary agent of the public good on the Internet, and the authoritative

enforcer of its discipline.

The IPv6 address space must be managed for the good of the Internetcommunity. Good management requires at least a small element of centralauthority over the delegation and allocation of the address space.The Internet community recognizes the Internet Assigned Numbers Authority(IANA) as the appropriate entity to have the responsibility for themanagement of the IPv6 address space....245

A terse page and a half in length, RFC 1881 listed only the IAB and the IESG as

authors; no individuals or US government agencies were mentioned. Like RFC 1174, RFC

1881 declared IANA’s power to delegate the registry function. This time, however, that

power was underscored by reserving the right of recision.

If, in the judgement of the IANA, a registry has seriously mishandled theaddress space delegated to it, the IANA may revoke the delegation, with duenotice and with due consideration of the operational implications.246

RFC 1881 was no coy document. It stipulated that IANA would have the power to

take an IPv6 registry delegation from one party and redelegate it to another. By appropriating

the right of recision, or at least by claiming to do so, the technical community’s old guard

was asserting preeminent, exclusive control over the Internet’s future numbering system.

This power was justified on the basis that it would be exercised “for the good of the Internet

Page 196: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

181

community.” It wasn’t hard to read between the lines. Future versions of name and

numbering resources would have to be managed for the good of the community. It had to be

stated this way, strongly, because of the unhappy and growing perception that the current

version had fallen under the control of selfish parochial interests.

There was more. Given that the Cooperative Agreement and Gerich’s RFCs cited

IANA’s delegation authority, wouldn’t IANA also have the power to revoke the DNS and

IPv4 delegations it made to NSI? No explicit claim was made to that effect in any RFC, but

recent events had raised questions about whether IANA possessed that power as well.

* * *

The language of RFC 1881 was stern because the mood of the community was dark.

The end of 1995 was a season of intense political rupture and polarization. With the Internet

boom clearly underway, the demand for names and numbers was exploding. Any newcomer

seeking visibility for a host site depended on accurate representation in the DNS and droves

of people were more than eager to pay the ticket of admission. Events had proven that NSI

was indeed the Internet’s boiler room, but a deal recently negotiated between NSI and the

NSF had turned the boiler room into a mint. As far as the Internet’s old guard was concerned,

NSI had engineered a power grab. The old guard’s responsibility to grab back. It was time

to determine once and for all who was the captain of the ship.

g. The Goldrush Begins – 1994

Formal public discussion of whether NSI should begin charging fees for domain

name registrations began in 1994. There were good arguments for changing the policy. The

pace of growth in new registrations, especially in .com, far exceeded expectations, stretching

the small company’s resources. Applications had increased from 300 per month in early 1992

when the Cooperative Agreement came into effect to well over 2000 per month by

September 1994. Historic advances in computer technology – both hardware and software

– presaged an explosion of commercial registrations.

That explosion was triggered by the rise of the World Wide Web – an epochal leap

forward in human culture. Credit goes to Tim Berners-Lee, a British-born physicist who

Page 197: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

182

HTML’s immediate predecessors included the Standard Generalized Markup Language (SGML),247

also known as ISO 8879, and Adobe’s proprietary Portable Document Format (PDF). They provided machine-

independent reading and printing capabilities.

The hypertext concept was also advanced by Ted Nelson. Both Nelson and Engelbart drew248

inspiration from Vannever Bush’s article “As We May Think,” Atlantic Monthly, July 1945,

http://www.theatlantic.com/unbound/flashbks/computer/bushf.htm. For more on Engelbart, see Ken Jordan’s

interview in “The Click Heard Round the World” Wired Magazine. January 2004, 159-61,

http://www.wired.com/wired/archive/12.01/mouse.html. See also Marcia Conner, “Engelbart’s Revolution,”

http://www.learnativity.com/engelbart.html.

created its key elements in 1989 and 1990 while working at the Swiss physics laboratory –

CERN. Using a NeXT computer to develop collaborative research tools, Berners-Lee

leveraged several parts of the Internet Protocol suite, the DNS, and introduced several

innovations of his own, including the hypertext markup language (HTML) and the hypertext

transfer protocol (HTTP). Hypertext was one of the most radical dreams brewed at247

Engelbert’s Augmentation Research Center at SRI. Berners-Lee brought it to reality.248

Hypertext refers to platform-independent documents that contain machine-readable

references to other documents – features we now think of as clickable links. Those references

could be executed through the use of an address identifier called a Uniform Resource Locator

(URL). Berners-Lee wanted URLs to be simple enough to write on napkins... a favored

communication medium among engineers. URLs are inherently hierarchical expressions;

domain names are an essential part of the string of characters which constitute the reference.

There were Web servers running in Europe by early 1990 and in the United States by

1991. Mosaic, the first graphical web browser, was released in January 1993. After just one

year of free distribution Mosaic attracted nearly twenty million users. Expressions like

http://www.tidbits.com/macnews.html quickly became part of the cultural landscape.

When the concept was new and unfamiliar, people who were exchanging URLs with

each other carefully spelled them out, one character after the other. Long strings were

meticulously pronounced "aych, tee, tee, pee, colon, slash, slash, double-yoo, double-yoo,

double-yoo, dot, em, see, eye, dot, en, ee, tee, " and so on, even in speeches and over the

radio. People soon caught on, however, and learned to disregard the protocol-identifying

prefixes (http://), server names (www), and file-type suffixes (.html) which were nearly

Page 198: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

183

Mueller (2002: 107-14, particularly 109).249

Joshua Q uittner, “B illions Registered ,” Wired M agazine October 1994,250

http://www.wired.com/wired/archive/2.10/mcdonalds_pr.html.

always the same. That shift in consciousness allowed people to focus on domain names as

the kernel of addresses, and to navigate using familiar expressions like “mci dot net.”

The “Webification” of domain names, in Mueller’s terms, “was the critical step in

the endowment of the name space with economic value.” As more people decided they249

wanted to have a presence on the web rather than just "surf" it, demand for those valuable

kernels increased.

* * *

NSI added one new employee in mid-1994 to assist with domain name registrations

and another to work on IP addresses. That still wasn’t enough to get ahead of demand. The

staff was often working 10 to 12 hour days, or even more, yet they were steadily falling

behind in processing requests. As the backlog accumulated, the .com domain was the source

and the site of the worst problems. That suffix was proving to be the most desirable of all.

Newcomers were more likely to look for mci.com rather than mci.net, and the principle held

for most other business names.

The Internet was becoming hugely popular, and NSI, operating under a fixed budget

with no other way to raise revenue, was straining as a result. The pressure to keep up with

the number of requests for additions and modifications wasn’t the only problem. Disputes

were beginning to crop up, raising concerns about NSI ’s legal liability and the related

financial risks to the company.

Earlier in 1994, a journalist named Josh Quittner prompted a surge of interest in

domain names (and in himself) through a New York Times Op-Ed piece titled "The Great

Domain Name Goldrush." He taunted the owners of the MacDonald’s Restaurant chain for

neglecting to register macdonalds.com. Quittner had registered the name to dramatize the250

point, and then publicly invited company representatives to contact him at

[email protected] to talk things over. The episode was resolved more or less politely

when Quittner transferred the name to the corporation in exchange for a $10,000 charitable

Page 199: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

184

Richard Sexton, “Talk given June 20, 1997 at a CAIP/Industry Canada DNS workshop in Ottawa,“251

http://www.open-rsc.org/essays/sexton/caipcon97/.

The policy also specified that registrants could request more than one name, but there would also252

have to be at least one server set up to host any name registered.

See Electronic Frontier Foundation,”’Legal Cases - Knowledgenet v. NSI (InterNIC), Boone, et al.’253

Archive,” http://www.eff.org/legal/cases/Knowledgenet_v_InterNIC_et_al/.

donation to an elementary school. But the incident served to put businesses on notice that

domain names had value, and that it would be wise to secure that value as soon as possible.

The immediate effect of his article was dramatic: the pace of registrations quadrupled to

8,000 names per month.251

Many of the companies that registered names under .com requested registrations in

.net as well, hoping to cover all the bases. Many attorney and corporate officers contacted

Williamson directly, requesting legal clarification of NSI’s domain name registration policy.

His regular answer was “first come/first served, no duplicates” – a position accepted as an

honored doctrine in the technical community, but not very satisfying elsewhere. Its252

simplicity offered no comfort to owners of famous coined names like “Xerox” or “Kodak”

who felt that they possessed some prior standing in “meatspace” which should be

automatically honored in cyberspace.

That same summer, Williamson was notified of a dispute over the name

knowledgenet.com. An Illinois-based computer consulting company had acquired a

trademark on the word KnoweldgeNet in January. A Delaware-based research consultant

registered the name via a Massachusetts-based ISP in March. The implication was that if253

NSI didn’t transfer rights to the domain to the trademark owner, the company would be

named in a lawsuit. With NSI’s budget already taxed so thin, this was a dreaded prospect.

There was no existing case law or other sort of legal precedent available, and NSI’s attorney

at the time was not a specialist in trademark matters. He based his defense on jurisdictional

issues. It wasn’t immediately clear whether the legal fees would come out of NSI’s budget,

or whether the NSF would subsidize the expense.

Page 200: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

185

The costs of running the InterNIC registry were rising across the board. A great deal

of human processing was needed to ensure that applicants were properly qualified to register

names in the educational, commercial, organizational, and network top level domains. It

might take as little as four or five minutes or so to process a new name, which meant that one

employee might be able to process about 100 names on a good day. But it only took one or

two complicated transactions to clog up the queue. Modifications had to be expedited as

well, and these also took time.

* * *

1994 had also been a busy year for the addition of new country codes, a rather

involved process on Postel’s end. He would verify the responsiveness of the applicant’s

machinery before directing Kosters to add the proper entries to the root zone. Moreover, the

coordination with overseas network operators could be complicated by everything from

expensive phone service, language barriers, and time zone differences. The ISO 3166-1 list

included more than 250 eligible country codes. Nearly half had been entered by 1994... most

of those since 1992. Each addition to the root was like a new Atlantis rising up within the

online sea.

In the early years of the DNS, far more often than not, country codes were added for

relatively developed parts of the world, and the delegations often went to individuals who

were familiar in the engineering community, perhaps because of educational ties or

conference attendance. The later additions tended to be countries where Internet connectivity

was generally unreliable, if not unavailable. In such cases, the country code domains were

configured to use host proxies sited outside the national borders of the corresponding nation.

Moreover, the country code community as a whole had become too large to buttress the

formal delegation process with resort to collegial and informal ties. Growth in the TLD space

exposed a number of problems that needed carefully prescribed solutions: determining who

was a legitimate contact for a country; settling contended claims to the same country code,

creating fair procedures for revoking a badly-managed domain delegation, and; handling

transfers between successive managers.

Page 201: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

186

John Klensin, “RFC 3071: Reflections on the DNS and RFC 1591,” February 2000. In RFC 1591,254

Postel cited Klensin as a key contributor.

Jon Postel, “RFC 1591 Domain Name Structure and Delegation,” March 1994.255

DNS issues were heating up, but there hadn’t been a comprehensive effort to update

TLD policies since RFC 920, ten years before. It was an opportune time for Postel to clarify

the rules governing country codes and to flesh out other matters. The instrument for this

restatement was RFC 1591, “Domain Name System Structure and Delegation.” Among the

most famous RFCs of them all, it was ultimately lauded as one of Postel’s “masterpieces.”254

If there was any single guiding doctrine to the memo, it could be called Postel’s

“delegation motif.” Postel generally liked any approach that promised to fix a problem by

delegating responsibility to someone who had the most immediate interest in creating a

solution that worked.

Beside sketching out some very basic technical requirements about operating a

domain, he stressed that TLD managers should treat all requests for names in “non-

discriminatory fashion,” showing no preferences between academic, business or other kinds

of users, and no bias toward different kinds of data networks, mail systems, protocols, or

products. (Nevertheless, like many in the IETF old guard, Postel believed academics and

businesses were more likely to serve the interests of the community than governments or the

government-sanctioned phone monopolies known as PTTs.)

In the text, Postel reasserted IANA’s standing as the “overall authority for the IP

Addresses, the Domain Names, and many other parameters, used in the Internet.” As255

usual, nothing was said about how this power was established. Later on in the document,

however, Postel made a remarkable statement about the duties and responsibilities of TLD

managers (actually, of all domain name registrants, since “the requirements in this memo are

applied recursively”). It provided a telling insight into how he justified his own position as

the Internet’s ultimate gatekeeper.

These designated authorities are trustees for the delegated domain, and havea duty to serve the community. The designated manager is the trustee of thetop-level domain for both the nation, in the case of a country code, and theglobal Internet community. Concerns about "rights" and "ownership" of

Page 202: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

187

Ibid.256

Ibid.257

domains are inappropriate. It is appropriate to be concerned about"responsibilities" and "service" to the community.256

In asserting what was “appropriate,” Postel was making his own loyalties plain.

Primary allegiance belonged to “the global Internet community” rather than a particular

national community. He wanted the country code managers – all of whom he had designated

– to join him in fulfilling that allegiance. Of course, there was no guarantee that they would

do so. A manager’s personal loyalties might be innately local, and nothing could change that.

Once he or she had proved technical and operational competence, no oath of fealty was

required. An even if a manager’s preferences were the same as Postel’s, if forced to choose

between the interests of “the Internet” and the state someday, he or she might bend in the

face of a government’s coercive powers (as would Postel, eventually). But Postel had now

explicitly stated that any gatekeeping powers within the global root – even his – depended

on subordination within a higher frame of legitimacy. If Postel’s ultimate responsibility was

to the Internet community, so was their’s.

How then, as the legitimate administrator superior authority, should he exercise

power when things were not proceeding smoothly? Postel recognized that, by asserting

jurisdiction over the TLD space on behalf of “the community,” he also needed to create a

formal dispute resolution system capable of adjudicating any complaints sent his way. He

named one: “The Internet DNS Names Review Board (IDNB), a committee established by

the IANA, will act as a review panel for cases in which the parties can not reach agreement

among themselves. The IDNB's decisions will be binding.” Yet no such committee was257

ever created. Allowing this loose end to persist was a momentous error.

If Postel ever hoped to establish a degree of formal authority over the Internet’s core

resources beyond the webs of personal fealty he had accumulated during the rise of the DNS,

he needed stronger evidence of his right to say, “The buck stops here.” Being a great role

model was not enough. Creating a review panel like the IDNB would have been a costly

Page 203: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

188

effort up front. It would have required taking on far more responsibility (with its associated

troubles and risks). But doing so then, when Postel and his colleagues still had great freedom

to pursue their own initiatives, would almost certainly have helped their cause later on.

Given the diverse and ultimately vague sources of Postel’s authority at that time

(Mueller called it “amorphous”), if Postel had actually followed through and constituted a

subordinate agent of IANA as the final arbiter in a dispute resolution chain, he would have

augmented IANA’s standing with a new manifestation of formalism. After all, since IANA’s

authority over core resources like the root was not delegated by a superior body (sketchy

“charters” from the IAB notwithstanding), that authority must necessarily have flowed from

the bottom up. Certainly, that is how Postel and his colleagues preferred to see things.

Therefore, any claim that he represented the interests of the global Internet community would

have been more firmly grounded if he could prove he served at the pleasure of that

community. RFC 1591's clarification of root policy provided an opportunity.

The relative standing of a ruler depends on signs of consent by the ruled. In some

communities a superior’s power is ratified by resort to plebiscites, oaths, or the kissing of

rings. On the Internet, up till then, it was essentially a matter of saluting. In practical terms,

this involved pointing one’s DNS resolvers to IANA’s root constellation, accepting simple

commands, following the RFCs, accepting protocol assignments, etc. Creating formal

procedures to handle specific types of disputes would have gone a step further. This would

have been tantamount to establishing a jurisdiction within which parties may come forward

to plead their case and ask for justice... a much more sophisticated demonstration of consent.

Creating a court-like body which gave Postel the chance to play Solomon before flesh and

blood members of the Internet community would have counted for a lot. Social institutions

with good dispute resolution mechanisms scale better over time than those without.

At the time RFC 1591 was published Postel’s standing in the Internet community was

so high it would have been relatively easy for him to create an IDNB review panel by fiat.

That he did not do so can be attributed (speculatively) to three factors. First, he was already

overburdened with work. No one pressed him to create the panel; more immediate demands

got his attention. In other words, he simply dropped the ball.

Page 204: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

189

This wording draws from John Klensin’s reflections in RFC 3071, Section 2.258

Second, even though creating a formal dispute resolution mechanism offered real

political benefits, it seemed to contradict Postel’s traditional preference for solutions favoring

the delegation of authority. According to that principle, which could be called the “delegation

motif,” trustees of the system would presumably do the right thing because they had an

inherent interest in getting the right thing done. The implication of establishing and

delegating a domain, therefore, is that the managers who receive the delegation be able to

function effectively without intervention or oversight. But this model of virtuous self-258

governance was inherently at odds with the notion of a quasi-legal backstop like an IDNB.

Though the technical design of the DNS was a top down hierarchy, its social structure most

certainly was not. Domain managers were granted full autonomy. Collaboration with IANA

was voluntary. That is why Postel felt obliged to exhort the managers to follow his own

model and consider themselves responsible to the served public.

Taking that next step... imposing a layer of gatekeepers on a world where reliance on

guiding principles had always been considered good enough... would have seemed

superfluous to some and dangerous to others. The managers of modern nation states have a

similar problem, except that maintaining a hierarchically organized technical system does not

rank as a top priority. Establishing the rule of law is a tricky business.

Third, and in keeping with the delegation motif, the person whose interests would

have been best served by the creation of an IDNB would have been Postel himself... the

incumbent benevolent dictator/gatekeeper of the DNS. For the time being he just kept

handling dispute resolution matters alone. The IDNB was effectively a committee of one.

The system Postel set up for managing the .us country code reflected his faith in the

delegation motif. The goal was a rich, proliferate, broadly filled-out namespace matched to

US political geography. It had a two letter, second level domain for each of the states and the

District of Columbia. Cities were under states at the third level, and registrants at the fourth,

resulting in the possibility of names like ibm.armonk.ny.us. There was also parallel scheme

of subcategories below the second level for schools, colleges, libraries, and government

Page 205: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

190

RFC 1591, Section 2.259

agencies. The heart of the approach was the belief that responsible agents across the country

would step forward and provide registration services for their subordinate communities, in

harmony with Postel’s taxonomy. In fact, it didn’t work out that way. Local agents were

often hard to find or unreliable. Registration in .com, using NSI’s centralized and better

managed system, was far more convenient.

* * *

As the opportunity to create a names review board slipped by, so did an early chance

to head off the crisis in .com. Recall that, in RFC 920, Postel did not describe the purposes

of the generic top level domains. In 1591 he did. He was well aware of the growing backlog

at NSI, and of the widespread the fears that the .com domain was on the verge of becoming

a morass. The new description of that category provided an insight into his views about how

to deal with its problems.

This domain is intended for commercial entities, that is companies. Thisdomain has grown very large and there is concern about the administrativeload and system performance if the current growth pattern is continued.Consideration is being taken to subdivide the COM domain and only allowfuture commercial registrations in the subdomains.259

In keeping with his delegation motif, Postel apparently thought that dividing .com

into some industry-delimited subcategories might inspire corresponding groups to collaborate

in establishing subdomain registries for their own purposes. It was a reasonable idea, but –

like the IDNB – Postel let it slide. For the short term, the easier approach would be to let the

NSF throw a little more money at the problem so that the workers at NSI could keep their

heads above water.

* * *

During the InterNIC’s second year of operation, the NSF contributed about a million

dollars for NSI’s portion. Covering a third year of NSI’s costs would require at least twice

that amount, even without any litigation headaches. After that, if trends held, massive

infusions of capital would be necessary. The NSF had two obvious options: Either find extra

Page 206: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

191

IEEE-USA, “Meeting Summary Report: The National Science Foundation Workshop on Name260

Registration for the “.COM” Domain,” September 30, 1994. Archived at http://www.dnso.org/dnso/

dnsocomments/comments-gtlds/Arc00/msg00005.html.

funds within the government, or amend the Cooperative Agreement in a way that would

support NSI’s operations by charging fees for domain name registrations. The second was

obviously more attractive. Clearly, at some point in the future NSF would be expected to

drop its support obligations altogether. By now, the lion’s share of subsidized registrations

were for commercial entities that could easily afford to pay a small fee. The Internet’s

backbone was already well on its way to privatization, why not the registry?

As of 1994, other than the soon-to-be-spun-off NSFNET, only a few elements of the

Internet remained dependant on federal sponsorship: Most notable were funding for: 1) the

IETF meetings; 2) Postel’s IANA and RFC Editor functions, and; 3) the InterNIC. The

momentum toward privatization seemed inevitable across the board, but especially for the

registry, which had resources to trade. A self-supporting registry might still require oversight,

just as CCIRN, the FEPG, and the IEPG were providing oversight for the routing system.

ISOC seemed like a reasonable candidate to fill that role.

h. IEEE Workshop – September 1994

With the summer drawing to a close, Stephen Wolff and Don Mitchell arranged for

a day long workshop to focus on registrations in the .com domain. It was held in the260

Washington offices of the IEEE on September 30, 1994. IEEE was asked to assist because

the organization had accumulated experience in allocating unique address ranges to

manufacturers of Ethernet compliant devices (such as those now used in most personal

computers). There were only about a dozen participants, including Wolff, Mitchell,

Williamson, and Rutkowski, who was still Executive Director of the Internet Society. Other

government participants included Robert Aiken from the Department of Energy, and Arthur

Purcell from the Patent and Trademark office.

The IEEE had two members present. Another attendee was associated with the North

American Numbering Plan... the organization that manages the allocation of area codes for

Page 207: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

192

the telephone system and collects fees for telephone network access charges on behalf of the

U.S. and Canadian governments. Two major personalities from the Internet engineering

community were there. One was Paul Mockapetris, who was still working at ISI and was then

Chair of the IETF. The other was David Sincoskie, an employee of Bellcore who was a

member of the IAB.

Postel did not attend, but sent an email asking that the group “focus on the provision

of responsible and responsive management for the .com domain.” He also asked that the

discussion be broadened to consider the .net and .org domains.

During his introduction Mitchell made a remarkable assertion, “NSF views itself as

supporting the Internet registration activity for the U.S. Research and Education community

rather than carrying out any inherently governmental function.” He stressed that the NSF had

only inherited responsibility for the registry after the DOD withdrew its support. He added

that the authority over the domain registry belonged all along to Postel, who had delegated

it to Williamson, and who had in turn passed it on to Kosters.

After a comprehensive presentation by Williamson about the precarious situation at

NSI, there was a fairly wide ranging discussion among the participants about what kind of

reform options might be considered. Some liked the prospect of terminating .com altogether

and starting over fresh with a new TLD. The group was able settle on a short list of

consensus points. These included recommendations that NSI should enlist “the best legal

help” to re-draft the domain name registration agreement. They also recommended that the

first come/first served registration policy be retained, but many more details of the agreement

between the registry and the registrant needed to be spelled out. Those elements were to

include relevant definitions, expectations, and binding obligations.

Despite Mitchell’s affirmation of Postel’s authority, the group wanted that issue

clarified. In addition, they wanted clarification of whether a registration conveyed a property

right. They also recommended instituting a new policy to limit the number of names that

could be assigned to single registrant over a certain period of time. They concluded that there

was “probably no need” for continuing U.S. government’s subsidy of .com.

Page 208: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

193

Ibid.261

During the day’s discussion, more than one reference had been made to the huge

revenue-generating potential of registration fees. There was no ignoring the fact that

registrations were increasingly being regarded as valuable property. A registry that charged

fees might be able to earn “up to a billion dollars” over the next ten years. Should the

government simply hand this away? If so, to whom? Control over the revenue, someone

noted, “could keep a non-profit entity afloat or create a bonanza for a for profit entity.”261

In fact, there had been some thought beforehand that, in due course, a revenue-

earning registry could be used to support ISOC activities. Down the line, a steady income

stream would enable ISOC to take over responsibility for funding the IETF and IANA

activities. One participant who did not wish to be named recounted,

I think we all went in there with the expectation that there would be anamicable agreement. Gravitating this thing over to the ISOC with NetworkSolutions remaining the operator would be a reasonable outcome.

To achieve such an outcome, the oversight issue would have to be, in that observer’s

words, “gently handled.” Rutkowski, however, wanted to proceed as soon as possible with

his grand plan to create a business-focused Internet industry association – the vaunted anti-

ITU. “Tony came into the meeting so obviously grasping that he pissed off everybody in the

room,” recalled the anonymous participant. Rutkowski created the impression that he wanted

to become “emperor of the world.”

I would not be surprised if his demeanor at that meeting was significant incausing him to lose his job as President [sic] of the Internet Society, becausehe effectively managed to alienate pretty much everybody in the room withthe assumption that he was going to be the heir apparent of registration.

* * *

Mockapetris and Sincoskie reported back to the IAB in October, two weeks after the

IEEE workshop. “NSF is getting tired of paying” for .com registrations, they stressed,

implying unmistakably that big changes were needed. After bouncing around the usual

unresolved questions, such as the problem of trademark law, and the issue of “who owns the

Page 209: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

194

IAB, “Minutes for October 13 1994 IAB Meeting,” http://www.iab.org/documents/iabmins/-262

IABmins.1994-10-13.html.

Personal interview with Kilnam Chong.263

namespace,” the group decided to pass the issue on to Postel (absent from the meeting),

requesting that “IANA study this problem and suggest solutions.”262

i. InterNIC Interim Review – November 1994

The issue of fees was on the table again in November. The NSF convened an

InterNIC Interim Review Committee, to prepare a midterm evaluation on the performance

of the three companies in the InterNIC. The panel, which met in Washington, D.C., was

chaired by Michael Schwartz, a computer scientist at the University of Colorado. Rutkowski

was the only non-NSF holdover from the IEEE meeting in September. The rest were from

academia and private industry.

The occasion of the review seemed to present the NSF with its best opportunity yet

to get out of the domain name registration business altogether. What alternative was there?

Given the difficulty of raising new funds, the prospect of increasing the award to NSI was

highly unlikely. The obvious task was to begin thinking about the transition to privatization.

Yet the group was preoccupied by another matter altogether.

The only non-US member of the review committee was Kilnam Chong, a Korean

who had been a former classmate of Postel’s at UCLA and who was back in the US that year

as a visiting professor at Stanford University. Chong, like many overseas pioneers of the

Internet, had worked with TCP/IP networks while getting educated in the US, and went on

to promote the technology after returning home. He recalled that the group spent most of its

time dealing with the woefully poor quality of work delivered by General Atomics. NSI’s

problems were treated as a “minor issue.” The introduction of fees was on the agenda, of

course, but there was no debate about that because the need to do so was considered

“obvious.”263

The final report of the Review Committee devoted twice as much space to the review

of GA’s performance as NSI’s. It lauded some GA staff members, especially Sue Calcari, but

Page 210: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

195

criticized the company’s management as bland and uninformed. Over the previous two years

there was an unusually high turnover at the head of the “help desk” position. Estrada had left

the company early on, and little had actually been achieved on the main project since then.

Calcari found a way to avoid the tumult at GA’s corporate offices, however, and had gotten

her InfoScout project up and rolling as a distinct entity. It was a cheerful guide to new and

interesting sites on the Internet.

GA’s failure to create viable outreach and help desk services only worsened NSI’s

problems. Its InterNIC staff was getting barraged with incessant “What is the Internet?”

questions from the public, further undercutting their ability to focus on operational work.

In the brief section devoted to its evaluation of NSI, the panel suggested that the

company begin charging for .com immediately and for the other TLDs later. No prices were

mentioned, but the reviewers expressed concern that announcing the new policy prematurely

could spark a rush on domain names by people seeking to “get them while they are free.” The

panel members also recommended that NSI charge for all IP registration services.

The panelists felt that NSI was performing a critical registry operation which had to

be sustained, and that the firm was doing that part of the job quite well under the

circumstances. There was a rumor going around that several NSI staff members were under

such severe job-related stress that some of them had required psychiatric care. This rumor

is not backed up by any evidence, but there is also no doubt that the members of NSI’s staff

were laboring under more pressure each month to keep things running.

Hubbard has one particularly clear recollection about that period, “We worked our

fingers to the bone! We worked our tails off!” For Williamson, much of the era is now “a

blur,” accompanied by the memory of his ruined marriage, sacrificed on the alter of keeping

the Internet running. Kosters put things stoically, “We had to get the job done, so we did it.

We all felt like it was a mission.” NSI’s staff had to deal with such rapid growth over such

a long period of time that it was if they were always ramping up, climbing another learning

curve. There was a time when Williamson was sure demand would plane out at around a few

thousand names a month. He even predicted it to a journalist, but he had to eat his words.

The day never came. Things never seemed to reach an efficiency plateau that would allow

Page 211: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

196

Phone interview with Don Mitchell, February 11, 2003264

Michael F. Schwartz, “InterNIC Midterm Evaluation and Recommendations: A Panel Report to the265

National Science Foundation,” December 1994, 21. Several archival copies are available online, including

http://www.domainhandbook.com/9412InterNIC.Review.pdf.

Ibid.266

them to consolidate their gains and catch their breath. Instead, they had to keep climbing.

Sisyphus never had it so bad.

The department was managing its costs reasonably well under the circumstances, but

that wasn’t enough to help the company as whole. From a performance standpoint, Mitchell

was impressed. The job was getting done despite punishing conditions. “There was no

problem with NSI, except NSI might go bankrupt.”264

In his presentation to the committee, Williamson stressed that the greatest threat

facing NSI was the financial burden of dealing with expected litigation. Mounting a defense

in the KnowledgeNet suit alone was expected to cost about $100,000. The panelists were

sympathetic. Accordingly, they reported that they were “disturbed to hear that the NSF and

/or the U.S. government does not provide a legal umbrella under which NSI can manage the

Internet domain name space.” In fact, Mitchell – as eager to support NSI as anyone – had265

already tried to get the NSF to help out with attorney costs, but was overruled. The panelists

nevertheless urged the NSF to “support NSI on the legal and policy-related issues” stemming

from this dilemma, and to help “place the operation of the domain name system into a

framework that provides suitable legal protection.” No specific recommendations were266

given as to how to proceed.

Though the money would be hard to find, the review committee had the authority to

recommend that NSF increase its grant to NSI. This was deemed inappropriate, however,

because the bulk of the registrations processed by NSI had by now shifted decisively from

.edu to .com. The NSF’s proper role as an agency was to sponsor educational projects, not

commercial ventures. By the same logic, there was no compelling reason for the NSF to go

out of its way to expose the government’s deep pockets to the prospect of domain name

litigation.

Page 212: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

197

Rony and Rony (1998: 142).267

Just as the officials in the Department of Defense began handing off responsibility

to the NSF in 1985 after internetworking research started to thrive in American universities,

the new overseers could see it would ultimately outgrow the nest provided by NSF. Yet

another of the Internet’s enabling harnesses had become an overly constraining fetter.

j. Exit GA – January 1995

Coincidentally, NSI was named as a co-defendant in the KnowledgeNet case only a

few days after the Interim Review panel wrapped up its meetings. As expected, no help267

was forthcoming from the NSF. (The matter was later settled out of court, and the domain

name went to the trademark holder.) There had been every reason to expect that there would

be quick movement toward developing a policy for instituting fees, but the matter languished

through the early months of 1995. George Strawn, the program officer, was back at Iowa

State, on an Intergovernmental Personnel Appointment, which meant he retained only

nominal part-time status at the NSF. Steve Wolff, the Division Director, had left for a

position at Cisco. Jane Caviness, the acting Director, was initially concerned with working

out what to do with General Atomics.

GA was given a final opportunity to defend itself at a meeting held shortly before the

Interim Review was finished. The company’s case was made by Kent England, a former

BBN employee who was the most recent individual heading GA’s part of the InterNIC. The

presentation satisfied Mitchell, until England closed by announcing he would be quitting the

company. It was left up to Karsten Blue, the son of GA’s owner, to follow with a creditable

affirmation of England’s roadmap, but he failed.

On January 21 1995, Mitchell notified GA of the decision not to renew the contractst

for its part of the InterNIC. In a huff, GA fired Calcari. Coincidentally, 39 year old had just

been diagnosed with breast cancer. Calcari called Mitchell who immediately took two

actions. The first was to arrange for Caviness to terminate the entire GA project effective

February 28 . This move cut off the substantial funds the company would have received ifth

Page 213: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

198

Susan Calcari, “The Scout Report – April 28, 1995,” http://scout.wisc.edu/report/sr/1995/scout-268

950428.html.

the contract had been allowed to expire naturally on March 31 . His second move was to callst

Dave Graves, head of Business Affairs at NSI. Graves agreed to take over funding of the Info

Scout immediately. That move preserved Calcari’s job and her health insurance.

The Info Scout archives from the mid 1990s are still available online. There’s an

interruption in the series from February 1995 to the end of April when publication resumed

with a lighthearted apology.

...A month or two ago the Info Scout took a bad turn on the trail, and wasambushed by outlaws with some bad medicine they were feeding tounsuspecting Scouts. This required the Scout to set up camp and stay put untilthe bad medicine was overcome by the good medicine of health treatments,rest, and good friends and family. The Scout is back in the saddle now andplans are to return to sending weekly reports from the frontier...268

Calcari took the Info Scout project to the University of Wisconsin (Larry

Landweber’s base) and resumed work there. The project survived her, and continued to

highlight the best of the web. Her contribution was memorialized on a web page bordered

by pink ribbons.

After this experience, Mitchell admitted developing a “certain loyalty” to NSI. “When

the rubber hit the road... a simple phone call, ‘Hey can you give us a hand...’ and it was

done.” It wasn’t long before Mitchell would find himself in position to return the favor.

k. Enter SAIC – February 1995

In early 1995, NSI’s owners were approached by Don Telage, representing the San

Diego-based defense contractor, Science Applications International Corporation (SAIC).

Telage wanted to know if the company was for sale. A $6 million deal began to take shape.

When Williamson found out, he resented not being offered a piece of it. He felt he had

played a key part in building the registry into a valuable asset. And he had pulled it off even

though the company’s officers had left him understaffed. Nor had they provided him with

the high level legal support that he obviously needed. Still, he was considered to be the

Page 214: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

199

Curiously, George Strawn’s recounting of events differs significantly from Mitchell’s. Strawn held269

that SAIC knew nothing of the Internet Registry and was purchasing NSI because of its interest in other assets.

Phone interview with George Strawn June 18, 2002, Phone interviews with Don Mitchell, February 11, 2003.

person at NSI who had the best personal relationship with Don Mitchell, so it was left up to

him to introduce Telage to the NSF to get a sense of whether the sale would be allowed to

go through.

Telage was interested in more than just NSI. The NSFNET was finally being

decommissioned that spring, bringing closure to the long effort to establish a privatized

Internet backbone. He was shopping around for appealing high-tech investments with a

promising future, and was curious about any other ventures Mitchell might recommend.

Mitchell made it clear that he envisioned no problem with the sale of NSI to SAIC. “Our only

concern is performance,” he told Telage. There was no reason to block the sale of one private

company to another private company, providing the resulting entity continued to demonstrate

“adequate capacity character and credit.”

In fact, Mitchell was overjoyed. “To us, SAIC is a Godsend.” Strawn called SAIC the

“hero” whose investments would “keep the Internet from melting down.”269

There was some disappointing news for Mitchell that day. Williamson had decided

to leave NSI for another job. Mitchell valued “producers” and admired Williamson as a

“superstar” who had made a huge personal sacrifice to engineer the two DDN-NIC

transitions. Without Williamson around, it was going to be that much harder to work out the

details of a transition to a fee-based registry before April 1 , the renewal anniversary of thest

Cooperative Agreement.

Further complicating matters was the fact NSI’s original owners were going be tied

up while they arranged the sale of their company. And the new owners were likely to spend

more time imposing various management changes once they took over. Also, the acting

Director, Jane Caviness, thought it would be better to hold off on making such a significant

policy change until a new full time Division Director was appointed. Consequently, Mitchell

arranged to have the Cooperative Agreement renewed for only six months. That decision

Page 215: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

200

Jesse Hirsh, “The Internet Complex,” http://www.tao.ca/writing/archives/mms/0016.html. See also270

Stephen Pizzo, “Domain Name Fees Benefit Defense Contractor,” http://www.dcia.com/pizzo.html.

imposed a September 30, 1995 deadline for finishing the arrangements that would allow

charging to begin.

On March 10th, with no hearings or competitive bidding, and barely a background

check, NSI was purchased for about $6 million in cash and equity. NSI’s primary owner,

Emmit McHenry, was paid $850,000 as part of a non-competition agreement intended to

keep him out of the registry business. He was allowed to use some of his equity to launch a

data services spinoff called Netcom Solutions.

By that time NSI’s cash flow situation was extremely precarious. Infusions from

SAIC helped the company limp along for the next few months, averting collapse and shoring

up morale among the besieged technical staff.

* * *

Rather than prompting celebrations about the likelihood of NSI’s survival, this turn

of events raised the suspicions of conspiracy theorists across the Internet. SAIC’s Board of

Directors included several individuals who had previously held leadership positions in

various U.S. intelligence and military services. Most press reports included a cliché about

SAIC’s board roster being a Who’s Who of the American military industrial complex.270

Combined with the names of retired members, there was a former head of the National

Security Agency (Bobby Inman), two Directors of the Central Intelligence Agency (Robert

Gates and John Deutch) and two Secretaries of Defense (Melvin Laird and William Perry).

The Pentagon was represented by the commander of the Panama invasion (General Max

Thurman). Michael A. Daniels, SAIC’s CEO, had been a White House advisor for the

National Security Council. To cap things off, read backwards, SAIC spelled “CIA’s”.

Anyone with a passing concern about privacy had reason to wonder why a company

run by retired “spooks” wanted control of a contact information database containing every

domain name owner on the Internet. And there were other things to worry about. SAIC

would now be able to exercise physical control over the root, and over the lion’s share of the

IP registry.

Page 216: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

201

Robert J. Aiken, “Re: toplevel domain names - inquiring minds want to know - who owns271

them?????????” http://www.netsys.com/ietf/1995/0801.html., IETF March 17, 1995,

l. Top Level Domains: Who Owns Them? – March 1995

About the same time Telage was negotiating SAIC’s purchase of NSI, public scrutiny

of IANA’s uncertain and precarious legal status became much more intense. That shift was

triggered by circulation of an inadvertently provocative comment by Tomaz Kalin, one of

ISOC’s founding trustees. A Slovenian with a Ph.D. in Physics, Kalin had been involved in

international networking since the early 1970s, eventually becoming Secretary General of the

European Association of Research Networks (RARE). He had become concerned about the

plight of A. K. Ahmad, a Jordanian national who was engaged in a dispute with the operators

of the Jordanian country code, .jo.

Ahmad had announced that he was considering legal action against the InterNIC.

(Ahmad probably meant NSI, but like many people, he wasn’t clear about the distinction.)

He was prepared to sue others, if necessary, but was worried about the ramifications. “I am

very cautious towards disrupting my relations with IANA and Mr. Postel over there since on

the long term we have to maintain good relations with him.”

Kalin passed Ahmad’s note on to a colleague with the comment, “This is obviously

one of the problems with the first-come-first-serve management style... I think that ISOC will

have to take a good look at this issue.” Pieces of the original note and a chain of followup271

comments were eventually posted on the main IETF mailing list by Robert J. Aiken, an

official at the US Department of Energy. Aiken provided oversight and support for several

advanced networking projects, including the High Performance Computing and

Communications initiative, the product of 1991 legislation sponsored by then Senator Al

Gore. The “High Performance Computing” label was applied to work performed in numerous

agencies and research sites at that time, including Postel’s division at ISI. Like everyone else

on the IETF list, Aiken spoke as an individual, though it was well understood that he was

high enough in the U.S. government hierarchy to be the equivalent of an 800 pound gorilla.

Aiken kicked off the discussion thread by titling it with a pointed, rhetorical question,

“Re: toplevel domain names - inquiring minds want to know - who owns them?????????”

Page 217: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

202

Ibid.272

We have danced around the issue that I am about to bring up formany years now with no clear answer - so I would like astraightforward answer from the ISOC.

Is ISOC claiming that it has jurisdiction and overallresponsibility for the top level address and name space - as some[such as Kalin] believe it does?

If yes - how did ISOC obtain this "responsibility", - if NO thenwho does own it?

272

The ambiguity that Cerf had inscribed into RFC 1174 was now having a confounding

effect. Was IANA assigning parameter values on behalf of the US government, or on behalf

of the Internet community? If the latter, how had ISOC inherited such powers? By what

rights could ISOC incarnate that community and legitimately represent its interests? Dancing

around the issue had been easy in 1992 and 1993 when there wasn’t much at stake. But the

web had begun to emerge in 1994, and the Internet boom was now underway with no end in

sight. Questions that had been broached without definitive answer during earlier discussions

about instituting fees in .com could not be put off much longer.

Looking up the chain of gatekeepers from .jo, Ahmad he could see that the InterNIC

had physical control of the root database, but that the IANA had some sort of authority over

the InterNIC. The reaction of ISOC’s trustee, Kalin, was driven by the assumption that ISOC

stood over IANA in that chain of authority. Aiken’s “Inquiring Minds” response on the IETF

list boiled down to a protest against what he suspected was a move by ISOC to take control

of the Internet’s most important resources.

Cerf responded privately to Aiken, and forwarded copies to Postel and Rutkowski.

His reply laid out a history of “the IANA function”that acknowledged its early dependence

on support from the US government. His main point was an insistence that the circumstances

had changed in the 1990s.

[T]the Internet had outgrown its original scope and become aninternational phenomenon. The NIC functions were being replicatedin Europe and more recently the Pacific rim – costs being borneby means and resources beyond the US Government. Today, RFC1174

Page 218: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

203

Vint Cerf, “IANA Authority,” March 17, 1995, cited at http://dns.vrx.net/news/by_date/old/1995/273

Mar/cerfdeal.html

This argument is inserted here without attribution to give voice to the strong concerns of Cerf’s later274

critics. My personal view is not so conspiratorial, following the aphorism, “Never attribute to malice what can

be explained by incompetence.” Even if ISOC seemed well positioned to fill a vacuum of leadership for

managing the Internet’s resources, there was nothing preventing other agents within the US, the IETF, or the

International community from rising to the opportunity.

describes the policy (this was endorsed by the IAB and, if I amnot mistaken, the FNC).

273

Cerf believed RFC 1174 provided the legitimating vehicle for ISOC’s bid to control

the root. The evidence is not so clear cut. There is no record that the FNC or any US

government agency ever issued an explicit endorsement of RFC 1174. Still, Cerf’s claim had

some weight. Arguably, NSF’s incorporation of RFC 1174 in the Cooperative Agreement

tacitly ratified IANA’s authority to perform such delegations. IANA’s presumed authority

was further ratified by Gerich’s RFCs regarding control of the IP space and Postel’s

declarations in RFC 1591. Even Mitchell’s strong support of IANA’s status at the IEEE

workshop added weight. His pronouncements had been questioned during the meeting, but

left standing.

Given the inherent ambiguity written into RFC 1174, combined with the record of

Cerf’s leadership in creating ISOC, Cerf exposed himself to the criticism that both endeavors

were undertaken as part of a coordinated, long-term plan. By reinforcing IANA as an

operational function with de facto responsibility for making critical allocations, while de jure

responsibility remained vague, ISOC had time to rise and take shape as an entity capable of

taking on legal responsibility at an expressly global level. In other words, critics could argue

that the need for something like a technically competent group of overseers who were

independent of US government control was manufactured by the gaps and ambiguities in

RFC 1174, and that the convenient appearance of ISOC as the Internet’s rescuer was not at

all accidental, but a preconceived outcome.274

At the time, however, no one would make such a harsh attack, and certainly not in

public. Cerf’s project continued to advance. By embedding claims of IANA’s authority

Page 219: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

204

See again, Richard Sexton’s citation at http://dns.vrx.net/news/by_date/old/1995/Mar/cerfdeal.html.275

Ibid.276

within documents that the community generally regarded as sanctioned instruments, IANA’s

claim to authority inherited a presumption of community sanction. Those instruments are

speech acts which served to ratify a particular state of social reality. In simpler terms, lots of

important people were “blessing” IANA, and those blessings counted for something. (For

more about the social implications of speech acts, see Appendix 2).

* * *

Cerf’s answer to Aiken repeatedly stressed the declining share of the US government-

operated systems in the overall consumption of Internet services. Due to the rapid relative

growth of private sector involvement, “the scope of the address and domain assignments has

now changed significantly” Moreover, he wrote, “about 50% of all the nets of the Internet

are outside of the US.”275

He offered a deal. “Rather than looking for historical precedent as a means of making

a ‘determination’ as to jurisdiction,” it would be better to “make some deliberate agreements

now among the interested parties.” He suggested, “the NICs, the IANA, the various US

Gov’t research agencies and ISOC.” “My bias,” he wrote, “is to try to treat all of this as a

global matter and to settle the responsibility on the Internet Society as a non-governmental

agent serving the community.” (My emphasis, but note that Cerf’s critics point to this

message as evidence of his long-term plans for a deal coming to fruition. )276

An underlying thread in Cerf’s response concerned strategies for reducing the burden

borne by the US government. Perhaps the various registry functions handled by the IANA

and the InterNIC could be delegated to private, not-for-profit agents who could operate

without federal subsidies. Cerf pointed out that the NSF committee overseeing the fulfillment

of the Cooperative Agreement already suggested that “the .COM and .ORG domains ought

to be handled on a self-supporting basis either by the InterNIC or by some party other than

the US Government.”

Page 220: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

205

Rick Adams, “Re: toplevel domain names - inquiring minds want to know - who owns,” IETF277

March 18, 1995 http://www.netsys.com/ietf/1995/0820.html

Ibid.278

The responses from Aiken and others like Sean Doran (who coined the derisive “It

Seeks Overall Control” slogan) indicated the degree of open suspicion about how ISOC’s

board members were eyeing the potentially valuable name and number space. But even more

list members expressed suspicions of the US government’s own predatory designs. Aiken

received a sharp public rejoinder from Rick Adams, founder of UUNET, and an ally of

Rutkowski on ISOC’s business-oriented Advisory Committee.

ISOC has the same rational for jurisdiction that InterNic, ARPA,DoE, NASA, FRICC, FNC, CCIRN, etc have --- NONE.

As long as they aren't stupid about it, and are doing anacceptable job, they can think whatever they want - no one reallycares (except for those trying to get government funding)

As soon as they become obstructionist (I'm particularly thinkingof the InterNic here), they will be ignored by most serviceproviders, who will move on to some other mechanism that gets thejob done.

The internet is a cooperative entity, no one has absolute rightsover it. Similarly, no one has sufficient power to claimjurisdiction.

277

In a followup, Adams charged the that US government was one of the biggest

obstacles to progress on the Internet. He blamed the NSF for doing too little to rectify the

poor performance at the InterNIC. The GA disaster was embarrassing enough, but the

registration backlog was a notorious problem. Adams’ main grievance, however, was that

the US government would dare to assert ownership. He predicted the strategy would backfire.

“[T]he very important point you need to understand is that if the Internet is fissioned, it will

be the governments who are the islands and the rest of the world who are the continents that

the islands [do] not connect to.”278

Rutkowski was still ISOC’s Executive Director at this time and hadn’t yet burned any

bridges. He diplomatically suggested appointing an ombudsman to deal with the Jordanian

Page 221: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

206

A. M. Rutkowski, “Re: toplevel domain names - inquiring minds want to know - who owns,” IETF279

March 17 1995, http://www.netsys.com/ietf/1995/0800.html.

A.M. Rutkowski, quoted in Robert J. Aiken “More on Domain name spaces and "inquiring minds280

..." - LONG!” IETF March 19, 1995, http://www.netsys.com/ietf/1995/0825.html.

Robert Aiken, “More on Domain name spaces and ‘inquiring minds’ ...- LONG!,” IETF March 19,281

1995 http://www.netsys.com/ietf/1995/0825.html.

issue. He also suggested creating a “stable process” that could handle such disputes, which

“are going to become more common.” In a private response, however, Rutkowski279

expressed a sentiment much closer to Adams’ position.

It’s difficult to imagine a truly global communications systemwhere other nations will indefinitely allow a single nation toadminister the basic numbering plan. In addition, there arealready examples of some partiality by the US in the existingadministration.

280

Aiken reprinted bits of Cerf’s and Rutkowski’s responses on the public IETF list, but

made it clear he wasn’t satisfied with the answers. He admitted that, yes, the US government

had indeed delegated out control over various address blocks and NIC functions. However,

“This does not mean that the US gov gave up any rights it had to the name or address space

– it merely meant that it did the smart thing and agreed to a delegated and distributed

registration process.”281

If someone in the government had indeed been willing to “abdicate such

responsibility,” wrote Aiken, it had to be ascertained who had done so (to ensure they had

such authority), and to whom it had been abdicated. He insisted that the FNC “quit sitting

on the fence [and] take a stand one way or the other – everyone deserves to know.” Of

course, the NSF’s Networking Division Director post was still empty, so no one at the FNC

was in a position to state its policy.

Aiken questioned whether ISOC’s lawyers fully appreciated the legal implications

of taking responsibility for the address space. “If the ISOC is in charge of names and

addresses then that means the ISOC, its board, etc, are liable for any problems that ensue.”

He also wondered why the “US FEDS keep getting proposals to help fund IETF activities,

Page 222: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

207

Ibid.282

Rutkowski, “More on Domain name spaces and ‘inquiring minds’ ...- LONG!,” IETF March 20,283

1995 http://www.netsys.com/ietf/1995/0830.html.

For background, see Daniel Karrenberg, “Internet Administration in the RIPE Area,”284

http://www.ripe.net/home/daniel/euro-notes.html.

Karrenberg, “More on Domain name spaces and ‘inquiring minds’ ...- LONG!,” IETF March 20,285

1995, http://www.netsys.com/ietf/1995/0828.html.

[even though] ISOC claims the IETF is part of the ISOC standards process.” The problem

boiled down, once again, to the question of ownership.“I would like HARD answers from

both sides.”282

Meanwhile, Postel was scurrying to get some hard answers regarding his own level

of exposure and vulnerability. He and one of ISOC’s lawyers, Andrea Ireland, began

investigating “a variety of alternative notice and procedural options” to minimize his

liability. But there was no useful result from this exploration. The situation remained as283

unsettled as before. ISOC could not provide direct coverage for him, and USC remained

unwilling to step up on Postel’s behalf.

Non-Americans who read Aikens’ comments on the IETF mailing list reacted with

alarm. A US government official seemed to be arguing that Europeans and Asians were

wrong if they believed they had exclusive permanent control over the blocks of IP addresses

Postel had delegated to them. Without such tenure, the danger was that the IP numbers could

be taken back. A protest came from Daniel Karrenberg, a pioneer of European networking

who co-founded the European registry service situated in Amsterdam. The registry he

managed – RIPE – had received the first Class A block delegated by Postel after the

publication of RFC 1174. Karrenberg also operated the root server based in London. “It284

would be ‘very interesting,’” he wrote, “to see the US government assert ‘rights’ to the

address space” which had already been allocated to non-US registries or assigned to non-US

users.285

The implication of Karrenberg’s message was that any move by the United States

government to take back the addresses would bring catastrophic results. Such an act would

Page 223: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

208

Ibid.286

disrupt Internet operations around the world, undermining the existing culture of

international collaboration. Aiken had suggested that a handful of insiders from the US

government and ISOC get together at a workshop to determine ownership. Karrenberg felt

that doing so would be the equivalent of fixing something that wasn’t broken. Such a

conference would create more trouble than it was worth. But if it were held, he wrote, “then

you better include international interest groups as well. Otherwise the fights – eh

deliberations – will have no value.”286

It is noteworthy that there were probably as many Americans as Europeans and

Asians openly complaining about the vestiges of US-centrism in Internet management. It

wasn’t that the Americans identified more with the interests of France or Fiji than with those

of the United States. The appeal of the Internet was its inherent extra-territorialism, and the

prospect that it could be used to shatter the fetters of traditional political control. Reassertion

of US government authority over the Internet’s key resources would mean a reversal of the

trend.

As is typical of many online discussions, the participants scattered off into historical

reflections, contemplations of analogies, ideological pronouncements, and personal potshots.

Noel Chiappa trotted out the age old aphorism about the Golden Rule... “ [T]hem what

supplies the gold gets to lean pretty hard on them what makes the rules.” Chiappa made it

clear that he was wary of over-reliance on the US government or a small pool of large

corporations for funding, and insisted that a broadly funded membership organization was

preferable. In other words, stick with ISOC.

Several postings lauded Rick Adams’ declaration that the Internet is a “cooperative

entity’ over which “no one has sufficient power to claim jurisdiction.” Dave Crocker liked

that idea so much he suggested the words belonged on a T-shirt. Rutkowski cut in, chiding

Crocker for overlooking the fact that the interests of important stakeholders (big

corporations) needed to be recognized. An exchange of snide remarks between them

foreshadowed the loathing that would characterize the DNS War at its peak, three years later.

Page 224: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

209

Dale Worley, “More on Domain name spaces and ‘inquiring minds’ ...- LONG!,” IETF April 10,287

1995, http://www.netsys.com/ietf/1995/1267.html.

Despite the cranky words and public backbiting, the discussants also vented the most

serious themes of the debate to come. Partisans of both sides would claim they wanted the

Internet to be left as free and unregulated as possible, but they offered radically opposed

strategies for doing so. The inner elite of old guard technologists sought to keep the Internet’s

management structure subordinate to their own arcane decision-making processes. The others

wanted Internet administration to be remade in the image of alien but interested pressure

groups.

As the “Inquiring Minds” thread and its offshoots wound down, Dale Worley, a web

page designer who was monitoring the discussion, jumped in. Worley echoed the mantra,

“No one has jurisdiction,” adding a comment that was highly perceptive, yet naive to its own

prescience.

We need to keep reminding people that this is the reality.

A joke is that this is true of *all* institutions. But we have topretend that "jurisdiction" is real, otherwise everyone wouldbreak all the laws when they weren't being watched.

It is the times when new institutions and new jurisdictions arebeing created that we come face-to-face with the fact that theyare just social constructs. The difficulty is to have theself-possession to create new and useful social fictions, ratherthan just running amok.

287

* * *

Just as a jurisdiction is a social construct, so is anarchy – a social fiction by which

people give themselves permission to run amok. Claims of freedom are a convenient

disguise for the facts of power. Though it was popular to idealize the Internet as an open,

cooperative, voluntaristic entity that was inherently beyond the reach of any authority, there

were rules in play nevertheless. Many were hierarchical. Among the most important was that

Postel would have final say over things like parameter assignments and which TLDs went

into the root. Another rule was normative: No one would press too hard with questions about

why Postel’s position atop the name and number hierarchy should be upheld as a rule. It was

Page 225: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

210

enough to bless him and to revel in being blessed back. Those shadowy social fictions had

been quite useful when the Internet community was much smaller, but the exploding

popularity of online communication drew spotlights.

Belief in the fiction that “No one has jurisdiction,” had once provided an excuse for

people to deny the facts of power while Postel, Cerf and a few others took on the task of

steering the good ship Internet forward, at a safe distance from danger. But now it was

increasingly evident that there were obstacles right ahead. A decision had to be made about

who would make decisions, and thus be held accountable for them. Sharp turns had to be

taken, one way or another, and it needed to be made plainly clear who would be in charge

of taking them... Fund the InterNIC, or let it collapse... Collect rent from registrants to pay

for the funding, or make the taxpayers do it... Anoint ISOC as the responsible owner, or

undercut it and find someone else to bear the burden and reap the potential rewards.

The quasi-formal, legally naive, federally subsidized rule-making structures that had

served the Network Working Group, the Internet Conference Control Board, and the Internet

Activities Board had worked exceptionally well in their time. That structure, despite its

ambiguities, productively harnessed the energies of the mavericks who built the Internet.

Karrenberg and many others still had faith in this structure and its record of success. They

believed there was nothing about Postel’s role which needed to be fixed, since nothing was

broken. Aiken, however, had a clear-eyed view of the bigger picture... He could see the

pattern in the rising fear of lawsuits, the emergence of multiple claims to address space, the

intensification of funding pressure, and broad questions of accountability. All these could be

taken as signs of lengthening cracks in the existing framework.

These new problems reflected a need for overarching structural reform. The sleek

harness of the previous decades was too puny for the new reality, and would have to be

recrafted... a painful process that was just beginning. Legitimacy and the power that came

with it could no longer be based on trust, reputation, and nods to the virtues of serving the

public good. The rights and duties of authority would have to be spelled out and tied to

responsible persons and accountable processes. Creating a new and improved social fiction

would require a higher order of blessings and counter-blessings.

Page 226: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

211

Rony and Rony (1998).288

m. Under New Ownership – Summer 1995

On July 28, 1995, with its new owners and managers securely installed, NSI initiated

a major shift in the procedure for registering domain names, publishing the first in a series

of Domain Dispute Policy Statements. The policy was designed to protect against lawsuits

from two classes of potential litigants: registrants and trademark holders. From that point on,

a registrant’s right to use a domain name was linked to several new requirements: The

registrant had to agree to make true statements on his or her registration application. The

registrant also had to agree to refrain from infringing on trademarks or other forms of

intellectual property. Most importantly, registrants had to agree they would not hold NSI

liable for any damages resulting from loss of use of the name.

Arguably, the new policy reflected an extensive circumvention of due process. The

rules tilted decisively in favor of trademark holders. Under the new policy, when a mark

holder challenged a registrants name, NSI’s would put the name “on hold,” removing it from

the zone file until the parties settled the matter among themselves or through litigation. The

rule was bad news for domain name holders who happened to be using the same string of

characters as a trademark holder, despite utter difference in meaning. In legal terminology,

the removal was the equivalent of a court acting in an excessively prejudicial manner to

provide preliminary injunctive relief to a plaintiff.

Under conventional trademark law, domain holders in a different business class than

the trademark holder would not generally be guilty of infringement and not generally subject

to these kinds of preliminary injunctions. The new policy was of course appreciated by many

trademark holders. Other observers who were well-versed on the subject were outraged by

its perceived unfairness. Many denounced it as counterproductive – a “lightning rod for288

lawsuits” that would ultimately benefit only the lawyers who were billing NSI for their

services.

One of the first critics was Karl Auerbach, an active IETF member whose technical

grounding was supplemented by a law degree. No defender of NSI, he had been urging that

Page 227: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

212

See Auerbach’s comments in the com-priv thread, “Re: New InterNIC Domain Dispute Policy,” also289

cross-posted to domain-policy, particularly August 8, 1995.

One of the first is described at http://www.eff.org/Legal/Cases/Knowledgenet_v_InterNIC_et_al/.290

See also Chapter 9, “Caught in the Crossfire: Challenges to NSI,” in Rony and Rony (1998: 79-458).

Mike Walsh, “Procter & Gamble has diarrhea,” com-priv August 30, 1995.291

any move toward the institution of domain name registration fees should be preceded by a

rebid of the entire contract. His response to the Domain Dispute Policy Statement was to

suggest bringing a class action suit against NSI. He backed off when it became clear that

IANA might have to be included as a defendant. 289

Denunciation of NSI – and therefore of SAIC – intensified as the ramifications of the

policy became clear. Cases soon began to pop up and work their way through the courts.290

Cyberlaw had already become a fashionable discussion topic among scholars and

journalists. The growing turmoil over rights to domain names added grist to their mill. The

macdonalds.com episode was continuing to inspire streams of defensive registrations. It was

reasonable to expect that disputes over domain name ownership would grow in importance.

Terms like “surfing the web” were coming into vogue. Well over one million copies of the

Mosaic web browser had been distributed by this time, and alternative browsers were in the

works, clear proof that interest in the Internet was exploding.

NSI’s registration technology improved dramatically during the summer, just in time

to keep up with a surge in demand. Over 8,000 domain names were registered in .com in July

and 15,000 in August. Screening procedures in .edu and .net were still being upheld, but the

situation .in .com had loosened substantially, leading to some wild outcomes. Despite

Kosters’ request that a registrant acquire only one name, a few businesses had started

registering numerous variations of their names. There was also a run on generic terms. The

most infamous was the collection amassed by Procter & Gamble, which accumulated dozens

of listings related to body parts and health ailments. This motivated a discussion thread under

the delightful subject line, “Procter & Gamble has diarrhea.”291

Page 228: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

213

Netscape’s developers were led by Marc Andreesen who, while a student at the National Center292

for Supercomputing Applications (NCSA) at the University of Illinois, had also led the group which created

the first graphical browser, Mosaic.

With its release of Windows 95, Microsoft also sought to create an AOL-like, proprietary dial up293

service called MSN. It quickly fizzled. Consumers preferred the kind of full-fledged Internet access local ISPs

could provide. Microsoft adjusted, repositioning MSN as an Internet portal. The company was considered “late

to the Internet,” and raced to catch up. Before long it was regularly sending a platoon of employees to

participate in IETF meetings. Simultaneously, it began recruiting Internet engineering veterans who had risen

high in the IETF’s ranks.

There were serious problems. NSI was supposedly trying to weed out Quittner-types

and “nonsense domains” but was rejecting some otherwise valid requests. Why had Procter

& Gamble’s applications gotten through while others were blocked? People were up in arms.

n. Pulling the Trigger – September 1995

By the summer of 1995, there was incessant media coverage about computing and

the Internet. Public awareness of the Web and of domain names was rising, and two events

in August boosted it even higher. On August 12 , Netscape Communications Corporationth

held its Initial Public Offering of shares, triggering one of the most spectacular stock run-ups

in the history of the NASDAQ exchange. Netscape’s freely distributed web browser and292

email applications suite quickly became the Internet access tool of choice. Less than two

weeks later, on August 24, with great fanfare, Microsoft released the Windows 95 operating

system. The launch generated huge sales of both software and hardware, providing PC buyers

with a far superior platform and toolset for getting online.293

Other news with important implications for the Internet didn’t make the headlines.

On August 15th, George Strawn rejoined the National Science Foundation as the new

Director of Division of Networking and Communications Research and Infrastructure.

Within thirty days, the debates over who would control the domain name system would

become a bare-knuckled brawl. All the conditions for launching the DNS War were in place.

Years later Strawn mused about his role in those events:

The real issue, I still think, remains how the world lumbered forward from aset of totally informal relationships that were quite suitable and appropriate

Page 229: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

214

Phone interview with George Strawn, June 18, 2002.294

Phone interview with Don Mitchell, February 11, 2003.295

for a research environment, to a commercial environment where everybodyneeded everything inscribed in law.294

If the world moved forward under Strawn’s watch, the word “lumbered” would not

describe its pace. One might better say that a massive stampede that was already in motion

surged and pitched ahead. During his first days on the job, Strawn agreed to a decision that

made dot-com a watchword of the new millennium and the basis of vast personal fortunes.

By allowing Network Solutions to begin charging fees for its registration services, thereby

turning domain names into commodities, he widened the gates for the oncoming Internet

landrush.

When Strawn arrived, Mitchell was ready for him. It was time to wrap up the

renegotiation of the Cooperative Agreement with NSI before the end-of-September deadline.

Mitchell told Strawn that the negotiations were urgent... “the hottest thing going on.” The295

NSF was hemorrhaging money, running through its allocation faster than expected. They

were less than half way through the term of the Cooperative Agreement, but nearly all the

money set aside for the Internet Registry was spent.

Mitchell had already begun fleshing out the details of a policy overhaul with NSI’s

Dave Graves, the Business Affairs Officer who had stepped in on a moment’s notice to take

over funding of the Info Scout Project. Mitchell and Graves had been working together in

secret, concerned that if the public found out that the imposition of fees was imminent, the

rush to get in under the wire would be overwhelming. Mitchell had ruled out the idea of

freezing registration activity while soliciting community input. “Our primary concern was...

[to] do nothing that would impede the growth of the Internet. We have got to have an

operational registry.” A lot of major players were out of the loop. The other agencies of the

FNC and even Jon Postel were kept in the dark.

The White House was informed via Tom Kahlil, Senior Director for Science and

Technology of the National Economic Council, ostensibly the FNC’s overseer. Mitchell

Page 230: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

215

Ibid.296

Ibid.297

justified the policy change as absolutely necessary, since the NSF could simply no longer

afford to subsidize registrations. He sweetened the deal for the government by proposing a

$15 surcharge for what would be called the US Information Infrastructure Fund. It was

intended to supplant the money government had been investing in IANA and the IETF. As

Mitchell recalled, “The White House just said, ‘Well, okay,’ and they left and we never heard

another word about it.”296

There was no question that changing the policy would trigger an explosion.

Resistence to the idea of charging for domain name registrations was almost as deeply

ingrained in the Internet’s culture as revulsion at the idea of paying to read technical

standards. One of the last things Mitchell had asked Scott Williamson to do before he left

NSI was to recommend a fee structure that Mitchell could use to go forward. Williamson

intentionally ignored the request. “It was something that I couldn’t imagine happening,” he

later recounted. “The nature of the Internet [was] cooperative effort. We were really a part

of that. It was just unthinkable that we would make a transition and start charging people for

a name.”

When Kosters succeeded Williamson as primary investigator of the registry, he was

just as unforthcoming. Mitchell understood his reluctance to be the “bad guy” and finally

asked Graves to do it. The two then began working together without putting anything in

writing. Each side would make a presentation to the other, and then take all their materials

home. This approach was based on Mitchell’s reasoning that, “If the government doesn’t

have a document, it’s not subject to request.”

We wanted to get this right. We didn’t want it being broken prematurely. Itwas something that absolutely had to get done. We felt that this was the rightway to handle it.297

The secrecy worked for a while, but eventually things had to be put on paper, and the

news did indeed get out ahead of schedule. Plans were taking shape for a September 18

Page 231: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

216

That same thread title moved to the IETF list. See http://www.netsys.com/ietf/1995/date.html.298

Chuck Gomes, “Network Solutions New Pitch,” Domain-Policy. March 27, 2000.299

announcement, which was a Monday. On the 12 , however, a draft of the press releaseth

showed up on a California Bay Area newsgroup under the subject heading, “The times, they

are a-changin’.” 298

The Internet is, by design, an excellent medium for communication, and it proved to

be an excellent tool for circulating and discussing the leak. “Interesting People,” one of the

most prominent newslists to carry the story, was operated by Dave Farber, founder of the

venerable Msggroup at ISI. Unlike most newslists, Farber exercised sole control over what

was posted. He used it to keep his readers informed about his travels, favorite gadgets, and

relevant speaking engagements. He would often repost informative links, press articles, or

just interesting messages that his readers sent him. Anyone who was ever a member of his

list probably noticed that he seemed to be up all hours of the night running it, collecting

information and sending out email.

Farber passed the NSI leak at 5:30 AM on September 13 , prefaced with a commentth

from his source that expressed the general mood...“Who died and made the NSF King?”

When Don Mitchell became aware of the leak he contacted Karen Sandberg, an

officer from NSF Division of Grants and Agreements. They worked late into the night of the

13 in constant phone contact with Dave Graves in Herndon, putting the final touches on theth

policy instrument. It was officially implemented at midnight as Amendment 4 to the

Cooperative Agreement, effective September 14 , 1995.th

Amendment 4 dropped the “cost plus fixed fee” provision of the Cooperative

Agreement and instead permitted charges of $35 per year for registration of SLDs ending in

.com, .net, and .org. Names registered within .edu remained free of charge, though the

qualification guidelines were tightened slightly. Screening in .com was eliminated altogether.

NSI retained compliance guidelines requirements for .net, and .org, but dropped them too in

early 1996, after consultation with Postel. Since initial registrations were required to cover299

a two year period, with the $35 fee for NSI, and the $15 Federal surcharge, the “price” of a

Page 232: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

217

One IAB member, Robert Moskowitz, reported that the IAB list was sent a more up to date copy300

of the draft “about the same time” as the leak. “Re: The times, they are a-changin' (long)” Swept 18, 1995

http://www.netsys.com/ietf/1995/2508.html.

IAB, “Minutes for September 12, 1995 IAB Teleconference,” http://www.iab.org/documents/301

iabmins/IABmins.1995-09-12.html

first time registration was set at $100. All prior registrations would now be subjected to $50

annual maintenance charges (supporting the same $35/$15 split) at the time of their

expiration.

* * *

The leadership of the Internet standards-making community was blind-sided by the

new policy. The IAB held a teleconference the same Tuesday that the leak first appeared,300

but before the story was widely disseminated. Betraying their cluelessness about the tempest

to come, their discussion began with an expression of relief that the “storm” on the IETF list

had died down. They expressed lingering unhappiness about NSI’s arbitrary screening policy,

and wondered what they might conceivably do to improve the situation.

Should we suggest to the IANA that .com be taken away from the Internicand given to someone who charges real money for this? More generally, whoshould worry about this sort of thing, and if it is not us, whose job is it?301

With Postel absent, all they could do was agree to take it up again later.

* * *

Without a paper record of the negotiations between NSI and the NSF, there is no clear

answer as to how Mitchell and Graves arrived at the decision to set a $35 fee. The NSF

subsidy at that time was around $120,000 per month, supporting 20,000 new registrations.

Loose estimates suggested that by the fifth year of the Cooperative Agreement the NSF

would have had to increase NSI’s monthly subsidy to around $1,000,000. Domain

registrations had been increasing by a factor of seven each year for the previous few years.

If registration growth had continued at just half that pace for three more years, there would

have been 1,000,000 names in the system by September 1998. If the expected income were

calculated to reflect an annualized average, revenues would have reached about $3,000,000

per month. Presuming the $1,000,000 subsidy held, this would have meant an immediate

Page 233: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

218

NSI funded travel for IAHC critics such as Richard Sexton, Einar Stefferud, and Adam Todd,302

among others.

return on investment of 200 percent – a bonanza by anyone’s definition. In fact, the number

of paid registrations surpassed 2,000,000 in 1998 and was still climbing.

Therefore, given the loose projections someone might be able to make in late 1995

about the real cost of processing a domain name registration three years later, leaving room

for profit, a $10 or $15 annual fee would have been a safe estimate. Now, with the benefit

of hindsight, we know that the wholesale price of a domain name was finally set at $6 per

year. Consequently, it is clear that the $35 fee set by Mitchell, Strawn and Graves was unduly

generous to NSI.

Of course, their projections also had to leave space for several unknown factors, most

importantly the cost of litigation. The fee had to be large enough to build a fund that would

cover what their press release described as “reasonably foreseeable contingencies.” The

estimates turned out to be overinflated; NSI did pay handsomely to hire some tough lawyers

over the years. A raft of Washington lobbyists were hired as well. The company even paid

the way of a few agents provocateurs when less scrupulous tactics came in handy. But the302

company never suffered a significant setback as the result of litigation or threatened lawsuits.

After fees were instituted, the imperative for NSI’s officers was to keep the .com registry in

their hands, and their hands alone.

* * *

Another “reasonably foreseeable contingency” stemming from the policy change was

broad discontent with how things had been handled. A charitable take on Strawn’s behavior

might grant that he was acting in the vein of Admiral Grace Hopper’s dictum, “It's easier to

ask forgiveness than it is to get permission.” His manner of seeking redemption to was to

offer to throw some money around, providing funds for a series of workshops starting late

in the year, where the issue of Internet names and numbers administration could be given

a full hearing. But the move had many critics, and few were willing to wait that long to

express their indignation.

Page 234: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

219

Phone interview with Don Mitchell, February 11, 2003.303

Phone interview with George Strawn, June 18, 2002.304

Postel was“miffed... not happy,” especially after Mitchell agreed that Postel was “the

authority,” but insisted that as the responsible agency, the NSF could change the funding

mechanism of the DNS registry as it saw fit. That type of decision, Mitchell wrote Postel,

“has nothing to do with your prerogative.” Mitchell’s tortured logic didn’t help matters.303

If he and Strawn could act as final gatekeepers over such an important initiative, and could

proceed without even seeking Postel’s legitimating guidance, Postel’s authority seemed to

have been emptied of any real standing. His ability to oversee the core resources of the

Internet had for a moment been rendered meaninglessness. If it happened once, it could

happen again, unless IANA’s power could somehow be reasserted, reclaimed and recovered.

The implicit offer from NSF that some money would be thrown his way didn’t

mitigate Postel’s sense of alienation. There was no guarantee that the $15 portion of the fee

being redeemed to the US Government would actually be used to the sustain the IETF or the

IANA. (As it turned out, those groups never saw any of it.) Postel’s feelings of mistrust

evidently hardened, as it did for many others.

The event marked the beginning of a severe break between the Internet’s “old guard”

engineering community on one side, and NSF/NSI constellation on the other. The “old-

guard” began to assert its authority and independence, while NSI’s supporters insisted on its

autonomy and sovereignty. As the two sides moved away from each other, the status of the

US Government as a competent policy leader dropped precipitously. The government had

paid to build the foundations of the system, but was left holding only a few short purse

strings.

The event also opened a rift among the members of the FNC Executive Committee.

Though Strawn was head of the FNC, he hadn’t moved to inform the other participating

agencies of the details until days before the agreement was executed. Even with that, he

recalled “We didn’t give anybody veto power. We simply said, ‘Here’s what we’re going to

do.’”304

Page 235: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

220

Phone interview with Mike St. Johns, April 30, 2003.305

DARPA’s representative on the committee, Mike St. Johns, was livid about the way

things had been handled. St. Johns had also spent several years at the DCA as the original

contract manager for the SRI NIC. Along the way he had written polices for managing the

TLDs, .mil, .gov, .edu, and .int and had dealt with the complexities of inter-agency

coordination from a variety of official perspectives. From St. Johns’ perspective, Mitchell

and Strawn had acted far too hastily to dodge a relatively small budget problem. NSF’s

short-term escape had been achieved at the long-term expense of politicizing control of the

root. It exposed the Internet policy making process to the influence of parties who were

inherently incapable of settling the trademark issues which had been confounding domain

name administration.

Moreover, St. Johns believed that the solution had unwisely provided NSI with the

ability to amass a huge “war chest” that it would exploit in pursuit of its own interests. He

was certain that no good would come of that decision. Looking back years later, he was sure

of it.305

Page 236: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

A set of COM-PRIV archives has been kept on a server at the Massachusetts Institute of306

Technology. The can be accessed indirectly, using a URL with parameters that search by message number. For

example: http://www.mit.edu:8008/MENELAUS.MIT.EDU/com-priv/?1-1000 . The last message in the list

is number 33740.

221

6. A NEW AGE NOW BEGINS

a. Scrambling for Solutions – September 1995

Right after the inauguration of fees the rush for domain names quieted down, but only

briefly. The pace picked up again by the end of the year, showering NSI with hundred dollar

checks. For the Internet community, the immediate consequence of the September deal was

a deluge of a different sort... a flood of talk.

Until then, domain policy and other Internet governance conversations had been

dispersed over a variety of fora. Perhaps the best known of these was com-priv, hosted by

PSInet. Based in Herndon, Virginia, the company was a major early provider of Internet

backbone services and hosted Server C of the root constellation. The com-priv list was306

created in the heyday of the ANS/CIX controversy to focus on the general question of

Internet commercialization and privatization, so DNS issues were certainly in scope. On the

Usenet, related discussions generally collected around the comp.protocols.tcp-ip.domains

newsgroup. The Asian and European IP registries both maintained newslists where the DNS

issues came up, and there were crossovers to another less well-known list called inet-access.

NSI had started a list named domain-policy earlier in 1995, primarily as a venue for

reaction to its dispute resolution policy. The range of topics covered there expanded over the

years, and that list eventually became an important stop for anyone interested Internet policy

as a whole, especially the DNS debates. There was also the North American Network

Operators Group (NANOG) list, hosted by MERIT. It was the indispensable venue for

discussion of technical issues relevant to ISP operators, broader policy discussions were not

entirely out of bounds, but the DNS controversy was clearly not within its core mission.

The preferred venue for conversation about the policy change quickly migrated to

newdom (New Domains), a fresh list created and sponsored by Matthew Marnell, an Ohio-

based computer consultant. He hosted it under the auspices of his own domain, iiia.org,

which he had created as the setting for his grandiose dream of an “International Internet

Page 237: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

222

Matthew James Marnell, “So what shall we do?,” Newdom-iiia September 15, 1995. The original307

URL was http://www.iiia.org/lists/newdom/1995q3/0001.html. The original Newdom archives are no longer

maintained online, but URLS cited in this text can be located via www.archive.org. A convenient entry point

would be http://web.archive.org/web/20011216054458/iiia.org/lists/newdom/. Also, see Appendix 3, “Comment

on Sources.”

Marnell, “Re: Internet Draft on Top Level Domain Names” Newdom-iiia Sept 15, 1995.308

Industrial Association.” Nothing came of that. Over the ensuing years, newdom discusions

were the only significant activity of his site.

Marnell’s inaugural posting on September 15 1995 was titled, “So what shall we do?”

That phrase had a whiff of V.I Lenin’s “What is to be done?” A new mood was in the air, as

if a momentous social upheaval was underway... as if the crisis had opened the door to a

revolutionary response that would finish off the old order and bring about a more balanced

distribution of wealth. In this case, however, many of the vanguards were openly promoting

themselves as the rightful beneficiaries of the revolution.

“The way I see it,” Marnell wrote, “is that NSI and NSF did a naughty thing by not

including the general community in their decision.” He listed several options, leading up to

the one he clearly favored.

We could all try to hijack the root domains and point everythingat each other instead of at the root servers. Not a good idea.We could all drop our .com, .net, .org domains and [move togeographical] name space. Might force the NIC to rethink it'snew policy. We might create some NewNIC for the express purposeof registering [top level] domains for domains that we asInternet users want.

307

The NewNIC, he reasoned, would have to be funded so that it could operate as a

competitive, commercial entity.“Before posting to the list you may want to think about which

domains you'd like to possibly have root level control over.”

In another posting later that day Marnell reiterated the general mood. “If [NSI] had

at least put it before the community, then this may have all come about more slowly. But

they dropped a bomb on us and now we're all scrambling for some solution.”308

Resentment of NSF’s move was coupled with an undisguised eagerness to get in on

the registration game. All over the net, people were running the numbers. Anyone operating

an ISP or a small network could see that with the technology presently available, operating

Page 238: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

223

Ibid.309

Jon Postel, “Re: Assumptions and misc ramblings,” Newdom-iiia September 20, 1995.310

a registry on the side was necessarily an expensive proposition. Now, with a benchmark of

$35 per name per year, it seemed there was a lot of gravy to go around. Marnell laid it out.

Who does the most DNS out there? I'd say that the small tomidsized ISPs still have the largest amount of domain name space.Maybe, I'm dreaming, but this chaos isn't so bad for any of us.The IANA and the NSF may have put a lot into the Net in the past,but money is the controlling factor now, and we can still votewith our dollars as well as with our hardware and software. I'mnot advocating wrenching the root servers from their moorings andrewriting the Net, but we could be calling for 0 governmentcontrol of the Net. Get their hands completely out of the honeypot.

309

The earliest posters advocated creating alternate TLDs and registries to compete

against NSI. A few used newdom to announce their plans to do so. The first to do so was

Matthew R. Sheahan, owner of a New Jersey-based ISP called Crystal Palace Networking.

* * *

The first day’s posts focused on creating an RFC that would lay out a fair and proper

method for allocating new TLDs. The idea was to get the show on the road so that people

could start making money. But “get rich quick” was not a unanimous objective. A few IETF

stars on the list saw the crisis as an opportunity to explore some novel technical solutions.

By the second day, the IESG’s Scott Bradner was steering the discussion toward a radically

new idea, devising a system by which multiple entities could all sell names in the .com

registry. He preferred a solution that would break NSI’s monopoly without creating new

ones, “[Let’s not] duplicate the known to be problematic current design,” he wrote.

Postel was more conservative, favoring the use of known technology to create a small

number of new NSI-styled registries. He shared his views on newdom and with ISOC’s310

board:

I think this introduction of charging by the Intenic [sic] fordomain registrations is sufficient cause to take steps to set upa small number of alternate top level domains managed by otherregistration centers.

Page 239: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

224

E-mail to [email protected], copied to ISOC trustees. Rutkowski archived this at311

http://www.wia.org/pub/postel-iana-draft13.htm.

Jon Postel “Re: Registering TLD’s” Newdom-iiia November 12, 1995.312

I'd like to see some competition between registration services toencourage good service at low prices.

311

No quick consensus emerged. Whatever Postel’s own initial preferences, he seemed

more than willing to engage in a process of deliberation, presuming it didn’t go on too long.

Several private TLD entrepreneurs insisted that Postel add their zones to the root

immediately. Arguably, there was a precedent. If a request to add a TLD had come from a

prospective country code operator, Postel would have vetted it and passed on the details and

a directive to Kosters. But new country codes were contemplated in RFC 1591, and private

requests were not. So Postel simply filed the request and sat on it. Anyone with further

inquiries was directed to join the newdom discussions. After one flurry of demands Postel

even published instructions: Fill out a template, send it to IANA, and wait. “Be prepared to

wait a long time... until the procedures we sometimes discuss on this list get sorted out.”312

When the designers of the DNS wrote the rules for adding new TLD suffixes, they

had been so focused on the semantics of five or six generic zones and the question of whether

or not to add country codes, they put almost no thought into the notion that hungry

entrepreneurs would show up, hoping to plant entirely new continents in cyberspace. In 1994,

when he published RFC 1591, Postel assumed the matter was settled. The signal was explicit

enough: “It is extremely unlikely that any other TLDs will be created.” Some country codes

were yet to be delegated, of course, but that was an uncontroversial matter. Now, in the new

context presented by the unilaterally-promulgated Amemdment 4 and the surprise

introduction of fees, Postel’s attitude about new suffixes had shifted. Still, he was

temperamentally unwilling to respond in kind with his own fait accomplis. There would be

no more unilateral changes. The gatekeeper’s rulebook would have to be amended in public

before any other big step could be taken.

Page 240: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

225

For the original proposal see John Gilmore, “Proposal for a DNS market infrastructure,” IP313

September 19, 1995, http://www.interesting-people.org/archives/interesting-people/199509/msg00053.html.

Jon Postel, “Sharing without a Central Mechanism,” Newdom-iiia October 20, 1995.314

Members of the National Security Agency wanted to mandate certain encryption standards for US-315

manufactured telecommunications equipment. Opponents objected on many counts. See A. Michael Froomkin,

“It Came From Planet Clipper: The Battle Over Cryptographic Key Escrow,” 1996 U. Chi. L. Forum 15.

* * *

If the brouhaha was cause for consternation, not everyone had succumbed to the

political and economic distraction. As Paul Mockapetris once said of the Internet,“While a

bunch of the engineering is painful, the challenges are extremely interesting.” The idea of

creating a distributed system for placing new entries into a shared registry was – for people

of a certain mind set – a challenge that was inherently fun to think about, especially if

political notions could be smuggled in through technical design. A good, carefully thought-

out example of that approach appeared on Dave Farber’s Interesting People list. Postel313

thought enough of it to repost it on newdom himself.314

The author, John Gilmore, was one of the original employees of SUN Microsystems,

a major vendor of UNIX-based workstations. He had made himself into a bit of an Internet

celebrity, renowned in various circles as a defender of free expression, a proponent of open

source software, and an opponent of a data encryption regulations proposed by the US

National Security Agency. During his various battles with the US government, he became315

rather proficient at exercising his right to demand documents under provisions of the

Freedom of Information Act, even teaching journalists how to prepare their own filings.

Along the way he coined the famous aphorism, “The Internet interprets censorship as damage

and routes around it.”

Gilmore could be counted among those who were most deeply suspicious of SAIC’s

involvement in the Internet. As a co-founder of the Electronic Frontier Foundation (EFF),

he had the ear of Internet-activist luminaries like Mitchell Kapor. EFF had already gotten

formally involved in the question of DNS administration by providing legal support to for

Page 241: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

226

Electronic Frontier Foundation, “Legal Cases - Knowledgenet v. NSI (InterNIC), Boone, et al.,”316

http://www.eff.org/Legal/Cases/Knowledgenet_v_InterNIC_et_al/KnowlegeNet.complaint.

John Gilmore “Proposal for a DNS market infrastructure, IP ”http://www.interesting-317

people.org/archives/interesting-people/199509/msg00053.html.

one of the earliest challenges to NSI’s dispute resolution policy. A solution to the DNS316

issue which served to undermine the concentration of power in the hands of a central

authority – NSI’s or anyone else’s – was nicely congruent with an EFF supporter’s view of

how the Internet should be run.

Gilmore proposed creating a new constellation of .com registries, chosen from among

applicants “meeting particular criteria.” They would all rely on publicly-maintained, open

source, “copylefted” software which was yet to be created. He offered to fund its

development himself. There would be mandatory updates of the official .com database every

three days, merging in new entries sold by each registry over the last period. Getting the

merging process done right was critical, so he briefly outlined the kinds of algorithms and

checksums that could be applied to manage the system. In case the same name had been sold

by two or more registries over the same period, the earliest of the two would be declared the

winner. Gilmore believed that the technical issues would be easy enough to resolve once

social coordination was established. “It just has to be clear that a domain registration isn't

over until the fat lady merges.”317

* * *

The first few weeks after the September 14 announcement, the engineeringth

community rehashed the “Inquiring Minds” discourse from earlier in the year. Once again,

participants tried to spin the history of the Internet in ways that would establish claims to

ownership of the names and number space. Hans Werner Braun, a San Diego-based

NSFNET veteran who had served on the IAB and the IEPG, was clear about the importance

of US government support, but believed “the Internet evolved so much over the last ten years

or so into the public realm, that the address and naming spaces have become public

Page 242: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

227

Hans-W erner Braun “Re: 206 .82 .160 .0 /22," nanog September 22, 1995318

http://www.merit.edu/mail.archives/nanog/1995-09/msg00076.html.

David Conrad, “Re: Authority over IANA & IP #s was Re: 206.82.160.0/22,” nanog September319

23, 1995, http://www.merit.edu/mail.archives/nanog/1995-09/msg00096.html.

Randy Bush, “draft-isoc-dns-role-00.txt” ietf November 14, 1995, http://www.netsys.com/-320

ietf/1995/3006.html.

See “FNC Organization Chart,” http://www.itrd.gov/fnc/FNC_org_chart.html; “FNC Charter321

9/20/95," http://www.itrd.gov/FNC_charter.html.

property.” In the same thread, David Conrad, head of the Asian IP registry APNIC, flatly318

proclaimed, “the Internet address space is a public good.” Others persisted in arguing that319

the US Government had never relinquished control.

The revival of the debate exposed a pattern that would be repeated many times during

the DNS War: Whenever it seemed like a big decision was in the works, people would trot

out particular versions of history to support their own notions of where authority should

reside. Randy Bush, a DNS specialist with a strong commitment to internationalization, at

least had the sense to offer the government a dare, “Fess up and draw the damned org

chart.”320

In short order the FNC produced not only a graphical depiction of its organizational

hierarchy, but a charter and even a formal definition of the Internet. It was not done simply321

to satisfy critics who wanted to see evidence of US authority of the root. The FNC members

now had to work out their own internal tensions, and it made sense to begin by describing

the group’s internal hierarchy, its relationship with White House agencies, and the

overarching outline of the FNC’s mission.

To counter sentiments challenging US control, Mike St. Johns sent a lengthy note to

both the IETF and NANOG lists. He took a strong position asserting US authority, arguing

that when DCA funding was phased out and the InterNIC was created, the Defense

Department continued to retain “first call” on the number space. “The deal,” he wrote, “was

to ensure that the DOD didn’t give away the space and then have to start paying for it.”

Presently, “the FNC (as the representative US organization) continues to own the space, but

chooses to limit its control over the space to issues... that it considers critical.” The InterNIC,

Page 243: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

228

Mike St. Johns, “Re: Authority over IANA and IP#s was Re: 206.82.160.0/22" nanog (and IETF)322

September 22, 1995, http://www.merit.edu/mail.archives/nanog/historical/9509/msg00088.html.

he insisted, owned none of the number space or the registration data, but was acting as an

agent of the US government.

“The namespace is another matter,” St. Johns continued. “It’s unclear whether DCA

or ARPA would claim ownership of the root domain, but it definitely vested within the

DOD.” When the TLD space was created, he wrote, all TLDs were considered

“undelegated.” After the DNS was deployed, the delegation process began. The .mil zone

was delegated first, and then various country codes. The NSF eventually received .edu and

the US Government took over .gov. (Parenthetically, RFC 1591 stipulated that .gov was for

use by all governments.)

St. Johns concluded his history with an opaque dig at the NSF. “At this time, COM,

ORG and NET remain undelegated... which basically places policy control for them under

the root authority which remains with the FNC.” The name and number spaces were322

therefore not public property. Fiduciary responsibility for the InterNIC and IANA, he

concluded, remains with “the feds.” But which feds?

St. Johns was still angry that the NSF had kept the rest of the FNC members in the

dark so long about Amendment 4. Years of careful investment by the US that was meant to

foster the creation of a publicly available name and number space had been converted into

a commercial plum and handed to NSI as a fait accomplis. The trademark problems of the

previous year had already led St. Johns to believe that the .com zone was a mess and needed

to be shut down or phased out. This latest catastrophe confirmed that for him.

Over the following weeks and months, St. Johns wavered between the idea of closing

the .com zone altogether or imposing freeze on new registrations in perpetuity. In the latter

case, he thought the zone could be funded by charging a one time maintenance fee of $25,

putting all the receipts into an interest-bearing fund, and paying for costs from a percentage

of the annual returns. In either case he wanted to see another 50-200 new top level domains

created. His logic was that the number had to exceed the number of business sectors

recognized by international trademark law – 42. By creating a greater number of TLDs it

Page 244: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

229

Phone interview with Mike St. Johns, April 30, 2003.323

would be possible to overlap those sectors and ensure there would be competition among the

zones.

St. Johns’ ability to pursue this overhaul strategy required both the clear

establishment of the FNC’s authority over that part of the namespace, and also the

confirmation his own authority to speak on behalf of the FNC. But not only was there no

formal statute upholding the FNC’s authority, the lines of authority within the FNC were far

from clear. Yes, Strawn was the chair according to the organization chart, but the FNC had

been created to let executive agencies trade favors as peers on an ad hoc basis – to make

“nodding agreements” as St. Johns called them. The FNC’s members were not there to323

take marching orders from a duly appointed captain. St. Johns didn’t have the power to

impose his own conception on the system.

Ironically, the merits of St. Johns’ design for the TLD space didn’t count for much.

The dispute was becoming religious. Now that the NSF had opened Pandora’s box, any new

plan would have to live or die on its ability to attract followers. Very few were predisposed

to follow the lead of a US government agency.

* * *

ISOC’s Board had been fairly quiet for the last several months, listless in the wake

of its internal and external conflicts. The DNS administration crisis offered a new raison

d’etre, revitalizing hopes that ISOC could play a more useful and significant role in the

world. It still seemed like a useful vessel to people like Gilmore, who wanted it to serve as

the credentialing authority for his proposed shared registration system. It wasn’t long after

the September 14 surprise that ISOC’s President, Larry Landweber, began putting togetherth

a comprehensive plan by which the organization would assume responsibility for the DNS

name space. According to that framework, released in November, ISOC would only provide

financial oversight and serve as legal umbrella. The IAB would be required serve to define

policies, undertake the selection and licensing of “Internet Name Providers,” resolve

Page 245: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

230

Brian Carpenter, Larry Landweber, Jon Postel, Nick Trio, “Proposal for an ISOC Role in DNS324

Name Space Management” in Jon Postel,“A” Newdom-iiia, November, 10 1995.

Jon Postel “Crazy Idea Number 37b,” Newdom-iiia November, 10, 1995325

Scott Bradner, “Re: Crazy Idea Number 37b,” Newdom-iiia November, 10, 1995326

Simon Higgs, Scott Bradner, “Re: Crazy Idea Number 37b,” Newdom-iiia November 10, 1995 and327

days following.

Harvard Information Infrastructure Project, National Science Foundation Division of Networking328

and Communications. “Internet Names, Numbers, and Beyond: Issues in the Coordination, Privatization, and

Internationalization of the Internet: Research Workshop,” Nov. 20, 1995. The conference papers were

accessible via http://ksgwww.harvard.edu/iip/GIIconf/nsfpaper.html.

disputes, and establish consultative and review bodies. The draft said nothing about how any

new TLDs should be added.324

Postel agreed to sign on to Landweber’s plan, as did the IAB Chair, Brian Carpenter,

and Nick Trio, an IBM executive who was also an officer of ISOC’s Advisory Council.

Postel circulated the ISOC plan on the Newdom list about a week before the Harvard

conference. He also circulated a related proposal under the heading, “Crazy Idea Number

37b” in which he suggested creating 26 new single character Top Level Domains. There325

would be one TLD added for each letter in the alphabet, but at an expansion rate of only five

new zones per year. “These letters have no semantic significance,” he wrote. “Any type of

business or organization could register under any letter.” Bradner dissented mildly, saying

he saw no benefit. Stronger objections came from private TLD applicants, particularly326

Simon Higgs, who wanted quick movement on the creation of TLDS with meaningful

suffixes. He had already tried to claim TLDs such as .news, and .coupons. Higgs became one

of the strongest proponents of using classification schemes and familiar categories that might

reduce consumer confusion.327

b. On the Roadshow – November 1995

The November 20 “Internet Names, Numbers, and Beyond” conference was chairedth

Brian Kahin, a Harvard-based academic who had been advising the government on Internet

commercialization since the inception of the NSFNET. The event was organized as a series328

Page 246: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

231

of panels, featuring Mockapetris, Rutkowski, St. Johns, Postel, Mike Roberts, Landweber,

and Paul Vixie, who was the lead author of the DNS resolution software used by nearly every

ISP in the world. Some other participants were less familiar to the Internet community, but

in most cases it made sense for them to be there, especially Bob Frank, co-chair of the

International Trademark Association’s Internet Task Force, and Mike Corbett, the FCC’s

specialist on the management of the toll free 800 and 888 zones. The audience was full of

outspoken luminaries including Vint Cerf, David Johnson, Carl Malamud, and Elise Gerich,

plus the managers of Asia’s and Europe’s IP registries, David Conrad and Daniel Karrenberg.

During the course of the day, Vixie, St. Johns, Roberts, and Mockapetris advocated

various strategies for closing .com. Rutkowski objected. Dave Graves and the others in NSI’s

contingent must have been relieved to hear him speak up. Rutkowski played up the fact that

the ITU favored reversion to country codes. (This tainted anyone else who supported the

same strategy.) Moreover, businesses which saw themselves pursuing a global strategy in

a global market wanted to avoid the expense and the extra difficulties registering and names

in every country code. Rutkowski’s arguments apparently influenced Mockapetris, who

switched to a position of “Let the market decide.”

Mockapetris and Vixie were like the bookends of DNS software. Mockapetris had

written the first implementation of a DNS server for the TOPS-20 operating system in 1984,

while still at ISI. He called it “JEEVES,” after the smart butler from the P. G. Wodehouse

tale, “Jeeves and Wooster: Ties that Bind.” Managers at the Digital Equipment Corporation

(DEC) were eager to support development of a UNIX version, believing it would enhance

the market for DEC’s TCP/IP-related hardware. In 1985 the company arranged to have an

employee, Kevin Dunlap, work on the project with a team of graduate students who made

up the technical staff at UC Berkeley’s Computer Systems Research Group, the publishers

of BSD UNIX. The group evidently played on the Wodehouse theme by naming the new

software “Berkeley Internet Name Daemon” – BIND.

Vixie was a relatively young programmer working at DEC’s Palo Alto office in 1988

when he was given responsibility for running the company’s corporate Internet gateway and

about 400 subzones of dec.com. To simplify the task, he undertook a rewrite of the UC

Page 247: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

232

From an interview with Dave Wreski, “Paul Vixie and David Conrad on BINDv9 and Internet329

Security,” LinuxSecurity October 3, 2002, http://www.linuxsecurity.com/feature_stories/conrad_vixie-1.html.

Paul Vixie, “External Issues in Scalability,”http://ksgwww.harvard.edu/iip/GIIconf/vixie.html.330

Berkeley code (which he once described as “sleazy, icky, rotten” and, worst of all,

unstable ). He then made his revision publicly available, including source code and an329

installation kit as part of the package. It was instantly popular. Vixie’s revisions were so

highly acclaimed that the managers BSD UNIX replaced their version of BIND with his.

DEC subsidized his continuing work on BIND even after he went out on his own in 1993,

providing him with state of the art hardware as he set out to build a new star in the root

server constellation (Hardware contributions came from NSI as well). In short order, Vixie

also received a significant vote of confidence from Postel, who gave him official

responsibility for Root Server F. In 1994, Vixie and Rick Adams of UUNET organized the

Internet Software Consortium (now called the Internet Systems Consortium) to support

Vixie’s work on major upgrades to BIND. As a result of all this activity, Vixie had become

someone to be reckoned with in both the IETF (though he held no formal offices there) and

the North American Network Operators Group (NANOG)

A running theme in Vixie’s discussion at the Harvard conference was the undesirable

effect of “domain envy,” especially the flattening of the namespace. As he saw it, nearly

everyone who wanted a presence on the Web was trying to register in .com because there was

virtually nowhere else to go. He articulated a theme that resonated for years to come: For

the sake of introducing competition, root zone operations needed to be moved from NSI to

the IANA, “where it should have been all along.” And he also insisted on the Internet’s

exceptionalism. Since the DNS namespace was “universal,” he argued, it was “inappropriate

for any regional government to take any kind of leadership role in designing, maintaining,

or funding it.” He wanted to see IANA in charge, under the aegis of ISOC.330

Anticipating that as many as 250,000,000 domain names would eventually be

registered, Vixie believed it was important to start as soon as possible steering new

registrants into a more diverse array of TLDs. He offered different proposals for adding tens

of thousands of TLDs to the root, either by creating short combinations of all the numerals

Page 248: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

233

and all the letters in the English alphabet (excluding known words), or by creating zones such

as .a001, .a002, and soon. One of his metaphors was the concept of “Create license plates,

but not vanity plates.”

Postel opened his panel on a note of bemused exasperation, “Great – more work for

the IANA!” He didn’t hide his disapproval of what had happened at NSI. “The reason we are

here is that something changed and we don’t like it.” As he saw it, the challenge was to

create a competitive environment that would provide good service to consumers. Postel

raised a problem that later became known as the “lock-in” dilemma: An owner of a domain

name would be unlikely to drop a name in one TLD and switch to a name under another

suffix simply because of price differences between the two zones. Switching costs were

simply too high to justify it. His logic echoed Vixie’s concerns about the way “domain envy”

was creating a flat namespace. Because of the fast expansion of the .com registry, it was

important to attract consumers into using new suffixes as soon as reasonably possible. Postel

also adopted elements of Bradner’s and Gilmore’s arguments about the need to create

“multiple registries for each domain” rather than simply create new .com style monopolies.

When Postel announced his .a through .z TLD concept, it struck a chord with Kahin,

who believed that new TLDs should be sold to the highest bidder through an auction. Kahin

thought this would solve a host of funding problems, but most others at the conference

weren’t so sure. If there was any confluence of opinion among them it was that the problem

was not money, but trust. With so much demand for names, one could imagine many

different ways of creating solvent – even lucrative – allocation regimes. Yet there was no

organization that everyone could regard as an impartial but qualified authority. International

participants and quite a few Americans didn’t trust the US Government. Too many others

harbored lingering doubts about ISOC.

Rutkowski was still nominally ISOC’s Executive Director at this time, but he was

backhandedly critical of Landweber’s proposal for an ISOC/IAB joint authority. Rutkowski

took the position that no existing organization was capable of providing the kind of oversight

that should be expected of a global administrative authority. Moreover, the sort of leadership

that could be provided by traditional governments was “vestigial and unnecessary.” Instead,

Page 249: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

234

he wanted to see the “creation of some new kind of alliance” to do the job. Rutkowski touted

his own World Internet Alliance as an organization that could speak for a broad swath of

stakeholders, particularly ISPs, who were the rising forces of Internet commerce.

Furthermore, he argued, the new organization should be incorporated in a way that would

provide “significant immunities” from litigation, either through a special legislative

enactment, or by siting itself in a jurisdiction such as Switzerland where the legal system was

tilted toward providing such immunities.

The fireworks had begun. Not surprisingly, St Johns refused to back down from his

position that the US Government owned the number space and the root of the names space.

Karrenberg shot back, complaining about US imperialism. For many of those who resented

US power, it made sense to put ISPs in charge. Whoever owned the root, the ISP’s servers

made the root the root. It wouldn’t be long before the number of ISPs outside US borders

would exceed the number inside. And that trend would never reverse. With a joint effort, the

world’s ISP operators could anoint a new root altogether. There was no law to stop them

from doing so. ISP operators had real power to change things, presuming they could figure

out a way to coordinate their actions.

But there were problems with this line of reasoning. If ISPs could aspire to lead the

name allocation regime, what about the numbers regime? Putting ISPs in charge of that as

well seemed like a dangerous prospect. It wasn’t hard to imagine that one or two leading ISPs

might end up having the strongest hand regulating the allocation of a resource that all ISPs

had to share. This concern was expressed most strongly by Alison Mankin, who argued that

it would be important to “decouple” the issue of IP numbers from domain names so that each

could be dealt with fairly.

The administrative connection between the names and numbers registries were really

just a vestige of history... a result of Postel’s longstanding technical involvement, as well as

the legacy of funding and operational ties that threaded through the DDN-NIC and InterNIC

years. The whole point of inventing hosts.txt and the DNS was to decouple names from

numbers at a network’s application-level. There was no technical reason to keep the two

Page 250: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

235

IETF Role in DNS Evolution BOF (DNSEVOLV) December 5, 1995. No minutes were published,331

but there was evidently a presentation, since lost, by Brian Carpenter.

Randy Bush, Brian Carpenter Jon Postel “Delegation of International Top Level Domains (iTLDs)”332

Originally at ftp://rg.net/pub/dnsind/relevant/draft-ymbk-itld-admin-00.txt .Forwarded to the Newdom archives

by Matt Marnell as, “ITLD Draft,” Newdom-iiia January 23, 1996.

technologies joined in a bureaucracy. The pragmatic virtue of unraveling them and pursuing

separate solutions was starting to exert an appeal.

c. You Must Be Kidding - January 1995

The 34 meeting of the IETF was held in Dallas, Texas, the first week of Decemberth

1995. Hosted by MCI, it was only the second IETF meeting where attendance had exceeded

one thousand registered participants. But it was the first to convene since the Netscape IPO

and the release of Windows 95, and since the NASDAQ stock index had closed over 1000.

Now the Dow was about to broach 5000, a gain of over 20 percent for the year. The Internet

frenzy was well underway, drawing inevitable attention to the wizards of the IETF, who,

behind the scenes, were finding themselves distracted from leading-edge technical work by

the unsolved, age-old political problem of how to divide up and distribute a resource in a way

that could allow all comers to feel were getting a fair share.

Brian Carpenter, Chair of the IAB, took the lead, presenting a paper at a BOF on the

IETF Role in DNS Evolution – DNSEVOLVE. His paper served as the basis for the first331

formal Internet draft on how to proceed, emerging after the holidays, on January 22, 1996,

under the name “Delegation of International Top Level Domains (iTLDs).” Co-authored by

Postel, Carpenter, and Bush (who identified himself as “Unaligned geek”), it was filed under

an unusual name – “draft-ymbk-itld-admin-00.txt.” 332

At first glance, it seemed to follow the normal style for documents in the RFC queue.

Going from back to front: Geeks tend to start numbering at zero. ADMIN flagged

administrative concern. ITLD was a fresh acronym. The “i” suggested international, but

Postel defined it as a category reserved for “generic top level domains open to general

registration.” This was convenient, since he had just arranged with NSI to eliminate

Page 251: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

236

screening in .net and .org, giving them equivalent administrative status with .com. YMBK

reflected less technical content. It stood for “you must be kidding.”

The draft retained Postel’s goal of introducing five iTLDs annually, and fleshed out

procedures for doing so. Selections were to be made sometime in March or April by a seven

member “ad hoc” working group, whose members would be appointed by the IANA, the

IETF, and the IAB. The ad hoc group would be required to develop the finer details of its

own procedures and policies, “in an open process [that] will be clearly documented.”

The draft also stipulated that “Multiple registries for the COM zone, and registries

for new iTLDs will be created.” This formulation hints at the registry/registrar structure that

finally emerged a year later. But the distinctions between the back-end wholesaler and front-

end retailer were still quite vague. The authors didn’t even have the words they needed to

help them say what they hoped to mean. There wasn’t much descriptive vocabulary yet

beyond “domain” and “registry.” In any case, the point was that NSI’s monopoly on .com

was to be terminated by the implementation of some new form of name registration

technology that was still under development.

This was a clear case of high level IETF members doing social engineering rather

than Internet engineering. “There is a perceived need,” the co-authors wrote, “to open the

market in commercial iTLDs to allow competition, differentiation and change.” Beyond their

freelance anti-trust project, they were frank about wanting to perpetuate IANA and protect

Postel. “As the net becomes larger and more commercial, the IANA needs a formal body to

accept indemnity for the touchy legal issues which arise surrounding DNS policy and its

implementation.” The body named, of course, was ISOC.

IANA was seeking more than ISOC’s legal umbrella. Postel needed to find a long-

term source of funding in order to wean IANA from US Government support. The draft

envisioned a structure in which registries would have to pay ISOC $5,000 annually for their

charters. Some of the money would presumably be passed on to IANA.

Locating new sources of income was a pressing issue. DARPA was already scaling

down its contribution to the IETF. The writing was on the wall for the IANA. It would have

been unwise for Postel to count on getting support from the NSF’s new Information

Page 252: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

237

Karl Denninger, “Top Level Domain Delegation Draft, Rev 0.1.” Circulated by Jon Postel as, “Karl333

Denninger’s Suggestion,” Newdom-iiia January 29, 1996.

Infrastructure Fund. Mitchell and Strawn had flaunted the original promise to solicit

community input before allowing the imposition of fees, and they could just as easily

overlook the pledge that NSF’s $15 portion of the fees would be used to subsidize IANA.

Under these circumstances, Postel could conceivably do what Kahn and Cerf had

done at CNRI – solicit corporate donations. But there was a better option than begging and

becoming beholden to a benefactor. As the defacto overseer of the IP number space and the

root zone, his gatekeeping power ostensibly included the power to collect tolls, and such

power could be used to lever a steadier source of income.

Postel concluded the ymbk draft with a short appendix that rejected the idea of “A

Thousand Domains Experiment.” Despite its “social appeal,” he had serious concerns. The

first was that dramatic expansion of the TLD space would be an “irreversible decision.” The

second had to do with habits of working that were well known to computer programmers:

“TLDs, like global variables, should be few and created with great care.” Both reasons were

self-consciously moderate, reflecting a deeply ingrained cultural attitude. When moving into

unfamiliar territory, Postel and those close to him preferred to acquire experience

incrementally. When it comes to collective action, technologists tend to be a cautious lot. A

consensus player, Vixie soon withdrew his proposal for thousands of new TLDs and

endorsed Postel’s iterative approach.

Karl Denninger, owner of MCSNet, a major ISP serving northern Illinois and

outlying regions, submitted an alternate proposal just three days later, Jan 25. Postel

circulated it on Newdom, where it drew favorable attention. Denninger had started the biz.*

hierarchy on the UseNet several years before, and was now seeking .biz as his own TLD. He

had clearly put some thought into the trademark issue (informed, no doubt, by careful

attention to the knowledgenet controversy, since the domain holder had relied on Denninger’s

hosting services). Newdom’s members were intrigued by Denninger’s radical approach.333

According to Denninger’s plan, IANA would take responsibility for vetting registries.

It would be empowered to rescind charters, but would otherwise stay out of registry

Page 253: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

238

David Maher, “K. Denninger’s Suggestion,” Newdom-iiia January 29 1996.334

administration. Most importantly, Denninger insisted that IANA and the IETF should have

no institutional dependence on funds raised by imposing fees on registries, whether through

charters or redemptions from operations. He believed this would solve the indemnification

problem. If the IANA received no money from the registries, there would be no “deep

pockets” to attract litigants. He was also wary of imposing any kind of registry/registrar

distinctions that had been suggested. As an inveterate free marketer, Denninger’s vision

required leaving any contract details up to the registry and the registrant, thereby fostering

the blossoming (and self-organized weeding out) out of diverse business models.

After a short amount of supportive hubbub, words of caution came from David

Maher, the attorney who had represented MacDonald’s Restaurants during the negotiations

with Josh Quittner. Opening with the disarming admission that “lawyers on this group are

about as popular as a fart in church,” Maher warned that Denninger’s approach wouldn’t free

IANA or anyone else from legal risks.

Indemnifications are worth the paper they are written on (if theyare in writing) plus the net worth of the party giving theindemnity.... The only solution to the problem is to set up asystem that is perceived as fair by as many users as possible(including trademark owners) and then provide a due processmechanism for expeditiously solving the inevitable disputes thatwill arise. (creating dictators is not a solution).

334

* * *

Maher was a Harvard-trained patent and trademark specialist who liked to dabble in

esoteric subjects like electronic publishing and artificial intelligence. He claims that he was

actually sitting in his office, reading the issue of Wired in which Quittner’s article appeared,

when a MacDonald’s representative called seeking help. After bringing the case to an

amicable conclusion in 1994 he became active in the International Trademark Association

(INTA), and was appointed co-chair of its Internet Task Force. The group produced its report

in July 1995, just before NSI issued its trademark-friendly registration policy. INTA’s Board

Page 254: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

239

International Trademark Association, “Request for Action by the INTA Board of Directors,”335

http://www.inta.org/policy/res_assigndn.shtml.

Ibid.336

incorporated the report in a “Request for Action” issued on Sept 19, 1995 in response to

NSI’s imposition of fees.335

The trademark community was pressing two types of demands. The first concerned

the legal treatment of domain names: INTA wanted recognition of the fact that domain

names were capable of functioning as trademarks and that their misuse would harm

trademark owners. INTA’s second demand was intended to provide the trademark

community with technical mechanisms by which they could seek to enforce their rights. In

other words, they wanted surveillance mechanisms built into all DNS registries. The demand

had been addressed directly to ISOC.

[T]he Internet Society, its affiliated organizations and the parties operatingunder contract with them should make available to the public complete listsof the domain names in a database format that is accessible through existingcommercial or private computer search techniques.336

Coincidentally, Denninger and Maher were both based in Chicago. But there was no

alliance or meeting of the minds; their strategies diverged ever more widely over time. Maher

was not a technologist by trade, but an outsider pressing a policy likely to cause discomfort

among libertarian-leaning computer enthusiasts. Many such technologists were inherently

suspicious of any attempt to require that Internet core resources – its protocols and essential

infrastructures – be redesigned to serve the purposes of traditional governments and

industries. But Maher showed he was an able politician. He managed to stay engaged while

avoiding antagonistic exchanges. He patiently made his case on newdom, opened

correspondence with many of the players, and even attended IETF meetings (At one, he

sought to endear himself within that surveillance-loathing community by wearing a T-shirt

branded with the logo of a popular encryption application – PGP). He finally won a speaker’s

slot at an upcoming conference on DNS reform, and became a leading player in nearly every

major development which followed.

Page 255: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

240

See the agenda and links to presentations at, http://aldea.com/cix/agenda.html.337

Denninger, on the other hand, seemed a likely candidate to become a leader in the

reform community. He was simultaneously entrepreneurial and technical, and came forward

just at the time ISPs were becoming the darlings of the Internet boom. He was also an

outspoken advocate of fashionable policy dogmas such open competition and letting markets

pick winners and losers among innovators. Yet, as we will see, his style was far more

abrasive than Maher’s. Within a few months Denninger would move to the periphery of the

debate and lose any chance of effectiveness.

d. ISPs Ascending – February 1996

The next event on the conference calendar was on February 2, 1996, again in

Washington. This one was co-hosted by ISOC and the Commercial Internet Exchange (CIX),

the industry group that had successfully challenged ANS’s control of the Internet’s high-

speed data backbone. The co-chairs were Susan Estrada, now a member of ISOC’s board,

and Bob Collet of CIX. The conference was titled – “Internet Administrative Infrastructure:

What is it? Who should do it? How should it be paid for?” The title reinforced the

assumption that control of the Internet ultimately rested with ISPs Anyone with a grand new

vision for running the Internet felt it would be necessary would to win them over.

Consequently, the conference agenda was organized to give numbers management

as much attention as DNS administrative reform. David Conrad and Daniel Karrenberg were

now deemed forces to be reckoned with. As managers of the Asian and European IP

registries they were intimately involved in mediating relations between the providers of

Internet services in their home regions and relevant parties in the US. At November’s

conference they had been relegated to the microphone queues in the aisles. This time they

would be dignified as featured participants on two of the day’s three panels.337

Tony Rutkowski, by this time fully separated from ISOC, led off with a PowerPoint

show that laid out the current “Situational Analysis.” As he saw it, CIX was the Internet’s

“principal international ISP organization” and his Internet Law Task Force was “emerging”

Page 256: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

241

Tony Rutkowski, “Situational Analysis,” http://aldea.com/cix/InternetNames.ppt.338

I A B , “ M i n u t e s f o r F e b r u a r y 1 3 , 1 9 9 6 I A B T e l e c o n f e r e n c e , ”3 3 9

http://www.iab.org/documents/iabmins/IABmins.1996-02-13.html. See also Gordon Cook’s interview with

Dave Farber at http://www.cis.upenn.edu/~farber/cook.htm.

as the “principal legal organization.” Any Internet administrative structure would have to

address concerns such as oversight, responsiveness to jurisdictional legal requirements,

diminution of liability, and assurance of “public interest considerations.”

Rutkowski also stressed the importance of “comity” among the many interested

parties on the Internet, an arrangement that was clearly inimical to any restoration of

overarching US Government control. He then argued that the only effective and stable

institutional solution was likely to be a “federative international organization” which honored

the Internet’s long-standing practice of “deference to Service Providers.” In hallway338

conversations during the day Rutkowski suggested that plans were reaching fruition to

establish this type of federation in Geneva, taking advantage of legal structures would make

it easier for the group to avoid liability challenges.339

* * *

There was growing acceptance of the need to move quickly to decouple numbers

management from names. The names debate had gotten bogged down in discussions of basic

principles. It was becoming apparent that resolution of the DNS issues might take a relatively

long time. Moreover, there was already a good model in place for going forward with

numbers.

RFC 1174 had placed great stress on the need to decouple names from routing, and

had paved the way for Postel’s May 1993 delegation of Class A Blocks to the Asian and

European Regional Internet Registries – APNIC and RIPE. The managers of those registries

were effectively free to set their own criteria for suballocation, presuming compliance with

relevant RFCs. The registries appeared to be well run and successful. There were now good

reasons to pursue the same course in the Western Hemisphere.

The European registry, RIPE (Réseaux IP Européens), was created in 1989 as a

“collaborative forum” run by volunteers, mostly university-based network specialists. In

Page 257: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

242

APNG evolved out Asia’s versions of the W estern CCIRN and the IEPG: the Asia Pacific340

Coordinating Committee for Intercontinental Research Networks (APCCIRN) and the Asia Pacific Engineering

Group (APEPG).

1992, RARE, the government-sanctioned networking association, granted RIPE formal legal

status. RIPE began with a three member staff, including Karrenberg, and was funded by

donations from a small number of academic and commercial networking associations. Postel

allocated three Class A blocks to it just after the InterNIC transition was completed. The

European Internet had grown so large by 1994 that RIPE implemented a policy of giving

priority to requests from contributing members first. Subsequently, every registry seeking IP

allocations became a RIPE contributor, and received the appellation Local Internet Registry

(LIR)

The formation of the Asian registry APNIC (Asia Pacific Network Information

Center) differed from RIPE in that APNIC’s leadership was far less keen to establish formal

links with traditional government agencies. Doing so would also have been a considerably

more difficult endeavor given the lesser degree of political integration in Asia.

APNIC was created in 1992 as a pilot project of the Asia-Pacific Networking Group

(APNG). Its explicit mission was to “facilitate communication, business and culture using340

Internet technologies” and to provide a mechanism which could administer address space

across the region in compliance with Gerich’s RFC 1366. Like RIPE, APNIC depended on

“voluntary” donations from its member organizations, including ISPs (also called Local

Internet Registries), as well as Enterprise and National members. APNIC also became an RIR

after Postel granted it two Class A Blocks in 1993, but unlike RIPE, APNIC did not

immediately take on a formal legal status. Conrad and the other APNIC managers

consequently believed they operated directly under IANA.

RIPE and APNIC had no corresponding organizational counterpart in the Western

hemisphere. NSI was a private, profit-seeking entity. But Karrenberg and Conrad did have

a personal counterpart in Kim Hubbard, who clearly recognized the virtue of following their

lead. Hubbard had begun to worry that if numbers became a speculative commodity, or got

Page 258: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

243

Phone interview with Kim Hubbard, February 17, 2003.341

Paul Resnick, “Suggestions for Market-based Allocation of IP Address Blocks,” February 1996,342

http:aldea.com/cix/ipaddress.html.

caught in the political process that was confounding names management, the Internet itself

would be jeopardized.

I’d go to these meetings and it was all about money. Everybody wants tomake money off of names. And I just knew the next step was going to be, “IfI can make money selling names, I can make money selling numbers.” Andthat’s impossible. You just cannot do that.341

Whereas the TLD name space could be expanded by at least one or two orders of

magnitude, the size of the IPv4 number space was fixed. The conventional wisdom was that,

if one company were allowed to buy up the remaining available blocks, it could use its

position to choke off the ability of new entrants to get a foothold in the market.

Consequently, that single company would own the future of the net. So much for comity.

The idea of creating a quasi-public numbers regulatory agency for the Americas had

started to draw support after the November 1995 conference, and was quickly becoming a

conventional wisdom, but the principle had to be firmly accepted before the details could be

worked out. A dwindling few still hoped for full-out market solutions such as auctions

One panelist, Paul Resnick, invoked the Nobel Prize winning economist Ronald H.

Coase as a guiding authority, “As long as exchanges are easily arranged, property will end

up in the hands of those who value it most, regardless of who owns it initially.” But his342

arguments gained little traction. There was a credibility problem. Resnick was an employee

of AT&T, the company most likely to buy up the IP space if it were released to the open

market.

Nearly everyone who claimed to have technical savvy insisted on the need to prevent

overloading of the routing tables. It wasn’t clear how a market-based solution could

accomplish the sort of coordination that would guarantee sufficient attention to issues such

as contiguity and concentration in the block allocations.

Page 259: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

244

Phone interview with Don Mitchell, February 11, 2003.343

Like many, Hubbard viewed the prospect of full-bore marketization of the IP space

as a disaster waiting to happen. From her perspective, an independent, responsive

administrative organization capable of managing the registry properly had to be created soon.

It was good in some ways that the DNS controversy had drawn attention to the question of

who possessed authority to manage the numbers space in a commercial environment. But

the conduct of the discussions was appalling, and the DNS mess was proving to be a huge

distraction. People were climbing all over each other to design an open market model for

names that, in her view, was fundamentally inapplicable to numbers. With sound

management of IP numbers as her driving concern, she was convinced that its prospects

would be diminished the longer the two discussions remained joined.

Mitchell put things more bluntly.

Look, this names obsession is going to fuck up the Internet. Thereally important thing is numbers... we need to get them separated, becausepeople don’t seem to be able to deal with names rationally... If you startbreaking the numbers, then you start breaking the network.343

Though the Hubbard/Mitchell strategy made good sense to most of the insiders, there

was still a gnawing question about whether IANA even had the authority to delegate number

blocks to regional registries. To sanctify IANA’s position atop the names and numbers

hierarchies, it would have to be anointed with some legitimate, official standing. Who would

or could do this, and how?

That problem was dealt with at length during a panel called, “Who Should Set

Internet Infrastructure Policy?” Toward the end, the chair, Susan Estrada, took an informal

poll of the audience, “What I think I’m hearing from this group is we need to institutionalize

the IANA. How many people here agree with that?” Nearly every hand in the room went

up. “And how many of you disagree with that?” In the back of the room, one hand went

up... Jon Postel’s.

A few people didn’t get the joke. They thought he wanted to preserve IANA as his

personal fiefdom. In fact, he clearly understood the need to institutionalize it. He didn’t

Page 260: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

245

bother to state any reason it should not be. But it was clear that he did not enjoy being the

center of so much attention. In 1996 he could still find the humor in the irony of that and

poke fun at himself for it.

Though the emphasis was on numbers, names were not forgotten. Maher, a member

of Estrada’s panel, presented his own take on the domain names crisis and a critique of the

leading proposals before the community. Reiterating the dogma that “the genius of the

Internet is that no one really runs it,” he nevertheless conceded that NSI, by virtue of its

contract with the NSF, “stands astride access” to the assignment of domain names. Even

if (from his perspective) no one ran the Internet, there was no denying that NSI was “the only

game in town” for anyone who wanted to establish a globally visible presence. The picture

of a giant standing astride access to the promised land is as good a metaphor for a gatekeeper

as any.

Maher’s primary concern was that “there is no current basis for jurisdiction under any

settled body of law.” He dismissed the FNC’s claims to ownership as “extraordinary”and

“arrogant,” though he believed several other agencies that had sufficient standing to make

a legitimate claim. He agreed with Rutkowski’s argument that the FCC in particular had

“plausible statutory basis for asserting jurisdiction over the Internet.” He also believed that

various actors at international, national and even local levels had grounds to assert

jurisdiction. Perhaps even state Public Utilities Commissions could make jurisdictional

claims over facilities and communications sited within their boundaries.

Such assertions of jurisdiction by varied arrays of local and national agencies would

not be, in Maher’s view, “a cheerful prospect.” Trademark attorneys did not want to have to

defend their client’s interests across a complex, balkanized legal territory. The trademark

community preferred a solution that would facilitate a uniform approach to dispute

resolution, with recourse to local courts kept available as a last resort.

Maher found fault with the various proposals he had seen emerge from both the

technical community and Rutkowski. All were taking a “head in the sand” approach to

business realities, looking for ways to escape liability for any disputes that might arise.

Postel’s supporters were seeking to control the right of recourse by requiring adherence to

Page 261: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

246

David Maher, “Legal Policy,” http://aldea.com/cix/maher.html.344

binding arbitration procedures executed within the technologists’ own institutional structures.

Rutkowski’s strategy was to do as much as possible to sidestep challenges altogether. But

disputes were inevitable, Maher insisted. It was necessary to face that reality and develop an

approach that would be responsive to the interests of trademark holders.

Therefore, any reform, concluded Maher, should incorporate due process, ensuring

that “erroneous, illegal, arbitrary, biased, or corrupt actions” of the responsible administrator

could be reviewed by a competent national or international tribunal. The best approach, he

added, would be to promote the “creation of a new international jurisdiction, based on

treaty.”344

That aspect of the proposal had profound implications. Over the previous few years,

the rise of the Internet had inspired ambitious premonitions about the possibility of breaking

the bounds of political sovereignty. The rise of the DNS controversy had sharpened the

thinking about how this might be accomplished, and the ISOC/CIX conference had

juxtaposed the fundamental differences among the different points of view. Whereas the

engineering elite and the global free marketers had put forth various strategies for bypassing

governmental authority, Maher had now introduced a strategy for co-opting it.

Rather than clarifying anything about how to resolve the DNS controversy, Maher’s

proposal only served to add yet another model to the mix. At that point, the political

momentum was still in the hands of the ISPs, and it would take some time to assimilate

Maher’s ideas into the discourse.

* * *

One of the last presentations of the day, by APNIC’s David Randy Conrad, focused

on how establish a hierarchy of trust within a new regime. “Depending on who you talk to,”

he wrote, there were three models of the present situation. The first was the “historical

model,” in which authority flowed through the following chain: DoD, FNC, IANA, InterNIC,

RIRs, LIRs, End Users. Then there was the “politically correct” model, which he considered

“less controversial.” ISOC, IAB, IANA, RIRs LIRs, End Users. Finally, there was the

Page 262: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

247

David Randy Conrad, “IP Address Allocations,”http://aldea.com/cix/randy.html.345

“pragmatic model:” ISPs, IANA, RIRs, LIRs, End Users. This last model recognized that

ISPs had the power to ignore IANA and the RIRs by creating a new “Anti-NIC.”

For Conrad, the historical model was “essentially irrelevant.” The Internet, he wrote,

“has matured beyond the point where the US Department of Defense or the Federal

Networking Council can exert any control.” Any attempt by the US government to reimpose

control and take back address space would introduce “chaos,” because ISPs would be

“unlikely” to “play along.” On the other hand, he considered the politically correct model to

be almost as controversial. The IETF community – which had significant overlaps with the

ISP community – still harbored deep mistrust toward ISOC and the IAB. And the pragmatic

model would remain an unrealized vision until the ISPs built up their own distinct

organization.

Table 2 David Conrad’s Models of Internet Governance

Historical Politically Correct Pragmatic Proposed

DoD

FNC

IANA

InterNIC

RIRs

LIRs

End Users

ISOC

IAB

IANA

RIRs

LIRs

End Users

ISPs

IANA

RIRs

LIRs

End Users

ISOC-IAB-ISPs

IANA

RIRs

LIRs

End Users

Conrad’s solution was to propose a new organization in which ISOC, the IAB and

ISPs would serve as peers atop a hierarchy that devolved to IANA, RIRs, LIRs, End Users.

The point was to retain current levels of authority in IANA’s hands, but to constitute a new

oversight body that would serve as an organization competent to review any challenges to

IANA’s decisions.345

One of the biggest drawbacks to Conrad’s proposal was the lack of an institutional

American counterpart to APNIC and RIPE. Strawn detected the emerging consensus. It was

palpably evident that the desire to create an RIR for the Americas was widely popular.

Moreover, the prospect of achieving that goal was much simpler than resolving the DNS

Page 263: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

248

Phone interview with Don Mitchell, February 11, 2003.346

controversy – a big mess that promised only to get bigger. Hubbard was encouraged to

proceed with her goal of creating a new quasi-public agency, to be called the American

Registry of Internet Numbers (ARIN).

Hubbard’s organizing and lobbying activities were to be paid for by NSI. The

company also agreed to subsidize ARIN’s startup operations until the organization could

fully rely on remittances from IP users. This arrangement was clearly in the interest of NSI’s

owners. Not only might the company reap the public relations advantages that would accrue

to it as a good citizen of the Internet, it would stand to reap greater profits by moving to

guarantee robust growth of connectivity, and consequently, demand for domain names.

There was also a quiet deal in the works... Mitchell and Strawn had begun to consider

a strategy that would completely release formal ownership of the .com, .net, and .org zones

to NSI as soon as the numbers registry could be split off and formally turned over to

ARIN.346

Yet more imperatives motivated the creation of ARIN. Behind the scenes, Hubbard

and several others had begun to harbor questions about Postel’s interventions in the

allocations of the number space. Due to the widespread perception of Postel as a wise and

impartial arbiter of policy, his individual actions were often taken as rule-setting precedents.

Hubbard and Mitchell worried that he was on the verge of making some choices that would

lead to bad long term consequences. There were signs that Postel was slipping. He clearly

needed help. In short, there was a need for greater rationalization in the decision-making

process. A North American registry dedicated to that single task could take the pressure of

Postel and provide just the sort of carefully spelled-out guidelines that were needed.

e. The Rise of ARIN – Flashback

The crowding of the IP space in the 1990s was a vestige of allocation decisions made

during the development phase in the late 1970s and early 1980s, well before the 1983 cutover

to TCP/IP. Formal implementation of the 32 bit regime began in 1981. It was apportioned

Page 264: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

249

Jon Postel, “RFC 755: Assigned Numbers,” (also published as IEN 93), May 3, 1993.347

A equivalent version is posted as “Internet Protocol v4 Address Space (last updated 2006-10-03),”348

http://www.iana.org/assignments/ipv4-address-space. The cited version is on file with the author.

Ibid.349

into 128 Class A slots, which collectively accounted for half the available addresses. Forty

one organizations, all involved in TCP/IP research, were named as the first recipients of

those slots. This expedited the transition from the preceding eight bit regime. 347

RFC 790, authored by Postel and published in September 1981, carried a list of the

original Class A assignments, setting things out in four columns: 1) The decimal value for

the first octet of the 32 bit address, ranging between 000 and 127; 2) A short name for the

network; 3) A long name for the network, and; 4) A contact reference, expanded in the

endnotes as an email address. Quite a few blocks were listed as “reserved” or “unassigned.”

The reference for those was simply Postel’s initials, JBP. There were regular RFCs listing

the allocations until early 1987 when the list of Class B and C assignments grew so lengthy

that the task was delegated to the DDN-NIC.

An updated list of Class A addresses titled “Internet Protocol Address Space” finally

reappeared at the ISI website in 1998, but not as an RFC. It reflected interesting changes.348

Now there were only three columns: As before, it started with the Address Block. The middle

column was labeled “Registry - Purpose.” The last column held the allocation date. There

was also an introductory paragraph that offered a brief rendition of history.

Originally, all the IPv4 address space was managed directly by the IANA.Later parts of the address space were allocated to various other registries tomanage for particular purposes or regional areas of the world. RFC 1466documents most of these allocations.349

Of course, there was no IANA as such when the IPv4 address space was created. It

was simply convenient for Postel to equate the institution with himself. The RFC he cited

was Gerich’s May 1993 “Guidelines for Management of IP Space.” Gerich had laid out the

criteria by which an organization might become a regional registry, but said precious little

about what policy governed assignments for “other purposes.” As with so many other things,

Page 265: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

250

An Apple employee and long-time IETF participant told me that after he recognized the size of the350

opportunity at hand, he solicited Postel personally and received the allocation on Apple’s behalf.

the matter had been left up to Postel. “While it is expected that no new assignments of Class

A numbers will take place in the near future,” she wrote, “any organization petitioning the

IR [the DDN-NIC function run at that time by NSI] for a Class A network number will be

expected to provide a detailed technical justification documenting network size and structure.

Class A assignments are at the IANA's discretion.”

Not surprisingly, a handful of American, British, and NATO military agencies had

received Class A blocks in the 1981 list, as did businesses which were actively involved in

TCP/IP development. These included BBN, IBM, AT&T, MERIT, Xerox, Hewlett Packard,

Digital Equipment Corporation (DEC), and Performance Systems International (PSI), a

major provider of backbone services that later become the operator of Root Server C. A few

universities such as Stanford, MIT and UCLA were included, as were overseas research

networks such as Nordunet and University College London. The regional registries had some

slots, of course, and several blocks were set aside as “IANA - Reserved.” The US Postal

Service had received a block, as did a body named Amateur Radio Digital Communications.

Over the following years, there were also some unlikely recipients. They included

private corporations such as Ford Motor Company, Eli Lily, Merck, Prudential Securities,

Halliburton, and foreign agencies such as the UK Department of Social Security. Apple

Computer Inc. also received a Class A allocation, though the company was not a significant

participant in the development of Internet technology. 350

Ironically, when controversies began to erupt in the early 1990s most of the criticism

of the IP allocation process was directed at Hubbard, who exercised day-to-day authority

over considerably smaller blocks of space than Postel. In IETF nomenclature, this was an

“existence proof” of an overarching principle... the tendency of individuals and organizations

tend to focus disproportionately more attention on the details of smaller expenditures,

skimming over the largest ones.

One of the few Class A allocations made by Postel that received any public attention

at all was dated July 1995, listed as “IANA- Cable Block.” Several prominent alumni of the

Page 266: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

251

Phone interview with Kim Hubbard, February 17, 2003.351

Ibid.352

Internet engineering community had gone to work for California-based venture named

@Home, founded in 1994 by NASA’s Milo Madine. Paul Mockapetris was the second

employee. The company later attracted notables such as Elise Gerich, Mike St. Johns, and

Robert F. Schwartz (chair of the 1994 InterNIC Interim Review). @Home’s business model

was based on delivering Internet connectivity via the cable television system.

As the business got underway, Mockapetris requested a Class A block on behalf of

@Home from Hubbard, based on the belief that the company’s client base was going to

mushroom. When she turned him down, Mockapetris went directly to Postel, promising that

@Home would manage the space on behalf of the entire US cable industry, rather than only

for itself. Hubbard recalls that Postel was “dead set against it for a long time.” She and

Postel also shared many doubts about the ability of the Internet’s routing system to

accommodate @Home’s plans.351

Postel finally gave in, despite his previous reservations. As Hubbard recalled, “He

said, ‘Just go ahead and issue it to them for all of cable... It’s not going to be routable

anyway. They’re not going to be able to do anything with it. And once they figure that out,

that will be the end of it.’” She thought she had found a way to finesse the decision without

allocating the entire block to a single vendor. “We put it in the database in such a way that

it wasn’t really @Home’s numbers. It was just reserved [as an industry-wide block]... Any

cable company could come to us and get address space directly from that Class A.” Thus the

listing “IANA - Cable Block.” 352

The ultimate solution for the @Home matter was considerably more reasonable than

the remarkably quiet arrangements that permitted monster allocations to be made to the likes

of Apple, Halliburton and Eli Lilly. Nevertheless, the event provided grist for the mill of the

those who were unhappy with the ongoing allocation regime. The story could be twisted to

make it sound like Postel caved in to pressure from a friend, granting him an entire Class A

Page 267: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

252

See, “Letter to US Rep. Tom Bliley from Jim Fleming,” nettime July 18, 1999, http://353

amsterdam.nettime.org/Lists-Archives/nettime-l-9907/msg00071.html.

Phone interview with Kim Hubbard, February 17, 2003.354

Ibid.355

block while less well-connected petitioners were left scrambling for tiny morsels of the

precious numbering space.353

* * *

The Cable allocation was not the only time Postel had reversed one of Hubbard’s

decisions. Since she was generally the first point of contact for anyone seeking allocations

in the available Class B and Class C spaces after the DDN-NIC was moved to NSI, she often

had to turn applicants down, or give them less than what they wanted, or tell them to

rearrange their existing space more efficiently. There were some companies, she said, “who

complained all the time about everything.” The better connected or litigiously predisposed354

of them had few qualms about trying to go over her head and present their troubles to Postel.

“Jon understood what I was trying to do and why it was important. And for the most part,

he was great about it.” But there were exceptions. An early incident involved the US Navy.

As she recalled,

They came [to me] and requested some huge amount and there wasno way that they needed it. So they went to Jon and Jon had his discussionswith them and came back to me and said basically, “Go ahead and give it tothem.” I didn’t like it, but it wasn’t my job at that time to say no to them.355

The Navy could have used its own address space less wastefully, just as others were

being told to do. However, instead of undertaking the task of renumbering its networks, the

Navy insisted on taking fresh Class Bs from unallocated space in the existing research

network. This was manageable at the time, since there were still quite a few Class B slots

remaining. But as the Internet boom accelerated the pressure to conserve space became more

intense, so Hubbard remained attentive to these sorts of problems.

When investment in small startup web design shops and private “mom and pop” ISPs

began to explode in 1994 and 1995, Hubbard was swamped with requests for IP numbers

Page 268: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

253

Ibid.356

from “everybody and their mother.” Now there were hordes of newcomers who wanted to

get in on the action, but “had no clue what they were doing.” She told of getting phone calls

“constantly” from people who told her they were planning to become an ISP, and then asked

her what TCP/IP was.

ISPs were also getting pressed by their customers who wanted more address space.

She encouraged the ISPs to blame the InterNIC. “Say, ‘You know, I’d love to help you but

these mean old people... say no.’”

I did it and I didn’t have to do it. I could have been very popular by justgiving people what they wanted instead of what they actually needed, but Irecognized, along with a lot of other people, that would have eventually donedamage to the net. 356

An incident in 1996 finally convinced Hubbard that Postel had to be taken out of the

allocation loop altogether. Her recollection of the story was that representatives from Bank

of America contacted to her request “eighty Class Bs or something crazy like that.” After

spending “many hours” talking to the bank’s personnel about the configuration of their

network, she concluded they actually only needed three. When she confronted a bank

representative, he admitted the request was excessive, but said that his bosses wanted all of

the blocks anyway.“Well, it can’t hurt to ask Postel,” he imposed on her, “ Just give it a

shot.”

Hubbard called Postel and said, “I’ve been told to ask you this.” She explained her

position that only three Class B blocks could be justified. He answered, “I’ll get back to

you.” When he did, he directed her, “Go ahead and give it...” After that, she said, “I just

stopped sending things to Jon.” Unhappy applicants tried going over her head anyway, but

the power to impose gatekeeping discipline was now concentrated in her hands. By then the

domain controversy was beginning to make significant demands on Postel’s time, and the

move to constitute ARIN was well underway. He was probably relieved to find that at least

one kind of problem was becoming less prominent among the many on his plate.

Page 269: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

254

* * *

The need to create North American counterpart to APNIC and RIPE traced back to

Elise Gerich’s 1993 plan to avoid the projected exhaustion of IP address space. The DNS

controversy finally created the political space to get things moving. NSI’s improving

financial turned out to be another boon. The company committed to fund the American

registry’s first year of operations. Hubbard was given a virtual carte blanche to travel

wherever she needed to go to make her case for the new organization.

Her challenge was to set up a fee structure where none existed before. She not only

had to convince the ISPs to pay for something previously given for free, she needed to

persuade them that the new allocation regime was fair. But these were not insurmountable

problems. Educated stakeholders were well aware of the physical realities – the possibility

of exhausting the address space and the limitations of the routers. Also, the success of

APNIC and RIPE proved there were good models – “existence proofs” in IETF parlance –

for regionally-organized number registries.

The European and Asian approaches were simultaneously idealistic, virtuous, and

practical. Clear ground rules reduced the likelihood that the process could be manipulated

or the participants intimidated. New recruits found themselves in a broad-based organization

that had open, transparent, and clearly-documented processes well in place. Those models

convinced Hubbard of the need to spell out the registry’s policies with as much detail as

possible prior to launch. That way, people would know “exactly what they could expect to

get before going in.” She picked a small group of insiders from NANOG and the IETF to

help her draft the rules.

Perfect fairness was not feasible. There was no way to get around the fact that

ARIN’s “jurisdiction” would not extend to the original Class A recipients. They would be

able to keep their blocks and reap the advantages without having to contribute anything to

keep the system going. Over the years – before and after the creation of ARIN – Hubbard

lobbied them to return space voluntarily, but only a few agreed.

Given the fights she had been through while working for the DDN-NIC and the

InterNIC, the road to creating ARIN was relatively peaceful. The process was not short,

Page 270: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

255

Bill Manning and Paul Vixie, “Technical Criteria for Root and TLD Servers,” March 1996,357

published as draft-manning-dnssvr-criteria-01.txt. Text of the first revision is still available via Manning, “Re:

RIPE24 DNS-WG proposed agenda,” April 19, 1996, http://www.ripe.net/ripe/maillists/archives/

dns-wg/1996/msg00024.html.

however. A preliminary step came in November 1996 when RFC 1466 was replaced by RFC

2050, “Internet Registry IP Allocation Guidelines.” The next step was to secure US

government approval. This was easier said than done.

The spectacular new prominence of the Internet and the obscure but vexing turmoil

of the DNS controversy was drawing the attention of successive new crops of officials at

higher and higher tiers in the government. The members of each new crop would need time

to get up to speed on the issues. They would also seek to impose their own decision-making

systems within the process and to make their own determinations of how the outcome should

be structured.

f. The Alt-Root Backlash – Spring 1996

Distilling the IP numbers issue from the crucible of the Internet governance debates

did little to improve the level of discourse about names. Fundamental differences came into

sharper relief. Participants became more emotional. Opposing camps became more stubborn

in their commitments. Promising steps forward were frequently undermined by perturbing

steps back.

One refreshing sign of progress was a collaborative effort by Bill Manning and Paul

Vixie to isolate and clarify objective performance criteria for new TLD registries. It357

addressed mundane concerns such as time synchronization, minimum transaction rates, the

need for uninterruptible power supplies, and staffing levels. Their draft was ostensibly

prepared under the aegis of “alt.scooby-doo_is_the_true_messiah,” indicating that someone

was still trying to maintain a sense of humor. Manning posted it to newdom on March 17.

Later that same day, Eugene Kashpureff made his first appearance on the list, ruining the

mood.

An up-and-coming entrepreneur, Kashpureff styled himself as a crusader for freedom

of choice in the market for Internet domain names. He had distinguished himself as a math

Page 271: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

256

David Diamond (1998), “Whose Internet Is It, Anyway?”358

and electronics prodigy in 1975 at just ten years old, building a computer from fairly

rudimentary components. He went on to endure a troubled adolescence, but ultimately

prevailed over homelessness and the lack of a High School diploma to build a career for

himself in the information technology business. He began to find success after forming a tow

truck company in Seattle. He computerized its operations and began marketing his software

throughout that industry. Over time he became quite familiar with the operation of popular

dial-up computer bulletin boards and news sharing systems.358

Kashpureff became so excited after seeing the Internet demonstrated in early 1995

that he put aside (“hung up,” in his words) the towing business and invested in a floundering

ISP. Within short order he created one of the largest local yellowpages-like directories then

available on the Web. He also became a professional spammer, sending out massive

quantities of unsolicited email advertising across the Internet. This was augmented by

involvement with a startup company that focused on hosting personalized email addresses,

a business that paid off quite handsomely, he claimed.

In February 1996 Kashpureff was also involved in launching brokeragent.com, which

was perhaps the first online service dedicated to brokering the transfer and resale of domain

names. He simultaneously began looking for ways to sell new names rather than simply

broker existing ones. Since there was no immediate way to get NSI to share the sales of

names ending in .com, .net and .org, the obvious strategy was to offer to sell names using an

appealing new set of suffixes. This was more easily said than done. The problem was getting

NSI to enter new suffixes in Server A’s master database. That’s what the fight on newdom

was all about.

Kashpureff and his wife were expecting a child (their fourth) and this must certainly

have contributed to his money-seeking resolve. He and Diane Bohling, one of his business

partners, decided to create a business called AlterNIC that would let them offer names for

sale in their own TLDs, bypassing NSI. Laughing about it during an interview, he described

how the decision to start this longshot business was made:

Page 272: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

257

Phone interview with Eugene Kashpureff, August 21, 2001.359

Me and Diane were getting drunk one night. And she kind of posed thequestion, “Well, what’s to stop us from doing our own root name service? Tohell with them!” And I thought about it for about five seconds, and I said,“There’s absolutely nothing technically stopping us from doing our own rootname service network. And we could probably do it better than them....“ Itwas literally, we were up drinking one night and AlterNIC was put togetherat four o’clock in the morning. The domain name was registered, the rootzone files were put together. The whole nine yards...359

It sounded crazy: Build an entirely new alternate root server system and get people

to use it. During their alcohol-enhanced brainstorm (perhaps in spite of it) Bohling and

Kashpureff realized that the technical challenges of mounting their own root service were

minimal. The DNS was a long-proven technology and the BIND software was well

understood. The financial challenges did not seem terribly formidable either, especially in

light of the potential payoff. NSI had gotten into the domain name business at the “ground

floor,” and was raking in millions with no end in sight. These were still the “early days” of

the Internet, as far as most people could tell, and the chance to be the second in line for such

a promising market was a tantalizing prospect. By March they were experimenting with

techniques that would provide a way to route around the legacy root constellation.

* * *

When Kashpureff finally found his way onto newdom and other online domain policy

discussion groups, he distinguished himself with a rash, playfully arrogant, sort of bluster.

He was outspoken about his willingness to undertake a go-it-alone approach, showing scant

regard for anyone’s objections. He made repeated professions of faith in letting “the market”

sort things out. Closing his messages with the salute, “At your name service,” he made it

clear that he wanted to offer new names for sale, in spite of whatever the Internet’s old-guard

might have to say about it.

Kashpureff and several other “alt rooters” antagonized the old-timers on newdom by

openly playing around with suggestions about which TLDs they would soon create. Postel

Page 273: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

258

Jon Postel, “Plans for new top level domains,” Newdom-iiia March 17, 1996.360

Tony Rutkowski, “Re: New Top Level Domain policy may be set be fiat! Read this,” Newdom-iiia361

March 18, 1996.

Jasmin Sherrif, “Re: I need a new red pen,” Newdom-iiia March 19, 1996.362

stepped in, tacitly appealing for patience, with the announcement that a “slight revision” to

the January ymbk draft was in the works.360

That wasn’t enough to placate Denninger. The next day, March 18 , he launched ath

sharply provocative thread, claiming that Postel was disregarding the list’s open discussion,

and was making decisions unilaterally. The charge came in a lengthy posting titled, “New

Top Level Domain policy may be set be fiat! Read this NOW!” Rutkowski took the cue and

chimed in with a dig less focused on Postel as a person, but even more pointed. Rutkowski

argued that Denninger’s comments had made the case for “rapid transition to international

bodies of appropriate jurisdiction, authority, and public accountability.” A faction was now361

organizing around the proposition that Postel’s processes were inappropriate and illegitimate.

As the criticism from Denninger escalated, several Postel supporters decided to take time

from pounding on Kashpureff to counterattack in Postel’s defense.

The first evidence a DNS War casualty appeared within that series of exchanges,

when a member from Britain spoke up for the first and last time... “Can you please get me

off this list!!!”362

Compared to other newcomers, Kashpureff made far fewer complaints about process

and legitimacy. He was more of a publicity seeker. Recognizing that there was no law

requiring that Internet users or ISPs point to Root Server A, his goal was to entice customers

to give his root server system a try. Metaphorically, his challenge was like attracting new

listeners to a radio station in an environment where all the radio sets were sold pre-tuned to

the NSI broadcast network.

Before that time, no one had bothered to touch the dial and select an alternate

program. Getting them to change the setting was a clever idea, but risky. Given the power

of convention, Kashpureff had a lot of persuading to do. To his advantage, the radio

Page 274: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

259

metaphor isn’t perfect. From a regulatory perspective, the TLD “spectrum” was wide open

to pioneers like Kashpureff, Bohling and the others who soon followed. There was no need

to apply for a license in order to start broadcasting.

The key to an alternate root strategy is to build up a base of cooperative ISPs. Any

name server operator who wanted to use Kashpureff’s root system would only need to make

small, if unconventional changes in BIND’s configuration files. It would be easy... even fun...

Just a tweak could deliver a whole new world of data. In the jargon of ISP operators, they

would literally “point to” the IP addresses of the AlterNIC’s root servers.

Instead of relying on the legacy root name service for directions on how to find the

.com database, ISPs which pointed to AlterNIC would get the same information, and then

some. AlterNIC provided everything from the legacy root, which Kashpureff monitored for

updates, plus addresses for his own TLDs, and those of a few friends. The equivalent analogy

would be to dial 311 instead of 411 for directory information, knowing that all the listings

from 411 would be available, as well as listings in some cool-sounding area codes that 411

refused to service. The AlterNIC would become a superset of DNS data. In the honored

Internet tradition of routing around censorship, Kashpureff and his supporters claimed they

were circumventing NSI’s exclusion of alternate TLD name spaces.

With so little movement forward in the effort to break NSI’s monopoly, and with

Postel increasingly under fire, accused of being an obstacle to progress, times were ripe for

fresh faces in the DNS business.

* * *

AlterNIC was essentially up and running by the end of the month. The formal launch

was, appropriately, April 1 . Kashpureff was already publishing his own TLDs .xxx, .ltd,st

.nic, .lnx, .exp, and .med. and had started adding others, including .earth and .usa, on behalf

of John Palmer, an aspiring TLD operator who had earned a reputation on the USENET as

a confrontational kook. The alternate rooters took on an haughty attitude, and exalted their

actions on Newdom under the thread, “Bypassing the Politburo.”

Kashpureff wasn’t the first of the alt-rooters, just the most audacious. The true

pioneer of their movement was a small ISP that had begun offering service under the suffix

Page 275: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

260

Paul Vixie,“Re: No Traffic - Whats going on (fwd),” Newdom-iiia April 14, 1996.363

Ibid.364

.dot at the end of 1995. Soon after, in January 1996, Paul Garrin, a New York-based media

artist doing business as name.space, launched a pick-your-own-TLD service offering names

in an ever-widening range of suffixes.

What distinguished Kashpureff, however, was his talent for remaking himself as a

public persona – a true advantage in a culture where most people are so woefully unable to

distinguish fame from infamy. Journalists found him to be an engaging and colorful

interview subject (as did I), and he was clearly the most politically adventurous of all the alt-

rooters. He was willing to meet the old-guard engineering community on several fields – the

newdom list and other online domain-policy fora, and even its home turf, the IETF. The

engineering community was full of iconoclasts, so it wasn’t inconceivable that Kashpureff

would fit in. But members of that community tended to value its processes very highly,

especially in the wake of the IPv7 crisis. Kashpureff was too obviously a loose cannon to

ever be accepted as a peer.

Despite Kashpureff’s odd charm, Vixie loathed him, almost without restraint. To

make an alternate root “visible,” it would be necessary to alter a configuration within BIND.

Vixie never intended for the software to be exploited this way. Shortly after Kashpureff

arrived on newdom, Vixie announced to the list that he felt he had a special responsibility to

“rail against this kind of bullshit in the strongest possible terms.”363

An alternate root hierarchy was, at best, an “unpleasantly bad idea,” Vixie wrote. “It

fails to scale; its average case performance is hideous or awful (take your pick). It does not

solve any real problem, while creating many new ones.” His missive included a passage

which revealed a deeper, philosophically-motivated reasoning. The key section was a parable

about maturity and appreciating the necessity of law.

When I was a child, it seemed to me that adults had unlimitedpower and I could hardly wait to grow up so that I, too, couldlive above the laws of man and of physics. Now, as a nominalgrown-up, I have found that the world is a fragile place and thatthere are more, rather than less, laws I must operate within.

364

Page 276: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

261

Ibid.365

Bill Frezza, “The ‘Net Governance Cartel Begins to Crumble,” InternetWeek, October 28, 1996.366

Http://www.internetweek/102896/635frez.htm,

Juxtaposing the “ignorant savagery” of children with the responsibility of grown-ups,

Vixie insisted that adults must do all they can to give their own kids “the illusion that they

are safe and that nothing can hurt them while they work on growing up.” He cast the alt-

rooters as children who preferred “mob rule.” Unwilling to restrain their infantile urges, they

wanted create any snide names they could string together, even if it meant turning the DNS

into a polluted cesspool like the USENET, an early showcase of online freedom that had

fallen into decadence. Conceding that Postel’s strict sense of discipline probably seemed

“fascistic” to the alt-rooters, he was nevertheless certain that the Internet was in good hands.

I don't always agree with the IANA's way of doing things, but Irecognize that we are better off under the benign dictatorship ofsomeone who (a) has been around a lot longer than I have and (b)genuinely wants the Internet to grow and take over theuniverse.

365

The objective of having the Internet “take over the universe” was a goal that

resonated with everyone participating in the discussion. Any disagreement was about means,

not ends. The alt-rooters wanted to claim they had inherited the Internet’s spirit of predatory

innovation. To his credit, Vixie had a visceral understanding of the extent to which rules366

are simultaneously constraining and enabling; he combined this with an appreciation of their

value and their delicacy.

Vixie wasn’t alone. Many other old-guard IETF members rejected the idea of

alternate roots. Scott Bradner feared they would give the Internet a bad reputation in parts

of the world where it had not yet established a strong foothold. Another IAB member, Steve

Bellovin, was sure they would foul up the caches in the routing system. Some insiders,

including Randy Bush and Perry Metzger, considered Kashpureff and his ilk an annoying

distraction, but essentially harmless. They portrayed the alt-rooters as laughably marginal

loons who were destined to remain at the periphery of the Internet, presuming they could stay

in business at all.

Page 277: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

262

David Maher, “Draft 42-B,”newdom-iiia April 4, 1996.367

* * *

Though there was a diversity of reactions in the engineering community about the rise

of alternate roots, the prospect was pure nightmare for the trademark community. Trademark

owners were obliged by law to protect their marks when threats were apparent, or risk

undercutting their ability to demand production in the future. If the creation of grassroots

TLD registries continued, the owners of “famous names” might have to defend their marks

in a rapidly proliferating variety of venues. In practical terms, this would mean registering

their names in several alternate TLD registries, regardless of whether those registries had any

visibility. The strategy would at least ensure that no cybersquatter could get the name first,

holding it hostage if that TLD were ever added to the list of zones in Server A.

It was conceivable that this proactive registration practice might take on a life of its

own, enriching people who would offer up a new TLD simply for the purpose of collecting

tribute from the owners of famous names. If Ty and Tupperware weren’t safe, no one was.

To avoid this blackmail, some trademark lawyers believed the best strategy was to choke off

the alt-rooters by doing whatever possible to restrict the expansion of the TLD space. The

next step would be to build strong mechanisms that would protect trademark holders against

infringement, or perhaps even privilege the access of “famous mark” holders to domain

names.

The strategy of seeking to block the creation of new TLDs was not a universal

position in the trademark community, however. Maher argued that the creation of many new

TLDs was preferable because it would ensure that no trademark holder would be denied

access to the Web; infringement problems could then be dealt with case by case. 367

Other trademark attorneys had joined newdom by March 1996, gamely trying to make

themselves heard above the din. The experience was like teaching Trademark Law 101 to an

overtly rebellious class of heathens. Some of the most outspoken engineering mavens refused

even to allow that basic legal concepts such as national jurisdiction might be applied to the

Page 278: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

263

Michael Dillon, Newdom, “Re: A little more trademark law,” March 27, 1996.368

Michael Dillon,“Trademark Law 101,” Newdom-iiia March 28, 1996. 369

Perry Metzger, “Draft 42-B,” Newdom-iiia April 1, 1996.370

Internet. Instead, they insisted that domains were immune to trademark law because iTLDs

were “international.” 368

There was an even deeper disjoint between the techies and the attorneys over basic

semantics. The argument hinged on the meaning of the word, use. For attorneys, what

counted was that domains were being used in commerce. There was simply no getting around

the fact that domain names could function as trademarks and that trademark holders could

be harmed by their misuse. Recognition of this was, after all, the main demand of Frank’s

and Maher’s Internet Task Force. Trademark attorneys had been trained to treat the word use

as a term of art, and there was little flexibility in their definition.

The techies, however, tended to view domain name space as a distinct new open

space which had nothing to do with trademarks at all. Its purpose was simply to “translate

somewhat mnemonic and relatively stable alphanumeric addresses into the cryptic and

changeable numeric addresses used by the Internet infrastructure” (my emphasis).369

Consumers could presumably be taught to appreciate the difference. And if national

governments didn’t get the point, that was their problem. “[T]he internet doesn't have, or

need, a government... Congress really has no role here other than to stay the hell out of the

way.370

g. Draft Postel – May 1996

Postel circulated his first revisions to the ymbk draft in late March. The most

significant change was to explicitly emphasize the goal of raising funds to support IANA’s

operations. Toward that end, Postel proposed that registries pay $100,000 for their initial

charter, and $10,000 thereafter for annual renewals. Denninger was outraged. To assuage

him, Postel raised the possibility that instead of going to IANA, the money might go

Page 279: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

264

Jon Postel, “InterNIC Domain Policy,” Newdom-iiia June 21, 1996.371

The text was suggested by Michael Dillon, See Jon Postel, Newdom “Draft 42-B” April 2, 1996.372

In a followup, to Postel’s response that the statement would be ineffective, Dillon wrote:

I understand the law's point of view on this, it's just that I don't think that IANA or the DNS

system is the party "using" the name if indeed a name is being used in a confusing way. Even the

registry is not "using" the name, they are just recording the domain holder's declaration of "use" in a

public list. Is there no wording that can convey this?

The technical mechanisms underpinning DNS require that domain names be publicly

recorded and be unique. Why should any legal liability accrue to the central authority who records and

publishes this list which is essential to the functioning of the underlying mechanism?

elsewhere, such as ISOC or to the payer’s choice from a list of approved charities. Still,

Denninger could not be placated.371

Postel finally published an official Internet Draft on May 3, 1996. Titled “New

Registries and the Delegation of International Top Level Domains,” it became known as

simply “draft-postel.” His text staunchly proclaimed the conventional wisdom of the anti-

government technologists: “Domain names are intended to be an addressing mechanism and

are not intended to reflect trademarks, copyrights or any other intellectual property rights.”372

Postel listed himself as the sole author, but acknowledged help. “This memo is a

total rip off of a draft by Randy Bush, combined with substantial inclusion of material from

a draft by Karl Denninger.” Indeed, despite Maher’s warnings, Postel incorporated

Denninger’s suggestions about indemnifying IANA, ISOC, and the IETF against any

trademark infringement proceedings resulting from action by new registries or their clients.

Postel also used ideas suggested by IAB Chair Brian Carpenter concerning the organization

of an appeals process.

As before, Postel set out a timetable by which an ad-hoc committee would review

applications and select new registries, suggesting the process could be completed by

November. Though Postel hadn’t compromised on the issue of trademark doctrine, he had

given up his earlier goal of five new registries per year, each with a one-character TLD zone.

The unofficial draft he circulated in March had moved to the idea of creating five new

registries per year, each now having three, three-character TLDs (equivalent to NSI’s three...

.com, .net, and .org). The official version published in May stipulated that there might be as

many as fifty new registries chartered the first year. No more than two thirds of them could

Page 280: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

265

Compare section 6.3.3 Business Aspect of Jon Postel, “Draft 42-B” Newdom-iiia March 30 to the373

same section of draft-postel-iana-itld-admin-00.txt.

draft-postel-iana-itld-admin-01.txt.374

be in the same country, and each would carry up to three exclusively-managed three-

character TLDs. The idea of shared registries was put aside. The bottom line meant starting

out with 150 new TLDS. There would be another ten such registries created annually over

the following four years – 30 more TLDs annually. Once again, Postel added text that

explicitly opposed the idea of creating “enormous numbers” of new TLDs.

Charters would be granted for five years, and registries could be rechartered if given

good reviews. Where the preliminary draft in March had stipulated a $100,000 fee for a

registry charter, the new version named no specific amount. The only explicit remaining373

mention of money was the requirement of a $1000 non-refundable application fee, payable

to the Internet Society.

Postel’s draft incorporated some of the technical and administrative requirements that

had been accumulating in a series of Manning-Vixie drafts, including a provision that the

registry escrow the databases needed to generate its critical files. If the operator failed for

some reason (or if its charter were rescinded), this would ensure that the zones and Whois

customer information could be taken over and managed by another registry.

In June 12, 1996 Postel published a second official draft of the "New Registries and

the Delegation of International Top Level Domains." It primarily tightened up language374

regarding ISOC’s role in the oversight process. There were few substantial changes, and it

generated little reaction. There was an apparent lull on newdom as the pace of complaint and

counter-complaint slowed down, but this was simply the calm before the storm.

* * *

ISOC’s Board of Trustees had eliminated the office of Executive Director when

Rutkowski left, creating the positions of President and CEO in its stead. A brief search ended

in April with the hiring of Don Heath. Heath had previously worked for a number of data

services and telecommunications firms, usually as a manager in sales-oriented departments.

His previous employers included Tymnet, XtraSoft, MCI, and Syncordia, a subsidiary of

Page 281: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

266

Internet Society, “Donald M. Heath,” http:www.isoc.org/isoc/general/trustees/heath.shtml.375

Mueller (2002: 143).376

Personal interview with Vint Cerf, November 15, 2001. Personal interview with Don Heath,377

November 11, 1999.

IAB, “Minutes for February 13, 1996 IAB Teleconference,” http://www.iab.org/378

documents/iabmins/IABmins.1996-02-13.html

British Telecom. That long resume allowed him to tout himself a “seasoned executive.”375

Since Heath and Cerf had both been at MCI, many observers inferred there was a direct

connection between the two men. For example, Mueller’s history asserted that Heath was

Cerf’s “protégé.” Nevertheless, both denied knowing the other before Heath interviewed376

for the ISOC position.377

Temperamentally, Heath was quite unlike the unmanicured know-it-alls who

populated the IETF and ISOC’s board. His stylish, blow-dried haircut flagged him out-of-

place. Not only that, he was outgoing, pleasant, and affable. These qualities distinguished

him even further from the community of hardened cynics he could now claim to lead. What

was most surprising about him, however, was that he seemed so much less technically

articulate than the others at the top. Despite an undergraduate degree in Mathematics, he

didn’t project the tough nerdy sharpness so typical within the engineering culture. And, he

had little prior involvement in the Internet’s development from either an administrative or

legal standpoint. In fact, before he was hired, some IAB members felt it necessary to state

for the record that they knew Heath was not an “Internet guy.” He did, however, have378

experience with fund raising (for community groups in San Diego), and his corporate

background suggested he was likely to give higher priority to ISOC’s bean-counting

responsibilities than his policy-driven predecessor.

During the interview process conducted by ISOC’s board, Heath had said he wanted

ISOC to change its “academic engineering research-founded” orientation toward a

“commercial” orientation. He also wanted it take a “proactive” role addressing “issues that

are very controversial.” In the context of the time, this would have meant taking public

Page 282: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

267

See his comments to the June 28, 1996 APPLe Workshop, http://glocom.ac.jp/resa/APPLe/379

D_Heath.html.

“Access Pricing for Information Infrastructure Regulation and the Internet” was jointly sponsored380

by the OECD Directorate for Science Technology and Industry/Division for Information and Communications

Policy, the European Commission (DGXXIII) and Dublin University’s COMTEC Research Center, June 20-21,

1996.

stances on issued related to encryption controls and the Communications Decency Act

(CDA). Moreover, he favored boosting ISOC’s membership tenfold, to about 60,000. 379

In his favor, Heath’s point of view seemed philosophically compatible with the “let’s

use the Internet to route around governments” attitude that was endemic in the engineering

community. He simultaneously gave an air of affinity with the “suited and booted” sappiness

that many engineers and academics presumed was dominant in the corporate world, where

it was nevertheless so important to cultivate alliances. Rutkowski had also insisted on the

importance of working with the business community, but Heath didn’t seem to have any

particular political agenda of his own.

By the time the Postel’s Internet Draft was published, Heath was ready to undertake

a series of trips to lobby on behalf of ISOC’s vision of DNS reform. His first was to a major

European conference on Internet Regulation held at Dublin’s Trinity College on June 20 and

21. Most of the business there concerned the pricing of connection and access services, and380

the possible consequences on the regulation of telephone access. “Hot” Internet issues such

as DNS reform and the recently-defeated Communications Decency Act were also on the

agenda. Heath was scheduled to appear on a panel with David Maher, Albert Tramposch,

Senior Legal Counsel for the World Intellectual Property Organization, and Robert Shaw,

the ITU’s Advisor on the Global Information Infrastructure.

Shaw’s background included employment as a Systems Analyst for the ITU. Through

this, he had acquired first-hand experience building office networks. He was promoted to a

Policy Advisor post at the ITU after receiving a Masters in Telecommunications. He had met

Rutkowski along the way, and the two had clashed, ostensibly over cutting off Carl

Malamud’s Bruno publication project. No doubt, it also had something to do with their

Page 283: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

268

Robert Shaw, “Internet Domain Names: Whose Domain is This?,” in Brian Kahin and James H.381

Keller, eds. (1997), Coordinating the Internet.

David Maher, “Reporting to God,” CircleID March 27, 2006. http://www.circleid.com/posts/382

reporting_to_god_icann_domain_names_dns/.

diverging notions of the virtues of government and the ITU. The conflict became personal,

and would sharpen over the years, especially after the DNS War erupted in full force.

Shaw had written a paper for the ITU which investigated the question of who in the

US had authority over the root and the power to initiate fees. Tracing the bureaucratic chain

of relations from the InterNIC, to IANA to DARPA, across to the NSF, and on to the FNC,

till they faded off into the White House, his paper was the most detailed explanation of the

“authority” issue available at the time, and was directly pertinent to the Dublin conference.381

During the formal discussion, Shaw made it clear to Heath that he was “totally

opposed” to Postel’s draft. He charged it “would just create lots of mini-monopolies and

wouldn’t solve anything long-term.” It would only replicate the “lock in” problem Postel

himself had worried about, while making a unified dispute resolution policy even harder to

implement. He also suggested that the appropriate US agency to handle DNS management

would be the Office of International Affairs (OIA) of the National Telecommunications and

Infrastructure Administration (NTIA) which was under the Department of Commerce. Heath,

of course, opposed any repatriation of US power over the Internet. Their debate continued

on during a pub crawl around Dublin that went late into the evening, joined by Maher and

Tramposch. Maher later wrote about it.

There really should be a plaque in one of those pubs that would say: “Sittingat this table, in June [1996], Don Heath, David Maher, Bob Shaw and AlbertTramposch crated the basic concepts that now govern the technicalcoordination of Internet addresses and domain names.”382

* * *

Traffic on newdom began to pick up again. The message feed in late June often

reached around thirty messages per day, and sometimes over fifty. In addition to the

brouhaha over the proposed cost of registry charters, the attacks on Postel and the counter-

Page 284: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

269

Tom Newell, speaking for NSI as “NIC Liaison,” distributed a press release titled “NSI Moves for383

Dismissal of ‘Roadrunner’ Lawsuit as Moot,” posting it to several relevant email lists. See, Newell,

“Roadrunner Lawsuit,” June 3, 1996. See also, “Stipulated Order,” June 7, 1996. http://www.patents.com/

nsistip.sht, and; Rony and Rony (1998: 379-457).

Kim Hubbard, “Re: A fourth lawsuit against NSI by a domain name owner to try,” Newdom-iiia384

June 20, 1996.

attacks, the derision of the alt-rooters and those counter-attacks, NSI’s troubles were once

again competing for attention.

There were now at least four cases where NSI had been sued for either declaring its

intention to put a domain name on hold due to complaints made by trademark holders, or for

doing so. In one of the first cases, an Arizona ISP named RoadRunner was challenged by

Warner Brothers for infringing on the mark associated with the cartoon character. NSI had

put the name on hold, prompting a countersuit by the defendant. When Warner Brothers and

the ISP finally settled matters out of court, the suit against NSI was dropped. NSI’s policy

had survived without a real test. A spokesperson claimed that NSI would have prevailed

anyway, and the company’s attorneys tried to spin the outcome as an endorsement of its

policy.383

Despite the fact that NSI’s policies were so friendly to the trademark-holding

community, the trademark attorneys wanted more. On June 21 , CORSEARCH, a trademarkst

industry lobbying group, filed a Freedom of Information Act (FOIA) request, demanding that

NSI provide a copy of its registrant database. The trademark community complained that the

WHOIS online search features were insufficient, and that access to a full, up-to-date copy

of the source database was necessary to support timely and thorough searches for “name

nappers.” Kim Hubbard responded for NSI, countering that the company had an obligation

to protect the confidentiality of customers.384

Near the end of June there was an announcement from NSI that domain name owners

who had not paid their required maintenance fees would from then on be given a 30 day

notice to make payment. If the fees were not received by the deadline, their names would be

purged from the database. Not surprisingly, NSI experienced a surge in phone calls from

customers who were trying to straighten out billing issues.

Page 285: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

270

Bob Frank,“my trash can is full (of $%&#@),” Newdom-iiia June 24, 1996. The most frequent385

posters were Michael Dillon, Greg Woods, Simon Higgs, Karl Denninger, Eugene Kashpureff, Perry Metzger,

and Karl Auerbach.

The company had now registered nearly half a million names, adding over 50,000

new registrations in May alone. Paying for renewals under these circumstances presented

a severe logistical problem. In fact, getting any customer service at all was a test of luck.

Incoming callers were getting nearly constant busy signals. NSI upped its service contract

with Bell Atlantic to provide for 40 incoming lines, but even this could barely keep up with

demand. And despite the phenomenal growth in sales of names, the company had yet to turn

a profit. Lots of money was being plowed into investment, but also to lawyers and

consultants.

* * *

Starting with a mid-April thread titled “SOCIALIST BUREUCRACY [sic]

RESTRICTS FREEDOM ON THE INTERNET,” the tone on newdom was getting steadily

nastier. On the Internet, using capitalized text in emails is considered the equivalent of

shouting. Spelling errors, while usually ignored, are nevertheless grating. More and more,

people were shouting rudely and not shutting up. The late-June messaging spike brought the

level civility to a calamitous new low. Bob Frank, Maher’s increasingly frustrated colleague

from the INTA, tried to cope with the noise by developing a list of the most active posters

within a specific period. He then announced that, in order to screen out “garbage” and

preserve a joyful life, he had decided to delete any messages he received from those

individuals without even reading them. This was the first of many times that volubility-385

monitoring was attempted in the course of the DNS War as an attempt to shame certain

people into better behavior. Such attempts were never effective.

Perhaps Franks’ life improved, but few chose the luxury of filtering out unpleasant

noise. And there was a lot of it. Denninger’s language had become particularly vicious. He

was now denouncing Vixie and Postel outright, insisting they “cut the crap” or risk a lawsuit.

“I have *already* postulated a way to solve this problem,” he wrote on June 21st. “Postel has

Page 286: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

271

Karl Denninger,“Re: InterNIC Domain Policy,” Newdom-iiia June 21, 1996.386

Ibid.387

Jon Postel, “InterNIC Domain Policy,” Newdom-iiia June 21, 1996.388

ignored it, you [Vixie] have ignored it, and the fundamental truth is that both of you are

attempting to protect a monopoly interest and impose a tax where none should exist.”386

In later messages, Denninger insinuated that both Vixie and Postel were unfairly

exploiting their positions for the sake of financial gain. Postel received the brunt of it.

One man, Postel (he IS effectively the IANA) has decided to takewhat would normally be "due process" and turn it into a fundingmechanism for his pet project, adding legal protection forhimself and others on the so-called board while he's at it.

387

Postel’s grouchy response was striking, if only because he responded at all. It was his

habit to avoid conflict, especially if it descended to mudslinging. This time he defended

himself and took a shot of his own. “I’ve got plenty to do,” he answered. “I get a university

research salary whether I do IANA stuff or work on the next generation local network, or

whatever.”

The point of collecting money from the operators of top-leveldomains is to do the work to support them. Maybe there will beenough money to support other good infrastructure things in theInternet. That would be decided by the board of the InternetSociety. A board elected by the members of the Society (sign upand elect the board you want).

I can imagine a day when someone else will be so lucky as to havethe tasks of being the IANA and have the joy of dealing with theKarl's [sic] of the world. They may need a funding source. Thisis a relatively painless way of setting one up.

388

Rutkowski took Postel’s intervention as another cue to promote his Internet Law and

Policy Forum... and to denigrate Postel and ISOC along the way. “Reality needs to enter into

a space that seems oblivious to externalities...” he wrote, insisting that the “processes and

arrangements for a small DARPA networking experiment are no longer applicable.”

The Internet Society per its charter is a nice little US based,primarily R&E technical professionals organization to furtherresearch and educational networking; controlled by a board onwhich Jon sits, with election processes that primarilyrecurrently elect people from this same community. It publishesa nice monthly magazine about their Board members. Somehow it

Page 287: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

272

Tony Rutkowski,“Re: InterNIC Domain Policy,” Newdom-iiia, June 23, 1996.389

The June 13, 2006 draft circulated to ILPF members by Rutkowski was later republished on the390

IETF list. See Gordon Cook, “Re: Has the IETF outlived its Usefulness? (This is what I tried to post

yesterday),” IETF July 15, 1996, http://www.netsys.com/ietf/1996/2649.html.

seems a stretch that it should become benighted to control theInternet for all the enterprises and peoples of the world, andmanage a 100 million dollar a month revenue stream in the world'spublic interest.

389

The ILPF members had just drafted a charter in which they declared their group a

“disinterested neutral organization.” They were also preparing to undertake a “Notice of

Policy Making,” a formal proceeding to be modeled after the public notice and comment

process used by the FCC. The timetable called for the ILPF to publish its proposed

recommendation by September 1 , take comments for two months, and deliver its finalst

recommendations by Jan 1 , 1997.st 390

Those rumblings from the ILPF increased the pressure on Postel and ISOC’s board

members to move the DNS reform process forward. With further delay, the initiative might

drift to others. This set the stage for the next ISOC board meeting, to be held alongside the

35 meeting of the IETF in Montreal.th

* * *

People in close orbit of the IETF community often found it convenient to schedule

meetings of related groups at the same venue around the same time. For example, the IEPG

generally met just prior the start of IETF week. ISOC might meet at the beginning or the end.

Traditionally, any IETF participant who was a paid up member of ISOC who and wanted to

sit in on the board meeting was allowed to do so. Few ever bothered.

This IETF meeting would be sandwiched between ISOC and an Asian networking

group. ISOC’s Board would convene on the 24 and the 25 , just as most of the IETFth th

participants were arriving.

Postel presented his June draft to ISOC’s Board members, hoping to get them to fund

a committee that would develop guidelines for an organization that would handle the

assignment of iTLDs to new registries. This would be a big step toward constituting the ad

Page 288: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

273

Geoff Huston, “Minutes of June 1996 meeting of ISOC Board of Trustees” June 24-25, 1996.391

http://www.isoc.org/isoc/general/trustees/mtg09.shtml.

hoc committee he had been advocating since the previous November. The Board members

seemed particularly interested in what his draft had to say about stipulations regarding

indemnification. They agreed to “endorse” Postel’s draft “in principle,” but asked him to

“refine the proposal by providing a business plan” for their review and approval.391

The response was more tentative than what Postel might have preferred. If his goal

was to delegate the work to a responsible committee, the outcome was, yet again, ‘More

work for the IANA.’

h. IRE BOF – June 1996

Nearly 1500 participants registered for the Montreal IETF meeting. About sixty full-

fledged working group sessions were scheduled, and just under twenty Birds of a Feather

sessions. One BOF in the Operations Area was titled “Internet Registries Evolution,” with

the devilishly appropriate acronym “IRE.” It was hosted by, as best anyone can recall, David

Conrad. Unfortunately, no minutes were kept, and there is no written record of it, other than

an entry in the official proceedings marking its occurrence. But it did take place, providing

an inadvertent occasion for many newdom participants to meet in the flesh, share their hopes,

and vent their antagonisms. It was inadvertent because the IRE BOF was intended for IP

registries, not DNS registries. But Kashpureff was there, working to bend events to his

advantage, and doing his best to leverage the IRE BOF for his own purposes.

Kashpureff had come to Montreal believing he could work out a modus vivendi with

various community leaders. It was a multi-pronged strategy intended to demonstrate that

AlterNIC was a Network Information Center on par with any other. To prove his bona fides

he had begun emulating other NICs, mirroring copies of the RFC series and other technical

materials at the AlterNIC site. Mirroring was a useful and time-honored practice within the

far-flung Internet community that distributed the burden of hosting information. Also, he

tried to build relationships with the people he considered to be the elite of the DNS

governance community. Kashpureff desperately wanted to buy Vixie a beer. He wanted to

Page 289: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

274

Phone interview with Eugene Kashpureff, August 21, 2001. Reynolds recalled the meeting, but did392

not recall the apology.

Part of this account is derived from the Kashpureff interview. For Vixie’s side, see Paul Vixie, “con393

men,” gtld-discuss December 16, 1996.

See an early recounting in, Eugene Kashpureff, “The Truth,” newdom-iiia July 4, 1996. Months394

later, he put it this way: “I was led to believe at IETF in June, by Jon Postel and Bill Manning, that they were

attemting [sic] to garner support for the draft, and for forming a WG...all the way down to Jon asking me to find

the folks with dots on their nametags, to lobby the issue.” Eugene Kashpureff, “Re: NEWDOM: Re: bogosity,”

Newdom-ar October 29, 1996.

break bread with anyone from IANA. One of the first things he did at the conference was to

approach Joyce Reynolds and apologize for his past sins as a professional spammer.392

Online, Postel had referred to the alternate TLDs as “renegade domains.” In person,

however, he treated Kashpureff graciously. Postel and Manning met Kashpureff for lunch

one afternoon, an act Kashpureff and other alt-rooters such as Simon Higgs took as more-

than-tacit approval of AlterNIC’s activity. The biggest coup of all was persuading Conrad

to let him have a slot at the IRE BOF, immediately following presentations by representatives

of APNIC, InterNIC, and RIPE. This was the kind of public acknowledgment he wanted.

Vixie would have none of it, however. He subjected Kashpureff to a public tongue-

lashing. The IRE BOF became a face-to-face confrontation. The stewards of the legacy393

system were now denouncing alternate roots as “the second coming of the UUCP maps.”

They had labeled Kashpureff a “DNS terrorist.” Vixie insisted that the publication of names

by the alternate roots was unethical. He refused to even discuss any administrative reform

process with Kashpureff until the AlterNIC root was turned off.

Kashpureff could only retort that “experimentation is the Internet way.” He retreated,

having failed to achieve formal legitimacy for his root service. From then on, by

embellishing the story about his exchange of pleasantries with Postel, he could at least claim

a moral victory and heightened status. More importantly, Kashpureff had won publicity394

– the true currency of advertising – so his adventure was far from a loss.

The IRE BOF was fairly well attended by DNS mavens. At one point, someone asked

for a show of hands to assess the level support for Postel’s draft. A majority was there, but

it wasn’t strong enough to merit the IETF’s threshold for dominant consensus. Moreover,

Page 290: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

275

“APPLe Q&A Session”, June 28, 1996, http://www.glocom.ac.jp/resa/APPLe/Glob_QA.html.395

“ T o n y R u tk o w s k i’s re m arks a t A P P L e W o rksho p ,” J une , 2 8 1 9 9 6 ,396

http://www.glocom.ac.jp/resa/APPLe/Tony_R.html.

there was no clear sense of how to proceed in order to develop that consensus. The various

positions represented in the room were far too fractious.

* * *

As IETF week drew to a close, yet another DNS-focused meeting began, the Asia

Pacific Policy and Legal Special Interest Group (APPLe). It was the brainchild of Laina

Raveendran Greene, a Singaporean who had met Rutkowski while working at the ITU.

Instrumental in creating APNIC and its public interest counterpart, the Asia Pacific

Networking Group (APNG), Greene was also an early participant in the ILPF discussions,

but was put off by its US-centeredness. She helped initiate APPLe in January 1996, during

a BOF at the APNG called "Introduction to Legal and Regulatory Issues on the Internet.”

The June APPLe meeting was sponsored by CIX, APNIC and GLOCOM, an Asian

News agency. It provided yet another opportunity for Heath, Conrad, Rutkowski, Maher,

Shaw and others to present their positions, publicly, but now in light of ISOC’s formal move

to assume authority over the DNS.

From the perspective of participants at the APPLe conference, ISOC had overreached

by asserting itself as the arbiter of DNS policy. Shaw and Rutkowski and others ganged up

on Heath, challenging the legitimacy of IANA and ISOC to set DNS policy. Rutkowski, who

was now employed by a Silicon Valley startup called General Magic, took things further. He

stressed the need to “lift the veil” that would reveal IANA to be “still an entity of the US

Department of Defense.” Persistence of US government control, he insisted, was

“untenable.” On the other hand, his own policy group, the ILPF, represented a “new395

young breed of Internet attorney... throughout the world” coming together, he wrote, “in a

common dialogue” within an “industry driven forum.”396

Many APPLe participants wondered out loud why ISOC, with only six thousand

members, was better suited than many larger industry groups to assume the mantle of policy

leadership for the Internet. In response, Heath invoked the Internet Society’s “altruistic”

Page 291: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

276

Ibid.397

desire to perpetuate “the principles that built the underlying characteristics of the Internet.”

Namely, keeping it, “unencumbered by regulatory agencies constraining its growth or

constraining its use.” The best he could do to justify ISOC’s legitimacy as a broad-based

organization was to invite anyone who shared those goals to join.397

* * *

Prior to June, Postel had not been a voluble newdom participant. His mid-June

response to Denninger was unusual, but it was also a singular moment. Never before had

Postel been attacked so blatantly. His normal patterns were to post only twice a month or so,

and those tended to be brief citations of text from RFCs, unequivocally germane

announcements, or references provided in some other form. Such posts were delivered flat,

without comment. Occasionally, responding directly and succinctly to the text of someone’s

draft, he might ask a question. He certainly took the first and more important half his own

aphorism to heart: “Be conservative in what you send, and liberal in what you receive.” He

rarely broke form, except when he occasionally sent ISOC promotional materials to the IETF

list.

Therefore, it was an exceptional event for Postel to initiate a thread on a DNS policy

list. On June 30, 1996, just after the APPLe conference closed, he started two. Both reflected

a man in a satirical and grumpy mood.

In one he sought to answer the question, “Close Out COM?” This was an idea that

was exercising an enduring appeal among a small number of existing country code operators

and prospective TLD operators. Clearly jealous of the desire for .com suffixes, they were

looking for ways to generate more demand for their own products. Their rationale for closing

.com often played up jurisdictional concerns or the virtues of forcing a clean break that would

ensure transition to a new modus vivendi. Postel called the notion “a non starter.”

Imagine, just close the COM (and NET and ORG) domain to newregistrations and tell all those making registration requests"were [sic] sorry, COM was a mistake, those 200,000 companiesalready registered got those names by accident".

Page 292: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

277

Jon Postel, “Close out COM?” Newdom-iiia June 30, 1996.398

Ibid.399

I think that a few companies might take the view that if theircompetitor got a COM name and they can't, that something isunfair. The [sic] might even ask a court to review the matter..

398

Postel’s other thread – “ISOC vs.” – was posed as a rhetorical question. Its purpose,

he wrote, was to answer those who had expressed “some concern” about whether ISOC was

the “appropriate” organization to the funds that would be collected and distributed in accord

with his registry management proposal. He insisted there was no better alternative. ISOC was

a membership organization able to provide “representation.” If the US Government were put

in charge by placing TLD administration under the FCC or the National Institute for

Standards (NIST), new zones would be licensed like radio stations. “How much do you think

you would have to spend to get into the TLD registry business in that case?”

There were prospects even worse than US Government intervention, he warned.

“[M]aybe because the Internet is international we could be so lucky as to have the ITU take

us under their wing.” Consider the traditional criticisms of the ITU – its inattention “to what

individuals think,” its subordination to the European state telephone monopolies, and its

slowness at making decisions. “These are the folks that brought us the wildly successful OSI

protocols,” he sneered. For Postel, the facts spoke plainly enough. “So when you wish for

an alternative, think about the likely choices.”399

The message settled nothing. Denninger accused Postel of “FUD mongering” for

playing on the technical community’s fears, uncertainties and doubts about the ITU’s

slowness, even as the current process meandered. Rutkowski responded with a copious

analysis of organizations in the global telecommunications system, including multiple

pointers to further analysis published at his own WIA web site. He then took another shot

at Postel.

[I]t isn't helpful that the present US government contractor'sstaff person responsible for the domain name/numbering work haschosen to suggest that one organization among many of the choicesand alternatives - i.e., the Internet Society - on which thestaff person also sits as a Board member with many colleagues

Page 293: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

278

Tony Rutkowski,“ISOC vs.” Newdom-iiia June 30, 1996.400

Karl Denninger, “ISOC vs.” Newdom-iiia June 30, 1996.401

over many years, should somehow unilaterally and permanentlyassume the global addressing responsibilities for the entireInternet/WWW/whatever universe, and garner a huge revenue streamand unfettered power in the process.

400

Volleys of attacks and counterattacks ensued. Denninger complained that Postel had

“sandbagged” the issue. Over the course of several months Denninger had expressed rising401

frustration with the IETF’s draft-writing process and its cliquish culture. Now he was in a

catastrophic rage, venting without restraint. He claimed that the proposal he and several

others were simply being ignored as part of a strategy to shut then out intentionally. Even

though documents he and Simon Higgs submitted had been republished as full-fledged

Internet Drafts, the process felt like a “black hole”which allowed them no say over the

creation of the final, all-important RFC. The business climate mandated the need to move

quickly, but the insiders had decided to go slow, Denninger accused, in keeping with their

“‘merge, take and plunder’ mentality.”

Denninger had convinced himself that participating in the construction of an alternate

root was his only remaining option. “I have every right to lead a net.revolution RIGHT NOW

on this matter,” he wrote. “IT IS IN THE BEST INTEREST OF MY CUSTOMER BASE

TO DO THIS, AND THEY ARE THE BOSS IN THIS MATTER. OTHER ISPS ARE IN

THE SAME POSITION.”

Once again, Postel’s allies jumped to his defense. Metzger insisted that Denninger’s

technical approach was utterly unworkable. How could a proliferating numbers of zones be

guaranteed universal visibility without resort to a single authoritative Network Information

Center? Denninger had an answer. Periodically, ISPs would have to download zone files

from both the InterNIC and the various competing NICs. The ISPs would then amalgamate

those zone files and provide name-to-number lookup services directly, bypassing the lookup

services offered by the DNS. In effect, there would be no roots at all. Publishers of zone files

would have to work as a group.

Page 294: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

279

Perry Metzger, “Re: ISOC vs.” Newdom-iiia June 30, 1996.402

Perry Metzger, “Re: ISOC vs.”Newdom-iiia July 1, 1996.403

Christopher Ambler, “Next IANA meeting & minutes?,” Newdom-iiia, July 1, 1996.404

Metzger dismissed the idea. “[W]e didn't abandon HOSTS.TXT only to end up with

something at least as bad to manage.” Not only that, he argued, the growing dominance402

of Microsoft in the marketplace meant the world was moving toward “drool and play”modes

of operation. Just as BIND was pre-configured to use the legacy root, new generations of

servers and desktop systems would be as well. Few users would even bother to investigate

their settings, let alone change them.403

Obstinate and undeterred, Denninger said he was prepared to launch .biz regardless

of formal approval. Having concluded that IANA’s official sanction was unattainable,

Denninger recast it as meaningless. But, despite this shocking apostasy, the promise of

IANA’s blessing still carried great weight in the Internet community. In fact, other aspiring

operators now sought it more lustfully than ever, imagining it to be an anointing that could

deliver wealth of heavenly proportions.

i. Blessed Envelopes – July 1996

The intense partisanship of the newdom discussions presented a strategic dilemma

to Christopher Ambler, a relatively new member of the list. The ship-jumping threats of the

alt-rooters were especially problematic. At the beginning of July Ambler had announced that

his own TLD was “ready to go now,” hinting that he might bet on the alt-root community.

Yet he hedged, saying he preferred “to be a part of the solution” even if it meant waiting for

the official process to take its course. Ambler was temperamentally impatient. One of his

earliest postings to the list demanded, “What’s the bottom line?” He never hid his

motivation. He didn’t want to antagonize the powers-that-be, but he didn’t want to be a

chump either. The alt-rooters seemed to have momentum. For Ambler, the bottom line was

to be a “winner” in the coming “land grab.” If push came to shove there would be no

ambivalence. “I’d hate to lose out because I wanted to play fair.” 404

Page 295: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

280

Ibid.405

To newcomers like Ambler, Denninger’s strategy offered a compelling logic: Stake

a claim now and fight it out later. Put the TLD you want into service immediately, whether

visible across the Internet or not. Down the line there might be lawsuits, but the legal costs

of defending that claim should be considered a normal cost of doing business. If litigation

did occur, being able to demonstrate an application of prior art and, more importantly, first

use in commerce, should help to establish superior standing before the court.

Ambler was willing to give the formal process a little more time, presuming he could

guarantee an inside track for himself. “I need to attend the next IANA meeting and just get

more involved in the discussion and policy-setting directly in whatever capacity is allowed,”

he wrote, naïvely unaware of IANA’s ambiguous standing.405

* * *

Ambler wasn’t the only one looking for a face-to-face meeting as a way to accelerate

things. Some wanted to create a working group under the aegis of the IETF. Rick Wesson,

an ISP operator in Sunnyvale, California doing business as “Alice’s Restaurant,” drafted a

charter for an Integrated Network Information Centers (iNIC) Working Group within the

IETF’s Operations Area. The proposed group would promote an “open market in iTLDs”and

“support diversity in iTLDs through the creation of private NICs.” The concept of

“integrated” NICs was clearly aimed at granting new registries the status of NIC, and

therefore some equivalence to NSI and the InterNIC. Newdom’s host, Matthew Marnell, was

particularly smitten with Wesson’s initiative, and jumped in with friendly revisions. Wesson

and Marnell reflected a contingent of newdom members who clearly aspired to operate their

own TLDs, but who, for the time being, had ruled out the confrontational go-it-alone

approach typified by Denninger and Kashpureff.

Wesson’s draft proposed Bill Manning as chair. Metzger gave him a resounding

endorsement. Manning seemed as good a choice as any. Technically adept, and close to

Postel, in person Manning was by far among of the most pleasant of the bunch. His online

comments tended to be relatively high in signal to noise ratio. Manning was actively

Page 296: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

281

Bill Manning, “Re: Biz Domains” Newdom-iiia June 30,1996.406

Jon Postel, “Re: Working Group ???” Newdom-iiia July 1, 1996407

involved in managing the root server based at ISI and worked closely with other operators

in the root constellation to coordinate performance upgrades. He led the work that ultimately

facilitated expansion of the constellation into new sites in Europe and Asia. He was also

administrator of the .int domain, dedicated to international treaty organizations.

Manning’s public approach often lamented “US centric” solutions. Like406

Rutkowski, he spoke out in favor of involving of global (read non-US) business

communities. But Manning moved comfortably within the IETF’s inner circles, and still

believed that groups like the IAB and ISOC could open the way to a global solution.

Rutkowski, of course, sought to undermine those groups, avidly soliciting participation in

his Internet Law and Policy Forum, though gaining little traction in the anti-Postel camp.

Leading IETF members weighed in on both sides of the working group question.

Some thought the IETF’s processes might help structure the debate and move things forward.

Others thought that sponsoring a working group was a bad idea, and the trouble should

remain someone else’s problem.

Postel was against it, too, but for different reasons. “I am pretty sure that having a

working group would add at least a year to the process” Thus far, of course, there had been407

no shortage of face-to-face meetings about new TLDs. Postel had just gotten his full of it at

the ISOC/IETF/IAB/APPLe marathon in Montreal and was in no mood for more. Rather than

create a process that might foster even more bickering among more people over more time,

he wanted to delegate the issue to a group that would be in a position make something

happen. Unlike a working group, the ad hoc committee he had been asking for might be

small enough and reputable enough to pull it off. But to move forward (and finally get DNS

reform his back), he first had to steer the community’s thinking toward a narrow and defined

agenda... namely, the introduction of new TLDs. He made the case to newdom on several

occasions, reiterating it again on July 3 .rd

Page 297: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

282

Jon Postel, “priorities,” Newdom-iiia July 3, 1996.408

What are the priorities here ? My list is:

1. Introduce competition in the domain name registry business.

2. Everything else.

So lets [sic] focus on how to accomplish the top priority.

General observation: Changing things is hard, introducingseparate new things is easier.

The proposal in my draft is to introduce new registries with newiTLDs, and leave every thing that currently exists alone. (We maywant to make changes to existing things later (or not), but wedon't even have to talk about those possible changes now.)

So issues for discussion later (like in 6 months from now) mightinclude:

a. sharing a TLD among several registries

This is a very interesting idea and i'd like to seeit made workable, but i don't think it is essentialto get some simple competition off the ground. Wecan add it later. I agree that we have thetechnology to do this. I don't understand thebusiness model.

b. transitioning [sic] out of the COM domain and eventuallyclosing it.

This may be difficult in pratice, [sic] and aftersome competition is in place, may be lessinteresting.

c. solving the internal contradictions in the world'strademark registration procedures

This is fundamentally impossible.

--jon.408

Not surprisingly, the responses were unfocused and unproductive. Rutkowski

answered first, questioning the utility of new TLDs at all. Ambler insisted on a timetable.

Higgs, Manning and a few others started to rehash their discussions of trademarks and

business categories. But it didn’t take long for the thread to spin off into a fight over the

legitimacy of IANA’s authority, the virtues of the IETF’s RFC process, the viability of

alternate roots, and, the perceived lack of character among various participants to the

discussion.

Page 298: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

283

Back in meatspace, DNS had made the news once again. Beside the predictable

trademark disputes and the associated legal battles, careless mistakes at NSI were making

headlines. While purging about 9,000 unpaid names, an operator at NSI had inadvertently

deleted some paid ones as well, such as msnbc.com. That domain was registered under a joint

venture between two corporate giants... Microsoft and the National Broadcasting Company.

The implications of the coverage was clear. If those companies weren’t safe from NSI’s

errors, who was?

* * *

During July’s acrimonious newdom exchanges, a few habitual hotheads attacked each

other, slinging epithets from “idiot’ to “net.kook.” Metzger accused Denninger and

Kashpureff of preparing to “go to war.” Several quit the list in disgust. Some called for

civility and tried to set an example of how to continue the discourse in a professional tone.

It was to no avail.

A few members were shamelessly inconsiderate, in a class by themselves. Jim

Fleming, an Illinois-based developer, had been posting frequently since the middle of March,

often several times a day. His posts occasionally included perceptive questions and

comments, but the lion’s share were tangential, highly speculative, intentionally provocative,

and ultimately alienating. Fleming regularly touted his own imaginary upgrade to the IP

protocol (IPv8 StarGate), and lectured the list on history, politics, and business strategies. He

also became an infamous nuisance on important Internet management and operations lists

such as ARIN and NANOG. Fleming never seemed to miss an opportunity to annoy. He

harassed Postel for any hint of a shortcoming, from the alleged assignment of a Class A

address to @Home to the presence of students on IANA’s staff.

The first online reference to the “DNS War” as such appeared on July 21 1996. It

punctuated a relatively amicable discussion between Stephen R. Harris and Richard Sexton.

In a message thread titled “The Name Game” they compared the addition of TLDs in the

DNS to the addition of newsgroups in the Usenet, an older, once-cherished newsgroup

Page 299: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

284

For a highly positive description, see Michael Hauben and Ronda Hauben (1997), Netizens: On the409

History and Impact of the Net, http://www.columbia.edu/~hauben/netbook/.

The meanings of the hierarchies were as follows: Net, unmoderated discussion; Mod, moderated410

discussion; fa, from ARPANET; soc, society, talk, talk; alt, alternative.

As Sexton tells it, he was showing a colleague how to perform the basic commands to send a411

message that would create a new newsgroup. In his example he requested the creation of rec.fucking. The

example was supposed to be contained by designating “local” distribution, but he mistyped as that part of the

message as “lical.” The rest is history. Richard Sexton, “The Origin of Alt.sex” http://www.vrx.net/

richard/alt.sex.html. See also his “Internet DNS Governance Model - Draft,” Open-RSC February 18, 1998.

distribution service which had fallen into a maddening controversy. Though still large and409

active in the mid 90s, the Usenet was eventually overshadowed by the World Wide Web.

There were interesting similarities between the Usenet and the DNS, most notably a

hierarchical naming structure that could be wholly or partly visible depending on the choices

made by those who connected to it. For the most part, those choices were made for the

Usenet community by a group that came to be known first as the Backbone Cabal, and later

as the Usenet Cabal. Having started out with a fairly limited top level hierarchy – net., mod.

and fa. – the Usenet experienced several contentious episodes as new ones like soc., talk.,

and alt. were added. (Unlike the DNS, Usenet hierarchies read from left to right.)410

Sexton, a web developer and ISP operator living outside Toronto, had been involved

in networking for some time, and was particularly active on Usenet, creating alt.aquaria in

1987. In 1988, while goofing off at the financial services firm where he worked as a

computer technician, he inadvertently inspired a series of events the led to the creation of the

alt.sex newsgroup hierarchy. It didn’t take long before the Usenet’s collection of alt.sex411

subhierarchies exploded. Around the same time, John Gilmore initiated alt.drugs. Not

surprisingly, alt.rock-n-roll appeared soon after.

The unintended consequences of Sexton’s and Gilmore’s actions had undermined the

power of the Usenet Cabal. Control of the newsgroup system dispersed across the Usenet

community. The whole enterprise became associated with a strong flavor of either sleaze or

full-throated liberty, depending on the aversions or preferences of one’s taste.

This had implications for the ongoing DNS debates. An Internet naming cabal had

been broken once before. Why not do it all over again? This time it could be done

Page 300: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

285

Stephen Harris, “The Name Game,” Newdom-iiia July 21, 1996.412

Bill Manning, “Re: with whom to setit up with.” Newdom-iiia July 3, 1996.413

intentionally. Given the ubiquitous popularity of pornography, Sexton believed history was

about to repeat itself. Someone would soon disrupt control of the DNS by creating a .xxx or

.sex TLD.

Insiders knew that the ongoing centralized form of DNS management was based more

on convention than technical necessity. The general public had already become so dependent

on the DNS, it was considered indispensable. But in fact, it was possible to consider

alternatives that carried a different set of social and technical benefits. For Sexton, the

attraction of bypassing the legacy institutions run by the engineering community’s “old

guard” was the promise of a freer Internet and the opening of new business opportunities.

Nothing about the legacy system was concrete. By his logic, if alternate roots were feasible,

they were desirable.

But Harris, well aware of the volatile history of Usenet management, responded,

“DNS wars? Agh! No thanks!”412

* * *

The newdom list was now well over nine months old and no new TLDs had been

born. Waiting in an unmoving queue is hard enough. What if it’s the wrong queue? Several

aspiring TLD operators started to get anxious about whether the had submitted their requests

to the right place. Someone suggested that, to cover all the bases, it would be wise to forward

a request template to three different addresses: 1) [email protected]; 2)

[email protected], and; 3) [email protected]. Manning responded, “the last two are

wrong.” He offered an alternative set: “[email protected], [email protected], myself.”413

With that, he had unambiguously inserted himself into the chain of command for TLD

creation. As an ISI employee with operational responsibility in the root system, and as the

man with the inside track to become chair of a prospective working group on TLD creation

policy, and with no objection from Postel, Manning’s words carried considerable weight. But

Page 301: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

286

“Re: Usurping the current root nameservers” Newdom-iiia. The thread ran July 16-19, 1996.414

he had overreached. It wasn’t long before he had to qualify his status, and make it clear his

words were not the authoritative voice of IANA.

Given that the main thrust of Manning’s amended notification list was to exclude

Alternic, Kashpureff took it as a direct snub, and a surprising one at that, after having had

such a nice lunch with him. Newdom immediately, and once again, fell into rancor. Sexton,

in a thread titled “Usurping the Current Nameservers,” threatened ultimatum, insisting that

“decision time” was at hand. Manning’s response was uncharacteristically satirical:

Fragmentation of the Internet... brought to you by quickbuck, flyby-night ISPs.

ask yourself... 30 years of reliable service or... 6 months by folks who think they know it all.414

Fleming shot back. “Is this an official position of the IANA...or the State of

California...?” It was a smart, tricky question... rhetorical sounding, but aimed right at the

nub of the problem. What was the source of Manning’s authority?

Manning responded only minutes later, “Jim, you are not reading your mail carefully.

IANA postion [sic] is stated by Jon Postel. This is Bill Manning. This is not the State of

California or the United States of America or King of the World.”

That hasty retort would turn out to have legal implications. Fleming wanted

clarification. Hadn’t Postel said Manning was a member of IANA?

This time, Manning responded more politely. “Part of my time is spent working with

the IANA, yes. I, Bill Manning, do not make any announcements as or on behalf of the IANA

without prior permission and then I'll indicate that the statement is an IANA position and

made with the approval of the IANA.”

For Manning, dealing with newdom’s restless wannabe TLD entrepreneurs was

proving to be a surprisingly new kind of leadership experience. The technically-focused

groups he led in the IEPG and the root server community had been successful, because, like

the IETF, IEPG members were deeply immersed in a consensus culture. People could behave

Page 302: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

287

more or less collegially because they usually had an immediate interest in productive

collaboration. The folks on newdom, however, were all potential competitors. In consensus

settings, members generally trusted that the leader wouldn’t exploit his or her gatekeeping

powers to impose faits accomplis. But in newdom, with an unusually high percentage of

members who seemed temperamentally predisposed to paranoia, trust was in short supply.

And whatever trust remained was imperiled by charges that the Internet old guard was

scheming to shut out the newcomers and keep the profits of TLD expansion space to itself.

Thus, every statement of Manning’s and Postel’s was inspected for hidden meanings and

ulterior motives.

That penchant to for reading tea leaves could also play out in reverse. There had been

a fleeting moment in June when Kashpureff thought he had found favor in the eyes of Postel

and Manning. Now, at the end of July, Christopher Ambler and Simon Higgs began to

believe that the momentum had swung their way

Ambler was so excited by the prospect of making a fortune he could hardly contain

himself. He was utterly confident about his technological solutions. Securing investors

looked like it would be easy. Best of all, he had thought of a great name for his new TLD...

.web. That suffix promised a category-spanning “coolness” factor that few other prospective

TLDs could match. Instantly marketable, .web was perfectly positioned to challenge .com’s

share of new registrations.

To be sure, “dot-com” was a lovely phoneme, but the term hadn’t yet been identified

with the Zeitgeist of the era. In 1996, to speak of the “web” was much more fashionable. In

those days conventional wisdom was already moving toward the notion that business on the

Internet was all about branding. If true, service and price would be secondary concerns, at

least at first. For the long run, Ambler had some ideas about how to compete in those areas

as well, but his driving imperative was to move forward as soon as possible. He needed to

stake his claim to have it recognized. If any new TLDs ever came to fruition, it was logical

that .web would have to be one of them. The name was that good.

Ambler wasn’t the first claimant to the .web TLD to submit a template to IANA. He

wasn’t even the second. As third in line, he couldn’t just let IANA’s process play out on its

Page 303: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

288

AlterNIC allowed three TLDs per applicant. His other two requests were .www and .auto.415

Christopher Ambler, “TLD Registration with AlterNIC,” Newdom-iiia July 16, 1996.

Christopher Ambler, “Re; Common Registry Procedures,” Newdom-iiia July 23, 1996.416

own. The name might go to someone in front. Under those circumstances, Denninger’s

advice about establishing proof of prior use looked very sensible. One might call it the “fire

up the servers and line up the lawyers” principle. Ambler filled out AlterNIC’s TLD

registration template and sent it to Kashpureff. At least he could be first in someone’s415

queue. But he knew his chances for success would be far better if he could find a way to push

to the front of IANA’s line as well. To do that, he was ready to buy his way in. Ambler knew

he would have to find someone to open the door and take his money.

Since joining the list, Ambler had frequently announced his readiness to go live with

.web. By mid July the announcements were coupled with a plea to the effect of, “Where do

I send my filing fee?” But Postel and Manning weren’t taking any money. Given the

pressing need to bolster his legal standing and move his claim to .web to the head of IANA’s

line, he had to find a way to get someone there to accept his check. On July 23 , during ard

Fleming-initiated thread about developing common registry procedures, Ambler made a hard

kick at the door.

Each and every day I get less confident that this is ever goingto be resolved. Jon? Bill? Is there *anything* being done? Idon't see a WG ever being established here in newdom because ofthe constant off-topic arguing.How about setting up a real meeting someplace and inviting allinterested to attend, lock everyone in a room, and have thai foodsent in until something is approved. I'll buy my plane tickettoday and pop for the first 24 hours of food.Come on, people. We've got a company sitting on its thumbswaiting for this to happen, an application and check *literally*sitting on my desk waiting for a procedure to send it in. We're*ready* now.416

Not content to sit on their thumbs either, Denninger and Kashpureff decided to kick

off what Kahspureff called “AlterNIC’s 1996 World Tour.” It would start in Chicago on

August 1 in the facilities of Denninger’s ISP, MCSNet. Kashpureff would travel thest

country, installing the servers that would enhance responsiveness and redundancy in the

AlterNIC root constellation. The configuration work would take some time, but each stop

Page 304: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

289

AlterNIC’s name servers were stationed in Detroit, Chicago, Seattle, and Bremerton, Washington.417

George Lawton, “New top-level domains promise descriptive names,” SunWorld Online September 1996

http://sunsite.nstu.nsk.su/sunworldonline/swol-09-1996/swol-09-domain.html.

Bill Manning, “Coffee, Tea or Thai?,” Newdom IIIA, July 25, 1996.418

on the tour could also provide the occasion for a some kind of meeting – part local, part

conference call – where alt-rooters could discuss policy under the banner eDNS (enhanced

DNS). There was open speculation that adding a .xxx TLD to a fully-fledged417

AlterNIC/eDNS root would be the final step that would put the alt-rooters on par with the

legacy InterNIC zones.

Now, under even greater pressure to find some way of advancing the official process,

Manning proposed a meeting at IANA’s offices in Marina del Rey. It was also scheduled for

August 1 , ostensibly as an interim meeting of the planned working group. Perhaps itst 418

would be possible to bite off a small piece of the problem and try to develop registry

evaluation procedures. Ambler was after bigger game. This was just the opportunity he was

looking for.

The beginning of August was shaping up as a watershed for another reason. Newdom

was moving to a new host. Marnell had operated newdom as an unmoderated list, in which

anyone could post anything. But he had become utterly frustrated with the list’s acrimony and

lack of focus. In late July, seeking to impose discipline, he made private attempts to contact

those he considered to be the greatest offenders. Denninger was chief among them. Rather

than tone down, however, Denninger escalated. Marnell, in turn, warned that he might

terminate Denninger’s posting privileges altogether. Denninger then threatened to sue.

Marnell simply folded. “Bye bye list,” he wrote, announcing he would close it by July 30 .th

There was only a brief moment for expressions of sympathy for Marnell, gratitude for his

efforts, and introspection about the atrocious behavior that seemed so endemic on the list.

Another fight broke out almost immediately over who would take over as host. Denninger,

Higgs, Sexton, and Wesson all announced new lists. Wesson’s ultimately captured the

momentum.

Page 305: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

290

See Paul Mockapetris and Bill Manning’s response, “Re: One more time (and the history behind,”419

Newdom-iiia July 30, 1996.

Bill Manning, “naughty pine,” Newdom-iiia July 29, 1996.420

Simon Higgs, “Re name.space,” domain-policy January 28, 1998.421

* * *

Marnell’s newdom remained quite busy till the end, no doubt because such important

events lay ahead. On July 30, the day before the Marina Del Rey meeting, Manning

participated in a goal-defining exchange with Paul Mockapetris. Mockapetris considered

anything that aimed at establishing “consistent policies” to be “a bug, not a feature.” He

believed that a consolidated policy would “artificially restrict the growth of the Internet,”

while diversity would “embrace the largest set of endpoints.” Mockapetris then listed a set

of specific goals, all intended to accelerate the pace of domain name registration. The last

was, “Begin the process by allocating TLDs to 5-10 independent organizations to manage

as they choose. Let users vote with their feet.” Manning agreed with all of it, particularly419

that final goal. “Some of us have the idea that the focus is on the process to do [that one].”

Manning submitted two documents just under the wire of newdom’s closing. One was

a list of “some items for consideration as selection criteria for iTLD creation.” To420

Ambler’s great interest, the first item was prior use. The others were domain integrity, audit

compliance, and published registration policies. Manning’s other last minute submission

was a memo that relied on Postel’s last draft to describe how an IANA administrator might

calculate and collect a new iTLD’s 2 percent annual registration fees. The title revealed

some trepidation... “Do I really want to do this?”

The Marina Del Rey meeting turned out disastrously. Its everlasting legacy is a hotly

disputed tale of Ambler’s attempt to acquire formal authorization for .web. Only six people

were present, Manning, Higgs, Ambler, Jonathon Frangie (Ambler’s attorney), plus Dan

Busarow and Michael Gersten, aspiring TLD operators. Higgs later insisted that “explicit

permission” was given to go ahead and start up registration services, but Bill Manning’s421

Page 306: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

291

The notes were sent privately and then reprinted on various mailing lists. Some reprints may be422

found in current archives, some are only cached archives. See Gordon Cook, “the .web lawsuit against IAHC -

let the record speak,” February 28, 1997, http://www.dotau.org/archive/1997-03/0000.html.

Simon Higgs, “Re: name.space,” domain-policy January 29, 1998.423

notes on the meeting do not bear this out. A more sensational outcome was the question422

of Christopher Ambler’s “smoking check.” Had IANA accepted Ambler’s $1,000 application

fee, virtually ensuring his priority claim to .web? As Higgs later recounted:

Chris says "Bill, can I give you a check for the applicationfee?"

Bill says "no".

Chris then says "Bill, can I attach an evelope [sic] to myapplication?"

Bill says "yes".

Chris says "do you have an envelope?"

Bill says "hang on a tick" and walks into Jon's office and comesback with an envelope and gives it to Chris.

Chris stuffs the check inside the envelope. Bill pretends he isStevie Wonder. Chris hands envelope to Bill. Bill adds envelopeto the mountain of paperwork he has with him.

We finish off the meeting, and I finished off the sweet & sourpork ...

Within several hours, someone leaks this story to the newdommailing list. IANA check the envelope and say "Bless my soul" andpromptly return the check to Chris to prove to the world theyweren't taking any advanced application fees.

I think everyone was guilty. Especially the special fried rice.423

It is polite but mistaken to blame an MSG-fueled giddiness for the ensuing confusion.

The real problem was a mix of Ambler’s predatory opportunism, Manning’s overly

accommodating demeanor, and Postel’s inattention. The Marina Del Rey event was

originally supposed to be about fleshing out details of the long-anticipated RFC, not picking

a winner in the race to become the Internet’s first major commercial competitor to Network

Solutions. In any case, twelve hours after the meeting was over, .web was up and running.

Page 307: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

292

Word got out right away, suggesting that some kind of sweetheart deal had occurred, and that

Denninger’s warnings had been justified.

Postel stepped in immediately. The following day, August 2 , he issued a statementnd

insisting that no commercial TLD registrations were being accepted. The envelope and the

check were returned to Ambler. But the damage was done. Things looked sloppy. Postel’s

once unassailable reputation as IANA and as steward of the Internet’s core resources was

truly damaged. Past problems such as ambiguous IP address allocations hadn’t reached

public consciousness because they were technically complex and had occurred in relative

obscurity. But now the Internet was booming, and there was swelling interest in TLD

allocation and the surrounding controversy. Money changing hands in exchange for a favor

was a primitive, easily understood idea.

This event became one of the “bloody shirts” of the DNS War. Afterward, whenever

it seemed something important was about to be decided or discussed – perhaps because

Congressional hearings were announced, or because the Administration was preparing to

release a new recommendation, or because a significant conference was about to take place

– the .web episode would be replayed and embellished. The ‘How did we get into this

mess?’ debates added the feud about whether IANA sold .web to Ambler that day, and

whether it had the authority to do so.

Still, the DNS debates remained as mired as ever in traditional arguments. Did the

Internet’s name and number resources rightfully belong to some US Government agency or

to the Internet community? If the former, Which government agency was it? If the latter,

Who represented that community? Nothing had been simplified. Nothing had been advanced.

And in the meantime, Network Solutions alone continued to reap the benefits of the .com

windfall.

Page 308: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

See graphic in David L. Margulius, “Trouble on the Net,” Infoworld November 24, 2003, 42.424

According to Kashpureff, AlterNIC’s penetration reached three percent during the summer of 1996.425

See Diamond (1998). Critics claimed AlterNIC never surpassed 0.5 percent. See Dave Crocker, “Re: Just do

it (or ‘Death of the Net? Film at 11...’),” domain-policy April 20, 1998.

Jon Postel, “New Registries and Delegation of International Top Level Domains,” August 1996.426

It can be found in various archives under “draft-postel-iana-itld-admin-02.txt.”

293

[T]echnological innovations inherited from the past... cannot legitimately be presumed to constitutesocially optimal solutions....

Paul David

On the information superhighway, national bordersaren’t even speed bumps.

Timothy C. May

7. THE TECHNICAL CONSTRUCTION OF GLOBALISM

a. A Chorus of Critics – September 1996

The attacks on IANA had a lasting effect, nicking the armor of Postel’s once-stellar

reputation, and tainting the image of the Internet’s old guard. But those setbacks didn’t boost

the fortunes of alt-rooters like Kashpureff and Denninger. The big retail ISPs were content

to stay with the legacy DNS constellation, keeping alt-roots on the periphery. NSI reported

a ten-fold increase in average daily DNS queries over one year, from about 100 million hits

per day in autumn 1995 to over 1 billion in autumn 1996424

At best, only three percent of the Internet’s users ever tried AlterNIC for name

resolution. Under those circumstances, putting out money for a domain name registration425

in one of its TLDs amounted to either a bad bet or a wistful token of encouragement.

Entrepreneurial TLDs never gained enough market share to even hint that a “tipping point”

might be within view. As enthusiasm for alternate roots subsided, NSI continued to enjoy the

ever-mounting windfall of .com. And the “formal” new TLD process sustained its inexorable,

if glacial, movement ahead.

In late August Postel published another draft on TLD registries. This time he426

suggested launching thirty new ones right away, with a goal of one hundred fifty by the end

Page 309: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

294

Kahin and Keller, (1997)427

of 1996, a far more aggressive schedule than his original suggestion of only five per year. He

still preferred to see any new TLD business operations modeled after NSI’s combined

registry/registrar functions. For parity, new operators would be limited to a maximum of

three TLDs each. To stimulate growth outside the United States, Postel recommended that

no more than half the new startups could be in the same country.

A significant portion of the draft concerned the creation of a “legal and financial

umbrella” for IANA, in explicit coordination with ISOC. There was also a discussion of

dispute resolution processes that included Postel’s first mention of the Internet DNS Names

Review Board (IDNB) since the publication of RFC 1591 two years earlier. The role of this

phantom board would now be “expanded” to handle a new class of technical disputes. As

before, nothing was said about how the IDNB would be constituted.

* * *

The August draft came shortly in advance of yet another grand meeting on Internet

governance... the “Conference on Coordination and Administration of the Internet,” funded

by the NSF, and hosted by the Harvard Science and Technology Program’s Information

Infrastructure Project. Scheduled for September 8-10, 1996, it promised to be the biggest

and most serious one yet. The chair, Brian Kahin, had authored the 1990 RFC that

documented the NSF’s plans for Internet commercialization. At this time he was working

simultaneously as Lecturer at Harvard’s Kennedy School while heading the Working Group

on Intellectual Property, Interoperability, and Standards of the U.S. State Department’s

Advisory Committee on International Communications and Information Policy. A Harvard-

trained attorney, an academician, and a quasi-officer of the U.S. government, Kahin had

established himself as an elite policy wonk who could be counted on to be highly attentive

to the concerns of trademark and copyright interests.

Unlike previous meetings and conferences motivated by DNS reform, papers

presented at this one would be published as an edited compilation. That raised the bar on the

quality of submissions. As usual, getting a handle on how the DNS worked and who427

Page 310: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

295

Earlier in 1997, CDA opponents had coordinated a “black ribbon” campaign, signaling their protest428

by resetting web pages to display black backgrounds.

ACLU vs. Reno, 929 Fsupp 824 (Eastern District of Pennsylvania 1996) paragraph 11.429

controlled the root was a top concern. But this conference targeted broader issues, and at a

relatively high level. Topics included trademark legalities, NSI’s dispute resolution policy,

IP performance metrics, and ISP traffic settlements. Names and numbers arcana were to be

explored in rigorous detail, but this was not the place to negotiate who would get which new

TLDs

The venue also allowed lots of room for discussion of “big picture” items, including

intriguing meta-questions such as whether the Internet had become an entity immune to the

dictates of national sovereignty. It was a timely discussion because a recent US District Court

ruling had struck down pieces of the Communications Decency Act. Most members of the

Internet community had long believed that the CDA presented an onerous threat to the free

speech rights of adults. Even if the Internet had not been proven impervious to428

sovereignty, it appeared as if the community’s freewheeling culture had withstood an attack,

and had shown itself as a force that had to be accommodated. Not only was the outcome a

cause for revelry, the Court’s decision included heartening language about the nature of the

Internet itself.

No single entity – academic, corporate, governmental, or non-profit –administers the Internet. ... There is no centralized storage location, controlpoint, or communications channel...429

Such wording bolstered the increasingly popular notion of the Internet as sui

generis... “chaotic,” and therefore ungovernable. There was no shortage of perspectives about

who should rule the Internet, and how, even among those who pronounced it ungovernable.

The NSF conference presented the latest opportunity to give them a hearing.

Mitchell Kapor, a leading EFF activist, and Sharon Eisner Gillett, an MIT Research

Associate and former BBN employee, co-authored a paper disputing the Internet’s “popular

portrayal as total anarchy.” Despite lacking a central information “gatekeeper” or a network

connectivity-mediating “king,” they wrote, the Internet was actually a decentralized

Page 311: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

296

Sharon Eisner Gillett and Mitchell Kapor, “The Self-Governing Internet: Coordination by Design,”430

in Kahin and Keller (1997: 24).

Ibid. p.32.431

confederation founded on the principle that “networks will relate as peers.” New norms,

embodied in the Internet Protocol and the community’s “sacrosanct” regard for

“interoperability,” were taken to be profoundly transformative, fostering a culture of business

and commerce called “co-opetition.” They credited unique Internet characteristics for

impeding the emergence of “chiefs” while promoting a style of coordination called

“decentralized interoperation.”

According to Gillett and Kapor, “ninety nine percent” of the Internet followed these

new, decentralizing norms. The remaining “one percent” of the system accounted for routing,

protocol standards, and unique identifiers. That portion was “exceptional” in that it was

critical to the Internet’s function, and thus required management by “Authorities.” Namely,

Postel. But perhaps only for the time being. What worked for a small Internet in the past

might not work for a big Internet in the future. They considered the legitimacy of incumbent

overseers to be increasingly problematic. “IANA’s authority depends on how close Jon

Postel comes to performing [his] role with the wisdom of King Solomon.” 430

IANA derived its authority from the Internet community’s shared history andtrust in Jon Postel. It would be naive to expect such authority to remainunchallenged as the community grows to encompass large numbers ofstakeholders who share neither of those.431

The tone of the Gillett and Kapor analysis was austere, but the thrust was hopeful,

even triumphalist. “It would not surprise us,” they wrote, if those last few of the Internet’s

centralized functions “ended up taking place in a bottom-up, ‘anyone can’ manner,” which

“eventually looked a lot more like coordination of the rest of the Internet.” In other words,

they weren’t advocating outright anarchy, but an outcome which would firmly and finally

establish that no one was in charge of any portion of the Internet – a world where guiding

principles would be fixed and gatekeepers would be gone.

Page 312: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

297

A short biography of Johnson can be found in John Brockman (1996), Digerati: Encounters with432

the Cyber Elite, online at http://www.edge.org/documents/digerati/Johnson.html.

“Law and Borders – The Rise of Law in Cyberspace,” 48 Stanford Law Review 1367 (1996).433

David R. Johnson and David G. Post, “And How Shall the Net Be Governed?: A Meditation on the434

Relative Virtues of Decentralized Emergent Law,” in Kahin and Keller (1997: 62-92).

A more self-avowedly radical piece, “And How Shall the Net Be Governed?: A

Meditation on the Relative Virtues of Decentralized Emergent Law,” came from two

attorneys, David Johnson and David Post. Their involvement in the creation of Counsel

Connect, a popular online service for lawyers, had secured their places at the forefront of

computerization within their profession. But they had also distinguished themselves as432

activists on free speech and privacy issues. Now they were taking up the cause of the Internet

as an independent sphere of legal authority. And they were on a roll, having recently

published a major article in the Stanford Law Review on a similar theme.433

Johnson and Post saw the Internet as “a new global space that disregards national

borders and that cannot be readily controlled by any existing sovereign.” Not only did its

decentralized nature allow for a new approach to rule-making (about which the two authors

had much to suggest), anyone engaged in that incipient rule-making process had to recognize

what Johnson and Post deemed to be an essential fact of the new reality: No existing

sovereign territorial government “possesses legitimate authority” to serve as a source of rules

for Internet activity. Any attempt to impose the “traditional model of governance,” they

believed, “represents... an extraterritorial power grab... a form of colonialism long rejected

(and rightly so) in the non-virtual context.”434

Such bold talk was not unusual during the Internet boom, but Johnson and Post were

far more sophisticated than most self-declared citizens of cyberspace. They understood the

utility of making a big splash. Building on the momentum of the Stanford article and the

popular fascination with the Internet itself, they leveraged the media tie-ins of the Harvard

venue for maximum return. They had consciously embarked on a world-making agenda,

thinking through the legal and political implications as skillfully as anyone before them.

Page 313: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

298

Don Mitchell, Scott Bradner, and K Claffy, “In Whose Domain? Name Service in Adolescence,”435

in Kahin and Keller (1997: 267).

* * *

Though Postel did not attend the conference, other leading characters from the DNS

debates were there, including Heath, Shaw, Rutkowski, Mitchell, and Bradner. The latter

two, along with Katherine Claffy, put together a rather brusque, counter-intuitive argument.

For them, DNS technology was ultimately “irrelevant.” The real problem was scale; over the

long run DNS wouldn’t be able to handle the problem of locating resources on the Internet.

A true directory service would be needed for that (The kind of service that AT&T had failed

to deliver to the InterNIC under its share of the Cooperative Agreement). In fact, they argued,

debates over the future of the DNS were a distraction from the more important problem of

Internet governance.

The Internet community prides itself on its anarchic nature. [The expected]withdrawal of U.S. government support and authority for its intellectualinfrastructure will render inadequate the boundaries that have isolated andprotected that anarchy. The community must create new authorities and/orcoalesce around globally credible institutions of governance, and determineand establish mechanisms which are likely to insure the long-term viabilityof these institutions. And all of this must occur quickly. We fear, and thuscaution the community, that there is much truth to the old adage that thosewho are unable to govern themselves will be governed by others.435

The real challenge, they believed, was “to institutionalize the IANA function as

quickly as possible.” Only that would guarantee survival of the Internet community’s

intellectual infrastructure. For these cynical old hands of the Internet, TLD allocation was

just an irrelevant sideshow.

* * *

There was something so palpably radical about the Internet those days, that pinning

down its meaning and purpose had become a matter of some interest. What was the desired

intent of all this effort? What project was being projected? Descriptions drawn from past and

present society often felt inadequate. Notwithstanding competing semantics, terms like “self-

governance” were too tame. Even “anarchy” fell short. New coinages – oxymoronic

Page 314: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

299

For NSI’s “Domain Dispute Policy Statement – Revision 02,” see Rony and Rony (1998: 157-9).436

S ee C a rl O p p e d ahl , “N SI Flawed D omain N ame P olicy in fo rm a tio n437

page,”http://www.patents.com/nsi.htm. See also, Milton Mueller, “Trademarks and Domain Names: Property

Rights and Institutional Evolution in Cyberspace,”http://istweb.syr.edu/%7Emueller/study.html.

morphologies like “dis-intermediated,” “stupid network,” “co-opetition,” and “decentralized

interoperation” – helped capture the early Internet’s thrilling sense of exceptionalism. But

that very exceptionalism had produced a crisis. It was no longer enough to be a geeky insider

who could intuit a tacit awareness of the Internet’s catalyzing values. It was no longer enough

to rely on a community of insiders to embed those values in every newly-built piece of the

Internet’s burgeoning infrastructure. Mystical faith in taken-for-granted principles could not

sustain those principles against attack, attrition, or entropy. For the Internet’s exceptional

traditions and institutional forms to be preserved, they would have to be articulated,

promoted, defended, and advanced.

* * *

The NSF conference was an intriguing detour, but the DNS street-fight never dropped

entirely from view. NSI published a new revision of its dispute resolution policy that same

week. Intended primarily to make the policy clearer and more easily understood, its436

appearance did as much to aggravate old wounds as to prevent new ones. The restatement

did nothing to satisfy those who believed the policy was designed to benefit self-

aggrandizing corporate opportunists at NSI who had tilted too heavily in favor of trade-mark

holding complainants. And it provided even more grist for the mills of legions of analysts

who were developing expertise in this chic new area of law. NSI’s own terms of service

regarding trademark policies were again attacked for undermining the principles of free

speech and first-come first-served. The company was also criticized for not reporting basic437

statistics about domain name disputes, and for leaving many unanswered questions about

how widely and how consistently its policies were applied. NSI wouldn’t even report how

many names had been removed.

Page 315: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

300

Excessive fear of being sued for contributory infringement produced an overreaction.

The company’s attorneys undertook a hyper-vigilant strategy that leaned hard to the benefit

of trademark holders, even when traditional concepts of infringement did not apply.

Some incidents were sensational. At one point, at the request of a Michigan lamp

retailer, Juno Lighting, NSI was on the verge of deleting the name juno.com. Its operator was

a well established ISP which had nothing to do with the manufacture or sale of lamp fixtures.

More than 500,000 people depended on juno.com’s free email service. The owners of Juno

Lighting demanded the name anyway, and NSI announced it would delete the juno,com entry

until the legal issues were settled. An emergency court order prevented disaster for Juno’s

email account holders. NSI eventually revised its policy so that a contested name could

remain visible on the Internet if the name holder agreed to indemnify NSI. But the basic flaw

was left in place, allowing trademark holders to invoke their presumed “rights” to various

names, despite clear lack of infringement.

Many of the domain names disputes which finally reached court had a kind of David

and Goliath feel. Roadrunner, an Arizona-based ISP, was challenged by Warner Brothers,

claiming dilution of the cartoon character brand. (A weak flyer, but fast and agile on the

ground, the roadrunner is a distinctive, popular bird often seen in the Southwestern US, and

especially Arizona.) Perhaps the most mismatched of such cases pitted Uzi Nissan against

the Nissan Motor Company. Nissan, who ran a computer business in North Carolina, was

born in Israel where Nissan is both a common family name and the name of a calender month

in Hebrew. He registered nissan.com in June 1994, shortly after Josh Quittner made an issue

of mcdonalds.com. But no Solomon stepped forward to engineer a quick and happy solution

to this case, and Mr. Nissan was under tough legal pressure from the motor company for

years thereafter.

One of the most celebrated David and Goliath standoffs did not reach litigation. Chris

“Pokey” van Allen, a twelve year old Pennsylvanian, received a challenging letter from the

Prema Toy Company demanding he abandon his pokey.org website (a gift from his father,

David van Allen, an ISP operator) and surrender the name. The younger van Allen’s

nickname happened to be the same as a claymation character named Pokey (the horse who

Page 316: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

301

Jeri Clausing, “Boy's Web Site Becomes a Domain Name Cause,” New York Times, March 25, 1998438

http://partners.nytimes.com/library/tech/98/03/cyber/articles/25pokey.html, and; Jeri Clausing, “Gumby Creator

Grants Boy Pokey Domain,” New York Times, April 24, 1998 http://partners.nytimes.com/library/tech/98/04/

cyber/articles/24pokey.html

See Ronald B. Standler, “Trademarks on the Internet,” http://www.rbs2.com/tm.htm.439

See Toeppen’s recounting at http://www.toeppen.com.440

was Gumby’s sidekick). The father responded to Prema’s demand by hiring an attorney and

undertaking a remarkably effective web-based public relations campaign that generated over

4,500 sympathetic email messages. Prema’s attorneys withdrew their demands after a

friendly intervention by Art Clokey, Gumby’s and Pokey’s creator.438

Then there was matter of satirical sites. Michael Doughey registered peta.org to

support content on behalf of “People Eating Tasty Animals.” The animal rights group, People

for the Ethical Treatment of Animals, were predictably offended. There was also elaborate

parody site called microsnot.com, which announced “the acquisition of England , a leadingtm

country.” The site described new licensing terms for England’s main product.

English will no longer be made available on a public domain basis. Alltm

users of English must register with Microsnot. A trial version of Englishtm tm

will be made available with a limited vocabulary.439

To be sure, there were truly predatory name-nappers. Dennis Toeppen, an Illinois-

base entrepreneur, registered about 200 names in 1995, shortly after fees were established,

including several he knew to be coined words. Such trademarks generally receive the

toughest protections under law because they have no other prior use in natural language. At

first Toeppen’s gamble seemed to work; rather than litigate, a few of his targets chose an

expeditious course and paid his greenmail demands. Others – most notably Intermatics and

Panavison – sued and won, establishing what Toeppen later described (in an expression of

ennobled suffering, if not redemptive contrition) as “the foundation of modern Internet

trademark law.”440

Page 317: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

302

See Ken Hart, “Net Plan Masks Power Bid,” Communications Week International, September 9,441

1996, 170.

Bill Frezza, “The ‘Net Governance Cartel Begins to Crumble,” InternetWeek, October 28, 1996.442

Http://www.internetweek/102896/635frez.htm.

* * *

By October 1996, domain names were a recurring media story, particularly when

there was news of the skyrocketing sums being paid for prized names. By now the bounty

had reached as high as $50,000. Coverage of the NSF’s latest Harvard conference fueled

more interest, especially within the segment of the computer industry trade press that focused

on ISPs. Savvy journalists were acquiring a more sophisticated appreciation of how much

hinged on the creation of new TLDs. A few were starting to take sides on the DNS debates,

and most of them favored the alt-rooters.

Postel, consequently, was taking harder shots than ever. Drumming up criticism of

Postel’s latest draft, Communications Week ran a story headlined “Net Plan Masks Power

Bid.” Robert Shaw played the foil, denouncing the draft as “totally without legitimacy.”

The domain name server space is a global resource and neither IANA norISOC have the right to appoint themselves as the taxing authority for the rootlevel to the tune of potentially millions of dollars a year.441

There was more. Bill Frezza, a columnist for InternetWeek, described Postel as “a

godfather-like figure within the [Internet’s] ruling cartel.” Frezza believed that AlterNIC’s

“homesteading” efforts had changed the landscape so significantly, the cartel might begin to

“crumble.” Presuming the Internet’s “old insiders” refused to “embrace the renegades,”

certain outcomes could be predicted: 1) An eventual “DNS war probably would settle into

sovereignless anarcho-syndicalism similar to the structure of international drug cartels,” or;

2) “real politicians could use the fracas as an excuse to grab control, kicking out the geeks”

and ultimately subordinate the Internet to the FCC or the ITU.442

A Boardwatch report provided a detailed background on the evolution of Postel’s

new iTLD RFC drafts. The article also recounted Ambler’s $1,000 “application” for .web,

incorrectly implying that Manning had been fired after the incident. The authors were

Page 318: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

303

David Hakala and Jack Rickard, “A Domain By Any Other Name,” Boardwatch, October 1996.443

Original sources have disappeared, but notes indicate the letter was dated September 23 1996. The444

letter was later referred to by the FNC Advisory Committee . See “Draft Minutes of the Federal Network

Council Advisory Committee (FNCAC) Meeting,” http://www.nitrd.gov/fnc/FNCAC_10_96_minutes.html.

Letter dated October 4, 1996, forwarded to me by the author.445

relatively gentle with Postel, allowing, “It seems inevitable that an organization which has

never attempted to make money in the past should have a few problems handling cash the

first time it tries.” Nevertheless, they lauded Kashpureff and sought to promote access to

AlterNIC’s root, printing step-by-step instructions for ISPs and home users who might want

to reconfigure their machines or operating systems in order to try it out.443

* * *

DNS reform was a strange kind of hot potato. It got hotter the longer one held it.

Postel had hoped to hand it off in June, but very little had been accomplished since ISOC’s

Board postponed sponsorship of his requested “blue ribbon” committee. Technical details

about operational requirements for root servers had been fleshed out, but tougher questions

about liability and authority remained open.

In the wake of Harvard conference Kahin might have felt that the power to take the

initiative and move things forward had come within his reach. Evidently hoping to settle the

authority question once and for all, he wrote to the co-chairs of the FNC asking for

clarification about U.S. government claims to ownership over 1) the IP address space; 2) the

.com .net and .org TLDs, and; 3) the root. But Kahin was not the only one trying to seize444

the initiative.

On the heels of Kahin’s move, Don Mitchell circulated a memo titled “Declare

Victory and Leave.” The Cooperative Agreement should be terminated immediately, he445

argued, 18 months ahead of schedule. His logic was that NSI’s financial solvency made

contract termination both possible and necessary... The Internet had evolved into a full-blown

commercial entity, and was therefore beyond the scope of the NSF’s educational support

mission.

Page 319: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

304

Where Mitchell was eager to help NSI consolidate its power, others still hoped to

diffuse it. Not that things had any gotten easier. Total registrations in .com and .net had now

reached 600,000, with the pace of registrations growing 15% each month. NSI was not only

solvent, it was flush with cash, and its future was bright. Market watchers speculated that the

company was preparing to “go public” with an Initial Public Offering (IPO). If so, its current

owners were about to become quite wealthy. Given how badly Postel had been burned by the

process so far, and given that agents of the US government were either preoccupied with

analysis or exceedingly friendly to NSI, and given that alternate roots had failed to break

through, only one entity might conceivably challenge NSI’s monopoly and disperse the

windfall – ISOC.

b. Blue Ribbons – November 1996

Like most of the people associated with the technical community, Don Heath lacked

a straightforward, comprehensive, popular goal for DNS reform. But even if he didn’t know

specifically what to do, given the mounting pressure for someone to finally step up and act

like a responsible agent, he had a unique opportunity to move things forward. IANA may

have been down, but it was far from out, and Postel was still touting ISOC as the legitimate

voice of the Internet community. Moreover, despite rough beginnings and persistent

controversy, ISOC’s standing in the technical community had achieved preeminence. As new

users rushed onto the Internet, and as new contributors joined the IETF, ISOC no longer

appeared to be such a usurper. Now it was just part of the landscape.

In early October Heath began floating plans for an International Ad Hoc Committee

(IAHC), the “blue ribbon” that Postel had longed for. There was a new sense of urgency, in

part because that Mitchell had become so much bolder in advocating his own ideas. Also,

the FNC Advisory Council was scheduled to meet in Washington on October 21-22,

establishing a deadline of sorts. If ISOC didn’t act decisively by then, control of the initiative

might fall to the NSF, or to Kahin, or just slip away altogether in another round of analysis

and fact-finding.

Page 320: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

305

Most comments were not attributed to any particular speaker. Peter Rony, forwarded by Ellen Rony446

to Michael Dillon, “NEWDOM: FW: FNCAC Meeting, Oct 21-22, 1996; NSF, Arlington, VA (fwd)”

newdom_ar 10/24/1996.

“Draft Minutes of the Federal Network Council Advisory Committee (FNCAC) Meeting”447

http://www.fnc.gov/FNCAC_10_96_minutes.html. See also, “FNCAC Agenda Topic – Governance of the

Internet,” http://www.fnc.gov/fnca97_sec5.html.

When the FNCAC meeting convened, the report requested by Kahin was not ready.

(It seems unlikely that it was ever completed. There was no record of it in any archives or at

the FNC’s website.) Nevertheless, DNS problem were discussed at length, and heatedly.

Peter Rony, researching a book on how to “win” in the domain name market, attended the

discussion and transmitted a partial record.446

Mitchell was there, avidly pressing his case for ending the NSF’s responsibility for

InterNIC support. He wanted out almost immediately. Many were reluctant to recommend

such precipitous action, since it would drastically reduce US leverage in shaping future

reforms. There was lingering criticism of NSF for the way the institution of fees had been

managed. There was also some discussion of shutting down .com altogether. And there was

general amazement about how profitable the DNS business had become.

The group passed a resolution to stress “the urgency of transferring responsibility for

supporting U.S. commercial interests in iTLD administration from the NSF to an appropriate

entity.” Who that “appropriate entity” might be was left vague and unspecified. Over at447

ISOC, however, people had very definite ideas.

Though the details of the IAHC weren’t complete, Heath had made a lot of headway.

Primed by Shaw, Maher and Tramposch in Dublin, and helped along behind-the-scenes more

recently by Larry Landweber, organizational work was racing forward. Throughout the

Internet community, especially on newdom, people were preoccupied by speculation about

the committee’s composition. Several were even jockeying to be appointed. Postel had been

held up by jury duty, so the final membership wasn’t yet known, but there was enough in

place to launch a fait accomplis.

Just as the FNCAC meeting got underway, ISOC issued a press release describing the

nine member IAHC. Belying the goal of launching a whole new framework for DNS

Page 321: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

306

ISOC Press Release. 1996 “Blue Ribbon International Panel to Examine Enhancements to Internet448

Domain Name System,” http://www.iahc.org/press/press1.html.

FNC Draft Minutes and FNCAC Agenda Topic. See fn. 447.449

Mueller (2002:143).450

governance, its understated objective was to “resolve controversy . . . resulting from current

international debate over a proposal to establish global registries and additional international

Top Level Domain names (iTLDs)” IANA, ISOC, and the IAB would each appoint two448

members, and the ITU, WIPO, and the INTA would each appoint one. This was an unlikely

alliance: the globalizing elite of the Internet engineering community and the bureaucratic

agents of the old world telecommunication, patent and trademark regimes.

The FNCAC’s reaction was first to ask for a seat at the table, and then to hedge its

bet. According to its minutes: “While not endorsing the [Postel/ISOC] RFC, FNCAC

members urged NSF and the FNC to seek membership on this advisory committee, in

recognition of the government's historic stewardship role in this sector.449

Leaving the FNC off the invitation list was just one remarkable oversight of ISOC’s

announcement. Another was the lack of explicit representation for the ISP community.

Asking CIX or RIPE or APNIC to appoint a member would certainly have been considered

appropriate, especially after so much attention over the previous year had been given to that

community’s importance and the legitimating role.

Some thought it was a mistake not to invite NSI. Yet, as the entity most likely to450

lose from introducing new TLDs, one can see how Postel and Heath might have expected

NSI’s behavior to be obstructive. Still, including NSI in order to negotiate some kind of “buy

in” would have been the more astute move. But no one acts with the benefit of hindsight.

c. IAHC - Late 1996

With the IAHC taking shape, it seemed like the reform process was taking a real step

forward. But appearances had been misleading before. After all the fuss of the past year, little

had been achieved. Expiration of the Cooperative Agreement was now less than 18 months

Page 322: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

307

Paul Vixie, “requirements for participation:” October 31, 1996, IETF and newdom-vrx.451

away. Concerned that momentum would be squandered on another round of dead-end drafts

and eternal argument, Paul Vixie laid down an ultimatum to the major players. And, to

underscore the point, he made it known on the Alt-root community’s newdom list.

I have told the IANA and I have told InterNIC -- now I'll tellyou kind folks.

If IANA's proposal stagnates past January 15, 1997, withoutobvious progress and actual registries being licensed or in theprocess of being licensed, I will declare the cause lost. Atthat point it will be up to a consortium of Internet providers,probably through CIX if I can convince them to take up thiscause, to tell me what I ought to put into the "root.cache" filethat I ship with BIND.451

Vixie believed that he could play the initiative game as well as anyone. His ultimatum

included a noteworthy assumption, implicitly equating the IAHC with “IANA’s proposal,”

and therefore Postel’s latest draft. Most observers of the process thought the purpose of the

IAHC was to flesh out an existing set of ideas, pulling together the best parts of Postel’s

drafts and contributions that had been made on public email lists. It was an unquestioned

premise. Few expected the outcome would be something altogether new.

* * *

IAHC members weren’t expected to serve as representatives of the groups that

appointed them. They were to participate as individuals, IETF-style, presumably free to act

as they saw fit, beholden to no one.

Both the IAB’s appointments were from outside the United States. Geoff Huston was

founder and President of Telstra, the leading ISP in Australia, and a key figure in APNIC.

Hank Nussbacher was an Israeli-based engineer who was deeply involved in designing an

IBM-mainframe-based store and forward network called BITNET (the “Because It’s There”

Network) while enrolled at the City University of New York. When he returned to Israel he

was instrumental in building IBM’s presence as an ISP there, and operating Israel’s TLD, .il.

Those two choices seemed to help redress the gap in ISP representation, but another

Page 323: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

308

Personal interview with Don Heath.452

explanation is more likely... a mix of personal and professional associations. Brian Carpenter,

the IAB Chair, was a distinguished computer engineer at CERN who later joined IBM. He

eventually became Chair of the IETF. Carpenter was also an avid ISOC supporter, an

enthusiasm he held in common with Nussbacher who was active in his home country’s ISOC

movement, and with Huston, who became an ISOC Board member, and would serve for a

time as its Treasurer.

Postel apparently searched for appointees among people who had been active in DNS

and Internet governance debates. Christopher Ambler later said he had declined Postel’s

invitation, expecting there would be a conflict of interest when it came time to grant TLD

licenses. Simon Higgs said he was invited, but had declined on the grounds that the IAHC

process was anti-competitive. Karl Auerbach said Postel had asked him to consider serving

as well. In the end, his choices were Dave Crocker and Perry Metzger.

Crocker had worked with Postel and Dave Farber at USC in the mid 1970s, no doubt

getting his foot in the door with some help from his famous older brother Steve. He had even

shared an office with Postel for part of that time. The younger Crocker went on to distinguish

himself as the sole author of RFC 822, the first widely-accepted standard for Internet-based

email. He worked on the MCI Mail project in the mid 1980s and afterward remained

professionally involved with development of application standards, including electronic data

interchange (EDI) and Internet faxing. He was also quite active in the IETF, contributing to

many RFCs. He served as an IESG Area Director (AD) in the early 1990s, remaining a

voluble member thereafter. (More than a few IETF veterans I spoke to regarded Cocker as

unbearably haughty and disputatious, but those quirks weren’t so unusual in that group.

Before the appointment, his comments pertinent to DNS reform often advocated on behalf

of the interests of people outside the United States. Parenthetically, according to Heath, it

was Crocker who had talked him out of inviting CIX’s President, Barbara Dooley, to serve

on the panel.452

Page 324: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

309

Manning and Vixie presented “RFC 2010: Operational Criteria for Root Name Servers” in October453

1996 as part of this process.

Postel’s other appointment was the youngest member of the group, Perry Metzger,

a security specialist who had chaired the IETF’s working group on Simple Public Key

Infrastructure (PKI). Metzger had a reputation from the Usenet as an exceptionally energetic

and prolific member of whatever list was his current passion. An effusive advocate of the

Libertarian Party, he was an invariable critic of U.S. government controls restricting the

export of products containing strong encryption algorithms. Not shy in front of a crowd,

Metzger was a presence in the IETF, engaging in polemics wherever it suited him. He had

been particularly outspoken on newdom since its inception, pushing for shared registries,

taunting the alt-rooters, and defending Postel against any insult. Metzger had an especially

strong personal affinity for Postel. He would seem to drop his harsh edge and light up in the

elder’s presence, becoming happily animated and, at the same time, softer.

Heath’s selections for ISOC were David Maher, the rescuer of mcdonalds.com, and

Jun Murai, an eminent Japanese computer scientist. Murai was one of the heros in

Malamud’s Internet Travelogue, where he was portrayed as a no-nonsense but easygoing

dynamo. Murai played a significant part in building his country’s Internet infrastructures,

leading an advanced research and operations effort known as the Widely Integrated

Distributed Environment (WIDE). WIDE became an essential component of Asia’s Internet

backbone, and a pioneering hub of IPv6 development. WIDE was also a big player in DNS,

slated to host a new root server as soon as the legacy constellation expanded from eight to

thirteen machines. That server would be under Murai’s control. He was also an ISOC453

Board member.

When Maher began showing up at conferences on in DNS issues, Postel and Cerf

were quite put off. Postel, Maher recounted, initially observed him “as if I had crawled out

from under a rock.” Cerf’s reaction to Maher’s first presentation was “close to apoplectic.”

Yet Maher had earned high regard from Heath and others in ISOC over time, and had

become nearly ubiquitous in DNS policy-making discussions. That prominence ultimately

provoked a falling out between Maher and his managers on the International Trademark

Page 325: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

310

David Maher, “Reporting to God,” CircleID, March 27, 2006, http://www.circleid.com/posts/454

reporting_to_god_icann_domain_names_dns/.

See her retrospective comments to WIPO’s domain name consultation, September 23, 1998. “As455

to new gTLDs. Don Heath can tell you that I have always been opposed to new gTLDs.” http://arbiter.wipo.int/

processes/process1/consultations/series1/usa-sanfrancisco/transcript5.html.

Authority’s Board. They demanded that he stop speaking to the press and restrict his

activities to preparation of background reports exclusively for INTA’s use. Rather than454

comply, Maher resigned from his INTA post on September 1 . His co-chair Bob Frankst

resigned shortly thereafter.

All along, evidently, the plan had been for INTA to get a spot on the IAHC,

presumably to be filled by Maher. Now it went to another member of INTA’s Internet Task

Force, Sally Abel, who was clearly in step with trademark community’s general opposition

to new generic TLDs. Heath made sure Maher got appointed anyway. That turn of events455

was clearly advantageous to trademark interests. Whatever differences Maher and Abel may

have had over the creation of new TLDs, and whatever other nuances distinguished their

approaches to policy-making, they shared a visceral affinity for the needs and interests of

trademark holders. It was the common cornerstone of their professional careers. The same

was true for another veteran of the Dublin pub crawl, Albert Tramposch, who was appointed

as expected by the World Intellectual Property Organization. A seasoned international

attorney, Tramposch had led WIPO’s trademark treaty negotiating efforts for most of the

1990s.

The International Telecommunications Union selected one of Postel’s best known

antagonists, Robert Shaw. Again, it was no great surprise. Like Maher, Shaw had been

virtually living and breathing DNS issues for the better part of the year. And, like Maher, he

had been cultivating relations with Heath and others in ISOC since the OECD meeting in

June. Mutual aversion to Rutkowski would have quickened their bonding.

What distinguished Shaw was that he seemed to be the engine of the ISOC/ITU, new-

technology/old-bureaucracy alliance. He had been posting official-sounding comments about

IAHC plannning to newdom since the first public announcement in October. He had justified

Page 326: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

311

Robert Shaw, “Re: NEWDOM: Update,” newdom-ar October 30, 1996.456

its membership roster as an “interesting experiment in widening the concept of the traditional

'community.'” He had also raised the prospect that the “outcome” should be “a more

permanent 'commission' [to be] 'bootstrapped' for when it is needed.”

One idea that appeals to me is to create something like aInternet Policy Memorandum of Understanding (MoU) group (e.g., aswas used for GSM agreements). This is an interesting modelbecause it allows many stakeholders to join it (including boththe private and public sector), is self-regulatory in nature,widely diffuses responsibility for complex policy decisions, canhave a legal character, and if formulated loosely, can be quicklyadapted to new problems.

456

The selections from the technical community were, not surprisingly, people whose

approach to problems tended to narrow things down and focus on the matter at hand. To the

extent engineers engaged in grand, long-term solutions, they preferred strategies that let them

identify and focus on critical pieces of a structure. Doing so, they could pursue designs that

were robust and scalable, and most importantly, build-able with the resources available.

Engineers were comfortable with tangible things. But the IAHC was about to undertake the

framing of a comprehensively new institutional vision for management of a complex, hard-

to-understand problem. The challenge was to establish a jurisdictional link between a domain

name hierarchy of obscure provenance and a patchwork of national intellectual property

systems. If the case at hand was almost entirely novel, the idea of institution building was

not. People who were already familiar with the construction of legal edifices within

international organizations were far better prepared for the task.

The policy specialists on the panel were familiar with the business of making

dispersed institutional things that require long-term care and feeding. They were used to

thinking more broadly and farther ahead at the expense of immediate specificity. The

engineering specialists were temperamentally reactive, preferring situations in which they

could fix things and confidently walk away. That mismatch worked to the benefit of the

lawyers and international bureaucrats on the IAHC, and thus to the great benefit of the

trademark and intellectual property communities.

Page 327: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

312

Tony Rutkowski: “Re: Let the games begin” gtld-discuss November 17, 1996. (Note that iahc-457

discuss postings have been merged into the gtld-discuss archives.)

Paul Vixie, “re: don heath’s comments,” gtld-discuss November 21, 1996.458

Jon Postel, “Re: don heath’s comments,” gtld-discuss November 22, 1996.459

Cultural differences may even explain why Heath was so much more receptive at the

end of the day to the arguments of the policy specialists than the engineers... the lawyers were

better schmoozers.

* * *

With the addition of Strawn, and with Heath as chair, the panel of eleven members

was announced on November 12. Expectations were running high, at least among those who

still had faith in the integrity of ISOC and IANA. NSI was now taking in about $4 million

per month (on over 15,000 registrations per week), with no plateau in sight. Postel’s

supporters were counting on the IAHC to cut to the chase. That meant, in short order,

blessing new TLDs, breaking NSI’s monopoly, and fostering real competition in the domain

name registration business.

A new mailing list, iahc-discuss, was inaugurated on November 16. Crocker kicked

things off with a polite message titled “Let the games begin,” inviting participants to read the

IAHC charter and other web-based materials. Rutkowski was the first to respond, contending

that an unincorporated group couldn’t have a real charter since the word implied “a grant by

sovereign authority.” And so the games began. Posting was immediately voluminous,457

quickly spiking beyond sixty messages per day.

Postel made few contributions to the list. He put up a couple of invitations to join

ISOC and some very brief explanations of country code TLDs and the .int domain. His only

policy-related comment came during a thread initiated by Paul Vixie. Punctuating his earlier

ultimatum, he warned against the IAHC becoming “another think tank or debating society.”

The time to ask “what should we do?’ was largely past, Vixie wrote. Postel chirped458

in,“Admitting to a small bias in the matter... Do what the ‘Postel draft’ says to do.”459

Page 328: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

313

Jon Postel, “The IANA’s File of iTLD Requests,” gtld-discuss December 5, 1996. Postel published460

two versions. The second excluded names that Postel surmised were pranks.

Soon after, anticipating the upcoming work sessions of the IAHC, Postel published

a date-sorted list of the TLD requests that had been submitted to IANA by aspiring operators

over the preceding fourteen months. He made no comment about how the IAHC should

assign priority among them.460

Several newdom members wanted to get together at the upcoming IETF meeting in

San Jose, hoping again to use the Internet Registries Evolution (IRE) Working Group

meeting as a venue. But its chair considered DNS out of scope. The IRE group had evolved

a stronger focus on IP registry issues. If there was any truly technical issue area left in

common between the names and number specialists, it was the problem of identifying and

communicating with registrants. Figuring out how to consolidating a registrant’s “handles”

across different systems was an important concern in both communities. But the IRE group

preferred cutting its ties with the DNS policy zoo. Consequently, Rick Wesson organized a

separate New Top Level Domain BOF.

The IAHC held a “greet the public” session at the same venue. It also reserved a room

for its first private working session. The session was not only closed to the public, Metzger

made a show of sweeping the room for bugs. For the time being, the public would have to

wait for specifics about what the IAHC was up to. There was some vague foreshadowing

from Heath, who announced at the IETF plenary that the IAHC would be working out the

details of the reform process, not Postel.

d. Constituting CORE – December 1996

Despite the NTLD BOF, and despite the creation of a public email list, the IAHC

members conducted their deliberations in a manner far at odds with normal ISOC or IETF

practices. Instead of using their public list for developing drafts, they relied on private

exchanges. When they met in person, the sessions were closed and a staff counsel was

present. There were no minutes or other formal records to chronicle the proceedings. What

Page 329: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

314

“Regions as defined by the World Trade Organization (WTO),” http://www.iahc.org/461

docs/countries.html.

is known of the debates within the IAHC has been reconstructed through conversations and

interviews.

In short, Postel’s ideas were put aside and Shaw’s prevailed. The panel took a “clean

slate” approach, ignoring the much-discussed public drafts aimed at generating a community

of NSI-styled competitors. What emerged instead was a series of entirely new models for

administration, dispute resolution, and customer service. There would be a centralized, non-

profit shared registry fed by a globally dispersed network of commercial registrars. It would

be called the Council of Registrars – CORE. The IAHC also adopted new nomenclature;

where Postel had used iTLD (for international), the new acronym would be gTLD (for

generic, and hinting at global).

Most of the IAHC’s time was spent preparing a document called the Generic Top

Level Domains Memorandum of Understanding (gTLD-MoU), the instrument that would be

used to constitute the appropriate institutional arrangements. Other business involved coming

up with procedures that would be used to select the registry database operator (DBO) and vet

the CORE members.

The first draft of IAHC report, issued December 19, described a procedure by which

businesses from around the world would be allowed to apply as prospective CORE members.

Only 28 were to be accepted... four registrars from each of the seven global regions

recognized by the World Trade Organization: North America; Latin America; Western

Europe, Central Europe, Eastern Europe and some former Soviet States; Africa; the Middle

East, and; Asia. Applicants would be required to provide evidence of capitalization,461

insurance, creditworthiness, and number of employees. These applications were to be vetted

by Arthur Anderson, LLP which, at the time, was still considered a highly reputable “Big

Five” accounting firm. Finally, a lottery would to determine which of the qualified

applicants would become CORE registrars in each region.

With one exception, the trademark interests got nearly everything they could wish.

There would be a 60-day waiting period on the registration of new names, allowing

Page 330: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

315

The relevant wording December 1996 draft was at paragraph 3.2: “The IAHC recommends that all462

existing gTLDs be shared in the same fashion [as new gTLDs].” The wording was strengthened in the February

1997 Final Report at Paragraph 3.2.3: “It is intended that all existing gTLDs eventually be shared.”

designated authorities to check the names for possible trademark violations. Not only that,

the new central registry’s hardware and software would be configured to provide expedited

access to those monitors. This would speed up their ability to run searches for specific strings

of characters. Down the line, in order to bring the legacy gTLDs under the new regime, NSI

would have to give up control of the lucrative .com registry, as well as .net and .org. The462

company was encouraged to apply to be a CORE registrar. Given the lottery gamble,

however, it had everything to lose. The Internet veterans probably didn’t spend much time

lamenting NSI’s reduced prospects.

Nevertheless, the trademark community did make one big concession. All along, the

conventional wisdom within most of the Internet community had been that competition

would come through the rapid creation of many new TLDs. Trademark hardliners had hoped

to block new TLDs altogether, but the ISOC members on the panel wouldn’t give in. They

believed in the introduction of TLD competition and they had staked their reputations on

following through. Even Postel, the ultra-conservative, had recommended a first wave of

thirty. Plus, the “flatness” of a registration system that channeled so many new names into

.com remained technically worrisome. The compromise was seven. Further additions would

be considered after April 1998, the expected date for the expiration of the Cooperative

Agreement.

As Crocker told the story, it was he who was leading the discussion when the new

names were finally selected, writing prospective suffixes on a whiteboard. One particularly

attractive candidate was .web, but Crocker was aware of Christopher Ambler’s claim.

IODesign’s version of the .web domain was already visible through the AlterNIC

constellation, and the company was billing registrants. According to Crocker, when he asked

the attorneys in the room whether the IAHC should be concerned, he was told that IODesign

had no legal standing.

Page 331: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

316

CIX Comments on IAHC Proposal,” Januaary 17, 1977, http://www.cis.org/iahccomm.html.463

Dismissing the alt-rooters as flaky pirates, the group also dismissed the wisdom of

avoiding unnecessary conflict, no matter who the opposition might be. It was an ill-fated

choice, akin to driving a car through an intersection when you are sure you have right of way,

despite seeing another vehicle in your path. But perhaps there was an ulterior motive. If

ISOC/ITU coalition could act unilaterally to assert itself over IODesign now, it would be in

a stronger position to do the same to NSI later.

So .web was included in the final seven. The other six were .arts, .firm, .info, .nom,

.rec, .store, and, .web. Crocker later said he was unaware that .arts was already in

AlterNIC’s root. (That TLD was run by the Canadian Jason Hendeles, owner of Skyscape

Communications, Inc. In 2005 Hendeles’ application for .xxx would be accepted by ICANN

and then reversed under pressure from social conservatives in the US Congress.)

e. Blocking the MoUvement Jan-April 1997

The first IAHC draft came out just before the winter holidays, allowing a short period

of quiet before critical reactions erupted in full force. In mid-January, members from CIX

and a handful of large North American and European ISP associations published a critique

that amounted to a direct attack. They challenged the panel’s “authority and legitimacy,”

citing the lack of ISP representation and the failure to use an open, consensus-building

process. They predicted that the gTLD-MoU would produce a “chilling effect on the

development of electronic commerce,” in part because the 60-day wait for new name

registrations was onerous, and in part because the creation of new, take-all-comer TLDs that

lacked “specific purpose” or “clear charter for their intended use” (like .com) would

exacerbate problems for trademark holders. In a remarkable shift, given the earlier pressure

to add suffixes, the authors of the CIX document recommended that further action be delayed

until later in 1997, leaving time for the US Patent and Trade Office and the European

Commission to conduct public hearings on domain name issues.463

Page 332: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

317

“PSINet Positon Paper on the Global Top Level Domain Name Proposal by the International Ad464

Hoc Committee (IAHC),” http:www.psi.net/iahcpositionpaper.html.

Another scathing rejection of IAHC came from the corporate offices of PSINet, a

company which had initially been one of its most important proponents. Renouncing their

earlier sponsorship of the IAHC, the authors called for “an open global Internet convention”

to be moderated by “a high profile Internet advocate such United States Vice President Al

Gore.” Gore was highly regarded in the Internet community for his initiatives as a Senator464

on behalf of digital infrastructure expansion. That reputation was bolstered as the Clinton

Administration showcased its support for developing the “Information Superhighway.”

(Gore, by the way, never actually claimed to have “invented” the Internet, as his opponents

charged during the 2000 Presidential campaign. )

Yet more negative reaction came from the Domain Name Rights Coalition (DNRC),

a group of Internet-savvy DC-area attorneys who had been vigorous critics of NSI’s dispute

resolution policy. The President, Michaela “Mikki” Barry, a skilled software developer with

business ties to PSINet, was strongly motivated by free speech concerns. The Executive

Director, Michael Doughney, was known for the attention-grabbing “People Eating Tasty

Animals” website peta.org. He managed other provocative sites as well, including the

“Biblical America Resistance Front” under barf.org. Other notable DNRC members were

Kathy Kleiman, an activist in highly-regarded professional groups such as the Association

for Computing Machinery, and Harold Feld, a magnum cum laude graduate of both Princeton

University and Boston University Law School, who had established himself as an expert in

telecommunications law.

For proponents of the gTLD-MoU, inciting the wrath of the DNRC was an

inauspicious turn of events. Feld, Barry, and the others redirected their considerable passions

and talents away from battling NSI into a sustained effort to block the IAHC’s plan. They

started by publishing a lengthy, carefully-argued attack. Its centerpiece was a vehement

critique of the requirement that domain name registrants submit to a sweeping waiver of their

rights to raise jurisdictional challenges in case of disputes. They denounced the shared

Page 333: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

318

The groups’s original site was www.dnrc.org. See “Comments of the Domain Name Rights465

Coalition to the IAHC Draft Specification for Administration and Management of gTLDs,” January 17, 1997,

http://www.netpolicy.com/dnrc-comments.html.

Joyce Reynolds, Internet Monthly Report, “Trip Report,” http://www.isi.edu/in-466

notes/imr/imr9701.txt.

registry model and they derided the IAHC’s list of proposed new TLDs, especially for

overlooking the need for categories that would serve personal and political speech.465

First reactions from key movers and shakers in the European Internet community

were no more promising. Joyce Reynolds, representing IANA, got an earful at January’s

RIPE meeting in Amsterdam. “The general consensus,” she reported, “is that this group is

NOT happy with the IAHC as a body and the draft proposal they have out.” Nearly466

everyone there complained about the lack of European representation (Shaw and Tramposch,

though Swiss residents, were actually American expatriots). No one was happy about the

narrow time frame available in which to evaluate the document and submit comments.

RIPE’s chair, Rober Blokzijl, went further, describing the IAHC plan as an attempt by

Americans to solve a vaguely-understood problem by exporting it to Europe. To make

matters worse, the meeting was monitored by Christopher Wilkinson, an official of the

European Community’s Telecommunications Directorate, who was on a fact finding mission

in preparation for a formal EC response.

One of the few items in the IAHC draft that received wide support was the

recommendation that the .us domain should add functional second level domains, emulating

the organization of most other country codes. In the UK, for example, commercial

organizations registered under co.uk, limited liability companies under ltd.uk and plc.uk, and

academic institutions under ac.uk. But the .us domain was designed to reflect American

political geography. States were at the second level, cities at the third, and there was a

complex array of intersecting subcategories for businesses, schools, libraries, and local

agencies. IANA was at the top of the chain, receiving a share of the funds that the NSF

Page 334: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

319

See Ann W estine Cooper and Jon Postel, “RFC 1386: The US Domain” December 1992. It was467

published about the same time the Cooperative Agreement was being finalized. See also the revision, RFC

1480, June 1993. See also Erica Schlesinger Wass, “The United States’ .US: Striving for the American Dream,”

in Wass, ed. Addressing the World: National Identity and Internet Country Code Domains. Rowman &

Littlefield Publishers. Oxford 2003.

Willy Black, “Nominet UK's submission to IAHC on their proposals,” January 17, 1997. The468

submission included a comment from James Hutton of the United Kingdom Education and Research Network

Association. See http://www.gtld-mou.org/gtld-discuss/mail-archive/03926.html.

provided to NSI, as stipulated by the Cooperative Agreement. The arrangement reflected467

Postel’s faith in the collective contributions of volunteers, community servants, and small

businesses. This was the delegation motif implicit in RFC 1591. Responsible agents at one

level would provide hosting services on behalf of registrants at the level below.

In fact, the American country code was a complex, jumbled mess. Anyone wanting

to enter a name within the .us hierarchy had to make arrangements through the proper local

registration authority. Finding the agent wasn’t simple. Dealing with one could often be a

frustrating ordeal, if not a dead end altogether. For these and other reasons, American

consumers looking to establish an Internet presence shunned .us and purchased names under

.com. Many foreign observers believed that long-term mismanagement of .us had led to the

DNS mess. It appeared to them as if the IAHC was doing a a poor job of reinventing the

wheel when the Europeans had several of their own that were rolling along quite nicely.

The British had just upgraded their own registration model, creating what appeared

to be the epitome of a shared registration system, called Nominet. It was a non-profit,

spawned by the academic community. Its director, Willy Black, proudly claimed that

Nominet operated in “the true spirit of trusteeship inherent in RFC1591." Back end prices468

at the registry were kept low, but sufficient to keep the registry solvent. Nearly anyone could

be a registrar, and most British ISPs were, setting prices as they saw fit, typically bundling

hosting services with the sale of names. Nominet had a structural advantage in that

registrants had to use completely spelled out names within specific third level categories; this

reduced the likelihood of conflict. In case of rival claims to a name, Nominet encouraged

contenders to avoid “suspension” by “the use of shared web-pages which point to the two

Page 335: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

320

Ibid.469

J. Christopher Clough, “Network Solutions' Response to IAHC Draft,” January 14, 1997. Clough470

was NSI’ Director of Communications. See http://www.gtld-mou.org/gtld-discuss/mail-archive/03858.html.

In a personal conversation with Christopher Wilkinson, he recounted receiving calls from NSI471

personnel intended to prompt his criticism of the gTLD-MoU.

parties in dispute and incorporate URLs to enable the user to quickly get to the two home

pages.” The rest was left to “commercial negotiation or Court order.”

The bottom line was a fear that the IAHC was only compounding the problem of the

American-style, wild-west, free for all. Black sent a lengthy comment to the IAHC

explaining the Nominet model, “which has much to offer.” He opposed the introduction of

multiple new generic TLDs, arguing that it would oblige trade mark owners to register in yet

more spaces, and would be seen as a “cynical attempt... to milk money” from them.469

* * *

Compared to those reactions, NSI’s seemed lukewarm, “As a key stakeholder and

participant... our consensus is also needed.” But the existential threat presented by the470

IAHC was well understood. The company’s officers recognized they needed new and

stronger allies to survive. George Strawn had been quite helpful in the past, yet his presence

on the IAHC had failed to yield any explicit protections for NSI. Now the stakes were higher

than ever. A huge upgrade to the back end name registration system was coming online, and

the company was poised for another remarkable stretch of growth in demand. Preparations

for the IPO were ramping up. Moreover, ARIN was beginning to take form, raising public

awareness that new costs would be associated with IP address allocations.

If this was war, a Machiavellian response was in order. Therefore, it made sense for

someone at NSI to warn members of the European Commission’s Directorate for

Telecommunications that American-led groups were bringing about major changes to

Internet administration without having first solicited European input. And it made equally471

good sense to warn officials in the US executive and legislative branches that the site of

Internet administration was about to be moved to Europe and away from US control.

Moreover, it made good sense for NSI to cultivate a surge of public protest against the IAHC.

Page 336: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

321

To do so, it would be useful to take steps intended to undermine the reputation of the plan’s

most important supporters, namely Postel, ISOC, the IETF old guard, and their new allies at

the ITU. NSI found a collaborator who was exceptionally eager and uniquely qualified to

make the case... Tony Rutkowski.

Rutkowski was once an incessant opponent of government interference in Internet

policymaking, and a leading voice in favor of relocating the seat of Internet administration

to Geneva. But now it was in his interest to cultivate the US government as an ally against

the presumed greater enemy – the ISOC/ITU coalition.

As it turned out, the beginning of 1997 was an opportune moment for lobbyists,

policy advocates, and otherwise interested experts who wanted to catch the ear of someone

in government regarding the future of Internet administration. Government employees

recognized they needed to learn more about the topic, and fairly quickly. It was an interesting

issue, nerdy but newsworthy, and “hot” enough to merit the effort it would take to get up to

speed, especially if you worked for an agency that presumably had a say about Internet

administration, or might want to have more of a say down the line. The same was true for

legislative staff, especially if an important constituent had a strong interest in the issue. There

was a real demand for beltway-savvy resources like Rutkowski and the DNRC clique.

Government personnel needed to get their arms around the issue quickly. This would put

them in a better position to prove their expertise and authority to set the agenda, or just throw

in their two cents.

* * *

The senior Administration official charged with overseeing Internet-related affairs

was not Al Gore, but Ira Magaziner, a business strategist with a penchant for futuristic

thinking. Magaziner had established an enduring friendship and political partnership with

Bill Clinton when they were students at Oxford, and became a prominent figure in the first

days of the Clinton Presidency, serving with Hillary Clinton to lead the new Administration’s

Health Care Task Force with. Like the Clintons, Magaziner believed the US health care

system to be so broken that only a comprehensive overhaul would put things right. Their

ambitious initiative was quickly and totally routed, however, leading to a major political

Page 337: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

322

See Neil A. Lewis, “Court Clears Clinton Aide in Lying Case.” New York Times. August 25, 1999.472

Phone interview with Gordon Cook, November 20, 2002.473

setback for the Administration and the Democratic Party. The outcome also saddled

Magaziner with an ongoing legal dilemma... perjury charges pressed by a Federal trial judge.

The case involved sworn statements he had given during the course of a politically motivated

civil suit against the Task Force. The heart of the matter concerned the composition and

conduct of the Task Force, and whether there had been a transgression of rules regarding the

secrecy of some deliberations. The charges weren’t cleared until August 1999.472

Magaziner lost his prestigious office inside the White House after the health care

endeavor, but remained in the Administration as Senior White House Advisor for Policy

Development. In 1994 he projected that the growth of Internet-driven commerce was likely

to become a key factor in the US economy during a hoped-for second term. In keeping with

the generally business-friendly “third way” approach that characterized Clintonian economic

policy, mixing laissez-faire commercial and trade policies with selected interventions,

Magaziner suggested a compatible institution-building strategy for the Internet. That became

his bailiwick.

In mid 1996, with legal clouds from the Health Case Task Force still overhead,

Magaziner remade himself as a globetrotting evangelist of Internet neo-liberalism,

campaigning against regulatory fetters that might hamper Internet’s transformation into an

engine of economic expansion. There was also time for hobnobbing with the leading

impresarios of computerization, members a self-congratulatory elite called the digerati – the

digitial literati. One of the best known was Esther Dyson, author, publisher, celebrity

journalist, and future chair of ICANN. Rutkowski was another. Though not a celebrity, he

happened to be based in the DC area where access to Magaziner was more convenient. Some

believe Rutkowski significantly influenced Magaziner’s positions on Internet commerce. One

reporter commented, “It was the Tony and Ira show in ‘96.”473

Mindful of his unhappy litigation experiences stemming from the health care

initiative, Magaziner opted to use the Web as much as possible for any upcoming comment

Page 338: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

323

and rule making processes. A policy statement titled “A Framework For Global Electronic

Commerce,” appeared as a first draft in December 1996, with the expectation that a fully-

fledged version would be worked out over the first part of 1997. The leading priority in the

draft framework was “Fostering the Internet as a Non-Regulatory, Market-Driven Medium.”

Its most famous sound byte was, “Establishing cyberspace as a duty free zone.” The driving

focus was to prevent the imposition of any new taxes on the Internet, beyond those already

imposed on existing commerce.

Magaziner liked to think in broad historical terms. He sometimes compared the

Internet’s development with different historical phases of cultural evolution, particularly the

second Industrial Revolution and the development of commercial rail systems. The challenge

in such situations, he believed, was for leaders to set up institutional structures that would

facilitate healthy expansion. This was another one of those pivotal moments. On the Internet,

growth was inevitable. The number of users was expected to grow from a few million mostly

in the US to over a billion worldwide by 2005. A well-designed structure would be needed

at the international level to reduce the likelihood of stifling encumbrances such as

jurisdictional disputes, endless litigation, and unreconciled case law.

The visionary, long-term projects that motivated Magaziner were very different from

the nitty gritty challenges of reforming the Internet’s name and number assignment systems.

In fact, surveying the situation at the close of 1996, it seemed like there were good reasons

to postpone any reform processes. Despite all the fuss, the Internet’s “plumbing” seemed to

be working just fine. As Clintonians liked to say, “If it ain’t broke, don’t fix it.”

From Magaziner’s perspective, the most significant weakness in the domain system

was not the lack of new suffixes, monopolistic registries, or the trademark dilemma, but the

nearly spread-eagle vulnerability of the root server constellation. The linchpin resource of

Internet was essentially in the hands of volunteers, most of whom – however trustworthy –

were unaccountable to the US government. Even worse, few of them had made more than

token investments in physical security. During a visit to one of the sites in the root

constellation, Magaziner was alarmed to discover how easy it was to get access. If he could

wander the hallways unattended, find his way inside the server room and even put his hands

Page 339: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

324

on such an amazingly important machine, who else could? As it turned out, only NSI’s

servers approached the level of protection that satisfied government auditors.

So the need to promote the “stability” of the Internet became a paramount goal.

Pending changes like the gTLD-MoU, the accelerated termination of the Cooperative

Agreement with NSI, and the final phase of ARIN’s startup all looked risky. They seemed

like the kinds of shakeups that, done wrong, could put Internet expansion on a bad track or

derail it altogether. Though officials at DARPA and the NSF wanted out of the Internet-

oversight business as soon as possible, Magaziner preferred they stay involved. He needed

time to articulate his overarching framework for electronic commerce. There would be

opportunities to tinker with plumbing after that. Moreover, the Patent and Trademark Office

was ramping up a fairly large scale public Notice of Inquiry into the domain name issue, and

this would take time to complete.

These were new issues for the White House, and there was no sense of consolidated

expertise (other than among folks at NSF who had become so overtly friendly to NSI).

Therefore it made sense to convene an Inter-Agency Working Group, drawing in highly

qualified people who could provide a fresh and thorough review. Getting the personnel in

place and up to speed would take time as well, implying that any major modifications in

Internet administration would have to be blocked until the changes could be clearly

understood and fully vetted.

“Stability” had other merits. The opposition party still controlled the House of

Representatives and the Senate, and it was no secret that NSI’s parent company, SAIC, was

controlled by elite members of earlier Republican administrations. The health care fight had

taught Democrats that any new battles should be chosen very carefully. Giving unequivocal

backing to the IAHC before all the facts were in was just asking for trouble. A premature

alignment with SAIC-bashers from ISOC and the EFF would bring little benefit to the

Administration, yet would antagonize some of the wealthiest, best-connected members of the

other side. For the time being, it was best to let sleeping dogs lie.

The domestic virtues of stability were complemented by the exigencies of

international politics. Europeans had conceded a lot to Clinton’s predecessors during the

Page 340: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

325

International Ad Hoc Committee, “Final Report of the International Ad Hoc Committee:474

Recommendations for Administration and M anagement of gTLDs,” February 4, 1997,

http://www.iahc.org/draft-iahc-recommend-00.html.

effort to bring about deregulation of their telephone monopolies. It would have been

impolitic for the Clinton Administration to wear out that welcome with a fait accomplis

engineered in America. Even though the ISOC/ITU agenda was intended to reduce US

government control of the Internet rather than fortify it, those details weren’t fully understood

within government-level policy-making circles on either side of the Atlantic.

Again,“stability” mandated not rocking any boats. For the time being, smooth sailing was

more important.

***

The IAHC members were apparently oblivious to Magaziner’s go-slow strategic

agenda. They had built up tremendous momentum, utterly convinced they were in full control

of the situation. Their last meeting, hosted by the ITU in Geneva, produced a Final Report,

issued on February 4, 1997, that spelled out more details of the planned new order. CORE474

registrars would be required to pay a surprisingly steep $20,000 entry fee (only $10,000 in

less developed regions of the world) plus $2,000 per month, plus whatever fee would

ultimately be charged per registration once the system went into operation.

CORE was to be incorporated in Geneva as a non-profit organization, governed by

a nine-member Policy Oversight Committee (POC). POC’s composition would be identical

to the initial plan for the IAHC – two appointments from ISOC, IANA, and the IAB, plus one

each from the INTA, WIPO, and the ITU. The FNC seat was gone, Heath was gone, and

nothing was added for the ISP industry. The most elaborate innovation was CORE’s dispute

resolution structure, evidently modeled on mechanisms used in places where trademark

litigation was uncommon, like Sweden. The new system envisaged a permanent role for

WIPO, particularly its Mediation and Arbitration Board. That board would oversee the

creation and operation of a series of Administrative Challenge Panels (ACPs), which would

act like courts, hearing disputes and rendering decisions.

Page 341: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

326

Ibid.475

To guarantee public oversight, any group or individual who signed of the gTLD-MoU

would be eligible to participate in a Public Advisory Board (PAB), whose main purpose was

to monitor the POC. Participating registrars would have to sign a separate CORE-MoU. The

ITU would sit atop the institutional chain, but without power to intervene, serving only as

a depository for both documents. According to the plan, CORE would be ready to start

service by the end of the year. But this depended on a crash effort to create and test the

Shared Registry System (SRS) software on which everything depended.

The trademark community’s influence within the IAHC was unmistakable. Both the

December Draft and the February Final Report included a scrupulously crafted dialectic

under the heading “Whether to Create New gTLDs.” One side – implicitly Sally Abel – held

the opinion that there was “no fundamental need” to add any TLDs; the other side wanted

to create “alternative entry points” that would stimulate competition. The point was to

disclose how seemingly incommensurable positions could be resolved, while providing some

rhetorical cover for the people who had engineered the compromise.

In the final synthesis, continuing on with NSI’s “monopoly-based trading practices”

was deemed a more “significant strategic risk” than “further exacerbating the current issues”

of trademark defense. It was a grudging concession. Overall, the outcome fortified

suspicions that trademark interests had dominated the process, and had successfully

consolidated their powers in the Internet space.475

Some members of the trade press reported the IAHC’s plan in a straightforward

manner, treating the prospect of new TLD’s as a fait accomplis. But most reported the

objections of the alt-rooters. A few journalists went further, playing up the cartel aspect once

again, questioning the conflicted interests of the Internet old guard.

Amendments and adjustments came in due course. Howls of protest killed the 60-day

wait. Complaints from the European Commission led to the end of the 28 registrar limit, and

entry fees were reduced across the board to $10,000.

Page 342: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

327

The IAHC’s narrow membership had already alienated many observers. The failure

to document its proceedings was even more disconcerting. These shortcomings created a

political dilemma for the IETF membership. Leaders in the Internet’s standards-making

community had based much of their reputation on claims of openness, a mythology that

captivated public consciousness. Yes it was true that aggressive, autonomous “ad hocing all

over the place” was a big part of the engineering community’s insular, we-know-best culture.

That free-wheeling approach worried process-sticklers like Rutkowski, and had spurred them

to fault the IETF (and especially IANA) for procedural shabbiness. Nevertheless, on its own

terms, the IETF had been undeniably open. List-based decision-making remained one of the

group’s most illustrious hallmarks. It was indeed true that anyone could show up and

participate. And the IETF’s standard-blessing procedures had matured after the IPv7 debacle,

becoming far more consistent and accountable. By violating the Internet’s cherished norms,

acting in a fashion so inimical to the established customs of the IETF, the IAHC had

simultaneously exposed itself to charges of illegitimacy, and the Internet old-guard to charges

of hypocrisy.

If Postel had any objections to the gTLD-MoU, he wasn’t saying so publicly. The

point of all this travail had been to achieve rough consensus. Now it was time to sign the

papers, move on, and make DNS politics someone else’s problem. Despite the shortcomings

(an inevitable part of ad hocracy), he and most members of the IETF were willing to accept

the IAHC’s decisions as a new chapter in the Internet’s rulebook. To them, the plan looked

like a reasonable way to introduce competition into the TLD space, hand off the trademark

problems to a responsible institution, and still guarantee the technical community’s

continuing oversight of the Internet’s plumbing. With the policy problems delegated

elsewhere, their next steps – designing and running the shared database system – would be

in much more hospitable territory.

Overall, the plan seemed full of potential benefits, not the least of which was the

prospect of providing a long-term stream of income for IANA. Also, moving the site of

formal responsibility from the US to Geneva promised better shielding of DNS

administration from prospective liability issues. And it seemed to fulfill the overriding goal

Page 343: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

328

Phone interview with George Strawn, June 18, 2002.476

of opening up competition, preventing the windfall profits of domain name registrations from

ending up in one company’s till.

Though the shared-registry business model imposed by the IAHC provoked enduring

resentment from the alt-root community, not everyone considered the IAHC a nefarious

cabal. Several of the prospective TLD operators on the newdom list simply repositioned

themselves as prospective registrars. New players showed up, attracted to the same business

opportunity. It all seemed like a brilliant compromise between the interests of Internet

engineering community and intellectual property community.

But the ISOC/ITU deal was drawn too narrowly, self-sabotaged by a monumental

underestimation of the fortitude and the will to fight still possessed by the excluded groups.

NSI had too much to lose, the alt-rooters had so much more to gain, and the pundits would

continue to insist on comparing current events with their idealized vision of a cyber-

revolution.

* * *

As the IAHC agenda emerged, the US government was working at cross purposes

with itself, sending mixed signals. For now, Magaziner wasn’t saying much of anything for

or against, shrewdly avoiding any public step that would alienate either the IETF community

or friends of NSI. Strawn’s role was more complicated. Despite having demanded a seat at

the table in the IAHC, he had been a rather disengaged participant. He later dismissed the

IAHC as, “an attempt to do things informally” in spite of increasing pressure to use other

processes. “[T]he worse the heat got,” he said later, “the more it was decided that formal

matters better supplant informal matters and something replace [it].”476

Strawn’s lackluster engagement apparently didn’t trouble the other IAHC members.

For their purposes, the simple fact of his membership was useful enough, signaling

involvement of a significant US Government agency. However nominal the FNC’s

endorsement, however weak its loyalty, the benefit to the IAHC was a welcome infusion of

status and legitimacy (perhaps reminiscent of the FNC ‘s absent-minded sponsorship of

Page 344: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

329

Linda Sundro, “National Science Foundation, Office of the Inspector General: Report, The477

Administration of Internet Addresses,” February 7, 1997. Cited in Gordon Cook, “The NSF IG’s Plan to

Administer and Tax the Internet World Wide,” Cook Report. http://cookreport.com/igreport.html.

IANA’s “charter”). To leverage that perception, the front page of the IAHC’s website was

topped by a prominent link to the FNC’s site, flanked to the left by links for ISOC, IANA,

and the IAB, and on the right by links for the ITU, INTA, and WIPO. The imprimaturs

clearly suggested that all of them formally supported the IAHC. Strawn did nothing to

disabuse anyone of the idea. There was every reason to believe that the FNC stood shoulder

to shoulder with the ISOC/ITU alliance in meatspace, just as it did in cyberspace.

At the start of 1997 the IAHC presented no immediate threat to US Government

interests, primarily because CORE was not yet capable of bringing about any actual changes

on the Internet. The plan for early conclusion of the Cooperative Agreement between the

NSF and NSI was another matter. Mitchell and Strawn thought they had worked out a

bargain with the decision-makers at NSI: Once ARIN was up and running, securely

managing numbers registration services for the Americas, the NSF would happily extricate

itself from the responsibility of overseeing names and number registration services. At the

expense of funding one more year of ARIN’s transistion costs, NSI would finally have .com,

.net, and .org free and clear, all to itself. Mitchell had spelled out the rationale in his “Declare

Victory and Leave” memo the previous October, but few people outside the NSF appreciated

that he fully intended to make it a reality.

There were objections inside the NSF, however, especially from the Inspector

General, Linda Sundro, who published her own findings in early February, even as Mitchell

and Strawn huddled with counterparts from SAIC and NSI, working out the final details of

their plan. Sundro argued that registration oversight should continue through the end of the

Cooperative Agreement in 1998, and beyond. Her primary concern was financial, believing

that the $15 Intellectual Infrastructure Fee that had been tacked onto domain name

registrations should be used to support upcoming research and development programs,

“thereby reducing the need for taxpayer subsidies in the future.” Sundro expected that477

NSF’s portion of the income stream from names alone would reach $60 million per year. She

Page 345: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

330

McConnell was OMB’s Chief of Information and Technology Policy. Phone interview with Gordon478

Cook, and; Mueller (2002: 288, fn 32).

also urged that the number assignments be seen as a potential revenue source, at perhaps 10

cents per IP address.

Despite Sundro’s reservations, Mitchell and Strawn continued their preparations. The

goal was to announce both the formation of ARIN and the April 1 termination of thest

Cooperative Agreement simultaneously at news conference scheduled for March 18. They

never got that far. By the end of February matters escalated to the Office of Management and

Budget, the home agency of Bruce McConnell, an information technology specialist who was

a prospective co-chair of Magaziner’s Inter-Agency Working Group. Strawn was called478

into OMB’s offices and asked for a status report. The outcome, in early March, was an abrupt

termination of the termination plans. Forward movement on ARIN was suspended as well,

pending reevaluation.

Magaziner and the various candidates for his Inter-Agency Group were still catching

up, trying to build a knowledge base about what was going on, and, for the time being, retain

the power of initiative. Blocking the termination was a relatively easy call. It wasn’t going

to make SAIC’s owners’ happy. Shedding the encumbrances of the Cooperative Agreement

would have improved IPO prospects. On the other hand, there was no new negative impact

on NSI’s cash flow. The intervention was considerably more problematic for Mitchell and

Strawn, who were now primed for openly antagonistic relations with members of

Magaziner’s group, if not Magaziner himself.

Mitchell and Strawn had lost the ability to act as the gatekeepers of their contracts

with NSI, and yet had failed to extricate themselves from the time-consuming responsibilities

of overseeing the InterNIC. To strike back, they would focus on ARIN, which had

widespread support throughout the technical community. The exceptions were some

relatively small ISPs, some easily peripheralized crazy-sounding email-list gadflies like Jim

Fleming, and some journalists who were notorious for complaining about anything and

everything. From that point on, ARIN’s opponents in the Inter-Agency Group would be

Page 346: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

331

Economist. Feb 8, 1997.479

Joyce Reynolds and Nehal Bhau, a graduate student, were also named as defendants. Rumor had480

it that Reynold’s plans to purchase a home had to be put off for a year as a result of the IODesign suit. This was

related to me by Kent Crispin. Reynolds, however, said this was not the case.

portrayed as incompetent meddling sellouts (or, when deference was expedient, clueless

newbies who still had a lot to learn).

* * *

The engineering community was generally unaware that another potentially

earthshaking deal between the NSF and NSI had just been averted. Most of the news

involved more trouble for IANA. An editorial in the British weekly, The Economist, derided

IANA and the IAHC as “amateurs.” and suggested putting professionals in charge, implying,

tacitly, the Telecommunications Directorate of European Commission. There were479

lawsuits, sharp criticism of Postel’s handling of TLD and IP assignments, bureaucratic

pressures from ISI’s overseers at the University of Southern California, and finally, after so

many years of steady support, the loss of funding from DARPA.

The alt-root community, profoundly antagonized, was defiant. Revitalized by his

anger, Denninger decided to hold a summit that would raise the flag of the eDNS coalition

and attract new support. Friendly reporters provided fortifying coverage. IODesign’s

Christopher Ambler went even further, filing a civil complaint in San Luis Obispo on

February 27. The complaint sought to block implementation of the gTLD-MoU by means

of a restraining order against the IAHC and IANA. Postel, Heath, Manning, Reynolds, and

a graduate student who maintained IANA’s website were named as defendants. The

complaint stated that IANA had reneged on an oral agreement to add IODesign’s .web

registry to the root. It also stated that the IAHC’s inclusion of .web in its list of seven new

TLDs infringed on IODesign’s service mark.

Postel’s supporters were outraged, believing Ambler had done much worse than

telling a self-serving lie and insulting the dearest, most esteemed member of the Internet

engineering community. By including Reynolds and a graduate student in the suit, he had

acted cruelly, putting uninvolved people in legal jeopardy. 480

Page 347: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

332

See Gordon Cook, Cook Report, June 1997.481

Image Online Design, Inc. v. Internet Assigned Number Authority, International Ad-Hoc482

Committee, et al., No. CV080380 (N.D. Cal. 1997). For the complaint and an accompanying supporting

declaration by Einar Stefferud, see http://normative.zusammenhaenge.at/faelle/iod_v_iana.html.

See transcript at http://www.brandenburg.com/misc/iodesign-judge.html. The case became moot483

after it became clear that the United States government had blocked the plan from proceeding.

Postel’s budget contained no reserves for dealing with such litigation. From a strictly

legal standpoint IANA was no more than a name for his sandbox at USC’s Information

Sciences Institute. He thus turned to ISI and ultimately USC for help. The university’s

General Counsel, Robert Lane, then had to determine what level of backing USC was obliged

to provide, if at all. After looking into the complex web of relations between between ISI,

Postel, and his various funding sources, Lane decided to carve out a position asserting that

IANA, and consequently ISI, and consequently USC, had no contractual authority over

NSI. That line ran contrary to the wording in the Cooperative Agreement about NSI’s481

obligation “to provide registration in accordance with the provisions of RFC 1174.” But

Lane’s professional obligation was to secure USC against undue liability. He was much less

concerned about protecting IANA’s claims to authority within the Internet’s administrative

hierarchy.

As it turned out, legal expenses never got out of hand. The presiding judge found so

little merit in IODesign’s arguments that he was evidently ready to dismiss the suit at the first

hearing. Ambler’s attorney backed down and withdrew the complaint “without prejudice,”

gamely preserving the right to try again elsewhere. The judge nevertheless issued a482

statement favorable to IANA.483

As the IODesign suit wrapped up, another legal battle was simmering. Paul Garrin,

media artist and owner of the alternate registry Name.Space, decided to try forcing his

suffixes into the legacy root by targeting NSI rather than IANA. It was a strategic move,

motivated in part by his failure to attract significant numbers of customers to his own root

constellation, and in part by a quixotic temperament. He seemed to relish confrontation with

huge, normally intimidating authorities. Garrin contacted NSI in March1997, requesting that

Page 348: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

333

Cited in Mueller (2002:152-3).484

his TLDs be added. NSI, in turn, deferred the request to IANA. The reply came from USC’s

attorney:

We are aware of no contract or other agreement that gives IANA authorityover [Network Solutions’s] operations. The IANA has no authority toestablish a generic top-level domain without an Internet communityconsensus arrived at through committee review and ample opportunity forpublic input.484

The first sentence of that citation provided the legal cover that USC needed, insisting

the matter was not in IANA’s jurisdiction. The second sentence, however, was a fuzzy hedge,

no more precise than the ambivalent conception of IANA’s authority inherited from RFC

1174. Read without negation, it implied that IANA did have authority to establish a generic

TLDs. What did establish mean, if not the ability to give a directive to change the zone file

in Root Server A? Was IANA the gatekeeper of the TLD space, or was it not? What did

consensus mean, if not the capacity to recognize a consensus within one’s community, and

then to reflect that consensus by speaking and acting on its behalf. Did the peers of the

Internet community speak through IANA, or did they not? The tortured construction of that

second sentence allowed Postel to hide from his own responsibility. It said, properly, “The

buck stops here,” but its pragmatic effect was to pass the buck back to NSI.

f. Haiti and iTLDS – March 1997

After nearly a year and half of argument and debate, not one new generic suffix had

been entered into the legacy root. Yet more than forty new country code TLDs were entered

during the same period. Each insertion had been a fairly routine matter, performed in accord

with RFC 1591 and the standard operating procedures used by Postel and Mark Kosters, who

ran the root servers at NSI. Now, complicating Postel’s life even further, some irregularities

in country code administration were coming under scrutiny.

An entry for Haiti was added to the root on March 6, 1997. The TLD, .ht, went to

Reseau Telematique Haitien pour la Recherche et le Developpement (REHRED), a non-

Page 349: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

334

Cited in John S. Quarterman, “Haiti and Internet Governance” First Monday July 1997,485

http://www.firstmonday.org/issues/issue2_6/quarterman/.

profit group linked to the Haitian academic community. A week later, March 13, Postel

informed REHRED that, at the behest of the Haitian government, he was redelegating .ht to

another party, Focus Data. The company was an aspiring Haitian ISP whose owners had ties

to commercial interests in the US (including, allegedly, MCI) and in Jamaica, and

presumably to officials in the Haitian government as well. REHRED objected and asked for

justification.

There was nothing in RFC 1591 stipulating that a government had final disposition

over its country code, but there was precedent for governments taking such action. For

example, the German registry .de was transferred from the University at Karlsruhe to the

Bundespost in the early 1990s. In that case, however, the two parties cooperated to ensure

continued service for existing registrants. This case involved an objection. Postel had to

admit that he had created a guiding policy on the spot.

Hello:

I am sorry if you do not understand that we have explaind [sic]to you that there is a rule we have adopted since RFC 1591 waspublished:

"Follow the expressed wishes of the government of the countrywith regard to the domain name manager for the country codecorresponding to that country". We have decided that this ruletakes priority. We do not believe it is wise to argue with thegovernment of a country about the TLD for that country.

--jon.485

The story was picked up by John Quarterman, a prolific and highly regarded technical

writer who had been around since the ARPANET days, starting at Harvard in 1974 and then

as an employee at BBN. He was alarmed by the .ht redelegation, in part because it was a

setback for traditional allies of the Internet community, and in part because it indicated

profound shortcomings in IANA’s dispute resolution process.

Much of Quarterman’s published work, especially his online journalism, concerned

online conferencing technologies and traffic performance analysis. He also wrote about the

Page 350: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

335

In the first quoted phrase of this sentence, Quarterman cites the Supreme Court decision overturning486

the CDA. See John S. Quarterman, “Haiti Déjá Vu,” SunExpert Magazine, February 1998.

Internet as a social phenomenon, and was fond of using the term “Matrix” (long before the

movie phenomenon) to refer to the “totality of present-day computer networks.” Quarterman

had also become quite partisan during the CDA debates, an experience that influenced his

perspective on the Haitian controversy. Echoing the view that the Internet had become “a

unique and wholly new medium of worldwide human communication,” he urged vigilance

against any further attempts at “handing it over to that antique form of human organization,

the nation-state.” 486

Internet technologists had a natural affinity with academics; both groups shared deep

normative commitments to open communication. Academics were considered more

trustworthy allies of the iconoclastic Internet community than the entrenched interests of the

national telephone monopolies or anyone whose primary loyalties rested with a potentially

repressive government. REHRED fit the traditional profile of a country code TLD registry.

Plus, it had received the delegation first. What justified the switch?

Quarterman’s investigation led to publication of an article in the online journal First

Monday. He reported a another peremptory switch, involving .do in the Dominican Republic.

He recounted the facts of the IDNB, the imaginary Internet DNS Names Review Board

described in RFC 1591. He lamented the unclear provenance of Postel’s authority and his

ability to act like a “pope” without outside review. (As an old friend of the DNRC’s Mikki

Barry, he had special access to the views of Postel’s most sophisticated critics.) To top

things off, he raised the record of US imperial interference in Latin American and Caribbean

affairs, suggesting that IANA’s move had tilted in favor of “a U. S.-based multinational at

the expense of a local Caribbean company.”

It is rather hard to imagine Jon Postel acting from nefarious motives.However, IANA could not have done a better job of making many think it isan imperialistic tool if it had tried. IANA's actions in this case have beendevoid of political common sense... If IANA has now invited governmentsin to directly dictate major and highly visible aspects of the Internet such as

Page 351: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

336

Quarterman, “Haiti and Internet Governance.”487

“In Re: Establishment of a Memorandum of Understanding on the Generic Top Level Domain Name488

Space of the Internet Domain Name System. Brief of Anthony M. Rutkowski,” http://www.wia.org/pub/dns-

brief.html.

national top level domains, IANA is damaging both its own usefulness andthe Internet itself.487

Quarterman also interviewed Dave Crocker, who suggested that the question was out

of scope for him since the IAHC was focused on generic TLDs rather than country codes.

Nevertheless, Crocker offered that some “fixing” of IANA was in store as part of the IAHC’s

effort to establish “more formal and visible” processes. This didn’t satisfy Quarterman, who

wanted precise details about how someone in REHRED’s position might be able to go about

seeking relief.

Quarterman concluded the article asking whether Postel was “an agent of

imperialism,” even if an unwitting one. The elliptical rhetoric may have been satisfying, but

it came at the expense of a sober analysis of Postel’s complex circumstances. Postel was in

a very weak position. Even if his post hoc rule-making seemed sloppy, it was unfair to expect

him to deny the re-delegation and thereby risk an international incident that could spur even

more controversy Nevertheless, Quarterman had the facts right. His rendition of IANA’s

procedural shortcomings provided yet more material for critics of Postel as a figurehead of

the ITU/ISOC alliance.

g. Laying the Cornerstone for a Global Commons

Rutkowski, coordinating with NSI, thought it might be possible convince a US

government agency to declare the IAHC plan unlawful. At the end of February he submitted

a Brief to the State Department and the Federal Communication Commission, spelling out

legal arguments that could be used by a US government agency willing to undertake action

to block the gTLD-MoU. No agency brought suit, but the exercise was useful in getting488

NSI’s message around Washington.

Page 352: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

337

“Secure Internet Administration and Competition in Domain Naming Services, Version 1.2,” April489

29, 1997. Formerly hosted at http://netsol.com/policy/internet.html.

IAHC opponents who received funding and support from NSI included the Association for490

Interactive Media (AIM), a small DC-based lobbying shop with a penchant for overstated claims. AIM literature

later described its activities this way: “In response to an underlying threat to the Internet, AIM creates the Open

Internet Congress (OIC) in '97 to combat international forces attempting to move the Internet operating body

from the US overseas. The efforts of the OIC lead to a Presidential initiative to create an operating body to

oversee Internet governance, later to become ICANN.” See http://www.imarketing.org/aboHistory.php.

Mueller (2002:150-1).491

Phone interview with Ira Magaziner, August 5, 2000.492

NSI’s formal response to the IAHC came in mid-April. Couched as a wide-ranging

counterproposal, it incorporated arguments from Rutkowski’s Brief, urging the US

government to intervene against the IAHC. Around the same time, NSI ramped up its489

lobbying efforts across the United States and Europe. The campaign focused on undermining

the MoU and vilifying its supporters. 490

Representatives of the European Union were now openly expressing dissatisfaction

with the IAHC and insisted on further debate, this time with direct European participation.491

Decisive opposition came from Martin Bangemann, Vice-President of the European

Commission, whose portfolio included Telecommunications, Information Technology, and

Industrial Policy. Bangemann was also a significant figure in the German government,

having been a member of the Bundestag, a former Minister of Economic Affairs, and the

National Chairman of the Free Democratic Party (FDP), a center-right group whose platform

echoed the deregulatory policies of Reagan and Thatcher. Bangemann had told Magaziner

that the European Community would initiate anti-trust proceedings if the gTLD-MoU plans

reached fruition.492

* * *

Heath and Postel, acting for ISOC and IANA respectively, signed the gTLD-MoU on

March 1, 1997. This signaled their formal endorsement of the IAHC’s work product. Heath

then began working with the ITU to organize a three day conference in Geneva, to be

punctuated by a grand May 1 signing ceremony. The success of that event would be criticalst

Page 353: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

338

to the success of the gTLD-MoU. To the extent the organizers could enlist varied and

independent support, the more effectively they could defend the legitimacy of their new

edifice, and launch a new era in Internet governance. This was one of the most clever

elements of Shaw’s design, an institutional legal framework that involved running a flag up

a flagpole and waiting to see who salutes. To attract more endorsements, Crocker set off on

a globe-trotting public relations campaign.

There was an IETF meeting that some month, at the Piedmont Hotel in Memphis.

It was the first of several that I attended. The “buzz” was that the MoU was designed to

satisfy the concerns of “big business” interests, especially MCI, DEC, AT&T, IBM, and

UUNET. All those companies supposedly wanted assurances about the stability of DNS

management. If key Internet functions were to going be moved from NSI, it had to be done

right. The operators would have to fit the new Internet mold of disciplined, steady,

accountable professionals, rather than the caffeine-guzzling, pizza-binging Internet culture

of overworked volunteers, neophyte graduate students, and ramshackle entrepreneurs.

Surprisingly few IETF members were skeptical about the concept of a deal between

the IANA and the ITU. It should have been an ontological shock, but most IETF members

deferred to Postel, Vixie, and the other DNS “wonks” backing the IAHC and the MoU.

Technical viability was the primary concern, and the Shared Registry System (SRS), was

now said to be “do-able.” It was left up to the implementors to go off and do their thing. The

IETF norm of “rough consensus” meant that airing old complaints and misgivings after a

standard had been selected would be a waste of effort.

Still, support wasn’t universal. Einar Stefferud, a well-connected ARPANET veteran

was there. He had already alienated Postel with public criticism of IANA. Now he was

condemning the IAHC and even defending Christopher Ambler’s suit. The outside criticism

couldn’t be ignored. The IETF community had gotten itself into a public relations war against

highly skilled, polished adversaries such as Rutkowski. Crocker and Metzger were far

outmatched.

While the IAHC process kept up its momentum, the alt-root movement was faltering.

The threat from IODesign was receding and the eDNS summit in Atlanta was poorly

Page 354: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

339

Rony and Rony devote an entire chapter to this episode, “Alterweb – A Parallel Universe,” (1998:493

513-72).

Gordon Irlam, “Controlling the Internet: Legal and Technical Aspects of the Domain Name494

System,” July 13, 1997, http://www.base.com/gordoni/thoughts/dns-control/confidential/dns-control-

confidential.html

Mueller (2002: 155-6; 288, fn 29).495

attended. Then a falling out divided their community. Kashpureff broke with Denninger,

splitting off to announce a new alternate constellation he called uDNS. Ever ambitious, the

u stood for universal.493

* * *

Forces in the US government, motivated largely by the desire to prevent loss of assets

through a hasty premature transfer of .com to NSI, preempted Mitchell’s and Strawn’s late

February efforts to walk away from Internet oversight. Yet the members of Magaziner’s

Inter-Agency group were apparently blind-sided just a few weeks later when officials at

DARPA’s Information Technology Office successfully executed their own extrication. It

took some time to fully appreciate the implications.

IANA had been receiving about $500,000 annually through DARPA’s contract with

ISI, set to expire at the end of March. (IANA also received about $60,000 per year from NSI

to run the .us domain, as stipulated in the Cooperative Agreement.) DARPA announced494

in the closing days of March that there would be no renewal. To make up the gap, pending

a long-term solution for IANA funding, the foreign IP registries stepped in. APNIC

committed $50,000 and RIPE $25,000. Regardless of critics denouncing IANA as an agent495

of US imperialism, the Internet community now had grounds to claim it had weaned itself

from dependence on US government support.

On April 23rd the NSF’s acting deputy director, Joseph Bordogna, made a new bid

for extrication. He announced that the NSF would not renew the Cooperative Agreement

when it was due to expire at the end of March 1998, and might choose to terminate sooner.

Bordogna’s announcement included the first public release of Sundro’s report. He tacitly

rejected her conclusions, making little comment about them, while offering a highly positive

Page 355: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

340

CNET News.com Staff, “NSF bows out of domain names,” CNET April 23, 1997,496

http://news.com.com/2100-1023-279152.html.

Dave Crocker, “Evolving Internet Name Administration,” April 15, 1997, http://www.iahc.org/497

contrib/draft-iahc-crocker-summary-00.html.

evaluation of the IAHC’s plan. (This move could support the view that NSF personnel were

more interested in getting out of the Internet management business than arranging sweetheart

deals for NSI.) Magaziner was caught flat footed. When the press called asking for comment,

he was unaware of the details.496

Events were moving at the pace of Internet time and the Clinton Administration was

falling behind. With the ITU conference less than a week away it was clear that another

intervention would be necessary. This time, however, executive branch policy officers had

far less directive power at their disposal. There were no subordinate employees to reign in.

IANA had no purse strings left to pull. Some proponents of the gTLD-MoU decided to

exploit the situation. Their victory would be portrayed as the victory of the global Internet

community over the ossified structures of anachronistic nation-states.

Dave Crocker had made himself the face of the IAHC, conducting himself more like

a demagogue than a diplomat. The circumstances called for a cogent explanation of the

opportunities that would open up to those who joined the new regime. Instead, he gave

patronizing lectures about the inexorable necessity of the IAHC’s plan. He also had an

annoying tendency to argue from authority, as if he could cloak himself with infallible,

unimpeachable truth. A paper titled, “Evolving Internet Name Administration,” encapsulated

his reasoning. The problem was how to “position [DNS management] for long-term, global

operation.”497

Certain technical requirements “dictated” the need for a “central authority over the

root.” First of all, the “fabric of the Internet” required a “precise and deterministic” mapping

service. An outcome that permitted the creation of multiple, uncoordinated roots, argued

Crocker, “flies in the face of established computer science.” If that occurred, the results of

a domain name lookup would be “probabilistic.” Second, it was necessary to meet the

requirements of scalability and shared TLDs. However, a “lightweight” authority would not

Page 356: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

341

Ibid.498

be sufficient for those purposes. The organization serving as the root’s central authority

would have to be “substantial,” taking the form agreed to by the IAHC.

Even if one agreed with Crocker’s argument about the need to preserve a unique root,

other questions remained. Why should the IAHC be the central authority rather than the US

government? How could the IAHC assume that authority without US government

permission? In response, he invoked the highest authorities available:

From the standpoint of reasonableness and logic, such concerns miss thereality of IANA’s 10+ years of oversight and authority and miss theunavoidable reality that the Internet is now global. There is no benefit instaking the DNS out as a US resource; this way be dragons. There is nobenefit in refuting IANA’s authority; no alternative management schemeexists or is proposed or, therefore, has community consensus. Never mindthat IANA has done an excellent job. The IAHC plan rests on the wellestablished authorities of IANA and ISOC and on the voluntary support ofsignatories. The latter means, quite simply, that the IAHC plan is self-enabling. Hence challenges about prior authority are rendered meaningless.498

The implication was clear: Where management of the Internet’s core resources was

concerned, a self-enabling global community could render nation-states meaningless. Even

superpowers were now subject to its dictates.

The essence of Crocker’s paper was an expressly global conception of the public

interest. It was fully realized in that it was founded on an extraterritorial conception of

authority. DNS administration had to be “independent of any particular country,” he wrote,

balancing “individual, commercial initiative” with “the modern Internet’s governance

style...” presumably, rough consensus and running code, with IANA atop of the great chain

of delegation power.

* * *

The upstart globalists had embarked on a course that promised, they believed, a new

beginning for global society. Yet the stewards of sovereign jurisdiction also happened to

know a thing or two about self enablement and the power to dictate. Though the IAHC

Page 357: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

342

Margie Wylie, “U.S. concerned by ITU meeting,” April 29, 1997, http://www.news.com/News/499

Item/0,4,10198,00.html

Dr. Bruno Lanvin, “Building Foundations for Global Commons,” April 29, 1997, http://www.itu.int/500

newsarchive/projects/dns-meet/LanvinAddress.html. For a list of other gTLD-MoU documents, see

http://www.gtld-mou.org/docs/news.html.

Donald M. Heath, “Beginnings: Internet Self-Governance:A Requirement to Fulfill the Promise,501

April 29, 1997, http://www.itu.int/newsarchive/projects/dns-meet/HeathAddress.html.

proponents believed the Internet had been unmoored from nation-states, the US government

still had many forms of leverage available, not least of which was a monopoly on coercive

authority within American territory. For now it was only necessary to put a nominal kibosh

on things through public remonstrations. What really mattered was that the decision had been

made to devote the resources necessary to staying alert and involved, and to avoid being

caught unprepared ever again.

In the closing days of April, the US State Department released a memo from

Secretary of State Madeline Albright signaling displeasure with the ITU’s support of the

IAHC. The operative terminology was an expression of “concern.” Albright pointed

specifically to the ITU’s expenditure of funds for the upcoming signing ceremony and its

conclusion of an “international agreement” despite the lack of formal consultation with

member governments. The US then sent a relatively low grade delegate to the signing499

ceremony at the Geneva International Conference Center, signaling displeasure through

protocol gestures.

The conference started with a gushing opening address by Bruno Lanvin, a

Coordinator from the UN Conference of Trade and Development (UNCTAD) whose

portfolio included electronic commerce. He was also President of ISOC’s Geneva Chapter.

Lanvin spoke of “Building Foundations for a Global Commons.” Heath spoke of “Internet500

self-governance” as a “daring... bold... beginning.” Alan Sullivan, a prospective registrar,501

took a more apocalyptic approach, “If we do not back this agreement, the structure of the

existing domain name system will be split and chaos on the Internet will occur.” Follow

Page 358: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

343

International Telecommunication Union, “Information Note to the Press,” May 1, 1997,502

http://www.itu.int/newsarchive/projects/dns-meet/DNS-PressNote2.html.

Pekka Tarjanne, “Internet Governance: Towards Voluntary Multilateralism,” April 29, 1997,503

http://www.itu.int/newsarchive/projects/dns-meet/KeynoteAddress.html. The gTLD-MoU was not the first of

this type. An earlier MoU concerned the free circulation of terminals for Global Mobile Personal

Communications by Satellite (GMPCS). The ITU is a registrar for the Universal International Freephone

Number System is custodian of the international telephone numbering plan.

through, he urged, or allow “the Internet community to be drowned in the impending

flood."502

In his keynote address, the ITU Secretary General, Pekka Tarjanne, laid out a vision

of principles that would guide the “coming together” of the once-contentious

telecommunications and Internet cultures. Costly old processes of “prescriptive standards...

legally binding agreements [and] industrial politics,” which Tarjanne termed “compulsory

multilateralism,” were giving way to faster and more timely processes by which

“communities of interest” would come together to formulate technical solutions, “letting the

market decide whether or not they got it right.” He called the new paradigm “voluntary

multilateralism.” The Memorandum of Understanding would often be the “chosen tool for

this new era,” he believed, as communities formed around a “special purpose” or sets of

“voluntary principles.” The ITU could facilitate the transition to that new era in various

ways: as depository for an MoU, as a registrar in an allocation regime, or as the custodian of

a scarce public resource. 503

One of the shortcomings of the old system, he noted, was how easily it could be

“hijacked by countries or companies” that had an interest in blocking a process. The new

paradigm would circumvent this by allowing “the parties committed to a particular course

of action”to proceed unhindered. In other words, consensus could be construed as narrowly

as needed in order to focus on problem solving. And in practical terms, therefore, the gTLD-

MoU could move ahead without waiting for approval from NSI or the US government.

More high-minded speeches were made, and documents were signed. With the

gTLD-MoU in effect, the IAHC reconstituted itself as the Policy Oversight Committee

(POC), holding its first public meeting at the conference venue. The POC had effectively

Page 359: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

344

declared itself to be the public authority of the gTLD space, and therefore the agency

empowered to guide IANA in its gatekeeping capacity, but the root did not change hands,

and no alterations in the zone file were ordered.

As it turned out, there would be no self-enabling registry of deeds for cyberspace. The

IAHC plan was on its way to failure, as was the Internet community’s bid to constitute the

root of the DNS and the gTLD namespace as an expressly global resource. The US

government would reassert itself institutionally, and keep ultimate control. All the rest was

denouement.

Page 360: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

345

Chronology 4 Selected Events in DNS History

Date Event

May 1, 1997 gTLD-MoU signing ceremony in Geneva.

March 1997 Criticism of Postel and IAHC. IODesign suit. Magaziner’s Inter-Agency Group formed.

February 1997 OMB intervenes to prevent NSF self-extrication from Cooperative Agreement.

November 12, 1996 Launch of the IAHC. Meeting of FNC Advisory Committee.

September 1996 “Coordinating the Internet” conference at Harvard.

July 1996 IANA/IODesign “envelope” event. Collapse of original Newdom group (Newdom-iiia).

June 1996 IRE BOF, 2 Postel draft, ISOC Board meets, APPle conference, Dublin conference.nd

April 1,1996 Launch of AlterNIC, Newdom debates become increasingly contentious.

February 1996 “Internet Administrative Infrastructure” conference in DC co-hosted by ISOC and CIX.

November 1995 “Internet Names, Numbers and Beyond” conference at Harvard.

September 14, 1995 Fee charging begins in .com. Newdom dialogues begin..

March 1995 Online debates about ownership of top level domains.

July 1995 @Home Class A block allocation incident.

March 10, 1995 SAIC purchases NSI for $6 million.

January 1995 General Atomics removed from InterNIC, Info Scout project moved to U. Wisconsin.

November 1994 Meeting of InterNIC Interim Review Committee .

October 1994 Josh Quittner taunts MacDonald’s Restaurants with “Domain Name Goldrush” articles.

September 30, 1994 Start of IEEE-sponsored workshop regarding transition of .com from NSF oversight.

March 1994 RFC 1591 describes delegation policies for TLDs.

January 1994 NSI notified of legal dispute over knowledgenet.com.

April 1, 1993 Start of the Cooperative Agreement. NSI is running root and .com.

December 1992 Postel publishes first RFC describing management of .us.

December 1992 Final negotiation of the Cooperative Agreement between NSF, ATT, NSI, and GA.

March 1992 IAB Chair Lyman Chapin criticizes military for slow adoption of DNS.

October 1, 1991 Root, DDN-NIC, and ARPA-NIC moved from SRI to GSI-NSI.

June 25, 1991 DCA redesignated Defense Information Systems Agency (DISA).

August 1990 RFC 1174 published, describing IANA’s authority.

February 26, 1986 First incrementation of NIC zone file; DNS is officially in production.

During 1985 CISCO routers enter market, furnishing expanded platform for UNIX systems.

February 1985 First country code .us added to root following ISO 3166-1 naming system.

Early 1985 DCA transfers DNS root management responsibilities from SRI to ISI.

October 1984 RFC 920 announces TLD selections, including .com.

November 1983 Mockapetris and Postel publish DNS implementation plans RFCs 881, 882, 883.

June 24, 1983 First successful test of DNS technology.

August 1982 Postel and Zaw Sing Su propose tree-like domain name hierarchy, RFC 819.

During 1977 Mary Stahl begins maintaining data for DDN-NIC.

During 1975 Defense Communications Agency (DCA) replaces ARPA, SRI runs DDN-NIC.

During 1972 Elizabeth “Jake” Feinler becomes principal investigator of NIC at SRI.

September 21, 1971 Peggy Karp publishes Host Mnemonics RFC 226, hosts.txt in use.

Page 361: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

Department of Commerce, Docket No. 970613137-7137-01, http://www.ntia.doc.gov/ntiahome/504

domainname/dn5notic.htm.

John R. Mathiason and Charles C. Kuhlman, New York University, “International Public Regulation505

of the Internet: Who Will Give You Your Domain Name?,” March 1998, at http://www.intlmgt.com/

domain.html.

Elizabeth Weise, “New York Company Sues to Open Up Internet Names,” New York506

Times/CyberTimes, March 22, 1997. See also Mueller (2002: 152-3), and http://name-space.com/law/.

346

Only the dead have seen the end of war.

Plato

8. AFTER GENEVA

The signing of the gTLD-MoU on May 1, 1997 punctuated the early phase of an

extended struggle. Subsequent events deserve a similarly detailed narration, but are only

summarized here.

Officers of the United States government continued efforts to delay advancement of

the gTLD-MoU, prompting further rounds of comment, debate, negotiation, and infighting.

J. Beckwith “Becky” Burr, already a member of Magaziner’s Interagency Working Group,

was moved from a position at the Federal Communications Commission to become a lead

negotiator for domain names issues as an Associate Administrator within the Department of

Commerce’s National Telecommunications and Information Administration's (NTIA) Office

of International Affairs. The first major public action under the aegis of the NTIA was the

July 1 1997 filing of a Notice named “Request for Comments on the Registration andst

Administration of Internet Domain Names,” organized under the Clinton Administration’s

Framework for Global Electronic Commerce policy process managed by Magaziner. The504

six week comment period resulted in 432 responses. One study deemed 282 of those

comments to be substantive, finding that explicit support for the gTLD-MoU outweighed

opposition by a factor of 10 to 1.505

The domain name conflict became particularly heated around this period, which was

marked by several newsworthy events, including; 1) a lawsuit brought against NSI and the

NSF by Paul Garrin, a media artist and operator of an alternate TLD service called

Name.Space; 2) NSI’s IPO filing; which admitted various risk factors including a high506

Page 362: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

347

The July 3, 1997 filing is available online at ftp://www.sec.gov/edgar/data/1030341507

/0000950133-97-002418.txt. See also the July 6, 1997 by comments by Anthony Couvering at

http://www.gtld-mou.org/gtld-discuss/mail-archive/04448.html.

Peter Wayner, “Internet Glitch Reveals System's Pervasiveness, Vulnerability,” New York508

Times/CyberTimes, July 18, 1997, http://www.nytimes.com/library/cyber/week/071897dns.html.

Diamond (1998).509

The group was the Association for Interactive Media, run by Andy Sernovitz, using the website510

http://www.interactivehq.org/oic.

The hearings, under the aegis of the U.S. House of Representatives, Committee on Science,511

Subcommittee on Basic Research, chaired by Charles W. ''Chip'' Pickering, were held in two parts on September

25 and 30, 1997. Part I, http://commdocs.house.gov/committees/science/hsy268140.000/hsy268140_0.HTM.

Part II, http://commdocs.house.gov/committees/science/hsy273140.000/hsy273140_0.HTM.

For the complaint, see. http://www.aira.org/legal/filing1.html. Later news reports of the issue512

include, Janet Kornblum, “Judge rules domain fees illegal,” at CNET, News.com, April 9 1998,

http://news.com.com/2100-1023-210006.html.

level of uncollectible receivables due to the “large number” of registrants engaged in

speculative activity and likely to default on payment; 3) an operational error at NSI which507

corrupted the dot-com lookup table, briefly interrupting web and email services in various

parts of the world; 4) a spectacular act of piracy by Eugene Kashpureff, in which he took508

over the page at InterNIC.net to mount a protest against NSI’s effort to convert the InterNIC

into a branded product; 5) NSI’s backing of a Washington DC lobbying organization509

which attempted to organize an “Open Internet Congress” to “Stop the Internet Coup” and

“Oppose the gTLD-MoU, the Internet Society and IANA;” 6) a contentious series of510

Congressional hearings involving Postel, Rutkowski, Heath, and others, focused on the

“transition of the domain names system” to the private sector, and 7) a class action suit511

brought on October 16, 1997 against NSI and the NSF, claiming that the (still undistributed)

$15 Internet Infrastructure Fee that NSI reimbursed to NSI for each annual registration

constituted an illegal tax.512

Tension between Magaziner’s group and the gTLD-MoU sharpened in the closing

months of 1997, as CORE’s infrastructure began to take shape. Concerned that the US

government might act to block the start of testing and operations anticipated for early 1998,

Dave Crocker began lobbying for a global campaign to “open up control of Internet

Page 363: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

348

Crocker’s comment was contained within an email to a private CORE list, leaked to the journalist513

Gordon Cook. See “Anti-monopoly Fervor Distorts Interagency DNS Working,” IP November 15, 1997.

http://www.interesting-people.org/archives/interesting-people/199711/ msg00039.html. See also Gordon Cook’s

comments in, “Will the Clinton Administration Permit the Hijacking Of,” IP November 13, 1997, http://

www.interesting-people.org/archives-ftp/ interesting-people/interesting-people.199711

Similar takes on the incident have been recounted to me in interviews with Gordon Cook and Ira514

Magaziner. The incident is also cited in Mueller (2002: 161, 289, fn.44).

I was present at the meeting at the Omni-Shoreham Hotel, commensurate with the December 8-12515

IETF Meeting.

Gilmore’s email was leaked and published by others. See Jay Fenello, “Re: UK Registrars URGENT516

meeting,” January 31, 1998, http://www.gtld-mou.org/gtld-discuss/ mail-archive/07218.html.

See the November 2002 thread, “anyone remember when the root servers were hi-jacked?,” IH,517

http://mailman.postel.org/pipermail/internet-history/ 2002-November.txt.gz.

Department of Commerce, National Telecommunications and Information Administration “A518

Proposal to Improve Technical Management of Internet Names and Addresses: Discussion draft 1/30/98,”

http://www.ntia.doc.gov/ntiahome/domainname/dnsdrft.htm. It was published in the Federal Register Docket

980212036-8036-01, 63.34, February 20, 1998, as “Improvement of Technical Management of Internet Names

and Addresses; Proposed Rule.” See http://www.ntia.doc.gov/ntiahome/domainname/022098fedreg.htm.

infrastructure.” As mentioned in the opening pages of this dissertation, Magaziner made513

an explicit demand to Postel insisting that CORE’s TLDs not be added to the root. There514

were several discussions between the members of IAWG, the IETF, and CORE, in

conjunction with IETF and CORE meetings held in Washington DC in December 1997, but

no accommodation was achieved. At one of those meetings Burr insisted that she held final

say, on behalf of the US government, over whether any changes could be made to the root.515

On January 22 John Gilmore sent an email to the CORE list discussing strategies to

indemnify IANA in case there was a direct confrontation between Jon Postel and the US

government over the entry of CORE’s TLDs into the root. Postel ordered the redirection516

of the root (or its “hijacking,” depending on one’s attitude toward the event) several days

later, but reversed course under strong pressure from Magaziner. 517

Magaziner’s Interagency Working Group released the so-called “Green Paper” (since

the policy-making process wasn’t considered “done” yet) through the Department of

Commerce on January 30. It signaled a strong break from the gTLD-MoU, proposing the518

selection of up to five new registries, each with a single TLD, using a shared registry system.

Page 364: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

349

The principles are detailed at ibid. The formulation, “guide the evolution,” was employed in the519

White Paper’s description of the Green Paper. See Department of Commerce, National Telecommunications

and Information Administration, “06-05-98 Statement of Policy, Management of Internet Names and

Addresses,” Docket Number: 980212036-8146-02. http://www.ntia.doc.gov/ntiahome/domainname/

6_5_98dns.htm.

Dave Farber, “Background to announcement of ITAG,” IP February 22, 1998. http://520

www.interesting-people.org/archives/interesting-people/199802/msg00041.html

An unlimited number competitive registrars could be accredited, subject to technical and

management criteria. NSI would be allowed to keep the dot-com, dot-net, and dot-org

registries, but would be obliged to operate them within the shared registry/competitive

registrar regime. The Green Paper also set out four principles to “guide the evolution of the

domain name system: stability, competition, private bottom-up coordination, and

representation” (my emphasis), and further proposed creating a body to manage the

“Coordinated Functions” of the Internet. This body would go beyond management of root

server operations and TLD additions, and would also coordinate assignments of IP blocks

to regional number registries, and “development of other technical protocol parameters as

needed to maintain universal connectivity on the Internet.” 519

Shortly after publication of the Green Paper, on February 11, 1998, Postel and

Carpenter (still head of the IAB) announced the IANA Transition Advisors Group (ITAG),

a small group of “senior advisors” invited to help Postel “draft statutes” for a “new

organization” that would move IANA “from a US Government funded activity to an openly

governed, privately funded not-for-profit activity.” This would eventually turn out be the520

coordinating body contemplated by the Green Paper.

The immediate aftermath of these events undercut ISOC’s reputation, emboldening

its opponents. Not only was there nothing to show after so time and money had been spent

to promote the IAHC and the gTLD-MoU, the ISOC community had proven itself woefully

unable to stand up to United States government intervention against a prestigious engineering

community process. Moreover, just as CORE’s investors were coming to terms with their

losses, its own version of Server A was stolen during a burglary at a co-location facility in

California.

Page 365: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

350

Marc Holitscher, “The Reform of the Domain Name System and the Role of Civil Society”521

Presentation at the 3rd Workshop of the Transatlantic Information Exchange Service (TIES) Paris, 6-7 April

2000 Unit for Internet Studies, Center for International Studies, Swiss Federal Institute for Technology,

http://www.internetstudies.org/members/holitscher/paris.html. See also Mueller (2002: 167-8)

Text in the subsequent White Paper admitted that “development” was an “overstatement” of IANA’s522

role and changed the operative word to “assignment.” See White Paper at fn. 159, Section 2, Response.

White Paper at fn. 519.523

International Forum on the White Paper, http://www.ifwp.org.524

Publication of the Green Paper inaugurated a new round of discussion. The NTIA

received over 650 formal comments by the start of June 1998, many of which criticized the

Green Paper’s authors for taking a US-centric approach. Members of the Internet standards521

community were concerned that the coordinating body envisioned by the Green Paper was

intended to usurp the IETF’s lead role in developing Internet standards, perhaps by

subordinating the IETF under it.522

The final“White Paper” that emerged on June 5, 1998, was a much softer document

than expected – a “Statement of Policy” rather than a “Rule Making.” Among other things,

its authors: 1) re-emphasized the four guiding principles of the Green Paper; 2) clarified the

relationship with the protocol community, while; 3) continuing to stress the need for a single,

overarching coordinating body to oversee names and number assignments; 4) put off the

question of new gTLDs as an issue for the new body to decide; 5) called upon WIPO to

initiate a new process to find solutions for the trademark and dispute problems, and; 6) made

it clear that a new not-for-profit corporation should be created by the “private sector” and

“managed by a globally and functionally representative Board of Directors.”523

Release of the White Paper prompted a race to create the new coordinating body. The

early favorite was the International Forum on the White Paper (IFWP), a series of meetings

held across four continents through the summer of 1998. Initially called the Global

Incorporation Alliance Workshop (GIAW), the process was advertised as an “ad hoc

coalition” of diverse “Internet stakeholder groups” coming together “around the world” to

“discuss the transition to private sector management of the technical administration of

Internet names and numbers” in accordance with the policies set out in the White Paper.524

Page 366: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

351

Upon reading about the announcement of the GIAW online, I called the provided number and525

reached someone who said she was an NSI employee. At the IFWP meeting, we were told that CIX’s Barbara

Dooley had provided funds for the venue, but could not attend due to a family emergency. CIX was listed as

the administrative contract for the ifwp.org domain name registration.

Computergram International, “The Origins of the Meeting,” July 7, 1998, http://526

www.findarticles.com/p/articles/mi_m0CGN/is_n134/ai_20874441.

The Electronic Frontier Foundation declined the invitation to attend because of the SAIC/NSI527

influence.

Email from Mike Roberts to Molly Shafer Van Houweling, August 21, 1998, provided by Roberts,528

on file with the author.

Logistical support and sponsorship for the first meeting in Reston, Virginia on July 1-2, 1998

was provided by NSI and CIX. The Chair was Tamar Frankel, a Professor of Law at525

Boston University. Her expertise included corporate governance. Frankel was a former

instructor of the DNRC’s Kathy Kleiman, who had passed her name on to Rutkowski and

Burr as IFWP planning got underway. 526

The apparently strong anti-IAHC tilt among the IFWP’s organizers caused some

wariness among the ISOC community and gTLD-MoU supporters. Some IFWP527

participants were persistent in insisting that the IFWP was “the” process, and that the new

corporation’s board and by laws would emerge directly out of it. Others, notably Mike

Roberts, one of the most senior ISOC veterans active in the IFWP, and a member of its

Steering Committee, believed there had been a “specific agreement... only to hold open

meetings to explore issues and ascertain consensus where it existed or emerged.” Roberts

concluded that small stakeholders and interest advocates wanted to “legitimize IFWP as the

final broker” because it gave their points of view “greater leverage.” Despite great public528

fanfare, and despite steady encouragement from Magaziner, the IFWP made no real progress

toward finding a consensus. Analysts at Harvard Law School’s Berkman Center offered a

potential solution. Throughout the summer, they had been publishing detailed comparative

reviews of the various drafts emerging from the IFWP and from a simultaneous IANA

process. With the opportunity to resolve the crisis in jeopardy, The Berkman group offered

to host a facilitated, invitation-only editorial wrap-up meeting September 12-13, 1998, and

Page 367: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

352

Email from Mike Roberts to [email protected], September 10, 1998, on file with the author. See529

also Mueller (2002: 178), who writes that Roberts’ move was prodded by Postel’s attorney, Joe Sims.

The site was http://www.iana.org/comments-mail/maillist.html.530

The ITU silver medal was also awarded to Vint Cerf in 1995. See Dave Farber’s copy of the ITU531

press release at “Announcement of ITU silver medal award to Jon Postel,” IP July 22, 1998,

http://www.interesting-people.org/archives/interesting-people/199807/msg00058.html.

Mueller (2002: 176).532

a public ratification event on September 19. Nevertheless, the IFWP process broke down

altogether when it was clear that the IANA process had reached fruition. Roberts asked his

supporters in the IFWP process (members of an ITU-sponsored email list) to “move on.”529

Postel had announced creation of a mailing list at the IANA site to collect comments

on the new corporation shortly after the White Paper’s release. It was a “one way” list that

simply displayed received postings that had been received. The site was soon overwhelmed

by noisy and provocative submissions from Postel’s critics, and never became an active

discussion venue. Postel did not participate in the other public discussions active at the530

time. He attended just one of the IFWP meetings, but only briefly, while in Geneva,

commensurate with an ISOC-sponsored INET meeting where he was receiving a special

award from the ITU for “contributions to the development of the Global Information

Infrastructure. 531

Nevertheless, Postel was a constant, if not material, presence in the IFWP, in that a

key point of the exercise was to design a new home for IANA. He was represented at the first

meeting in Reston by Joe Sims, Senior Partner for the prominent Washington DC law firm,

Jones Day. Recommended to Postel by ITAG, Sims had extensive experience as a deal maker

for Fortune 500 companies, specializing in mergers, acquisitions, and anti-trust matters.532

Like the IFWP, Sims and Postel were in a race to create a new structure before the expiration

of the Cooperative Agreement between the NSF and NSI. It was already in extension, nearing

the end of its six month “ramp down” period. Sims had started drafting sets of bylaws during

the summer, presenting a version for consideration at the IETF meeting in Chicago in late

August. He then began working with counterparts at NSI to harmonize his draft with their

Page 368: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

353

The proposed articles of incorporation and bylaws were archived at533

http://www.iana.org/description2.html.

For links to the various proposals and other related documented, see Ellen Rony’s “IFWP:534

International Forum on the White Paper,” at http://www.domainhandbook.com/ifwp.html.

immediate concerns. An NSI negotiator suggested the Internet Corporation for Assigned

Names and Numbers (ICANN) as the new entity’s name.

On September 17, 1998, Postel and NSI CEO Gabe Battista jointly announced the

result, titled “Proposed Articles of Incorporation - A Cooperative Effort by IANA and

NSI. ” ICANN’s first organizational meeting was held in New York on September 25. 533

Sims and Roger Cochetti (a lobbyist for IBM), finalized the selection of ICANN's interim

personnel. Mike Roberts became interim CEO. Esther Dyson, a well-known computer

industry pundit, agreed to serve as Chair of ICANN's interim board of directors.

Given that the IFWP had been heralded for having such an exceptionally open and

inclusive process, presumably modeled after the spirit of the Internet and the IETF, this

outcome prompted widespread resentment. Many grassroots participants felt their

contributions and interests had been wholly ignored. Believing there was still a chance to

circumvent ICANN by presenting a superior corporate model to the NTIA, three distinct

alternative proposals emerged. The most notable of these was presented by the Boston

Working Group (BWG), a small committee of IFWP participants who had already finalized

their travel arrangements for the cancelled ratification meeting and decided to show up

anyway. The NTIA arranged a one week extension of the Cooperative Agreement in order

to consider the competing proposals. The Pickering committee received testimony on534

October 7, 1998, during which Becky Burr announced that: 1) NSI had agreed to a timetable

for the adoption of a Shared Registry System; 2) the NTIA had opened a comment period for

consideration of the competing proposals for the new coordinating body, 3) the WIPO

process contemplated in the White Paper was about to convene; 4) that the “United States

has continued to consult with the international community, including other interested

governments, on the evolution and privatization of the domain name system, and; 5) a “high

Page 369: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

354

Testimony of J. Beckwith Burr Associate Administrator (Acting) National Telecommunications and535

Information Administration Before the House Committee on Science Subcommittee on Basic Research and

Subcommittee on Technology,” October 7, 1998, http://www.house.gov/science/burr_10-07.htm.

Gordon Cook, “Secret Meeting Shows ICANN - IBM Dependence,” The Cook Report on Internet536

Protocol Technology, Economics, Policy January 2000. See both http://www.cookreport.com/08.11.shtml and

http://www.cookreport.com/icannoverall.shtml

“Approved Agreements among ICANN, the U.S. Department of Commerce, and Network Solutions,537

Inc.” http://www.icann.org/nsi/nsi-agreements.htm.

level” group had been convened to review the root system’s security. It was clear that535

ICANN had the inside track for accreditation. In an atmosphere of mounting criticism

directed toward ICANN and the process by which it was formed, Congressman Tom Bliley

(R-Virginia), Chair of the House Commerce Committee, announced the opening of an

investigation by the Subcommittee on Oversight and Investigations into the administration's

conduct in the matter. Postel died of post-operative complications from heart surgery the

following day, October 16, 1998, an unexpected shock for the entire community. After a

short pause for memorial, skirmishing resumed its quick pace.

On November 25, 1998, ICANN the US Department of Commerce issued a

Memorandum of Understanding that formally reconsolidated IANA’s powers under ICANN,

retaining the four principles enunciated in the Green Paper: Stability; Competition; Private,

Bottom-Up Coordination, and; Representation. ICANN was criticized for a Board structure

deemed insufficiently open to public input and too vulnerable to capture by entrenched

interests. With no stable source of income, it had to rely on stopgap loans solicited from MCI

and IBM. It also benefitted from encouragement originating within the US government,536

and logistical support from the Berkman Center.

ICANN’s viability was finally secured on November 4, 1999 through a tripartite

arrangement involving ICANN, NSI, and the US Department of Commerce. A year later,537

on November 16, 2000, well past the peak of the Internet boom, ICANN’s Board of Directors

finally selected seven companies that would begin operating as new TLD registries (each

with one), but only after further months of preparation and negotiation. And only two of

Page 370: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

355

ICANN Resolution 00.89, http://www.icann.org/minutes/minutes-annual-meeting- 16nov00.htm538

Melanie Austria Farmer, “VeriSign buys Network Solutions in $21 billion deal” C/Net News, March539

7 , 2 0 0 0 . h t t p : / / n e w s . c o m . c o m / V e r i S i g n + b u y s + N e t w o r k + S o l u t i o n s + i n + 2 1 +

billion+deal/2100-1023_3-237656.html

Richar Koman, “Karl Auerbach: ICANN ‘Out of Control’,” O’Reilly Policy DevCenter, December540

5, 2002, http://www.oreillynet.com/pub/a/policy/2002/12/05/karl.html.

those TLDs, dot-biz and dot-info, could compete in the generic class with dot-com, dot-net,

and dot-org, now called “unsponsored” TLDs.538

NSI’s concessions to ICANN over the years included an agreement to provide

funding for ICANN operations, adoption of the split registry/registrar business model

endorsed by the IAHC, and the spin-off of the dot-org registry to another firm (a non-profit

created by ISOC), Still, NSI continued to maintain the leading position that had been

secured for it under the stewardship of SAIC. SAIC’s shareholders were the big winners of

the drawn out process, reaping a huge financial windfall. They had purchased NSI for $3

million in early 1995 and sold it to Verisign for $21 billion in March 2000, the same month

that the NASDAQ composite index reached its all time high. The new owner’s stock

plummeted shortly thereafter.539

ICANN remains in place as a nominally broad-based organization and, to its critics,

a metastasizing bureaucracy. It has nevertheless worked to ensure that the chain of540

authority over the Internet’s assignable core resources, especially the TLD space, remain

effectively under US national control. This has been called “trap door authority” because the

US government retained effective veto power over ICANN’s decisions without exercising

a high profile form of day-to-day control. The veto power was evidenced in 2006 when

ICANN reversed a 2005 decision to add the TLD suffix .xxx to the root.

A potential source of strong contention has arisen within a United Nations venue

called the World Summit on the Information Society (WSIS), which first met in December

2003. The WSIS launched a Working Group on Internet Governance (WGIG) whose

members debated related issues for nearly two years without reaching a unified consensus

on how to proceed. Members of the US government were wary of these moves, motivating

Page 371: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

356

Beaird’s statement was made at “Regime Change on the Internet? Internet Governance after WGIG”541

Symposium hosted by the Internet Governance Project and Syracuse University in Washington DC, July 28,

2005, reported by Robert Guerra, “Regime Change on the Internet: Conference Notes,” August 8, 2005,

http://www.circleid.com/posts/regime_change_on_the_internet_conference_notes/.

This statement came during the question and answer session of the “Regime Change” symposium.542

See also Robert Guerra’s full event notes at, http://www.privaterra.org/activities/wsis/blog/internet-governance-

after-wgig.html.

“The Mandate of the IGF” http://www.intgovforum.org/mandate.htm. See also, International543

Telecommunications Union, “Tunis Agenda for Information Society,” WSIS-05/TUNIS/DOC/6(Rev. 1)-E,

November 18, 2005, http://www.itu.int/wsis/docs2/ tunis/off/6rev1.html.

Becky Burr, Marilyn Cade, “Steps the U.S. Government Can Take To Promote Responsible, Global,544

Private Sector Management of the DNS,” http://internetgovernance.org/pdf/burr-cade.pdf.

Richard Beaird, Senior Deputy U.S. Coordinator, US State Department, Communication and

Information Policy Section, to announce that the US intended to maintain its central role in

decisions about Internet governance and its evolving principles. Beaird also stated that the541

US government would not support any new conventions or treaties on Internet governance,

and did not consider it fruitful to participate is such discussions. Nevertheless, at a meeting542

held in Tunis in November 2005, the parent WSIS process inaugurated an Internet

Governance Forum (IGF) to undertake a “multi-stakeholder policy dialogue” with a mandate

to, among other things “Discuss public policy issues related to key elements of Internet

governance in order to foster the sustainability, robustness, security, stability and

development of the Internet.” In July 2006 Becky Burr and Marilyn Cade proposed steps543

by which ICANN would replace US final authority to direct Verisign/NSI operators

regarding modification of entries in the root database 544

* * *

Despite NSI’s “victory,” the long struggle over the creation of new TLDs transformed

the nature of the discourse about Internet administration. It underscored the demands for

political legitimacy, if not outright democracy, in the settlement of key allocation questions.

Those calls for legitimacy may have quieted with the end of the Internet boom, but the urge

behind them remains unsatisfied, and is likely to affect future debates on this and related

issues.

Page 372: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

357

Where the allocation of global resources is concerned, the demand for widely

inclusive, fair, and open political participation is unlikely to disappear. To the extent that

resources are allocated via globally-organized hierarchies such as the Domain Name System,

people are likely to respond by identifying themselves as part of a global polity. This pattern

of interest formation occurred during the several phases of the DNS War. A pertinent

question is how this ever-repeating phenomenon of demand and expectation might serve to

promote the rise of expressly global social authority.

Page 373: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

358

Chronology 5 DNS History after the gTLD-MoU

Date Event

Oct 30 - Nov 2, 2006 IGF (in Athens) meets and organizes “dynamic coalitions.”

November 16-18, 2005 WSIS (in Tunis) inaugurates Internet Governance Forum (IGF).

March 22, 2005 ICANN approves addition of .eu TLD (fully operational by April 2006).

December 10-12, 2003 UN’s World Summit on the Information Society (WSIS) holds preparatory meeting.

October 2002 Distributed Denial of Service (DDOS) attack against the Root Constellation.

June 28, 2002 ICANN (in Bucharest) adopts “Blueprint for Reform” known as ICANN 2.0.

March 18, 2002 At Large member Karl Auerbach sues ICANN seeking access to financial records.

November 16, 2000 ICANN approves seven new TLDs before the At Large Board members are seated.

mid-late 2000 ICANN At Large Board Members elected in first global election.

March 2000 SAIC sells NSI to Verisign for estimated $21 Billion.

November 4, 1999 Tripartite agreement between ICANN, NSI, and DOC secures ICANN’s position.

May 27, 1999 Contentious ICANN Board meeting in Berlin.

December 7-11, 1998 IETF meets in Orlando.

October 16, 1998 Postel’s death.

October 7, 1998 Pickering Committee hearings concerning DNS transfer to the private sector.

September 30, 1998 Planned Expiration of Cooperative Agreement (after extension).

September 18, 1998 Creation of ICANN announced.

September 4, 1998 Collapse of IFWP process.

Summer 1998 WIPO process begins. IFWP meetings in Reston, Geneva, Buenos Aires.

June 5, 1998 White Paper released, followed by Congressional hearings and launch of IFWP

April 1998 Congressional Hearings to review progress of DNS privatization efforts.

March 30-April 3, 1998 IETF meets in Los Angeles

March 31, 1998 Planned Expiration of Cooperative Agreement (Before Extension).

February 14, 1998 Burglary of CORE servers.

February 11, 1998 Creation of IANA Transition Advisors Group (ITAG)

January 30, 1998 Release of Green Paper.

January 28, 1998 Postel “Splits” the Root.

January 22, 1998 Gilmore proposes protective options for Postel.

mid-December, 1997 IAWG meetings with CORE supporters and critics in Washington, DC.

December 1997 Magaziner signals Postel not to add CORE’s TLDs to the Root.

October 17, 1997 Class action suit filed, claiming Internet Infrastructure Fund is an illegal tax.

September 25 & 30, 1997 Pickering Committee Hearings in US House of Representatives.

July 1997 Kashpureff hijacks InterNIC page after NSI branding and IPO campaign.

July 1, 1997 NTIA begins comment period.

June 25, 1997 NSF rejects PGMedia’s request to NSI regarding addition of its TLDs to the Root.

May-June 1997 EU pressures Magaziner to block gTLD-MoU.

May 1, 1997 gTLD-MoU Signing in Geneva.

Page 374: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

359

If I were to take the position that it sounds like you’retaking, it would make me sound like a megalomaniac,and I don’t want people to think I’m amegalomaniac... Even if I am I wouldn’t want peopleto think that.

Vint Cerf

9. CONCLUSION

a. Pursuing the Gutenberg Analogy

We had been through this before, buying new gadgets, absorbing new media,

adopting new skills, witnessing myriad innovations in the ways of work and war. Like our

parents and our grandparents, we learned that social and political disruption was part of life.

What was modern today could be obsolete tomorrow. Old artifacts might be preserved in

museums, or simply be discarded and forgotten. Technological upheaval was familiar to us.

But the Internet suggested an altogether bigger kind of transformation. Its potential

effects loomed like a 500 year storm... possibly as significant as the introduction of the

printing press. Those of us who knew a little history were well aware that printing

technology had disrupted the underlying order of an earlier era. By empowering agency in

local vernaculars, the new devices helped break Catholic control over Europe and launch the

Age of Reason in the 17 Century. One could ask whether the rise of the Internet might impelth

a similar decline of Rome’s successor... the international system of sovereign states. That

political form still dominated the globe at the onset of the new millennium. But perhaps it

would not last much longer.

Still, the printing press metaphor was incomplete. Our contemporary Gutenbergs had

the benefit of hindsight. They could reflect on the consequences of their actions, and perhaps

even on their deepest motivations and animating values.

As my research began, it became clear that I would be meeting the instigators of this

epochal disruption. I recognized an exceptional opportunity – even an obligation – to ask

them what they thought they were doing. What were their intentions? How did they think

things would play out? I called this line of inquiry “The Gutenberg Analogy.”

Page 375: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

360

Personal interview with Jon Postel, December 9, 1997.545

Personal interview with Bill Manning, March 30, 1998.546

* * *

Jon Postel seemed to enjoy the question. His answer was not about the Internet

exclusively, but about the power-to-the-people impact of information technology in general.

He had been fascinated, he said, by news reports showing how Chinese protesters relied on

fax machines to organize themselves during the June 1989 Tienanmen protests. He told me

he felt like he was “part of that.” Postel generally didn’t use a lot of words, but from his

expression and his body language, I got the clear impression that he was not only glad to be

part of what made “that” possible, but simultaneously excited and humbled by the magnitude

of the result.545

Bill Manning gave a serious, carefully-reasoned answer. He described various types

of protocols and standards that were deployed within major US infrastructures, from ATM

machines, to telephones and beyond. Each one required support from a cadre of skilled

technicians. Each one thus guaranteed employment for members of those cadres. But upkeep

of so many distinct technologies was inefficient. Now that the Internet Protocol had become

a dominant standard, it would provide a platform for consolidation. Employers were

discovering they could reduced overall costs by leveraging the Internet’s economies of scale.

But this consolidation would also undermine existing employment patterns, forcing people

to acquire new skills in order to get new jobs.

Manning was self-consciously acting as a change agent, an avatar of creative

destruction. He wanted to see what the future had in store. The faster the rise of the Internet,

the faster he would get to see how things turned out.546

Years later, well after the founding of ICANN, I broached the same question to Vint

Cerf . Unlike Postel and Manning, he was wary of the Gutenberg analogy, and especially of

my underlying hypothesis, which he called “overblown.” The word was strong, but his tone

was judicious. His reaction to grandiose analogies, he admitted, was “colored by this worry

that people will try to put more into ICANN’s lap than ICANN has lap for.” He was

Page 376: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

361

The quotes from this section are taken from the Cerf interview conducted November 15, 2001.547

Ibid.548

unhappy that so many had sought to use ICANN and its global elections as a “stalking horse

for some kind of global government.” They were undermining ICANN’s effectiveness, in his

view, giving it a “burden” that it was “in no position to assume.” Cerf was wary of the

analogy for another reason: he did not want to be tagged as a “megalomaniac.”547

I’m always very sensitive to the possibility that I might be eithermisunderstood as arguing that this thing is as big as the printing press, and Ihad a hand in it, and therefore isn’t that terrific. So I hate like the devil toimagine that that’s what people think. At the same time I’m very excited,since it seems to have gotten a lot of people interested in doing things, andinvolved, and willing to spend an awful amount of time and money toparticipate.548

Disclaimers aside, it was clear Cerf had indeed thought deeply about the impact of

the Internet, and in terms very specific to ICANN’s role in DNS administration. Our

conversation turned to the “peculiar characteristics” of domain names and IP addresses, and

the extent to which these “objects” were like property. He was familiar with relevant

concepts inherited from English law, including leasehold, freehold, real property, and

continuing liens. He reflected on the complex array of problems that stemmed from the task

of transferring names between registrars. Establishing provenance of ownership was one.

Properly recording the transfers was another.

As the conversation headed toward a discussion of foreclosure and escheat, I

suggested that these are precisely the kinds of gate-keeping activities that indicate how an

expressly global organization might emulate the work of modern governments. Cerf’s guard

went up again, as if it were necessary to draw the line. “There’s almost no coercive power

available to anyone at ICANN,” he insisted. Still, he had to concede that by selecting TLDs,

accrediting registrars, and so on, ICANN was indeed allocating resources.

Cerf then switched gears and spoke directly to the “basis” of ICANN’s “regulatory

power.” It all began with Postel in the “pre-ICANN” period. “Postel’s intentions were

Page 377: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

362

Ibid.549

solely,” he believed, “to try and make sure that people who were responsible for pieces of

the Internet were in fact committed to it, and that dumb things didn’t happen.” Those

“intentions” were not only smart, they were selfless:

Jon didn’t make any money out of this at all. Of all the people I can think ofin the world, Jon is the least kind of person who had any ego at all. He neversought the limelight. He wasn’t one of those guys trying to do something insecret... [H]e tried to do the best he could to help people resolvedisagreements.Turning that into an institution has been incredibly hard. What it tells you isthat there is no substitute for an enlightened despot who is well intentionedand capable. Trying to make an institution that balances everyone’s interestis incredibly difficult.549

Cerf’s comment about there being “no substitute” for Postel’s enlightened despotism

was probably just a rhetorical device. If Postel’s commitment to smart and selfless action was

the basis for his power, others could aspire to the same sort of honorably well-intentioned

legitimacy. He would serve as their model. Thus ICANN could be made “well intentioned”

by incorporating it as a non-profit, and made “capable” by populating its board and staff with

technical experts.

What couldn’t be replicated was Postel’s status as a revered figure. There was no

substitute for the faith people had in him. That profoundly high regard led his associates to

overlook his mistakes, led his fans to see wisdom in every pronouncement, and led the press

to call him “god.” ICANN’s officers could pretend to wear the halo of financially

disinterested, highly skilled professionals, and to share Postel’s commitment to the Internet.

But they would never be as trusted as he had been. To put it more precisely, they would never

be as universally revered as people now preferred to think he had been.

Cerf, on the defensive against ICANN’s critics, was at pains to defend its integrity

(and by implication his own), stressing that ICANN’s “job” was “not to make a profit” for

itself or for anyone in particular. He described its positive duties almost as an afterthought:

“Its job is to enable other people to make a profit if they so choose on the net. You don’t

Page 378: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

363

Ibid.550

have to make a profit on the net. Some people use the net for reasons that are

noncommercial. That’s OK too.”550

Reviewing the transcript after the interview, I realized that I had exposed the meat

of the Gutenberg analogy, but had missed an opportunity to probe it further. The basis of

Postel’s power, regulatory or otherwise, was much more than an honorable record of keeping

his hands out of the cookie jar. He had power because members of a wider community

identified him as someone who could advance their interests. He was not simply a

trustworthy functionary, but a man who was trusted to lead the way ahead. If I had dug

deeper, I might have reached a clearer view of the direction Postel had chosen, and why.

It would have been appropriate at that point to question assumptions taken for granted

by nearly everyone in the world, Cerf and myself included: Why enable use of the Internet?

What interests would be served, and how? In general terms, the proper question would have

been this: Even if a gatekeeper does not profit personally, what values profit by the

gatekeeper’s actions? Put another way: What guiding principles does the gatekeeper seek to

serve?

Not surprisingly, Cerf had put commerce first in his off-the-cuff comment about

ICANN’s positive purposes. This was not new. Other major figures in the DNS debates had

given the same overarching justification when presented with the Gutenberg analogy. In their

cases, however, it was not an afterthought. Ira Magaziner, for example, responded at length,

describing the benefits of building structures which could support Internet commerce. So did

Paul Twomey, whom I interviewed when he was chair of ICANN’s Government Advisory

Committee.

Given more time to reflect, Cerf might have voiced a different rationale, perhaps

closer in spirit to Postel’s notion of radical empowerment or Manning’s desire for

Schumpeterian progress. But he didn’t. The response from “father” of the Internet was

congruent with the views of Magaziner and Twomey... the most prominent state agents in

the DNS debates. They had not been promoting the Internet because they intended to change

Page 379: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

364

Brian Carpenter, “RFC 1958: Architectural Principles of the Internet,” June 1996.551

the world and transform the global order. Their intentions were precisely the opposite... to

bolster the political standing of their governments. Internet-based commerce was being

pursued as way of spurring new employment, corporate profitability, and economic

prosperity. It was a means to that stable, nation-state enshrining end.

* * *

Cerf might have answered quite differently if someone had asked him about the

Gutenberg analogy in 1996, when Internet megalomania was much more fashionable. He

could have taken cues from RFC 1958, titled “Architectural Principles of the Internet.”

Authored by Brian Carpenter, then Chair of the IAB, the RFC offered “general guidance”

about the “current principles of Internet architecture.” Its “spanning set” of rules were

friendly to commerce, but also revealed just how much proponents of Internet expansion

once evinced interests and loyalties which put them at odds with sovereign governments.

That document was published in June 1996... well before the incorporation of

ICANN, the split of the root, and the IAHC debacle. IETF attendance was booming, ISOC

was getting a fresh start under Don Heath, and no one had yet uttered the words “DNS War.”

The Internet Protocol had all but triumphed over its competitors by then. Internet-related

businesses were taking off, and the ascendant Web was widely recognized as a social

juggernaut. To be sure, there were problems. It appeared as if the world’s most powerful

sovereign state was trying to block the Internet’s progress, and even roll it back. The fight

over the CDA had reached a high pitch, and there was sharp resentment of the Clinton

Administration’s efforts to impede public access to high quality encryption technology. In

that context, with so much at stake, Carpenter no doubt felt it was incumbent on him to assert

the best interests of his community, especially against perceived threats to further expansion.

Carpenter list of architectural principles began with a hedge: “[C]onstant change is

perhaps the only principle of the Internet that should survive indefinitely.” That said, he

described the overarching norms in a nutshell: “[T]he goal is connectivity, the tool is the

Internet Protocol, and the intelligence is end to end rather than hidden in the network.”551

Page 380: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

365

Connectivity “flourishes in the increasingly liberal and competitive commercial

telecommunications environment,” he wrote. But commerce was a means to an end, and not

the end itself. Rather, the “exponential growth of the network [showed] that connectivity is

its own reward.” The conception of connectivity was unbordered. “The key to global

connectivity is the inter-networking layer.” That conception was also universal. There might

be upgrades to new versions of the Internet Protocol, but achieving “the ideal situation” of

“uniform and relatively seamless operations” required that “there should be one, and only

one protocol.”

Ideologies claiming an unbordered, universalist scope were not new to the world.

Catholicism and Islam come to mind, as do several failed prototypes for world government

that appeared in the 19 and 20 centuries. Some anarchist and libertarian ideologies mayth th

fit into that category as well. Uniquely, however, the Internet combined an inherent,

libertarian-tinged resistance to intercessors – embodied in the end-to-end principle – with the

immutable constraint of an overarching mediator – the network itself. Cybernetics and

government share a common etymological root for good reasons. The universalist thrust of

the Internet’s doctrine – an almost predatory effort to make itself the world’s underlying

telecommunications protocol – belied the old saw that adoption of IP was strictly voluntary.

Once the market has tilted, the new circumstances take on a force of their own. Recall Cerf’s

insistence that ICANN was not coercive. How overtly coercive do you need to be when

you’re the only game in town?

Ye it was clear that the Internet had ushered in an innovative and a critical distinction

from the sovereign-crowned hierarchies of nation-states and religious doctrines. The end-to-

end principle championed the sovereignty of members at the endpoints over the sovereignty

of the overarching mediator. This distinction needs further explanation.

In a key passage intended to articulate fundamental engineering principles, Carpenter

relied on the words “state” and “fate” to put his ideas across, words that evoke highly loaded

concepts in the languages of international and deistic world systems. Deconstruction of their

multiple meanings can help disclose the political implications of Internet architecture. Here

is the passage:

Page 381: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

366

RFC 1958, section 2.3.552

Brian Carpenter, Fred Baker, “RFC 1984: The IAB and IESG Statement on Cryptographic553

Technology and the Internet,” August 1996.

An end-to-end protocol design should not rely on the maintenance of state(i.e. information about the state of end-to-end communication) inside thenetwork. Such state should be maintained only in the endpoints, in such away that the state can only be destroyed when the endpoint itself breaks down(known as fate-sharing).552

In other words, a connection’s state as such depended on the intelligence sustained

by its member endpoints. Carpenter allowed that the network would maintain information

to perform services on a connection’s behalf, like routing, but he stipulated that the

responsibility for the integrity and security of a connection’s state belonged expressly at the

fringes. In distinction, a nation-state’s state as such depends in large part on recognition (an

act of sustained intelligence) by other nation-states.

Consequently, the engineering community’s profoundly autonomous vision of a

connection’s state provided ground for tension with the governments of nation-states. The

community’s intensely self-reliant notion of security mandated that future versions of the

Internet Protocol support the strongest possible cryptographic protections. That demand

translated into a giant “Keep Out” sign, blasting the US government and any other that might

seek to corrupt the integrity of end-to-end communication.

Carpenter followed up several weeks later with another RFC that expressed that same

position in openly contentious terms. It stated that the IAB and the IESG were “disturbed to

note that various governments” were adopting a series of cryptographic policies that were

“against the interests of consumers and the business community.” As an alternative, “The

IAB and IESG would... encourage policies that allow ready access to uniform strong

cryptographic technology for all Internet users in all countries.” Postel gave that RFC a

number far out of sequence – the ominous 1984 – in order to dramatize its importance.553

The second term, fate, hints at existential matters. Here again, what made the

ideology of the Internet architects unique among authoritatively mediated systems was the

Page 382: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

367

Though many nation-states enshrine equal rights as a constitutive principle, it is nevertheless the554

case that rights are defended through state intervention, and depend on political circumstances in which certain

groups and individuals tend to acquire more rights than others.

RFC 1958, section 2.4.555

The concept of “charter myth” was created by the anthropologist Bronislaw Malinowski to describe556

how a group’s story of origin “conveys, expresses and strengthens the fundamental fact of local unity,” and

“literally contains the legal charter of the community." See Malinowski, "Myth in Primitive Psychology,"

(1954:116). cited by Iain R. Edgar, “Anthropology and the Dream,” http://sapir.ukc.ac.uk/Guests/g-

ie/cultdream.political.html.

rule that the responsibility for a connection’s enduring life resides with its member

endpoints. Fate-sharing implied that communication between member endpoints was the

ultimate purpose of existence, not communion with a higher sovereign. As with the word

“state,” the appropriation of “fate” helped distinguish the Internet as a political form.

Connectivity was the new reward, not privilege or paradise.554

Evidence for the Internet community’s ideological exceptionalism does not need to

hang on the preceding discussion of fate and state, of course. The cherished slogans of the

engineering community had already signaled a move to set the Internet apart. Carpenter had

recapitulated them in his architectural principles, a strategy that served to ratify the

community’s existing sense of self even as he sought to extend its norms:

Fortunately, nobody owns the Internet, there is no centralized control, andnobody can turn it off. Its evolution depends on rough consensus abouttechnical proposals, and on running code. Engineering feed-back from realimplementations is more important than any architectural principles.555

The idea that the Internet was the product of uncontrolled evolution was its great

chartering myth. Brian Reid and Steve Wolff, quoted at the opening of this dissertation,556

were smitten by the idea, and so were countless other Internet veterans. Carpenter echoed the

tradition in his statement of principles, giving it pride of place... the abstract. “The Internet

and its architecture have grown in evolutionary fashion from modest beginnings rather than

from a Grand Plan,” it began. That “process of evolution is one of the main reasons for the

technology’s success.”

Page 383: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

368

Why credit the organizing theory of modern biology for the success of the Internet?

Why deny conscious design? In fact, the Internet’s designers had always been clear about

their purposes. Goals such as connectivity, interoperability, and scalability had guided those

who participated in building the ARPANET and were sustained during the expansion of the

Internet. Principles such as open documentation, end-to-end intelligence, security, and

universality were adopted as community members refined their ideas about how to advance

those core goals. The Internet’s pioneers may have lacked a complete, prefigured blueprints

when they set out on their journey, but there was no lack of guiding vision.

So why did they deny it? A modest wariness of being tagged megalomanic – like the

concern voiced by Vint Cerf – is not a sufficient explanation. The tendency for vociferous

and aggressive sloganeering about Internet evolution, and no one owning it, etc., evidently

peaked about the same time grandiose pronouncements about the Internet’s future vis a vis

the nation-state were most fashionable. Perhaps that’s the clue. The denial may reflect

something deeper about the human condition. Rule making – especially on such a grand scale

– may be more readily countenanced when posited as rule following.

b. Summary of Findings

The first part of this dissertation offered a substantial and sustained narrative history

of Internet governance, with emphasis on the DNS controversy through the signing of the

gTLD-MoU.

A major theme of the discussion was that the gTLD-MoU represented the

culmination of a formal and official bid to declare the Internet’s name space as a global

resource, with the goal of setting up an expressly global system of administration. Its chains

of authority would have been anchored in a new structure distinct from traditional

international institutions. For that reason, questions of its guiding principles and its

gatekeeping powers received close attention in the narrative.

The conception of an expressly global organization is that the organization’s claims

to legitimacy are grounded on an array of guiding principles, constitutive instruments,

effective technical reach, and other jurisdictional arrangements that are all conceived as

Page 384: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

369

having an unambiguously global scope. Its standing would not ultimately depend on

authorization by the traditional treaty instruments of sovereign states. Such authorization

would not necessarily undermine its overall social standing, however, but would be quite

likely to bolster it.

The guiding principles of the gTLD-MoU were derived from statements made by

leaders of the Internet engineering community regarding the desire for global end-to-end

connectivity, and statements from representatives of trademark and telecom industries

advocating a new program for voluntary multilateralism. Its constitutive instruments included

a variety of institutional elements that were intended to provide for stakeholder-based

decision-making, tempered by public oversight.

Its effective technical reach was initially intended to cover all tuples (database

records) that combined Internet Protocol addresses with domain names having generic

suffixes (the IAHC’s seven in a first stage, and the legacies including dot-com in a second

stage). This reach involved only a subset of the Internet, because country codes were not

included. Nevertheless, a tightening alliance between IANA, the ITU, and the gTLD-MoU

regime raised expectations that explicit authority over the root would follow. Also, IANA’s

growing independence in that context, and its relationship with the regional IP registries,

suggested a stronger form of expressly global character for other segments of the Internet’s

core resources.

Other jurisdictional arrangements, particularly the WIPO-managed Administrative

Challenge Panels that were intended to serve as the gTLD-MoU’s dispute resolution

mechanisms, represented an innovative move to bypass nationally-bound legal processes.

Several themes woven into the narrative are of some historical interest and have not

previously received such extended attention. These include; 1) the ambiguity which haunted

Jon Postel’s claim to authority, and the degree to which that ambiguity was rooted in RFC

1174; 2) a detailed description of the events which led to Amendment 4 of the Cooperative

Agreement, and the institution of fees by NSI; 3) the interplay of ideologies that emerged in

conference discussions concerning the future disposition of the Internet’s core resources.

Page 385: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

370

Three metaphors – guides, gatekeepers, and peers – were often invoked during the

narrative. Those cues were intended to alert the reader to the significance of certain

personalities, statements, and events. Very little was said about how those metaphors were

derived. They are familiar words that stand for familiar concepts, drawing plain meaning

from the context of the discussion. The intent was to offer a narrative that could stand on its

own, regardless of the underlying analytical framework withstands critical challenge.

Interested readers can find a discussion of those metaphors in Appendix 2.

Page 386: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

371

10. APPENDIX I: ACRONYMS

Partial listing of acronyms, names, and special terminology used in this dissertation.

Term Full name or explanation Description

“.” The dot, also known as the root. Hierarchical apex of the DNS.

ACM Association for Computing

Machinery

The world’s first organized association of computing

professionals, it provided a platform upon which

several BWG partisans voiced their opposition to

ICANN’s leadership.

AlterNIC. Brand name implying Alternate

Network Information Center. . . . .

Eugene Kashpureff’s alternate registry service,

prominent from March 1996 through July 1997.

ANS. . . . . Advanced Network Services Not-for-profit entity established by MERIT in 1992

to run backbone services for the NSF.

APNG.. . . Asia Pacific Networking Group. . . Asian correlate of European and American routing

management groups, CCIRN and IEPG.

APNIC Asia Pacific Network Information

Center

Asian correlate of the European and American IP

registries, RIPE and ARIN.

APPLe Asia Pacific Policy and Legal

Forum

Policy discussion forum founded by Laina

Raveendran Greene.

ARPA-

NIC

Advanced Research Projects

Network - Network Information

Center

The research oriented counterpart of the DDN-NIC.

They operated side by side until the inception of the

InterNIC.

ARPANET Advanced Research Projects

Network

The Internet’s precursor, which operated from 1969

to 1982.

ASO Address Support Organization The segment of ICANN’s structure whose

constituencies are focused on the allocation of IP

numbers. It is constituted by three regionally based

authorities.

AUP Acceptable Use Policy A rule preventing commercial use of government

property, which inhibited the utility of the NSF

backbone (operated by MERIT) as a long-haul

Internet service provider.

BBN Bolt, Berenak and Newman Primary contractor for the ARPANET.

BIND Berkeley Internet Name Daemon Dominant DNS-supporting software package used

by Internet Service Providers. Written and published

by ISC. Its default configuration reinforces use of the

root system endorsed by IANA and the USG,

currently hosted by NSI.

Page 387: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

372

Partial listing of acronyms, names, and special terminology used in this dissertation.

Term Full name or explanation Description

BSD Berkeley Standard Distribution Popular flavor of the UNIX operating system

distributed free of charge by the University of

California.

BWG Boston Working Group Online discussion group whose founders, in late

summer 1998, presented alternate bylaws for what

became ICANN

CA Cooperative Agreement Formal name of the 1993 arrangement between NSI

and the NSF regarding DNS operations. Since its

inception it has been amended 19 times. USG

oversight of the CA has been transferred to the DOC.

CCIRN Coordinating Committee for

Intercontinental Research

Networks

Provided global infrastructural oversight in late 180s

and early 1990. Outgrowth of IANW. Fostered

IEPG. Advocated creation of FNC.

ccTLD Country Code Top Level Domain A TLD using a two letter suffix matching the ISO-

3166 table of country codes, and managed by a

designated authority at the pleasure of its

corresponding national government.

CDA Communications Decency Act Legislation which limited the liability of Internet

Service Providers for the retransmission of restricted

content. Many activists in the Internet community

believed certain provisions threatened free speech

rights.

CIX Commercial Internet Exchange Consortium of ISPs founded in 1992 to challenge

ANS’s market advantage as a backbone provider for

the NSF. CIX members employed an innovative

peering policy that did not demand financial

settlements for exchanges of traffic.

CNRI Corporation for National Research

Initiatives

Washington-based non-profit organization which

provided funding for IETF meetings and

administration through 1998.

CORE Council of Registrars Non-profit structure designated through the gTLD-

MoU to coordinate registry and registrar activities.

CPSR Computer Professionals for Social

Responsibility

US-based lobbying group traditionally concerned

with peace and arms control issues, which became a

vehicle for promoting “Internet Democracy” after the

creation of ICANN.

Page 388: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

373

Partial listing of acronyms, names, and special terminology used in this dissertation.

Term Full name or explanation Description

DARPA Defense Advanced Research

Projects Agency

Defense Department agency involved in funding

early research on packet switched networks and high

performance computing, including the ARPANET.

Later provided funding for ISI.

DCA Defense Communications Agency See DISA

DDN Defense Data Network Military-oriented operational networking facility run

by SRI from 1972 to 1991, when it was transferred to

GSI/NSI.

DDN-NIC Defense Data Network - Network

Information Center

Designation for the facility that maintained the

hosts.txt file and managed IP address allocations for

the Internet until 1993, when it was physically split

from the ARPA-NIC.

DISA Defense Information Systems

Agency

Originally Defense Communications Agency.

Defense Department sub-agency which funded

DDN-NIC operations.

DNS Domain Name System A hierarchical, distributed database that permits

resolution of memorable names like miami.edu into

their underlying IP numbers.

DNRC Domain Name Rights Coalition A Washington, DC area consortium of attorneys,

including Michaela “Mikki” Barry, Harold Feld, and

Kathy Kleiman.

DNSO Domain Name Support

Organization

The segment of ICANN’s structure whose

constituencies are focused on domain name issues. It

is constituted by seven constituencies: Business, non-

commercial, ISPs, registrars, gTLD registries,

ccTLD registries, intellectual property, and an open

General Assembly.

DOC Department of Commerce US Government agency which assumed formal

control of InterNIC-related oversight, after the NSF

relinquished that responsibility in 1998.

eDNS enhanced DNS Consortium of alternate registries including systems

run by Simon Higgs, Karl Denninger, Jay Fennello,

and Eugene Kashpureff

EFF Electronic Frontier Foundation Activist group founded by Mitchell Kapor and John

Perry Barlow.

Page 389: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

374

Partial listing of acronyms, names, and special terminology used in this dissertation.

Term Full name or explanation Description

FEPG Federal Engineering Planning

Group

Advised on creation of peering points and other

issues of interest to major backbone providers.

within the United States.

FNC Federal Networking Council Committee of USG computer networking experts

from various agencies..Advisory Committee

(FNCAC) was involved in DNS issues until the

creation of the IAG.

GAC Government Advisory Committee Informal International Organization through which

nation states issue formal recommendations to

ICANN’s Board of Directors. Such statements are

officially considered non-binding. The GAC has no

seats on ICANN’s Board.

GIAW Global Incorporation Alliance

Workshop

Original name of the IFWP.

GIP Global Internet Project Consortium of companies including IBM and MCI

which provided startup funding for ICANN in 1998

and 1999.

gTLD Generic Top Level Domain A TLD which is effectively open to anyone who

wishes to purchase an SLD with its corresponding

suffix. Current gTLDs include .com, .net, .org and

others are contemplated. Some ccTLDs follow

business models equivalent to gTLDs.

gTLD-

MoU

Generic Top Level Domains

Memorandum of Understanding.

Work product of the IAHC which established the

POC and CORE.

IAB Internet Architecture Board Provides oversight to the IETF and serves as the final

appeals body in case an IETF member claims a

process violation. Members are nominated and

selected through an internal IETF process and

approved by ISOC.

IAG Federal Interagency Working

Group on Domain Names

Group within the USG dealing with DNS issues

starting early 1997. First led by Brian Kahin, then by

Ira Magaziner.

IAHC International Ad Hoc Committee A “blue ribbon” committee commissioned by ISOC

in late 1996 to resolve DNS policy matters.

Page 390: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

375

Partial listing of acronyms, names, and special terminology used in this dissertation.

Term Full name or explanation Description

IANA Internet Assigned Numbers

Authority

The body that records and coordinates name,

number, and protocol assignments used on the

Internet, as well as the root server system used by the

DNS. Initially a side project of Jon Postel’s, who as a

graduate student, was given the assignment by his

dissertation supervisor, Dave Farber. As the Internet

grew the task was given the name IANA. Early

funding for this work was channeled through grants

from DARPA, and later through ICANN.

ICANN Internet Corporation for Assigned

Names and Numbers

A globally constituted body chartered in California

as a non-profit membership organization to manage

the administration of technical identifiers used on the

Internet. Under the semi-formal aegis of the USG,

ICANN coordinates Internet policy and operations

by overseeing root operations, credentialling

registries, registrars, and dispute resolution providers

concerned with the use, sale, and allocation of

domain names. ICANN is also charged with

providing for the expansion of the registrar system,

the admission of UDRP providers, and the possible

extension of the TLD space, as well as funding and

oversight of the IANA and the RFC editor activities

on behalf of the IETF and other concerned parties.

IDNB Internet DNS Names Review

Board

Panel named by Jon Postel in “RFC 1591: Domain

Name System Structure and Delegation.” It was

intended to provide for reviews and dispute

resolution, but was never constituted.

IEPG Intercontinental Engineering

Planning Group (originally)

Internet Engineering Planning

Group (Currently)

Advises on creation of peering points and other

issues of interest to major backbone providers.

IESG Internet Engineering Steering

Group

The combined Area Directors overseeing the IETF’s

Working Groups

IETF Internet Engineering Task Force Standards making body for the Internet. Membership

is open and standards are non-proprietary and freely

published. Approximately 2000 people attend its

thrice-yearly meetings. A somewhat larger number

participates online.

IFWP International Forum on the White

Paper

A series of meetings held in the summer of 1998 to

discuss the formation of a new corporation for

Internet technical adminstration.

Page 391: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

376

Partial listing of acronyms, names, and special terminology used in this dissertation.

Term Full name or explanation Description

IMP Interface Message Processor Devices built by BBN to link computers on the

ARPANET.

INET Internet Networking Conference A series of annual, quasi-academic conferences

sponsored by ISOC, that emerged from the IAW in

1991.

INTA International Trademark Authority New York based organ of the trademark community.

Interop Internet Interoperations A major annual tradeshow used to showcase state-of-

the-art internetworking technology.

InterNIC Internet Network Information

Center

Domain Dame Registration Service managed by

Network Solutions, Inc. (NSI) under agreement with

the National Science Foundation until 1998, and later

with the Department of Commerce

IP Internet Protocol Electronic messaging standard used to send and

receive packets of data on the Internet. Operates in

conjunction with connection managers such as the

Transmission Control Protocol (TCP) or the Uniform

Datagram Protocol (UDP).Created by Vint Cerf

(ICANN Board member and ISOC Chair) and

Robert Kahn (CEO of CNRI).

IPTO Information Processing Techniques

Office

The sub-agency of ARPA (later DARPA) which

sponsored the creation of the ARPANET, the

development of TCP, and other breakthroughs.

ISC Internet Software Consortium Institutional home of Paul Vixie, lead author of

BIND. Non-profit company subsidized by market

leaders like Digital Equipment Corporation.

ISI Information Science Institute Long time employer of Jon Postel, Joyce Reynolds,

Bill Manning, and other key IANA personnel. Host

of IANA’s offices. A think tank sited in Marina Del

Rey, California and Virginia, but formally part of the

University of Southern California. Funded by

DARPA through 1998, briefly by USC, and then by

ISOC and ICANN.

ISO International Organization for

Standards

Geneva-based technical standards organization

whose voting members are representatives of

national standards bodies. ISO is not an acronym, but

the Greek term for “equal.”

Page 392: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

377

Partial listing of acronyms, names, and special terminology used in this dissertation.

Term Full name or explanation Description

ISOC Internet Society Non-profit group which provides a legal umbrella for

the activities of the IETF. Created and largely

constituted by significant members of the

engineering community which created key Internet

technologies.

ISP Internet Service Provider An agency that connects a local network or

subnetwork to the public Internet. Typically a dial up

service, this definition can also include computing

centers for public and private institutions, and large

scale, high-speed backbone providers.

ISC Internet Software Consortium Publisher of BIND, headed by Paul Vixie.

ITAG IANA Transition Advisors Group Formed shortly after root “switch” event and Green

Paper announcement to advise Postel on transition

from US Government funding to incorporation as a

not-for-profit activity

ITU International Telecommunications

Union

International Organization involved in formulating

and sponsoring the gTLD-MoU. Now a participant in

ICANN’s PSO.

MAC Membership Advisory Committee ICANN committee which made recommendations on

the selection of its At Large Directors.

MERIT Michigan Educational Research

Information Triad

University-based consortium that entered in a

partnership with IBM and MCI in 1987 to run the

NSFNET.

NANOG North American Operators Group Leading association of ISPs in the United States.

NSF National Science Foundation US Government agency oriented toward support of

unclassified educational research. Provided a venue

that promoted the transition of the Internet from a

military-funded project to privatization.

NSFNET The NSF Network An NSF-funded project focused on linking

university-based supercomputers.

NSI Network Solutions Incorporated A Herndon, VA based company which was awarded

a contract in 1993 to operate the apex of the root

server system, and to manage the .com, .net, .org,

and .edu TLDs.

NTIA National Telecommunications and

Information Administration

An agency under the DOC specifically concerned

with Internet policy coordination issues.

Page 393: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

378

Partial listing of acronyms, names, and special terminology used in this dissertation.

Term Full name or explanation Description

Open-

RSC

Open Root Server Consortium A discussion group founded by several alternate

registry operators and others who were critical of the

IAHC and ICANN.

PSO Protocol Support Organization The segment of ICANN’s structure whose

constituencies are focused on technical protocols. It

is constituted by the IETF, the ITU, and several other

standards-making groups.

RARE (European Association of Research

Networks)

Registrar The operator of the system that is the interface

between a domain name registrant and a registry.

Akin to a retailer.

Registry The operator of the database which contains the

information linking a domain name and its

corresponding IP address (as well as other mappings

pertinent to domain names).

RFC Request For Comment The name of the numbered series of standards issued

by the IETF. For example RFC 1591. Standards are

subject to varying degrees of review. “Informational”

and “experimental” documents can be published with

minimal review. Best Current Practice (BCP) and

Standard (STD) documents must be approved by a

consensus (dominant support) of the Area Directors

of the IESG. BCP documents may also be issued

directly by the IAB.

RIPE Réseaux IP Européens Europe’s Regional Internet Registry, created in 1989,

and given formal legal status in 1992 in cooperation

with RARE.

SAIC Science Applications International

Corporation

Parent company of Network Solutions (NSI)

SLD Second Level Domain Syntactically evident as the string of characters

immediately to the left of a TLD, like the miami in

miami.edu. Due to the popularity of .com it is the

type of domain name most commonly purchased and

used on the Internet.

SRS Shared Registry System A technological design approach by which distinct

registrar agents to sell names into a common “back-

end” domain name database registry.

Page 394: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

379

Partial listing of acronyms, names, and special terminology used in this dissertation.

Term Full name or explanation Description

TLD Top Level Domain Syntactically evident as the suffix of a domain name,

like the .edu in miami.edu. There are 252 currently

made visible in ICANN’s root.

TM Trademark The intellectual property interests involved in the

DNS controversy strong representation by trademark

holders and their attorneys.

UDRP Uniform Dispute Resolution Policy ICANN’s procedure for resolving complaints against

domain name holders employs a system of

“providers” who act as judges for the parties to the

dispute.

USG United States Government

WIPO World Intellectual Property

Organization

Geneva-based International Organization involved in

the IAHC’s and ICANN’s dispute resolution

structure.

Page 395: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

380

In a Stupid Network, the data on it would be the boss.David Isenberg

Those who tell the stories also rule.Plato

11. APPENDIX II: POWER

a. Rule-making on the Internet vs rule-making in general

It was quite fashionable in the 1990s to portray the Internet as an entity that was

owned by no one – a chaotically-arranged, edge-controlled, consensus-driven, self-organized

anarchy. Reality was far from the case. The Internet’s designers organized its core resources

in a manner that established a discipline of hierarchical control over an ever-expanding

society of connected agents. The Internet’s legacy of anarchy stemmed from its extraordinary

openness. By enshrining scalable connectivity as a supreme priority and perhaps even as the

Internet’s summum bonum, the designers favored technical approaches which would foster

participation by huge numbers of new users. Where access to core resources was concerned,

however, new entrants were marshaled as “responsible parties” within a system of carefully

ranked delegations and assignments.

At the end of the decade, as Internet access became highly valued, there was a

corresponding rise in political efforts to control of the distribution of those core resources.

Lively disputes arose over what kind of norms should channel those distribution decisions,

and who would get to articulate those norms. This dissertation has presented the early history

of that rule-making moment on the Internet, giving particular attention to the ideological

arguments that sustained various alliances at key points in the controversy.

During the preceding narration I often compared the behaviors of key actors to the

behavior of guides, gatekeepers, or peers. Those metaphors were intended to draw attention

to specific ways in which people act as deployers of rules... agents with real power to make

a difference in the world. Now it is appropriate to step back from the narrative. The purpose

of this essay is to substantiate those metaphors by situating them within a formal analytical

framework.

Putting aside the deeper content of this or that dispute, I want to show how adherents

to various ideologies account for the proper source of rules in the world. When people act

Page 396: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

381

– whether as guides, gatekeepers, or peers – they have some self-awareness of their own

urges, needs, rights, obligations, interests, predicaments, and so on. That self-awareness

provides them with knowledge about their capacities and intentions. It roots their sense of

agency and informs their purposes. For example, people who see themselves as “just

following orders” may conceive the proper source of rule to be outside themselves. In cases

where people consider themselves to be free thinkers, open to all choices, or to be individuals

who act solely on the basis of personal conscience, then the proper source is visualized as

stemming from the inside. Moreover, the source of a rule could be an explicit command

given by a named and known agent, or it could be general principle subject to interpretation

depending on the situation. So, just as there are outside/inside distinctions, there are

being/abstraction distinctions which in combination give rise to an array of possible

conceptions.

To ground this framework, I will invoke some venerable three-part metaphoric

schemes which are might be familiar to students of philosophy, political theory, and

international relations. I’m particularly interested in Nicholas Onuf’s triadic approach to the

problem of Rule, presented in World of Our Making (1989), and extended in later works.

Onuf’s early work on rule sought to demonstrate links between his own triadic framework

and Charles S. Peirce’s triadic system of logic. Onuf also sought to conflate his triadic

system with Jonathon Searle’s system of speech acts, attempting to make the schemes “fit”

by dismissing two of Searle’s five categories. I will take a different approach, first by

demonstrating how Searle’s five can be reconsolidated into three, and, then showing how

those three seem to link up rather neatly with Onuf’s system.

This attempt to strengthen the conceptual ties between Onuf’s triad and Searle’s

theory of illocutionary force is intended to build a case for a constructivist methodology for

empirical research. Making a full case must depend on future work, extending those

conceptual ties into the kernel of Pierce’s system, especially his phenomenology of signs.

I will begin with a discussion of Constructivist precepts regarding the ontological

primacy of rules in the constitution of political interests. That will be followed by an

examination of the linguistic foundations of rule-making in human social practices. There

Page 397: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

382

Giddens (1986).557

will be a brief comparison of Onuf’s and Searle’s expositions of speech acts, wrapping up

with a preliminary justification for the metaphors of guides, gatekeepers, and peers. Along

the way, I will offer some new terminology to address the question of how people account

for the proper source of rules in the world.

b. The Skilled Practice of Rules and Categories

i. Structuration

International Relations specialists who call themselves Constructivists tend to borrow

heavily from a sociological precept called “Structuration.” The concept stems from the557

work of British academician and political advisor Anthony Giddens, who set out to describe

how people can simultaneously produce a culture and yet be produced by it.

In society, humans behave as agents who engage in skilled social performances. At

the same time, habits and institutions emerge from those performances; those habits and

institutions are reflected as discernable social structures. Consequently, agents and structures

can not exist independently of each other. They are co-constituted recursively through

expressions of skilled social performance. This results in a condition called “Duality of

Structure.” Social structures are conceptualized as the simultaneous “medium and outcome”

of skilled human performance.

One highly-touted virtue of Structuration is that it avoids two intellectual traps that

have ensnared entire communities of professional academicians and armchair theorists. On

one hand there is the structure-enshrining doctrine which holds that unseen but discernable

forces give rise to reality. Those underlying causes are variously conceived as metaphysical

forms, innate structures, supernatural antecedents, pre-ordained commands, fundamental

essences, or natural laws. Thus, structure is taken as ontologically prior, with the implication

that human life plays out in a highly determined manner... perhaps even a pre-destined one.

In this sense, something outside of humans is credited as the ultimate the cause of human

Page 398: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

383

behavior. An individual’s conception of his or her own purpose is thus conceived as an

epiphenomenon, subordinate to the purposes compelled by the overarching system.

Structure-enshrining doctrines are often associated with behavioralism, and the notion

that social sciences can best be understood, like natural sciences, through quantitative,

statistically verifiable methods, abstract methodologies such as game theory, and so on.

Economists and psychologists are attracted to such approaches because it allows them to

make the world their laboratory. They can accumulate data through surveys, tests, and

historical records, and then number-crunch their way to grand conclusions. Behavorialists

in the field of international relations included the World War II era political scientist Harald

Lasswell, and the seminal theorist of contemporary structural realism (also called

neorealism), Kenneth Waltz.

Over time, a critical response to the perceived excesses of structure-enshrining

doctrines produced a countervailing agent-enshrining doctrine. The argument goes this way:

For any given conception of reality which achieves dominance at a certain time and place,

there are privileged groups of elites who reap great benefits at the expense of others. Those

conceptions are manifested as grand narratives – call them Zeitgeists (spirits of an age),

Weltanschauungs (world views), or paradigms. They typically include justifications for the

inequalities in human relationships which might characterize a particular society. Those

justifications thus serve to sustain an elite’s domination. Bringing about justice in some

society (or at least raising the relative position of subordinated individuals), therefore, would

require a radical transformation designed to overturn the dominant conception of reality

within that society.

Ironically, any new paradigms or grand narratives would be as likely as the old ones

to justify some elite group’s domination. Even the concept of justice itself is inevitably

subject to appropriation by exploitive interests. Therefore, goes the argument, it may be

necessary to subvert all grand narratives so that each of person is free to write his or her own

individual narrative and thereby create his or her own reality.

This alternative agent-enshrining stance – generally associated with post-modernism

– portrays the world as a pastiche of consumption choices. Post-modernists include the

Page 399: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

384

See E.O. Wilson, Richard Rorty, and Paul R. Gross, “Is Everything Relative? A Debate on the Unity558

of Knowledge, Wilson Quarterly (Winter 1998) 14-57. Though Wilson has written of the “failure” of logical

positivism, his work certainly shares its absolutist faith in objective scientific knowledge. “Can we devise a

universal litmus test for scientific statements and with it eventually attain the grail of objective truth? ... The

answer could well be yes.”Consilience (1998: 59-65, esp. 60).

French literary theorists Roland Barthes and Jean-Francois Lyotard. Barthes published “The

Death of the Author” in 1968, denying the possibility of stable, enduring meaning in any text.

Meaning would instead originate with the reader. Lyotard published The Postmodern

Condition: A Report on Knowledge in 1979 arguing that society had become a multiplicity

of incommensurable “phrase regimes.” In international relations theory, Richard Ashley,

drawing on the work of French philosopher Michel Foucault, stresses the need to contest the

grand narratives of sovereignty and anarchy.

Post-modernists site conceptions of freedom and justice within the limits of an

individual’s conscience or within the highly localized setting of a self-selected community.

An individual’s purposes, then, are expressly his or her own, implying that structure is

epiphenomenal to human agency.

These two archetypes both serve to justify one’s agency in the world. But they

embody fundamentally antithetical notions of personal sovereignty. This explains the

intensity of debates that have pitted positivist scientists against pragmatic philosophers (for

example, E.O. Wilson versus Richard Rorty ), and religious fundamentalists against558

political pluralists (for example, the Family Research Council versus the American Civil

Liberties Union). In IR theory, the dispute is known as “The Third Debate.” Reducing these

worldviews to a battle between absolutism and relativism offers useful insights, but it also

oversimplifies. The Duality of Structure offers an alternative that goes beyond these

seemingly exclusive poles.

* * *

It is possible to recast the foregoing dichotomy as a trichotomy that makes a place for

the Constructivist approach. As a first step to that end, consider the distinction that was

already made between the notions of rules and purposes which originate externally – from

the structure – and those which originate internally – from the agent. Psychologists and

Page 400: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

385

Published as Chapter IX in “Matthews, Brander, ed. The Oxford Book of American Essays. 1914559

Ibid.560

economists refer to the former attitude as exogenous and the latter as endogenous.

(Parenthetically, biologists and geologists use the same terms in turn to represent factors that

originate outside or inside the systems they study.) As it happens, the assumption that

causality is exogenous matches up fairly well with structure-driven justifications for

behavior. On the other hand, endogenous conceptions imply far more than self-isolating

relativism. Consider how Ralph Waldo Emerson used the word in his 1850 essay, “Uses of

Great Men.”

Man is that noble endogenous plant which grows, like the palm, from withinoutward... Man is endogenous and education is his unfolding. The aid wehave from others is mechanical, compared with the discoveries of nature inus. What is thus learned is delightful in the doing, and the effect remains.Right ethics are central, and go from the soul outward.559

As the quintessential champion of “self-reliance,” Emerson held that “Nothing is at

last sacred but the integrity of your own mind.” He would thus seem at home in the category

of individualistic relativism. After all, the essay is treated as a celebration of iconoclastic

creativity... “He is great who is what he is from Nature, and who never reminds us of others.”

But his words are also taken a paean to a community’s receptivity to new insight... “There

needs but one wise man in a company, and all are wise, so rapid is the contagion.”560

So the ingenuity which is characteristic of endogenous action does not necessarily

imply utter relativism. Ingenuity, like a contagion, has real effects on its environment.

Learning does not happen in a vacuum.

The inherent ambiguity of endogenous conceptions reaches back to Protagoras, whose

aphorism, “Man is the measure of all things,” indicates the ancient pedigree of the post-

modern impulse. But Protagoras was also a respected constitutionalist and teacher of virtue

who argued that, “Skill without concern, and concern without skill are equally worthless.”

This urge to ground individual human action within the needs of human society surpassed

simplistic ethical relativism. It is noteworthy, by the way, that Protagoras was a

Page 401: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

386

contemporary of Plato, whose philosophical system of ideal forms became a leading

archetype for exogenous conceptions.

The point here is that the two faces of endogenous action need distinct names: one

for that which is taken to be utterly free and unhinged, and one for that which impinges upon

its surroundings. As a shorthand, I have some neologisms. Action which presumes the

possibility of utter relativism will be called selvogenous. Action presumed to be creatively

constitutive will be called codogenous. The term exogenous works as it is already generally

understood, but three more neologisms can help advance the discussion. From this point on

I will refer to members of the structure-enshrining group as exogents, members of the agent-

enshrining group as selvogents, and members the co-constituting groups as codogents. The

following table presents the arrangement, showing how codogenous and selvogenous actions

divide the generally understood conception of endogenous action.

Table 3 Conceptual Sources of Rule

Perspective Structure-originated Co-constitutive Agent-originated

Presumed

Origin

Exogenous

(externally founded)

Endogenous

(internally founded)

Expression Absolutism Structurationism Relativism

Rule Style Exogenous Codogenous Selvogenous

Characteristic Rules having particular

contents are seen as

overarching and essentially

absolute, needing only to

be discerned and followed.

They are presumed to stem

from background

conditions – such as

Natural Law or a

supernatural creator –

which persist regardless of

the existence of human

agents.

Rules are seen as

ontologically prior, but not

independently free

standing, since human

agency reflects skilled

practices which must

necessarily occur within

social structures. The

particular content of rules

of may vary widely with

local constructs, but always

within discernable

categories of practice.

Rules are seen as relative,

representing an

essentially local construct

of human will. Humans

are assumed to be the

measure of all things, so

that the particular

contents of rules are

simply an epiphenomenal

result of human existence.

To be fair, it is necessary to grant that the Structurationist rhetoric of absolutists and

relativists tends to set up strawmen that any reasonable person would want knocked down.

Page 402: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

387

The absolutists are taken to task for a mathematically cold and dehumanizing objectification

of society (or, in religious cases, insurmountably fatalistic conceptions of human essence and

supernatural destiny) which factors out the free will of agents in world-making activity. The

relativists are criticized for an exasperating self-absorbed subjectivity which factors out the

viability of overarching moral structures, and even ontological ones.

The Structurationist’s conceit is to declare they have the most penetrating vision of

the world. Absolutists are criticized for overlooking the urgency of agency, relativists for

blinding themselves to the collective discernability of structure. Indeed, it may be possible

to name actual people in either camp who are as lacking in nuance as Structurationists might

portray them. Still, the traps are real. Giddens avoided them in his own field work by

focusing on the tokens (money, gifts, property, etc.) that members of a society exchanged

during their interactions. This made it possible to trace activity within the simplest dyads of

society up through larger and larger aggregates.

Like Structurationists, Constructivists want to avoid overemphasizing structure or

agency at the risk of dismissing the other. The answer is not simply a matter of simply

striking the right balance, however, but to find a way of building from the most solid

foundation. The most promising starting point so far has been located within the western

tradition of analytic philosophy, and especially the intellectual tradition established by the

Austrian philosopher, Ludwig Wittgenstein, the British linguist, J. L. Austin, and the

American linguist, John Searle. All focused on the centrality of language in the conduct of

social life. Just as the mind is the lense that mediates a single individual’s subjective

apprehension of reality, language is the lense that mediates multiple individuals’

intersubjective apprehension of social reality. There is no way around it.

Searle’s work is especially germane since it focuses on speaking as a rule-governed

form of behavior. The Constructivist research program focuses on the rules that constitute

skilled social performances. Rules are seen as the bearer service, so to speak, by which

agents and structures co-constitute each other.

Focusing on rules as the object of analysis does not reduce the importance of agents

or structures as practical concerns, or as merited topics of study. Both can be investigated

Page 403: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

388

simultaneously. One can be as attentive to the constraining ramifications of structure as any

hyper-positivist, yet as concerned with the free will and dignity of agents as any post-

modernist. The virtue of the focus on rules is that it opens the way to an approach that factors

in both agents and structure. It brings their particular edifices into relief, in light of each

other.

Consider, as a general example, people who wake up in the morning regarding

themselves as individuals who must go out to the world and “play the game.” They may feel

alienated or enthusiastic about doing so, but they do it just the same. Knowing the rules, they

know how to get on in the world, and they establish their habits accordingly. By buying into

the game, they simultaneously produce the game. The game would not exist without its

players, and there would not be players if they were not in the game. This is co-constitution.

To play by the game is to play by its rules. By finding out what the rules are, we can

understand how people simultaneously make “the game,” and themselves into its players.

What gives games (or structures like a club, a workplace, the state, or the Internet)

meaningful existence is the willingness of actors to animate its associated rules. Agents are

in the driver’s seat, so to speak, giving life to the rules of the road, and simultaneously

constituting themselves as drivers. And agency inevitably discloses properties. There can be

good drivers, bad drivers, reckless drivers, defensive drivers, drunk drivers, sleepy drivers,

licensed or unlicensed drivers, profiled drivers, or whatever the case may be.

As we will see, the co-constitution of agents and structures can play out in nearly

infinite ways. The process corresponds to both the establishment of material culture and to

the formation of neuronal coalitions within the human brain. It provides for the building of

both the social and mental edifices on which people rely to get on in the world, and by which

the world lets them in. Changes in the world reflect its active shaping. Interventions result

in consequences, the substance of which may be intended, or unintended, but those

consequences occur just the same, providing the grounds for future interventions.

It is important to be clear that “the world,” includes not only the apprehensible

material reality, but the mental reality of agents who are skilled in the practice of rules by

virtue of their own cognitive edifices. Sentience presents a special problem because

Page 404: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

389

observers (so far) lack tools that allow direct observation of another thinking person’s

cognitive edifices. Nevertheless, it is reasonable to say that apprehensible reality and

sovereign cognition are both subject to dynamic processes of intervention and consequence.

Mental edifices can support routines ranging from the simplest gestures, habits and

manners, to the most elaborate notions of ideological conviction, paradigmatic thought, and

aesthetic understanding. These edifices also account for the metanarratives by which people

constitute themselves as exogents, codogents, or selvogents. People have a capacity to

overturn and replace their mental edifices, and therefore must be considered free. But this

capacity is not the kind of unhinged or unhingeable freedom that the selvogents defend.

Humans are endowed with a sensory/mental apparatus which enables them to apprehend and

recognize numerous distinct objects that exist in brute reality. This shared endowment, which

is itself an aspect of brute reality, raises the possibility of building propositions that people

can use (collectively, as agents) to corroborate their mutual apprehension of existing objects

and the relations between them. As propositions accumulate, they may be bundled within

metanarratives.

So, in distinction from the selvogents, I will argue that it is possible to constitute a

globally legitimate metanarrative about existing objects, given sufficient discipline of

corroboration. I will also argue that attempting to do so is necessary. But it is important to

note that moral precepts are not objects. Exogenous conceptions that a globally legitimate

moral metanarrative has been supernaturally revealed, or can be revealed through science,

have no foundation in discernable evidence. Perhaps a moral metanarrative could be

established codogenously, enshrining an ethical foundation for law through political action,

but such a narrative would be recognizable as an intentional product a of human culture

rather than as a reflection of some free standing irresistible force.

To underscore what is at stake, it would be useful to examine the etymology of

“narrative.” The word stems from the Latin gnarus, for knowing. The consonant

combinations of gn and kn in English spelling are clues that relate the root meanings in terms

diverse as ignore, agnostic, ken and, ultimately, can. Etymologically, knowledge is indeed

power. To narrate is “to tell the particulars of an event or act.” Narration, therefore, is

Page 405: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

390

intervention. When a narrator names particulars and defines the links between them, hearers

experience the consequences, intended or not, in their mental edifices. To engage in narration

is to deploy power.

It is useful to recall that humans are endowed with the capacity to create propositions

about ideas which are poorly coupled to existing reality. This is a basis for freedom. Humans

can conjure sheer abstractions and imagine alternate futures as they choose between courses

of action. But the expression of choice ultimately necessitates engagement with existing

reality. Systematic expression of choice reflects skilled engagement. Since materially existing

reality includes material artifacts of human culture, material reality discloses the

consequences of human choice. Thus, it may be less problematic to construct a narrative that

takes an adequate measure of what existence is than to construct one that presumes to say

how it ought to be reconstructed. This suggests an inevitable distinction between the grounds

of engagement and the goals of engagement. More simply, making a scientific description

is simpler than building a moral or an open society. Both are matters of some importance,

and both suggest the centrality of a fundamental, if facile-sounding question: What are the

rules of engagement with brute reality?

ii. Speech Acts

To study rules we need a method that provides intelligible access to them. To study

the origin and content of rules we need to locate where they entered the world. Fortunately,

doing so can be quite straightforward. The entire point of giving skilled a social performance

is to create intelligible results. Those results – accessible artifacts of those performances –

are known to us as deeds. We apprehend their forms as expressions of action – the sensible

effects of purposive signals transmitted by sentient beings.

The methodology Constructivists use to study these signals depends on a large body

of work in the fields of linguistics, communicative action, and philosophy, particularly the

analysis of “speech acts” originated by Austin and extended by Searle. My long-term plan,

remember, is to augment that methodology by extending the conception of speech acts to

include Peirce’s phenomenology of signs. I do not intend to challenge any prevailing

Page 406: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

391

overarching doctrines, but I will offer some revisions and point out some conflations which

I hope will be considered reasonable. What Peirce adds is an exquisitely thorough

explanation of the rules governing the construction of situations within which human

signaling becomes intelligible. Also, the focus on signs retains a basic compatibility with

Giddens’ token-oriented methodology and Structurationist thinking

A key element in Austin’s theory of speech acts is the distinction between what a

speaker intends a hearer to understand – an illocution – and what the hearer then construes

as the speaker’s intention – a perlocution. What is claimed and what is construed may not

always turn out to be the same, which causes problems in communication. A pertinent issue

for accomplishing successful illocutions concerns the presumed competencies of speakers

and hearers. Speech, like any signaling, operates through a combination of semantic,

syntactic, and pragmatic relations. Below I will use the term “pragmatic state” to discuss the

relation of a speaker’s competence to perform a deed and a hearer’s recognition of that

speaker’s competence within a given social setting. Marriage rites provide a convenient

example.

Only an official who has been duly appointed within a given social structure can

effectively say, “I now pronounce you husband and wife,” and expect that claim to make a

difference in the world, conferring a new status of marriage between two individuals. In that

equation, the competence of the hearers is as important as that of the speaker. There is real

skill involved in knowing when someone can responsibly make a claim, give a command,

or accept an agreement. There is also skill in knowing when certain utterances may be

semantically and syntactically valid, but pragmatically false with no performative effect. This

falsity could occur under many circumstances. More commonly, the speaker may not be

formally credentialed to invoke the speech act, or the prospective husband and wife may not

be deemed entities formally entitled to participate.

An account of formality is a special problem to be taken up later. Though the

condition of formality in this example depends on association with prevailing legal

structures, it can also be animated by professionalism within epistemic communities, access

to published references like dictionaries, the sincerity of the speaker, reference to calibrated

Page 407: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

392

Onuf (1989: 157).561

measurement systems, and so on. For now, suffice it to say that the condition of formality

allows speakers and hearers, with minimal risk of attenuation, to transmit signals embodying

valid pragmatic states. It elevates the presumed trustworthiness of truth claims. Thus,

speakers can honestly and correctly state that two people are married, constituting hearers

who may then reconstitute themselves as speakers, and so on, ad infinitum. The quality of

formality behind the transmission of a truth claim indicates the degree to which people have

opportunities to share similar beliefs – equivalent states of mind about given sets of social

facts.

Pragmatic states can of course be staged theatrically or evoked in play. Children

might act out a marriage ceremony for fun, as might actors in a scene. Doing so, they

recognize that the pragmatic state of their performance is valid for the context of that

performance, and not for all the contexts beyond. Still, the capacity to think hypothetically

and to imagine the world in ways that are not actually intended to contribute to the

establishment of its formal reality is highly consequential. Communicative imagination

reveals the phenomenally creative powers of human agency. It is the source of our ability to

test inferences, plan for change, and gauge illocutionary force. It facilitates the socialization

of agents, allowing them to pre-model real world competencies. And, for speakers who are

so inclined, it also the enabler of lies and deception.

Ironically, even formal social reality is the product of human inventiveness. There is

nothing new in pointing out the world is constructed out of useful social fictions. The

persistence of property rights and monetary systems are classic examples of how the world

is built on a long legacy of promises, legal claims, contracts, IOUs, and other tokens. As

these vestiges of our actions persist and pile up, we extend the self-proclaimed authority of

their underlying social fictions. For Onuf, “all figures of speech extend the fiction of their

reality to social construction.” The words fact and manufacture share the same561

etymological root, after all. Yet it is clear that certain classes of social constructs exert far

more import than others, some because they are upheld by habits of tradition, others because

Page 408: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

393

they hold up under empirical tests. If the proper business of science is to state accurate facts

about reality, it is no wonder that so many philosophers of science are concerned first of all

with stating clear and formal standards for facticity.

It is quite likely that our understanding of how we articulate formal rules will improve

as we refine our understanding of the human capacity to express and evaluate imaginary or

provisional rule-making. It is clear that we can easily distinguish formal and informal

pragmatic states like those in the examples above, but there are much harder cases at play in

the world, and they are often the source of terrible trouble. The problems hinges on

incommensurable notions regarding the proper source of rules.

Again, marriage provides an example. In many jurisdictions around the world, the

phrase “I now pronounce you husband and wife” is being replaced by “I now pronounce you

married,” expanding the scope of legal marriage to include same-sex partnerships. But the

formal practice of same-sex marriage has yet to achieve the level of global legitimacy known

to the practice of hetero-sex marriage. This example demonstrates that formal pragmatic

states can be accepted locally while contended globally. It is important to stress that this

sense of local does not mean a geographic constraint, but a limit on the community of

practitioners. (Neither does the notion of jurisdiction necessarily hinge on geographic limits.)

The point here is to emphasize major distinctions in attitudes toward the proper

source of rules. Consider first the conduct of opponents of same-sex marriage in the

contemporary American context: The large majority behave as exogents, invoking religious

precepts to justify their advocacy of a specific global rule. Specifically, they demand

reconstitution of the social world to reinforce harmony with their conception of otherworldly

rules. An absolutist notion of structure extinguishes the possibility of other choices. Next,

consider the conduct of participants in a secret same-sex relationship: Due to risk of

persecution, they work to hide their choices from the society at large. Those secret practices

carve out a sense of local reality as a safe zone, but the fact of the relationship is hoped to be

meaningless beyond that zone. This is not to equate all closeted people with selvogents, but

to describe what occurs when deeds (rule-based behaviors) are conceived as having an

Page 409: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

394

For a revealing discussion of the process of moral reasoning, see Jonathan Haidt, “The Emotional562

Dog and its Rational Tail: A Social Intuitionist Approach to Moral Judgment.” Psychological Review. October

2001:108.4, 814-834.

expressly local character. Severely localized agency of this sort is sterile to the extent that

it forfeits opportunities to impact the wider culture.

Finally, consider the attitudes of people who either advocate or support the

legalization of same-sex marriage, and may also seek to participate in one. Like the exogents,

they demand the constitution of a global rule. But unlike the exogents, their demand for a

change in the rules is justified on its own terms: First, their capacity as agents to make

demands, and second, the expectation that those demands will be heard by other agents who

will agree to reconstitute the social structure as demanded. Using the same morphology,

members of this group can be termed codogents, suggesting the idea of co-constituted

generation of change and growth. (The similarity to “cogent” also raises the conception of

people who act to persuade.) From the codogenous perspective, the proper source of rule is

conceived as originating from the demands of agents who are willing to recognize each

others’ demands, and then work out formally-stated mutual interests to their common

satisfaction. The process is at the core of the republican tradition in western civilization.

The upshot is this. Public disputes over same-sex marriage reveal categorically

incommensurable understandings about what kinds of sentences have valid pragmatic states

in a formal argument. The stridency of the argument suggests the possibility of failure at the

level of illocutionary/perlocutionary action. Perhaps this is because, as psychologist Jonathan

Haidt has shown, discussions that touch on moral precepts are typically driven by post-hoc,

emotional reasoning. Yet hearers on each side know what the other side’s speakers mean562

to mean. The failure is not simply a confusion of semantics or a lack of rationality by one or

both sides. What speakers claim and what they want construed in this case is unmistakable.

This belies differing beliefs about the proper source of rules. The problem is not so

much a failure to communicate as a failure to validate. The supernatural justification for

political commitments makes no sense to one side, while the other side considers rights-

based arguments inapplicable to the issue at hand. Each considers the others’ core premise

Page 410: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

395

Ibid. particularly Haidt’s reflection on the emotion of disgust. However, I take issue with Haidt’s563

argument (with Jesse Graham) in “When Morality Opposes Justice: Conservatives Have Moral Intuitions That

Liberals May Not Recognize,” Social Justice Research (forthcoming). Haidt argues that the human species

learned a commitment to purity as a result of evolution and the need to avoid contaminated meats. This

adaptation reflect the drive toward wholeness. I would argue that it developed further as humans undertook

puzzle-solving activities and learned the virtue of harmonizing their various precepts and hypotheses. This

adaptive strategy helped them organize their understanding of the world and maintain their status relations as

the size of their communities grew. Conservatives retain that commitment in social norms that preach chastity,

whereas liberals have presumably abandoned, seeing no evidence that libertine behaviors are by themselves

harmful. Yet both conservatives and many liberals retain core commitments to hygiene (I presume, a priori, that

both brush and floss). But they do have a different take on purity. Religious traditionalists seek to uphold pre-

modern cultural notions of carnal integrity, where secular progresses increasingly decry superstition as

loathsome.

to be irrelevant. Visceral emotional sentiments deter members of each side from the leaps

of imagination that would allow empathy with the content of values most pertinent to the

other’s interest-formation process. They can’t think from each others’ gut.563

* * *

Though marriage is a handy example for now, the main concern of my empirical

work has been to study rule making for the Internet. The problem in that domain is whether

formal administrative mechanisms should reflect the play of market forces, the influence of

nation states, the organized participation of stakeholders, the authority of technical experts,

or some other arrangements. As with the same-sex marriage controversy, contention over

the proper source of rules and, consequently, the pragmatic validity of claimed interests has

sharpened the debate, fostering a polarized and demonizing

Constructivists analyze peoples’ words – the things they do that make a difference

in the world – because of their interest in understanding how such actions constitute social

reality. Social life is a constitutive endeavor. The accumulation of human choices produces

the societies we live in. Admitting this is problematic, however. Society can not be studied

from outside society. At the same time, the act of stating facts about society will necessarily

have constitutive effects within society. Giddens uses the term “double hermeneutic” to

explain the condition. Narrative intervention has consequences. This reinforces the

conclusion that scholarship in the vein of Rule-Oriented Constructivism can have practical

effects. Practitioners reject the a priori sureties that exogents claim to offer and accept

engagement in building upon the grand narratives that selvogents seek to escape.

Page 411: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

396

Eric Partridge (1983), Origins: A Short Etymological Dictionary of Modern English.564

Onuf (1987: 274).565

Onuf 1987: .277).566

The challenges for Constructivists are similar to the challenges faced by those in the

business of politics... at least, honest politics. That is, how to promote the advancement of

a community’s interests. To meet the challenge it is necessary to adopt methods of analysis

and discourse that facilitate commensurable formal dialogs about the issues at hand. In other

words, there are good reasons to formulate and embrace sustainable rules for formal

communication. Such rules are enabling in that they allow engaged participants to express

their interests on a given issue. But they are simultaneously constraining in that they establish

standards for pragmatic validity.

The notion of interests is often taken for granted, but the concept is central to later

discussion, so it will be useful examine in some detail how interests are derived.

iii. Interests

The word interest is built from the roots inter and esse, having the meanings between

and that which is actual, existing, genuine, true, authentic. The Sanskrit esse is particularly

old, and finds its way into many words, including essence, is, sooth, and even etymology.564

Combined, the roots suggest to be between, but modern usage gives us important and of vital

concern.

Onuf’s conception of interests carefully distinguishes them from preferences or

wants. It is necessary to calculate ones’ own preferences in order to advance them. Rules are

tools which make this possible, so interests are not merely a function of an agent’s wants, but

also an agent’s circumstances. Consequently, “A want is not an interest unless one can

plausibly act on it.” Identifying one’s interests is intrinsically linked to strategies for565

achieving them. Plausible ends require plausible means.

Interests are recognizable to us as the reasons we give for our conduct.Reasons speak to the relation between our taken-for-granted rationality andthe wants that we are in a position to satisfy.566

Page 412: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

397

Onuf (1987: 275), citing Harold Laswell and Abraham Kaplan, Power and Society. New Haven:567

Yale University Press. (1950: 23) .

Inaugural Address By President George W. Bush, January 20, 2005.568

Like any claim issued by speakers to hearers, the evocation of interests introduces the

possibility (but not inevitability) of ratification. The pursuit of ones’ interests is intrinsically

a practice of rule-making and power, For that reason, it will be useful to carry forward with

the term interest defined as a “pattern of demands and supporting expectations.”567

Conceptualization and calculation of interests is itself a topic of great interest in the

field of International Relations here in the United States, where scholarship is traditionally

dominated by people who sort into classifications such as Realism, neo-Realism, and

Pluralism. Constructivist (sometimes called Reflexivist) scholars make a showing in

academia as well, but members of the other groups are far more solidly entrenched in

government jobs where the act of calculating and advancing the national interest is practiced

with the full resources of the state. Realists and Pluralists generally lean toward

behavioralism, portraying interests as a given result of an agent’s circumstance. But where

realists stress that the controlling circumstance is the essentially anarchic nature of the system

of sovereign states, the Pluralists focus on the utility-maximizing drives of individuals who

deal with each other within and across the bounds of states.

To complicate matters, the current US Administration is led by George W. Bush,

often considered an Idealist, representing a stance which has not recently been a strong force

in academia. His stated goal is to instigate reform of the domestic orders of various states

according to specific notions of desirable practices. The justification given for this is

ambivalent. He has said, "The survival of liberty in our land increasingly depends on the

success of liberty in other lands," suggesting a realistic, self-interested motivation. But he568

has often said also that those desirable practices – and US interests as well – are conceived

and directed by a divine source. “I believe that America is called to lead the cause of freedom

Page 413: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

398

” Acceptance speech at the Republican National Convention, September 2, 2004. See New York569

Times, September 3, 2004, p 4.

This is a paraphrase of Mark Dery’s concept, “ventriloquizing nature.” See, “ "Wild Nature" (The570

Unabomber Meets the Digerati)” regarding appeal to the authority of nature, “forestalling debate by

c a m o u f l a g i n g t h e m a n - m a d e a s t h e g o d - g i v e n . ” h t t p : / /w w w . l e v i t y .c o m / m a r k d e r y /

ESCAPE/VELOCITY/author/wildnature.html.

in a new century... [F]reedom is not America’s gift to the world, it is the Almighty God’s gift

to every man and woman in the world.” 569

Drawing finer distinctions between schools of thought in IR scholarship must be put

off for now. The point here is to emphasize that Realists and Idealists share a firm belief that

interests are formed exogenously. From an existential perspective, these are clear cases of

false consciousness, indicative of the strategies that subjects use to shirk their subjective

responsibilities. When a subject’s interests are presumed to come so definitively from

outside the subject, the morally constitutive implications of subjectivity and intersubjectivity

can more comfortably be ignored. First, because this greatly simplifies the task of interest

calculation, so that other agents can be sorted by the poles of friend or foe, angel or evil-doer.

Second, because the exogents believe their choices are excused... caused from the outside.

If one’s position in the international system requires a certain behavior, or if the call of one’s

god requires a certain behavior, one imagines oneself as simply following rules and not

participating in making them. This provides the excuse by which it is possible to “play the

game” like a robot, regardless of the consequences, come what may. But from a

Constructivist perspective – and perhaps an ethical one – it is necessary to recognize that

playing the game is always constitutive, whether the speaker admits it or not. When nature

speaks – or when God speaks – a ventriloquist is doing the talking.570

There are reasons it may be incorrect to characterize the administration as Idealist and

therefore exogenously motivated. In a widely-reported statement, a “senior Bush advisor”

derided “the reality-based community” of journalists and experts who "believe that solutions

emerge from... judicious study of discernable reality." The advisor was particularly

dismissive of enlightenment principals and empiricism:

Page 414: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

399

Ron Suskind, "Without A Doubt: Faith, Certainty and the Presidency of George W. Bush" New York571

Times Magazine, October 17, 2004. Suskind agreed to withhold the advisor’s name from publication. Many

informed observers have guessed that the source was Karl Rove.

That's not the way the world really works anymore, We're an empire now, andwhen we act, we create our own reality. And while you're studying that reality-- judiciously, as you will -- we'll act again, creating other new realities,which you can study too, and that's how things will sort out. We're history'sactors... and you, all of you, will be left to just study what we do.571

To the extent that selvogents distrust the evidence of intersubjectivity, they may

consider the act of calculating and advancing a society’s interests (as Society, writ large) to

be an absurdity that only compounds absurdities. If one’s perception of one’s own interests

is not derived from an engaged discernment of reality, then it emerges full-blown from the

unconstrained wishes and whims of the agent. By rejecting participation in the negotiation

of interests, such agents satisfy themselves with a presumption that they can achieve a free-

standing subjectivity which risks no moral implications at all. The anonymity of the speaker

only serves to fortify that responsibility-shirking stance.

The jumble of inconsistent, disengaged, and contradictory positions expressed by

officers of the Bush Administration might explain this country’s recent setbacks and reduced

prospects in world affairs. But investigation of that proposition is out of scope. The goal here

is to inform a discussion of interests, calling on resonant examples when appropriate. To

borrow a phrase from Peirce, we seek interests which comport with reality so that our

demands can be satisfied, and our expectations confirmed.

In the next section I will briefly review how the underlying rationale for interests –

the sources of human demands and expectations – have been portrayed within various

metanarratives. That discussion will open the way to an examination of how it is that agents

advance their perceived interests by engaging in distinct sorts of speech acts.

Page 415: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

400

c. A Short History of Interests and Motivations

i. Fear

Fear is a skilled practice, and a highly adaptive one. It conditions human organisms

to recognize potentially painful stimuli which threaten the integrity of their bodies. As

animals we are born with intuitive mechanisms that can immediately prompt withdrawal

from or attention to the causes of unexpected physical sensations. As we learn to associate

certain sights, sounds, smells, tastes, or touches with painful feelings, we build up

expectations about what kinds of sensations are actually threatening (or not), and what kinds

of responses are appropriate (or not). From an evolutionary perspective, given the stakes,

organisms which learn hyper-vigilance with regard to predacious agents tend to be more

adaptive than those which inculcate complacency. Fight or flight behaviors can also be

organized at the social level, where the skilled practice of fear has been an abiding interest

since the dawn of history. Thucydides (c.460-c.400 B.C.) considered fear to be the prime

mover of human activity, distinguishing three types. The most important of the three was

phobos, correlating an animal’s capacity for physical fear. In humans, that form of fear is said

to motivate a counteracting desire for security and safety (aspehelia). That desire, then, was

the primary driver of political activity. Two other drives corresponded with two forms of fear

not typical of animals. The second drive is expressed as the pursuit of honor, prestige or

glory (doxa). The third accounts for pursuit of profit and wealth (kerdos, ophelia).

Thucydides’ famous and approving account of Pericles’ Funeral Oration describes how those

latter two drives would also serve to counteract fear.

Early in the Oration, Thucydides – speaking to us through Pericles – exalts Athenian

democracy and freedom as a way of life which allows citizens to advance by their own

capacity and merit, regardless of social standing or personal behaviors. While lauding the

manner in which Athenians eschew surveillance of each other, allowing wide forbearance

in lifestyle and personal conduct, he also wants to show that, “this tolerance in our private

lives does not make us lawless citizens.” His reasoning has become one of the Oration’s best-

known passages.

Page 416: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

401

Without access to the original Greek text at this time, I am presuming that Thucydides used the572

identical word at both locations.

Fear is our chief safeguard, teaching us to obey the magistrates and the laws– particularly those that protect the injured – whether they are on the statutebook or belong to that code of unwritten laws that cannot be broken withoutdisgrace.

Later on, Pericles has more to say about disgrace when he exhorts Athenians to take

responsibility for their economic fortunes,“We place the real disgrace not in the fact of

poverty but in the declining of the struggle against it.”

Disgrace – not just physical dread – is thus put forth as something to be feared. Two

distinct flavors of disgrace can be gleaned from the different contexts in which Pericles utters

the word. There is the shameful sense of dishonor that can be felt most acutely when no572

one is looking (precluding the need for direct surveillance), and there is the humiliating guilt

that can be felt more acutely the more people are looking (prompting one to greater industry).

The former may be shorthanded as the disgrace of perversion (however widely or narrowly

circumscribed), and the latter as the disgrace of a lazy citizen, or to give the idea a more

modern-sounding form – the disgrace of slackerhood.

Pericles presents and immediately resolves the “disgrace of perversion” problem in

order to uphold the nobility of the Athenian people as a whole. He wants to defend that

nobility against anyone – inside or outside their community – who might have accused the

Athenians that the relative social liberalism of their society reflected a fundamental moral

failing. His argument is that they are implicitly aware of an unwritten moral bottom line, and

so they respect it by the powers of their own fear-motivated self-discipline.

The “disgrace of slackerhood” is a qualitatively different concern. Not only does the

sentiment have a profoundly modern echo, urging self-help as a social good (since a rising

economic tide presumably lifted all boats as successfully two and a half millennia ago as it

does now), it poses the reciprocal obligations of citizens and states. Thus it was considered

right and fair for the citizens of Athens to demand explicit effort from each other when there

was clear benefit to be gained. After all, they shared common interests. Since Athenian

Page 417: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

402

The Prince, XIX, “That One Should Avoid Being Despised and Hated.”573

fortune depended in great part on the prizes won and defended by the citizen-soldiers of its

Hoplite armies, it is no surprise that the failure to struggle against poverty was deemed the

greatest, most “real” disgrace.

As a result of this innovation, splitting the practice of fear into three parts, the

Athenians clarified the difference between themselves and animals. Humans could fear not

only the whip, but also the gaze of the Gods, and the gaze of one’s fellow citizens.

Parenthetically, inspired by the Hoplite model, the early Romans called their

spearmen Quirites, a term which gave rise to words as diverse as virile and cry. The

connection to virility is clear enough. The other derivation is rooted in the notion that a

Quirite was a citizen who, by right of his contribution to the society, was entitled to make

demands of other citizens. Our modern notion of the verb cry, retains that sense of speaking

out and making demands.

* * *

Writing nearly two millennia later, Niccholo Machiavelli (1469-1527) expressed

similar concerns. Like Thucydides, he chronicled the history of particular wars, documented

the successes and failures of political leaders, and fought for his community in its army. But

Machiavelli had a wider range of wars to study, and his project was more brazenly

prescriptive. He was not especially concerned with categorizing the essential motivators of

human action, and there is no plain description of any related three-part system in his

analysis. But there are some hints of a scheme. In The Prince, when laying out the prospects

for creating a Civil Principality, he characterized the “opposing humours” and “contrary

appetites” in the community’s population. The nobles were motivated by “greed and cruelty;”

the people “loved peace.”573

Machiavelli considered the habits of peace to be in peoples’ interest, if not always in

their blood. The challenge, therefore, was getting members the community to advance those

interests, despite their self-sabotaging impulses. The answer, as for Thucydides, required a

godly gaze. Denounced by varieties of religious leaders for epitomizing the maxim that “the

Page 418: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

403

Machiavelli never stated the maxim as simply as that, but it is ascribed to him due to statements like574

these: “Therefore it is unnecessary for a prince to have all the good qualities I have enumerated, but it is very

necessary to appear to have them. And I shall dare to say this also, that to have them and always to observe them

is injurious, and that to appear to have them is useful... ” The Prince XVIII, “Concerning the Way Princes

Should Keep Faith”

Discourses, XI, “Of The Religions of the Romans”575

ends justify the means,” Machiavelli was nevertheless morally minded, desiring the good

fortune of the community. But he believed that end would have to be obtained through574

trickery, a necessary evil. “Those princes who have done great things have held good faith

of little account, and have known how to circumvent the intellect of men by craft.”

In Discourses Machiavelli reflected on the durable success of the second King of

Rome, Numa Pompilius, who “finding a very ferocious people and wanting to reduce them

to civil obedience by the acts of peace, turned to religion.” Numa’s reign was marked by a

series of so-called (and well-timed) “miracles,” the founding of a new priesthood, a great

deal of public piety, and the establishment of elaborately-costumed ceremonial precessions

that combined militaristic and vernal symbolism, kicking off the new year of a reformed solar

calendar. Machievalli greatly admired the result. Rome’s citizens, he noted, “feared more the

breaking of an oath than the laws.”

And whoever considers well Roman history will see how much Religionserved in commanding the armies, in reuniting the plebs, both in keeping mengood, and in making the wicked ashamed.575

That strategic cultivation of supernatural fear was profoundly effective, Machiavelli

believed, motivating desires to infuse religious norms throughout Roman society. It

prompted the writing of “good ordinances” which in turn facilitated “the happy success of

the enterprises.” Rome apparently fought no wars during Numa’s long reign. In effect,

Machiavelli was a proponent of socially engineered fear – the skilled practice of fear

mongering. Done the right way, he concluded, communities would be more likely to enjoy

peace and prosperity.

Page 419: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

404

ii. Rationality

Two centuries later, Thomas Hobbes (1588-1679) distinguished three “causes of

quarrel,” each providing a reason for war. The urges of competition, diffidence, and glory

could in turn prompt wars of gain, safety, and reputation. Reversing the order of his

categories, there is a strong correspondence with Thucydides (whose work he translated).

The relations are shown in Table 1. The similarities linking Fear/Diffidence and

Safety/Security are self-evident, as are the confluences between Honor/Reputation and

Profit/Gain. Glory and Competition can be aligned with the two Disgraces if they are treated

as negations. Glory suggests a sense of immortality or perpetuity. Competition, a sense of

immediacy or day to day effort. To lose at either (leading to, say, infamy or impoverishment)

can be a kind of disgrace, whereas victory at either would indicate a kind of grace.

Like his predecessors, Hobbes was thinking through the practice of fear. His

approach, however, reflected a qualitatively higher degree of formality. Thucydides and

Machiavelli wrote as historians, and were far less systematic. Though Thucydides sought

to provide “a true picture of the events which have happened, and of like events which may

be expected to happen hereafter in the order of human things,” the best technique available

to him was narrative, forcing him into a chronological order of exposition. Machievelli’s

immediate motivations had more to do with proving his bona fides and friendly wishes to

potential employers as he bid for jobs as a political advisor. He wrote for a private audience

rather than public consumption or scholarly peers, and his most important works became

available only after his death.

Hobbes, on the other hand, emulated the methodical, inquiring spirit of Galileo,

whom he had met in 1636. Like Galileo, Hobbes presented his results in a form intended to

be accessible to the widest possible audience. His exposition began with a careful discussion

of “Sense” and “Imagination,” addressing basic questions about physical reality, perception

and cognition. His willingness to engage in abstraction and undertake thought problems

carried him to his famous premise about the emergence of human society from a state of

nature, where life was “nasty, brutish and short.” And that provided grounds for a critical

observation. Given the fearsome consequences of endless quarrel, he reasoned, humans must

Page 420: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

405

select sovereigns to whom they will give up certain rights in order to guarantee their own

long-term survival (though the right of self-defense against immediate threats is retained).

Table 4 Triadic Constructions of Motivations and Interests

Author Label

Thucydides/Pericles

Immediate motivator(To be averted)

Disgraceof perversion

Fearphobos

Disgrace of failed citizenship

Thucydides Positive motivator (To be pursued)

Honordoxa

Safety/Securityaspehelia

Profitkerdos

Machiavelli Groups/Humours People/Peace Army/Cruelty Nobles/Greed

Hobbes Causes of Quarrel Glory Diffidence Competition

Hobbes Motives of War Reputation Safety Gain

Morgenthau Sources of Power Alliances Military Wealth

Osgood National Self-Interest Prestige Survival Self-sufficiency

Maslow Needs Esteem/Growth Biological/Safety Love

Haidt Moral Foundations Purity/Ingroup Aversion to Harm Reciprocity

Laswell Ends Deference Safety Income

Habermas Cognitive Interests Emancipatory Technical Practical

Onuf Immediate Ends Standing Security Wealth

Onuf Sources of Conduct Shame Dread Guilt

Onuf Rule Categories Instruction-rules Directive-rules Commitment-rules

Onuf [Abbreviated Heading] Existence Material Control Discretionary Endeavor

In other words, for Hobbes, the skilled practice of fear among communities of men

creates states. Because Hobbes was interested in the fear people have of each other,

regardless of their supernatural presuppositions, his reasoning signaled a decisive break from

the prevailing Thomist and classicist theories of war and political order. Its moral

implications reached beyond Christendom to all humanity. Seeking to emulate Galileo’s

inductive logic, Hobbes maintained that the study of nature should stand on its own terms,

independently of religious text. “[W]isdom,” he wrote in Leviathan’s Introduction, “is

acquired, not by reading of books, but of men.” The first two of Leviathan’s four sections

Page 421: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

406

developed a secular justification for the state – an “artificial man” – that is rightfully regarded

as a pivotal step forward in the development of formal political philosophy. Hobbes wasn’t

able to achieve Galileo’s standard of empirical experiment, but his universalizing approach

offered a grand narrative that anyone could incorporate as a mental edifice.

It is a separate question whether Hobbes’ conception of human behavior in the

lawless jungles of prehistory was finally corroborated. So is the issue of whether his

prescriptions are correct. The point is that his interest in the skilled practice of fear was

exceeded by his interest in the skilled practice of reason. Hobbes formulated his ideas in

terms meant to allow the possibility of challenge, corroboration and refinement. Like his

colleagues in the vanguard of the Enlightenment project (and at some risk, despite his well-

known personal timidity), Hobbes transgressed the grand narratives of the prevailing order

for the sake of developing a clear statement of human interests. If we can know what we

might reasonably expect from our sensible interactions with reality, we can better assess the

plausible outcomes of any demands we might make. There is true glory in that, in the sense

that a correct description of reality, delivered gracefully, will be preserved and retold by

continuing generations of speakers and hearers.

* * *

The skilled practice of reason leapt ahead during the Enlightenment, fostering grand

narratives that elevated knowledge and innovation as high virtues, making life safe and even

rewarding for scientists and skeptics. Rates of technological advancement accelerated,

bringing radical changes to material culture, which in turn fed a growing belief in the forward

march of human progress.

For the French sociologist and philosopher of science Auguste Comte (1798-1857),

progress was not only desirable, it was inevitable, driving civilization toward the ultimate

satisfaction of fundamental human wants and needs. Comte proposed a grand narrative of

grand narratives, describing three phases of human civilization – theological, metaphysical,

and positive. According to his sequence, human understanding regarding the source of

natural order started with belief in gods. It then advanced to belief in unseen forces, and

would finally settle on belief in only what can be measured.

Page 422: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

407

Comte considered himself the herald of the new order, offering methods of sociology

and sciences of the human mind that would unleash the full force of positive truth. Nothing

could be more in keeping with the human interest than more knowledge, he believed,

because knowledge was providential. He also believed that the statics and dynamics of

human nature could be studied objectively, and thereby measured and manipulated as readily

as the inanimate objects of nature. Reliance on his positive methodology was key to this, of

course. Once nature’s truths were discovered, civilization could be entirely reorganized,

subject to the efficient directives of science-driven economics, planning, and engineering.

For a time, Comte was a close colleague of another prophet of progress, the French

Utopian Henri de Saint-Simon. In his crazier moments (and he had many), Comte imagined

himself as the head of a new Positive religion. He even took the time to identify a pantheon

of secular saints. Yet, despite his personal foibles, Comte was highly regarded. He left an

enduring influence, especially in that his deterministic sentiments were given a dialectical

spin and recycled into Marx’s historical materialism.

Exogenously-flavored grand narratives like Comte’s proliferated during the industrial

age, and continue to do so today. Such Utopianism offers an inevitably optimistic sense of

destiny. However, by insisting that the source of rule is fully external to an individual’s own

agency, exogenous conceptions of collective rationality stifle the expression of choice.

Freedom, at best, would be the known necessity. The consequence of the exogenous take on

skilled practice of rationality was to enshrine the validity of world views in which the

primitive urges of agents were subordinated to the grander forces of rationality.

iii. Reflexivity

Given our need to plan and predict, we have a vital interest in knowing whether

reality is comprehensible, and if so, how so. Is it rationally based? Can it be apprehended out

of perfect shapes and ratios? Is it built upon simple atomic integers that all divide evenly into

each other to express the functions we must unavoidably live by? Or, even if those functions

aren’t so nicely rational, are they at least discernable and open to corroboration, so that we

might hope to get a handle on how things fold together?

Page 423: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

408

For a possibly dissenting view, see Alan D. Sokal, “Transgressing the Boundaries: Towards a576

Transformative Hermeneutics of Quantum Gravity” Social Text #46/47, pp. 217-252 (spring/summer 1996).

Please note: If this citation is unfamiliar, also see Sokal’s followup, “A Physicist Experiments With Cultural

Studies” http://www.physics.nyu.edu/~as2/lingua_franca_v4/lingua_franca_v4.html.

These are big, persistent questions, and, fortunately, there have been many successes

in pursuit of the answer. The π of Euclid and the G of Newton, conceived as constant and

universal, are still taken to be so despite their ineluctable historicity. Which is not576

surprising. A grand narrative worth its salt would very likely include some grand elements

in its semantics, syntax, and pragmatics. Expressions perceived as constants and universals

remain fixed in our mental and cultural edifices only in part because we uphold them. Such

expressions have had countless opportunities to fail, and have held up nonetheless. What we

come to know reverberates more resonantly with what can be known.

A full account of π and G cannot leave out Euclid and Newton, key participants in

the creation of human culture. Now, it’s fair to say that reality folds in accordance with

perceivable regularities so that, even if the denotations π and G are arbitrary, their

connotations are not. But our most complete understanding of π and G will always carry the

denoters’ fingerprints. Their findings were the result of deliberate effort and serious

reflection, providing as strong an example as any of how our apprehension of brute reality

is changed by the intervention of thinking people. By offering up highly formalized rules to

serve as resources by which all others may comprehend the world’s perceivable regularities,

they made it possible for learning to occur. These discoverers – as speakers – enabled hearers

to advance their skilled practice of rationality.

Because the acquisition of information about the world tends to expand the number

of objects in one’s consciousness, and therefore the number of things about which someone

can have intentions, we can conclude that learning augments identity. Acquiring the skilled

practices of rationality through social learning enhances the skilled practice of being a

subjective knowing agent. Thus, cultural edifices transmitted by speakers influence the

mental edifices of hearers.

The acceptance of a new state of affairs – say, the acquisition of a new fact – builds

up cultural edifices by fostering an identity of interests among hearers... at least insofar as

Page 424: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

409

See Giddens, The Consequences of Modernity and Reading Guide to: Beck, U, Giddens, A, Turner,577

B, Robertson, R and Swanson, G (1992) 'Review Symposium: Anthony Giddens on Modernity', in Theory,

Culture and Society, 9, 2: 141 - 75, particularly Turner.

Onuf (1987: 290-4).578

they possess intersecting stores of knowledge about particular facts and intention-taking

objects. What hearers may want to do with that new knowledge could differ vastly. From

this perspective interests are neither exogenous or selvogenous, but codogenous... the result

of collaborative intentionality. The question of how agents discriminate between which new

rules to inculcate or reject – in other words, how people decide what may be worth learning,

and which interests are worth pursuing – will be picked up again when we return to the

overarching problem of how people account for the proper source of rule.

* * *

I described earlier how “playing the game” by following existing rules is a

reproductive social act in which agents simultaneously reconstitute the social structure and

themselves as competent agents within it. Reflecting about life risks particularly radical

changes if it involves looking for ways to express ideas about constants and universals that

no one has yet expressed. The suggestion of new rules (or the suggestion that existing rules

are flawed) stands as a proposal for a new game.

Believing their concerns to be the most generalized among all the academic

disciplines, sociologists and philosophers credit their efforts as providing a major source of

reflexive change. That would be easy to evaluate. It is an easy bet that far more people in577

this world are familiar with pi than positivism. Nevertheless, as a constructivist engaged in

the skilled practice of reflexivity, I propose to add my own findings into the mix of what is

taught and learned. (The effectiveness of my practice is left for others to decide.)

Table 4, introduced earlier, draws liberally from World of Our Making, including an

appendixed table Onuf labeled “Faculties of Experience.” He also called it the “Synoptic

Table.” The arrangement of columns and rows in my version suggests common threads of578

thought running through the works of several important theorists. The perspective will also

serve to show how those threads might extend into disciplines beyond political theory.

Page 425: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

410

Ibid.579

Frankly, teasing out the two disgraces from Pericles’ Funeral Oration was meant to resonate

with the metaphors that Onuf chose for the row labeled Sources of Conduct. (The row labels

are my own, based on my interpretation of his text. There were no row labels in his version,

which were to be treated as “paradigms of experience.” He said his categories were to be

treated as unnamed sets, but his top row seemed to perform as an overarching header. I

include key phrases from those header cells in the last row of Table 1. )

These schemes are all examples of social construction, and so is the aggregation.

Taken as a whole, across the span of civilized time, they reflect a kind of groping for an

understanding of how we humans can know the world, and how we can make our way in it.

The common question can be restated this way. What discernable endowment do we have,

if any, that is the germinal element of human identity, and can serve as the proper basis for

a successful grand narrative? Some think fear. Others rationality. Hume started with

passions and appetites. Many, including Peirce, would say love. For Sartre, it is freedom.

In justifying his scheme, Onuf argues for the priority of sense experience. He adds

that the faculties of touching, seeing, and hearing (and just those three) “dominate social

practices universally:”

I suggest that our sensory experience of the world and of our bodily selves inthat world, reinforced, by our appreciation of possible relations of in and out,wholes and parts, yield a universal set of three categories of reasoning, not tomention many other social practices. I call the three senses so reinforced“faculties of experience.”579

Though I agree that our sensory experience yields our social practices, that experience

implicates much more than simply seeing, hearing, and touching. The human capacity for

sensible apprehension of the world – one’s sense of acting and being acted upon in that world

– provides the grounds for the constitution of self-regarding intentional agents. But that

apprehension involves deeply embedded biological capacities, such as feeling, cognition, and

emotion. The capacity for social interaction, hence language, stems from those capacities.

Page 426: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

411

The purpose of the upcoming section is to give a fuller account of the link between these

innate biological capacities and the processes of social construction.

d. Deploying Rules

i. The Scope of Parts and Wholes

In the preceding discussion I sketched out why it is that rules and interests count for

so much in Constructivist analysis. Namely, they reflect the skilled practices at the core of

the co-constitutive production and reproduction of human agency and social structure. In this

section I am primarily concerned with explaining how this framework can serve as a method

of social science. I will argue that the interrogation of human events can be aided by locating

the deployers of rules. Deployers are those whose actions originate the signals that produce

changes in mental and cultural edifices. Furthermore, in keeping with certain themes of the

triadic perspective introduced earlier, I will argue that there are three categories of rules, and

that these correspond with three classes of deployers: guides, gatekeepers and peers.

A key premise of this argument is that the construction of social reality can not occur

without concrete changes to material reality. This is so because the acts of speaking and

hearing, like any acts of skilled human performance, shape and animate the neuronal

coalitions that fire and form inside human brains. Structure, rules, and agency are thus

inextricably linked to physical reality via the embodied minds of living, meaty people whose

innate physical endowments provide the fundamental resources from which skilled social

performances ensue.

Social rules can not have an effect on the world unless they are deployed, and they

can not persist as such unless they are retained. The human organism – a product of natural

biological evolution – provides the capacity for both deployment and retention. Our ability

to conceptualize changes in the world as a relationship between parts and wholes has proven

to be one of the most remarkable adaptive mechanisms of our species, and a linchpin of our

culture.

Page 427: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

412

* * *

People are keenly aware of how their physical bodies change over time, even if they

are not directly aware of how social construction often plays a part in causing those changes.

By age 3, children know that they are getting bigger, that their ages are increasing, and that

they can expect to keep growing. Consciousness of change intensifies after puberty, when

human physiology undergoes perceptibly rapid changes in both form and function.

By adolescence young people are well aware that they are learning. They are aware

that learning has utility, and that further learning may be demanded of them. We now know

that the human brain undergoes remarkable growth in form and function before the age of

five, continuing on in spurts through the late teens. That growth, which augments the

capacity to learn, is as inevitably certain as a healthy person’s blooming body and each year’s

passing birthday.

But the content of learning – from the barest element of memory to the most complex

sets of coordinated skills – reflects a different type of physical change. As learning occurs,

underlying arrangements in neuronal connections take shape. Such learning continues

through all the rest of conscious life... even to the hazy end, if only to recall and reproduce

memory, maintain mental triggers, or simply mark the procession of time. These are the

changes motivated by social construction.

Neuroplasticity supports retention of memory and acquired motor skills. This

endowment allows us to become aware of growth and change. It facilitates our

comprehension of temporal distinctions like before and after, as well as time-relative

qualitative distinctions such as younger and older, hungry or content, ignorant and wise. But

that capacity to conceive of transience is accompanied by (and perhaps subordinate to)

deeply-rooted feelings of persistence.

That is, we each have an innate awareness of a singular, self-regarding “I” who

possesses a constant, uninterrupted existence. By the same token, our brains contain

mapping mechanisms that facilitate conscious awareness of sensations that occur at different

Page 428: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

413

The phrase is adopted from Antonio Damasio, The Feeling That Happens: Body and Emotion in580

the Making of Consciousness (1999).

The proper translation is “We both step and do not step in the same rivers. We are and are not.”581

Onuf(1989:98).582

sites on our bodies. Each specific bodily sensation is perceived as a feeling that happens to

that same singular, self-regarding “I” and to it alone.580

These feelings of wholeness stem from genetic endowments that support our inborn

drive for self-preservation. Those endowments include epigenetic reactions to visual

sensations such as color and facial expression, as well as fear, thirst, hunger, disgust,

surprise, and other visceral responses to the environment. Those consequent feelings give

rise to emotional states that underpin our most basic capacities for reasoning.

This juxtaposition of transience and persistence reveals an old conundrum. Heraclitus

(c.535- c.475 B.C) taught that no one can step in the same river twice. The water flowing581

past is not the same as before, and neither is the person. Everything changes, we can agree.

Yet we have no problem understanding the sentence, “Joe stepped in the Nile again.” An

intention to convey the idea that a persistently whole person named Joe has returned to a

persistently whole river named the Nile is easily understood. Heraclitus could argue that by

ignoring the non-persistence of things, both the speaker’s illocution and the hearer’s

perlocution reflect a false state of affairs. Our words indicate a wholeness that is far more

elusive than we generally realize. But the sloppiness permitted by language has great utility

for us. Those shared assumptions get us through the day. We have mastered the ability to go

on happily, believing in a neutral objectivity by which the world fills our descriptions. We

see Joe and we think Joe. We see the Nile and we think Nile. We hear the sentence and we

know what the speaker means. Communication works. Or so we would like to believe.

Ironically, and what Heraclitus understood, is that what we actually do with

descriptive language is quite the opposite: We impose words onto lumped up conceptions

of the world. One imagined state of affairs accounts for another. Joe and the Nile reflect582

a long series of associations that link up our ideas of What is a person?, What is a river?,

Page 429: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

414

Parenthetically, that may be why Zeno’s Paradox was such an enduring complaint against583

particularists such as Heraclitus and Pythogoras.

The phrases are drawn from the opening comments by William B. Hurlbut at the Stanford584

University Conference,“Becoming Human: Brain, Mind and Emergence,” recorded March 21, 2003.

http://www.meta-library.net/events/stanford-frame.html and by Nancy Murphy at the Seattle Pacific University

Conference, “Ethics, Values and Personhood in the 21 Century,” http://www.meta-library.net/evp-mind/consc-st

frame.html.

What do I remember?, and What is this? We write our descriptions upon the world, and they

become part of it. It’s a convenient strategy, even if the words are necessarily arbitrary and

the lumpy conceptions are possibly misleading. When does Nile refer to a particular flowing

body of water in Egypt, and not the Greek word for a river valley? The answer relies on our

ability to share pragmatic states.

The tools of vocabulary and illocutionary power provide hooks that allow us to leap

the gap between poetic wholeness and prosaic distinctions, and to circumvent the potentially

bottomless conundrums which philosophers plumb to make their living.583

ii. From Neurons to Narratives

A key puzzle for philosophers and scientists is explaining how our physical ability

to sense the world links up with our mental ability to make sense of the world. Neuro-

physiologists are learning to map the areas of the brain where specialized functions occur,

and they are learning to assess behaviors of perception and cognition. At the same time,

linguists have developed strong accounts of performative speech and social interaction. But

there is no settled account of the mechanisms of imprinting and consciousness which link

the two... where chemistry becomes consciousness, meat becomes mind, or biology becomes

biography. Our drive to make sense of this is itself indicative of our drive to deal with the584

conundrum of parts and wholes.

Animals having simple brain cortexes appear earlier on the evolutionary time scale

than animals with circulatory or respiratory systems. Planarian worms, considered to be the

descendants of the first bilaterally symmetrical cephalic creatures, possess a variety of

receptors including primitive eyes, integrated via a nerve net and cortex. Their nervous

Page 430: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

415

systems provide for local reflexes, for coordinated, whole-body motor responses, and for the

ability to acquire conditioned responses from stimulus. In other words, planarians can learn.

They can be trained to turn left or right in controlled conditions, using food, flashes of light,

or puffs of air. It may even be proper to say that experimenters “speak” to planarian worms

in such circumstances, since the etymology of “condition” includes the root dict, meaning

speech, from which we also derive diction, addict, and predict.

This is not to say that planarian worms have objectifying or self-regarding mental

edifices. Even if though they change the world as a result of eating food, digesting it, and

expelling waste, it would be incorrect to say those actions are motivated by intentional

mental states. The point here is that human-originated sign events can produce motor

responses in simple animals. Such sign events can also leverage those animals’ neuroplastic

characteristics, inducing them to store and recall simple information. (Interestingly,

planarians are able to regenerate heads from severed tails; the descendant worms retain the

trained behaviors of the antecedents, suggesting that learning involves the body as well as

the brain.) The ability to induce conditioned reflexes in more advanced animals has been

amply demonstrated, from Ivan Pavlov’s experiments with salivating dogs, to more

sophisticated human experiences with advertising and political propaganda. As the saying

goes, “Neurons that fire together, wire together.”

Conditioned reflexes are a relatively simple problem when compared to questions of

self and society, but that does not diminish the utility of the animal kingdom as a source of

insight and analogy. Though the rise of the Internet has sparked discussions about “hive”

cultures, it is generally considered more productive to focus on the behaviors of highly

convivial primate cousins like chimps, bonobos and baboons. They engage in tool-making,

self-medication, localized culture, in-group competition, empathy, and reciprocity. Those

activities reveal social forms that more than vaguely resemble our own. Chimps can

recognize themselves in mirrors and can even display rudimentary theories of mind, so that

one chimp can know it knows something that another chimp does not.

Robert Sapolsky, a neurobiologist who has also done extensive zoological field work

with African baboons, has written extensively about the similarities and differences he has

Page 431: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

416

Roberrt M. Sopolsky, “A Natural History of Peace,” Foreign Affairs January/February 2006.585

http://www.foreignaffairs.org/20060101faessay85110/robert-m-sapolsky/a-natural-history-of-peace.html.

Ibid. “Optimizing the fission-fusion interactions of hunter-gatherer networks is easy: cooperate586

within the band; schedule frequent joint hunts with the next band over; have occasional hunts with bands

somewhat farther out; have a legend of a single shared hunt with a mythic band at the end of the earth.

Optimizing the fission-fusion interactions in contemporary human networks is vastly harder, but the principles

are the same.”

observed between animals and humans, arguing that insights gleaned from the experiences

of other species may help us improve our ability to manage stress and may even help us

develop better strategies for getting along in groups. This has tremendous interdisciplinary585

value. One of his most interesting findings builds on the accumulating evidence of cultural

plasticity and cultural transmission among some primate groups. The upshot is that a species

is not subject to live out a hard-coded genetic destiny as, say, “killer apes” and that it may

therefore be possible to optimize interactions between distinct “bands” within a species,

raising the frequency of cooperative interactions and reducing the frequency of violent

ones.586

Sapolsky’s hypothesis about “The Natural History of Peace” deserves mention, at

least in passing, because it deals with important practical concerns and has rightfully drawn

a great deal of attention among IR scholars and professionals. His move represents a rather

high order intervention in human society. Despite a poverty of gatekeeping authority, he has

leveraged a vast stock of propositional content with the aim of influencing behavioral

outcomes on a global scale.

The central concern of this discussion, however, is to better understand the human

capacity for sophisticated social intervention. To recall where we have been so far, that

capacity depends on the ability to form intentions based on a conscious assessment of one’s

interests – plausible demands and expectations consisting of beliefs about how the world’s

parts fit together as wholes, supplemented by knowledge that actions may have unintended

consequences, and that other individuals can be more or less conscious of their own interests.

Humans might have difficulty parsing that last sentence, but it is certain that chimps

and apes would never “get it” at all. Even if less sophisticated primates could be taught

Page 432: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

417

Max Planck Institute Press release , “Brain Researchers Discover the Evolutionary Traces of587

G r a m m a r , ” F e b 1 7 , 2 0 0 6 . h t t p : / / w w w . m p g . d e / e n g l i s h / i l l u s t r a t i o n s D o c u m e n t a t i o n /

documentation/pressReleases/2006/pressRelease20060216/.

Searle (2001: 124).588

Josef Parvizi, Gary W. Van Hoesen, Joseph Buckwalter, and Antonio Damasio, “Neural connections589

of the posteromedial cortex in the macaque,” Proceedings of the National Academy of Sciences of the United

States of America, January 31, 2006 103.5:,1563-8. http://www.pnas.org/cgi/content/full/103/5/1563.

enough vocabulary to associate hundreds or even thousands of signs with referents, they lack

the endowments necessary to deal with complex grammatical functions like subordinate

clauses and conditional tenses. Numerous studies have determined that such complex

functions are processed in phylogenetically newer parts of the brain that apes lack, including

Broca’s area. The endowments figure in what Searle calls the “great gulf” between humans587

and chimps, our exceptional “capacity to create, to recognize, and to act on desire-

independent reasons for action.” 588

But just as differences are discernable, so are similarities. Humans and primates share

several phylogenetically older structures that have been implicated in language and

consciousness, including the frontal operculum, which humans engage during both simple

and complex grammatical processing, and the posteromedial cortex, which is active during

waking states and is especially active during episodes of reflective self-awareness.589

Much of the debate on whether other primates use language turns on whether they

actually create new sentences, or whether the best they can do is recombine acquired terms

in simplistic pairs or threes. Noam Chomsky, whose theory of generative grammar

revolutionized the field of linguistics in the late 1950s, insists that apes can not have true

linguistic competence because such competence depends on a speaker’s innate knowledge

of the rules of a language. Such knowledge enables human children to engage in

exceptionally rapid language acquisition and highly original sentence production by the age

of two. It enables adults to recognize that the sentence “Colourless green ideas sleep

furiously” is correctly formed, but full of nonsensical ontological violations. For Chomsky,

the capacity for linguistic competence is an exclusively human endowment tied to the

presence of a “language organ” in the brain and dependent on the healthy physical

Page 433: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

418

For more on the debate, see “ Chimp Talk Debate: Is It Really Language?” New York Times590

Science Section. C, p1. June 6, 1995. Also online at http://www.santafe.edu/~johnson/articles.chimp.html.

Generativists may not accept use of the word “mind” in this context as a literal term, but it retains591

figurative utility.

Stanley I. Greenspan, Stuart G. Shankar, The First Idea: How Symbols, Language and Intelligence592

Evolved From Out Primate Ancestors to Modern Humans. Da Capo Press, 2004:6.

maturation of developing children. Broca’s area would presumably be part of the language

organ’s neuronal circuitry, but the complete system is yet to be identified.

A strong countervailing view to Chomsky’s generativism comes from a school of

thought called interactionism. Proponents include Jerome Brunner, an eminent cognitive590

psychologist, Michael Tomasello, a cognitive psychologist who is co-director of the Max

Planck Institute for Evolutional Anthropolgy, Sue Savage-Rumbaugh, a primatologist at

Georgia State University’s Language Research Center, Stuart Shankar, a philosopher with

a deep interest in animal cognition, and Stanley Greenspan, a clinical psychiatrist with a

specialty in child development. Interactionists often argue that Chomsky’s focus on highly

abstract symbol-encoding systems raises the bar on the definition of language too high. They

criticize him for overlooking the Wittgensteinian insights that meaning is embedded in the

use of language, not in words themselves, and that language is embedded in social situations.

If one accepts that communication is enabled by mutual participation of agents in a

common interactive framework – that communication depends on playing “the game” in a

shared environment – many possibilities to conduct relationships and conversations are

opened up. What counts is the intention to make oneself understood in terms of the other

party’s intentions. Both primates and humans interact socially, after all, and both clearly

engage in emotional signaling that bring about changes in one another’s minds. The study591

of creative and reflective thinking should begin there, write Shankar and Greenspan:

[T]he cultural diffusion of these processes of emotional interactionthroughout our evolutionary history constituted the primary engine for thedevelopment of our highest mental capacities.592

Page 434: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

419

Transcribed from the online interview, “A Talk on the Wild Side” at593

http://paulagordon.com/shows/savage-rumbaugh/.

Dorothy L. Cheney and Robert M. Seyfarth, “How Monkeys See the World: Inside the Mind of594

Another Species. 1990 Chicago: University of Chicago Press.

Andrew Carstairs-McCarthy, “The Origins of Complex Language: An Inquiry into the Evolutionary595

Beginnings of Sentences, Syllables and Truth” Oxford 1999. The quote is taken from an online abstract at

http:psycprints.ecs.soton.ac.uk/archive/00000082/.

The interactionist concern with emotion is consonant with recent advances in the field

of neurophysiology, exemplified by the work of Damasio, LeDoux, Goleman, Sapolsky, and

Ekman, who stress feeling and emotion as the foundation of mind/body unity and intentional

thought. What matters for Savage-Rumbaugh, renowned for training the bonobos Kanzi and

Panbanisha to communicate via “lexigrams,” is recognizing that agents who have limited

vocabularies at their disposal can nevertheless act deliberately. Treating them as purposive

beings, she argues, will inspire an expansion of purposive communication.

When you speak to [bonobos]... as though they understand, they somehowcome to understand. [It’s] how mind operates... what common sense is.Language is sort of the home of the mind. It’s how we trade pieces of mindback and forth, and how we learn to coordinate our behavior. We’reprocessing the whole implication of that context, so when I overlay somewords on it, it makes sense.593

* * *

A simple example of proto-linguistic emotional signaling occurs in the alarm calls

of vervet monkeys, native to sub-Saharan Africa. Vervets make a variety of signals through

facial expressions, postures and sounds, including three distinct calls when particular

predators are seen. Vervets who hear the calls respond with distinctly appropriate behaviors.

Hearing the leopard alarm, a loud bark, they climb into trees and scan the ground. Hearing

the eagle alarm, a coughing sound, they hide in bushes and look up. Hearing the python

alarm, a chuttering sound, they stand on their hind legs and approach the snake, keeping it

in sight from a safe distance. The New Zealand-based linguist Andrew Carstairs-McCarthy594

considers the vervet alarms to be an example of nonsyntactical monocategorical

communication, arguing “it makes no sense to ask whether an individual call is nominal or

verbal.” Similarly, in a discussion of animal communication, Martin Nowak, a595

Page 435: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

420

Martin A. Nowak, “Evolutionary Biology of Language,”Philosophical Transactions: Biological596

S c ien c e s , 3 5 5 (1 4 0 3 ) :1 6 1 5 - -1 6 2 2 . h t tp : / /w w w 3 . i s r l .u iu c .e d u /~ junwang4 /langev / lo c a lco p y/

pdf/nowak00evolutionaryBiology.pdf.

Terrence Deacon, The Symbolic Species: The Co-evolution of Language and the Brain 1997.597

Norton: New York. pp. 54-8.

mathematical biologist who specializes in evolutionary dynamics, describes nonsyntactic

words as signals that refer to “events” that are “whole situations” rather than distinguishable

denotations for objects and actions. Terrence Deacon, a biological anthropologist, suggests596

similarities between vervet alarms, holophrastic utterances (saying the names of things to

request them, typical during infant language acquisition), and our involuntary susceptibility

to the sound of laughter.597

Though the syntax is non-existent, the semantics are clear. Vervets sound their alarms

in life-threatening situations, impelling their band-mates to safety. It is a collaborative, and

perhaps even an altruistic act, given that hiding quietly and leaving the others exposed might

be a better survival strategy for an individual at a given moment. But somehow, either by

habit or evolution (since the warnings help protect kin), they are endowed with an identity

of interest. More importantly for this discussion, their communication is emotionally

motivated and emotionally effective, demonstrating a primitive level skill in the practice of

fear.

Humans use nonsyntactical event speech just as effectively. Emotionally primed calls

and responses comply neatly with the rules of illocutionary and perlocutionary action. For

example, under battlefield conditions someone might yell out “Artillery!” in response to a

certain whistling sound overhead, fully expecting that all hearers would immediately take

cover against the incoming attack. Unlike the vervets, however, we can make words do

double duty, using them as alarms, or simply as names of things. The application of syntax

allows us to speak referentially without invoking the same urgent behavior. But, given the

proper circumstances of tone and context, when “Artillery!” functions to alarm, we skip the

step of evaluating the assertion to act immediately in accord with the implication of its truth.

Page 436: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

421

That is, instead of pondering the validity of the speaker’s assertion of incoming artillery fire,

we move on cue to take cover.

From an evolutionary perspective, this capacity for a deep-rooted emotional response

to nonsyntactic communication is a highly adaptive survival mechanism. But it has

drawbacks. It is easily manipulated. False alarms can fool us. Given that awareness, we

consider it wise to ban people from shouting “Fire!” in crowded theaters. By the same

adaptive token of evolution, we are susceptible to false attractions. The animal kingdom has

many examples of predators that use camouflage to mislead their prey. One species of turtle

lures fish right up its mouth with a tongue that looks like a tasty worm. Likewise, humans

can be gullible, and easily misled by nonsyntactic sexual, patriotic, and religious

iconography. Just as we try to vigilant about false alarms, we try to be vigilant about false

advertising.

The origin of such vigilance in human evolution is very likely to be associated with

the invention of syntax itself. So, instead of responding to whole events, our hominid

ancestors learned to factor and account for the parts within, creating space in their minds for

talking about predators and attractions. Urgent emotional states were augmented by

descriptive ones. That space allows for intelligible distinctions between name and action,

presence and absence, and cause and effect. With those linguistic tools, it was then possible

to speak intentionally about objects and events, describing relations between things,

imagining futures, making plans for coordinated behaviors.

The foregoing inference about the advent of propositional speech is conjecture of

course, since we have no physical artifacts from which to trace the development of syntactic

language. But it is clear enough that the effort to build a rich vocabulary of description that

distinguishes parts of the world from the whole of the world has been a driving impulse of

human civilization. So has the effort to rearrange those descriptions and impose them back

onto the world to make new wholes, perhaps by changing our minds, perhaps by changing

material culture.

We recognize that effort in the skilled practice of science, a unique accomplishment

of our species. That term originates from a Sanskrit verb for cutting and splitting. Later

Page 437: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

422

That two-pronged approach is called “foundherentism.” See Susan Haack, Defending Science –598

Withtin Reason: Between Scientism and Cynicism. 2003. Prometheus Books: Amherst, New York.

Much of the following discussion about obserrved vervet behaviors is informed by an unattributed599

document titled “Social Relationships” found online at http://www.bio.davidson.edu/people/vecase/

behavior/Spring2003/ PerezHeydrich/Social%20Relationships.htm.

meanings included decoding. The root is retained in scissor, schizoid, discern, and

consciousness. Contemporary vernacular uses of the word often suggest the notion of a body

of systematized knowledge. That conception closes the circle, making parts into wholes. (The

etymology of system, by the way, evokes things standing together.)

A few state-of-the art metaphors for science clearly resonate with the notion of parts

and wholes. Philosopher Susan Haack, a specialist in the works of Peirce, describes a

federation of disciplines that combine an experientially-anchored method of inquiry with a

commitment to explanatory integration. She coined the term “foundherentism” to capture598

the idea, a term so jam-packed with meaning that it could only be accessible to a narrow

audience. Alternatively, George Lakoff, a linguist who now works as a political consultant,

invented the term “converging evidence,” a phrase that is much better suited to the

vernacular, yet faithful to the concept.

* * *

Vervets do not have science, but they intuitively know a thing or two about rules.

They engage in behaviors that indicate a keen appreciation of status relationships and a basic

appreciation of cause and effect. They are territorial, typically living in female-dominated

bands of 80 or so, in which higher-status vervets of both sexes are allowed priority in the

search for food and grooming arrangements. Each band defends its territory against others,599

but borders are porous because males leave their natal group after reaching maturity. A

process of testing, learning and social reordering ensues as a young male tries to break into

a rival band and work his way up the pecking order, gauging when the time is ripe for

competitive displays against other males and for bids to approach females.

If a male vervet seeks food, grooming, or sex above his station, or neglects to show

proper obedience at the proper time, he experiences visceral consequences. There could be

Page 438: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

423

a shouting match, or even a bloody fight. He might triumph and then win extra grooming

within a higher-status clique, or he might lose and then be shunned. In either case, the young

male’s initiative demonstrates a vervet-level knowledge of the world and a vervet-level

intention to change that world by establishing a new social order in the group. The results of

a battle are imprinted within the mental edifices of the observers, who may then reorder their

behaviors in accord with their own vervet-level interests. Like many primates, much of their

brain capacity is devoted to maintaining knowledge of who’s who.

What is it like to be a vervet? Moving up the status chain has innate value in vervet

society, establishing valuable alliances that support patterns of beneficial reciprocity with a

stronger caste of collaborators. This improves reproductive opportunities, making success

its own reward. As in human society, being part of the in-crowd offers protections. In vervet

society, life is safer at the center and harsher at the fray. Presumably there are times when

weaker vervets might consider bids to advance rank, but can predict the likelihood of failure

and therefore withhold action in order to spare themselves the immediate brutal cost. They

put up with harassment and bullying rather than face utter catastrophe. From this capacity to

evaluate and negotiate status relationships it is possible to infer that vervets engage in a kind

of wishing... for dominance and for reciprocity.

Even the wish for dominance may ultimately be just a wish for the quality of

reciprocity that dominance can guarantee. Reciprocity is not necessarily fair and balanced.

Dominant vervets receive unrequited grooming by subordinates who seek the privilege of

association, just as humans often seek to bask in the halo of celebrities. From an observer’s

perspective, it is clear who are the supplicants.

* * *

Humans have turned wishing into a high art. Growing facility with syntax fostered

the development of narrative, both in one’s mind and in the local community. If one asks the

question, “What was it like to be a prehistoric hominid with an innate capacity for intentional

wishing and a rudimentary culture of syntactic language?” it is possible to make some

informed conjectures about the kinds of narratives that were likely to emerge. Etymology

provides several clues. The word wish stems from the Sanskrit term for desire, which has

Page 439: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

424

also given rise to the English words venerate, win, and Venus, and, the Lithuanian “vyti”

meaning “to hunt.” (The origin of desire, by the way, is rather more poetic, building from

the idea of regretting the absence of a star, and suggesting, therefore, the seeking of

something that may not be attainable.)

Given the importance of cultivating beneficial alliances, weaker agents must use

reciprocal strategies to win immediate favors and long-term support from stronger agents.

Attempts to display dominance simply will not work. Thus, subordinates in primitive

hominid societies would show fealty to the dominants through behaviors that reinforced the

elevated status value of the dominants. Those behaviors would include the giving of gifts like

food, and perhaps even one’s children, and also the display of submissive, worshipful

ceremonies (worship stemming from “value shaping”). Displays of loyalty and veneration

tended to become more elaborate as human culture advanced, so that ornaments and other

valued items began to appear in arrays of gifts. One can surmise that such displays were

sometimes insincere or even deceitful, motivated simply by the supplicant’s cold calculation

of costs and benefits, but it seems more likely that the events were deeply emotional

experiences for those involved. Moreover, the ability of a dominant to read and reinforce the

intentional sincerity of a supplicant was probably a selective advantage.

As humans developed a cultural capacity to think about causes in the world,

distinguishing objects and actions from immediate events, they found ways to think beyond

the hostile intentions of predators, and beyond the more or less friendly intentions of

members in their group. There would have been flashes of desire for successful hunts, for

recovery from disease and catastrophe, for release from pain, and for comfort against the

elements. In other words, they must have wished for protection and safety against the ravages

of nature. The wish would be turned back upon nature itself, like a peace offering or a sign

of submission to a dominant member of one’s tribe. Given the primate intuitions of

reciprocity and alertness to agency – both sharpened to a skilled practice by the flowering of

syntactic culture – a strategy of venerating natural forces as animate beings would have been

likely to emerge.

Page 440: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

425

The notion of the unseen see-er is stressed by Pascal Boyer (2001), Religion Explained: The600

Evolutionary Origins of Religious Thought..

For more on reciprocity with gods, cf. Karen Armstrong (2006), The Great Transformation: The601

Beginning of Our Religious Traditions.

Even if prehistoric language and consciousness provided no touchstone for the idea

of the supernatural, humans were well primed for some concept of intentional spirit behind

forces we now consider inanimate. Early humans already imputed intentionality to each

other, to predators, and to other seen living agents. It would only take one more step to

impute intentions to unseen spirits, be they hidden behind the clouds, living in the river, the

sun, or the moon, or hovering ethereally at the graves of deceased ancestors. The emotional

capacity for such belief was already there. But where spirits were concerned, the overt

expressions and internal feelings of veneration could be even more intense. Spirit agents

were potentially much more powerful than humans. They could see while not being seen.600

That combined set of assumptions – the existence of spirits and the virtues of

reciprocity – would have motivated gift-giving (perhaps through burned sacrifice or some

stone-age version of pennies in the fountain) and ceremonial events. Over time, it is601

reasonable to assume that the dominants of a clan – or else their approved delegates – would

have been retained as intercessors during ever more elaborate worship events. This would

have preserved and reinforced status relationships within the clan.

Objects of spiritual veneration are often assigned an iconography, through bone relics,

or through animal or human figurines, or even through drawings and symbols. Doing so

associates conceptions of unseen agency with more easily grounded mental edifices. What

needs to be underscored, however, is the felt sense of a relationship with an unseen agent

deemed capable of causing seen effects. Reciprocity can presumably “work” once one has

the belief that someone is “there” to reciprocate.

The emergence of beliefs in an afterlife (belief in the immortal persistence of one’s

ongoing internal narrative) would have augmented the reasons to value enduring alliances

with spirit powers. It is important to keep in mind, however, that a prime benefit would have

Page 441: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

426

Paleontologist Stephen Jay Gould, in response to a question about potential communication with602

alien species who wouldn’t know our language, said, “We should play the Bach B Minor Mass and say, in as

many languages as we can, ‘This is the best we have ever done, and we would like you to hear it, and we’d like

to hear the best you have ever done.’” http://speakingoffaith.publicradio.org/programs/pelikan/transcript.shtml

been (and still is) to establish feelings of comfort and confidence within the mental edifices

of the believer.

The practice of interaction with imagined spirits takes many forms in modern society,

but is generally characterized by expressions of faith in heavenly-based supernatural agents.

The essential principle of exogenous belief, remember, is the notion that the rules are an

absolute factor of the environment, and that they therefore precede human agency. Religious

believers equate the source of rules with a divine ruler of the world, variously called “God,”

“Jehovah,” “Allah,” and many other less familiar names. By cultivating thoughts about a

supreme being with whom they can establish a reciprocal relationship, such believers have

a very precise sense of who makes the rules in the world. They know the answer to “Who’s

the boss?” To wish for a reciprocal relationship with a divine superbeing is to wish for

something quite grand indeed. That wish has inspired the soaring achievements of religious

art, music, and architecture... achievements that even staunch atheists will admit are the

greatest products of human culture.602

Religion is not the only style of exogenous practice. There are considerably more

modern and systematically non-deistic approaches, primarily associated with radical forms

of Darwinism and historical materialism. According to such views, the principles of natural

order are rationally discernable, unmercifully fixed, and fundamentally mindless. They are

therefore intrinsically immune to pleas for reciprocity (whether for mercy or good fortune),

either directly or through the intercession of a deity. The only option for human

empowerment is to know the rules and to submit totally to its demands, with no hope of help

from a field-tilting game master. In extreme cases, the apparent totality of the wish to know

the rules of natural order and submit to them seems to resonate with emotions akin to

religious veneration and worship. But the telltale difference is the absence of an animate

intentional reciprocating agent in the charter myth and its associated grand narrative.

Page 442: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

427

Both styles of exogenous rule require that individuals play to win its “game”

according to its predefined standard of victory. People are allowed no chance of defining the

grounds of victory for themselves. Both styles advance the pretext that there is a self-

contained and completely integrated answer for why things are the way they are and how one

should act in light of that fact. For both, the natural order gives rise to moral order.

The difference, to reiterate, stems from polar conceptions of the source of rule. One

pole is associated with narratives of a named, intention-bearing ruler. The other with an

abstract universal principle. This same polarity – animate ruler, inanimate principle – applies

to the conceptions of codogenous and selvogenous rule that I posited earlier. Two more

neologisms will sharpen the distinction. In the case of an animate ruler – presuming an

intention-bearing boss – the style is capkenic. In the case of an inanimate principle –

presuming no intention-bearing boss – the style is acapkenic. The words are meant to build

on roots that in turn suggest “known head” and “no known head.”

Given this formulation, varieties of religious absolutism would be categorized as

capkenic exogeny, and materialistic determinism as acapkenic exogeny. Given the possible

combination, a matrix of six rule styles can be imagined. A preliminary mapping of familiar

ideologies and grand narratives appears in the following table. I also suggest vernacular

phrases that may help convey the appropriate meaning for each category without resorting

to the neologisms.

Page 443: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

428

Table 5 Matrix of Rule Styles

Rule Style Exogenous Codogenous Selvogenous

Perspective Absolutism Structurationism Relativism

Capkenic

Style

Supernatural king

combines guiding and

gatekeeping powers

(usually through earthly

intercessors). Weber:

Routinized Charismatic

Institutions.

Socially selected leaders

achieving power through

professional skill,

charismatic appeal,

strategic advantage, and

material leverage, etc.

Unconstrained

Individualism, could

claim alienated or

hedonistic motivation as

well as autogenic world-

making power.

Capkenic

Vernacular

Dogmatic Supernaturalism. Heroic Unionism. Self-powered Relativism.

Capkenic

Expressions

Western Monotheism.

North Korea.

WWII Japan.

Ancestor Worship.

Caudillism.

Nationalistic Fascisms.

Primitive Democracy.

Primitive Republics.

Postmodernism.

Relativism.

Modern nihilisms.

Some Existentialisms

New Age Quantumism.

Acapkenic

Style

Charismatic or entrenched

leaders treated as faithful

interpreters of the

discernable laws of nature.

They have privileged

access as “knowers,” but

make no supernatural

claims.

Process-oriented

commitment to, integration,

fairness, and inclusion

based on propositional

knowledge.

Disengagement and

isolation may reach

extremes through the

experience of altered or

“transcendent” brain

states that mix high

sensory acuity with

functional disorientation.

Acapkenic

Vernacular

Deterministic Rationalism Process Constitutionalism Selfless Disengagement

Acapkenic

Expressions

Plato’s Forms.

Determinism.

“Watchmaker” Deism.

Utilitarianism.

‘Guiding Hand’ Capitalism.

Historical Materialism.

Social Darwinism.

Hitlerism.

Montesquieu-style, checks-

and-balances government.

Communicative Action.

Cenoscopism (Peirce).

Self-conscious dialecticism.

Some bureaucratisms.

Meditation.

Self-Nihilism.

iii. Direction of Fit

To recap, the overarching problem is to justify a set of metaphors for deployers of

rules – guides, gatekeepers, and peers – that can explain the skilled practice of power within

a constructivist framework that takes rules as ontologically prior. The approach to the

solution motivated a discussion of certain categories of human interests which have been a

Page 444: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

429

persistent concern of political theorists through the ages. This was done because I believe the

deployer metaphors can be linked to those categories at a deep level. Probing further into the

nature of interests, I accepted Onuf’s definition of interests as demands and expectations that

can plausibly be achieved, and I agreed with his related argument that sensory experience

yields social practices. But I disagreed with his assertion that the three universal categories

of reasoning (yet to be described) are each in turn founded in the experiences of seeing,

touching, and hearing.

In this chapter I began to set out a more comprehensive account for the natural

development of social practices, drawing from contemporary literature in neurolinguistics

and evolutionary psychology. An opportunistic aside refreshed an earlier discussion of

absolutist, structurationist, and relativist ideologies. But the core thread remains an inquiry

into the kinds of skills that people apply when they advance their interests.

The current chapter’s discussion has established that an adequate answer must be

attentive to how people interact within social environments framed by resort to emotional

force and syntactical nuance. In other words, the fields of neurophysiology and linguistics

have a lot to offer. I briefly described how generativists are deeply certain that human

cognition is fundamentally different from animal cognition, whereas interactionists spend a

great deal of effort looking for similarities. I won’t take a position in the dispute other than

to remark how interesting it is that professional linguists can have such a hard time talking

to each other.

Modern linguists – and philosophers also – have an abiding concern with what

makes the mind. A comprehensive discussion of interests needs to address that. Searle’s

work is particularly interesting, however, because his careful exposition of intentionality is

so attentive to what the mind can make. His basic conception of illocutionary action was

presented earlier. It will now be taken up in greater detail.

In The Foundations of Illocutionary Logic, Searle (with Daniel Vanderveken)

elaborated his overarching conception of illocutionary force to derive a structure of five

distinct sets of illocutionary points and forces: Assertives, Directives, Commissives,

Declaratives, and Expressives. He distinguished several components and properties,

Page 445: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

430

Note the substitution of mind for world in later work. Searle (2001: 37).603

Searle (1985: 95).604

Onuf (1989:90).605

Onuf(1989: 89).606

including “direction of fit,” which accounts for how the world brings about changes the mind

(say, by seeing an object and thinking of a tree) or the mind brings about changes the world

(by chopping down a tree). His nomenclature was somewhat confusing, however, in that the

direction of fit for changing the mind is called “words-to-world.” By keeping the same noun

order and adding some verbs to clarify the meaning, that might be “Make the words in your

mind match up with the world beyond it.” Conversely, “world-to-word” might be “Make603

the world outside your mind match up with the words inside it.” In an attempt to simplify

things, I will use active phrasing such as “Mind makes world.”

One of the key insights developed by Searle’s intellectual mentor, J. L. Austin, was

to draw an essential distinction between assertive utterances that simply state facts about

things (“Her name is Sue.”) and performative utterances that do things (“Her name shall be

Sue.”). Searle retained the performative concept in his own five-part scheme, but under the

label declaratives. These were described as having a two-way direction of fit, both words-to-

world and world-to-world, he wrote, “because the point is to bring about a change in the

world by representing the world as so changed.” Given the importance of this category in604

Searle’s scheme, it is intriguing that Onuf excludes it from his constructivist rendition of

speech acts, arguing that declarations are not by themselves regulative (rule producing)

because the regulatory effect depends either on the speaker’s authority or the hearer’s

assent. Depending on the case, therefore, Onuf believes the utterances would function as605

either directives or assertives.

Onuf also drops the category Searle labeled expressive because he believes that, like

declaratives, “speech acts in neither category can produce rules of their own.” In Searle’s606

scheme, expressives account for utterances reflecting “some psychological attitude about the

state of affairs represented by the propositional content.” They are presumed to have a null

Page 446: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

431

Searle (1985: 38), (1979:15-16).607

Onuf (1989: 93).608

Onuf(1989:93). Note the earlier formulation, “My promises confer duties on me that are609

simultaneously rights others claim I have conferred on them (Onuf 1989: 88).

Searle(1985:38).610

direction of fit because the truth of the proposition is presupposed. In other words, the607

speaker is not changing anything, but simply revealing an internal feeling of, say, attitude (“I

prefer to call her Suze.”), or gratitude (“Thank you, Sue.”).

Onuf takes issue with Searle for deducing that the same direction of fit applies to

directives, which are akin to orders and request (“Please get me a beer, Sue”), and

commissives, akin to promises and pledges (“I will be true, Sue”). For Searle both are608

world-to-word; utterances of either kind initiate new states of affairs in the world. Onuf sees

commitments fitting the other way. While agreeing that they prompt action “directed toward

realization of that anticipated state of affairs,” what counts more for Onuf is that their words

are fit to the state of the world only after the commitment has been discharged (So, the act

of promising to be true entails, “You now have the right to demand I be true, Sue”). But609

Searle was straightforward enough. While conceding that having two categories with the

same direction of fit was “inelegant,” he was clear that both utterances concern a future

course of action... for the hearer in the case of directives, and for the speaker in the case of

commissives.610

These competing ideas about direction of fit seem so torturously complex, I believe,

because their temporal schemes lack a plain link to underlying material reality. To simplify

the problem, I suggest would be more productive to start from the nuts and bolts of

psychological states and mental edifices, focusing on the events that actually cue particular

material changes in a person’s neuronal correlates of consciousness. The virtue of this more

plainly-linked approach would open the way to a better account of how signaling events in

the cultural relate to mind, world, and performative speech.

Page 447: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

432

This seems necessary first of all, in order to reincorporate expressive utterances into

the mind/world dynamic. Searle assigned them a null fit, but even if the presupposed truth

of the expression is unproblematic, it is nevertheless important to have an account of how

awareness of propositions enters the mind of hearers. After all, such awareness is likely to

have regulative effects on their conduct. A holophrastic alarm like “Artillery!” is more likely

to take syntactic form as the expression “I hear artillery incoming!” than as the directive

“Take cover.” Even if the expression ultimately entails the directive, the transcription occurs

after entry into the mind.

Expressions as simple as greetings make a difference, upholding or advancing the

status of speakers and hearers in a web of social relations. The utterance event is also

meaningful to any witnesses. Similar status effects occur with expressions of gratitude and

congratulations. Thus we can see the importance of an apology (the highest word) as an

expression which can restore a sense of wholeness in hearers who feel affronted. Expressions

of sympathy, wishes for comfort and healing could work the same way. Conversely, spoken

insults may undermine a hearer’s status, even quite egregiously in certain social settings. The

very carnality of the word insult (to leap in) evokes an image of something fitting itself into

the mind, however uncomfortably. In other words, expressions can play a significant role in

causing grievances and motivating efforts to redress them. They are therefore deeply tied up

with the regulation of conflict. Sincerity of expression becomes a factor of its truthfulness.

Since recognition of status is deeply implicated in one’s perception of the whole

reality, I would argue that expressions provide the same class of cues as assertions. Searle

evidently kept them separate to capture what he considered to be an important distinction

between the presupposed truths of expressives, and the contingent, testable flavor of

assertives. Yet both factor in the mind as truth claims. Conflating expressions and assertions

this way permits the constructivist triad to account for the whole variety of truth claims that

can be cued into the minds of hearers, forming their stocks of beliefs. These would include

posture, regard, gaze, background conditions (which Searle sometimes calls “preparatory

Page 448: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

433

Searle occasionally cites Paul Grice, another Berkeley-based linguist, who wrote extensively about611

the concept of what he called “implicature” resulting from the background conditions of conversations. Like

Searle, Grice was quite concerned with a speaker’s intentionality, but was more expansive, developing a

philosophy of meaning that stressed a speaker’s ability to induce a hearer’s recognition of that intentionality.

Onuf(1989: 90).612

Parenthetically, the desire to avoid exposure to such risks and obligations may help explain why US613

Presidents have avoided formal declarations of war for over half a century.

conditions”), grandstanding, and things not said. (This conflation is also intended to611

provide grounds for a link to Peirce’s semiotic categories in future work.)

Onuf eliminates Searle’s performative category through conflation, so that “firing

someone is directive... declaring war is assertive, and bidding at auction is commissive.”612

It is arguable whether declarations of war deserve categorization as performatives in the first

place, however, since they might accord better with Searle’s notion of expressions. The

primary affect of such speech is the rearrangement of beliefs in the minds of hearers,

supporting emotional states (apparently by encoding neuronal coalitions between the limbic

system and memory stores in the hippocampus and other areas of the neocortex) that can

influence future preference ordering. As a matter of war-making, however, such a declaration

by itself would not motivate any more immediate consequences than a declaration of

“Constructivism Appreciation Day.” A rather different train of directive speech acts is

necessary to set soldiers marching. In that case, what matters would be the commands given

by superiors to subordinates. So would the statement of a formal delegation of powers, as in

US Congressional resolutions that authorize Presidents to use military force.

Declarations of war also have a commissive aspect. Parties who make the declaration

presumably bind themselves and the people under their command to whatever international

laws may govern the conduct of war. Indeed, given the presuppositions that human beings613

have free choice, and that all human communication is subject to attenuation, it is possible

to argue that every human speech act begins as a commissive. This is because any utterance

which comes to be taken as an assertion or as a command, at least when not given

anonymously, obliges the speaker to a future course of action... accepting responsibility for

having made the statement. Moreover, there is always some risk that hearers will ignore the

Page 449: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

434

Searle (1985:12).614

Other differences beyond requests, but not discussed here at length, would include warnings.615

Searle (1985: 220).616

statement, misunderstand it, or that the signal will simply fail to get through at all. Such

possibilities are typically inconsequential enough to be ignored, allowing analysts to focus

on substantive intent.

Attempts to persuade are necessarily risk-laden endeavors. As Searle notes, “There

could not... be a performative expression ‘I hereby persuade you.’ because there is no way

a conventional performance could guarantee that you are persuaded.” Following orders is614

quite different. There is no great persuasive skill involved in getting an employee to carry out

an employer’s legitimate command. Directive gatekeeping power provides a guarantee of

material and social results that asssertive guiding and commissive peering power can not.

Knowing the rules of semantics, syntax, and pragmatics appropriate to a social

situation, speakers and hearers are able to “play the game” competently. The social

construction of language as a skilled practice reinforces background conditions that serve to

manage the ever-present risk of attenuation, clarifying the relationships of truth-claim

makers, superiors and subordinates, as well as bidders and petitioners in official settings.

* * *

There is at least one more significant divergence between my edifice-linked approach

and Searle’s classic scheme. Searle considers utterances such as applications, prayers,615

questions, and requests to be speech acts that ultimately entail directives. This is616

reasonable presuming the hearer is predisposed to assent. At the moment a speaker utters a

competent directive phrased as a request (“Please get me a beer, Sue.”), the hearer’s future

course of action is already presupposed in the speaker’s mind (“Sue is obliged under the

obtaining background conditions to get me a beer, and she will do it.”). That knowledge

enters the hearer’s mind in a consequent instant (“The request was polite and reasonable, and

I’m in a mood to please, so I’ll do it.”). The same sequence would apply in stronger forms

of directives, including commands issued in highly formal settings.

Page 450: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

435

Onuf, in Germain and Kennedy(2005:59)617

Searle (1985: 103-5)618

However, though some requests share the same illocutionary force and intent of

directives, this is not true of all cases. A prayer (“Please God, let me find my keys right

now.”) does not necessarily bind the hearer. As with any commissive, the future course of

action for the speaker is subject to the hearer’s action. The example may seem facetious, but

it is clearly a case of wishing, much like petitions, bids at auction, bargaining in a market,

political negotiations, the floating of trial balloons, and all kinds of material tests in which

speakers make requests or proposals for action where the outcomes are highly contingent on

the speaker’s response.

These are not the classic sort of commissives in which speakers make unilateral

pledges by which they agree to be bound by hearers. Nor are they “side by side declarations”

– such as contracts, treaties, and marriages – that often characterize commissives as

voluntary associations among presumed equals. Nevertheless, my apparent divergence617

with Searle on the issue of requests has a degree of compatibility with other elements of his

approach. For example, his scheme takes three psychological states into account – beliefs,

intents, and desires. The direction of fit he suggests for each psychological state corresponds

in turn with each of the speech act categories: assertives, directives and commissives. 618

* * *

If the argument seems to be taking on a particularly arcane and tedious form, I believe

the effort is justified in order to substantiate the use of the guides, gatekeepers and peers as

appropriate metaphors for the deployment of power. Those metaphors were selected, in part,

because they have accessible meanings and can stand by themselves. Thus, they were able

to sustain some of the narrative threads in the historical part of this dissertation. Still, there

is much more to them than simple resonance in the modern vernacular. In the upcoming

section I will begin to address that deeper aspect of the metaphors more directly. The table

on the following page shows the differences and similarities of the schemes discussed in this

section and the next.

Page 451: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

436

Table 6 Searle’s Five Illocutionary Points and Constructivist Derivations

Category Assertive Directive Commissive Declarative Expressive

Direction of

Fit

Word to world World to word World to word Word to world &

world to word

null fit

Examples Suggest,

hypothesize,

state

Order, insist Promise, swear,

pledge

Fire, appoint,

resign, christen,

marry, name

Thank, apologize,

congratulate, deplore,

welcome

Comment Testable as

either true or

false

Treats requests

as a special

case

Admits that

having two

categories with

the same

direction of fit

is “inelegant”

(1979: 15)

Derived from

Austin’s

performatives. An

essential category

of speech act

theory

No direction of fit

because truth is

presupposed. (1979:

15-16).

Psychological

states

(1985:35, 103-

4)

Belief Intention Desire

Onuf Assertives Directives Commissives (as Assertives or

Directives)

(dropped)

Intent Wholes to

wholes

Wholes to parts Parts to wholes

Form of Rule Hegemony Hierarchy Heteronomy

Interest Standing Security Wealth

Simon Assertives Directives Commissives (as Directives) (as Assertives)

Cue Mind makes

mind

Speech makesworld makesmind

Mind makesspeech

Feeling Knowing Doing Wishing

Examples Expressions,Nouns,Identifiers,Propositions,DescriptionsPhrases,Guesses,Ontologies

Declarations,Orders,Deductions,Promulgations

Proposals,Negotiations,Tests,Questions

Deployers Guides Gatekeepers Peers

Interventions Claims Executions Bids

Page 452: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

437

Kings 3:28.619

iv. Guides, Gatekeepers, and Peers

One of the Old Testament’s most celebrated legends tells of two “harlots” who

brought an infant child before King Solomon, each claiming the child was hers. Unable to

determine who was telling the truth, Solomon decided that the living child should be divided

by a sword and each woman given a half. Terrified by this prospect, one of the women

intervened, pleading that the baby be spared and given to the other. Solomon saw the truth,

awarded the child to the rightful mother, and secured his reputation in Western lore.“And all

Israel heard of the judgment which the king had judged; and they feared the king; for they

saw that the wisdom of God was in him, to do judgment.”619

The tale propagates the values of wholeness and integrity by recapitulating them. The

woman who acted to save the child is instantly recognizable as honorable. The judge who

wisely preserved wholeness (even through devious pretense) is deemed great and wise for

simultaneously executing justice and for demonstrating how honorable people act in

harmony with honorable principles.

Modern technology offers far better techniques for determining parenthood, which

is all to the good because Solomon’s sword trick could only work once. A more persistent

conundrum is how to split differences and achieve fair distributions. This has given rise to

the cupcake metaphor, which is about how to maximize the happiness among two children

who must share a single treat. Give one the responsibility to cut it, knowing that the other

will get first pick. The division will be performed very carefully.

The second story propagates the value of utility maximization by recapitulating it. We

may laugh, recognizing how our innate, greedy urges can be cleverly turned around. We may

also regret that this promising metaphor has such limited applicability real life situations.

Philosophers and economists have offered more sophisticated versions. In a famous thought

experiment the political philosopher John Rawls suggested imagining that, before you were

born, you could establish the principles of social relations at work in the world, knowing that

Page 453: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

438

A more complete rendition describes starting out from an “original position” in rational actors620

confront a “veil of ignorance” about the specific consequences of their choices.

you could be born as anyone, anywhere. If so, you would presumably want the world to be

constituted as a place with a very fair distribution of benefits, rights, and opportunities.620

Despite the irony that one of modernity’s most celebrated morality tales makes more

of a resort to supernaturalism than the Bible’s, what all these stories have in common is that

their underlying principles are conceived as abstractions pertinent to the entire human

community, past, present, and future. Even more important is that they matter when live

bodies experience the consequences. Does the child live? Does the mother keep her baby?

Who gets enough cupcake, or water, or access to jobs, education, and health care? Values,

naturally conceptualized as wholes, play out as particulars.

As I put it earlier, creeds make deeds. If the creed is not actualized, it does not matter

outside of the mind. Abstract claims about values are rules that people can break. But if the

creed is honored and motivates action, the consequent deed matters very much, especially

for those experiencing the sharp side of the sword. Executed actions are the rules that can

break people.

Agents who speak about whole truths, at least in their claimed area of expertise, are

guides. Agents who execute the consequences, at least within some chain of command, are

gatekeepers. The two agencies often overlap. Solomon was glorified for his wisdom (making

him a guide), but feared for his judgement (making him a gatekeeper). The swordsman

ordered to do the deceptively framed deed of distributive justice was a gatekeeper as well.

The disclosure of future consequences was (seemingly, for a moment) in his hands. But what

if the swordsman had spoken before the mother, shirking his obligation to Solomon and

refusing to slay the child? By upholding the same values of wholeness and integrity that

motivate the tale, his deed of refusal would have marked him as both guide and gatekeeper

of those higher principles. His deed, not Solomon’s, would have recapitulated the creed. He

would have been the hero, not Solomon. But in either case, the value would have been

reinforced. The example and counterexample underscore the moral implications of

structuration. The execution of a rule by a speaker has effects beyond a performative

Page 454: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

439

Onuf (1989: 228-257, 291).621

outcome. It also helps reconstitute the value-bearing competencies of the speaker and hearer

within a rule-based context.

The swordsman was not the hero, however. He is portrayed as a robotic follower of

imposed rules, the able slave of spoken orders, blamelessly incapable of using his mind to

decide anything for himself. Many gatekeepers welcome this monolog role, its simplistic

competency, and the isolation from responsibility it seems to promise.

Yet responsibility is inescapable. Humans are all each others’ peers, including kings

and harlots and kids with cupcakes. They try things out and are subject to the consequences

of doing so. Not knowing the truth, Solomon had to hunt for it. Setting his trap, he imposed

a future obligation on himself to act accordingly as soon as the truth was revealed. Two

contentious women had presented themselves to him, pleading their cases, and the true

mother dared to renegotiate the outcome. Two children carve out a distribution of what they

are about to eat. Rawls challenged his readers to imagine they will forever inhabit the world

their choices bring about. Bids to craft the future are the rules people dare to make.

* * *

The task of this section is consider how power can be deployed via each of the three

categories of constructivist-styled speech acts laid out earlier. To perform an assertive

illocution is to deploy power as a guide. To perform a directive is to deploy power as a

gatekeeper. And to perform a commissive is to deploy power as a peer. This approach is not

entirely novel for constructivism. Onuf proposed a correspondence between various sets of

speech acts, skill groups, and the types of rules that characterize various sorts of social and

political structures. As he saw it, the work of priests, professors and propagandists621

reflected assertive rules. The doings of warriors and diplomats reflected directive rules. The

efforts of physicians, merchants, mothers, hunters, and farmers reflected commissive rules.

To label priests and professors as guides seems straightforward enough, as they

certainly speak to the whole situation within their areas of expertise. Admittedly, there are

various ways in which they act as gatekeepers. Priests perform as intercessors for those who

Page 455: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

440

believe they must undergo certain ritual experiences in order to qualify for entry through the

“pearly gates” of heaven. Many religious figures also believe themselves to be anointed as

supernatural channels used by God in the transfer of healing remedies, beneficial actions,

judgments, and so on. Professors give grades and are thus gatekeepers of a student’s

credentials.

The term propagandist conveys the least diluted sense of the deployment power

within this category. Guide is intended to have about the same meaning, but without the

pejorative tone. Guides do effect change in material reality to the extent that their words and

symbols cause some reconfiguration to the neural arrangements of hearers, but their power

over another’s body ends there. Guides work to embed thoughts in minds. The consequence

may quite effectively seem to cage and constrain people, but the hearer’s consent is an

essential factor. Religious leaders preach to choirs, and professors speak via the presumed

authority of epistemic communities. Both may occasionally act as gatekeepers, prescribing

immediate actions that a few hearers follow more or less uncritically, but their ability to

instill values across a wide population is the key to their power. As master propagandists,

they engage in linguistic performances that build up a hearer’s creedal conception of what

constitutes the whole. Such conceptions will resonate at particular moments, stimulated by

both conscious and unconscious association with particular circumstances, organizing the

preferences by which belief informs action.

Onuf’s inclusion of warriors and especially diplomats in the category I use for

gatekeepers seems counterintuitive. Both jobs involve high degrees of risk and negotiation.

Many warriors think of themselves as hunters and often also as the hunted. Clue finding is

an essential skill for them, as it is for diplomats. An aphorism attributed to the British

theologian Brooke Foss Westcott is that a diplomat is someone who can bring home the

bacon without spilling the beans.

But Onuf’s grouping seems intended to make a point about the states engaged in a

political Realist’s world of zero-sum conflict. In such worlds, members of each side work

to maximize their own resources at the expense of the other side. The essential skill of

Page 456: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

441

warriors and diplomats, in his view, is combat. In that sense, they are only doing the bidding

of their superiors, which is literally to beat the resources out of their enemies.

That perspective is closer to the sense of deployment of directives that I wish to

develop. The gatekeeper of an event would be the specific person named in answer to the

question, “Who did it?” The proper answer to such a question is not “they.” Intentional

beings have names and can only be made accountable by resort to those names. Of course,

there is usually a chain of command in human actions. The competencies involved in many

organizations are plainly deductive, requiring that scheduled obligations be fulfilled as

certain triggers are fired. A line from Shakespeare’s King Lear captures the sheer physicality

of this power.

Thou hast seen a farmer’s dog bark at a beggar? And the creature run fromthe cur? There though mightst behold the great image of authority: a dogobeyed in office.

The chain of command could be portrayed as a train of events, prompting a series of

gatekeepers who receive, process, and pass along commands, one after the other. Being free

beings, however, very much unlike the components inside an inanimate device, they could

break convention, either by getting up and leaving their places, or by introducing new trains

into the process. Usually it just keeps working, however, as people’s instilled norms inspire

them to sustain the internal order of things. The process can be more convoluted than the

inside a social Rube Goldberg machine. Attention to the distinction between the role of

assertives, directives, and commissives is the key to unraveling the confusion. That is, it

helps to be able to distinguish when people are acting as guides because they believe they are

following their internal sense of wholeness, or as gatekeepers because they are obliged to

speak when they have been spoken to, or as peers because they are humbly inventing their

futures one step at a time.

* * *

For Foucault, deployment was a term of art. Power, alliances, sexuality were deployed

in relation to each other, he believed, finding a common site in the making of families. Those

deployments expedited the linking up of practices with identities. They promoted the

Page 457: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

442

Parenthetically, the erosion of traditional hegemonically-supported norms about marriage and family622

in the United States may help explain the cultural backlash which has led some groups to seek stricter re-

codification of traditional marriage and family forms in law. Nevertheless, that backlash seems out of step with

economic forces which reward innovation, mobility, and atomization. Capitalist drives have proven quite

successful at bending inelastic social structures past their limit.

transitions of laws into norms, securing sets of dominant and subordinate relationships

throughout societies by their hegemonic effects. While accepting those insights, my own622

resort to the term should not be taken as an endorsement of his overarching world view. The

intent of this work is to point out the possibility of a grand narrative rather than to resist it.

The opportunity to establish a body of scientifically-founded truths about the natural world

strikes me as a persistent option rather than what Foucault might have called a misguided

assumption about disintegrating genealogies.

The possibility of establishing a moral grand narrative is a far more challenging task,

but it would make no sense at all without also developing a way of talking sensibly about

base reality. Doing so is an appropriate goal for people who might hope to assess and

advance their interests.

Deployments emanate from embodied minds. From the standpoint of cognitive

linguistics, psychic phenomena are not distinct from physical phenomena, but emerge from

them. Every psychic occurrence is supported by a firing of neuronal coalitions. The firing of

neuronal coalitions is usually triggered by the immediate sensing of physical phenomena, but

not necessarily every time. People can think and feel without resort to immediate sensory

input. The supporting physical phenomenon of thought and feeling – the firing of neurons

– has objective existence, but can only be understood subjectively. That firing is intrinsic to

a thinker, whether conscious or unconscious.

Guides, gatekeepers and peers are all deployers, unfolding what is in their minds

through the making of signs, and especially through speech acts. Those disclosures impact

material and social reality and consequently lodge new renditions of those realities within

the minds of observers.

Page 458: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

http://www.icann.org/minutes/.623

http://www.lextext.com/icann/index.html.624

See especially “Representation in Cyberspace Study,” http://cyber.law.harvard.edu/rcs/index.html,625

and; “Debate Over Internet Governance” http://cyber.law.harvard.edu/is99/governance/introduction.html,

“IFWP Working Drafts.” http://cyber.law.harvard.edu/ifwp/Structure.html.626

http://www.domainhandbook.com.627

http://www.gtld-mou.org/628

443

12. APPENDIX III: COMMENT ON SOURCES

Sources and Methodology

In the course of monitoring the online discourse and reviewing old, resurrected

archives, I accumulated over six hundred megabytes of text-based email concerning Internet

governance and the reform of the domain name system. This material included over 100,000

distinct messages transmitted via public, semi-public, and closed news lists. I also gathered

a great deal of information firsthand, either by attending formally organized functions, or by

participating in offline exchanges with key players. I also conducted numerous interviews

throughout the period, via email, by phone, and in person.

An extensive documentary history is accessible in hypertext form on the World Wide

Web. Many important announcements and official instruments dating from ICANN’s

incorporation in late 1998 have been made available to the public via ICANN’s own

website. Additional archival information from that period is available at the ICANN Blog623

(weblog) site managed by Bret Fausett, and at Harvard’s Berkman Center. The Berkman624 625

Center also maintains significant pieces of the historical record immediately prior to the

ICANN period, as does Ellen Rony at the companion site for Domain Name Handbook.626 627

Important email and document archives relevant to the IAHC and gTLD-MoU

processes have been generously maintained and hosted by Kent Crispin. I am also grateful628

to Richard Sexton, Rick Wesson, and Simon Higgs, who sent me their archives of currently

unhosted email lists.

Page 459: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

444

http://rfc.sunsite.dk/.629

http://www.netsys.com/.630

A particularly convenient online resource for reviewing RFCs – the documents

which define Internet standards – is hosted by the Denmark-based Sunsite. Netsys.com,629

“The Intelligent Hacker’s Choice,” was an invaluable choice for those who wanted to peruse

the archives of the Internet Engineering Task Force’s primary email list. Unfortunately,630

netsys.com went dark in late 2004, and no alternate live site seems to provide records prior

to 2000. For the time being, the Archive.org site provides a way to recover those records, and

others that are steadily vanishing from the Internet. Smatterings of email archives can also

be found at mail-archive.com.

In fact, loss of data from the online video, audio and text record has been a distressing

aspect of this investigation. Remarkable materials such as the first live webcast from the U.S.

Congress, in September 1997, were available for several years, but are no longer accessible.

Such is the nature of the web, described long ago in Cass Whittington’s haiku.

You step in the stream,But the water has moved on.This page is not here.

Lamentably, censorship is at work with regard to the longest continuing open

discussion of domain policy issues, which began under the aegis of the InterNIC in 1995. The

discourse was abruptly terminated by the InterNIC’s overseer, Network Solutions, Inc. a

subsidiary of Verisign Corporation, in May 2001. All records were taken off line at that time,

despite community protest. Pieces can nevertheless be reconstructed through individually

owned archives.

The following tables will serve to provide a record of the primary materials that I

accumulated in the course of this research, as well as a key to the mailing list abbreviations

that are used in the footnotes.

Page 460: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

445

Table 7 Mailing Lists and Abbreviations

Abbreviation Long Name Description/Comment

apnic Apnic-talk, General discussions onAPNIC

Asia Pacific Network Information Center archives athttp://www.apnic.net/community/lists/index.html

bwg Boston Working Group Restricted membership discussion group hosted by DavidSteele since late 1998.

com-priv Commercialization andPrivatization

Formerly hosted by psi.com, archives are now at MIThttp://diswww.mit.edu/MENELAUS.MIT.EDU/com-priv/

domain-policy Domain Policy discussion Formerly hosted by NSI, now closed. Some archives onprivately-held disk storage.

GIAW Global Internet Alliance Workshop see IFWP.

gtld-discuss gTLD-MoU discussion Hosted by IAHC/CORE. Now closed. For archives, seegtld-mou.org/docs/maillist.htm.

iahc-discuss IAHC discussion Merged with gtld-discuss archives. Now closed.

ietf IETF general discussion group No live archive prior to 2000. Seehttp://www.netsys.com/ietf at archive.org

IFWP International Forum on the WhitePaper

Association of Internet Professionals. Now closed.

IH Internet History Live at postel.org/internet-history.

IP Interesting People Moderated by Dave Farber, archives athttp://www.interesting-people.org/archives/interesting-people/.

msggroup Message Group First ARPANET mailing list. Archive.org holds archives viahttp://www.tcm.org/msggroup

namedroppers Technical discussion of DNSissues

Live, hosted by NSI at http://www.ops.ietf.org/lists/namedroppers/

nanog North American NetworkOperators Group

Live at http://www.nanog.org/mailinglist.html

nettime “Mailing list for networkedcultures, politics, and tactics.”

See nettime english at http:// www.nettime.org/archives.php

newdom-ar New Domains discussion list Hosted by Alice’s Restaurant, (Rick Wesson).Now closed.Some archives on privately-held disk storage.

newdom-iiia New Domains discussion list Hosted by Internet Industrial Association (MatthewMarnell). Now closed. Archives on privately-held diskstorage.

newdom-vrx New Domains discussion list Hosted by Richard Sexton. Now closed. Archives onprivately-held disk storage.

ORSC Open Root Server Confederation Richard Sexton. Also, “On Richard’s Street Corner.”

poised Process for Organization ofInternet Standards

IETF oversight group, originally focused on IETF/ISOCrelationship (at lists.tislabs.com).

Page 461: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

446

Table 8 Selected Schedule Interviews

Respondent Date(yyyy.mm.dd)

John Klensin 1997.04.07

Jon Postel 1997.12.09

Tony Rutkowski 1997.12.13

Bill Manning 1998.03.30

Mike Roberts 1999.11.04

Christopher Wilkinson 1999.11.04

Don Heath 1999.11.11

Dave Holtzman 2000.05.01

Ira Magaziner 2000.08.05

Marc Hurst 2000.08.07

Jay Fenello 2000.08.07

Bill Manning 2000.08.11

Esther Dyson 2000.08.13

Harald Alvestrand 2000.11.13

Kilnam Chong 2000.11.13

Paul Twomey 2000.11.18

Eugene Kashpureff 2001.08.21

Becky Burr 2001.11.11

Steve Bellovin 2001.11.13

Lars-Johan Liman 2001.11.14

Mark Kosters 2001.11.14

Stefan Trumpy 2001.11.14

David Johnson 2001.11.14

Ken Silva 2001.11.14

Danny Younger 2001.11.14

Jun Murai 2001.11.15

Vint Cerf 2001.11.15

Bob Connelly 2001.11.15

Larry Landweber 2002.06.13

George Strawn 2002.06.18

Steve Coya 2002.06.28

Dave Crocker 2002.07.04

Scott Bradner 2002.07.11

Robert Kahn 2002.07.22

Paul Mockapetris 2002.09.16

Gordon Cook 2002.11.20

Don Mitchell 2003.02.10 (and 11)

Kim Hubbard 2003.02.17

Scott Williamson 2003.02.22

Mary Stahl 2003.04.24

Mike St. Johns 2003.04.30

Page 462: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

447

Table 9 Meetings and Conferences Attended

Date Meeting Location

April 7-11, 1997 38 IETF Memphis, TNth

December 8-12, 1997 40 IETF Washington, D.C.th

December 10, 1997 IAWG/CORE Washington, D.C.

December 12, 1997 ISOC Washington, D.C.

March 30- April 3, 1998 41 IETF Los Angeles, CAst

July 1-2, 1998 IFWP Reston, VA

December 7-11, 1998 43 IETF Orlando, FLrd

September 25-26, 1999 CPSR-sponsored ICANN Review Alexandria, VA

November 4, 1999 ICANN Annual Board Marina del Rey

November 17, 2000 ICANN Annual and Special Board Marina del Rey

November 15, 2001 ICANN Annual Board Marina del Rey

July 28, 2005 WGIG Review Washington, D.C.

Page 463: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

448

Research for this paper also included many public and private e-mail exchanges,

interviews and discussions. Respondents included: Tony Rutkowski, Einer Steffarud,

Richard Sexton, Jay Fenello, Christopher Ambler, Robert Shaw, Patrick Faltstrom, Christian

Huitema, Tim O’Reilly, Dave Meyers, Bill Simpson, Bob Moskowitz, Dave Clark, Don

Heath, John Gilmore, Jon Postel, David Maher, Rob Austein, Chuck Gomes, Dave Farber,

Alan Hanson, Hugh Daniels, Bill Flanigan, Robert Fink, Susan Harris, Donald E. Eastlake,

III, Dave Crocker, Perry Metzger, Karen Rose, Greg Chang, Vint Cerf, Pal Twomey, Gordon

Cook, Larry Landweber, David Johnson, Ira Magaziner, Harald Alvestrand, Milton Mueller,

Willie Black, Jun Murai, Esther Dyson, Christopher Wilkinson, Karl Auerbach, Mikki Barry,

Einar Steffarud, Don Mitchell, George Straw, Kim Hubbard, Steve Coya, Mike Roberts, Bob

Kahn, Jake Feinler, and Erik Fair. I have tried to keep my reports of these discussions and

observed incidents faithful to my best recollection, and to draw quotes from written materials

when possible. Therefore, all responsibility for any misrepresentations of their views and

comments is my own.

Page 464: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

449

13. BIBLIOGRAPHY AND SOURCES

Abbate, Janet. 2000. Inventing the Internet. Cambridge, Massachusetts: MIT Press.

Albitz, Paul, and Liu, Cricket. 1998. DNS and BIND. 3 Ed. Sebastopol, California:rd

O’Reilly.

Armstrong, Karen. 2006. The Great Transformation: The Beginning of our ReligiousTraditions. New York: Alfred A. Knopf.

Berger, Peter J. 1999. The Desecularization of the World: Resurgent Religion in WorldPolitics. Washington, D.C.: Ethics and Public Policy Center.

Berger, Peter J., and Luckman, Thomas. 1966. The Social Construction of Reality: ATreatise in the Sociology of Knowledge. New York: Doubleday.

Bloombecker, Buck. 1990. Spectacular Computer Crimes: What They Are and How TheyCost American Business Half a Billion Dollars a Year. Homewood, Illinois: DowJones-Irwin.

Boyer, Pascal. 2001. Religion Explained: the Evolutionary Origins of Religious Thought.New York: Basic Books.

Boyd, Robert, and Richerson, Peter J. 2005. The Origin and Evolution of Cultures. NewYork: Oxford University Press.

Braden, Bob. 1998. “The End-to-end Research Group: Internet Philosophers and

‘Physicists’” Presentation to the 41 Plenary of the Internet Engineering Task Force.st

http://www.ietf.org/proceedings/98mar/slides/plenary-braden/index.html

Brockman, John. 1996. Digerati: Encounters with the Cyber Elite. Wired BooksIncorporated.

Burch, Kurt, and Denemark, Robert A., eds. 1997. Constituting International PoliticalEconomy. Boulder, Colorado: Lynne Rienner Publishers.

Castells, Manuel. 1996. The Rise of the Network Society. Cambridge, Massachusetts:Blackwell Publishers.

Chalmers, David J. 1996. The Conscious Mind: In Search of a Fundamental Theory. NewYork: Oxford University Press.

Codding, George, and Rutkowski, Anthony M. 1983. The ITU in a Changing World.Deedham Massachusetts: Artech House.

Page 465: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

450

Coontz, Stephanie. 2005. Marriage, a History: form Obedience to Intimacy or How LoveConquered Marriage. New York: Viking.

Damasio, Antonio. 1999. The Feeling of What Happens: Body and Emotion in the Makingof Consciousness. New York: Harcourt Brace and Company.

______. 2003. Looking for Spinoza. Joy Sorrow and the Feeling Brain. Orlando, Florida:Harcourt Books.

Deacon, Terrence. 1997. The Symbolic Species: The Co-Evolution o Language and theBrain. New York: W. W. Norton & Company.

Dennet, Daniel C. 2003. Freedom Evolves. New York: Penguin.

______. 2006. Breaking the Spell: Religion as a Natural Phenomenon. New York: Viking.

Dertouzos, Michael. 1997. What Will Be: How the New World of Information Will ChangeOur Lives. New York: HarperEdge.

Dessler, David. 1989. “What’s at Stake in the Agent Structure Debate?” InternationalOrganization 43, 3 (Summer 1989): .441-73.

Deutscher, Guy. 2002. The Unfolding of Language: An Evolutionary Tour of Mankind’sGreatest Invention . New York: Viking.

Diamond, David. 1998. “Whose Internet Is It, Anyway?” Wired 6.4, April: 172-7, 187, 194-5

Electronic Privacy Information Center. 2004. The Public Voice WSIS Sourcebook:Perspectives on the World Summit on the Information Society. Electronic PrivacyInformation Center.

Feinberg, Todd E. 2001. Altered Egos: How the Brain Creates the Self. New York: OxfordUniversity Press.

Franda, Marcus. 2001. Governing the Internet: The Emergence of an International Regime.Boulder, Colorado: Lynne Rienner Publishers.

Friedman, Thomas L. 2005. The World is Flat: A Brief History of the Twenty-First Century.New York: Farrar, Straus and Giroux.

Froomkin, Michael. 2000. Wrong Turn in Cyberspace: Using ICANN to Route Around theAPA and the Constitution, 50 Duke Law Journal. 17 (Oct. 2000)

Page 466: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

451

______. 2003. [email protected]: Toward a Critical Theory of Cyberspace, 116Harvard. Law Review. 749 (2003)

Giddens, Anthony. 1984. The Constitution of Society: Outline of the Theory of Structuration.Berkeley, University of California Press.

______. 1990. The Consequences of Modernity. Stanford: Stanford University Press.

______. 2000. Runaway World: How Globalization is Shaping Our Lives. New York:Routledge.

Gleick, James. 1998. Chaos: The Making of a New Science. New York: Viking Adult.

Goldsmith, Jack, and Wu, Tim. 2006. Who Controls the Internet? Illusions of a BorderlessWorld. New York: Oxford University Press.

Gole, Rebecca W. 1999. “Playing the Name Game: A Glimpse at the Future of the InternetDomain Name System” Federal Communications Law Journal. Vol 51 April 13,1999. http://www.law.indiana.edu/fclj/pubs/v51/no2/golemac1.PDF

Greenspan, Stanley I. and Shankar, Stuart G. 2004. The First Idea: How Symbols, Languageand Intelligence Evolved From Out Primate Ancestors to Modern Humans.Cambridge, Massachusetts: Da Capo Press, Perseus Books Group.

Haack, Susan. 1993. Evidence and Inquiry: Towards Reconstruction in Epistemology.Malden, Massachusetts: Blackwell Publishers.

_______. 1998. Manifesto of a Passionate Moderate: Unfashionable Essays. Chicago:University of Chicago Press.

_______. 2003. Defending Science – Within Reason: Between Scientism and Cynicism.Amherst, New York: Prometheus Books.

Hafner, Katie, and Lyon, Matthew. 1996. Where Wizards Stay Up Late: The Origins of theInternet. New York: Touchstone. See the companion website at http://www.olografix.org/gubi/estate/libri/wizards/index.html.

Hamelink, C.J. 1994. The Politics of World Communication. London: Sage.

Hamer, Dean. 2004. The God Gene: How Faith is Hardwired into Our Genes. New York:Anchor Books.

Page 467: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

452

Harvard Information Infrastructure Project, National Science Foundation Division ofNetworking and Communications. 1995. “Internet Names, Numbers, and Beyond:Issues in the Coordination, Privatization, and Internationalization of the Internet:Research Workshop, Nov. 20, 1995.” http://ksgwww.harvard.edu/iip/GIIconf/nsfpaper.html.

Hauben, Michael, and Hauben, Ronda. 1997. Netizens: On the History and Impact of theNet. Los Alamitos, California: IEEE Computer Society Press.

Hewson, Martin, and St. Clair, Timothy. eds. 1999. Approaches to Global GovernanceTheory. New York: State University of New York Press.

Holitscher, Marc, ed. 1999. “Debate: Internet Governance,” Swiss Political Science Review.5(1): 115-136. http://www.holitscher.org/downloads/InternetGovernanceSPSR1.pdf

Huitema, Christian. 2000. Routing in the Internet (2 edition). Englewood Cliffs, Newnd

Jersey: PTR Prentice Hall.

Kahin, Brian, and Keller, James H. eds. 1997. Coordinating the Internet. Cambridge,Massachusetts: MIT Press.

Kahin, Brian, and Nesson, Charles. eds. 1997. Borders in Cyberspace: Information Policyand the Global Information Infrastructure. Cambridge, Massachusetts: MIT Press.

Keohane, Robert. 1984. After Hegemony. Princeton: Princeton University Press.

Kesan, Jay P., and Shah, Rajiv C. 2001. “Fool Us Once Shame on You – Fool Us TwiceShame on Us: What We Can Learn From the Privatization of the Internet BackboneNetwork and the Domain Name System,” February 2001, University of IllinoisCollege of Law Working Paper No, 00-18, http://papers.ssrn.com/paper.taf?abstract_id=260834.

Klein, Hans. 2005. “ICANN Reform: Establishing the Rule of Law: A policy analysisprepared for The World Summit on the Information Society (WSIS), Tunis, 16-18N o v e m b e r 2 0 0 5 , ” G e o r g i a I n s t i t u t e o f T e c h n o l o g y ,h t t p : / / w w w . i p 3 . g a t e c h . e d u / i m a g e s / I C A N N -Reform_Establishing-the-Rule-of-Law.pdf.

Kleinrock, Leonard. 1964. Communication Nets; Stochastic Message Flow and Delay,McGraw-Hill Book Company, New York.

Kofman, Eleonore, and Youngs, Gillian eds. 1996. Globalization: Theory and Practice.London: Pinter. 1996.

Page 468: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

453

Krasner, Stephen D. (ed). 1983. International Regimes. Ithaca, NY: Cornell UniversityPress.

Kratochwil, Friedrich V. 1989. Rules, Norms, and Decisions: On the Conditions of Practicaland Legal reasoning in International Relations and Domestic Affairs. Cambridge:Cambridge University Press.

Krol, Ed. 1992. The Whole Internet Users Guide and Catalog. Sebastopol, California:O’Reilly.

Kubalkova, Vendulka, Onuf, Nicholas G., Kowert, Paul, eds. 1998. International Relationsin a Constructed World. New York: M.E. Sharpe.

Kurzweil, Ray. 2005. The Singularity is Near: When Humans Transcend Biology. NewYork: Viking.

Lakoff, George, and Johnson Mark. 1980. Metaphors we Live By. Chicago: University ofChicago Press.

Lakoff, George. 2006. Whose Freedom? The Battle Over America’s Most Important Idea.New York: Farrar, Straus and Giroux.

LeDoux, Joseph. 2002. Synaptic Self: How Our Brains Become Who We Are. New York:Viking.

_______. 1996. The Emotional Brain: The Mysterious Underpinnings of Emotional Life.New York: Touchstone.

LaQuey Tracey, and Ryer, Jeanne C. 1993. The Internet Companion: A Beginner’s Guide toGlobal Networking. Reading, Massachusetts: Addison Wesley.

Leiner, Barry M., et al. 2003. “A Brief History of the Internet (Version 3.23, revisedDecember 10, 2003).” http://www.isoc.org/internet/history/brief.shtml

Lessig, Lawrence. 1999. Code: And Other Laws of Cyberspace. New York: Basic Books.

________. 2001. The Future of Ideas: The Fate of the Commons in a Connected World.New York: Random House.

Libet, Benjamin. 2004. Mind Time: The Temporal Factor in Consciousness. Cambridge,Massachusetts: Harvard University Press.

Page 469: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

454

Leib, Volker: 2003. ICANN und der Konflikt um die Internet-Ressourcen:Institutionenbildung im Problemfeld Internet Governance zwischen multinationalerStaatstätigkeit und globaler Selbstregulierung. Dissertation, Universität Konstanz.http://www.ub.uni-konstanz.de/kops/volltexte/2003/970/.

Lewis, James A. “Perils and Prospects for Internet Self-Regulation,” Conference Paper,INET 2002, http://inet2002.org/CD-ROM/lu65rw2n/papers/g02-d.pdf.

Loundy, David J. 1997. “A Primer on Trademark Law and Internet Addresses.” 15 JohnMarshall J. of Computer and Info. Law 465. Also at http://www.loundy.com/JMLS-Trademark.html.

Lynch, Daniel C., and Rose, Marshall T. eds. 1993. The Internet Systems Handbook.Reading, Massachusetts: Addison-Wesley Professional.

MacLean, Don. ed. 2004. Internet Governance: A Grand Collaboration. New York: UnitedNations Information and Communication Technologies Task Force.

Malamud, Carl. 1992. Exploring the Internet: A Technical Travelogue. Englewood Cliffs,New Jersey: PTR Prentice Hall.

Masur, Sandra. 1991. “The North American Free Trade Arrangement: Why It’s in theInterest of U.S. Business.” Columbia Journal of World Business. 26, 2 (Summer1991): 98-103.

McTaggert, Craig. 1999. Governance of the Internet’s Infrastructure: Network Policy for theGlobal Public Network. Masters Thesis, University of Toronto Faculty of Law.

Mewes, Heather. 1998. “Memorandum of Understanding on the Generic Top-Level DomainSpace of the Internet Domain Name System,” 13 Berkeley Tech Law Journal. 235(1998).

Mills, David L. 2005. “‘A Maze of Twisty, Turney Passages’ - Routing in the InternetSwamp (and other adventures).” Web-based presentation, University of Delaware.http://www.eecis.udel.edu/~mills/database/brief/goat/goat.pdf. Apparently an updateof 1999. “Tutorial on the Technical History of the Internet.” SIGCOMM SymposiumPresentation.

Mueller, Milton L. 2002. Ruling the Root: Internet Governance and the Taming ofCyberspace. Cambridge, Massachusetts: MIT Press.

Newberg, Andrew, D’Aquili, Eugene D., and Rause, Vince. 2001. Why God Won’t GoAway: Brain Science and the Biology of Belief. New York: Ballantine Books.

Page 470: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

455

National Academy of Sciences. 1994. Revolution in the U.S. Information Infrastructure,http://www.nap.edu/readingroom/books/-newpath/chap2.html.

National Research Council (U. S.), Committee on the Internet in the Evolving InformationInfrastructure, and National Research Council. 2001 The Internet's Coming of Age.National Academies Press.

Nye, Joseph S. 2004. Power in the Global Information Age: From Realism to Globalization.New York: Routledge.

Onuf, Nicholas G. 1989. World of Our Making. Rules and Rule in Social Theory andInternational Relations. Columbia: University of South Carolina Press.

_______. 1998. The Republican Legacy in International Thought. Cambridge: CambridgeUniversity Press.

Paré, Daniel J. 2003. Internet Governance in Transition: Who is the Master of ThisDomain? Lanham, Maryland: Rowman & Littlefield.

Partridge, Eric. 1983. Origins: A Short Etymological Dictionary of Modern English. NewYork: Greenwich House.

Pool, Ithiel de Sola. 1983. Technologies of Freedom: On Free Speech in an Electronic Age.Cambridge, Massachusetts: Belknap Press.

Quarterman, John S. 1997. “Haiti and Internet Governance” First Monday 2, 6 (July 1997).http://www.firstmonday.org/issues/issue2_6/quarterman/.

Rader, Ross William. 2001. “Alphabet Soup: The History of the DNS,”http://www.whmag.com/content/0601/dns/print.asp.

Ratey, John J. 2001. A User’s Guide to the Brain: Perception, Attention, and the FourTheaters of the Brain. New York: Pantheon Books.

Richerson, Peter J., and Boyd, Robert. 2005. Not by Genes Alone: How Culture TransformedHuman Evolution. Chicago: University of Chicago Press.

Robbins, Jim. 2000. Synphony in the Brain: The Evolution of the New Brain WaveBiofeedback. New York: Grove Press.

Rony, Ellen, and Rony, Peter. 1998. The Domain Name Handbook. High Stakes andStrategies in Cyberspace. Lawrence, Kansas: R&D Books.

Page 471: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

456

Ruggie, John Gerard. 1998. Constructing the World Polity: Essays on InternationalInstitutionalization. New York; Routledge.

Russell, Andrew L. 2001. "Ideological and Policy Origins of the Internet, 1957-1965,"Conference Paper, Telecommunications Policy Research Conference (TPRC) - 29thResearch Conference on Communication, Information, and Internet Policy,http://www.arxiv.org/ftp/cs/papers/0109/0109056.pdf.

______. 2002. “From ‘Kitchen Cabinet’ to ‘Constitutional Crisis’: the Politics of InternetStandards, 1973-1992,” Conference Paper, International Committee on the Historyof Technology (ICOHTEC), Granada, Spain, June 27.

______. 2006. Andrew L. Russell, "'Rough Consensus and Running Code' and theInternet-OSI Standards War," IEEE Annals of the History of Computing 28.3,July-September, 48-61.

Searle, John R. 1969. Speech Acts: An Essay in the Philosophy of Language. Cambridge:Cambridge University Press.

______. 1979a. Expression and Meaning: Studies in the Theory of Speech Acts. Cambridge:Cambridge University Press.

______. 1979b. Mind: A Brief Introduction. Oxford: Oxford University Press.

______. 1984. Minds, Brains and Science. Cambridge, Massachusetts: Harvard UniversityPress.

______. 1997. The Mystery of Consciousness. New York: New York Review.

______. 1998. Mind, Language and Society: Philosophy in the Real World. New York:Basic Books.

______. 2001. Rationality in Action. Cambridge, Massachusetts: MIT Press.

Searle, John R., and Vanderveken, Daniel. 1985 Foundations of Illocutionary Logic.Cambridge: Cambridge University Press..

Sernovitz, Andy. 1997. "The US Govt. Is *not* Supportive of gTLD-MoU." July 27, 1997http://www.gtld-mou.org/gtld-discuss/mail-archive/0375.html

Shapiro, Andrew L. 1999. The Control Revolution: How the Internet is Putting Individualsin Charge and Changing the World We Know. New York: The Century Foundation.

Page 472: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

457

Shermer, Michael. 2004. The Science of Good and Evil: Why People Cheat, Gossip, Care,Share, and Follow the Golden Rule. New York: Times Books.

Snow, C. P. 1959. The Two Cultures. Cambridge: Cambridge University Press.

Stark, Thom. 1997. “What’s in a Namespace?” Boardwatch Magazine. (May 1997): 74-9.

Thierer, Adam, and Crews, Clyde Wayne, Jr. eds. 2003. Who Rules the Net? InternetGovernance and Jurisdiction. Washington, D.C.: Cato Institute.

Varela, Francisco J., Thompson, Evan, and Rosch, Eleanor. 1996. The Embodied Mind:Cognitive Science and Human Experience. Cambridge, Massachusetts: MIT Press.

Wass, Erica Schlesinger, ed. 2003. Addressing the World: National Identity and InternetCountry Code Domains. Lanham, Maryland: Rowman & Littlefield.

Wehrbach, Kevin. 1997. “Digital Tornado: The Internet and Telecommunications Policy,”OPP Working Paper Series 29, Federal Communications Commission,http://www.fcc.gov/Bureaus/OPP/working_papers/oppwp29.txt.

Wiener, Norbert. 1964. God & Golem, Inc.: A Comment on Certain Points whereCybernetics Impinges on Religion Cambridge, Massachusetts: MIT Press.

Wilson, Edward O. 1998. Consilience: the Unity of Knowledge. New York: Alfred A.Knopf.

Wriston, Walter B. 1992. The Twilight of Sovereignty: How the Information Revolution isTransforming Our World. Bridgewater, New Jersey: Replica Books.

Zakon, Robert H. 2006. “Hobbes’ Internet Timeline v8.1" http://www.zakon.org/robert/internet/timeline.

Page 473: Launching the DNS War: Dot-Com Privatization and the Rise of Global Internet Governance

VITA

Craig Lyle Simon was born in Mineola, New York on August 11, 1956. His parents

are Irving Simon (deceased) and Shirley Simon. He started his elementary education at

Baylis Elementary School in Syosset, New York and completed his secondary education at

Chatsworth Senior High School in Los Angeles in 1974. He graduated with a BA degree in

History from the University of California at Santa Cruz in June 1980. Living in Santa Cruz

through the summer of 1984, he was employed in a variety of occupations, including

supervisor at Threshold Enterprises, a distributor of vitamins and nutritional products.

In August 1984 he was admitted to the Graduate School of International Studies at

the University of Miami. He was awarded a University Fellowship for three terms, from

Autumn 1985 through Spring 1988. He was granted an M.A. in International Studies in May

1987, after submitting the Thesis Technology Transfer and Soviet-American Rivalry:

Prospects for the Decline of Bipolar Hegemony. He completed the comprehensive

examination required for Ph.D. candidacy in May 1988 and taught Strategic Studies at

Florida International University in Summer 1988.

He suspended formal academic work in September 1988 to begin a career as a

database application specialist. He taught introductory computer courses at Miami Dade

Community College. He returned to academic work in 1996 while maintaining his consulting

practice. In 1998 he published “Internet Governance Goes Global,” as a chapter in

International Relations in a Constructed World. He was awarded a Ph.D. in International

Studies in December 2006.