16
The Darknet and the Future of Content Distribution Peter Biddle, Paul England, Marcus Peinado, and Bryan Willman Microsoft Corporation 1  Abstract We investigate the darknet  a colle ctio n of networks and tech nologi es used to share digit al content. The darkn et is not a separat e physical net wor k but an appli cat ion and protocol laye r riding on existing networ ks. Ex amples of da rkn ets are peer-to-pe er file sharing, CD and DVD copying, and key or password sharing on email and newsgroups. The last few years have seen vast increas es in the darknet’s aggrega te bandwi dth, reliabili ty, u sabi lity , size of sh ared libr ary, and avai lability of sear ch engines. In this paper we categorize and analyze exi sti ng and future dar knets, fr om both the te chnical and legal persp ectiv es. We spec ulat e that there wi ll be short -ter m impediments to the effectiveness of the darknet as a distribution mechanism, but ulti mately the darkn et-g enie will not be put back into the bott le. In view of t his hypothesi s, we examine the rel evanc e of content protecti on and content distribution architectures. 1 Introduction People have always copied things. In the past, most items of value were physical objects. Patent law and economies of scale meant tha t smal l scale copying of physical objects was usually uneconomic, and large-scale copying (if it infringed) was stoppable using policemen and courts. Today, things of value are increasingly less tangible: often they are just bit s and bytes or can be accurate ly repr esent ed as bit s and bytes. The widespread deployment of packet-switched networks and the huge advances in computers and codec-technologies has made it feasible (and indeed attractive) to deliver such digital work s over the Inter net. Thi s pre sents great opp ortunities and great chal leng es. The opportunity is low- cost delivery o f per sonalized, desi rable hig h-qualit y content. The challenge is tha t such content can be distr ibuted ille gal ly . Copyr ight law governs the legality of copying and dis tribution of such valuable data, but copy right protection is increasi ngly strained in a world of progra mmable comput ers and high-spe ed networks. For exampl e, consider the stagger ing burs t of creati vity by authors of comp uter programs that are design ed to share audio files. This was first populari zed by Napster, but today several popular applicati ons and servi ces offer similar capabili ties. CD-writ ers have become mainstream, and DVD-writer s may well follow suit. Hence, even in the absen ce of network connectivity, the opportunity for low-cost, large-scale file sharing exists. 1.1 The Darknet Throughout this paper, we will call the shared items (e.g. software programs, songs, movies, books, etc.)  objects. The persons who copy objects will be called  users  of the darknet, and the computer s used to share object s will be called  hosts. 1 Statements in this paper represent the opinions of the authors and not necessarily the position of Microsoft Corporation.

The Darknet and the Future of Content Distribution.pdf

Embed Size (px)

Citation preview

Page 1: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 1/16

Page 2: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 2/16

The idea of the darknet is based upon three assumptions:1. Any widely distributed object will be available to a fraction of users in a form

that permits copying.2. Users will copy objects if it is possible and interesting to do so.3. Users are connected by high-bandwidth channels.

The darknet is the distribution network that emerges from the injection of objectsaccording to assumption 1 and the distribution of those objects according to assumptions2 and 3.

One implication of the first assumption is that any content protection system will leakpopular or interesting content into the darknet, because some fraction of users--possiblyexperts–will overcome any copy prevention mechanism or because the object will enter the darknet before copy protection occurs.

The term “widely distributed” is intended to capture the notion of mass marketdistribution of objects to thousands or millions of practically anonymous users. This is in

contrast to the protection of military, industrial, or personal secrets, which are typically notwidely distributed and are not the focus of this paper.Like other networks, the darknet can be modeled as a directed graph with labeled

edges. The graph has one vertex for each user/host. For any pair of vertices (u,v), there isa directed edge from u to v if objects can be copied from u to v. The edge labels can beused to model relevant information about the physical network and may includeinformation such as bandwidth, delay, availability, etc. The vertices are characterized bytheir object library, object requests made to other vertices, and object requests satisfied.

To operate effectively, the darknet has a small number of technological andinfrastructure requirements, which are similar to those of legal content distributionnetworks. These infrastructure requirements are:

1. facilities for injecting new objects into the darknet (input)

2. a distribution network that carries copies of objects to users (transmission)3. ubiquitous rendering devices, which allow users to consume objects (output)4. a search mechanism to enable users to find objects (database)5. storage that allows the darknet to retain objects for extended periods of time.

Functionally, this is mostly a caching mechanism that reduces the load andexposure of nodes that inject objects.

The dramatic rise in the efficiency of the darknet can be traced back to the generaltechnological improvements in these infrastructure areas. At the same time, most attemptsto fight the darknet can be viewed as efforts to deprive it of one or more of theinfrastructure items. Legal action has traditionally targeted search engines and, to a lesser extent, the distribution network. As we will describe later in the paper, this has been

partially successful. The drive for legislation on mandatory watermarking aims to deprivethe darknet of rendering devices. We will argue that watermarking approaches aretechnically flawed and unlikely to have any material impact on the darknet. Finally, mostcontent protection systems are meant to prevent or delay the injection of new objects intothe darknet. Based on our first assumption, no such system constitutes an impenetrablebarrier, and we will discuss the merits of some popular systems.

We see no technical impediments to the darknet becoming increasingly efficient(measured by aggregate library size and available bandwidth). However, the darknet, inall its transport-layer embodiments, is under legal attack. In this paper, we speculate onthe technical and legal future of the darknet, concentrating particularly, but not exclusively,on peer-to-peer networks.

Page 3: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 3/16

The rest of this paper is structured as follows. Section 2 analyzes differentmanifestations of the darknet with respect to their robustness to attacks on theinfrastructure requirements described above and speculates on the future development of the darknet. Section 3 describes content protection mechanisms, their probable effect onthe darknet, and the impact of the darknet upon them. In sections 4 and 5, we speculateon the scenarios in which the darknet will be effective, and how businesses may need tobehave to compete effectively with it.

2 The Evolution of the DarknetWe classify the different manifestations of the darknet that have come into existence

in recent years with respect to the five infrastructure requirements described and analyzeweaknesses and points of attack.

As a system, the darknet is subject to a variety of attacks. Legal action continues tobe the most powerful challenge to the darknet. However, the darknet is also subject to avariety of other common threats (e.g. viruses, spamming) that, in the past, have lead tominor disruptions of the darknet, but could be considerably more damaging.

In this section we consider the potential impact of legal developments on the darknet.Most of our analysis focuses on system robustness, rather than on detailed legalquestions. We regard legal questions only with respect to their possible effect: the failureof certain nodes or links (vertices and edges of the graph defined above). In this sense, weare investigating a well known problem in distributed systems.

2.1 Early Small-Worlds NetworksPrior to the mid 1990s, copying was organized around groups of friends and

acquaintances. The copied objects were music on cassette tapes and computer programs. The rendering devices were widely-available tape players and the computers of the time – see Fig. 1. Content injection was trivial, since most objects were either not copyprotected or, if they were equipped with copy protection mechanisms, the mechanismswere easily defeated. The distribution network was a “sneaker net” of floppy disks andtapes (storage), which were handed in person between members of a group or were sentby postal mail. The bandwidth of this network – albeit small by today’s standards – wassufficient for the objects of the time. The main limitation of the sneaker net with itsmechanical transport layer was latency. It could take days or weeks to obtain a copy of anobject. Another serious limitation of these networks was the lack of a sophisticated searchengine.

There were limited attempts to prosecute individuals who were trying to sellcopyrighted objects they had obtained from the darknet (commercial piracy). However, the

darknet as a whole was never under significant legal threat. Reasons may have includedits limited commercial impact and the protection from legal surveillance afforded bysharing amongst friends.

The sizes of object libraries available on such networks are strongly influenced by theinterconnections between the networks. For example, schoolchildren may copy contentfrom their “family network” to their “school network” and thereby increase the size of thedarknet object library available to each. Such networks have been studied extensively andare classified as “interconnected small-worlds networks.” [24] There are several popular examples of the characteristics of such systems. For example, most people have a socialgroup of a few score of people. Each of these people has a group of friends that partlyoverlap with their friends’ friends, and also introduces more people. It is estimated that, on

Page 4: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 4/16

average, each person is connected to every other person in the world by a chain of aboutsix people from which arises the term “six degrees of separation”.

These findings are remarkably broadly applicable (e.g. [20],[3]). The chains are onaverage so short because certain super-peers have many links. In our example, somepeople are gregarious and have lots of friends from different social or geographicalcircles..

We suspect that these findings have implications for sharing on darknets, and we willreturn to this point when we discuss the darknets of the future later in this paper.

The small-worlds darknet continues to exist. However, a number of technologicaladvances have given rise to new forms of the darknet that have superseded the small-worlds for some object types (e.g. audio).

2.2 Central Internet ServersBy 1998, a new form of the darknet began to emerge from technological advances in

several areas. The internet had become mainstream, and as such its protocols andinfrastructure could now be relied upon by anyone seeking to connect users with acentralized service or with each other. The continuing fall in the price of storage together with advances in compression technology had also crossed the threshold at which storinglarge numbers of audio files was no longer an obstacle to mainstream users. Additionally,the power of computers had crossed the point at which they could be used as renderingdevices for multimedia content. Finally, “CD ripping” became a trivial method for contentinjection.

Page 5: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 5/16

The first embodiments of this new darknet were central internet servers with largecollections of MP3 audio files. A fundamental change that came with these servers wasthe use of a new distribution network: The internet displaced the sneaker net – at least for audio content. This solved several problems of the old darknet. First, latency was reduceddrastically.

Secondly, and more importantly, discovery of objects became much easier becauseof simple and powerful search mechanisms – most importantly the general-purpose world-wide-web search engine. The local view of the small world was replaced by a global viewof the entire collection accessible by all users. The main characteristic of this form of thedarknet was centralized storage and search – a simple architecture that mirroredmainstream internet servers.

Centralized or quasi-centralized distribution and service networks make sense for legal online commerce. Bandwidth and infrastructure costs tend to be low, and havingcustomers visit a commerce site means the merchant can display adverts, collect profiles,and bill efficiently. Additionally, management, auditing, and accountability are much easier

in a centralized model.However, centralized schemes work poorly for illegal object distribution becauselarge, central servers are large single points of failure: If the distributor is breaking the law,it is relatively easy to force him to stop. Early MP3 Web and FTP sites were commonly“hosted” by universities, corporations, and ISPs. Copyright-holders or their representatives sent “cease and desist” letters to these web-site operators and web-owners citing copyright infringement and in a few cases followed up with legal action [15].The threats of legal action were successful attacks on those centralized networks, andMP3 web and FTP sites disappeared from the mainstream shortly after they appeared.

2.3 Peer-to-Peer Networks

The realization that centralized networks are not robust to attack (be it legal or technical) has spurred much of the innovation in peer-to-peer networking and file sharingtechnologies. In this section, we examine architectures that have evolved. Early systemswere flawed because critical components remained centralized (Napster) or because of inefficiencies and lack of scalability of the protocol (gnutella) [17]. It should be noted thatthe problem of object location in a massively distributed, rapidly changing, heterogeneoussystem was new at the time peer-to-peer systems emerged. Efficient and highly scalableprotocols have been proposed since then [9],[23].

2.3.1. Napster Napster was the service that ignited peer-to-peer file sharing in 1999 [14]. There

should be little doubt that a major portion of the massive (for the time) traffic on Napster

was of copyrighted objects being transferred in a peer-to-peer model in violation of copyright law. Napster succeeded where central servers had failed by relying on thedistributed storage of objects not under the control of Napster. This moved the injection,storage, network distribution, and consumption of objects to users.

However, Napster retained a centralized database 2 with a searchable index on the filename. The centralized database itself became a legal target [15]. Napster was firstenjoined to deny certain queries (e.g. “Metallica”) and then to police its network for allcopyrighted content. As the size of the darknet indexed by Napster shrank, so did thenumber of users. This illustrates a general characteristic of darknets: there is positive

2 Napster used a farm of weakly coupled databases with clients attaching to just one of theserver hosts.

Page 6: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 6/16

feedback between the size of the object library and aggregate bandwidth and the appealof the network for its users.

2.3.2. GnutellaThe next technology that sparked public interest in peer-to-peer file sharing was

Gnutella. In addition to distributed object storage, Gnutella uses a fully distributeddatabase described more fully in [13]. Gnutella does not rely upon any centralized server or service – a peer just needs the IP address of one or a few participating peers to (inprinciple) reach any host on the Gnutella darknet. Second, Gnutella is not really “run” byanyone: it is an open protocol and anyone can write a Gnutella client application. Finally,Gnutella and its descendants go beyond sharing audio and have substantial non-infringinguses. This changes its legal standing markedly and puts it in a similar category to email.That is, email has substantial non-infringing use, and so email itself is not under legalthreat even though it may be used to transfer copyrighted material unlawfully.

2.4 Robustness of Fully Distributed DarknetsFully distributed peer-to-peer systems do not present the single points of failure that

led to the demise of central MP3 servers and Napster. It is natural to ask how robust thesesystems are and what form potential attacks could take. We observe the followingweaknesses in Gnutella-like systems:

Free riding Lack of anonymity

2.4.1 Free RidingPeer-to-peer systems are often thought of as fully decentralized networks with copies

of objects uniformly distributed among the hosts. While this is possible in principle, inpractice, it is not the case. Recent measurements of libraries shared by gnutella peersindicate that the majority of content is provided by a tiny fraction of the hosts [1]. In effect,although gnutella appears to be a peer-to-peer network of cooperating hosts, in actual factit has evolved to effectively be another largely centralized system – see Fig. 2. Free riding (i.e. downloading objects without sharing them) by many gnutella users appears to bemain cause of this development. Widespread free riding removes much of the power of network dynamics and may reduce a peer-to-peer network into a simple unidirectionaldistribution system from a small number of sources to a large number of destinations. Of course, if this is the case, then the vulnerabilities that we observed in centralized systems(e.g. FTP-servers) are present again. Free riding and the emergence of super-peers haveseveral causes:

Peer-to-peer file sharing assumes that a significant fraction of users adhere to the

somewhat post-capitalist idea of sacrificing their own resources for the “common good” of the network. Most free-riders do not seem to adopt this idea. For example, with 56 kbpsmodems still being the network connection for most users, allowing uploads constitutes atangible bandwidth sacrifice. One approach is to make collaboration mandatory. For example, Freenet [6] clients are required to contribute some disk space. However,enforcing such requirements without a central infrastructure is difficult.

Existing infrastructure is another reason for the existence of super-peers. There arevast differences in the resources available to different types of hosts. For example, a T3connection provides the combined bandwidth of about one thousand 56 kbps telephoneconnections.

Page 7: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 7/16

Page 8: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 8/16

is potentially quite powerful. We believe it probable that there will be a few more rounds of technical innovations to sidestep existing laws, followed by new laws, or newinterpretations of old laws, in the next few years.

2.4.4 Conclusions All attacks we have identified exploit the lack of endpoint anonymity and are aided by

the effects of free riding. We have seen effective legal measures on all peer-to-peer technologies that are used to provide effectively global access to copyrighted material.Centralized web servers were effectively closed down. Napster was effectively closeddown. Gnutella and Kazaa are under threat because of free rider weaknesses and lack of endpoint anonymity.

Lack of endpoint anonymity is a direct result of the globally accessible global objectdatabase, and it is the existence of the global database that most distinguishes the newer darknets from the earlier small worlds. At this point, it is hard to judge whether the darknetwill be able to retain this global database in the long term, but it seems seems clear thatlegal setbacks to global-index peer-to-peer will continue to be severe.

However, should Gnutella-style systems become unviable as darknets, systems, suchas Freenet or Mnemosyne might take their place. Peer-to-peer networking and file sharingdoes seem to be entering into the mainstream – both for illegal and legal uses. If wecouple this with the rapid build-out of consumer broadband, the dropping price of storage,and the fact that personal computers are effectively establishing themselves as centers of home-entertainment, we suspect that peer-to-peer functionality will remain popular andbecome more widespread.

Page 9: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 9/16

2.5 Small Worlds Networks RevisitedIn this section we try to predict the evolution of the darknet should global peer-to-peer

networks be effectively stopped by legal means. The globally accessible global databaseis the only infrastructure component of the darknet that can be disabled in this way. Theother enabling technologies of the darknet (injection, distribution networks, renderingdevices, storage) will not only remain available, but rapidly increase in power, based ongeneral technological advances and the possible incorporation of cryptography. We stressthat the networks described in this section (in most cases) provide poorer services thanglobal network, and would only arise in the absence of a global database.

In the absence of a global database, small-worlds networks could again become theprevalent form of the darknet. However, these small-worlds will be more powerful thanthey were in the past. With the widespread availability of cheap CD and DVD readers andwriters as well as large hard disks, the bandwidth of the sneaker net has increaseddramatically, the cost of object storage has become negligible and object injection toolshave become ubiquitous. Furthermore, the internet is available as a distributionmechanism that is adequate for audio for most users, and is becoming increasinglyadequate for video and computer programs. In light of strong cryptography, it is hard toimagine how sharing could be observed and prosecuted as long as users do not sharewith strangers.

In concrete terms, students in dorms will establish darknets to share content in their social group. These darknets may be based on simple file sharing, DVD-copying, or mayuse special application programs or servers: for example, a chat or instant-messenger client enhanced to share content with members of your buddy-list. Each student will be amember of other darknets: for example, their family, various special interest groups,friends from high-school, and colleagues in part-time jobs (Fig. 3). If there are a few activesuper-peers - users that locate and share objects with zeal - then we can anticipate thatcontent will rapidly diffuse between darknets, and relatively small darknets arrangedaround social groups will approach the aggregate libraries that are provided by the globaldarknets of today. Since the legal exposure of such sharing is quite limited, we believe thatsharing amongst socially oriented groups will increase unabated.

Small-worlds networks suffer somewhat from the lack of a global database; each user can only see the objects stored by his small world neighbors. This raises a number of interesting questions about the network structure and object flow:

Page 10: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 10/16

What graph structure will the network have? For example, will it be connected?What will be the average distance between two nodes?

Given a graph structure, how will objects propagate through the graph? Inparticular, what fraction of objects will be available at a given node? How longdoes it take for objects to propagate (diffuse) through the network?

Questions of this type have been studied in different contexts in a variety of fields(mathematics, computer science, economics, and physics). A number of empirical studiesseek to establish structural properties of different types of small world networks, such associal networks [20] and the world-wide web [3]. These works conclude that the diameter of the examined networks is small, and observe further structural properties, such as apower law of the degree distribution [5], A number of authors seek to model thesenetworks by means of random graphs, in order to perform more detailed mathematicalanalysis on the models [2],[8],[21],[22] and, in particular, study the possibility of efficientsearch under different random graph distributions [18],[19]. We will present a quantitativestudy of the structure and dynamics of small-worlds networks in an upcoming paper, but tosummarize, small-worlds darknets can be extremely efficient for popular titles: very fewpeers are needed to satisfy requests for top-20 books, songs, movies or computer programs. If darknets are interconnected, we expect the effective introduction rate to belarge. Finally, if darknet clients are enhanced to actively seek out new popular content, asopposed to the user-demand based schemes of today, small-worlds darknets will be veryefficient.

3 Introducing Content into the DarknetOur analysis and intuition have led us to believe that efficient darknets – in global or

small-worlds form -- will remain a fact of life. In this section we examine rights-management technologies that are being deployed to limit the introduction rate or decrease the rate of diffusion of content into the darknet.

3.1 Conditional Access Systems A conditional-access system is a simple form of rights-management system in which

subscribers are given access to objects based (typically) on a service contract. Digitalrights management systems often perform the same function, but typically imposerestrictions on the use of objects after unlocking.

Conditional access systems such as cable, satellite TV, and satellite radio offer littleor no protection against objects being introduced into the darknet from subscribing hosts. A conditional-access system customer has no access to channels or titles to which theyare not entitled, and has essentially free use of channels that he has subscribed or paidfor. This means that an investment of ~$100 (at time of writing) on an analog video-capture card is sufficient to obtain and share TV programs and movies. Some CAsystems provide post-unlock protections but they are generally cheap and easy tocircumvent.

Thus, conditional access systems provide a widely deployed, high-bandwidth sourceof video material for the darknet. In practice, the large s ize and low cost of CA-providedvideo content will limit the exploitation of the darknet for distributing video in the near-term.

The same can not be said of the use of the darknet to distribute conditional-accesssystem broadcast keys. At some level, each head-end (satellite or cable TV head-end)uses an encryption key that must be made available to each customer (it is a broadcast),

Page 11: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 11/16

and in the case of a satellite system this could be millions of homes. CA-system providerstake measures to limit the usefulness of exploited session keys (for example, they arechanged every few seconds), but if darknet latencies are low, or if encrypted broadcastdata is cached, then the darknet could threaten CA-system revenues.

We observe that the exposure of the conditional access provider to losses due topiracy is proportional to the number of customers that share a session key. In this regard,cable-operators are in a safer position than satellite operators because a cable operator can narrowcast more cheaply.

3.2 DRM Systems A classical-DRM system is one in which a client obtains content in protected (typically

encrypted) form, with a license that specifies the uses to which the content may be put.Examples of licensing terms that are being explored by the industry are “play on thesethree hosts,” “play once,” “use computer program for one hour,” etc.

The license and the wrapped content are presented to the DRM system whoseresponsibility is to ensure that:a) The client cannot remove the encryption from the file and send it to a peer,b) The client cannot “clone” its DRM system to make it run on another host,c) The client obeys the rules set out in the DRM license, and,d) The client cannot separate the rules from the payload.

Advanced DRM systems may go further.Some such technologies have been commercially very successful – the content

scrambling system used in DVDs, and (broadly interpreted) the protection schemes usedby conditional access system providers fall into this category, as do newer DRM systemsthat use the internet as a distribution channel and computers as rendering devices. Thesetechnologies are appealing because they promote the establishment of new businesses,

and can reduce distribution costs. If costs and licensing terms are appealing to producersand consumers, then the vendor thrives. If the licensing terms are unappealing or inconvenient, the costs are too high, or competing systems exist, then the business willfail. The DivX “DVD” rental model failed on most or all of these metrics, but CSS-protected DVDs succeeded beyond the wildest expectations of the industry.

On personal computers, current DRM systems are software-only systems using avariety of tricks to make them hard to subvert. DRM enabled consumer electronics devicesare also beginning to emerge.

In the absence of the darknet, the goal of such systems is to have comparablesecurity to competing distribution systems – notably the CD and DVD – so thatprogrammable computers can play an increasing role in home entertainment. We willspeculate whether these strategies will be successful in the Sect. 5.

DRM systems strive to be BOBE (break-once, break everywhere)-resistant. That is,suppliers anticipate (and the assumptions of the darknet predict) that individual instances(clients) of all security-systems, whether based on hardware or software, will be subverted.If a client of a system is subverted, then all content protected by that DRM client can beunprotected. If the break can be applied to any other DRM client of that class so that all of those users can break their systems, then the DRM-scheme is BOBE-weak. If, on theother hand, knowledge gained breaking one client cannot be applied elsewhere, then theDRM system is BOBE-strong.

Most commercial DRM-systems have BOBE-exploits, and we note that the darknetapplies to DRM-hacks as well. The CSS system is an exemplary BOBE-weak system.The knowledge and code that comprised the De-CSS exploit spread uncontrolled around

Page 12: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 12/16

Page 13: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 13/16

2) Require a license to play : For example, if a watermark is found indicating thatcontent is rights-restricted then the renderer may demand a license indicatingthat the user is authorized to play the content.

Such systems were proposed for audio content – for example the secure digital musicinitiative (SDMI) [16], and are under consideration for video by the copy-protectiontechnical working group (CPTWG) [12].

There are several reasons why it appears unlikely that such systems will ever becomean effective anti-piracy technology. From a commercial point of view, building awatermark detector into a device renders it strictly less useful for consumers than acompeting product that does not. This argues that watermarking schemes are unlikely tobe widely deployed, unless mandated by legislation. The recently proposed Hollings bill isa step along these lines [11].

We contrast watermark-based policing with classical DRM: If a general-purposedevice is equipped with a classical DRM-system, it can play all content acquired from the

darknet, and have access to new content acquired through the DRM-channel. This is instark distinction to reduction of functionality inherent in watermark-based policing.Even if watermarking systems were mandated, this approach is likely to fail due to a

variety of technical inadequacies. The first inadequacy concerns the robustness of theembedding layer. We are not aware of systems for which simple data transformationscannot strip the mark or make it unreadable. Marks can be made more robust, but inorder to recover marks after adversarial manipulation, the reader must typically search alarge phase space, and this quickly becomes untenable. In spite of the proliferation of proposed watermarking schemes, it remains doubtful whether robust embedding layers for the relevant content types can be found.

A second inadequacy lies in unrealistic assumptions about key management. Mostwatermarking schemes require widely deployed cryptographic keys. Standard

watermarking schemes are based on the normal cryptographic principles of a publicalgorithm and secret keys. Most schemes use a shared-key between marker and detector.In practice, this means that all detectors need a private key, and, typically, share a singleprivate key. It would be naïve to assume that these keys will remain secret for long in anadversarial environment. Once the key or keys are compromised, the darknet willpropagate them efficiently, and the scheme collapses. There have been proposals for public-key watermarking systems. However, so far, this work does not seem practical andthe corresponding schemes do not even begin to approach the robustness of thecryptographic systems whose name they borrow.

A final consideration bears on the location of mandatory watermark detectors in clientdevices. On open computing devices (e.g. personal computers), these detectors could, inprinciple, be placed in software or in hardware. Placing detectors in software would be

largely meaningless, as circumvention of the detector would be as simple as replacing itby a different piece of software. This includes detectors placed in the operating system, allof whose components can be easily replaced, modified and propagated over the darknet.

Alternatively, the detectors could be placed in hardware (e.g. audio and video cards).In the presence of the problems described this would lead to untenable renewabilityproblems --- the hardware would be ineffective within days of deployment. Consumers, onthe other hand, expect the hardware to remain in use for many years. Finally, consumersthemselves are likely to rebel against “footing the bill” for these ineffective contentprotection systems. It is virtually certain, that the darknet would be filled with a continuoussupply of watermark removal tools, based on compromised keys and weaknesses in theembedding layer. Attempts to force the public to “update” their hardware would not only beintrusive, but impractical.

Page 14: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 14/16

In summary, attempts to mandate content protection systems based on watermarkdetection at the consumer’s machine suffer from commercial drawbacks and severetechnical deficiencies. These schemes, which aim to provide content protection beyondDRM by attacking the darknet, are rendered entirely ineffective by the presence of even amoderately functional darknet.

4.2 FingerprintingFingerprint schemes are based on similar technologies and concepts to watermarking

schemes. However, whereas watermarking is designed to perform a-priori policing,fingerprinting is designed to provide a-posteriori forensics.

In the simplest case, fingerprinting is used for individual-sale content (as opposed tosuper-distribution or broadcast – although it can be applied there with some additionalassumptions). When a client purchases an object, the supplier marks it with anindividualized mark that identifies the purchaser. The purchaser is free to use the content,

but if it appears on a darknet, a policeman can identify the source of the content and theoffender can be prosecuted.Fingerprinting suffers from fewer technical problems than watermarking. The main

advantage is that no widespread key-distribution is needed – a publisher can usewhatever secret or proprietary fingerprinting technology they choose, and is entirelyresponsible for the management of their own keys.

Fingerprinting has one problem that is not found in watermarking. Since eachfingerprinted copy of a piece of media is different, if a user can obtain several differentcopies, he can launch collusion attacks (e.g. averaging). In general, such attacks are verydamaging to the fingerprint payload.

It remains to be seen whether fingerprinting will act as a deterrent to theft. There iscurrently no legal precedent for media fingerprints being evidence of crime, and this case

will probably be hard to make – after all, detection is a statistical process with falsepositives, and plenty of opportunity for deniability. However, we anticipate that there willbe uneasiness in sharing a piece of content that may contain a person’s identity, and thatultimately leaves that person’s control.

Note also that with widely distributed watermarking detectors, it is easy to seewhether you have successfully removed a watermark. There is no such assurance for determining whether a fingerprint has been successfully removed from an object becauseusers are not necessarily knowledgeable about the fingerprint scheme or schemes in use.However, if it turns out that the deterrence of fingerprinting is small (i.e. everyone sharestheir media regardless of the presence of marks), there is probably no reasonable legalresponse. Finally, distribution schemes in which objects must be individualized will beexpensive.

5 ConclusionsThere seem to be no technical impediments to darknet-based peer-to-peer file sharing

technologies growing in convenience, aggregate bandwidth and efficiency. The legalfuture of darknet-technologies is less certain, but we believe that, at least for some classesof user, and possibly for the population at large, efficient darknets will exist. The rest of thissection will analyze the implications of the darknet from the point of view of individualtechnologies and of commerce in digital goods.

Page 15: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 15/16

5.1 Technological ImplicationsDRM systems are limited to protecting the content they contain. Beyond our first

assumption about the darknet, the darknet is not impacted by DRM systems. In light of our first assumption about the darknet, DRM design details, such as properties of the tamper-resistant software may be strictly less relevant than the question whether the currentdarknet has a global database. In the presence of an infinitely efficient darknet – whichallows instantaneous transmission of objects to all interested users – even sophisticatedDRM systems are inherently ineffective. On the other hand, if the darknet is made up of isolated small worlds, even BOBE-weak DRM systems are highly effective. The interestingcases arise between these two extremes – in the presence of a darknet, which isconnected, but in which factors, such as latency, limited bandwidth or the absence of aglobal database limit the speed with which objects propagate through the darknet. Itappears that quantitative studies of the effective “diffusion constant” of different kinds of darknets would be highly useful in elucidating the dynamics of DRM and the darknet.

Proposals for systems involving mandatory watermark detection in rendering devicestry to impact the effectiveness of the darknet directly by trying to detect and eliminateobjects that originated in the darknet. In addition to severe commercial and socialproblems, these schemes suffer from several technical deficiencies, which, in thepresence of an effective darknet, lead to their complete collapse. We conclude that suchschemes are doomed to failure.

5.2 Business in the Face of the DarknetThere is evidence that the darknet will continue to exist and provide low cost, high-

quality service to a large group of consumers. This means that in many markets, thedarknet will be a competitor to legal commerce. From the point of view of economictheory, this has profound implications for business strategy: for example, increasedsecurity (e.g. stronger DRM systems) may act as a disincentive to legal commerce.Consider an MP3 file sold on a web site: this costs money, but the purchased object is asuseful as a version acquired from the darknet. However, a securely DRM-wrapped song isstrictly less attractive: although the industry is striving for flexible licensing rules,customers will be restricted in their actions if the system is to provide meaningful security.This means that a vendor will probably make more money by selling unprotected objectsthan protected objects. In short, if you are competing with the darknet, you must competeon the darknet’s own terms: that is convenience and low cost rather than additionalsecurity.

Certain industries have faced this (to a greater or lesser extent) in the past. Dongle-protected computer programs lost sales to unprotected programs, or hacked versions of the program. Users have also refused to upgrade to newer software versions that arecopy protected.

There are many factors that influence the threat of the darknet to an industry. We seethe darknet having most direct bearing on mass-market consumer IP-goods. Goods soldto corporations are less threatened because corporations mostly try to stay legal, and willpolice their own intranets for illicit activities. Additionally, the cost-per-bit, and the total sizeof the objects have a huge bearing on the competitiveness of today’s darknets comparedwith legal trade. For example, today’s peer-to-peer technologies provide excellent servicequality for audio files, but users must be very determined or price-sensitive to downloadmovies from a darknet, when the legal competition is a rental for a few dollars.

Page 16: The Darknet and the Future of Content Distribution.pdf

8/11/2019 The Darknet and the Future of Content Distribution.pdf

http://slidepdf.com/reader/full/the-darknet-and-the-future-of-content-distributionpdf 16/16

References[1] E. Adar and B. A. Huberman, Free Riding on Gnutella,

http://www.firstmonday.dk/issues/issue5_10/adar/index.html[2] W. Aiello, F. Chung and L. Lu, Random evolution in massive graphs , InProceedings of the 42nd Annual IEEE Symposium on Foundations of Computer Science, pages 510—519, 2001.

[3] R. Albert, H. Jeong and A.-L. Barabási, Diameter of the world-wide web , Nature401, pages 130—131, 1999.

[4] D. Aucsmith, Tamper Resistant Software, An Implementation , Information Hiding1996, Proceedings: Springer 1998.

[5] A.-L. Barabási, R. Albert, Emergence of scaling in random networks , Science286, pages 509—512, 1999.

[6] I. Clarke, O. Sandberg, B. Wiley and T. Hong, Freenet: A distributed informationstorage and retrieval system , International Workshop on Design Issues in

Anonymity and Unobservability, 2000.[7] R. Clarke, A defendant class action lawsuit http://www.kentlaw.edu/perritt/honorsscholars/clarke.html

[8] C. Cooper and A. Frieze, A general model of web graphs , Proceedings of ESA2001, pages 500-511, 2001.

[9] F. Dabek, E. Brunskill, M. F. Kaashoek, D. Karger, R. Morris, I. Stoica and H.Balakrishnan, Building peer-to-peer systems with Chord, a distributed lookupservice , In Proceedings of the Eighth IEEE Workshop on Hot Topics in OperatingSystems (HotOS-VIII), pages 81—86, 2001.

[10] S. Hand and T. Roscoe, Mnemosyne: peer-to-peer steganographic storage , InProceedings of the First International Workshop on Peer-to-Peer Systems, 2002.

[11] Senator Fritz Hollings, Consumer Broadband and Digital Television Promotion

Act.[12] http://www.cptwg.org[13] http://www.gnutelladev.com/protocol/gnutella-protocol.html[14] http://www.napster.com[15] http://www.riaa.org[16] http://www.sdmi.org[17] M. Javanovic, F. Annextein and K. Berman, Scalability Issues in Large Peer-to-

Peer Networks - A Case Study of Gnutella , ECECS Department, University of Cincinnati, Cincinnati, OH 45221

[18] J. Kleinberg, Navigation in a small world, Nature 406, 2000.[19] J. Kleinberg, Small-world phenomena and the dynamics of information , Advances

in Neural Information Processing Systems (NIPS) 14, 2001.[20]

S. Milgram, The small world problem, Psychology Today, vol. 2, pages 60—67,1967.[21] M .Newman, Small worlds: the structure of social networks , Santa Fe Institute,

Technical Report 99-12-080, 1999.[22] M. Newman, D. Watts and S. Strogatz, Random graph models of social networks ,

Proc. Natl. Acad. Sci. USA 99, pages 2566—2572, 2002.[23] I. Stoica, R. Morris, D. Karger, M. F. Kaashoek, H. Balakrishnan, CHORD: A

scalable peer-to-peer lookup service for internet applications , In Proceedings of the ACM SIGCOMM 2001 Conference SIGCOMM-01 , pages 149—160, 2001.

[24] D. J. Watts and S. H. Strogatz, Collective dynamics of small-world networks ,Nature, 393:440-442, June 1998.