9
Information Security The revolutionary idea that defines the boundary between modern times and the past is the mastery of risk: the notion that the future is more than a whim of the gods and that men and women are not passive before nature. Until human beings discov- ered a way across that boundary, the future was the mirror of the past or the murky domain of oracles and soothsayers who held a monopoly over knowl- edge of anticipated events. — Peter Bernstein, Against the Gods: The Remarkable Story of Risk I nformation security is in the first stages of the journey Bernstein describes as distinguishing the modern era. 1 Since the dawn of modern computing, security has been left to computer security experts—chiefly technologists whose technical understanding qualified them to shoulder the responsibility of keeping computers and their valuable information safe. The rapid growth of society’s dependence on information systems, particularly the Internet, has drawn attention to the glaring vulnera- bility of this fragile infrastructure. Most organizations call security a “top management priority,” and chief executive officers rank security as 7.5 out of 10 in importance, 2 but funding levels don’t match the rhetoric. On average, companies spend only 0.047 percent of their revenue on security. An industry-average, composite firm has a US$1.1 million average annual se- curity budget (excluding staff), translating into a grand total of US$196 spent on security per employee per year. 3 Furthermore, much of this goes to product, not process, which, as Bruce Schneier points out, is backward: Secu- rity is not a product, it’s a process. 4 Why the disconnect? Simple ques- tions, readily an- swered in any other business context, are met by informa- tion security experts with embarrassed silence. These questions include: Is my security better this year? What am I getting for my security dollars? How do I compare with my peers? Answering such questions requires rigorous security metrics and a risk-management framework in which to compare them. Risk management: The only viable means Although risk management has detractors, leading com- panies and entire industry sectors view it as the only vi- able means for dealing with information security risks. The alternative is a return to the murky domain of oracles and soothsayers who dispense security advice that com- panies follow, hoping it will keep them safe. In this old- world model, fear of the catastrophic consequences of an information attack, uncertainty about their vulnerability, and doubt as to the sufficiency of current safeguards drive organizations’ security decisions. Formal risk management seeks to move beyond fear, uncertainty, and doubt (FUD) to a framework in which organizations can quantify the likelihood of danger, esti- mate the extent of possible damage, and weigh the costs of security safeguards against their expected effectiveness. Four important realizations are pushing companies away from FUD and toward risk management: DANIEL GEER, JR., KEVIN SOO HOO, AND ANDREW JAQUITH @stake Information Security: Why the Future Belongs to the Quants 24 PUBLISHED BY THE IEEE COMPUTER SOCIETY 1540-7993/03/$17.00 © 2003 IEEE IEEE SECURITY & PRIVACY Although most businesses say information security is a primary concern, few have adequate systems in place because securing information requires a risk-management approach with dependable, quantifiable metrics.

Information security: Why the future belongs to the quants

  • Upload
    a

  • View
    216

  • Download
    3

Embed Size (px)

Citation preview

Page 1: Information security: Why the future belongs to the quants

Information Security

The revolutionary idea that defines the boundarybetween modern times and the past is the masteryof risk: the notion that the future is more than awhim of the gods and that men and women are notpassive before nature. Until human beings discov-ered a way across that boundary, the future was themirror of the past or the murky domain of oraclesand soothsayers who held a monopoly over knowl-edge of anticipated events.

— Peter Bernstein, Against the Gods: The Remarkable Story of Risk

I nformation security is in the first stages of the journeyBernstein describes as distinguishing the modernera.1 Since the dawn of modern computing, securityhas been left to computer security experts—chiefly

technologists whose technical understanding qualifiedthem to shoulder the responsibility of keeping computersand their valuable information safe. The rapid growth ofsociety’s dependence on information systems, particularlythe Internet, has drawn attention to the glaring vulnera-bility of this fragile infrastructure.

Most organizations call security a “top managementpriority,” and chief executive officers rank security as 7.5out of 10 in importance,2 but funding levels don’t matchthe rhetoric. On average, companies spend only 0.047percent of their revenue on security. An industry-average,composite firm has a US$1.1 million average annual se-curity budget (excluding staff), translating into a grandtotal of US$196 spent on security per employee per year.3

Furthermore, much of this goes to product, not process,which, as Bruce Schneier points out, is backward: Secu-rity is not a product, it’s a process.4 Why the disconnect?

Simple ques-tions, readily an-swered in any other business context, are met by informa-tion security experts with embarrassed silence. Thesequestions include:

• Is my security better this year?• What am I getting for my security dollars?• How do I compare with my peers?

Answering such questions requires rigorous securitymetrics and a risk-management framework in which tocompare them.

Risk management: The only viable means Although risk management has detractors, leading com-panies and entire industry sectors view it as the only vi-able means for dealing with information security risks.The alternative is a return to the murky domain of oraclesand soothsayers who dispense security advice that com-panies follow, hoping it will keep them safe. In this old-world model, fear of the catastrophic consequences of aninformation attack, uncertainty about their vulnerability,and doubt as to the sufficiency of current safeguards driveorganizations’ security decisions.

Formal risk management seeks to move beyond fear,uncertainty, and doubt (FUD) to a framework in whichorganizations can quantify the likelihood of danger, esti-mate the extent of possible damage, and weigh the costsof security safeguards against their expected effectiveness.Four important realizations are pushing companies awayfrom FUD and toward risk management:

DANIEL GEER,JR., KEVIN SOO

HOO, AND

ANDREW

JAQUITH

@stake

Information Security: Why the Future Belongs to the Quants

24 PUBLISHED BY THE IEEE COMPUTER SOCIETY � 1540-7993/03/$17.00 © 2003 IEEE � IEEE SECURITY & PRIVACY

Although most businesses say information security is a

primary concern, few have adequate systems in place

because securing information requires a risk-management

approach with dependable, quantifiable metrics.

Page 2: Information security: Why the future belongs to the quants

Information Security

• Information asset fragility. Companies in most industriesrealize that efficient operation of their complex enter-prises depends on information. Every known instanceof critical information corruption, damage, or destruc-tion intensifies their concern over this dependence.Both awareness of information security and the will toaddress it are spreading.

• Provable security. Because no good, consistent securitymetrics are available, companies find themselves unableto gauge the suitability or effectiveness of different se-curity options accurately. Consequently, the amount acompany can spend on “improving” security has nonatural bounds beyond its ability to pay.

• Cost justification. Economic recession and the rising costof security solutions mean security vendors must com-pete with other infrastructure projects for informationtechnology dollars. Cost-benefit analyses and return-on-investment calculations are becoming standard pre-requisites for any information security sale.

• Accountability. Various industry-specific regulatory bod-ies, recognizing the growing exposure of their industriesto information security risks, are mandating mecha-nisms for managing those risks. For example, the BaselII Capital Accords will soon require banks to set asidecapital reserves explicitly to cover operational risks, in-cluding information security risks. Both the Gramm-Leach-Bliley Act (www.senate.gov/~banking/conf)and the Health Insurance Portability and AccountabilityAct (http://cms.hhs.gov/hipaa) compel companies inthe US to be accountable for information security, andthe Sarbanes-Oxley Act (www.sarbanes-oxley.com) in-cludes mandatory reporting of material businesschanges, including serious information security inci-dents. Finally, when considering the tender ministra-tions of New York Attorney General Eliot Spitzer,whose apparent goal is to craft an investor-protectionregime centered on the avoidance of information shar-ing, there’s no doubt that some form of accountability ishere to stay.

Early adopters are already applying risk-managementtools to information security. Security measurement andquantification is a stumbling block, however. In spite ofnascent efforts to perform security return-on-investmentand cost-benefit analyses, reliable, statistically representa-tive information security metrics do not exist.

If information security were a public health problem,individual companies might have some rudimentary un-derstanding of their own information security health, butwe have no aggregate basis for public policy because orga-nizations do not share their data.

Roadblocks to data sharingAnonymized data sharing among companies is an impor-tant tool for obtaining aggregate security metrics. In the

same way that a doctor might share anonymous patientinformation with the US Centers for Disease Control,companies could share details of their information secu-rity experiences to help each other see the overall securitypicture. Several practical challenges, legal concerns, andincentive failures have stalled and continue to stall thistype of data sharing.

Practical challenges range from the lack of commondefinitions for terms and metrics to determining how toshare information meaningfully. The vocabulary of infor-mation security is fraught with imprecision and overlap-ping meanings. Fundamental concepts and words, such asincident, attack, threat, risk, and vulnerability, mean dif-ferent things to different people. Basic metrics for count-ing events are difficult to collect because the terms are un-clear. More complicated metrics, such as consequences ofsecurity breaches, present even greater challenges becauseof the time, effort, and uncertainty involved in assessingthem and the inherent influence of subjective judgment.Few companies will divert scarce security resources awayfrom security system implementations in favor of collect-ing and sharing security metrics. Finally, even if compa-nies try to pool data regularly, no underlying infrastruc-ture for compiling data for analysis exists to support them.

Policy makers in Washington, D.C., studied private in-dustry’s reservations about sharing information securitydata in an attempt to discover whether government couldfacilitate information sharing. From a legal perspective,sharing information among industry competitors couldconstitute collusion and therefore violate US antitrust laws.In addition, as information about security practices and in-cidents becomes well-known, sharing data might increasethe risk of liability lawsuits. If a government agency partic-ipates in the sharing, it might be compelled to disclose data

to the public under the Freedom of Information Act(FOIA), even if the data are proprietary and confidential.

Fortunately, policy makers have addressed many ofthese legal concerns, including antitrust concerns andobligations under the FOIA (though, as we write this, thesituation is fluid and the preconditions for sharing stillvulnerable to election-year demagoguery).

The unwillingness of many companies to share secu-rity information points to a general failing in the market-

http://computer.org/security/ � IEEE SECURITY & PRIVACY 25

Practical challenges range from thelack of common definitions for termsand metrics to determining how toshare information meaningfully.

Page 3: Information security: Why the future belongs to the quants

Information Security

place to provide appropriate incentives. Concerns overreputation and losing customers to competitors tend tosuppress many companies’ inclination to report incidents,much less share information about them. Furthermore,the desire to protect confidential and proprietary informa-tion that might have been compromised reinforces the si-lence.

Ultimately, the problem is due to lack of trust. Manycompanies believe that disclosing sensitive security infor-mation, even anonymously, will hurt future business.Without a structure to contradict this, their fears are as ra-tional as the fundamental necessity of sharing.

What to measureIn settings that require balancing the cost of countermea-sures against the cost of risk, decision support is preciselythe point of any measurement exercise. Getting the rightmeasurements depends on knowing the right questions.In medicine, a doctor asks, what is the patient’s malady?In public health, the questions are, how many patientshave this malady, and how did they get it?

In security, business leaders ask,

• How secure am I?• Am I better off than I was this time last year?• How do I compare with my peers?• Am I spending the right amount of money?• What are my risk-transfer options?

Were we talking about some other field, we couldlook to prior art and industry-specific knowledge—forexample, derivatives pricing in finance, health and safetyin pharmaceutical manufacture, reliability in power dis-tribution, and warehouse turns in just-in-time logistics.With security, however, we’re almost starting from zero.

Data to gatherSo, what data should a business gather? For now, the an-swer is somewhat circular: a business should measure whatit can measure. By asking itself what it can measure, thebusiness learns something valuable, such as the quality andconsistency of its existing information security processand policy. By using what it already measures, or couldeasily measure, it begins to draw baselines it can later use tocompare both performance over time and performanceagainst its peers. Candidate measurements include

• Application defects with security implications, as iden-tified by phase of development

• Network vulnerabilities, even if found only by scanning • Ratios of the number of user sessions to suspicious ac-

tivities, and to bona fide attacks• Password cracks using automated tools• Intrusions and attempted intrusions• Patch coverage and fully burdened costs of installation

per patch• Network probes, both incoming and outgoing• Costs of insecurity, especially if the opportunity costs

can be captured

This list is certainly not exhaustive. Perhaps most im-portantly, businesses should start measuring things thateventually let them define “normal.” No security event isnormal, but unless the organization defines or quantifieswhat is normal, it will be slow to detect what is not.

Commercial intrusion detection systems (IDSs) have aglobal sense of what is abnormal for perimeter-crossingattacks. The real threats are inside, however, and only aninsider monitoring internal measurements can say what isnormal inside a company.

In short, the first rule of statistics applies: Get the data;you can always throw them away later.

Data models to stealIn security, we are hardly inventing anything revolutionarywhen we support decision making under uncertainty.Whenever data are scarce, the standard approach is to buildmodels from other fields and insert expert opinion to sup-plement the data. We can update and calibrate the models asnew data become available, perhaps completely replacingexpert opinion with objective measurements in the future. Itmight be an evolutionary process, but that is certainly no rea-son for us to start from scratch. As Picasso said, “Good artistscopy. Great artists steal.” So let’s do some rational stealing.

Quality assurance literature. The quality control liter-ature is vast, close in spirit to the literature the securityworld needs to develop, and has been quite successfulover the last two decades. The mantra of the 1980s was,“Quality is free,” and without a doubt, the average prod-uct quality has improved since then, as measured in anynumber of ways. A classic study of software quality con-vincingly demonstrated that developers should work onquality early in the process.5 Table 1 shows how the costto fix defects increases in later development stages.

We can steal much more from the quality control lit-erature, particularly if we treat security flaws as specialcases of quality flaws, assuming they are accidental andnot the fruits of sabotage.

Public health terminology and reporting structure.Public health tries to answer questions about disease inci-

26 IEEE SECURITY & PRIVACY � JULY/AUGUST 2003

STAGE RELATIVE COST

Design 1.0

Implementation 6.5

Testing 15.0

Maintenance 100.0

Table 1. Relative cost to correct security defects, by stage.

Page 4: Information security: Why the future belongs to the quants

Information Security

dence, prevalence, and spread—that is, to get the “bigpicture,” literally. The choice of the term “virus” to de-scribe the electronic variety was fitting and naturally sug-gests that we consider public health’s terminology and re-porting structure as models for security.

Consider the epidemiologic concept of herd immunity,which means what it sounds like: for a population to sur-vive, not every member need be immune to a pathogen.Rather, just enough of the population must be immune toprevent an epidemic. In the natural world, species and indi-vidual diversity are protective; in the electronic world, stan-dardization on overwhelmingly dominant vendors createsa near monoculture of limited, if any, herd immunity.

Consider the Nimda worm. Even had we known theprecise susceptibility characteristics for each of Nimda’stransmission vectors, we would not have guessed whatNimda could do by merging several vectors into one. Inthis sense, diseases that mutate quickly or that affect the im-mune system are like Nimda. Similarly, little difference ex-ists between an asymptomatic disease carrier (a TyphoidMary) and a distributed-denial-of-service (DDoS) zombiewaiting to be triggered. Other parallels are easy to draw.

We already have a model for measuring public healthwhen virulence and mutation of existing disease vectors arethe issue: the CDC, which is supported by American tax-payers but plays a worldwide role. Specialists from several al-lied institutions have called for the creation of a CDC for theInternet, with electronic analogs to the CDC’s epidemio-logic coverage: mandatory reporting of communicable dis-eases, statistical identification of excess incidence, and longi-tudinal trend analysis to calibrate epidemiologic models.6

Portfolio management. Generally, portfolio manage-ment strategy aims to balance risk and reward and, whennecessary, cap risk. Portfolio theory calls for managers tomeasure unique and systematic risk factors, calculate in-dustry and company betas, and rigorously apply well-un-derstood financial yardsticks.7 Systems administratorswho plan workable modes of diminished operation forcomputer systems are doing precisely the same thing as theportfolio managers who hedge their investment positions.

Portfolio management tries to tie reward to risk, andto do so as risk and the need for reward change. If the of-ficial measure of homeland security risk today movedfrom yellow to orange, what knobs would you adjust? Ifyou answer, “We don’t even have knobs to adjust,” or,“All of our knobs are already turned all the way up todistort,” you clearly need a better risk model, onehedged and calibrated enough to respond to externalrisk indicators. If you haven’t calibrated the model withmeasurement, only one thing is certain: You will eitheroverspend or underprotect. Recognizing this fault, an-alyst firms such as GIGA Group are explicitly toutingportfolio management as an essential IT managementskill and strategy.8

Accelerated failure testing. Measurement drives re-producibility in manufactured goods, just as repro-ducibility drives productivity gains, and productivitygains create wealth. Accelerated failure testing is the cali-bration of manufacturing quality under the constraint oftime compression.

To test a toaster, evaluators move the handle up anddown 5,000 times, make toast at –40º, observe the effectsof exploding bagels, and so forth. Their purpose is to givethe toaster a lifetime of abuse all at once—that is, to com-press time. In principle, the difference between testing atoaster and performing a serious penetration test of a Webserver is very small. Thus, we might learn from the accel-erated failure time testing literature.

For the toaster, we ask, “How much abuse before itbreaks,” not “Can we break the toaster?” In security, themeasure of interest is parallel. The level-of-effort to sub-vert—how hard it is for the attacker to penetrate the Webserver, not whether the attacker can—yields the highestreturn value to the upstream client. With a trustworthyanswer, a rational decision maker can decide whether aplausible threat to the business exists. Without a trustwor-thy answer, the decision maker will fall prey to FUD. IfFUD represents the dark ages, then measurement is themeans to enlightenment.

Having measured the level-of-effort to subvert, how-ever, doesn’t mean we assume the device, software, or sys-tem will never fail. Rather, we can now judge whethernatural product rollover intervals are sufficient or if weneed to build in such strategies as mandatory upgrade, an-tiretention, or even automatic reinstallation. Some well-known products already use these strategies. The Win-dows Media Player’s end-user license agreement, forexample, has had mandatory upgrade and antiretentionfor more than a year. Research at the University of Mary-land frames the issue as determining the interval betweenflaw discovery and population-based patch installation.9

Insurance. Insurance is about estimating risk better thanthe counterparty to the transaction and deciding whetherto underwrite it or lay it off via reinsurance. In most of theworld, steadily accumulating actuarial data and learningfrom them brings success. In the Internet business world,however, such a strategy is almost impossible.

Actuarial data require some basic stability of the un-derlying entity being measured. When the underlyingmeasurement entity is subject to change, the best alterna-tive is a moving window (left censoring) rather than a fulldata capture. For example, pricing life insurance based onthe average life expectancy over the last thousand yearswould make no sense—the change in average human lifeexpectancy has been too great.

In the electronic business world, the technical flux ofchange is high—so high that actuarial data are practicallyimpossible to obtain. Add configuration complexity and

http://computer.org/security/ � IEEE SECURITY & PRIVACY 27

Page 5: Information security: Why the future belongs to the quants

Information Security

variability to technical flux, and the chance of finding anytwo sites with substantially the same operating environ-ment is almost zero. For this reason, information risk in-surance generally protects against named hazards (if a spe-cific thing happens, the insurance company will pay)rather than an umbrella (no matter what happens the in-surance company will pay).

Insurers must answer the same question those whoworry about the critical infrastructure of any major coun-try must answer: What is the risk aggregation inherent inthe portfolio? Or, in computerese, how likely is cascade fail-ure (a failure that spreads from one system to another in adomino effect)? All important security events, includingthe Nimda worm, have been cascade failures. The chal-lenge for insurers is to insure many parties against all sorts ofelectronic failures without subjecting their portfolios torisk aggregation. They don’t want to structure their insur-ance products thinking they are insuring against house fires(one event equals one claim), only to discover they are in-suring against earthquakes (one event equals many claims).

Until we can measure the risk of cascade failures, mea-suring species (platform) diversity in the computing envi-ronment will have to substitute. The data on diversity inthis environment are not good, with a handful of compa-nies dominating every aspect of it.10 In fact, species diver-sity is one of the great security tensions today. From thesystems administration viewpoint, nothing is so sweet ashaving all platforms alike. From the security administra-tion viewpoint, however, nothing is so frightening ashaving all platforms alike.

Reports to generateIf we had at least some of these data, what reports wouldwe want to see? Some possibilities are

• Time-series trend analysis

• Self and peer trend (cross-sectional) analysis• Net present value and other financial impact analysis• Risks transferred, mitigated, and accepted

Most statisticians, epidemiologists, actuaries, and oth-ers could generate these reports. The problem is not alack of expertise, but a lack of data. Yet, the data to gather,sift, and convert to measures do exist. They exist in virusmanagement systems, intrusion detection logs, trouble-ticketing applications, configuration management plat-forms, security reports from third-party consultants, andso on. Those who have such data have a professional dutyto reveal what those data say.

The Hoover Project: Pooled data on application securityCompanies today must defend against increasingly so-phisticated threats targeting specific applications. In thepast year, the security community has documented anincreasing number of vulnerabilities targeting businessapplications such as Microsoft SQL Server, Oracle, andthe Internet Information Server, as well as personal pro-ductivity applications such as Outlook. John Pescatore,a Gartner Group analyst, notes that “the current gener-ation of firewalls focuses on the network level, kind oflike the walls of a fort stopping direct attack. However,close to 75 percent of today’s attacks are tunnelingthrough applications.”11

The @stake Hoover Project aimed to profile the cur-rent state of security by analyzing 45 e-business applica-tions. We focused on applications rather than firewallsand related network infrastructure for two reasons. First,application-level attacks can easily traverse most firewalls.Second, paraphrasing infamous bank robber Willie Sut-ton’s remark, that’s where the money is. All the data pre-sented here are real. At the same time, the data, by virtue

28 IEEE SECURITY & PRIVACY � JULY/AUGUST 2003

Business-adjusted risk (BAR) is a technique for classifying security

defects by their vulnerability type, degree of risk, and potential

business impact.

For each security defect, we calculated a BAR score as follows:

Business impact (1 – 5) × risk of exploit (1 – 5, depending on

business context) = business adjusted risk (1 – 25)

Risk of exploit indicates how easily an attacker can exploit a given

defect. A score of 5 denotes high-risk, well-known defects an at-

tacker can exploit with off-the-shelf tools or canned attack scripts. A

score of 3 indicates that exploiting the defect requires intermediate

skills and knowledge, such as the ability to write simple scripts.

Finally, only a professional-caliber malicious attacker can exploit

certain classes of defects; we give these defects a score of 1.

Business impact indicates the damage that would be sustained if

the defect were exploited. An impact score of 5 represents a flaw

that could cause significant financial impact, negative media ex-

posure, and damage to a firm’s reputation. A score of 3 indicates

that a successful exploit could cause limited or quantifiable

financial impact, and possible negative media exposure. Defects

that would have no significant impact (monetary or otherwise)

received a score of 1.

BAR is a simple tool for measuring risk: the higher the score, the

higher the risk. Because BAR includes relative ratings for both like-

lihood of occurrence and business impact, it behaves similarly to

insurers’ annual loss expectancy calculations. A BAR score of 20, for

example, denotes an order of magnitude more risk than 2.

Business-adjusted risk

Page 6: Information security: Why the future belongs to the quants

Information Security

of being from @stake customers, are biased toward com-panies keenly aware of information security issues andtheir importance. In this respect, the insights and conclu-sions are not representative of any general population, butare nevertheless instructive. Other firms with data likeours need to step up now.

Cross-sectional analysis One of @stake’s primary business areas is assessing the se-curity of client applications. Using a subset of engagementdata gathered between February 2000 and July 2001, wecreated anonymized security profiles for 45 e-business ap-plications (commercial packages, middleware platforms,and end-user e-commerce applications) including the po-tential risks they posed to our clients’ businesses.

To understand a typical application’s security profile,we examined each assessment in our sample, classifyingsecurity defects by vulnerability type, degree of risk, andpotential business impact (see the “Business-AdjustedRisk” sidebar for a description of our scoring process).

We classified the application security defects intonine high-level and 56 lower-level categories based inpart on the Open Web Application Security Project’s(OWASP, www.owasp.org) Application Security At-tack Components taxonomy. Seventy percent of the de-fects were due to security design flaws, as Figure 1shows. After excluding flaws that had low business im-pact or were difficult to exploit, nearly half (47 percent)of the remaining serious defects could have beencaught—and fixed inexpensively—during the applica-tion design stage.

To understand what differentiates one application fromanother, we used outlier analysis on 23 of the assessments inour survey. For each engagement, we calculated an overallbusiness risk index, based on the sum of the individual busi-

ness-adjusted risk (BAR) scores (see the sidebar). We rankedengagements by their index scores (highest to lowest) and di-vided them into quartiles. Engagements with the lowestbusiness risk index formed the first quartile; those with thehighest formed the fourth.

As Figure 2 shows, the most secure applications in ouranalysis contained, on average, one-quarter of the defectsfound in the least secure. The top performers’ reduced de-fect rates also translated into much lower risk scores. Theleast secure applications carried, on a business-adjustedrisk basis, nearly six times more risk than the most secure,as Figure 3 illustrates.

http://computer.org/security/ � IEEE SECURITY & PRIVACY 29

ENGAGEMENTS SERIOUS WHERE DESIGN DESIGNCATEGORY OBSERVED RELATED FLAWS*

Administrative interfaces 31% 57% 36%Authentication/access control 62% 89% 64%Configuration management 42% 41% 16%Cryptographic algorithms 33% 93% 61%Information gathering 47% 51% 20%Input validation 71% 50% 32%Parameter manipulation 33% 81% 73%Sensitive data handling 33% 70% 41%Session management 40% 94% 79%

Total 45 70% 47%

*Scores of 3 or higher for exploit risk and business impact

31%

27%

27%

27%

24%

24%

20%

20%

20%

20%

Session replay/hijacking

Password controls

Buffer overflows

File/application enumeration

Weak encryption

Password sniffing

Cookie manipulation

Administrative channels

Log storage/retrieval issues

Error codes

Top 10 application security defects

Assessments where encountered

Security defects by category

(a) (b)

Figure 1. Common application security defects: (a) security defects by category and (b) top 10 security defects.

Fourthquartile

Firstquartile

4.8

23.0

Overall

0.3

2.7

Adminstrativeinterfaces

Authentication andaccess control

0.7

6.5

Configurationmanagement

1.2

3.3

Cryptographicalgorithms

0.3 0.5

Informationgathering

1.0 1.3

Inputvalidation

1.3

3.5

Sessionmanagement

0.7

3.3

Sensitive datahandling

0.3

3.3

Parametermanipulation

0.21.8

Figure 2. Average defects per engagement for first and fourthquartiles by risk category.

Page 7: Information security: Why the future belongs to the quants

Information Security

Time-series analysis We used data from the cross-sectional analysis as a baselinefor a time-series analysis of application security. We addedapplication assessments from 2002–2003 to this data setand plotted trends in application security for the softwareand financial services industries. Because our client his-tory is heavily biased toward these industries (perhaps in-dicating that they are leaders, or at least early adopters, ofapplication security), we tracked them over time to seehow the number of vulnerabilities found per applicationhas changed over the years.

Generally, as Figure 4a shows, application securityimproved across all quartiles. The improvement is notuniform, however. Figure 4b shows a significant gap be-tween the number of defects found in the first and fourthquartiles. In 2002, for example, the average number ofdefects found per application in the fourth quartile was5.7 times that found in a typical first quartile application

If your company is a certified leader in security, thesenumbers show that counterparty risk, or the risk incurredwhen transacting with others, is becoming a more signif-icant part of your total risk. If your company is a securitylaggard, the time has come for you to take security seri-ously, and the numbers show it.

Return on security investment analysisIn the development process, application security is typi-cally an afterthought, with remediation occurring onlyafter vulnerabilities are discovered during testing or theprogram is released to the public. Generally accepted soft-ware engineering principles hold that software flaws areless expensive to fix early in the development process. Ifwe could discover and fix security vulnerabilities early in

the development cycle, would the downstream savingsjustify the costs of such preventative practices?

Using vulnerability data from our 2000–2001 applica-tion assessments and cost multiplier results from the soft-ware quality assurance literature, we calculated the returnon investment of applying secure software engineeringpractices at different stages of the software developmentcycle. The return on investment was 12 percent whenanalysis was performed during testing, 15 percent when itwas performed during implementation, and 21 percentwhen analysis was performed during design.12

Although our efforts with the Hoover Project are notdefinitive, they illustrate the virtues of benchmarking andmeasurement, and offer several examples of how we canuse time-series and cross-sectional analyses to find securityrisk factors. Imagine what would happen if industry agreedon measurement techniques, pooled data, and created asample size several orders of magnitude larger.

The future is hereThe future is already here....it’s just unevenly dis-tributed.

—William Gibson

Water, water, everywhere Nor any drop to drink.—Samuel Taylor Coleridge

The future is here, all right. Everywhere, somebodywants numbers on security, and nearly everyone says wemust start collecting them soon.

But we’re already drowning in numbers. We just don’thave the will to use them. For economic reasons alone,we would benefit from knowing what numbers we have

30 IEEE SECURITY & PRIVACY � JULY/AUGUST 2003

82%

89%

88%

76%

63%

20%

69%

89%

93%

88%

Bottom quartile Top quartile

331.8 score

36.2

85.2

36.3

6.8

11.0

46.3

31.5

44.0

34.5

60 score

4.0

10.3

8.7

2.5

8.8

14.5

3.3

5.3

2.5

Risk reduction

Business-adjusted risk index

Administrative interfaces

Authentication/access control

Configuration management

Cryptographic algorithms

Information gathering

Input validation

Parameter manipulation

Sensitive data handling

Session management

Figure 3. Average business-adjusted risk (BAR) index per engagement by risk category.

Page 8: Information security: Why the future belongs to the quants

Information Security

and using them to generate some insight into the securityproblem. We can test exotic hypotheses later.

Yes, the future is here, and it is unevenly distributed.As a first step to evening it out, we should agree on whatwe can collect now at low cost. In the electronic world,data are cheap to acquire, effortless to retain, and low-costto aggregate and to disseminate.

A pure estimator is wonderful, not to mention neces-sary, in the sciences. If you want a pure estimate of a causalprocess, you must be wary of your measures. For example,do they have bias? Are they consistent estimators? Is youruse of them free from the sorts of hidden correlations thatprevent high precision and high accuracy? Are you repro-ducibly clear on methodological collection issues? In busi-ness, however, we don’t want a pure estimator. Rather, wewant a self-correcting, cost-effective control strategy.

For a control strategy, having some measurement ofthe underlying process is more important than having aparsimonious, unbiased, consistent, or uncorrelated andstatistically independent measurement. You can even usethe data you already have. When those data are spotty, asthey surely are at this time, build a model of the processesyou think are at work, calibrate the model with the avail-able data, and use expert judgment for the rest. Improvethe model over time, but start now.

If you already have some data, even if they are of poorquality, you can still use them for trend analysis. Even if thenumbers are suspect, the trend of the data is likely accurate.Why? Because without pathological conditions, which weset aside in the name of simplicity, an estimator will have thesame error in both its “before” and “after” measurements.Thus, the delta between the two will not carry the sameerror because the error will cancel out. Please, if you are astatistician, let this statement pass unchallenged; we have tostart where people are, not where we want them to be.

And share your data. Understanding and characterizing

many security issues will require data sharing. For example,unless a firm shares intrusion detection histories with itspeers, it will never know whether it is a target of choice orof chance. Similarly, if the firm does not know the level-of-effort to subvert its systems or how those systems align withthose of its peers, penetration tests won’t indicate whetherthe firm is spending the right amount of money. If youdon’t know how much to spend, you’ll either underspendor overprotect. If your industry has no baseline and no oneshares, firms are left to choose mediocrity (downside beingreputation risk) or profligacy (downside being financialrisk). Neither choice looks good when the baseline databecome available, as they certainly will.

T he world is becoming a more dangerous place, at leastfor data. Even if you don’t want to measure, even if

you don’t care whether you are overspending or under-protecting, make no mistake that it’s getting harder totransfer risk contractually, so you will pay for the riskwhether you measure it or not.

The quality spread between the firms providing thebest security and those providing the worst is broadening,so your counterparty risk is growing even if your own riskis shrinking. Regulatory divergence between countries isalso increasing, so knowing your security strengths andweaknesses has never been more critical.

If you have data and you consider yourself a security pro-fessional, then you have an ethical obligation to make some-thing of them. Aggregate them, analyze them, share them,use them to generate hypotheses—just don’t hide them.

Measurement for security is essentially inevitable andinevitably essential: the future belongs to the quants.

References1. P.L. Bernstein, Against the Gods: The Remarkable Story of

http://computer.org/security/ � IEEE SECURITY & PRIVACY 31

41

27

22

13

3127

19

1413

885

0

5

10

15

20

25

30

35

40

45

2000 2001 2002Year

Num

ber

of d

efec

ts p

er a

pp

licat

ion

Fourth quartile

Third quartile

Second quartile

First quartile

20012000 2002Year

Aver

age

defe

cts

per

ap

plic

atio

n(n

orm

aliz

ed t

o to

p q

uart

ile)

3.2

4.1

5.7

2.12.5

2.9

1.7 1.7 1.6

1.0 1.0 1.0

6.0

5.0

4.0

3.0

2.0

1.0

0.0

Fourth quartile

Third quartile

Second quartile

First quartile

(a) (b)

Figure 4. Although application security generally improved from 2000 through 2002, the relative gap between the best and worstperformers widened: (a) improvement in application security and (b) widening gap in the number of security application defects.

Page 9: Information security: Why the future belongs to the quants

Information Security

Risk, John Wiley & Sons, 1996, p. 1.2. M. Gerencser and D. Aguirre, “Security Concerns

Prominent on CEO Agenda,” strategy + business, 12 Feb.2002, www.strategy-business.com/press/enews/article/?ptag-ps=&art=254087&pg=0.

3. A. Carey, “Worldwide Information Security ServicesForecast, 2001–2006,” IDC report no. 26899, Apr. 2002.

4. B. Schneier, Secrets & Lies, John Wiley & Sons, 2000.5. IBM Systems Sciences Inst., Implementing Software Inspec-

tions, monograph, IBM, 1981. 6. S. Staniford, V. Paxson, and N. Weaver, “How to Own

the Internet in Your Spare Time,” Proc. Usenix SecuritySymp., Usenix Assoc., 2002, pp. 149–167.

7. A.R. Jaquith, “Learning from Wall Street,” Secure BusinessQuarterly, vol. 2, no. 1, 2002, www.sbq.com/sbq/app_security/sbq_app_wall_street.pdf.

8. C. Gliedman, “Managing IT Risk with Portfolio Man-agement Thinking,” CIO (Analyst Corner), www.cio.com/analyst/012502_giga.html.

9. W.A. Arbaugh, W.L. Fithen, and J. McHugh, “Windows ofVulnerability,” Computer, vol. 33, no. 12, 2000, pp. 52–59.

10. J.S. Quarterman, “Monoculture Considered Harmful,”First Monday, vol. 7, no. 2, www.firstmonday.dk/

issues/issue7_2/quarterman.11. D. Verton, “Airline Web Sites Seen As Riddled With

Security Holes,” Computerworld, 4 Feb. 2002.12. K. Soo Hoo, A. Sudbury, and A.R. Jaquith, “Tangible

ROI Through Secure Software Engineering” Secure Busi-ness Q., vol. 1, no. 2, 2001, pp. 8–10, www.sbq.com/sbq/rosi/sbq_rosi_software_engineering.pdf.

Daniel Geer Jr. is chief technology officer of @stake. He speaksand publishes regularly on a range of issues in digital security.He holds an ScD in biostatistics from Harvard University’s Schoolof Public Health and a BS in electrical engineering and computerscience from the Massachusetts Institute of Technology. Con-tact him at [email protected].

Kevin Soo Hoo is a senior security architect for @stake. Hisresearch interests include information security risk management,security economics, and the economic incentives of critical infra-structure protection. He has a PhD from Stanford University’sDepartment of Management Science and Engineering. Contacthim at [email protected].

Andrew Jaquith is a program director for @stake. His researchinterests include supply chain management and e-commercestrategy. He has a BA in economics and political science fromYale University. Contact him at [email protected].

EXECUTIVE STAFFExecutive Director: DAVID W. HENNAGEAssoc. Executive Director:ANNE MARIE KELLYPublisher: ANGELA BURGESSAssistant Publisher: DICK PRICEDirector, Administration: VIOLET S. DOANDirector, Information Technology & Services: ROBERT CAREManager, Research & Planning: JOHN C. KEATON

COMPUTER SOCIETY OFFICESHeadquarters Office1730 Massachusetts Ave. NW

Washington, DC 20036-1992

Phone: +1 202 371 0101 • Fax: +1 202 728 9614

E-mail: [email protected]

Publications Office10662 Los Vaqueros Cir., PO Box 3014

Los Alamitos, CA 90720-1314

Phone:+1 714 8218380

E-mail: [email protected]

Membership and Publication Orders:

Phone: +1 800 272 6657 Fax: +1 714 821 4641

E-mail: [email protected]

Asia/Pacific OfficeWatanabe Building

1-4-2 Minami-Aoyama,Minato-ku,

Tokyo107-0062, Japan

Phone: +81 3 3408 3118 • Fax: +81 3 3408 3553

E-mail: [email protected]

PURPOSE The IEEE Computer Society is theworld’s largest association of computing profes-sionals, and is the leading provider of technicalinformation in the field.

MEMBERSHIP Members receive the month-ly magazine COMPUTER, discounts, and oppor-tunities to serve (all activities are led by volun-teer members). Membership is open to all IEEEmembers, affiliate society members, and othersinterested in the computer field.

BOARD OF GOVERNORSTerm Expiring 2003: Fiorenza C. Albert-Howard, Manfred Broy, Alan Clements, Richard A.Kemmerer, Susan A. Mengel, James W. Moore,Christina M. SchoberTerm Expiring 2004: Jean M. Bacon, RicardoBaeza-Yates, Deborah M. Cooper, George V.Cybenko, Haruhisha Ichikawa, Lowell G. Johnson,Thomas W. WilliamsTerm Expiring 2005: Oscar N. Garcia, Mark AGrant, Michel Israel, Stephen B. Seidman, KathleenM. Swigger, Makoto Takizawa, Michael R. Williams

Next Board Meeting: 22 Nov. 2003, Tampa, FL

IEEE OFFICERSPresident: MICHAEL S. ADLERPresident-Elect: ARTHUR W. WINSTONPast President: RAYMOND D. FINDLAYExecutive Director: DANIEL J. SENESESecretary: LEVENT ONURALTreasurer: PEDRO A. RAYVP, Educational Activities: JAMES M. TIENVP, Publications Activities:MICHAEL R. LIGHTNERVP, Regional Activities: W. CLEON ANDERSONVP, Standards Association: GERALD H. PETERSONVP, Technical Activities: RALPH W. WYNDRUM JR.IEEE Division VIII Director JAMES D. ISAAKPresident, IEEE-USA: JAMES V. LEONARD

EXECUTIVE COMMITTEEPresident:STEPHEN L. DIAMOND* Picosoft, Inc.P.O.Box 5032San Mateo, CA 94402Phone: +1 650 570 6060Fax: +1 650 345 [email protected]

President-Elect: CARL K. CHANG*Past President: WILLIS. K. KING*VP, Educational Activities: DEBORAH K. SCHERRER(1ST VP)*VP, Conferences and Tutorials: CHRISTINASCHOBER*VP, Chapters Activities: MURALI VARANASI†VP, Publications: RANGACHAR KASTURI †VP, Standards Activities: JAMES W. MOORE†VP, Technical Activities: YERVANT ZORIAN†Secretary: OSCAR N. GARCIA*Treasurer:WOLFGANG K. GILOI* (2ND VP)2002–2003 IEEE Division VIII Director: JAMES D.ISAAK†2003–2004 IEEE Division V Director: GUYLAINE M.POLLOCK†2003 IEEE Division V Director-Elect: GENE H.HOFFNAGLEComputer Editor in Chief: DORIS L. CARVER†Executive Director: DAVID W. HENNAGE†

* voting member of the Board of Governors† nonvoting member of the Board of Governors

COMPUTER SOCIETY WEB SITEThe IEEE Computer Society’s Web site, athttp://computer.org, offers information andsamples from the society’s publications and con-ferences, as well as a broad range of informationabout technical committees, standards, studentactivities, and more.