11
Computers & Security, 9 (1990) 379-389 Secure Systems Design-An Evolving National Strategy* Peter Winters Coopers G Lybrand M any influential computer security experts in govern- ment, business, and academia have recently taken a hard look at the U.S. Government’s most ambitious and well-financed computer security initiative-the National Computer Security Center’s (NCSC) program for “trusted” computer technology, *In 1987, the National Institute of Standards and Technology (NIST) was given a broad charter to exercise new computer security leadership. This article examines NISI’s agenda in the arca of secure systems design. It provides a historical context for the issues on NISI’s agenda and commentary on their direction. The author would like to acknowledge the valuable input provided by several NIST officials who agreed to be inter- viewed in conjunction with the prepara- tion of this article. They are Mr. Lynn McNulry (Associate Director for Com- puter Security), Dr. Dennis Branstad (Senior Research Fellow), Dr. Stuart Katzke (Chief, Computer Security Divi- sion), and Mr. Gene Troy (Manager, Computer Security Assistance Group). and they have concluded that changes are needed in the overall national strategy to increase the security of computer systems which process sensitive informa- tion. They feel that a dispropor- tionate amount of attention and resources is being paid to ensure that computers can maintain the confidentiality of sensitive data when more effort should bc applied to addressing the prob- lem of data integrity. AS a result, a rather intensive examination of the government’s overall compu- ter security strategy has been taking place for the past several years within govermnent, com- mercial industry, and academia. The majority of those involved in this examination are seeking to build upon the foundation created by the NCSC, in an effort to develop more mcan- ingful design standards for secure computing in the unclassified world. The National Institute of Standards and Technology (NIST, formerly the.National Bureau of Standards) is eager to exercise a leadership role on this issue. NIST’s constituents in the unclassified computing com- munity often have a different set of security priorities than the military and intelligence com- munities which are the primary tar ets of the NCSC’s trusted a tee nology strategy. NIST also views the develop- ment of secure systems design standards as an issue of inter- national significance. The Canadians, West Germans, and British have recently published their own standards in this area. Now there are plans being dis- cussed for the development of a single European standard under the forum of the European Economic Community (EEC). In both the national and inter- national debate regarding secure systems design standards, ele- 0167-4048/90/$3.50 0 1990, Elsevier Science Publishers Ltd. 379

Secure systems design—an evolving national strategy

Embed Size (px)

Citation preview

Computers & Security, 9 (1990) 379-389

Secure Systems Design-An Evolving National Strategy* Peter Winters Coopers G Lybrand

M any influential computer security experts in govern-

ment, business, and academia have recently taken a hard look at the U.S. Government’s most ambitious and well-financed computer security initiative-the National Computer Security Center’s (NCSC) program for “trusted” computer technology,

*In 1987, the National Institute of

Standards and Technology (NIST) was

given a broad charter to exercise new

computer security leadership. This article examines NISI’s agenda in the

arca of secure systems design. It provides

a historical context for the issues on NISI’s agenda and commentary on their

direction.

The author would like to acknowledge the valuable input provided by several NIST officials who agreed to be inter-

viewed in conjunction with the prepara-

tion of this article. They are Mr. Lynn McNulry (Associate Director for Com-

puter Security), Dr. Dennis Branstad

(Senior Research Fellow), Dr. Stuart Katzke (Chief, Computer Security Divi-

sion), and Mr. Gene Troy (Manager, Computer Security Assistance Group).

and they have concluded that changes are needed in the overall national strategy to increase the security of computer systems which process sensitive informa- tion. They feel that a dispropor- tionate amount of attention and resources is being paid to ensure

that computers can maintain the confidentiality of sensitive data when more effort should bc applied to addressing the prob- lem of data integrity. AS a result, a rather intensive examination of the government’s overall compu- ter security strategy has been taking place for the past several years within govermnent, com- mercial industry, and academia.

The majority of those involved in this examination are seeking to build upon the foundation created by the NCSC, in an effort to develop more mcan- ingful design standards for secure computing in the unclassified world. The National Institute of

Standards and Technology (NIST, formerly the.National Bureau of Standards) is eager to exercise a leadership role on this issue. NIST’s constituents in the unclassified computing com- munity often have a different set of security priorities than the military and intelligence com- munities which are the primary tar ets of the NCSC’s trusted

a tee nology strategy.

NIST also views the develop- ment of secure systems design standards as an issue of inter- national significance. The Canadians, West Germans, and British have recently published their own standards in this area. Now there are plans being dis- cussed for the development of a single European standard under the forum of the European Economic Community (EEC). In both the national and inter- national debate regarding secure systems design standards, ele-

0167-4048/90/$3.50 0 1990, Elsevier Science Publishers Ltd. 379

P. Win terslSecure Systems Design

ments of cooperation and

competition are pervasive.

This art& provides a historical pcrspectivc of these issues by summarizing the trusted tcch- nology strategy and explaining some of the subsequent efforts to focus more attention on integrity-based systems security issues. It then examines the

major initiatives that NIST is pursuing in their strug le to exercise leadership in t i!l ‘s area. In doing so, the article describes key motivational factors associated with these issues and potential consequences of the NISI initia-

tives.

The Trusted Technology Strategy

In 1983, the Department of Dcfcnse (DOD) Computer Security Ccntcr, which since became the NCSC, published the

Trusted Computer System Evaluation Criteria’. This stan- dard is more commonly known as the “Orange Book.” The Orange Book specifics a hierarchy of criteria or standards for evaluating the security design quality of a computer system. The criteria arc arranged in

lcvcls of decreasing stringency from A 1 to B3, B2, B 1, C2, C I and D. Under the auspices of the NCSC, the Orange Book bccamc the centerpiece of an ambitious DOD-led strategy to improve

‘DOD 5200.28~srr), Departmenr of

Defense Trusted Computer System

Evaluation Criteria, (the “Orange

Book”). December 1985.

the quality of the security design of computers used throughout the government to process classi-

fied and otherwise sensitive information.

By this strategy, the government intended to gain leverage in influencing vendors to design computer systems that would provide improved security according to the government’s definition of what is required to do so. Another objective was to amortize the cost of achieving security gains in systems design. The intention was to build a broad base of market support for trusted technology (i.e., trusted according to the government’s definition of security) so that vendors could recoup the signifi- cant costs incurred to develop s stems to meet various levels of t i: c Orange Book criteria. This requires that trusted technology products be obtainable on a commercial, off-the-shelf basis.

Upon the foundation of the Orange Book, the NCSC under- took several other major initia- tives as part of an overall strategy to encourage the widespread availability of trusted computer systems and to promote their use in accordance with specific guidance from the NCSC. They include the following:

(1) The Trusted Product Evalua- tion Program: This is a program in which vendors submit their systems for evaluation to obtain an NCSC rating against the Orange Book criteria. The pro-

gram includes a ratings maintcn- ante program (known as RAMP) which is designed to facilitate and control the maintenance of product ratings as upgrades occur. This program is comple- mented by the publication of the

Evaluated Products List (EPL)‘, the official list of products which have been certified by the NCSC as meeting a particular level of the Orange Book criteria.

(2) The “Yellow Book”? This

standard was published in 1985 by the NCSC to provide guidance in applying the Orange Book criteria in specific comput- ing environments. It determines risk levels for computer environ- ments by comparing the security clearances and access authoriza- tions of users to the sensitivity of information resident on the system. This comparison dcter- mines the umode of operation”. The risk calculation also factors in

the degree of confidence which can be placed in the application developers (including maintain- ers) and the configuration con- trol process, such that there can be an acceptable presumption that malicious logic has not been introduced into the system. This determines whether a computer

2Information Systems Security Products

and Services Catalogue, January 1990,

prepared by the National Security Agency (updated and published quar-

terly).

‘CSC-STD-003-85, Computer Security

Acquirements-Guidance for Applying

the Department of Defense Trusted Computer System Evaluation Criteria in

Specific Environments, June 25, 1985.

380

Computers and Security, Vol. 9, No. 5

environment is considered “open” or “closed” as defined by the Yellow Book. Based upon a system’s identified mode of operation and type of environ- ment, the Yellow Book specifies the level of the Orange Book criteria (i.e. C2, B 1 etc.) appro- priate for the system. The Yellow Book can be characterized a pre- packaged risk assessment based primarily on the need to protect national security information from unauthorized access and/or modification.

(3) The “C2 by 1992” policy: This policy, issued on July 15, 1987, requires all federal govern- ment systems processing classi- fied or unclassified, sensitive information to meet the security criteria specified by the C2 criteria of the Orange Book by July 15, 1992. Formally named the National Telecommunica- tions and Information Systems Security Policy No. 200, or simply NTISSP No. 200, it was issued by the NTISSC (National Telecommunications and Infor- mation Systems Security Com- mittee), a federal government interagency committee chaired by NSA.

(4) The “Rainbow Series”: Fol- lowing the Orange Book and the Yellow Book, NCSC has issued a series of publications which are collectively known as the “rain- bow series” (so-called by the various colors of their covers).

Some rainbow series publications are designed to interpret or extend Orange Book concepts

for system environments that are not directly addressed by the Orange Book criteria, including networks4, databasesj, and sub- system securityb approaches. Others are guidelines that assist system designers in understand- ing particular Orange Book con- cepts and describe techniques for meeting the criteria.

By some measures, the NCSC strategy has had remarkable success. Most of the major

manufacturers of computer hardware and operating systems are working with the NCSC in some capacity to upgrade or redesign system products to receive a trust rating by the NCSC at a particular level of the Orange Book criteria. The vendors have invested significant resources in their efforts to obtain an NCSC rating. The EPL is being populated with a suffl- cient number of products to ensure that federal agencies can meet the “C2 by 1992” policy on the basis of a competitive selec- tion process. At the higher levels of trust, some major vendors are completely redesigning their

operating systems in order to meet the requirements of the Orange Book.

There are also signs that the strategy is showing increasing acceptance among the end-user organizations who should stand to benefit by it. However, this particular yardstick yields a more tenuous impression of success

than the vendor activity to date to meet the standard. Some U.S. government agencies are now writing requirements for the use of trusted products into their own agency policies. The “C2 by 1992” policy has been written into DOD policy (DOD 5200.28, Security Requirementsfor Aulomahc Data Processing (‘P) Systems), Intelligence Community policy (Director of Central Intelligence Directive (DCID) 1 /16, Security Policy on Intelligence Information In Automated Informahon Systems (‘Ss) and Networks), and Treasury Department policy (Directive 8502). Department of Energy policy (DOE Order 5637.1, CiassiJled Computer Secur- iy Program) describes require- ments in terms of minimum Orange Book levels as well.

‘NCSC-TG-005, Version-l, Trusted

Network Interpretation of the Trusted Computer System Evaluation Criteria, July 31, 1987.

sNCSC-TG-02 1, Version- 1, Draft Trusted Database Management System Interpretation of the Trusted Computer System Evaluation Criteria, October 25, 1989.

6NCSC-TG-009, Version- 1, Computer Security Subsystem Interpretation of the Trusted Computer System Evaluation Criteria, September 16, 1988.

Some government computer and telecommunications computer procurement specifications are being issued with Orange Book trust requirements. For example, the Department of State’s recent agency-wide procurement effort for ADP equipment specifies computer security requirements in terms of minimum Orange Book levels. The Department of

381

P. Win terslSecure Systems Design

State Telecommunications Network (DOSTN), a new, worldwide telecommunications

and computer networking pro- ject, will incorporate end-to-end network encryption technology that is bein designed to meet the B2 leve P of trust. However, the specification of Orange Book trust levels in the procurement specifications of civilian agencies is still more the exception than

the rule.

The Focus on Integrity as a Distinct Issue

The NCSC’s strategy has been

necessarily high profile. It attempts to motivate large government bureaucracies, major computer manufacturers, and private sector customers of these manufacturers to accept the government’s definition of the problem and its strategy for resolution. However, the grow- ing impact of the trusted tcch-

nology strategy has provided the impetus for a critical review of the Orange Book as the ccn- tcrpiece of a national strategy for secure systems design.

One of the most noteworthy critiques to emerge was the Clark/Wilson paper’, presented to the IEEE Technical Com- mittee on Sccuriry and Privacy in April, 1987. This paper qucs- tioned the applicability of the Orange Book to one of the most

‘Clark, David D., and Wilson, David K., A Comparison of Commercial and Mili- tary Computer Security Policies, April, 1987.

important computer security issues facing the private sector- information integrity. In particu- lar, Clark and Wilson focused on the objective of the mandatory control requirements (at the B lcvcls and above) of the Orange Book to enforce confidcnriality policies and contrasted them to integrity-based objectives.

The mandatory control objcc- tives of the Orange Book are rooted in the desire to enforce the national security classifica- tion and clearance system for access to classified information

on computer systems. They are expressed in the Bell and Lapadula model’, which mathe- matically defines the relationship between user clcarancc levels and data classification. The model defines data access permission by specifying the user’s abilit;\: to read or write data, depcndmg on the results of the comparison between data scnsitiviry and user clearance level. Mandatory con- trols arc so named because they enforce a security policy which is mandatory (i.r. not at the discrc- tion of an administrator or user).

Integrity-based objectives arc more oriented toward preventing fraud, errors, and the general unauthorized modification of data. Clark and Wilson argue that scvcral important integrity objectives cannot be met by the

8Bell, D. E. and LaPadula, L. J., Secure Computer Systems: Unified Exposition and Multics Interpretation, MKT-2997 Rev. 1, MITRE Corporation, Bedford, MA, March 1976.

mechanisms specified in the Orange Book. In particular, the) assert that integrity-based requirements for ensuring a well-formed transaction and a segregation of duties among system users arc not adequately supported by the Orange Book controls. Whereas the Bell and Lapadula model leads to an emphasis on controlling who (or what) can red or write data, the Clark/Wilson model focuses on who (or what) can e~eclrtc particular programs to manipu- late data and on how to ensure that the programs manipulate the data properly.

The Clark/Wilson paper stimu- lated a flurry of activity and analysis on the question of infor- mation integrity. One of the most conccrtcd efforts undcr- taken in the wake of the paper was the organization of two

successive data integrity work- shops. They consisted of the Invitational Workshop on Integrity Policy in Computer Information Systems (WIPCIS, October 27-29, 1987) and the Workshop on Data Integrity (January 25-27, 1989). These

workshops wcrc sponsored by NIST, the NCSC, the IEEE, and the Special Interest Group on Security, Audit, and Control (SIGSAC) of the Association for Computing Machinery. They drew leading computer security experts from the government, industry, and academia to debate the matter of information integrity and to try to achieve some consensus on approaches

382

Computers and Security, Vol. 9, No. 5

that could be taken to address the problem.

The Clark/Wilson paper and the integrity workshops did not break any new ground by idcnti- +ng information integrity as an important problem associated with computer systems. It has long been recognized as such.. For example, NISI’s Dr. Dennis Branstad points out that U.S. ’ banks recognized significant integrity exposures associatid with the USC of electronic funds transfers (EFTs) over computer networks in the 1970s. The resulting demand for protection helped lead to the development and utilization of the popular commercial grade encryption

algorithm, DES (Data Encryp- tion Standard).

However, the Clark/Wilson

paper and the integrity work- shops have been important because they have generated momentum within government, industry, and academia, to

address data integrity issues in a concerted, coordinated manner. In particular, the integrity work- shops are reminiscent of the approach used by the authors of the Orange Book, who also sought the advice and input of an array of experts in the field before drafting the standard. While no consensus has been achieved on an integrity model which could serve as a counter- part to the Orange Book, the efforts to date have served to sharpen the focus of influential organizations (both in govem-

ment and in industry) on the perceived deficiencies of the Orange Book and on the current trusted technology strategy being pursued by the government and vendor community.

As a result, just as the trusted technology strategy has begun to show the results of several years worth of commitment on the part of the government and vendors, some of the key partici- pants in the process (government agencies, vendors, and interested commercial firms) are question- ing whether the security enhancements produced by this strategy are really useful for addressing their most prevalent computer security concerns.

Changing Leadership

Congress did not riced the

Clark/Wilson paper or the integrity workshops to draw its attention to the question of national computer security leadership. They have been con- cerned about the implications of having the NCSC, as part of an intelligence organization under

military management, act as the lead authority for all computer security matters in the govern- ment. Their concern was hcight-

encd by several initiatives during the mid-1980s in which the military and intelligence side of the government presumed to “advise” the civilian side of government and the private sector regarding information security. Of particular note were the National Security Decision

Directive (NSDD) # 145 and the NTISSP No. 2 (which has since been revoked) policies. NSDD # 145 was specifically directed at the protection of unclassified, sensitive information. NTISSP No. 2 was promulgated by the National Security Council under Admiral John Poindexter. It attempted to establish unclassi- fied, but sensitive information as

a distinct category or classifica- tion ofinfbrmarion that would require security protection. The

prospect of assigning an umbrella classification to unclas- sified, but sensitive information looked to many, including Con- gress, like an attempt by the government to unduly restrict the availability of government information to the public.

In 1987, Congress passed the Computer Security Act (signed by the President as Public Law 100-235). This law, among other things, transferred the rcsponsi- bility for computer security leadership from the NCSC to NIST for all except class&cd systems and Warner Amend-

ment systems (a special, limited category of unclassified systems in the military).

The Act was a clear signal from Congress that it did not want NSA dictating the computer

security agenda for the civilian sector of government or acting as “advisor” to the private sector. However, given the lack of resources allocated to NIST to carry out its new mandate, it is apparent that Congress does not

383

P. Win terslSecure Systems Design

yet feel a sense of urgency for NIST to establish major new directions and initiatives. The NIST computer security pro- gram budget for fiscal year 1990 is $2.5 million. Their professional computer security staff com- prises approximately 45 people. As a point of comparison, the NCSC’s fiscal year 1990 budget is between $40-$45 million, and they have a staff of approxi- mately 200 people. It does not seem to overstate the matter to conclude that both have similarly significant leadership responsi- bilities for computer security, yet, there is a clear disparity regarding their current resources for exercising such leadership.

Nevertheless, NIST is eager to demonstrate their computer security leadership. They are at the center of the current efforts to re-evaluate the government’s trusted technology strategy and to develop new standards that better address integrity and some types of confidentiality issues from the perspective of the unclassified computing com- munity in the civilian sector of government and in the private sector. The remainder of this article focuses on three of the initiatives that NIST is pursuing in this context.

(1) The Need for a “Civil” Orange Book and a “Civil” Yellow Book

NIST views the NCSC-led trusted technology strategy as unbalanced; it emphasizes the confidentiality needs of the military and the intelligence

community to the exclusion of some important confidentiality and integrity concerns facing many civilian government agen- cies and the private sector. NIST intends to build upon the enthusiastic interest generated by the Clark/Wilson paper and the subsequent integrity workshops. One idea being pursued is the development of a “civil” Orange Book. Dr. Willis Ware, named 1989’s “Computer Security Man of the Year” at the National Computer Security Conference in Baltimore, challenged both NIST and the NCSC to develop a civil Orange Book during his acceptance speech at the confer- ence. He was referring to the need for an alternative standard to the Orange Book which would concentrate more on addressing information integrity concerns and confidentiality issues in a manner more relevant to the concerns of the civilian government and private sector. NIST is very interested in responding to this challenge It is also interested in the idea of developing a civil Yellow Book. Such a standard would prescribe lcvcls of integrity-based controls and assurance characteristics for particular risk environments common to the private sector and to civil government agencies.

The terms civil Orange Book and civil Yellow Book may con- vey the idea that a security design standard and a related implementation program are planned to be similar to the Orange Book and the NCSC

program. However, NIST’s offi- cials envision a substantially dif- ferent type of standard and implementation program. While plans and ideas for a civil Orange Book and a civil Yellow Book are still being formulated, some common themes are already evident from discussions with NIST officials. Clearly, there is the intention to emphasize the integrity concerns that NIST feels are not addressed by the current trusted technology strategy. In addition, two other themes are discernable.

(1) The Orange Book already provides a very useful framework for addressing confidentiality issues and some types of integrity issues in the systems design pro- cess; however, its treatment of the confidentiality problem is not complete, given the variety of confidentiality policies impor- tant in many civil government and private sector organizations.

(2) An integrity-based standard will be accompanied by imple- mentation mechanisms signifi- cantly different from the NCSC Trusted Product Evaluation Pro- gram.

Regarding the first theme, Dr. Branstad points out that Bell & Lapadula, while appropriate for military classification policies and some commercial applica- tions, may not be the most appro- priate model for the target audience of a civil Orange Book The Bell and Lapadula model

384

Computers and Security, Vol. 9, No. 5

assumes a hierarchy of labeled categories (e.g., unclassified, con- fidential, secret, and top secret), and it implies that it is more important to protect categories of information at the top of the hierarchy. Dr. Branstad indicates that many times in the unclassi- fied government world and in the private sector it may not be

possible to arrange information protection categories in such a strict hierarchy as the basis for enforcing mandatory read and write controls.

Yet, in talking with NIST offi- cials, it seems that their concerns about the applicability of the Orange Book to the confiden- tiality issues of the non-national security community have more to do with the way in which the NCSC has interpreted which Orange Book criteria levels are appropriate for particular com- puting environments (via the Yellow Book). They see in the Yellow Book a decision-making criteria that considers national security information as the only important factor in determining risk and specifying increasing levels of trust. This reduces the appeal of the Orange Book to many outside of the national security community because it creates the perception that the Orange Book is not designed to address their problems. In fact, there is sentiment at NIST to proceed with the development of a civil Yellow Book as a higher priority than the civil Orange Book.

Regarding the second theme- the conclusion that an integrity- based standard will be accompanied by a much differ- ent implementation mechanism than the NCSC Trusted Product Evaluation Program-there are several supporting explanations.

First, the NCSC evaluation pro-

gram is sharply criticized by many in government and busi- ness for the time it takes to receive an EPL rating and for the lack of specific, stable criteria

used to evaluate vendor products. Much of this criticism stems from the Orange Book’s assur-

ance requirements, as opposed to its specification of security fea- tures. Assurance is the measure of confidence that a system’s security features perform as

intended and cannot be bypassed to subvert the security policy goals. At the upper levels of the criteria (B2, B3, and A 1) the Orange Book concentrates pri- marily on specifying stringent requirements for achieving greater assurance, as opposed to the addition of new security features.

An EPL rating is dependent on the vendor’s ability to demon- strate to the NCSC that they have provided the required security features with the level of assurance that the NCSC requires. NIST’s Dr. Stuart Katzke says that vendors are con- cerned that the NCSC’s assur- ance requirements have proven to be somewhat subjective,

dynamic, and time-consuming in their impact on the evaluation process. The process of achieving an EPL rating currently requires at least two to three years. Both vendors and users are concerned about the impact that this extended evaluation process has on their ability to market and utilize current technology.

Another reason the NCSC evalua- tion approach is not considered feasible for a civil Orange Book/ Yellow Book program is that an inte rity-based computer secur- ity esign standard would focus f on the application environment to a far greater degree than the Orange Book The Orange Book approach is primarily oriented towards operating systems. But integrity issues, such as the risk that program modules will pro- cess data in an incorrect or unauthorized manner, do not seem to lend themselves to solu- tions that are implemented solely within the operatin system or a “security kernel”. T % e certifica- tion of myriad applications (including relevant interfaces to the operating system) would seem to be a much greater chal- lenge than the current NCSC endeavor to evaluate vendor operating systems and subsystem security products.

NIST does not envision an elaborate program to certify vendor products independently against an integrity standard. Based upon their experience with maintaining the DES stan-

385

P. Win terslSecure S ys terns Design

dard, they foresee the develop- ment of concise integrity requirements and evaluation standards which vendors could use for self-evaluation, thereby anticipating with confidence the formal certification results. To be

sure, it is unclear how NIST will factor the Orange Book assur- ance concepts into this type of approach. But they view the cur- rent NCSC evaluation process as an unworkable approach for achieving vendor cooperation in designing systems to meet new integrity standards.

In contrast to the enthusiasm of NIST and others to proceed with a coordinated effort to develop a civil Orange Book, some experts are skeptical that such a standard can be developed to address integrity matters before a con- sensus is reached on the defini- tion of information integrity. Others would like to move for- ward, developing prototypes of systems that meet specified integrity models such as the Clark/Wilson model or other variations, even though enhance- ments or changes to these models may be required.

These criticisms do not imply mutually exclusive approaches. Overall, the NIST initiative to lead the development of civil Orange Book and civil Yellow Book standards seems to be a worthy endeavor. Furthermore, NIST appears to be uniquely positioned to lead a national strategy that builds on the foundations of the Orange Book,

the Clark/Wilson model, the

integrity workshops, and a

variety of related work on this subject.

(2) Harmonizing U.S. and Inter-

national Efforts to Develop Com-

puter Security Design Standards

NISI has tracked the recent efforts in the international arena to develop trusted systems criteria with great interest and with some evident degree of concern. They cite the published standards of Canada, West Germany, and Great Britain as examples of international activiry in this area. Of even more sig- nificance may bc the emerging interest of the EEC to address the

development of trusted systems criteria as a unified European concern. The EEC is now plan- ning to host a conference in late

September of this year to address the issue of harmonizing the various trusted systems criteria. One potential outcome of this process is the devclopmcnt of a unified EEC standard. NIST is very much attuned to the intcr- national economic and trade issues associated with these initiatives. NISI’s Lynn McNulty says that these efforts reflect not only a desire on the part of other countries to emulate U.S. leadcr- ship in this area, but also a sense among these countries that the development of standards is an issue of economic competition. This is true with respect to imports and exports.

On the import side, McNulty

explains that the NCSC’s Trusted Product Evaluation Pro- gram is perccivcd in some cou11- tries as an attempt to keep their computer products out of the U.S. market on the prctcxt of U.S. national security concerns. This is because the NCSC does not accept products made by non-U.S. manufacturers in their evaluation program. As an example, a British vendor, Honeywell Bull, requested an evaluation of their operating system, but was turned down by the NCSC because they are a non-U.S. vendor. With respect to the U.S. market, where the NCSC’s product evaluation has the potential to become an important criteria for winning certain types of contracts, the inability of foreign vendors to obtain an NCSC rating looks to them very much like a trade barrier. This perception could spark retaliatory actions in markets important to U.S. com- puter vendors.

U.S. vendors arc frustrated over export restrictions on products dcvelopcd to meet U.S. govcrn- mcnt standards. Major computer manufacturers, such as IBM and DEC, have expressed concern that overseas markets are their fastest growing markets, and they

are reluctant to invest signifi- cant resources to meet U.S. standards only to see those pro- ducts automatically excluded from profitable ovcrscas markets. With regard to trusted systems, McNulty indicated that some vendors arc concerned about the

Computers and Security, Vol. 9, No. 5

implications of NSA’s interpreta- tions of the law which would prevent exporting 83 technology.

NIST officials believe that it would be wise for the U.S. to

. harmonize our trusted systems and integrity-based initiatives with the on-going efforts in other countries. In particular, they believe that the U.S. has a very important stake in the direction that the EEC takes with respect to their current initiative to produce a single standard. There is an opportunity for the U.S. to benefit from both an economic and a security stand- point by exercising a lead role in establishing international design standards for secure computing.

From an economic standpoint, U.S. leadership and influence in an international effort to harmonize standards could enhance U.S. opportunities based on our extensive experience with the trusted systems initiative and our efforts to define and address the integrity problem. From a security standpoint, it seems highly preferable that our own national government, business, and academic institutions con- tinue to have a forum for influencing the security design quality of systems manufactured by major U.S. computer vendors.

Alternatively, WC may fact a situation in which either our definition of the security design problem is addressed by the vendors or someone else’s defini- tion is used-but not both. At

least that seems to be the message from some of the major vendors. DEC’s Secure Product Manager, Mr. Steve Lipner, said as much when he stated at the March 13-14, 1990 meeting of the National Computer System Security and Privacy Advisory Board that “the Europeans arc getting on with it...” (referring to the process of developing integrity standards for secure systems design) and that U.S. vendors “. . . would not wait for- ever for a U.S. standard”, given the importance of the European market to the U.S. computer manufacturers.

NIST is concerned about the U.S. ability to sustain any national

computer security design strategy successfully that is perceived to represent an economic protectionist strategy aimed against foreign computer vendors. This would have the potential to result in a backlash against leading U.S. vendors, whose cooperation is vital to any U.S. program aimed at improv- ing systems security design. With respect to the international implications of this issue, NIST is again in a position to exercise important lcadcrship.

(3) Revising the “C2 by 1992”

Requirement

As of April, 1990, NIST is in the final stages of drafting a set of recommendations that will advise federal agencies that they are no longer bound by the “C2 by 1992” requirement with

respect to unclassified systems. NIST is taking this action by virtue of their authority under the Computer Security Act of 1987. Instead of the mandatory “C2 by 1992” requirement, NIST will recommend that agencies determine appropriate computer security requirements on the basis of their own risk analyses. They will encourage the use of trusted technology where it is appropriate to address the idcnti- fied risks. However, NIST believes that not all systems, a priori, require C2 protection.

While NIST’s logic is under- standable from an academic standpoint, it seems to have the potential to weaken significantly one of the key programmatic

features of the NCSC stratcgy- cnsurin that trusted products arc avai P able on a commercial off-the-shelf basis. If vendors perceive that the government is not serious about the Orange Book criteria, they may not be as interested in investing their resources to meet the standard. Prior to NIV’s re-evaluation of the C2 requirement, vendors arc already becoming skeptical of the government’s commitment to its own standard owing to the absence of trusted systems requirements in many government computer procurc- mcnt specifications. This would be of no concern to NIST’s con- stituents if C2 controls were

irrelevant to the security prob- lems facing the wide range of government unclassified systems, but this is not the case.

387

P. Win terslSecure Systems Design

Even for systems in which integrity is clearly a paramount concern, it is hard to conceive

that the majority of the C2 con- trols would not be desirable and/ or necessary as enforcement mechanisms for these systems’ security policies. C2 dis- cretionary access, audit, identifi- cation and authentication, and assurance controls are clearly useful as part of any solution to the integrity problem. To the extent that many government agencies face similar types of security exposures that are addressed by the C2 criteria, the “C2 by 1992” policy could be viewed as the logical response to a broad-brush risk analysis of the threats and vulnerabilities common to most systems that process unclassified, but sensitive information.

In asserting its leadership, NIST should be careful not to substi- tute weak guidance for firm policies which establish useful, clear, and attainable objectives for improving systems security. There is an inherent tension in most agencies between manage- ment’s desire to maintain or improve computer operations and management’s responsibility to appropriately address com- puter security exposures. The responsible computer security components in most U.S. civilian agencies find it difficult to gain senior management’s support when conflicts between security and operations arise. They have a relatively small amount of

organizational power and respect as compared with the informa- tion systems components. This makes it difficult to accomplish major computer security initia- tives, such as influencing the security quality of vendor operating systems, simply on the basis of an internal agency risk analysis.

It is interesting that NIST stresses only the security exposures which are not addressed by the ~2 criteria (i.e., integrity and availability concerns) in its initia- rive to modify the “C2 by 1992” requirement. NIST does not explicitly judge the C2 criteria to be inappropriate in any specific

circumstance for systems that formerly were required to meet the NTISSP No. 200 policy. I argue that the C2 criteria repre- sents a valuable and reasonable minimum standard, appropriate for most systems processing sen- sitive information. As a manda- tory policy, the “C2 by 1992” requirement has generated suffi- cient momentum among vendors to build C2 systems such that many government agencies could reasonably be expected to comply with the policy by 1992. Furthcrmorc, there is little cvidencc that the agencies arc complaining about the policy.

NISI’s position on the “C2 by 1992” requirement reflects its traditional leadership approach in computer security. The establishment of mandatory poli- cies is contrary to NISI’s normal procedure. Most of its computer

security-related standards have been published as guidance. NISI’s position on the “C2 by 1992” requirement may also reflect the pressure it feels to exert leadership authority on issues that are clearly within its domain, since the passage of the Computer Security Act of 1987.

Whatever the various motiva- tional factors may be, the reality is that guidance, such as NISI’s risk analysis advice, often has minimal impact on agency dcci- sion-makers. Whatever the general merits of such guidance, NIST has not clearly articulated the problem with the “C2 by 1992” requirement such that it justifies the reconstitution of the policy from a mandatory dircc- tive into the envisioned guidance.

Despite its action to change this major NCSC-initiated policy, NIST computer security officials insist that cooperation, not con- flict, is their goal with rcspcct to establishing their leadership role and working with the NCSC. They stress that such cooperation is vital if the U.S. is to maintain its leadership role in establishing computer security design stan- dards and is to be capable of building on the record of accom- plishment of the NCSC and the trusted systems strategy.

I suggest that instead of striking the requirement, it would be more useful for NIST to focus on initiatives that would help government agencies and others

388

Computers and Security, Vol. 9, No. 5

to better utilize C2 controls and other necessary controls not addressed by the C2 criteria. As a first step, NIST should solicit direct input from end-user organizations who have C2 tech- nology to gain insight into how the controls are used. At the same time, they could continue to pursue their initiative to develop more useful integrity- based standards for areas not addressed by the Orange Book criteria.

Conclusions

NIST believes that its constitu- ents have a stake in the outcome of the current national and inter- national debates regarding com- puter security design standards. Such standards have not in the past, and will not in the future, provide an “off-the-shelf” secur- ity solution for protecting sensi- tive data on computer systems. Systems designed to meet trust criteria may provide no effective security if their features are mis-

Peter J. Winters is an information sys-

tems security specialist in the Informa-

tion Technology Audit Services (ITAS) division of Coopers & Lybrand in

Washington, DC. Prior to joining

Coopers & Lybrand in January, 1990, Mr. Winters worked for over 5 years as a systems security analyst at the U.S. Department of State. Mr. Winters has a

B.A. degree in international affairs from

The George Washington University in

Washington, DC.

used or are not used at all. How- ever, the ability to implement systems security controls depends, in no small part, on the availability of those controls in the first place.

It would be difficult for indi- vidual agencies or private sector firms to apply the requisite pres- sure to persuade vendors to design security into systems according to the unique specifi- cations of each agency or firm. NIST, NCSC, and other organi- zations involved in the process of defining ways to address secure systems design issues are demon- strating leadership in their initia- tives to develop coordinated standards.

While there is a potential for conflict and competition between NIST and NCSC regarding their respective leader- ship responsibilities, NIST offi- cials were careful to stress that they want to build upon the

successes of the NCSC. With the exception of the NIST recom- mendations regarding the “C2 by 1992” policy, I believe that NISI’s constituents should be encouraged by its initiatives in the computer security arena.

Additional References

DOD Directive 5200.28, Security Requirements for Automatic Data Processing (ADP) Systems, revised April 1978, cancelled and reissued March 2 1, 1988.

NIST Special Publication 500- 160, Report of the Invitational Workshop on Integrity Policy in Computer Information Systems (WIPCIS), Dr. Stuart W. Katzke and Zella G. Ruthberg (eds.), January 1989.

NIST Special Publication 5OO- 168, Report of the Invitational Workshop on Data Integrity, Zella G. Ruthberg and William T. Polk (eds.), September 1989.

389