6
Computers 81 Security, 9 (1990) 587-592 The Complexity of “Long Haul” Data Communication Security Belden Menkus S 0 long as telecommunication occurred only through wires and was essentially landlocked the environment in which data com- munication security had to be addressed remained finite. How- ever, once so-called long haul telecommunication began to occur increasingly in the atmos- phere the vulnerabilities associated with data communica- tion changed significantly. Growth in both the use and the technological complexity of the satellite, microwave, and so- called cellular forms of telecom- munication is expcctcd to continue for the foreseeable future. Each of these forms can be described as being over engineered to avoid signal degradation due to weather or atmospheric disturbances. Each of these telecommunications forms already is being attacked aggressively by both vandals and seekers after various forms of Q 1990, Belden Menkus. personal and organizational intelligence. (Directories of so- called private satellite and micro- wave transmission frequencies, for example, long have circulated internationally.) The weakest segment of satellite data transmission in terms of security is the air-to-ground links at both ends of the process. It is particularly vulnerable to compromise at the destination end of the transmission process. Typically, the satellite’s so-called footprint, that is the geographic area covered by the signal trans- mitted back to earth, may cover an area extending anywhere from 600 000 to 800 000 square kilometers. (The satellite signal must travel about 34 000 kilo- meters to reach the earth. Unfor- tunately, at that distance the signal aiming process is not precise. The expansion of the signal path from the ideal trajec- tory allows for any possible error in aiming it.) Anyone located within the area covered by the footprint who possesses a rcccivcr turned to the proper transmis- sion frequency potentially can intcrccpt the message without the knowlcdgc of the intcndcd rccipicnt. A comparable vulnerability exists-on a much smaller scale, of course-with microwave transmission. Here, too, the signal path has been expanded from the ideal trajectory to allow for any possible error in aiming it. This means that a microwave signal is susceptible to potential interception up to 20 kilomctcrs on either side of the signal path at any point along the 32 kilometers that normally separ- ates microwave transmission towers. So-called cellular telephony generally is susceptible to com- promise in its current form. It was developed as an alternative to so-called mobile telephony and its developers appear to have assumed that no particular pro- 0167-4048/90/$3.50 0 1990, Elsevier Science Publishers Ltd. 587

The complexity of “long haul” data communication security

Embed Size (px)

Citation preview

Page 1: The complexity of “long haul” data communication security

Computers 81 Security, 9 (1990) 587-592

The Complexity of “Long Haul” Data Communication Security Belden Menkus

S 0 long as telecommunication occurred only through wires

and was essentially landlocked the environment in which data com- munication security had to be addressed remained finite. How- ever, once so-called long haul telecommunication began to occur increasingly in the atmos- phere the vulnerabilities associated with data communica- tion changed significantly. Growth in both the use and the technological complexity of the satellite, microwave, and so- called cellular forms of telecom- munication is expcctcd to continue for the foreseeable future. Each of these forms can be described as being over engineered to avoid signal degradation due to weather or atmospheric disturbances. Each of these telecommunications forms already is being attacked aggressively by both vandals and seekers after various forms of

Q 1990, Belden Menkus.

personal and organizational intelligence. (Directories of so- called private satellite and micro- wave transmission frequencies, for example, long have circulated internationally.)

The weakest segment of satellite data transmission in terms of security is the air-to-ground links at both ends of the process. It is particularly vulnerable to compromise at the destination end of the transmission process. Typically, the satellite’s so-called footprint, that is the geographic area covered by the signal trans- mitted back to earth, may cover an area extending anywhere from 600 000 to 800 000 square kilometers. (The satellite signal must travel about 34 000 kilo- meters to reach the earth. Unfor- tunately, at that distance the signal aiming process is not precise. The expansion of the signal path from the ideal trajec- tory allows for any possible error in aiming it.) Anyone located

within the area covered by the footprint who possesses a rcccivcr turned to the proper transmis- sion frequency potentially can intcrccpt the message without the knowlcdgc of the intcndcd rccipicnt.

A comparable vulnerability exists-on a much smaller scale, of course-with microwave transmission. Here, too, the signal path has been expanded from the ideal trajectory to allow for any possible error in aiming it. This means that a microwave signal is susceptible to potential interception up to 20 kilomctcrs on either side of the signal path at any point along the 32 kilometers that normally separ- ates microwave transmission towers. So-called cellular telephony generally is susceptible to com- promise in its current form. It was developed as an alternative to so-called mobile telephony and its developers appear to have assumed that no particular pro-

0167-4048/90/$3.50 0 1990, Elsevier Science Publishers Ltd. 587

Page 2: The complexity of “long haul” data communication security

B. Menkusl’Y ong Haul” Data Communication Security

vision necdcd to be made for

protecting the content of the voice communication that it was cxpcctcd to be used for. The primary USC of cellular tclcphony was cxpcctcd to be bcrwccn various motor vchiclcs while they wcrc in motion. The ccllu- lar tclcphonc already is being used for data communication from so-called laptop micro- computers and similar portable data handling devices. So-called portable wide area networks have been dcvclopcd that USC cellular

telephony for data communica- tion. All of the data being com- municatcd in this environment are susccptiblc to potential intcr- ccption by anyone who can rcccivc on the frequencies used by the transmitters in each cell to rcccivc and relay signals from cellular tclcphony users.

The data communication sccur- ity vulncrabilitics associated with satcllitc, microwave, and cellular tclcphony data transmission arc intcnsificd by the introduction of mechanisms for linking the various forms of so-called local area and wide area networks.

Other Considerations

The cxplosivc growth of data communication USC has placed strains on the evolving tclc- communicarions cnvironmcnt and its technological undcr- pinning that appears to cxcccd the ability of the legislators, regulators, scrvicc providers, and technologists to contend with. Dealing with the economic

momentum behind this growth appears to have prccludcd any serious concern by thcsc interests with the riced to cnhancc the

security of this cnvironmcnt. And, numerous computing hard- ware and software products and data communication service offerings have been dcvclopcd that did not rcflcct the rcalitics of either the tclccommunication environment or of the way in which the technology being employed in the new form of data processing is interacting with and altering that environ- ment.

This situation has led to the cmcrgencc of significant issues in data communications security that have yet to bc attacked successfully. And, both their impact and the need to rcsolvc them can bc cxpcctcd to bccomc more urgent as such things as international sccuritics and com- moditics trading networks cvolvc, and such dcvclopmcnts as clcctronic data intcrchangc agrccmcnts prolifcratc.

The first issue is that it is not possible to ccrtl$ in any convcn- tional scnsc the working rcliabil- ity of the software commonly used in maintaining and manag- ing data communications activi- tics. That this software may bc less than reasonably rcliablc undcrmincs the integrity of both the data communication process and the data that security mcasurcs mi ht be invoked to protect. The 7 ack of reliability in this software may bc seen in the

surprisingly common incidcncc of (a) the misdirection of mcssagcs, (b) the failure to deliver mcssagcs, and (c) the successful introduction of spurious mcssagcs. All of this can occur without the software detecting the failure, warning those responsible for the managcmcnt of the data com- munication process, or undcr- taking any sort of corrcctivc response. In fact, in sonic situa- tions it will be essentially impos- siblc to dcterminc from whatcvcr activity records arc gencratcd by this software that anything amiss has taken place. Unlike the cquipmcnt used in data com- munications, this software remains an unknown quantity to those who must USC it-or monitor that use. It seems to be rcasonablc to cxpcct that mini- mum performance standards could be dcvclopcd for this software. It does not exist at prcscnt [ 11.

The second issue is that the impact of database and trans- action content and processing error within a nctworkcd dis- tributcd processing cnvironmcnt has not been clarified. Hcrc too, failure to resolve this issue undcrmincs the integrity of both the data communication process and the data that arc being sccurcd. Almost all of the error significance and propagation mcasurcs commonly used in data processing managcmcnt wcrc dcvclopcd some 25 years ago in the early batch-oricntcd main- frame computing environment

588

Page 3: The complexity of “long haul” data communication security

Computers and Security, Vol. 9, No. 7

[2]. They were based, for instance, on the assumption that the univcrsc in which a particu- lar error occurred was finite and could bc defined in unambigu- ous terms. However, new forms of computing tend increasingly to merge both structural distinct processing activities and the con- tent of divergent databases. This makes such an approach to defining and managing errors unrealistic. For instance, it con- tinues to bc extremely difficult to distinguish consistently between telecommunication network, transaction processing, and database-maintenance- induced errors when they appear in the data communications environment. And, the questions of responsibility and liability for any failure to mitigate the pos- sibility occurrence of such errors still have to be resolved in a con- sistent fashion.

The third issue is the failure of all of the interests involved to give sufficient attention to dcfin- ing and resolving the structural and operational inadequacies associated with the USC of long hatrl tclccommunications facili- ties for data communication. The various telecommunication service providers generally have failed to supply the users of these facilities with adcquatc informa- tion about their pcrformancc. And, with regulatory and lcgisla- tivc bodies that are involved with telecommunication-both nationally and internationally- consistently have failed to demonstrate cithcr sufficient

understanding of the data com- munications environment or a willingness to deal aggresively with assuring the security of that environment.

The fourth issue is the inadequate consideration that has been given

by computing executives and data communication system planners to the value of data encryption and the problems associated with its widespread use. One example of this is the apparent unwillingness of all concerned to press those con- cerned to resolve the persisting questions about the reliability, viability, and suitability of both the so-called dafti encryption stan- dard [3] and public key encryption [4]. That unwillingness has been complicated by the continued self-interested interference by the military intelligence agencies of various nations in efforts to improve the general quality of the data encryption processes used in and out of the govcrn- mental setting.

The Only Feasible Solution

Any attempt to deal with the security of data communications should recognize that these four issues persist in this cnvironmcnt. It should recognize as well both

2: ~~~~~~P~~~~~P~~~~~~~n

the tclccommunication process. Othcrwisc, this attempt is almost guaranteed to fail. The only fcasiblc solution to this situation appears to be what might be

considered to be data se(fiprotec- tion. It takes the form of the con- sistent USC of encryption for the transmission of sensitive [5], confidential [6], and what might be termed money value [7] data. (The last data category too often is overlooked in existing efforts to improve the security of data communications.)

Uniform USC of encryption in large-scale data communication increases the need for developing and using more powerful algorithms in this process. The currently available commercial data encryption products tend to

reflect the state of computing technology in the mid to late

1960s. And, the proponents of most of them gloss over the complexities associated with cffcctivc encryption key managc- ment [8] as well as the increased computing power available to those who would attack the algorithm and the grcatcr sophis- tication of their approach to such efforts.

Existing approaches to data encryption tend to exhibit two wcakncsscs. One wcakncss is the USC of pseudorandom functions that masqucradc as gcnuincly random mechanisms. The sheer length of a pscudorandom data string that must bc travcrscd before it rcpcats itself no longer affords the degree of protection against dctcrmined attacks on encrypted data that it might have provided 20 or 30 years ago. Some way to import the results of a gcnuinc randomizing pro-

589

Page 4: The complexity of “long haul” data communication security

B. Menkusl”‘Long Haul” Data Communication Security

cess into a data encryption pro- cess should be developed.

The other weakness in these approaches to data encryption is that the digital conversion method is based on the use of a rigidly structured algorithm. Most take the form of a mathe- matical model consisting of a series of linear equations, whose functions thus are inherently predictable The use of such a rigid mechanism tends to rein- force whatever pattern already exists in the structure of the data being encrypted. (This is not a problem in the context of the mainframe-to-mainframe trans- fers that most existing commer- cial data encryption mechanisms were developed to protect. How- ever, it is particularly a concern in such things as the communi- cation of the relatively brief and highly structured monq, value data transactions mentioned earlier.) The inherent weakness of structure in the encryption process was established by William Friedman [o] some 70 years ago. It clearly poses a greater potential vulnerability to a concerted cryptographic attack than any lack in either the cfficicncy of the encryption algorithm itself or the cffcctivc- ncss of the key management pro- ccss.

What is nccdcd is a way to crcatc an al orithm whose structure and unctions will not remain I! fixed over time. Rather, they need to bc self organizing-or, better, reorganizing-over time.

There is every reason to believe that the emerging manifestations of artificial intelligence can be used to create and maintain such an algorithm. (It clearly is only a matter of time before the arti- facts of artificial intelligence will be applied to the process of cryptographic attack.) Most likely any attempt to apply a so- called experr system to the opera- tion of an encryption algorithm is not desirable, due to its inherent structural rigidity. (The so-called rules thatf;‘re an expert system normally are imposed upon the process that it is to be used to carry out.) However, the self-organizing functions of a so- called neural networking micro- computer could be applied effectively to this problem. Unfortunately, the ability of such a device to recognize patterns within large bodies or flows of apparently structurally diverse data also makes it potentially a highly effective tool for crypto- analytical attack.

The introduction of artificial intelligence into the data cncryp- tion process offers some hope of resolving the inhcrcnt insccuri- tics of data communication. There is no reason to belicvc that either government entities or the various telcconimunication service providers will take the lead in developing means for the routine use of artificial intclli- gcnce in data encryption. Thus, it appears that the major users of data communication arc going to have to dcvclop these means individually or collectively. For

there is every indication that the insecurities of data communica- tion are going to continue to grow exponentially and that their cumulative impact upon the general security of the com- puting process eventually will become catastrophic.

Notes

[l] The performance standards commonly used in planning and measuring data communications networks are deceptive in at least two aspects. First, they refer basi- cally to the performance of the switching and ancillary equip- ment used with the network, and not to the performance of the ncrwork itself Secondly, these standards and the measures from which they were dcrivcd do not reflect the current state of tcle- communication technology. They reflect basic assumptions of some 65 years ago about a much less active voice network environ- ment. Their use is quite unrealis- tic in the intensive mixed data and voice telecommunication environment that is rapidly cmcrging internationally.

[2] Most of these measures are based on naive assumptions that arc easy to demonstrate in the classroom and it may bc reason- able to attempt to apply to the csscntially straightforward and stable environment of so-called batch processing of data. How- ever, they consistently are unrealistic to USC in the complex and essentially unstable dis- tributed database environment

590

Page 5: The complexity of “long haul” data communication security

Computers and Security, Vol. 9, No. 7

that is becoming increasingly common in computing.

[3] Marc properly known as the U.S. Government Data Encryption Standard, it has become, unfortu- nately, an international defacto standard for data encryption. It was established in January 1977 as U.S. Government Information Processing Standard 46 and reflects encryption concepts that were developed in the late 1960s.

correctly from its developers as the Rivest, Shamir, Adelman Public Key Encryption Algorithm. It initially was described in a jointly authored paper in the February 1978 issue of the Com- munications of the Association for Computing Machinery. This approach to data encryption also reflects concepts that were developed in the late 1960s.

[s] Sensitive data may be defined as encompassing (a) that used to control the functioning of other systems -such as security sub- systems, telecommunications nenvork control systems, and program library managements, (b) that which relates to overall organization and individual pro- duct or service planning, and (c) that which records critical opera- tional, proprietary, and

employee, customer, or supplier data.

[6] Conjdential data is that which would embarrass the organiza- tion, its executives, its other employees, its customers, and its suppliers, or whose inadvertent or unauthorized disclosure might leave any member of these groups open to possible regula- tory agency or legal action.

[7] Money value data are those which authorize and record funds disbursement, such as employee payroll, accounts pay- able, dividend payment, and so- called book entry securities processing; records funds due; and control and record the acquisi- tion, transfer, and distribution of convertible resources, such as inventory and physical assets. Money value data are to be found commonly in such things as electronic funds transfer pro- cesses; but they exist as well in such things as the transaction flows in electronic data inter- than e arrangements; social bene!ts transactions; and insur- ance and medical expense benef- its data flows.

[8] Almost all key management techniques currently used in data encryption appear to be derived in some fashion from the crypto-

graphic key distribution proce- dures first used more than a century ago when the armies of various countries began to make significant use of telegraph in the field. These techniques may have been appropriate to apply 25 years ago when the use of data encryption was relatively limited and, in many instances, all of those affected by a particular key change might be located without a single or building complex. They do not appear to be suit- able for the rapidly evolving current data communication environment in which comput- ing power is being distributed widely and in which almost con- tinual key changes may be called for.

[9] In effect, Friedman said that when cryptanalysis is undertaken it is not necessary to know either the key or the encryption algorithm that has been used. It is only necessary to be able to discern a pattern in the text to be decrypted. It follows, then, that while it is useful to protect both the details of the algorithm and the key that is used in an encryp- tion process it is equally impor- tant to avoid the disclosure of structure in the data that arc being encrypted. Most efforts made to improve the use of data encryp- tion tend to overlook this last concern.

591

Page 6: The complexity of “long haul” data communication security

B. Menkusl “Long Haul” Data Communication Security

Since 1953 Belden Menkus has been He writes and lectures extensively on

helping management improve informa- various aspects of business management.

tion handling techniques and operational He has been for many years Executive

security practices. From 1953 to 1968 he Editor of 7%c]ounra/ ~j~~x~etx, Marq+

occupied progressively more responsible ment. He also is Editor in Chief of

staff positions with various organiza- EDPACS: EDP Audit, Control and Secur-

tions. Since 1968 he has been a full-time iry. h addition, he writes regularly for

consultant to management. He has been both Accout~tir~~q Today and Modem O&e accredited by the Society of Professional Tec/rnology. Management Consultants, and he has

been certified as an Information Systems

Auditor, a Systems Professional, an

Office Automation Professional and a

Records Manager.

He is a Fellow of both the British Insti-

tute for Administrative Management and

the American Association of Criminol-

ogy. In addition, he has been named an

Associate of the British Institute for

lleprographic Technology. He haa

rcccivcd the Jcssc H. Neal Award of the

Association of 13usincss Publishers and

twice has been honored with the silver

medallion of the American Managcmcnr

Associations. In addition, he has rcccived

distinguished service citations from the

National Micrographics Association, the

Association of Records Executives and

Administrators, the Association for

Sysrcms Management, the Institute of

Internal Auditors, and the New York

Chamber of Commerce and Industv.

And, he has been named a life honorary

member of the faculty of the Federal

Emergency Management Institute.

592