View
217
Download
0
Category
Tags:
Preview:
Citation preview
Seizing the SignalsSeizing the Signals
CSCE 727 - Farkas 2
Reading ListReading List This class
– Carr, Chs. 3, 4– Introduction to TEMPEST, The Complete and unofficial
TEMPEST Information Place http://www.eskimo.com/~joelm/tempestintro.html
– NSA, TEMPEST endorsement program, http://www.nsa.gov/ia/industry/tempest.cfm
– Federal Computer Intrusion Laws, http://www.usdoj.gov/criminal/cybercrime/cclaws.html
Signal IntelligenceSignal Intelligence
Deriving intelligence from intercepted electromagnetic waves
Types of intelligence:– Communication intelligence (COMINT)– Electronic intelligence (ELINT)– Imagery intelligence (IMINT)
CSCE 727 - Farkas 3
ELECTRONIC VISUAL SURVEILLANCE ELECTRONIC VISUAL SURVEILLANCE AND THE REASONABLE AND THE REASONABLE
EXPECTATION OF PRIVACYEXPECTATION OF PRIVACY By Max Guirguis, Journal of Technology Law & Policy,
2004, http://grove.ufl.edu/~techlaw/vol9/issue2/guirguis.html
Positive results: reduced crime, efficient work place, etc.
Negative results: potential misuse of recording, incorrect results, constitutional rights
Surveillance of public vs. private (or reasonably expected to be private) places
CSCE 727 - Farkas 4
CSCE 727 - Farkas 5
EchelonEchelon
Goal: – intercept large quantities of communication– Analyze (semi-automated) gathered data– Identify and extract messages of interest
What messages are retained?– Key words – categories– Human verification
Who has access to them?
CSCE 727 - Farkas 6
The Positive AspectsThe Positive AspectsIncreased national securityPreventive measuresGlobal effects
– Global commerce– Communication infrastructure
CSCE 727 - Farkas 7
Negative AspectsNegative Aspects Global (in)balance Privacy issues Misuse Law Error of analysis
– Large amount of data– Sophistication of analysis– Use of results
Other Surveillance IssuesOther Surveillance Issues
CSCE 727 - Farkas 9
EavesdroppingEavesdropping
Sender RecipientTools: microphone receivers, Tape recorder, phone “bugs”, scanners,Radio receivers, satellite receivers, spy satellites,Network sniffing, etc.
CSCE 727 - Farkas 10
Computer CommunicationsComputer CommunicationsTCP/IP Protocol StackTCP/IP Protocol Stack
Application Layer
Transport Layer
Internetwork Layer
Network Access Layer
• Each layer interacts with neighboring layers above and below• Each layer can be defined independently• Complexity of the networking is hidden from the application
At what layer should we support security?
CSCE 727 - Farkas 11
Security NeedsSecurity Needs
Basic services that need to be implemented:Key managementConfidentialityNonrepudiationIntegrity/authenticationAuthorization
CSCE 727 - Farkas 12
Network Access Layer SecurityNetwork Access Layer Security Dedicated link between hosts/routers hardware
devices for encryption Advantages:
– Speed Disadvantages:
– Not scalable– Works well only on dedicates links– Two hardware devices need to be physically
connected
CSCE 727 - Farkas 13
InternInternetwork Layer Securityetwork Layer Security
IP Security (IPSec) Advantages:
– Overhead involved with key negotiation decreases <-- multiple protocols can share the same key management infrastructure
– Ability to build VPN and intranet
Disadvantages:– Difficult to handle low granularity security, e.g.,
nonrepudation, user-based security,
CSCE 727 - Farkas 14
Transport Layer SecurityTransport Layer Security
Advantages:– Does not require enhancement to each
application
Disadvantages:– Difficult to obtain user context– Implemented on an end system– Protocol specific implemented for each
protocol
CSCE 727 - Farkas 15
Application Layer SecurityApplication Layer Security Advantages:
– Executing in the context of the user --> easy access to user’s credentials– Complete access to data --> easier to ensure nonrepudation– Application can be extended to provide security (do not depend on the
operating system)– Application understand data --> fine tune security
Disadvantages:– Implemented in end hosts– Security mechanisms have to be implemented for each application -->
– expensive
– greated probability of making mistake
CSCE 727 - Farkas 16
Surveillance DifficultiesSurveillance Difficulties
New Technologies– 1994: U.S. Congress: Communication
Assistance or Law Enforcement Act (digital telephony bill”
EncryptionData authenticity and integrity
CSCE 727 - Farkas 17
TEMPESTTEMPEST
U.S. government code : classified set of standards for limiting electric and magnetic radiation emanations from electronic equipments.
Investigations and studies of compromising emanations.
CSCE 727 - Farkas 18
Compromising EmanationsCompromising Emanations
Unintentional intelligence-bearing signals that if intercepted and analyzed can disclose classified information.
Intercepted when transmitted, handled, or processed
Tempest equipment: remotely mirror what is being done on a remote device, e.g., video monitor, cable wire, processing unit, etc.
CSCE 727 - Farkas 19
Unintentional EmanationsUnintentional Emanations Normal operation of system Deliberate or accidental exposure to unusual
environment Software induced Security Considerations: Traditional
– Unauthorized access to the system – requires knowledge about the system, applications, configuration, can be detected, limited time frame, etc.
Upcoming – Exploitation of compromising signals
CSCE 727 - Farkas 20
TEMPEST HistoryTEMPEST History U.S. government concern about capture and reconstruction of emanations
from high-security devices used to process, transmit, store sensitive data– 1950s: Introduce standards to limit “leakage” – NAG1A– 1960s: revise NAG1A to FS222 and FS222A– 1970s: revise standards – National Communications Security Information
memorandum 5100 (NACSIM)– 1974: revise NACSIM 5100– 1981: National Communications Security Committee Directive 4. – MACSIM
5100A (classified) – 1984: National Communications Security Instructions – NACSI 5400 (secret)– 1984: National Security Directive 145. by NSA
NSA: Tempest: a signal problem, (http://www.nsa.gov/public_info/_files/cryptologic_spectrum/tempest.pdf
NSA: History of US Communications security, http://www.nsa.gov/public_info/_files/cryptologic_histories/history_comsec.pdf
CSCE 727 - Farkas 21
Military applicationMilitary application
WWII Enemy communications– German army eavesdropped on enemy
communication while already implementing protection measures against the same attacks against German communications
1960: MI5 tempest attack on cipher machines
Limited publications
CSCE 727 - Farkas 22
Non-military ApplicationNon-military Application
1966: open publication on the risk of tempest attacks
19821984: Swedish government publication on the business risk of tempest attacks
1985: van ECK – screen content disclosure1985: Bank ATM – card info and PIN1990: tamper resistant hardware – smart
card
CSCE 727 - Farkas 23
Electromagnetic EmissionsElectromagnetic Emissions
Simplest form of electromagnetic fields: transmission and distribution lines, wall socket power: steady 60 hertz (U.S.), sinusoidal wave
Electric devices: alter characteristics of electromagnetic waves (frequency, power level, wave form) – E.g., wave forms: sinusoidal, sawtooth, spike, square
Capture and interpret: complex waves can be captured, interpreted, and replayed on similar device to create exact replica of the original device
Field strength – Reduced with the distance from the electric device– Depends on the emanating device, e.g., type of screen, CPU,
CSCE 727 - Farkas 24
COMSECCOMSEC
Four main parts:– Physical security – Emission security– Transmission security – Cryptographic security
Red equipment: handles plain text information with national security value
Black equipment: protected (encrypted) information
Unintentional emission: from Red systems
CSCE 727 - Farkas 25
TEMPEST AttackTEMPEST Attack
Requires:– High level of expertise and equipment to
decode captured waves– Proximity to the target– Long collection time
Processing device: $5,000-$250,000
CSCE 727 - Farkas 26
Tempest ProtectionTempest Protection
Physical separation– Exclude unauthorized individuals from areas
near the source of emanation
Electromagnetic separation– Shielding, filtering, etc. to remove the leak
Signal level minimization– Lowest feasible power-level use
CSCE 727 - Farkas 27
TEMPEST ShieldingTEMPEST Shielding
NSA specifications – Ferrites, other frequency interference products – Shield equipment, cables, room, building, etc. – NSA standards, endorsed devices and
contractors – Expensive – TEMPEST protected PC about
double the price– Shielding and distance together
CSCE 727 - Farkas 28
Threat-Based SystemThreat-Based System
Reduce the cost of TEMPEST efforts– Evaluation: sensitivity of information, risk of
TEMPEST attack, etc.– Personnel control: physical control,
unauthorized access– Compartmentalization: each sensitivity level is
isolated from the others– Physical control of emanation: shield, power,
noise, etc.
CSCE 727 - Farkas 29
Tempest ProceduresTempest Procedures
Government and organizational restrictionsProducts, installation, maintenanceReporting needsCertified TEMPEST technical authority
(CTTA)
CSCE 727 - Farkas 30
Need for TEMPESTNeed for TEMPEST
Little public data on TEMPEST casesGovernment focus and funding
– National security intelligence– Economic espionage
Decoding device: hard to obtainBandwidth of human intelligence vs.
TEMPESTTEMPEST threat within U.S. – minimal??
CSCE 727CSCE 727
Cyber AttacksCyber Attacks
(Brief Overview)(Brief Overview)
CSCE 727 - Farkas 32
AttackAttack
RFC 2828:
“ An assault on system security that derives from an intelligent threat, i.e., an intelligent act that is a deliberate attempt (especially in the sense of a method or technique) to evade security services and violate the security policy of the system.”
CSCE 727 - Farkas 33
Normal FlowNormal Flow
Information source
Information destination
CSCE 727 - Farkas 34
InterruptionInterruption
Information source
Information destination
Asset is destroyed of becomes unavailable - AvailabilityExample: destruction of hardware, cutting communicationline, disabling file management system, etc.
CSCE 727 - Farkas 35
InterceptionInterception
Information source
Information destination
Unauthorized party gains access to the asset – ConfidentialityExample: wiretapping, unauthorized copying of files
CSCE 727 - Farkas 36
ModificationModification
Information source
Information destination
Unauthorized party tampers with the asset – IntegrityExample: changing values of data, altering programs, modify content of a message, etc.
CSCE 727 - Farkas 37
Fabrication Fabrication
Information source
Information destination
Unauthorized party insets counterfeit object into the system – AuthenticityExample: insertion of offending messages, addition of records to a file, etc.
CSCE 727 - Farkas 38
Phases of AttackPhases of Attack Improve detection by examining which “phase” an
intruder’s behavior is identified Attack phases:
– Intelligence gathering: attacker observes the system to determine vulnerabilities
– Planning: attacker decide what resource to attack (usually least defended component)
– Attack: attacker carries out the plan– Inside the system:
Hiding: attacker covers tracks of attack Future attacks: attacker installs backdoors for future entry points
CSCE 727 - Farkas 39
Passive AttackPassive Attack
“Attempts to learn or make use of information from the system but does not affect system resources” (RFC 2828)
Sniffer
CSCE 727 - Farkas 40
SniffersSniffers
How easy it is to sniff on– Local wired network– Wide-area wired network– Wireless devices
What are the risks of sniffers?– Message content– Traffic flow
CSCE 727 - Farkas 41
Passive attacks
Interception (confidentiality)
Disclosure of message contents Traffic analysis
How can we protect against How can we protect against sniffers?sniffers?
CSCE 727 - Farkas 42
CSCE 727 - Farkas 43
Protection against passive Protection against passive attacksattacks
Shield confidential data from sniffers: cryptography
Disturb traffic pattern: – Traffic padding – Onion routing
Modern switch technology: network traffic is directed to the destination interfaces
Detect and eliminate sniffers
CSCE 727 - Farkas 44
Active attacksActive attacks
“Attempts to alter system resources of affect their operation” (RFC 2828)
CSCE 727 - Farkas 45
Active attacks
Interruption Modification Fabrication(availability) (integrity) (integrity)
Give examples of attacks!
CSCE 727 - Farkas 46
Active AttacksActive AttacksMasqueradeReplayModification of messagesDenial of serviceDegradation of serviceSpoofing attacksSession hijacking
CSCE 727 - Farkas 47
Degradation of ServiceDegradation of Service
Do not completely block service just reduce the quality of service
CSCE 727 - Farkas 48
Intrusion ControlIntrusion Control
It is better to prevent something than to plan for loss.
Problem: Misuse happens!
CSCE 727 - Farkas 49
NeedNeed::
Intrusion Prevention: protect system resources
Intrusion Detection: (second line of defense) identify misuse
Intrusion Recovery and response: cost effective recovery models
CSCE 727 - Farkas 50
Intrusion PreventionIntrusion Prevention
First line of defenseTechniques: cryptography, identification,
authentication, authorization, access control, security filters, etc.
Not good enough (prevention, reconstructions)
CSCE 727 - Farkas 51
Intrusion Detection System Intrusion Detection System (IDS(IDS))
Looks for specific patterns (attack signatures or abnormal usage) that indicate malicious or suspicious intent
Second line of defense against both internal and external threats
CSCE 727 - Farkas 52
Intrusion Detection SystemsIntrusion Detection SystemsDeter intrudersCatch intrudersPrevent threats to fully occur (real-time
IDS)Improve prevention techniquesIDS deployment, customisation and
management is generally not trivial
CSCE 727 - Farkas 53
Audit-Based Intrusion Audit-Based Intrusion DetectionDetection
Intrusion Detection System
Audit DataProfiles, Rules, etc.
Decision
Need:• Audit data• Ability to characterize behavior
CSCE 727 - Farkas 54
Audit DataAudit Data Format, granularity and completeness depend on the
collecting tool Examples
– System tools collect data (login, mail)– Additional collection of low system level– “Sniffers” as network probes– Application auditing
Honey Net Needed for
– Establishing guilt of attackers– Detecting suspicious user activities
CSCE 727 - Farkas 55
Audit Data AccuracyAudit Data Accuracy
Collection method– System architecture and collection point– Software and hardware used for collection
Storage method– Protection of audit data
Sharing and Integration– Transmission protection and correctness– Availability
CSCE 727 - Farkas 56
IDS CategoriesIDS Categories1. Time of data analysis
Real-time vs. off-the-line IDS2. Location where audit data was gathered
Host-based vs. network-based vs. hybrid3. Technique used for analysis
Rule-based vs. statistic-based4. Location of analysis
Centralized, distributed, network-based5. Pattern IDS looking for
Misuse vs. anomaly-based vs. hybrid
Incident ResponseIncident Response
CSCE 727 - Farkas 58
Incident ResponseIncident Response
• Federal Communications Commission: Computer Security Incident Response Guide, 2001, http://csrc.nist.gov/groups/SMA/fasp/documents/incident_response/Incident-Response-Guide.pdf•Incident Response Team, R. Nellis, http://www.rochissa.org/downloads/presentations/Incidence%20Response%20Teams.ppt •NIST special publications, http://csrc.nist.gov/publications/nistpubs/index.html
CSCE 727 - Farkas 59
Due Care and LiabilityDue Care and Liability
Organizational liability for misuse– US Federal Sentencing Guidelines: chief executive
officer and top management are responsible for fraud, theft, and antivirus violations committed by insiders or outsiders using the company’s resources.
– Fines and penalties Base fine Culpability score (95%-400%)
– Good faith efforts: written policies, procedures, security awareness program, disciplinary standards, monitoring and auditing, reporting, and cooperation with investigations
When to plan for incidents?When to plan for incidents?
CSCE 727 - Farkas 60
CSCE 727 - Farkas 61
Roles and ResponsibilitiesRoles and Responsibilities User:
– Vigilant for unusual behavior– Report incidents
Manager:– Awareness training– Policies and procedures
System administration:– Install safeguards– Monitor system– Respond to incidents, including preservation of evidences
CSCE 727 - Farkas 62
Computer Incident Response Computer Incident Response TeamTeam
Assist in handling security incidents– Formal – Informal
Incident reporting and dissemination of incident information
Computer Security Officer– Coordinate computer security efforts
Others: law enforcement coordinator, investigative support, media relations, etc.
CSCE 727 - Farkas 63
Incident Response Process 1.Incident Response Process 1.
Preparation – Baseline Protection – Planning and guidance– Roles and Responsibilities – Training – Incident response team
CSCE 727 - Farkas 64
How to Respond?How to Respond?
CSCE 727 - Farkas 65
How to Respond?How to Respond?
CSCE 727 - Farkas 66
How to Respond?How to Respond?
CSCE 727 - Farkas 67
How to Response?How to Response? Actions to avoid further loss from intrusion Terminate intrusion and protect against reoccurrence Law enforcement – prosecute Enhance defensive security Reconstructive methods based on:
– Time period of intrusion– Changes made by legitimate users during the effected
period– Regular backups, audit trail based detection of effected
components, semantic based recovery, minimal roll-back for recovery
CSCE 727 - Farkas 68
Incident Response Process 2.Incident Response Process 2.
Identification and assessment– Symptoms– Nature of incident
Identify perpetrator, origin and extent of attack Can be done during attack or after the attack
– Gather evidences Key stroke monitoring, honey nets, system logs, network
traffic, etc. Legislations on Monitoring!
– Report on preliminary findings
CSCE 727 - Farkas 69
Incident Response Process 3.Incident Response Process 3.
Containment– Reduce the chance of spread of incident– Determine sensitive data– Terminate suspicious connections, personnel,
applications, etc.– Move critical computing services– Handle human aspects, e.g., perception
management, panic, etc.
Why is the Human Aspect Why is the Human Aspect Important?Important?
What can we do to limit damage?What can we do to limit damage?
Are cover stories acceptable? Are cover stories acceptable?
CSCE 727 - Farkas 70
CSCE 727 - Farkas 71
Incident Response Process 4.Incident Response Process 4.
Eradication– Determine and remove cause of incident if
economically feasible– Improve defenses, software, hardware,
middleware, physical security, etc.– Increase awareness and training– Perform vulnerability analysis
CSCE 727 - Farkas 72
Incident Response Process 5.Incident Response Process 5.
Recovery– Determine course of action– Reestablish system functionality– Reporting and notifications– Documentation of incident handling and
evidence preservation
CSCE 727 - Farkas 73
Follow Up ProceduresFollow Up Procedures
Incident evaluation:– Quality of incident (preparation, time to
response, tools used, evaluation of response, etc.)
– Cost of incident (monetary cost, disruption, lost data, hardware damage, etc.)
Preparing reportRevise policies and procedures
Recent ConcernsRecent Concerns
CSCE 727 - Farkas 74
CSCE 727 - Farkas 75
Recovery or Survivability?Recovery or Survivability?
What is “Survivability”?
To decide whether a computer system is “survivable”, you must first decide what “survivable” means.
CSCE 727 - Farkas 76
Vulnerable ComponentsVulnerable Components
1. Hardware2. Software3. Data4. Communications5. People
CSCE 727 - Farkas 77
Effect Modeling and Vulnerability Effect Modeling and Vulnerability DetectionDetection
Cascading effects
Seriously effectedcomponents
Weaklyeffected component
Not effectedcomponents
Incorporating Human Incorporating Human Aspects?Aspects?
Traditional issues: password sharing, errors, fraud, insiders, malicious users, social engineering, etc.
New issues: perception management, psychological operations, communication media– Egypt: role of the Internet, A. Alexander,
Internet role in Egypt's protests, http://www.bbc.co.uk/news/world-middle-east-12400319
CSCE 727 - Farkas 78
CSCE 727 - Farkas 79
Legal AspectsLegal Aspects National law International law Legal regime to apply Gray areas of law Legal response Evidence preservation
THEMIS: Threat Evaluation Metamodel for Information Systems
Presented at the 2nd Symposium on Intelligence and Security Informatics, 2004
Csilla Farkas, Thomas Wingfield, James B. MichaelDuminda Wijesekera
Themis, Goddess of Justice
CSCE 727 - Farkas 81
Cyber vs. Kinetic Attack Academic State-of-the-Art: Effects-Based Analysis Problem: Charter Paradigm Means-Based The Schmitt Reconciliation
– Distinguishing Military from Diplomatic and Economic Coercion
– Seven Factors
Use of Force in Cyberspace
CSCE 727 - Farkas 82
SeverityImmediacyDirectnessInvasivenessMeasurabilityPresumptive LegitimacyResponsibility
Schmitt Factors
CSCE 727 - Farkas 83
Severity
People Killed;Severe Property Damage
Armed attacks threaten physical injury or destruction of property to a much greater extent than other forms of coercion. Physical well-being usually occupies the [lowest, most basic level] of the human hierarchy of need.
How many people were killed?
How large an area was attacked? (Scope)
How much damage was done within this area? (Intensity)
People Killed;Severe Property
Damage
People Injured;Moderate
Property Damage
People Unaffected;No Discernable
Property Damage
CSCE 727 - Farkas 84
Immediacy
People Killed;Severe Property Damage
Over how long a period did the action take place? (Duration)
How soon were its effects felt?
How soon until its effects abate?
Seconds to Minutes
Hours to Days
Weeks to Months
The negative consequences of armed coercion, or threat thereof, usually occur with great immediacy, while those of other forms of coercion develop more slowly.
CSCE 727 - Farkas 85
Directness
People Killed;Severe Property Damage
Was the action distinctly identifiable from parallel or competing actions?
Was the action the proximate cause of the effects?
Action Sole Cause of Result
Action Identifiable as One Cause of Result, and to an Indefinite
Degree
Action Played No Identifiable Role in
Result
The consequences of armed coercion are more directly tied to the actus reus than in other forms of coercion, which often depend on numerous contributory factors to operate.
The voluntary and wrongful act or omission that constitutes the physical components of a crime. Because a person cannot be punished for bad thoughts alone, there can be no criminal liability without actus reus.
CSCE 727 - Farkas 86
Invasiveness
People Killed;Severe Property Damage
Did the action involve physically crossing the target country’s borders?
Was the locus of the action within the target country?
Border Physically Crossed; Action Has
Point Locus
Border Electronically Crossed; Action Occurs
Over Diffuse Area
Border Not Crossed; Action Has No
Identifiable Locus in Target Country
In armed coercion, the act causing the harm usually crosses into the target state, whereas in economic warfare the acts generally occur beyond the target’s borders. As a result, even though armed and economic acts may have roughly similar consequences, the former represents a greater intrusion on the rights of the target state and, therefore, is more likely to disrupt international stability.
CSCE 727 - Farkas 87
Measurability
People Killed;Severe Property Damage
Can the effects of the action be quantified?
Are the effects of the action distinct from the results of parallel or competing actions?
What was the level of certainty?
Effects Can Be Quantified Immediately by Traditional Means (BDA, etc.) with High Degree of Certainty
Effects Can Be Estimated by Rough Order of
Magnitude with Moderate Certainty
Effects Cannot be Separated from Those of Other Actions; Overall
Certainty is Low
While the consequences of armed coercion are usually easy to ascertain (e.g., a certain level of destruction), the actual negative consequences of other forms of coercion are harder to measure. This fact renders the appropriateness of community condemnation, and the degree of vehemence contained therein, less suspect in the case of armed force.
CSCE 727 - Farkas 88
Presumptive Legitimacy
People Killed;Severe Property Damage
Has this type of action achieved a customary acceptance within the international community?
Is the means qualitatively similar to others presumed legitimate under international law?
Action Accomplished by Means of Kinetic
Attack
Action Accomplished in Cyberspace but Manifested by a
“Smoking Hole” in Physical Space
Action Accomplished in Cyberspace and Effects
Not Apparent in Physical World
In most cases, whether under domestic or international law, the application of violence is deemed illegitimate absent some specific exception such as self-defense. The cognitive approach is prohibitory. By contrast, most other forms of coercion—again in the domestic and international sphere—are presumptively lawful, absent a prohibition to the contrary. The cognitive approach is permissive.
CSCE 727 - Farkas 89
Responsibility
People Killed;Severe Property Damage
Is the action directly or indirectly attributable to the acting state?
But for the acting state’s sake, would the action have occurred?
Responsibility for Action Acknowledged
by Acting State; Degree of Involvement Large
Target State Government Aware of Acting State’s
Responsibility; Public Role Unacknowledged; Degree of Involvement Moderate
Action Unattributable to Acting State; Degree
of Involvement Low
Armed coercion is the exclusive province of states; only they may generally engage in uses of force across borders, and in most cases only they have the ability to do so with any meaningful impact. By contrast, non-governmental entities are often capable of engaging in other forms of coercion (propaganda, boycotts, etc.).
CSCE 727 - Farkas 90
Overall Analysis
People Killed;Severe Property Damage
Have enough of the qualities of a use of force been identified to characterize the information operation as a use of force?
Use of Force Under Article 2(4)
Arguably Use of Force or Not
Not a Use of Force Under Article 2(4)
Recommended