Upload
lykien
View
215
Download
1
Embed Size (px)
Citation preview
Research Report No. UVACTS-5-14-63 July, 2004 DEVELOPMENT OF COUNTER MEASURES TO SECURITY
RISKS FROM AIR CARGO TRANSPORT
By:
Carla D. Rountree Dr. Michael J. Demetsky
2
A Research Project Report For the Mid-Atlantic Universities Transportation Center (MAUTC) A U.S. DOT University Transportation Center
Dr. Michael J. Demetsky Department of Civil Engineering Email: [email protected] Center for Transportation Studies at the University of Virginia produces outstanding transportation professionals, innovative research results and provides important public service. The Center for Transportation Studies is committed to academic excellence, multi-disciplinary research and to developing state-of-the-art facilities. Through a partnership with the Virginia Department of Transportation’s (VDOT) Research Council (VTRC), CTS faculty hold joint appointments, VTRC research scientists teach specialized courses, and graduate student work is supported through a Graduate Research Assistantship Program. CTS receives substantial financial support from two federal University Transportation Center Grants: the Mid-Atlantic Universities Transportation Center (MAUTC), and through the National ITS Implementation Research Center (ITS Center). Other related research activities of the faculty include funding through FHWA, NSF, US Department of Transportation, VDOT, other governmental agencies and private companies.
Disclaimer: The contents of this report reflect the views of the authors, who are responsible for the facts and the accuracy of the information presented herein. This document is disseminated under the sponsorship of the Department of Transportation, University Transportation Centers Program, in the interest of information exchange. The U.S. Government assumes no liability for the contents or use thereof.
ABSTRACT
CTS Website Center for Transportation Studieshttp://cts.virginia.edu University of Virginia
351 McCormick Road, P.O. Box 400742Charlottesville, VA 22904-4742
434.924.6362
3
ABSTRACT
The terrorist attacks of September 11, 2001 displayed the shortcomings of aviation
security in the United States. Most of the attention on aviation security since that time
has focused on airline passengers, their luggage, and their carry-on items, leaving air
cargo security on the back burner. The lack of security screening and screening
guidelines of cargo traveling by both passenger and all-cargo aircraft is the driving
purpose behind this research project: the development of a framework that may be used
by individual airports or airlines to analyze various security setups for screening
outbound air cargo within an on-airport cargo facility. This was accomplished through
airport surveys, a case study at an air cargo facility, and computer simulations testing
various setups of security technologies to screen cargo within a facility.
Data collected from surveys sent to over 100 of the nation’s major airports
revealed the lack of security in the air cargo environment and validated the need for this
research. Information was obtained on security measures utilized for cargo and
personnel, as well as the frequency of cargo screenings and information on the size and
setups of cargo facilities. Also, a results comparison between large and small airports
was conducted. A case study was performed at a cargo facility within a major U.S.
airport in order to gather data pertinent to the simulations used to test the security setups.
Information gathered on truck arrivals, the number of flight destinations, security
measures in place, as well as the general facility setup was used to form the basis of the
simulations. The simulations, conducted in Arena 7.01, tested the effectiveness and
cargo throughput of four security cases. Each case employed a different combination of
4
security measures proven suitable for an air cargo environment. The security setups were
evaluated based on the security systems’ costs, the overall effectiveness of catching high-
risk cargo, and the average amount of time taken to process cargo through the facility.
The Arena simulations present airlines, freight forwarders, and airport authorities
with the necessary tool to evaluate various cargo security screening measures that will
provide the best security solution for their particular facility or facilities. However,
further research is needed on the effectiveness of many security technologies. With this
information, government and aviation officials will be able to use this framework as a
step toward achieving a well-rounded plan for ensuring the safety and security of our
nation’s air cargo.
5
TABLE OF CONTENTS
CHAPTER 1: INTRODUCTION 12 1.1: Introduction to the Air Cargo Industry 12 1.2: Emphasis on Transportation Security 13 1.3: Current State of Air Cargo and Future Forecasts 14 1.4: Problem Statement and Purpose 15 1.5: Scope 16 1.6: Report Organization 17 CHAPTER 2: LITERATURE REVIEW 18 2.1: Efforts to Secure Air Cargo 18 2.2: Screening Methods 21 2.3: Efforts to Secure Ports and Ocean-bound Cargo 30 2.4: Electronic Supply Chain Manifest (ESCM) 33 2.5: Hazardous Materials (Hazmat) Regulations and Security Efforts 35 2.6: Simulation Models 37 2.7: Summary 40 CHAPTER 3: METHODOLOGY 41 3.1: Methodology Introduction 41 3.2: Airport Surveys 41 3.3: Evaluations of Screening Methods 44
6
3.4: Case Study 45 3.5: Arena Simulation of Cargo Flow Through an On-airport Cargo Facility 51 CHAPTER 4: RESULTS 59 4.1: Airport Surveys 59 4.2: Screening Methods 77 4.3: Arena Models 78 CHAPTER 5: CONCLUSIONS AND RECOMMENDATIONS 97 5.1: Conclusions 97 5.2: Recommendations 99 REFERENCES 101 APPENDIX A: SURVEY MATERIALS 105
7
LIST OF TABLES
Table 2.1: Breakdown of Screening Method Characteristics 26 Table 3.1: Detection Capabilities of Screening Methods Researched 53 Table 4.1: Results of the First Phase of Evaluation of Screening Methods 77 Table 4.2: Cases with Screening Technologies Capable of Screening for Explosives 79 Table 4.3: Cases with Screening Technologies Capable of Screening for
Stolen/Mislabeled Goods 79 Table 4.4: Cases with Screening Technologies Capable of Screening for Illegal
Drugs 79 Table 4.5: Cases with Screening Technologies Capable of Screening for Radioactive
Materials 79
Table 4.6: Cases with Screening Technologies Capable of Screening for
Dangerous/Illegal Gases 79 Table 4.7: Number of Screening Technologies In Each Case Capable of Detecting
Each Threat Type 80 Table 4.8: Case Costs and Processing Results 80 Table 4.9: Results for Explosives/Explosive Materials Detection 82 Table 4.10: Results for Stolen/Mislabeled Goods Detection 83 Table 4.11: Results for Illegal Drug Detection 84 Table 4.12: Results for Dangerous Gases Detection 85 Table 4.13: Results for Radioactive Materials Detection 86 Table 4.14: Results for Overall High-Risk Cargo Detection 87
8
Table 4.15: Means and 95% Confidence Intervals for Explosives Detection 88 Table 4.16: Means and 95% Confidence Intervals for Stolen Goods Detection 88 Table 4.17: Means and 95% Confidence Intervals for Illegal Drugs Detection 89 Table 4.18: Means and 95% Confidence Intervals for Radioactive Materials
Detection 89 Table 4.19: Means and 95% Confidence Intervals for Dangerous Gases Detection 90
9
LIST OF FIGURES
Figure 1.1: Breakdown of U.S. Domestic Air Cargo Market for 2001 15 Figure 3.1: Case Study Facility Layout, Outbound Side 46 Figure 3.2: Flow Chart Illustrating Cargo Flow Through An On-airport Cargo
Facility 48 Figure 3.3: Histogram of Time Between Truck Arrivals to Case Study Facility 50 Figure 3.4: Best Fit Distribution from Arena’s Input Analyzer,
-0.001 + 15.4(0.382)-15.4x15.4-1e-(x/0.382)^15.4 50 Figure 4.1: 95% Confidence Interval Spreads for 10% Screening in All Four Cases
and for All Five High-Risk Types for Percent Found from Security 91 Figure 4.2: 95% Confidence Interval Spreads for 25% Screening in All Four Cases
and for All Five High-Risk Types for Percent Found from Security 91 Figure 4.3: 95% Confidence Interval Spreads for 50% Screening in All Four Cases
and for All Five High-Risk Types for Percent Found from Security 92 Figure 4.4: 95% Confidence Interval Spreads for 10% Screening in All Four Cases
and for All Five High-Risk Types for Total Percent Found 93 Figure 4.5: 95% Confidence Interval Spreads for 25% Screening in All Four Cases
and for All Five High-Risk Types for Total Percent Found 94 Figure 4.6: 95% Confidence Interval Spreads for 50% Screening in All Four Cases
and for All Five High-Risk Types for Total Percent Found 94
10
GLOSSARY
Belly cargo/freight: cargo or freight that is carried underneath the main cabin of
passenger aircraft.
Cargo unit: a single cargo parcel of any size
Cutoff time: the time past which an airline or freight forwarder will no longer accept
cargo shipments for same-day flights. Cutoff times differ between domestic and
international cargo, and also between airlines or freight forwarders.
Freight forwarder: a person or company that picks up, transports, and delivers
shipments on behalf of another party. Freight forwarders may also be referred to
as indirect air carriers.
Known shipper program: federal mandate that airlines (both direct and indirect)
shipping cargo aboard passenger aircraft know the origin of that cargo. Airlines
often accomplish this by verifying the customer’s location and type of business
before accepting any shipments. Cargo flying on all-cargo aircraft is not required
to originate from known shippers.
Outbound air cargo: air cargo that is awaiting transport by aircraft to a destination.
Tug/cart system: a popular method of transporting air cargo and passenger luggage
between facilities and aircraft. A small truck (tug) carries a train of about 3 or 4
platforms (carts) on wheels that hold ULDs or luggage.
Unit loading device (ULD): a container designed to fit precisely inside the hull of an
11
aircraft that can hold one or more cargo shipments. ULDs have differing designs
for differing types of aircraft and therefore will only fit in one specific make of
aircraft.
ACKNOWLEDGEMENTS
I would like to thank my advisor, Professor Michael J. Demetsky, for his
encouragement and support of a project that fit my interests and career goals, and for his
assistance and advice throughout the pursuit of this research. I would also like the thank
Saeed Eslambolchi for his gracious help in developing the surveys, along with Professor
K. Preston White for his assistance in troubleshooting my efforts in the simulations.
Your guidance throughout this project is sincerely appreciated.
I would also like to express my gratitude to the officials from the case study
airport and case study facility. This project would not have been possible without your
kind assistance.
Finally, I would like to thank my family and friends for their continued interest
and support of this work and my career goals.
12
CHAPTER 1: INTRODUCTION
1.1: Introduction to the Air Cargo Industry
Since its beginnings in the 1920s, the air cargo industry has been known for its
speedy yet expensive service. This sector of the freight transportation market thrives on
the narrow time frame in which it can deliver goods, and customers in need of quick
deliveries are willing to pay the price. Air transport is especially advantageous when
long distances must be overcome. Cargo may traverse an ocean and be delivered within
24 hours by aircraft, whereas the same journey may take five to ten times as long by sea.
As should be expected, the cargo will cost five to ten times as much to transport by air
rather than by sea.1
The air cargo industry began with a single scheduled cargo shipment that flew just
after the end of World War I, and the first regular all-cargo flight service began in 1926.
After World War II ended many war veterans began their own non-scheduled air cargo
services that lead to high competition, increased flight frequencies, and the eventual
failure of most of the non-scheduled services. The scheduled services that survived grew
steadily through the 1940’s and 1950’s. The air transport deregulation of 1977 and 1978
resulted in an influx of new domestic airline companies seeking to take advantage of the
new ability to negotiate shipment rates. The latest change in the industry occurred in the
1 William Armbruster , “Breaking News,” Air Cargo World Online, October 8, 2002.
http://www.aircargoworld.com/break_news/3.htm.
13
1980s with the introduction of the “just-in-time” concept, in which deliveries are planned
in advance with the intent of keeping customer’s inventories low.2
Today air cargo can be transported one of two ways: as belly freight on a
commercial passenger flight, or on a dedicated all-cargo aircraft. Commercial airlines
make between five and ten percent of their revenue through the air cargo business.3
Around 50 to 60 percent of domestic air cargo by tons is transported by commercial
passenger aircraft.4
1.2: Emphasis on Transportation Security
The terrorist attacks of September 11, 2001 spurred the recent focus on
transportation security and its apparent shortcomings, including the creation of the
Department of Homeland Security and the Transportation Security Administration
(TSA). The vast majority of the TSA’s new federal security regulations and programs
have targeted airline passengers and their luggage, as well as ocean-bound cargo. New
federal security screeners, screening machines such as x-rays and trace detection devices,
luggage and carry-on rules, and stricter screening of passengers themselves have been
introduced in airports around the country. Fear of a “dirty” bomb or a weapon of mass
destruction in a cargo container instigated the focus on ocean-bound cargo security,
which has ranged from new federal security programs that involve international
cooperation to the installation of security devices such as radiation detectors, radio
frequency identification, and electronic cargo unit sensors. Air cargo is one
2 Richard Malkin, “An Air Cargo Century,” Air Cargo World Online, January 2000.
http://www.aircargoworld.com/archives/feat1jan00.htm. 3 Karim Nice, “How Air Freight Works,” How Stuff Works, October 2002.
http://www.howstuffworks.com/air-freight/htm. 4 “Airport Security Report,” PBI Media LCC, Volume 9, No. 5, February 27, 2002, p. 3.
14
transportation sector that has been mostly overlooked from a security standpoint. Very
few regulations pertaining to air cargo security have been passed by the federal
government, and most of those that have are vague and non-specific. Major legislation
that would outline specific and strict air cargo security regulations has been introduced
into Congress, but so far nothing has been passed.
1.3: Current State of Air Cargo and Future Forecasts
U.S. domestic air cargo tonnage (revenue-ton kilometers) grew by 41% in the
1990s. In 2000 alone domestic tonnage went up 3.7%, however it dropped by 9.2% in
2001 because of the September 11 attacks and a slowing economy. Air cargo carriers’
use of trucks has increased in recent years, even faster than air transport itself. Trucks
have developed into a major part of the air cargo market as a major linkage point to the
cargo facilities, making the air cargo industry increasingly intermodal. From 1995 to
2000 truck-transported freight (tons) increased by 4.5%, while air-transported freight
grew by only 1.9%. In 2001 express deliveries made up the vast majority of the domestic
air freight market, at 60.5%, while another 20.1% was taken by scheduled freight. The
domestic air freight market is illustrated in Figure 1.1.
15
Figure 1.1: Breakdown of U.S. Domestic Air Cargo Market for 2001
Breakdown of U.S. Domestic Air Cargo in 2001
19.4%
20.1% 60.5%
Express Deliveries Scheduled Freight Deliveries Other
Boeing predicts that U.S. domestic air cargo will grow steadily at about 4.5%
through 2011 and will grow at a rate of about 4.3% through 2021. North American cargo
traffic with Europe, Asia, and Latin America decreased by 10% to 16% in 2001, but is
expected to steadily increase by 6% to 8% through 2021.5
1.4: Problem Statement and Purpose
According to the FBI, cargo terminals, cargo transfer facilities, and consolidation
facilities are hotbeds for cargo theft.6 And while 50% to 60% of all U.S. air cargo travels
as belly cargo, only about 4% is screened for explosives.7 The precise effectiveness of
most screening methods used, along with the effectiveness of screeners themselves, is not
known.8 This, along with the lacking legislation pertaining to air cargo security, has left
a large gap in a major sector of transportation security. A standardized system or set
5 “Boeing World Air Cargo Forecast 2002-2003: North America,” The Boeing Company, 2002.
http://www.boeing.com/commercial/cargo/n_america.html. 6 “Aviation Security: Vulnerabilities and Potential Improvements for the Air Cargo System,” U.S.
General Accounting Office, December 2002, pp. 8-9. 7 Gordon Dickson and Byron Okada, “Gaps Cited in Air Cargo Security,” Knight-Ridder/Tribune
News Service, February 27, 2002, p. 1. 8 “Aviation Security: Efforts to Measure Effectiveness and Address Challenges,” U.S. General
Accounting Office, November 2003, p. 1.
16
procedure for screening cargo within airport facilities is needed. Various combinations
of security measures need to be evaluated in order to develop a system that effectively
screens as much cargo as the limited time element allows and at a reasonable cost to the
industry.
The purpose of this study is to analyze outbound cargo flow through an on-airport
cargo facility and develop a systematic framework for evaluating air cargo screening
alternatives within an air cargo facility without jeopardizing the crucial time element
involved in air cargo transport and with minimal cost. Hazards such as explosives,
explosive materials, cargo theft and smuggling, and chemical attacks will be considered
in the analysis. A computer simulation of an on-airport cargo facility will be used to
evaluate various combinations of security technologies used to counteract such risks.
1.5: Scope
For this project, five major tasks will be undertaken in order to develop a best-
case security framework for on-airport cargo facilities. The first task is a comprehensive
literature review of security regulations, screening methods, and security programs across
various transportation modes. For the second task, the results of surveys sent to major
U.S. airports will be analyzed in order to determine the current state of national air cargo
security and cargo operations for both large and small cargo facilities. Third, various
screening methods will be analyzed in order to determine what can feasibly be used in an
air cargo environment. For the forth task, information obtained from a case study
conducted at a large on-airport air cargo facility at a major U.S. airport will be analyzed
17
and used to form the basis for the fifth task, which is a computer simulation of outbound
cargo flow through an on-airport facility. The simulation will analyze varying
combinations of security methods from the third task that directly screen cargo. Inbound
cargo and U.S. mail are not included in the analysis. The results of this project will be
useful for gaining insight into the implications of stricter air cargo screening regulations
and provide a general outline that airlines, freight forwarders, airport authorities, and
government organizations can use to analyze operations and security techniques.
1.6: Report Organization
Chapter 2 describes the literature review, including regulatory efforts to secure air
cargo, screening methods available, efforts to secure cargo intermodally and in seaports,
hazardous materials regulations, and a review of various computer simulation models. In
Chapter 3, the methodology behind the surveys, the case study, and the evaluation of the
screening technologies and the simulation models will be discussed. Chapter 4 will show
the results, and conclusions and recommendations will be given in Chapter 5.
18
CHAPTER 2: LITERATURE REVIEW
2.1: Efforts to Secure Air Cargo
Efforts to ensure nationwide security of air cargo since September 11, 2001 have
come from the newly formed Transportation Security Administration (TSA). Only a few
regulations have been put into law, but many more are currently being debated by TSA
and Congress.
2.1.1: TSA Enacted Regulations Pertaining to Air Cargo Security
On November 11, 2001, President George W. Bush signed into law the Aviation
and Transportation Security Act (ATSA), which established the creation of the TSA
within the U.S. Department of Transportation. This act outlined numerous new
regulations and guidelines for security of the transportation sector as a whole, however
only one sentence in the act directly pertains to air cargo: “Cargo deadline – A system
must be in operation to screen, inspect, or otherwise ensure the security of all cargo that
is to be transported as soon as practicable after the date of enactment of the Aviation and
Transportation Security Act,” 9. Other rules and guidelines directly and indirectly
pertaining to air cargo security have been enacted since the signing of the ATSA:
1. All cargo aboard passenger flights must be screened10
2. All freight forwarders must have a TSA-approved security program in order to
be allowed to ship on passenger flights
9 Aviation and Transportation Security Act, Section 110(f), Public Law 107-071. Passed by 107th
Congress November 19, 2001. 10 James R. Carroll, “Legislation Would Tighten Security Rules for Air Freight,” The Courier-
Journal (Louisville, Kentucky), April 7, 2003.
19
3. Airlines must have a TSA-approved security program for cargo
4. TSA requires ID checks of anyone entering restricted cargo areas within
airports
5. TSA requires random screening of people, vehicles, and property at airport
perimeter access points.
6. Criminal history checks are mandatory for cargo employees working in
restricted or secure areas11
7. Physical inspection of air cargo is required, however the exact percentage
required by TSA is classified.12
It should be noted that airlines and freight forwarders are not required to check
and record IDs of known shipper employees delivering cargo to a facility.
2.1.2: Proposed Rules and Regulations Pertaining to Air Cargo Security
In January of 2003, a bill entitled Air Cargo Security Act was proposed by
Senators Kay Bailey Hutchison (R-TX) and Diane Feinstein (D-CA), and the measure
was approved by the Committee on Commerce, Science, and Transportation in March of
2003. The measure, which has not been passed by Congress as of June 2004, outlines the
following regulations:
1. Air cargo facilities and aircraft must be inspected on a regular basis
2. Cargo handlers must undergo security training
3. Creation of a national known shipper database, listing all companies and cargo
11 “Airport Security,” Title 49 of the Code of Federal Regulations, Subchapter B, Part 1542.
http://www.tsa.gov/public/display?theme=79&content=0900051980096ff5. 12 Asa Hutchinson, Under Secretary for Border and Transportation Security, Eno Transportation
Foundation 2004 Leadership Development Conference, Session on Security: Transforming How We Manage Transportation. May 26, 2004.
20
transporting/transported on U.S. and international passenger carriers13
4. Random security checks must be conducted at air cargo facilities and in
aircraft
5. Screening/inspection systems for air cargo must be established
6. Mandatory background checks for all airline employees handling cargo
7. Government ability to stop shippers who have violated regulations from
shipping air cargo.14
Another bill that is currently being debated in Congress would require mandatory
screening and inspection of all air cargo. This bill was introduced by Senator Ed Markey
(D-MA) and has passed in the House of Representatives, but has yet to pass in the
Senate.15
The Transportation Security Administration has outlined an Air Cargo Strategic
Plan, which officials call a “blueprint” for a comprehensive approach to air cargo
security. The plan includes the following proposed rules:
1. Prescreen all cargo to determine its threat level
2. Examine all cargo deemed to be of “elevated risk”16
3. Inspect all aircraft prior to the day’s first flight17
4. Randomly inspect low-risk cargo and cargo bound for passenger aircraft
13 “Committee Approves Air Cargo Security Bill,” U.S. Senate Press Release, March 13, 2003.
http://www.senate.gov/~commerce/press/03/2003313730.html. 14 James R. Carroll, “Legislation Would Tighten Security Rules for Air Freight,” The Courier-
Journal (Louisville, Kentucky), April 7, 2003. 15 “McGreevey Calls on Fed to Establish Stricter Air Cargo Guidelines,” Press Release, New
Jersey Office of the Governor, August 8, 2003. http://www.state.nj.us/cgi-bin/governor/njnewsline/view_article.pl?id=1329.
16 “Air Cargo Strategic Plan,” Transportation Security Administration Press Release, November 17, 2003. http://www.tsa.gov/public/interapp/press_release/press_release_0371.xml.
17 David Harris, “The Air Cargo Security Plan,” The International Air Cargo Association, December 17, 2003. http://www.tiaca.org/articles/2003/12/17/388C7D7AAA454B9D86BB24EF50333B66.asp.
21
5. Airport operators and operators of all-cargo aircraft with a gross take-off
weight of 12,500 lbs or more will have to conduct criminal record checks on
all employees and use other additional means to screen and identify anyone
with aircraft access
6. Use facility security measures to ensure security of all-cargo aircraft
7. Have a centralized, more comprehensive known shipper program (done
through TSA, not individual airlines or freight forwarders).18
TSA also plans to require all non-U.S. all-cargo carriers with operations within
the U.S. to have TSA-approved security plans outlining screening procedures and
identifying all personnel with access to aircraft.19
2.2: Screening Methods 2.2.1: Direct Cargo Screening
Numerous non-intrusive technologies and methods exist that may be used to
screen cargo units directly and specifically for various types of threats. Eight
technologies and methods will be analyzed for this project: pulsed fast neutron analysis,
vapor detection, trace detection, canines, x-ray machines, gamma ray, thermal neutron
activation, and radiation detection. All the methods discussed do not require cargo
containers to be opened, and they all have difficulty detecting biological threats.
a. Pulsed Fast Neutron Analysis
18 “Air Cargo Strategic Plan,” Transportation Security Administration Press Release, November
17, 2003. http://www.tsa.gov/public/interapp/press_release/press_release_0371.xml. 19 David Harris, “The Air Cargo Security Plan,” The International Air Cargo Association,
December 17, 2003. http://www.tiaca.org/articles/2003/12/17/388C7D7AAA454B9D86BB24EF50333B66.asp.
22
A pulsed fast neutron analysis machine works by measuring cargo density to
identify the chemical composition of the container’s contents. Pulsed neutrons
are directed at the cargo unit, interact with the cargo’s material, and “create
gamma rays with energies characteristic of its elemental composition” that are
used to display an image of the contents on a screen. This can reveal the presence
of any material with specific elemental concentrations similar to known threat
objects and materials. This machine is classified as an active detection system,
meaning that it stimulates the material so that detectors may analyze the effects of
stimulation. It can require building modifications due to its size. The cost per
machine ranges from $10 million to $25 million, and inspection time takes a
minimum of one hour per cargo unit.
b. Vapor Detection
Vapor detection machines are equipped with a sensor that collects air samples
from around the cargo unit. Spectrographic analysis is performed to determine
the molecular makeup of the material within the unit. Vapor detection machines
are relatively small and light, and they can be battery-operated, computer-
operated, or electrically operated. Vapor detection is a passive detection system,
meaning it does not require the stimulation of materials to determine a threat
presence. The cost per machine ranges from $30,000 to $50,000, and they can
process a cargo unit in about 30 to 60 seconds.
c. Trace detection
Trace detection machines use a swipe to wipe the cargo unit and pick up
particulate matter. Spectrographic analysis is performed on the swipe to
23
determine the molecular makeup of the material picked up on the unit. Like
vapor detection machines, trace detection machines are relatively small and can
be operated by battery, computer, or electronically. According to TSA, these
machines have “shown few problems” when screening cargo. The cost per unit is
$30,000 to $50,000, and they can process a cargo shipment in about 30 to 60
seconds.20
d. Canines
Drug- and explosives-detecting canines are widely considered by security experts
to be the most effective way to screen cargo since they have the fewest drawbacks
of any method currently available. Dogs have a very sensitive sense of smell, and
they can be trained to passively alert handlers of the presence of explosive
materials or drugs. Properly trained canines very rarely give false positive alerts.
Canines can be trained to detect either explosives or drugs, but should never be
trained to detect both. All canines used at U.S. airports must receive TSA
certification. Canines used for drug detection may work 2 or 4 hour shifts each
day with periodic rest. Canines trained to detect explosives may only work 30 to
60 minutes before taking a 20-minute rest. Canines can clear 400 to 500 cargo
parcels for both drugs and explosives in about 30 minutes. It is very important for
a canine to receive extensive training, care, and rest for it to perform properly.
Yearly maintenance costs can range from $7,000 to $50,000 per canine unit (a
canine unit consists of 2 to 4 teams with 1 handler and 1 to 2 dogs per team).
20 “Volume 6 – Report on Non-intrusive Detection Technologies,” U.S. Treasury Advisory
Committee on Commercial Operations of The United States Customs Service Subcommittee on U.S. Border Security Technical Advisory Group and Customs Trade Partnership Against Terrorism, June 14, 2002. http://www.cargosecurity.com/ncsc/coac/Non-intrusive.pdf.
24
However, the start-up costs for a canine unit can be quite high. The first year of
maintenance and training can cost well over $100,000.21
e. X-ray Machines
X-ray machines scan cargo units by directing x-ray beams at the unit so that the
beams interact with the material inside and form an image of the material on a
screen. X-ray energy may be low or high, but higher energies are needed for
denser materials. X-ray machines are classified as active detection systems, and
they generally take 2 to 5 minutes to scan a cargo unit. A drawback of x-ray
machines is that they cannot specifically identify a threat (i.e., differentiate
between materials), except for certain systems used with a high-energy
transmission. Costs range from $2 million to $10 million.
f. Gamma Ray
Gamma ray systems are active detection systems that use a radioactive element to
produce gamma rays, which are directed at the cargo unit. An image is displayed
on a screen as the gamma rays interact with the material in the container. These
machines may be fixed in place, or they may be placed on a vehicle for mobility.
The downsides to gamma ray systems are that they cannot identify specific
threats, and they have difficulty differentiating between materials when scanning
high-density cargo. Costs range from $500,000 to about $3 million per machine,
and they can scan a cargo unit in 2 to 5 minutes.
g. Thermal Neutron Activation
21 TCRP Report 86: Public Transportation Security Volume 2, K9 Units in Public Transportation:
A Guide for Decision Makers, National Academy Press, Washington D.C., 2002.
25
Thermal neutrons are directed at the cargo unit and absorbed by the material
within. As a result, a gamma ray photon is emitted and its energy signature is
detected by sensors, which can then determine specific element concentrations
that might be a sign of an explosive. Thermal neutron activation systems are
active detection systems that can either be fixed in place or mounted on a vehicle
for mobility. Costs range from $500,000 to $3 million per machine. The system
takes a minimum of one hour to scan a cargo unit.
h. Radiation Detection
All radioactive substances emit radiation (i.e., x-rays, alpha rays, neutrons), which
is detected and measured by a detector in the radiation detection system. High
levels of specific types of radiation may indicate a threat object. These machines
are classified as passive detection systems. The systems are small and are easily
portable, and they can be operated either by a battery, a computer, or
electronically. Machines typically cost between $10,000 and $50,000 and can
scan a cargo unit in 30 to 60 seconds.
Table 2.1 compares the costs, inspection times, installation requirements, and
identification abilities of the screening methods discussed in this section.
26
SCREEN TIME TO MAT’L MAT’L COST FOR INSPECT DISCR. ID INSTALLATION ACTIVE SYSTEMS X-ray $1 - 10 million Explosives, 2 - 5 min No No Mobile or fixed. Standard $1 - 5 million stolen 2 - 5 min No No Fixed sites need Dual View $10 million mat’ls, 2 - 5 min No No power, road Backscatter $2 - 5 million drugs 2 - 5 min No No access, Gamma Ray $500,000 - 2 - 5 min No No personnel $3 million facilities, and Pulsed Fast $10 - Explosives, 1 hr + Yes Yes attention to Neutron 25 million drugs radiation safety. Analysis Vehicles Thermal $500,000 - Explosives 1 hr + Yes Yes needed Neutron $3 million for mobility. Activation PASSIVE SYSTEMS Vapor $30,000 - Prohibited 30 - 60 sec Yes Yes Portable or Detection $50,000 gases desktop equip. Trace $30,000 - Explosives, 30 - 60 sec Yes Yes operated by Detection $50,000 drugs battery or Radiation $10,000 - Radiation 30 - 60 sec No Yes, for wallplug. Detection $50,000 radioactive material Canines $7,000 - Explosives, 10 - 60 sec Limited Yes Require care, $120,000 per drugs by amt. of feeding, shelter. unit per year training
Table 2.1: Breakdown of Screening Method Characteristics. Source: U.S. Treasury Advisory Committee on Commercial Operations of the United States Customs Service22
2 2.2: Screening Methods for Personnel, Visitors, and Truck Drivers
Some of the most popular methods for screening individuals at airports involve
manual and paper-based processes. At most airports around the country, airport
employees carry official airport IDs or badges as identification. Some airports issue IDs
to frequently-visiting truck drivers, as well. These IDs or badges usually clearly identify
22 “Volume 6 – Report on Non-intrusive Detection Technologies,” U.S. Treasury Advisory
Committee on Commercial Operations of The United States Customs Service Subcommittee on U.S. Border Security Technical Advisory Group and Customs Trade Partnership Against Terrorism, June 14, 2002. http://www.cargosecurity.com/ncsc/coac/Non-intrusive.pdf.
27
the airport, list the individual’s name, and include a picture. Such identification must be
clearly displayed at all times. IDs are often equipped with a magnetic strip that is
programmed to allow an individual entry to certain areas, depending on their job and
rank. Truck drivers delivering or picking up cargo who are issued airport IDs usually
have to check with security personnel upon arriving at the airport, and their arrival is
documented by established FAA procedures, which are paper-based.
Visitors who have official business at airports and need access to secured areas do
not have many options for gaining such access. One of the more popular screening
methods for airport visitors is to simply assign them an authorized escort who has proper
clearance. At some airports visitors can be issued temporary IDs valid only for the day or
days they are scheduled to be there.
Biometrics is a newly emerging security technology that many industry and
government officials see as the future of access security for airport personnel, truck
drivers, and even airline passengers. Biometrics uses biological identification by
matching signatures in fingerprints, thumbprints, hands, voices, faces, or irises. A
person’s signature may be stored either in a central database or on a “smart card,” a
plastic driver’s license-sized card with an embedded computer chip that stores the
individual’s biological signature. A biometric reader scans the part of the body that it is
programmed to read and matches the person’s signature to the signature stored in the
smart card or the central database.23
As promising and secure as the technology sounds, there are problems and issues
that have arisen. Privacy issues have been raised about the fact that some biometric
23 Liza Porteus, “Homeland Security Technologies in the Pipeline,” Fox News, January 14, 2004.
http://foxnews.com/story/0,2933,108289,00.html.
28
systems store detailed personal information in a centralized database that numerous
people, including some government organizations, could have access to. Smart cards
could be lost or stolen.24 Tests performed on iris scanning systems have found that
individuals with very light or very dark eyes or people whose iris is not connected to the
eye socket can cause identification errors.25 Also, people without fingerprints cannot be
identified by fingerprint-based biometric systems.
2.2.3: Perimeter Security and Surveillance
Three of the most popular methods for perimeter security and surveillance are
canine patrols, guard patrols, and closed circuit television (CCTV). Canines, as
explained under Section 2.2.1, require a large initial investment for equipment, care and
training. However, after the first year, maintenance costs are quite low, and a properly
trained canine is considered to be one of the most effective screening and patrol methods
available today. Canines used for patrol only do not require as much training as those
used for drug and explosives detection. Patrol canines enforce general good behavior in
public areas and can quickly and effectively apprehend a suspect person without causing
harm. Guard patrols can be used virtually anywhere to deter unlawful activity. Guard
patrols have a variety of responsibilities including keeping watch over secured areas to
ensure only proper personnel are present, monitoring airport perimeters to check for any
24 Jennifer Jones, “A Moving Target – Border Control and Transportation Safety Apps Put
Biometrics to the Test,” Federal Computer Week, June 23, 2003. http://www.fcw.com/fcw/articles/2003/0623/cov-report2-06-23-03.asp.
25 Celeste Perry, “Iridian Helps Cut Airport Waits as Security Tightens (Update 1),” Bloomberg L.P., January 8, 2003. http://www.biometricgroup.com/in_the_news/bloomberg.html.
29
forced entry onto airport property, and watching for suspicious activity of any sort.
Guard and canine patrols can and often are used together.
CCTV has the capability to monitor and store video of any area in which a video
camera is installed. Within an air cargo facility, properly placed cameras can record
container loading, unloading, and handling so that any improper activity can be seen by
personnel monitoring the videos. CCTV is limited in the fact that it does not provide any
actual protection from cargo tampering. CCTV can provide constant monitoring of any
place that has a camera installed. For such areas, CCTV can be an advantage over guard
and canine patrols because they are under constant surveillance. However, patrols have
the ability to stop any unlawful activity while it is in action and apprehend the person or
persons involved.26
Canine patrol teams can cost anywhere from $7,000 to over $100,000, depending
on when the canines are first obtained, how many canines make up the team, and many
other factors. Guard patrols can cost anywhere from $30,000 to $60,000 per year per
guard depending on location and experience.27 CCTV cameras cost anywhere from $50
to $1,000 apiece. Additional components such as switching and recording devices vary
greatly in cost.28
26 “Aviation Security: Vulnerabilities and Potential Improvements for the Air Cargo System,” U.S. General Accounting Office, December 2002.
27 TCRP Report 86: Public Transportation Security Volume 2, K9 Units in Public Transportation: A Guide for Decision Makers, National Academy Press, Washington D.C., 2002.
28 “Aviation Security: Vulnerabilities and Potential Improvements for the Air Cargo System,” U.S. General Accounting Office, December 2002.
30
2.3: Efforts to Secure Ports and Ocean-Bound Cargo
2.3.1: Introduction
The vast majority of cargo security efforts by the U.S. government thus far have
been geared toward ocean-bound cargo container security. The Department of Homeland
Security has initiated a number of new security programs along with private companies
and foreign countries in order to reduce the supply chain’s vulnerability to terrorist
actions. These programs include the Container Security Initiative (CSI), the Customs-
Trade Partnership Against Terrorism (C-TPAT), and the Rapid Information Security
Knowledge (RISK) Alert.
2.3.2: Container Security Initiative (CSI) and the 24-Hour Rule
The Container Security Initiative (CSI) was implemented in January of 2002. It
requires the cooperation of foreign countries to allow U.S. Customs officials into their
major ports shipping cargo to the U.S. in order to screen “high risk” containers that are
U.S.-bound. Customs officials work closely with the port officials using non-intrusive
surveillance equipment to screen any cargo believed to present a significant threat. The
goal of CSI is to discover and clear high-risk cargo before it reaches the U.S. By
inspecting high-risk cargo during “down time” as it waits to board a vessel, the need to
inspect it once it arrives in the U.S. is eliminated, thereby reducing the overall processing
time.
All containers arriving at the port that are bound for the U.S. have their manifest
data electronically evaluated and are assigned a risk level. Data from the host country’s
31
customs department is also analyzed in the risk assignment process. During the first year
of operation, U.S. Customs enacted the 24-Hour Rule to ensure that manifest data would
be received in time to conduct inspections if needed. The 24-Hour Rule states that U.S.
Customs must receive a cargo container’s electronic manifest data at least 24 hours
before the cargo is to be loaded onto a ship. This rule applies to all U.S.-bound cargo, not
just cargo originating at foreign ports with CSI presence. This regulation allows U.S.
Customs to directly receive manifest data without having to obtain it from foreign
customs agents.
During CSI’s first year, 15 foreign governments agreed to allow U.S. Customs
officials into their major ports (24 ports total), and five of these ports became home to
CSI teams. CSI is now operable in 19 of the 20 largest foreign ports transporting cargo to
the U.S.29
2.3.3: Customs-Trade Partnership Against Terrorism (C-TPAT)
C-TPAT is a program through which participating private companies commit to
increasing their efforts to ensure supply chain security and in return, U.S. Customs
promises a decreased likelihood of inspecting those companies’ shipments for weapons
of mass destruction. Companies currently participating in the program include domestic
manufacturers, truckers, and shipping companies. Program officials plan to allow foreign
manufacturers into the program in the near future. The program requires participants to
document the security procedures that they conduct and report them to U.S. Customs.
29 “Container Security: Expansion of Key Programs Will Require Greater Attention to Critical
Success Factors,” U.S. General Accounting Office, July 2003.
32
To become a C-TPAT member, a company must enter into an agreement with
Customs to support the C-TPAT security recommendations and work with its supply
chain partners to increase security. Companies are given risk scores that are reduced
when they officially become C-TPAT members. The lower the risk score, the lower the
likelihood that Customs will inspect shipments for WMDs.
C-TPAT officials periodically conduct validations of member’s security measures
that are based on information such as security-related incidents, cargo volumes, and
geographic location. The validations, along with information from Customs on past
relations between a company and U.S. Customs (including trade compliance and any
criminal investigations) determine updated risk score computations. Depending on any
vulnerabilities found, a company can have its benefits suspended until action is taken to
address the problems. If overall security has been enhanced to the satisfaction of C-TPAT
officials, a company’s risk score will be further reduced.30
2.3.4: Rapid Information Security Knowledge (RISK) Alert
RISK Alert is a data software and tracking system for ocean cargo, ships, and ship
crews from origin to destination that can also collect data from electronic seals and
bioterrorism sensors on cargo containers. The system was developed by Transentric, part
of the Union Pacific Corporation, and the Delaware River Maritime Enterprise Council.
If a ship or its cargo veers off the intended course, appropriate law enforcement agencies
will be notified immediately. Participating agencies can choose what types of security
30 “Container Security: Expansion of Key Programs Will Require Greater Attention to Critical
Success Factors,” U.S. General Accounting Office, July 2003.
33
situations they would like to be notified about and by what method they would like to
receive notification.31
The system benefits commercial shippers and other participants by allowing them
to prove their security and safety management practices work and benefit supply chain
security. Participants also benefit by having to submit updated security information just
one time, since the information will automatically be sent to the proper agencies once
submitted.32
2.3.5: Summary
In terms of inter-governmental collaboration, efforts and research toward ocean
cargo security is more advanced than that for air cargo security. Programs such as CSI,
C-TPAT, and the RISK Alert could be implemented in the air cargo industry and serve as
another protective measure against cargo security threats. Programs of this nature would
complement the known shipper program and provide a means to secure cargo without
adding time to the processing that goes on within the air cargo facility.
2.4: Electronic Supply Chain Manifest (ESCM) The Electronic Supply Chain Manifest is a test project using new screening
technologies to identify people responsible for cargo along its journey and to track cargo
itself along and between different transportation modes. The purpose of the project was
to find an alternative to the manual security tracking processes currently used for air
31 Henry J. Holcomb, “Security Tightens by Monitoring Ships and Cargo,” Philadelphia Inquirer,
March 19, 2003. http://www.centredaily.com/mld/centredaily/news/5428395.htm. 32 “R.I.S.K. Alert – Supply Chain Technology Aids Homeland Security,” Transentric Press
Release, March 28, 2003. www.transentric.com/WhatsNew/pressReleases1.asp?intWhatsNewID=65.
34
cargo that facilitates efficient and secure cargo movements. The testing mainly involved
motor carriers and air carriers. Around 200 people and 40 companies signed up to be
involved in the test project that took place around Chicago’s O’Hare Airport and New
York City’s JFK Airport. Authorized system users can monitor information on a
shipment as it moves along the supply chain from manufacturing and shipping to motor
carrier and aircraft.
The system works by matching cargo with its proper handler, known shipper
information, and its origin and destination. The system makes use of technologies
including encrypted internet transactions, smart cards, and biometric fingerprint readers.
All access points along the chain are equipped with biometric fingerprint readers to
validate the person handling the cargo (usually the truck driver). Smart cards are used to
verify the identity of the cargo handlers and the transfer of cargo from one person to
another, as well as to transfer information on the cargo itself. The smart cards are given
to the handlers for use at pick-up and delivery, and they contain the handler’s thumbprint
(for biometric identification), information on the cargo’s manifest, and a copy of the
handler’s driver’s license. A Triple DES 192 bit encryption is used for internet
transactions, which is the current encryption standard for the federal government because
of its low cost and high security.
The program began in 1999 and was tested for 2.5 years. An analysis of the test
results found that the computer network was very reliable and that the use of electronic
manifests, smart cards, and biometrics proved to be an effective and time-efficient
combination. Nearly 75% of all manifests were processed in less than 1 minute, 15
35
seconds. 86% of the manifests were processed in less than 2 minutes, and 92% of the
manifests took less than 3 minutes to process.
Participants who consistently used the system were very pleased and remained
interested in keeping the system. Many even used the system longer than the testing plan
called for. Very few technical problems were reported. Participants felt that security was
increased because of the limited access to information, limited documentation needing to
be accessed.33
2.5: Hazardous Materials (Hazmat) Regulations and Security Efforts
Among air carriers, more types of dangerous goods (and larger quantities) are
allowed on all-cargo aircraft than on passenger aircraft. About 50% of all regulated
dangerous goods may travel on both all-cargo aircraft and passenger aircraft. Around
30% of regulated hazardous materials are allowed on all-cargo aircraft, but not passenger
aircraft, and the final 20% may not travel by air at all. Materials included in the final
20% include highly toxic, explosive, oxidizing, self-reactive, and flammable chemicals.
For those hazmat goods allowed on aircraft (cargo, passenger, or both), the
government has 4 types of quantity restrictions:
1. Restrictions by name: names of certain materials are prohibited if they
present a dire hazard or have cause aircraft incidents (i.e., fires or explosions).
2. Restrictions by hazard class or subdivision: hazardous materials are divided
into classes, and certain subdivisions of these classes are highly toxic or
reactive and are either allowed only on all-cargo aircraft or not at all.
33 “Electronic Supply Chain Manifest Freight ITS Operational Test Evaluation: Final Report,”
U.S. Department of Transportation, December 2002. http://www.itsdocs.fhwa.dot.gov//JPODOCS/REPTS_TE//13769.html.
36
3. Restrictions by quantity listed on packages: if the quantity of a certain
dangerous good exceeds government regulations, it will not be allowed onto
the aircraft.
4. Restrictions by package integrity: the material must be packaged so as to
protect the shipment and prevent leaks or spills.
In the past few years, concern has arisen over undeclared hazmat shipments and
the risks they pose to air transport. In 2000, U.S. Customs and the Federal Aviation
Administration performed inspections of international cargo, both arriving and departing,
at 19 U.S. airports and found that around 8% of hazmat shipments contained undeclared
dangerous material. Major air carriers usually only discover undeclared dangerous goods
when they are prompted to search a shipment for some reason, and such promptings and
discoveries are rare. For those that are discovered, neither the government nor the air
transport industry keeps a record of such discoveries, so trends cannot be identified. Air
carriers’ only options for screening for undeclared goods consist of physical inspection
and limited x-ray use. Air carriers have found that casual shippers are much more likely
to try to ship undeclared hazmat shipments than known shippers, and most major carriers
restrict their customers to known shippers only. However, most carriers and other
industry experts believe that undeclared shipments are not usually intentionally
undeclared, but result from ignorance of regulations governing dangerous goods.34
34 “Aviation Safety: Undeclared Air Shipments of Dangerous Goods and DOT’s Enforcement Approach,” U.S. General Accounting Office, January 2003.
37
2.6: Simulation Models Three simulation software packages capable of simulating air cargo flow through
an air cargo facility were researched for potential use in this project. The packages
examined were CargoVIZ, AutoMod, and Arena 7.01.
2.6.1: CargoVIZ
CargoViz is a three-dimensional simulation software package that allows the user
to design and visualize the flow of airline passenger baggage through security
checkpoints. CargoViz allows the user to virtually construct the interior of an airport
security inspection area and specify what security equipment is to be used, the quantity of
each type of equipment, and rate of passenger traffic passing through the inspection area.
The user may choose from airport models within the software, or he may construct his
own. By running scenarios, the user can evaluate the effectiveness of the type and
quantity of security equipment used for various passenger flow rates. The three-
dimensional environment lets the user view the scene from any angle, and he may
navigate right, left, up, down, forward, backward, or zoom in and out. The toolbar allows
the user to add objects to the workspace; change, create, or delete baggage flow paths;
change the viewpoint and the lighting; and activate or deactivate the floorspace grid.
CargoVIZ was created by the Applied Visualization Center at the University of
Tennessee, Knoxville in conjunction with the Transportation Security Administration and
the Safe Skies Program of the National Safe Skies Alliance. The National Safe Skies
38
Alliance statistically determined measurements for quantifying the impacts of various
airport and equipment layouts on the flow of baggage through the security checkpoint.35
2.6.2: AutoModTM
AutoModTM, a product of Brooks PRI Automation, is discrete event simulation
software that can be used to model most any system. The user can create 3-D animation
for any simulation. The system requires programming knowledge, although the software
does come with ready-built model logic for system portions such as conveyors, path-
based vehicle movements, bridge cranes, tanks, and pipes. Modules use a combination of
a graphical user interface and a CAD system to make simulation of material handling
systems simpler and allow for virtually unlimited simulation size and detail. The
software has also incorporated automatic connections and routings for certain systems.
Statistical reports and graphs are automatically generated when a simulation is performed
that give information on equipment utilization, processing time, and other aspects of the
process. The 3-D animation allows the user to view the model from any angle at any
point in time. Various airport authorities, air carriers, and government organizations have
used AutoModTM to simulate airport operations such as baggage systems and passenger
security checkpoints. Brooks PRI Automation offers consulting services to organizations
using their products, including on-site training, video production, and technical support.36
35 “Cargo VIZ User Manual,” Applied Visualization Center, University of Tennessee, Knoxville, July 30, 2003. http://viz.utk.edu/projects/tsa/cargoviz/cargo_viz.pdf.
36 AutoModTM Information Packet, Brooks-PRI Automation, 2003.
39
2.6.3: Arena 7.01
Arena, like AutoModTM, is discrete event simulation software than can be used to
model almost any system including manufacturing, logistics, warehousing, and
distribution systems. Arena is a product of Rockwell software and usually uses 2-D
animation. The software does not require computer programming knowledge and has a
user-friendly interface. The software has various templates with modules used to
simulate basic and advances processes as well as transfer elements. The software has an
optional component called OptQuest that serves as an optimization tool, as well as a
Factory Analyzer that can be used to improve performance and capacity for
manufacturing systems. A Flow Process template may also be used, which models
discrete-continuous systems. 3-D animation is possible with Arena’s extra 3-D player.
After a simulation has run to completion, the simulation history can be used to create the
3-D animation.37
2.6.4: Simulation Models Summary
From these three simulation packages, Arena 7.01 was chosen for use in this
project. Arena was readily available at a reasonable cost, it does not require
programming knowledge, and it comes with various useful statistical analysis programs.
CargoVIZ is intended for use with passenger luggage, not cargo, and AutoMod requires
moderate computer programming knowledge in order to use. Prior experience with the
Arena was also a factor. This allowed for time to be spent calibrating the models rather
than learning new simulation software.
37 W. David Kelton, Simulation with Arena 3rd Edition,” McGraw-Hill: Boston, 2004.
40
2.7: Summary
Looking at the current regulations in place pertaining to air cargo, it can be seen
that few regulations have been put into effect, and many of them are quite vague. A large
number of the proposed regulations are also vague and could be open to interpretation.
Stating that cargo must be screened could simply mean that it must originate from a
known shipper, or it could be intended to mean that cargo must be screened by a specific
screening method such as the methods discussed in this chapter. Ensuring air cargo
security must involve more than just direct cargo screening – methods to validate
personnel and truck drivers, perimeter security, and programs such as those implemented
at ports for ocean-bound cargo will help ensure a layered security program. Determining
the most efficient and effective way to directly screen cargo as an integral part of a
layered security approach will be discussed in subsequently.
41
CHAPTER 3: METHODOLOGY
3.1: Introduction
The methodology discussed in this chapter includes the airport surveys, the
evaluations of the cargo screening methods, the case study, and the simulation of cargo
flow through an on-airport cargo facility. The literature review, which is also part of the
methodology, was discussed in the last chapter.
3.2: Airport Surveys
3.2.1: Introduction and Purpose
In August of 2003, a survey containing questions about air cargo facilities,
operations, and security measures (titled “Air Cargo Operations and Security Survey”)
was sent out to 118 medium- to large-sized airports around the U.S. A list of 181 major
U.S. airports and their corresponding airport codes was obtained from
www.infoguides.com/airports/code_stn.htm. Airport data and contact information for all
of these airports was obtained from the Federal Aviation Administration’s webpage. The
airport managers, airport addresses, and airport information telephone numbers were
included in the list. A phone call was made to each airport on the list in order to get the
name, mailing address, and email address of an appropriate person to whom the survey
should be sent. The appropriate person was usually not the person spoken with on the
phone. The list was narrowed to 118 airports since some airports did not have on-airport
42
cargo operations, and contact with some airports could not be established within a
reasonable time frame. Appropriate contacts and mailing addresses were obtained for all
118 airports. However, only 59 of the contacts’ email addresses were obtained. Many of
the airport employees giving out the appropriate contact information either did not know
the contact’s email, could not locate it, or did not wish to disclose it. Surveys were sent
to all 118 airports at the beginning of August of 2003 with a return deadline of September
5, 2003. Hard copies were sent to all airports on the list. Electronic copies were emailed
to contacts for which email addresses had been obtained. A cover letter accompanied
each survey, referring the contacts to a website where the survey could also be taken
online. This provided the person completing the survey multiple options for filling out
and returning the survey: regular mail, email, fax, or online completion.
The survey’s purpose was to gather information on the cargo operations, security
measures used, and the physical layout of the facilities themselves in order to gain an
understanding of the layout of a typical on-airport cargo facility and the process of
operations within it. As an incentive to return the surveys, a copy of the aggregate
results was offered to participants.
The survey questions were divided into two general sections: operations and
security. The survey and an accompanying cover letter stated that no individual
responses would be released, keeping individual airports’ responses confidential. The
cover letter and the survey may be seen in Appendix A.
As of September 5, 2003, only 16 surveys had been returned. It was decided to
extend the deadline to October 8, 2003, and a second email (also included in Appendix
A) was sent to those airport contacts with email addresses that had not returned the
43
survey asking them to reconsider completing it and returning it before the new deadline.
An additional 3 surveys were returned by the second deadline. Since aviation security
has been a sensitive subject since September 11, 2001, this may have discouraged many
airports from returning the survey.
3.2.2: Operations, Facilities, and Cargo Volumes
The first section of the survey asked questions pertaining to airports’ cargo
operations, the size of their facilities, and the types and volumes of cargo carried. The
first five questions asked for information on the number of companies sorting cargo and
leasing space at the airport, the size of the cargo facilities themselves, and the number of
aircraft parking positions at each facility. The next five questions dealt with the method
of cargo sorting, the number of facility employees, the percentage of cargo handled as
belly freight, the percentage of cargo as international freight, and the method of transport
between the facility and the plane. The final two questions in this section asked for a
listing of common cargo types handled and quarterly cargo volumes.
The intent of this section was to gain an understanding of the scale of the
operations within air cargo facilities, along with information on the nature of the cargo
itself.
3.2.3: Security
The second section of the survey was targeted toward security measures in place
at airports that would have bearing on air cargo. The first four questions asked for the
amount of cargo screened and the methods available to do so, how employees working in
44
secure cargo areas are screened, and if visitors are allowed in the cargo facilities. The
next four questions ask for the percentage of cargo coming from known shippers,
information on screening trucks and truck drivers, and whether or not international cargo
is handled at the airport. The final four questions asked about the clearance times and
screening methods of international cargo, surveillance of dumpsters, plans to expand on
current security technologies, and the final question asked whether or not the airport
would be interested in allowing a case study for this project to be conducted on site.
The intent of this section was to gather specific information on how employees,
trucks, truck drivers, and cargo itself is screened for security purposed and to what extent.
The objective of the final question was to find an airport willing to host the case study
needed for this project.
3.3: Evaluations of Screening Methods
The direct cargo screening technologies discussed in Section 2.2.1 will be
evaluated in two phases. The first phase consists of evaluation based on the
technologies’ cost, screening time, and applicability in an air cargo environment. The
technologies that meet the requirements for this criteria will move on to the second phase,
which consists of testing via computer simulation in an operational environment.
For the first phase, any technology that carries a high cost (millions of dollars or
more per unit) or requires more than a few minutes to scan a single unit will be
eliminated. Also, any method that would be difficult to apply in an air cargo facility due
to factors such as size and mobility will also be eliminated from consideration. In the
45
second phase, each screening method will be evaluated in combination with other
methods to determine which combination or combinations result in the lowest cost and
time required, as well as the greatest detection coverage of various threat materials. A
range of effectiveness will be incorporated into the simulation for each screening method,
however very little data is available on the effectiveness of security screening
technologies, therefore it will not play a pivotal role in the overall evaluations.
3.4: Case Study
3.4.1: Introduction
A case study was conducted at an on-airport air cargo facility owned by a major
U.S. airline at a major U.S. airport. A total of three visits were made to the facility in
order to gain an understanding of how a cargo facility is laid out and the process of
outbound cargo operations within it, determine what security measures are used, and
gather data on truck arrivals for use in the Arena simulations. Information on the
airline’s known shipper program and truck schedules was also obtained.
3.4.2: Facility Layout and Operations
The air cargo facility is divided into two sections: the right side of the facility is
used to process outbound cargo, and the left side is used to process inbound cargo and
U.S. mail. The outbound cargo side will be considered for this project. Figure 3.1 shows
the layout of the outbound cargo side (the right side) of facility.
46
Figure 3.1: Case Study Facility Layout, Outbound Side
The facility consists of 7 landside cargo doors and 10 flight destinations that are labeled
along the walls of the facility. The yellow line traversing the width of the facility is the
dividing point between the delivery area and the processing area. Cargo is not officially
accepted by the airline until it has crossed the yellow line. On the date that data was
collected from the facility, an average of 6 employees were working in the facility (not
including the forklift operators) and 3 forklifts were in use. An employee stated that one
of the forklifts was not in operation on that day.
The basic procedure for processing cargo through the facility is outlined below.
Figure 3.2 illustrates the flow process.
1. Trucks arrive and dock at one of the 7 landside cargo doors.
2. Truck drivers have their paperwork verified and the cargo’s flight status is
checked.
3. Cargo is unloaded. Once it crosses the yellow line, it has been officially accepted
by the airline.
Office
Cold Storage
TemporaryStorage
Area
Destination
Wall Labels
Yellow Cross-line for Acceptance of Cargo
Airside Cargo
Doors (for
Landside
Cargo Doors
Side Cargo Door
47
4. Cargo is taken to the temporary storage area for measurements, labeling, and
possible physical inspection.
5. Cargo is sorted by destination and is placed on the floor under the appropriate
destination wall label. The cargo is held in this area until its intended flight
departure time nears.
6. Cargo units bound for the same flight are placed in unit loading devices (ULDs)
and loaded onto tugs.
7. Tugs leave through the airside cargo doors to take ULDs to the aircraft.
The airline delivers cargo to both domestic and international destinations. The cutoff
time for domestic cargo is 2 hours, and the cutoff for international cargo is 4 hours.
Trucks delivering cargo may be owned by the airline and operate on a schedule, or they
may be local trucks that do not operate on a schedule with the airline. Cargo units are
usually unloaded and sorted by forklifts, which are equipped with scales for weighing
each unit. Cargo is taken by tugs to the aircraft 20 to 30 minutes before the scheduled
takeoff. All but around 10% of arriving outbound cargo is shipped the same day it
arrives at the facility.
48
Figure 3.2: Flow Chart Illustrating Cargo Flow Through An On-airport Cargo Facility
3.4.3: Security
The airline relies mainly on the known shipper program, x-ray, and physical
inspection to keep their facility and cargo secure. The airline will only do business with
known shippers, but they make exceptions for shippers needing to transport live animals
and human remains. In order to verify a new potential customer as a known shipper, the
airline will send a cargo manager to the customer’s location to ensure that they have a
physical address, will be shipping legitimate goods, and can pay the shipping fees. Any
company that has been a known shipper in the past but has sent less than 12 shipments in
the past two years must be re-verified. The airline has never encountered any terrorist
groups wanting to obtain known shipper status, but they have turned down potential
Cargo Sorted by
Destination
N
Contents Verified?
Y
Remove for More
Inspection
Cargo Loaded onto ULDs, then
tugs
Tugs Take Cargo to Aircraft
Y
N
Cargo Accepted,Truck Leaves
Cargo Moved to Temporary Area Truck
Arrives
Physical Inspection?
(25%)
Paperwork Verified, Flight
Status Checked Ok?
N Cargo Denied,Truck Leaves
Y
1. 2. 3. 4. 4.
5. 6. 7.
49
customers for various reasons. They have also never encountered an unknown shipper
attempting to give them a shipment.
The airline uses x-ray and physical inspection to ensure cargo security once cargo
is in their possession. The x-ray machines they have are not in plain sight, and
information on the number of machines, their exact location, and frequency of use could
not be obtained. The airline mandates physical inspection of 25% of their outbound
cargo, which is an unusually high percentage of physical inspection for any airline or
freight forwarder. However, they cannot open any shipments from the U.S. government,
and they will only inspect shipments that can easily be opened and re-sealed.
3.4.4: Truck Arrivals
Data on the time between truck arrivals was obtained from the airline’s truck
schedule and observations of local truck arrivals to the facility. Data on local truck
arrivals was taken on February 19, 2004. The time between truck arrivals was used in the
Arena simulations to generate truck arrivals and their cargo deliveries. Figure 3.3 shows
that the time between arrivals was usually 20 minutes or less.
50
Figure 3.3: Histogram of Time Between Truck Arrivals to Case Study Facility
In order to properly represent this pattern of time between arrivals in Arena, the
data was put into a statistical analysis program within Arena called the Input Analyzer in
order to determine the statistical distribution that best fit the data set. The Input Analyzer
found that a Weibull distribution of the equation
-0.001 + 15.4(0.382)-15.4x15.4-1e-(x/0.382)^15.4 gave the best representation. Figure 3.4 shows
the distribution output from the Input Analyzer.
Figure 3.4: Best Fit Distribution from Arena’s Input Analyzer, -0.001 + 15.4(0.382)-15.4x15.4-1e-(x/0.382)^15.4
Time Between Truck Arrivals
02468
1012141618
20 40 60 80 100 120 140 160 180 200 200+
Minutes
Freq
uenc
y of
Tru
cks
51
The Weibull distribution equation takes the form
f(x) = a + αβ -αxα-1e-(x/β)^α if x > 0
where
a = offset from y-axis α = shape parameter of the distribution, and β = scale parameter of the distribution.38
This equation will be used to generate both trucks and their cargo arriving to the facility
in all the Arena simulations.
3.4.5: Summary
The layout of the case study facility and its operations will serve as the basis for
the Arena simulations. The processes described in this section will be duplicated in
Arena, and the data gathered from the truck arrivals will be used to generate truck and
cargo arrivals. Section 3.5 discusses the Arena simulations in detail.
3.5: Arena Simulation of Cargo Flow Through an On-airport Cargo Facility
3.5.1: Introduction
Arena will be used first to create a base case simulation without incorporation of
any security technologies, however it will include a small percentage of physical
inspection. The simulations will be based on the layout of the case study facility and will
use the data gathered during the case study. Each simulation will run for a period of 5
38 Averill M. Law and W. David Kelton, Simulation Modeling and Analysis Third Edition, McGraw-Hill, Boston, 2000, p. 303.
52
years and will undergo 3 replications. Some of the major factors from the case study that
will be incorporated into the Arena models will be the following:
1. Truck arrival times
2. Distribution of cargo units per truck arrival
3. Number of facility employees and forklifts
4. Number of flight destinations
5. Allowable time in which processing must be completed
6. Inclusion of physical inspection
It should be noted that although the case study facility routinely conducts physical
inspections on 25% of their outbound cargo, this is known to be greater than the
classified amount of physical inspection required by TSA. It will be assumed that TSA
requires 10% physical inspection, and this will be the value used in the simulations.
At the beginning of the simulation when the cargo units are created, 0.1% of the
units coming through will be assigned a high-risk status. The physical inspection in the
simulation will be conducted randomly, and statistics will be collected on the number of
high risk units inspected and found, along with the number of units that are not inspected
and work their way through the entire process. It will be assumed that physical
inspection is 100% effective at finding high-risk cargo.
Once the base case simulation is completed and evaluated, the combinations of
security methods will be incorporated for evaluation as discussed in Section 3.3. The rate
of 10% physical inspection will be included in these cases, as well. For these cases, the
cargo units will be assigned a specific high-risk status, meaning they contain a dangerous
or prohibited material. The five specific risks used in this project will include illegal or
53
dangerous gases, illegal drug shipments, stolen or mislabeled goods, radioactive
materials, and explosives (or explosive materials) not including gases and radioactive
materials. Table 3.1 shows which screening methods listed in Section 2.2 can detect
these five risks.
Screening Method Risks Detected X-Ray Explosives, stolen/mislabeled goods, illegal drugs Gamma Ray Explosives, stolen/mislabeled goods, illegal drugs Pulsed Fast Neutron Analysis Explosives, illegal drugs Thermal Neutron Activation Explosives Vapor Detection Dangerous gases Trace Detection Explosives, illegal drugs Radiation Detection Radioactive materials Canines Explosives, illegal drugs
Table 3.1: Detection Capabilities of Screening Methods Researched
Once a cargo unit is chosen for security screening, it will be inspected by all the
security methods included in the case unless one of the methods detects risk. If this
occurs, the cargo unit will be taken out of the security queue to undergo physical
inspection in order to verify the risk. If the risk is verified, the unit will be immediately
removed from the facility. If the risk is not verified (thus resulting in a false positive on
the part of the method that detected the “risk”), the unit will be routed to the cargo sorting
area. Screening for all five high-risk cargo types occurs simultaneously within the Arena
models.
Each combination case will be simulated three times. The first simulation will
screen a total of 10% of the outbound cargo, the second simulation will screen 25% of the
outbound cargo, and the third will screen 50% of the outbound cargo. Each of these
simulations will go through three replications. These percentages do not include the 10%
54
physical inspection. The screening time for each percentage for each case will be
evaluated, and comparisons of cost and screening time will be made among the different
combination cases. Statistics will be collected on the number and types of high-risk
cargo units caught, the number and types of high-risk cargo units not caught, the number
of true and false positives, and the total process time. The assumptions made in the
simulations are presented in the next section.
3.5.2: Simulation Setup and Assumptions
Each simulation except for the base case simulation will consist of four parts: a
truck model, a cargo flow model that includes the 10% physical inspection, a security
case model, and animation. The base case simulation will not have a security case model.
In the actual case study facility, cargo does not leave the facility immediately after
processing – it is held in the facility until close to its flight departure time. However, for
accurate evaluation of processing times in the simulation, it is assumed that cargo leaves
the facility as soon as the processing is completed. All cargo is assumed to be domestic
cargo and therefore will have a cutoff time of 2 hours. The base case simulation is set up
so that all cargo may be processed within 1 hour and 40 minutes or less to allow at least
20 minutes to transport cargo from the facility to the plane.
The cargo must “seize,” or take hold of, resources at various instances throughout
the simulation. The resources available for cargo to seize are 7 cargo bay doors, 6 facility
employees, and 4 forklifts. Both trucks and their associated cargo will simultaneously
seize an available cargo bay door upon arrival at the facility. Cargo must seize an
employee when going through paperwork and flight status verification, labeling and
55
measurements, physical inspection, and each technology in the security model. Cargo
must seize a forklift when being transported to the temporary storage area, being
transported to its city destination station, undergoing placement in ULDs and tugs, and
when being screened by certain technologies. If all resources are busy, the cargo must
wait in a queue to be serviced by the next available resource.
As stated in Section 3.3, very little data is available on the effectiveness of
screening technologies. Billie Vincent, president and CEO of Aerospace Services
International, estimates that explosive-detecting technologies are around 70% effective in
a cargo environment for units up to 79”x 60”x 64”.39 A triangular distribution with a
minimum of 60% effectiveness, a mean of 70% effectiveness, and a maximum of 80%
effectiveness will be assumed for all screening technologies. A uniform false positive
rate of 1% to 3% will be assumed for each technology.
Other assumptions have also been made in the simulations for truck arrivals,
cargo, and security processes. These assumptions are based on observations, not actual
recorded data, from the case study facility. Assumptions of a stochastic nature have been
assigned either triangular or uniform distributions because of their simplicity. Triangular
distributions take the form
f(x) = [2(x-a)]/[(b-a)(c-a)] if a ≤ x ≤ c
f(x) = [2(b-x)]/[(b-a)(b-c))] if c < x ≤ b
where
a < c < b
a = a location parameter; the minimum possible value c = the shape parameter; the value that maximizes the density function, and
39 Stephen Parezo, “Containing Security,” Air Cargo World Online,
http://www.aircargoworld.com/archives/features/2_apr02.htm. April 2, 2002.
56
b – a = a scale parameter; b is the maximum possible value.
Uniform distributions take the form
f(x) = 1/(b-a) if a ≤ x ≤ b
where
a < b
a = a location parameter; the minimum possible value, and b - a = a scale parameter; b is the maximum possible value.40
The assumptions that have been made about truck arrivals are the following:
1. Trucks (and the cargo contained within) will randomly choose an available cargo
bay door.
2. The number of cargo units unloaded from a truck will follow a triangular
distribution of the form TRIA(1, 8, 18) cargo units per truck.
The assumptions that have been made about the cargo process are the following:
1. The amounts of high-risk cargo: it is assumed that only 0.1% of all cargo entering
the facility will be high-risk. All designated high-risk cargo will be divided
evenly among the 5 high-risk types considered in this study, so that all incoming
cargo has a 0.02% chance of being one of the 5 high-risk types.
2. Cargo will be evenly distributed among the ten destinations served, so that all
incoming cargo has a 10% chance of being bound for any one destination.
3. The cargo delay time for paperwork and flight status verification will follow a
triangular distribution of the form TRIA(0.2, 0.75, 2) minutes per cargo unit.
40 Averill M. Law and W. David Kelton, Simulation Modeling and Analysis Third Edition,
McGraw-Hill, Boston, 2000, pp. 299 and 317.
57
4. Physical inspection of cargo will follow a triangular distribution of the form TRIA
(1,7,20) minutes per cargo unit.
5. Physical inspection is 100% effective at discovering all 5 types of high-risk cargo.
6. Any high-risk cargo that is discovered will be immediately removed from the
facility for further inspection.
7. Cargo sorting will follow a triangular distribution of the form TRIA(0.4, 0.6, 0.8)
minutes per cargo unit.
8. Cargo will be delayed at the city destination stations for a length of time
following a triangular distribution of the form TRIA(0.2, 0.4, 0.75) minutes per
cargo unit for the purpose of batching cargo into ULDs.
9. Cargo units will be batched together in ULDs and follow a triangular distribution
of the form TRIA(1, 2, 3) units per ULD.
10. The time required to place ULDs onto tugs will follow a triangular distribution of
the form TRIA(0.2, 0.3, 0.5) minutes per ULD.
11. ULDs are batched together on tugs following a uniform distribution of the form
UNIF(3, 4) ULDs per tug.
The assumptions that have been made about the security technologies and processes are
the following:
1. The time needed to process a cargo unit through x-ray, gamma ray, trace
detection, vapor detection, radiation detection, and canine scanning will follow
uniform distributions using the minimum and maximum time values listed in
Table 2.1 if they are chosen for analysis in the simulations.
58
2. The time needed to process a cargo unit through pulsed fast neutron analysis and
thermal neutron activation will maintain a constant value of 1 hour if chosen for
analysis in the simulations.
3. For each security case, cargo selected for security screening will proceed through
each security method unless a unit is found to be high risk, in which case it will be
removed from the security queue and physically inspected to determine whether
or not the cargo poses an actual risk.
3.5.3: Simulation Evaluations
The final evaluations of each security case will be based on the average amount of
time it takes a cargo unit to be processed, the percentage of high-risk cargo found based
on the amount of high-risk cargo selected for security screening, and the percentage of
high-risk cargo found based on the total amount of high-risk cargo being processed
through the facility regardless of whether or not it was selected for screening (henceforth
referred to as Percent Found from Security and Percent Found Total, respectively).
Significance testing will be performed on the Percent Found from Security and Percent
Found Total for each case in order to evaluate the results of the simulations.
59
CHAPTER 4: RESULTS
4.1: Airport Surveys
4.1.1: Introduction and Classification
The surveys yielded a 16% return rate. Most airports returning the survey did not
fully complete it, especially Part 2 that pertained to security. One initial observation was
that of a large gap in reported quarterly cargo volumes. Responding airports either had
volumes of 6,000 tons or less, or 12,000 tons or more. For the purposes of this project,
the responding airports were divided into two groups based on cargo volumes. Airports
that reported having 6,000 tons per quarter or less were classified as small cargo-handling
airports, and airports that reported processing more than 12,000 tons per quarter were
classified as large cargo-handling airports.
Six airports did not report any cargo volumes, so an alternative classification
method was needed in order to determine their group placement. The answers to
questions 1 through 5 were analyzed and compared with the answers given by airports
reporting quarterly cargo volumes. Airports reporting larger cargo volumes tended to
have more companies leasing and sorting cargo on site, along with larger facilities and
more aircraft parking positions. Airports not reporting cargo volumes with answers
similar to either large or small airports for the first five questions were placed in the
respective groups.
60
4.1.1: Responses and Reactions
Only 19 of the 118 surveys sent out were returned. One airport called to get
clarification on Questions 1 and 2, making sure the questions pertained to on-airport
cargo facilities only. A TSA official from another airport called to obtain verbal
reassurance that individual responses would not be published. The TSA official felt that
the survey questions were in compliance with TSA regulations, meaning that airports
should not be prohibited by law from answering any of the questions. However, some of
the airports that returned the surveys answered few or none of the questions pertaining to
security, stating that it was either against airport policy, TSA policy, or that the
information was simply unavailable. Two airports sent emails stating that they would not
return the survey because they thought it conflicted with TSA regulations. Out of 19
responses, 5 airports were interested in having a case study on cargo operations
performed at their cargo facilities.
4.1.2: Detailed Findings
The results from each individual question are given. The full set of graphical
results is given in Appendix A and shows the differences between small and large
airports, as well as the overall trends for all responding airports.
Question 1: How many companies/airlines SORT cargo at your airport (either at a
central sorting facility or at individual locations on airport property?
Total: The majority of respondents (61.1%) have 5 or fewer airlines/companies
(freight forwarders, etc) sorting cargo at their airports. Only 11.2% of respondents
61
have more than 15 airlines/companies sorting at their airports. 90% of airports
reporting 5 companies sorting or fewer were small airports.
Small Airports: 90% of small airports have 5 or fewer airlines/companies sorting
cargo at their airports. 40% of responding small airports have no airlines or
companies sorting cargo at their airports.
Large Airports: 75% of large airports have anywhere from 5 to 15
airlines/companies sorting cargo at their airports, and the spread between these
numbers is very even.
Question 2: How many companies/airlines LEASE cargo space (for storage, sorting,
etc.) at your airport?
Total: The majority of respondents (50%) have 5 or fewer airlines/companies
(freight forwarders, etc) leasing cargo space at their airports. Only 11.2% of
respondents have more than 15 airlines/companies leasing cargo space at their
airports.
Small Airports: 80% of small airports have 5 or fewer airlines/companies leasing
space at their airports.
Large Airports: Large airports tend to have more airlines/companies leasing from
them than smaller airports. Almost 40% of large airports have between 6 and 10
airlines/companies leasing from them. Another 25% have between 11 and 15
airlines/companies leasing space from them.
62
Question 3: What is the total size (e.g., in square feet, acres, etc) of your cargo
facilities?
Total: 47.1% of respondents have a total facility size of 100,000 square feet or
less. Only 11.8% of respondents have more than 250,000 square feet of total
cargo facility space. All airports with more than 200,000 square feet of total
facility space belong to large airports.
Small Airports: 60% of all small airports have 50,000 square feet or less of total
cargo facility space.
Large Airports: 75% of large airports have more than 200,000 square feet of total
cargo facility space.
Question 4: What is the total size (e.g., in square feet, acres, etc.) of each individual
facility?
Note: The results presented here represent size ranges for individual airport cargo
facilities. The results do not reflect whether a respondent had one facility or
multiple facilities.
Total: The majority of the facilities (35.6%) are 20,000 square feet or less. 27.1%
of the facilities are between 20,000 square feet and 40,000 square feet.
Small Airports: Over 60% of small airports have individual facilities that are
20,000 square feet or less. Nearly 24% have between 20,000 and 40,000 square
feet per facility.
Large Airports: 50% of large airports have 40,000 square feet per facility or less.
The large airports tend to have more space per facility than smaller airports.
63
Question 5: How many aircraft parking positions are there at each cargo facility?
Please list the number of positions for each facility.
Total: The majority of respondents (48.5%) reported that their facilities had 2 or
fewer aircraft parking positions. 93.9% of respondents have 10 or fewer aircraft
parking positions.
Small Airports: 78.6% of small airports have 2 or fewer aircraft parking positions
at their cargo facilities.
Large Airports: 31.3% of large airports have 2 or fewer aircraft parking positions,
and another 50% have between 3 and 6 aircraft parking positions per facility.
Question 6: Is sorting conducted centrally for all carriers, or individually by each
carrier?
All respondents reported that sorting is conducted individually by each carrier.
Question 7: How many airport/airline employees work in secure air cargo areas?
Total: 41.7% of respondents reported that 50 or fewer employees work in secure
air cargo areas. 83.3% have 150 or fewer employees working in secure air cargo
areas. Only 16.7% have over 200 employees working in secure air cargo areas.
Small Airports: 55.6% of small airports have 50 or fewer employees working in
secure cargo areas.
Large Airports: 50% of large airports have between 101 and 150 employees
working in secure cargo areas. No large airports reported having fewer than 51
employees in secure cargo areas.
64
Question 8: What percentage of the air cargo flowing through your airport is
international cargo?
Total: The majority of respondents (60%) reported that only 10% or less of their
cargo is international cargo. 93.3% of respondents reported 40% or less of their
cargo is international cargo.
Small Airports: 66.7% of small airports reported that 0-10% of their cargo is
international. No small airports reported having more than 40% international
cargo.
Large Airports: About 57% of large airports have between 0% and 10%
international cargo. 14.3% of large airports reported having over 90%
international cargo. Large airports have a tendency to have a higher percentage of
international cargo compared to small airports.
Question 9: What percentage of cargo flowing through your airport comes through as
belly freight on passenger planes?
Total: 40% of respondents reported that 10% or less of their cargo is transported
as belly freight. 60% of respondents reported that 20% or less of their cargo is
transported as belly freight. The corresponding histogram shows a significant gap
between responses: respondents tend to have 40% or less belly freight, or more
than 80% of cargo as belly freight.
Small Airports: Nearly 45% of small airports have less than 10% belly freight.
Another 22.2% have more than 90% belly freight.
65
Large Airports: 33.3% of large airports reported having less than 10% belly
freight, and another 33.3% reported having between 31% and 40% belly freight.
Nearly 17% of large airports have 81-90% belly freight.
Question 10: How is belly cargo from passenger planes transported from the plane to
the cargo facility and vice versa?
Total: Of the answers given, 68.8% of respondents report that tugs and/or carts
are used to transport belly cargo between the facility and the aircraft. Another
12.5% use trucks. Other responses, at 6.3% each, were pod loaders, belt loaders,
and ramps.
Small Airports: Small airports reported using only tugs and/or carts or trucks.
80% of small airports use tugs and/or carts to transport cargo.
Large Airports: Large airports use more methods than small airports. 60% of
large airports reported using tugs and/or carts to transport cargo.
Question 11a: What are the most common types of cargo passenger carriers?
Total: The two most common types are mail and perishable items (50% and
41.7%, respectively. Other reported items were fish, human remains, express
packages, electronics, medicine, live animals, gaming equipment, and caskets.
Small Airports: The most common types of cargo for passenger carriers reported
by small airports are mail and perishable items (reported by 50% and 33.3% of
small airports, respectively). Human remains and caskets were also reported.
66
Large Airports: The most common types of cargo for passenger carriers reported
by large airports are mail and perishable items (each reported by 50% of large
airports).
Question 11b: What are the most common types of cargo for all-cargo carriers?
Total: Express packages make up the majority at 36.6%. Electronics are second
at 36.4%. Also reported were manufactured items, retail merchandise, machine
parts, mail, perishable items, international goods, automobile parts,
pharmaceuticals, furniture, and chemicals.
Small Airports: 80% of small airports reported express packages as a common
cargo type for all-cargo carriers. Also reported were manufactured items, retail
merchandise, international freight, perishable items, and electronics at 20% each.
Large Airports: 50% of large airports report both express packages and
electronics as common cargo types for all-cargo carriers. Mail, perishable items,
and automobile parts were all reported by 33.3% of large airports.
Question 12: What are the latest quarterly volumes of cargo flow by weight for your
airport?
Note: Results reflect quarterly volumes from the past year and are split up by
quarter.
Total: For all 4 quarters, the majority of respondents reported a cargo flow of
3,000 tons or less per quarter (38.5% for Quarters 1 and 2, and 41.7% for Quarters
3 and 4). A significant percentage of respondents reported cargo flow in excess
67
of 27,000 tons per quarter (15.4% for Quarters 1 and 2, and 16.7% for Quarters 3
and 4).
Small Airports: For the first two quarters, about 80% of small airports reported
volumes less than 3,000 tons per quarter. No small airports reported volumes in
excess of 6,000 tons per quarter. For the third and fourth quarters, all small
airports reported volumes less than 3,000 tons per quarter.
Large Airports: For the first and second quarters, all large airports had more than
12,000 tons per quarter. Cargo volumes were fairly spread out from 12,000 tons
to more than 27,000 tons. For the last two quarters, all large airports had more
than 15,000 tons per quarter. For the third quarter, volumes were fairly spread out
again from 15,000 tons to more than 27,000 tons. In the fourth quarter, nearly
43% of large airports reported cargo volumes between 21,001 tons and 24,000
tons.
Question 13: Of all cargo flowing through your airport, what percentage of it is
subject to any sort of security screening, scanning, or inspection process?
Total: 70% of all respondents reported screening up to 10% of their cargo. Only
30% of respondents reported screening 91-100% of their cargo.
Small Airports: 71% of small airports reported screening 10% or less of their
cargo, and 29% reported screening more than 90% of their cargo.
Large Airports: 75% of large airports screen less than 10% of their cargo, and
25% reported screening over 90% of their cargo.
68
Question 14: What screening technologies are currently available and/or used for
cargo?
Total: Technologies reported were closed circuit television, x-ray technology,
canines, trace detection, and physical inspection. X-ray is the most popular
method, used by 54.5% of responding airports. Canines are the second most
popular method, used by 45.5% of responding airports. Physical inspection was
the least popular method, only used by 9.1% of respondents. Many respondents
reported using multiple screening methods at their airports.
Small Airports: X-ray is the most popular cargo screening method for small
airports, used by 83.3%. Closed circuit television (CCTV) is the second most
popular method at 66.7%. Both trace detection and canines are used at 50% of
responding small airports. Physical inspection is rarely used, at 9.1%.
Large Airports: Large airports do not use as many cargo-screening methods as
small airports do. Only CCTV, x-ray, and canines were reported as being used.
X-ray machines and canines were reported most frequently, each used be 80% of
responding large airports. CCTV is used by 60% of responding large airports.
Question 15: What type(s) of screening for airport/airline workers is currently used to
allow them to enter secure airport areas?
Total: Screening methods reported were background checks, ID badges,
biometrics, pin numbers, fingerprinting, badges, and smart or swipe cards.
Background checks and ID badges were the most popular methods, each used by
66.7% of responding airports. Badges and smart cards/swipe cards are also used
69
frequently, at 55.6% and 50.0%, respectively. Only 5.6% of respondents reported
using biometrics, pin numbers, or fingerprinting. Most respondents reported
using multiple screening methods at their airports.
Small Airports: 100% of responding small airports use background checks and ID
badges for employees accessing secure cargo areas. 80% use badges (i.e., an
airport patch on a uniform) and smart or swipe cards to control access. A small
percentage use biometrics and fingerprinting.
Large Airports: 87.5% of large airports use both background checks and ID
badges. 75% use both badges and smart or swipe cards. As can be seen from the
graph above, the use of various employee control access methods is similar
among both large and small airports.
Question 16: Are visitors allowed in the cargo facilities?
Total: 62.5% of responding airports allow visitors in the cargo facilities.
Small Airports: Just over 50% of small airports allow visitors in cargo facilities.
Large Airports: More than 70% of responding large airports allow visitors into the
cargo facilities.
Question 16a: If YES, are they subject to any type of screening?
Total: Of those airports that allow visitors, 80% screen them. All responding
airports that screen visitors do so by providing them with an authorized escort.
Small Airports: 80% of small airports provide visitors with an authorized escort.
70
Large Airports: 80% of large airports provide visitors with an authorized escort.
Question 17: What percentage of cargo coming through the airport is from known
shippers?
Total: The majority of responding airports, at 72.7%, have 91% to 100% of their
cargo coming from known shippers.
Small Airports: 57% of small airports have 91%-100% of their cargo coming
from known shippers. No small airports reported transporting less than 71%
known shipper cargo.
Large Airports: All responding large airports reported shipping 91%-100% known
shipper cargo.
Question 18: Are trucks carrying cargo to or from airport property allowed to enter
secured areas of the airport?
Total: 55.6% of all responding airports allow trucks to enter secure airport areas.
Small Airports: 60% of small airports allow trucks to enter secure airport areas.
Large Airports: 50% of large airports allow trucks to enter secure airport areas.
Question 19a: Truck clearance and security: Must the truck driver obtain clearance
for himself as the correct driver? Please explain.
Total: Of responding airports that allow trucks to enter secure airport areas, 100%
require truck driver verification. Verification methods reported were ID
verification, airport ID check, pin number, smart card or badge, biometrics,
71
prearranged clearance, and assignment of an authorized escort. ID verification
and assignment of an authorized escort were the most common responses, at
23.1% each. Airport ID checks and smart cards or badges were the second most
popular responses at 15.4% each.
Small Airports: Truck driver clearance methods for small airports are varied.
Identity verification is the most popular method, used by 28.6% of small airports.
Also reported used were airport IDs, biometric readers, authorized escorts, pin
numbers, and smart cards.
Large Airports: Large airports do not use as many methods as small airports do –
airport IDs, authorized escorts, and prearranged clearance were the only clearance
methods reported. Authorized escort was the most common measure used,
reported by 50% of large airports.
Question 19b: Truck clearance and security: How are the trucks themselves cleared?
Total: Of responding airports that allow trucks to enter secure airport areas, 80%
clear trucks manually. 10% clear them electronically, while another 10% clear
trucks by other methods.
Small Airports: 66.7% of small airports use manual clearance to allow trucks to
enter secure airport areas. Computer/electronic clearance and other methods are
each used by 16.7% of small airports.
Large Airports: 100% of responding large airports clear trucks manually.
72
Question 19c: Truck clearance and security: Are trucks inspected for firearms/stolen
goods, etc.?
Total: 25% of respondents reported inspecting trucks always, 25% sometimes,
and 25% never. 12.5% of responding airports reported frequently inspecting
trucks, and another 12.5% reported rarely inspecting trucks.
Small Airports: There is a significant gap in truck inspections at small airports.
Trucks are inspected frequently/always, or not at all. 40% of small airports
reported always inspecting trucks, while another 40% reported never inspecting
trucks.
Large Airports: Truck inspections at large airports are more synonymous,
occurring either sometimes or rarely. 66.7% of large airports report sometimes
inspecting trucks.
Question 20: Does international cargo flow through your airport?
Total: 61.1% of responding airports report that international cargo does flow
through them.
Small Airports: Only 40% of small airports handle international cargo. 75% of
small airports handling international cargo also have customs on site.
Large Airports: Almost 88% of large airports handle international cargo.
73
Question 21a: For international cargo: On average, how long does it take for cargo to
clear customs?
Total: Of airports responding to Question 20, 42.9% report that international
cargo takes 1 to 2 hours to clear customs. 14.3% reported that it takes 4 or more
hours to clear international cargo through customs, while an equal percentage
reported 1 hour or less.
Small Airports: All responding small airports take a minimum of 3 hours to clear
international cargo through customs.
Large Airports: 60% of large airports take between 1 and 2 hours to clear
international cargo through customs. No large airports reported a clearance time
greater than 3 hours.
Question 21b: For international cargo: In general, for cargo that is selected for
screening, is it subject to more stringent screening than domestic cargo?
Total: For this question, 60% of airports reported that they do not screen
international cargo more stringently than domestic cargo.
Small Airports: 100% of small airports reported that international cargo is not
subject to additional screening.
Large Airports: 66.7% of large airports do screen international cargo more
stringently than domestic cargo.
74
Question 22: Are the dumpsters in secure airport areas subject to any type of
surveillance?
Total: 73.3% of responding airports report that dumpsters are subject to
surveillance.
Small Airports: 66.7% of small airports subject dumpsters to surveillance.
Large Airports: 83.3% of large airports subject dumpsters to surveillance.
Question 22a: If YES, what type (CCTV, guard patrol, etc.)?
Total: Of those responding yes to the previous question, 53.3% use closed circuit
television for surveillance and 46.7% use a guard patrol.
Small Airports: Most small airports (62.5%) use CCTV to keep surveillance on
dumpsters. The rest use guard patrols.
Large Airports: 57.1% of large airports use guard patrols to keep an eye on
dumpsters. The rest reported using CCTV.
The results of the first section of the survey were used to gain an understanding of
how air cargo facilities are laid out, the size and scope of the operations within them, and
the characteristics of the cargo handled within them. The answers from the first section
were also used to classify airports as small cargo-handling airports and large cargo-
handling airports. The second section of the survey, which dealt with security issues, was
used to determine the state of security in and around air cargo facilities, what specific
measures are in use for ensuring security of cargo and personnel, and the frequency of
75
use. In terms of direct cargo screening, which is the focus of this research, the results
show that only a small amount of cargo is screened.
4.1.3: Summary
As should be expected, larger airports have more companies sorting and leasing
space, larger cargo facilities, more aircraft parking positions, more employees, and more
varied types of cargo flowing through their airports than smaller airports.
Small airports are more inclined to use newer types of screening technologies for
cargo, truck drivers, trucks, and airport personnel. All responding small airports require
employee background checks, which is now also required by TSA. However, a small
percentage of large airports do not conduct background checks. This is a significant
vulnerability in that these airports are unaware of potential criminal behavior that could
easily go unnoticed amidst large, busy cargo operations.
Most airports require visitors to cargo facilities to remain under the supervision of
an authorized escort. However, the minority that does not leaves their cargo operations
vulnerable to theft and tampering.
None of the responding large airports will ship cargo from unknown shippers.
Many small airports ship a small percentage of cargo from unknown shippers. As long as
some type of verification process is in place to determine the legitimacy of the unknown
shipper and its shipments, this small percentage of unknown shipper cargo should not
pose a significant security threat.
76
Small airports are more likely to allow trucks into secured airport areas than
larger airports are. Those airports allowing trucks into secure areas should ensure that a
security checkpoint is in place to catch any suspicious people, vehicles, or cargo before
they gain access to secured areas. This can reduce susceptibility to theft and tampering.
All airports report requiring truck drivers to properly identify themselves. Identification
procedures vary.
Small airports either consistently inspect trucks or do not inspect them at all.
Large airports were more consistent with their responses – inspections are not frequent,
but they are conducted on some level at all responding large airports. Small airports
devoid of truck inspections leave themselves open to criminal activity, especially if the
lack of inspections is known.
Large airports handle much more international cargo than small airports, and they
can clear it through customs twice as fast. Most large airports reported subjecting
international cargo to more stringent screening than domestic cargo, yet none of the
responding small airports reported doing so.
Three quarters of airports keep their dumpsters under surveillance. Dumpsters
can be an outlet for cargo theft, so this is an important area of security coverage. Small
airports tend to use CCTV to watch dumpsters, while large airports lean more toward the
use of guard patrols.
Almost three quarters of all responding airports screen less than 10% of their air
cargo. This presents a serious security issue, leaving criminals and terrorists with ample
opportunities to transport explosive materials, stolen goods, and other contraband.
77
4.2: Screening Methods
As discussed in Section 3.3, the first phase of evaluation of the screening methods
consisted of analysis of the screening time, cost, and applicability in an air cargo
environment in order to determine what technologies would be chosen for the second
phase of evaluation (the simulations). Table 4.1 shows the results of the first evaluation
phase.
Screening Method Simulate? Reason for Not Simulating X-Ray Yes -- Gamma Ray Yes -- Pulsed Fast Neutron Analysis No Price ($25 million), time (1 hr +) Thermal Neutron Activation No Time (1 hr +) Vapor Detection Yes -- Trace Detection Yes -- Radiation Detection Yes -- Canines No More information needed
Table 4.1: Results of the First Phase of Evaluation of Screening Methods All of these methods were found to be applicable in an air cargo environment. Pulsed
fast neutron analysis was cut from consideration due to its high cost and unusually long
required screening time of one hour or more. Thermal neutron activation was
disqualified due to its lengthy screening time, as well. More research must be conducted
before canines can be accurately simulated. Although there is information available on
their screening time and their general effectiveness could be assumed, other information
such as the average frequency and duration of air cargo facility visits would be required
in order to effectively simulate canine screening. X-ray, gamma ray, vapor detection,
trace detection, and radiation detection will move to the second phase of evaluation, the
Arena simulations. The results of these will be discussed in the next section.
78
4.3: Arena Models
4.3.1: Simulation Results
The five screening technologies deemed suitable for simulation were arranged
into four different combinations that made up the four security cases simulated in Arena.
The average processing time of these combinations will be compared to the processing
time in the base case, which has no inspection technologies.
Case 1: X-ray
Case 2: X-ray, trace detection, and vapor detection
Case 3: X-ray, trace detection, vapor detection, and radiation detection
Case 4: X-ray, gamma ray, trace detection, vapor detection, and radiation
detection.
For each case, cargo units that were selected to go through the security process passed
through each technology in the order listed above unless one of the technologies detected
a risk. When a risk was detected in a cargo unit, the unit was removed from the security
queue and inspected physically to determine whether or not a risk actually existed.
Tables 4.2 through 4.6 show which cases are capable of scanning for a particular threat
along with how many of the technologies in each case can scan for one particular threat.
Table 4.7 is a combination of Tables 4.2 through 4.6, showing how many technologies in
each case can scan for each threat. An “x” denotes one screening method.
79
Case 1 Case 2 Case 3 Case 4 X-Ray x x x x Gamma Ray x Vapor Detection Trace Detection x x x Radiation Detection
Table 4.2: Cases with Screening Technologies Capable of Screening for Explosives
Case 1 Case 2 Case 3 Case 4 X-Ray x x x x Gamma Ray x Vapor Detection Trace Detection Radiation Detection
Table 4.3: Cases with Screening Technologies Capable of Screening for Stolen/Mislabeled Goods
Case 1 Case 2 Case 3 Case 4 X-Ray x x x x Gamma Ray x Vapor Detection Trace Detection x x x Radiation Detection
Table 4.4: Cases with Screening Technologies Capable of Screening for Illegal Drugs
Case 1 Case 2 Case 3 Case 4 X-Ray Gamma Ray Vapor Detection Trace Detection Radiation Detection x x
Table 4.5: Cases with Screening Technologies Capable of Screening for Radioactive Materials
Case 1 Case 2 Case 3 Case 4 X-Ray Gamma Ray Vapor Detection Trace Detection x x x Radiation Detection
Table 4.6: Cases with Screening Technologies Capable of Screening for Dangerous/Illegal Gases
80
Case 1 Case 2 Case 3 Case 4 Explosives x xx xx xxx Stolen/Mislabeled Goods x x x xx Illegal Drugs x xx xx xxx Illegal/Dangerous Gases x x x Radioactive Materials x x Table 4.7: Number of Screening Technologies In Each Case Capable of Detecting Each Threat Type
All the simulations incorporate 10% physical inspection, the assumed required
percentage by TSA, however the analysis of the physical inspection will be separate from
the analysis of the security cases. Table 4.8 shows the results for the average number of
cargo units processed during the 5-year simulation period and the average length of time
required to process the cargo. The percentages listed underneath each case represent the
total percentage of cargo screened.
Cases Avg.
System Cost
Avg. No. Units
Processed
Avg. Units Processed Per Case
Avg. Process
Time (min)
Avg. Process Time Per
Case Base - 579,354 579,354 99 99
Case 1: 10% 572,883 102 25% 572,474 106 50%
$5,000,000
578,348
574,568
114
108
Case 2: 10% 565,080 104 25% 569,562 107 50%
$5,580,000
575,300
569,981
113
108
Case 3: 10% 569,433 103 25% 579,036 106 50%
$5,610,000
565,041
571,170
118
109
Case 4: 10% 568,377 105 25% 554,789 117 50%
$7,360,000
581,675
568,280
136
119
Table 4.8: Case Costs and Processing Results
81
The cost of each case increases as more screening methods are added to the cases. The
amount of time needed to process cargo increases as screening methods are added, as
well. This was expected since additional screening required additional time. Within each
case it can be seen that increasing the percentage of cargo screened results in a significant
increase in the average cargo processing time. The average number of units processed
did not uniformly decrease as expected. This is most likely due to a problem inherent in
Arena, in which units that are “batched” together to form a single unit all take on the
properties and attributes of the first unit in the batch.
Tables 4.9 through 4.13 show the results from the screenings for each high-risk
material including the number of each high-risk type caught by each applicable screening
method, the number of each high-risk type missed by each applicable screening method,
the number of false positives given by each screening method, the percentage of high-risk
cargo units caught that went through each security case, and the percentage of the total
number of high-risk goods entering the facility that were caught. The values given are
the averages of the three simulations performed for each percentage screened (10%, 25%,
and 50% for each case). The base case is not included in these results since there are no
technologies incorporated in the base case simulation. These results also do not include
any high-risk units caught by the initial physical inspection.
82
Case 1 Case 2 Case 3 Case 4 10% 25% 50% 10% 25% 50% 10% 25% 50% 10% 25% 50% Expl. Found
by X-Ray 3 9 22 2 9 21 3 8 15 5 6 18
Expl. Not Found by X-
Ray 1 4 11 2 2 7 2 4 9 1 4 9
Expl. Found by GR - - - - - - - - - 1 3 6
Expl. Not Found by
GR - - - - - - - - - 0 2 3
Expl. Found by TD - - - 1 2 6 2 2 7 0 1 2
Expl. Not Found by
TD - - - 1 1 1 0 2 2 0 1 1
False Positives 0 1 1 0 0 0 0 0 1 0 0 1
Total Found 3 9 22 3 11 27 5 10 22 6 10 26
Total Not Found 1 4 11 3 3 8 2 6 11 1 7 13
% Found from
Screening 69% 69% 66% 75% 92% 97% 93% 87% 90% 100
% 93% 97%
% Found from
Screening (Case Avg.)
68% 88% 90% 97%
Total % Found 5% 14% 35% 5% 19% 43% 8% 15% 37% 9% 16% 43%
Total % Found (Case
Avg.) 18% 22% 20% 23%
Table 4.9: Results for Explosives/Explosive Materials Detection
Looking at Table 4.9, the likelihood of discovering explosives increases when more than
one screening method is used to scan for them. In Case 1, where x-ray was the only
method used to scan for explosives, only 68% of the explosives passing through the x-ray
machine were found. In Cases 2 and 3 where a trace detector was introduced, this
statistic increased to around 90%. Case 4 used x-ray, trace detection, and gamma ray to
screen for explosives, and nearly 100% of all explosives going through the security
83
machines were discovered. The overall percentage of explosives found increased, also,
but the increase was small.
Case 1 Case 2 Case 3 Case 4 10% 25% 50% 10% 25% 50% 10% 25% 50% 10% 25% 50%
SG Found by X-Ray 4 7 19 4 8 17 3 8 19 4 7 15
SG Not Found by X-
Ray 0 5 9 1 3 7 1 3 7 2 2 8
SG Found by GR - - - - - - - - - 1 1 5
SG Not Found by
GR - - - - - - - - - 0 1 3
False Positives 0 0 0 0 0 0 0 0 0 0 0 0
Total Found 4 7 19 4 8 17 3 8 19 5 8 20
Total Not Found 0 5 9 1 3 7 1 3 7 2 3 11
% Found from
Screening
100% 58% 68% 86% 66% 70% 69% 74% 76% 83% 85% 87%
% Found from
Screening (Case Avg.)
75% 74% 73% 85%
Total % Found 8% 11% 32% 7% 14% 32% 5% 14% 37% 11% 16% 37%
Total % Found (Case
Avg.) 17% 17% 19% 21%
Table 4.10: Results for Stolen/Mislabeled Goods Detection
Like Table 4.9, Table 4.10 shows the increased likelihood of discovering more units
containing stolen/mislabeled goods when more than one technology is used to scan for
them. Cases 1 through 3 used only x-ray to scan for stolen/mislabeled goods, and the
overall percentage discovered was fairly consistent at 17% to 19%. The percentage
found from the security queue varied only slightly within Cases 1 through 3. When the
gamma ray detector was introduced in Case 4, both percentages increased, but the
increase in the total percentage found was slight – about 3%.
84
Case 1 Case 2 Case 3 Case 4 10% 25% 50% 10% 25% 50% 10% 25% 50% 10% 25% 50%
Drugs Found by
X-Ray 4 10 18 3 5 16 4 9 18 6 8 13
Drugs Not Found by
X-Ray 1 5 7 2 4 6 1 3 7 0 4 7
Drugs Found by
GR - - - - - - - - - 0 3 5
Drugs Not Found by
GR - - - - - - - - - 0 1 2
Drugs Found by
TD - - - 2 3 3 1 2 6 0 1 1
Drugs Not Found by
TD - - - 0 1 3 1 1 1 0 0 0
False Positives 0 0 0 0 0 1 0 0 0 0 0 0
Total Found 4 10 18 5 8 19 5 11 24 6 12 19
Total Not Found 1 5 7 2 5 9 2 5 13 0 5 9
% Found from
Screening 86% 69% 73% 89% 86% 85% 86% 95% 95% 100
% 100% 98%
% Found from
Screening (Case Avg.)
76% 87% 92% 99%
Total % Found 7% 18% 34% 9% 14% 32% 6% 19% 45% 11% 22% 38%
Total % Found
(Case Avg.) 19% 18% 23% 24%
Table 4.11: Results for Illegal Drug Detection
Looking at Table 4.11, increasing the number of technologies to scan for illegal drugs
increased the percentage caught, as was the case for explosives and stolen/mislabeled
goods. Case 1, which only used an x-ray machine, only caught 76% of all illegal drugs
passing through security, while Cases 3 and 4 caught over 90% of the illegal drugs
passing through security. Using a trace detector or a trace detector coupled with a
85
gamma ray detector as backup methods for the x-ray machine increased the percentage of
illegal drugs caught that passed through the security case. The overall percentage
increase was very slight moving from Case 1 through Case 4, with an overall increase of
5%.
Case 1 Case 2 Case 3 Case 4 10% 25% 50% 10% 25% 50% 10% 25% 50% 10% 25% 50%
Gases Found by VD - - - 5 8 21 3 9 17 5 10 18
Gases Not Found by
VD - - - 3 2 9 1 5 6 3 5 8
False Positives - - - 0 1 0 0 0 0 0 0 0
Total Found - - - 5 8 21 3 9 17 5 10 18
Total Not Found - - - 3 2 9 1 5 6 3 5 8
% Found from
Screening 0% 0% 0% 62% 75% 69% 56% 67% 75% 61% 69% 69%
% Found from
Screening (Case Avg.)
0% 68% 66% 67%
Total % Found 0% 0% 0% 8% 15% 35% 6% 14% 31% 8% 19% 34%
Total % Found (Case
Avg.) 0% 19% 17% 20%
Table 4.12: Results for Dangerous Gases Detection
Since only one of the technologies was suited for detecting dangerous gases, no real
increase in the percentage of dangerous gases found can be seen. Incorporation of the
vapor detector resulted in about an 18% detection rate of all dangerous gases entering the
facility and a 67% detection rate of dangerous gases passing through the security case.
86
Case 1 Case 2 Case 3 Case 4 10% 25% 50% 10% 25% 50% 10% 25% 50% 10% 25% 50%
Rad. Found by RD - - - - - - 2 9 18 3 7 17
Rad. Not Found by
RD - - - - - - 0 3 10 1 3 8
False Positives - - - - - - 0 0 1 0 0 0
Total Found - - - - - - 2 9 18 3 7 17
Total Not Found - - - - - - 0 3 10 1 3 8
% Found from
Screening 0% 0% 0% 0% 0% 0% 92% 74% 63% 50% 69% 64%
% Found from
Screening (Case Avg.)
0% 0% 76% 61%
Total % Found 0% 0% 0% 0% 0% 0% 5% 16% 30% 4% 13% 29%
Total % Found (Case
Avg.) 0% 0% 17% 15%
Table 4.13: Results for Radioactive Materials Detection
The radiation detector was only incorporated in Cases 3 and 4, and both cases show
around a 16% overall radiation detection rate. For radioactive substances passing
through security, 76% of them were detected in Case 3, and 61% were detected in Case 4.
Table 4.14 shows the percentage of all high-risk cargo found for each case – for
high-risk cargo just going through security and for all high-risk cargo entering the
facility. Again, the base case is not included in the table since there are no technologies
incorporated in the base case simulation, and these results also do not include high-risk
cargo caught by the initial physical inspection.
87
Case 1 Case 2 Case 3 Case 4 % of All High-Risk Cargo Found
from Screening 44% 63% 79% 82%
Total % of All High-Risk Cargo Found 11% 15% 19% 21%
Table 4.14: Results for Overall High-Risk Cargo Detection
The percentage of high-risk cargo found increased with each case. As more technologies
are added to the security lineup, more high-risk cargo is discovered. The percentages of
high-risk cargo found in Case 4 nearly doubles the percentages found in Case 1, and Case
3 has percentages only marginally smaller than those in Case 4.
4.3.2: Simulation Significance Testing
Three replications were performed for each screening percentage (i.e., 10%, 25%,
and 50% of cargo screened) within each case, and significance testing based on the 95%
confidence intervals of each set of replications was conducted to discover any significant
differences in the results. Tables 4.15 through 4.19 show the means and the 95%
confidence intervals for all four cases and for all five high-risk types.
88
% Found from Security % Found Total Mean 95% Conf. Interval Mean 95% Conf. Interval
Case 1 10% 69% 67 5% 7 25% 69% 41 14% 3 50% 66% 23 35% 16
Case 2 10% 75% 62 5% 3 25% 92% 21 19% 22 50% 97% 14 43% 10
Case 3 10% 93% 29 8% 8 25% 87% 28 15% 4 50% 90% 5 37% 11
Case 4 10% 100% 0 9% 17 25% 94% 15 16% 13 50% 97% 9 42% 14
Table 4.15: Means and 95% Confidence Intervals for Explosives Detection
% Found from Security % Found Total Mean 95% Conf. Interval Mean 95% Conf. Interval
Case 1 10% 100% 0 8% 18 25% 58% 39 11% 8 50% 68% 31 32% 9
Case 2 10% 86% 32 7% 5 25% 66% 51 14% 13 50% 70% 24 32% 18
Case 3 10% 69% 75 6% 5 25% 74% 37 14% 7 50% 76% 39 36% 8
Case 4 10% 84% 41 10% 19 25% 85% 9 16% 13 50% 87% 11 37% 15
Table 4.16: Means and 95% Confidence Intervals for Stolen Goods Detection
89
% Found from Security % Found Total Mean 95% Conf. Interval Mean 95% Conf. Interval
Case 1 10% 86% 62 7% 10 25% 70% 8 18% 4 50% 73% 22 34% 14
Case 2 10% 89% 47 9% 6 25% 86% 38 14% 2 50% 85% 10 32% 9
Case 3 10% 86% 42 6% 14 25% 95% 11 19% 12 50% 95% 14 45% 7
Case 4 10% 100% 0 11% 3 25% 100% 0 22% 13 50% 98% 7 38% 14
Table 4.17: Means and 95% Confidence Intervals for Illegal Drugs Detection
% Found from Security % Found Total Mean 95% Conf. Interval Mean 95% Conf. Interval
Case 1 10% - - - - 25% - - - - 50% - - - -
Case 2 10% - - - - 25% - - - - 50% - - - -
Case 3 10% 92% 36 5% 4 25% 74% 14 16% 6 50% 63% 25 30% 5
Case 4 10% 50% 100 4% 9 25% 70% 52 13% 17 50% 64% 8 29% 6
Table 4.18: Means and 95% Confidence Intervals for Radioactive Materials Detection
90
% Found from Security % Found Total Mean 95% Conf. Interval Mean 95% Conf. Interval
Case 1 10% - - - - 25% - - - - 50% - - - -
Case 2 10% 62% 27 7% 9 25% 75% 53 15% 21 50% 69% 21 35% 18
Case 3 10% 55% 96 6% 16 25% 67% 15 14% 15 50% 75% 4 31% 5
Case 4 10% 61% 13 8% 9 25% 69% 20 19% 8 50% 69% 50 34% 30
Table 4.19: Means and 95% Confidence Intervals for Dangerous Gases Detection
Figures 4.1 through 4.6 show the spreads of the 95% confidence intervals for the
Percent Found from Security and the Percent Found Total. Figures 4.1 through 4.3 show
the confidence interval spreads for the percentages screened (10%, 25%, and 50%)
among the four cases for the Percent Found from Security, and Figures 4.4 through 4.6
show the same spreads for the Percent Found Total. Spreads that overlap suggest that
there is not a significant difference in the results. Each figure has a dashed line indicating
the average of the means for the intervals presented (the assumed true mean) through
which most of the intervals should intersect. Fewer intersections indicate more of a
significant difference in the results.
91
All High-Risk, 10% Screening: 95% Confidence Intervals for % Found from Security
0 20 40 60 80 100 120
95% Confidence Interval Spreads
Case 1, ExplCase 2, ExplCase 3, ExplCase 4, ExplCase 1, SGCase 2, SGCase 3, SGCase 4, SGCase 1, DrugsCase 2, DrugsCase 3, DrugsCase 4, DrugsCase 3, RMCase 4, RMCase 2, GasesCase 3, GasesCase 4, Gases
Figure 4.1: 95% Confidence Interval Spreads for 10% Screening in All Four Cases and for All Five
High-Risk Types for Percent Found from Security
All High-Risk, 25% Screening: 95% Confidence Intervals for % Found from Security
0 20 40 60 80 100 120
95% Confidence Interval Spreads
Case 1, ExplCase 2, ExplCase 3, ExplCase 4, ExplCase 1, SGCase 2, SGCase 3, SGCase 4, SGCase 1, DrugsCase 2, DrugsCase 3, DrugsCase 4, DrugsCase 3, RMCase 4, RMCase 2, GasesCase 3, GasesCase 4, Gases
Figure 4.2: 95% Confidence Interval Spreads for 25% Screening in All Four Cases and for All Five
High-Risk Types for Percent Found from Security
µ = 78.3
µ = 79.8
92
All High-Risk, 50% Screening: 95% Confidence Intervals for % Found from Security
0 20 40 60 80 100 120
95% Confidence Interval Spreads
Case 1, ExplCase 2, ExplCase 3, ExplCase 4, ExplCase 1, SGCase 2, SGCase 3, SGCase 4, SGCase 1, DrugsCase 2, DrugsCase 3, DrugsCase 4, DrugsCase 3, RMCase 4, RMCase 2, GasesCase 3, GasesCase 4, Gases
Figure 4.3: 95% Confidence Interval Spreads for 50% Screening in All Four Cases and for All Five
High-Risk Types for Percent Found from Security
Looking at Figures 4.1 through 4.3, the interval overlap for 10% screening is very good,
meaning that there are no real significant differences in the results of 10% screening of
all high-risk cargo among the four cases. The interval overlap for 25% screening is fair,
and there is not as much total overlap as in the 10% screening. It should be noted that the
confidence interval spreads are smaller in the 25% screening compared to the 10%
screening. The interval overlap for the 50% screening is poor. Just over half of the
intervals intersect with the assumed true mean, µ. The confidence intervals are much
smaller for the 50% screening than in the 10% or 25% screenings. This suggests that
there are some significant differences in the results for the 50% screening. These
possible significant differences and large confidence interval spreads likely result from
µ = 78.9
93
the randomness of stochastic simulations along with the batching problem inherent in
Arena.
All High-Risks, 10% Screening: 95% Confidence Intervals for % Total Found
0 5 10 15 20 25
95% Confidence Interval Spreads
Case 1, ExplCase 2, ExplCase 3, ExplCase 4, ExplCase 1, SGCase 2, SGCase 3, SGCase 4, SGCase 1, DrugsCase 2, DrugsCase 3, DrugsCase 4, DrugsCase 3. RMCase 4, RMCase 2, GasesCase 3, GasesCase 4, Gases
Figure 4.4: 95% Confidence Interval Spreads for 10% Screening in All Four Cases and for All Five
High-Risk Types for Total Percent Found
µ = 7.1
94
All High-Risk, 25% Screening: 95% Confidence Intervals for % Total Found
0 5 10 15 20 25 30 35
95% Confidence Interval Spreads
Case 1, ExplCase 2, ExplCase 3, ExplCase 4, ExplCase 1, SGCase 2, SGCase 3, SGCase 4, SGCase 1, DrugsCase 2, DrugsCase 3, DrugsCase 4, DrugsCase 3, RMCase 4, RMCase 2, GasesCase 3, GasesCase 4, Gases
Figure 4.5: 95% Confidence Interval Spreads for 25% Screening in All Four Cases and for All Five
High-Risk Types for Total Percent Found
All High-Risk, 50% Screening: 95% Confidence Intervals for % Total Found
0 10 20 30 40 50 60
95% Confidence Interval Spreads
Case 1, ExplCase 2, ExplCase 3, ExplCase 4, ExplCase 1, SGCase 2, SGCase 3, SGCase 4, SGCase 1, DrugsCase 2, DrugsCase 3, DrugsCase 4, DrugsCase 3, RMCase 4, RMCase 2, GasesCase 3, GasesCase 4, Gases
Figure 4.6: 95% Confidence Interval Spreads for 50% Screening in All Four Cases and for All Five
High-Risk Types for Total Percent Found
µ = 15.8
µ = 35.4
95
Looking at Figures 4.4 through 4.6, the interval overlap for all three screening
percentages is good, although a slight decrease in the number of overlapping spreads can
be seen the results for 50% screening. The confidence intervals do not narrow moving
from 10% screening up to 50% screening as they did in the results from Percent Found
from Security. The results presented in Figures 4.4 through 4.6 do not suggest significant
differences within the screening percentages. Although these results do not suggest any
real significant differences in the screening percentages, the confidence interval overlaps
could be improved and the intervals themselves shortened if the batching problem in
Arena had not been a factor.
4.3.3: Summary
The simulation results show that the average cargo processing time increases as
the percentage of all cargo screened moves from 10% up to 25% and 50%. The average
cargo processing time also increases as more screening technologies are added into the
simulations (especially when looking at the results of Case 4), but this increase is not as
dramatic. Looking at the results from the Percent Total Found, a steady but slight
increase can be seen moving from Case 1 to Case 4 for the detection of explosives,
stolen/mislabeled goods, and illegal drugs. This increase is not seen for radioactive
materials or dangerous gases among the cases since only one technology was used to scan
for these threats. In terms of all the high-risk cargo types found, there is a sizeable
increase in the percentage found for both Percent Found Total and Percent Found from
Security – Case 4 nearly doubles the percents found in Case 1. Significance testing
shows that for the Percent Found Total, there is not evidence to suggest that the results
96
are significantly different. However, this is not the case for the Percent Found from
Security, where the confidence intervals in the 25% and 50% screening suggest that there
could be some significant differences. This is noteworthy since the Percent Found Total
is of greater consequence than the Percent Found from Security. The Percent Found
Total indicates how much of all high-risk cargo entering the facility was discovered and
therefore how effective each case truly is. The Percent Found from Security indicates
how effective the screening technologies are only in terms of the high-risk cargo they
catch based on the high-risk cargo chosen for inspection. The implications of these
results and the impacts of the batching errors in Arena will be discussed in the next
chapter.
97
CHAPTER 5: CONCLUSIONS AND RECOMMENDATIONS
5.1: Conclusions
5.1.1: Surveys
The airport surveys gave an insightful look into the security practices at the
nation’s airports and the differences in security practices between larger and smaller
airports. Small airports are more likely to invest in newer screening technologies for both
people and cargo, including biometrics and trace detection. For cargo screening, larger
airports rely heavily on x-ray machines and canine inspections. Truck driver verification
is a standard practice at U.S. airports, however occurrence of inspections of the trucks
themselves is quite varied. The most alarming statistic gathered from the surveys is that
three quarters of the nation’s airports screen less than 10% of their air cargo, thus
validating the need for this project.
5.1.2: Arena Models
One conclusion that can be made from the results is that simply an increase in the
amount of cargo screened provides a higher number of high-risk cargo caught than
adding more screening technologies to the security lineup. However, increasing the
variety of screening technologies results in a better chance of catching a variety of
threats. In terms of a recommendation for the case study facility, Case 3 would best serve
the purpose. Although Case 4 found more high-risk cargo both for Percent Found from
Security and Percent Found Total, the significantly higher cost and average processing
98
time cannot be justified. Case 4 found only a slightly higher percentage of high-risk
cargo than Case 3, and the cost and processing time of Case 3 are comparable to Cases 1
and 2.
The significance testing of the confidence intervals revealed that there may be
some significant differences in the Percent Found from Security, which could be
explained by the error caused when Arena batches two or more entities together to form a
single entity that takes on the properties of the first entity in the batch. Evidence of this
error can be seen in Table 4.8, where the average number of units processed per case does
not uniformly decrease as expected. As more screening methods are added in each case,
the time to process cargo should increase, which should in turn result in fewer cargo units
being processed. Another simulation package may be able to give the expected results.
The results show that this framework (albeit with a different simulation package)
can be used to determine how well a particular security case will perform in an individual
facility. With a few minor changes to the simulations and the proper sets of data, the
results of this research can be applied to any individual air cargo facility. Government
organizations, airport authorities and organizations, airlines, and freight forwarders could
benefit from the implementation of this framework for their own security testing purposes
because of its adaptability. The government could also use the results of such security
testing to develop better and more specific legislation to regulate air cargo security. The
setup requires only minor changes to the models in order to properly reflect the facility of
concern, and with the appropriate data, any security screening method may be tested
within the models.
99
5.2: Recommendations
From the results of the Arena models, Case 3 is recommended as the best case for
the case study facility. Ultimately, the recommended case would depend on the
individual needs and specific security concerns of an airport, airline, freight forwarder, or
government organization. Case recommendations aside, a framework such as the one
presented in this project can be of help to government organizations attempting to create
better air cargo security legislation. Evaluations of screening methods with this
framework can give the government detailed information on what methods or
combinations thereof are feasible for application in an air cargo facility and how effective
they will be at discovering threats. Using this information, the federal government can
enact specific security standards that will ensure the security of air cargo parcels, the
aircraft they travel in, and the facilities that process them.
In order to make the framework outlined in this project more feasible, more
detailed data must be gathered on the processes within a cargo facility, thereby
eliminating the need for assumptions in the simulations. The simulations may be
specifically tailored to a facility to determine the best case for that facility. More
research is also needed on the frequency and duration of canine teams to cargo facilities.
Again, this information can be used to specifically tailor the simulations to a certain
facility. Also, more research is needed on the efficiency of the screening technologies.
This information is essential to properly determine what type or types of security setups
will work in an air cargo setting. Finally, a different simulation package would serve to
eliminate Arena’s batching errors and give results that are not significantly different.
100
In the end, the best security setup for an air cargo facility will be a layered
approach consisting of direct cargo screening, perimeter access control, personnel
screening, and security programs such as the known shipper program or C-TPAT adapted
for an air cargo environment. A layered setup ensures multiple back-ups in case of a
failure in one area. Direct cargo screening, the portion of air cargo security explored in
this project, alone will not ensure the security of air cargo, but it is an integral part of the
larger, layered picture, and it can be an important first step in the development of this
layered approach.
101
REFERENCES
1. ACI-NA Air Cargo Facilities and Security Survey. Michael Webber, Airports
Council International-North America. September 2002.
2. Air Cargo Development Manager at case study airport. Personal interview.
December 4, 2003.
3. “Air Cargo Security,” Transportation Security Administration Q & A from Airports
Council International North America. May 8, 2002. www.aci-na.org.
4. “Air Cargo Strategic Plan,” Transportation Security Administration Press Release.
November 17, 2003.
http://www.tsa.gov/public/interapp/press_release/press_release_0371.xml
5. “Airport Data (5010) and Contact Information,” The Federal Aviation
Administration. April 9, 2004.
http://www.faa.gov/arp/safety/5010/index.cfm?nav=safedata.
6. “Airport Security Report,” PBI Media LCC, Volume 6, Number 5. February 27,
2002.
7. “Airport Security,” Title 49 of the Code of Federal Regulations, Subchapter B, Part
1542. http://www.tsa.gov/public/display?theme=79&content=0900051980096ff5.
8. Armbruster, William. “Breaking News,” Air Cargo World Online. October 2002.
www.aircargoworld.com/break_news/3.htm.
9. AutoModTM Information Packet, Brooks-PRI Automation. 2003
10. Aviation and Transportation Security Act, Section 110(f), Public Law 107-071.
Passed by 107th Congress, November 19, 2001.
102
11. “Aviation Safety: Undeclared Air Shipments of Dangerous Goods and DOT’s
Enforcement Approach,” U.S. General Accounting Office, Presented to U.S. House
of Representatives. January 2003.
12. “Aviation Security: Efforts to Measure Effectiveness and Address Challenges,” U.S.
General Accounting Office. November 2003.
13. “Aviation Security: Vulnerabilities and Potential Improvements for the Air Cargo
System,” U.S. General Accounting Office, Presented for the U.S. Senate. December
2002.
14. “Boeing World Air Cargo Forecast 2002-2003: North America,” Boeing.com. April
2002. www.boeing.com/commercial/cargo/n_america.html.
15. Cargo Sales Manager at case study facility. Personal interview. December 4, 2003.
16. “Cargo VIZ User Manual,” Applied Visualization Center, University of Tennessee,
Knoxville. July 30, 2003. http://viz.utk.edu/projects/tsa/cargoviz/cargo_viz.pdf.
17. Carroll, James R. “Legislation Would Tighten Security Rules for Air Freight,” The
Courier Journal. April 7, 2003.
www.courier-journal.com/localnews/2003/04/07/ke040703s393116.htm
18. “Committee Approves Air Cargo Security Bill,” U.S. Senate Press Release. March
13, 2003. www.senate.gov/~commerce/press/03/2003313730.html.
19. “Container Security: Expansion of Key Programs Will Require Greater Attention to
Critical Success Factors,” U.S. General Accounting Office. July 2003.
20. Dickson, Gordon and Byron Okada. “Gaps cited in air cargo security,” Knight-
Ridder/Tribune News Service. February 27, 2002. InfoTrac
Web, Electronic Collection CJ83335612.
103
21. “Electronic Supply Chain Manifest Freight ITS Operational Test Evaluation: Final
Report,” U.S. Department of Transportation. December 2002.
http://www.itsdocs.fhwa.dot.gov//JPODOCS/REPTS_TE//13769.html.
22. Harris, David. “The Air Cargo Security Plan,” The International Air Cargo.
Association. December 17, 2003.
http://www.tiaca.org/articles/2003/12/17/388C7D7AAA454B9D86BB24EF5033
3B66.asp.
23. Holcomb, Henry J. “Security Tightens by Monitoring Ships and Cargo,”
Philadelphia Inquirer. March 19, 2003.
www.centredaily.com/mld/centredaily/news/5428395.htm.
24. Hutchinson, Asa. Under Secretary for Border and Transportation Security, Eno
Transportation Foundation 2004 Leadership Development Conference, Session on
Security: Transforming How We Manage Transportation. May 26, 2004.
25. Jones, Jennifer. “A Moving Target – Border Control and Transportation Safety Apps
Put Biometrics to the Test,” Federal Computer Week. June 23, 2003.
www.fcw.com/fcw/articles/2003/0623/cov-report2-06-23-03.asp.
26. Law, Averill M. and W. David Kelton. Simulation with Arena 3rd Edition,”
McGraw-Hill: Boston. 2004.
27. Malkin, Richard. “An Air Cargo Century,” Air Cargo World Online. January 2000.
http://www.aircargoworld.com/archives/feat1jan00.htm.
28. “McGreevey Calls on Fed to Establish Stricter Air Cargo Guidelines,” Press Release,
New Jersey Office of the Governor. August 8, 2003. http://www.state.nj.us/cgi-
104
bin/governor/njnewsline/view_article.pl?id-1329.
29. Nice, Karim. “How Air Freight Works,” Howstuffworks.com. September 2002.
www.howstuffworks.com/air-freight.htm.
30. Parezo, Stephen. “Containing Security,” Air Cargo World Online. April 2002.
http://www.aircargoworld.com/archives/features/2_apr02.htm.
31. Partnership Against Terrorism. June 14, 2002.
http://www.cargosecurity.com/ncsc/coac/Non-intrusive.pdf.
32. Perry, Celeste. “Iridian Helps Cut Airport Waits as Security Tightens (Update 1),”
Bloomberg L.P. January 8, 2003.
http://www.biometricgroup.com/in_the_news/bloomberg.html.
33. Porteus, Liza. “Homeland Security Technologies in the Pipeline,” Fox News.
January 14, 2004. http://foxnews.com/story/0,2933,108289,00.html.
34. “R.I.S.K. Alert—Supply Chain Technology Aids Homeland Security,” Transentric
Press Release. March 28, 2003.
www.transentric.com/WhatsNew/pressReleases1.asp?intWhatsNewID=65.
35. TCRP Report 86: Public Transportation Security Volume 2, K9 Units in Public
Transportation: A Guide for Decision Makers, National Academy Press, Washington
D.C. 2002.
36. “Volume 6 – Report on Non-intrusive Detection Technologies,” U.S. Treasury
Advisory Committee on Commercial Operations of The United States Customs
Service Subcommittee on U.S. Border Security Technical Advisory Group and
Customs Trade Partnership Against Terrorism. June 14, 2002. http://www.cargosecurity.com/ncsc/coac/Non-intrusive.pdf.
105
APPENDIX A: SURVEY MATERIALS
A.1: Air Cargo Operations and Security Survey
University of Virginia Center for Transportation Studies
Air Cargo Operations and Security Survey
Airport Name and Location: ________________________________________________ Airport Code: ____________________________________________________________ Respondent’s Name and Title: _______________________________________________ Address: ________________________________________________________________ ________________________________________________________________________ Telephone: _______________________ Email: ________________________________ Note: Answers to all questions will be kept confidential. No identifying information about any individual airport will be included in the results summaries. Would you like to receive a results summary from this survey? Yes No If yes, please make sure that the address information above is completed.
PART 1: GENERAL AIR CARGO AND AIRPORT INFORMATION
1. How many companies/airlines sort cargo at your airport (either at a central sorting facility or at individual locations on airport property)? ______________
2. How many companies/airlines lease cargo space (for storage, sorting, etc.) at your airport? __________________________________________________________
3. What is the total size (e.g., in square feet, acres, etc.) of your cargo facilities? _________________________________________________________________
106
4. What is the total size (e.g., in square feet, acres, etc.) of each individual facility?
Facility Number Size 1 2 3 4 5 6
5. How many aircraft parking positions are there at each cargo facility? Please list the number of positions for each facility.
Facility Number Number of Aircraft Parking Positions 1 2 3 4 5 6
6. Is sorting conducted: Centrally (for all carriers)? Individually by each carrier?
a. If sorting is central, what time of day does sorting usually occur? Check all that apply.
Morning Evening Midday Late Night
7. How many airport/airline employees work in secure air cargo areas?
____________________________________________________________
8. What percentage of the air cargo flowing through your airport is international
cargo? 0-10% 31-40% 61-70% 91-100% 11-20% 41-50% 71-80% 21-30% 51-60% 81-90%
107
9. What percentage of cargo flowing through your airport comes through as belly freight on passenger planes?
0-10% 31-40% 61-70% 91-100% 11-20% 41-50% 71-80% 21-30% 51-60% 81-90%
10. How is belly cargo from passenger planes transported from the plane to the cargo
facility and vice versa? ____________________________________________ _______________________________________________________________
11. What are the most common types of cargo for: a. Passenger carriers? _________________________________________
_________________________________________________________ b. All-cargo carriers? _________________________________________
_________________________________________________________
12. What are the latest quarterly volumes of cargo flow by weight for your airport (for the past year, or the last year on record)?
Year Quarter 1
Total Weight Quarter 2
Total Weight Quarter 3
Total Weight Quarter 4
Total Weight
PART 2: CARGO SCREENING, SCANNING, AND SECURITY
13. Of all cargo flowing through your airport, what percentage of it is subject to any sort of security screening, scanning, or inspection process? ________________
14. What screening technologies are currently available and/or used for cargo?
Check all that apply. Pulsed fast neutron analysis Canines Thermal neutron activation Trace detection Bulk explosives protection systems Vapor detection Radiation detection Gamma ray Closed circuit TV X-ray Decompression chambers Other (please list) ________________________________________
108
15. What type(s) of screening for airport/airline workers is currently used to allow them to enter secure airport areas? Check all that apply.
Badges Background checks (no. of years checked?) _____________________ Smart cards/swipe cards Biometrics (fingerprint/face recognition/iris scan) ID badges with employee picture Other (please list) _________________________________________ _________________________________________ _________________________________________ 16. Are visitors allowed in the cargo facilities? Yes No
a. IF YES, are they subject to any type of screening? Please explain. ___________________________________________________________ ___________________________________________________________ ___________________________________________________________
17. What percentage of cargo coming through the airport is from known shippers? 0-10% 31-40% 61-70% 91-100% 11-20% 41-50% 71-80% 21-30% 51-60% 81-90%
18. Are trucks carrying cargo to or from airport property allowed to enter secured areas of the airport? Yes No
IF YES, go to Question 19. IF NO, skip Question 19 and go to Question 20.
19. Truck clearance and security: a. Must the truck driver obtain clearance for himself as the correct driver?
Yes No Please explain: ____________________________________________ _________________________________________________________ _________________________________________________________
b. How are the trucks themselves cleared? Manually Computer/electronic clearance Other (please list/explain) ________________________________ ________________________________ ________________________________
109
c. Are trucks inspected for firearms/stolen goods, etc.? Always Rarely Frequently Never Sometimes
20. Does international cargo flow through your airport? Yes No IF YES, go to Question 21. IF NO, skip Question 21 and go to Question 22.
21. For international cargo: a. On average, how long does it take for cargo to clear customs? _______
_________________________________________________________
b. In general, for cargo that is selected for screening, is it subject to more stringent screening than domestic cargo? Yes No
22. Are the dumpsters in secure airport areas subject to any type of surveillance? Yes No a. IF YES, what type? (CCTV, guard patrol, etc.) ____________________
___________________________________________________________
23. Do you plan to expand your screening technologies/procedures or make any
changes to your facilities in the near future that pertain to air cargo security and safety? If yes, please describe. _____________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________ ________________________________________________________________
110
24. Would you be interested in allowing a case study to be conducted (for university research purposes) on unit operations of cargo flow through your airport?
Yes No If yes, the researchers will contact you for further information.
For any questions, please contact Carla Rountree by phone (334) 663-5885 or by email ([email protected])
Thank you for your time. Please return:
By Mail: fold and seal survey and send postage-paid
By Fax: (434) 982-2951
By Email: attach the survey to your email and send to [email protected]
111
A.2: Survey Cover Letter Center for Transportation Studies University of Virginia 351 McCormick Rd. P.O. Box 400742 Charlottesville, VA 22904-4742 July 27, 2003 Dear Sir or Madam: The University of Virginia Center for Transportation Studies is conducting a study concerning air cargo security risks and countermeasures as a part of an ongoing research project. An important part of this study is the enclosed survey, “Air Cargo Operations and Security Survey,” in which we are asking selected airports to participate and provide information pertinent to the study. All information provided will be strictly confidential. Information about individual airports will not be included in any of the results of the survey or the study itself. The Center for Transportation Studies will also provide a summary of responses to any interested parties. The survey may be completed in 10 to 15 minutes. The survey may be completed using the enclosed hard copy and returned either by fax at 434-982-2951, or by mail (the enclosed copy may be folded and sealed for ease of return, and postage is paid). An electronic copy of the survey will be emailed to you as well. This version may be returned as a hard copy via the fax number or mailing address above, or by email to Carla Rountree at [email protected]. Please return the survey by your preferred method by September 5, 2003. Your participation in this study is greatly appreciated. Sincerely, Carla Rountree Center for Transportation Studies Department of Civil Engineering University of Virginia
112
A.3: Survey Follow-up Email Dear Survey Recipient, In August the Center for Transportation Studies at the University of Virginia sent you a survey entitled “Air Cargo Operations and Security Survey.” This survey is a very important part of an ongoing study being conducted by the Center for Transportation Studies, and your response is needed. We are extending our return deadline to October 8, 2003 in hopes that you will reconsider completing and returning the survey.
An electronic copy of the survey has been attached and may be returned either by fax (434-982-2951) or by email (send to Carla Rountree at [email protected]). There is also a web-based version of the survey that may be found at www.zoomerang.com/survey.zgi?AQYN9D93L3J855DA7VCWQL5Q for completion online. You may also return the hard copy that was sent to you in August either by fax or by mail (postage is paid, the survey may be folded and dropped in the mail). If you do not wish to participate in this study, please send an email to [email protected], and you will not be contacted any further concerning this survey.
Your participation in this survey is greatly appreciated. Sincerely, Carla Rountree Center for Transportation Studies Department of Civil Engineering University of Virginia
113
A.4: Graphical Results
Question 1: How many companies/airlines SORT cargo at your airport (either at a central sorting facility or at individual locations on airport property)?
(Total) Response % Cumulative
% 0 22.2 22.2 1 11.1 33.3 2 11.1 44.4 4 5.6 50.0 5 11.1 61.1 8 5.6 66.7 9 11.1 77.8
11 5.6 83.3 15 5.6 88.9 20 5.6 94.4 25 5.6 100.0
n = 8 mean = 6.88 median = 5 (Sm. Airports)
Response % Cumulative %
0 40.0 40.0 1 20.0 60.0 2 20.0 80.0 4 10.0 90.0 9 10.0 100.0
n = 10 mean = 1.9 median = 1
(Lg. Airports) Response % Cumulative
% 5 25.0 25.0 8 12.5 37.5 9 12.5 50.0
11 12.5 62.5 15 12.5 75.0 20 12.5 87.5 25 12.5 100.0
n = 8 mean = 12.25 median = 10
Question 1: All Airports
0
10
20
30
40
50
60
70
0-5 6-10 11-15 16-20 21-25
Number Companies/Airlines Sorting Cargo at Airport
Perc
enta
ge o
f Res
pons
es
Question 1: Small Airports vs. Large Airports
0
20
40
60
80
100
0-5 6-10 11-15 16-20 21-25
Number Companies/Airlines Sorting Cargo at Airport
Perc
enta
ge o
f Res
pons
es
Small Airports Large Airports
114
Question 2:How many companies/airlines LEASE cargo space (for storage, sorting, etc.) at your airport?
(Total) Responses % Cumulative
% 0 5.6 5.6 1 11.1 16.7 2 16.7 33.3 4 11.1 44.4 5 5.6 50.0 6 5.6 55.6 8 5.6 61.1 9 11.1 72.2
10 5.6 77.8 13 5.6 83.3 15 5.6 88.9 20 5.6 94.4 25 5.6 100.0
n = 18 mean = 7.56 median = 5.5 (Sm. Airports)
Responses % Cumulative %
0 10.0 10.0 1 20.0 30.0 2 30.0 60.0 4 20.0 80.0 6 10.0 90.0 9 10.0 100.0
n = 10 mean = 3.1 median = 2
(Lg. Airports) Response % Cumulative
% 5 12.5 12.5 8 12.5 25.0 9 12.5 37.5
10 12.5 50.0 13 12.5 62.5 15 12.5 75.0 20 12.5 87.5 25 12.5 100.0
n = 8 mean = 13.1 median = 11.5
Question 2: All Airports
0
10
20
30
40
50
60
0-5 6-10 11-15 16-20 21-25
Number Companies/Airlines Leasing Cargo Space at Airport
Perc
enta
ge o
f Res
pons
es
Question 2: Small Airports vs. Large Airports
0102030405060708090
0-5 6-10 11-15 16-20 21-25
Number Companies/Airlines Leasing Cargo Space at Airport
Perc
enta
ge o
f Res
pons
es
Small Airports Large Airports
115
Question 3:What is the total size (e.g., in square feet, acres, etc.) of your cargo facilities?
(Total) Response, sq. ft. % Cumulative
% 0-50,000 33.3 33.3
50,001-100,000 11.1 44.4 100,001-150,000 11.1 55.6 150,001-200,000 11.1 66.7 200,001-250,000 16.7 83.3
250,000 + 16.7 100.0 n = 18 mean = 547,676 median = 142,000
(Sm. Airports) Response, sq. ft. % Cumulative
% 0-50,000 60.0 60.0
50,001-100,000 20.0 80.0 100,001-150,000 10.0 90.0 150,001-200,000 10.0 100.0 200,001-250,000 0.0 100.0
250,000 + 0.0 1000 n = 10 mean = 59,542 median = 37,056
(Lg. Airports) Response, sq. ft. % Cumulative
% 0-50,000 0.0 0.0
50,001-100,000 0.0 0.0 100,001-150,000 12.5 12.5 150,001-200,000 12.5 25.0 200,001-250,000 37.5 62.5
250,000 + 37.5 100.0 n = 8 mean = 1,048,135 median = 223,750
Question 3: All Airports
0
5
10
15
20
25
30
35
0-50,000 50,001-100,000
100,001-150,000
150,001-200,000
200,001-250,000
250,001 +
Total Size (sq. ft.) of Cargo Facilities
Perc
enta
ge o
f Res
pons
esQuestion 3: Small Airports vs. Large
Airports
0
10
20
30
40
50
60
70
0 -50,000
50,000 -100,000
100,001 -150,000
150,001 -200,000
200,001 -250,000
250,001 +
Total Size (sq. ft.) of Cargo Facilities
Perc
enta
ge o
f Res
pons
es
Small Airports Large Airports
116
Question 4: What is the total size (e.g., in square feet, acres, etc.) of each individual facility?
(Total) Responses, sq. ft. % Cumulative
% 0-20,000 35.6 35.6
20,001-40,000 27.1 62.7 40,001-60,000 10.2 72.9 60,001-80,000 10.2 83.1
80,001-100,000 8.5 91.5 100,001-120,000 5.1 96.6 120,001-140,000 0.0 96.6 140,001-160,000 1.7 98.3 160,001-180,000 0.0 98.3 180,001-200,000 0.0 98.3
200,000 + 1.7 100.0 n = 16 mean = 115,398 median = 27,274
(Sm. Airports) Response, sq. ft % Cumulative
% 0-20,000 61.9 61.9
20,001-40,000 23.8 85.7 40,001-60,000 4.8 90.5 60,001-80,000 0.0 90.5
80,001-100,000 4.8 95.2 100,001-120,000 0.0 95.2 120,001-140,000 0.0 95.2 140,001-160,000 4.8 100.0 160,001-180,000 0.0 100.0 180,001-200,000 0.0 100.0
200,000 + 0.0 100.0 n = 8 mean = 28,639 median = 17,990
(Lg. Airports) Response, sq. ft. % Cumulative
% 0-20,000 21.1 21.1
20,001-40,000 28.9 50.0 40,001-60,000 13.2 63.2 60,001-80,000 15.8 78.9
80,001-100,000 10.5 89.5 100,001-120,000 7.9 97.4 120,001-140,000 0.0 97.4 140,001-160,000 0.0 97.4 160,001-180,000 0.0 97.4 180,001-200,000 0.0 97.4
200,000 + 2.6 100.0 n = 38 mean = 163,344 median = 45,000
Question 4: All Airports
05
10152025303540
0-20
20-40
40-60
60-80
80-10
0
100-1
20
120-1
40
140-1
60
160-1
80
180-2
0020
0 +
Size (1000's of sq. ft.) of Individual Cargo Facilities
Perc
enta
ge o
f Res
pons
es
Question 4: Small Airports vs. Large Airports
010203040506070
0-20
20-40
40-60
60-80
80-10
0
100-1
20
120-1
40
140-1
60
160-1
80
180-2
0020
0 +
Size (1000's of sq ft) of Individual Cargo Facilities
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
117
Question 5:How many aircraft parking positions are there at each cargo facility?
(Total) Response % Cumulative
% 0-2 48.5 48.5 3-4 27.3 75.8 5-6 12.1 87.9 7-8 3.0 90.9
9-10 3.0 93.9 11 + 6.1 100.0
n = 15 mean = 9 median = 3
(Sm. Airports) Response % Cumulative
% 0-2 78.6 78.6 3-4 14.3 92.9 5-6 7.1 100.0 7-8 0.0 100.0
9-10 0.0 100.0 11 + 0.0 100.0
n = 9 mean = 1.4 median = 1
(Lg. Airports) Response % Cumulative
% 0-2 31.3 31.3 3-4 25.0 56.3 5-6 25.0 81.2 7-8 6.3 87.5
9-10 6.3 93.8 11 + 6.3 100.0
n = 6 mean = 6.2 median = 3
Question 5: All Airports
0
10
20
30
40
50
60
0-2 3-4 5-6 7-8 9-10 11 +
No. Aircraft Parking Positions at Individual Cargo Facilities
Perc
enta
ge o
f Res
pons
esQuestion 5: Small Airports vs.
Large Airports
0102030405060708090
0-2 3-4 5-6 7-8 9-10 11 +
No. Aircraft Parking Positions at Individual Cargo Facilities
Perc
enta
ge o
f Res
pons
es
Small Airports Large Airports
118
Question 6:Is sorting conducted centrally (for all carriers) or individually by each carrier?
(Total) Response % Cumulative
% Individually 100.0 100.0 Centrally 0.0 100.0
n = 15 *Note: Airports are not separated by size for this question because all airports reported individual sorting.
Question 7: How many airport/airline employees work in secure cargo areas?
(Total) Response % Cumulative
% 0-50 41.7 41.7
51-100 25.0 66.7 101-150 16.7 83.3 151-200 0.0 83.3
200 + 16.7 100.0 n = 13 mean = 284 median = 59
(Sm. Airports) Response % Cumulative
% 0-50 55.6 55.6
51-100 22.2 77.8 101-150 11.1 88.9 151-200 11.1 100.0
200 + 0.0 100.0 n = 9 mean = 133.4 median = 50
(Lg. Airports) Response % Cumulative
% 0-50 0.0 0.0
51-100 25.0 25.0 101-150 50.0 75.0 151-200 0.0 75..
200 + 25.0 100.0 n = 4 mean = 582.5 median = 137.5
Question 6: All Airports
0
20
40
60
80
100
Individually Centrally
Method of Cargo Sorting
Perc
enta
ge o
f Res
pons
es
Question 7: All Airports
0
10
20
30
40
50
0-50 51-100 101-150 151-200 200 +
No. Airport/Airline Employees Working in Secure Cargo Areas
Perc
enta
ge o
f R
espo
nses
Question 7: Small Airports vs. Large Airports
0102030405060
0-50 51-100 101-150 151-200 200 +
No. Airport/Airline Employees Working in Securre Cargo Areas
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
119
Question 8: What percentage of the air cargo flowing through your airport is international cargo?
(Total) Response, % % Cumulative
% 0-10 60.0 60.0 11-20 0.0 60.0 21-30 20.0 80.0 31-40 13.3 93.3 41-50 0.0 93.3 51-60 0.0 93.3 61-70 0.0 93.3 71-80 0.0 93.3 81-90 0.0 93.3 91-100 6.7 100.0
n = 16 mean = 19 median = 5
(Sm. Airports) Response, % % Cumulative
% 0-10 66.7 66.7 11-20 0.0 66.7 21-30 22.2 88.9 31-40 11.1 100.0 41-50 0.0 100.0 51-60 0.0 100.0 61-70 0.0 100.0 71-80 0.0 100.0 81-90 0.0 100.0 91-100 0.0 100.0
n = 9 mean = 10.6 median = 5
(Lg. Airports) Response, % % Cumulative
% 0-10 57.1 57.1 11-20 0.0 57.1 21-30 14.3 71.4 31-40 14.3 85.7 41-50 0.0 85.7 51-60 0.0 85.7 61-70 0.0 85.7 71-80 0.0 85.7 81-90 0.0 85.7 91-100 14.3 100.0
n = 7 mean = 24.3 median = 5
Question 8: All Airports
0
10
20
30
40
50
60
70
0-10
11-20
21-30
31-40
41-50
51-60
61-70
71-80
81-90
91-10
0
% of Cargo as International Cargo
Perc
enta
ge o
f Res
pons
esQuestion 8: Small Airports vs.
Large Airports
010203040506070
0-10
11-20
21-30
31-40
41-50
51-60
61-70
71-80
81-90
91-10
0
% of Cargo at International Cargo
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
120
Question 9: What percentage of cargo flowing through your airport comes through as belly freight on passenger planes?
(Total) Response, % % Cumulative
% 0-10 40.0 40.0 11-20 20.0 60.0 21-30 6.7 66.7 31-40 13.3 80.0 41-50 0.0 80.0 51-60 0.0 80.0 61-70 0.0 80.0 71-80 0.0 80.0 81-90 6.7 86.7 91-100 13.3 100.0
n = 15 mean = 30 median = 15
(Sm. Airports) Response, % % Cumulative
% 0-10 44.4 44.4 11-20 22.2 66.6 21-30 11.1 77.7 31-40 0.0 77.7 41-50 0.0 77.7 51-60 0.0 77.7 61-70 0.0 77.7 71-80 0.0 77.7 81-90 0.0 77.7 91-100 22.2 100.0
n = 9 mean = 29.4 median = 15
(Lg. Airports) Response, % % Cumulative
% 0-10 33.3 33.3 11-20 16.7 50.0 21-30 0.0 50.0 31-40 33.3 83.3 41-50 0.0 83.3 51-60 0.0 83.3 61-70 0.0 83.3 71-80 0.0 83.3 81-90 6.7 100.0 91-100 0.0 100.0
n = 6 mean = 30 median = 25
Question 9: All Airports
05
1015202530354045
0-10
11-20
21-30
31-40
41-50
51-60
61-70
71-80
81-90
91-10
0
% Cargo as Belly Freight
Perc
enta
ge o
f Res
pons
es
Question 9: Small Airports vs. Large Airports
0
10
20
30
40
50
0-10
11-20
21-30
31-40
41-50
51-60
61-70
71-80
81-90
91-10
0
% Cargo as Belly Freight
Perc
enta
ge o
f Res
pons
es
Small Airports Large Airports
121
Question 10: How is belly cargo from passenger planes transported from the plane to the cargo facility and vice versa?
(Total) Response % Cumulative
% on trucks 12.5 12.5
tug &/or cart 68.8 81.3 belt loaders 6.3 87.5 pod loaders 6.3 93.8
ramp 6.3 100.0 n = 15
(Sm. Airports) Response % Cumulative
% on trucks 20.0 20.0
tug &/or cart 80.0 100.0 belt loaders 0.0 100.0 pod loaders 0.0 100.0
ramp 0.0 100.0 n = 5
(Lg. Airports) Response % Cumulative
% on trucks 10.0 10.0
tug &/or cart 60.0 70.0 belt loaders 10.0 80.0 pod loaders 10.0 90.0
ramp 10.0 100.0 n = 10
Question 10: All Airports
01020304050607080
on trucks tug &/orcart
beltloaders
podloaders
ramp
Belly Cargo Transport Method from Plane to Facility
Perc
enta
ge o
f Res
pons
es
Question 10: Small Airports vs. Large Airports
0
20
40
60
80
100
ontrucks
tug &/orcart
beltloaders
podloaders
ramp
Belly Cargo Transport Method from Plane to Facility
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
122
Question 11a: What are the most common types of cargo for passenger carriers?
(Total) Response % Cumulative
% mail 50.0 N/A
perishable items 41.7 N/A human remains 8.3 N/A exp. packages 8.3 N/A
electronics 8.3 N/A live animals 8.3 N/A
medicine 8.3 N/A gaming equip. 8.3 N/A
caskets 8.3 N/A n = 12
(Sm. Airports) Response % Cumulative
% mail 50.0 N/A
perishable items 33.3 N/A human remains 16.7 N/A exp. packages 0.0 N/A
electronics 0.0 N/A live animals 0.0 N/A
medicine 0.0 N/A gaming equip. 0.0 N/A
caskets 16.7 N/A n = 6
(Lg. Airports) Response % Cumulative
% mail 50.0 N/A
perishable items 50.0 N/A human remains 0.0 N/A exp. packages 16.7 N/A
electronics 16.7 N/A live animals 16.7 N/A
medicine 16.7 N/A gaming equip. 16.7 N/A
caskets 0.0 N/A n = 6
Question 11a: All Airports
0102030405060
peris
hable
items
human r
emains
expre
ss pa
ckag
es
electr
onics
live an
imals
medici
ne
gaming
equipm
ent
cask
ets
Types of Cargo Transported by Passenger Carriers
Perc
enta
ge o
f Res
pons
es
Question 11a: Small Airports vs. Large Airports
0102030405060
peris
hable
items
human r
emains
expre
ss pa
ckag
es
electr
onics
live an
imals
medici
ne
gaming
equipm
ent
cask
ets
Types of Cargo Transported by Passenger Carriers
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
123
Question 11b: What are the most common types of cargo for all-cargo carriers?
(Total) Responses % Cumulative
% manufact. items 9.1 N/A
retail merch. 18.2 N/A mail 18.2 N/A
exp. packages 63.6 N/A machine parts 9.1 N/A
int’l freight 9.1 N/A perishable items 27.3 N/A automobile parts 18.2 N/A
electronics 36.4 N/A pharmaceuticals 9.1 N/A
furniture 9.1 N/A chemicals 9.1 N/A
n = 11
(Sm. Airports) Responses % Cumulative
% manufact. items 20.0 N/A
retail merch. 20.0 N/A mail 0.0 N/A
exp. packages 80.0 N/A machine parts 0.0 N/A
int’l freight 20.0 N/A perishable items 20.0 N/A automobile parts 0.0 N/A
electronics 20.0 N/A pharmaceuticals 0.0 N/A
furniture 0.0 N/A chemicals 0.0 N/A
n = 5
(Lg. Airports) Responses % Cumulative
% manufact. items 0.0 N/A
retail merch. 16.7 N/A mail 33.3 N/A
exp. packages 50.0 N/A machine parts 16.7 N/A
int’l freight 0.0 N/A perishable items 33.3 N/A automobile parts 33.3 N/A
electronics 50.0 N/A pharmaceuticals 16.7 N/A
furniture 16.7 N/A chemicals 16.7 N/A
n = 6
Question 11b: All Airports
010203040506070
manufac
t. item
s
retail
merc
h.mail
exp.
pack
ages
machin
e part
s
int'l frei
ght
peris
hable
items
autom
obile
parts
electr
onics
pharm
aceu
ticals
furnit
ure
chem
icals
Types of Cargo Transported by All-Cargo Carriers
Perc
enta
ge o
f Res
pons
es
Question 11b: Small Airports vs. Large Airports
0102030405060708090
manufac
t. item
s
retail
merc
h.mail
exp.
pack
ages
machin
e part
s
internati
onal fr
eight
peris
hable
items
autom
obile
parts
electr
onics
pharm
aceu
ticals
furnit
ure
chem
icals
Types of Cargo Transported by All-Cargo Carriers
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
124
Question 12: What are the latest quarterly volumes of cargo flow by weight for your airport (for the past year, or the last year on record)?
(Total) Response, tons
Quarter 1 % Cumulative
%
0-3,000 38.5 38.5 3,001-6,000 7.7 46.2 6,001-9,000 0.0 46.2 9,001-12,000 0.0 46.2
12,001-15,000 7.7 53.8 15,001-18,000 7.7 61.5 18,001-21,000 15.4 76.9 21,001-24,000 7.7 76.9 24,001-27,000 0.0 84.6
27,000 + 15.4 100.0 n = 13 mean = 44,056 median = 12,882
(Sm. Airports) Response, tons
Quarter 1 % Cumulative
%
0-3,000 83.3 83.3 3,001-6,000 16.7 100.0 6,001-9,000 0.0 100.0 9,001-12,000 0.0 100.0
12,001-15,000 0.0 100.0 15,001-18,000 0.0 100.0 18,001-21,000 0.0 100.0 21,001-24,000 0.0 100.0 24,001-27,000 0.0 100.0
27,000 + 0.0 100.0 n = 6 mean = 1,271 median = 886
(Lg. Airports) Response, tons
Quarter 1 % Cumulative
%
0-3,000 0.0 0.0 3,001-6,000 0.0 0.0 6,001-9,000 0.0 0.0 9,001-12,000 0.0 0.0
12,001-15,000 14.3 14.3 15,001-18,000 14.3 28.6 18,001-21,000 28.6 57.1 21,001-24,000 0.0 57.1 24,001-27,000 14.3 71.4
27,000 + 28.6 100.0 n = 7 mean = 80,730 median = 20,939
Question 12: All Airports (Quarter 1)
05
1015202530354045
0-3 3-6 6-9 9-12
12-15
15-18
18-21
21-24
24-27
27+
Quarter 1 Volumes of Cargo (1000's of tons)
Perc
enta
ge o
f Res
pons
es
Question 12: Small Airports vs. Large Airports (Quarter 1)
0102030405060708090
0-3 3-6 6-9 9-12
12-15
15-18
18-21
21-24
24-27
27+
Quarter 1 Volumes of Cargo (1000's of tons)
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
125
(Total) Response, tons
Quarter 2 %
Cumulative %
0-3,000 38.5 38.5 3,001-6,000 7.7 46.2 6,001-9,000 0.0 46.2 9,001-12,000 0.0 46.2
12,001-15,000 7.7 53.8 15,001-18,000 7.7 61.5 18,001-21,000 0.0 61.5 21,001-24,000 15.4 76.9 24,001-27,000 7.7 84.6
27,000 + 15.4 100.0 n = 13 mean = 44,531 median = 12,845
(Sm. Airports) Response, tons
Quarter 2 %
Cumulative %
0-3,000 83.3 83.3 3,001-6,000 16.7 100.0 6,001-9,000 0.0 100.0 9,001-12,000 0.0 100.0
12,001-15,000 0.0 100.0 15,001-18,000 0.0 100.0 18,001-21,000 0.0 100.0 21,001-24,000 0.0 100.0 24,001-27,000 0.0 100.0
27,000 + 0.0 100.0 n = 6 mean = 1,096 median = 374
(Lg. Airports) Response, tons
Quarter 2 % Cumulative
%
0-3,000 0.0 0.0 3,001-6,000 0.0 0.0 6,001-9,000 0.0 0.0 9,001-12,000 0.0 0.0
12,001-15,000 14.3 14.3 15,001-18,000 14.3 28.6 18,001-21,000 0.0 28.6 21,001-24,000 28.6 57.1 24,001-27,000 14.3 71.4
27,000 + 28.6 100.0 n = 7 mean = 81,616 median = 23,295
Question 12: All Airports (Quarter 2)
05
1015202530354045
0-3 3-6 6-9 9-12
12-15
15-18
18-21
21-24
24-27
27 +
Quarter 2 Volumes of Cargo (1000's of tons)
Perc
enta
ge o
f Res
pons
esQuestion 12: Small Airports vs.
Large Airports (Quarter 2)
0102030405060708090
0-3 3-6 6-9 9-12
12-15
15-18
18-21
21-24
24-27
27+
Quarter 2 Volumes of Cargo (tons)
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
126
(Total) Response, tons
Quarter 3 %
Cumulative %
0-3,000 41.7 41.7 3,001-6,000 0.0 41.7 6,001-9,000 0.0 41.7 9,001-12,000 0.0 41.7
12,001-15,000 0.0 41.7 15,001-18,000 8.3 50.0 18,001-21,000 16.7 66.7 21,001-24,000 8.3 75.0 24,001-27,000 8.3 83.3
27,000 + 16.7 100.0 n = 12 mean = 48,336 median = 17,886
(Sm. Airports) Response, tons
Quarter 3 %
Cumulative %
0-3,000 100.0 100.0 3,001-6,000 0.0 100.0 6,001-9,000 0.0 100.0 9,001-12,000 0.0 100.0
12,001-15,000 0.0 100.0 15,001-18,000 0.0 100.0 18,001-21,000 0.0 100.0 21,001-24,000 0.0 100.0 24,001-27,000 0.0 100.0
27,000 + 0.0 100.0 n = 5 mean = 421 median = 174
(Lg. Airports) Response, tons
Quarter 3 % Cumulative
%
0-3,000 0.0 0.0 3,001-6,000 0.0 0.0 6,001-9,000 0.0 0.0 9,001-12,000 0.0 0.0
12,001-15,000 0.0 0.0 15,001-18,000 14.3 14.3 18,001-21,000 28.6 42.9 21,001-24,000 14.3 57.1 24,001-27,000 14.3 71.4
27,000 + 28.6 100.0 n = 7 mean = 82,417 median = 23,638
Question 12: All Airports (Quarter 3)
05
1015202530354045
0-3 3-6 6-9 9-12 12-15
15-18
18-21
21-24
24-27
27 +
Quarter 3 Volumes of Cargo (1000's of tons)
Perc
enta
ge o
f Res
pons
esQuestion 12: Small Airports vs.
Large Airports (Quarter 3)
0
20
40
60
80
100
0-3 3-6 6-9 9-12
12-15
15-18
18-21
21-24
24-27
27+
Quarter 3 Volumes of Cargo (1000's of tons)
Perc
enta
ge o
f Res
pons
es
Small Airports Large Airports
127
(Total) Response, tons
Quarter 4 %
Cumulative %
0-3,000 41.7 41.7 3,001-6,000 0.0 41.7 6,001-9,000 0.0 41.7 9,001-12,000 0.0 41.7
12,001-15,000 0.0 41.7 15,001-18,000 8.3 50.0 18,001-21,000 0.0 50.0 21,001-24,000 25.0 75.0 24,001-27,000 8.3 83.3
27,000 + 16.7 100.0 n = 12 mean = 48,279 median = 18,388
(Sm. Airports) Response, tons
Quarter 4 %
Cumulative %
0-3,000 100.0 100.0 3,001-6,000 0.0 100.0 6,001-9,000 0.0 100.0 9,001-12,000 0.0 100.0
12,001-15,000 0.0 100.0 15,001-18,000 0.0 100.0 18,001-21,000 0.0 100.0 21,001-24,000 0.0 100.0 24,001-27,000 0.0 100.0
27,000 + 0.0 100.0 n = 5 mean = 421 median = 179
(Lg. Airports) Response, tons
Quarter 4 % Cumulative
%
0-3,000 0.0 0.0 3,001-6,000 0.0 0.0 6,001-9,000 0.0 0.0 9,001-12,000 0.0 0.0
12,001-15,000 0.0 0.0 15,001-18,000 14.3 14.3 18,001-21,000 0.0 14.3 21,001-24,000 42.9 57.1 24,001-27,000 14.3 71.4
27,000 + 28.6 100.0 n = 7 mean = 82,302 median = 21,634
Question 12: All Airports (Quarter 4)
05
1015202530354045
0-3 3-6 6-9 9-12
12-15
15-18
18-21
21-24
24-27
27 +
Quarter 4 Volumes of Cargo (1000's of tons)
Perc
enta
ge o
f Res
pons
esQuestion 12: Small Airports vs.
Large Airports (Quarter 4)
0
20
40
60
80
100
0-3 3-6 6-9 9-12
12-15
15-18
18-21
21-24
24-27
27+
Quarter 4 Volumes of Cargo (1000's of tons)
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
128
Question 13: Of all cargo flowing through your airport, what percentage of it is subject to any sort of security screening, scanning, or inspection process?
(Total) Response, % % Cumulative
% 0-10 70.0 70.0 11-20 0.0 70.0 21-30 0.0 70.0 31-40 0.0 70.0 41-50 0.0 70.0 51-60 0.0 70.0 61-70 0.0 70.0 71-80 0.0 70.0 81-90 0.0 70.0 91-100 30.0 100.0
n = 10 mean = 35 median = 10
(Sm. Airports) Response, % % Cumulative
% 0-10 70.0 70.0 11-20 0.0 70.0 21-30 0.0 70.0 31-40 0.0 70.0 41-50 0.0 70.0 51-60 0.0 70.0 61-70 0.0 70.0 71-80 0.0 70.0 81-90 0.0 70.0 91-100 30.0 100.0
n = 7 mean = 32 median = 10
Question 13: All Airports
01020304050607080
0-10
11-20
21-30
31-40
41-50
51-60
61-70
71-80
81-90
91-10
0
% Cargo Subject to Screening
Perc
enta
ge o
f Res
pons
es
Question 13: Small Airports vs. Large Airports
01020304050607080
0 - 10
11 - 2
0
21 - 3
0
31 - 4
0
41 - 5
0
51 - 6
0
61 - 7
0
71 - 8
0
81 - 9
0
91 - 1
00
% Cargo Subject to Screening
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
129
Question 14: What screening technologies are currently available and/or used for cargo?
(Total) Response % Cumulative
% CCTV 36.4 N/A
Trace Detection 27.3 N/A X-Ray 54.5 N/A
Canines 45.5 N/A Phys. Inspection 9.1 N/A n = 11
(Sm. Airports) Response % Cumulative
% CCTV 66.7 N/A
Trace Detection 50.0 N/A X-Ray 83.3 N/A
Canines 50.0 N/A Phys. Inspection 16.7 N/A n = 6
(Lg. Airports) Response % Cumulative
% CCTV 60.0 N/A
Trace Detection 0.0 N/A X-Ray 80.0 N/A
Canines 80.0 N/A Phys. Inspection 0.0 N/A n = 5
Question 14: All Airports
0
10
20
30
40
50
60
CCTV TraceDetection
X-Ray Canines Physicalinspection
Screening Methods used for Cargo
Perc
enta
ge o
f Res
pons
esQuestion 14: Small Airports vs.
Large Airports
0102030405060708090
CCTV TraceDetection
X-Ray Canines Physicalinspection
Screening Methods used for Cargo
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
130
Question 15: What type(s) of screening for airport/airline workers is currently used to allow them to enter secure airport areas?
(Total) Response % Cumulative
% Badges 55.6 N/A
Background Checks 66.7 N/A
Smart/Swipe Cards 50.0 N/A
Biometrics 5.6 N/A ID Badges 66.7 N/A
Fingerprinting 5.6 N/A Pin Number 5.6 N/A
n = 18
(Sm. Airports) Response % Cumulative
% Badges 80.0 N/A
Background Checks 100.0 N/A
Smart/Swipe Cards 80.0 N/A
Biometrics 20.0 N/A ID Badges 100.0 N/A
Fingerprinting 10.0 N/A Pin Number 0.0 N/A
n = 10
(Lg. Airports) Response % Cumulative
% Badges 75.0 N/A
Background Checks 87.5 N/A
Smart/Swipe Cards 75.0 N/A
Biometrics 12.5 N/A ID Badges 87.5 N/A
Fingerprinting 12.5 N/A Pin Number 12.5 N/A
n = 8
Question 15: Small Airports vs. Large Airports
0102030405060708090
100
Badge
s
Backg
round
Che
cks
Smart/S
wipe C
ards
Biometr
ics
ID Badg
es
Fingerpr
inting
Pin Num
ber
Screening Methods Used for Cargo Employees
Perc
enta
ge o
f Res
pons
es
Small Airports Large Airports
Question 15: All Airports
01020304050607080
Badge
s
Backg
round
Che
cks
Smart/S
wipe C
ards
Biometr
ics
ID Badg
es
Fingerpr
inting
Pin Num
ber
Screening Methods Used for Cargo Employees
Perc
enta
ge o
f Res
pons
es
131
Question 16: Are visitors allowed in the cargo facilities?
(Total) Response % Cumulative
% Yes 62.5 62.5 No 37.5 100.0
n = 16
(Sm. Airports) Response % Cumulative
% Yes 55.6 55.6 No 44.4 100.0
n = 9
(Lg. Airports) Response % Cumulative
% Yes 71.4 71.4 No 28.6 100.0
n = 7
Question 16: All Airports
0
10
20
30
40
50
60
70
Yes No
Are Visitors Allowed in Cargo Facilities?
Perc
enta
ge o
f Res
pons
esQuestion 16: Small Airports vs.
Large Airports
01020304050607080
Yes No
Are Visitors Allowed in Cargo Facilities?
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
132
Question 16a: If YES, are they subject to any type of screening?
(Total) Response % Cumulative
% Yes 80.0 80.0 No 20.0 100.0
n = 10 *Type of screening: Authorized Escort (100%)
(Sm. Airports) Response % Cumulative
% Yes 80.0 80.0 No 20.0 100.0
n = 5
(Lg. Airports) Response % Cumulative
% Yes 80.0 80.0 No 20.0 100.0
n = 5
Question 16a: Small Airports vs. Large Airports
0102030405060708090
Yes No
Are Visitors Subject to Screening?
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
Question 16a: All Airports
01020
30405060
708090
Yes No
Are Visitors Subject to Screening?
Perc
enta
ge o
f Res
pons
es
133
Question 17: What percentage of cargo coming through the airport is from known shippers?
(Total) Response, % % Cumulative
% 0-10 0.0 0.0 11-20 0.0 0.0 21-30 0.0 0.0 31-40 0.0 0.0 41-50 0.0 0.0 51-60 0.0 0.0 61-70 0.0 0.0 71-80 9.1 9.1 81-90 18.2 27.3 91-100 72.7 100.0
n = 11 mean = 91 median = 95
(Sm. Airports) Response, % % Cumulative
% 0-10 0.0 0.0 11-20 0.0 0.0 21-30 0.0 0.0 31-40 0.0 0.0 41-50 0.0 0.0 51-60 0.0 0.0 61-70 0.0 0.0 71-80 14.3 14.3 81-90 28.6 42.9 91-100 57.1 100.0
n = 8 mean = 89.3 median = 95
(Lg. Airports) Response, % % Cumulative
% 0-10 0.0 0.0 11-20 0.0 0.0 21-30 0.0 0.0 31-40 0.0 0.0 41-50 0.0 0.0 51-60 0.0 0.0 61-70 0.0 0.0 71-80 0.0 0.0 81-90 0.0 0.0 91-100 100.0 100.0
n = 3 mean = 95 median = 95
Question 17: Small Airports vs. Large Airports
0
20
40
60
80
100
0-10
11-20
21-30
31-40
41-50
51-60
61-70
71-80
81-90
91-10
0
% Cargo from Known Shippers
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
Question 17: All Airports
0
10
20
30
40
50
60
70
80
0-10
11-20
21-30
31-40
41-50
51-60
61-70
71-80
81-90
91-10
0
% Cargo from Known Shippers
Perc
enta
ge o
f Res
pons
es
134
Question 18: Are trucks carrying cargo to or from airport property allowed to enter secured areas of the airport?
(Total) Response % Cumulative
% Yes 55.6 55.6 No 44.4 100.0
n = 18
(Sm. Airports) Response % Cumulative
% Yes 60.0 60.0 No 40.0 100.0
n = 10
(Lg. Airports) Response % Cumulative
% Yes 50.0 50.0 No 50.0 100.0
n = 8
Question 18: All Airports
0
10
20
30
40
50
60
Yes No
Are Trucks Allowed to Enter Secure Airport Areas?
Perc
enta
ge o
f Res
pons
es
Question 18: Small Airports vs. Large Airports
010203040506070
Yes No
Are Trucks Allowed to Enter Secure Airport Area?
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
135
Question 19a: Must the truck driver obtain clearance for himself as the correct driver?
(Total) Response % Cumulative
% Yes 100.0 100.0 No 0.0 100.0
n = 9 Clearance Method
(Total) Response % Cumulative
% Airport ID 15.4 15.4
Biometric reader 7.7 23.1 Authorized
escort 23.1 46.2
Prearranged 7.7 53.8 Identity
verification 23.1 76.9
Pin number 7.7 84.6 Smart
card/badge 15.4 100.0
n = 10
(Sm. Airports) Response % Cumulative
% Airport ID 14.3 14.3
Biometric reader 14.3 28.6 Authorized
escort 14.3 42.9
Prearranged 0.0 42.9 Identity
verification 28.6 71.4
Pin number 14.3 85.7 Smart
card/badge 14.3 100.0
n = 6
Question 19a: All Airports (Clearance Methods)
0
5
10
15
20
25
airp
ort I
D
biom
etric
read
er
Auth
oriz
edes
cort
prea
rran
ged
clea
ranc
e
iden
tity
verif
icat
ion
pin
#
smar
tca
rd/b
adge
Clearance Methods for Truck Drivers
Perc
enta
ge o
f Res
pons
es
Question 19a: All Airports
0102030405060708090
100
Yes No
Must Truck Driver Obtain Clearance?
Perc
enta
ge o
f Res
pons
es
136
Question 19b: How are the trucks themselves cleared?
(Total) Response % Cumulative
% Manually 80.0 80.0
Comupter/Elec. 10.0 90.0 Other 10.0 100.0
n = 10
(Sm. Airports) Response % Cumulative
% Manually 66.7 66.7
Comupter/Elec. 16.7 83.3 Other 16.7 100.0
n = 6
(Lg. Airports) Response % Cumulative
% Manually 100.0 100.0
Comupter/Elec. 0.0 100.0 Other 0.0 100.0
n = 4
Question 19b: Small Airports vs. Large Airports
0102030405060708090
100
Manually Computer/Elec. Other
Clearance Methods forTrucks
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
Question 19b: All Airports
0
10
20
30
40
50
60
70
80
90
Manually Computer/Elec. Other
Clearance Methods forTrucks
Perc
enta
ge o
f Res
pons
es
137
Question 19c: Are trucks inspected for firearms/stolen goods, etc.?
(Total) Responses % Cumulative
% Always 25.0 25.0
Frequently 12.5 37.5 Sometimes 25.0 62.5
Rarely 12.5 75.0 Never 25.0 100.0
n = 8
(Sm. Airports) Responses % Cumulative
% Always 40.0 40.0
Frequently 20.0 60.0 Sometimes 0.0 60.0
Rarely 0.0 60.0 Never 40.0 100.0
n = 5
(Lg. Airports) Responses % Cumulative
% Always 0.0 0.0
Frequently 0.0 0.0 Sometimes 66.7 66.7
Rarely 33.3 100.0 Never 0.0 100.0
n = 3
Question 19c: Small Airports vs. Large Airports
010203040506070
Alway
s
Frequen
tly
Someti
mes
Rarely
Never
Frequency of Truck Inspections
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
Question 19c: All Airports
0
5
10
15
20
25
30
Alway
s
Frequen
tly
Someti
mes
Rarely
Never
Frequency of Truck Inspections
Perc
enta
ge o
f Res
pons
es
138
Question 20:Does international cargo flow through your airport?
(Total) Response % Cumulative
% Yes 61.1 61.1 No 38.9 100.0
n = 18
(Sm. Airports) Response % Cumulative
% Yes 40.0 40.0 No 60.0 100.0
n = 10
(Lg. Airports) Response % Cumulative
% Yes 87.5 87.5 No 12.5 100.0
n = 8
Question 20: All Airports
0
10
20
30
40
50
60
70
Yes No
Does International Cargo Flow Through Your Airport?
Perc
enta
ge o
f Res
pons
esQuestion 20: Small Airports vs.
Large Airports
0
20
40
60
80
100
Yes No
Does International Cargo Flow Through Your Airport?
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
139
Question 21a: For international cargo: On average, how long does it take for cargo to clear customs?
(Total) Response % Cumulative
% 0-1 14.3 14.3 1-2 42.9 57.1 2-3 14.3 71.4 3-4 14.3 85.7 4 + 14.3 100.0
n = 7 mean = 9 median = 2
(Sm. Airports) Response % Cumulative
% 0-1 0.0 0.0 1-2 0.0 0.0 2-3 0.0 0.0 3-4 50.0 50.0 4 + 50.0 100.0
n = 2 mean = 26 median = 26
(Lg. Airports) Response % Cumulative
% 0-1 20.0 20.0 1-2 60.0 80.0 2-3 20.0 100.0 3-4 0.0 100.0 4 + 0.0 100.0
n = 5 mean = 2 median = 2
Question 21a: All Airports
0
10
20
30
40
50
0-1 1-2 2-3 3-4 4 +
Time (hrs) for International Cargo to Clear Customs
Perc
enta
ge o
f Res
pons
es
Question 21a: Small Airports vs. Large Airports
010203040506070
0-1 1-2 2-3 3-4 4 ormore
Time (hrs) for International Cargo to Clear Customs
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
140
Question 21b: In general, for international cargo that is selected for screening, is it subject to more stringent screening than domestic cargo?
(Total) Response % Cumulative
% Yes 40.0 40.0 No 60.0 100.0
n = 5
(Sm. Airports) Response % Cumulative
% Yes 0.0 0.0 No 100.0 100.0
n = 2
(Lg. Airports) Response % Cumulative
% Yes 66.7 66.7 No 33.3 100.0
n = 3
Histogram Question 21b
0
10
20
30
40
50
60
70
Yes No
Is International Cargo Subject to More Screening than Domestic?
Perc
enta
ge o
f Res
pons
es
Question 21b: Small Airports vs. Large Airports
0
20
40
60
80
100
Yes No
Is International Cargo Subject to More Screening than Domestic?
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
141
Question 22: Are the dumpsters in secure airport areas subject to any type of surveillance?
(Total) Response % Cumulative
% Yes 73.3 73.3 No 26.7 100.0
n = 15
(Sm. Airports) Response % Cumulative
% Yes 66.7 66.7 No 33.3 100.0
n = 9
(Lg. Airports) Response % Cumulative
% Yes 83.3 83.3 No 16.7 100.0
n = 6
Question 22: All Airports
0
10
20
30
40
50
60
70
80
Yes No
Are Dumpsters Subject to Surveillance?
Perc
enta
ge o
f Res
pons
esQuestion 22: Small Airports vs.
Large Airports
0102030405060708090
Yes No
Are Dumpsters Subject to Surveillance?
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports
142
Question 22a: If YES, what type?
(Total) Response % Cumulative
% CCTV 53.3 53.3
Guard Patrol 46.7 100.0 n = 15
(Sm. Airports)
Response % Cumulative %
CCTV 62.5 62.5 Guard Patrol 37.5 100.0
n = 8
(Lg. Airports) Response % Cumulative
% CCTV 42.9 42.9
Guard Patrol 57.1 100.0 n = 7
Question 22: All Airports
0
10
20
30
40
50
60
CCTV Guard patrol
Type of Surveillance Used for Dumpsters
Perc
enta
ge o
f Res
pons
esQuestion 22a: Small Airports vs.
Large Airports
0
10
20
30
40
50
60
70
CCTV Guard patrol
Type of Surveillance Used for Dumpsters
Perc
enta
ge o
f R
espo
nses
Small Airports Large Airports