134
DEFENSE SYSTEMS MANAGEMENT COLLEGE SIMULATION BASED ACQUISITION: A NEW APPROACH Report of the Military Research Fellows DSMC 1997-1998 Lieutenant Colonel Michael V. R. Johnson, Sr., USA Lieutenant Colonel Mark F. McKeon, USMC Lieutenant Colonel Terence R. Szanto, USAF December 1998 PUBLISHED BY THE DEFENSE SYSTEMS MANAGEMENT COLLEGE PRESS FORT BELVOIR, VIRGINIA 22060-5565

SIMULATION BASED ACQUISITION: A NEW APPROACH · SIMULATION BASED ACQUISITION: A NEW APPROACH ... 1-4 Chapter 2 – Definitions and ... ing the topical areas of modeling and …

  • Upload
    hangoc

  • View
    223

  • Download
    2

Embed Size (px)

Citation preview

i

DEFENSE SYSTEMS MANAGEMENT COLLEGE

SIMULATION BASEDACQUISITION:

A NEW APPROACH

Report of theMilitary Research Fellows

DSMC 1997-1998

Lieutenant Colonel Michael V. R. Johnson, Sr., USA

Lieutenant Colonel Mark F. McKeon, USMC

Lieutenant Colonel Terence R. Szanto, USAF

December 1998

PUBLISHED BY THEDEFENSE SYSTEMS MANAGEMENT COLLEGE PRESS

FORT BELVOIR, VIRGINIA 22060-5565

ii

DISCLAIMER

This book was produced in the Department of Defense (DoD) schoolenvironment in the interest of academic freedom and the advancementof national defense-related concepts. The views expressed in this bookare those of the authors and do not reflect the official position or policyof the DoD or those of the United States Government.

iii

ACKNOWLEDGMENTS

This Report is the result of an 11-months Military Research Fellowshipprogram sponsored by the Defense Systems Management College.

iv

NOTICE

Copies of this report may be obtained by writing or faxing:

DEFENSE SYS MGMT COLLEGEATTN: AS PR9820 BELVOIR RD, STE G3FT BELVOIR, VA 22060-5565

Telephone: (703) 805-4366DSN: 655-4366Fax: (703) 805-3726

v

TABLE OF CONTENTS

Page

Preface ........................................................................................................................................ viiObjective of Study .............................................................................................................. viiMethodology ....................................................................................................................... viiiSpecial Thanks .................................................................................................................... ix

Chapter 1 – IntroductionBetter Performance ............................................................................................................ 1-1Faster Schedule .................................................................................................................. 1-2Cheaper Cost ...................................................................................................................... 1-3Guide to this Report .......................................................................................................... 1-4

Chapter 2 – Definitions and TerminologyM&S Definitions ................................................................................................................ 2-1Hierarchies and Classes of Models and Simulations ...................................................... 2-2Virtual and Synthetic ......................................................................................................... 2-5Lifecycle Cost and Total Ownership Cost ........................................................................ 2-5SBA Vision Statement ....................................................................................................... 2-6SBA Definition ................................................................................................................... 2-7

Chapter 3 – BackgroundGuidance ............................................................................................................................. 3-1Previous and On-Going Efforts ........................................................................................ 3-2Common Operating Digital Environment (CODE)....................................................... 3-5Defense Systems Affordability Council (DSAC) ............................................................ 3-5Joint SBA Task Force ........................................................................................................ 3-5Assumptions, Trends, and SBA Enablers ........................................................................ 3-6

Chapter 4 – Essential Aspects of SBABalancing Requirements ................................................................................................... 4-1Numerous Design Alternatives ......................................................................................... 4-2The Iterative Design Process ............................................................................................ 4-5Converging on an Optimal Solution ................................................................................. 4-5Test as an Integral Part of Design ..................................................................................... 4-7Decision Risk Analysis ...................................................................................................... 4-9

vi

Chapter 5 – Expanding the SBA EnvelopeFactors Influencing the Use of SBA: Domain, Infrastructure, and Need .................... 5-1SBA Push and SBA Pull .................................................................................................... 5-5SBA Push ............................................................................................................................ 5-5SBA Pull .............................................................................................................................. 5-9

Chapter 6 – A Future State of SBANeeds Generation Phase ................................................................................................... 6-3Iterative Design and Development Phase ....................................................................... 6-4Production Phase ............................................................................................................... 6-5Operations and Sustainment Phase.................................................................................. 6-6Net Result ........................................................................................................................... 6-6

Chapter 7 – Challenges to Implementing SBACultural Challenges ........................................................................................................... 7-1Technical Challenges.......................................................................................................... 7-2Process Challenges ............................................................................................................. 7-6

Chapter 8 – Conclusions and RecommendationsConclusions ......................................................................................................................... 8-1Recommendations ............................................................................................................. 8-2

APPENDICES

Appendix A – Acronyms and Glossary ............................................................................. A-1

Appendix B – Interviews and Personal Contacts ............................................................. B-1

Appendix C – About the Authors ..................................................................................... C-1

LIST OF FIGURES

Figure 2-1. Hierarchies of Models and Simulations ..................................................... 2-2

Figure 2-2. Matrix of Classes of Models and Simulations ............................................ 2-3

Figure 2-3. Hierarchies and Classes of Models and Simulations ................................. 2-4

Figure 2-4. Three Principal Interacting Communities .................................................. 2-8

Figure 2-5. Application of M&S Across Three Dimensions ........................................ 2-9

Figure 3-1. Growth in Computing Power ....................................................................... 3-6

Figure 4-1. Affordability: Capability vs. Cost ................................................................ 4-3

Figure 4-2. SBA Process .................................................................................................. 4-4

Figure 4-3. Simulate-Emulate-Stimulate Process, Top-Down View ............................ 4-6Figure 4-4. Simulate-Emulate-Stimulate Process, Side View ...................................... 4-6

vii

PREFACEThis study presents the combined efforts of three military Research Fellows, participating in an11-month Defense Systems Management College Research Fellowship program, sponsored bythe Under Secretary of Defense for Acquisition and Technology. In keeping with its role as thecenter for systems management education in the Department of Defense (DoD), the DefenseSystems Management College (DSMC) conducts this annual fellowship program to research asubject of vital interest to the U.S. defense acquisition community.

To achieve program objectives, program managers have long been applying modeling and simu-lation (M&S) tools to efforts within the various stages of their programs. Recently, however,declining defense budgets have increased the pressure on the acquisition community to findcheaper ways to develop and field systems. Additionally, the rapid pace of changing world eventsdemands that these material solutions get into the hands of the warfighter faster. To meet theincreased challenge of budget and time constraints, many programs have radically changed theway they conduct business. These programs recognize the powerful increases in productivityand decreases in cost brought by M&S tools. Within these programs, program managementlooks to weave M&S applications across program phases and seeks to leverage the strengths ofexternal M&S applications to efforts within their program. This new way of doing business,coupling rapid advances in simulation technology with process change, is fueling a new approachto how we acquire defense systems. This new approach is being termed Simulation BasedAcquisition, or SBA.

Objective of Study

The objective of this book is to convince program managers that SBA is a smarter way of doingbusiness. We will do this by defining SBA, explaining the strengths of SBA, and describing theforces that will encourage its use. Where possible, we highlight best practices and usefulimplementation guidance.

Within the DoD, there are a staggering number of variables that an acquisition program officemust evaluate and analyze. There is an almost infinite number of possible applications of SBAactivities within DoD acquisition programs. Where they apply, we present examples of com-mercial applications of SBA. Most, however, are narrowly focused and are of questionablegeneral use to acquisition program offices. This is partly because acquisition programs withinthe DoD are unique in their complexity compared with many commercial enterprises. Systemsthat the DoD produces are usually composed of many varied sub-components that push theboundaries of technology. When these complex sub-components are brought together in theaggregate at the system level, the complexity of the program is compounded. We hope that this“round down range” will stimulate discussion and provide the mark from which to “adjust fire.”

viii

Methodology

We conducted our research in four areas. First, we embarked on basic research and informationcollection while attending the 12-week residential Program for Management Development(PMD) course at the Harvard University Graduate School of Business. Second, we conductedan extensive search of the applicable literature. Third, we conducted interviews and attendedbriefings and conferences on Simulation Based Acquisition. Finally, we held two in-processreviews with students attending the DSMC Advanced Program Managersment Course (APMC),and with the DSMC faculty.

Our initial research began while we were at the Harvard Business School. There, we presentedand discussed issues concerning SBA with faculty members and our fellow classmates. As ourclassmates (157 students from 38 countries) represented both U.S. and international compa-nies, we gained truly global insights into a few of the topic areas. We were also fortunate to havea few classmates working for U.S. defense contractors, who provided valuable perspectives intogovernment and industry interrelationships and model sharing. Our discussions centered onapplicable business practices and answers to numerous questions raised by our research topic.For example, should program managers be provided incentives to design and develop modelsand simulations that allow for reuse and/or integration into other programs? Are there appli-cable business practices, or measures of success/metrics that could evaluate the effective use ofa government program’s modeling and simulation efforts? Where did they see technologygoing in the near future?

Our second area of research was a comprehensive literature review and Internet search cover-ing the topical areas of modeling and simulation and Simulation Based Acquisition. A particu-larly useful area was the SBA Special Interest Group on the Defense Modeling and SimulationOffice’s World Wide Web home page (www.dmso.mil), which contains a lot of historical andcurrent information on the subject, as well as up-to-date links to modeling and simulationorganizations and groups which are active on the web.

Our third area of research, conducting interviews and attending briefings and conferences,provided most of our information. We broke our areas of emphasis into three categories: gov-ernment, defense industry and commercial industry. To gain insight into government program-matic issues, we visited Service Acquisition Offices, acquisition and test organizations, newlyformed program offices, and established program offices. We obtained an understanding of thedefense industry’s support to government programs through visits to corporation headquartersand contractor facilities. We visited commercial firms that have been making significantinvestments in simulation technology. In all, we conducted over 85 interviews (see Appen-dix B). Some interviews were as short as thirty minutes, while some lasted over the courseof three days. Most, however, were three or four hours in duration. The level of the intervieweesvaried greatly, from Senior Acquisition Officials and Program Managers to individuals taskedwith constructing physical wooden mock-ups (used in the verification of virtual models). Thoughthe sources varied, there was a great deal of commonality in the views expressed. We also

ix

participated in and attended several SBA conferences and workshops. Each site visited providedunique insights into the collage that is the Simulation Based Acquisition picture.

Our final area of data collection was through peer and faculty review at DSMC. Through frankdiscussions conducted in our office spaces and the use of the DSMC’s Management Delibera-tion Center (a Group Decision Support System), many of the APMC students provided us withan excellent sounding board on the direction and progress of our research. These in-processreviews helped ensure we were addressing the issues most important to the acquisition commu-nity concerning Simulation Based Acquisition.

Special Thanks

The Research Fellows extend a special note of appreciation to Ms. Joan Sable, DSMC MilitaryResearch Fellowship Coordinator. Ms. Sable ensured that our administrative and logisticalrequirements were met at DSMC and Harvard, and her support enabled us to concentrate ourattention and energies on the research and writing of this report.

For all of their guidance throughout our research project we pay special thanks to ColonelKenneth “Crash” Konwin, Director Defense Modeling and Simulation Office; Ms. Robin Frost,Office of the Secretary of Defense (Director of Test, Systems Engineering, and Evaluation);and Mr. Steve Olson and the other members of the National Defense Industrial Association’sSBA Industry Steering Group.

We appreciate the efforts of the DSMC Press staff for their many hours working on this reportto ensure its highest quality. Thanks to the Visual Arts and Press staff for their work on thegraphs, charts and cover page as well as their many hours in the layout of this report. Finally, weextend a special thank you to Air Force Academy Cadets First Class Paul Ferguson and NathanAtherley for their research assistance during their summer internship at DSMC.

There are others, too numerous to mention individually, who deserve recognition. The threeResearch Fellows would like to thank all of those interviewed. As a token of our appreciation,we dedicate this effort to you. May our report be as helpful to you as you were to us.

x

1-1

11INTRODUCTION

acquisition processes (indeed the very solutionsthemselves) are usually complicated.

To be successful, a program office must bal-ance the complex relationships that existbetween “better”, “faster”, and “cheaper.”The degree of balance is not usually measureddirectly, but it can be measured in terms ofthe risk in meeting objectives. Risk is a mea-sure of the inability to achieve a program’sdefined performance, schedule, and costobjectives. It has two components: the prob-ability of failing to achieve particular perfor-mance, schedule, or cost objectives and theconsequences of failing to achieve thoseobjectives.1 SBA can address the first compo-nent of risk by increasing the likelihood of pro-ducing systems that have “better” performance,“faster” schedule, and “cheaper” cost.

Better Performance

No one would advocate providing thewarfighter with something that is “almost asgood as.” The material solutions that the U.S.defense acquisition community produces arethe tools of victory on future battlefields. Find-ing and producing “better” solutions startswith the warfighter’s ability to correctly articu-late the requirement. This frames all of theacquisition activities that follow.

The Services and DoD are committed to pro-vide superior weapon systems and materialsto the warfighter faster, and at less cost. Toachieve this, the acquisition community ischarged with providing “better, faster, andcheaper” material solutions. Our researchshows that the practices and processes of simu-lation based acquisition (SBA) will help over-come many of the hurdles associated withacquiring “better, faster, and cheaper”solutions.

The phrase “better, faster, and cheaper” isdeceptively simple, because the execution ofthis task across the Services and DoD is a com-plex undertaking. As many program officeswill attest, this statement rings true for threereasons. First, the current Government acqui-sition process, with its oversight requirementsand the nature of its funding, is not the moststreamlined or efficient of processes. Second,many large Government acquisition programsdevelop complex systems, which push the lim-its of technology. And third, program complex-ity is further magnified since most of the ma-terial solutions produced are not stand-aloneproducts. That is, they are “force multipliers”that must interface with and enhance the com-bat performance of other systems that are ei-ther fielded or in development. Unlike manycommercial programs, therefore, the current

1-2

Concepts that satisfy stated requirements mustthen be identified and evaluated. The militaryhas a much harder problem to solve than doesthe commercial automotive or aviation indus-tries, since they usually have a good startingpoint. And while they produce some very com-plicated products, the basic concepts for com-mercial products do not deviate greatly fromone generation to the next. Follow-on systemswithin the DoD, on the other hand, are usu-ally radical departures from their predeces-sors. Take as an example, air superiority fight-ers; the F-22 can hardly be viewed as a variantof the F-15. The increased performancedemanded of follow-on DoD systems bringswith it the increased likelihood that theseperformance objectives will not be met.

Through the use of modeling and simulation,an SBA process offsets this increased perfor-mance risk in three ways. First, it enables thewarfighter to become a member of the designteam and to influence the design much earlierthan the current process allows. Simulationprovides the tools for the warfighter to visual-ize and interact with the system to performoperational analyses and assessments. Thedesign team rapidly incorporates necessarychanges into the design based upon this expertinput and the result is a better solution. Sec-ond, the SBA process provides rapid feedbackto the design team by enabling them to per-form “what if” analyses, or iterations, onhundreds of point designs. Rather than build-ing a physical prototype to validate a singlepoint design, designers can use a virtual pro-totype to look at potentially hundreds ofdesign variations. The rapid feedback andlearning from these iterations will enable theprogram office to produce a better system.Third, not only are we able to conduct moreiterations, but we are also able to converge onmore optimal designs because we can test

many of our assumptions to see if they are true.Furthermore, as the tools for looking at theentire life cycle of the system improve, designteams will be able to move from form and fitissues to the more complex issues of function.This, too, will contribute to better solutions.

Faster Schedule

The military goes to war with what is fielded,not with what is on the drawing boards or inthe acquisition pipeline. As Lieutenant Gen-eral Paul J. Kern, Director of the Army’sAcquisition Corps stated, “The current acqui-sition process was good for producing systemsin the Cold War environment, where we had apredictable enemy with known lead times.Now, many of our foreseeable potentialenemies are different; they are not constrainedby a rigid, inflexible acquisition process. Theycan purchase weapon systems and/or sub-com-ponents in an open-air market environment,like a global off-the-shelf system. Throughmixing and matching various weapon systemsand subsystems, they can rapidly generate somevery lethal systems. We lose if they can purchaseand bring together their systems faster thanwe can develop ours because of long cycletimes.”2

By enabling the acquisition process to getinside the adversary’s decision cycle, an SBAprocess increases the likelihood of develop-ing and producing systems “faster”. If we canmodel what we need answers to, we can getrapid feedback through the power of simula-tion. The program gets the required informa-tion much sooner in the process than it doesunder the current system, translating intofaster cycle times. Additionally, we can con-duct many more processes concurrentlybecause more people can access the informa-tion they need at the same time. Concurrent

1-3

engineering activities within an SBA processwill compress the cycle time of weapons acqui-sition. Working with virtual prototypes and digi-tal product descriptions allows program man-agers to exhibit agility in designing and evalu-ating concepts. For example, using computa-tional fluid dynamics, the test community canhelp the design team look at weapons releaseissues much earlier and at a fraction of the costof running a wind tunnel test. Virtual manu-facturing allows the manufacturing communityto look at procedures such as reducing thenumber of parts in the design as well as re-hearsing production processes. Using three-dimensional solid modeling, shop-floor me-chanics can assess the maintainability of thedesign and make design inputs alongside theengineers, much earlier in the process.

Cheaper Cost

“Cheaper” can be broken down into two areas.The first area involves reducing the initialacquisition cost of systems. This point wasillustrated by Norm Augustine two decadesago in one of his famous Augustine’s Laws:“In the year 2054, the entire defense budgetwill purchase just one tactical aircraft. This air-craft will have to be shared between the AirForce and Navy 3 days each per week.”3

Clearly there’s a need to reverse and stabilizethe trend in rising initial acquisition costs. Sec-ond, “cheaper” also includes reducing the an-nual costs of operating and sustaining systemsuntil and including disposal. The foreseeabletrend in system life span is that DoD will beliving with major systems for longer periodsof time. As systems get older, the sustainmentcosts usually rise. To address concerns withinthese two areas, the Defense SystemsAffordability Council, chaired by the UnderSecretary of Defense for Acquisition and Tech-nology, recently directed the establishment of

ambitious top level goals for reducing totalownership cost.

Regarding the first area of reducing initialacquisition costs, an SBA process increases thelikelihood that a program will stay within itscost objectives during the acquisition phase.The synthetic environment reduces the reli-ance on costly physical prototypes and testsfor making programmatic decisions. Physicaltests are primarily done to validate models andgenerate confidence in the use of the models.These validated models can then be reused toperform numerous design simulations, at afraction of the cost of one physical test. Simu-lations also help to focus the test effort on thecritical evaluation areas, thereby avoiding un-necessary physical tests. As noted by Air ForceMajor General Leslie Kenne, Program Man-ager for the Joint Strike Fighter aircraft, “Wehaven’t begun to tap all the benefits that mod-eling and simulation has to offer to reduce ourtesting requirements.”4

Regarding the second area of reducing sus-tainment costs, an SBA process increases thelikelihood that a program can reduce theannual costs of operating and sustaining sys-tems until and including disposal. Althoughit’s difficult to find a definitive source for thedata, it is frequently mentioned that approxi-mately 70 percent of the total costs of a sys-tem are determined by the time a new pro-gram receives approval to start (referred to as“Milestone I”), and 85 percent are determinedby the time a program’s design is selected(referred to as “Milestone II”).5 These earlydecisions, which directly affect total ownershipcost, are currently being made with limitedknowledge of system cost, schedule and per-formance implications. It is speculative at bestto determine how much of a system’s totalownership cost can be influenced by better

1-4

design decisions. Certainly, however, as theability to simulate the entire system’s life cyclecontinues to improve, so too will the ability tomake intelligent tradeoffs on how much tospend now in improvements in manufactur-ability, reliability, maintainability, and sup-portability in order to save later. A significantvalue of SBA is that it allows program officesto begin evaluating the long-term cost impactsof design decisions as part of the designprocess, rather than relying on engineeringchange proposals and modifications to fix theproblem after the design is frozen.

Guide To This Report

Chapter Two, Definitions and Terminology,lays the foundation for the SBA VisionStatement and Definition that follow.

Chapter Three, Background, presents currentpolicy and guidance, previous studies and con-clusions, and some assumptions and trendsupon which SBA depends upon.

Chapter Four, Essential Aspects of SBA,introduces the theoretical practices thatsupport an SBA process.

Chapter Five, Expanding the SBA Envelope,identifies the forces that will generate pro-gress towards implementing SBA. Theseinclude the forces internal to a program thatwill push SBA along, as well as the exter-nal pull from the warfighting and resourceallocation communities.

Chapter Six, A Future State of SBA, developsa model and vision for what SBA can hope toachieve in the future, by identifying notionalSBA components and their interactions.

Chapter Seven, Challenges to ImplementingSBA, describes some of the challenges that aprogram may face as it tries to implement anSBA approach.

Chapter Eight, Conclusions and Recommen-dations, summarizes our conclusions about theprospects for implementing SBA, and providesour recommendations for making the transi-tion to this new way of acquiring defensesystems.

What we present throughout this report we haveheard many times from many people. It is ourhope that we have presented our research insuch a way as their message comes across loudand clear and that you are convinced that SBAis a better way of doing business.

1-5

ENDNOTES

1. Defense Systems Management College,Acquisition Strategy Guide, Third Edition, (FortBelvoir, VA: Defense Systems ManagementCollege Press, January 1998).

2. LTG Paul J. Kern, Military Deputy to AssistantSecretary, Army RD&A, interview with authors,7 May 1998.

3. Norman R. Augustine, “Augustine’s Laws andMajor System Development Programs,” DefenseSystems Management Review, Volume 2,Number 2, p. 64.

4. Major General Leslie J. Kenne, Program Man-ager, Joint Strike Fighter Program, Remarks atthe National Defense Industrial Association SBAConference, 17 March 1998.

5. John Krouse. Computer Aided Engineering,Onward Press, June 1993, p. 7.

1-6

2-1

22DEFINITIONS AND

TERMINOLOGY

logical representation of a system, entity, phe-nomenon, or process.”1 (Entities are “a dis-tinguishable person, place, unit, thing, event,or concept about which information is kept.”2)It is important to note that a model can existwithout a single piece of software.3 It can be ahardware mockup or a simple equation on apiece of paper. A simulation is “a method forimplementing a model over time.”4 Thus, totie the two together, simulations are the soft-ware that implements models over time, withinthe context of a scenario.5

A great deal of confusion often results fromthe common practice of using the terms “mod-eling” and “simulation” interchangeably, aswell as that of using the term “M&S” to standfor both models and simulations and modelingand simulation.6 In this report we use the term“M&S” to stand for modeling and simulation,which is an analytical problem-solving ap-proach.7 Modeling and simulation, as definedin the DoD M&S Glossary, is “the use of mod-els, including emulators, prototypes, simula-tors, and stimulators, either statically or overtime, to develop data as a basis for makingmanagerial or technical decisions.”8

Many within the DoD, the Armed Forces, andindustry realize the enormous potential of aSimulation Based Acquisition process, andSBA has recently been the topic of several con-ferences and workshops. There has been somedisagreement and confusion, however, in get-ting to a common understanding of exactlywhat SBA is. Certainly SBA is a new way ofdoing business for acquiring DoD weapon sys-tems, which implies a need to change our pro-cesses. And it has been described by the policyand cultural changes necessary to bring aboutthis changed process, the favorable environ-ment necessary to speed it along, as well asthe technical impediments to its swift enact-ment. But what exactly is SBA?

M&S Definitions

First we need to clarify some terminology usedthroughout this book. (We highly recommendreferring to DoD 5000.59-M, DoD Modelingand Simulation (M&S) Glossary, for currentdefinitions in this fast changing area. It is avail-able on-line on the DMSO World Wide WebHome page, mentioned previously in the Pref-ace.) According to DoD 5000.59-M, a modelis “a physical, mathematical, or otherwise

2-2

Hierarchies and Classes of Modelsand Simulations

As shown in Figure 2-1, models and simula-tions have been classified hierarchicallyaccording to their level of aggregation. Aggre-gation is the ability to group entities, whilepreserving the effects of entity behavior andinteraction when grouped.9 The four generalclassifications are Engineering, Engagement,Mission/Battle, and Theater/Campaign.10

Engineering models and simulations are at theleast level of aggregation, whereas Theater/Campaign models and simulations are at thehighest. Within these classifications, modelsand simulations can vary in their level of detailor representation.

• Engineering level models and simulationsprovide measures of performance (MOP)concerning such issues as design, cost,manufacturing, and supportability. They

can include aerodynamics, fluid flow,acoustics, and fatigue, as well as physics-based models of components, subsystems,and systems.11

• Engagement level models and simulationsare used for evaluating the effectiveness ofan individual system against another sys-tem in one-on-one, few-on-few, and many-on-many scenarios. They provide measuresof effectiveness (MOE) at the system-on-system level.12

• Mission/Battle level models and simula-tions are used for evaluating the effective-ness of a force package, or multiple plat-forms performing a specific mission. Theyprovide MOE at the force-on-force level.13

• Theater/Campaign level models and simu-lations are used to evaluate the outcomesof joint and combined forces in a theater

Figure 2-1. Hierarchies of Models and Simulations

Mission/ Battle

Engagement

Engineering

Theater/Campaign

2-3

Real

Real

Simulated

Virtual Constructive

Live Smart

Simulated

People

Equipment

or campaign level conflict. They providemeasures of value (sometimes referred toas measures of outcome-MOO) at thehighest levels of conflict.14

This classification of models and simulationsis significant to acquisition programs, becausethe type of information required determinesthe level of aggregation to be used. Solutionsto broad issues, such as mission need state-ments, can be explored with highly aggregatedmodels and simulations. More focused issues,such as the maturing of the design, dictate thatthe models and simulations move towards andinto the engineering category. Programs must,therefore, move up and down this ladder ofabstraction to tailor the models and simula-tions to their needs. For example, highly ag-gregated simulations can explore the battle-field effects of increasing an aircraft’s combatradius. If the results are promising, the designteam could then use engineering level models

and simulations to address the associateddetailed design issues, such as increasing theaircraft’s internal fuel capacity. There are alsoadditional considerations for tailoring thelevel of aggregation, such as unnecessarycomputational burden and the complexity ofinformation management.

Models and simulations have long been clas-sified into the three classes of Live, Virtual,and Constructive, which attempt to delineatethe degrees of human and equipment realism,as shown in the matrix in Figure 2.2.15 Livesimulations denotes real people operating realsystems; virtual simulations denotes realpeople operating simulated systems; and con-structive models and simulations denote simu-lated people operating simulated systems.Smart simulations denotes simulated peopleoperating real equipment. The live, virtual andsmart classes are applied to simulations, butnot models since live models would be humans

Figure 2-2. Matrix of Classes of Models and Simulations

2-4

Battle

Theater/

ValidationLiveMission/

Engagement

Engineering

Campaign

ConstructiveVirtual

Smart

and actual equipment. The term virtual modelwould be misleading since virtual simulationsinject humans-in-the-loop to exercise motorcontrol, decision making, or communicationsskills, and the human element of a virtual simu-lation is not modeled (the simulated systemsin virtual simulations would be made up ofconstructive models). On the other hand, inconstructive simulations, real people stimulate(make inputs) to the constructive models, butthe people are not involved in determining theoutcomes.17 Hence it is appropriate to haveboth constructive models and constructivesimulations.

As noted by the DoD M&S Glossary, the clas-sification of live, virtual, and constructive canbe somewhat problematic for two reasons.First, there is no clear delineation betweenthe categories, because the degrees ofhuman involvement and equipment realism

are infinitely variable. The second reason it isa problematic classification is highlighted bythe fact that the bottom right quadrant ofFigure 2.2 has not been previously namedto indicate the class of simulated peopleoperating real equipment.18 We have namedthis class of simulations as smart simulations.

There is value, however, in marrying the hier-archies and classes of models and simulationstogether, as shown in Figure 2-3, to show therange of possibilities when discussing M&Ssupport to acquisition. Constructive modelsand simulations can range from highly aggre-gated, theater level models and simulations,to physics-based, engineering level models andsimulations. This range of aggregation can beapplied to virtual simulations, and we canenvision the same for the obverse of virtualsimulations (the “smart” classification). Thepossible combinations are wide-ranging, in-cluding such hybrid combinations as hardware/

Figure 2-3. Hierarchies and Classes of Models and Simulations

2-5

software in the loop (HW/SWIL), human inthe loop (HITL), and wargaming. It is notnecessary to package every model or simula-tion into a particular category—the utility isin exploring the numerous possible hybridcombinations available and tailoring them tothe requirement.

While the classification of live simulation wasuseful for the training community, the conceptof real people and real equipment takes on adifferent purpose for the acquisition environ-ment. In the training vernacular of “all but waris simulation,” real people and real equipmentare still simulations of the real event of war.For the acquisition of systems, there are tworoles for the classification of real people andreal equipment. The first role is testing thatwhich cannot be adequately depicted in thesynthetic environment, because of insufficientknowledge or technology. The data obtainedfrom these tests could be used for future modeldevelopment. The second role is to validatethe models and resultant simulations. The“live” events in these two roles can range fromcomponent level tests to large-scale physicalprototypes, but in each case their purpose isto validate. The first role validates either a con-cept or design, whereas the second role vali-dates a model or simulation.

Virtual and Synthetic

When discussing M&S and SBA, it is impor-tant to have a common understanding of theword “virtual.” In the most general sense, vir-tual “refers to the essence or effect of some-thing, not the fact.”19 In practice, the word vir-tual is usually paired with another word, as invirtual battlespace, virtual reality, and virtualprototype. Virtual battlespace is defined as“the illusion resulting from simulating theactual battlespace.”20 Virtual reality is “the

effect created by generating an environmentthat does not exist in the real world. Usually,[virtual reality is] a stereoscopic display andcomputer-generated three-dimensional envi-ronment which has the effect of immersing theuser in that environment. This is called theimmersion effect. The environment is inter-active, allowing the participant to look andnavigate about the environment, enhancingthe immersion effect. Virtual environment andvirtual world are synonyms for virtual real-ity.”21 Synthetic environments, on the otherhand, are defined as “... simulations that rep-resent activities at a high level of realism, fromsimulations of theaters of war to factories andmanufacturing processes. These environmentsmay be created within a single computer or avast distributed network connected by localand wide area networks and augmented bysuper-realistic special effects and accuratebehavioral models. They allow visualization ofand immersion into the environment beingsimulated.”22 Finally, virtual prototypes are “amodel or simulation of a system placed in asynthetic environment, and are used to inves-tigate and evaluate requirements, concepts,system design, testing, production, and sustain-ment of the system throughout its life cycle.”23

Life Cycle Cost and Total Ownership Cost

The concepts of life cycle cost (LCC) and totalownership cost (TOC) also figure prominentlywhen discussing SBA. The term TOC oftenappears to be replacing the term LCC. TOCis the totality of costs associated with theDepartment of Defense, including the costs re-lated to weapon systems, whereas LCC is thetotality of costs over time related to develop-ing, acquiring, operating, supporting anddisposing of weapon systems.24 The Secretaryof Defense, William S. Cohen, states in hisAnnual Report to the President and the Congress

2-6

that “Total ownership cost is the sum of allfinancial resources necessary to organize,equip, and sustain military forces sufficient tomeet national goals in compliance with alllaws; all policies applicable to DoD; allstandards in effect for readiness, safety, andquality of life; and all other official measuresof performance for DoD and its compo-nents.”25 When viewed this way, LCC inher-ently refers to that subset of TOC that has todo with weapons systems.

In practice, however, TOC is often used in aweapon system context, in which case TOCand LCC appear to be synonymous terms, asthey both cover all costs to research, develop,acquire, own, operate and dispose of a weaponsystem for a specific (or assumed) number ofyears. However, TOC as applied to weaponsystems seems to imply a greater effort to cap-ture more of the costs covered by both defini-tions than has heretofore been practicedroutinely in DoD under the banner of LCC.26

For example, if a new system or concept willrequire additional security forces as part of itsoperation, then the life cycle cost of this sup-port would most likely be factored in to theTOC of the system or concept, but would prob-ably not have been included under LCC. Thereare many on-going debates on this subject,which we don’t intend to resolve here. In thisbook we use the more-encompassing term TOC.

SBA Vision Statement

For many years, the training community hasleveraged the strengths of M&S to augmentlive training. As noted by General RichardHawley, Commander of Air Combat Com-mand, “…[M&S] has been a key part of ourtraining for many years, but…we’ve never fullyexploited the contributions that modeling andsimulation can make to our readiness

programs.”27 Similarly, although the acquisi-tion community has also made M&S a key partof the systems acquisition process for manyyears, it too is beginning to realize that thebenefits of M&S haven’t been fully exploited.In his 16 March 1998 Memorandum, the Hon-orable Jacques S. Gansler, Under Secretaryof Defense (Acquisition and Technology),stated, “As we undergo changes in our defenseacquisition infrastructure, let me take thisopportunity to firmly state my commitment tothe use of M&S in the acquisition of our weap-ons systems…. Therefore, it is essential thatwe plan for the use of M&S in our acquisitionstrategies. I expect programs to make theupfront investment in M&S application andtechnology and will be looking for evidenceof that investment in program planning andexecution.”28 Dr. Gansler goes on to note ad-ditional steps he is endorsing for the SBAinitiative to capitalize on the current effortsin M&S.

The DoD, in collaboration with industry,developed a vision statement for SBA thatenvisions an acquisition process in which DoDand industry are enabled by robust, collabo-rative use of simulation technology that isintegrated across acquisition phases andprograms. The goals of SBA are to:

1. Substantially reduce the time, resources,and risk associated with the entire acqui-sition process;

2. Increase the quality, military worth, andsupportability of fielded systems, whilereducing total ownership costs throughoutthe total life cycle; and

3. Enable Integrated Product and ProcessDevelopment (IPPD) across the entireacquisition life cycle.29

2-7

SBA Definition

This vision statement promotes discus-sions about SBA, and most people basicallyagree with its intent. However, an expandeddefinition of SBA is instructive:

Simulation Based Acquisition is aniterative, integrated product and pro-cess approach to acquisition, usingmodeling and simulation, that enablesthe warfighting, resource allocation,and acquisition communities to fulfillthe warfighter’s materiel needs, whilemaintaining Cost As an IndependentVariable (CAIV) over the system’sentire life cycle and within the DoD’ssystem of systems.

A discussion of each of the definition’s criticalelements follows.

“Simulation Based Acquisition is an iterative,integrated product and process approach toacquisition”: SBA enables Integrated Productand Process Development (IPPD) teams toconverge on optimal solutions by balancingrequirements through an iterative design pro-cess. DoD and contractor organizations workinternally and with each other as an integratedteam effort.30 IPPD used to mean that we puta logistician on the design team to translatehis maintenance and support knowledge intomeaningful design guidelines. Now IPPDmeans arming the hands-on experts with thetools to rapidly see the results of their designinputs, thereby making them a part of thedesign team. For example, using three-dimen-sional “solid” models, the F-22 aircraft pro-gram enabled two mechanics to inject main-tainability changes into the design tradeoffs,because they were able to visualize the systemmuch earlier in the process.

“…through modeling and simulation”: M&Sactivities make SBA possible. Stepping intothe synthetic environment enables exercisingthe power of simulation. Within the same timeframe, many more analytical excursions canbe made with virtual designs than would bepossible with physical prototypes. Theincreased level of user involvement, coupledwith compressed time and space feedback,lead to better learning and problem solving.This is far superior to the traditional approach,which is driven by real experiences using physi-cal prototypes.31 M&S facilitates the team andincreases communication, making teammembers more effective.

“…the warfighting, resource allocation, andacquisition communities”: SBA is more thanjust acquisition. SBA helps to link the DoD’sthree principal decision support systems: theRequirements Generation System, the Plan-ning Programming and Budgeting System, andthe Acquisition Management System.32 Figure2-4 shows the relationship of these three com-munities. The name of the community repre-senting the Requirements Generation Systemis changed to the more expansive term ofwarfighting community, which includes: re-quirements personnel, operators, maintainers/sustainers, and trainers. The resource alloca-tion community is also an integral part of theacquisition process, as it allocates the programbudgets (subject to approval by OSD and Con-gress), and has the burden of balancing theServices’ and DoD’s budget(s). The acquisi-tion community, as used here, includes bothgovernment and industry agencies involved indeveloping and fielding military systems.

“…to fulfill the warfighter’s materiel needswhile maintaining Cost As an IndependentVariable (CAIV)”: indicates the use of a strat-egy that balances mission needs with projected

2-8

out-year resources. We are now saying that wewill trade performance in order to achieve ourcost objective. By linking improved costmodels with our computer-aided engineeringtools, we’ll be able to better predict the costsof different alternatives so as to make betterinformed tradeoff analyses.

“…over the system’s entire life cycle” meansto look both within and across all phases ofthe program, as early as possible during theacquisition of the system. It encompasses thecollaborative use of simulation beyond thetraditional performance issues to address thesystem’s entire life cycle cost issues during thedesign, to include manufacturability, support-ability, lethality, sustainability, mobility,survivability, flexibility, interoperability,reliability, and affordability.

“…and within the DoD’s system of systems.”This signifies to fully explore the system’sinteraction within and impact upon the DoD’ssystem of systems, to capture the desire foreffective total systems integration, as well asthe collaborative use of M&S across programs.As the United States participates in greaternumbers of combined operations, this aspectof the definition will have to be expanded toinclude a “system of systems” look across alliedsystems and programs as well.

Figure 2-5 graphically illustrates the threedimensions that SBA attempts to integrate.First, the vertical dimension is where M&S hasbeen traditionally applied within each programphase, without much regard to reuse later inthe program. The second dimension SBAattempts to integrate is the horizontal appli-cation of M&S across the phases of theprogram. More than just sticking with and

Figure 2-4. Three Principal Interacting Communities

Acquisition

Warfighting ResourceAllocation

2-9

Across

Programs

With

in P

hase

Within Program and Across Phases

MS OMS I

MS IIMS III

growing models over the life of a program, thismeans addressing the entire life cycle’s issuesas early as possible during the design. The ac-quisition community must stay focused on thereal goal, which is to produce superior weaponsystems, not superior virtual prototypes. Thekey to success is to get the design right beforebuilding the system, after which, in effect, thedesign becomes frozen. This horizontal appli-cation across phases of the program indicatesan attempt to explore the cost drivers acrossthe entire life cycle of the system, and when itmakes sense, to address those issues in thedesign. For example, more reliable systems cantranslate into better combat effectiveness, aswell as potential manpower decreases formaintenance, decreased mobility footprint,and reduced sparing levels. Acquiring goodcost figures for these types of operations andsupport costs across the expected life of the

system will help determine how much shouldbe spent on reliability during system design.After production, as we continue to refine oursystem models with field data and operatorinput, the simulations will help us decide if weneed to modify or build a new system, as wellas consider non-materiel solutions. Lookingat the furthest point in the life cycle, there mayeven be disposal costs that could be mitigatedby changes in the design. While the importanceof looking at disposal costs may seem some-what far-fetched, it is interesting to note thatthe Army spends about $100 million a yeardemilitarizing ammunition, of which $13million alone is for stocks from World War I.33

If we don’t start looking at disposal costs dur-ing the design, we may find that in the future,we won’t be able to afford to buy the next-generation system, because we couldn’t affordto dispose of the current one.

Figure 2-5. Application of M&S Across Three Dimensions

2-10

Finally, the third dimension SBA attempts tointegrate is across programs. If we continueto maximize each system alone, the overall sys-tem of systems will be less than optimal. Pro-grams need to recognize the value of theinteraction of their system within the overallsystem of systems. Individual programs can-not afford to build everything themselves; theyneed to rely on each other for their similarneeds. There are few design tradeoff analysesbeing conducted within the services betweentheir own programs, much less betweendifferent services’ programs. Currently, theJoint Requirements Oversight Council(JROC) reviews all Mission Need Statements(MNS) for joint applicability, and follows upon major programs before milestone reviews.To do these tradeoffs effectively, however, theyhave to be made at a level much lower thanthe JROC. The capability to do these tradeoffanalyses will begin to extend beyond thetraditional interface control proceduremethod, which allows each individual system

to be optimized, but which results in the over-all system of systems’ performance being sub-optimized. Failure to look at the big picturecan also result in over-designing systems aswell as unnecessary duplication of effort, bothof which waste resources.

When we begin to look outside a single sys-tem, we see even fewer interactions of theseissues in the context between systems, as forexample the maintenance and logistics con-siderations between the next generationamphibious assault vehicle and the ships thatwill transport it. We see this happening eventoday within large, integrated weapon systemssuch as submarines and aircraft, where themajor sub-components (sonars, torpedoes,missiles, and munitions, etc.) are designed anddeveloped independently and integrated laterafter each has been built. The design inter-face is controlled by an interface control docu-ment (ICD), with little to no concurrent designtradeoffs possible across the ICD.

2-11

ENDNOTES

1. DoD 5000.59-M, DoD Modeling and Simulation(M&S) Glossary, December 1997.

2. Ibid.

3. Defense Modeling and Simulation Office(DMSO), Verification Validation & Accredita-tion (VV&A) Recommended Practices Guide(November 1996), Chapter 1 Overview, pp. 1-4.

4. DoD 5000.59-M, DoD Modeling and Simulation(M&S) Glossary, December 1997.

5. DMSO, VV&A Recommended Practices Guide.

6. Ibid.

7. Ibid.

8. DoD 5000.59-M, DoD Modeling and Simulation(M&S) Glossary, December 1997.

9. Aggregation definition from DoD 5000.59-M,DoD Modeling and Simulation (M&S) Glossary,December 1997.

10. L.K. Piplani, J.G. Mercer, and R.O. Roop, Sys-tems Acquisition Manager’s Guide for the Use ofModels and Simulations, Report of the DefenseSystems Management College 1993-1994 MilitaryResearch Fellows, DSMC Press, September 1994.

11. Piplani et al, ibid.

12. Ibid.

13. Ibid.

14. Ibid.

15. DoD 5000.59-M, DoD Modeling and Simulation(M&S) Glossary, December 1997.

16. Ibid.

17. Ibid.

18. Ibid.

19. Ibid.

20. Ibid.

21. Ibid.

22. Ibid. The authors have removed the word“internetted” from the beginning of this defini-tion as it is not necessary to be connected to theinternet in order to have a synthetic environment.

23. Ibid.

24. Dan Fink; Deputy, Acquisition Logistics Integra-tion; Deputy Chief of Naval Operations (Logis-tics), OPNAV N4; interview by authors 8 July1998.

25. William S. Cohen, Secretary of Defense, AnnualReport to the President and the Congress, 1998.

26. Fink, ibid.

27. General Richard E. Hawley, Commander AirCombat Command, Keynote Address at 19thInterservice & Industry Training, Simulation andEducation Conference, 2 December 1997,Orlando, FL.

28. Dr. Jacques S. Gansler, Under Secretary ofDefense (Acquisition & Technology), “M&S inDefense Acquisition,” memorandum, 16 March1998.

29. DoD Acquisition Council of the Executive Coun-cil on Modeling and Simulation, 5 December1997.

30. DoD Guide to Integrated Product and ProcessDevelopment; Office of the Under Secretary ofDefense (Acquisition and Technology), February5, 1996.

2-12

31. Stefan Thomke, “Simulation, Learning and R&DPerformance: Evidence from Automotive De-velopment,” Research Policy 27 (1) (May 1998),55-74.

32. CJCSI 3170.01, Requirements GenerationSystem (Formerly MOP 77)), 13 June 1997.

33. Kern, ibid.

3-1

33BACKGROUND

Guidance

DoD 5000.1, “Defense Acquisition”

This directive encourages the use of M&S bystating that “models and simulations shall beused to reduce the time, resources, and risksof the acquisition process and to increase thequality of the systems being acquired. Repre-sentations of proposed systems (virtual pro-totypes) shall be embedded in realistic,synthetic environments to support the variousphases of the acquisition process, fromrequirements determination and initialconcept exploration to the manufacturing andtesting of new systems, and related training.”3

DoD 5000.2-R, “Mandatory Procedures forMDAPs and MAIS Acquisition Programs”

This directive mandates that “accredited mod-eling and simulation shall be applied, asappropriate, throughout the system life cyclein support of the various acquisition activities:requirements definition; program manage-ment; design and engineering; efficient testplanning; result prediction; and to supplementactual test and evaluation; manufacturing; andlogistics support. [Program Managers] shallintegrate the use of modeling and simulation

There are numerous studies, papers and pro-gram examples that highlight the benefitsM&S can provide in technical risk reduction,time savings, direct cost savings and/or costavoidance, and management risk reduction.Since many of these documents are continu-ally being refined (particularly in the dynamicgrowth area of M&S), it is important to viewthem first-hand for possible changes.1 Forexample, the Army recently coined the phraseSimulation and Modeling for Acquisition,Requirements and Training, or SMART, todescribe their initiative for SBA.2

This chapter presents current policy and guid-ance, previous studies and conclusions, andsome assumptions and trends upon which SBAdepends. Thus far, the only reference to Simu-lation Based Acquisition that appears in theDefense Acquisition Deskbook is Dr.Gansler’s 16 March 1998 memorandum pre-viously mentioned in Chapter Two. We expectthis will soon change, however, as the acquisi-tion community continues to implement this“smarter” way of doing business. The follow-ing synopses are a quick walk-through ofrecent information relevant to SBA.

3-2

within program planning activities, plan for lifecycle application, support, and reuse modelsand simulations, and integrate modeling andsimulation across the functional disciplines.”4

Regarding test and evaluation, it further statesthat “modeling and simulation shall be anintegral part of test and evaluation plan-ning…Test and evaluation programs shall bestructured to integrate all developmental testand evaluation (DT&E), operational test andevaluation (OT&E), live-fire test and evalua-tion (LFT&E), and modeling and simulationactivities conducted by different agencies asan efficient continuum. All such activities shallbe part of a strategy to provide informationregarding risk and risk mitigation, to provideempirical data to validate models and simula-tions, to permit an assessment of the attainmentof technical performance specifications and sys-tem maturity, and to determine whether systemsare operationally effective, suitable, andsurvivable for intended use.”

For operational test and evaluation, it statesthat “The use of modeling and simulation shallbe considered during test planning. Wheneverpossible, an operational assessment shall drawupon test results with the actual system, or sub-system, or key components thereof, or withoperationally meaningful surrogates. Whenactual testing is not possible to support anoperational assessment, such assessments mayrely upon computer modeling, simulations(preferably with real operators in the loop),or an analysis of information contained in keyprogram documents. However, as a conditionfor proceeding beyond LRIP, initial opera-tional test and evaluation shall not comprisean operational assessment based exclusivelyon computer modeling; simulation; or, ananalysis of system requirements, engineeringproposals, design specifications, or any other

information contained in program documents(10 USC2399). The extent of modeling andsimulation usage in conjunction with opera-tional and test evaluation shall be explainedin the Test and Evaluation Master Plan….”7

For Live Fire Test and Evaluation on otherthan ACAT 1A programs, it states: “…Alter-natively, in the case of a covered system (orcovered product improvement program for acovered system), the USD(A&T) or the CAE[Component Acquisition Executive] maywaive the application of the required surviv-ability and lethality tests and instead allow test-ing of a system or program by firing munitionslikely to be encountered in combat at compo-nents, subsystems, and subassemblies, togetherwith performing design analyses, modeling andsimulation, and analysis of combat data in lieuof testing the complete system configured forcombat.”8

Regarding the four key systems engineeringtasks of requirements analysis, functionalanalysis/allocation, design synthesis and verifi-cation, and system analysis and control, it statesthat “The verification of the design shall in-clude a cost-effective combination of designanalysis, design modeling and simulation, anddemonstration and testing.…”9

DoD Directive 5000.59, “DoD Modeling andSimulation (M&S) Management”

The purpose of this directive is threefold. First,it establishes DoD policy, assigns responsi-bilities, and prescribes procedures for the man-agement of M&S. Second, it establishes theDoD Executive Council for Modeling andSimulations (EXCIMS). Third, it establishesthe Defense Modeling and Simulation Office(DMSO).

3-3

It directs that it is DoD policy that “Invest-ments shall promote the enhancements ofDoD M&S technologies in support of opera-tional needs and the acquisition process;develop common tools, methodologies, anddatabases; and establish standards and proto-cols promoting the internetting data exchange,open system architecture, and software reus-ability of M&S applications. Those standardsshall be consistent with current national, Fed-eral, DoD-wide and, where practicable, inter-national standards for open systems… TheDoD Components shall establish verification,validation, and accreditation (VVA) policiesand procedures for M&S applications man-aged by the DoD Component. The ‘DoDM&S Executive Agent’ shall establish VVAprocedures for that application…. M&Sapplications used to support the major DoDdecision making organizations and processes(such as the Defense Planning and ResourcesBoard; the Defense Acquisition Board; theJoint Requirements Oversight Council; andthe DoD Planning, Programming, and Bud-geting system)… shall be accredited for thatuse by the DoD Component for its own forcesand capabilities. Each DoD Component shallbe the final authority for validating and ac-crediting representations of its own forces andcapabilities in joint and common use M&S.Each Component shall be responsive to theother Components to ensure that its forces andcapabilities are appropriately represented inthe development of joint and common useM&S.”10

DoD 5000.59-P, “Modeling and Simulation(M&S) Master Plan”

This plan establishes numerous M&S activi-ties. First, it establishes the DoD vision forDoD M&S and outlines a strategy for achiev-ing future DoD M&S-based capabilities.

Second, it assigns M&S implementation re-sponsibilities and provides guidelines for thedevelopment, cooperation, and coordinationof DoD M&S efforts. Third, it establishesDoD M&S objectives, identifies action, and,where possible, assigns responsibilities for ac-complishing them. Fourth, it provides a basisfor developing supporting plans and programs,including the DoD Modeling and SimulationInvestment Plan (MSIP), and the DoDComponent’s M&S master and investmentplans. Fifth, it provides justification for re-source allocations to M&S within DoD Com-ponent programming and budgeting processesand fosters the integration of the defense andcivilian M&S bases into a unified national andinternational base using common standards,processes and methods.11

DoD Manual 5000.59-M, “DoD Modeling andSimulation (M&S) Glossary”12

This manual prescribes a uniform glossary ofM&S terminology for use throughout the De-partment of Defense. In addition to the mainglossary of terms, it includes a list of M&S-related abbreviations, acronyms, and initialscommonly used within the DoD.

DoD Instruction 5000.61, “DoD Modeling andSimulation (M&S) Verification, Validation andAccreditation (VV&A)”

This Instruction implements policy, assignsresponsibilities, and, prescribes proceduresfor the VV&A of DoD M&S. It designatesthe Defense Modeling and SimulationOffice (DMSO) as the DoD VV&A focalpoint, to be the central source of DoD VV&Adata and information and to assist DoD andnon-DoD organizations, as requested, in re-solving VV&A issues or obtaining informa-tion on DoD VV&A practices. It specifies that

3-4

information and data on VV&A activities willbe readily available through the DoD M&SResource Repository (MSRR) system includ-ing, as a minimum, DoD Component VV&Apolicies and procedures, V&V results, andaccreditation documentation. This instructionalso specifies minimum documentationrequirements for verification and validationinformation, and accreditation results.13

“Department of Defense Verification, Valida-tion and Accreditation (VV&A) RecommendedPractices Guide”

This guide provides background and informa-tion on recommended principles, processes,and techniques for use in DoD VV&A effortsthat support the analysis, acquisition, andtraining communities.14

“Simulation, Test, and Evaluation Process(STEP) Guidelines”

Provides a set of guidelines for the programmanager to refer to in implementing theDoD’s October 3, 1995 direction to make theSimulation, Test and Evaluation Process anintegral part of Test and Evaluation MasterPlans. STEP proposes a Model-Simulate-Fix-Test-Iterate approach, and is defined as “…aniterative process that integrates simulation andtest for the purpose of interactively evaluat-ing improving the design, performance, jointmilitary worth, survivability, suitability, andeffectiveness of systems to be acquired andimproving how those systems are used.” STEPis described as “a test and evaluation answerto the DoD challenges of implementing IPPDand SBA.”15

Previous and On-Going Efforts

“Study on the Effectiveness of Modeling andSimulation in the Weapon System AcquisitionProcess,” Science Applications InternationalCorporation, October 1996

The purpose of this report was to cite docu-mented contributions to the total acquisi-tion process. It concluded that “There isconsistent evidence of M&S being used effec-tively in the acquisition process but not in anintegrated manner across programs or func-tions within the acquisition process. Substan-tial evidence has been collected from individualsuccess stories, though the benefits are notreadily quantifiable into a general standard. Thekey is in focusing on the integration of M&Sapplications, across acquisition programs andthroughout the process, not in exploring theapplications themselves….”16

In attempting to quantify metrics, it noted“cost savings are especially difficult to quan-tify” and are often “more correctly classifiedas ‘cost avoidance’ and are measures of sig-nificant additional work or results that wereobtained using M&S tools which would havecost the reported ’savings’ if they had beenobtained by more traditional methods.” Thisstudy grouped the challenges that preclude theseamless use of M&S in the acquisition pro-cess into technical, cultural, and managerialchallenges, as noted below:

Technical Challenges:• Interoperability of M&S Tools;• Availability of Data Descriptions;• Security/Sensitivity of Data;• Physics-based M&S;• Hardware and Software Limitations;• Variable Resolution.

3-5

Cultural Challenges:• Acquisition Processes;• Incentives for M&S Use;• M&S Workforce;• Acceptance of M&S.

Managerial Challenges:• OSD and Service Guidance;• Ownership of Data;• VV&A Requirements;• Funding Process;• Use of System Models.

This report defined SBA as a “term tocharacterize the general approach of signifi-cantly increased use of M&S tools and the newprocesses which they enable in a new, moreintegrated approach to program develop-ment” (italics emphasis added). It furthercharacterized SBA in terms of a systemsengineering process by saying “The positiveresults of simulation efforts in systems engi-neering have become evident in what we referto as Simulation Based Acquisition.”

Finally, Appendix C of the October 1996 studycontains a useful summary of previous studiesand recommendations dating from 1989 to 1995.

Common Operating Digital Environment(CODE)

CODE, a DoD initiative related to SBA, isbeing co-sponsored by the Assistant DeputyUndersecretary of Defense for LogisticsReinvention and Modernization (ADUSD/LR&M), and the National Defense IndustrialAssociation (NDIA). A joint industry- andgovernment-working group is developing aframework for a “large-grained interoperabil-ity” technical environment to enable digitalweapon system life cycle information manage-ment by 2002. The vision is for government

and industry partners to create, exchange, andsustain information solely in a digital environ-ment. The CODE will provide access to digi-tal data across the virtual extended enterprise,and the functionality to support business pro-cesses and decision making. The initial reportis cognizant of SBA and notes that the JointTechnical Architecture and its subsystem HighLevel Architecture need to be reviewed withother stovepipe architectures to ensure anoptimum life cycle environment.17

Defense Systems Affordability Council(DSAC)

In December 1997, the DSAC identified threefundamental areas contributing to theaffordability of acquisition programs: 1) reduc-tion of total ownership costs; 2) 50 percentreduction of acquisition program cycle timefor new systems; and 3) realistic programmingand program stability (zero percent programcost growth) enabled by a broad range ofpotential improvements in requirements set-ting, funding management, and acquisitionpractices.18 As noted in the previous chapter,Simulation Based Acquisition contributes toall three of these objectives.

Joint SBA Task Force

The Acquisition Council of the ExecutiveCouncil on Modeling and Simulation commis-sioned a six month effort to develop a roadmap in which DoD and Industry are enabledby robust, collaborative use of simulation tech-nology that is integrated across acquisitionphases and programs. The Terms of Referencefor this effort called for six major items to beincluded in the road map:

1. near- and long-term DoD actions neededto accelerate the SBA concept;

3-6

2. industry actions needed to accelerateSBA;

3. notional representations of systems archi-tectures and a conceptual framework toidentify key requirements for seamlessdata transfer and interaction;

4. opportunities for reuse and commonalityacross programs;

5. primary ownership of each module in thesystems architecture; and

6. estimated government and industryinvestments needed to implement the pro-posed road map, and methods todetermine the return on investment.19

The SBA Task Force is ongoing during thepublication of this research effort, and theirresults will be presented at a conference inNovember 1998.

Assumptions, Trends, and SBA Enablers

Brigadier General Robert Armbruster,Deputy for Systems Acquisition at the U.S.Army Aviation and Missile Command, notesthat three change factors now make SBA pos-sible: technology, advocacy, and Life CycleManagement.20

Technology has advanced to the point whereit is often less expensive to simulate processes,components, and systems, than it is to physi-cally create or run them. Computing resources

Figure 3-1. Growth in Computing Power

Reprinted from Research Policy, Volume 27, Thomke, Stefan. “Simulation, Learning and R&D Performance:Evidence from Automotive Development,” pp. 55-74, 1998, with permission from Elsevier Science.

105

1955

1955

1960

1960

1965

1965

1970

1970

1975

1975

1980

1980

1985

1985

1990

1990

1995

1995

2000

2000

104

103

102

10

1

0.1$30K

DEC PDP8

Intel 4004IBM 704 Altair 8800

IBM 7090

DEC PDP1

(PDP6)

(DEC VAX 780)

TMC CM-1

Cray T3DCray T3E

$12M(Structural Biology)

Intel iPSC(Higgs Boson Mass)

(72-Hour Weather)(Pharmaceutical Design)

(DEC PDP11)

Illiac IV

(IBM 360)IBM Stretch

Apple II

IBM 7094CDC 6600

Pentium PC

Cray X-MP

Cray Y-MP

Cyber 205

Cray C90

Cray T90

Cray 2

Cray I Pentium Pro PC

$5K

Apple WS

IBM PC

Sun WS

Macintosh PC

Functional/Affordable

IBM 360/90

CDC 7600

(2D Plasma)

(48 Hour Weather)(3D Plasma)

(Airfoil Design)

Leading Edge

Year

Pea

k S

pee

d

(Mill

ion

s o

f O

per

atio

ns

Per

Sec

on

d)

3-7

that were worth $30K in 1960 only cost about10 cents today.21 In 1965, Gordon Moore (whoco-founded Intel Corporation in 1968) madethe observation that computing power wasdoubling every 18-24 months. Now known asMoore’s Law, his theory has proved to be re-markably accurate: in 26 years, the number oftransistors on a chip has increased more than3200 times, from 2300 on the 4004 processorin 1971, to 7.5 million on the Pentium II pro-cessor.22 This exponential growth in comput-ing power is shown graphically in Figure 3-1.23

In 1960, many companies began using simula-tion tools, but they were often inadequate.24

Recent increases in computing power andsimulation tools, however, have greatlyincreased the capabilities of M&S efforts to aprogram. While physical models and proto-types do not represent reality completely,programs have grown accustomed to usingthem and understand their limitations. Now,programs are finding that in many instancesthe computer models are more correct thanthe physical models. The following examplehighlights this point. The experiment, to be runin both the physical and virtual, was to test theability of a ship’s compartment to withstandan explosion. During the validation testing ofthe ship’s model, the test engineers could notget the computer simulation results to agreewith the actual physical test results. The testdesign for the compartment called for a cir-cumference weld, which had been correctlyincluded in the finite element computermodel. However, as the physical prototype hadbeen incorrectly spot welded, the results of thephysical test did not match the results fromthe simulation. When the error was realized,it was easier (and more cost-effective) tochange the computer model to a spot weld,and re-run the simulation, at which time, theresults matched.25 Another example is from

the automotive industry. A simulation analy-sis showed that the vehicle skin would ripbelow a certain thickness. Not believing thesimulation, the engineers overrode the analy-sis of the simulation and made the vehicle skinthinner. During testing the skin ripped as thesimulation had predicted. It was at this pointthat the engineers became believers in theresults of simulations.

This leads to the second change factor of“advocacy.” Leadership is essential to effec-tive change in an organization and todaychange within an organization must be led andnot just managed.26 Within OSD and the Ser-vices senior leadership within the acquisitioncommunity is taking an active role towards theimplementation of SBA. Advocacy for SBA,however, is not solely limited to senior lead-ership but can be found at all levels of theacquisition community. At the ElectronicProving Ground at Fort Huachuca, Arizona,they are finding that simulations can be bet-ter than physical tests. Security issues andcivilian communication interference concernslimit what can be physically tested on somesystems. Testing in the virtual environmentremoves such barriers. Thus some simulationscan be more realistic than what can actuallybe tested in the physical.27 Success stories, suchas this and the examples in the preceding para-graph, are beginning to make believers out ofthe most hardened skeptics and are helpingto promote this technology.

And the third change factor contributing tomake SBA possible is the desire to be able toperform Life Cycle Management—simulationenables us to do LCM.28 Simulation makes itpossible to really get cost on the table and onequal footing with the performance require-ments. This forces the requirements commu-nity to make the hard tradeoffs as part of the

3-8

design process. As noted by BG Bergantz, Pro-gram Manager for the Army’s Comanche

helicopter program, SBA enables us to “putthe intellectual before the physical.”29

3-9

ENDNOTES

1. The DoD Acquisition Deskbook has a good syn-opsis of M&S guidance in Information Structure,Topics, Section 2.9.1 M&S in Acquisition: TheProcess, General Information, and Background(31 March 1998).

2. Kern and Purdy, “Simulation Based Acquisition:A Good Thing, But How Do We Get There?”Army RD&A Magazine, July-August 1998, p. 4.

3. DoD 5000.1, Defense Acquisition, 15 March 1996,(Section D Policy, Paragraph 2.f. Modeling andSimulation) 23 March 1998.

4. DoD 5000.2-R, Mandatory Procedures for MajorDefense Acquisition Programs (MDAPS) andMajor Automated Information System (MAIS)Acquisition Programs, 23 March 1998 (Paragraph3.4.4, Modeling and Simulation).

5. DoD 5000.2-R, Mandatory Procedures for MDAPSand MAIS Acquisition Programs, 23 March 1998(Paragraph 3.4.1, Test and Evaluation Strategy).

6. DoD 5000.2-R, Mandatory Procedures for MDAPSand MAIS Acquisition Programs, 23 March 1998(Paragraph 3.4, Test and Evaluation).

7. DoD 5000.2-R, Mandatory Procedures for MDAPSand MAIS Acquisition Programs, 23 March 1998(Paragraph 3.4.5, Operational Test andEvaluation).

8. DoD 5000.2-R, Mandatory Procedures for MDAPSand MAIS Acquisition Programs, 23 March 1998(Paragraph 3.4.9, Live Fire Test and Evaluation)(Not applicable to ACAT IA programs).

9. DoD 5000.2-R, Mandatory Procedures for MDAPSand MAIS Acquisition Programs, 23 March 1998(Paragraph 4.3, Systems Engineering).

10. DoD Directive 5000.59, Modeling and Simulation(M&S) Management, Section D, Policy, 4 January1994.

11. DoD 5000.50-P, DoD M&S Master Plan (MSMP),October 1995.

12. DoD 5000.59-M, DoD Modeling and Simulation(M&S) Glossary, December 1997.

13. DoDI 5000.61, DoD Modeling and Simulation(M&S) Verification, Validation and Accreditation(VV&A), 29 April 1996.

14. DMSO, VV & A Recommended Practices Guide(November 1996), Chapter 1 Overview.

15. Dr. Philip Coyle and Dr. Patricia Sanders, Simu-lation, Test, and Evaluation Process (STEP) Guide-lines, 4 December 1997.

16. Anne Patenaude, et al., Science ApplicationsInternational Corporation, Study on the Effective-ness of Modeling and Simulation in the WeaponSystem Acquisition Process, Final Report, Octo-ber 1996.

17. Common Operating Digital Environment(CODE) Top Level Policy Guidance Document,http://www.acq.osd.mil/cals/ accessed on 1 July98.

18. Joint Simulation Based Acquisition Task ForceTerms of Reference, April 1998

19. Ibid.

20. Brigadier General Robert E. Armbruster, Deputyfor Systems Acquisition, U.S. Army Aviation andMissile Command, remarks during presentationat Army SBA Conference, Orlando, FL, 21-22January 1998.

21. David Chang; Director, Vehicle Synthesis, Analy-sis & Simulation Process Center; General Mo-tors, interview by authors, 31 March 1998.

22. www.intel.com/intel/museum/25anniv/Hof/moore.htm accessed on 5 May 1998.

3-10

23. Stefan Thomke, “Simulation, Learning and R&DPerformance: Evidence from Automotive Devel-opment,” Research Policy 27 (1) (May 1998), 55-74.

24. Michael Carty, Gene Coffman, and Hwa-SungNa, Ford Motor Company, interview by authors,27 February 1998.

25. Myles Hurwitz, Naval Surface Warfare Center,Carderock Division, interview by authors, 18 Feb-ruary 1998.

26. Lieutenant Colonel Charles L. Beck, Jr., USAF,et al, A Model for Leading Change: Making Ac-quisition Reform Work, Report of the 1996-1997DSMC Military Research Fellows.

27. John Haug, Directorate for Technical Mission,U.S. Army Test and Evaluation Command, in-terview by authors, 11 February 1998.

28. Armbruster, ibid.

29. Brigadier General Joseph L. Bergantz, ProgramManager RAH-66 Comanche helicopter pro-gram, remarks during presentation at Army SBAConference, Orlando, FL, 21-22 January 1998.

4-1

44ESSENTIAL ASPECTS OF SBA

Balancing Requirements

Early user involvement is essential in defin-ing, refining, and balancing requirements. Thefollowing quotations emphasize this point:

• “The best way to define a program isthrough M&S—plunk the user down intothe virtual world, have him play with theconcept, and work out the tactics, training,and procedures—take the user out of theabstract and into the virtual combatworld.”1

• “SBA and M&S enable you to play optom-etrist with the user—do you like it this way,or that?”2

• “We need more user input during devel-opment of the system. Simulation bringsthe warfighters and engineers together sothat each has a better appreciation of thefuture system and its interaction in thefuture battlefield.”3

• “Spiral development concentrates on sol-dier feedback. It involves the soldier whouses the equipment directly in the devel-opment process, brings all the partiestogether—from the user to the developerto industry.”4

This chapter presents five essential featuresof a Simulation Based Acquisition processwhich, when implemented, will present the ac-quisition community with capabilities not cur-rently imbedded in the current process. First,SBA will, through early and continuousinvolvement, allow the user to define, refine,and balance requirements. This is the criticalfirst step in producing better, faster, andcheaper material solutions. Second, an SBAprocess supported by the synthetic environ-ment, allows design teams to concurrently ex-plore greater numbers of possible materialsolutions than is possible within the currentacquisition process. Third, the iterative natureof the SBA design process will enable IPPDteams to converge systematically on optimalsolutions more efficiently than is currentlypossible. Fourth, SBA will support changingthe role of testing into that of being an inte-gral part of the design process. Finally, SBAsupports informed tradeoff analyses througha Decision Risk Analysis process. This chap-ter will explore each aspect separately. Thoughpresented individually, when the five essen-tial aspects of SBA are combined, their syner-gistic effect towards assisting the acquisitionprocess is greater than the sum of theindividual parts.

4-2

• “The warfighter only has a general notionof his requirements up front, and is unableto give a detailed description early on. M&Senables the user and the developer towalk up the spiral development laddertogether.”5

The traditional acquisition process assumesthe user completely understands the require-ments and the impact each has on the pro-gram. To be more specific, it has been assumedthat the user completely understands the over-all cost associated with each requirement.According to Lieutenant General GeorgeMuellner, Air Force Principal Deputy forAcquisition, former Director of Requirementsfor Air Combat Command, and former Direc-tor of the Joint Advanced Strike TechnologyProgram: “The old way of doing business wasfor the user to throw his requirement over thefence to the acquisition community, whowould then spend five years working the mar-gins, when 90% of the performance wasalready locked up. Now we’re getting wherewe can keep the trade space open, don’t lockup the requirements, and force the require-ments people to deal early on with theseissues.”6

Balancing requirements in the defense acqui-sition business means bringing into balancewhat the user wants with what can beaffordably accomplished, or as stated by Lt.Gen. Muellner, converting the warfighter’swants into affordable needs.7 According to ColCuff, U.S. Army Training and Doctrine Com-mand (TRADOC) Systems Manager (TSM)for the Crusader field artillery program: “Withsimulation, the user has a better understand-ing of the cause and effect relationship be-tween requirements, cost, and schedule. TheTSM gets to see the operational impact ofdesign and requirements tradeoffs. The user

always wants everything, but previously did nothave an appreciation for the cost and sched-ule impacts that requirements had on programexecution.”8 An SBA process supports the userthrough visual representation of the variousdesign alternatives early in the design phase.The user can then compare alternatives andprovide input to the design team on the valueof competing designs with relationship to hisrequirements. Visualization also makes iden-tification and resolution of many issues easierand faster. The entire IPPD team can seewhere the issues are and then focus on how tosolve the problem.

System requirements are really a combinationof many things: the user’s operational needs,logistics concerns, lessons learned, environ-mental issues, etc. These system requirementsmust then be balanced in terms of cost andforce effectiveness. The user plays a key rolein balancing requirements, and the applied useof M&S across the phases of a program cansignificantly affect the outcome of the finalsystem.

Figure 4-1 shows the JSF program’s depictionof this balancing act between cost and capa-bility in determining the most affordablesolution. The JSF sets cost objectives to bal-ance mission needs with projected out-yearresources, taking into account anticipatedprocess improvements in both DoD and thedefense industries.9

Numerous Design Alternatives

Concurrent systems engineering can be viewedas a process of balancing requirements, ormaking compromises between competingrequirements. Its primary function is to ensurethe product meets the customer’s cost, sched-ule, and performance needs encompassing the

4-3

total product life cycle.10 This systems engi-neering approach is not new; what is new isthe ability to conduct many parts of this pro-cess in the synthetic environment using M&S.The benefit of doing this process in the syn-thetic environment is that designers and de-velopers can try things without fear of failure.A recent report on virtual prototyping notedthat during the design cycle, a hesitancy forrevisions exists once a marginally working sys-tem has been achieved, as a result of the in-creased risk associated with making changes.11

This increased risk is a result of the time lostand the associated cost of building alternativehardware, even though the alternative may bea superior design.

Because making design changes in a syntheticenvironment can be faster and cheaper (andtherefore less risky) than making changes to aphysical prototype, an IPPD team is able toexplore more design options. HarvardUniversity’s Stefan Thomke, in his researchwith the automobile industry, cites a powerful

example. An automotive company found thatan assumption regarding a componentinvolved in side-impact crashes was in factfalse. This assumption had never been testedbecause it seemed obvious that making a com-ponent in a car stronger would improve crash-worthiness. Physical prototypes and crasheswere reserved for high payoff issues. Engineerswere not willing to use costly physical testingto test assumptions they were confident about.Because it was quick and inexpensive to checkout the assumption using simulation, oneengineer insisted on it. Surprisingly, the teamdiscovered that strengthening the componentactually decreased crashworthiness, by unex-pectedly propagating the load to another partof the vehicle. The solution was to reduce thestiffness of the component, which went againstthe conventional wisdom. This discovery ledto a reevaluation of other reinforced areas andhas improved the crashworthiness of all carsunder development. These and other designchanges improved crashworthiness by 30 per-cent—a significant increase. Interestingly, the

Figure 4-1. Affordability: Capability vs. Cost

O&

S

pro

du

ctio

n

LethalitySurvivability

Supportability

Capability

Affordability

Cost

R&

D

4-4

time and cost to build physical vehicle proto-types for the two verification experimentsat the completion of the project exceededthe time and cost of the entire advanceddevelopment M&S project.12

Once a physical prototype exists it is difficultfor people to envision a new concept that issubstantially different. It is very compelling forthe design team to get to a point design, andthen consider possible excursions or engineer-ing changes from that point design, rather thanexamining the entire design space. Designingin the synthetic environment, however,decreases the cost and time of looking atalternative designs.

Thomke notes that the advantages of sub-stituting real physical objects with virtual ex-perimentation can be very substantial— onceset up, virtual tests can be run at very little

additional cost per run.13 In addition, henotes several other benefits of virtualexperimentation. These include:

• Better depth and quality of the analysesbecause it is possible to slow the test eventsdown and zoom in on minute areas.

• Information from problem-solving cyclesbecomes available to other developmenttasks earlier in the development process,therefore other tasks can proceed morequickly, and higher degrees of designconcurrence are feasible.

• With faster problem-solving cycles, design-ers will be able to push (or “front-load”)problem-discovery to earlier developmentphases and thus make problem-relatedengineering changes earlier and at a lowercost.

Figure 4-2. SBA Process

QFD ANALYSIS PHYSICAL TEST

DELPHIPROCESS

Mission

Engagement

Engineering

Campaign

CONSTRUCTIVESIMULATION

Simulation Based Acquisition Process

VIRTUALENVIRONMENT

INTERACTIVEDIGITAL

SIMULATION

DISTRIBUTED INTERACTIVE ENVIRONMENT

4-5

• Reduced cost and time to get experi-mental feedback encourages the designteam to conduct more diverse “what if”experiments which, in turn, may lead tothe discovery of more effective (andunanticipated) design changes.14

The Iterative Design Process

The Joint Strike Fighter (JSF) program por-trays the process of iteratively designing asystem using the SBA process as shown inFigure 4-2.15 The JSF process is a buildingblock approach, with each step building uponthe previous steps as well as iterating withinthe steps and between steps. It begins with theDelphi Process, which is a method devised bythe RAND Corporation to obtain expert judg-ment from multiple experts without many ofthe biases associated with interpersonal pres-sures.16 These results are then fed into a Qual-ity Functional Deployment (QFD) analysis,which refines the requirements into ways toaccomplish those requirements. QFD providesa way of tracking and tracing tradeoff analy-ses through various levels, from requirementsthrough design decisions to production andsupport processes, while preserving traceabil-ity to user needs.17 These results are then usedto build the appropriate level of constructivesimulation. At first, these simulations will beat an aggregated campaign level, but will con-tinue to be iterated and refined back throughthe previous two steps and down the construc-tive simulation hierarchy to mission/battle,engagement, and finally the engineering level.As more is learned, the simulationprogresses to interactive digital simulationand interaction within a virtual environ-ment, which introduces the human-in-the-loop. Again, there can be much iterationwithin each step, as well as feedback to ear-lier steps to capture the knowledge gained.

As the program matures into flight test,these results are also fed back into the pro-gram to capture the knowledge, and to helpchallenge and possibly revise the inputs to theprocess.

Once a program reaches the constructivesimulation step, it has the option to begininteracting with other systems in a distributedinteractive environment. This can be a pow-erful tool for the IPPD team because it enablesthem to receive feedback on their system wellbefore the design is locked in. Virtual systemscan participate in exercises and experimentsto determine their effect on the battlefield, andthose results and warfighter feedback are cap-tured early enough in the program to help thedesign team conduct more informed tradeoffanalyses. In effect, the warfighter becomes anintegral part of the design team.

Converging on an Optimal Design

Another view of this systems engineering andsimulation process was developed by UnitedDefense Limited Partnership and is being usedon the Crusader field artillery system. TheirSimulate, Emulate, Stimulate process is shownin Figure 4-3 as a top-down view, and in Figure4-4 from a side view.18

The Simulate, Emulate, Stimulate processstarts with a low fidelity model, and movesthrough the four steps of analyze, design,evaluate, and revise, successively converginguntil reaching a high fidelity model of the finalsystem. Early models of the system are usedin a series of simulations to evaluate conceptsof potentially high-risk areas to identify andresolve deficiencies. As the system designmatures, some subsystem simulations arereplaced with emulations, which are modelsthat accept the same inputs and produce the

4-6

Simulate - Emulate - Stimulate Process

StimulateStimulate

EmulateEmulate

Simulate

High FidelityHigh Fidelity

Low FidelityLow Fidelity

Figure 4-3. Stimulate-Emulate-Stimulate Process, Top-Down View

Figure 4-4. Stimulate-Emulate-Stimulate Process, Side View

High Fidelity

Simulate

Stimulate

Emulate

Low Fidelity

4-7

same outputs as a given system.19 These emu-lations have a higher level of fidelity and trans-late to a more detailed evaluation of the sys-tem performance. Further design maturationallows the model to stimulate actual systemhardware, leading to the final stage of benchtop integration and then system test.20

SBA is about using simulation to explore thedesign space, to validate designs, and to verifythat the proposed design will meet the end-user’s expectations and is manufacturable,supportable, and affordable. Simulationenables the team to see the results of theirdesign decisions not only in terms of perfor-mance, but also to project the cost of thosedecisions across the system’s life cycle. Therapid feedback from the design iterations helpsthe team converge on an optimal solution—otherwise they’ll continue to work on the prob-lem until they run out of time. It’s impossibleto know if a single design is the best possiblesolution; however it is possible to evaluate onedesign against another. The confidence inreaching an optimal design should increase asthe number of designs explored increases, par-ticularly if there is significant feedback andlearning taking place between the designiterations.

Data and personnel interfaces need to beseamless and integrated to facilitate rapiddesign iterations. This requires an integrateddata environment to work effectively. A Prod-uct Information Management (PIM) tool isextremely important to manage across the vir-tual enterprise and can facilitate widely sepa-rated teams. It controls and provides easyaccess to the large quantities of engineeringand product-related data that are generatedduring concurrent engineering, while trackingthe numerous rapid changes from differentsources in the organization that often occur

at the same time.21 Concurrent engineeringdemands a great deal of work-in-processinformation. The PIM links the engineeringtools with the other tools necessary to managea fully integrated environment, such as officeautomation, systems engineering, businessmanagement, configuration management,reference library, and particularly web andInternet connectivity for cost-effectivesubcontractor access.

The integrated data environment is changingthe entire design process. For example, on theCrusader program, critical information nolonger exists in specifications and drawings.It’s now in very large databases that need tobe managed and linked. Requirements for theCrusader now exist as a highly structured,hierarchical, linked data set of over 7,000requirements. Using their integrated dataenvironment management system, the Cru-sader design team can quickly identify the col-lateral impact of a requirements change.22 Theteam can trace the top-level warfighting tasksall the way down to the detailed design andback up to the final product. During designiterations, if the final product falls short ofexpectations in some areas (for example ininitial acquisition cost, interoperability, ormaintainability), this traceability from require-ments to design will be key to enabling theprogram office, in conjunction with thewarfighter, to make smart tradeoffs.

Test as an Integral Part of Design

The traditional “build – test – fix” or “test –fix – test” process is giving way to a new pro-cess in which test is becoming an integral partof design. As mentioned before in ChapterThree, the Simulation, Test, and EvaluationProcess (STEP) Guidelines calls it a “model –simulate – fix – test – iterate” approach,

4-8

wherein there are many iterative loopspossible in this process.24

Dr. Henry Dubin, Technical Director of theArmy’s Operational Test and Evaluation Com-mand (OPTEC), emphasizes this changingprocess by saying that we no longer “conducttest and evaluation for the purpose of deter-mining pass or fail of critical thresholdparameters…[we now] focus testing on whatthe acquisition team needs to learn—test tolearn, and evaluate to understand the [opera-tional] impact of what we have learned onmission accomplishment.”25

When viewing the test process, it’s importantto keep in mind that both physical tests andvirtual tests have limitations. Dr. John Foulkes,Director of the Army’s Test and EvaluationManagement Agency, notes that the “test andevaluation community should view anythingshort of actual war as being simulation, includ-ing live testing. Program Managers shouldensure the right mix of constructive (digital)simulation, virtual simulation, and live simu-lation (testing), and continually re-evaluatewhy and how we test. Constructive and virtualsimulation should reduce the burden on theamount of live simulation required, therebyreducing the time and cost of testing.”26

Models, by definition, do not represent real-ity completely.27 This applies to both physicaland synthetic models. Models are, therefore,an attempt at replicating reality. Many reasonsmay prevent a model from accurately repre-senting reality or being a high fidelity model.Inaccuracy of the model can be the result ofthe inability to capture all the attributes of thereal situation, possibly because of humanerror, ignorance, time limits, or economics.28

Many times we use M&S to overcome physi-cal limitations of live testing, including safety

concerns, interoperability issues, interferencewith civilian activities, non-availability of thethreat, and affordability.29 In part M&S is doneto reduce the cost of representing aspects ofthe real that are irrelevant to the experiment,as well as to control out some aspects of thereal to simplify the analysis of the results.30

Dr. Dubin proposes a test strategy of usingM&S to design and develop the system, tomature the models with the system, and to usethe test data for calibrating and maturing thesystem model. By truly integrating testing earlyon we should gain synergies to greatly enhancethe overall value of the program. He empha-sized his point by stating the converse: “Build-ing a simulation solely for the purpose ofreducing your Milestone III testing require-ments is almost always a waste of money.”31 Dr.Phil Coyle, Director of Operational Test andEvaluation, Office of the Secretary of Defense,noted the same importance of early test involve-ment: “M&S and testing are intertwined; whenthey are not, neither is effective.”32

The benefits of M&S to the test and evalua-tion community are numerous, among themthe ability to:

• conduct full and continuous evaluation ofthe system;

• evaluate system performance where livesimulation is neither feasible or practical;

• identify and concentrate the live simulationresources in the high risk areas;

• stress the system at less than system level;

• conduct excursions for the development oftest and evaluation plans and to identify livesimulation scenarios.33

4-9

Involving the tester early in a program mayalso allow the tester to influence the design tominimize what must be tested and to facili-tate the testing that must be conducted. Theknowledge gained through M&S of a systemwill enable the test community to design andconduct more effective and efficient testing.

The Crusader program noted that the biggestchallenge is to combine testing and M&S toleverage savings in time and resources, whilesatisfying the decisionmakers that the require-ments are met.34 The test community needs tobe on board early as part of the design solution.This way they gain credibility in the models asthey’re being used early in the program, andbuild in confidence in the models throughincremental testing.

The intent of modeling and testing is alsobeginning to change. For example, the role ofthe Navy’s David Taylor Model Basin hasbecome more of validating models rather thandesigning models. The purpose of testing isevolving into validating the model so that theanalytical calculations derived from the modelare believable.

A mix of analytical and physical testing is oftenthe right answer, depending upon the cred-ibility and confidence in the tools. New pro-gram starts may require physical prototypesto prove out the models, since there may notbe known models from which to build. But thegoal remains the same—to conduct a combi-nation of M&S and testing in order to get abetter product. As one tester stated: “The oldparadigm was to test a company of tanks. Thenew paradigm will be to test a platoon, but tosimulate a battalion.”35

Decision Risk Analysis

Though modeling and simulation costs arecoming down, they still can be very expensiveto apply. Programs probably cannot afford touse M&S on every issue and need the abilityto prioritize their efforts. One way to priori-tize efforts is through a methodology calledDecision Risk Analysis (DRA), which quanti-fies and ties together cost, schedule, perfor-mance, producibility, risk, and quality topermit informed tradeoff analyses.36 A deci-sion risk analysis tool quantitatively providesthe program manager with a means of assess-ing a program’s probability of achievingprogram success, while employing a CAIVstrategy. If the assessed risk is low and wellknown, combinations of many program activi-ties may be acceptable. If assessed risk is sub-stantial and/or unknown, a detailed break outof activities accompanied with detailed time,cost, and performance estimates for comple-tion of those activities from assessment in aDelphi-type environment may be required.

One such tool is the Venture Evaluation andReview Technique (VERT), which is a govern-ment-tool designed to do DRA.37 It is a MonteCarlo simulation-based tool that aids programmanagers in measuring risk in an unbiasedmanner. VERT uses the same approach as aGannt chart and it adds probability to make ita risk measurement tool. To build a moreaccurate Gannt chart you interview IPT mem-bers to get a more accurate assessment of howlong activities will take. You end up with arange of time an event could take, along witha corresponding probability. You can then runa series of “what ifs” to determine various out-comes as parameters are changed. The Ganntchart will allow you to identify the program’scritical path for cost, schedule, and perfor-mance. You can then focus simulation in those

4-10

high-risk areas to reduce programmatic risk.If the risk is low, no modeling may be required,or a low fidelity, kinematics-based CAD/CAMmodel may be all that is required for proof ofprinciple. On the other hand if the risk ishigh, a higher fidelity simulation or physi-cal test may be required for proof of concept.

Implementing a program like VERT can takea lot of manpower to collect and maintain thedata; however, it can pay big dividends by help-ing determine the high risk areas. A DRAapproach helps the manager determine whereto employ modeling and simulation efforts andto what level of fidelity.

4-11

ENDNOTES

11. Tim Thornton, Coleman Research Company(CRC), Collaborative Virtual Prototyping System(CVPS), 9 January 1997.

12. Stefan Thomke, “Simulation, Learning and R&DPerformance: Evidence from AutomotiveDevelopment,” Research Policy 27 (1) (May1998), 55-74.

13. Ibid.

14. Ibid.

15. Colonel Phil Faye, USAF, Joint Strike FighterProgram briefing, 30 March 1998.

16. For more information on Delphi, see DefenseAcquisition Deskbook, Information Structure1.2.4 Establish Risk Management Strategy,AFMC—Identify the Ranking of Risk, Submit-ted by Major Paul Loughnane, USAF, AFMC/ENPI (December 1997)

17. For more information on QFD, see the DoDGuide to Integrated Product and Process Develop-ment, Office of the Undersecretary of Defense(Acquisition and Technology), 5 February 1996.

18. Bruce Van Wyk, United Defense Limited Part-nership, Crusader Program briefing, 2 April 1998.

19. DoD 5000.59-M, DoD Modeling and Simulation(M&S) Glossary, December 1997.

20. Bruce Van Wyk, United Defense Limited Part-nership, Crusader Program briefing, 2 April 1998.

21. John Krouse, “Manager’s Guide to ComputerAided Engineering,” pp. 156-157.

22. Bruce Van Wyk, United Defense Limited Part-nership, Crusader Program briefing, 2 April 1998.

1. Will Brooks, Chief, Simulation Branch, ArmyMateriel Systems Analysis Activity, interview byauthors, 11 February 1998.

2. Colonel Lynn Carroll, USAF, Warfighter Train-ing Research Division Chief, Air Force ResearchLaboratory, briefing to the Joint Strike FighterMission Level Strategy session, 30 March 1998.

3. Rob Beadling, John J. McMullen Associates, Inc.,interview by authors, 6 March 1998.

4. LTG Kern, USA, “SBA: Fielding a CapabilityBased Force,” presentation and remarks at ArmySBA Conference, Orlando, FL, 21-22 January1998.

5. Colonel Picanso, USAF, Electronic Systems Cen-ter, comments during presentation at Joint StrikeFighter Mission Level Strategy session, 30 March1998.

6. LTG George Muellner, USAF, Air Force Princi-pal Deputy for Acquisition, interview by authors,30 March 1998.

7. Colonel Kenneth Konwin, USAF, Director ofDefense Modeling and Simulation Office,interview by authors, 31 July 1998.

8. Colonel Michael Cuff, U.S. Army, TRADOCSystems Manager for Cannon Systems, interviewby authors, 2 April 1998.

9. Major General Leslie J. Kenne, Program Man-ager, Joint Strike Fighter Program, “Joint StrikeFighter—The Affordable Solution” 30 April 1998briefing, on-line reference www.jast.mil/html/progbriefs.htm on 22 April 1998

10. Defense Acquisition Deskbook, InformationStructure 2.6.1 Systems Engineering, FrontlineWisdom and Advise, AFMC - The EngineeringProcess From Industry Point of View, Submittedby Mr. Kammer, HQ AFMC/ENPP (09 June1998).

4-12

23. Dr. John B. Foulkes; Director, U.S. Army Testand Evaluation Management Agency; Presenta-tion and remarks at Army SBA Conference, 21-22 January 1998.

24. Dr. Philip E. Coyle and Dr. Patricia Sanders,Simulation, Test, and Evaluation Process: STEPGuidelines, 4 December 1997, pp. 1-3.

25. Dr. Hank Dubin; Technical Director, U.S. ArmyOperational Test and Evaluation Command; Pre-sentation and remarks at Army SBA Conference,21-22 January 1998.

26. Foulkes, ibid.

27. Thomke, ibid.

28. Ibid.

29. Dubin, ibid.

30. Thomke, ibid.

31. Dubin, ibid.

32. Dr. Philip E. Coyle; Director, Operational Test& Evaluation, Office of the Secretary of Defense;Presentation to 1997 Defense Modeling andSimulation Office Industry Days, 22 May 1997.

33. Foulkes, ibid.

34. Colonel William Sheaves, U.S. Army; ProjectManager, Crusader field artillery program; Pre-sentation and remarks at Army SBA Conference(21-22 January 1998).

35. John Haug; Director, Technical Mission, Test andEvaluation Command, U.S. Army, interview byauthors, 11 February 1998.

36. Dr. Stuart W. Olson, STRICOM, AMSTI-E,Briefing to MG John E. Longhouser, Com-mander, US Army Test and Evaluation Com-mand and BG John P. Geis, Commander, USArmy Simulation, Training and InstrumentationCommand, 7 February 1997.

37. The U.S. Army Logistics Management College,Fort Lee, VA teach courses on VERT. For moreinformation see on-line http://www.almc.army.mil/

5-1

55EXPANDING

THE SBA ENVELOPE

destination. How much SBA can be done in aprogram largely depends on three factors:

1. the domain the program is in;

2. how well developed the infrastructure is;and

3. how compelling a need there is to changethe current way of doing business.

Regarding the first factor, how much SBA canbe done will partly depend on the maturity ofthe program’s domain. The DoD M&S Glos-sary defines a domain as “The physical or ab-stract space in which entities and processesoperate. A domain can be land, sea, air, space,undersea, a combination of any of the above,or an abstract domain, such as an n-dimen-sional mathematics space, or economic or psy-chological domains.”1 If the domain is fairlymature with well-developed models, then aprogram has a much better chance of leverag-ing others’ investments to their benefit. On theother hand, a program’s knowledgeableindustry partners may still be using blueprintsand fax machines. These programs will prob-ably have to set their sights a little lower inhow far they will be able to reap the benefits

In this chapter we describe how SimulationBased Acquisition practices within the Depart-ment of Defense may be expanded. To set thestage, we present the factors influencing theextent to which a program can implementSBA. Next, we present the areas that programscan “push” while implementating SBA activi-ties within their program. Finally, we discussthe “pull” programs will receive from thewarfighting and programming communitiesonce they and other programs beginimplementing SBA .

Factors Influencing the Use of SBA –Domain, Infrastructure, and Need

Many programs are doing smart things withM&S, and many of these programs say theyare already implementing SBA. Otherscounter that those programs are only begin-ning to scratch the surface of what is pos-sible using a simulation-based approach toacquisition. They are just gaining benefitsfrom having a common description of theproduct, so they cannot claim to be imple-menting SBA. SBA is a new approach thathas a sliding scale—programs can do a little,or they can do a lot. As someone said at arecent conference, SBA is a journey, not a

5-2

of moving to an acquisition strategy basedupon the heavy use of M&S.

The automotive and aerospace industries aretwo of the most advanced domains because oftheir lengthy involvement with M&S, and havehad good success in developing their enter-prises by integrating with their suppliers. (Anenterprise is defined as “an arbitrarily-definedfunctional and administrative entity that existsto perform a specific, integrated set of missionsand achieve associated goals and objectives,encompassing all of the primary functionsnecessary to perform those missions.”2) Con-versely, the U.S. shipbuilding industry has manydomestic suppliers that do not have any exper-tise in the synthetic environment, and areunable, therefore, to provide CAD drawings(much less digital models) of their products.This impedes the benefits the prime contrac-tors in the shipbuilding industry can accrue byusing M&S, since they have the additionalexpense of reverse engineering the models oftheir suppliers’ products, with all of the attendantworries about the validity of the models.

It’s important to note that in spite of this limi-tation, the shipbuilding industry has gainedsignificant benefits from using M&S. Forexample, General Dynamics Electric BoatCorporation was able to reduce the numberof pipe hangars on the New Attack Subma-rine (NSSN) from 40,000 to about 18,000.Since hangars are a major cost item on a sub-marine, this equates to a major cost savingsfor the program. General Dynamics more thanrecouped their implementation costs becausethe cost reduction on the first boat aloneexceeded the entire cost of modeling it, andthey are building three more.3

The second factor influencing the transitionto SBA will be how much of the infrastructure

is in place to facilitate a program’s entry intoa simulation-based acquisition process. TheDefense Modeling and Simulation Office(DMSO) is building many parts of this infra-structure. For example, DMSO has created theModeling and Simulation Resource Reposi-tory (MSRR), which is a collection of models,simulations, object models, ConceptualModels of the Mission Space (CMMS), algo-rithms, instance databases, data sets, data stan-dardization and administration products,documents, tools, and utilities. The MSRR isa collection of resources hosted on a distrib-uted system of resource servers. These serversare interconnected through the worldwide webusing the internet for the unclassified MSRRand through the Secret Internet ProtocolRouting Network (SIPRNET) for the classi-fied MSRR. The MSRR provides a layer ofservices that includes the registration ofresources and users, description and qualityinformation of resources, and specializedsearch capabilities.4 DMSO has also createdthe Modeling and Simulation OperationalSupport Activity (MSOSA), which assists DoDactivities in meeting their M&S needs by pro-viding operational advice and facilitatingaccess to M&S information and assets. TheMSOSA is a contractor-staffed activity oper-ating under the direction of the DMSO Di-rector of Operations.5 Programs needing M&Shelp or assistance should contact the MSOSA.Contact information is available through theDMSO website (mentioned previously in thepreface).

DMSO is also leading a DoD-wide effort toestablish a common technical framework tofacilitate the interoperability of all types ofmodels and simulations, among themselvesand with command, control, communica-tions, computers and intelligence (C4I) sys-tems, as well as to facilitate the reuse of

5-3

M&S components. This Common TechnicalFramework (CTF) consists of three pieces: theCMMS, Data Standards (DS), and the HighLevel Architecture (HLA).6

The mission of the CMMS is to develop aconceptual model of the mission space foreach DoD mission area to provide a commonbasis for development of consistent andauthoritative M&S representations. Its pur-pose is to provide designers with an evolvableand accessible framework of tools andresources for conceptual analysis. The missionspace structure, tools, and resources will pro-vide both an overarching framework andaccess to the necessary data and detail to per-mit development of consistent, interoperable,and authoritative representations of the envi-ronment, systems, and human behavior inDoD simulation. Using this framework,designers will be able to develop a clear pictureof what they wish to represent in order to pro-duce a workable model or simulation for anyapplication. This picture will be multi-dimen-sional and must include a depiction of theentities, actions, and interactions that must berepresented. There will be several CMMS cor-responding to broad mission areas (such asconventional combat operations, other mili-tary operations, training, acquisition, andanalysis).7

The second piece of the CTF is the M&S DataStandards Program. The mission of the M&SDS Program is to enable data suppliers to pro-vide the M&S community with cost-effective,timely, and certified data to promote reuse andsharing of data; interoperability of models andsimulations within themselves and with thewarfighter’s C4I systems; and improved cred-ibility of M&S results. The strategic objectiveis to establish, promulgate, and overseepolicies, procedures and methodologies for

M&S data requirements; data standards; dataverification, validation, and certification;authoritative data sources (ADS) and datasecurity to provide quality data as commonrepresentations of the natural environment,systems, and human behavior. The DataProgram is organized into four areas: DataEngineering (including Data InterfaceFormat (DIF) and tool development),Authoritative Data Sources, Data Quality,and Data Security.8

Within the Data Engineering area, the Syn-thetic Environment Data Representation andInterchange Specification (SEDRIS) effort issignificant for facilitating SBA. Currently,there is no uniform and effective standardmechanism for interchanging syntheticenvironments between M&S applications.SEDRIS will support the unambiguous inter-change of data between database generationsystems by using a standard applicationprogrammer’s interface that will allow the datato be captured from the producer’s native for-mat with accompanying explanatory metadata.Although SEDRIS by itself will not solve allinteroperability problems, it will provide thetechnology to enable solutions.9

The third piece of the CTF is the High LevelArchitecture (HLA) program. The HLA is acom-posable approach to constructing simu-lations that recognizes that no single, mono-lithic simulation can satisfy the needs of allusers; all uses of simulations and useful waysof combining them cannot be anticipated inadvance; and future technological capabilitiesand a variety of operating configurations mustbe accommodated.10 The HLA provides acommon framework within which specific sys-tem architectures, or federations, are definedby three components. These are:

5-4

1. Ten rules that define the relationshipsamong the federation components;

2. An object model template that specifiesthe form in which simulation elements aredescribed;

3. An interface specification that describesthe way simulations interact duringoperation.11

A federate is a member of a HLA federation,which may include federation managers; datacollectors; real world (“live”) systems such asinstrumented ranges, sensors, or C4I systems;simulations; passive viewers; and otherutilities.12

The MSRR, MSOSA, and CTF are all piecesof the SBA infrastructure that facilitate theentry of programs into SBA. (We’ll have moreto say about standards and interoperability inChapter 7, Challenges to Implementing SBA.)At this stage, many of the tools needed to doenterprise-wide simulation are at an imma-ture, “micro” level, and are not ready to handle“macro” jobs.

The big three auto makers have been solvingtheir enterprise data interoperability needs bydictating that their first-tier suppliers must usetheir computer-aided design packages for de-velopment and submission of their products.(Chrysler uses Dassault Systemes’ CATIA,Ford uses SDRC’s Ideas, and General Motorsuses Unigraphics Solutions, Inc.’s package).The commercial side of The Boeing Companyhas also standardized with CATIA. Others arechoosing to solve their enterprise needs byusing standards for the universal exchange ofproduct information. For example, the Stan-dard for the Exchange of Product Data(STEP) is an emerging international standard

for representing data about products that pro-vides a set of standard definitions for the datathroughout the product life cycle.13 UsingSTEP, The Boeing Company (military side) isable to successfully exchange design informa-tion between its St. Louis facility (which usesUnigraphics to design the Joint StrikeFighter’s fore body) with its Seattle facility(which uses CATIA in its role as systems inte-grator).14 General Dynamics Land Systems(GDLS) uses both Pro-E and Computervisionfor their design software, while their vendorsmostly use Autocad, Unigraphics, and Pro-E.GDLS maintains interoperablity by focusingtheir efforts on ensuring the accuracy of thedata.15 Likewise, United Defense LimitedPartnership is using various commerciallyavailable tools (e.g. Pro-E for solid modelingand Corypheous for visualization) and link-ing them together to get the job done.16 Theeffort a program must devote to ensuringseamless interoperability between the prod-ucts of M&S tools will depend on factors suchas the number of different M&S tools used,their degree of compatibility and the levelwhich they interface. Translation efforts in theexamples cited above, range from being a nui-sance to being major problems that requirefull time support.

Regardless of the method used, programsneed a way to communicate freely in their in-tegrated data environments. HLA compliancefor simulations has been mandated by OSDfor 30 September 2000 and non-compliantdevelopment and modification of simulationsmust cease by 30 September 1998; however,“HLA is not an interoperability ‘magic wand.’That is, HLA will not automatically makeevery simulation suitable for federating withevery other simulation, nor guarantee a valid,meaningful exchange of information across thefederation … but the HLA does provide the

5-5

critical technical foundation for the interop-erability of simulations among themselves, andwith live systems.”17 Said another way, HLAwill provide the common “plug” within afederation, but not guarantee that modelsfrom different federations will “play” to-gether in simulation. When examining thepossibilities, the real answer may often be thatapplications need to be “compatible but notnecessarily common.”18

A final factor influencing the transition to SBAwill be how compelling a need there is tochange the current way of doing business. Cri-sis is often the reason for change and has beena force driving many programs to turn toincreasing and more innovative uses of M&S.Certainly this is the case for the Army’sComanche helicopter program. The programoffice used M&S extensively to conduct a com-petitive fly off in which two competing teamsused man-in-the-loop simulation instead ofbuilding costly prototype aircraft. In addition,M&S activities allowed program managementto work within budget constraints and main-tain a viable program throughout developmen-tal test and evaluation, with only one flyingprototype.19 In the automotive industry thiswas most apparent with Chrysler, which wason the verge of bankruptcy in the early 90s withmany employees working half days. Inresponse to this crisis, Chrysler built a newtechnology and design center and made a sig-nificant switch-over to using computer-aidedengineering. A workforce that saw and sup-ported the need for change made this possible.Similarly, within the DoD there is a compel-ling need to find a better and more affordableway of acquiring systems. It is unwise to con-tinue business as usual in light of the rapidlyevolving M&S technology that will supportSBA as a better alternative.

SBA Push and SBA Pull

The domain of the program, the maturity ofthe infrastructure, and the degree of need willall influence how much SBA can be imple-mented in a program. But the key point is thatany program can start moving down the pathto implementing SBA. This “SBA Push” iswhat programs are doing within their enter-prise to implement SBA, given the existingstate of the domain, infrastructure, and need.

Likewise, the “SBA Pull” is the expectationand demand for SBA from outside agencies(for example, the warfighting and resourceallocation communities). Dollar for dollar,programs that are implementing SBA willbe perceived as better programs, because ofthe increased visibility, superior insight, andsubsequent “better, faster, and cheaper”products.

SBA Push

We’ll start first by looking at how programscan expand the envelope by moving down thepath to implementing SBA. Regardless ofwhere a program is in its life cycle, it can ben-efit by initiating SBA practices. It is usuallynot cost-effective for a program to wait untileverything is completely available and in placebefore starting to use SBA. It’s more impor-tant to get started, achieve some successes,learn, and continue to build. A good analogyis that of building a housing subdivision. Oncethe overall plan is in place, there’s a tradeoffbetween the cost-efficiencies of scale and vol-ume and the investment required. Usually thismeans building a few houses, selling them togenerate cash flow to finance additionalhouses and infrastructure, and continuing toexpand throughout the subdivision. An addedbenefit to this approach is the learning that

5-6

occurs as each house is built. The last houseshould be significantly better, take less time,and cost less to build than the first.

Defense acquisition programs are normallyinventing or reinventing a new product andinventions do not have reliability—reliabilityis achieved through history, use, and time. It’simportant, therefore, to get started in SBA inorder to continually and incrementallyimprove the processes, rather than waitinguntil everything is in place. The movementtowards SBA will be a progression. It has beenstated that “the concept is revolutionary, but theimplementation will be evolutionary.”20

An important consideration for the programmanager is to not oversell SBA. As noted byone industry manager, the rate of changetowards SBA is a concern. The communitymust accurately portray the direction, thespeed of advance, and the capabilities of thefinal end state of SBA so that expectations arein line with reality. There will be incrementalchanges and payoffs. Creating unrealisticexpectations can negate any potential benefits.In particular, there may be many things thatcannot be modeled well enough to get theinformation needed.

Does it require an up-front investment for pro-grams to adopt SBA? Certainly there are ini-tial investments required in terms of the com-puter and software purchases, as well asworkforce training.21 There are other costs, in-cluding application software changes requiringthe updating and revalidation of simulations.

Much has been said about The BoeingCompany’s failure to recoup its approximately$1.5B to $2B investment for the full CAD/CAM system used on the 777 airplane. How-ever, their automated, simulation-based

design and build process enabled them to de-liver a completely reliable, operational aircrafton day one, a first-of-its-kind in the commer-cial aircraft industry. In the past it usually tooka year after delivery to work out all the soft-ware bugs, maintenance issues, and parts dis-tribution for a new aircraft. Boeing’s instantsuccess with the first aircraft has stimulatedmore sales and has further benefits that extendacross improved operations for the entireBoeing commercial aircraft workforce andprocedures. The company continues to remainwell positioned to outstrip their competitionwith faster aircraft upgrades.22 With bettercustomer satisfaction, lower operating costs,and lower maintenance costs, Boeing expectsto sell more 777 aircraft. They are also able touse this technology on other aircraft for nodevelopment costs. Finally, if Boeing hadn’tgot the aircraft to market as quickly, Airbusmight have taken some of the sales. Benefitslike these are difficult to factor into theinvestment decision with hard data becausethere is no way to predict the outcome hadtraditional processes been used. This “costavoidance” or opportunity cost issue makes itdifficult to determine the return on investmentfor switching to the SBA approach. Programscan show cost avoidance figures resulting fromthe building of fewer prototypes, or more ef-ficient tests, or the need for only one flyingtestbed. But it is very difficult, if not impos-sible, to put dollar values on risk reduction ora greatly improved product.

In any event, most programs probably will notreceive extra funding to change over to thisnew SBA approach. Reality is a matter ofreallocation of the existing budget to obtainthe best value, and determining when thatvalue will be received. Again, this will re-quire a tradeoff analysis to balance specificneeds with the available resources. The

5-7

Army dictates the use of a Simulation Sup-port Plan (SSP) to effectively manage and in-tegrate the use of M&S within acquisition pro-grams.23 The Air Force handles this tradeoffanalysis by requiring programs to reflect theirM&S strategy and requirements in the appro-priate acquisition documentation, such as Op-erational Requirements Documents, Test andEvaluation Master Plans, and Single Acquisi-tion Management Plans.24 The objective ofthese requirements is the same: to determinethe high-leverage areas that simulation canenhance, while at the same time planning tobuild from these successes.

High-leverage areas can vary greatly from oneprogram to another. Factors that can influencea program’s high-leverage areas are suchthings as the program’s developmental timeline and the program’s strategies and goals.For example, rather than baselining a systemfirst and then creating a trainer, programs arecreating early-on test-beds that provide theopportunity to iterate and immerse the user,which can ultimately grow into a trainer.25

Through the use of the virtual environmentthe Crusader program, with a simulation bud-get of $9M, was able to reduce the program’srequirement for physical prototypes by six. Ata price tag of $50M each, the program real-ized a total cost avoidance of $300M.26 Strate-gies and goals also help identify programmatichigh-leverage areas. Chrysler has reducedtheir cost of doing business by $800M by tak-ing eight months out of the cycle time throughthe use of the virtual environment during de-sign development. Their prototypes are nowas good as the first production cars were whenthey were using the traditional productionmethod.27 General Motors does 80 percent ofits manufacturing in-house (compared withChrysler’s 20 percent and Ford’s 50 per-cent). GM’s costs, therefore, are associated

with tooling, and they must concentrating onincreasing their confidence level before com-mitting to tooling.28 This goal of increasing theconfidence level in the design is supportedthrough the use of SBA practices. Ford, onthe other hand, concentrates on a global, dis-tributed engineering capability, which empha-sizes the importance of product informationmanagement.29 Designing in the virtual envi-ronment supports this strategy of distributedengineering and information management.

Many of today’s programs focus primarily onthe design development portion and not thefull life cycle of a system and indeed, this is animportant first step in achieving the full ben-efits of SBA. Within systems, we’re justbeginning to simulate beyond the traditionalperformance issues to address the entire lifecycle’s cost issues during the design. Theseinclude manufacturability, supportability,lethality, sustainability, mobility, survivability,flexibility, interoperability, reliability, andaffordability. One of the nearest term capa-bilities that will benefit many programs is “vir-tual manufacturing”—the ability to look at themanufacturing issues during design.

The F-22 program is looking at supportabilityissues by using a 3-D model that has enabledtwo mechanics to influence the design by mak-ing critical inputs long before any hands-onprototypes were available.30 The JSF programis developing the Joint Interim Mission Model(JIMM) to merge two legacy models, enablingtest and training inputs to their early model-ing efforts until JMASS is available.31 The Cru-sader program, together with the combatdevelopment community, has jointly con-ducted a series of simulated warfightingexperiments called Concept Evaluation Pro-grams (CEPs), which provided early warfighterinsights into the changing tactics, techniques,

5-8

and procedures associated with the newweapon system.32 During the redesign of theBoeing 737 aircraft, Boeing found thatdigitizing older systems where the design wasonly on blueprints can still pay big dividends.They did a full Digital Product Assembly andDigital Product Definition for only those partsthat were new (the wing, empennage, andmain landing gear). Although the digitizationof the old parts was tedious, Boeing found itto be worthwhile.33 The JSF created cost curvesfor each alternative during its Analysis ofAlternatives, which allowed operationaltradeoffs using CAIV.34

Using the CAIV concept, it is critical fora program manager to know how designdecisions will impact on overall program costs.Some programs have developed cost modelsbased on historical costs of similar typesystems. A good example is the CVN-77 air-craft carrier program that looked at previousships to determine what the cost drivers werethroughout the life cycle of a ship.35 This iden-tified where the CVN-77 program shouldtarget attention to achieve the most costreduction.

The Crusader program has pretty high confi-dence in its cost models and has establishedan Ownership Cost Working Group consist-ing of representatives from the contractor,user community, and the program office toidentify life cycle cost drivers and methods toeliminate or minimize them. During the earlyphase of the Crusader’s development, an in-dependent study group was commissioned toanalyze the program’s requirements and de-sign in order to provide recommendationsbalancing weight, cost, and performance. Tofacilitate this process, a model was developed.The model was based on interviews with de-sign engineers and subject matter experts. The

model showed the overall change in force ef-fectiveness based on incremental increases ordecreases in weight, cost ,and performance.36

The program manager and user representa-tive are able to use this informationcollaboratively in making tradeoff decisions.

An SBA approach enables early identificationof the key drivers (such as cost, schedule, andperformance) throughout the system’s lifecycle, by simulating systems and environmentsexternal to the system.37 The Longbow Hellfiremissile program eliminated its fly-to-buy cri-teria by the close coupling with their contrac-tor enabled by using the Simulation/Test Ac-ceptance Facility (STAF) to finalize the accep-tance test procedures long before the produc-tion phase.38

The High Mobility Artillery Rocket System(HIMARS) program is an example of howvaluable M&S development and reuse isbecoming to program managers. TheHIMARS is a multiple launch rocket systemthat uses a five-ton truck as its mobility plat-form. The HIMARS program managerdecided that using M&S was the best approachfor development of the HIMARS rocketlauncher. No model of a five-ton truck existed,but a model was needed for integration effortsif M&S was to be used to develop the rocketlauncher. The HIMARS program spent $500Kof a $300M budget to build a 5-ton truck modelfrom the Family of Tactical Vehicles (FMTV)program, and is willingly providing this modelto other programs that ask for it. Other pro-grams in the Tank and Automotive Commandcan now build off of this initial model todevelop models of the other 24 variants of 2.5-ton and five-ton vehicles that have 85 percentcommonality39 at much less expense than ifeach program were to create a new model.Later on, the Military Traffic Management

5-9

Command (MTMC) used the five-ton truckmodel to conduct rail impact analysis duringtransportability testing. MTMC was also ableto use the model to conduct air and shipimpact analysis.40 Users outside the Tank andAutomotive community will also be able tobenefit when this model is added to DMSO’sMSRR, and is searchable and readily avail-able. What started out as a requirement tomodel a single vehicle for one program hasgrown into the potential for influencing manyprograms and aiding the requirements of otherorganizations.

Many companies and programs are using avisualization room to bring team memberstogether for the free exchange of ideasbetween disciplines. Although much of thework in an integrated data environment canbe distributed and decentralized, a visualiza-tion room visually presents the digital data informs that a team can use as a group. Con-cepts and work-in-progress can be displayed,analyzed, and debated by integrated productteams, even if they are geographically dis-persed. Work can be imported electronicallyto conduct design reviews or to immerse acustomer for feedback. The Marine Corps’Advanced Amphibious Assault Vehicle(AAAV) program has an excellent visualiza-tion room. It was built as an integral part ofthe entire AAAV facility, which includes gov-ernment and contractor personnel. Thevisualization room provides the IPPD team alocation to discuss issues while viewing 3-Drepresentations of the vehicle or any compo-nent. If the team needs further clarificationof an issue they can move to the adjacent high-bay area to look at a mockup or the physicalprototype. This capability is invaluable forquickly and effectively resolving programissues.

In building the 777 commercial airliner,Boeing used an application called FlyThru thatprovides a three-dimensional view of theairplane as it is being designed. This allowserrors to be detected and fixed prior to com-mitting to expensive manufacturing processes.The fit of parts and systems on the 777 air-plane was 20 times better than what had beennormally achieved in the past. According toBoeing’s manager of Visualization Tools, “Be-ing able to digitally build the entire plane andsee the parts of the plane before building it,was the biggest money saver for the 777.”41

SBA Pull

SBA Pull (as we described at the beginning ofthe chapter) is the expectation and demand fromothers once they see the advantages of SBA

From program inception, the warfighter com-munity will become an active participant inprogram design and development. Longbefore any physical prototypes are available,virtual prototypes of system concepts will beable to participate in exercises that demon-strate the battlefield effects. Logisticians willbe able to quantify the cost effectiveness ofmaking the system more supportable and workwith the design team to find the most cost-effective mix of performance and supportparameters. Using visualization tools such asthree-dimensional solid modeling, people withreal field experience can also provide real-time user feedback on design iterations,because they will be able to “see” and interactwith the design as it matures.

This user feedback will begin to extend beyondthe traditional service boundaries of the spon-soring organization. Programs will begin tolook at how their system will interact withother systems on the battlefield by interfacing

5-10

those system models on a virtual battlefield.The user will be able to begin using the newsystem in the expected environment andassessing any potential incompatibilities orunforeseen circumstances that could beaverted in the design. Programs will be ableto use each others’ models to get the designinformation they need. The combatant com-mands can also be immersed at various inter-vals as the design matures and provide theirassessment and feedback on how well the sys-tem is meeting their expectations. There willbe better dialogue with the resource alloca-tion community since the program will havemuch improved tools to keep the cost of thesystem affordable. In exploring the cost driv-ers over the entire system’s life, they will alsohave much more accurate projections of thesystem’s cost. Regardless of who is interfac-ing with the program, these outside agencieswill have a much enhanced ability to see whatthe program is. Indeed, many programs notedthat a significant side benefit was the greatmarketing tool this approach provides for visi-tors and oversight personnel, because programoutsiders could instantly grasp the import ofwhat they saw.

The Crusader field artillery program starteda force effectiveness analysis even before therequest for proposal was issued. The contrac-tor conducted trade studies to optimize the sys-tem within the overall system, using multiplescenarios. They looked at the problemsencountered (e.g. thermal analysis, ammuni-tion capacity, reliability, maintainability, avail-ability, etc.) in terms of cost per force effec-tiveness. They discovered that the Crusadercould do things the current system (with up-grades) cannot. For example, the current Pala-din system can not keep up with the Bradleyarmored personnel carrier or the M1 tank. Inaddition, the Crusader frees up the Multiple

Launch Rocket System to hit deeper targets.In short, the Crusader increased the opera-tional tempo of the battlefield. Some of theseinsights were intuitive, and some were not. Thevalue of much of this information was diffi-cult to explain, and the contractor was caughtbetween satisfying the requirements stated inthe request for proposal and making tradeoffswhose value to the user he could only guessat. For example, if he could decrease the sizeof the crew from five people down to three,what was the change in total ownership cost?Were the cost savings significant enough tojustify spending money now to decrease thecrew size? What were the operational tempoimpacts on the Bradley and would it be ableto get the necessary information quicklyenough to the Crusader, which is 40 kilome-ters away? These are the types of issues thatthe requirement community needs to deal withearly. The earlier in the design phase we findthe answers to these questions, the better—inthe past we’ve often not discovered theseissues until operational testing, or even worsenot until they’re deployed to the first unit. Bythen we have already produced and fielded thesystem, when if the problem had been discov-ered earlier the cost to fix it would have beenconsiderably less.

While SBA provides the capability to integrateback into the requirements generation pro-cess, it can also reach forward into the resourceallocation process (the Planning, Program-ming, and Budgeting System), by providingtotal ownership cost information for the outyear budgets. For example, there is an obvi-ous extra cost associated with putting simula-tors on board an aircraft carrier, but it may beworth that cost to counteract deployed train-ing atrophy.44 SBA provides a way to explorethose costs.

5-11

Even low cost demonstrations are finding itcost-effective to use a simulation-basedapproach. The Predator unmanned aerialvehicle advanced concept technologydemonstration creatively used M&S to predictoperational effectiveness and assess alterna-tive force structure options, and therebydetermined an optimum system configuration.45

Many programs are expanding the envelopeon what’s possible in SBA, and they are doingit cost-effectively. And although the tools arestill limited, these programs are beginning toaddress the bigger implications of systems ofsystems issues, where in the past we did nothave the ability to address them at all. LarryWinslow, Director of Technology for Boeing’sPhantom Works, summed it up by saying “Nowwe’re doing Program SBA. In the future we’llbe doing Systems of Systems SBA.”46

5-12

ENDNOTES

14. Larry Winslow, Director of Technology, BoeingCompany Phantom Works, interview by authors,29 April 1998.

15. Robert Lentz, General Dynamics Land Systems,interview by authors, 24 February 1998.

16. Richard Staiert, United Defense Limited Part-nership, interview by authors.

17. High-Level Architecture (HLA) Transition Re-port, March 1998, p. 2, on line http://www.dmso.mil/hla/policy/rept611.html accessed on 29 May1998.

18. Dr. Stuart Starr, The MITRE Corporation,interview by authors, 4 March 1998.

19. Bergantz, ibid.

20. Dr. David Chang, General Motors, interview byauthors, 31 March 1998.

21. For more information on implementing com-puter aided engineering and concurrent engi-neering within a company, see “Manager’s Guideto Computer Aided Engineering,” John Krouse,OnWord Press, 1993.

22. John Roundhill, Vice President, The BoeingCompany, remarks at INCOSE Annual Sympo-sium in Vancouver, B.C., on 30 July 1998.

23. Army Regulation AR 5-11, “Management ofArmy Models and Simulations,” 10 July 1997.

24. Air Force Acquisition Policy 97A-004, “Model-ing and Simulation in Support of the Air ForceAcquisition Process,” 28 November 1997.

25. Colonel Lynn Carroll, USAF, Warfighter Train-ing Research Division Chief, Air Force ResearchLaboratory, briefing to the Joint Strike FighterMission Level Strategy session, 30 March 1998.

1. “A Taxonomy for Warfare Simulation(SIMTAX),” Military Operations Research Soci-ety (MORS) Report, 27 October 1989, on-linehttp://www.dmso.mil/docslib/mspolicy/glossary/accessed 24 June 1998.

2. DoD 5000.59-M, DoD Modeling and Simulation(M&S) Glossary, December 1997.

3. Jim Boudreaux, Electric Boat Corporation, in-terview by authors. 24 February 1998.

4. MSRR home page, on-line http://www.msrr.dmso.mil/ accessed on 25 June 1998.

5. MSOSA home page, on-line http://www.msosa.mil.inter.net/Default.htm accessed on 25 June1998.

6. Judith Dahmann, “High Level Architecture,”briefing (1998). See http://www.dmso.mil/ for amore in depth discussion of the CTF and its threecomponents: CMMS, Data Standards, and HLA.

7. DMSO CMMS home page, on-line http://www.dmso.mil/projects/cmms/ accessed on 26June 1998.

8. DMSO Data Standardization home page, on-linehttp://www.dmso.mil/projects/ds/ accessed on 26June 1998.

9. http://www.sedris.org/faq_trpl.htm

10. HLA home page, on-line http://hla.dmso.mil/accessed on 25 June 1998.

11. HLA home page, on-line http://hla.dmso.mil/accessed on 25 June 1998.

12. DoD 5000.59-M, DoD Modeling and Simulation(M&S) Glossary, December 1997.

13. PDES, Inc. home page, on line http://pdesinc.scra.org/ accessed on 5 May 1998.

5-13

26. Armbruster, ibid.

27. Greg Avesian; Chrysler Corporation; Manager,Solutions Marketing; interview by authors, 26February 1998.

28. Chang, ibid.

29. Richard Riff, Manager, C3P Project Office, FordMotor Company, interview by authors, 23 April1998.

30. Larry Winslow, Director of Technology, BoeingCompany Phantom Works, interview by authors,29 April 1998.

31. Major General Leslie J. Kenne, Program Man-ager, Joint Strike Fighter Program, Remarks atthe National Defense Industrial Association SBAConference, 17 March 1998.

32. Sheaves, ibid.

33. Michael Mecham, “Aerospace Chases The Soft-ware Boom,” Aviation Week & Space Technology,6 October 1997, 46-49.

34. Kenne, ibid.

35. Dan Selfridge, Newport News Shipbuilding,“Lifecycle Cost and Manning Cost Analysis”briefing given to NAVSEA PMS 392 SubmarineTOC IPT Meeting on 2 April 1998.

36. Jim Shields, Chief of Systems Engineering, Cru-sader Program Office, interview by authors, 1 July1998.

37. Armbruster, ibid.

38. Ibid.

39. http://www.army-technology.com/contractors/transport/stewart_stevenson/index.html ac-cessed 11 Aug 1998.

40. Brock Birdsong, Missile Research, Developmentand Engineering Center, US Army Aviation andMissile Command, phone interview with author,29 July 1998.

41. Marilyn Morgan, “Boeing Leads Way in VirtualMock-up”, Boeing News, April 3,1998.

42. Peter Cherry, Vice President, Vector Research,Inc., interview by authors, 27 April 1998.

43. CJCSI 3170.01, Requirements GenerationSystem, (Formerly MOP 77), 13 June 1997.

44. Colonel Lynn Carroll, U.S. Air Force, WarfighterTraining Research Division Chief, Air Force Re-search Laboratory, briefing to the Joint StrikeFighter Mission Level Strategy Session, 30 March1998.

45. MG Kenneth R. Israel, Director, Defense Air-borne Reconnaissance Office, “Modeling andSimulation Employed in the Predator UnmannedAerial Vehicle Program,” on-line http://www.acq.osd.mil/daro/homepage/predms.htmaccessed 11 December 1997.

46. Larry Winslow, Director of Technology, BoeingCompany Phantom Works, interview by authors,29 April 1998.

5-14

6-1

66A FUTURE STATE OF SBA

“Future commanders must be able to visualize and create the “bestfit” of available forces needed to produce the immediate effects andachieve the desired results.”

Joint Vision 2010

As the Joint Vision 2010 quotation abovestates, commanders must be able to projectand select the best fit of systems available toaccomplish their mission. This applies to notonly the existing systems available today, butalso to looking ahead at the deficiencies andopportunities in accomplishing future mis-sions. SBA will provide this capability towarfighting commanders by giving them a syn-thetic view of future battlefields, where con-ceptual systems can be integrated and theiroperational effectiveness assessed.

In describing this future process a keyassumption is that the systems acquisition pro-cess will continue to be centered on programs.The purpose of the acquisition process will be,as it is now, to produce superior systems inresponse to mission needs. An SBA approachwill not change this product focus, but insteadwill enhance DoD’s ability to acquire the “bestfit” of new systems to meet the warfighter’sneeds. Some argue that in order to achieve thefull potential of SBA it is necessary to move

This chapter portrays the implications of anSBA approach for future defense systemsacquisition, by projecting SBA capabilities andtrends. While it is difficult to predict exactlyhow change will be implemented or todescribe the resultant organizational structure,it is possible to envision what an SBAapproach holds for the future. Within thisacquisition process of the future, systems willbe thoroughly modeled and simulated priorto “bending metal.” Beginning with initial userneeds, requirement definition, through design,testing, manufacturing, logistics, training,operational usage, and disposal, the syntheticenvironment and integrated teams will play amajor role. Ideally, the acquisition processbecomes a continuum without a definablebeginning or end, continually assessing pro-jected threats and needs against the ability tomeet those needs. One senior acquisitionofficial put it best when he said, “If the mod-els are done correctly, the simulations will tellus when we need to start a new program.”1

6-2

from a “product-centric” focus of systemsacquisition, to a “mission-centric” view.2 Theargument is that during the earliest stages ofturning a mission need into a material solu-tion, it is necessary to have the broadest pos-sible view in order to optimize the tradeoffsacross service boundaries. This broad viewcould only be achieved by merging similar mis-sion areas across the services, and divestingthe services of control of the funds for thesemission areas. The managers of these missionareas would have budget and decisionauthority to determine where best to investfunding to meet user requirements.

But just because the technology will enable thepossibility of doing business this way does notnecessarily make it desirable or a necessarypre-condition for implementing SBA. In factit would be a major barrier to implementingSBA, as the services have no intention of giv-ing up their ability to determine therequirements for the material solutions totheir mission needs. This cuts to the very foun-dation of how the services interpret their rolesand missions. There is nothing inherent aboutthis new way of doing business that requiresthe services to give up their right to determinetheir future, so we reject the argument thatimplementing SBA requires a “purple,” ormulti-service, acquisition corps. Instead, ourassumption is that acquisition will remain inthe province of each service and be program-centric. There are good reasons for maintain-ing separate services, and they are continuingto learn how to interoperate and functionas a joint team. So too can the acquisitionsystem.

Another assumption is that a primary objectiveof SBA is to get the design right before build-ing a system, when in effect the designbecomes frozen. Any changes required after

the start of production result in costlyengineering changes and/or system modifica-tions. If the design is kept mostly in the syn-thetic environment prior to production, thecost and time required to make the change issignificantly reduced. In an SBA process, thedefining point is when the program moves outof the synthetic environment and beginsproduction.

The new ability to bring systems together whilethey are still in design will be key to ensuringthat systems will work together when they arefielded, and that they are not over- or under-designed. Competing new systems can beevaluated side-by-side on equal terms to de-termine the best alternative. Duplication ofcapability within and across Services will bemore readily apparent, resulting in significantcost avoidance. According to Dr. Peter Cherry,Vice President of Vector Research, Inc., “Thetools we’re using are getting better, but nowwe need a richer context in which to make de-cisions, otherwise we’ll continue to only makeacquisition decisions on the margin. We makemarginal decisions now—the challenge is toprovide a system of systems context so wecan do better than decisions on the margin.Previously we only made incremental changesprogram by program. Today, systems of sys-tems require the total integration of systems.We need to be able to walk the system’s impacton the battlefield all the way from the platoon,to the company, to the battalion level. We havea lack of the full understanding of system sup-portability issues. We still talk about systemsfrom the cockpit perspective—we need toelevate up to a higher level, and talk to Con-gress about battlefield effectiveness, not howmuch faster or how stealthy a system is. Forexample, we need to be able to show the valueof the JSTARS [Joint Surveillance and TargetAttack Radar System aircraft] to the artillery,

6-3

not how many targets it can track and it’s meantime between failure.”42

System of systems issues are a primary concernof the Service Chiefs and the Joint Require-ments Oversight Council (JROC); however,these concerns are now primarily handledat a very high level through the use of aCapstone Requirements Document (CRD).A CRD is used to identify overarching re-quirements for a system, or several pro-grams that form a system of systems. It con-tains performance-based requirements tofacilitate development of individual programrequirements.43

Programs implementing SBA will be able togive the users the necessary insight to balancethese needs and deficiencies in time to influ-ence the design, rather than waiting for theServices and JROC to validate ever-increasingpoint design solutions during Milestonereviews. At this high level it’s too late to impactthe design cost-effectively, as all of the assump-tions and tradeoff decisions have already beenmade in getting ready for the review. Whatare being presented are the results of theprogram optimized against the static require-ments dictated in the Operational Require-ments Document (ORD), which assumed theuser fully understood the impact of thoserequirements.

With this in mind, we will start by looking atthe beginning of a program. This chapterdiscusses a future state of SBA by lookingat four major phases over the life cycle of aprogram, beginning with:

1. generating a mission need;

2. iterative design and development activi-ties;

3. production; and

4. operations and sustainment.

Needs Generation Phase

SBA will encompass more than what we todaycall acquisition. SBA will provide a commonsynthetic environment that the warfightingcommunity can use to experiment with non-material solutions to meet deficiencies, and tohelp them determine affordable requirements.As stated in the Chairman of the Joint Chiefsof Staff Special Instruction for the Require-ments Generation System, the warfighterdetermines the need for a material or a non-material solution by conducting a mission areaanalysis of the current and projected capabili-ties to accomplish assigned missions. The Ser-vices’ first try to satisfy mission needs throughnonmateriel solutions, such as changes in doc-trine or tactics. The Services will be able touse updated models and simulations providedby the SBA community as part of their mis-sion area analysis. If a nonmateriel solution isdeemed not feasible, the need is translatedinto a Mission Need Statement (MNS) whichis expressed in broad operational terms.3 Asthe MNS is documented, validated andapproved, the SBA process will bring thewarfighter and acquisition communitiestogether from the very beginning of programdefinition. Members of the acquisition com-munity will have a better understanding of thewarfighter’s need because they participated inthe MNS development. Information gainedfrom non-material experiments can providethe acquisition community insight into thewarfighter’s needs, and any models developedmay be of use in searching for a materialsolution.

6-4

The generation of an MNS marks the end ofthe needs generation phase in the SBA pro-cess. The warfighting and acquisition commu-nities will determine the members of the IPTwho will collaborate to generate and definethe operational requirements during the nextphase of iterative design and development.

Iterative Design andDevelopment Phase

The second major phase of a program in thisfuture state of SBA is iterative design anddevelopment. As discussed in Chapter One,the early decisions in a program are criticalbecause to a large extent they will determinesystem effectiveness and what the system willcost throughout its life cycle. The iterativeprocess of determining requirements sets thestage for the TOC of the program. SBAenables the acquisition community to supportthe warfighters in determining affordablerequirements, and provides a vehicle to getindustry involved during early concept de-velopment. Rather than locking in require-ments before understanding them fully, therequirements group in a program officedevelops simulations to conduct engineeringtrade studies that quantify capability versuscost. Flexibility in design is maintained as longas possible by keeping the requirements fluiduntil the cost impacts of each are well under-stood. The warfighters, industry, and programoffice personnel are an integrated team thatmakes informed decisions regarding capabil-ity versus cost issues across the entire life cycleof the system.

During this phase, models and simulationsrepresenting new and revolutionary conceptsand systems are evaluated in a series ofiterative “what if” trade off analyses. New con-cepts are evaluated in a virtual environment

to determine their operational impact andsystem effectiveness. Cost models provide theTOC impacts of competing concepts anddesigns.

New conceptual models are evaluated todetermine if they provide significant opera-tional benefit, and promising concepts aredeveloped further. As confidence in themodels grows, many other issues can beaddressed concurrently. Initial cost models ofa design provide a rough order of magnitude(ROM) estimate of the TOC. As the designmatures these ROMs are refined and becomemore accurate. Using an integrated visual rep-resentation of the system, functional expertson the IPPD team make their requirementsand concerns known to all IPPD members, andwork directly with each other in determiningthe optimum design. Training requirementsare addressed and decisions made whether tobuild a stand-alone trainer or to have trainingembedded in the system. Logisticians andtesters address supportability and testabilityissues and compare results between compet-ing conceptual systems. The transportationcommunity evaluates transportability of thesystem design. Manufacturing issues areaddressed by designing a virtual factory. Itera-tions continue until a high level of confidenceis reached in the design, at which time thesystem is submitted for a production decision.If approved, all models, simulations and dataare passed on for use in producing the system.

There should be little chance of a system beingdisapproved during a production decision,because all stakeholders have frequent oppor-tunities for visibility and feedback during sys-tem design. Periodic reviews of the system’sprogress will be possible by immersing theaudience using tools such as visualizationrooms. Problem areas can be resolved quickly

6-5

by bringing in all stakeholders to buy off oncritical affordability decisions as part of thedesign tradeoffs.

Oversight activities are necessary to ensureprograms are producing the best possibleweapons systems. Although a program will beconsidering systems of systems issues as partof their iterative development, there will stillbe macro-level issues that need to be resolvedat levels higher than any one program. Thisoversight function will be able to consider bothjoint and combined systems of systems issues,for those systems that will operate with otherservices and allied nations.

The primary activity of the oversight functionis assessing and prioritizing efforts from asystem of systems perspective. It assesses theprogress of programs’ iterative developmentsand the expected production time to ensurethat the system will be available when required.Programs will stay in iterative developmentand continue to refine the design in a syntheticenvironment, continually assessing newtechnologies as well as input from the field. Adecision to begin production is made in suffi-cient time to allow for production, fielding,and training to occur prior to the need date.Oversight reviews will be periodic instead ofactivity-based, but can also be unscheduled inresponse to unforeseen changes in the need.These reviews will replace the current mile-stone reviews, because there will be minimalneed for successive and increasing levels ofdollar commitment as the program designmatures because the dollar commitment willbe significantly reduced. Instead, the abilityto conduct most of the design in the syntheticenvironment will have the effect of fusingthe activities leading up to today’s presentMilestone III production decision.

Production Phase

The third major phase of a program in thisfuture state of SBA is production, which beginsafter a production decision is made at the endof the design and development phase. Allmodels and simulations previously developedfor a program are available for production.These models and simulations include tool-ing and factory design. Existing models andsimulations are developed to a greater levelof detail as necessary for full-scale production.As a result of previous rigorous developmentand validation of the models and simulationsthere should be:

• Minimum Engineering Change Proposalsbecause the design coming into the Produc-tion Phase was determined acceptable toall stakeholders;

• Reduced tooling time because toolingrequirements were already identified;

• Less decisions to be made because thereshould be few surprises and therefore lesstradeoff decisions required—all knowntradeoffs were evaluated and decided on;

• Better quality of the end item because reli-ability, maintainability, supportability,manufacturability, producibility, and othersupport issues were all considered andincorporated into the design;

• An affordable design because evaluationof cost models with Cost As an Indepen-dent Variable (CAIV) forced early trade-off decisions between performance andaffordability.

6-6

Operations and Sustainment Phase

The fourth and final major phase of a programin this future state of SBA is operations andsustainment, which ends upon system disposal.Actual operations and support (O&S) data areused to update the system’s models and simu-lations. There is continuous analysis of thisO&S feedback to determine future deficien-cies. A deficiency may only require a systemupgrade or it may require development of atotally new concept. Proposed changes to theexisting models and simulations or new mod-els and simulations to meet the deficiency areagain evaluated in a synthetic environment andthe warfighters can use this data to assess non-material alternatives, based upon the acquisi-tion community’s updated models. Opera-tional units and commands could use thesemodels and simulations to better predict theirbudget requirements. Cost models that reflectactual costs incurred for operations andsupport of the system could be used to accu-rately project quarterly and annual trainingand operational requirements.

Net Result

This proactive approach to assessing programstatus is driven by mission need and the date asystem is required, rather than the excessivelength of the acquisition cycle. Programs canbe much more flexible to respond to changingmission needs, because there is not as muchemphasis on locking down the requirements.The entire process is more agile because pro-grams are able to stay in iterative developmentuntil production lead times coincide with needdates.

The ultimate effect of this new SBA processwill be the ability to conduct “what if”scenarios across the entire spectrum of theDoD. Program teams will be charged withevaluating the interactions of their system(s)with other systems on the battlefield. The over-sight function will be armed with the knowl-edge to make better informed decisions onwhich systems will best meet the military’sneeds. Program teams will have a much closerlink to the battlefield upon which the systemsoperate, because they will be able to constantlysee the battlefield effects and implications oftheir design decisions. All will help keep theteam focused on the real goal of producingsuperior systems.

6-7

ENDNOTES

1. LTG George Muellner, Air Force PrincipalDeputy for Acquisition, interview by authors, 30March 1998.

2. For a more complete description of this alterna-tive view, see “Functional Description Document– 980209” developed by the NDIA’s IndustrySteering Group, available through the SBA Spe-cial Interest Area on-line http://www.msosa.mil.inter.net/sia-sba/sba_sia_documents.asp?process=view_archive

3. CJCSI 3170.01, Requirements GenerationSystem (Formerly MOP 77)), 13 June 1997.

6-8

7-1

77CHALLENGES TO

IMPLEMENTING SBA

significant change in the way DoD doesbusiness. There is a natural resistance tochanging the old ways if they appear to beworking. There is significant resistance tobelieving in virtual prototypes, as people aremore apt to believe one physical test versusthousands of virtual tests. For example, thetechnology for simulating the crashworthinessof a vehicle far outpaces the acceptance of theresults, to the point where their accuracy isbetter than the variability between differentphysical crashes.1 However, despite this capa-bility, all automobile manufacturers continueto conduct expensive crash tests (albeit theyare conducting much fewer tests than in thepast). A culture change is required to viewcomputer modeling as analogous to physicalmodeling. Even so, acceptance of M&S isoccurring at an increasing rate as M&S issuccessfully implemented. The engineersfrequently resist using M&S at first, but soondiscover that it can be a very valuable designtool, and that they often do not need hardwareto find problems.

An important corollary to this acceptanceproblem is to “not believe everything yousee.” Simulations create compelling visual

Many programs are making great progress inimplementing SBA; however, even the bestprograms are a long way from implementingthe ideal process outlined in the previous chap-ter. This chapter outlines the cultural, techni-cal, and procedural challenges to implement-ing this future state of SBA. Some of thesechallenges will be overcome more quickly thanothers depending to a large extent on howmuch emphasis they receive. Cultural chal-lenges result from people’s views, beliefs, andmanagement actions; technical challengesresult from M&S limitations in technology;and process challenges are caused by the waythe DoD is organized and operates.

Cultural Challenges

Many people maintain that cultural barriersare the biggest challenge to overcome in mov-ing to an SBA process. These challengesinclude acceptance of M&S, not believingeverything seen, working as teams in adistributed environment, and becomingproactive in determining an affordable design.

Acceptance of the results of M&S is probablythe single biggest challenge, as it involves a

7-2

arguments that can drive complacency in ques-tioning the underlying models. The mere factthat something can be modeled and simulateddoes not mean that it can be built. A rigorousprocess of verifying, validating, and accredit-ing (VV&A) models and simulations for theirintended purpose can counteract this problem,and can increase the level of confidence thatthe models and simulations are representativeof what was intended and accurately representreality.2

Another cultural challenge is learning how towork as teams in distributed environments.Western culture and training places an empha-sis on individual performance, but systems aredesigned and built by teams. An SBA processcan accentuate this long-standing problembecause even more emphasis is placed onworking together as a team, and the teams maychange frequently and not be co-located.Many companies are experimenting with waysto create just-in-time teams and reward teamperformance.

A final cultural challenge is learning to becomeproactive in designing an affordable system.This involves continually refining the require-ments with the user to find the optimum mixof performance and affordability, and look-ing at the costs across the entire life of thesystem. The range of options to explore canbe intimidating, and the tendency will be tomove on with a design that appears goodenough, rather than continuing to exploredesign options to reduce the TOC of thesystem.

Technical Challenges

Technical barriers are perhaps the easiest toenvision. Technical challenges includeinteroperability concerns, and limitations in

what can be effectively and affordablymodeled.

Interoperability challenges result from thedesired ability to seamlessly share models andsimulations between programs and acrossservices to analyze tradeoffs in a system ofsystems synthetic environment. As discussedpreviously, the DMSO is pursuing variousinitiatives in the Common Technical Frame-work to facilitate interoperability and reuse.Certainly more technical improvements willcontinue emerging, such as computer-aidedtools, which will facilitate the development anddeployment of reuse standards and policiessuch as the High Level Architecture.

Programs can experience interoperability chal-lenges within their own program, as theyattempt to move information up and down thehierarchy of models and simulations (forexample, from campaign to mission toengagement to engineering, and then backup). There are interoperability challenges inmaking different tools communicate with eachother, particularly those that were developedfor use in a different context from the one forwhich they are now intended. For example, alogistics model that helps determine theoptimum stockage level for spare parts maynot be compatible with another tool thatpredicts reliability, or even worse, givesconflicting results.

The MSRR will provide increasing access toprograms to assist in their finding what modelsand simulations are available for their systemsof systems analyses, which will facilitate maxi-mum reuse. This is particularly evident withitems that have been designed for reuse, suchas threat and environment data. However,designing for reuse costs approximately threetimes as much as designing for a specific

7-3

program,3 and is only feasible when all of theprograms intending to use the models andsimulations are known in advance. HLA be-gins to solve this problem by helping groupsthat choose to interact with each other definespecific system architectures called federa-tions. This creates a problem when a programwants to use another program’s model or simu-lation that was developed as part of anotherfederation. An extreme but probable examplewould be attempting to use models and simu-lations from several other federations to get atrue picture of the battlefield.

In an ideal SBA process, models and simula-tions can be reused as easily as using a libraryor repository. However, historically this typeof ad hoc reuse has only resulted in about 20percent savings in software code reuse.4 A lotof time is often spent analyzing the code todetermine what is applicable for reuse, andoften the most valuable results are the insightsgained into how to functionally design the newcode, rather than the reuse of the code itself.A SBA process may not actually reuse modelsand simulations in this traditional softwaresense, but it does point out the considerabledifficulty experienced in a similar processwhich is much simpler than SBA envisions.

Some of the easiest areas of reuse to envisionare for common areas that many programs willneed, such as models of the environment inwhich systems will operate, and of the threatsthese systems will face. Efforts are alreadyongoing in some of these areas, such as theSEDRIS effort mentioned earlier, and the Vir-tual Proving Ground under development atthe Army’s Test and Evaluation Command.The DoD has a vested interest in reducingduplication of efforts in these areas, and alsoin having a common depiction of them acrossall systems. Programs will be able to use the

DoD’s environment to evaluate many aspectsof their system. If a program has a require-ment that is not satisfied by the commonenvironment, it can be expanded to meet theneeds, and it can be subsequently brought backinto the aggregate common environment forreuse by other programs. The common envi-ronment gets expanded as a result. Having acommon environment will also provide cred-ibility and help to minimize issues that couldarise if each contractor developed his ownenvironment and threat models, especially ifthese systems were in a competitive selectionprocess. Other items the DoD should considerproviding are the analytical tools that will beused to evaluate contractors’ models. Forexample, if the DoD planned to use a construc-tive model such as the Army’s CASTFOREM(Combined Arms and Support Task ForceEvaluation Model) to evaluate a contractor’sproposed model, then the program officeshould consider providing CASTFOREM tothe contractor early in the program.

There are also technical limitations in whatcan be effectively and affordably modeled,including reliable cost figures, failure andreliability prediction, and human behavior.Cost models that accurately project TOC needto be available as early as possible to enableinformed tradeoff analyses that all communi-ties and decision makers can believe. Discrep-ancies will otherwise persist between costestimates prepared by differing agencies—forexample, the Government Accounting Officemight predict higher cost estimates than theprogram’s estimates. These discrepanciescould effectively cripple the program’s abilityto design a more affordable system, becausethe credibility of the system’s total owner-ship costs that are being used to make de-sign tradeoffs is called into question. Costcurves for each alternative considered allow

7-4

quantification of alternatives and operationaltradeoffs with cost as an independent variable.The objective is to use modeling and simula-tion to help determine the “bend in the knee”when looking at performance versus cost. Thisbend in the knee refers to the point where thecost of providing greater performance beginsto exceed the benefit it provides, or its pointof marginal return. By showing the user wherethe bend in the knee starts to occur on a givenperformance characteristic then we can ask ifit is worth the extra cost to achieve theincrementally less capability per dollar spent.The earlier we are able to show the user thecost implications of his requirements, thesooner we can agree to the necessary tradeoffsto result in an affordable design. This forcesthe user to make tough calls early in thesystem’s development and avoids pursuingunaffordable solutions.

We need to be able to treat CAIV at the earli-est stages, by incorporating cost considerationsinto the computer-aided design andengineering tools. It is otherwise impossibleto get cost on the table as a trade mechanism.We are still using cost models that use dollarsper pound as the basis of estimating, asopposed to conducting detailed cost tradeoffs.Just as computer-numerically-controlled(CNC) machines are able to manufactureitems directly from the digital design data, costmodels would be able to directly use the digitaldata to project manufacturing costs. Evenmore sophisticated cost models would be ableto address costs across the life cycle of thesystem—operations costs, upgrade costs,maintenance costs, and disposal costs, forexample.

Today’s state-of-the-art in modeling and simu-lation does not provide the ability to fully andaccurately predict failures to the point where

we can replace physical reliability testing withreliability testing using modeling and simula-tion. Durability algorithms are in short sup-ply, because of the complexity of durabilitymodels. The ability to predict failure variesfrom one product area to another. Physics offailure is farthest along in electronics.5 Forexample, transistor manufacturers are able topredict with high confidence when transistorswill fail. In chips, we know exactly how they’llfunction and how they’ll behave, but the samedoes not hold for the mechanical world. Cur-rently, we do not have good models for pre-dicting failure and determining reliability ofmechanical systems or their components. Theability to predict the failure of a new systemrequires an understanding of the internal phys-ics of the materials within the system. Theability to model this level of fidelity withincomponents and within a system is in itsinfancy, but some fields of study are makingstrides by applying recent increases in com-puting power. Meteorological experts arebeginning to model weather systems atincreasing levels of fidelity to achieve a betterunderstanding of how thunderstorms, hurri-canes, and tornadoes behave. Pharmaceuticalcompanies are beginning to model new drugsat the atomic level and then immerse them-selves into the model to help them create thedrug. We can expect similar advances in otherfields as the technology matures. We need todevelop physics of failure and cost models tohelp predict reliability through simulation. If,in simulation, we are able to test a program tothe point of failure, then we can improve over-all system reliability by improving the area thatfailed. We can also use these failure data topredict the required maintenance, manpower,spares, and life cycle cost for a system giventhe number of operating hours.

7-5

The limitations to our ability to build highfidelity physics of failure models result fromsuch things as the limited understanding ofhow materials fail, variations in the manufac-turing processes of materials, and variationsin the properties of materials. One of the mostsignificant limitations is the cost and timerequired to capture all the stress loads that asystem or component will experience duringoperations. Each test conducted on a systemcaptures data for only one set of operationalconditions. Program schedule and cost con-straints limit the number of samples that canbe taken, so assumptions are necessary toextrapolate from this limited sample size tomake conclusions for all possible conditionsthat may be experienced. If we could accu-rately model the physics of the system, wecould conduct many more tests in the syntheticenvironment than are cost-effective withtoday’s approach.

A program manager must demonstrate thatthe system he is developing will meet the reli-ability required by the user. Durability testingis the process that provides the evidence ofwhether a system is reliable enough to pro-ceed toward production. Durability testing isvery demanding of program resources,requiring the PM to schedule and pay for hun-dreds, if not thousands of hours of systemoperations and testing. Confidence in thereliability of the systems that we field requiresthat we have an acceptable level of under-standing about why and how often a systemwill fail, which can be a significant cost driverof a program.6 The challenge is to reduce thenumber of reliability testing hours required.If we can reduce the number of reliability test-ing hours required of an end item, we canreduce both the schedule and budget neces-sary for a program during this phase. As anexample, the M1A2 tank program dedicated

four physical prototype vehicles just to supportdurability testing.7

Today, when we develop a new system it hasno known reliability.8 Only through use andtime do we achieve reliability. The only trueway to achieve this knowledge is to run thephysical system. When we do this we gain con-fidence in the reliability of that one system wetested and then we infer reliability to any sys-tem that is built to the same specifications.Therefore, the two critical factors are first, thatthe prototype we test be in accordance withthe system’s specifications; and second, thatwe are able to accurately repeat builds of sub-sequent systems. The weakness of this ap-proach is that the physical prototype is likelyto have flaws and inconsistencies comparedwith the final production version of the sys-tem because of human error in building theprototype, and configuration changes as wegain more knowledge about the system and astechnology continues to develop.

Modeling and simulation actually enables usto improve the confidence in the ability toaccurately build and test a prototype that istruly representative of a production item.Using M&S we are able to keep the designopen to changes longer because we havegreater confidence that the final design willmeet the required need and that the produc-tion run will be successful. This means we needless time allocated for last minute changes andgives us more time to learn about the systemas it matures, to incorporate lessons learnedback into the system, and to integrate the latesttechnology if desired. The changes made usingM&S early in the design can be much lessexpensive than if we were making changes toa physical prototype, which would not bepossible until much later in the program.

7-6

A final technical limitation is human behav-ior modeling. Conceptually, SBA couldinclude modeling and simulating the acquisi-tion process itself to help produce betteracquisition strategies. Human behavior is dif-ficult to model, however, because ourassumptions cannot be correct for all individu-als. Some factors that influence how an indi-vidual will react in a given situation includeexperience, education, values, self-confidence,and fatigue. The variability in reactions is sogreat that accurately predicting behavior is notpossible and any process, such as acquisition,that is heavily dependent on human interfaceand decisions is difficult to predict and model.Probably the best we’ll be able to achieve inthis area will be through the use of wargamingtechniques. Wargaming allows for shortfalls inmodels and data, and provides a way for pro-gram managers to simulate interactions withentities that cannot be directly controlled. Theinsights gained by understanding theimplications of decisions should enablemanagers to develop more effectivestrategies.9

Process Challenges

Process challenges result from the way DoDoperates and is organized, and include thechange in the test process, security of classifiedand proprietary data, and metrics.

A major concern of program managers is thecost and time required to test a system. Asignificant portion of a program’s scheduleand budget is usually allocated for testing,and an SBA approach has the potential ofreducing both. As mentioned earlier, therole of testing is no longer to validate pointdesigns; rather the role of testing is to vali-date models so that there is a high confidencein the resultant simulations from those models.

Test personnel should be members of the IPTduring the earliest stages, and should workwith the developer to consider and resolve testissues. A good example of addressing testissues early is the Follow On To TOW (Tube-Launched, Optically Tracked, Wire Guided)hand-held anti-tank missile, or FOTT. In theFOTT program the specifications for the Vir-tual Proving Ground were included in theRequest for Proposal, and test personnel wereco-located with the program developers.10

These test personnel grew up with the modelsand matured them with tests as required togain confidence in their use. They also helpeddevelop the acceptance test procedures (ATP),so that many iterations and excursions couldoccur quickly and inexpensively and the resultscould be checked against the ATP after eachiteration. Involving developmental test andoperational test personnel early in the pro-gram and using M&S to gain higher confi-dence in the first missile allowed the FOTTPM to reduce missile requirements for engi-neering and manufacturing development(EMD). The Javelin program (a similar fire-and-forget missile) required 190 missiles forEMD, while FOTT required only 40 missiles.11

Within the test community there needs to bea shift in focus from testing hardware to test-ing simulations of hardware. By testing simu-lations of future hardware systems, IPT mem-bers can resolve many problems before expen-sive prototypes are built. Test simulations helpto identify the problem areas where effortsshould be focused to increase the probabilitythat live testing is successful. The knowledgegained from conducting these simulations willlead to better quality testing as well as moreefficient and less testing of the physical hard-ware. As an example, during the Longbowhelicopter Hellfire missile development therewere more than 500 simulated firings and only

7-7

20 live shots. This resulted in the LongbowSimulation/Test Acceptance Facility (STAF)paying for itself in one year, with several yearsleft in the program.12 In addition, otherprograms will be able to “reuse” the facility,further amplifying the cost avoidance for DoD.

The purpose of conducting live testing ischanging from validation of the physical sys-tem to validation of the model. GeneralMotors is using tests to validate model designassumptions and not to find fixes. The goal isto get the design right as early in the life cycleas possible.13 As confidence in the results ofmodels goes up the number of live testsrequired will come down. Any questions thatsimulations cannot answer may require aphysical prototype of the subsystem in ques-tion to reduce risk to an acceptable level.Information gained from building and testingany physical prototype of the system shouldbe used to update the system’s model.

A big challenge is to have an authoritativeDoD source for a common synthetic testenvironment. If we intend to compare com-peting system designs, we need to ensure weevaluate those systems within a commonsynthetic environment that is also available tothe system developers. This commonenvironment is required to provide a levelplaying field for evaluating conceptual designsof a system. These relationships are critical inensuring that the run-time databases, derivedfrom the synthetic environment database, willbe correlated so that all “views” of the envi-ronment are the same. An important “view”is that of computer-generated forces that donot “see” the battlefield but must use the datarepresentations to correctly interpret theenvironmental conditions.14 This means thatthe computer controlling the actions of mili-tary forces in a simulation needs information

about the terrain and objects in the environ-ment represented in a way it can interpret. Forexample, the visual representation of an objectthat a soldier-in-the-loop needs during a simu-lation is no good to the computer. The com-puter needs information about the same ter-rain and objects, but numerically representsthe essential information, such as position,height, weight, etc. Unfortunately, syntheticenvironment data interchange today is cum-bersome, expensive, and often unreliable.

A second major process concern amonggovernment and industry is ensuring thesecurity of classified and proprietary data. Ifwe expect agencies and companies to providedata through repositories, we must be able toprovide a high level of confidence in the secu-rity of those data. We want data to be readilyavailable and shared with those who needthe information and at the same time, wemust keep data from those who do not havea need to know. Today’s technology allowsus to share data easily and there are manyways that data can get into the wrong hands.As an example, an automobile manufacturerinadvertently left some data of a “next gen-eration” vehicle on a data tape that wasreused and given to one of its suppliers. Thesupplier inadvertently passed the data on toa competitor. This security breach of cor-porate knowledge allowed the competitionto make up a year-and-a-half in developmenttime.15

Program offices are also very careful aboutwho uses the models and simulations of theirprogram. They want to know exactly howand for what purpose the data will be used.This is understandable because a compet-ing program could use the detailed infor-mation provided by the models and simula-tions of another program to show how its

7-8

system is superior or where the other pro-gram is deficient. The information containedin a model could also be a valuable source todetermine vulnerabilities of the new system.

A final process challenge will be developingmetrics, which are needed for an overallassessment of progress in the execution of aprogram. Metrics are measures of success thatserve as a powerful management tool forevaluating effectiveness in accomplishingproject objectives and in achieving andimproving customer satisfaction.16 They allowprogram managers to manage on the basis offacts and data. Metrics solely focused onindividual process results do not give a pictureof overall success in implementation. Metrics,therefore, should also be structured to identifythe overall effects of SBA implementation.Measures that could be used to gauge suc-cess include schedule, responsiveness andtimeliness, and communications. The mea-surement of variances between planned andactual schedules, consumption of resources,

productivity, customer satisfaction, cycle time,and completion of tasks could be particularlyuseful in determining if a program is captur-ing the expected benefits and cost avoidance.In a collaborative, concurrent, and distributedengineering environment, communicationstakes on an increasingly important role.Organizations must be able to easily andquickly move large amounts of data. Theymust track how well information is movingthrough the organization, such as whether theProduct Information Manager is facilitatingconcurrent engineering, or whether the engi-neers are experiencing unacceptable delaytimes.

There are difficult cultural, technical, and pro-cedural challenges to implementing the idealSBA process of the future. But we don’t haveto wait until all the challenges are solved be-fore we start implementing those parts wherewe can extract value. The process is beginningto change, and those who adapt will find thebiggest rewards.

7-9

ENDNOTES

1. Stefan Thomke, “Simulation, Learning and R&DPerformance: Evidence from Automotive Devel-opment,” Research Policy 27 (1) (May 1998).

2. See “DoD VV&A Recommended PracticesGuide,” Defense Modeling and Simulation Of-fice, November 1996, for detailed information onVV&A.

3. James Shiflett; Vice President, SAIC; interviewby authors, 20 March 1998.

4. LTC Gene Glasser, U.S. Army; Director, ArmySoftware Reuse Center, interview by authors, 7May 1998.

5. Jerry Chapin, U.S. Army Tank and AutomotiveCommand, interview by authors, 25 February1998.

6. Art Adlam, U.S. Army Tank and AutomotiveCommand, interview by authors, 25 February1998.

7. Robert Lentz, General Dynamics Land Systems,interview by authors, 24 February 1998.

8. David Chang and Prakash Shrivastaka, GeneralMotors, interview by authors, 31 March 1998.

9. For an innovative approach to using wargamingin acquisition, see on-line http://www.npt.nuwc.navy.mil/divnewport/Business-War-Games/main.html.

10. John Haug, Director, Technical Mission,TECOM, interview by authors, 11 February 1998.

11. John Bier, Technical Management, Close Com-bat Anti-Armor Weapon System program office,phone interview, 8 September 1998.

12. Brigadier General Robert E. Armbruster, Deputyfor Systems Acquisition, U.S. Army Aviation andMissile Command, remarks during presentationat Army SBA Conference, Orlando, FL, 21-22January 1998.

13. David Chang and Prakash Shrivastaka, GeneralMotors, interview by authors, 31 March 1998.

14. http://www.sedris.org/doc_trpl.htm

15. Unnamed source due to confidentiality.

16. Many of the ideas expressed here were inspiredby the “DoD Guide to Integrated Product andProcess Development,” Office of the Under Sec-retary of Defense (Acquisition and Technology),5 February 1996.

7-10

8-1

88CONCLUSIONS AND

RECOMMENDATIONS

decision makers throughout the life cycle of aprogram. Risks can be identified, minimized,mitigated, or even eliminated to increase theprobability of program success. Productsdeveloped when implementing SBA can serveas excellent communication and marketingtools. Being able to visualize a system as itevolves from a low fidelity concept into a highfidelity system is very useful in resolving con-flicts throughout development. Simulation andvisualization are powerful tools for IPTs anddecision makers which allow them to see theresults of each decision and evaluate the rami-fications before having to make a costly finaldecision.

Using SBA, programs are making decisionsbased not only on more information, but alsobetter information, which is leading to a betterproduct. These programs are being “pushed”into using SBA for a variety of reasons, butonce there, they are finding many benefits.Both industry and government programs areturning to SBA as a means of remaining com-petitive and making the most of limited funds.Companies that have implemented SBA onone program have started to apply SBA toother programs and are looking for ways toleverage efforts across programs. Many

This chapter provides our conclusions regard-ing SBA, along with some recommendationson where to proceed. Our conclusions are thatSBA is a smarter way of doing business, butthere are significant challenges ahead to real-izing its full potential. Our recommendationsare:

• educate the workforce on this new concept;

• encourage a single SBA process for DoD;

• identify SBA efforts within programdocuments;

• continue developing the tools and infra-structure to make it happen;

• fund SBA as well as M&S development;and

• conduct a follow-on study that results in anSBA Recommended Practices Guide.

Conclusions

SBA is a smarter way of doing business. Itcreates an environment where complex issuescan be integrated and evaluated to support

8-2

programs are discovering that using M&S isthe only way to accomplish certain tasks be-cause of cost, safety, time, or scale consider-ations. As decision makers and users begin tosee programs implementing SBA, there willbe a “pull,” or an expectation that SBA willbe used in other programs. As they are shownthe cost impacts, warfighters are beginning tosee an impact and the value of delaying thefinalizing of their requirements. Programs us-ing SBA will be perceived as better and in factwill be better, than programs not using SBA.SBA will enable programs to quickly deter-mine what the issues are and to focus on them.Not only will the program benefit from imple-menting an SBA approach, but outside orga-nizations will benefit also.

The roles of industry and DoD are unclear forimplementing SBA, however. There isuncertainty about who should make the firstmove toward institutionalizing SBA. In thespirit of the Single Process Initiative to followbest commercial practices, DoD is reluctantto dictate to industry how to implement SBA,believing that it will occur naturally in the com-petitive commercial world. Industry’s attitude,on the other hand, is that if DoD wants a stan-dard system, it will have to put some moneybehind development.1 Industry is particularlyconcerned that each service will come up withdifferent standards and ways to implementSBA.

As discussed in the last chapter, there are tech-nical, process, and cultural challenges toimplementing SBA. The technical challengesare impeding the full potential of SBA, butare continuing to improve at various pacesamong the domains. SBA will continue to pickup speed and move toward a critical mass asthese barriers continue to fall. The processchallenges require further analysis and study.

SBA provides a significantly differentapproach to identifying and meeting militaryrequirements from the current process. Theimplications of a collaborative, concurrentdevelopment approach versus the presentsequential, increasing buy-in approach are notwell understood, and may require us to re-evaluate our current process. Activities withinthe process may need to occur more often orat a different time in the development of asystem. The cultural challenges requireeducation and experience—education tospread the word on this new perspective on us-ing M&S, and experience that comes by gettingstarted and learning with a new process. Manyprograms are doing smart things and there is agreat need for educating all SBA stakeholdersabout what SBA is and how it is different fromjust using M&S in a program. Acceptance ofSBA will continue to grow as more peoplelearn its concept, capabilities, and benefits.Gaining experience with SBA is one of thebest ways of overcoming cultural barriers.

Recommendations

Our first recommendation is to educate theworkforce on the new SBA concept by incor-porating classes on SBA into the curriculumat the Defense Systems Management Collegethat define what SBA is, what tools are avail-able, what SBA efforts are ongoing, and whatthe vision of SBA is. These classes should alsoinclude lessons learned by programs that suc-cessfully implemented SBA as well as fromprograms that failed to implement SBA. Asnoted earlier, there is only a single referenceto SBA in the Defense Acquisition Deskbook.Some of the areas that are now devoted toM&S should be reviewed with the goal ofincorporating the broader viewpoint of usingan SBA approach in acquisition, rather thanjust the use of M&S. Once this happens, the

8-3

field should be encouraged to provide inputon best practices, and wisdom and advice. Toget the dialogue started, we recommend thatthis book be included in the Deskbook as areference source for SBA.

Our second recommendation is to encourageall stakeholders to define and develop a singleDoD SBA process. The Services need to worktogether to develop an agreed upon singleprocess for SBA. A single process will reducethe burden on contractors when dealing withmultiple Services. The result should be areduction in the overall cost to DoD acrossprograms. Certainly we don’t want a situationthat replicates the one in some industries thatdictate specific software packages for theirsuppliers.

Our third recommendation is to identify SBAefforts within existing program documents,such as the Simulation Support Plan, the Testand Evaluation Master Plan, and the SingleAcquisition Management Plan. The intent isto move beyond just specifying how M&S willbe used in a program, to outlining how M&Swill be applied to change the process to anSBA approach.

Our fourth recommendation is to continuedeveloping the tools and infrastructure toenable programs to conduct systems of systemsanalyses. The DoD should continue to lookfor investments in common SBA tools that willsupport analysis of programs within a systemof systems synthetic environment. DoDneeds to find the resources to develop com-mon portions of the SBA infrastructure, asthis issue is too big and too important to

leave to a piecemeal approach by individualprograms, which cannot afford to fund large-scale M&S efforts to benefit all programs. Pro-grams do only what is required to successfullyfield a system. A program may develop acapability that all may benefit from, but thisis secondary and not the focus of the pro-gram. Common tools developed by a programor by DoD need to be available to otherprograms.

Our fifth recommendation is to fund SBAdevelopment in addition to M&S developmentactivities. Efforts need to focus on achievingthe holistic view of SBA and not on individualand unrelated M&S activities. The bigger pic-ture of SBA must be emphasized and effortscoordinated to optimize their contribution toachieving SBA. When screening priorities forfunding, use an SBA filter as well as an M&Sfilter.

Our sixth and final recommendation is to con-duct a follow-on study to analyze the impactsof SBA on cost, schedule, and performance,and from that develop an SBA RecommendedPractices Guide. While we concentrated ourefforts on the success stories and programsthat touted prowess in implementing SBA,there are probably failures and lessons learnedthe hard way. Some of these may be the resultof technology limitations, while others areerrors in implementation, but they are all valu-able lessons. This effort should focus on thehard evidence to show why SBA is indeed abetter, faster, and cheaper way of doing busi-ness, and thereby ferret out the key criteriaprograms should use in determining how toimplement SBA in their program.

8-4

ENDNOTES

1. Bill Gregory, “Simulation’s Latest Promise,”Armed Forces Journal International, June 1998,Volume 135, Number 11, p.34.

A-1

APPENDIX A

ACRONYMS AND GLOSSARY

A-2

A-3

APPENDIX AACRONYMS AND GLOSSARY

ACRONYMS

Acronym Term

AAAV Advanced Amphibious Assault Vehicle

ACAT Acquisition Category

ACTD Advanced Concept Technology Demonstration

ADS Authoritative Data Sources

AI Artificial Intelligence

APMC Advanced Program Managers Course

C4ISR Command, Control, Communications, Computers, Intelligence, Surveillance,and Reconnaissance

CAD Computer Aided Design

CAE #1 Component Acquisition Executive

CAE #2 Computer Aided Engineering

CAIV Cost as an Independent Variable

CAM Computer Aided Manufacturing

CASTFOREM Combined Arms and Support Task Force Evaluation Model

CEP Concept Evaluation Program

CFD Computational Fluid Dynamics

C4I Command, Control, Communications, Computers, and Intelligence

CMMS Conceptual Model of the Mission Space

CNC Computer-Numerically-Controlled

CODE Common Operating Digital Environment

CRD Capstone Requirements Document

CTF Common Technical Framework

DIF Data Interface Format

DMSO Defense Modeling and Simulation Office

DoD Department of Defense

DRA Decision Risk Analysis

DS Data Standards

DSAC Defense Systems Affordability Council

A-4

DSMC Defense Systems Management College

DT&E Developmental Test and Evaluation

EMD Engineering and Manufacturing Development

EXCIMS Executive Council for Modeling and Simulation

FMTV Family of Tactical Vehicles

FOTT Follow-On To TOW

HIMARS High Mobility Artillery Rocket System

HITL Human-in-the-Loop

HLA High Level Architecture

HWIL Hardware in the Loop

ICD Interface Control Document

IPPD Integrated Product and Process Development

IPT Integrated Product Team

JIMM Joint Interim Mission Model

JMASS Joint Modeling and Simulation System

JROC Joint Requirements Oversight Council

JSF Joint Strike Fighter

JSTARS Joint Surveillance and Target Attack Radar System

LCC Life Cycle Cost

LCM Life Cycle Management

LFT&E Live Fire Test and Evaluation

LRIP Low Rate Initial Production

MAIS Major Automated Information System

MDAP Major Defense Acquisition Program

MNS Mission Need Statement

MOE Measures of Effectiveness

MOO Measures of Outcome

MOP Measures of Performance

M&S Modeling and Simulation

MSIP Modeling and Simulation Investment Plan

MSOSA Modeling and Simulation Operational Support Activity

MSRR Modeling & Simulation Resource Repository

MTMC Military Traffic Management Command

NDIA National Defense Industrial Association

A-5

O&S Operations and Support

ORD Operational Requirements Document

OSD Office of the Secretary of Defense

OT&E Operational Test and Evaluation

PIM Product Information Management

PM Program Manager

PPBS Planning, Programming, and Budgeting System

QFD Quality Functional Deployment

ROM Rough Order of Magnitude

SBA Simulation Based Acquisition

SEDRIS Synthetic Environment Data Representation and Interchange Specification

SES Simulate, Emulate, Stimulate

SIPRNET Secret Internet Protocol Routing Network

SISO Simulation, Interoperability, and Standards Organization

SMART Simulation and Modeling for Acquisition, Requirements and Training

STAF Simulation Test Acceptance Facility

STEP #1 Simulation, Test, and Evaluation Process

STEP #2 Standard for the Exchange of Product Data

SWIL Software in the Loop

TCO Total Cost of Ownership

TOC Total Ownership Cost

TOW Tube-Launched, Optically-Tracked, Wire-Guided

VERT Venture Evaluation Review Technique

VV&A Verification, Validation, and Accreditation

A-6

GLOSSARY

AccreditationThe official certification that a model or simulation is acceptable for use for a specific purpose.(reference DoD M&S Glossary)

Advanced Concept Technology DemonstrationTechnology demonstrations that are tightly focused on specific military concepts and thatprovide the incorporation of technology that is still at an informal stage into a warfightingsystem. The ACTDs have three objectives: a) to have the user gain an understanding of andto evaluate the military utility of concepts before committing to acquisition; b) to developcorresponding concepts of operation and doctrine that make best use of the new capability;and c) to provide the residual operational capability to the forces. ACTDs are of militarilysignificant scope and of a size sufficient to establish utility.(reference DoD M&S Glossary)

AggregationThe ability to group entities while preserving the effects of entity behavior and interactionwhile grouped. (reference DoD M&S Glossary)

AlgorithmA prescribed set of well defined, unambiguous rules or processes for the solution of a problemin a finite number of steps. (reference DoD M&S Glossary)

Artificial IntelligenceThe effort to automate those human skills that illustrate our intelligence e.g., understand-ing visual images, understanding speech and written text, problem solving and medicaldiagnosis. (reference DoD M&S Glossary)

BattlespaceRefers both to the physical environment in which the simulated warfare will take place andthe forces that will conduct the simulated warfare. All elements that support the front lineforces (e.g., logistics, intelligence) are included in this definition of battlespace. (referenceDoD M&S Glossary)

BenchmarkThe activity of comparing the results of a model or simulation with an accepted representa-tion of the process being modeled. (reference DoD M&S Glossary)

‘bend in the knee’Refers to the point where incremental increases in performance are obtained at everincreasing costs, or the point of diminishing returns.

A-7

Common-Use M&SM&S applications, services, or materials provided by a DoD Component to two or moreDoD Components. (reference DoD M&S Glossary)

Component(See “DoD Component”)

Computational Fluid DynamicsThe accurate numerical solution of the equations describing fluid and gas motion and therelated use of digital computers in fluid dynamics research. (reference DoD Mission Successfrom High Performance Computing, DoD HPCMO Report, Office of the Secretary of De-fense, DDRE, March 1995)

Computer SimulationA dynamic representation of a model, often involving some combination of executing code,control/display interface hardware, and interfaces to real-world equipment. (reference DoDM&S Glossary)

Conceptual Model of the Mission SpaceFirst abstractions of the real world that serve as a frame of reference for simulation devel-opment by capturing the basic information about important entities involved in any missionand their key actions and interactions. They are simulation-neutral views of those entities,actions, and interactions occurring in the real world. (reference DoD M&S Glossary)

Constructive Model or SimulationModels and simulations that involve simulated people operating simulated systems. Realpeople stimulate (make inputs) to such simulations, but are not involved in determining theoutcomes. Constructive simulations are often referred to as war games. (Note: Also seeadditional information under “Live, Virtual, and Constructive Simulation.”) (reference DoDM&S Glossary)

Cost as an Independent VariableMethodologies used to acquire and operate affordable DoD systems by setting aggressive,achievable lifecycle cost objectives, and managing achievement of these objectives by trad-ing off performance and schedule, as necessary. Cost objectives balance mission needs withprojected out-year resources, taking into account anticipated process improvements in bothDoD and industry. CAIV has brought attention to the government’s responsibilities forsetting and adjusting lifecycle cost objectives and for evaluating requirements in terms ofoverall cost consequences. (reference DSMC Glossary of Defense Acquisition Acronyms andTerms, 8th Edition)

Decision Risk AnalysisA methodology that quantifies and ties together cost, schedule, performance, producibility,risk, and quality to permit tradeoffs.

A-8

DoD ComponentOne of the four services within the Department of Defense, i.e. Army, Navy, Air Force, orMarine Corps.

DomainThe physical or abstract space in which the entities and processes operate. The domain canbe land, sea, air, space, undersea, a combination of any of the above, or an abstract domain,such as an n-dimensional mathematics space, or economic or psychological domains.(reference DoD M&S Glossary)

EmulateTo represent a system by a model that accepts the same inputs and produces the sameoutputs as the system represented. For example, to emulate an 8-bit computer with a 32-bitcomputer. (reference DoD M&S Glossary)

EmulationThe imitation of a computer system, performed by a combination of hardware and soft-ware, that allows programs to run on systems that would normally be incompatible. (referencehttp://www.wcom.com/cgi-bin/dictQuery.cgi?key=emulation)

EntityA distinguishable person, place, unit, thing, event, or concept about which information iskept. (reference DoD M&S Glossary.)

EnterpriseAn arbitrarily-defined functional and administrative entity that exists to perform a specific,integrated set of missions and achieve associated goals and objectives, encompassing all ofthe primary functions necessary to perform those missions. An intranet, for example, is agood example of an enterprise computing system. (reference http://www.pcwebopaedia.com/TERM/e/enterprise.html)

EnvironmentThe texture or detail of the natural domain, that is terrain relief, weather, day, night, terraincultural features (such as cities or farmland), sea states, etc.; and the external objects, con-ditions, and processes that influence the behavior of a system (such as terrain relief, weather,day/night, terrain cultural features, etc.). (reference DoD M&S Glossary)

Executive Council for Modeling and SimulationAn organization established by the USD(A&T) and responsible for providing advice andassistance on DoD M&S issues. Membership is determined by the USD(A&T) and is at theSenior Executive Service, flag, and general officer level. (reference DoD M&S Glossary)

A-9

FederateA member of a High Level Architecture Federation. All applications participating in aFederation are called Federates. This may include federation managers, data collectors,real world (“live”) systems (e.g., C4I systems, instrumented ranges, sensors), simulations,passive viewers and other utilities. (reference DoD M&S Glossary)

FederationA named set of interacting federates, a common federation object model, and supportingRuntime Infrastructure, that are used as a whole to achieve some specific objective. (referenceDoD M&S Glossary)

FidelityThe accuracy of the representation when compared to the real world. (reference DoD M&SGlossary)

HierarchyA ranking or ordering of abstractions. (reference DoD M&S Glossary)

High Level ArchitectureMajor functional elements, interfaces, and design rules, pertaining as feasible to all DoDsimulation applications, and providing a common framework within which specific systemarchitectures can be defined. (reference DoD M&S Glossary)

Human FactorsThe discipline or science of studying man-machine relationships and interactions. The termcovers all biomedical and psychological considerations; it includes, but is not limited to,principles and applications in the areas of human engineering, personnel selection, train-ing, life support, job performance aids, and human performance evaluation. (reference DoDM&S Glossary)

Human-in-the-LoopA model that requires human interaction. (reference DoD M&S Glossary)

InfrastructureAn underlying base or foundation; the basic facilities, equipment, and installations neededfor the functioning of a system. (reference DoD M&S Glossary)

Integrated Product and Process DevelopmentAn approach to systems acquisition that brings together all of the functional disciplinesrequired to develop, design, test, produce and field a system. This is essentially the same asConcurrent Engineering. (reference DoD M&S Glossary)

A-10

Integrated Product TeamIntegrated Product Teams are a means to achieve concurrent engineering or IntegratedProduct and Process Development. They are multi-disciplinary teams consisting of repre-sentatives from all disciplines involved in the system acquisition process, from requirementsdevelopment through disposal. Having the participation of all the appropriate disciplines,Integrated Product Teams are often empowered to make decisions to achieve successfuldevelopment of their particular product. (reference DoD M&S Glossary)

InteroperabilitySee: M&S Interoperability.

Life Cycle CostThe total cost to the government of acquisition and ownership of that system over its usefullife. It includes the cost of development, acquisition, operations and support (to includemanpower), and where applicable, disposal. (reference DSMC Glossary of Defense AcquisitionAcronyms and Terms, 8th Edition)

Live SimulationA simulation involving real people operating real systems. (Note: Also see additional infor-mation under “Live, Virtual, and Constructive Simulation.”) (reference DoD M&S Glossary)

Live, Virtual, and Constructive SimulationThe categorization of simulation into live, virtual, and constructive is problematic, becausethere is no clear division between these categories. The degree of human participation inthe simulation is infinitely variable, as is the degree of equipment realism. This categoriza-tion of simulations also suffers by excluding a category for simulated people working realequipment (e.g., smart vehicles). (Note: also see each term separately, e.g., live simulation)(reference DoD M&S Glossary)

M&S InfrastructureM&S systems and applications, communications, networks, architectures, standards andprotocols, and information resource repositories. (reference DoD M&S Glossary)

M&S InteroperabilityThe ability of a model or simulation to provide services to and accept services from othermodels and simulations, and to use the services so exchanged to enable them to operateeffectively together. (reference DoD M&S Glossary)

MaintainabilityThe ability of an item to be retained in, or restored to, a specified skill level, using pre-scribed procedures and resources, at each prescribed level of maintenance and repair.(reference DSMC Glossary of Defense Acquisition Acronyms and Terms, 8th Edition)

A-11

Mathematical ModelA symbolic model whose properties are expressed in mathematical symbols and relation-ships; for example, a model of a nation’s economy expressed as a set of equations. Contrastwith: graphical model; narrative model; software model; tabular model. (reference DoDM&S Glossary)

Measures of Effectiveness (MOE)A measure of operational success that must be closely related to the objective of the mis-sion or operation being evaluated. A meaningful MOE must be quantifiable and measureto what degree the real objective is achieved. (reference DSMC Glossary of DefenseAcquisition Acronyms and Terms, 8th Edition)

Measures of Outcome (MOO)Metrics that define how operational requirements contribute to end results at higher levels,such as campaign or national strategic outcomes. (reference M&S Report, DSMC 1994 Re-search Fellows Report)

Measures of Performance (MOP)Measures of the lowest level of performance representing subsets of measures of effective-ness (MOEs). Examples are speed, payload, range, time on station, frequency, or otherdistinctly quantifiable performance features. (reference DSMC Glossary of DefenseAcquisition Acronyms and Terms, 8th Edition)

MetadataInformation describing the characteristics of data; data or information about data; descrip-tive information about an organization’s data, data activities, systems, and holdings.(reference DoD M&S Glossary)

MetricA measure of the extent or degree to which a product possesses and exhibits a certain quality,property, or attribute. (reference DoD M&S Glossary)

Metric(s)A process or algorithm that may involve statistical sampling, mathematical computations,and rule-based inferencing. Metrics provide the capability to detect and report defects withina sample. (reference DoD M&S Glossary)

Mission SpaceThe environment of entities, actions, and interactions comprising the set of interrelatedprocesses used by individuals and/or organizations to accomplish assigned tasks. (referenceDoD M&S Glossary)

A-12

ModelA physical, mathematical, or otherwise logical representation of a system, entity,phenomenon, or process. (reference DoD M&S Glossary)

ModelingApplication of a standard, rigorous, structured methodology to create and validate a physi-cal, mathematical, or otherwise logical representation of a system, entity, phenomenon, orprocess. (reference DoD M&S Glossary)

Modeling & Simulation Resource RepositoryThe MSRR is a collection of Modeling and Simulation (M&S) resources. MSRR Resourcesinclude models, simulations, object models, Conceptual Models of the Mission Space(CMMS), algorithms, instance databases, data sets, data standardization and administra-tion products, documents, tools and utilities. (reference www.msrr.dmso.mil/)

Modeling and SimulationThe use of models, including emulators, prototypes, simulators, and stimulators, either stati-cally or over time, to develop data as a basis for making managerial or technical decisions.The terms “modeling” and “simulation” are often used interchangeably. (reference DoDM&S Glossary)

Modeling and Simulation AccreditationSee: Accreditation.

Model-Test-ModelAn integrated approach to using models and simulations in support of pre-test analysis andplanning; conducting the actual test and collecting data; and post-test analysis of test resultsalong with further validation of the models using the test data. (reference DoD M&S Glossary)

Physical PrototypeA model whose physical characteristics resemble the physical characteristics of the systembeing modeled; for example, a plastic or wooden replica of an airplane. A mock-up. (referenceDoD M&S Glossary)

A-13

Planning, Programming, and Budgeting System (PPBS)The primary resource allocation process of DoD. One of three major decision making supportsystems for defense acquisition (the other two are the Requirements Generation Systemand the Acquisition Management System). It is a formal, systematic structure for makingdecisions on policy, strategy, and the development of forces and capabilities to accomplishanticipated missions. PPBS is a cyclic process containing three distinct, but interrelatedphases: planning, which produces Defense Planning Guidance (DPG); programming, whichproduces approved program objectives memorandum (POM) for the military departmentsand defense agencies; and budgeting, which produces the DoD portion of the President’snational budget. (reference DSMC Glossary of Defense Acquisition Acronyms and Terms, 8th

Edition)

ProducibilityThe relative ease of manufacturing an item or system. This relative ease is governed by thecharacteristics and features of a design that enables economical fabrication, assembly,inspection and testing using available manufacturing techniques. (reference DSMC Glossaryof Defense Acquisition Acronyms and Terms, 8th Edition)

ReliabilityThe ability of a system and its parts to perform its mission without failure, degradation, ordemand on the support system. (reference DSMC Glossary of Defense Acquisition Acronymsand Terms, 8th Edition)

Reliability ModelA model used to estimate, measure, or predict the reliability of a system; for example, amodel of a computer system, used to estimate the total down time that will be experienced.(reference DoD M&S Glossary)

RiskRisk is a measure of the inability to achieve a program’s defined performance, schedule,and cost objectives, and has two components: 1) The probability of failing to achieve par-ticular performance, schedule, or cost objectives; and 2) The consequence of failing to achievethose objectives. (reference DSMC Acquisition Strategy Guide, Third Edition)

SBA PullThose forces (attitudes, expectations, incentives, directives, policies, etc.), external to aprogram office, that influence a program office to adopt an SBA approach.

A-14

SBA PushThose forces (attitudes, expectations, incentives, directives, policies, etc.), internal to aprogram office, that influence a program office to adopt an SBA approach.

Simulation Based AcquisitionAn iterative, integrated product and process approach to acquisition, using modeling andsimulation, that enables the warfighting, resource allocation, and acquisition communitiesto fulfill the warfighter’s materiel needs, while maintaining Cost As an Independent Variable(CAIV) over the system’s entire lifecycle and within the DoD’s system of systems.

Solid ModelingA digital representation of the surface characteristics of an object.

StimulateTo provide input to a system in order to observe or evaluate the system’s response. (referenceDoD M&S Glossary)

StimulationThe use of simulations to provide an external stimulus to a system or subsystem. An exampleis the use of a simulation representing the radar return from a target to drive (stimulate) theradar of a missile system within a hardware/software-in-the-loop simulation. (reference DoDM&S Glossary)

SupportabilityThe degree of ease to which system design characteristics and planned logistics for resources,including the logistic support elements, allows for the meeting of system availability andwartime utilization requirements. (reference DSMC Glossary of Defense Acquisition Acronymsand Terms, 8th Edition)

Synthetic EnvironmentSimulations that represent activities at a high level of realism, from simulations of theatersof war to factories and manufacturing processes. These environments may be created withina single computer or a vast distributed network connected by local and wide area networksand augmented by super-realistic special effects and accurate behavioral models. They allowvisualization of and immersion into the environment being simulated. (reference DoD M&SGlossary)

A-15

System of SystemsSeamless connectivity of Command, Control, Communications, Computers, Intelligence,Surveillance, and Reconnaissance (C4ISR) across the sea-land-space interface in a jointwarfighting environment, and assimilation…into a coherent tactical picture…to develop amulti-dimensional netted architecture…while providing rapid sensor-to-shooter connec-tivity. (reference Admiral Jay Johnson, Chief of Naval Operations, during a visit to theJoint Strike Fighter program, 1995). The authors use the term System of Systems in a con-text broader than C4ISR, to encompass interactions between sub-systems of the same sys-tem, multiple system integration, user tradeoff analysis and optimization, and warfightingscenario optimization. One of the first places for the term System of Systems to show up inprint was in the U.S. Naval Institute Proceedings May 1995 article “The Emerging Systemof Systems,” by Admiral William A. Owens, US Navy. To the authors’ knowledge, there isno commonly accepted definition of this term as yet.

Total Cost of OwnershipSee: Total Ownership Cost (TOC).

Total Ownership CostThe sum of all financial resources necessary to organize, equip, and sustain military forcessufficient to meet national goals in compliance with all laws; all policies applicable to DoD;all standards in effect for readiness, safety, and quality of life; and all other official mea-sures of performance for DoD and its components. (reference 1998 Annual Report to thePresident and the Congress, Office of the Secretary of Defense)

ValidationThe process of determining the degree to which a model or simulation is an accuraterepresentation of the real world from the perspective of the intended uses of the model orsimulation. (reference DoD M&S Glossary)

VerificationThe process of determining that a model or simulation implementation accurately representsthe developer’s conceptual description and specification. Verification also evaluates theextent to which the model or simulation has been developed using sound and establishedsoftware-engineering techniques. (reference DoD M&S Glossary)

Verification, Validation, and AccreditationSee each separate term.

VirtualThe essence or effect of something, not the fact. (reference DoD M&S Glossary)

A-16

Virtual BattlespaceThe illusion resulting from simulating the actual battlespace. (reference DoD M&S Glossary)

Virtual ManufacturingA model or simulation of the processes required for making an item.

Virtual PrototypeA model or simulation of a system placed in a synthetic environment, and used to investi-gate and evaluate requirements, concepts, system design, testing, production, and sustainmentof the system throughout its lifecycle. (reference DoD M&S Glossary)

Virtual RealityThe effect created by generating an environment that does not exist in the real world. Usually,virtual reality is a stereoscopic display and computer-generated three-dimensionalenvironment which has the effect of immersing the user in that environment. This is calledthe immersion effect. The environment is interactive, allowing the participant to look andnavigate about the environment, enhancing the immersion effect. (reference DoD M&SGlossary)

Virtual SimulationA simulation involving real people operating simulated systems. Virtual simulations injecthuman-in-the-loop in a central role by exercising motor control skills (e.g., flying an air-plane), decision skills (e.g., committing fire control resources to action), or communicationskills (e.g., as members of a C4I team). (Note: Also see additional information under “Live,Virtual, and Constructive Simulation.”) (reference DoD M&S Glossary)

Virtual WorldSee synthetic environment.

VisualizationThe formation of an artificial image that cannot be seen otherwise. Typically, abstract datathat would normally appear as text and numbers is graphically displayed as an image. Theimage can be animated to display time varying data. (reference DoD M&S Glossary)

War GameA simulation game in which participants seek to achieve a specified military objective givenpre-established resources and constraints; for example, a simulation in which participantsmake battlefield decisions and a computer determines the results of those decisions. Alsoused synonymously with constructive simulation. (reference DoD M&S Glossary)

A-17

WargamingThe act of conducting a war game.

What-if AnalysisAnalysis conducted to determine the impact of changing a variable during the design of aweapon system, as in “What if we change the stiffness of the beam, what will be the impacton the cost, schedule, and performance of the system?”

A-18

B-1

APPENDIX B

INTERVIEWS AND PERSONAL CONTACTS

B-3

APPENDIX BINTERVIEWS AND PERSONAL CONTACTS

* SBA Industry Steering Group Member 1

** Joint SBA Task Force Member2

COMPANY CONTACT

ADVANCED AMPHIBIOUS ASSAULT Richard BayardVEHICLE (AAAV) PROGRAM OFFICE Assistant Program ManagerAAAV Technology Center [email protected] 991 Annapolis Way 703-492-3344Woodbridge, VA 22191-1215

Mark RoutsonGeneral [email protected]

AERONAUTICAL SYSTEMS CENTER (ASC) Dr. Jerry ArnettDistributed Mission Training Office [email protected] Patterson Air Force Base, OH

AIR FORCE MATERIAL COMMAND Lt Col Joseph M. Terry**Modeling and Simulation Integration Office AFMC/MSIO-OL Liaison OfficerOperating Location [email protected] Research Parkway 407-208-5762Orlando, FL 32826

AIR FORCE RESEARCH LABORATORY Col Lynn Carroll, Chief6001 S. Power Road, Bldg. 558 Warfighter Training Research DivisionMesa, AZ 85206-0904

ANSER Dr. Aron Pinker1215 Jefferson Davis Highway [email protected] Arlington, VA 22202 703-416-3439

APPLIED SYSTEMS INTELLIGENCE INC. Norm Geddes10882 Crabapple Road, Suite 2 PresidentRoswell, Georgia 30075 [email protected]

B-4

COMPANY CONTACT

ARMY SOFTWARE REUSE CENTER LTC Gene GlasserASQB-IRC (Stop C-2) Director, ASRC6000 6th St, Suite S122A [email protected] Belvoir, VA 22060-5576 806-3860/4300

ASSISTANT SECRETARY OF THE ARMY Lieutenant General Paul J. Kern(RESEARCH, DEVELOPMENT AND Military Deputy to the ASA(RDA)ACQUISITION), ASA(RDA) 703-697-0356Attn: SARD-DO103 Army Pentagon Ellen Purdy**Washington, D.C. 20310 Senior Operations Research Analyst

[email protected]

COL Michael [email protected]

Dr. Herbert FallinDirector for Assessment and [email protected]

Mike Truelove**Acquisition AnalystScience Applications International Corporation (SAIC)[email protected]

Paul AmosAcquisition AnalystScience Applications International Corporation (SAIC)[email protected]

ATACMS-BAT PROJECT OFFICE COL John HollyArmy TACMS-BAT [email protected]: SFAE-MSL-AB 205-876-1141Redstone Arsenal, AL 35898-5650

B-5

COMPANY CONTACT

BALL AEROSPACE Larry Jobson8381 Old Courthouse Road [email protected], VA 22182 703-917-9125

BARRONCAST, INC. Bruce Barron215 W. Oakwood Rd. Vice President & General ManagerP.O. Box 138 [email protected], Michigan 48371 248-628-4300

Lee [email protected]

Merlin WarnerSales [email protected]

BOEING COMPANY Larry Winslow. DirectorSeattle, WA 98124 Technology for Phantom Works

Michael W. Johnson*Defense & Space GroupVice-President Modeling & [email protected]

CACI Dr. Klaus Dannenberg1600 Wilson Blvd Senior Vice PresidentArlington, VA 22209-2515 [email protected]

703-558-0255

CARDEROCK NAVAL SURFACE Myles HurwitzWARFARE CENTER Head, Computer M&S Department9500 MacArthur Blvd West NSWC/CarderockBethesda, MD 20817-5700 [email protected]

(301)227-1927

CENTER FOR NAVAL ANALYSIS Dr. Henry Eskew4400 Ford Avenue 703-824-2254Alexandria, VA 22302

B-6

COMPANY CONTACT

CHRYSLER CORPORATION Arthur Anderson800 Chrysler Drive Manager, Vehicle Engineering &Auburn Hills, MI 48326-2757 Dimension Control

Large Car Platform

Greg AvesianManager, Solutions [email protected]

CLOSE COMBAT ANTI-TANK WEAPON John BierSYSTEM (CCAWS) PROJECT OFFICE Technical ManagerSFAE-MSL-CC-TM Follow on to TOW ProgramRedstone Arsenal, AL 35898 [email protected]

256-842-0494

Allen Zumbach256-876-3200

COLEMAN RESEARCH CORPORATION Tim Thornton6820 Moquin Drive [email protected], AL 35806 205-922-6000

COMANCHE PROGRAM OFFICE Dan HollenbaughRedstone Arsenal, AL 35898 Program Test Engineer

256-313-4464

COMPUTER SCIENCES CORPORATION Dennis LaackChina Lake, CA Joint Accreditation Support Activity

(JASA)[email protected] x130

B-7

COMPANY CONTACT

CRUSADER PROGRAM OFFICE Col Bill SheavesSFAE-GCSS-CRE, Building 171 Program ManagerPicatinny Arsenal, NJ 07806-5000 [email protected]

973-724-4588

Kevin FaheyDeputy Program [email protected]

Jim ShieldsChief of Systems [email protected]

DEFENSE ADVANCED RESEARCH Gary Jones, DirectorPROJECTS AGENCY (DARPA) Advanced Technology Integration5201 Leesburg Pike, Suite 503 [email protected] Church, VA 22041 703-681-1484

DEFENSE INTELLIGENCE AGENCY Richard Bernstein**Bolling Air Force Base Intelligence OfficerWashington, D.C. 20340-5100 [email protected]

703-820-1623

B-8

COMPANY CONTACT

DEFENSE MODELING AND Bruce BaileySIMULATION OFFICE DoD M&S Working Group1901 N. Beauregard St [email protected], VA 22311 703-824-3420

Col Kenneth “Crash” [email protected]

Dr. Judith [email protected]

Jona [email protected]

John GraySenior M&S [email protected]

Heikki Joonsar**[email protected]

DEFENSE SYSTEMS MANAGEMENT Shane DonahueCOLLEGE [email protected] Belvoir Rd 703-805-5411Fort Belvoir, VA 22060

Randy ZittelProfessor Systems [email protected]

DEPUTY CHIEF OF NAVAL OPERATIONS Dan Fink, Deputy(LOGISTICS), OPNAV N4 Acquisition Logistics IntegrationNavy Pentagon 2000 [email protected], DC 20350 703-601-1679

B-9

COMPANY CONTACT

DEPUTY UNDERSECRETARY FOR John D. Binford**DEFENSE, (INDUSTRIAL AFFAIRS & General EngineerINSTALLATIONS), INDUSTRIAL [email protected] AND ASSESSMENT 703-681-54725203 Leesburg Pike, Suite 1403Falls Church, VA 22041-3466

DRAPER LABORATORY, INC. Dr James NegroThe Charles Stark [email protected] Technology Square 617-258-3860Cambridge, MA 02139-3563

Moshe [email protected]

ELECTRIC BOAT CORP Gregory Angelini*75 Eastern Point Road Software Technology Team Leader,Groton, CN 06340-4989 Computer Systems Technology

John PorterCVX Principal [email protected]

Jim [email protected]

ELECTRONIC SYSTEMS COMMAND COL Hoot Gibson5 Eglin St Director, Modeling, Simulation andHanscom AFB, MA 01731-2100 Training PAD

[email protected]

EVANS AND SUTHERLAND Paul Hinote12443 Research Parkway, Suite 303 [email protected], FL 32826 407-658-1802

B-10

COMPANY CONTACT

FOCUS: HOPE Joe Petrosky1400 Oakman Blvd General ManagerDetroit, MI 48238 Center for Advanced Technologies

[email protected]

FORD MOTOR COMPANY Gene Coffman24500 Glendale Ave Staff Technical SpecialistDetroit, MI 48239 [email protected]

313-592-2079

Hwa-Sung NaPrincipal Engineering [email protected]

FORD MOTOR COMPANY Michael Carty2000 Rotunda Drive Product Development CenterDearborn, MI 48121-2053 [email protected]

313-317-9213

Richard RiffAdvanced Engineering Center ManagerC3P Project [email protected]

GENERAL MOTORS David Chang, Director30200 Mound Road National Automotive OrganizationWarren, MI 48090-9010 Vehicle Synthesis, Analysis &

Simulation Process [email protected]

Prakash [email protected]

B-11

COMPANY CONTACT

HARVARD BUSINESS SCHOOL Professor Dorothy LeonardSoldiers Field Road 617-495-6637Morgan Hall Boston, MA 0216

Stefan ThomkeAssistant [email protected]

Ron FoxProfessor [email protected]

HEADQUARTERS DEFENSE Richard BernsteinINTELLIGENCE AGENCY [email protected] AFB, MD 202-231-8934

HIGH MOBILITY ARTILLERY Mr. Brock BirdsongROCKET SYSTEM (HIMARS) Mechanical Engineer, Missile RDEC1411 Westview Drive [email protected], IA 52241 319-339-8755

IDA Hans Mair1801 North Beauregard Street Operational Evaluation DivisionAlexandria, VA 22311-1772 [email protected]

703-845-2034

INSTITUTE FOR SIMULATION Dr. Ronald Hofer, Ph.D.TECHNOLOGY Associate Director3280 Progress Drive [email protected], FL 32826 407-658-5576

Ernie SmartDirectorGovernmental & Industrial Partner- [email protected]

B-12

COMPANY CONTACT

INTEGRATION & TECHNOLOGY Bob DoddBATTLE LAB Military AnalystCDR, USATRADOC [email protected]: ATCD-B 757-727-5715Ft Monroe, VA 23651-5000

J.T. PARKER ASSOCIATES Ted Parker*3061 Mimon Road President; ADM (Ret)Annapolis, MD 21403-1317 [email protected]

301-261-1216

JOHN J. MCMULLEN ASSOCIATES, INC Rob Beadling*2341 Jefferson Davis Hwy Program ManagerArlington, VA 22202 CVX Advanced Simulation

[email protected]

JOHNS HOPKINS UNIVERSITY Dennis Lester801 University Boulevard, SE [email protected], NM 87106 505-853-1975

JOHNS HOPKINS UNIVERSITY Dr. Mark MouldingApplied Physics Laboratory Physicist11100 Johns Hopkins Road [email protected], MD 20723-6099 240-228-4979

Glenn [email protected] 240-228-6855

Jim Coolahan**SBA Task Force Technical [email protected]

Bob Lutz**SBA Task Force Technical [email protected]

B-13

COMPANY CONTACT

JOINT SIMULATION SYSTEM (JSIMS) Dave WalkerOrlando, FL Analysis and Integration Manager

[email protected], ext. 738

JOINT STRIKE FIGHTER COL Phil FayePROGRAM OFFICE Director, Requirements (RQ)1745 Jeff Davis Hwy, Suite 307 [email protected], VA 22202 703-602-7390 x6649

KROUSE ASSOCIATES John Krouse7310 Holly Park Dr. [email protected], OH 44060 440-354-5334

LOCKHEED MARTIN A.J. (Beau) Beauregard, P.E.*Washington Operations Director, Advanced Programs1725 Jefferson Davis Highway Aeronautics SectorArlington, VA 22202-4127 [email protected]

703-413-5711

LOCKHEED MARTIN Linda PoolePO Box 748 Program ManagerFort Worth, TX 76101 Virtual Product Development Initiative

(VPDI)817-763-2096

LOCKHEED MARTIN AERONAUTICAL Margaret Herald*SYSTEMS [email protected] South Cobb Drive 770-494-9833Marietta, GA 30063

LORAL/LOCKHEED MARTIN LTG (Ret) Dan SchroederFEDERAL SYSTEMS Consultant546 Lob Lolly Dr [email protected], NC 910-245-3136

MICROSOFT Mark WalkerOne Microsoft Corporation [email protected], WA 98052-6399 425-703-3181

B-14

COMPANY CONTACT

MITRE Marvin Hammond11493 Sunset Hills Rd [email protected], VA 20190 703-883-5867

Stuart [email protected]

MULTIGEN, INC. Bill MacKrell6701 Democracy Blvd N.E.Regional Sales RepresentativeBethesda, MD 20817 [email protected]

301-571-2466

MULTIGEN, INC. Ray Homan550 S. Winchester Blvd Exec. V. President and Gen. ManagerSan Jose, CA 95128 [email protected]

408-261-4100

NAVAL COASTAL SYSTEMS STATION Roger LeetePanama City, FL [email protected]

850-234-4265

NAVAL RESEARCH LABORATORY Bill Sandberg4555 Overlook Avenue, SW Deputy Director, Laboratory forWashington, D.C. 20375 Computational Physics and

Fluid [email protected]

NAVAL SEA SYSTEMS COMMAND (NAVSEA) Mike Rice2541 Jefferson Davis Hwy O3KQAlexandria, VA 22242-5610 [email protected]

703-602-0887, ext. 327

Stuart Marcus**[email protected], ext. 107

B-15

COMPANY CONTACT

NAVAL SURFACE WARFARE CENTER Joe Stelling**Port Hueneme Division SBA Task Force LeaderBuilding 1380 805-228-87194363 Missile WayPort Hueneme, CA 93043-4307

NAVAL UNDERSEA WARFARE CENTER Jeff Feaster1176 Howell St Engineering ,T&E DeptNewport, RI 02841 [email protected]

401-841-6892 x22314

Debra JonesElectronics EngineerCapital Purchase Program Manager401-841-2012 x 23351

Tom [email protected]

NAVY M&S MANAGEMENT OFFICE CAPT Jay KistlerCrystal City Presidential Towers [email protected], VA 22202 703 601-1482

NAVY MODELING AND SIMULATION George PhillipsMANAGEMENT OFFICE Navy M&S Point of ContactN6M1 [email protected]

703-685-6995

NEWPORT NEWS SHIPBUILDING Wayne Evans4101 Washington Ave Manager, Lifecycle EngineeringNewport News, VA 23607-2770 [email protected]

757-380-4876

Dan SelfridgeLifecycle [email protected]

B-16

COMPANY CONTACT

NORTHROP GRUMMAN Rahim Munshi8900 East Washington Blvd Systems EngineerPico Rivera, CA 90660 [email protected]

562-948-8824

NSSC, SC-21 PROGRAM Janet Jaensch2531 Jefferson Davis Hwy Director, SC-21 Modeling & SimulationArlington, VA 22242 [email protected]

703-602-6453, x 105

Thien NgoAsst. Director, Modeling and [email protected], x 122

OCI, INCORPORATED Donald Hirsch1235 Jefferson Davis Highway [email protected], VA 22202 703-416-0002

OPTIMETRICS, INC. (OMI) Frank Wysoki*1 Newport Drive [email protected] Hill, MD 21050 703-791-2286

ORIGINAL SIM Christian Rochefort5524 St. Patrick Regional Sales ManagerMontreal, Canada H4E 1A8 [email protected]

514-766-8868

PLYNETICS EXPRESS Rick Hannan1067 Centre Road Project ManagerAuburn Hills, MI 48326 [email protected]

248-373-4400

Steven WillisProgram [email protected]

B-17

COMPANY CONTACT

PONTIAC-GENERAL MOTORS CORP. Ronald Strayhorn100 Renaissance Center Vice President, Sales and ServiceDetroit, MI 48243-7301 [email protected]

313-667-4132

PROGRAM EXECUTIVE OFFICER Lorraine SheaTHEATER AIR DEFENSE (TAD) Director M&S, PEO TAD, NAVSEAPEO TAD, NAVSEA [email protected] CityArlington, VA

RAYTHEON ELECTRONIC SYSTEMS William Stewich1001 Boston Post Rd Mgr. Business Dev. C4I ProgramsMarlboro, MA 01752 [email protected]

508-490-3228

RAYTHEON SYSTEMS COMPANY Randy Boys*Plano, TX 75086 [email protected]

972-575-6128

RAYTHEON SYSTEMS COMPANY Steve Olson*7700 Arlington Blvd [email protected] Church, VA 22204 703-876-1942

RL ENGWALL & ASSOCIATES Dick Engwall*560 Choptank Cove Court Management ConsultantAnnapolis, MD 21401 [email protected]

410-571-8623

SCIENCE APPLICATIONS INTERNATIONAL Annie PatenaudeCORPORATION (SAIC) [email protected] Greensboro Dr., Suite 290 703-749-5109McLean, VA 22102

B-18

COMPANY CONTACT

SCIENCE APPLICATIONS INTERNATIONAL Tom BrownCORPORATION (SAIC) [email protected] Jefferson Davis HighwayArlington, VA 22202-3911 Dave Thomen

[email protected]

SCIENCE APPLICATIONS INTERNATIONAL Col (Ret) James ShiflettCORPORATION (SAIC) Vice President3045 Technology Parkway 407-207-2725Orlando, FL 32826-3299

W. San HortonSenior Technical Manager407-282-6700

SECRETARY OF THE AIR FORCE Lieutenant General George Muellner1060 Air Force Pentagon Principal Deputy (Acquisition)Washington, DC 20301 703-695-7311

MAJ Gary HopperSAF/[email protected]

MAJ Gerald S. Smither, Jr.**SAF/[email protected]

Lt. Col Paul W. Coutee**SAF/[email protected]

B-19

COMPANY CONTACT

SIMULATION TRAINING AND Stan GoodmanINSTRUMENTATION COMMAND [email protected](STRICOM)12350 Research Parkway Donald JonesOrlando, FL 32826 Deputy Product Manager,

Air & Combat Training [email protected]

Les [email protected]

Dr. Stuart [email protected]

Admiral [email protected]

Phillip SprinkleDeputy PM Training [email protected]

Ed [email protected]

Bill [email protected]

SPACE & WARFARE SYSTEMS Phillip HornickCOMMAND (SPAWAR) Deputy Program Manager M&S4301 Pacific Hwy C60 Topside [email protected] Diego, CA 92110-3127 619-537-0145

B-20

COMPANY CONTACT

SPACE AND MISSILE SYSTEMS CENTER Mark Fagan180 Skynet Way Chief, Modeling & Simulation DivisionLos Angeles Air Force Base, CA 90245 Directorate of Developmental Planning

[email protected]

STANFORD RESEARCH INSTITUTE (SRI) Ralph TomsINTERNATIONAL Senior Technical Advisor333 Ravenswood Avenue [email protected] Park, CA 94025 650-859-2852

STANFORD RESEARCH INSTITUTE (SRI) Greg Wilcox*INTERNATIONAL [email protected] North Kent Street 703-247-8467Arlington, VA 22153

TECHNICAL AUTOMATION SERVICES Murray CantorCORPORATION (TASC) Program Management55 Walkers Brook Dr [email protected], MA 01867 617-942-2000

Hal Jones*Technical Automation Services Corporation (TASC)55 Walkers Brook DriveReading, MA [email protected] x 2207

THE UNIVERSITY OF MICHIGAN Panos PapalambrosCollege of Engineering Professor and Chair2236 G.G. Brown Building Mechanical Eng. & Applied MechanicsAnn Arbor , MI 48109-2125 http://www-

personal.engin.umich.edu/~pyp/734-764-8464

B-21

COMPANY CONTACT

TRAINING AND DOCTRINE COMMAND COL Michael CuffSystems Manager for Cannon Systems TRADOC Sys. Mgr for Cannon SystemsCommandant U.S. Army Field [email protected] School (USAFAS) 580-442-6902Ft Sill, OK 73503-5600

TRIDENT SYSTEMS INC. Nick Karangelen*10201 Lee Highway PresidentSuite 300 [email protected], VA 22030 703-691-7765

U.S. ARMY DIRECTOR OF INFORMATION COL James DealSYSTEMS FOR COMMAND, CONTROL, Executive OfficerCOMMUNICATIONS & COMPUTERS (DISC4) [email protected] 3E458 703-697-5503Washington, DC 20301

U.S. ARMY LOGISTICS Dr. George HuntleyMANAGEMENT COLLEGE [email protected]: ATSZ SAM PQMD 804-765-4265Bldg. 125002401 Quarters RdFort Lee, VA 23801-1705

U.S. ARMY MATERIAL SYSTEMS Patrick O’NeillANALYSIS ACTIVITY Chief, C4I BranchAMXSY-CA392 Hopkins Road Will BrooksAberdeen Proving Grounds, MD 21005-5071 A2ATD Program Office

Chief, Simulation [email protected] – 4946

U.S. ARMY MATERIAL SYSTEMS Jerry MoellerANALYSIS ACTIVITY (AMSAA) [email protected]: AMXSY-A 309-782-7823Rock Island, IL 61299-7260

B-22

COMPANY CONTACT

U.S. ARMY MODELING AND Linda StoneSIMULATION OFFICE [email protected] 703-601-0010, x 331111 Jefferson Davis HwyCrystal Gateway North William DunnArlington, VA 22202 [email protected]

703-601-0011 x 25

U.S. ARMY TANK AND Arthur AdlamAUTOMOTIVE COMMAND Attn: AMSTA-TR-VPUSATACOM TARDEC [email protected], MI 48397 – 5000 810-574-8882

Tim BaileyAttn: AMSTA-TR-N/[email protected]

Jerry [email protected]

David HolmOperations Research [email protected]

Paul SkalnyNational Automotive [email protected]

B-23

COMPANY CONTACT

U.S. ARMY TEST & EVALUATION Dave BrownCOMMAND [email protected] Proving Ground, MD 21005 410-278-1473

John [email protected]

LTC Skip [email protected]

U.S. ATLANTIC COMMAND Lt. Col Robert StriniJoint Training, Analysis and Chief, Synthetic Theater of WarSimulation Center (JTASC) (STOW) Branch116 Lakeview Pkwy [email protected], VA 23435-2699 757-686-7525

UNDER SECRETARY OF DEFENSE Honorable Jacques S. GanslerACQUISITION TECHNOLOGY, USD(A&T) 703-695-2381Pentagon Washington, DC 20301

Dave OliverPrincipal [email protected]

UNITED DEFENSE LIMITED Richard StaiertPARTNERSHIP (CRUSADER PROGRAM) Project Manager4800 East River Road Advanced Systems DevelopmentMinneapolis, MN 55421-1498 [email protected]

612-572-4815

Bob StrattonBusiness Development [email protected]

Steven UntzCrusader Information Mgmt. [email protected]

B-24

COMPANY CONTACT

UNITED DEFENSE LIMITED David WallestadPARTNERSHIP (CRUSADER PROGRAM) Crusader Program Director(continued) [email protected] East River Road 612-572-4799Minneapolis, MN 55421-1498

UNITED DEFENSE LIMITED Warren RichesonPARTNERSHIP Director12443 Research Pkwy 407-380-5500Orlando, FL

Bob HattonManager, Orlando OpsTraining Systems [email protected]

UNIVERSITY OF ILLINOIS AT Wayne J. DavisURBANA-CHAMPAIGN ProfessorUrbana, IL 61801 General/Mechanical & Industrial Eng.

[email protected]

UNIVERSITY OF IOWA Dr. Edward HaugThe University of Iowa Dept of [email protected] Mechanical Engineering 319-335-5726Iowa City, IA 52241

VECTOR RESEARCH, INC Peter CherryP.O. Box 1506 [email protected] Arbor, MI 48106 734-973-9210

B-25

ENDNOTES

1. The SBA Industry Steering Group (ISG) isindustry’s primary interface with DoD regardingSBA. The ISG is endorsed by the NationalDefense Industrial Association (NDIA), theInternational Council on Systems Engineering(INCOSE), the National Training SystemsAssociation, the Aerospace Industries Associa-tion (AIA), the Institute of Electrical and Elec-tronics Engineers (IEEE) and the ElectronicIndustries Association (EIA). Members below areidentified with a *.

2. The Joint SBA Task Force is discussed in Chapter3, Background. Members below are identifiedwith a **.

B-26

C-1

APPENDIX C

ABOUT THE AUTHORS

C-2

C-3

APPENDIX CABOUT THE AUTHORS

Operations Officer, Marine Aircraft Group-31 at MCAS Beaufort, SC. LtCol McKeonbegan his acquisition career as the ProgramManager for the USMC Land and TrainingArea Requirement System at the DefenseTraining and Performance Data Center inOrlando, FL. Subsequent acquisition assign-ments include Project Manager for USMC/USN KC-130 Flight Simulators and as Pro-gram Manager for USMC and DoD VirtualReality R&D programs at the Naval Air War-fare Center Training Systems Division,Orlando, FL. LtCol McKeon graduated fromthe U.S. Naval Academy in 1978 with a B.S. inPolitical Science. He received his M.S. in Com-puter Systems Management/InformationTechnology in 1991 from the Naval Postgradu-ate School, Monterey, CA. He is a graduateof the Harvard Graduate School of Business’sProgram for Management Development, theUSMC Amphibious Warfare School, and theMarine Command and Staff Course.

Lt Col Terence R. Szanto, US Air Force, has13 years of acquisition experience at the pro-gram office, Air Staff, joint, and internationallevels. He was most recently assigned to Head-quarters Air Force, providing recommenda-tions for $24 billion of the $450 billion AirForce Future Years Defense Plan. In a jointacquisition assignment in the Republic ofKorea, he coordinated military technologycooperation programs. At the ElectronicSystems Center, Hanscom AFB, he upgradedthe B-1 and B-52 aircraft mission planningsystems for smart and conventional weaponscapabilities, and maintained 238 tactical air-craft mission planning systems deployed at 150

LTC Michael V.R. Johnson, Sr., US Army, hasover 20 years of operational and acquisitionexperience. His most recent assignment waswith Headquarters, Army Special OperationsCommand as Acquisition Branch Chief. Otheracquisition assignments include: MaintenanceEngineer with Communications and Electron-ics Command; Training with Industry atMartin Marietta; Project Director at Simula-tion Training and Instrumentation Command(STRICOM); and the first Project Director forthe Distributed Interactive Simulation Office.LTC Johnson’s operational assignments in-clude: aviation assignments in Fulda, Germanywith the 11th Armored Cavalry Regiment, atFort Carson, Colorado with the 4th InfantryDivision as a Company Commander in theAviation Brigade, Brigade Plans Officer for aMechanized Infantry Brigade; and as theOperational Aviation Officer for the 4th Divi-sion, and at Fort Knox as a Cavalry PlatoonLeader with the 2/6 Cavalry Brigade. LTCJohnson has a Masters Degree in ContractAcquisition Management and is a graduate ofthe U.S. Military Academy, the HarvardGraduate School of Business’s Program forManagement Development, the Army’s Com-mand and General Staff College, and theDefense Systems Management College’sAdvanced Program Manager’s Course. He israted in the OH-58, AH-1 and UH-1helicopters. He is also Airborne and Rangerqualified.

LtCol Mac McKeon, US Marine Corps, has20 years of aviation service, flying the F-4Phantom, A-6 Intruder, and the F/A-18Hornet. His most recent assignment was as the

C-4

locations worldwide. His first acquisitionassignment was at Space Division, Los Ange-les AFS, CA. Lt Col Szanto spent a career broad-ening assignment in F4-E aircraft maintenanceat George AFB, CA. He graduated from theU.S. Air Force Academy in 1981 with a B.S.in Engineering Mechanics, and received anMBA from Pepperdine University. Lt Col

Szanto is a graduate of the Harvard Gradu-ate School of Business’s Program for Man-agement Development, is a distinguishedgraduate of Squadron Officers School and AirCommand and Staff College, and will attendthe Industrial College of the Armed Forcesthis fall.