Upload
others
View
21
Download
1
Embed Size (px)
Citation preview
SSRR 2017 November 8, 2017
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Sponsor: Missile Defense AgencyMikel D. Petty, Ph.D.
University of Alabama in Huntsville
9th Annual SERC Sponsor Research ReviewNovember 8, 2017
FHI 360 CONFERENCE CENTER1825 Connecticut Avenue NW, 8th Floor
Washington, DC 20009
www.sercuarc.org
SSRR 2017 November 8, 2017 2
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Presenter• Mikel D. Petty, Ph.D.
▪ Associate Professor, Computer Science▪ Senior Scientist for Modeling and Simulation, ITSC
• Education▪ Ph.D. Computer Science, UCF 1997▪ M.S. Computer Science, UCF 1988▪ B.S. Computer Science, CSUS 1980
• Career summary▪ Information technology: CSUS, UTEP, GM, UCF; 1980-1990▪ Research: UCF, ODU, UAH; 1990-present
• Research▪ Modeling and simulation: interoperability and composability,
VV&A methods, human behavior modeling, autonomy and AI▪ > 200 publications▪ > $16.5 million total research funding awarded
SSRR 2017 November 8, 2017 3
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Presentation outline• Overview of courses
▪ Courses and development status▪ Courses’ structure and features▪ Monte Carlo Simulation: Outline and Examples▪ Verification, Validation, and Accreditation: Outline and Case Studies▪ Past and future offerings▪ Other uses of the content
• Sample content▪ Monte Carlo Simulation▪ Verification, Validation, and Accreditation
SSRR 2017 November 8, 2017 4
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Overview of courses
SSRR 2017 November 8, 2017 5
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Quadrature
Monte Carlo 2 examples
Courses and development statusM&S 101
Terms and definitionsMathematical preliminaries
M&S categorizations
Modeling methods
Monte Carlo simulationDiscrete event simulation
Physics-based modeling
VV&A
Distributed simulation
Monte Carlo simulation
VV&A
Special topics
M&S Fundamentals
Physics-based modeling
Terms and definitionsMathematical preliminaries
M&S categorizations
Modeling methodsDiscrete event simulation
Distributed simulation
Integration of ODEs
Fourier methods for PDEs
Physics in modelsAbsence of continuity
Applications
Additional modeling methods
Implementation
CMSP exam
Mathematics of randomnessConcepts and definitions
Methods
Design of experimentsInput and output
Concepts and definitions
Case studies
Processes
Monte Carlo 1 examples
(1) Development complete; currently being offered(2) Under development(3) Planned
(2) (1)
(1)(2)
(3)
SSRR 2017 November 8, 2017 6
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Courses’ structure and features• Structure
▪ 2 days, 0800-1700 each day▪ Lecture 0800-1100, Exercises 1100-1200▪ Lecture 1300-1600, Exercises 1600-1700
• Features▪ Target audience: engineers, analysts, first-level supervisors▪ Students encouraged to question and contribute during lectures▪ Many examples to illustrate concepts and processes▪ Both MDA-oriented and general-interest examples▪ Examples implemented in Excel and R▪ Current slide counts: MCS 273, VV&A 337▪ Sources cited in slides▪ Slides (hardcopy and digital) provided to students▪ Hands-on instructor-supervised exercises based on examples ▪ Course content enhanced after each offering based on feedback
SSRR 2017 November 8, 2017 7
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Monte Carlo Simulation: Outline• Part 1: Concepts, Randomness, and Examples
▪ Course introduction ●
▪ Concepts and definitions ●
▪ Mathematics of randomness ●
▪ Monte Carlo 1 examples ●●
▪ Design of experiments and MCS ●
• Part 2: Input, Output, and Examples▪ Input modeling and output analysis ●
▪ Monte Carlo 2 examples ●●
▪ Setting confidence interval width ●
SSRR 2017 November 8, 2017 8
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Monte Carlo Simulation: Examples• Monte Carlo 1 examples
▪ EVA duration▪ Product earnings▪ Heat transfer▪ Lionfish control▪ Slope runoff▪ Electromagnetic launcher▪ Minefield transit▪ Missile impacts*▪ Fragment trajectories*
• Monte Carlo 2 examples▪ Monty Hall▪ Risk battle▪ Bearing replacement▪ Attrition combat▪ Direct fire*▪ Bombing accuracy*▪ Screening clinic▪ Epidemic progression▪ Forest fire▪ Layered defense*▪ Interceptor deployment*
SSRR 2017 November 8, 2017 9
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Verification, Validation, and Accreditation: Outline• Part 1: Concepts
▪ Course introduction ●▪ Definitions and concepts ●
▪ Software engineering VV&A vs. modeling and simulation VV&A ●
▪ VV&A and model credibility ●
▪ Validation risk and the Columbia disaster ●
• Part 2: Methods▪ Introduction ●
▪ Survey of verification and validation methods ●
▪ Validation using confidence intervals ●
• Part 3: Case studies ●●
• Part 4: Processes▪ V&V of integrated model federations ●
▪ VV&A of legacy models ●
▪ Developing and validating virtual targets ●
SSRR 2017 November 8, 2017 10
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Verification, Validation, and Accreditation: Case studiesValidating a model of X using Y• Spacecraft propulsion system; regression• Waiting line; hypothesis testing• Ground combat; comparison testing• Foam decomposition; confidence interval*• Missile impacts; hypothesis test*• Commander decision making; hypothesis test• Wastewater treatment; multiple methods
SSRR 2017 November 8, 2017 11
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Past and future offerings• M&S 101
▪ Colorado Springs CO, June 9-12 2015▪ Huntsville AL, August 3-6 2015▪ Huntsville AL, March 28-31 2016▪ Huntsville AL, August 9-12 2016▪ Colorado Springs CO, May 15-18 2017
• Monte Carlo Simulation▪ Huntsville AL, August 7-8 2017▪ Colorado Springs CO, September 6-7 2017▪ Huntsville AL, December 13-14 2017▪ Colorado Springs CO, April 11-12 2018
• Verification, Validation, and Accreditation▪ Huntsville AL, August 9-10 2017▪ Colorado Springs CO, September 26-27 2017▪ Huntsville AL, January 23-25 2018▪ Colorado Springs CO, May 1-3 2018
SSRR 2017 November 8, 2017 12
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Sample content:Monte Carlo Simulation
SSRR 2017 November 8, 2017 13
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Two forms of Monte Carlo simulationStochastically
varyinginitial conditions
Deterministicmodel
Stochastically varying results
Probability distributionsused to model variability
in initial conditions
e.g., physics-based model Multiple runs withrun-to-run variability in results;
analyzed statistically
Fixedinitial conditions
Stochasticmodel
Stochastically varying results
Specific known or given initial conditions
e.g., probability-based model Multiple runs withrun-to-run variability in results;
analyzed statistically
MC1Missile impacts[Zhang, 2008]
MC2Bombing accuracy
[BanksJ, 2010]
MC
1M
C2
SSRR 2017 November 8, 2017 14
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Layered defense• Hostile aircraft attacking friendly facility• Two layers of defense, each with a different SAM• Type A SAM: random target selection, Pk = 0.70• Type B SAM: specified target selection, Pk = 0.90• Type B SAMs much more expensive than Type A
Type A SAM, e.g., Stinger (notional) Type B SAM, e.g., Patriot (notional)
SSRR 2017 November 8, 2017 15
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Layered defense model [Przemieniecki, 2000] [Albert, 2012]
Initialraid
Type Aengagement
Reducedraid
Type Bengagement
Reducedraid
Friendlyfacility
AAA BBB
• Type A SAMs engage first, Type B SAMs engage second• Each type A SAM selects random target, possible duplicates• Each type B SAM given specific target, no duplicates• All available missiles of each type fired in each layer
SSRR 2017 November 8, 2017 16
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Layered defense model• Variables
▪ Number of attacking aircraft Hn▪ Number of type A SAMs An▪ Number of type B SAMs Bn▪ Type A SAM probability of kill Ak▪ Type B SAM probability of kill Bk▪ Type A SAM cost Acost▪ Type B SAM cost Bcost
• Assumptions and abstractions▪ Number of attacking aircraft fixed at 8▪ Attacking aircraft identical▪ Attacking aircraft speed, altitude, defenses included in Ak, Bk
SSRR 2017 November 8, 2017 17
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Scenario and initial conditions• Analysis of probability of raid annihilation
▪ How many SAMs of type A should be deployed?▪ How many SAMs of type B should be deployed?▪ What is the optimum mix of SAMs to achieve PRA ≥ 0.95?
• Initial conditions▪ Number of attacking aircraft Hn = 8▪ Number of type A SAMs An ∈ {6, 8, 10, … 24}▪ Number of type B SAMs Bn ∈ {0, 1, 2, … 6}▪ Probability of kill Ak = 0.70, Bk = 0.90▪ Cost per SAM Acost = 1, Bcost = 50
How does the mix of type A and type B SAMsaffect the probability of raid annihilation?
SSRR 2017 November 8, 2017 18
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Raid <- function(Hn, An, Ak, Bn, Bk) {# Initial raid consists of Hn attacking aircraft, numbered 1 to Hnraid0 <- seq(1:Hn)# Defense layer 1; result raid1 is vector of attackers remaining.targets <- sample(raid0, size=An, replace=TRUE) # Randomly select targetsshots <- runif(An, min=0, max=1) # Outcome of each shotkills <- targets[which(shots <= Ak)] # Determine which shots killed targetraid1 <- setdiff(raid0, kills) # Remove killed targets from raid# Defense layer 2; result raid2 is number of attackers remaining.shots <- runif(Bn, min=0, max=1) # Outcome of each shotkills <- sum(shots <= Bk) # Determine how many targets were killedraid2 <- max(0, length(raid1)-kills) # Deduct killed targets from raidreturn(raid2)
}
trials <- 1000 # Number of trials for each An, Bn combinationHn <- 8 # Number of attacking aircraftAn <- c(6, 8, 10, 12, 14, 16, 18, 20, 22, 24) # Number of type A SAMsBn <- c(0, 1, 2, 3, 4, 5, 6) # Number of type B SAMsAk <- 0.70 # Probability type A SAM destroys its targetBk <- 0.90 # Probability type B SAM destroys its targetAcost <- 1 # Relative cost of type A SAM (w.r.t. type B)Bcost <- 50 # Relative cost of type B SAM (w.r.t. type A)
results <- matrix(nrow=length(An), ncol=length(Bn))costs <- matrix(nrow=length(An), ncol=length(Bn))for (i in 1:length(An)) {
for (j in 1:length(Bn)) {results[i,j] <- sum(replicate(trials, Raid(Hn, An[i], Ak, Bn[j], Bk)) == 0)/trialscosts[i,j] <- (An[i]*Acost)+(Bn[j]*Bcost)
}}
Model implementation
SSRR 2017 November 8, 2017 19
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Results of 1 trial> Hn <- 8> An <- 12> Ak <- 0.70> Bn <- 2> Bk <- 0.90> raid0 <- seq(1:Hn)> raid0[1] 1 2 3 4 5 6 7 8> > targets <- sample(raid0, size=An, replace=TRUE) # Randomly select targets> shots <- runif(An, min=0, max=1) # Outcome of each shot> kills <- targets[which(shots <= Ak)] # Determine which shots killed target> raid1 <- setdiff(raid0, kills) # Remove killed targets from raid> targets[1] 8 1 8 5 8 5 8 1 3 7 2 4
> shots[1] 0.84513263 0.89358349 0.09825950 0.50993895 0.61837621 0.03134841 0.51336181 0.86036322 0.69179964
0.75470591 0.51931788 0.53344185> kills[1] 8 5 8 5 8 3 2 4> raid1[1] 1 6 7> > shots <- runif(Bn, min=0, max=1) # Outcome of each shot> kills <- sum(shots <= Bk) # Determine how many targets were killed> raid2 <- max(0, length(raid1)-kills) # Deduct killed targets from raid> shots[1] 0.07976728 0.82967724> kills[1] 2> raid2[1] 1
SSRR 2017 November 8, 2017 20
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Results of 70,000 trials
Response variablesProbability of raid annihilation (upper value in cells)Cost of SAMs (lower value in cells)
SSRR 2017 November 8, 2017 21
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Sample content:Verification, Validation, and Accreditation
SSRR 2017 November 8, 2017 22
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Modelnot valid
Modelvalid
Modelnot used
Modelused
Correct
Type II errorUse of invalid model;
Incorrect V&V;Model user’s risk;More serious error
Correct
Type I errorNon-use of valid model;
Insufficient V&V;Model builder’s risk;Less serious error
V&V errors and statistical errors
Fail
to re
ject
H0
whe
n H
1tru
ei.e
., fa
il to
reje
ct a
n in
valid
mod
elP(
Fail
to re
ject
H0
| H1
true)
= P
(Typ
e II
err
or) =
β
Reject H0 when H0 truei.e., reject a valid model
P(Reject H0 | H0 true) = P(Type I error) = α
SSRR 2017 November 8, 2017 23
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Missile impacts [Zhang, 2008]
• Scud: single stage liquid fuel tactical ballistic missile• Range 180 Km (Scud-A) to 700 Km (Scud-D)• Deployed and used widely in Middle East and Asia• Where does a missile impact w.r.t. the aiming point?
Scuds being launched[ZeroHedge, 2015]
Scud on transporter[Batiz, 1997]
SSRR 2017 November 8, 2017 24
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Model [Zhang, 2008]
• Deterministic physics-based 6DOF MATLAB Simulink• Calculates missile trajectory and impact point
from initial conditions• Organized into modules: velocity, rotation,
atmospheric conditions, aerodynamics, thrust
vvvvv
vvvv
ZYPdt
dmV
mgZYpdtdmV
mgXPdtdVm
γ+γ+γβα−γα=ϕ
θ−
θ−γ−γ+γβα+γα=θ
θ−−βα=
cossin)sinsincossin(sincos
cossincos)sinsincoscos(sin
sincoscosVelocity moduleequations
Velocity moduleblock diagram
SSRR 2017 November 8, 2017 25
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Monte Carlo analysis• Application
▪ Validate model’s range x and deflection y errors(miss distances) w.r.t. aiming point
▪ Compare model error variances to live test data▪ Two ranges: 60 Km and 100 Km▪ 6 live tests, 800 Monte Carlo model trials each range
• Procedure▪ Generate initial conditions stochastically▪ Calculate impact point using deterministic model▪ Repeat for 800 trials▪ Compare model and live test x and y variances
using statistical hypothesis tests
SSRR 2017 November 8, 2017 26
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Validation comparison• Dispersion of impact points important
in missile systems, often validated• Compare model variance (from runs)
to simuland variance (from observations)• Comparison not simply informally comparing
and values and concluding “TLAR”• Statistical hypothesis test compares to• F-test used• x and y variances compared separately
2Sσ
2Mσ
2Mσ 2
Sσ2Mσ 2
Sσ
SSRR 2017 November 8, 2017 27
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
Results of 1600 trials60 Km
100 Km
Model 535.08 85.75 800
Live 558.52 90.35 6
Source x err s y err s n
Model 921.39 111.25 800
Live 980.52 120.68 6
Source x err s y err s n
SSRR 2017 November 8, 2017 28
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
F-test: Concepts• Statistics [Brase, 2015]
▪ Hypothesis test for equality of population variances ▪ Tests whether populations have equal variances▪ Assumes populations to be normally distributed;
very sensitive to non-normality• Validation
▪ Tests simuland variance and model variancefor equality
▪ Use when multiple simuland observations andmultiple model runs available
▪ If the test finds the variances to be equal,then the model is considered valid for variance
SSRR 2017 November 8, 2017 29
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
F-test: Procedure• Obtain independent samples from populations• Calculate sample statistics x ̅1, s1, n1 and x2̅, s2, n2• Set notation so that • Select level of significance α; often α = 0.05• Formulate hypotheses H0 and H1• Calculate test statistic F using formula• Determine P-value for F (table or software)• If P-value < α, then reject H0, else accept H0
22
21
ssF =
21
21
21
,sizes Sample,devs std Sample,means Sample
nnssxx
valid) not (modelvalid) (model
: :
22
211
22
210
σσ
σσ
>
=
HH
22
21 ss ≥
(For critical region test procedure, see [Bhattacharyya, 1977].)
SSRR 2017 November 8, 2017 30
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
F-test: Calculations
s1
s2
F
d.f.N
d.f.D
P-value
558.52
535.08
1.0895
5
799
0.3646
90.35
85.75
1.1102
5
799
0.3534
60 Km980.52
921.39
1.1325
5
799
0.3415
120.68
111.25
1.1767
5
799
0.3188
100 Km
F distribution P-value in Excel =FDIST(F,d.f.N,d.f.D)
} Model
} Simuland
SSRR 2017 November 8, 2017 31
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
F-test: Interpretation• Rejection criteria for one-tailed F-test
▪ If P-value < α, then reject H0▪ Otherwise, do not reject H0
• Result▪ All four P-values > α▪ Do not reject H0 for either direction or range
• Interpretation▪ Model is valid w.r.t. variance of x and y errors
at both 60 Km and 100 Km
SSRR 2017 November 8, 2017 32
Missile Defense Agency (MDA) Research and Course Development:Verification, Validation, and Accreditation
and Monte Carlo Simulation
More information• Mikel D. Petty, Ph.D.• University of Alabama in Huntsville
▪ Information Technology and Systems Center▪ Computer Science Department
• Contact information▪ Telephone: 256-824-6140▪ Email: [email protected]