32
2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual- Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton Hotel Orlando East-UCF Area, FL

Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

  • Upload
    others

  • View
    5

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

2019 LVC and Agile Workshop

"Accelerating Test and Evaluation with Live-Virtual-

Constructive and Agile"

Program Guide

September 17-19, 2019

DoubleTree by Hilton Hotel Orlando East-UCF Area, FL

Page 2: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

THANK YOU TO OUR SPONSORS!

Platinum Level

Gold Level

Bronze Level Sponsor

ITEA is a 501(c)(3) professional education association dedicated to the education and advancement of the test and evaluation profession. Registration fees, membership dues, and sponsorships are tax deductible.

Sponsorship dollars defer the cost of the workshop and support the ITEA scholarship fund, which assists deserving students in their pursuit of academic disciplines related to the test and evaluation profession.

Page 3: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Hosted by the ITEA Central Florida Chapter

Program Committee

Workshop Chair: Steve “Flash” Gordon, PhD - Georgia Tech Research Institute Deputy Workshop Chair: C. David Brown, PhD, CTEP – The MITRE Corporation

Technical Program Chair: Mr. Mark Phillips - Raytheon Chief of Staff for Registrations and Support: Mr. and Mrs. Henry Merhoff

WORKSHOP DESCRIPTION

Fielding effective, secure systems to warfighters at the speed of need is essential, but this goal is difficult to achieve given that industrial-age acquisition and systems engineering processes, including Test and Evaluation (T&E), do not mesh well with development and use of modern software-intensive systems. Agile software processes that combine acquisition events with developmental and operational testing show promise in decreasing historic timelines. Combining built-in information technology security and assured hardware platforms (Sec) with software development (Dev) and information technology operations (Ops) throughout the SecDevOps software build is also streamlining the delivery of secure software-intensive systems. Finally, increasing the focus on what the warfighter needs now and what is necessary for potential conflicts will provide more usable and effective systems. Other key ideas for improving effectiveness and accelerating this process include early prototyping via modeling, simulation, and gaming; evaluating hardware prototypes; combining test events; the use of Artificial Intelligence to improve data gathering and reporting; and evolutionary program development. Simulation and gaming environments can be used to allow warfighters and testers to evaluate the advantages of system variants and alternative tactics before the hardware and software are finalized.

CONTINUING EDUCATION UNITS (CEUs)

Each of the 4-hour Pre-Workshop Tutorials provide 4 contact hours of instruction (4 CEUs) that are directly applicable to your professional development program, including the Certified Test and Evaluation Professional Credential (CTEP).

In addition to the Pre-Workshop Tutorials, the Workshop provides 4 contact hours of instruction (4 CEUs) for each half-day, 8 contact hours of instruction (8 CEUs) for each full-day, or 20 contact hours of instruction (20 CEUs) for attending the full Workshop, that are directly applicable to your professional development program, including the Certified Test and Evaluation Professional Credential (CTEP).

Please send your request for a Certificate of Attendance to [email protected]

INDEX Pre-Workshop Tutorials ……………………………………………………………………………………………………………………………. Page 1 Wednesday Opening Plenary Session ………………………………………………………………………………………………………. Page 3 Wednesday Afternoon Technical Track Sessions ………………………………………………………………………………………. Page 4 Thursday Plenary Session …………………………………………………………………………………………………………………………. Page 5 Thursday Afternoon Technical Track Sessions ……………………………………………………………………………….…………. Page 6 ITEA Chapters …………………………………………………………………………………………………………………………………………. Page 7 Technical Track Session Abstracts ……………………………………………………………………………………………………………. Page 8 ITEA Life Members …………………………….……………………………………………………………………………………………………. Page 14 Biographies …………………………………………………………………………………………………………………………………….………. Page 15 ITEA Corporate Members and Chapters ..…………………………………………………………………………………………………. Page 25 Certified Test and Evaluation Professional (CTEP) Credential ……………………………………………………………………. Page 28 Notes Page ….……………………….…………………………………………………………………………………………………………………. Page 30 Hotel Map ………………….……………………………………………………………………………………………………………………………. Page 31

Page 4: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton
Page 5: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 1 | P a g e

Tuesday, September 17th

8:00 a.m. – Noon Morning Pre-Workshop Tutorials (Separate fee required)

Accelerating Cybersecurity Test and Evaluation

Instructor: Mr. Alex Hoover - Department of Homeland Security (DHS) Cybersecurity raises a number of challenges for the test and evaluation professional. As an emerging performance domain, there is a lack of standard institutional processes and methods for specifying and understanding cybersecurity capability, as opposed to a more familiar performance domain such as Reliability, Availability, and Maintainability. This lack of standards extends to the tools (manual and automated) available to handle cybersecurity test and evaluation. This tutorial will provide a point of view for cybersecurity performance assessment and align it with the fundamentals of Design of Experiments to provide a method for handling this domain using classical test and evaluation processes. Some days we hear “testing cyber is just testing.” Other days, it’s “cyber is different from everything else.” Both of these viewpoints are valid within their limited context, and discussion of these topics will be included in this tutorial.

Integrating Systems Engineering, Agile, DevSecOps, and Test and Evaluation

Instructor: Dave Brown, PhD, CTEP, and Dave Bell, PhD, CTEP, The MITRE Corporation Most of the capability of complex systems is now being delivered by software, and in the process, systems engineering and T&E are getting a bad rap. We often hear that systems engineering processes are cumbersome, overly rigid and inflexible, resource excessive, and time consuming. We hear that T&E is overbearing, redundant, and resource intensive. We also hear that systems engineering and T&E are yesterday’s processes, replaced now by Agile, especially for software development, but hardware development as well. However, Agile does not make systems engineering and T&E obsolete. In fact, systems engineering actually provides a top-level structure and process to integrate Agile into large scale developments, especially of hardware/software integrated systems. Also, many elements of systems engineering are part of the processes that every Agile development team uses during each sprint. There are also many practitioners stating that developments incorporating Agile needn’t accomplish any T&E. Again, T&E is very much a part of, and well-integrated into Agile. Therefore, a program’s T&E must be planned by practitioners that understand Agile and Systems Engineering. This tutorial will give T&E practitioners the overview and top-level understanding of Agile and Systems Engineering that they need to plan and oversee effective and efficient T&E for a program using Agile development methodology.

Using Design of Experiments to Accelerate the Knowledge Gain from Test Data

Instructor: Steve “Flash” Gordon, PhD - Georgia Tech Research Institute (GTRI) The use of Design of Experiments (DOE) to evaluate test data can provide more information in less time using fewer resources than previously-used data analysis methods. This tutorial will briefly discuss Design of Experiments methods and processes, demonstrate standard procedures to reduce noise in the experiment or test, conduct a DOE application in class, and assist participants in hitting various targets consistently. Comparison of traditional testing methods and DOE methods will be presented in order to illustrate the time and funding avoided by using DOE and the additional information gained from the DOE analysis. Attendees will also receive a complementary DOE e-book.

Verification and Validation – Accelerating T&E through the Application of Credible Models and Simulations

Instructor: Ms. Simone Youngblood - The Johns Hopkins University Applied Physics Laboratory M&S is a key tool in system acquisition used to: reduce the time to field a system, reduce the resources needed to develop and evaluate that system, inform decision risk; and enhance training. Verification, Validation and Accreditation (VV&A) are processes that focus on ensuring that M&S used in support of T&E provides credible representations of the system under test, its performance space, and its effectiveness. As we seek to accelerate T&E and expand available training systems, we need to evaluate ways that VV&A can assist these acceleration needs. VV&A goals emphasize quality, identifying M&S use risk, formalizing the conceptual model description, and establishing recommended practices for VV&A. V&V topics of particular import for the T&E community include: leveraging test data to develop validation referents, selecting V&V methods that align with available referent data, and ensuring that V&V evidence meets the rigor and requirements of the oversight Test Organizations. This tutorial will address these topics and discuss VV&A methods that will assist in the acceleration of M&S used in T&E and training.

Page 6: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

2 | P a g e Hosted by the ITEA Central Florida Chapter

1:00 PM – 5:00 PM Afternoon Pre-Workshop Tutorials (Separate fee required)

Accelerating Cybersecurity in Acquisition

Instructor: Mr. Pete Christensen, CTEP - The MITRE Corporation Now more than ever, Program Managers (PM) and their teams must ensure that cybersecurity is given careful consideration throughout the system acquisition lifecycle from program inception through operations and sustainment. Cybersecurity requirements must be identified up front and early. Test infrastructure and tools must be provisioned to build cybersecurity capabilities into the acquisition from the very first lines of code. Program Teams must exploit every opportunity to integrate functional testing and cybersecurity testing Results of early cybersecurity testing can be used to influence design and development efforts and posture programs for success in Developmental Test (DT), Operational Test (OT) into deployment and operations. There are many opportunities to accelerate Cybersecurity in Acquisition, enabled by new technologies (Cloud), new software development approaches (DEVSECOPS) and robust Cybersecurity Testing venues such as the National Cyber Range Complex (NCRC). The concepts and ideas presented here are relevant across the whole of government. This presentation will provide an overview Cybersecurity in DOD Acquisition and how program Teams might leverage new technology, development approaches and test infrastructure accelerate Cybersecurity in Acquisition.

How to Plan Test Strategy for Agile Development in a Government Framework Instructors: Mr. Hans Miller - The MITRE Corporation

This tutorial provides a framework and guidance for programs transitioning to an agile construct or new programs established with an agile construct. The intended audience includes requirements managers, program managers and test managers executing DoD programs; however, the overall principles could apply to multiple agencies. This tutorial is not a singular solution for agile testing; it acknowledges the different approaches needed for different programs and is intended to provide students with an understanding of concepts that can be tailored to their specific program. This tutorial will walk through characteristics of agile process and where it does and does not apply to help inform expectations. It will cover US code, OSD and service policy as it applies to agile testing and planned policy updates designed to allow greater flexibility. The core of the tutorial covers upfront planning and strategy considerations for successful testing; requirements, contracting, infrastructure investments, automation and test execution. It concludes with how to translate that strategy into concise, timely, and relevant documentation from the TEMP, test plan, and test reporting.

Introduction to Agile Test and Evaluation

Instructor: Ms. Jennifer “Jenny” Rekas - The MITRE Corporation Agile software engineering process models, such as Scrum, Kanban, or XP, have been popular for several years. Originally, Agile testing practice was focused on individual software projects and how automated test could be accomplished for small teams. As Agile has become a more accepted process model, organizations look to scale it for larger, more complex systems that are not all software-based, as well as identify how to perform test and evaluation in an Agile context using DevSecOps technologies. This tutorial introduces several Agile and DevSecOps process concepts, with a focus on Test and Evaluation. Topics for this lecture-based tutorial include: review of Agile processes at the individual project level and scaled process models for larger systems; examples of Agile testing practices; introduction to DevOps, particularly how test and evaluation fits into that paradigm; and exploration of a case study of how Agile test and evaluation was implemented on a large system of systems effort.

Using TENA/JMETC to Improve the Test and Training Process

Instructor: Mr. Gene Hudgins - KBR TENA provides for real-time software system interoperability, as well as interfaces to existing range assets, C4ISR systems, and simulations. TENA has also been selected for use in JMETC events, well-designed for its role in prototyping demonstrations and distributed testing. JMETC is a distributed LVC testing capability developed to support the acquisition community during program development, developmental testing, operational testing, and interoperability certification, and to demonstrate Net-Ready KPP requirements in a customer-specific JME. JMETC uses a hybrid network architecture. The JMETC Secret Network (JSN), based on the SDREN, is the T&E enterprise network solution for secret testing. The JMETC MILS Network (JMN) is the T&E enterprise network solution for all classifications and cyber testing. JMETC provides readily available connectivity to the Services’ distributed test capabilities and simulations, as well as industry test resources. JMETC is also aligned with the JNTC integration solutions to foster test, training, and experimental collaboration. This tutorial will inform the audience as to the current impact of TENA and JMETC on the Test and Training community; and its expected future benefits to the range community and the warfighter.

Page 7: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 3 | P a g e

Wednesday, September 18th Workshop Opening Plenary Session

8:00 a.m. Opening Ceremony - COL Bill Keegan, USA (Ret.), ITEA President, Equator Corporation Presentation of the Colors - US Army ROTC, University of Central Florida The National Anthem - Ms. Jennifer Johnson, Disney Vacations 8:10 a.m. Welcome – Steve "Flash" Gordon, PhD, Workshop Chair, Georgia Tech Research Institute 8:20 a.m. Opening Remarks – C. David Brown, PhD, CTEP, Deputy Workshop Chair, The MITRE

Corporation 8:30 a.m. Opening Keynote - Mr. Alan Shaffer, Deputy Undersecretary of Defense for Acquisition and

Sustainment, Department of Defense (DoD) 9:00 a.m. Guest Speaker - Mr. Bob Potter, Division Chief for Mission Command, U.S. Army Evaluation

Center C4ISR Directorate and the Army Test and Evaluation Center Lead Representative to the Synthetic Training Environment Cross Functional Team

9:30 a.m. Featured Speaker - Mr. Terry Carpenter, Director and Program Executive Officer of NBIS,

Defense Information Systems Agency (DISA) 10:00 a.m. Break

10:30 p.m. DoD Programs Using Agile Panel

Moderator: C. David Brown, PhD, CTEP – The MITRE Corporation

Panelists: Amy Henninger, PhD - Senior Advisor for Software and Cybersecurity to the Director, Operational Test and Evaluation Director, Operational Test and Evaluation (DOT&E) Mr. Geoffrey Hart - Agile Lead, PMO National Background Investigation Services Mr. Chris DeLuca - Director, Space, Cyber and Information Systems Division, Office of the Under Secretary of Defense for Research and Engineering (USD(R&E))/Director, Developmental Test & Evaluation (DT&E) Mr. Thomas Hartzell - Chief Development Tester (CDT) for Air Force Maintenance, Repair, and Overhaul Initiative (MROi), U.S. Air Force Ms. Jenny Rekas - MITRE T&E Support to the National Background Investigation Services, The MITRE Corporation

11:45 a.m. Technical Program Overview - Mr. Mark Phillips, Technical Program Chair, Raytheon

Noon Lunch

1:15 p.m. Afternoon Keynote - Mr. Dave Duma , Principle Deputy Director, Operational Test and Evaluation (OT&E), Director, Operational Test and Evaluation (DOT&E)

1:45 p.m. Application of Agile Test and Evaluation Methods to Align with Accelerated System

Development - Mr. Tim Bishop, SES, Deputy Program Executive Officer, Simulation, Training, and Instrumentation, United States Army

2:15 p.m. Featured Speaker - Mr. Robert "Rob" Tamburello, Deputy Director, National Cyber Range

Complex, Test Resource Management Center (TRMC)

2:45 p.m. Break

Page 8: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

4 | P a g e Hosted by the ITEA Central Florida Chapter

3:30 pm Technical Sessions

Time Title of Presentation Speaker Organization

Capabilities Based T&E, Cyber, and Flight Test

1st CHAIR: Mr. Craig Hatcher - Edwards AFB, USAF 2nd CHAIR: Mr. Kevin McGowan - 47th Cybersecurity Test Squadron, Eglin AFB, USAF

3:30 PM

NAVAIR’s Implementation of Capabilities-Based Test and

Evaluation

Mr. Kenneth Senechal and Ms. Lori Jameson Naval Air Systems Command (NAVAIR)

4:00 PM

Cyber Testing in a Rapid DOD Acquisition Environment

Mr. Kevin McGowan COLSA Corporation

4:30 PM

A Conceptual Framework for Flight Test Management and

Execution Utilizing Agile Development and Project

Management Concepts

Mr. Craig A. Hatcher U.S. Air Force

Reliability, Automated Testing, and Flight Test

1st CHAIR: Mr. Doug Stewart - 321 Gang 2nd CHAIR: Mr. Robert Lutz, Johns Hopkins University/Applied Physics Laboratory

3:30 PM

Ensuring Reliability for Middle Tier Acquisition Programs

Ms. Melanie Goldman CCDC Data & Analysis Center

4:00 PM

Automated Testing in Aircraft ASD

Mr. Bryan Kelly U.S. Air Force

4:30 PM

Flight Test Data Analysis Process Improvement

Mr. Austin Ruth Georgia Tech Research Institute (GTRI)

Agility, Ranges, and Lessons of Past Programs

1st CHAIR: Steve Gordon, PhD - Georgia Tech Research Institute 2nd CHAIR: Mr. Karl King - TENA SBA

3:30 PM

Not So Technical, But Yet Key: Attaining T&E Agility by

Implementing Right-Sized Processes

Ms. Rosa R. Heckle The MITRE Corporation

4:00 PM

Lessons from Past Rapid Acquisition Programs

Lindsey Davis, PhD Institute for Defense Analysis

4:30 PM TBA

AI, Cyber, and Big Data for Nuclear

1st CHAIR: Himanshu Upadhyay, PhD - Florida International University 2nd CHAIR: Mr. Roger Boza

3:30 PM

Introduction to Artificial Intelligence

Tushar Bhardwaj, PhD Florida International University

4:00 PM

Artificial Intelligence Platform of Cyber Threat Automation and

Monitoring (CTAM) System

Himanshu Upadhyay, PhD Florida International University

4:30 PM

Deep Learning with Big Data Analytics for Nuclear

Decommissioning Applications Mr. Roger Boza Florida International University

5:00 p.m. Networking Reception

Page 9: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 5 | P a g e

Thursday – September 19th Workshop Plenary Session

8:00 a.m. Day 2 Welcome – Mr. Mark Phillips, Workshop Technical Program Chair, Raytheon T&E

Fellow 8:20 a.m. Day 2 Overview – Steve Gordon, PhD, Workshop Chair, Georgia Tech Research Institute

8:30 a.m. Morning Keynote – Steve Hutchison, PhD, Director of Test and Evaluation, Department of

Homeland Security 9:00 a.m. Agile Supporting T&E Panel

Moderator: Robin Poston, PhD - Professor and Dean of the Graduate School, System Testing Excellence

Program, University of Memphis

Panelists: Mr. Robert Aguilera - Senior Vice President, Garud Technology Mr. Wayne Dumais - Deputy Director, Immigration and Border Security Programs, Office of Test & Evaluation, Science & Technology Directorate, Department of Homeland Security Jignya Patel, PhD - Assistant Professor of Information Systems, College of Business, Florida Institute of Technology Dean Ptaszynski, PhD - Manager, Enterprise Document Management and Services Development, FEDEX Mr. Tim Sienrukos - Network Communications Branch, ANG-E65, Enterprise Services Test & Evaluation Division, ANG-E6, Federal Aviation Administration (FAA)

10:00 a.m. Break

10:30 a.m. Integrating Government Independent T&E with an Agile Process Panel

Moderator: Mr. Hans Miller - Principal T&E SME, OSD Programs, The MITRE Corporation

Panelists: Mr. Jeffrey Bobrow – Director, C4I and Space, Commander, Operational Test & Evaluation Force (COMOPTEVFOR) Mr. John Hartford - NAVWAR Chef Tester, Naval Information Warfare Systems Command Lt Col Karlos Tungol, USAF - Commander, 45th Test Squadron, USAF Test Center, AFMC, United States Air Force Lt Col Barbara Ziska, USAF - Commander, 605th Test and Evaluation Squadron, USAF Air Warfare Squadron, ACC, United States Air Force

11:30 a.m. Guest Speaker – Major General Matthew "Zap" Molloy, USAF (Ret'd), Director of

Business Development, DellEMC 12:00 a.m. Lunch

Page 10: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

6 | P a g e Hosted by the ITEA Central Florida Chapter

1:15 pm Technical Sessions

Time Title of Presentation Speaker Organization

Agile Autonomous Mudbucket T&E Innovation

1st CHAIR: Mr. Jon Skarphol - Collins Aerospace 2nd CHAIR: Mr. Mike Naes - Georgia Tech Research Institute

1:15 PM

CRIIS & Agile SW Development for T&E

Mr. Brian Hulet Collins Aerospace

1:45 PM

LVC-Enabled Testing Environment for Autonomous Aircraft Systems

Mr. Jon Skarphol Collins Aerospace

2:15 PM Mudbucket: The LVC Appliance Mr. Matt Henderstrom CTSi

2:45 PM

Self Service Infrastructure Environment for Next Generation

High Performance Test and Evaluation

Mr. Charles "Chuck" Reynolds Technical Systems Integrators

Web-Based Tool and LVC2

1st CHAIR: Mr. Robert Lutz, The Johns Hopkins University Applied Physics Lab (JHU/APL) 2nd CHAIR: Mr. Doug Messer - International Test and Evaluation Events Chair

1:15 PM

Cost of T&E: Implementation of TEMP Part Four Web-Based Tool

Mr. Michael Said US Navy

1:45 PM Using LVC in Test Design Mr. Huat Ng KBR

2:15 PM

The LVC Continuum for Test & Evaluation

Mr. Robert Lutz The Johns Hopkins University Applied Physics Lab (JHU/APL)

2:45 PM TBA

TENA-LVC, Agile for FAA, and Navy M&S Reuse

1st CHAIR: Mr. Karl King - TENA SBA 2nd CHAIR: Mr. Xi Guo - Georgia Tech Research Institute

1:15 PM TENA LVC Model Collection Mr. Jim Bak GBL

1:45 PM Use of Agile in the FAA Mr. Timothy Sienrukos Federal Aviation Administration (FAA)

2:15 –3:15 PM

M&S Reuse in the Navy A Conceptual Framework for Flight Test Management and

Execution Utilizing Agile Development and Project Management Concepts

Ms. Rachael Orzechowski SimVentions

SPECIAL STUDENT SESSION - Careers in Test & Evaluation Panel

Moderator: Mr. Mark Phillips - Workshop Technical Chair, Test and Evaluation Fellow, Raytheon

1:30 PM

Dave Brown, PhD, CTEP Workshop Deputy Chair The MITRE Corporation

Lindsey Davis, PhD Supporting the Director,

Operational Test and Evaluation

Institute for Defense Analyses (IDA)

Robin Poston, PhD Director System Testing Excellence Program, University of Memphis

Mr. Austin Ruth Research Engineer I Electronic Systems Lab, Georgia Tech Research Institute (GTRI)

Mr. Tom Treakle Client Solutions Director for DoD Programs DellEMC

Major General Matthew "Zap" Malloy, USAF (Ret'd.)

Director of Business Development DellEMC

Page 11: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 7 | P a g e

3:15 p.m. Break

3:45 p.m. Featured Speaker - Mr. Alex Hoover, Deputy Director of Cybersecurity Engineering, Department of Homeland Security (DHS)

4:15 p.m. Technical Program Review - Mr. Mark Phillips, Workshop Technical Program Chair,

T&E Fellow, Raytheon 4:30 p.m. Workshop Summary - Steve Hutchison, PhD, Director of Test and Evaluation, Department of

Homeland Security (DHS) 5:00 p.m. Workshop Conclusion - Dave Brown, PhD, CTEP, Deputy Chair, The MITRE Corporation,

and Steve "Flash" Gordon, PhD, ITEA Central Florida Chapter President and Chair, Georgia Tech Research Institute (GTRI)

Page 12: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

8 | P a g e Hosted by the ITEA Central Florida Chapter

ABSTRACTS A Conceptual Framework for Flight Test Management and Execution Utilizing Agile Development and Project Management Concepts Mr. Craig A. Hatcher, 812 TSS/ENTI Tracking schedules for flight test is difficult in a traditional network-based scheduling paradigm. Traditional network-based scheduling relies on being able to lay out a plan and follow it during project execution without many changes. This paradigm begins to struggle when the plan changes often. Flight test is a very dynamic endeavor. Scope changes frequently based on what is learned in test. Data is also collected at different times than planned due to the realities of test execution. As a result, schedules are quickly out of date from both a time and resource usage standpoint. To compensate, network schedules are raised to a very high level to capture blocks of work. Insight into true progress is lost in the process. The software industry has moved away from network scheduling techniques toward Agile techniques and processes to manage projects. The reason for this paradigm shift is due to the volatile nature of software projects. User needs and scope changes often as users refine what they truly need. Comprehensive network schedules become obsolete quickly. Agile methods have been invented to minimize upfront planning and embrace scope changes as normal and necessary during the project lifecycle. Flight test and software development projects share similar characteristics. They both are very volatile and require constant management of changes. This presentation will outline a conceptual framework that describes how Agile techniques, concepts, and processes can be used to monitor and execute flight test. In addition, this presentation will show how Agile techniques enable a throughput metric to be constructed that can provide the basis for understanding the capacity of an organization to do work. Applications of Artificial Intelligence: An Overview Tushar Bhardwaj, PhD, Florida International University In computer science, artificial intelligence (AI), refers to the capability of a program to autonomously act, react and adapt to the working environment. AI enables the machine to behave like humans and perform the cognitive functions such as “learning” and “problem solving”. AI systems gradually moves from traditional approaches (algorithms and expert systems) towards more efficient and advanced technologies (machine learning and deep learning). AI is the study of algorithms and statistical models that is being used by computers to perform specific tasks without using explicit instructions. Artificial Intelligence Platform of Cyber Threat Automation and Monitoring (CTAM) System Himanshu Upadhyay, PhD, Florida International University Artificial Intelligence (AI) is a branch of computer science dealing with the simulation of intelligent behaviour in computers (Merriam-Webster). AI is the field of study which allows computers to execute tasks at human intelligence level. AI is being used in many field like cyber security, language translation, speech recognition, image processing, self-driving cars and recommendation systems. AI is used in places where it is difficult for humans to execute the work. AI is the technique allowing computers to match human intelligence which includes machine learning and deep learning. Machine Learning is the study and construction of programs that are not explicitly programmed, but learn patterns as they are exposed to more data over time. Deep learning is the subset of machine learning in which multilayer neural network learn from large amount of data (Intel). Florida International University’s Cyber Attack Orchestration Test Bed for Automation and Threat Monitoring in Virtual Environment, developed for Department of Defense’s Test Resource Management Center has proved its efficacy in advanced cyber threat monitoring and response using state of the art virtualization technology and malware behavioral analysis using sophisticated machine learning algorithms. Focus of this S&T research is to understand the impact of various test vector on the defined mission using the virtual test bed. CTAM system consists of four major components Virtualization, Advanced Cyber Analytics based on AI, Test Control Center, and Test Vector Repository. Focus of this presentation will be on the techniques of Artificial Intelligence applied to cybersecurity. Automated Testing in Aircraft ASD Mr. Bryan Kelly, USAF Test Pilot School (as of July 2019) The successful employment of a weapons system hinges more than ever on embedded software, particularly in the context of Air Force flight operations. As the quantity and complexity of aircraft software grows, so does the risk that defects missed in development will reach the operational fleet downstream. Recent guidance establishing agile software development (ASD) as the new standard for rapid acquisition activities challenges system test groups to release software early and often, without compromising on either quality or safety. Traditional, manual software regression testing is too slow, labor-intensive, and error-prone, and is not well-suited to support rapid acquisition initiatives. This paper examines the return on investment (ROI) of automated regression testing via empirical case study with actual F-16 Center Display Unit (CDU) software developed in an agile environment and using a commercial graphical user interface (GUI) test tool. Automated test scripts are written using the GUI tester while referencing checklist procedures that are currently accomplished manually in CDU test activities. The results suggest that automated regression testing has a positive ROI in terms of cost and time savings when used for aircraft ASD, but implementation involves technical risk. While not explicitly evaluated in this study, automated testing could also produce gains in software quality that outweigh other forms of return.

Page 13: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 9 | P a g e

Cost of T&E: Implementation of TEMP Part Four Web-Based Tool Michael Said, Department of the Navy T&E Office, Deputy Assistant Secretary of the Navy (DASN), RDT&E In response to the need to understand the actual cost of T&E, and to significantly improve the tracking of testing costs in Department of the Navy (DON) acquisition programs, a CAC-enabled, web-based resource tool has been developed to capture T&E resource requirements in support of program testing. The T&E Master Plan (TEMP) Part Four (TP4) tool is a significant enabler to search programs, coordinate, and identify T&E resources requirements. The tool integrates the T&E resource demands across all programs for the DON. This will enable DON T&E to better forecast and allocate T&E resources, and use its limited resources wisely. DON programs required to develop a TEMP per DoDI 5000.02 and SECNAVINST 5000.2E are now mandated to develop resource summary inputs using TP4. Programs are expected to have all of their TEMP Part IV data populated into TP4. Programs are also required to update TP4 at the beginning of the Fiscal Year (FY). Additionally, TP4 shall be updated six months into the FY should significant changes to the T&E resource plan occur. This presentation will provide the background and outline how the tool works, captures inputs and displays overall cost of T&E for acquisition programs. CRIIS & Agile SW Development for T&E Mr. Brian Hulet, Collins Aerospace Joint Secure Air-combat-training System (JSAS) is Collins’ family of test and training systems. DoD Test Ranges’ Common Range Integrated Instrumentation System (CRIIS) and US Navy’s Tactical Combat Training System Increment II (TCTS II) are programs of record in the JSAS portfolio. JSAS provides and collects data across user-defined areas via secure connected-communications and secure platform interface(s). Recent SW development for CRIIS transitioned to using the Agile SW processes, which includes OS migration, simplified HMI/GUI interfaces, and other feature sets. This briefing will cover the lessons learned, highlight the HMI/GUI enhancements, advanced system capabilities, and roadmap details for the Common Range Integrated Instrumentation System (CRIIS) as envisioned by Collins Aerospace to meet the continued needs for Live, Virtual, and Constructive (LVC) enabled systems for test and training including standards based open architectures, certified RMF-hardened MultiLevel Security, and distributed mission operations. Cyber Testing in a Rapid DoD Acquisition Environment Mr. Kevin McGowan, COLSA Corporation The need to deliver functional capability to the warfighter at the speed of need is driving an increasing number of acquisition programs towards the use of Agile and rapid prototyping acquisition strategies. But, hand in hand with the need for rapid capability delivery is the need for these systems to be cyber survivable in an increasingly complex cyber contested battlespace. This includes ensuring that the system is both cyber secure and cyber resilient. The system must have appropriate protections in place and be capable of successfully and effectively executing its mission functions during and following a cyber-attack. This presentation addresses some challenges that Agile and rapid programs present to the DoD cyber test community, as well as some strategies that may be employed to help ensure the delivery of cyber survivable systems to the warfighter. This includes, but is not limited to shaping the development and test environments, integrating test with system security engineering, adopting SecDevOps and Test-Based Development strategies, incremental cybersecurity and cyber resiliency testing, the use of both automated and hands-on cyber testing, and vulnerability identification and remediation. Data Analysis Mr. Austin Ruth, Georgia Tech Research Institute Data analysis is currently a manual process, often taking long periods of time to complete singular measurements. The inefficient processing of data leads to a lack of valuable information to present to the test engineers. The presented solution implements an Artificial Intelligence agent with the ability to complete tasks in parallel while also incorporating advanced techniques such as machine learning, natural language processing, computer vision, and more to process data at a higher fidelity than possible by the data scientist. This will relieve the bottleneck of human processing time, and have it replaced with the processing speed of the available processing unit. These techniques are incorporated using a data pipeline, allowing the agent to request appropriate processing techniques for the data. Because of the pipeline, the agent itself will be completely modular, allowing for analysis of aircraft carrier data to F-16 flight test data. We currently use machine learning models for flight test data analysis to more efficiently categorize radar parameters. Finally, due to the improved data analysis efficiency, reports are generated more quickly, and in-situ analysis at test events can be performed.

Page 14: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

10 | P a g e Hosted by the ITEA Central Florida Chapter

Deep Learning with Big Data Analytics for Nuclear Decommissioning Application Mr. Roger Boza, Applied Research Center, Florida International University The nuclear industry is experiencing a steady increase in maintenance costs even though plants are maintained under high levels of safety, capability, and reliability. Nuclear power plants are always expected to run every unit at maximum capacity, efficiently utilizing assets with minimal downtime. Surveillance and maintenance of nuclear-decommissioning infrastructure provide many challenges with respect to maintenance or decommissioning of the buildings. As these facilities await decommissioning, there is a need to understand the structural health of these structures. Many of these facilities were built over 50 years ago and in some cases these facilities have gone beyond the operational life expectancy. In other cases, the facilities have been placed in a state of “cold and dark” and they are sitting unused, awaiting decommissioning. In any of these scenarios, the structural integrity of these facilities may be compromised, so it is imperative that adequate inspections and data collection/analysis be performed on a continuous and ongoing basis. A pilot-scale infrastructure was developed to implement structural health monitoring using scanning technologies, machine learning/deep learning and big data technologies. The focus for structural health monitoring was the walls of the mock-up infrastructure. A plan was developed to collect various formats of data such as structured and unstructured data from the various sensors deployed in the mock-up infrastructure. The main source considered for data was video and images from various imaging sources. During the data collection process, a total of 28,000 images (RGB) were taken with a regular camera and stored in the Big Data Platform using the Hadoop Distributed File System (HDFS). The images contain variations in light exposure, angles, and aspect ratios. The entire dataset was evenly separated into two categories, “Baseline” and “Degraded”. A duplicate dataset was formed by scaling down all images using antialiasing to a manageable resolution for the neural network model. This data distribution formed the basis for the machine learning approach. A Deep Convolutional Neural Network (CNN) was implemented in Python using the Keras library and TensorFlow architecture. The goal for the CNN was to classify the images into its two categories respectively. The CNN was constructed by a Sequential Model in Keras which is a linear stack of neuron layers. A total of 10 layers were stacked by a combination of convolutions, max pooling, and a dense layer. The model was verified with 70/30 Cross Validation technique which achieved 97.1% accuracy during the training phase. The high accuracy of the CNN model demonstrates that with deep learning as a component of structural health monitoring can provide valuable information for the conditions of a nuclear facility. Ensuring Reliability for Middle Tier Acquisition Programs Mr. Martin Wayne and Ms. Melanie Goldman, CCDC Data and Analysis Center Section 804 of the 2016 National Defense Authorization Act establishes guidance for Middle Tier Acquisition (MTA) programs. The process incorporates reduced timelines as well as rapid prototyping of new technologies and rapid fielding of proven technologies. The shortened timelines for MTA can present difficulties for traditional reliability growth testing. However, the Army currently has the capability to supplement traditional testing with more innovative and effective methods to accelerate fielding while ensuring necessary reliability is achieved. An alternative approach to traditional reliability testing involves identifying potential failure modes and mechanisms in subsystem/components of concern and then characterizing the highest priority failure modes using the most efficient means available. A number of reliability engineering tools and techniques can be utilized such as modeling and simulation (M&S), component testing, or lab/simulator testing on full systems. This presentation discusses successful applications to weapon systems. The use of M&S, labs, and simulators can provide valuable insights while significantly reducing the amount of system-level testing that is required, ultimately yielding large savings in cost and schedule. There is a great opportunity to apply this approach not only to MTA programs within the Army but throughout the entire DoD as well. Higher-Fidelity Training for the Flight Crew Mr. Matt Hederstrom, CTSi A valuable way to improve mission readiness is to provide higher-fidelity training for the flight crew. The U.S. military expends large amounts of funding per year flying training missions. An Embedded Training (ET) capability would improve the fidelity and effectiveness of in-flight training hours. However, ET requires the ability to manipulate data on the aircraft. This is not straightforward for at least these reasons: 1) The Operational Flight Program (OFP) may not be able to manipulate every system from inside the OFP, which may preclude a centralized software-only solution; and, 2) Retrofitting existing systems with distributed software for ET may be cost prohibitive or not even possible. The patented Mudbucket® provides a cost-effective means of manipulating data on the aircraft. The Mudbucket was originally developed to inject flight simulation and other scenario data into an aircraft on the ground to simulate in-flight conditions for testing. The system manipulates data on existing aircraft interfaces such as MIL-STD-1553, ARINC-429, and IEEE-1394 so that multiple components on the aircraft remain coordinated when scenario data is injected. Mudbucket allows hardware to be kept in the loop to increase test and training fidelity. This capability can be used to inject simulated threats into an aircraft in flight. Mudbucket complements the aircraft mission systems software by providing the on-aircraft hardware-in-the-loop capability, software, and interfaces needed to minimize the cost of ET. Mudbucket provides a greater ability to merge LVC entities both within training simulators and onboard aircraft in flight. This “appliance”, based on Mudbucket, can easily and quickly be installed on the airframe by maintenance personnel, even if in the field or onboard ship. The Mudbucket box can interface to the aircraft’s MIL STD-1553 data bus via a simple T-cable, and the device will be roll-on/roll-off, installed in minutes. Mission planning can be coordinated prior to the training flight, and easily uploaded to the Mudbucket appliance. Pre-defined scenarios can be resident in the device’s library, but trainers will also be afforded the flexibility to build custom scenarios on an ad-hoc basis. During flight operations, the pilot will engage the device through existing displays and menus, providing an intuitive, simple interface with the system. The training interface will not require changes to the aircraft OFP. Alternatively, an electronic kneeboard-style device can serve as the interface, obviating any need to change current menu structure on the platform. The Mudbucket appliance will also provide flight data recording and playback for training de-brief purposes. Reconstruction of the training mission will be intuitive, informative, comprehensive, and afford invaluable feedback to the flight crew to dramatically improve readiness.

Page 15: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 11 | P a g e

Lessons from Past Rapid Acquisition Programs Lindsey A. Davis, PhD, FFRDC Support to DOT&E, Institute for Defense Analyses The United States has been actively engaged in combat for the past two decades. In response to warfighters’ needs, the Department of Defense has rapidly fielded many systems, including a wide range of countermeasures, weapons, and vehicle and aircraft armor upgrades during this time. A variety of programs across all service branches were reviewed to glean lessons from past rapid acquisition programs; many of these lessons also can be applied to future programs as recommended best practices. Six lessons emerged, representing themes common across multiple programs. The first lesson shows that the Department of Defense has the ability to field systems rapidly when needed. Lessons two through five pertain to specific techniques for success in rapid acquisition programs. The last lesson highlights that rapid acquisition strategies are not always successful for some programs. Case studies are used to illustrate these lessons. The LVC Continuum for Test and Evaluation Mr. Robert Lutz, Johns Hopkins Applied Physics Laboratory Research programs in autonomous unmanned systems have made great progress in recent years, and the supporting technologies are quickly reaching a level of maturity that will allow for formal DoD Programs of Record to be established in the near future. However, as programs begin to initiate developmental test activities, it is clear that the infrastructure used on live ranges will need to be updated to address the special challenges associated with autonomous systems. For instance, the inherent unpredictability of autonomous systems may introduce hazardous conditions when operating in close proximity to other manned assets. This suggests a stronger reliance on simulation methods and tools to identify such hazards and develop/verify effective remediation techniques. However, the tools used to study autonomous system behaviors and performance in a fast-time constructive simulation environment are frequently different than the simulation tools used to support live test events. This leads to modeling inconsistencies, incompatible results, credibility issues, longer simulation development timelines, and higher costs. This paper focuses on a continuum of activities to develop an integrated Live-Virtual-Constructive (LVC) simulation environment that supports the entire end-to-end T&E lifecycle for autonomous system test. In particular, this paper will explore the progression of T&E simulation capabilities from standalone constructive simulation tools to hardware-in-the-loop stimulation capabilities to integration with live range assets. Throughout this developmental continuum, the evolving LVC environment leverages modern agile techniques and Service-Oriented Architecture (SOA) to facilitate compositional development from reusable components throughout each stage of its controlled expansion. Finally, this paper will discuss exemplars of this approach to highlight the efficiencies and other benefits to sponsored programs. LVC-Enabled Testing Environment for Autonomous Aircraft Systems Angus (Thom) McLean, PhD, and Mr. Jon Skarphol, Collins Aerospace Autonomous aircraft systems have testing requirements that are distinct from traditional aircraft test and evaluation. In traditional aircraft systems, the airworthiness is predicated on the ability of an aircrew (test pilot) to safely control the vehicle through the test envelope. Autonomous aircraft are designed to obviate the need for human-in-the-loop controls, so traditional airworthiness paradigms are not directly applicable. Collins has developed an LVC-enabled testing environment that bridges the gap between traditional test methods and the highly innovative development environments used to produce autonomous vehicle technology. Incorporating model-based systems engineering and architecture, along with design-time and run-time assurance structures enable a holistic iterative development and test environment blending the best of LVC and autonomy technologies. In test and evaluation of autonomous aircraft, an iterative development and testing cycle must be fully embraced. Even operational testing should not be viewed as an isolated testing event, but rather the culmination of an iterative development and integration cycle. A testbed for autonomy serves as an integration, experimentation, and test & evaluation context. The flexibility of a test environment can be assessed by determining the range of activities that it can host. The environment we propose is a single integrated set of reconfigurable LVC-enabled resources that host the entire spectrum of testing, from development and integration tests through human-factors evaluations and final operational flight tests. This provides a single integration context, reducing and simplifying the integration activities required to support testing. An effective test environment provides smooth transitions for an article or capability through two critical phases: 1) moving from the development environment to the testing environment, and 2) moving from the testing environment to the final operational or demonstration platform. Many times, in R&D programs, engineering artifacts, such as software or hardware components, are not simultaneously ready for integration or testing. To address this, the proposed environment includes networked computational and simulation devices that support pair-wise or ad hoc integration testing. This allows components to begin testing as they become available. This support of compositional testing insulates the testing process from development schedule risk. A similar problem frequently arises in highly ambitious autonomy research programs, where insights that can have significant impact on the final outcome often occur after components have moved into the testing phase. Our LVC-enabled test environment reduces the overhead of moving between testing phases by maintaining identical software installations in the test aircraft and simulators. This allows a smooth progression of increasingly more complex tests to more efficiently accommodate late-stage innovation. The essential properties of the LVC-enabled test environment we present provides: 1) a single integration context, 2) a smooth transition from development to final test, 3) support for compositional test strategies to mitigate the impact of delays in development, and 4) opportunity for integration of late-stage innovation to ensure the most advanced technologies make it to the final tests. These features will be critical for successful testing of advanced autonomy paradigms.

Page 16: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

12 | P a g e Hosted by the ITEA Central Florida Chapter

M&S Reuse in the Navy Ms. Tammie McClellan, University of Central Florida, and Ms. Rachael Orzechowski, SimVentions Inc. Modeling and Simulation (M&S) is critical to the success of the Test and Evaluation (T&E) process. It is utilized both as a predictive tool and as part of an iterative design process. Everyone can agree that M&S capabilities should be planned and resourced early, and that characterizing the capabilities and limitations of the M&S in relation to its specific intended use is essential. The problem many practitioners face is the ability to find M&S to support their requirements and a lack of incentives to encourage information sharing. The Defense M&S Catalog was established by the Defense Modeling & Simulation Coordination Office to support the visibility component of the DoD data strategy and to provide an avenue for M&S organizations to make resources available for reuse. The Catalog is a collection point for enterprise discovery and actively seeks contribution of resources from M&S organizations. The EMBR tool complements the Catalog and was developed to offer organizations local control and management of their M&S assets which can then be published to the Defense M&S Catalog. The EMBR tool has been customized by other organizations to meet their specific data requirements, while maintaining the ability to exchange information with the DoD enterprise. This presentation will provide an overview of capabilities of the Defense M&S Catalog and a discussion of how the EMBR tool facilitates M&S information exchange between the DoD enterprise, the services, and individual Program Executive Offices (PEOs). Mudbucket: The LVC Appliance Mr. Matt Henderstrom – CTSi A valuable way to improve mission readiness is to provide higher-fidelity training for the flight crew. The U.S. military expends large amounts of funding per year flying training missions. An Embedded Training (ET) capability would improve the fidelity and effectiveness of in-flight training hours. However, ET requires the ability to manipulate data on the aircraft. This is not straightforward for at least these reasons: (1) the Operational Flight Program (OFP) may not be able to manipulate every system from inside the OFP, which may preclude a centralized software-only solution; and (2) retrofitting existing systems with distributed software for ET may be cost prohibitive or not even possible. The patented Mudbucket® provides a cost-effective means of manipulating data on the aircraft. The Mudbucket was originally developed to inject flight simulation and other scenario data into an aircraft on the ground to simulate in-flight conditions for testing. The system manipulates data on existing aircraft interfaces such as MIL-STD-1553, ARINC-429, and IEEE-1394 so that multiple components on the aircraft remain coordinated when scenario data is injected. Mudbucket allows hardware to be kept in the loop to increase test and training fidelity. This capability can be used to inject simulated threats into an aircraft in flight. Mudbucket complements the aircraft mission systems software by providing the on-aircraft hardware-in-the-loop capability, software, and interfaces needed to minimize the cost of ET. Mudbucket provides a greater ability to merge LVC entities both within training simulators and onboard aircraft in flight. This “appliance”, based on Mudbucket, can easily and quickly be installed on the airframe by maintenance personnel, even if in the field or onboard ship. The Mudbucket box can interface to the aircraft’s MIL STD-1553 data bus via a simple T-cable, and the device will be roll-on/roll-off, installed in minutes. Mission planning can be coordinated prior to the training flight, and easily uploaded to the Mudbucket appliance. Pre-defined scenarios can be resident in the device’s library, but trainers will also be afforded the flexibility to build custom scenarios on an ad-hoc basis. During flight operations, the pilot will engage the device through existing displays and menus, providing an intuitive, simple interface with the system. The training interface will not require changes to the aircraft OFP. Alternatively, an electronic kneeboard-style device can serve as the interface, obviating any need to change current menu structure on the platform. The Mudbucket appliance will also provide flight data recording and playback for training de-brief purposes. Reconstruction of the training mission will be intuitive, informative, comprehensive, and afford invaluable feedback to the flight crew to dramatically improve readiness. NAVAIR’s Implementation of Capabilities Based Test and Evaluation (CBTE) Mr. Kenneth Senechal and Ms. Lori Jameson – NAVAIR Capabilities Based Test and Evaluation (CBTE) evolves Developmental Test (DT) to increase the focus on mission level requirements throughout the developmental cycle. CBTE integrates warfare analysis, future system concepts of employment, and current concepts of operations to develop test strategies to “Test like we fight!” By focusing testing based on how the capabilities will be used and the mission context surrounding their use, a higher quality system or system-of-systems can be delivered to the warfighter with significantly reduced likelihood of system deficiency “discovery” by the end user. CBTE relies heavily on Live, Virtual, Constructive (LVC) tools to execute testing much earlier in the acquisition life cycle along with providing a more robust and mission representative test environment though out the test program. Using LVC allows for the testing of capabilities that simply cannot be tested solely in a live environment due to capability complexity or security concerns. Implementing CBTE requires an early review of potential LVC tools to be included in the overall test strategy. At the core of NAVAIR’s implementation of CBTE is the teaming with the Commander Operational Test and Evaluation Force (COTF). NAVAIR is integrating COTF’s Mission Based Test Design (MBTD) process while designing, planning and executing test with COTF in a true Integrated Test Team (ITT). Ultimately, the collaborative NAVAIR and COTF team will be working from one common test matrix and method of test to gather data jointly and report independently. The complete implementation of CBTE will increase the speed at which new capability gets to the warfighter by shifting testing left to identify and fix deficiencies early and by reducing the amount of dedicated Operational Test (OT) required. Additionally, CBTE will verify the execution of kill chains across multiple platforms/systems to ensure there are no interoperability or other issues. Finally, CBTE will close the loop with warfare analysis by bringing the “as delivered” capability back to be compared with the original analysis and used for future studies.

Page 17: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 13 | P a g e

Not So Technical, But Yet key – Attaining T&E Agility by Implementing Right-sized Processes Ms. Rosa R. Heckle, MITRE Independent test and evaluation (T&E) that is focused on both technical and operational implementation is critical to objectively ensure a technology’s performance within its intended use cases. Purposefully defined agile T&E processes that are flexible, collaborative, and iterative ensures that scientific rigor is maintained, and reproducibility of the evaluation is supported while also facilitating the rapid transition of the technology to operational use. Reproducibility is key in efforts to improve the speed and efficiency of T&E. Reproducibility is the ability of an entire T&E process for a particular product to be duplicated. Value from the T&E process is not only derived from the final T&E results, but also from the information that is gleaned through each step of the process; particularly from the evaluator’s objective ideas, thinking process, and conclusions which includes explicit, implicit, as well as the valuable tacit knowledge. Capturing and storing the needed information, including subject matter expert tacit knowledge, allows other testers and developers to maintain momentum and leverage prior work and insights. To accomplish this, it is important to devise processes and use tools that facilitate capture and retrieval of data, information, and ultimately knowledge in testing and evaluating similar systems. While always at the bottom of the ‘to do’ list, documentation and processes are necessary and key to ensuring that capabilities and systems transcends any individual. Communicating and having an understanding of the challenges, as well as being able to leverage lessons learned, will provide the T&E team the ability to maintain forward momentum (NOT have to start all over or make the same mistakes.) The processes and documentation required should be sufficient to get the information needed while not burdening the evaluators with paperwork or rigidity. This presentation will present a case study of a data science research T&E teams’ establishment of processes and establish an agile “solution intent” [SAFe] to facilitate testing and evaluation. We will illustrate that at the start, documentation was non-existent. If it did exist, it did not provide the information needed to support transitions, continuity, or reproducibility. What did exist was stored in a developer’s or researcher’s files and was not immediately applicable to the team. There was a clear need to create our own processes and documented solution intent. As SAFe suggests, we first identified what should be included in the solution intent, and then found easy ways to capture and share it. We used various tools and methods to facilitate the process as well as communication. This deliberate engineering rigor added to testing resulted in repeatable execution and increased predictability by yielding more precise and accurate testing results. The process is flexible, collaborative and iterative, while allowing for repeatability. We will present the processes implemented and how they were tailored specifically for the environment. We will describe what worked, what didn’t work, as well as the issues that emerged, and lessons learned. Self Service Infrastructure Environment for Next Generation High Performance Test and Evaluation Mr. Chuck Reynolds - Technical Systems Integrators Test and Evaluation High Performance Compute Centers are expanding throughout major industries as the need for the computational power across larger data sets and workflows grows exponentially. Lower barriers of entry and higher performing centers are accounting for the bottle neck throttling the adoption of workflows in these Compute Centers. With the advent of technology breakthroughs in the interconnect backplane technology based on industry standard PCIe (Peripheral Component Interconnect express) interface, to solve latency and bandwidth limits, and Lab as a Service automation frameworks, Test and Evaluation Compute Centers can solve both of these issues. Test Centers with a focus on “as a Self Service” portals to schedule, reserve, and configure new, faster, and greener network infrastructure will allow more users to adopt and consume resources and deliver solutions to high compute and big data test and evaluation problems at a new pace. TENA LVC Object Model Collection Mr. Jim Bak, GBL Systems Corporation The TENA Software Development Activity (SDA) has developed a common architecture to support effective integration and reuse of testing, training, and simulation capabilities that require real-time collaboration between distributed computer systems operating within diverse testing and training environments. Through the establishment of the Test and Training Enabling Architecture (TENA), the interoperability and reuse of range assets are tremendously improved, thereby reducing development, operation, and maintenance costs of range systems. TENA Object Models (OMs) are used to define the interfaces used for exchanging information and services between distributed, and often independently developed, applications. The TENA User Community have been using a subset of the current TENA Standard OMs in building TENA-enabled applications/simulations/interfaces in support of LVC-based distributed testing. This subset of TENA Standard OMs have been in existence for 15+ years. Today, through lessons learned from distributed LVC testing, the TENA SDA is conducting a refinement of the current TENA Standard OMs to improve integration and data interoperability for LVC simulations and applications. Use of Agile in the FAA Mr. Timothy Sienrukos, Federal Aviation Administration (FAA) The presentation will describe how we in the FAA are leveraging different technologies (Ansible, Docker, Kubernetes, etc) and techniques (DevOps, MVP, etc) to meet the faster development/test tempo that Agile projects demand. The presentation will offer a reflection on what worked, what didn’t, and various lessons learned throughout the process.

Page 18: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

14 | P a g e Hosted by the ITEA Central Florida Chapter

Using LVC in Test Design Mr. Huat Ng and Mr. Wayne Devereux, KBR As part of system development, the use of simulations is critical to help understand complex system performance. The Department of Defense (DoD) recognizes Live, Virtual and Constructive (LVC) simulations to support cost-effective decisions on technology evaluation. Methods are discussed to use LVC to augment and improve operational test design for applications in the DoD. The nature of LVC environment where the interdependencies of other weapon systems, threats, networks, command and control (C2) and global environments are realized, allow for the experimentation and creation of more advanced test designs. Advancing live test methods are based on well-established standards, and provide new data sets to support the validation of simulations methods in broad areas of varying technical domains. These domains range from complex network and datalink inter-relationships between systems such as the new IFF Mode 5 systems and Link 16 networks, to the effects of extreme environmental conditions, such as icing conditions, on aircraft performance. Current information technology (IT) trends, e.g., virtualization, cloud computing, micro-services, containerization, etc., helps to manage the orchestration of complex test scenarios, resulting in improved definition of test designs and help to maximally discriminate most impactful scenarios for operational testing. Together, these technology attributes and trends applied through LVC simulations can help show how these interdependencies may affect critical mission performance and potentially impact the desired outcomes within complex battlespaces.

Page 19: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 15 | P a g e

SPEAKER BIOGRAPHIES Mr. Robert Aguilera is a Senior Vice President for Garud Technology Services, Inc. (GTS). GTS is a Maryland-based, Economically Disadvantaged Woman Owned Small Business (EDWOSB), and a highly-specialized provider of Technical and Professional services. Rob is a senior acquisition professional with more than 30 years of support to DoD and DHS Research, Development, Test and Evaluation (RDT&E) initiatives including Basic and Applied Research, Technology Demonstrations/Experimentation and prototype efforts. He is a retired Naval Officer of 25 years that completed two tours in his career at the US Navy’s Operational Test Authority. In the last 11 years in private industry, Rob has supported T&E efforts in both DoD and DHS. In DHS, Rob overseas a portfolio of activities that includes Operational Test Agent (OTA) support to several DHS Oversight programs in CBP & FEMA. Most recently GTS completed FOT&E of FEMA’s Logistics Supply Chain Management (LSCMS) and CBPs TECS Modernization programs. Both were Level 2 (ACAT 2 equivalent) oversight programs, and both test events involved cybersecurity/resilience activities. GTS is also designated the OTA for two agile programs in FEMA’s IT acquisition portfolio, Grants Outcome (FEMA GO) and National Flood Insurance Program (NFIP) Pivot.

Mr. Jim Bak graduated from UCLA in 1992 with a Bachelor of Science in Physics. His software development career began in the early 90’s and has covered an extensive array of software development and systems integration including real-time embedded solutions for airborne avionics and armament systems; GPS orbit modeling and clock corrections for the FAA’s Wide Area Augmentation System (WAAS); and net-centrics based application development for distributed Live-Virtual-Constructive T&E events. Mr. Bak is currently the Vice President – Engineering for GBL Systems Corporation, who he has worked for since 2001. He also supports the Test Resource Management Center (TRMC) in a number of software and T&E capacities, including Object Model development for the Test & Training Enabling Architecture (TENA) Software Development Activity (SDA). William D. (Dave) Bell, PhD, CTEP, has more than forty-nine years of professional engineering experience with extensive knowledge in the areas of basic research, systems development, systems engineering, systems integration, testing and acceptance, resource acquisition, and program administration. He retired from Federal Government service in 2005 and since 2007 has worked for The MITRE Corporation as a Principal Multidiscipline Systems Engineer. Recently he served as the Technical Director for Dr. C. David Brown who was both the DASD(DT&E) and Director TRMC. Prior to this assignment, he managed several of MITRE’s Test and Evaluation (T&E) projects that support the Office of the Secretary of Defense (OSD). Since January 2004, he has taught a graduate course in Systems Integration and Test for Southern Methodist University (SMU). He has guest lectured for Johns Hopkins University (JHU) systems engineering classes multiple times and co-taught T&E for MITRE’s Master’s students at JHU. Dave received his BS in Physics from The Ohio State University, MBA from National University and the Doctor of Engineering from Southern Methodist University.

Mr. Timothy F. Bishop, SES, currently serves as the Deputy Program Executive Officer for the Program Executive Office, Simulation, Training and Instrumentation. Mr. Bishop manages a portfolio of fielded systems with a current inventory value of over $2.65B in 125 locations; over 335,000 training devices in 19 foreign countries; and 480 sites worldwide. He evaluates proposed program plans of subordinate managers in light of current technology; Army research, development, and engineering needs; and program thrusts.

C. David Brown, PhD, is a consulting engineer supporting DOD T&E through the MITRE Corporation and the Institute for Defense Analyses. He also teaches graduate courses in program management and systems engineering for Johns Hopkins University. He is the former Deputy Assistant Secretary of Defense for Developmental Test and Evaluation and Director of the DOD Test Resource Management Center. Throughout his 44-year career in T&E, he has held a variety of positions ranging from range instrumentation development, range test director, technology and range director, and test lead for a major Army acquisition program. Dr. Brown has a PhD in electrical engineering from the University of Delaware and an MS in National Security Policy from the Industrial College of the Armed Forces. He is a registered Professional Engineer, was a member of the Army Acquisition Corps, and is a retired Army Reserve Colonel. He has three patents and is a Certified Test and Evaluation Professional, and a certified Scaled Agilist. He has been a member of ITEA almost since its founding and has severed in numerous leadership and support positions. He was the recipient of the prestigious ITEA Matthews Award in 2016.

Page 20: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

16 | P a g e Hosted by the ITEA Central Florida Chapter

Mr. Terry L. Carpenter, Jr., is the Program Executive Officer (PEO) for the National Background Investigation Service (NBIS). As the PEO for NBIS, he is responsible for leading the effort to design, build, test, field, operate, maintain, and secure the federal-wide information technology (IT) service used to conduct suitability, security, and credentialing investigations for all federal civilians, military members, and government contractors. He was appointed to the Senior Executive Service in 2017. Until his appointment as the PEO for the NBIS, he served as the Director of DISA’s Services Development Directorate (SD) where he was responsible for the acquisition of information services for the DoD. He provided the acquisition oversight of multiple portfolios of joint programs that deliver integrated enterprise services and data systems for DISA business operations, DoD collaboration, global cloud computing services, and warfighting command and control services. Prior to his role as the Director of SD, he served as the Chief of the Requirements and Analysis Office, where he was responsible for the analysis of new requirements; maintaining the requirements baseline and developing business cases for investments in new capabilities aligned with DISA and department strategic guidance. Prior assignments at DISA include serving as the Technical Director (TD) for the Component Acquisition Executive. There, he led development of acquisition strategy and technical implementation planning which included four major portfolios covering enterprise services, mission assurance and information assurance, and communications. He led the development of a Unified Capabilities strategy and tactical plan that drove greater innovation and efficiencies while decreasing new service delivery times. Before he came to DISA, he served as the TD and Assistant Program Manager for Production for the Navy Enterprise Resource Planning (ERP) Program where he earned the Navy Meritorious Civilian Service Award. As the TD, he was responsible for all technical matters related to Navy ERP, an integrated business management system that updates and standardizes Navy business operations, provides financial transparency and total asset visibility across the enterprise. Prior to joining the government in 2009, he was the President and CEO of NextGenData, Inc., which designed, developed and operated internet telecommunications and data systems for customers globally.

Mr. Peter H. Christensen, CTEP, is employed by the MITRE Corporation. Pete is currently assigned as the Cyber Lead for MITRE’s work in support of the Deputy Assistant Secretary of Defense (DASD) for Information and Integration, Portfolio Management, Cyber Directorate. In other MITRE roles, he supported the Office of the Secretary of Defense (OSD) Program Division as Test and Evaluation Portfolio Manager, Naval Sea Systems Department Head, and Lexington Park Site Lead. From 2014 through 2017, Pete was asked to serve on Intergovernmental Personnel Assignment (IPA), with Test Management Resource Center (TRMC), as the Director, National Cyber Range (NCR). In that role Pete was responsible for customer outreach, event planning and execution of Cybersecurity Test & Evaluation (T&E) and Training events conducted at the NCR, including Program Management of the NCR Contract and NCR Complex expansion. Under Pete’s leadership, NCR utilization dramatically increased and the Government, FFRDC, Contractor Team successfully executed over 200 events. As MITRE’s Test and Evaluation Portfolio Manager, Pete was responsible for coordinating Test and Evaluation Activities for several Acquisition and Technology and Logistics and OSD Sponsors. Pete supported the DASD for Developmental Test and Evaluation, the TRMC and the OSD, Director of Operational Test and Evaluation (OT&E). From 2001 through 2006, Pete served as an IPA in Scientific Advisory Roles with the Marine Corps Operational Test and Evaluation Activity. Pete led two Operational Test Agency Initiatives to address OT&E of Information Assurance and Interoperability Testing. Pete concurrently provided oversight and direction to the OT&E for several programs undergoing OT&E including the M777 Lightweight 155 Howitzer and Expeditionary Fighting Vehicle. Pete is an Adjunct Professor at Capitol Technology University in Laurel Md. For the last 12 years, since 2006, he has taught courses in the Cybersecurity Master’s Program on Network Systems Security Concepts and Malicious Software. Pete is a retired U.S. Navy Commander. Pete is an active member of the International Test and Evaluation Association (ITEA), a Certified Test and Evaluation Professional (CTEP), has served on the ITEA Board of Directors and authored many articles in The ITEA Journal of Test and Evaluation.

Lindsey A. Davis, PhD, obtained a Bachelor’s of Science in Chemical Engineering from Virginia Tech in 2012 and a Doctor of Philosophy in Biomedical Engineering from the University of South Carolina in 2016. She has worked at the Institute for Defense Analyses since 2017 supporting the Director, Operational Test and Evaluation (DOT&E) live fire group. Mr. Ralph C. (Chris) DeLuca is currently serving as the Director, Space, Cyber and Information Systems in the Office of the Deputy Director for Developmental Test and Evaluation. He joined the organization in June, 2015. Prior to this, he was the Program Support Team Lead for Land Warfare and Tactical Communications programs within the Office of the Deputy Assistant Secretary of Defense for Systems Engineering’s Major Program Support Division beginning in January 2011. Preceding that assignment, he served as the Deputy Program Manager for the Defense Agencies Initiative program at the Department of Defense Business Transformation Agency, joining Department of Defense as an NH-IV Government Civilian there in August 2009. Prior to civilian service, Mr. DeLuca gained broad defense and acquisition knowledge and expertise spanning the acquisition lifecycle during a 25-year career in the U.S. Army Armor and Acquisition Corps. He retired as a Colonel after serving as the Project Manager for Future Combat Systems Spin Outs. Earlier in his military career, he served as the Product Manager for the Army Airborne Command and Control System and the Mounted Battle Command on the Move programs. Also, as a Field Grade Officer he served as the Army G-3 Staff Officer for equipment, as an evaluator at the Army Test and Evaluation Command Close Combat Directorate, with the Army Research Laboratory Sensors and Electron Devices Directorate as an Integration Officer, and as an Assistant Project Manager for Tank Training Devices for the Abrams Program. He commanded two Cavalry Troops at the Company level as well as serving as an Airborne Armor Platoon Leader among other Company-grade line assignments. Also, as a junior officer he served as a Comptroller at the Army Operational Test and Evaluation Command.

Page 21: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 17 | P a g e

Mr. David W. Duma is the Principal Deputy Director, Operational Test and Evaluation. He assumed this position in January 2002. Prior to returning to government service, he worked in private industry managing a variety of projects involving test and evaluation; requirements generation; command, control, communications, intelligence, surveillance, and reconnaissance; modeling and simulation; and software development. He served as Acting Director, Operational Test and Evaluation from February 2005 to July 2006, June 2009 to September 2009, and January 2017 to December 2017. Mr. Duma completed 30 years of Naval service, which included serving as the Acting Deputy Director for Conventional Systems in the office of Director, Operational Test and Evaluation and Director, Test and Evaluation Warfare Systems for the Chief of Naval Operations. His Naval career also included service as the Deputy Commander, Submarine Squadron 10 and Commanding Officer of the nuclear-powered submarine USS Scamp (SSN 588). Mr. Duma holds Masters of Science degrees in National Security and Strategic Studies and in Management. He holds a Bachelor of Science degree in Nuclear Engineering. He received the U.S. Presidential Executive Rank Award on two occasions; in 2008, the Meritorious Executive Award and in 2015, the Distinguished Executive Rank Award. He received two lifetime achievement awards for his work in defense test and evaluation; first in 2017, from the International Test and Evaluation Association and second in 2018, from the National Defense Industrial Association.

Mr. Wayne Dumais is the Deputy Director, Immigration and Border Security Programs, Office of Test & Evaluation, Science & Technology Directorate, Department of Homeland Security. Wayne covers programs under the border security and immigration portfolio with the Office of Test and Evaluation at DHS. Before DHS, Wayne came out of industry with 15 years of hands on T&E experience spanning developmental test (DT), operational test (OT) and a significant amount of time managing Live Fire (LFT&E) programs for General Dynamics. Wayne was a tank master gunner and platoon sergeant in the Army and a graduate of the Naval Post Graduate School. The Honorable Kevin M. Fahey currently serves as the Assistant Secretary of Defense for Acquisition (ASD(A)). In this position, he advises the Under Secretary of Defense for Acquisition and Sustainment (USD(A&S)), the Deputy Secretary of Defense, and the Secretary of Defense on matters relating to the Department of Defense Acquisition System, acquisition program management, and the development of strategic, space, intelligence, tactical warfare, command and control, and business systems. Before assuming his position as ASD(A), Mr. Fahey was employed with Cypress International, Inc. in Alexandria, Virginia as Vice President, Combat Vehicles and Armaments following a 34-year civil service career culminating with his retirement on December 1, 2015 from the Senior Executive Service. Mr. Fahey was selected for the Senior Executive Service in February 2000. Effective June 1, 2014, Mr. Fahey assumed the duties as the Executive Director, System of Systems Engineering and Integration Directorate, Office of the Assistant Secretary of the Army (Acquisition, Logistics, and Technology). Mr. Fahey previously served as the Program Executive Officer for Combat Support and Combat Service Support, Program Executive Officer Ground Combat Systems at Warren, MI, as well as the Deputy Program Executive Officer Ammunition, Senior Technical Executive for Close Combat Armament Systems, Armament Research, Development and Engineering Center, at Picatinny Arsenal, NJ. Mr. Fahey also served as the Deputy Project Manager, Crusader and the Chief of the Systems Engineering and International Division for the Crusader Program, the Future Armored Resupply Vehicle (FARV) Program Development Project Officer and Chief of Systems Engineering, the U.S. delegate to the international 155mm Joint Ballistic Working Group and the M119 Development Project Officer. Mr. Fahey, a native of Massachusetts, entered civil service in 1981 following graduation from the University of Massachusetts with a Bachelor of Science Degree in Industrial Engineering/Operations Research. Upon graduation from college, Mr. Fahey attended the Quality and Reliability intern program at the DARCOM Intern Training Center, Red River Army Depot, Texarkana, Texas. Mr. Fahey has been the recipient of multiple awards and honors to include the Presidential Distinguished Rank Award, Exceptional Civilian Service Award (2nd award), Meritorious Civilian Service Award, and Superior Civilian Service Award.

Steven “Flash” Gordon, PhD, is the Orlando Field Office Manager and a Principal Research Engineer for Georgia Tech Research Institute. He served 26 years in the United States Air Force with tours as an F-111 Weapons Systems Officer, Instructor, and Wing Electronic Warfare Officer; Air Staff Division Chief; 13th Air Force Director of Operations and Air Operations Center Director; and Air Force Academy Department of Mathematics Professor and Head. He also served as the first Technical Director for the Air Force Agency for Modeling and Simulation. Dr. Gordon has a Bachelor’s Degree in Mathematics (Marymount); Master’s Degrees in Education (Peabody/Vanderbilt), Industrial Engineering/Operations Research (Purdue), and in Business (Florida); and a PhD in Aero and Astro Engineering (Purdue). His research interests include return on investment for simulation-based training, trade space tools for training systems, statistical techniques for test and evaluation, and decision support tools for military operations.

Page 22: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

18 | P a g e Hosted by the ITEA Central Florida Chapter

Mr. Geoffrey Hart, Solution Train Engineer, National Background Investigation Services (NBIS), Defense Information System Agency (DISA). For over 22 years, Mr. Hart has been serving and supporting Information Technology Acquisition efforts throughout the Federal Government. He is a certified DAWIA Level III Acquisition Professional within Information Technology and Project Management Professional (PMP) with experience planning, leading, executing, and controlling a variety of information technology related initiatives including business process improvements, technology development, and implementation. Currently, Mr. Hart is an IT Analyst for Policy & Planning within the National Background Investigation Services (NBIS) Program Management Office (PMO). There he serves as the Large Solution Train Engineer. As such, Mr. Hart provides subject matter expertise to the Program Director, the Program Manager (PM), Deputy PM, the Solution Manager, the Chief Architect, three (3) Product Managers and 17 Product Owners to achieve program objectives within all of the program functional areas. Further, he guides the nearly 300 professionals allocated across the program’s three (3) Agile Release Trains (ARTs) and 19 development teams. A servant leader and coach, Mr. Hart is responsible for designing, facilitating, enabling, and coordinating the program’s large solution efforts. His efforts ensure consistency across the various components of the program to deliver value to the end user. Within this role, Mr. Hart received the 2018 award for the Outstanding Achievement as a Contracting Officer's Representative (COR). He was specifically cited for an ability to capitalize on the talents of the entire team and effectively draw from the strength of each of the team members. Further, Mr. Hart was cited for developing alternative courses of actions and keeping management informed on resolution – frequently tackling “red tape” proactively to clear obstacles in order to ensure continued progress towards established program objectives. Before coming to the NBIS PMO, Mr. Hart supported DISA on efforts such as the 2015 Dianne McCoy Acquisition Award Clean Sheeting Team that reviewed and identified potential cost savings and avoidances for multiple IT specialties that included: communication, engineering, SATCOM, cybersecurity, applications, and capacity services. Mr. John Hartford joined the Naval Information Warfare Systems Command (NAVWAR) headquarters staff as Chief Tester on March 31. Prior to NAVWAR, John served as the Navy Systems Test Branch Head at the Naval Information Warfare Center (NIWC), Pacific and was embedded in PEO C4I & Space Systems as the Director of T&E. As a Naval officer, John served in a number of T&E billets during his career, including APM-T&E in the Navy’s GPS Program Office, Operations Analyst at the Naval Air Systems Command, and as the Operational Test Director for the S-3A/B Viking Anti-Submarine Warfare aircraft at Air Test and Evaluation Squadron One (VX-1). Mr. Tom Hartzell joined the MROi Integrated Program Office (IPO) Team January 20th as Chief Developmental Tester (CDT). He brings 23 years of AF Cyber/IT and 11+ years of Acquisition experience, including active duty, reserve, contract, and civil service. Tom graduated from the University of South Carolina with a B.S. in Biology in 1993, and joined the SC Dept. of Health and Environmental Control as a HAZMAT emergency responder. He joined the AF in 1995, commissioned though OTS. Assignments included Cannon AFB, NM; Mountain Home AFB, ID; Offutt AFB, NE; and Wright-Patterson AFB, OH. He earned an M.S. in Information Resource Management from AFIT in 1998, deployed in support of OPERATION ENDURING FREEDOM in 2001-2, and retired from the AF Reserves in 2016. Contract and civil service positions included DISA, DLA, 554 ELSW, and AFLCMC. Tom is married to Becky, his spouse of 24 years. They have a daughter, Samantha (21), and son, Gavin (19), who both currently attend Miami University in Oxford, OH. He enjoys spending free time with his family and their two opinionated Corgis. Tom is also active in martial arts, and enjoys water gardening and Islay Scotch.

Mr. Craig Hatcher is currently an acquisition program manager at the 412th TW, Edwards AFB, CA. He has over 25 years of program management experience managing various test infrastructure upgrades. He has experience in Critical Path, Theory of Constraints, and Agile program management techniques. Craig has been instrumental in implementing several different program management tools and techniques during his career within both the test and acquisition program management arenas. He has taught various classes on scheduling and risk management techniques. He is currently managing the development of the next generation control rooms for the 412th TW. Craig has a Bachelor’s of Science Degree in Applied Mathematics from Abilene Christian University, a Master’s of Science Degree in Applied Mathematics from Texas A&M University, and a Master’s of Science Degree in Project Management from City University. Craig is also a certified Project Management Professional (PMP) from the Project Management Institute.

Ms. Rosa R. Heckle has been in the systems engineering field for over 25 years. She currently works for MITRE, a federally funded research and development center. For the past 6 years she has been supporting testing and evaluation for research teams working in human language technologies and computer vision. She currently is a member of a working group looking at testing and evaluation for artificial intelligence/machine learning enabled systems. She is a graduate of Johns Hopkins Carey business school where she received a Masters in Organizational Development, and the University of Maryland where she received her PhD in Information Systems. Amy E. Henninger, PhD, has developed expertise at the critical intersection of information science and technology, advanced analytics, and the process of building strategic decisions at the enterprise-level. Currently serving as a Highly Qualified Expert (HQE) Senior Advisor for Software and Cybersecurity, Dr. Henninger formulates recommendations and strategic plans in software and cybersecurity for the Director, Operational Test and Evaluation (DOT&E). Prior to her current position, Dr. Henninger served in a variety of federally funded research and development center (FFRDC) and special term government leadership positions, traversing the seams between the two as they have intertwined over time. At the Institute for Defense Analyses (IDA), Dr. Henninger’s tenure included a two-year intergovernmental personnel appointment detailed under the Office of the Assistant Secretary of Defense for Research and Engineering. She went on to serve as a HQE with emphasis in modeling and simulation and Live-Virtual-Constructive environments as the Technical Advisor to the Director of the Center for Army Analysis and the Deputy Chief of Staff Army G-8, where she provided vision and leadership for the Army’s $10B M&S/LVC portfolio. She served at the Defense Intelligence Agency (DIA) as a Defense Intelligence Senior Leader (DISL) Senior IT Advisor, where she co-led the development of the Agency’s Digital Future Strategy and; most recently, she served as the Science and Technology Lead for the Test and Evaluation Reform Management Group in the Chief Management Office.

Page 23: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 19 | P a g e

Mr. Alex Hoover became the Associate Director for Cybersecurity within the Capability Development Support Group (CDS) Office of Systems Engineering (OSE) January 9, 2017. Previously, Mr. Hoover served as a Test Area Manager for the Cybersecurity and Homeland Security Enterprise Systems Deputy Directorate within the Office of Test and Evaluation (OTE) in the Science and Technology Directorate (S&T) of the Department of Homeland Security (DHS). Prior to coming to DHS, Mr. Hoover was a civil servant in the Navy as a Systems Engineer for Combat Direction Systems Activity, Dam Neck, Virginia. He supported fleet cybersecurity analyses and served as an Operational Test Director. Mr. Hoover was commissioned as a Surface Warfare Officer in the United States Navy in 1991. He served in the Combat Logistics and Cruiser Destroyer Fleets. Mr. Gene Hudgins works for KBRWyle as Director of Test and Training Environments and supports the Test Resource Management Centers’ (TRMC) Test and Training Enabling Architecture (TENA) Software Development Activity (SDA) and Joint Mission Environment Testing Capability (JMETC) as the lead for the TENA and JMETC User Support Team. Since October 1998, the Central Test and Evaluation Investment Program (CTEIP) has overseen the development of the TENA – which drastically improves range interoperability and resource reuse among DoD range systems, facilities, and simulations. As a key member of the TENA SDA and JMETC Program Office, Gene is responsible for Distributed Event Coordination, Design, and Integration. Gene also manages TENA training and Range Commanders Council coordination. Gene is an active member of the International Test and Evaluation Association (ITEA) and currently serves as Vice President on the Executive Committee of the ITEA National Board of Directors (BOD). Prior to this work for the TRMC, Gene worked on Eglin AFB as an Instrumentation Engineer and Department Head. Gene has a Bachelor’s Degree in Electrical Engineering from Auburn University (War Eagle!), a Master’s Degree in Electrical Engineering from the University of Florida (Go Gators!), and an MBA from the University of West Florida.

Steven J. Hutchison, PhD, is the Director of the Capability Development Support Group (CDS) in the Science and Technology Directorate of the Department of Homeland Security (DHS), and continues to serve as the Director, Test and Evaluation. Prior to coming to DHS, Dr. Hutchison served in the Office of the Secretary of Defense as Principal Deputy, Developmental Test and Evaluation. Dr. Hutchison previously served in the Defense Information Systems Agency (DISA) as Test and Evaluation Executive, and in the office of the DoD Director, Operational Test and Evaluation (DOT&E) as a net-centric warfare system analyst. Dr. Hutchison retired from the US Army in 2002. His military career included assignments in the 82nd Airborne and 3rd Infantry divisions, as an Assistant Professor in the Department of Mathematics at the United States Military Academy, and as Assistant Technical Director and system evaluator in the Army Test and Evaluation Command (ATEC). Dr. Hutchison earned a Bachelor of Science degree from the United States Military Academy, a Master of Science in Operations Research at the US Naval Postgraduate School, and his PhD in Industrial Engineering from Purdue University in 1998.

Ms. Lori Jameson is a senior civilian responsible for implementing LVC solutions for Capabilities Based Test & Evaluation across the NAVAIR enterprise. She has over twenty-nine years of experience in program management, warfare analysis, test and evaluation (T&E), training systems, and modeling and simulation. Ms. Jameson also supports PMA-205 with the 2025 Fleet Training Wholeness Strategy. She is providing expertise to support training solutions including LVC connectivity with open air ranges, virtual labs and constructive models. Ms. Jameson earned a Bachelor of Science Degree in Information Systems Management from the University of Maryland.

Ms. Jennifer Johnson, World Elegance Pageant Ms. Plus Florida 2019, has been performing and singing since she was 8-years old and has over 150 shows under her belt. She has performed in many different places from North Carolina to Florida including Voce, GI Jukebox, American Sway, Brahms to Broadway and Disney’s Candlelight Processional where she has sung with some of today’s brightest stars including Warwick Davis, James Edward Olmos, Cal Ripken, Jr and Whoopi Goldberg. She is the step-granddaughter of the late Retired Air Force Lt. Col. David Hatcher. Jennifer gives back to her community while empowering girls and women of all ages to believe that nothing is impossible.

Major Bryan “Rico” Kelly is an assistant director of operations and F-16 instructor pilot at the U.S. Air Force Test Pilot School at Edwards AFB, CA. He recently completed a fellowship at the Lawrence Livermore National Laboratory in Livermore, CA. Major Kelly was commissioned in 2005 from the U.S. Air Force Academy. He is a U.S. Air Force Test Pilot School graduate and senior pilot with more than 2,400 flying hours and 400 instructor hours in 36 different makes and models of aircraft.

Page 24: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

20 | P a g e Hosted by the ITEA Central Florida Chapter

Ms. Tammie McClellan is the Program Director for Information Systems Technology at UCF's Institute for Simulation and Training. She leads the development effort for the Defense M&S Catalog’s discovery features to assist users in locating M&S resources available for reuse. Her team is also responsible for hosting and support of the servers that house the Defense M&S Catalog, EMBR Portal, and other R&D efforts on behalf of the Defense M&S Coordination Office (DMSCO).

Col Hans Miller, USAF (ret), is a Principal Test and Evaluation Subject Matter Expert at the MITRE Corporation. He retired with over 25 years of experience in combat operations, experimental flight test, international partnering, command and control, policy, and strategic planning of defense weapon systems. His last assignment was as Division Chief of the Policy, Programs and Resources Division, Headquarters Air Force Test and Evaluation Directorate at the Pentagon. He led a team responsible for Test and Evaluation policy throughout the Air Force, coordination with OSD and Joint Service counterparts, and staff oversight across the spectrum of all Air Force acquisition programs. Prior to that assignment, he was the Commander of the 96th Test Group, Holloman AFB, NM. Hans Miller was commissioned as a graduate of the USAF Academy. He has served as an operational and experimental flight test pilot in the B-1B and as an F-16 chase pilot. He flew combat missions in the B-1B in Operation Allied Force and Operation Enduring Freedom. He served as an Exercise Planning Officer at the NATO Joint Warfare Center, Stavanger, Norway. Col (ret) Miller was the Squadron Commander of the Global Power Bomber Combined Test Force coordinating ground and flight test activities on the B-1, B-2 and B-52. He served as the Director, Comparative Technology Office, within the Office of the Secretary of Defense. He managed the Department’s Foreign Comparative Testing, and Rapid Innovation Fund programs. Hans Miller is a Command Pilot with over 2100 hours in 35 different aircraft types. He is a Department of Defense Acquisition Corps member and holds Level 3 certification in Test and Evaluation. He is a graduate of the USAF Weapons School, USAF Test Pilot School, Air Command and Staff College and Air War College. He holds a bachelor’s degree in Aeronautical Engineering and a master’s degree in Aeronautical and Astronautical engineering from Stanford University.

Maj. Gen. Matthew H. Molloy, USAF (Ret.) is the former Commander, Air Force Operational Test and Evaluation Center, Kirtland Air Force Base, N.M. General Molloy reports directly to the Air Force Chief of Staff regarding the test and evaluation of more than 76 major programs valued at more than $650 billion being assessed at 12 different locations. He directs the activities of more than 750 military, civilian, and contractor personnel. As a member of the test and evaluation community, General Molloy coordinates directly with the offices of the Secretary of Defense and Headquarters U.S. Air Force while executing realistic, objective and impartial operational testing and evaluation of Air Force, coalition and joint warfighting capabilities. General Molloy was commissioned in 1987 through the Reserve Officer Training Corps program at the University of Colorado, Boulder, where he received a Bachelor of Science degree in aerospace engineering. He completed Euro-NATO Joint Jet Pilot Training in 1989 and is a command pilot with more than 3,200 flying hours in the F-15 and F-22. He has commanded at the flight, squadron, group and wing levels.

Ms. Rachael Orzechowski, Program Manager with SimVentions Inc., has over ten years’ experience with software development and modeling and simulation supporting various US Government programs. She currently leads the effort to develop the Enterprise Metacard Builder Resource (EMBR) Portal and its customized versions and enjoys research into enhancing organizational effectiveness through the application of Agile Methodologies.

Ms. Jignya Patel joined Florida Institute of Technology in 2015 as an assistant professor of Information Systems. She holds an MBA from University of North Alabama and a Ph. D from University of Memphis in the area of Management Information Systems. Her research interests are in the area of IT project management, software testing, and unintended or negative effects of IT use on individuals. Her publications have appeared in journals such as Information Technology & People and Journal of Information Technology Management. She has also presented her research at prestigious conferences such as International Conference on Information Systems, American Conference of Information Systems, and Southern Association for Information Systems.

Page 25: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 21 | P a g e

Robin Poston, PhD, is the Director of the System Testing Excellence Program for the FedEx Institute of Technology at The University of Memphis, and she is a Professor and Dean of the Graduate School at The University of Memphis, which has 4,200 graduate students studying in 126 graduate programs. Dr. Poston is a recipient of the Memphis Alumni Association Distinguished Teaching Award and she leads the annual International Research Workshop on Advances and Innovations in Software Testing attended by hundreds of academic and industry professionals. Dr. Poston’s current research focuses on understanding the alignment within the IT unit among developers and testers, client managers’ responsibilities and governance especially in mitigating vendor silence and managing vendors in outsourcing of software testing projects, and online security threats. Dr. Poston has over 18 years of experience in the information systems field working for KPMG Consulting, Educational Computer Corporation, Meta Group Research, and Convergys, as well as consulting with several Fortune 500 companies and government agencies. Today, she works with organizations, such as the Department of Homeland Security, Defense Information Systems Agency of the Department of Defense, FedEx Corporation, First Tennessee Bank, St. Jude/ALSAC, and others to conduct projects and educational programs. Dr. Poston received her bachelor’s degree in Computer Science from The University of Pennsylvania (1987), master’s degree in Accounting from The University of Central Florida (1992), and Ph.D. in Management Information Systems from Michigan State University (2003).

Mr. Bob Potter is the Division Chief for Mission Command within the US Army Evaluation Center's C4ISR Directorate, and is the ATEC's Lead Representative to the Synthetic Training Environment Cross Functional Team. Before joining ATEC in 2007, Mr. Potter served as a Software and Systems Engineer for various US Navy Command and Control Systems, including the USS Ronald Reagan Ship Self Defense System and numerous AEGIS Class cruisers and destroyers. Mr. Potter's T&E portfolio includes the Joint Battle Command - Platform, Nett Warrior and the Army's emerging Command Post Computing Environment. Mr. Potter holds a bachelor's degree in Computer Science from Old Dominion University.

Mr. Dean Ptaszynski retired from the US Army as a Lieutenant Colonel, after serving 20 years in leadership positions in Tactical Communications and Computer Systems Acquisition. After the Army, Dean was an IT Manager with FedEx Services for over 19 years. While at FedEx, Dean managed an employee/vendor team that was responsible for testing of all Corporate Loads for Revenue Systems. After Testing, Dean stood-up an Agile development team that was responsible for Enterprise Document Management, Enterprise Notifications, Enterprise Currency and Enterprise Shipment. During this time, Dean became a certified Scaled Agile Framework (SAFe) Program Consultant (SPC). As an SPC, Dean taught SAFe courses to over 500 FedEx employees.

Ms. Jennifer Rekas is a Senior Agile Systems Engineer with The MITRE Corporation. She has led numerous Agile/DevOps transformations across a wide variety of DoD, Intelligence Community and Federal Civilian Agencies and regularly provides DevOps training and agile coaching for MITRE sponsors looking to adopt agile practices to deliver better and faster. Jennifer also leads a multidisciplinary team of systems and software engineers to mature DevOps capabilities and enable more effective application in the government domain. Her current research interests include how to utilize DevOps metrics to enable rapid decision making by senior leadership and how to tailor recognized commercial agile practices for large scale, multi-vendor mission critical DoD systems of systems. Before joining MITRE, Jennifer was a software systems engineer with commercial, government contractors, and other FFRDC organizations. Mr. Austin Ruth is Research Engineer for Georgia Tech Research Institute (GTRI) tasked with the improvement of flight test data analysis processes using Artificial Intelligence. He holds a B.S. in Electrical Engineering from Kennesaw State University, and is pursuing an M.S. in Computer Science with an Interactive Intelligence focus from Georgia Tech’s Online Masters of Computer Science (OMSCS). After a year of work in flight test support, Mr. Ruth began working in the field of data analysis where he successfully showed the viability of machine learning in the space. Throughout his career he has also worked as a systems engineer, leading multiple system test events with GTRI.

Page 26: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

22 | P a g e Hosted by the ITEA Central Florida Chapter

The Honorable Alan R. Shaffer currently serves as the Deputy Under Secretary of Defense for Acquisition and Sustainment (A&S). Senate confirmed in January 2019, he is responsible to the Under Secretary of Defense for all matters pertaining to acquisition; contract administration; logistics and material readiness; installations and environment; operational energy; chemical, biological, and nuclear weapons; the acquisition workforce; and the defense industrial base. From 2015 to 2018, Mr. Shaffer served as the Director, NATO Collaboration Support Office in Neuilly-sur-Seine, France. In this role, he was responsible for coordinating and synchronizing the Science and Technology (S&T) collaboration between NATO member and partner Nations, comprising a network of about 5,000 scientists. Previous to his role at NATO, Mr. Shaffer served as the Principal Deputy Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) from 2007-2015. In this position, Mr. Shaffer was responsible for formulating, planning and reviewing the DoD Research, Development, Test, and Evaluation (RDT&E) programs, plans, strategy, priorities, and execution of the DoD RDT&E budget that totals roughly $25 billion per year. He has also served twice as the Acting Assistant Secretary of Defense for Research and Engineering from 2007-2009 and 2012-2015. Additionally, in 2009, he was appointed as the first Director, Operational Energy, Plans and Programs (Acting). Mr. Shaffer has also served as the Executive Director for several senior DoD Task Forces, including review of all research, acquisition and test activities during the 2005 Base Realignment and Closure. In 2007, he was the Executive Director for the DoD Energy Security Task Force and, from 2007-2012, he served as the Executive Director of the Mine Resistant Ambush Protection (MRAP) Task Force, where he was responsible for oversight and fielding 27,000 MRAPs. Before entering the federal government, Mr. Shaffer served a 24-year United States Air Force career in command, weather, intelligence and acquisition oversight with assignments in Utah, California, Ohio, Honduras, Germany, Virginia and Nebraska. His career included deployment to Honduras in the mid-1980s and direct support of the United States Army 3rd Armored Division in Hanau, Germany. During Operation DESERT STORM, he was responsible for deployment of the 500-person theater weather force.

Mr. Kenneth Senechal serves as the Director, Capabilities Based Test & Evaluation (CBTE), and is the senior civilian responsible for implementing Capabilities Based Test & Evaluation across the NAVAIR enterprise. He has over 20 years of experience in test & evaluation (T&E), program management, and system engineering. Mr. Senechal is a Naval Air Systems Command Associate Fellow, a graduate of the U.S. Naval Test Pilot School, Class 123, and has a Bachelor of Science degree in Aerospace Engineering from the Missouri University of Science and Technology.

Mr. Tim Sienrukos serves as the Test Director in the Network Communications Branch, part of the Enterprise Services Test & Evaluation Division of the FAA William J. Hughes Technical Center. His role and responsibility is to lead the Test & Evaluation efforts for the National Airspace System (NAS) Common Reference (NCR) service, one of the FAA’s first end-to-end Agile projects to reach the NAS. Tim’s undergraduate work at Seton Hall University and Stockton University culminated in a Bachelor of Science Degree in Computer Science. Subsequent education includes graduate-level work at the Stevens Institute of Technology.

Mr. Jonathon C. Skarphol is a Mission Systems Architect and the Test and Training Instrumentation (TTI) Discipline Chief at Collins Aerospace. He is a regular presenter at ITEA events, former chief engineer for the Common Range Integrated Instrumentation System (CRIIS), and currently manages the technical strategy and roadmaps for the TTI and Tactical Cyber product lines focused on providing innovative cybersecure solutions to modernize air combat training and T&E. Jon holds a B.S. in Electrical Engineering from North Dakota State University, a MBA from the University of Iowa, and a Systems Engineering and Architecting graduate certificate from Stevens Institute of Technology. Skarphol has 20 years of experience in technology, innovation, and engineering across the defense and high-performance computing industries. He has held various engineering and leadership positions related to test and training solutions, cybersecurity products, and in development of system of systems, hardware, and high assurance security designs of encryption devices, cross-domain guards, datalinks, GPS, Cryptographic Engines, FPGA/ASICs, and miniaturization of electronics including advanced packaging techniques.

Page 27: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 23 | P a g e

Robert N. Tamburello, PhD, currently serves as the Deputy Director of the National Cyber Range Complex (NCRC) within the DoD Test Resource Management Center (TRMC). In this role, Dr. Tamburello works with an interdisciplinary team to design and conduct events that address the full spectrum of DoD requirements in the cyber test and training domains. In addition, as the Program Manager for the NCRC Expansion, Dr. Tamburello is leading the effort, in partnership with the Services, to increase the DoD’s capacity to support cyber test and training requirements through the establishment of new NCRC facilities. Before joining TRMC, Dr. Tamburello served as the Division Chief for Mounted Systems within the Integrated Suitability and Methodology Evaluation Directorate (ISMED) of the U.S. Army Evaluation Center (AEC) at Aberdeen Proving Ground, MD. In this capacity, he led a diverse group of civilian and military personnel, including Australian Army exchange officers, to perform the operational suitability evaluation of a portfolio of more than 250 ground-based systems including Abrams, AMPV, Bradley, JLTV, MRAP, and Stryker. Immediately prior to serving in this position, he was the Division Chief of the Army Center for Reliability Growth at AEC. Previously, he has served in such positions as the Army Test and Evaluation Command (ATEC) lead for the Army Expeditionary Warrior Experiment (AEWE), the Tactical Vehicles Team Leader in the Reliability and Maintainability Directorate of AEC, and the Wide Area Surveillance Team Leader at the U.S. Army Materiel Systems Analysis Activity (AMSAA). Over his career, he has led data-driven studies, developed methodologies, presented research findings, published journal articles, and crafted policy. He has also advised numerous program management offices and supervised the development of test plans and evaluations that directly supported milestone decisions associated with the production and fielding of Major Defense Acquisition Programs across all commodity areas. Serving as an instructor for the Army Center for Reliability Growth’s Short Course, Dr. Tamburello promoted best practices for system reliability design, test, and evaluation, while concurrently developing new tools and techniques through his research contributions.

Mr. Thomas Treakle is a Client Solutions Director with Dell Technologies Federal Consulting and is responsible for Test Range and Mission Systems business area. He holds a B.S. and M.S. in Ocean Engineering from Virginia Tech and is currently actively involved in the requirements alignment, architectural development and productization of the JSF Knowledge Management and RAPIDS program sponsored by Test Resource Management Center (TRMC). As a Chief Engineer with Leidos, Inc. Mr. Treakle actively worked in distributed test technology development for the Test and Training Enabling Architecture (TENA) as Principal Investigator for the Technologies for Net-Centric Test Interoperability (TNTI) and TENA in Resource Constrained Environments (TRCE) projects sponsored by the TRMC Test & Evaluation / Science & Technology (T&E/S&T) Program.

Lt Col Karlos G.L. Tungol is the Commander, 45th Test Squadron, Eglin Air Force Base, Florida. He is responsible for commanding test operations for a 250+ member organization with personnel at 7 major geographic locations, executing a $42.5M annual test budget. The 45th Test Squadron performs full spectrum developmental testing for 443 Air Force and Joint air, space, strategic command and control, air and ground mission planning, strategic intelligence information dissemination, and defense business systems acquisition programs. Prior to his current assignment, he was the Integration Branch Chief of the Airspace Mission Planning Division, PEO Digital, Hanscom Air Force Base, Massachusetts. He led the 1.2 billion dollars Joint Mission Planning System Enterprise that develops a suite of PC based mission planning systems for 50 Air Force aircraft and weapons systems, as well as providing mission planning services to the Navy, Marines, Army, several government agencies, and 30 allied nations. He served on the Air Staff as a Payloads Program Manager, Joint Systems Program Manager, Joint Systems Program Element Monitor and then as the Chief of Integration Advanced Combat Systems under the Director of Air Force Rapid Capabilities Office, Office of the Assistant Secretary of the Air Force (Acquisition). While at the Air Force Rapid Capabilities Office, he deployed as the Senior Air Advisor and Chief of Programs at the 438th Air Expeditionary Wing, Train, Advise, Assist Command-Air, Forward Operating Base Oqab, Kabul, Afghanistan. He led the combat and training requirements for the Afghan Air Force and served as the senior air advisor to the Afghan Air Force Headquarters. He led 3.4 billion dollars in Title-10 foreign military sales program requirements directly resulting in the successful sustainment of Mi-17 helicopters, as well as the C-208B light lift aircraft, and initial operational capability for the C 130H, A-29, and MD-530F aircraft fleets for the Afghan Air Force.

Ms. Simone Youngblood is a member of the Johns Hopkins Applied Physic Laboratory’s Principal Professional Staff. Leveraging an extensive background in simulation development and credibility assessment, Simone Youngblood has served as the DoD VV&A focal point for the past 26 years. Ms. Youngblood was the editor of the DoD VV&A Recommended Practices Guide and chaired the development of several VV&A related standards including: IEEE Standard 1278.4, IEEE Standard 1516.4 and MIL-STD 3022. Ms. Youngblood has served as the V&V and/or Accreditation agent for numerous M&S efforts that span a broad organizational spectrum to include: DTRA, DNDO, NAVAIR and PEO IWS 1. Ms. Youngblood has a B.A. in mathematics as well as B.S. and M.S. degrees in computer science. Lt Col Barbara Ziska is the commander of the 605th Test and Evaluation Squadron at Hurlburt Field, Florida. The 605th TES is Air Combat Command’s sole operational test squadron for C2ISR (Command, Control, and Intelligence, Surveillance, and Reconnaissance) systems, including AOC (Air Operations Center), AWACS (Airborne Warning and Control System), JSTARS (Joint Surveillance and Target Attack Radar System), CRC (Control and Reporting Center), TACP (Tactical Air Control Party), DCGS (Distributed Common Ground System), Nuclear C3, and CMCC (Common Mission Control Center). 605th test teams have a long history of executing early involvement and integrated testing with their developmental test partners in the 96th CTG at Eglin AFB, Florida, and are currently actively involved with Kessel Run teams. Lt Col Ziska is a senior Air Battle Manager with experience in the AOC, CRC and E-3 AWACS.

Page 28: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

24 | P a g e Hosted by the ITEA Central Florida Chapter

Professional Certification

Elevating the Test and Evaluation Profession with a Globally Recognized Credential ITEA administers, manages, and awards the Certified Test and Evaluation Professional (CTEP) credential which provides significant benefits to T&E professionals, organizations, and their customers. Over 500 T&E subject-matter experts (SMEs) have been involved in the development of this credential. These SMEs—T&E executives, managers, supervisors, individual contributors, and technicians—have come from a diverse cross-section of the T&E profession, representing industry, government, academia, laboratories, ranges, weapon systems, information technology, transportation, electronic communications, consumer electronics, and more.

PURPOSE OF THE CTEP CREDENTIAL

• Recognize individuals who demonstrate:

o KNOWLEDGE, SKILLS, AND ABILITIES: They meet the minimum level of competency in the requisite KSAs that have been identified by T&E subject-matter experts (SMEs).

o COMMITMENT to maintain currency in the field. o DEDICATION to advancing the profession.

• Develop and promote common standards, principles, procedures, processes, and terms for the T&E profession.

• Support professional development and education to enhance the KSAs of T&E professionals.

PROFESSIONAL CERTIFICATION VERSUS "CERTIFICATE" PROGRAMS

Please note that a “professional certification credential” is quite different from the “certificate” programs that are currently available to test professionals. “Certificate” programs award a certificate of completion or achievement to individuals after they successfully complete a course of study or meet some minimum requirements. In contrast, a professional certification credential:

• Is a time-limited recognition requiring periodic submission for re-certification to demonstrate continued currency in the profession, including demonstration of full-time employment in the field and continuing education.

• Awarded based on the candidate’s passing a competency exam, which could be written and/or observational, and would not be related to the completion of any specific course or curriculum of courses.

• Bestows upon an individual to right to use the credential’s designation in conjunction with their name (e.g. CSE, CPA, or CPM) after an assessment and verification that they have met predetermined and standardized criteria.

• Confers occupational identity and provides a method for maintaining quality standards of knowledge and performance and stimulating continued self-improvement.

• Provides differentiation among test professionals, using standards developed through a consensus driven process and based on existing legal and psychometric requirements.

• Requires adherence to a Professional Code of Ethics.

Page 29: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 25 | P a g e

Page 30: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

26 | P a g e Hosted by the ITEA Central Florida Chapter

Page 31: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile

Hosted by the ITEA Central Florida Chapter 27 | P a g e

Page 32: Program Guide - ITEA...2019 LVC and Agile Workshop "Accelerating Test and Evaluation with Live-Virtual-Constructive and Agile" Program Guide September 17-19, 2019 DoubleTree by Hilton

Welcome to the 2019 LVC and Agile Workshop

28 | P a g e Hosted by the ITEA Central Florida Chapter

THANK YOU TO OUR SPONSORS!

Platinum Level

Gold Level Sponsor

Bronze Level Sponsor

ITEA is a 501(c)(3) professional education association dedicated to the education and advancement of the test and evaluation profession. Registration fees, membership dues, and sponsorships are tax deductible.

Sponsorship dollars defer the cost of the workshop and support the ITEA scholarship fund, which assists deserving students in their pursuit of academic disciplines related to the test and evaluation.