Upload
ebony-balding
View
221
Download
2
Tags:
Embed Size (px)
Citation preview
1
Jim WoolseyDeputy Director for Performance AssessmentsOSD [email protected]
Assessing Program Execution
Performance Assessments and Root Cause Analyses
2
Performance Assessments and Root Cause Analyses (PARCA)
PARCA was created by the 2009 Weapons System Acquisition Reform Act (WSARA)
Stood up in January 2010
Director Mr. Gary R. Bliss
Deputy Director for Acquisition Policy
Analysis Cell Dr. Philip S. Anton
Deputy Director for Performance Assessments
Mr. James P. Woolsey
Senior Advisor for Root Cause Analysis
Vacant
Deputy Director for Earned Value Management
Mr. Gordon M. Kranz
Military Deputy Vacant
www.acq.osd.mil/parca
3
PARCA Performance Assessments – WSARA’s Assignments
1. Carry out performance assessments of MDAPs
2. Issue policies procedures and guidance on the conduct of performance assessments
3. Evaluate the utility of performance metrics used to measure cost, schedule and performance
4. Advise acquisition officials on performance of programs that have been certified after Nunn-McCurdy breach, or are entering full rate production, or are requesting multi-year procurement
Improve visibility into the execution status of MDAPs
4
Event-Driven Assessments
Performance assessments following Nunn-McCurdy: Apache Block 3 (x2), ATIRCM-CMWS (x2), DDG1000 (x3), Excalibur, F-35 (x3), RMS (x2), WGS, Global Hawk
Advice on multiyear, full rate decisions, other: SSN 774, C-5 RERP, C-27J, UH-60M, AMRAAM, CH-47, V-22, DDG-51, F/A-18E/F/G, SSN-774
Assessments:– Track progress on root causes
– Establish and follow performance metrics
– Comment on overall program prospects
Also participated in JSF quick look report
5
Continuous Performance Assessments
Assessments are performed through the DAES process– Surveillance: Information gathered from PMs and OSD offices
– Executive insight: Information presented to decision-makers
PARCA:– Integrates assessments from other offices
– Recommends programs for DAES briefings
– Identifies important issues for discussion
PARCA also does independent analyses, such as:– Identification of a failing EV system
– Early identification of significant cost growth
– Illustration of LRIP cost implications
– Description of reliability status
6
PARCA Vision for Assessing Program Execution
Sharpen assessment tools and invent new ones– Data-driven analyses of current programs
– Clear and concise communication to leadership
Improve how we gather what we know– The DAES process
Change the way we think about programs– Framing assumptions
7
Using Earned Value to Show Implications of LRIP Costs
LRIP 1 LRIP 2 LOT 1 LOT 2 LOT 3 LOT 4 LOT 5 LOT 6 LOT 7 LOT 80.00
0.50
1.00
1.50
2.00
2.50
3.00
Notional Data - Sample Missile ProgramEstimated Average URF Price by Lot
Approx Unit Cost Contract Values
Projected Est.
Current EAC Projection
Previous EAC Projection
$ M
illi
on
s
EAC projection based on CUM Current Period CPR
Contract Type CPAF CPAF FPIF FPIF FFP FFP FFP FFP FFP FFP Quantity 2 12 17 32 42 53 104 157 207 Contract % Complete 100% 100% 60% 20%
8
A Failing EV System
Sep-05 Mar-06 Oct-06 Apr-07 Nov-07 Jun-08 Dec-08 Jul-09 Jan-10 Aug-10800.0
850.0
900.0
950.0
1,000.0
1,050.0
Earned Value vs. Estimated Costs
PM EAC EV EAC (W/Fee)
Es
tim
ate
d P
ric
e (
Mill
ion
s T
Y$
)
Cumulative CPI = 0.96
9
0.680000000000001
0.780000000000001
0.880000000000001
0.980000000000001
1.08
1.18
CPI_cum TCPI_EAC +10%
Earned Value and Cost Realism
10
Assessing Reliability and Availability
Problem: – KPP is usually availability (Ao)
– We measure reliability (MTBF)
– The connection between the two is not always clear
Another problem:– Reliability is complicated
And it’s important– Reliability and Ao drive support costs and CONOPS
PARCA has had some success clarifying these issues on several programs– More to follow
11
PARCA Vision for Assessing Program Execution
Sharpen our tools and invent new ones– Data-driven analyses of current programs
– Clear and concise communication to leadership
Improve how we gather what we already know– The DAES process
Change the way we think about programs– Framing assumptions
12
Gathering What We Know –The DAES Process
As the senior acquisition executive, USD(AT&L) must maintain situational awareness on all MDAPs
DAES is a unique mechanism for doing so:– It is continuous (not event-driven)
– It is broad-based
– It includes independent viewpoints
– It is data-driven (or should be)
12
13
Two Parts of DAES Improvement1. DAES
Assessments2. Executive
Insight
Insight*Lead
Description
Product
PARCA and ARA
Improve DAES assessments
• Refine Assessment Categories
• Define assessment content
• Clarify roles and responsibilities
Consistent, rigorous and
efficient program assessments
ASD(A)
Improve executive insight into programs
• Determine priorities and preferences
• Streamline process from data through meetings
• Execute improved processesEfficient and
appropriate insight
Priorities, requireme
nts
Structure, data and
information
14
Improving DAES Assessments
PARCA is one of several players improving the DAES process– Mr. Kendall’s interest and direction has been critical
– Dr. Spruill has implemented and reinforced Kendall direction
– Mrs. McFarland is improving the process for executive insight
PARCA roles:– Update assessment guidance (with ARA)
Will include analysis concepts and best practices
Input from OIPTs, SAEs and functional offices
Will incorporate Better Buying Power initiatives
15
Assessment Categories
Current
– Cost
– Schedule
– Performance
– Contracts
– Management
– Funding
– Test
– Sustainment
– Interoperability
– Production
Proposed
– Program Cost*
– Program Schedule*
– Performance
– Contract Performance*
– Management*
– Funding
– Test
– Sustainment
– Interoperability
– Production
– International Program Aspects (IPA)**
* New or re-structured ** Added before PARCA/ARA work
16
O
verv
iew
C
ore
A
sses
sme
nt
Are
as
S
amp
le
Top
ics
Contract Performance An assessment of a program’s
execution of major individual contracts. How are the contracts performing in
cost and schedule, and what effect do they have on the overall program?
Scope
and Conte
xt
Programmatics
and
Baseline
Documents
Size,
Purpose,
and
Structure
Contract
Schedule Integration
and
Critical
Path
Duration
and
%
Complete
Contract Cost
and Schedul
e
Analysis/Metrics
Cost
Analysis
Cost
and
Lower
Level
Trends
Contract
Budget
Analysis
Cost
Drivers
Schedule
Analysis
Critical
Path
Task
Completion
Milestones (Contract)
Schedule
Drivers
Performance
Trends
Variability
CV /
SV
History
Cost,
Schedule,
and
Funding
Effort
Remaining %
Complete
% Spent
% Scheduled
Work
and
Budget
Remaining
EAC
Analysis
VAC
Trends
Differences in
EACs
Realism
Risk and
Mitigation
Qualitative Factors
MR Burn Down
Government Liability
Impact on
Program
Success
Scope / Planning To-Date ProjectedPerformance / Execution Impact /
Risk
What is being assessed?
What should I consider?
What tools could I use?
17
Metrics for Schedule Performance
Block Diagrams: By April 6
Draft Guidance: By May 4
Guidance Coordination: May 11
Approval: By May 25
17
18
PARCA Vision for Assessing Program Execution
Sharpen our tools and invent new ones– Data-driven analyses of current programs
– Clear and concise communication to leadership
Improve how we gather what we already know– The DAES process
Change the way we think about programs– Framing assumptions
19
Estimating Assumptions Flow from Framing Assumptions
Framing Assumptions
Consequences
Estimating Assumptions
Requirements, Technical,& Program Management
Cost Estimators
Responsible Communities:
Design is mature(Prototype design is close to Production-Ready)
Production and development can be
concurrent
Cost and Schedule Estimates
Schedule will be more compact than historical
experience
Weight (critical for vertical lift) is known
Weight will not grow as usual for tactical
aircraft
Design can now be refined for affordability
Affordability initiatives will reduce production
cost
20
Correlation When Framing Assumption is Invalid
Framing Assumptions
Consequences
Estimating Assumptions
Requirements, Technical,& Program Management
Cost Estimators
Responsible Communities:
Design is mature(Prototype design is close to Production-Ready)
Production and development can be
concurrent
Cost and Schedule Estimates
Schedule will be more compact than historical
experience
Weight (critical for vertical lift) is known
Weight will not grow as usual for tactical
aircraft
Design can now be refined for affordability
Affordability initiatives will reduce production
cost
21
Illustrative Framing Assumptions
Pre-MS B activities: The design is very similar to the ACTD.
Technical base: Modular construction will result in significant cost savings.
Policy implementation: The conditions are met for a firm, fixed price contract.
Organizational: Arbitrating multi-Service requirements will be straightforward.
Program dependencies: FCS will facilitate solution of size, weight, and power issues. Interoperability
Threat or operational needs: The need for precision strike of urban targets will not decline.
Industrial base/market: The satellite bus will have substantial commercial market for the duration of program.
Pro
gram
no
wP
rogr
am
futu
reP
rogr
am
Env
ironm
ent
22
Framing Assumptions and Decision-Making
Intent is to raise the key issues for the program irrespective of whether they are controversial– First step: Identify the right issues and know how they contribute to
program success.– Second step: Establish what metrics are relevant to the issue’s
contribution to program success.– Third step: Present the data to date for and against, including relevant
historical programs that are capable of discriminating outcomes.– Fourth step: Generate baseline forecasts of how the data will evolve if
the thesis is correct . . . And vice versa. Track data and report.
Concept will be piloted this year
23
Summary
Sharpen tools and invent new ones– Ongoing and never-ending
Improve how we gather what we already know– New DAES assessment process this summer
Change the way we think about programs– Framing assumptions piloted this year