No CAM Left Behind:APL’s Incremental Approach to EVMS
Implementation
Steve ShinnHoward Hunter
2009 NASA Project Management Challenge2/24/2009
AgendaBrief APL OverviewIn the beginningFactors that precipitated EVMS developmentAPL Space Department approach to EVMS
Management supportGraduated approach
Implementation detailsArchitecture “Practice” during mission phases A/BTraining - “No CAM left behind”Reporting
EV system highlightsWhere are we now?ClosingQuestions
3
APL, a Division of The Johns Hopkins University
School of Arts and SciencesWhiting School of EngineeringCarey Business School
Bloomberg School of Public HealthSchool of MedicineSchool of Nursing
Applied Physics Laboratory
Nitze School of Advanced International Studies
The Peabody Institute
4
APL Profile
• Not-for-profit DoD-chartered “University Affiliated Research Center” (UARC)
• Staffing: 4,500+ employees (70% scientists and engineers)
• $960M annual portfolio
• Business areas:Air and Missile DefenseBiomedicineCivilian SpaceHomeland ProtectionInfocentric OperationsNational Security SpacePrecision EngagementScience and TechnologyStrategic SystemsUndersea WarfareWarfare Analysis
MAINCAMPUS
MAINMAINCAMPUSCAMPUS
MONTPELIERRESEARCH PARK
MONTPELIERMONTPELIERRESEARCH PARKRESEARCH PARK
SOUTHCAMPUSSOUTHSOUTHCAMPUSCAMPUS
• Main campus in Maryland: 400 acres, 50+ buildings• Thirty locations across the United States
5
Civilian SpaceWorld-Class Science and Innovative Engineering
Critical Challenge: Answer fundamental space and earth science questions for our sponsors, our nation, and humanity
Key Programs:TIMED – Thermosphere, Ionosphere, Mesosphere Energetics and Dynamics MESSENGER – MErcury Surface, Space ENvironment, GEochemistry, and RangingCRISM – Compact Reconnaissance Imaging Spectrometer for Mars New Horizons – Mission to Pluto/CharonSTEREO – Solar Terrestrial Relations ObservatoryRBSP – Radiation Belt Storm Probes
6
APL Spacecraft: 1996–2007
MSX
NEAR
ACE
TIMED
CONTOUR
MESSENGER
New Horizons
STEREO
2005 2006 20072001 2002 2003 20041997 1998 1999 20001993 1994 1995 1996
From PDR to I&T
= Arrival at I&T faclity through launch
=
7
In the BeginningReal Quotes Prior to EVM Implementation
“Earned Value doesn’t work, and I have the empirical evidence to prove it.”
“If [EVMS] takes up too much of my time, I’m going to leave. I really don’t have time for this. I came here to do science!”
“EV is a waste of time – I have my own system that I developed.”
“EVMS money should be spent on science!”
“Our engineers and scientists are sensitive to new processes. We can’t have a single engineer quit because of your EVM system.”
“I don’t control the resources. How am I supposed to be responsible for this?”
“I don’t need this earned value stuff. All I need is a functional WBS and a good system engineer.”
8
Factors That Precipitated EVM Implementation
From 1979 to 1996, APL was singularly successful in arriving at program costs that were largely within a few percentage points of the planned costs at completion.
Eight Spacecraft Bus Programs were within ±8% of the initial cost estimate at the start of phases C/D.Recent missions and instruments experienced cost growth.
To combat future cost/schedule growth, APL implemented several new processes and systems.
EVM was introduced as a project management tool.State-of-the-practice capabilities for cost/schedule analysis and estimating were needed to complement EVM and attack the cost growth challenge.
5-year vision: Create an organic cost/schedule estimating and analysis function within the SD that incorporates state-of-the-practice methods for resource estimation and analysis.
9
EVMS Part of Mission Delivery
The APL Quality Management System states that:
The Laboratory’s policy is to make critical contributions to critical national challenges through the application of science and technology.
All products for our sponsors meet requirements for intended use as well as schedule and cost. We are committed to continual improvement.
Cost and Schedule Development and Management Standards are intended to define standard cost and schedule planning and management practices for all SD projects and programs. This includes:
Cost planning
Cost management
Schedule planning
Schedule management
EVMS
This charter formed the basis for our move to EVMS.
Financial Management Supportto the Project Lifecycle
11
APL Vision for EVMS
Provide project management with a consistent, standard framework for assessing project performance on major projects.
Proposal/Initial Project Concept: Provide up-front planning and a clearer definition of work using a standardized Work Breakdown Structure (WBS).
For project initiation through delivery:Integrate cost and schedule for each phase of the project; utilize the Responsibility Assignment Matrix (RAM) and the Work Statement Authorization (WSA).Utilize management best practices.Monitor and control the project.Provide an early warning system.Make use of regular, rigorous estimates at completion (EACs) to ensure better understanding of future work.Think of EVMS as one part of the CAM project management tool suite.
Exploit EVMS, where practical, for integrated, efficient collection and reporting of 533, contract performance reporting (CPR), risk management, and CADRe data.
12
Making the Vision a Reality: What Worked
Management support – critical to success“What’s in it for me?” – show value to usersKeep it simpleIncremental approach
Don’t expect to have an ANSI 748A-compliant system on day 1Always show utility
“One-System Approach” – collaborate closely with sponsors and subcontractorsOpen Communication
13
APL Management Support
With a foundation in place, management took a series of steps to support the EV implementation.
Promoted a cultural shift from Senior Department Managers through Program Managers to Lead Engineers (CAM) with the use of an EVMS to manage a project (part of the program management discipline)Implementing an EVMS prior to external requirements
Use of EVMS prior to key decision point (KDP) CSD utilization of EVM on projects >$15MEV steering committee established
“We use EVM not because we have to but because it’s the right thing to do.”As people witness senior management “buy in,” the process change gets easier.
14
Our Approach: One CAM at a Time
New system users (or processes/policies) can be organized into three groups:
The Advocates – These are the people who recognize a good idea and embrace new systems that may help them do their jobs.
Get as many of these people on your side. They become the “town-criers”who will ultimately help get the system embraced.
The Uninformed/Indifferent – This group may be either unaware of the concepts or unsure if the system will ultimately help them.
You need to show the value of the system to this group ASAP. If they like what they hear, they will become advocates. If you are unable to show value, this group will join the group below.
The Naysayers – This group often resists any change. Perhaps they have been successful for a long time and see no reason for change. This group is often sarcastic and/or skeptical of any communications.
Minimize this group’s effectiveness. You can’t completely ignore them (think termites in wood), but you can’t devote too much time to them. Limit their damage while keeping others from joining.
15
Graduated Approach
EV utilized ($3M) -Time of Flight board (2005)Managed by exceptionEV deemed beneficial to management of project
Secondary project ($4M) (2006)Leveraged this project to demonstrate the discrete measurement of performance for Subcontractor effort and materialAid in developing the change control process (CBB,UB, BCR & MR)
First Mission - RBSP (2007)IMP/IMS and related schedule reports Cost & schedule integration WBS/Dictionary, WSA’s, RAM, CPR’sData reviewed each month with CAM’s – workshop format Supported monthly sponsor meetings. Reviewed schedule & CPR dataEVMS was used through phase B of the RBSP project. Phase C/D cost and schedule baselines are being established.
Small successes built positive “press”.
16
Graduated Approach (cont.)
Project Management Control System (PMCS) DescriptionThe description explains the processes, procedures, and methodologies used by the SD for the planning and control of projects through the use of the EVMS.
Training held: The initial training course has been completed.
Over 175 individuals have been trained.Hands-on workshops are held every month on EVMS-active programs during data reviews.
Ensures just-in-time trainingLower training cost than intensive 3/5/7-day EVMS courses
17
Graduated Approach for ANSI Compliance
First: Implement the most meaningful pieces of the system.Second: Ensure that each portion of your system is ANSI-748A compliant from the start.
You may not implement all elements, but make sure that the ones you do implement are reasonably sound.
Third: Perform a gap analysis to determine holes.Fourth: Determine a plan/path to complete compliance.Fifth: Plan remaining activities and stick to your schedule!
Start Small!
A simple spreadsheet
summary can get you started.
Earned Value ReportProject - XXXXX
Month End 10/31/2005Cumulative to DateActual
Budgeted Cost Cost VarianceWork Work Work
WBS %Com Scheduled Performed Performed Schedule Cost Budgeted1.110 Management & Administration 16% 63,791 42,618 44,264 -21,173 -1,646 269,9591.120 Reliability & Quality Assurance; 12% 24,755 31,098 15,711 6,344 15,388 251,8621.210 Engineering 17% 242,632 156,780 72,923 -85,853 83,857 915,9361.220 Integration & Testing 0% 0 0 0 0 0 35,4881.240 Instrument Calibration 0% 0 0 0 0 0 21,1541.310 Lo Science 0% 0 0 0 0 0 109,0571.320 Hi Science 0% 0 0 0 0 0 81,5731.810 Phase E 0% 0 0 0 0 0 263,6041.820 Phase E 0% 0 0 0 0 0 374,8481.830 Phase E 0% 0 0 0 0 0 404,3381.RMA Rocket & Mission Analysis 46% 16,427 16,944 22,019 518 -5,075 37,000Total 361,978 261,813 168,691 -100,165 93,122 2,779,194
18
Collaboration and Cooperation
Too many systems are “private” and/or stove-piped.
Involve stakeholders in the system.
Minimize the “us versus them” mentality.
APL PMs have worked with NASA counterparts to utilize EV data.
Collaborative environment
System is based on NASA policies (7120.5D) to ensure that projects don’t “reinvent the wheel.”
Work with subcontractors and instrument providers early and often.
Early engagement is critical to receiving timely information.
Make EVMS part of a larger, integrated project management suite.
Integrate risk management, scheduling, funding, project milestones, and system engineering tools
19
Our Solution to Common EVMS Pitfalls
To keep EVMS from being an administrative burden, focus on value first and then on administrative elements.To minimize “gotcha” audit mentality, maintain a collaborative environment.To lower EVM system implementation costs, leverage existing project management processes.
Bottom Line: The data are only as good as the people reviewing it. Many of the elements of EVMS are simply good project management.
20
APL EVM Long-Term Goals
Continue to foster organizational change with demonstrated EV capability
Integrated approach to the reporting of the 533, CPR, and CADRe
Enterprise system in place
EVMS compliant with ANSI/EIA-748A guidelines
Provide a meaningful management tool to the APL project management community
21
Implementation Details
22
EVMS Architecture
Cobra
MS Project .MPP
Actual Cost of Work Performed (ACWP)Budgeted Cost of Work Scheduled (BCWS)
wInsight
SharePoint ToolkitCentral online repository for project performance data
ProjectServer
DataWarehouse
Schedule data (multiple projects, access and control, alerts through web, status update)
Schedule data(Budgeted Cost of Work Performed [BCWP] calculation)
EV data analysis,reporting and baseline control, and rate and efficiency variances
EV trend analysis and reporting
Accounting Interface
Resource loaded schedules
ForProject
RMIS
Resources into Operating Plan
Actuals Translation DatabaseAccruals Estimator System
23
“Practice, Practice, Practice” (Phases A/B)
RBSP projectPhase A
IMP/IMS and related schedule reportsCost and schedule integrationWBS, WBS dictionary, WSAs, RAM, and CPRsSchedule management plan and PMCS/draftCAM training
Phase BData reviewed each month with CAM (workshop format) SharePoint data repository established for SD projectsIMS developed Operating Plan/EAC demonstrated in the CPRSupported monthly sponsor meetings. Reviewed schedule and CPR data.Monthly EV review with PM and Deputy PMs (management by exception)Baseline status review completed successfully
24
Baby Steps
Focus on value:Agreement between stakeholders (functional supervisors and CAM)Processes developed Not a time burdenNot a punitive systemManagement by exceptionAbility to drill down into the dataInformation readily available
25
“No CAM Left Behind”
Work closely with the CAMSchedule developmentUnderstanding the data from the EVMSFrequent interactive meetingsMentoring relationshipPart of the team
Work with the PMHe/She is our most important customerValue added to the PM team (internal and sponsor)Provide key insight and analysis to the data
The focus of the EV team should be mission success
26
Training
Generic EVMS trainingBasic EVControl Account Manager (CAM) duties/descriptionEV types/accounting considerationsAnalysis and reporting
Emphasis placed on advanced skills through CAM/PM workshops Variance analysisMRBaseline change controlCPREAC/estimate to complete (ETC)/Operating plan update
Integrated baseline review
27
Reports Available Through EVMSSection S - Schedule Reports
# De scription Fre que ncy Format
S1 Schedule Health Check Monthly XLS
S2 CAM Status Update Reques t Monthly XLS
S3 CAM Schedule Update Monthly XLS
S4 Resource O rphan Report Monthly .PDF
S5 Resource Mism atch Report Monthly .PDF
S6 3 Month Lookahead Monthly .PDF
S7 Critical Path Monthly .PDF
S8 Schedule Update Narrative Monthly .DO C
S9 Baseline Schedule Variance Monthly .PDF
S10 Detailed Schedule Monthly MPP
S11 Deliverables - 3 Month Lookahead Monthly XLS
S12 Phase B - Miles tones Monthly .PDF
S13 Phase B - Deliverables Monthly .PDF
Section E - Earned Value Reports# Description Frequency Format
E1 Contract Performance Report Format 1, 2, 3, 4 Monthly XLS
E2 Contract Performance Report Format 5 Monthly XLS
E3 Cost Variance Chart (wInsight) Monthly .PDF
E4 Schedule Variance Chart (wInsight) Monthly .PDF
Section F - Financial Reports# Description Frequency Format
533 Monthly Financial Management Report Monthly XLS
533 Quarterly Financial Management Report Quarterly XLS
Section T - Trending# Description Frequency Format
T1 Total Milestones Monthly XLS
T2 IMP/Contract Milestones Monthly XLS
T3 Total Slack Monthly XLS
T4 Estimate At Completion Monthly XLS
T5 Management Reserve Monthly XLS
28
CPR – SponsorCLASSIFICATION (When Filled In)
CONTRACT PERFORMANCE REPORT FORM APPROVED
DOLLARS IN $ OMB No. 0704-0188
1. CONTRACTOR 2. CONTRACT 3. PROGRAM 4. REPORT PERIOD
a. NAME a. NAME a. NAME a. FROM (YYYYMMDD)
The Johns Hopkins University Applied Phy G-MIG G-MIG Phase B
b. LOCATION (Address and ZIP Code) b. NUMBER b. PHASE 2008 / 07 / 01
Johns Hopkins Road, Laurel, MD 20723 NAS5-012345 TO 10 B b. TO (YYYYMMDD)
c. TYPE d. SHARE RATIO c. EVMS ACCEPTANCE
CPFF NO X YES (YYYYMMDD) 2008 / 07 / 31
5. CONTRACT DATA
a. QUANTITY b. NEGOTIA d. TARGET PROFIT/ e. TARGET f. ESTIMATED g. CONTRACT h. ESTIMATED CONTRACT i. DATE OF OTB/OTS
COST THORIZED UNPRICED WO FEE PRICE PRICE CEILING CEILING (YYYYMMDD)
1 75,500,000 0 0 75,500,000
6. ESTIMATED COST AT COMPLETION 7. AUTHORIZED CONTRACTOR REPRESENTATIVE
MANAGEMENT ESTIMATE CONTRACT BUDGET VARIANCE a. NAME (Last, First, Middle Initial) b. TITLE
AT COMPLETION BASE
(1) (2) (3) Shinn, Steve O. Program Managers
a. BEST CASE 0 c. SIGNATURE d. DATE SIGNED
b. WORST CASE 0 (YYYYMMDD)
c. MOST LIKELY 0 75,500,000 75,500,000
8. PERFORMANCE DATA
NASA WBS[21] CURRENT PERIOD CUMULATIVE TO DATE REPROGRAMMING AT COMPLETION
ACTUAL ACTUAL ADJUSTMENTS
BUDGETED COST COST VARIANCE BUDGETED COST COST VARIANCE
WORK WORK WORK WORK WORK WORK COST SCHEDULE BUDGETED ESTIMATED VARIANCE
ITEM SCHEDULEDPERFORMEDPERFORMED SCHEDULE COST SCHEDULED PERFORMED PERFORMED SCHEDULE COST VARIANCE VARIANCE BUDGET
(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12a) (12b) (13) (14) (15) (16)
2.1.1 Project Management 170,200 170,200 165,000 0 5,200 675,000 675,000 670,420 0 4,580 0 0 0 1,428,028 1,693,400 -265,3722.10.1 Satellite Integration, Assembly and Test 122,500 121,644 98,000 -856 23,644 559,425 519,616 348,750 -39,808 170,866 0 0 0 1,208,500 1,094,200 114,3002.10.4 Logistics Preparations & LV Support 11,500 0 0 -11,500 0 25,000 0 0 -25,000 0 0 0 0 61,590 38,400 23,1902.10.5 Launch Vehicle Interface 2,200 2,200 2,300 0 -100 11,500 11,500 12,700 0 -1,200 0 0 0 22,483 26,500 -4,0172.11.1 Education/Public Outreach Activities (Phases A-D) 23,800 16,500 31,800 -7,300 -15,300 110,230 70,410 88,600 -39,820 -18,190 0 0 0 220,551 170,758 49,7932.2.1 System Engineering 205,000 205,000 208,500 0 -3,500 821,000 821,000 775,200 0 45,800 0 0 0 1,850,000 2,000,500 -150,5002.2.2 Spacecraft System Engineering (lead, deputy electrical, depu 69,000 69,000 44,100 0 24,900 284,200 284,200 255,000 0 29,200 0 0 0 630,250 460,450 169,8002.2.4 Contamination & Configuration Eng'r 60,400 53,700 54,632 -6,700 -932 318,950 231,700 225,500 -87,250 6,200 0 0 0 579,000 735,125 -156,1252.2.5 Deep Dielectric Parts Testing 17,575 25,550 85,000 7,975 -59,450 182,530 118,410 230,520 -64,120 -112,110 0 0 0 220,700 544,444 -323,7442.3.1 Performance Assurance Engineering 70,926 70,926 109,500 0 -38,574 286,971 286,971 322,612 0 -35,641 0 0 0 634,759 786,416 -151,6572.4.1 Project Scientist-APL 79,880 79,880 90,475 0 -10,595 325,800 325,800 330,500 0 -4,700 0 0 0 723,754 735,700 -11,946
FORMAT 1 - WORK BREAKDOWN STRUCTURE
0 00
c. ESTIMATED COST OF
29
wInsight Charts
2,500
5,000
7,500
10,000
12,500
15,000
17,500
20,000
22,500
2008APR MAY JUN JUL
Time N
ow
Dollars in Thousands
BCWS 23,16916,83912,2926,016BCWP 18,51513,2718,0033,892ACWP 15,99911,6296,4752,975ETC
0
10,000
20,000
30,000
40,000
50,000
60,000
70,000
80,000
90,000
2008 2009
0
20
40
60
80
100
120
140
160Dollars in Thousands
Start
Com
plete
Percent of PMB
BCWS 23,169BCWP 18,515ACWP 15,999
CBB/TAB 77,320Program Manager's Estimate 0Contractor's Estimate 56,173
WBS Number BCWS BCWP ACWP SV CV BAC LRE VAC CPI CPI Trend SPI SPI Trend
1 1.1.1 999,274 999,274 1,018,459 0 -19,184 2,252,836 2,534,816 -281,980 0.981 ↓ 1.000 ↔
2 1.1.2 1,307,702 1,231,300 1,284,581 -76,402 -53,281 2,725,734 3,006,397 -280,663 0.959 ↓ 0.942 ↑
3 1.1.3 128,088 128,404 114,655 316 13,749 333,106 330,429 2,676 1.120 ↓ 1.002 ↓
4 1.1.4 463,455 398,455 438,331 -65,000 -39,877 959,504 1,125,652 -166,148 0.909 ↓ 0.860 ↑
5 1.1.5 0 0 -3,625 0 3,625 0 -3,625 3,625 0.000 ↔ 0.0006 1.2.1 1,092,109 1,092,109 770,696 0 321,413 2,440,502 2,047,976 392,526 1.417 ↔ 1.000 ↔
7 1.2.2 4,796,782 2,701,523 1,557,108 -2,095,1,144,415 9,129,851 9,075,061 54,790 1.735 ↓ 0.563 ↑
8 1.2.3 1,256,429 1,164,733 712,632 -91,696 452,101 2,890,155 3,233,190 -343,035 1.634 ↑ 0.927 ↓
9 1.2.4 1,061,590 1,050,751 1,209,673 -10,840-158,922 2,803,497 2,906,861 -103,364 0.869 ↑ 0.990 ↑
10 1.2.5 761,117 461,833 466,577 -299,28 -4,745 1,661,783 2,020,141 -358,358 0.990 ↑ 0.607 ↑
11 1.2.8 377,654 342,666 197,849 -34,988 144,818 604,349 457,776 146,573 1.732 ↑ 0.907 ↑
12 1.2.A 243,888 129,526 105,849 -114,36 23,677 785,410 711,738 73,672 1.224 ↑ 0.531 ↓
13 1.3.1 571,270 298,793 388,985 -272,47 -90,192 881,657 1,249,516 -367,859 0.768 ↓ 0.523 ↓
14 1.3.2 944,104 802,596 728,212 -141,50 74,385 3,042,791 2,975,302 67,488 1.102 ↑ 0.850 ↑
15 1.3.3 89,891 17,362 67,057 -72,529 -49,695 447,605 449,511 -1,906 0.259 ↓ 0.193 ↔
16 1.3.4 723,487 578,804 559,563 -144,68 19,241 2,985,923 2,954,738 31,185 1.034 ↑ 0.800 ↓
17 1.3.5 84,829 84,829 86,726 0 -1,897 187,866 187,923 -57 0.978 ↑ 1.000 ↔
18 1.3.6 1,434,152 1,138,575 1,255,722 -295,57-117,147 3,227,629 3,221,265 6,364 0.907 ↓ 0.794 ↔
19 1.3.7 3,380,911 2,614,847 2,171,696 -766,06 443,151 9,887,536 9,161,729 725,807 1.204 ↔ 0.773 ↓
20 1.3.8 474,096 570,634 400,206 96,538 170,428 1,293,190 1,227,774 65,416 1.426 ↑ 1.204 ↑
21 1.3.9 38,988 38,988 60,287 0 -21,299 786,434 822,135 -35,701 0.647 ↔ 1.000 ↔
22 1.3.F 856,383 651,171 721,677 -205,21 -70,505 2,162,331 2,031,313 131,018 0.902 ↓ 0.760 ↔
23 1.4.1 559,425 519,616 348,750 -39,808 170,866 1,209,538 1,104,719 104,819 1.490 ↓ 0.929 ↑
24 1.4.5 27,228 0 0 -27,228 0 62,591 37,371 25,220 0.000 0.000 ↓
25 1.4.6 10,604 10,606 13,881 3 -3,275 23,483 26,531 -3,048 0.764 ↑ 1.000 ↑
26 1.7.1 1,069,586 1,133,640 933,653 64,054 199,987 2,212,768 2,305,788 -93,020 1.214 ↓ 1.060 ↓
27 1.7.2 207,460 172,217 159,171 -35,243 13,046 547,665 506,692 40,974 1.082 ↓ 0.830 ↑
28 1.7.3 108,492 108,651 145,681 158 -37,030 207,049 292,302 -85,252 0.746 ↓ 1.001 ↓
29 1.9.1 100,061 72,727 84,835 -27,335 -12,109 222,551 171,758 50,793 0.857 ↓ 0.727 ↔
WBS LVL LL SV CV VAC VAR SV CV VAC
1 1.1.1 3 ↔ ↓ ↓ V 0 -19,184 -281,9802 1.1.2 3 ↑ ↓ ↓ cV -76,402 -53,281 -280,6633 1.1.3 3 ↓ ↓ ↓ C 316 13,749 2,6764 1.1.4 3 ↑ ↓ ↓ cSCV -65,000 -39,877 -166,1485 1.1.5 3 cCV 0 3,625 3,6256 1.2.1 3 ↔ ↔ ↑ cCV 0 321,413 392,5267 1.2.2 3 ↑ ↓ ↑ scSC -2,095,259 1,144,415 54,7908 1.2.3 3 ↓ ↑ ↓ scCV -91,696 452,101 -343,0359 1.2.4 3 ↑ ↑ ↔ scC -10,840 -158,922 -103,364
10 1.2.5 3 ↑ ↑ ↓ scSV -299,284 -4,745 -358,35811 1.2.8 3 ↑ ↑ ↔ scCV -34,988 144,818 146,57312 1.2.A 3 ↓ ↑ ↑ scSC -114,362 23,677 73,67213 1.3.1 3 ↓ ↓ ↓ scSCV -272,477 -90,192 -367,85914 1.3.2 3 ↑ ↑ ↓ scS -141,508 74,385 67,48815 1.3.3 3 ↔ ↓ ↓ scSC -72,529 -49,695 -1,90616 1.3.4 3 ↓ ↑ ↓ sS -144,683 19,241 31,18517 1.3.5 3 ↔ ↑ ↑ c 0 -1,897 -5718 1.3.6 3 ↔ ↓ ↓ scSC -295,577 -117,147 6,36419 1.3.7 3 ↓ ↔ ↔ scSC -766,064 443,151 725,80720 1.3.8 3 ↑ ↑ ↑ scSC 96,538 170,428 65,41621 1.3.9 3 ↔ ↔ ↓ cC 0 -21,299 -35,70122 1.3.F 3 ↔ ↓ ↔ scSC -205,211 -70,505 131,01823 1.4.1 3 ↑ ↔ ↓ cC -39,808 170,866 104,81924 1.4.5 3 ↓ ↑ scSV -27,228 0 25,22025 1.4.6 3 ↑ ↑ ↑ cCV 3 -3,275 -3,04826 1.7.1 3 ↓ ↓ ↓ cC 64,054 199,987 -93,02027 1.7.2 3 ↑ ↓ ↓ S -35,243 13,046 40,97428 1.7.3 3 ↓ ↓ ↓ cCV 158 -37,030 -85,25229 1.9.1 3 ↔ ↓ ↑ scSCV -27,335 -12,109 50,793
30
IMS Schedule Reports
S11 3 Month Look ahead
S7 Critical Path
31
SharePoint
SharePoint Toolkit – Central online repository for project performance data (EVMS Portal)
Includes all projects using EVData available through SharePoint:
WBS, WBS dictionaryIMSWork authorizationsEV and scheduling reports by month
Internal and externalBaseline versus Operating Plan versus Actuals ReportEV report – indices, S, P, A, SV, CV etc.
An electronic, interactive, real-time CAM notebook!
32
SharePoint – CAM Notebook
Categories of monthly reports
Monthly Earned Value Reports
33
Where Are We Now?
34
Program Manager and SD Management Involvement in Our Current EVMSThe EVMS:
Is providing the PM with a consistent, standard framework for assessing project performanceIs providing the PM with a sound program management tool that generates data in a format that enables management by exceptionIs providing the PM with early detection of problems, enabling faster response and corrective action
The PM:Is the single point of integrated responsibility for project technical, schedule, and cost performanceSign-off on final monthly EV data
The CAM:Is establishing and maintaining the schedule and budgetReports performance and assesses earned valueMonitors costs charged to the control accountGenerates an Estimate at CompletionIs reviewing variance analysis and developing and implementing corrective action, as required
APL Senior Management:Are provided with an EV analysis via weekly executive management meeting
35
EVMS: Comments from Our PMs
Helped to pinpoint areas where development was behind schedule and there was potential risk for cost growthWeighted milestone tracking has helped me understand how far behind/ahead the team is regarding performanceThe EVMS will be even more beneficial as we move through design, build and test phase of the projectAllowed insight into trending across the project and enhanced management decision processBenefit of oversight far outweighs burden of implementation
36
Closing: Making EVMS Work
Don’t implement EVMS for the sake of implementation.Show value at every step of the process.Integrate EVM with all of your project management tools.
EVM is NOT an island!Avoid an audit-like mentality.
EVM is a management tool, not an accounting tool.Use GAO, OMB, and NASA documents as a guide to ensure that you’re on solid footing.EVMS really is a good thing!
Reap the benefits of the only management system in the world that is both analytical and predictive!Rest assured: You won’t be leaving your CAMs behind.
37
Questions?
It must be remembered that there is nothing more difficult to plan, more doubtful of success, more dangerous to manage, than the creation of a new system.
For the initiator has the enmity of all who would profit by the preservation of the old institutions and merely lukewarm defenders in those who would gain by the new ones.
Niccolo MachiavelliFirst attempt to implement an EVMS for a defense system
Florence, Italy circa 1509
Recommended