Upload
branden-maxwell
View
218
Download
2
Embed Size (px)
Citation preview
© 2010 Carnegie Mellon University
® CMMI is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.
V&V Principles
Verification and Validation Summit 2010
Mike Phillips Software Engineering InstituteCarnegie Mellon University
2V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
SEI Background
Funded by the U.S. government as a research & development lab; (FFRDC)
Created in 1984 and administered by Carnegie Mellon University
Headquartered in Pittsburgh, Pennsylvania; offices and support worldwide
3V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
My Background….
Performed developmental testing for USAF
Planned and then managed testing for B-2
Managed CMMI since 2000
Led development of CMMI for Acquisition
Currently on F-22 Team for “sustaining modernization”
4V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
SEI Mission and Strategy
The SEI advances software engineering and related disciplines to ensure systems with predictable and improved quality, cost, and schedule.
Strategy
Mission
5V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
How Do You Want to Work?
• Random motion – lots of energy, not much progress
• No teamwork – individual effort
• Frequent conflict
• You never know where you’ll end up
• Directed motion – every step brings you closer to the goal
• Coordinated efforts
• Cooperation
• Predictable results
Processes can make the difference!
6V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
CMMI®: Continuous Representation
Performed1
Managed2
Defined3GP 3.1 Establish a Defined ProcessGP 3.2 Collect Improvement Information+ GENERIC PRACTICES OF LEVEL 2
GP 1.1 Perform Base Practices
GP 2.1 Establish an Organizational PolicyGP 2.2 Plan the ProcessGP 2.3 Provide ResourcesGP 2.4 Assign ResponsibilitiesGP 2.5 Train PeopleGP 2.6 Manage ConfigurationsGP 2.7 Identify and Involve Relevant StakeholdersGP 2.8 Monitor and Control the ProcessGP 2.9 Objectively Evaluate AdherenceGP 2.10 Review Status with higher Level Management+ BASE PRACTICES OF LEVEL 1
Incomplete0
7V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Error Correction Costs By Phase
$$$
Value of Fixing Defects Early
Relative Cost to Correct Error
Operation
DetailedDesign
Integration
Validation
Implementation
TIME
8V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Late Discovery of System-Level Problems
5x
SoftwareArchitecturalDesign
SystemDesign
ComponentSoftwareDesign
CodeDevelopment
UnitTest
SystemTest
Integration Test
Acceptance Test
RequirementsEngineering
110x
Where faults are introducedWhere faults are foundThe estimated nominal cost for fault removal
20.5%
1x
20%, 16%
10%, 50.5%
0%, 9% 40x
70%, 3.5%16x
Sources: NIST Planning report 02-3, The Economic Impacts of Inadequate Infrastructure for Software Testing, May 2002.D. Galin, Software Quality Assurance: From Theory to Implementation, Pearson/Addison-Wesley (2004) B.W. Boehm, Software Engineering Economics, Prentice Hall (1981)
9V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Using the Three CMMI Constellations in V&V
CMMI-SVC
CMMI-DEV
CMMI-SVC provides guidance for those providing testing
services
CMMI-ACQCMMI-ACQ provides
V&V guidance to acquisition leadership on technical progress
CMMI-DEV provides guidance for measuring,
monitoring, and managing product
development• spec compliance • user satisfaction
16 Core process areas common to all
10V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Capability Profiles with Multiple Constellations
VER VAL AVER ATM CAM PPQA0
1
2
3
11V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Visibility into the Team’s Performance
Transition
CMMI for Development
CMMI for Acquisition
Operational
Need
Developer
Acquirer
Plan Design Test Maintain Decommission
Acquisition Planning
RFPPrep.
Solicita-tion
Source Selection
System Acceptance
Program Leadership Insight / Oversight
Integrate& Test
DeliverPlan Design Develop
CMMI for Services
Service Provider
12V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
CMMI-SVC content
16 Core Process Areas
and 1 shared PA (SAM)
Capacity and
Availability
Management
Service Continuity
Service System
Development
Strategic Service
Management
Incident Resolution
& Prevention
Service Delivery
Service System
Transition
PA Addition
13V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Many Approaches are Synergistic
Improving interfaces is of interest to both government and industry….
TMMI
14V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Analysis – Test Metrics
Coverage Legacy/Ex-ternal
Mistake Communica-tion
Process Training0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
2005
2006
2007
2008
Coverage and Legacy continue
to be the two largest
categories
Mistakes decreased - good
thing!
Source: ITEA Annual Symposium 2009presented by 46 Test Wing from Eglin Air Force Base, FL
15V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Analysis – Test Metrics
Initiation Requirements Design Implementation Integration Testing
System Testing Acceptance Testing
Completion0
50
100
150
200
250
300
350
Amount of Minor and Major Defects Saves by Category
Minor De-fects
Major De-fects
Nu
mb
er
of
De
fec
ts
Amount of Defect Saves (Major + Minor) = 1147778 last year
Source: ITEA Annual Symposium 2009presented by 46 Test Wing from Eglin Air Force Base, FL
16V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Analysis – Test Metrics
Injected 4
Detected 6
Initiation 0Req't 0 9
Project Plan 0 0 0Design 0 2 0 3
Imp 0 4 0 1 29Int Test 0 0 0 0 11 5Sys Test 0 1 0 0 3 0 4Acc Test 0 0 0 0 2 0 0 1
Completion 0 0 0 0 0 0 0 0
Delivery Completion
Major Defect Save Containment Matrix
Init Req't Project Plan Design Imp Int Test Sys Test Acc Test
Good Containment
Checklists have been updated to
address/catch these issues in the future
This project did not accomplish
designSource: ITEA Annual Symposium 2009presented by 46 Test Wing from Eglin Air Force Base, FL
17V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Selected Results: Quality Performance (Organization 3)
18V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Quantitative Measures: Quality Performance Results Summary
We all do!
Measure Used By The Organization Performance Result
Defect density by severity, ML5 compared to ML3 (Organization 1)
62.5% fewer high-severity defects with ML5 projects
Defect density in circuit board design (Organization 2a)
65% improvement
Defect containment by phase (Organization 3)
The fix of defects within the phase they were injected increased by 240%
Defect containment, ML5 compared to ML3, by phase per KLOC (thousands of lines of code) (Organization 2b)
Defect containment improved 13%
User acceptance test defects per KLOC (Organization 7)
Less than 0.15 defects per KLOC
% of defects removed prior to system test (Organization 7)
>85%
19V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
What Have We Missed?
Now let’s chat….
© 2010 Carnegie Mellon University
CMMI Backup Charts
21V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
CMMI-ACQ v1.2Acquisition Category Process Areas
Acquisition Requirements Development
Solicitation & Supplier
Agreement Development
AgreementManagement
AcquisitionTechnical
Management
Acquisition Validation
Acquisition Verification
16 Core Process Areas
22V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Acquisition Technical Management - Goals
SG 1: Evaluate Technical Solutions
Supplier technical solutions are evaluated to confirm that contractual requirements continue to be met.
SG 2: Perform Interface Management
Selected interfaces are managed.
23V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Acquisition Verification - Specific Goals
SG 1: Prepare for Verification
Preparation for verification is conducted.
SG 2: Perform Peer Reviews
Peer reviews are performed on selected work products.
SG 3: Verify Selected Work Products
Selected work products are verified against their specified requirements.
24V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Acquisition Verification - Summary
Verification includes
• Selecting work products for verification
• Establishing a verification environment
• Establishing criteria and procedures
• Preparing for and conducting peer reviews
• Analyzing peer review data
• Performing verification
• Analyzing verification results and identifying corrective actions
25V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Acquisition Validation - Goals
SG 1: Prepare for Validation
Preparation for validation is conducted.
SG 2: Validate Selected Products and Product Components
Selected products and product components are validated to ensure that they are suitable for use in their intended operating environment.
26V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
Acknowledgements
Slides noted from the ITEA Annual Symposium 2009 were presented by Ms. Kathy Reid of the 46 Test Wing from Eglin Air Force Base, FL.
27V&V Principles Phillips, Oct 14, 2010
© 2010 Carnegie Mellon University
This work was created in the performance of Federal Government Contract Number FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so, for government purposes pursuant to the copyright license under the clause at 252.227-7013.
This Presentation may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at [email protected].
NO WARRANTY
THIS MATERIAL OF CARNEGIE MELLON UNIVERSITY AND ITS SOFTWARE ENGINEERING INSTITUTE IS FURNISHED ON AN “AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT.