174
Systems Engineering Capability Assessment Method (SECAM) i © 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50) Systems Engineering Capability Assessment Model The Model to Assess Systems Engineering Capability for Integrated Systems and Integrated Product and Process Development (IPPD) Document Number: INCOSE-TP-1996-002-01 Version 1.50a June 1996 This document was prepared by the Capability Assessment Working Group of the International Council on Systems Engineering (INCOSE). It has received the unanimous approval of the INCOSE Technical Board.

Systems Engineering Capability Assessment Model - V1.5a - June 1996

  • Upload
    dmwillo

  • View
    80

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

i

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Systems Engineering Capability Assessment Model

The Model to Assess Systems Engineering Capability for Integrated Systems and Integrated

Product and Process Development (IPPD)

Document Number: INCOSE-TP-1996-002-01

Version 1.50a

June 1996

This document was prepared by the Capability Assessment Working Group of the International Council on Systems Engineering (INCOSE). It has received the unanimous approval of the INCOSE Technical Board.

Page 2: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

ii

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Systems Engineering Capability Assessment Model

The Model to Assess Systems Engineering Capability for Integrated Systems and Integrated

Product and Process Development (IPPD)

Document Number: INCOSE-TP-1996-002-01

Version 1.50a

June 1996

Copyright © 1996 by INCOSE: This work is a collaboration effort of the members of the INTERNATIONAL COUNCIL ON SYSTEMS ENGINEERING (INCOSE). Permission to reproduce this product and to prepare derivative works from this product is granted royalty-free provided this copyright notice is included with all reproductions and derivative works.

This document was prepared by the Capability Assessment Working Group of the International Council on Systems Engineering (INCOSE). It has received the unanimous approval of the INCOSE Technical Board.

Page 3: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

iii

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Preface

This document was prepared by the Capability Assessment Working Group on Systems Engineering. It has received approval of the INCOSE Technical Board.

INCOSE technical developments are developed within the working groups of INCOSE. Members of the working groups serve voluntarily and without compensation. They are not necessarily members of INCOSE. The reports developed within INCOSE represent a consensus of the broad expertise on the subject within INCOSE as well as those activities outside of INCOSE that have expressed an interest in participating in the development and improvement of this report.

Page 4: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

iv

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Table of Contents

1 INTRODUCTION.............................................................................................................................. 1 1.1 GENERAL ......................................................................................................................................1 1.2 ENDORSEMENT OF THE SECAM.....................................................................................................1 1.3 SECAM AND ITS ASSESSMENT METHOD.........................................................................................2 1.4 WHY THE SECAM WAS DEVELOPED ..............................................................................................3 1.5 ACKNOWLEDGEMENTS...................................................................................................................4 1.6 ADDITIONAL COPIES / GENERAL INFORMATION ON INCOSE............................................................4 1.7 INFORMATION ON THE INCOSE SECAM........................................................................................4

2 DEVELOPMENT OF THE SECAM.................................................................................................. 6 2.1 APPROACH....................................................................................................................................6 2.2 INCREMENTAL DEVELOPMENT .......................................................................................................8 2.3 PLANNED FUTURE EXTENSIONS......................................................................................................9

3 INCOSE SECAM, VERSION 1.50 ................................................................................................... 10 3.1 GENERAL INFORMATION ..............................................................................................................10

3.1.1 Process Maturity ................................................................................................................. 10 3.1.2 Systems Engineering Capability ........................................................................................... 11 3.1.3 Some Limitations................................................................................................................. 12 3.1.4 Applicability ....................................................................................................................... 13

3.1.4.1 Use..................................................................................................................................13 3.1.4.2 Product Diversity..............................................................................................................13 3.1.4.3 Product Life Cycle ............................................................................................................15 3.1.4.4 Size of Systems Engineering Organizations ........................................................................16

3.1.5 Use of the INCOSE SECAM ................................................................................................. 16 3.2 STRUCTURE OF THE INCOSE SECAM..........................................................................................18

3.2.1 Process Categories.............................................................................................................. 20 3.2.2 Key Focus Areas (KFAs)...................................................................................................... 20

3.2.2.1 Distribution of KFAs ........................................................................................................20 3.2.2.2 Introductory Text..............................................................................................................21 3.2.2.3 General Characteristics .....................................................................................................21 3.2.2.4 Questions .........................................................................................................................21

3.2.3 Relationship to Systems Engineering .................................................................................... 21 3.2.4 SECAM Capability Levels.................................................................................................... 22 3.2.5 Interpretation of Questions .................................................................................................. 27

3.3 TRACEABILITY MATRICES............................................................................................................27 3.4 GLOSSARY ..................................................................................................................................28 3.5 RELATIONSHIP TO OTHER STANDARDS..........................................................................................28

4 DETAILED REVISION HISTORY & PARTICIPANTS................................................................. 29 4.1 VERSION 1.00 ..............................................................................................................................30 4.2 VERSION 1.10 ..............................................................................................................................31

Page 5: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

v

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

4.3 VERSION 1.20 ..............................................................................................................................32 4.4 VERSION 1.30/1.31.......................................................................................................................33 4.5 VERSION 1.40 ..............................................................................................................................34 4.6 VERSION 1.41 (PRELIMINARY)......................................................................................................37 4.7 VERSION 1.50 ..............................................................................................................................37 4.8 VERSION 1.50A ............................................................................................................................40

5 INCOSE SYSTEMS ENGINEERING CAPABILITY ASSESSMENT MODEL.............................. 41 5.1 CATEGORY 1 MANAGEMENT PROCESS CATEGORY ........................................................................42

5.1.1 KFA 1.1 Planning................................................................................................................ 42 5.1.2 KFA 1.2 Tracking and Oversight.......................................................................................... 46 5.1.3 KFA 1.3 Subcontract Management....................................................................................... 50 5.1.4 KFA 1.4 Inter-group Coordination....................................................................................... 53 5.1.5 KFA 1.5 Configuration Management.................................................................................... 57 5.1.6 KFA 1.6 Quality Management.............................................................................................. 60 5.1.7 KFA 1.7 Risk Management................................................................................................... 63 5.1.8 KFA 1.8 Data Management.................................................................................................. 66

5.2 CATEGORY 2 ORGANIZATION PROCESS CATEGORY .......................................................................70 5.2.1 KFA 2.1 Process Management and Improvement................................................................... 70 5.2.2 KFA 2.2 Competency Development....................................................................................... 73 5.2.3 KFA 2.3 Technology Management........................................................................................ 78 5.2.4 KFA 2.4 Environment and Tool Support ............................................................................... 81

5.3 CATEGORY 3 SYSTEMS ENGINEERING PROCESS CATEGORY ...........................................................86 5.3.1 KFA 3.1 System Concept Definition...................................................................................... 86 5.3.2 KFA 3.2 Requirements & Functional Analysis....................................................................... 90 5.3.3 KFA 3.3 System Design........................................................................................................ 97 5.3.4 KFA 3.4 Integrated Engineering Analysis ............................................................................100 5.3.5 KFA 3.5 System Integration ................................................................................................105 5.3.6 KFA 3.6 System Verification ...............................................................................................108 5.3.7 KFA 3.7 System Validation .................................................................................................112

6 GLOSSARY....................................................................................................................................116

APPENDIX A DETAILED REVISION HISTORY..........................................................................132 A.1 DRAFT VERSION 1.00 FEBRUARY 1994........................................................................................ 132 A.2 DRAFT VERSION 1.01 JUNE 1994 ................................................................................................ 132 A.3 DRAFT VERSION 1.10 JULY 1994 ................................................................................................ 132 A.4 VERSION 1.20 NOVEMBER 1994.................................................................................................. 132 A.5 VERSION 1.30 APRIL 1995.......................................................................................................... 133 A.6 VERSION 1.31 APRIL 1995.......................................................................................................... 133 A.7 VERSION 1.40 MAY 1995 ........................................................................................................... 133 A.8 VERSION 1.41 (PRELIMINARY) OCTOBER 1995 - JANUARY 1996 (NEVER FORMALLY RELEASED) .... 135 A.9 VERSION 1.50 (PRELIMINARY) APRIL 1996.................................................................................. 136

APPENDIX B TRACEABILITY MATRICES.................................................................................139 6.1.1.1 V1.4 .............................................................................................................................. 140

APPENDIX C RELATIONSHIP TO OTHER STANDARDS..........................................................165

Page 6: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

vi

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

APPENDIX D SYSTEMS ENGINEERING CAPABILITY ASSESSMENT MODEL QUESTIONNAIRE................................................................................................................................168

List of Figures

FIGURE 1.3-1. SUB-DOCUMENTS COMPRISING THE SECAM ASSESSMENT METHOD ........................................3 FIGURE 2.1-1. INCOSE SECAM DEVELOPMENT AND APPLICATION HISTORY .................................................7 FIGURE 2.1-2. BUILD-TEST-ANALYZE-BUILD CONCEPT FOR SECAM DEVELOPMENT .....................................8 FIGURE 3.1-1. SCOPING THE ASSESSMENT OBJECTIVE..................................................................................11 FIGURE 3.1-2. SEPAS CONDUCTED TO DATE USING THE INCOSE SECAM..................................................14 FIGURE 3.1-3. SEPA LIFE CYCLE COVERAGE ..............................................................................................16 FIGURE 3.1-4. INCOSE SECAM SCORING PROFILE EXAMPLE .....................................................................17 FIGURE 3.2-1. STRUCTURE OF THE INCOSE SECAM..................................................................................19 FIGURE 3.2-2. KEY FOCUS AREAS BY PROCESS CATEGORY..........................................................................21 FIGURE 3.2-3. EXAMPLE ATTRIBUTES OF SYSTEMS ENGINEERING CAPABILITY .............................................23 FIGURE 3.2-4. CLASSES OF CAPABILIT Y ATTRIBUTES WITHIN INCOSE SECAM............................................23 FIGURE 3.2-5. EXAMPLE GENERIC ATTRIBUTES...........................................................................................24 FIGURE 3.2-6. EXAMPLE VERTICAL THEME ATTRIBUTES..............................................................................25 FIGURE 3.2-7. ORGANIZATION OF INCOSE SECAM QUESTIONS WITH RESPECT TO CAPABILITY LEVELS........26 FIGURE 4-1. SECAM IMPROVEMENT SUMMARY..........................................................................................30

Page 7: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

1

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1 INTRODUCTION

1.1 GENERAL

The INCOSE Systems Engineering Capability Assessment Model (SECAM) Model Description, herein after referred to simply as the INCOSE SECAM, is a product of the Capability Assessment Working Group (CAWG) of the International Council On Systems Engineering (INCOSE). The SECAM is to be used in conjunction with the SECAM Assessment Method to evaluate systems engineering capability. When this is done properly, INCOSE believes that an organization practicing systems engineering in an integrated systems or integrated product and process development (IPPD) environment may evaluate its capability to perform systems engineering and determine areas for potential improvement.

1.2 ENDORSEMENT OF THE SECAM

The CAWG is one of four working groups that comprise the Measurement Technical Committee of the Technical Board of INCOSE. At the INCOSE Winter Workshop in January 1996, the INCOSE SECAM and the SECAM Assessment Method received:

• The unanimous endorsement of the INCOSE Technical Board for publication as an INCOSE Technical Paper.

[Note: A Technical Paper is a designation given to all approved products of the INCOSE Technical Board to include handbooks, models, reports, etc.]

This was the first product of an INCOSE Working Group to achieve the INCOSE Technical Board's approval.

Previously, at the Fifth Annual International Symposium of NCOSE (now INCOSE) held in July 1995, several events of significance occurred with respect to the INCOSE/CAWG SECAM and the SECAM Assessment Method:

• The INCOSE/CAWG SECAM and supporting documents received unanimous endorsement of the Measurement Technical Committee.

• The INCOSE/CAWG SECAM and supporting documents received unanimous approval by the Technical Board to be released as an "interim" Technical Paper.

[Note: The word interim has been italicized for emphasis; this was a precursor to complete approval that was obtained in January 1996.]

• The Board of Directors unanimously resolved: "INCOSE will publicize and disseminate the CAWG developed Systems Engineering Capability Assessment Model (SECAM) and supporting documents as the INCOSE model and method to assess systems engineering process maturity, using the funds already allocated to the Technical Board budget line item."

Page 8: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

2

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.3 SECAM AND ITS ASSESSMENT METHOD

Both the SECAM and the SECAM Assessment Method (CAWG-1996-02-1.50) are required to conduct an effective assessment of systems engineering capability. The application of the SECAM and its assessment method to an organization is referred to as a Systems Engineering Process Assessment (SEPA). A SEPA is an organized activity conducted to:

• Assess, or measure, an organization's current state of systems engineering capability.

• Identify problem areas.

• Provide a vector for growth in capability.

The INCOSE SECAM may be viewed as the tool used to measure, or assess, the organization during the SEPA. It contains information on the: (1) development of the SECAM, (2) application of the SECAM to organizations producing products and/or providing services in various product domains and phases of the product life cycle, (3) significant features of the structure/organization of the SECAM, and (4) capability model.

The SECAM Assessment Method provides the means to assist in and standardize the application of the SECAM, thereby permitting repeatability of the results. It consists of a number of sub-documents as shown in Figure 1.3-1.

The SECAM Questionnaire is a part of the SECAM Assessment Method and is one of the primary means of obtaining information that leads to the development of a scoring profile and findings (weaknesses and strengths) resulting from a SEPA. The Questionnaire is available as a stand alone document (CAWG-1996-03-1.50) in order to ensure that ancillary information, e.g., instructions for answering the Questionnaire, Glossary, etc., are not inadvertently removed when providing the Questionnaire to SEPA participants. Separation of the Questionnaire from the SECAM Assessment Method is done as a practical consideration based upon experience; in actuality, it is part of the SECAM Assessment Method.

SECAM Assessment Method Document:

• SECAM Assessment Method Overview

• SECAM Assessment Planning Tools - Assessment Process Flow - Activity Descriptions - On-Site Phase Work Breakdown Structure - On-Site Phase Planning Schedule

• SECAM Data Gathering Tools - SECAM Questionnaire - Exploratory Questions

• SECAM Scoring Tools - Numeric Scoring Method

Page 9: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

3

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

- Heuristic Scoring Method

• SECAM Presentation Templates - Assessment Team Briefing - Findings Briefing - Action Plan

Figure 1.3-1. Sub-Documents Comprising the SECAM Assessment Method

1.4 WHY THE SECAM WAS DEVELOPED

The CAWG was formed by INCOSE in October 1992. The charter of the CAWG was formally de fined by the consensus of its members who attended the NCOSE (now INCOSE) Business Meeting in January 1993. The charter is:

"To lead a broad-based NCOSE (now INCOSE) initiative to develop a method for assessing and improving the efficiency and effectiveness of systems engineering."

To achieve its charter, the CAWG adopted two goals:

• Develop a Capability Assessment Model for Systems Engineering, i.e., the SECAM.

• Gain industry (customers and standards organizations) and Government acknowledgment and acceptance of this Model.

By the conclusion of its May 1993 meeting, the CAWG had informally surveyed many of existing models that could be used or easily extended to become a model to assess systems engineering capability. Based upon this survey, the CAWG decided to actively pursue development of the SECAM in order to permit the capability assessment of systems engineering. This decision was made for several reasons:

• There was a pressing, near term need to perform capability assessments of systems engineering within industry and Government.

• A significant portion of the CAWG membership believed a model did not currently exist that could adequately assess systems engineering capability as defined by the requirements generated by the CAWG for such a model.

• It was felt that any initiatives to develop a model outside the efforts of the NCOSE/CAWG would not produce an adequate model for at least a couple of years.

Since its initial release, the CAWG has continued to improve the SECAM. Improvements have been made in coordination with other elements of the INCOSE. For instance, the taxonomy of the INCOSE Metrics Guidebook for Integrated Systems and Product Development, developed by the Metrics Working Group of the INCOSE Measurements Technical Committee, was influenced by the SECAM. Conversely, the treatment of metrics within the SECAM was influenced by the Metrics Guidebook and review of the SECAM by members of the Metrics Working Group.

Page 10: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

4

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

The intent in developing and improving the SECAM and its Assessment Method is to ensure that INCOSE members and others are given every opportunity to learn about and use them, as well as other INCOSE measurement products, to measure and improve their systems engineering activities - thereby enhancing the business position of company or government entity being assessed. These products are available to become an integral part of the strategic planning activities of both industry and government.

1.5 ACKNOWLEDGEMENTS

The INCOSE SECAM and its Assessment Method represent the work of many individuals from industry, government, and academia. Within the systems engineering community, attempts have been made during the development of the INCOSE SECAM to gain a broad cross section of both authorship and review. Those individuals who have provided input as an author, reviewer, or both are listed in Section 4.0 under the particular version to which they contributed. In addition to the names shown, many other individuals supported the work of some of those listed or provided verbal feedback during a systems engineering process assessment that resulted in improvement. The INCOSE effort could not have been successful without the participation of all concerned. Their contributions are greatly appreciated for having helped generate these pioneering products to measure systems engineering capability.

1.6 ADDITIONAL COPIES / GENERAL INFORMATION ON INCOSE

Copies of the INCOSE SECAM, SECAM Assessment Method, and SECAM Questionnaire may be obtained from the INCOSE Central Office. General information on INCOSE, membership information, and copies of other INCOSE products (e.g., the INCOSE Metrics Guidebook for Integrated Systems and Product Development) may be obtained from the INCOSE Central Office. Communications with the INCOSE Central Office may be made via:

International Council on Systems Engineering 2033 Sixth Avenue, Suite 804 Seattle, WA 98121 E-mail: [email protected] Telephone: (800) 366-1164 (in Seattle, use 206-441-1164) Facsimile: (206) 441-8262

1.7 INFORMATION ON THE INCOSE SECAM

General information on the INCOSE SECAM and other INCOSE products, information on INCOSE technical policies, goals, and strategic planning, as well as technical activities of the INCOSE Technical Board, can be obtained from:

Dr. Brian M. McCay Chair, INCOSE Technical Board

Mitretek Systems 25 Burlington Mall Road Bedford, MA 01803-4141

E-mail: [email protected] Telephone: (617) 229-5329 Facsimile: (617) 229-5301

Information on the INCOSE SECAM and the other products of the INCOSE Measurement Technical Committee (e.g., Metrics Guidebook), strategic planning and coordination of the development of measurement

Page 11: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

5

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

products, as well as general information on the activities of the four working groups within the Measurement Technical Committee, can be obtained from:

Mr. E. Richard Widmann Chair, INCOSE Measurement Technical Committee and Member of the INCOSE Technical Board (Chair of the Capability Assessment Working Group, 1992-1995) Hughes Aircraft Company Electro-Optical Systems PO Box 902 El Segundo, CA 90245 E-mail: 0069222@ msgate.emis.hac.com

[email protected] Telephone: (310) 616-7685 Facsimile: (310) 616-1432

Specific information on the INCOSE SECAM and its Assessment Method, development plans for these products, and the activities of the INCOSE CAWG, requests for CAWG facilitated systems engineering process assessments (SEPAs), can be obtained from:

Mr. Blake A. Andrews Chair, INCOSE Capability Assessment Working Group and Member of the INCOSE Measurement Technical Committee (Co-Chair Capability Assessment Working Group, 1993-1995) Rockwell, Collins Air Transport Division 400 Collins Road, NE Cedar Rapids, IA 50498 E-mail: [email protected] Telephone: (319) 395-4922 Facsimile: (319) 395-6042

or,

Mr. John Worl Co-Chair, INCOSE Capability Assessment Working Group Battelle 4000 NE 41st Street Seattle, WA 98105-5428 E-mail: [email protected] Telephone: (206) 528-3219 Facsimile: (206) 528-3552

Page 12: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

6

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2 DEVELOPMENT OF THE SECAM

The measurement of systems engineering capability is a relatively new field. A synopsis of the approach taken to develop the INCOSE SECAM and its Assessment Method, their incremental development, and application history is provided in this section.

2.1 APPROACH

The CAWG was formed by INCOSE in October 1992. At the first meeting, the members of the CAWG decided to follow a classical systems engineering approach in attempting to achieve the charter and goals adopted by the working group (refer to section 1.4). The approach taken was to:

• Define the requirements for a systems engineering capability assessment model (SECAM). • Generate a top level plan for the development of the SECAM. • Seek out information on currently available models that offered the potential for relatively easy

modification to a SECAM (this survey was conducted in order to avoid prematurely adopting a point solution).

• Adopt a design approach for the SECAM. • Develop the SECAM in an incremental manner. • Seek feedback on the utility of the SECAM in order to continually improve and extend the model. • Develop a methodology to assess or evaluate the SECAM against the requirements.

An inherent part of this systems engineering approach is the concept of auditing or reviewing CAWG products at critical points in time, with the intent of assessing progress and determining if a change in direction is warranted. For instance, the- requirements developed for the SECAM have been reexamined and updated several times. As experience with model building for systems engineering continues, the requirements will be re-examined and updated.

In the same manner, re-examination of the INCOSE SECAM has resulted in incremental improvements. Five major version updates, and a number of minor version updates, have been generated since the initial version was completed in February 1994. The SECAM Assessment Method, initially completed in March 1994, has also undergone a number of improvements. A historical perspective of the development of the INCOSE SECAM and its Assessment Method is indicated in Figure 2.1-1, which also includes the applications of the SECAM in various systems engineering process assessments (SEPAs). It is anticipated that future version updates to the INCOSE SECAM and its Assessment Method will be made to further refine these products, since the measurement of systems engineering capability is a recently developed, and not extensively charted, field.

The development of the INCOSE SECAM was accomplished by gaining a broad cross section of both authorship and review within the systems engineering community. The content and structure of the SECAM has developed to the point where it can be described as “a model developed by systems engineers to assess (measure) systems engineering capability". Those individuals who have provided input as authors, reviewers, or both, are listed in Section 4 under the particular version to which they provided a contribution.

Page 13: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

7

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Loral

Grumman

Hughes

LCCS SEPA-A

Grumman SEPA-B

LSRS SEPA-C

Hughes SEPA-D

CSC SEPA-2

Rockwell SEPA-1

Ver 1.0

Ver 1.2

Ver 1.4

Capability Assess-ment WG Formation

INCOSE “interim”

approval of SECAM and Supporting Documents

INCOSE approval of SECAM and Supporting Documents

INCOSE SECAM invited as a

Base Document for US SC7 TAG

& ISO

1992 1993 1994

Aug

Oct

Jan

May-June

Mar-Apr

Mar

Pre-CAWG Models

Oct 92

Ver 1.00

Ver 1.10

Ver 1.20

Ver 1.30/

1

Ver 1.40

Feb July Dec

Mar

WHC SEPA-3

Feb

WHC SEPA-3

Feb

WHC SEPA-4

Sept

Hughes SEPA-6

Dec

Raytheon SEPA-5Nov-Dec

Jan/Feb May

MayApr Oct/Nov 95-Jan 96

Ver 1.41 Preliminary

1995 1996

USDA SEPA-7

Mar

USDA SEPA-7

Mar

TRW SEPA-10May-Jun

TRW SEPA-10May-Jun

Boeing SEPA-13May-Jun

Boeing SEPA-13May-Jun

Lucent SEPA-16Jun-July

Lucent SEPA-16Jun-July

Apr

May - July

Ver 1.5Preliminary

Apr/May/Jun

Ver 1.50 Preliminary

Ver 1.50

Jun

Supporting Documents (SECAM Assessment Method)

System Engineering Process Assessments (SEPAs)

July

WHC SEPA-9May-Jun

WHC SEPA-9May-Jun

DOE/RL SEPA-12May-Jun

DOE/RL SEPA-12May-Jun

AT&T SEPA-15Jun-July

AT&T SEPA-15Jun-July

BMI SEPA-8May-Jun

BMI SEPA-8May-Jun

HoneywellSEPA-11May-Jun

HoneywellSEPA-11May-Jun

USDA SEPA-14

Jun

USDA SEPA-14

Jun

Jan

Figure 2.1-2. INCOSE SECAM Development and Application History

Page 14: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

8

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

• “Built" initial Version 1.00 o Merged proto-models o Based upon favorable assessment results

• “Tested" initial Version 1.00 in two SEPAs. • “Built" next two version updates (1.10, 1.20) based upon:

o Incorporation of lessons learned from SEPAs. o Analyses performed upon the SECAM. o Comments by individual reviewers.

• “Tested" Version 1.20 in a SEPA. • “Built" Version 1.30 based upon:

o Minor observations from the SEPA. o Analysis performed on the SECAM.

• “Built" Version 1.40 based upon: o Incorporation of lessons learned from the SEPA. o Analysis performed and a major re-examination of the SECAM. o Wide range of review and comment from within INCOSE.

• “Tested" Version 1.40 in a SEPA. • “Built" Version 1.41 (Preliminary - Nov 1995) based upon:

o Minor observations from the SEPA. • “Tested" Version 1.41 (Preliminary - Nov 1995) in two SEPAs. • “Built" Version 1.41 (Preliminary - Jan 1996)

o Feedback and minor observations obtained from SEPAs • “Tested" Version 1.41 (Preliminary - Jan 1996) & KFA 1.8 in a SEPA • “Built" Version 1.50 (Preliminary - Apr, May, June, 1996) based upon:

o Incorporation of lessons learned and major observations from three SEPAs.

o Analysis performed and re-examination of the SECAM. o Wide range of review and comment from within INCOSE.

• “Tested" Version 1.50 (Preliminary - May, June 1996) in nine SEPAs. • “Built" Version 1.50 based upon:

o Continued review and analysis. o Lessons learned from SEPAs.

Figure 2.1-3. Build-Test-Analyze-Build Concept for SECAM Development

2.2 INCREMENTAL DEVELOPMENT

The systems engineering approach used to develop the INCOSE SECAM utilizes the concept of “build-test-analyze-build". Using this concept, the then current version of the INCOSE SECAM is applied using the SECAM Assessment Method during the conduct of a systems engineering process assessment (SEPA) at one or more corporate/government entities. The INCOSE SECAM and its Assessment Method are then subsequently examined based upon the experiential feedback received, lessons learned from the SEPA, analysis of the SECAM, and review comments. An improved, updated version of the SECAM and its Assessment Method are then generated. Application of the “build-test-analyze-build" concept for the INCOSE SECAM is summarized in Figure 2.1.

Page 15: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

9

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Using this development concept, users of the current version of the INCOSE SECAM and its Assessment Method can receive the benefit of the results, experience, and feedback obtained from the previous version. This quick turn around, or “quick time to market" philosophy offers a distinct advantage to successive users in this relatively new field of measuring systems engineering capability.

The INCOSE CAWG has maintained configuration management of the SECAM and its Assessment Method. The configuration control permits traceability of changes between different versions. This permits comparison of results obtained from SEPAs conducted using different versions of the SECAM.

2.3 PLANNED FUTURE EXTENSIONS

At present, the INCOSE SECAM is considered to only partially fulfill the requirements developed by the CAWG for a Model for the Capability Assessment of Systems Engineering. The requirements for a complete model must address at least three major types of capability attributes: process, people, and technology. The INCOSE SECAM focuses to a large extent on process attributes. However, non-process attributes are also included in this version of the SECAM as well as in Version 1.40, albeit to a lessor extent.

The INCOSE SECAM can be viewed as filling a void, since the CAWG believes that a Capability Assessment Model for Systems Engineering that satisfies all requirements defined by the CAWG does not currently exist. All other sophisticated models (known to CAWG) for the capability assessment of systems engineering are currently process based only. This is viewed as a weakness by both the CAWG and the INCOSE Technical Board. The concentration on process-only attributes is one of the reasons that the term “Interim" was included in the name of prior versions of the INCOSE SECAM, since it was realized that process constituted only a subset of the requirements necessary to assess true systems engineering capability. This term has been dropped from Version 1.50 of the INCOSE SECAM since it does address some aspects of systems engineering capability other than process.

At the October 1994 meeting of the CAWG, three conceptual frameworks for extension of the INCOSE SECAM were presented. Each conceptual framework examined includes process, people, and technology. These conceptual frameworks were proposed by Dr. Bill Mackey (Computer Sciences Corporation) and are based upon the work accomplished by his team prior to the meeting. The CAWG will consider these concepts, as well as others, as the basis for development of a complete model structure that meets its requirements.

Page 16: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

10

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3 INCOSE SECAM, VERSION 1.50

3.1 GENERAL INFORMATION

3.1.1 PROCESS MATURITY

The concept of process maturity was initially developed in the software engineering domain approximately a decade ago. In the last few years, interest in this concept has grown in intensity and has spread to other engineering domains, including systems engineering. There is a growing consensus that technical disciplines can benefit by transitioning from a low level of process maturity to increasingly higher levels of process maturity.

A number of process maturity models have been developed that address a variety of traditional and specialty engineering disciplines. Definition of exactly what constitutes each level of process maturity varies somewhat according to the particular process maturity model being considered. However, a general trend can be observed across different models regarding process maturity. This general trend seems to be that with increasing process maturity comes increased structure and formalization of the particular process under consideration. Organizations in disciplines, such as systems engineering, that are at the lowest level of process maturity are generally characterized as having ad hoc, or perhaps chaotic, systems engineering processes. Above that, but still within the lower levels of process maturity, organizations are characterized by not having standard processes, but rather execute their processes in an informal or intuitive manner. Such organizations often are considered reactive in that much of their activity focuses on solving daily crises, often referred to as “fire fighting". In achieving the requisite technical performance for their products, schedule and budget constraints are often exceeded. Product quality from these organizations is often difficult to predict and usually not repeatable.

This is not to say that an organization with low process maturity cannot produce a high quality product in the face of adversity. Experienced systems engineers and system engineering managers, when confronted with significant problems, can intuitively tailor or disregard their formal systems engineering processes and overcome the situation. However, the knowledge of how to do this usually resides within key individuals, the so called “heroes", who, when they leave the effort, take with them the knowledge and experience gained. More junior and less sophisticated members of the team benefit only slightly from the situation in terms of experience gained from overcoming these problem areas, since their knowledge of the formal process is usually weak and no attempt is made to capture lessons learned in a historical data base from which they could learn.

Conversely, organizations in disciplines, such as systems engineering, that have a high level of process maturity are generally characterized as following a disciplined process; i.e., the process used to produce the product has been institutionalized within the organization. Such organizations are considered proactive in that much of their activity focuses on anticipating problems before they occur and taking corrective action to minimize the impact on the effort. These organizations attempt to control their environment, rather than allowing the environment to control the organization. In achieving the requisite technical performance for their products, schedule and budget constraints are usually satisfied. These organizations usually maintain a historical data base of past performance and lessons learned upon which they base their approach to each new effort. Product quality from these organizations is much more repeatable and predictable.

Page 17: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

11

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.1.2 SYSTEMS ENGINEERING CAPABILITY

The concept of systems engineering capability is what distinguishes SECAM from other complex models that are process-based only. Prior to the development of SECAM, all sophisticated models which sought to measure an organization used process maturity as the basis for their assessment. The Capability Assessment Working Group has explored the concept of capability maturity versus process maturity and decided to pursue a broader approach towards the assessment of an organization, i.e. one that includes both process and non-process indicators of capability. It is believed that this will provide a more comprehensive measurement of true systems engineering capability.

Figure 3.1-1 presents a concept of how performance, capability, and process maturity are related. This figure is not necessarily complete and some of its content is subject to continuing debate. However, it illustrates the problems faced by an organization desiring to understand and improve its abilities.

Performance

Capability Capacity

Process

People

Technology

Resources

Control

Agility Figure 3.1-1. Scoping the Assessment Objective

Ideally, an organization desires to predict and manage its performance. In this model, performance is comprised of two elements: capability and capacity. The capability of an organization is characterized by its processes, the skills of the people it employs, and the technology that can be brought to bear upon its problem domain. However, capability alone is an incomplete indication of performance. Organizations may have the capability to accomplish technical feats but, unless they also have the capacity needed to accomplish their objectives in a reasonable time-frame, the capability is diminished. Capacity in this model is characterized by having a sufficient quantity of the right resources, having the necessary infrastructure to manage those resources effectively, and having the agility within the organization to quickly respond to changes in political, economic, and business arenas.

The INCOSE SECAM attempts to assess all aspects of systems engineering capability. In the early stages of its development, SECAM was essentially a process-based assessment tool. During this phase of its development, SECAM was sometimes referred to as an “Interim Model". Beginning with Version 1.40, non-process attributes of systems engineering capability were included within the model. The inclusion of non-process attributes was increased in Version 1.50 of the SECAM and the term “Interim Model" has been dropped. INCOSE intends to

Page 18: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

12

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

continue incrementally improving and extending the SECAM to explore the concept of assessing capability, and ultimately predicting performance, in this context.

3.1.3 SOME LIMITATIONS

The current version of the INCOSE SECAM, designated as Version 1.50, is the fifth major update of this model. The SECAM is still, to a large extent, a process based model. As mentioned in section 2.0, INCOSE views process attributes as representing only one element, or dimension, of true systems engineering capability. Process maturity indicators were developed first because they were deemed easiest to develop and had received the most attention in other efforts, e.g., International Standards Organization (ISO) initiatives, and other disciplines, e.g., software. However, Version 1.50 of the SECAM, and to a lesser degree Versions 1.40 and 1.41 (Preliminary), have been extended to include non-process indicators of systems engineering capability. INCOSE believes that these non-process indicators represent some of the high leverage characteristics of systems engineering capability. The INCOSE CAWG intends in the future to continue extending the SECAM to include additional process and (especially) non-process indicators of systems engineering capability.

Another limitation of the SECAM is its validation. The assessment of systems engineering in terms of capability, or even process based maturity, is a relatively new and not extensively explored field. As of June 1996, there have been twenty (20) systems engineering process assessments (SEPAs) performed on systems engineering organizations using the SECAM and its predecessor proto-models that were merged to create Version 1.00. Though experience with using the INCOSE SECAM has grown and each SEPA was deemed of value to the organizations that were assessed, it still does not provide enough of an experience base to completely ascertain the utility of the model or validate that it assesses (measures) “true" systems engineering capability.

The validation of the INCOSE SECAM, as with any other model of this type, will only be gained through extensive testing and analysis over time. An inherent component of this validation will be to demonstrate quantitatively that improvement in systems engineering capability, as measured by assessment results, can lead to a more effective organization. The SECAM is not validated, but does represent the ideas of its developers, most of whom are systems engineers, with regard to what should be considered as needed to assess systems engineering capability. To date, INCOSE is unaware of any model developed to assess systems engineering capability, or process maturity, that has been validated quantitatively.

With respect to software, various claims have been made regarding improvements in productivity and efficiency resulting from growth in software process maturity. It is premature, however, to make such claims with regard to systems engineering. Only after numerous systems engineering organizations have been assessed (baselined), developed and implemented “get well" plans to correct identified deficiencies, and reassessed (reÐmeasured) can sufficient data be obtained to determine with some confidence the value of pursuing this avenue of growth in capability assessments. Implicit in making these claims will be measures of how systems engineering's impact on product development has affected “the bottom line", such as quicker time to market, lower product cost, reduction in cycle time, etc.

The intent here is not to paint a picture of pessimism, total uncertainty, or negativity regarding the use of the SECAM. Rather, it is to offer a word of caution with respect to the SECAM due to the amount of experiential data presently available (even though growing) regarding its use.

Page 19: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

13

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.1.4 APPLICABILITY

Pertinent information regarding systems engineering process assessments (SEPAs) conducted to date using the INCOSE SECAM and its Assessment Method is provided here, and elsewhere, in order to indicate the applicability and extent to which these these products have been used. Similarly, a synopsis of the development history of these products has been provided in Section 2.0.

3.1.4.1 USE

As illustrated in Figure 2.1-1, the INCOSE SECAM and its predecessor proto-models have been applied in twenty (20) SEPAs at sixteen (16) different corporate or government entities since August 1992. The three protohmodels were merged to form the initial version of the SECAM, designated Version 1.00. Figure provides a list of SEPAs conducted to date. The participating corporate and government entities reported their experience using the SECAM or its proto-models to be of value to them in identifying deficiencies and problem areas for improvement. One of these companies is using the SECAM as a structure upon which to improve their systems engineering processes and has used the model to conduct another SEPA to re-assess progress made against their incremental improvement plan. This same company intends to continue using the SECAM to incrementally measure its process improvements.

It is important to note that the CAWG has used the lessons learned and feedback received from these SEPAs as a source for improving the SECAM. The CAWG intends to improve and attempt to validate the SECAM through successive application of the model to various systems engineering activities.

3.1.4.2 PRODUCT DIVERSITY

Until recently, INCOSE has been viewed as an organization comprised principally of contractors whose primary customer is the Department of Defense (DOD). Like their parent organization, the membership of most of the working groups of INCOSE, including the CAWG, was similarly comprised.

The first four SEPAs were conducted using the INCOSE SECAM predecessor proto-models on four corporate entities that developed products or provided services primarily in the DOD product domain. Since systems engineers from primarily DOD contractors were predominant (in terms of numbers) in the development of the SECAM, it was not surprising that these systems engineers viewed the favorable results and experiences from the first four SEPAs as an indication of the utility of the three proto-models to measure systems engineering capability. The merging of these three proto-models into the initial version of the SECAM seemed a reasonable choice.

The INCOSE SECAM was used initially by non-DOD contractors. Referring to Figure 2.1-1, in 1994, the INCOSE SECAM was applied in a CAWG facilitated SEPA to a commercial avionics developer (Rockwell Collins Air Transport Division) and in a self administered SEPA to a National Aeronautics and Space Administration (NASA) contractor (Computer Sciences Corporation).

Page 20: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

14

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

SEPA # Corporate or Government Entity Date SECAM Version

A Loral Command & Control System Aug 1992 Proto-Model “L" B Grumman Oct 1992 Proto-Model “G" C Loral Space & Range Systems Jan 1993 Proto-Model “L" D Hughes Aircraft Company, Electro-Optical Systems May-Jun 1993 Proto-Model “H" 1 Rockwell, Commercial Air Transport Division Mar 1994 1.00 2 Computer Sciences Corporation Mar-Apr 1994 1.00 3 Westinghouse Hanford Company (WHC) Feb 1995 1.20 4 Westinghouse Hanford Company (WHC) Sep 1995 1.40 5 Raytheon Missile Systems Division Nov-Dec 1995 1.41 Prelim

Nov 1995 6 Hughes Aircraft Co mpany, Electro-Optical Systems Dec 1995 1.41 Prelim

Nov 1995 7 US Department of Agriculture,

North Central Soil Conservation Research Laboratory Mar 1996 1.41 Prelim

Jan 1996 (&KFA 1.8)

8 Battelle Memorial Institute (BMI) May-Jun 1996 1.50 Prelim May 1996

9 Westinghouse Hanford Company (WHC) May-Jun 1996 1.50 Prelim May 1996

10 TRW System Integration - Hanford, WA May-Jun 1996 1.50 Prelim May 1996

11 Honeywell Industrial Automation & Controls Division May-Jun 1996 1.50 Prelim May 1996

12 US Department of Energy, Richland Operations May-Jun 1996 1.50 Prelim May 1996

13 Boeing Defense and Space Group (BDSG) Jun 1996 1.50 Prelim May 1996

14 US Department of Agriculture, North Central Soil Conservation Research Laboratory

Jun 1996 1.50 Prelim May 1996

15 AT&T Corporation Jun-Jul 1996 1.50 Prelim Jun 1996

16 Lucent Technologies Jun-Jul 1996 1.50 Prelim Jun 1996

Figure 3.1-2. SEPAs Conducted to Date Using the INCOSE SECAM

In February 1995, the INCOSE SECAM was applied in a CAWG facilitated SEPA to a Department of Energy (DOE) contractor (Westinghouse). In September 1995, Westinghouse undertook a second CAWG facilitated SEPA making it the first company to use the INCOSE SECAM to measure improvement in its systems engineering capability using a baseline established in a previous SEPA. A DOE observer participated in both SEPAs. In May-June 1996, Westinghouse undertook a third CAWG facilitated SEPA, making it the first time a company has used the SECAM in three consecutive SEPAs.

Beginning in November 1995, two DOD contractors used the INCOSE SECAM to perform self administered SEPAs (Raytheon and Hughes). Prior to this, Hughes had used one of the predecessor proto-models to perform a self assessment.

Page 21: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

15

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Members of the USDA, North Central Soil Research Laboratory, reviewed the INCOSE SECAM and conducted a self-administered SEPA in March 1996. The same USDA organization conducted a second SEPA in June 1996.

Battelle Memorial Institute used the INCOSE SECAM to conducted a self-administered SEPA across three of its corporate entities (Battelle Seattle Research Center, Battelle Pacific Northwest National Laboratories, and Battelle Columbus) in May-June 1996. Battelle provides a wide range of services and products to many organizations (including the DoD, DoE and others).

TRW Systems Integration Group in Richland Washington supports several on-going large engineering activities on the DOE's Hanford site in Eastern Washington. TRW systems engineers from two of these projects, Tank Waste Remediation System and the Spent Nuclear Fuels Project, participated in the most recent assessment.

Honeywell Industrial Automation and Controls Division completed a CAWG facilitated SEPA in June 1996. This division of Honeywell is a world leader in the development, deployment, and life-cycle maintenance of industrial control systems.

Boeing Defense and Space Group (BDSG) applied the INCOSE SECAM to conduct a self-administered assessment. This was BDSG's first use of the INCOSE SECAM.

AT&T and Lucent Technologies are conducting self-administered self assessments during June and July 1996. Both SEPAs are being coordinated by an INCOSE member.

Each of these organizations provides varied services and products and applies systems engineering in different technological contexts. The SECAM has been used successfully by a wide range of organizations producing products representative of the following product domains:

l Department of Defense (DOD) l National Aeronautics and Space Administration (NASA) l Commercial Aerospace l Department of Energy (DOE) l Department of Agriculture l Industrial Automation & Control l Communications Equipment & Services l Public Research

3.1.4.3 PRODUCT LIFE CYCLE

The INCOSE SECAM has been successfully applied to organizations that develop products or provide services representative of all phases of the product life cycle, as indicated in Figure 3.1-3. The entries in the left column of this figure indicate the particular SEPAs that have been conducted to date as listed in Figure 3.1-3 and illustrated in Figure 2.0-1. For instance, “A" in Figure represents SEPA-A in Figures 2.0-1 and 3.1-3 conducted using one of the SECAM predecessor proto-models at Loral Command and Control Systems in August 1992. Similarly, “7" in Figure represents SEPA-7 in Figures 2.0-1 and 3.1-3 conducted using the SECAM at the US Department of Agriculture, North Central Soil Conservation Research Laboratory, in March 1996.

SEPAs “A" through “D" in Figure 3.1-3 were conducted using the pre-SECAM proto-models, which were merged to form the initial version of the SECAM. The organizations participating in these SEPAs tended to be

Page 22: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

16

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

developing products or providing services that were in the middle phases of the product life cycle, i.e., development, production, and post-production support.

SEPAs “1" through “16" in Figure involved organizations that cumulatively were developing products or providing services that span the entire product life cycle, .i.e., concept exploration through disposal.

ABCD123456789

10111213141516

Cncpt Expl Dem-Val Develmt Productn Post-Prodn Disposal

Figure 3.1-3. SEPA Life Cycle Coverage

3.1.4.4 SIZE OF SYSTEMS ENGINEERING ORGANIZATIONS

The size of the systems engineering organizations assessed using the INCOSE SECAM ranges from relatively small (<75 employees), to medium (approximately 350 employees), to large (>1500 employees). The systems engineering organizations assessed were a part of corporate or government entities ranging in size from relatively small (< 300 employees), to medium (approximately 1000 employees), to large (> 3500 employees).

3.1.5 USE OF THE INCOSE SECAM

The application of the INCOSE SECAM in a systems engineering process assessment is described in the SECAM Assessment Method (see paragraph 1.3 and Figure 1.3-1). The intent here is to provide some salient features of the usage of the SECAM; consult the SECAM Assessment Method for application of the SECAM.

Page 23: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

17

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

The INCOSE SECAM, applied properly using the SECAM Assessment Method, provides a means of baselining (measuring) the current state of a government or corporate entity's systems engineering capability, identifying problem areas, and serves as a vector for capability improvement. This baseline is established based upon the results obtained from the answers to the questions contained within the questionnaire and discussions held among systems engineering management, project leaders, and practitioners during the SEPA. The established baseline is comprised of:

l A scoring profile based upon a determination of a set of capability levels made separately for each Key Focus Area (KFA) within the model, as indicated in Figure .

l A set of findings (strengths and weaknesses) regarding systems engineering in the organization being assessed.

The determination of an overall score across all KFAs (such as a composite or average score) is discouraged when conducting a SEPA, since the emphasis then becomes one of obtaining a “score" rather than identifying problem areas. INCOSE is interested in self improvement of system engineering activities, rather than in external evaluations for the purposes of source selection, often characterized by a single “score".

1.1 P

lannin

g1.2

Tra

cking

& O

vers

ight

1.3 S

ubco

ntra

ct Mgt

1.4

Inter

grou

p Co

ordin

ation

1.5 C

onfig

urati

on M

gt

1.6 Q

uality

Mgt

1.7 R

isk M

gt1.8

Data

Mgt

2.2 C

ompe

tency

Dev

elopm

ent

2.3 T

echn

ology

Mgt

2.4

Env

& To

ol Sp

t

3.1 S

ystem

Con

cept

Defi

nition

3.2

Reqt

s &

Func

t An

alysis

3.3 S

ystem

Des

ign

3.4 In

tegra

ted E

ng A

nalys

is

3.5 S

ystem

Integ

ratio

n

3.6 S

ystem

Ver

ificati

on

3.7 S

ystem

Vali

datio

n

Figure 3.1-4. INCOSE SECAM Scoring Profile Example

The prime focus in conducting the SEPA should not be upon the score obtained in each KFA, but upon a determination by the systems engineering organization of the KFAs that need improvement. Since the emphasis is upon self improvement (and not obtaining a score for source selection purposes), it is up to the systems engineering organization being assessed to determine if a low “score" in a particular KFA is unacceptable and whether or not it needs to be improved based upon the business situation that is being supported.

Page 24: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

18

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

A SEPA conducted for self improvement should be done in a collaborative spirit -- the intent is to surface problems in the systems engineering process and not to blame individuals. The most important aspect is taking ownership of problem areas and generating a consensus that improvement needs to be undertaken. Since the emphasis is on “how can we improve", the adversarial relationships often characteristic of external evaluations (for the purposes of source selection) can be avoided -- rather than denying problems exist, there is an acceptance that they do exist and that there is room for improvement.

For those KFAs that the systems engineering organization has determined are weak and need improvement, an essential element of any properly conducted SEPA is the generation of a plan for improvement -- a “get well" plan. Through the collaborative spirit of conducting a SEPA for self-improvement, enthusiasm generated during the conduct of the SEPA, taking total ownership of identified problem areas, and implementing the “get well" plan, the systems engineering activity will hopefully improve its capability and effectiveness. Such improvements in systems engineering process will not occur instantly, but usually take a considerable period of time -- usually a minimum of several months. Meaningful improvement usually requires a change in the culture of those involved with systems engineering.

It is important to emphasize that application of the INCOSE SECAM using the Questionnaire and Assessment Method in a SEPA only baselines the systems engineering organization and helps identify problem areas -- by itself, it does not lead to improvement. It is up to the organization to improve its system engineering activities.

At some reasonable point in time after the improvement plan has been implemented, perhaps six months or more, it would be reasonable to rePapply the INCOSE SECAM in another SEPA to measure progress made against the goals established in the plan. In this sense, the SECAM can be used as a yardstick to measure improvement. Based upon the results of this re-assessment, the systems engineering organization may decide to modify its improvement plan to further optimize its path to achieving a higher level of capability.

A key feature in conducting a SEPA efficiently is the ability to tailor the SECAM and its Assessment Method to the constraints, e.g., cost, political necessities, etc., of the organization being assessed. The SECAM Assessment Method permits a “complete" assessment to be conducted in 3.5 - 4 days. Tailoring of the Assessment Method is also permitted to eliminate materials anticipated to be non-value added, thus reducing the cost of a SEPA. One tailoring approach used in a number of the SEPAs conducted has been to tailor the application of the SECAM Questionnaire to the perceived level of maturity of the organization being assessed. After reviewing the SECAM Questionnaire, a reasonable judgement can be made by the assessment team as to the anticipated capability level. The participants in the SEPA can then be asked to complete the SECAM Questionnaire for three capability levels only, i.e., the anticipated capability level, a level below, and a level above. The use of a three capability level “window" has been used in several SEPAs to tailor the SECAM Questionnaire.

3.2 STRUCTURE OF THE INCOSE SECAM

The SECAM is documented in Section 0. Figure 3.2-1 depicts the structure of the INCOSE SECAM and the relationship of this structure to elements of systems engineering in the “real world". These “real world" elements are depicted as ovals. Elements of the SECAM are depicted as boxes.

An organization's systems engineering process should be based upon a documented, widely recognized standard (e.g., EIA 632, IEEE-1220-1994, SAE ARP 4754, etc.). An organization achieves its systems engineering capability by applying standards to develop systems engineering processes, technology, and people with appropriate skills that are specific to its needs and product domain.

Page 25: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

19

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

The INCOSE SECAM is structured into Process Categories, containing Key Focus Areas (KFAs), which in turn, are comprised of questions used to ascertain a particular capability level for each KFA. Questions within each KFA are ordered in ascending levels of capability.

Each KFA contains six ascending levels of systems engineering capability:

l Initial Level 0 l Performed Level 1 l Managed Level 2 l Defined Level 3 l Measured Level 4 l Optimizing Level 5

A set of questions within each KFA samples the implementation of the related systems engineering attributes in a program/organization (“real world") as identified by the KFA. The questions are arranged in five levels starting with Performed and ending with Optimizing. The Initial level is the lowest and default level if a capability does not exist; it does not contain questions. Questions are used to ascertain at which capability level the activities required by the KFA are being performed.

Process Categories

EIA632 or IEEE1220 Activities

Key Focus Areas

Organization

Questions (Attributes)

Actual SE Activities

Capability Levels

SE Capability

Map into

Map into

Contain

Contain

Achieved By

Sample

Organized By

IndicateAchieved By

Implemented By

Uses

Process Categories

EIA632 or IEEE1220 Activities

Key Focus Areas

Organization

Questions (Attributes)

Actual SE Activities

Capability Levels

SE Capability

Map into

Map into

Contain

Contain

Achieved By

Sample

Organized By

IndicateAchieved By

Implemented By

Uses

Figure 3.2-1. Structure of the INCOSE SECAM

Page 26: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

20

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

The INCOSE SECAM definition of capability, while aligned with ISO SPICE, allows for the existence of:

l Generic Process Attributes l KFA-Specific Process Attributes l Generic Non-Process Attributes l KFA-Specific Non-Process Attributes

within each level of capability for each KFA.

3.2.1 PROCESS CATEGORIES

The aspects of the systems engineering domain contained within the INCOSE SECAM are divided into three broad categories of activities, called Process Categories. Each process category represents a broad class of activities essential to systems engineering. The process categories are:

1. Management - This category focuses on management activities associated with systems engineering. It is program oriented and primarily covers program planning, monitoring, and control functions.

2. Organization - This category focuses on organizationPwide activities essential to systems engineering. It is organization oriented and covers business level issues such as process definition and improvement, competency development, technology management, and environment and tool support.

3. Systems Engineering - This category focuses on the technical activities of the systems engineering domain. It is program oriented and covers specific elements found in the systems engineering discipline.

Each Process Category is comprised of a set of related Key Focus Areas (KFAs) which represent the essential aspects of the Process Category. While the overall Process Category structure of the INCOSE SECAM reflects that of a process based model, the content of the KFAs includes both process and non-process attributes of systems engineering capability.

3.2.2 KEY FOCUS AREAS (KFAS)

Each of the three Process Categories is comprised of a set of Key Focus Areas (KFAs). A KFA identifies a set of related attributes that, when accomplished, satisfy a required aspect of the Process Category; a KFA is an essential element of systems engineering as implemented by the organization/program.

3.2.2.1 DISTRIBUTION OF KFAS

Figure 3.2-2 identifies how KFAs are distributed within the INCOSE SECAM by Process Category. The numbering of each KFA within the INCOSE SECAM is hierarchical and reflects the parent Process Category of the KFA, e.g., the Planning KFA is numbered KFA 1.1 since it is the first KFA in the Management Process Category (Process Category number “1"). Numbering of KFAs within Process Categories does not reflect a prioritization of the KFAs; Process Category numbering is not indicative of prioritization as well.

1.0 Management 2.0 Organization 3.0 Systems Engineering

Page 27: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

21

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.1 Planning 2.1 Process Management and Improvement

3.1 System Concept Development

1.2 Tracking and Oversight 2.2 Competency Development 3.2 Requirements and Functional Analysis

1.3 Subcontract Management 2.3 Technology Management 3.3 System Design 1.4 Inter-group Coordination 2.4 Environment and Tool

Support 3.4 Integrated Engineering

Analysis 1.5 Configuration

Management 3.5 System Integration

1.6 Quality Management 3.6 System Verification 1.7 Risk Management 3.7 System Validation 1.8 Data Management

Figure 3.2-2. Key Focus Areas by Process Category

3.2.2.2 INTRODUCTORY TEXT

Each KFA contains introductory text that provides a brief description of the scope and the significant aspects that are representative of the KFA. The introductory text is not necessarily meant to be all inclusive of the essential elements of performing the activities associated with the KFA.

3.2.2.3 GENERAL CHARACTERISTICS

A set of no more than five general characteristics follows the introductory text to focus the attention of the reader on the primary purpose(s) of each KFA. General characteristics focus upon the primary purpose(s) of each Key Focus Area (KFA). They have served in the development of the model and have demonstrated utility in preparing for its application. The number of general characteristics per KFA has been intentionally limited to no more than five of the most significant aspects of each KFA; they are not necessarily meant to be all inclusive of the essential elements of performing the activities required by the KFA.

3.2.2.4 QUESTIONS

Each KFA contains questions which identify specific attributes associated with the KFA. As mentioned in the introduction to paragraph , the questions are arranged in five sets of ascending levels of capability. A sixth level, “Initial", is an implied default level containing no questions. This level of capability is implied if an insufficient number of attributes exist at the “Performed" level, i.e. the lowest level containing attributes (questions).

3.2.3 RELATIONSHIP TO SYSTEMS ENGINEERING

The INCOSE SECAM is a model to assess (measure) systems engineering capability, as described in Paragraphs 3.1.2 and 3.2.5. It is important to note that the INCOSE SECAM is not a model of systems engineering. The SECAM references recognized systems engineering standards, such as the EIA 632 and IEEE-1220-1994, as the appropriate models of systems engineering to be used by the organization being assessed. SECAM coverage of the EIA 632 and IEEE-1220-1994 standards is provided in Appendix C.

Page 28: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

22

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

An organization's systems engineering process, if based upon these recognized standards, describes “what" is to be accomplished, and not “how" it is to be accomplished. The “how" should be provided via detailed procedures and methods that will most likely be unique to a business environment, and even to a particular corporate or government entity within a given business environment.

The INCOSE SECAM assesses the “what" aspects to be accomplished by systems engineering and provides an indication of those areas that can be improved. The INCOSE SECAM can be used to assess, or measure, systems engineering as implemented by an organization in an integrated systems or integrated product and process development (IPPD) environment.

3.2.4 SECAM CAPABILITY LEVELS

The SECAM capability levels are used to describe the level of capability with which a corporate or government entity is accomplishing its systems engineering activities. The intent of each capability level is characterized by the example attributes indicated in Figure (note: this figure is not intended to provide a complete set of attributes for each level of capability for each KFA). There are six SECAM capability levels that range from zero (lowest, default level) to five (highest). Each Key Focus Area (KFA) within the SECAM contains these six levels in ascending order of increasing levels of capability

Capability Process Attributes Non-Process Attributes

5 Optimizing • program process effectiveness goals

are established based upon business goals

• continuous process improvement of program processes

• continuous process improvement of standards

• activities driven by systems eng. benefit • fully scalable complexity management • SE focus is product life cycle & strategic

applications • activities are optimally effective • work products are of optimal utility

4 Measured • metrics derived from proc data • quantitative understanding of program

processes • ability to predict performance • program process induced defects

identified • program processes improved

• all information fully integrated in a program database

• activities driven by systems eng. benefit • SE focus on all phases of product life cycle • activities are measurably effective • work products are of measurably significant

utility

3 Defined • processes are defined by org standards • standards are tailored & used • tailoring is reviewed & approved • program processes data is collected • customer feedback is obtained

• consistent program success • all information is managed electronically • key information is integrated in a program

database • activities driven by benefit to program • SE focus is requirements through operation • activities are significantly effective • work products are of significant utility

2 Managed • policies define need for activities • activities are planned, tracked & verif • work products reviewed for adequacy • corrective actions are taken • work products are controlled

• key information managed electronically • activities driven by benefit to customer • SE focus is requirements through design • activities are adequately effective • work products are of adequate utility

1 Performed • activities done informally • non-rigorous plans & tracking

• information is paper-based • activities driven only by contract

Page 29: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

23

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

• dependency on “heros" • work products are in evidence • general recognition of need for

activity

• SE focus limited to requirements • activities are marginally effective • work products are of marginal utility

0 Initial • general failure to perform activities • no easily identifiable work products • no proof something was accomplished

• no assurance of success • information is difficult to identify • driving force for activities is indeterminate • no assurance of complexity management • no systems engineering focus • activities and products of little effect or

value

Figure 3.2-3. Example Attributes of Systems Engineering Capability

The capability aspects of the SECAM contain both process and non-process classes of attributes as can be seen in Figure 3.2-3. Both the process and non-process classes of attributes can contain generic and KFA-specific classes of attributes. Attributes as used in this context are analogous to practices contained in some process based models. The classes of attributes that comprise the capability aspects of the SECAM are depicted pictorially in Figure 3.2-4.

Aspects of Capability

Process Attributes

Non-Process Attributes

KFA Specific Attributes

Generic Attributes

KFA Specific Attributes

Generic Attributes

Figure 3.2-4. Classes of Capability Attributes within INCOSE SECAM

The specific attributes of capability, i.e., process, non-process, generic, and KFA-specific, for each KFA are represented in the form of questions. While some models of maturity/capability have used declarative statements to represent practices or attributes, the SECAM has used questions in order to facilitate ease of configuration management between the SECAM and its Appraisal Method during development. This has facilitated the rapid development and “quick time to market" discussed in Section 2.0, while at the same time not seeming to present a problem to users of the SECAM. Future versions of the SECAM may convert the questions in the SECAM to declarative statements.

The questions have been organized into five groups that correspond to ascending levels of capability, from “Performed" (Level 1) to “Optimizing" (Level 5) as indicated in Figure 3.2-3. The “Initial" level (Level 0) is a default level that contains no questions. Each capability level for a KFA can potentially have attributes

Page 30: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

24

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

representative of all four classes of attributes, i.e., process, non-process, generic, and KFA-specific. As a minimum, all capability levels have generic process and generic non-process attributes.

Process attributes tend to describe, with increasing levels of capability, how well the process for a particular KFA is defined, institutionalized, and followed. The non-process attributes tend to describe, with increasing levels of capability, the appropriateness of a process, e.g., how effective a process is and how valuable are its products. Non-process attributes are intended to provide a “sanity check" on the level of capability indicated by the process attributes (e.g., it would make no sense if the model indicated a high level of capability for an organization if that organization was producing mediocre products).

The generic attributes tend to describe, with increasing levels of capability, the common features of capability for all KFAs. The KFA-specific attributes tend to describe, with increasing levels of capability, the unique features of capability for a given KFA. This is based upon the view that growth in capability in different KFAs, such as Planning and Requirements and Functional Analysis, has both common elements (generic attributes) and unique elements (KFA-specific attributes). Stated in another way, growth in the capability to perform Planning has some elements that are common to, and some that are different from' those elements that contribute to growth in the capability to perform Requirements and Functional Analysis. Figure 3.2-5 presents some example generic attributes. Within the SECAM, each generic attribute is identified with a “/G" suffix appended to the attribute number. KFA-specific attribute numbers omit this suffix notation.

Level 1.0 Planning 3.2 Requirements & Functional Anal.

5 1.1-5.6/G30 Are the metrics collected on the effectiveness of planning activities used to monitor and improve the systems engineering process?

3.2-5.8/G30 Are the metrics collected on the effectiveness of requirements and functional analysis activities used to monitor and improve the systems engineering process?

4 1.1-4.1/G22 Are metrics used to determine the effectiveness of planning activities?

3.2-4.2/G22 Are metrics used to determine the status and effectiveness of requirements and functional analysis activities?

3 1.1-3.30/G18 Are planning processes standardized across the organization?

3.2-3.41/G18-Is the process for requirements and functional analysis standardized across the organization?

2 1.1-2.4/G7 Has responsibility been assigned for program planning?

1.1-2.3/G7 Is responsibility designated for the management of requirements and functional analysis on the program?

1 1.1-1.1/G1 Is planning being accomplished in at least an informal manner?

3.2-1.1/G1 Is requirements analysis being accomplished by the program in at least an informal manner?

Figure 3.2-5. Example Generic Attributes

Page 31: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

25

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Within each KFA, there are a number of stand-alone KFA specific attributes as well as sets of KFA specific attributes. These sets of KFA-specific attributes form vertical “themes" within a given KFA. That is, the core or kernel attribute shows a different manifestation across two or more capability levels. Figure 3.2-6 presents some examples of vertical themes of KFA-specific attributes. Within the SECAM, vertical themes are identified by a “T" tag appended to the text of the attribute (question). These tags indicate the KFA, particular theme, and level of the attribute. For example, referring to the figure, theme “A" appears in levels 1-3 and 5. There are multiple elements of the same theme appearing in levels 2 and 3 as well. Multiple attributes of a common theme are noted with a lower case alpha character appended to the level identifier. Stand alone KFA-specific attributes are noted by the absence of both “/G' and “T" tags.

Level 1.0 Planning 3.2 Requirements & Functional Anal.

5 1.1-5.4 Is the process for developing the work breakdown structure reviewed by appropriate, experienced personnel and corrective actions taken as necessary? T1.1-A-L5

3.2-5.4 Are errors found in the requirements analyzed to determine causality with respect to the standard process? T3.2-G-L5

4

(No thread attribute at this level)

3.2-4.3 Are errors found in the requirements analyzed to determine causality with respect to the program's processes? T3.2-G-L4

3 1.1-3.22 Is the work breakdown structure reviewed to assure that it is complete, consistent and correct? T1.1-A-L3a

1.1-3.23 Is the work breakdown structure reviewed at appropriate program milestones and revised as necessary? T1.1-A-L3b

3.2-3.30 Are formal reviews used to inspect requirements for errors? T3.2-G-L3

2 1.1-2.8 Have the systems engineering work products and activities been defined in a traceable and accountable manner? T1.1-A-L2a

1.1-2.16 Does the work breakdown structure cover all the tasks and products necessary to the program? T1.1-A-L2b

3.2-2.8 Are requirements inspected for errors? T3.2-G-L2

1 1.1-1.3 Is there a work breakdown structure for the program that defines logical units of work to be managed at the program level? T1.1-A-L1

3.2-1.10 Are requirements reviewed for errors? T3.2-G-L1

Figure 3.2-6. Example Vertical Theme Attributes

Page 32: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

26

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

The capability levels of the INCOSE SECAM are aligned with the International Standards Organization (ISO) Software Process Improvement Capability dEtermination (SPICE) Base Practices Guide (BPG). The ISO SPICE BPG is a process-based model that contains a process capability architecture comprised of six capability levels. Five levels are comprised of a set of common features, each of which consists of a set of generic practices; the sixth and lowest level is a default level. Strictly speaking, only the generic process attributes of the INCOSE SECAM are aligned with the ISO SPICE BPG generic practices.

1.1 Planning

3.7 System Verification

KFA Specific Attributes

Performed• Work products provide evidence that work is being accomplished

Managed• Performance of activities is planned• Disciplined performance of activities• Performance is tracked and verified

Defined• Standard processes are defined• Programs tailor activities as needed• Process & Product data is gathered

Measured• Establish measurable quality goals• Determine needed process capability• Performance is predicted

Optimizing• Establishing quantitative effectiveness goals• Continuous improvement of processes

InitialLevel 0

Level 1

Level 2

Level 3

Level 4

Level 5

Level 0

Level 1

Level 2

Level 3

Level 4

Level 5

Key Focus Areas

General Characteristics

General Characteristics

General Characteristics

Capability Levels

Generic Attributes

Figure 3.2-7. Organization of INCOSE SECAM Questions with Respect to Capability Levels

Figure 3.2-7 illustrates the relationship between generic attributes and KFA-specific attributes within each KFA. Generic attributes (both process and non-process) are indicated by the horizontal bars in the figure. The vertical bars represent KFAs within the model (the process category structure of the model has been omitted for clarity). Where each horizontal bar (set of generic attributes) intersect a vertical bar (a KFA), the set of both process and non-process generic attributes is applied. The set of generic process attributes is aligned with the SPICE process-only attributes (common features) at each level of capability.

This figure also depicts “gaps" between each horizontal bar (generic attributes). These “gaps" in the figure represent KFA-specific attributes (both process and non-process).

Thus, SECAM capability levels reflect aspects of capability beyond process capability as defined by the SPICE BPG (e.g., non-process attributes). Further, unlike the SPICE convention of base practices which appear only in Level 1 (the Performed Level), SECAM allows KFA specific attributes (both process and non-process) to be spread across multiple capability levels, either as stand alone attributes or as part of a theme. These features discriminates between models which seek only to assess levels of process instantiation and the SECAM which seeks to measure all aspects of systems engineering capability.

Page 33: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

27

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.2.5 INTERPRETATION OF QUESTIONS

The questions within each KFA have been designed to elicit whether required systems engineering activities are achieved and attributes are manifest without implying a particular method or procedure. This may be accomplished by probing with a question the existence of an activity or attribute. The questions have been arranged in ascending groups under headings named after, and defined in alignment with, the ISO SPICE convention: Performed, Managed, Defined, Measured, and Optimizing (with Initial as the default and lowest level). The arrangement of question groupings is in ascending levels of capability, i.e., Performed being the lowest level where identifiable systems engineering activity is taking place and Optimizing being the highest level of systems engineering capability. Within the Optimizing level for each KFA, the systems engineering process representative of a particular KFA optimizes itself the based upon metrics collected regarding the effectiveness of the process; attributes of this optimization are also sampled. There is a default sixth level, called Initial, in which system engineering activity is either not being performed or is so scattered, disorganized, and chaotic that it is unrecognizable. There are no questions for the Initial capability level; a no response to all or most of the questions in the Performed level would place a systems engineering organization or program into the Initial capability level for a particular KFA.

Many process based models have used declarative statements to represent practices within the model and questions to sample these practices when applying the model in an assessment. The analogy to a practice within the INCOSE SECAM is either a process or non-process attribute, currently represented in the form of a question. During its development, questions have been used in the SECAM for ease of configuration management between the model and the questionnaire (which is used during the conduct of an assessment). In a subsequent version of the INCOSE SECAM, the attributes may be converted to declarative statements.

A clarification should be made regarding the issue of documentation that is addressed by some of the questions contained in the INCOSE SECAM. There is no intent to burden systems engineering with excessive paperwork or to equate a steadily growing bureaucratic paper mill with increased systems engineering capability. Systems engineering should remain agile and flexible enough to meet technical and business needs with a minimum of documentation. To this end, in every KFA under the capability level labeled “Managed", there is a question regarding a plan for that KFA. For example, under the System Concept Definition KFA, the question would be “Is there an approved plan (may be part of a larger technical management plan) for the program to perform system concept definition?" To answer the question with a “Yes" response, it is not necessary to have a separate plan for each KFA; this is the reason for the parenthetical statement included in the question. A well written technical management plan (e.g. Systems Engineering Management Plan or Integrated Master Plan) should address system concept definition and, if it does, the answer to the question would be “yes". The intent in asking the question is to ensure that the technical management plan or, in the absence of this plan, some other plan does actually address system concept definition. Similarly, there is one or more questions pertaining to policy under each KFA; again, it is not necessary to have a large number of separate policies, one policy of proper scope could address policy issues for several KFAs.

3.3 TRACEABILITY MATRICES

Each KFA has for convenience, an associated traceability matrix (see Appendix B) that relates the questions of a KFA to its general characteristics. These matrices also provide information on how each question has evolved during each update of the model. This information may be of value to organizations that wish to compare assessment results derived from this version of the model with assessment results obtained from earlier versions of the model. When a national standard for systems engineering is established, these matrices will also relate the questions within each KFA to the requirements for systems engineering as specified in the standard.

Page 34: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

28

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.4 GLOSSARY

An attempt has been made, in the choice of words used in the INCOSE SECAM, to use terms that are recognizable and have the same meaning to all practitioners of systems engineering. However, English is a very rich language both in terms of numbers of words and synonyms; there is no uniformly established lexicon used across different products and applications for many professional disciplines, including systems engineering. Various industries employing systems engineering often use the same words with slightly different meanings. To assist in overcoming the semantical problems that arise through the choice of a given word for a particular concept, a Glossary has been provided at the end (Section 6) of the INCOSE SECAM. Words contained within the Glossary have been italicized in the text of the KFAs.

3.5 RELATIONSHIP TO OTHER STANDARDS

The INCOSE SECAM was developed to measure an organization's systems engineering capability. The SECAM references recognized systems engineering standards, such as the EIA 632 and IEEE-1220-1994, as the appropriate models of systems engineering to be used by the organization being assessed. SECAM coverage of the EIA 632 or IEEE-1220-1994 standards is provided in Appendix C. A mapping to ISO 9001 is also provided.

Page 35: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

29

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

4 DETAILED REVISION HISTORY & PARTICIPANTS

The INCOSE SECAM is in its sixth major update, denoted Version 1.50. This issue of Version 1.51 includes minor administrative corrections to Version 1.50 (Preliminary), which represented an improvement over its predecessor Version 1.40. Significant aspects regarding the development and application of the INCOSE SECAM are summarized in Figure 4.0-1. A detailed list of specific revisions is provided in Appendix A.

INCOSE views the SECAM as a “living document" (refer to Figure 2.1-1) that will be updated periodically based upon feedback and experiential data received from individuals and organizations that review and/or use the SECAM. The most significant feedback on the SECAM's utility and suggestions for its improvement are expected to originate from actual assessments performed on an organization's systems engineering activity using the model.

Pre-Version 1.00 - 1992-1993 • Loral, Hughes, & Grumman proto models • Each proto model solely process focused

Version 1.00 - February 1994 • Merging Loral/Hughes/Grumman Proto˜Models • Addition of KFAs Unique to Systems Engineering • Total of 15 KFAs • Solely a Process Focused Merged Model

Version 1.10 - July 1994 • Further Systems Engineering Characterization • Standardized use of terms • Addition of Glossary • No longer considered a Merged Model

Version 1.20 - November 1994 • All KFAs strengthened to “satisfactory" • Addition of Concept Definition KFA • Increased to 16 KFAs

Version 1.30/1.31 - April 1995 • Addition of Goals -to-Questions traceability • Addition of Question & Goals revision history

Version 1.40 - July 1995 • Improved technical richness • Comprehensive review by INCOSE experts • Integration & Verification separated into 2 KFAs: o Integration o Verification

• Addition of Validation KFA • Non-process capability indicators added • Increased to 18 KFAs

Version 1.41 (Preliminary) - Oct/Nov 1995, Jan 1996 • Experiential feedback from three SEPAs • Feedback from an analysis by MITRE

Version 1.50 (Preliminary) - April 1996 • Further enhancement of non-process attributes • Addition of Data Management KFA • Identification of generic attributes

Page 36: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

30

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

• Identification of KFA themes • Increased to 19 KFAs

Figure 4-1. SECAM Improvement Summary

Additionally, the INCOSE SECAM has been distributed to related working groups within the INCOSE Technical Board in order to receive their feedback and suggested improvements as to its content. The intent is to ensure that the model has the complete concurrence of INCOSE and to coordinate the development of products within INCOSE.

The INCOSE SECAM is currently not a validated model. Based upon a series of CAWG facilitated systems engineering process assessments (SEPAs) and self-assessments, the CAWG hopes to use the resulting experiential data and lessons learned to continually improve and update the INCOSE SECAM through successive numbered revisions. This process will potentially offer over a period of time, a degree of validation for the SECAM; additional avenues to facilitate validation of the SECAM are being explored by INCOSE.

In order to place the current version of the INCOSE SECAM in the proper context, it is necessary to digress briefly and discuss its previous versions. This discussion also provides some insight into future developments that are planned for the SECAM.

4.1 VERSION 1.00

The first version of the INCOSE SECAM, denoted Version 1.00, was completed in February 1994. This version of the INCOSE SECAM was also referred to as the “Merged Model", since its primary focus was the merging of the efforts of three separate attempts to develop a model to assess the capability of an organization's systems engineering activity.

The consolidation (or merging) effort began during the November 1993 meeting of the CAWG. Significant progress was also made to complete the effort at the January 1994 meeting of the CAWG. Subsequent incorporation of comments by Hal Pierson (Software Productivity Consortium) resulted in a finished Version 1.00 INCOSE SECAM (or “Merged Model") in February 1994.

Version 1.00 of the INCOSE SECAM was primarily the result of the joint efforts of the following people:

l Don Crocker Grumman l Doug Low Hughes Aircraft Company, Electro-Optical Systems l Bill Mackey Computer Sciences Corporation l Bill Money Loral Command and Control Systems l Hal Pierson Software Productivity Consortium l Al Reichner Loral Space and Range Systems

Version 1.00 of the INCOSE SECAM, along with its supporting documents, was distributed in April 1994 to the members of the CAWG, INCOSE management, and other interested parties. The supporting documents, which are described in the cover letter which accompanied the mailing of the INCOSE SECAM, are considered necessary for the proper usage of application of the INCOSE SECAM. The initial versions of these supporting documents were generated in the February through April 1994 time frame.

In March 1994, the INCOSE SECAM and supporting documents served as the basis for the CAWG's first facilitated systems engineering process assessment (SEPA) at Rockwell Collins Air Transport Division. The

Page 37: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

31

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

two CAWG facilitators were Hal Pierson and Al Reichner; the Rockwell team leader was Blake Andrews. Computer Sciences Corporation used the INCOSE SECAM in a major SEPA, orchestrated by Bill Mackey in the March-April 1994 time frame.

4.2 VERSION 1.10

Based upon the feedback received from the two aforementioned SEPAs and comments received from other individuals, the CAWG began a dramatic improvement effort of the INCOSE SECAM at its meeting in June 1994. The participants at this CAWG meeting reviewed the lessons learned presented by Blake Andrews (Rockwell) and Bill Mackey (Computer Sciences Corporation) and then formed into smaller groups, each tasked with improving their assigned portion of the INCOSE SECAM.

Following this meeting, Blake Andrews and Bill Mackey spent a considerable amount of time incorporating the suggested improvements and further expanding the INCOSE SECAM to include a Glossary of Terms, provided by Bill Money (Loral Command and Control Systems) and a new Introduction to the INCOSE SECAM, provided by Rich Widmann (Hughes Aircraft Company).

The INCOSE SECAM, Version 1.10, was considered by the CAWG to represent a significant improvement over its predecessor, Version 1.00:

l The terms “key focus area" and “key process area" were used interchangeably in Version 1.00. The use of “key process area" was deemed inappropriate as this caused considerable confusion between the INCOSE SECAM and the Software Engineering Institute's Capability Maturity Model for Software Engineering. Version 1.10 makes reference only to “key focus areas".

l Several of the key focus areas (KFAs) in Version 1.00 were considered weak (i.e. only partially satisfactory or unsatisfactory). These KFAs were strengthened in Version 1.10 of the INCOSE SECAM through the incorporation of additional suitable goals and/or additional appropriate questions. Consequently, the resulting Version 1.10 INCOSE SECAM now provides a more “balanced" perspective across all KFAs than its predecessor.

l It was decided that each question within a KFA should be appropriately reworded to reflect the commitment to perform, ability to perform, activities performed, measurement and analysis, and verifying implementation areas within each KFA. In many cases, Version 1.00 lacked this continuity. Version 1.10 corrects this deficiency.

l Redundant questions existed within several KFAs of Version 1.00. These redundancies were a vestige of the “merging" process used to create this version of the INCOSE SECAM. Redundant questions were consolidated in Version 1.10.

l During the development of Version 1.10, the appropriateness of each goal within a KFA was reviewed. Questions were reviewed for consistency with the refined goals and changes were made as appropriate.

l The wording of many questions in Version 1.00 was ambiguous or consisted of more than one implied question. These deficiencies have been corrected in Version 1.10 by providing single entity questions with more precise, or less ambiguous meaning.

l A glossary has been added to Version 1.10 in an attempt to clarify certain key terms. As such terms appear in the questions, italics have been used to identify them as having an entry in the glossary.

Page 38: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

32

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

l Version 1.10 have been edited to remove internal inconsistencies found in Version 1.00. Such inconsistencies include wordings which might be interpreted as implying a specific organizational structure or specific systems engineering process implementation.

As mentioned above, in developing Version 1.00 of the INCOSE SECAM, most of the effort was devoted to merging the efforts of three previously separate attempts to develop a model to assess the capability of an organization's systems engineering activity. Version 1.10 is considered to have progressed far beyond the “merging" process of Version 1.00, and consequently the words “Merged Model" were deleted from the title and content of the INCOSE SECAM, Version 1.10.

Version 1.10 of the INCOSE SECAM was primarily the result of the joint efforts of the following people :

l Blake Andrew Rockwell Collins Air Transport Division l Roger Bate Software Engineering Institute l Jerry Burleson Loral Command and Control Systems l Don Crocker Grumman l Kevin Forsberg Center for Systems Management l Suzanne Garcia Software Engineering Institute l Bob Jone Loral Federal Systems l Dorothy Kuhn Texas Instruments l Bill Mackey Computer Sciences Corporation l Don Marquet MMC Astronautics l Bill Money Loral Command and Control Systems l Richard Pariseau Navel Air Warfare Center, Aircraft Div l Tom Paruer AAI l Hal Pierson Software Productivity Consortium l Dave Prekla GTE l Art Pyster Software Productivity Consortium l Al Reichner Loral Space and Range Systems l Jim Roger Hewlett-Packard l Mary Simpson Battelle - Pacific NW Labs l George Stern TASC l Don Wick Hewlett-Packard l Rich Widmann Hughes Aircraft Company, Electro-Optical Systems

4.3 VERSION 1.20

At the October 1994 meeting of the CAWG, a further improvement of the INCOSE SECAM was initiated. The CAWG participants in the model refinement effort reviewed: (1) the remaining suggested improvements from the aforementioned SEPAs that were not included in Version 1.10. (2) the results of an extensive analysis of the Version 1.10 conducted at Computer Sciences Corporation under the leadership of Dr. Bill Mackey, and (3) additional comments from other individuals on Version 1.10.

The INCOSE SECAM, Version 1.20, was considered by the CAWG to represent a significant improvement over its predecessor, Version 1.10:

Page 39: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

33

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

l Version 1.10 contained several key focus areas (KFAs) that were considered only partially satisfactory. These KFAs were strengthened in Version 1.20 through the incorporation of additional suitable goals and/or additional questions. Version 1.20 no longer contains any KFAs which are considered by the CAWG to be only partially satisfactory. Consequently, Version 1.20 provides a more “balanced" perspective across each of its KFAs.

l A new KFA entitled “System Concept Definition" was incorporated into Version 1.20 along with appropriate goals and questions. This increases the total number of KFAs to sixteen. The new KFA was incorporated in order to more completely reflect the scope of the systems engineering process.

l During the review of Version 1.10, a small number of questions were identified which were inappropriate to the KFA or level within a KFA. These questions were moved or deleted as appropriate, in Version 1.20.

l The glossary of Version 1.20 was improved over Version 1.10.

Version 1.20 of the INCOSE SECAM was primarily due to the joint efforts of the following people: l Blake Andrew Rockwell Collins Air Transport Division l LeRoy Botten Computer Sciences Corporation l Vic Church Computer Sciences Corporation l Dennis Crehan Loral Aerosys l Raymond Granata NASA, Goddard l Joe Ludford Computer Sciences Corporation l Bill Mackey Computer Sciences Corporation l Stephan Mayer Allied Signal l Bill Oran Allied Signal l Steven Senz GTE l Bruce Shelton Systems Management Development Corp. l Jesse Silver Computer Sciences Corporation l Antonio Vallone Computer Sciences Corporation l George Van Nostrand Computer Sciences Corporation l Roger Werking Computer Sciences Corporation l Rich Widmann Hughes Aircraft Company, Electro-Optical Systems

4.4 VERSION 1.30/1.31

The third update of the INCOSE SECAM, designated Version 1.30/1.31, was based upon (1) the incorporation of a series of traceability matrices first presented in the October 1994 meeting of the CAWG and (2) some minor observations from the CAWG facilitated SEPA at the Westinghouse, Hanford Facility.

During the week of 6 February 1995, the CAWG facilitated a SEPA at Westinghouse, Hanford Facility, in Richland, WA. The CAWG team was lead by Mr. Blake Andrews (Rockwell International) and assisted by Doug Low and Rich Widmann (both of Hughes Aircraft Company). The Westinghouse SEPA Team Leader was Mr. John Blyler. The SEPA was conducted using Version 1.20 of the INCOSE SECAM, with supporting documents updated to Version 1.2X (X=0, 1, or 2). During the SEPA, a number of observations were made regarding Version 1.20 of the INCOSE SECAM. Some of these minor observations were incorporated into Version 1.30/1.31 of the INCOSE SECAM, dated April 1995.

Page 40: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

34

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

The INCOSE SECAM, Version 1.30/1.31 represents an improvement over its predecessor version, 1.20, as follows:

l The incorporation of a series of traceability matrices, one for each KFA. Each traceability matrix provides the relationship between the questions and the goals contained within a KFA. The traceability matrices were created by Leroy Botten and verified by Bill Mackey (both of Computer Sciences Corporation).

l A strengthened introduction to the environment and tools support KFA provided by Doug Low. l The addition of two questions to the Training KFA under 9.2 entitled “Ability to perform" by Blake

Andrews. l A revised introduction was provide by Rich Widmann. l Glossary definitions provided by Richard Pariseau, Naval Air Warfare Center, Aircraft Division

4.5 VERSION 1.40

The fifth release (fourth update) of the INCOSE SECAM, designated Version 1.40, represented the most significant improvement of the model to date. Version 1.40 was also the most widely contributed to and reviewed version of the model, prior to publication, of any of its predecessor versions. Within the systems engineering community, attempts were made to gain as broad a cross section as possible of both contribution (authorship) and review of the model.

Every key focus area (KFA) within the INCOSE SECAM was improved. For nearly every KFA, most of the introductory descriptive text was rewritten; many goals were reworded, and where applicable, some additions/deletions to goals were made; and the number of questions was increased and the wording of individual questions changed to be more reflective of the nature of the KFA. In addition, the phrasing of the content of each KFA was changed to place it into “one voice" to overcome the previous “many voices" that existed in previous versions of the INCOSE SECAM.

The KFA structure of the INCOSE SECAM was expanded. The System Integration and Verification KFA was split into two separate KFAs and a System Validation KFA was added; thereby increasing the total number of KFAs from sixteen to eighteen. The increase in the number of KFAs, which occurred in the Systems Engineering Process Category, provided a more “balanced" overall structure to the INCOSE SECAM -- there are now seven KFAs in both the Management Process Category and Systems Engineering Process Category.

The numbering structure of the KFAs was changed from the former “flat" structure to a hierarchical structure representative of the relationship between the individual KFAs and their respective Process Categories. The new numbering structure has the added benefit of facilitating the ease of addition or deletion of KFAs within the INCOSE SECAM. Within each KFA, the grouping of questions under common headings representing increasing levels of capability has been maintained, however, the names of the common headings has been changed to reflect a better alignment with the ISO SPICE maturity levels.

Process improvement of the particular process representative of each KFA was refined and strengthened. Additionally, the Process Management and Improvement KFA (renamed from the former Process Management KFA to better reflect its content) was strengthened significantly to further reinforce both the management of systems engineering process and improvements to the process.

Page 41: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

35

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

The INCOSE SECAM (Version 1.40) resulting from the above changes is balanced in terms of content across KFAs, consistent in terms of content within ascending maturity levels within each KFA, and has a vastly increased depth and breadth of systems engineering technical richness. The significant improvements to and intense review of this version of the INCOSE SECAM has strengthened its content and structure to the point where it can truly be described as “a model developed by systems engineers to assess (measure) systems engineering process".

The improvements that resulted in this updated version of the INCOSE SECAM were based primarily upon: (1) recommendations arising out of the CAWG facilitated SEPA at the Westinghouse Hanford Facility, (2) a number of systemic improvements proposed by Ken Crowder (Boeing Information and Support Services Corporation), (3) a significant re€write of selected KFAs, based upon the above, at the 20 April 1995 meeting of the CAWG, (4) a significant re€write of selected KFAs subsequent to the April 1995 CAWG meeting based upon the procedures adopted and followed at that meeting, and (5) comments provided by reviewers of the improved KFAs.

Version 1.40 of the INCOSE SECAM was considered to represent a significant improvement of over its predecessor Version 1.30/1.31 with regard to the following:

l The nomenclature used to identify capability levels within each KFA was changed to reflect alignment of the model with ISO SPICE conventions.

l A general strengthening of all KFAs was performed based upon a general systemic approach proposed by Ken Crowder. Using this proposed approach (with slight modifications by Rich Widmann and Blake Andrews), many of the KFAs were updated during the 20 April 1995 meeting of the CAWG; those KFAs updated subsequent to the meeting were changed in a manner consistent with the approach taken during the meeting.

l A reÐphrasing of many of the questions within each KFA was performed by Blake Andrews and Rich Widmann to place the INCOSE SECAM into “one voice", thereby eliminating the “many voices" of the previous versions of the model.

l The Planning, Tracking and Oversight, Subcontract Management, Inter-group Coordination, and Configuration Management KFAs were strengthened by Ken Crowder. The Planning KFA was further strengthened by Bob Lightsey. The Tracking and Oversight KFA was further strengthened by Mike Gross. The Inter-group Coordination KFA was further strengthened by Steve Tavan and Tom Bachand.

l The Tracking and Oversight KFA was further updated to reflect work done by the Metrics Working Group of the INCOSE. Specifically, this KFA now addresses metrics classified as technical performance measures, planning and control metrics, and systems engineering process metrics.

l A major re-write and strengthening of the Risk Management KFA was performed by Rich Widmann. George Friedman, Bob Lightsey, and Bob Olson further strengthened this KFA.

l The Process Management and Improvement KFA was strengthened by George Friedman, Mike Townsend, Steve Tavan, and Tom Bachand.

l A major re-write and strengthening of the Training KFA was performed by Art Stone. John Blyler further strengthened this KFA.

l A major re-write and strengthening of the Environment and Tool Support KFA was performed by Doug Low. Brian McCay further strengthened this KFA.

l The System Concept Definition KFA was strengthened by Dave Olson and Mike Gross. l The System Requirements KFA was strengthened by Dave Olson, Mike Gross, and Ken Crowder. l The System Design KFAs was strengthened by Dave Olson and Ken Crowder. l The System Integration and Verification KFA was divided into two separate KFAs. The new System

Integration KFA and System Verification KFA both underwent a major re-write and strengthening by

Page 42: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

36

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Rich Widmann. The System Integration KFA was further strengthened by Steve Tavan, Tom Bachand, and Charles Baird.

l A new System Validation KFA was created by Blake Andrews. l New traceability matrices were created for the new KFAs. Existing traceability matrices were updated

to reflect changes made to the goals and questions within each KFA. This work was done by Bill Mackey, Leroy Botten, and Blake Andrews.

l A general technical enriching of the systems engineering content of all the KFAs was accomplished based upon the above improvements and the incorporation of numerous suggestions provided by reviewers. Every KFA underwent some form of review; those that were dramatically changed underwent numerous reviews by individuals who were not authors of the KFA.

l A revised introduction (Section 1) and rewritten/expanded Section 2 was provided by Rich Widmann.

Additionally, all of the above improvements for the INCOSE SECAM were incorporated and final editing was performed by Blake Andrews.

Version 1.40 of the INCOSE SECAM was primarily due to the joint efforts, either as author, reviewer, or both, of the following people:

l Blake Andrew Rockwell Collins Air Transport Division l Tom Bachand Mitre l Charles Baird Systems Integration Software, Inc. l John Blyler Tri-Cities Local INCOSE Chapter l LeRoy Botten Computer Sciences Corporation l Lillian Brantley Hughes Aircraft Company, Electro-Optical Systems l Gary Comb Hughes Aircraft Company, Electro-Optical Systems l Richard Connett Hughes Aircraft Company, Electro-Optical Systems l Dick Cramond TRW l Ken Crowder Boeing Information and Support Services l M. M. (Mac) Dillsi MACTEC/Vectra GSI l Richard Garrison Westinghouse Hanford l Gary Garside Hughes Aircraft Company, Electro-Optical Systems l Paul Gartz Boeing Commercial Airplane Company l Jerry Fisher Loral Federal Systems l Norm Gei Hughes Aircraft Company, Defense Systems l Charles Griner Westinghouse Hanford l Mike Gros Hughes Aircraft Company, Defense Systems l Dan Franci Department of Energy, Hanford Facility l George Friedman Past President, INCOSE l Mike Liewald Hughes Aircraft Company, Electro-Optical Systems l Bob Lightsey Defense Systems Management College l Doug Low Hughes Aircraft Company, Electro-Optical Systems l Bill Mackey Computer Sciences Corporation l Thomas McCann Hughes Aircraft Company, Defense Systems l Brian McCay Mitre l John Nyland United Defense L.P. l Bob Olson Naval Air Warfare Center - Weapons l Dave Olson Hughes Aircraft Company, Electro-Optical Systems l Ron Olson GTE Government Systems l Larry Pohlman Boeing Information Systems

Page 43: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

37

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

l Frank Pohlner Hughes Aircraft Company, Electro-Optical Systems l Dave Prekla GTE Government Systems l Donna Rhode Loral Federal Systems l Terry Ros Battelle Pacific Northwest Laboratory l Cho Shimizu Boeing Commercial Airplane l Joseph Simpson Tri-Cities Local INCOSE Chapter l Mary Simpson Battelle Pacific Northwest Laboratory l George Stern TASC l Sue Stetak Hewlett Packard l Art Stone Rockwell l Steve Tavan Mitre l Michael Townsend Naval Air Warfare Center - Indianapolis l Rich Widmann Hughes Aircraft Company, Electro-Optical Systems l Ann Wilbur Loral Western Development Laboratories

4.6 VERSION 1.41 (PRELIMINARY)

The fifth update of the INCOSE SECAM, designated Version 1.41 (Preliminary), was not formally released. It was used as a vehicle to capture minor observations and lessons learned from three SEPAs conducted in the last quarter of 1995. Three revisions of this preliminary document were informally distributed.

The first revision, dated October 1995, incorporated changes resulting from minor observations during the second SEPA conducted at the Westinghouse Hanford Company in September 1995 using Version 1.40. It was generated for use at the October 1995 CAWG meeting to discuss the Westinghouse SEPA.

The second revision, dated November 1995, incorporated additional minor observations into the SECAM. Additionally, lessons learned from the Westinghouse SEPA were incorporated into the supporting documents (now called the SECAM Assessment Method). This version was used by Raytheon in a SEPA beginning in November 1995. It was also used by Hughes Aircraft Company Electro-Optical Systems in a SEPA in December 1995.

The third revision, dated January 1996, incorporated changes from resulting from the Raytheon and Hughes SEPAs and an expansion of the descriptive portion of the SECAM. This version was informally distributed at the January 1996 INCOSE Winter Workshop for review.

Since Version 1.41 (Preliminary) was not formally released, changes incorporated into this version are discussed as part of Version 1.50.

4.7 VERSION 1.50

The fifth major update of the INCOSE SECAM, designated Version 1.50, represents the most significant improvement of the model to date. Version 1.50 was also the most widely contributed to and reviewed version of the model, prior to publication, of any of its predecessor versions. Within the systems engineering community, attempts were made to gain as broad a cross section as possible of both contribution (authorship) and review of the model.

Page 44: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

38

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

The descriptive portion of the SECAM has been re-written to improve understanding of the design philosophy of the SECAM. Terminology used within the model and to describe the model has been reviewed and updated for consistency.

Every KFA within the SECAM has been reviewed. Additional non-process attributes have been included. Questions dealing with policies and formal planning have been moved from level 1 (performed) into level 2 (managed) within each KFA. Every key focus area (KFA) within the SECAM has been unified with a set of common attributes reflected as a consistent set of (32) generic questions used within each KFA.

The KFA structure of the INCOSE SECAM has been expanded; increasing the total number of KFAs from eighteen to nineteen. A new Data Management KFA has been added to distinguish between the existing Configuration Management activities and Data Management activities. This KFA was incorporated in order to more completely reflect the scope of the systems engineering process and was recommended as a result of the SEPA conducted at Hughes Aircraft Company in December 1995.

The training KFA has been renamed to Competency Development to more accurately reflect its intent. Configuration and change management matrices have been moved into an appendix. Traceability from SECAM to EIA 632, IEEE-1220-1994, and ISO 9001, has been mapped and included in a separate appendix.

The improvements that resulted in this updated version of the INCOSE are based primarily upon: (1) recommendations arising out of the second CAWG facilitated SEPA at the Westinghouse Hanford Facility in September 1995, (2) recommendations arising out of a self-administered SEPA conducted by Raytheon in November - December 1995, (3) recommendations arising out of a self-administered SEPA conducted by Hughes Aircraft Company in December 1995, (4) recommendations from a self-administered SEPA conducted at the United States Department of Agriculture in March 1996, (5) review at the October 1995 and January 1996 CAWG meetings, (6) a reanalysis of the model conducted during the June 1996 CAWG meeting to formally identify themes within each KFA, (7) the feedback from nine assessments using Version 1.50 (Preliminary), and (8) comments provided by reviewers of the SECAM.

Version 1.50 of the INCOSE SECAM is considered to represent a significant improvement of over its predecessor Version 1.40/1.41 with regard to significant contributions by the following:

l Each KFA has received multiple reviews. l A new KFA entitled “Data Management" has been incorporated into Version 1.50 (Preliminary) along

with appropriate goals and questions. This KFA was created by Rhonda Coen and George Richman and further strengthened by Rich Widmann, John Worl, Richard Pariseau, Sam Alessi, Bill Mackey, Don Barber, and Blake Andrews. This increases the total number of KFAs to nineteen.

l Generic questions were formalized and documented in all KFAs based upon work done in the Data Management KFA. A tag was appended to the number of each generic question to identify its relationship with corresponding questions throughout each KFA.

l Two Non-Process Generic Questions were incorporated into each KFA to assess the effectiveness of the products and value of the activities based upon the suggestions of Rich Widmann and Doug Low.

l The Inter-group Coordination KFA was strengthened by John Worl, Sam Alessi, and Michelle Sibernagel. Material on IPPD was provided by John Nylund.

l The System Concept Definition KFA was strengthened by Jack Ring, Rich Widmann, and Blake Andrews.

l The Training KFA was strengthened by Jack Ring. As a result of this work the KFA has been re-titled “Competency Development" to more accurately reflect its intent.

Page 45: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

39

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

l The Environment and Tools Support KFA was strengthened by Nancy Rundlet, Bill Miller, Jock Rader, and Mark Sampson.

l The Quality Assurance KFA was strengthened by Sam Alessi and Blake Andrews. As a result of this work the KFA has been re-titled “Quality Management" to more accurately reflect its intent.

l The Risk Management KFA was strengthened by George Friedman. l The System Requirements KFA was strengthened by Bill Miller, Richard Stevens, Rich Widmann, and

Blake Andrews. As a result of this work the KFA has been re-titled “Requirements and Functional Analysis" to more accurately reflect its intent.

l The Integrated Engineering Ana lysis KFA was strengthened by Dave Olson and Rich Widmann. l Mapping from SECAM to EIA 632, IEEE 1220, and ISO 9001, was performed by George Richman and

Blake Andrews. l Themes within each KFA were documented by Blake Andrews, Don Barber, Allen Ray, David

Brenchly, and John Worl, and Rich Widmann, using a tag scheme proposed by Blake Andrews at the June 1996 meeting of the Capability Assessment Working Group.

l The introductory sections (paragraphs 1 through 3 and subs) were restructured and rewritten by Rich Widmann and Blake Andrews to clarify concepts used to develop the INCOSE SECAM.

Additionally, all of the above improvements for the SECAM were incorporated and final editing was performed by Blake Andrews.

This issue of Version 1.50 of the SECAM was primarily due to the joint efforts, either as author, reviewer, or both, of the following people:

l Sam Alessi US Department of Agriculture, North Central Soil Conservation Research Laboratory

l Blake Andrew Rockwell Collins Air Transport Division l Eugene Antonier Hughes Aircraft Company, Electro-Optical Systems l Bruce Ammerman Boeing Defense and Space Group l Eileen Arnold Rockwell Collins General Aviation Division l Don Barber Honeywell Air Transport System Division l Wayne Brandt Rockwell Collins Air Transport Division l David Brenchley Battelle, Pacific Northwest National Laboratory l Ross Cairn Rockwell Collins General Aviation Division l Rhonda Coen Hughes Aircraft Company, Electro-Optical Systems l Ken Crowder Boeing Defense and Space Group l Steve Cuspard Boeing Defense and Space Group l Jim Eklund US Department of Agriculture, North Central Soil Conservation Research

Laboratory l Dan Franci Department of Energy, Richland Operations l George Friedman Past President, INCOSE l George Hudak AT&T Corporation l Curt Johansen Physio-Control Corporation l Doug Low Hughes Aircraft Company, Electro-Optical Systems l William Mackey Computer Sciences Corporation l Brian Mar University of Washington l John May Boeing Defense and Space Group l Bill Miller AT&T Corporation l Gar Norman Westinghouse Hanford Company l John Nyland United Defense L.P., Armament Systems Division

Page 46: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

40

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

l Dave Olson Hughes Aircraft Company, Electro-Optical Systems l Frank Orsag Westinghouse Hanford Company l Roger Orth Boeing Defense and Space Group l Richard Pariseau Naval Air Warfare Center l Rich Pavlik Honeywell Commercial Avionics Systems Division l Ian Philpott Rockwell Collins Air Transport Division l Larry Pohlman Boeing Information Systems l Frank Pohlner Hughes Aircraft Company, Electro-Optical Systems l Jock Rader Hughes Aircraft Company, Radar and Communication Systems l R. Allen Ray United Defense L.P., Armament Systems Division l George Richman Hughes Aircraft Company, Electro-Optical Systems l Jack Ring Innovation Management l Barney Robert Futron Corporation l Nancy Rundlet QSS Inc. l Vince Saladin Department of Energy, Richland Operations l Mark Sampson TD Technologies, Inc. l Salley Schneider US Department of Agriculture, IAREC, Systems Research Group l Michelle Sibernagel Battelle Seattle Research Center l Joseph Simpson Tri-Cities Local INCOSE Chapter l Mary Simpson Battelle, Pacific Northwest National Laboratory l Doug Stemm Hughes Aircraft Company, Electro-Optical Systems l Richard Steven QSS Ltd (UK) l Steve Tavan Mitre l Jesse Teal TRW Systems Integration l Michael Townsend Naval Air Warfare Center, Indianapolis l Leng Vang US Department of Agriculture, North Central Soil Conservation Research

Laboratory l Ward Voorhee US Department of Agriculture, North Central Soil Conservation Research

Laboratory l Guy Wagner Honeywell Defense Avionics Systems Division l Rich Widmann Hughes Aircraft Company, Electro-Optical Systems l John Worl Battelle Seattle Research Center

4.8 VERSION 1.50A

Version 1.50a denotes an administrative update, which converted the existing version 1.50 hard copy to this electronic copy for archival purposes, and for posting to the INCOSE web site.

Page 47: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

41

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

5 INCOSE SYSTEMS ENGINEERING CAPABILITY ASSESSMENT MODEL

This section defines each of the nineteen key focus areas (KFAs), grouped into three process categories, which comprise the INCOSE SECAM. Each KFA section consists of a general description of the KFA, general characteristics associated with the KFA, and questions which may be used to assess whether the specific attributes of the KFA are met. The questions within each KFA are organized into five levels of systems engineering capability; a sixth, and lowest level of capability, is a default level containing no questions.

Numbering of each KFA is indicative of the process category hierarchy, as follows:

1.0 - Management Process Category 1.1 - Planning KFA 1.2 - Tracking and Oversight KFA 1.3 - Subcontract Management KFA 1.4 - Inter-group Coordination KFA 1.5 - Configuration Management KFA 1.6 - Quality Management KFA 1.7 - Risk Management KFA 1.8 - Data Management KFA 2.0 - Organization Process Category 2.1 - Process Management and Improvement KFA 2.2 - Competency Development KFA 2.3 - Technology Management KFA 2.4 - Environment and Tool Support KFA 3.0 - Systems Engineering Process Category 3.1 - System Concept Definition KFA 3.2 - Requirements and Functional Analysis KFA 3.3 - System Design KFA 3.4 - Integrated Engineering Analysis KFA 3.5 - System Integration KFA 3.6 - System Verification KFA 3.7 - System Validation KFA

Page 48: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

42

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

5.1 CATEGORY 1 MANAGEMENT PROCESS CATEGORY

The Management Process Category consists of the following key focus areas:

1.1 Planning 1.2 Tracking and Oversight 1.3 Subcontract Management 1.4 Inter-group Coordination 1.5 Configuration Management 1.6 Quality Management 1.7 Risk Management 1.8 Data Management

5.1.1 KFA 1.1 PLANNING

Systems planning involves the identification of needs and constraints at the program level. The results of planning may be classified in terms of technical requirements and program requirements. These requirements define the technical and program structure required to bring a system into being. Planning includes: program requirements definition; identification, integration and scheduling of all engineering functions and tasks; work breakdown structure development; organizational structure definition (as related to the program); and descriptions of or references to key policies and procedures. System planning is documented in a technical management plan which sometimes references other planning documents. In both an integrated systems environment and integrated product and process development (IPPD) environment, this plan is sometimes referred to as a systems engineering management plan (SEMP).

The technical management plan relates the technical requirements to program requirements, providing the structure to guide and control the integration of engineering activities needed to achieve the systems engineering objectives consistent with a top-level management plan for the program. The technical management plan addresses planning for three basic areas of technical management:

l technical program planning & control l systems engineering process l engineering specialty integration

The technical management plan provides the management planning structure necessary to transform the technical and management requirements for the system into an operational system.

An event-driven plan is generated for the program that lays out the core technical portion of the program, process descriptions, and significant events. The event-driven plan documents the significant accomplishments necessary to complete the program's efforts and ties each accomplishments to a key program event. This event driven plan is included (sometimes by reference) as part of the technical management plan. Each significant event may be thought of as a function with defined tasks to be accomplished. Entrance criteria are defined to start the event (function) and accomplishment (exit) criteria are established to determine the completion of the event (function). In an integrated systems environment, this event-driven plan is sometimes referred to as a Systems Engineering Master Schedule (SEMS). In an IPPD environment, this event-driven plan would be the

Page 49: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

43

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

technical portion of the integrated master plan (IMP) -- the SEMS, if developed separate from the IMP, is usually incorporated as a part of the IMP.

A calendar-based plan is generated for significant events and activities within the program and included (sometimes by reference) as part of the technical management plan. In non-complex programs, this calendar-based plan may be a Gantt chart only. In complex programs, the calendar-based plan may be both a Gantt chart and a network chart that relates dependencies among tasks and events and permits the determination of a critical path. In an integrated systems environment, this calendar-based plan is sometimes referred to as a Systems Engineering Detailed Schedule (SEDS). In an IPPD environment, this calendar-based plan would be the technical portion of the integrated master schedule (IMS).

General Characteristics Characteristic 1 Technical activities, work products, and scheduling are documented in a

technical management plan. Characteristic 2 A work breakdown structure is established that identifies logical units of

work to be managed at the program level. Characteristic 3 The technical program elements are integrated in a top-down, life-cycle,

low-risk approach which is effective and efficient. Characteristic 4 The management approach supports technical objectives and technical

objectives are specified in a realistic manner, consistent with organization, cost, and scheduling constraints.

Questions

yes no

1.1-1 Performed 1.1-1.1 Is planning being accomplished in at least an informal manner? 1.1-1.2 Is time allocated for planning activities on the program? 1.1-1.3 Is there a work breakdown structure for the program that defines logical units of work

to be managed at the program level? T1.1-A-L1 1.1-1.4 Is scheduling conducted as a part of planning activities? T1.1-B-L1 1.1-1.5 Does the systems engineering team leader report directly to the program manager for

technical/program direction? T1.1-C-L1 1.1-1.6 Are the products / results of the planning activities at least of marginal value to the

program? 1.1-1.7 Are planning activities at least of marginal effectiveness?

1.1-2 Managed 1.1-2.1 Does the program follow a written organizational policy (may be part of a broad-based

policy) for implementing planning activities? 1.1-2.2 Is there an approved technical management plan (may be part of a larger program plan)

for the program (e.g., a SEMP)? 1.1-2.3 Are task dependencies addressed as a part of scheduling? T1.1-B-L2 1.1-2.4 Has responsibility been assigned for program planning? 1.1-2.5 Do systems engineering personnel participate on proposal teams? T1.1-D-L3

Page 50: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

44

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.1-2.6 Do documented and approved statements of work exist for systems engineering activities? T1.1-E-L2

1.1-2.7 Is the level of technical work required for the program reconciled to the available level of funding or projected market potential? T1.1-F-L2

1.1-2.8 Have the systems engineering work products and activities been defined in a traceable and accountable manner? T1.1-A-L2a

1.1-2.9 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for performing planning activities assessed?

1.1-2.10 When planning skills are inadequate, are individuals involved in planning activities trained in estimating and planning procedures applicable to their areas of responsibility?

1.1-2.11 Are appropriate tools available to conduct planning activities? 1.1-2.12 Has responsibility for developing the technical management plan been

assigned? T1.1-G-L2 1.1-2.13 Is the technical management plan reviewed, updated and made available to

members of the program? T1.1-H-L2 1.1-2.14 Does the technical management plan provide form and context for the planned

technical activities and identify products? T1.1-I-L2 1.1-2.15 Are deviations from the technical management plan documented and rationale

recorded so that decisions can be traced through the program life cycle? T1.1-J-L2 1.1-2.16 Does the work breakdown structure cover all the tasks and products necessary

to the program? T1.1-A-L2b 1.1-2.17 Are the program roles, responsibilities, and objectives for each organization or

functional discipline documented? 1.1-2.18 Is there a designated systems engineering first-line manager or team leader

who is responsible for negotiating technical commitments? T1.1-C-L2 1.1-2.19 Are systems engineering commitments made to other groups within the

program reviewed? T1.1-K-L2 1.1-2.20 Are planning activities performed in a structured manner? 1.1-2.21 Are data collected for monitoring planning activities? 1.1-2.22 Are corrective actions initiated when program activities deviate significantly

from the plan? 1.1-2.23 Are the products / results of planning activities at least of adequate value to the

program? 1.1-2.24 Are planning activities at least of adequate effectiveness?

1.1-3 Defined 1.1-3.1 Is the technical management plan developed and approved according to a formal

procedure? 1.1-3.2 Do systems engineering personnel participate in the technical decision making process

on the program proposal team? T1.1-C-L3 1.1-3.3 Do systems engineering personnel participate with other affected groups in program

planning throughout the life cycle? T1.1-K-L3a 1.1-3.4 Is the continuity of systems engineering personnel managed throughout all phases of the

development life cycle? T1.1-D-L3 1.1-3.5 Has a system life cycle, with predefined stages of manageable size, been identified or

defined? T1.1-E-L3 1.1-3.6 Is there a mechanism to ensure compliance of systems engineering commitments made

to other groups? T1.1-K-L3b

Page 51: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

45

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.1-3.7 Are the systems engineering activities and work products that are needed to establish and maintain control of the program well defined?

1.1-3.8 Are estimates for the size of the systems engineering work products (or changes in the size of the work products) derived based upon historical data? T1.1-F-L3a

1.1-3.9 Are estimates for the systems engineering effort and cost derived based upon historical data?

1.1-3.10 Do estimates take into consideration whether a task is precedented or unprecedented?

1.1-3.11 Is an event driven plan developed for technical aspects of the program? T1.1-G-L3

1.1-3.12 Is the event driven plan traceable to the scheduling? T1.1-B-L3 1.1-3.13 Are the risks associated with the cost, resource, schedule, and technical aspects

of the system identified, assessed, and mitigated? 1.1-3.14 Is the basis for systems engineering planning captured? 1.1-3.15 Are budget and resource estimates, schedules, and personnel assignments,

periodically reviewed and updated? 1.1-3.16 Do systems engineering personnel review and agree to systems engineering

estimates, schedules, and personnel allocations? T1.1-F-L3b 1.1-3.17 Are estimates, schedules, and personnel allocations of groups that support

systems engineering reviewed? 1.1-3.18 Is the technical management plan revised to reflect on-going and projections for

the program? T1.1-H-L3a 1.1-3.19 Does the technical management plan identify responsibilities and objectives for

each technical discipline participating in systems engineering activities? T1.1-I-L3 1.1-3.20 Are measurement systems (e.g. earned value) identified and used? 1.1-3.21 Are formal reviews of the technical management plan conducted to assess its

completeness and correctness (i.e. that all phases of the program are addressed in the context of a systems engineering life cycle)? T1.1-H-L3b

1.1-3.22 Is the work breakdown structure reviewed to assure that it is complete, consistent and correct? T1.1-A-L3a

1.1-3.23 Is the work breakdown structure reviewed at appropriate program milestones and revised as necessary? T1.1-A-L3b

1.1-3.24 Are formal reviews of the technical management plan conducted to assess its consistency with the top-level program management plan and with lower-level plans? T1.1-H-L3c

1.1-3.25 Are formal reviews of the roles, responsibilities, and objectives, defined by the technical management plan for each organization, conducted to assure that they are complete, consistent, and correct?

1.1-3.26 Are formal reviews of the technical management plan conducted at appropriate program milestones and the plan revised as necessary? T1.1-J-L3

1.1-3.27 Is a summary report from each meeting prepared and distributed to affected groups and individuals?

1.1-3.28 Are metrics collected for assessing the effectiveness of planning activities? 1.1-3.29 Are peer/defect reviews conducted to assess and improve planning activities

and products? 1.1-3.30 Are planning processes standardized across the organization? 1.1-3.31 Are guidelines provided to allow the program to tailor the standard planning

process for its specific needs?

Page 52: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

46

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.1-3.32 Are the products / results of planning activities at least of significant value to the program?

1.1-3.33 Are planning activities at least of significant effectiveness?

1.1-4 Measured 1.1-4.1 Are metrics used to determine the effectiveness of planning activities? 1.1-4.2 Are analyses performed on the metrics associated with planning activities to identify

corrective actions for the program? 1.1-4.3 Are the identified corrective actions implemented as necessary for the program? 1.1-4.4 Are the products / results of the planning activities at least of measurably significant

value to the program? 1.1-4.5 Is the effectiveness of planning activities at least measurably significant?

1.1-5 Optimizing 1.1-5.1 Is the effectiveness of the planning process and its implementation activities reviewed

on both an event-driven and periodic basis? 1.1-5.2 Upon review, are actions taken to correct identified deficiencies in the planning process

and its implementation? 1.1-5.3 Is the process for developing the technical management plan reviewed by appropriate,

experienced personnel and corrective actions taken as necessary? T1.1-H-L5 1.1-5.4 Is the process for developing the work breakdown structure reviewed by appropriate,

experienced personnel and corrective actions taken as necessary? T1.1-A-L5 1.1-5.5 Are quality management reviews and/or audits conducted of the planning activities and

program data and the results used to improve the process? 1.1-5.6 Are the metrics collected on the effectiveness of planning activities used to monitor and

improve the systems engineering process? 1.1-5.7 Are the products / results of the planning activities of optimal value to the program? 1.1-5.8 Are planning activities of optimal effectiveness?

5.1.2 KFA 1.2 TRACKING AND OVERSIGHT

Tracking and oversight involves performing the systems engineering management activities to guide, monitor, evaluate, and adjust the technical effort relative to program objectives, goals, and plans.

Tracking consists of the measurement of program functions and tasks identified in systems planning. Technical characteristics, program progress, and systems engineering process activities should be tracked. Oversight consists of development of meaningful metrics from measurement data, assessment of these tracking metrics, and the initiation of corrective action as required. Oversight actions resolve or mitigate issues which might threaten the effective achievement of technical program objectives.

Generally, a measurement system is used to provide the information necessary to synthesize technical performance metrics (sometimes referred to as technical performance measures), planning and control metrics, and systems engineering process metrics. These metrics enable the measurement and control of management, technical, and process factors which support tracking and oversight.

Technical performance metrics are used to track key technical parameters throughout a development program. Planning and control metrics provide a periodic assessment of the health and status of the program throughout the life cycle. Systems engineering process metrics provide an indication of the quality and productivity of the systems engineering process as applied to a specific program.

Page 53: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

47

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

General Characteristics Characteristic 1 Metrics are established in accordance with the planning objectives. Characteristic 2 Management, technical, and process functions and tasks are monitored (i.e.

tracked, assessed and corrected) against the planning objectives. Characteristic 3 Work products and performance are evaluated against the system

engineering plan(s). Characteristic 4 Corrective actions consistent with the plans are taken when management,

technical, and process problems and issues are identified.

Questions

yes no

1.2-1 Performed 1.2-1.1/G1 Is tracking and oversight being accomplished in at least an informal manner? 1.2-1.2 Are technical performance measures collected and used in at least an informal manner?

T1.2-A-L1 1.2-1.3 Are planning and control metrics collected and used in at least an informal manner?

T1.2-B-L1 1.2-1.4 Are systems engineering metrics collected and used in at least an informal manner?

T1.2-C-L1 1.2-1.5/G2 Are tracking and oversight activities planned for the program? 1.2-1.6/G3 Are the products / results of the tracking and oversight activities at least of

marginal value to the program? 1.2-1.7/G4 Are tracking and oversight activities at least of marginal effectiveness?

1.2-2 Managed 1.2-2.1/G5 Does the program follow a written organizational policy (may be part of a

broad-based policy) for implementing tracking and oversight? 1.2-2.2/G6 Is there an approved plan (may be part of a larger technical management plan)

for the program to identify, collect and evaluate metrics? 1.2-2.3/G7 Has the responsibility for establishing and managing tracking and oversight

activities been assigned? 1.2-2.4/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for

performing tracking and oversight activities assessed? 1.2-2.5/G9 Are personnel trained in the objectives procedures, and methods for performing

tracking and oversight activities? 1.2-2.6 Have systems engineering team leaders received orientation in the technical aspects of

the program? 1.2-2.7 Are responsibilities designated for collecting, reviewing and evaluating technical

performance measures for the program? T1.2-A-L2 1.2-2.8 Are responsibilities designated for collecting, reviewing and evaluating planning and

control metrics for the program? T1.2-B-L2

Page 54: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

48

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.2-2.9 Are responsibilities designated for collecting, reviewing and evaluating systems engineering metrics for the program? T1.2-C-L2

1.2-2.10 Is information, data , and lessons learned, obtained from tracking and oversight metrics collected and archived for use by future programs? T1.2-D-L2

1.2-2.11/G10 Is tracking and oversight performed in a structured manner? 1.2-2.12/G11 Are data collected for monitoring tracking and oversight activities? 1.2-2.13/G12 Are corrective actions initiated when tracking and oversight activities deviate

significantly from the plan? 1.2-2.14/G13 Are the products / results of the tracking and oversight activities at least of

adequate value to the program? 1.2-2.15/G14 Are tracking and oversight activities at least of adequate effectiveness?

1.2-3 Defined 1.2-3.1/G15 Are tracking and oversight activities planned, approved, and established

according to a formal procedure? 1.2-3.2 Are systems engineering commitments and changes to commitments made to

individuals and groups external to the organization reviewed? 1.2-3.3 Are approved changes to commitments that affect the program communicated to

systems engineering and other related personnel? T1.2-E-L3 1.2-3.4 Are action items documented and assigned, reviewed, and tracked to closure? 1.2-3.5 Are the risks associated with cost, resource, schedule, and technical aspects of the

program tracked? 1.2-3.6 Are planning and control metrics for the systems engineering aspects of the program

recorded and distributed to appropriate parties? T1.2-B-L3a 1.2-3.7 Do systems engineering personnel conduct periodic informal reviews to track technical

progress, plans, performance, and issues against the plan? 1.2-3.8 Are formal reviews to address the accomplishments and results of the systems

engineering aspects of the program conducted at selected milestones? 1.2-3.9 Is a mechanism used to assure that the system's functional baseline is established,

allocated, and monitored during design and implementation? 1.2-3.10 Are metrics (technical performance measures, process and control, and

systems engineering) defined according to a formal procedure? 1.2-3.11 Are program technical performance measures identified, documented, and

distributed to appropriate parties? 1.2-3.12 Are technical performance measures mapped to measures of effectiveness? 1.2-3.13 Are technical performance measures profiles re-assessed and re-allocated as

necessary during the program life cycle? 1.2-3.14 Is modeling and analyses an element of technical performance measures

tracking and oversight? 1.2-3.15 Are technical performance measures collected, evaluated, and reported

according to a formal procedure? T1.2-A-L3 1.2-3.16 Are technical resource profiles maintained for each critical system

configuration item in the system architecture? 1.2-3.17 Are profiles maintained of completed systems engineering activities against

planned activities? 1.2-3.18 Are all systems engineering problem reports prioritized and tracked to closure?

Page 55: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

49

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.2-3.19 Are quantitative systems engineering goals established and tracked for the program?

1.2-3.20 Are formal records maintained of subsystem and configuration item specification progress?

1.2-3.21 Do systems engineering team leaders regularly review technical progress and issues with their engineers?

1.2-3.22 Are data on work product and/or specification errors collected during development and verification? T1.2-D-L3

1.2-3.23 Are systems engineering process metrics identified, documented, and distributed to appropriate parties? T1.2-C-L3a

1.2-3.24 Are systems engineering process metrics collected, evaluated, and reported? T1.2-C-L3b

1.2-3.25 Are planning and control metrics collected, evaluated, and reported? T1.2-B-L3b

1.2-3.26/G16 Are metrics developed to assess the effectiveness of tracking and oversight activities?

1.2-3.27/G17 Are peer/defect reviews conducted to assess and improve tracking and oversight activities and products?

1.2-3.28/G18 Are tracking and oversight processes standardized across the organization? 1.2-3.29/G19 Are guidelines provided to allow the program to tailor the standard tracking

and oversight process for its specific needs? 1.2-3.30/G20 Are the products / results of the tracking and oversight activities at least of

significant value to the program? 1.2-3.31/G21 Are tracking and oversight activities at least of significant effectiveness?

1.2-4 Measured 1.2-4.1/G22 Are metrics used to determine the status and effectiveness of tracking and

oversight activities? 1.2-4.2 Are identified measurement systems (e.g. earned value) and tools used to accomplish

tracking and oversight measurements? 1.2-4.3 Are the impacts of changes to systems engineering work products tracked, and

necessary corrective actions taken? T1.2-E-L4 1.2-4.4 Are the systems engineering efforts and costs tracked, and necessary corrective actions

taken? 1.2-4.5 Are critical resources identified and tracked, and necessary corrective actions taken? 1.2-4.6 Are systems engineering schedules tracked, and necessary corrective actions taken? 1.2-4.7 Are systems engineering technical activities tracked, and necessary corrective actions

taken? 1.2-4.8 Are corrective actions initiated when technical performance measures exceed their

specified acceptable range? 1.2-4.9/G23 Are analyses performed on the metrics associated with tracking and oversight

to identify corrective actions for the program? 1.2-4.10/G24 Are the identified corrective actions implemented as necessary for the

program? 1.2-4.11/G25 Is the value of products / results of the tracking and oversight activities at least

of measurably significant value to the program?

Page 56: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

50

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.2-4.12/G26 Is the effectiveness of tracking and oversight activities at least measurably significant?

1.2-5 Optimizing 1.2-5.1/G27 Is the effectiveness of the tracking and oversight process and its

implementation activities reviewed on both an event-driven and periodic basis? 1.2-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the tracking

and oversight process and its implementation? 1.2-5.3/G29 Are quality assurance reviews and/or audits conducted of the tracking and

oversight activities and the results used to improve the process? 1.2-5.4/G30 Are the metrics collected on the effectiveness of tracking and oversight

activities used to monitor and improve the systems engineering process? 1.2-5.5/G31 Are the products / results of the tracking and oversight activities of optimal

value to the program? 1.2-5.6/G32 Are tracking and oversight activities of optimal effectiveness?

5.1.3 KFA 1.3 SUBCONTRACT MANAGEMENT

The purpose of subcontract management is to select, manage, and control subcontractors who will meet the defined needs of the subcontracted effort. Subcontract management involves the functions of subcontract planning, requirements definition, subcontractor selection, technical quality control, and cost control. It deals with the effective and efficient use of subcontractors to accomplish the technical program objectives, within the framework of established and documented plans, procedures and requirements which describe the products and processes being subcontracted.

General Characteristics Characteristic 1 Subcontract management activities are planned. Characteristic 2 The definition and flow down of subcontractor requirements are consistent

with the technical and program requirements. Characteristic 3 Technically qualified subcontractors are selected. Characteristic 4 The subcontractor's work products and performance are monitored and

evaluated against commitments and program objectives as defined in the subcontract.

Questions

yes no

1.3-1 Performed 1.3-1.1/G1 Is subcontract management being accomplished in at least an informal manner? 1.3-1.2/G2 Are subcontract management activitie s planned for the program?

Page 57: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

51

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.3-1.3 When subcontractors are used on the program, are requirements for the work documented as part of a contract? T1.3-A-L1

1.3-1.4 When requirements change, are the changes renegotiated with the subcontractor and the changes documented as part of a contract? T1.3-B-L1

1.3-1.5 Are costs tracked and controlled during subcontract management? 1.3-1.6/G3 Are the products / results of the subcontract management activities at least of

marginal value to the program? 1.3-1.7/G4 Are subcontract management activities at least of marginal effectiveness?

1.3-2 Managed 1.3-2.1/G5 Does the program follow a written organizational policy (may be part of a

broad-based policy) for selecting, managing, and controlling the subcontractor? 1.3-2.2/G6 Is there an approved plan (may be part of a larger technical management plan)

for the program to perform subcontract management? 1.3-2.3/G7 Has the responsibility for establishing and managing the subcontract been

assigned? 1.3-2.4/G8 Is the adequacy of resources (e.g. funding, staff, tools, etc.) provided for

selecting the subcontractor and managing the subcontract assessed? 1.3-2.5/G9 Are team leaders and other individuals who are involved in establishing and

managing the subcontract trained to perform these activities? 1.3-2.6 Does the program manager approve the subcontract based upon input from the systems

engineering team leader? 1.3-2.7 Do those who are involved in managing the subcontract receive orientation in the

technical aspects of the subcontract? 1.3-2.8 Does the program subcontract for work that is outside its core competencies? 1.3-2.9 Does the subcontractor selection process include consideration of the alignment of

business plans, product upgrades, and technology focus of the prime contractor and the subcontractors? T1.3-E-L2

1.3-2.10 Are the subcontractor's quality management activities monitored? T1.3-C-L2 1.3-2.11 Are the subcontractor's configuration management activities monitored? T1.3-

B-L2 1.3-2.12 Is acceptance testing conducted as part of the delivery of the subcontractor's

products? T1.3-D-L2 1.3-2.13/G10 Is subcontract management performed in a structured manner? 1.3-2.14/G11 Are data collected for monitoring subcontract management activities? 1.3-2.15/G12 Are corrective actions initiated when subcontract management activities deviate

significantly from the plan? 1.3-2.16/G13 Are the products / results of the subcontract management activities at least of

adequate value to the program? 1.3-2.17/G14 Are subcontract management activities at least of adequate effectiveness?

1.3-3 Defined 1.3-3.1/G15 Is the work to be subcontracted defined and planned according to a formal

procedure?

Page 58: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

52

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.3-3.2 Are subcontractors selected, based on an evaluation of the subcontract bidders' ability to perform the work? T1.3-E-L3a

1.3-3.3 Is the contractual agreement between the prime contractor and the subcontractor used as the basis for managing the subcontract? T1.3-A-L3a

1.3-3.4 Is there a clearly documented subcontract that is complete, accurate, and unambiguous, for each subcontractor? T1.3-A-L3b

1.3-3.5 Does the subcontract contain a statement of work , specification terms and conditions, list of deliverables, schedule, cost, and a defined acceptance process? T1.3-A-L3c

1.3-3.6 Is a documented and approved subcontractor's system engineering management plan used for tracking the systems engineering activities, schedule, and cost, and communicating status?

1.3-3.7 Are changes to the subcontractor's statement of work , specification, terms and conditions, and other commitments resolved? T1.3-B-L3

1.3-3.8 Does the prime contractor's management conduct periodic status and/or coordination reviews with the subcontractor's management?

1.3-3.9 Are periodic informal reviews, technical reviews and interchanges held with the subcontractor? T1.3-E-L3b

1.3-3.10 Are subcontractors regularly reviewed to ensure that the approved process is being followed?

1.3-3.11 Are formal reviews to address the subcontractor's systems engineering accomplishments and results conducted at selected milestones?

1.3-3.12 Are the subcontractor's quality management activities used to improve the subcontractor's products? T1.3-C-L3

1.3-3.13 deleted 1.3-3.14 Are discrepancies discovered during acceptance testing used to improve the

subcontractor's products? T1.3-D-L3 1.3-3.15 Does each subcontractor have a system engineering management plan,

integration plan, and verification plan? 1.3-3.16 Do systems engineering personnel approve the process and product standards

used by subcontractors? 1.3-3.17 If the subcontractor is unable to meet all product and process standards, are

additional deliverables provided to augment the deficiency? 1.3-3.18 Is there a mechanism for assuring that all subcontractors follow a defined

engineering process? 1.3-3.19/G16 Are metrics developed to assess the effectiveness of subcontract management

activities? 1.3-3.20/G17 Are peer/defect reviews conducted to assess and improve subcontract

management activities and products? 1.3-3.21/G18 Are subcontract management processes standardized across the organization? 1.3-3.22/G19 Are guidelines provided to allow the program to tailor the standard subcontract

management process for its specific needs? 1.3-3.23/G20 Are the products / results of the subcontract management activities at least of

significant value to the program? 1.3-3.24/G21 Are subcontract management activities at least of significant effectiveness?

Page 59: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

53

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.3-4 Measured 1.3-4.1/G22 Are metrics used to determine the status and effectiveness of subcontract

management activities? 1.3-4.2 Are methods established and used for tracking and reviewing the performance of

subcontractors? T1.3-E-L4a 1.3-4.3 Is the subcontractor's performance evaluated on a periodic basis, the evaluation

reviewed with the subcontractor, and corrective actions taken as necessary? T1.3-E-L4b 1.3-4.4/G23 Are analyses performed on the metrics associated with subcontract management

to identify corrective actions for the program? 1.3-4.5/G24 Are the identified corrective actions implemented as necessary for the

program? 1.3-4.6/G25 Is the value of products / results of the subcontract management activities at

least of measurably significant value to the program? 1.3-4.7/G26 Is the effectiveness of subcontract management activities at least measurably

significant?

1.3-5 Optimizing 1.3-5.1/G27 Is the effectiveness of the subcontract management process and its

implementation activities reviewed on both an event-driven and periodic basis? 1.3-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the

subcontract management process and its implementation? 1.3-5.3/G29 Are quality management reviews and/or audits conducted of the subcontract

management activities and work products and the results used to improve the process? 1.3-5.4/G30 Are the metrics collected on the effectiveness of subcontract management

activities used to monitor and improve the systems engineering process. 1.3-5.5/G31 Are the products / results of the subcontract management activities of optimal

value to the program? 1.3-5.6/G32 Are subcontract management activities of optimal effectiveness?

5.1.4 KFA 1.4 INTER-GROUP COORDINATION

Inter-group coordination facilitates the effective communication and the resolution of issues among diverse engineering groups and others involved in system development. Such groups include customers, suppliers, producers, engineers of various disciplines and specialties and other stakeholders. Inter-group coordination involves the management of relationships, interfaces, communications, exchanges and reviews of technical information that are needed to effectively and efficiently accomplish technical / program objectives. Inter-group coordination provides the means for the interaction and integration of technical disciplines and fosters active participation among them in an effective and efficient manner.

The program should assign teams to specific work packages as identified by a work breakdown structure. Each team is responsible for the planning, developing, and satisfying the requirements associated with its work package(s). Within the Department of Defense, the use of multi-disciplinary teams, designated Integrated

Page 60: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

54

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Product Teams (IPTs), has been mandated as a core feature of the integrated product and process development (IPPD) approach to reduce costs and to field products sooner.

In addition to multi-disciplinary teams a concurrent engineering environment must be established. Concurrent engineering is a systematic approach to the integrated, concurrent design of products and their related processes. Concurrent engineering integrates product and process requirements, organizes a program for efficiency and effectiveness, balances the program's communication infrastructure, and integrates the systems documentation infrastructure. Marketing, engineering, manufacturing, and field support must work together to consider all elements of the product life-cycle from conception through disposal to define acceptable system solutions.

Effective inter-group coordination must involve meetings and information exchanges to review technical progress and to identify and address important issues. Inter-group interfaces that communicate technical details must be planned and managed relative to the quality and integrity of the overall system for both the product and program. These pre-planned inter-group coordination interfaces are furthermore generally incorporated into the overall systems engineering process.

This precise delineation of inter-group activities ensures the compatibility of the interfaces and facilitates timely performance of horizontal and vertical communications during the conduct of systems engineering activities.

General Characteristics Characteristic 1 Inter-group coordination activities are planned. Characteristic 2 Interfaces (i.e. lines of communication) between groups are defined and

managed. Characteristic 3 Technical and project issues are identified and resolved among affected

groups. Characteristic 4 All stakeholders have the project and technical information they need,

when they need it.

Questions

yes no

1.4-1 Performed 1.4-1.1/G1 Is inter-group coordination being accomplished in at least an informal manner? 1.4-1.2 Is the importance of inter-group coordination understood by program participants?

T1.4-A-L1 1.4-1.3 Do program stakeholders regularly exchange technical information? T1.4-B-L1a 1.4-1.4 Are there regular technical interchanges with the customer? T1.4-B-L1b 1.4-1.5/G2 Are inter-group coordination activities planned for the program? 1.4-1.6 Do affected personnel review and/or approve plans within groups which may affect

other groups? T1.4-C-L1 1.4-1.7 Are both traditional and speciality engineering disciplines involved in product

development as needed? T1.4-D-L1 1.4-1.8/G3 Are the products / results of inter-group coordination activities at least of

marginal value to the program? 1.4-1.9/G4 Are inter-group coordination activities at least of marginal effectiveness?

Page 61: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

55

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.4-2 Managed 1.4-2.1/G5 Does the program follow a written organizational policy (may be part of a

broad-based policy) for implementing inter-group coordination? 1.4-2.2/G6 Is there an approved plan (may be part of a larger technical management plan)

for the program to perform inter-group coordination? 1.4-2.3/G7 Has the responsibility for identifying, coordinating, and resolving issues among

different groups been assigned? 1.4-2.4Are processes for assembling the right mix of technical disciplines established? T1.4-

D-L2 1.4-2.5Do systems engineering activities have representation from appropriate engineering

skills? T1.4-A-L2a 1.4-2.6Do inter-group coordination activities include adequate customer involvement? T1.4-

B-L2a 1.4-2.7Are systems engineering issues called to the attention of the program manager? T1.4-

B-L2b 1.4-2.8Does the program have a process for resolving inter-group issues? T1.4-E-L2a 1.4-2.9Is there an established process for escalation and arbitration of technical differences,

leading to resolution? T1.4-E-L2b 1.4-2.10 Is a mechanism used to communicate issue resolutions to all affected

organizations including subcontractors and associate contractors? T1.4-B-L2c 1.4-2.11 Are team meeting facilities available? 1.4-2.12/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for

performing inter-group coordination activities assessed? 1.4-2.13/G9 Do personnel receive training in communication skills, group problem solving,

and active listening? 1.4-2.14 Are inter-group coordination activities and the results of those activities

documented? 1.4-2.15 Does management espouse and model in their own behavior, appropriate

communication skills? T1.4-A-L2b 1.4-2.16/G10 Is inter-group coordination being managed in a structured manner? 1.4-2.17/G11 Are data collected for monitoring inter-group coordination activities? 1.4-2.18/G12 Are corrective actions initiated when inter-group coordination activities deviate

significantly from the plan? 1.4-2.19/G13 Are the products / results of the inter-group coordination activities at least of

adequate value to the program? 1.4-2.20/G14 Are inter-group coordination activities at least of adequate effectiveness?

1.4-3 Defined 1.4-3.1/G15 Are inter-group coordination activities planned, approved, and established

according to a formal procedure? 1.4-3.2 Are standards and guidelines used to coordinate integrated engineering analysis

activities? T1.4-D-L3a 1.4-3.3 Are formal procedures in used to identify multi-disciplinary team members? T1.4-D-

L3b 1.4-3.4 Do systems engineering personnel organize, conduct and close formal reviews of the

system requirements? T1.4-C-L3

Page 62: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

56

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.4-3.5 Do systems engineering personnel review and agree to designs produced by other engineering design teams? T1.4-E-L3a

1.4-3.6 Are all appropriate engineering disciplines represented on the systems design team? T1.4-A-L3

1.4-3.7 Do the developers of configuration items review and agree to the system requirements allocated to the configuration item? T1.4-E-L3b

1.4-3.8 Do systems engineering personnel review and agree to the products produced by other engineering disciplines? T1.4-E-L3c

1.4-3.9 Do the other engineering disciplines review and agree to systems engineering products? T1.4-E-L3d

1.4-3.10 Do systems engineering personnel and other engineering personnel periodically engage in technical meetings to review, identify, and resolve problem areas? T1.4-B-L3a

1.4-3.11 Do systems engineering personnel review and approve plans prepared by other groups that may have an impact on the system requirements? T1.4-B-L3b

1.4-3.12 Do systems engineering personnel review and approve all plans that affect multiple engineering organizations/disciplines/functions? T1.4-B-L3c

1.4-3.13/G16 Are metrics developed to assess inter-group coordination effectiveness? 1.4-3.14/G17 Are peer/defect reviews conducted to assess and improve inter-group

coordination activities and products? 1.4-3.15/G18 Are inter-group coordination processes standardized across the organization? 1.4-3.16/G19 Are guidelines provided to allow the program to tailor the standard inter-group

coordination process for its specific needs? 1.4-3.17/G20 Are the products / results of the inter-group coordination activities at least of

significant value to the program? 1.4-3.18/G21 Are inter-group coordination activities at least of significant effectiveness?

1.4-4 Measured 1.4-4.1/G22 Are metrics used to determine the status and effectiveness of inter-group

coordination activities? 1.4-4.2/G23 Are inter-group coordination metrics analyzed to identify trends and initiate

corrective actions to improve the effectiveness of program activities? 1.4-4.3/G24 Are the identified corrective actions implemented as necessary for the

program? 1.4-4.4 Has the organization's inter-group coordination efforts and use of multi-disciplinary

teams received external recognition (e.g., recognized industry leadership, receipt of professional society awards)?

1.4-4.5/G25 Is the value of products / results of the inter-group coordination activities at least of measurably significant value to the program?

1.4-4.6/G26 Is the effectiveness of inter-group coordination activities at least measurably significant?

1.4-5 Optimizing 1.4-5.1/G27 Is the effectiveness of the inter-group coordination process and its

implementation activities reviewed on both an event-driven and periodic basis? 1.4-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the inter-

group coordination process and its implementation?

Page 63: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

57

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.4-5.3/G29 Are quality management reviews and/or audits conducted of the inter-group coordination activities and work products and the results used to improve the process?

1.4-5.4/G30 Are the metrics collected on the effectiveness of inter-group coordination activities used to monitor and improve the systems engineering process?

1.4-5.5/G31 Are the products / results of the inter-group coordination activities of optimal value to the program?

1.4-5.6/G32 Are inter-group coordination activities of optimal effectiveness?

5.1.5 KFA 1.5 CONFIGURATION MANAGEMENT

Configuration management involves the planning, configuration identification, change control, status accounting, and auditing of the product elements which consist of requirements, interfaces, and design representations of the products being provided to meet the stated program objectives.

Configuration management planning is the process of developing, coordinating, and documenting the configuration management products and activities for identification, configuration control, status accounting, and audits of configuration items.

Configuration identification involves the selection of documents which identify and describe the baseline configuration characteristics of an item during its life cycle.

Configuration control is important for keeping form, fit and function, controlling interfaces, controlling design characteristics, assessing change impacts, and maintaining records of changes. It provides for the control of baseline configuration item changes which are required to correct problems which have been identified within the context of meeting the technical and program requirements.

Configuration status accounting provides the recording and reporting of change information to the baseline configuration items. It is the management information system which provides the traceability of configuration identification and changes thereto, and facilitates the effective implementation of approved changes.

Configuration auditing involves the checking of an item for compliance with its configuration identification. Configuration audits provide for validating that the developed item fulfills it technical requirements, and that the product configuration is identified by comparing the configuration item with its technical documentation.

General Characteristics Characteristic 1 Configuration management products and activities are planned. Characteristic 2 Configuration work products are identified, controlled, and available. Characteristic 3 Configuration changes are controlled and evaluated consistent with the

technical and program requirements. Characteristic 4 The status and contents of the configuration work products are

communicated.

Page 64: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

58

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Questions

yes no

1.5-1 Performed 1.5-1.1/G1 Is configuration management being accomplished in at least an informal

manner? 1.5-1.2 Is there a means to control the configuration items, baselines, and all changes thereto?

T1.5-A-L1 1.5-1.3/G2 Are configuration management activities planned for the program? 1.5-1.4/G3 Are the products / results of the configuration management activities at least of

marginal value to the program? 1.5-1.5/G4 Are configuration management activities at least of marginal effectiveness?

1.5-2 Managed 1.5-2.1/G5 Does the program follow a written organizational policy (may be part of a

broad-based policy) for implementing configuration management? 1.5-2.2/G6 Is there an approved plan (may be part of a larger technical management plan)

for the program to perform configuration management? 1.5-2.3 Are the work products to be placed under configuration management identified?

(moved from level 3) T1.5-B-L2 1.5-2.4 Has responsibility been established for managing configuration items (e.g.

configuration control board)? T1.5-C-L2 1.5-2.5/G7 Has responsibility been assigned for coordinating and implementing

configuration management? 1.5-2.6/G8 Is the adequacy of resources (i.e. funding, staff, tools, etc.) provided for

performing configuration management activities assessed? 1.5-2.7 Are configuration management personnel trained in the objectives, procedures, and

methods for performing their configuration management activities? 1.5-2.8/G9 Are systems engineering personnel and other engineering personnel trained to

perform their configuration management activities? 1.5-2.9/G10 Is configuration management being performed in a structured manner? 1.5-2.10/G11 Are data collected for monitoring configuration management activities? 1.5-2.11/G12 Are corrective actions initiated when configuration management activities

deviate significantly from the plan? 1.5-2.12/G13 Are the products / results of the configuration management activities at least of

adequate value to the program? 1.5-2.13/G14 Are configuration management activities at least of adequate effectiveness?

1.5-3 Defined 1.5-3.1/G15 Are configuration management activities planned, approved, and established

according to a formal procedure? 1.5-3.2 Is a configuration management library system established as a repository for baselines?

T1.5-A-L3a

Page 65: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

59

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.5-3.3 Are change requests and problem reports for all configuration items initiated, recorded, reviewed, approved, and tracked? T1.5-A-L3b

1.5-3.4 Are changes to baselines controlled by a change review board? T1.5-C-L3 1.5-3.5 Are products from the baseline library created and their release formally controlled?

T1.5-A-L3c 1.5-3.6 Is the status of configuration items recorded? T1.5-B-L3 1.5-3.7 Are configuration status reports documenting the configuration control activities made

available? 1.5-3.8 Are baseline audits conducted and results recorded? 1.5-3.9 Is a mechanism used for controlling changes to system/subsystem/configuration item

requirements? 1.5-3.10 Is a mechanism used for controlling changes to configuration item product

specification? 1.5-3.11 Is a formal procedure used for controlling changes to the products which have

completed baseline verification? 1.5-3.12 Is a mechanism used for controlling changes to the system and component

configuration item design? 1.5-3.13 Is there a configuration control function for all formal documentation produced

by systems engineering personnel? 1.5-3.14 Are tests and/or reviews performed to ensure that changes to configuration

items have not caused unintentional effects? 1.5-3.15 At any given point in time can the status of engineering products under

configuration management be identified? 1.5-3.16 Do system configuration management audits verify the completeness and

correctness of the contents of specifications? 1.5-3.17/G16 Are metrics collected to assess the effectiveness of configuration management

activities? 1.5-3.18/G17 Are peer/defect reviews conducted to assess and improve configuration

management activities and products? 1.5-3.19/G18 Are configuration management processes standardized across the

organization? 1.5-3.20/G19 Are guidelines provided to allow the program to tailor the standard

configuration management process for its specific needs? 1.5-3.21/G20 Are the products / results of the configuration management activities at least of

significant value to the program? 1.5-3.22/G21 Are configuration management activities at least of significant effectiveness?

1.5-4 Measured 1.5-4.1/G22 Are metrics used to determine the status and effectiveness of configuration

management activities? 1.5-4.2/G23 Are analyses performed on the metrics associated with configuration

management to identify corrective actions for the program? 1.5-4.3/G24 Are the identified corrective actions implemented as necessary for the

program? 1.5-4.4/G25 Is the value of products / results of the configuration management activities at

least of measurably significant value to the program?

Page 66: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

60

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.5-4.5/G26 Is the effectiveness of configuration management activities at least measurably significant?

1.5-5 Optimizing 1.5-5.1/G27 Is the effectiveness of the configuration management process and its

implementation activities reviewed on both an event-driven and periodic basis? 1.5-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the

configuration management process and its implementation? 1.5-5.3 Are configuration management activities periodically audited to confirm that the

resulting baselines conform to the documentation that defines them? 1.5-5.4/G29 Are quality management reviews and/or audits conducted of the configuration

management activities and work products and the results used to improve the process? 1.5-5.5/G30 Are the metrics collected on the effectiveness of configuration management

activities used to monitor and improve the systems engineering process. 1.5-5.6/G31 Are the products / results of the configuration management activities of optimal

value to the program? 1.5-5.7/G32 Are configuration management activities of optimal effectiveness?

5.1.6 KFA 1.6 QUALITY MANAGEMENT

Quality management is the unifying set of activities that link human capabilities with engineering, production, and support activities and processes. Its objective is to ensure that all the stated needs are met. To accomplish this objective, quality management utilizes a top-down, life-cycle perspective toward overall system and product quality. This perspective is embodied in the concept of Total Quality Management (TQM).

Quality management provides for independent evaluation and assessment of the products and processes used to meet the program objectives. The results of evaluations and assessments are reported to all those impacted. Reduction in variability and enhancing design robustness are key elements of quality management. These elements support the systems engineering objective of maximizing the effectiveness and efficiency in both products and the processes used to create them.

General Characteristics Characteristic 1 Quality management products and activities are planned. Characteristic 2 Technical efforts adhere to defined standards, procedures, and

requirements. Characteristic 3 Noncompliance issues are addressed and tracked to closure.

Questions

yes no

1.6-1 Performed 1.6-1.1/G1 Is quality management being accomplished in at least an informal manner?

Page 67: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

61

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.6-1.2/G2 Are quality management activities planned for the program? 1.6-1.3 Do all organizations involved with the program perform quality management? T1.6-A-

L1 1.6-1.4/G3 Are the products / results of the quality management activities at least of

marginal value to the program? 1.6-1.5/G4 Are quality management activities at least of marginal effectiveness?

1.6-2 Managed 1.6-2.1/G5 Is there an organizational policy (may be part of a broad-based policy) that

requires quality management be an integral part of all engineering activities? 1.6-2.2/G6 Is there an approved plan (may be part of a larger technical management plan)

for the program to perform quality management? 1.6-2.3/G7 Is an independent individual or group responsible for performing quality

assessment of systems engineering activities for programs? 1.6-2.4 Do quality management personnel have a management reporting channel separate from

program management? T1.6-A-L2 1.6-2.5/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for

performing quality management activities assessed? 1.6-2.6/G9 Are systems engineering personnel and other engineering personnel trained to

perform their respective quality management activities? 1.6-2.7/G10 Are quality management activities performed in a structured manner? 1.6-2.8/G11 Are data collected for monitoring quality management activities? 1.6-2.9/G12 Are corrective actions initiated when quality management activities deviate

significantly from the plan? 1.6-2.10/G13 Are the products / results of the quality management activities at least of

adequate value to the program? 1.6-2.11/G14 Are quality management activities at least of adequate effectiveness?

1.6-3 Defined 1.6-3.1/G15 Are quality management activities planned, approved, and established

according to a formal procedure? 1.6-3.2 Do documented and approved quality assurance standards exist? T1.6-B-L3 1.6-3.3 Are inspections of systems engineering products applied throughout the program life

cycle? T1.6-A-L3 1.6-3.4 Is the systems engineering process applied consistently throughout the program life

cycle? 1.6-3.5 Are there quality checkpoints to evaluate the conduct of each step of the systems

engineering process throughout the system life cycle? T1.6-C-L3a 1.6-3.6 Are systems engineering activities reviewed and audited to ensure compliance with

plans and standards? 1.6-3.7 Are representative samples of systems engineering planning and technical products

reviewed to ensure compliance with standards? T1.6-C-L3b 1.6-3.8 Are the results of quality assurance reviews and audits reported to systems engineering

personnel on a regular basis?

Page 68: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

62

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.6-3.9 Are non-compliance issues escalated when the issues cannot be resolved within the program?

1.6-3.10/G16 Are metrics collected for assessing the effectiveness of quality management activities?

1.6-3.11/G17 Are peer/defect reviews conducted to assess and improve quality management activities and products?

1.6-3.12/G18 Are quality management processes standardized across the organization? 1.6-3.13/G19 Are guidelines provided to allow the program to tailor the standard quality

management process for its specific needs? 1.6-3.14/G20 Are the products / results of the quality management activities at least of

significant value to the program? 1.6-3.15/G21 Are quality management activities at least of significant effectiveness?

1.6-4 Measured 1.6-4.1/G22 Are metrics used to determine the status and effectiveness of quality

management activities? 1.6-4.2/G23 Are analyses performed on the metrics associated with quality management to

identify corrective actions for the program? 1.6-4.3/G24 Are the identified corrective actions implemented as necessary for the

program? 1.6-4.4/G25 Are the products / results of the quality management activities at least of

measurably significant value to the program? 1.6-4.5/G26 Is the effectiveness of quality management activities at least measurably

significant?

1.6-5 Optimizing 1.6-5.1/G27 Is the effectiveness of the quality management process and its implementation

activities reviewed on both an event-driven and periodic basis? 1.6-5.2 Is a mechanism used to verify that samples examined by quality management activities

are truly representative of the work performed? T1.6-C-L5 1.6-5.3/G28 Upon review, are actions taken to correct identified deficiencies in the quality

management process and its implementation? 1.6-5.4/G29 Is a mechanism used to assure that the quality management activities defined by

the systems engineering process are being consistently applied? 1.6-5.5 Is a mechanism used to verify that product standards are adequate? T1.6-B-L5 1.6-5.6 Are quality assurance processes modified when deemed necessary using a documented

process? 1.6-5.7/G30 Are the metrics collected on the effectiveness of quality management activities

used to monitor and improve the systems engineering process. 1.6-5.8/G31 Are the products / results of the quality management activities of optimal value

to the program? 1.6-5.9/G32 Are quality management activities of optimal effectiveness?

Page 69: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

63

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

5.1.7 KFA 1.7 RISK MANAGEMENT

Risk management is both a program and technical management process that considers risks affecting the technical, cost, and schedule aspects of a program in an uncertain environment. Risk management involves three related activities: risk identification, risk analysis, and risk mitigation (sometimes called risk handling). Risk identification is the continuous process of identifying all areas of potential risk on a program. The thoroughness with which risk identification is done is a key factor in the effectiveness of risk management; if risks are not identified then analysis and appropriate corrective action cannot be taken. Risk analysis is the process of quantifying each specific risk; determining the probability of its occurrence; and the impact on the program associated with its occurrence; and developing and analyzing alternative options, backed up by specific recommendations for action. Risk mitigation is the process of avoiding, reducing and controlling, or deliberately accepting risk on the program.

Risk management provides a means of addressing issues which have the potential for causing a specified technical, schedule, or cost requirement on the program to not be satisfied. Risk management should not be viewed as a separate program office function, but rather as an integral part of sound systems engineering management of the technical effort.

The effectiveness of the program risk management effort will be based upon the breadth of experience depth of system expertise, interface and integration experience, and creativity of the assigned risk manger, program team, and the participants from the line organizations. This will be further influenced by the scope of the charter provided by the program manager.

General Characteristics Characteristic 1 Risk management is planned. Characteristic 2 Risks are identified and analyzed. Characteristic 3 Actions are taken to deliberately mitigate technical, cost and schedule risks. Characteristic 4 Risks and mitigation actions are monitored. Characteristic 5 Risk status and risk mitigation efforts are communicated and coordinated

across affected groups.

Questions

yes no

1.7-1 Performed 1.7-1.1/G1 Are risks being identified, analyzed, and mitigated in at least an informal

manner on the program? 1.7-1.2/G2 Are risk management activities planned for the program? 1.7-1.3 Is the program manager made aware of significant risks? T1.7-A-L1 1.7-1.4/G3 Are the products / results of the risk management activities at least of marginal

value to the program? 1.7-1.5/G4 Are risk management activities at least of marginal effectiveness?

Page 70: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

64

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.7-2 Managed 1.7-2.1/G5 Does the program follow a written organizational policy (may be part of a

broad-based policy) for implementing risk management? 1.7-2.2/G6 Is there an approved risk management plan (may be part of a larger technical

plan) for the program to identify, analyze, and mitigate risk? 1.7-2.3/G7 Has responsibility been assigned for program risk management activities? 1.7-2.4 Are systems engineering personnel assigned to perform risk management? 1.7-2.5 Does the risk management plan define the risk levels (sometimes referred to as a risk

model) to be used by the program? 1.7-2.6 Does the risk management plan define what management response is required for each

risk level? 1.7-2.7 Are risks categorized into those that can be avoided, controlled, or accepted? T1.7-B-

L2 1.7-2.8 Has a communication path been established between the risk management team and the

program management team? T1.7-A-L2 1.7-2.9/G8 Is the adequacy of resources (e.g. funding, staff, tools, etc.) provided for risk

management activities assessed? 1.7-2.10 Are appropriate tools available to conduct risk analyses? 1.7-2.11/G9 Are systems engineering personnel and other engineering personnel trained to

perform their respective risk management activities? 1.7-2.12 Does risk management involve a multi-functional group that spans both

technical and business specialities? T1.7-C-L2a 1.7-2.13 Is risk management integrated both vertically and horizontally across the

program? T1.7-C-L2b 1.7-2.14/G10 Is risk management being performed in a structured manner? 1.7-2.15 Are risk management (identification, analysis, and mitigation) activities re-

evaluated over the duration of the program? T1.7-D-L2 1.7-2.16/G11 Are data collected for monitoring risk management activities? 1.7-2.17/G12 Are corrective actions initiated when risk management activities deviate

significantly from the plan? 1.7-2.18/G13 Are the products / results of the risk management activities at least of adequate

value to the program? 1.7-2.19/G14 Are risk management activities at least of adequate effectiveness?

1.7-3 Defined 1.7-3.1/G15 Are risk management activities planned, approved, and established according to

a formal procedure? 1.7-3.2 When a set of risks is identified for the program, are the risks reviewed to determine

that they form a complete set? 1.7-3.3 Are all elements of the work breakdown structure examined as part of the risk

identification process in order to help ensure that all program aspects have been considered? 1.7-3.4 Is there a well-defined method for evaluating risk associated with key processes within

the program (e.g. design, test, manufacturing, etc.)? 1.7-3.5 Is the analysis of risks reviewed for adequacy and completeness? 1.7-3.6 For each risk , are cause and effect relationships established?

Page 71: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

65

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.7-3.7 Is each risk analyzed for potential coupling to all other identified risks? 1.7-3.8 For each risk , are alternative courses of action, work-arounds, and/or fall-back positions

developed with a recommended course of action? 1.7-3.9 Are risk thresholds established, which when exceeded, institute more aggressive

mitigation efforts? 1.7-3.10 For each identified risk, is the potential impact to the program assessed should

the risk occur? 1.7-3.11 Does the risk management team provide the program management team clear

recommendations and justification for risks requiring mitigation? T1.7-A-L3a 1.7-3.12 Are risk mitigation alternatives provided to management along with the

anticipated impact of each alternative on risk , cost, and schedule? T1.7-A-L3b 1.7-3.13 Are risk analysis results and mitigation plans documented? T1.7-C-L3a 1.7-3.14 Is risk mitigation (handling) reviewed for adequacy and completeness? 1.7-3.15 Are risk reduction profiles documented and reviewed for appropriateness?

T1.7-B-L3 1.7-3.16 Are risks monitored at appropriate milestones and re-evaluated? T1.7-D-L3 1.7-3.17 Are the results of risk monitoring activities provided to affected

personnel/disciplines? T1.7-C-L3b 1.7-3.18 Is there a mechanism for monitoring corrective actions taken and tracking open

risk items to closure? 1.7-3.19 Are the metrics used in Tracking and Oversight activities an integral part of the

risk monitoring activities? 1.7-3.20/G16 Are metrics collected to assess the effectiveness of risk management activities? 1.7-3.21 Is risk management a part of program formal reviews? 1.7-3.22/G17 Are peer/defect reviews conducted to assess and improve risk management

activities and products? 1.7-3.23/G18 Is the risk management process standardized across the organization? 1.7-3.24/G19 Are guidelines provided to allow the program to tailor the standard risk

management process for its specific needs? 1.7-3.25/G20 Are the products / results of the risk management activities at least of

significant value to the program? 1.7-3.26/G21 Are risk management activities at least of significant effectiveness?

1.7-4 Measured 1.7-4.1 Are metrics regarding identified risks collected and corrective actions taken according

to the risk reduction profiles? 1.7-4.2 Are metrics regarding identified risks collected and examined in light of previous risk

analyses, and when established thresholds are exceeded, corrective action initiated? 1.7-4.3 During risk monitoring, are new risks identified, analyzed, and corrective action taken? 1.7-4.4/G22 Are metrics used to determine the status and effectiveness of risk management

activities? 1.7-4.5/G23 Are analyses performed on the metrics associated with risk management to

identify corrective actions for the program? 1.7-4.6/G24 Are the identified corrective actions implemented as necessary for the

program?

Page 72: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

66

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.7-4.7/G25 Is the value of products / results of the risk management activities at least of measurably significant value to the program?

1.7-4.8/G26 Is the effectiveness of risk management activities at least measurably significant?

1.7-5 Optimizing 1.7-5.1/G27 Is the effectiveness of the risk management process and its implementation

activities reviewed on both an event-driven and periodic basis? 1.7-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the risk

management process and its implementation? 1.7-5.3/G29 Are quality management reviews and/or audits conducted of the risk

management activities and program data and the results used to improve the process? 1.7-5.4/G30 Are the metrics collected on the effectiveness of the risk management activities

on the program used to monitor and improve the risk management process? 1.7-5.5 Are lessons learned and effectiveness of risk mitigation actions taken on the program

placed in a readily accessible database for use on future programs? 1.7-5.6/G31 Are the products / results of the risk management activities of optimal value to

the program? 1.7-5.7/G32 Are risk management activities of optimal effectiveness?

5.1.8 KFA 1.8 DATA MANAGEMENT

Data management is administrative control of program data, both deliverable and non-deliverable. Administrative control involves such activities as identification, interpretation of data requirements, planning, scheduling, control, archiving, and retrieval of program data.

Data are the various forms of documentation required to support a program in all of its areas (e.g., administration, engineering, configuration, financial, logistics, quality, safety, manufacturing, and procurement). The data may take any form (e.g., reports, manuals, notebooks, charts, drawings, specifications, files, or correspondence). The data may exist in any medium (e.g., printed or drawn on various materials, photographs, or electronic media). Data may be deliverable (e.g., items identified by a program contract data requirements list (CDRL)). Data may be non-deliverable (e.g., informal data, trade studies and analyses, internal meeting minutes, internal design review documentation, and action items).

Identification involves the definition of data management tasks, requirements, and responsibilities. Planning consists of the analysis and validation of program deliverable and non-deliverable data requirements. The identification and planning processes begin at the initial program stage, prior to any proposal activities, and continue throughout the life of the program.

Data management interpretation of requirements involves participation in decisions that potentially affect the archiving, retrieval, distribution, and delivery of data. This includes recommended changes to the data requirements, schedule adjustments, and problems that affect data development milestones.

All data products should be received, logged, archived, recovered, transmitted, and distributed per a program data management plan which may be part of a larger program plan. Deliverable data should be controlled and

Page 73: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

67

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

scheduled in order to track contractual data requirements, schedules, and status information. The data item delivery schedule is monitored as part of the management checks and balances system. All major elements necessary to (1) identify all data item requirements, their individual delivery schedules, and the performance history against those schedules, (2) identify originators of each data product increment, (3) transmit documentation, and (4) other pertinent information (e.g., customer response dates, disposition, file page counts, security classification, etc.) should be tracked.

General Characteristics Characteristic 1 Data management products and activities are planned. Characteristic 2 Program data are identified, controlled, inspected, archived, and retrieved,

as appropriate. Characteristic 3 Data item changes are controlled and evaluated consistent with the

technical, program, and configuration management requirements. Characteristic 4 The status and contents of the program data are communicated to those

that need them.

Questions

yes no

1.8-1 Performed 1.8-1.1/G1 Is data management being accomplished in at least an informal manner? 1.8-1.2 Is there a means of archiving and retrieving program data? T1.8-A-L1 1.8-1.3 Are relationships between requirements and interface documents used within various

levels of system development documented (e.g., by using a specification tree)? T1.8-B-L1a 1.8-1.4 Are relationships between drawings for each physical element of the system

documented (e.g., using drawing trees)? T1.8-B-L1b 1.8-1.5 Are relationships between software elements and interfaces documented (e.g., using

software structure trees)? T1.8-B-L1c 1.8-1.6/G2 Are data management activities planned for the program? 1.8-1.7/G3 Are the products / results of the data management activities at least of marginal

value to the program? 1.8-1.8/G4 Are data management activities at least of marginal effectiveness?

1.8-2 Managed 1.8-2.1/G5 Does the program follow a written organizational policy (may be part of a

broad-based policy) for implementing data management? 1.8-2.2/G6 Is there an approved plan (may be part of a larger technical management plan)

for the program to perform data management? 1.8-2.3/G7 Has the responsibility for establishing and managing program data been

assigned? 1.8-2.4/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for

performing data management activities assessed?

Page 74: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

68

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.8-2.5 Are data management personnel trained in the objectives, procedures, and methods for performing data management activities?

1.8-2.6/G9 Are systems engineering personnel and other engineering personnel trained to perform their respective data management activities?

1.8-2.7 Is planning, scheduling, and control accomplished for program data? T1.8-A-L2 1.8-2.8 Are program data requirements established? T1.8-B-L2 1.8-2.9 Are program data inspected prior to delivery or archiving? T1.8-C-L2 1.8-2.10/G10 Is data management being performed in a structured manner? 1.8-2.11/G11 Are data collected for monitoring data management activities? 1.8-2.12/G12 Are corrective actions initiated when data management activities devia te

significantly from the plan? 1.8-2.13/G13 Are the products / results of the data management activities at least of adequate

value to the program? 1.8-2.14/G14 Are data management activities at least of adequate effectiveness?

1.8-3 Defined 1.8-3.1/G15 Are data management activities planned, approved, and established according

to a formal procedure? 1.8-3.2 Is a common data management archival and retrieval system used throughout the

organization? T1.8-A-L3a 1.8-3.3 Does the archival and retrieval system provide capture techniques for program data that

are appropriate to the degree of formality of the data? T1.8-B-L3a 1.8-3.4 Can desired program data be efficiently located based upon common characteristics

(e.g., key words, topics, contract number, etc.)? T1.8-A-L3b 1.8-3.5 Can desired program data be quickly retrieved? T1.8-A-L3c 1.8-3.6 Are program data requirements established based upon a common or standard set of

data requirements T1.8-B-L3b? 1.8-3.7 Is a standard process used to control changes to data requirements standards? 1.8-3.8 Are program data inspected for compliance to data requirements prior to delivery or

archiving? T1.8-C-L3 1.8-3.9 Are program data formally archived and controlled? T1.8-A-L3d 1.8-3.10 Is the status of program data recorded? 1.8-3.11 Are status reports documenting data management activities communicated to

appropriate groups and/or individuals? 1.8-3.12 At any given point in time, can the status of program data under data

management be identified? 1.8-3.13 Are individuals having responsibility for the generation of program data alerted

of upcoming milestones and/or delivery dates? 1.8-3.14/G16 Are metrics collected for assessing the effectiveness of data management

activities? 1.8-3.15/G17 Are peer/defect reviews conducted to assess and improve data management

activities and products? 1.8-3.16/G18 Are data management processes standardized across the organization? 1.8-3.17/G19 Are guidelines provided to allow the program to tailor the standard data

management process for its specific needs?

Page 75: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

69

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.8-3.18/G20 Are the products / results of the data management activities at least of significant value to the program?

1.8-3.19/G21 Are data management activities at least of significant effectiveness?

1.8-4 Measured 1.8-4.1/G22 Are metrics used to determine the status and effectiveness of data management

activities? 1.8-4.2/G23 Are analyses performed on the metrics associated with data management to

identify corrective actions for the program? 1.8-4.3/G24 Are the identified corrective actions implemented as necessary for the

program? 1.8-4.4/G25 Are the products / results of the data management activities at least of

measurably significant value to the program? 1.8-4.5/G26 Is the effectiveness of data management activities at least measurably

significant?

1.8-5 Optimizing 1.8-5.1/G27 Is the effectiveness of the data management process and its implementation

activities reviewed on both an event-driven and periodic basis? 1.8-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the data

management process and its implementation? 1.8-5.3 Are data management activities periodically reviewed to confirm that the requirements

for program data are still valid? T1.8-C-L5 1.8-5.4/G29 Are quality management reviews and/or audits conducted of the data

management activities and program data and the results used to improve the process? 1.8-5.5/G30 Are the metrics collected on the effectiveness of data management activities

used to monitor and improve the systems engineering process? 1.8-5.6/G31 Are the products / results of the data management activities of optimal value to

the program? 1.8-5.7/G32 Are data management activities of optimal effectiveness?

Page 76: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

70

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

5.2 CATEGORY 2 ORGANIZATION PROCESS CATEGORY

The Organization Process Category consists of the following key focus areas:

2.1 Process Management and Improvement 2.2 Competency Development 2.3 Technology Management 2.4 Environment and Tool Support

5.2.1 KFA 2.1 PROCESS MANAGEMENT AND IMPROVEMENT

Process management involves those activities needed to establish and maintain processes required to accomplish systems engineering. Process management and improvement involves establishing process related criteria and improvement criteria for systems engineering activities independent of any particular program. These criteria take the form of company policies and standards for the performance of systems engineering processes. Process management establishes engineering and quality standards and guidelines for systems engineering and for the tracking and improvement of the systems engineering process.

General Characteristics Characteristic 1 Process management and improvement activities are planned. Characteristic 2 An organization?wide family of standard systems engineering processes is

established. Characteristic 3 Guidelines are established for tailoring the organization's standard systems

engineering processes into a programÀspecific systems engineering process. Characteristic 4 An organization-wide repository for systems engineering process related

information is established and maintained. Characteristic 5 Information from project and organizational sources is used to

continuously improve the organization's standard systems engineering processes and to integrate processes which interact with one another.

Questions

yes no

2.1-1 Performed 2.1-1.1/G1 Is systems engineering process management being accomplished in at least an

informal manner? 2.1-1.2 Are systems engineering processes being identified? T2.1-A-L1 2.1-1.3/G2 Are systems engineering process management activities planned for the

organization?

Page 77: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

71

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.1-1.4/G3 Are the products / results of the systems engineering process management and activities at least of marginal value to the organization?

2.1-1.5/G4 Are systems engineering process management activities at least of marginal effectiveness?

2.1-2 Managed 2.1-2.1/G5 Does the organization follow a written organizational policy (may be part of a

broad-based policy) for implementing and maintaining its systems engineering process? 2.1-2.2/G6 Is there an approved systems engineering process improvement plan (stand-

alone or part of the systems engineering process)? 2.1-2.3 Is improvement of systems engineering process being accomplished in at least an

informal manner? T2.1-B-L2 2.1-2.4/G7 Is there a designated systems engineering manager or team leader responsible

for process management? 2.1-2.5 Are there designated systems engineers responsible for process management and

improvement? T2.1-C-L2a 2.1-2.6 Is systems engineering process improvement part of the responsibility of those

responsible for process management? T2.1-C-L2b 2.1-2.7 Is the organizational policy clearly and completely described and presented to all

engineering and program personnel? T2.1-D-L2 2.1-2.8 Has a managed and controlled process database been established for process data in the

organization? T2.1-A-L2a 2.1-2.9 Are best practices within the organization identified and communicated to other

programs? T2.1-A-L2b 2.1-2.10/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided to

perform systems engineering process management and improvement activities assessed? 2.1-2.11/G9 Is training provided to systems engineering personnel involved in systems

engineering process management? 2.1-2.12 Is training provided on how to improve systems engineering process? T2.1-E-

L2 2.1-2.13/G10 Is systems engineering process management and improvement performed in a

structured manner? 2.1-2.14/G11 Are data collected for monitoring systems engineering process management

and improvement activities? 2.1-2.15/G12 Are corrective actions initiated when process management and improvement

activities deviate significantly from the plan? 2.1-2.16/G13 Are the products / results of the process management and improvement

activities at least of adequate value to the program? 2.1-2.17/G14 Are process management and improvement activities at least of adequate

effectiveness?

2.1-3 Defined 2.1-3.1/G15 Are process management and improvement activities planned, approved, and

established according to a formal procedure?

Page 78: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

72

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.1-3.2 Has a standard systems engineering process been documented by the organization? T2.1-A-L3a

2.1-3.3 Are the inputs and outputs of systems engineering process activities clearly defined? T2.1-A-L3b

2.1-3.4 Are initiation and completion criteria defined for each major activity in the systems engineering process? T2.1-A-L3c

2.1-3.5/G16 Are metrics collected for assessing the effectiveness of process management and improvement?

2.1-3.6/G17 Are peer/defect reviews conducted to assess and improve process management and improvement activities and products?

2.1-3.7 Are error causes reviewed to determine whether changes to the systems engineering process are required to prevent future occurrences of such errors? T2.1-B-L3a

2.1-3.8 Is a mechanism used for periodically assessing the systems engineering process and implementing indicated improvements? T2.1-B-L3b

2.1-3.9 Does the organization seek to benchmark its systems engineering process against processes used by other organizations? T2.1-B-L3c

2.1-3.10/G18 Does the organization have a standard systems engineering process? 2.1-3.11/G19 Does the standard systems engineering process permit tailoring to meet

specific program needs? 2.1-3.12 Is there an established set of tailoring guidelines? T2.1-E-L3 2.1-3.13 Does the program use the organization's defined systems engineering process?

T2.1-F-L3a 2.1-3.14 Are formal methods for systems engineering used on programs? T2.1-F-L3b 2.1-3.15 Is a single process management and improvement process standardized across

the organization? T2.1-B-L3c 2.1-3.16/G20 Are the products / results of the systems engineering process management and

improvement activities at least of significant value to the organization? 2.1-3.17/G21 Are systems engineering process management and improvement activities at

least of significant effectiveness?

2.1-4 Measured 2.1-4.1 Is the effectiveness of formal reviews and the quality of the systems engineering

products analyzed on each program? T2.1-F-L4 2.1-4.2 Is systems engineering productivity measured and analyzed for each major process

activity within the systems engineering process? 2.1-4.3 Is the effectiveness of informal reviews and the quality of their products analyzed on

each program? 2.1-4.4 Are the data gathered during inspections used to improve the systems engineering

process? 2.1-4.5 Are uniform systems engineering process metrics across programs? 2.1-4.6 Is a mechanism used to evaluate the utility of process metrics collected across all

programs? 2.1-4.7 Has the organization been asked to benchmark its practices against those of other

organizations? T2.1-B-L4 2.1-4.8 Has the organization's systems engineering process recieved external recognition (e.g.,

recognized industry leadership, receipt of professional society awards)?

Page 79: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

73

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.1-4.9/G22 Are process metrics used to determine the effectiveness of each major activity in the systems engineering process?

2.1-4.10/G23 Is a formal process established to apply the results of the evaluation of the process metrics to systems engineering process improvement?

2.1-4.11/G24 Are the identified corrective actions implemented as necessary within the organization?

2.1-4.12/G25 Are the products / results of the systems engineering process management and improvement activities at least of measurably significant value to the organization?

2.1-4.13/G26 Is the effectiveness of systems engineering process management and improvement activities at least measurably significant?

2.1-5 Optimizing 2.1-5.1/G27 Is the effectiveness of the process management and improvement activities

reviewed on both an event-driven and periodic basis? 2.1-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the process

management and improvement process? 2.1-5.3/G29 Are quality management reviews and/or audits conducted of process

management and improvement process and the results used to improve the process? 2.1-5.4/G30 Are the metrics collected on the effectiveness of systems engineering process

used to monitor and improve the systems engineering process? 2.1-5.5 Is a mechanism used for periodically assessing the systems engineering process and

implementing indicated improvements? T2.1-B-L5 2.1-5.6/G31 Are the products / results of the systems engineering process management and

improvement activities of optimal value to the program? 2.1-5.7/G32 Are systems engineering process management and improvement activities of

optimal effectiveness? 2.1-5.8 Is a formal procedure used to assure periodic management review of each program and

institute changes to the systems engineering process as needed. (Note: From KFA 3.1) T2.1-D-L5

5.2.2 KFA 2.2 COMPETENCY DEVELOPMENT

An effective competency development program requires planning, procedures, training media (e.g., workbooks, computer software, etc.), and a database of competency development process data. As an organizational process, the main components of competency development include a managed competency development program, documented plans, personnel with appropriate mastery of systems engineering and mechanisms for measuring the effectiveness of competency development program.

The purpose of competency development is twofold: 1) to increase organizational, and program competency (as an alternative to hiring the needed talent) to perform the style, scope, and intensity of systems engineering required over time, and 2) to provide a learning environment for individuals who wants to increase their knowledge, skills, wisdom, competency or mastery of systems engineering.

Page 80: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

74

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Competency development occurs on several levels: 1) university education resulting in a degree and non_degree short courses, 2) company sponsored train ing, and 3) on the job experiences. Each approach should be recognized and nurtured by the organization's competency development program.

Effective competency development requires courseware, trainers, and environment. Courseware ranges from textbook to domain-specific examples to current program tasks. Trainers range from professors to professional systems engineering educators / mentors to (co-learning) peers and managers. The learning environment ranges from lecture to laboratory to studio.

Competency development advances both the organization's and the individual's capability and opportunity. Development objectives are determined by the organization's near term needs and long term strategies. Since both near term needs and long term strategies change, the competency development program must be adaptable. Individual career planning is a convenient mechanism for integrating the individual's near and long term educational needs with the organization's needs.

The competency development process consists of a managed program (not limited to “classroom" events only), needs forecast and analysis, development plans, courseware, proficiency tests and standards, and sponsor/alumni feedback mechanisms to measure the effectiveness of the program. These elements must be coordinated to continuously improve both the skills of the individual and the efficiency of the organization and the program, to meet the near term needs and long term strategies of both in an optimal fashion.

Special considerations with respect to systems engineering competency development include:

l spanning the variety of topics consistent with serving a broad constituency of candidates, such as; transitional (into systems engineering), lateral (domain breadth, vertical (functional or life cycle expertise) and management (including modeling and measuring systems engineering effectiveness).

l accommodating the spectrum of learning styles of adult professionals.

l courseware that represents the broad problem space of systems engineering so the systems engineering-specific skills such as trade-off analysis, risk analysis, etc. are meaningfully presented.

l covering collaborative or team aspects of systems engineering as well as the “technical" tasks.

l accomplishing competency development in the minimum time and minimum interference with ongoing programs.

General Characteristics Characteristic 1 Competency development activities are planned and managed. Characteristic 2 Competency development activities meet the needs of the organization. Characteristic 3 Competency development meets the needs of the program. Characteristic 4 Competency development meets the needs of the individual. Characteristic 5 Post-competency development achievement levels are measured to

determine effectiveness and to provide feedback for process improvement.

Page 81: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

75

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Questions

yes no

2.2-1 Performed 2.2-1.1/G1 Is competency development being accomplished in at least an informal

manner? 2.2-1.2/G2 Are competency development activities planned for the organization? 2.2-1.3 Are alumni capable of performing the style of systems engineering that the business

needs? T2.2-A-L2 2.2-1.4/G3 Are the products / results of the competency development activities at least of

marginal value to the organization? 2.2-1.5 Are the products / results of the competency development activities at least of marginal

value to the program? 2.2-1.6 Are the products / results of the competency development activities at least of marginal

value to the individual? 2.2-1.7/G4 Are competency development activities at least of marginal effectiveness?

2.2-2 Managed 2.2-2.1/G5 Does the organization follow a written organizational policy (may be part of a

broad-based policy) for implementing competency development? 2.2-2.2/G6 Is there an approved plan (may be part of a larger technical management plan)

for the organization to perform competency development? 2.2-2.3/G7 Is there a designated systems engineering manager or team leader responsible

for the management of systems engineering competency development? 2.2-2.4 Have specific resources been identified to implement the competency development

plan? 2.2-2.5 Are long term competency development needs based upon the organization's strategic

plan? T2.2-B-L2 2.2-2.6 Are near term competency development needs based upon immediate program needs?

T2.2-C-L2a 2.2-2.7 Are specific skills identified in the organization's competency development plan?

T2.2-C-L2b 2.2-2.8 Does the organization recognize the need for three different types of competency

development approaches (i.e. formal education, in-house, and on-the-job)? T2.2-D-L2 2.2-2.9 Is competency development provided for key functional areas (e.g. analysis techniques

specific to the organization's problem domains)? T2.2-E-L2 2.2-2.10 Are personnel knowledgeable in systems engineering assigned to perform

training? T2.2-G-L2a 2.2-2.11 Are trainers required to demonstrate proficiency in the topics for which they

intend to train others? T2.2-H-L2a 2.2-2.12 Does the training team consist of personnel with appropriate skills? T2.2-G-

L2b 2.2-2.13 Are management personnel involved in competency development activities,

either as a recipient or as a participant? T2.2-F-L2

Page 82: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

76

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.2-2.14/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for performing competency development activities assessed?

2.2-2.15/G9 When skills are inadequate, is training provided to members of the competency development team?

2.2-2.16 Are alumni capable of performing the style and scope of systems engineering that the business needs? T2.2-A-L2

2.2-2.17 Do competency development personnel receive recognition for their efforts? T2.2-H-L2b

2.2-2.18/G10 Is competency development being performed in a structured manner? 2.2-2.19/G11 Are data collected for monitoring competency development activities? 2.2-2.20/G12 Are corrective actions initiated when competency development activities

deviate significantly from the plan? 2.2-2.21/G13 Are the products / results of the competency development activities at least of

adequate value to the organization? 2.2-2.22 Are the products / results of the competency development activities at least of

adequate value to the program? 2.2-2.23 Are the products / results of the competency development activities at least of

adequate value to the individual? 2.2-2.24/G14 Are competency development activities at least of adequate effectiveness?

2.2-3 Defined 2.2-3.1/G15 Are competency development activities planned, approved, and established

according to a formal procedure? 2.2-3.2 Does the organization recognize and integrate three basic types of training (i.e. formal

education, in-house training, and on-the-job training) into its competency development plan? T2.2-D-L3

2.2-3.3 Is there a mechanism for identifying competency development needs within the organization? (moved from level 2) T2.2-C-L3

2.2-3.4 Are competency development opportunities and the relationship between competency development and career opportunity clearly stated and communicated to all personnel within the organization? T2.2-J-L3a

2.2-3.5 Is there a tools, methods, and procedures competency development program for systems engineers within the organization? T2.2-E-L3a

2.2-3.6 Is there a required systems engineering project management competency development program for all engineering managers? T2.2-E-L3b

2.2-3.7 Are first-line managers, team leaders, and engineers trained on the organization's standard systems engineering process? T2.2-F-L3a

2.2-3.8 Is there a mechanism to develop individual competency development goals consistent with both the individual's career objectives and the organization's needs? T2.2-B-L3

2.2-3.9 Is there a process for the collection and interpretation of systems engineering competency development metrics? T2.2-I-L3

2.2-3.10 Are quality management and configuration management personnel trained in basic principles of systems engineering? T2.2-F-L3b

2.2-3.11 Have systems engineering and program managers received training in technical management? T2.2-E-L3c

Page 83: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

77

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.2-3.12 Are comple tion criteria for each training course documented in standards or course descriptions? T2.2-J-L3b

2.2-3.13 Is there a mechanism to evaluate students to verify their comprehension of training materials prior to recognition? T2.2-K-L3a

2.2-3.14 Are alumni capable of performing the style, scope and intensity of systems engineering that the business needs? T2.2-A-L3

2.2-3.15 Is there a mechanism to formally recognize competency development achievements? T2.2-K-L3b

2.2-3.16 Do competency development achievements contribute toward individual recognition in terms of opportunity and career advancement? T2.2-K-L3c

2.2-3.17/G16 Are metrics collected for assessing the effectiveness of competency development?

2.2-3.18/G17 Are peer/defect reviews conducted to assess and improve competency development activities and products?

2.2-3.19/G18 Are competency development processes standardized across the organization? 2.2-3.20/G19 Are guidelines provided to allow the program to implement competency

development for its specific needs? 2.2-3.21/G20 Are the products / results of the competency development activities at least of

significant value to the participants (organization, program, & individual)? 2.2-3.22/G21 Are competency development activities at least of significant effectiveness?

2.2-4 Measured 2.2-4.1 Is each systems engineering staff member's professional experience, expertise, and

career path tracked? T2.2-K-L4 2.2-4.2 Is there a mechanism for assessing the effectiveness of each systems engineering

training course with respect to set objectives? T2.2-H-L4 2.2-4.3 Do students evaluate how well competency development activities meet their needs?

T2.2-I-L4 2.2-4.4 Are competency development process effectiveness metrics collected analyzed with

respect to the organization's strategic plan, and corrective actions taken as necessary? T2.2-B-L4

2.2-4.5/G22 Are metrics used to determine the status and effectiveness of competency development activities?

2.2-4.6/G23 Are analyses performed on the metrics associated with competency development to identify corrective actions for the organization?

2.2-4.7/G24 Are the identified corrective actions implemented as necessary for the program?

2.2-4.8 Has the organization's competency development efforts received external recognition (e.g., recognized industry leadership, receipt of professional society awards)?

2.2-4.9/G25 Are the products / results of competency development activities at least of measurably significant value to the participants (organization, program, & individual)?

2.2-4.10/G26 Is the effectiveness of competency development activities at least measurably significant?

Page 84: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

78

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.2-5 Optimizing 2.2-5.1/G27 Is the effectiveness of the competency development process and its

implementation activities reviewed on both an event-driven and periodic basis? 2.2-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the

competency development process and its implementation? 2.2-5.3/G29 Are quality management reviews and/or audits conducted of the competency

development activities and the results used to improve the process? 2.2-5.4/G30 Are the metrics collected on the effectiveness of competency development

process used to monitor and improve the systems engineering process? 2.2-5.5 Is a mechanism used to tailor the competency development plan to satisfy future

systems engineering competency development needs? T2.2-I-L5 2.2-5.6/G31 Are the products / results of the competency development activities of optimal

value to the participants (organization, program, & individual)? 2.2-5.7/G32 Are competency development activities of optimal effectiveness?

5.2.3 KFA 2.3 TECHNOLOGY MANAGEMENT

Technology management involves identifying, selecting, evaluating, and investing in new technologies, and incorporating the appropriate technologies into the organization's products and processes. By maintaining an awareness of product/process technology innovations throughout the world and systematically evaluating and experimenting with them, the organization selects appropriate technologies to improve its competitiveness, and increase both productivity and product quality. Pilot efforts are performed to assess new and unproven technologies before they are introduced across the organization and, where required, investments are made to increase the maturity of the technology. With appropriate sponsorship of the organizations management, the selected technologies are incorporated into the organization's products and standard process.

General Characteristics Characteristic 1 The organization has a process for maintaining technology

awareness. Characteristic 2 Selection of product and process technologies is planned

according to the organization's needs. Characteristic 3 Investments are made to increase technology maturity. Characteristic 4 Technology innovations directly improve the organization's

standard products and services. Characteristic 5 Technology innovations directly improve the organization's

standard process.

Page 85: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

79

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Questions

yes no

2.3-1 Performed 2.3-1.1/G1 Is technology management being accomplished in at least an informal manner? 2.3-1.2/G2 Are technology management activities planned for the organization? 2.3-1.3 Does the organization foster awareness of the state-of-the-art technology? T2.3-A-L1 2.3-1.4/G3 Are the products / results of technology management activities at least of

marginal value to the organization? 2.3-1.5/G4 Are technology management activities at least of marginal effectiveness?

2.3-2 Managed 2.3-2.1/G5 Does the organization follow a written organizational policy (may be part of a

broad-based policy) for technology management? 2.3-2.2/G6 Is there an approved plan (may be part of a larger technical management plan)

for the organization to perform technology management? 2.3-2.3 Does the organization encourage innovation? T2.3-A-L2 2.3-2.4 Does the organization require appropriate analysis before new technology insertion is

allowed? T2.3-B-L2 2.3-2.5 Does the organization support participation in technical consortia, societies, and

collaborations? 2.3-2.6/G7 Is responsibility designated for technology management? 2.3-2.7 Are systems engineering personnel assigned to perform technology management for the

organization? 2.3-2.8 Are technology evolving groups established? 2.3-2.9 Is participation in new technology identification, assessment, and incorporation, a part

of the annual budget? 2.3-2.10/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for

technology management activities assessed? 2.3-2.11 Are formal criteria established for the commercial-off-the-shelf (COTS) /

internal development decision process? 2.3-2.12 Does the technology management team consist of personnel with appropriate

skills? 2.3-2.13/G9 When skills are inadequate, is technology management training provided? 2.3-2.14/G10 Is technology management being performed in a structured manner? 2.3-2.15/G11 Are data collected for monitoring technology management activities? 2.3-2.16/G12 Are corrective actions initiated when technology management activities deviate

significantly from the plan? 2.3-2.17/G13 Are the products / results of the technology management activities at least of

adequate value to the program? 2.3-2.18/G14 Are technology management activities at least of adequate effectiveness?

Page 86: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

80

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.3-3 Defined 2.3-3.1/G15 Are technology management activities planned, approved, and established

according the formal procedure? 2.3-3.2 Is a mechanism used to identify personnel to participate in technology exploration? 2.3-3.3 Are technology improvement results reviewed for return on investment? 2.3-3.4 Is a mechanism used to established to disseminate technology improvement results

throughout the program/organization? 2.3-3.5 Is a mechanism used to established to facilitate introduction of new products or transfer

of technology across the program/organization? 2.3-3.6 Is a mechanism used for maintaining awareness and disseminating knowledge of the

state-of-the-art technology? T2.3-A-L3 2.3-3.7 Are measures taken to insert new technology into the systems engineering process? 2.3-3.8 Are mechanisms used for managing and supporting the introduction of new

technologies into products? 2.3-3.9 Is a cost/benefit analyses performed prior to the adoption of new technologies? T2.3-B-

L3a 2.3-3.10 Is the effectiveness of newly introduced technologies reviewed to verify

analysis used to justify its introduction? T2.3-B-L3b 2.3-3.11 Is a mechanism used for evaluating technologies used by the organization

versus those externally available? T2.3-B-L3c 2.3-3.12 Is a mechanism used for identifying and replacing obsolete technologies? 2.3-3.13 Is a mechanism used for assessing existing designs and specifications for reuse

in new applications? 2.3-3.14 Are technology improvement activities formally documented? 2.3-3.15/G16 Are metrics collected for assessing the effectiveness of technology

management? 2.3-3.16/G17 Are peer/defect reviews conducted to assess and improve technology

management activities and products? 2.3-3.17/G18 Are technology management processes standardized across the organization? 2.3-3.18/G19 Are guidelines provided to allow the program to incorporate technology

management as appropriate for its specific needs? 2.3-3.19/G20 Are the products / results of technology management activities at least of

significant value to the organization? 2.3-3.20/G21 Are technology management activities at least of significant effectiveness?

2.3-4 Measured 2.3-4.1/G22 Are metrics of technology management activities analyzed to identify trends

and improve effectiveness? 2.3-4.2/G23 Are analyses performed on the metrics associated with technology management

to identify corrective actions for the organization? 2.3-4.3/G24 Are the identified corrective actions implemented as necessary for the

organization? 2.3-4.4/G25 Are the products / results of technology management activities at least of

measurably significant value to the organization?

Page 87: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

81

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.3-4.5/G26 Is the effectiveness of technology management activities at least measurably significant?

2.3-5 Optimizing 2.3-5.1/G27 Is the effectiveness of the technology management process and its

implementation activities reviewed on both an event-driven and periodic basis? 2.3-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the

technology management process and its implementation? 2.3-5.3/G29 Are quality management reviews and/or audits conducted of the technology

management activities and the results used to improve the process? 2.3-5.4/G30 Are the metrics collected on the effectiveness of the technology management

process used to monitor and improve the technology management process? 2.3-5.5/G31 Are the products / results of technology management activities of optimal value

to the organization? 2.3-5.6/G32 Are technology management activities of optimal effectiveness?

5.2.4 KFA 2.4 ENVIRONMENT AND TOOL SUPPORT

The environment and tool support key focus area addresses the technology support for the execution of a common systems engineering process. This activity includes managing the efficiency and effectiveness of the existing environment and tools, forecasting, planning and acquiring new tools, and tailoring the existing environment and tools for each program's needs. Responsibility for supporting the environment and its tools must be defined. Adequate resources must also be made available to support these activities. The environment and tools should be changed or upgraded according to a documented plan based upon the organization's goals and program requirements.

The environment in this context is the underlying computing and communications support for the tool set as well as the means of integrating individual tools to provide interoperability. Systems engineering tools span all KFAs. Tools may be categorized into the domains of 1) requirements management, 2) requirements analysis, 3) modeling and simulation, 4) change management, and 5) verification. Requirements management tools are used to establish and maintain the relationships between technical aspects of the program (i.e. requirements, functions, architecture, components, verification methods, etc.) and the management aspects of the program (work tasks, resources, organization structure, facilities, etc.). Technology is an enabling dr iver for the environment and tools. As environment and tool technologies improve, more sophisticated processes and methods can be realized thereby improving the capability to perform work more efficiently.

General Characteristics Characteristic 1 Environment and tool support activities are planned. Characteristic 2 Engineers are provided with an environment and a set of tools which

support an established systems engineering process. Characteristic 3 The environment and tools can be tailored to a program's needs. Characteristic 4 Improvements to the environment and tools are implemented in a

controlled manner.

Page 88: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

82

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Questions

yes no

2.4-1 Performed 2.4-1.1/G1 Is environment and tool support for systems engineering activities being

accomplished in at least an informal manner? 2.4-1.2 Is there a recognized systems engineering environment and tools? T2.4-A-L1 2.4-1.3 Is the selection of systems engineering tools driven by the needs of the enterprise?

T2.4-B-L1 2.4-1.4/G2 Are environment and tool support activities planned for the organization? 2.4-1.5/G3 Is the systems engineering environment, tools, and support at least of marginal

value to the organization? 2.4-1.6/G4 Are the systems engineering environment and tool development and support

activities at least of marginal effectiveness?

2.4-2 Managed 2.4-2.1/G5 Is there a policy (may be part of a broad-based policy) endorsing the use of a

common environment and standard tools for systems engineering activities? 2.4-2.2 Are the needs of the enterprise captured as a documented set of requirements for the

systems engineering environment and tools? T2.4-C-L2 2.4-2.3 Has an environment and a set of tools been deployed? T2.4-A-L2 2.4-2.4/G6 Is there a plan to provide resources to maintain and upgrade facilities and tools

in support of the systems engineering environment? 2.4-2.5/G7 Have responsibilities been assigned to provide environment and tool support for

systems engineering activities? 2.4-2.6/G8 Is the adequacy of resources (e.g. funding, staff, tools, etc.) provided to perform

systems engineering environment and tool support assessed? 2.4-2.7/G9 Are technical support personnel trained to perform environment and tool

support for the systems engineering activities? 2.4-2.8 Are the responsibilities and duties of the technical support personnel established? T2.4-

D-L2 2.4-2.9 Has an environment and set of tools been established for application to systems

engineering tasks? T2.4-B-L2 2.4-2.10 Is training provided to systems engineering personnel who use the environment

and tools? T2.4-E-L2 2.4-2.11 Are the environment and tools adaptable for specific program requirements?

T2.4-F-L2 2.4-2.12 Does the environment and its information infrastructure facilitate the sharing of

key information across all disciples participating on a program? T2.4-G-L2 2.4-2.13 Is use of the environment and tools promoted by management? T2.4-H-L2 2.4-2.14 Is the design generally driven by the tool? T2.4-J-L2 2.4-2.15/G10 Is environment and tool support for systems engineering activities provided in a

structured manner? 2.4-2.16/G11 Are data collected for monitoring environment and tool support activities?

Page 89: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

83

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.4-2.17/G12 Are corrective actions initiated when environment and tool support activities deviate significantly from the plan?

2.4-2.18/G13 Are the products / results of the environment and tool support activities at least of adequate value to the organization?

2.4-2.19/G14 Are environment and tool support activities at least of adequate effectiveness?

2.4-3 Defined 2.4-3.1/G15 Are environment and tool support activities planned, approved, and established

according to a formal procedure? 2.4-3.2 Are the needs of each program included as part of the documented set of requirements

for the systems engineering environment and tools? T2.4-C-L3a 2.4-3.3 Is a mechanism used to identify and replace obsolete systems engineering tools and

methods? T2.4-B-L3a 2.4-3.4 Are measures taken to leverage prior investments in obsolete environment and tool

components when migrating to new ones? T2.4-B-L3b 2.4-3.5 Is there a mechanism for assessing new candidate environment and tool components?

T2.4-B-L3c 2.4-3.6 Is a trade study performed to determine the costs/benefit of commercial off the shelf

(COTS) environment and tool components versus in-house custom environment and tool components? T2.4-B-L3d

2.4-3.7 Is assistance provided to accomplish tailoring (i.e. toolsmiths)? T2.4-E-L3 2.4-3.8 Is the configuration of the environment and tool components managed? T2.4-A-L3a 2.4-3.9 Is there a mechanism for reusing work products? 2.4-3.10 Is a standard set of process and product data recorded within the systems

engineering environment? T2.4-I-L3 2.4-3.11 Do tools support the standard systems engineering process? T2.4-C-L3 2.4-3.12 Is the adequacy of environment and tool support assessed? T2.4-D-L3 2.4-3.13/G16 Are metrics collected for assessing the effectiveness of environment and tool

support? 2.4-3.14/G17 Are peer/defect reviews conducted to assess and improve environment and tool

support activities and products? 2.4-3.15/G18 Are environment and tool support processes standardized across the

organization? 2.4-3.16 Has a standard environment and set of tools been established for application to

systems engineering tasks? T2.4-A-L3b 2.4-3.17 Does the environment and its information infrastructure facilitate the sharing of

all information across all disciples participating on a program? T2.4-G-L3a 2.4-3.18 Are all tools in the environment suitably integrated? T2.4-G-L3b 2.4-3.19 Are the environment and tools tailorable for specific program requirements?

T2.4-F-L3 2.4-3.20/G19 Are guidelines provided to allow the program to tailor the standard

environment and tool for its specific needs? 2.4-3.21 Is use of the standard environment and tools mandated by management? T2.4-

H-L3 2.4-3.22 Is the information provided by the tools used in decision making processes?

T2.4-J-L3

Page 90: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

84

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.4-3.23/G20 Are the products / results of the environment and tool support activities at least of significant value to the organization?

2.4-3.24/G21 Are environment and tool support activities at least of significant effectiveness?

2.4-4 Measured 2.4-4.1 Is the adequacy of the environment and its tools evaluated? T2.4-I-L4a 2.4-4.2 Is the adequacy of the technical support for the environment and its tools evaluated?

T2.4-I-L4b 2.4-4.3 Is the effectiveness of environment and tool support being measured T2.4-I-L4c? 2.4-4.4/G22 Are metrics of environment and tool support activities analyzed to identify

trends and improve program effectiveness? 2.4-4.5 Does the environment and its information infrastructure integrate all information across

all disciples participating on a program? T2.4-G-L4 2.4-4.6 Is risk management applied to environment and tool support? T2.4-D-L4 2.4-4.7 Are end users of the tools and their environment periodically asked to assess the value

of the environment and tools with respect to the development of good products? T2.4-I-L4d 2.4-4.8 Are proposed changes to the documented set of requirements for the systems

engineering environment and tools analyzed for their potential impact? T2.4-C-L4a 2.4-4.9/G23 Are analyses performed on the metrics associated with environment and tool

support to identify corrective actions? 2.4-4.10/G24 Are the identified corrective actions implemented as necessary? 2.4-4.11 Are lessons learned captured and used to improve the environment and tool

support? T2.4-C-L4b 2.4-4.12 Are external trends that might affect the environment and its tools regularly

reviewed and assessed for potential impact? T2.4-B-L4 2.4-4.13 Has the organization's use of tools and systems engineering support

environment received external recognition (e.g., recognized industry leadership, receipt of professional society awards)?

2.4-4.14 Are tools provided by the organization perceived as improving the program's ability to cope with “crunch time" situations? T2.4-J-L4

2.4-4.15/G25 Are the products / results of the environment and tool support activities at least of measurably significant value to the organization?

2.4-4.16/G26 Is the effectiveness of environment and tool support activities at least measurably significant?

2.4-5 Optimizing 2.4-5.1 Is the effectiveness of environment and tool support being correlated to the performance

of each program? T2.4-I-L5a 2.4-5.2/G27 Is the effectiveness of the environment and tool support process and its

implementation activities reviewed on both an event-driven and periodic basis? 2.4-5.3/G28 Upon review, are actions taken to correct identified deficiencies in the

environment and tool support process and its implementation? 2.4-5.4 Is there a mechanism for ensuring compliance with environment and tool standards?

T2.4-C-L5a 2.4-5.5/G29 Are quality management reviews and/or audits conducted of the environment

and tool support activities and program data and the results used to improve the process?

Page 91: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

85

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.4-5.6/G30 Are the metrics collected on the effectiveness of environment and tool support activities used to monitor and improve the systems engineering process?

2.4-5.7 Is there a mechanism to ensure that lessons learned from the application of tools on one program are transferred to subsequent programs? T2.4-J-L5

2.4-5.8 Are end users of the tools and their environment periodically asked to suggests improvements? T2.4-I-L5b

2.4-5.9 Are the requirements for the standard systems engineering environment and tools updated as warranted to improve the standard systems engineering process? T2.4-C-L5b

2.4-5.10/G31 Are the products / results of the environment and tool support activities of optimal value to the organization?

2.4-5.11/G32 Are environment and tool support activities of optimal effectiveness?

Page 92: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

86

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

5.3 CATEGORY 3 SYSTEMS ENGINEERING PROCESS CATEGORY

The Systems Engineering Process Category consists of the following key focus areas:

3.1 System Concept Definition 3.2 Requirements & Functional Analysis 3.3 System Design 3.4 Integrated Engineering Analysis 3.5 System Integration 3.6 System Verification 3.7 System Validation

5.3.1 KFA 3.1 SYSTEM CONCEPT DEFINITION

System concept definition sets the stage for the system life-cycle development activities. System concept definition results in a top-level description of the system based upon identification of the customer and/or user expectations and operational needs, technological limitations, cost-drivers, risks, and justification. Customer needs, objectives and requirements are analyzed in relation to customer mission/operations, operational environments, and desired system characteristics. System concept definition includes the following activities:

l determining customer/user needs and transforming those needs and developing a customer/user requirements baseline.

l transform customer/user requirements baseline into system requirements, which as a set, will satisfy the user needs within the bounds of an agreed upon budget or overall life cycle cost.

l performing operations (or mission) analysis on the system requirements to understand the required behavior of the system.

l derive alternative system concepts which unify system feature, function, performance, and price (or life cycle cost of ownership).

l articulating the alternative concepts sufficiently for selection of a preferred concept that is verified through formal concept review.

l establishing a system concept baseline.

At each step during system concept definition, alternative requirements, functions, and top-level physical/software architectural solutions are proposed. The system requirements are defined, analyzed, and synthesized only to a level of understanding that allows logical selection of the best of the alternatives which will yield an optimal conceptual system that can be completely developed in successive stages.

General Characteristics Characteristic 1 System concept definition activities are planned.

Page 93: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

87

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Characteristic 2 Customer/user needs are negotiated and captured in the form of a user requirements baseline.

Characteristic 3 The technical, schedule, and economic feasibility of the system concept is established and the rationale is documented.

Characteristic 4 A preliminary system level requirements and functional baseline are established and traceable to customer/user requirements.

Characteristic 5 A conceptual top-level physical/software architecture is established and traceable to both the concept baseline and the customer/user requirements baseline and the rationale is documented.

Questions

yes no

3.1-1 Performed 3.1-1.1/G1 Is system concept definition being performed in at least an informal manner? 3.1-1.2/G2 Are system concept definition activities planned for the program? 3.1-1.3 Are customer/user needs and constraints documented and understood? T3.1-A-L1 3.1-1.4 Are stakeholder needs and constraints documented and understood? T3.1-B-L1 3.1-1.5 Are technological limitations documented and understood? T3.1-C-L1a 3.1-1.6 Is more than one alternative system concept considered? T3.1-D-L1 3.1-1.7 Are cost drivers associated with each system concept considered? T3.1-E-L1 3.1-1.8 Are risk items associated with each system concept considered? T3.1-F-L1 3.1-1.9 Is the complexity of each system concept considered? T3.1-G-L1 3.1-1.10 Are system expansion and growth concepts considered as part of system

concept definition activities? T3.1-C-L1b 3.1-1.11/G3 Are the products / results of system concept definition activities at least of

marginal value to the program? 3.1-1.12/G4 Are system concept definition activities at least of marginal effectiveness?

3.1-2 Managed 3.1-2.1/G5 Does the program follow a written organizational policy (may be part of a

broader policy) for implementing system concept definition activities? 3.1-2.2/G6 Is there an approved plan (may be part of a larger technical management plan)

for the program to perform system concept definition? 3.1-2.3 Are the key elements of the system concepts documented? 3.1-2.4 Is the level of detail and rigor for each system concept alternative developed

commensurate with the anticipated size and complexity of the system? T3.1-G-L2a 3.1-2.5/G7 Is there a designated systems engineering manager or team leader responsible

for system concept definition on the program? 3.1-2.6 Does the concept definition team have a broad technology background in the primary

engineering discip lines required of the system? 3.1-2.7 Does the concept definition team have domain experience in the area(s) of

customer/user needs and operational goals?

Page 94: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

88

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.1-2.8 Are operational scenarios developed to scope the anticipated uses of the system? T3.1-H-L2a

3.1-2.9 Are customer/user needs captured in the form of a user requirements baseline. 3.1-2.10 Do documented stakeholder constraints drive the selection and evaluation of

alternative solutions? T3.1-B-L2 3.1-2.11 Are customers/users involved in trade studies; especially where key

requirements cannot all be met? T3.1-A-L2 3.1-2.12 Are potential technological limitations considered during trade studies of

alternative system concepts? T3.1-C-L2 3.1-2.13 Are cost drivers considered during trade studies of alternative system concepts?

T3.1-E-L2 3.1-2.14 Are risk items considered during trade studies of alternative system concepts?

T3.1-F-L2 3.1-2.15 Is the complexity of the system considered during trade studies of alternative

system concepts? T3.1-G-L2b 3.1-2.16 Is functional analysis performed to establish a top-level functional baseline

completely traceable to the user requirements baseline? 3.1-2.17 Are trade studies performed to select between alternative concepts? T3.1-D-L2 3.1-2.18 Are formal or informal reviews conducted on the system concept and its

allocated baseline. T3.1-H-L2b 3.1-2.19/G8 Is the adequacy of resources (e.g. funding, staff, tools, etc.) provided for

performing necessary engineering studies, including prototype development (as required), assessed?

3.1-2.20 Does the concept definition team have the necessary communication skills to interact with the customer and/or users to elicit statements of needs and goals?

3.1-2.21 Does the concept definition team possess the skills needed to develop and assess alternative conceptual approaches?

3.1-2.22/G9 When skills are inadequate, is training provided in system concept definition? 3.1-2.23 Does training include system concept definition techniques, assessment

methods, and concept documentation? 3.1-2.24/G10 Is system concept definition and requirements flow down performed in a

structured manner on the program? 3.1-2.25/G11 Are data collected for monitoring system concept definition activities? 3.1-2.26/G12 Are corrective actions initiated when system concept definition activities

deviate significantly from the plan? 3.1-2.27/G13 Are the products / results of the system concept definition activities at least of

adequate value to the program? 3.1-2.28/G14 Are system concept definition activities at least of adequate effectiveness?

3.1-3 Defined 3.1-3.1/G15 Are system concept definition activities planned, approved, and established

according to a formal procedure? 3.1-3.2 Are the customers/users of the system directly involved in the definition of the system

concept alternatives? T3.1-A-L3 3.1-3.3 Are system capabilities identified which meet the user's needs and customer's

expectations?

Page 95: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

89

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.1-3.4 When selecting a preferred system concept, are a sufficient number of alternative system concepts evaluated? T3.1-D-L3a

3.1-3.5 Are constraints reviewed to establish their validity? T3.1-H-L3a 3.1-3.6 Are conflicting stakeholder constraints balanced or resolved? T3.1-B-L3 3.1-3.7 Is growth in technological capability considered when evaluating alternative system

concepts? T3.1-C-L3 3.1-3.8 Are are life cycle implications of cost drivers considered as part of the trade studies of

alternative system concepts? T3.1-E-L3 3.1-3.9 Are estimates of system complexity and the degree of coupling between system

elements a factor in trade studies? T3.1-G-L3 (from 3.4-L3) 3.1-3.10 Are models, simulations, or prototypes used in the analytical process of

selecting between alternative system concepts? T3.1-D-L3b 3.1-3.11 Is system performance predicted and confirmed across the dynamic range of

system operation as required in the operational scenarios? T3.1-H-L3b 3.1-3.12 Is the fidelity of performance predictions validated? T3.1-H-L3c 3.1-3.13 Is a sensitivity analysis performed as part of trade studies? T3.1-D-L3c 3.1-3.14 Are user-system interactions analyzed? T3.1-H-L3d 3.1-3.15 Is there a mechanism to assess concept feasibility? T3.1-H-L3e 3.1-3.16 Are scenarios describing system operation developed for each alternative

system concept? 3.1-3.17 Are scenarios describing life cycle support developed for each alternative

system concept? 3.1-3.18 Are system and operational concepts documented to a level useful for deriving

and validating system requirements? T3.1-H-L3e 3.1-3.19/G16 Are metrics collected for assessing the effectiveness of system and operation

concepts? 3.1-3.20/G17 Are peer/defect reviews conducted to assess and improve system concept

definition activities and products? 3.1-3.21/G18 Are system concept definition processes standardized across the organization? 3.1-3.22/G19 Are guidelines provided to allow the program to tailor the standard system

concept definition process for its specific needs? 3.1-3.23/G20 Are the products / results of the system concept definition activities at least of

significant value to the program? 3.1-3.24/G21 Are system concept definition activities at least of significant effectiveness?

3.1-4 Measured 3.1-4.1 Are customer and user reactions to concepts (e.g. operation, requirements, design,

testing) used as a basis for evaluating program performance? T3.2-A-L4 3.1-4.2 Are stakeholders involved in the development and tailoring of the program's system

concept definition process? T3.1-B-L4 3.1-4.3 Are modeling, simulation, and prototyping results measured and used to improve the

program's system concept definition process? T3.1-D-L4 3.1-4.4/G22 Are metrics used to determine the status and effectiveness of system concept

definition activities? 3.1-4.5 Is the selected system concept robust with respect to technological limitations? T3.1-2-

L4 3.1-4.6 Is the selected system concept robust with respect to cost drivers? T3.1-E-L4

Page 96: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

90

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.1-4.7 Is the selected system concept robust with respect to risks? T3.1-F-L4 3.1-4.8 Is the complexity of the selected system concept controlled based upon the analysis of

measures of system complexity (e.g. number of interfaces)? T3.1-5-L4 3.1-4.9/G23 Are analyses performed on the metrics associated with system concept

definition to identify corrective actions for the program? 3.1-4.10/G24 Are the identified corrective actions implemented as necessary for the

program? 3.1-4.11 Are metrics taken for evaluating the system concept definition process? 3.1-4.12/G25 Are the products / results of the system concept definition activities at least of

measurably significant value to the program? 3.1-4.13/G26 Is the effectiveness of system concept definition activities at least measurably

significant?

3.1-5 Optimizing 3.1-5.1/G27 Is the effectiveness of the concept definition process and its implementation

activities reviewed on both an event-driven and periodic basis? 3.1-5.2 Do the stakeholders review the system concept definition products and recommend

improvements to the standard systems engineering process? T3.2-B-L5 3.1-5.3/G28 Upon review, are actions taken to correct identified deficiencies in the concept

definition process and its implementation? 3.1-5.4/G29 Are quality management reviews and/or audits conducted of concept definition

process and its data products and the results used to improve the process? 3.1-5.5/G30 Are the metrics collected on the effectiveness of system concept definition

process used to monitor and improve the systems engineering process? 3.1-5.6/G31 Are the products / results of the system concept definition activities of optimal

value to the program? 3.1-5.7/G32 Are system concept definition activities of optimal effectiveness?

5.3.2 KFA 3.2 REQUIREMENTS & FUNCTIONAL ANALYSIS

Requirements and functional analysis are the life-cycle development activities associated with the iterative identification and refinement to successively lower levels of the top-level requirements and functional baselines established for the preferred system concept selected in System Concept Definition. These activities establish a layered approach that defines the necessary and sufficient attributes of the lower-level system components required for its successful development, production, deployment, operation, and disposal. Requirements and functional analysis re-examines the outputs of System Concept Definition to develop a complete set of system requirements (explicit, derived, and implicit) and a functional architecture. During these activities, the potential constraints imposed by follow-on synthesis (System Design) are also considered.

During System Concept Definition, the customer/user requirements baseline served as the criteria for the selection a preferred system concept from among competing alternative system concepts, using trade studies as a means of selection. As the preferred system concept is refined during Requirements and Functional Analysis, the customer/user requirements baseline is re-examined with each successive, lower-level development of

Page 97: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

91

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

requirements and functional architecture. This is done to ensure that customer/user expectations are realized and to resolve any customer/user requirements that cannot be met.

The focus in Requirements and Functional Analysis is on completely defining the technical problem for the preferred system concept. Where alternative requirements are possible, trade studies are used to select the preferred set of requirements with respect to cost, schedule, and performance. Similarly, trade studies are used to select the preferred functional architecture from competing alternatives to satisfy successively lower levels of system requirements; considerations of System Design aide in selecting the feasible solution from among functional architecture alternatives. In this manner, a complete requirements baseline and functional architecture are established that cover an operational and functional view of requirements. This can be related to a physical view of requirements as addressed in System Design. Additionally, a complete assessment of project risk with mitigation options are developed. This is addressed by the Risk Management KFA.

The major tasks of Requirements and Functional Analysis are:

l Requirements analysis to successively lower levels. A structured or organized method to determine functional and performance requirements to successively lower levels based upon the customer/user requirements baseline. Prior requirements analyses are reviewed and updated. Functional requirements identified are used as top-level functions for functional analysis. It is conducted iteratively with functional analysis.

l Functional analysis to successively lower levels. A structured or organized method to define and integrate a functional architecture to successively lower levels based upon the developing lower level requirements baseline. All specified usage modes for the system are considered and a time line analysis is generated for time critical sequencing of functions. Prior requirements analyses are reviewed and updated. Input, output, and functional interfaces are defined. It is conducted iteratively with requirements analysis to ensure requirements are satisfied. It is also conducted iteratively with system design (synthesis) to define and refine feasible solution alternatives.

l Requirements management. The activity of maintaining traceability from the customer/user requirements baseline to the successively developed, lower level requirements that represent the system's capabilities.

l Requirements flowdown. The activity of decomposing and allocating system level requirements to successively lower level elements of the system.

l Functional flowdown. The activity of decomposing and allocating system level functions to successively lower level functions of the system to satisfy the requirements flowdown.

l Re-examination of mission and operational analysis. The activity of re-analyzing and documenting the customer/user's need, as the system develops through requirements and functional analysis activities, in terms of: (1) accomplishing the system's mission or purpose and (2) describing the system's intended operational characteristics.

Page 98: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

92

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

General Characteristics Characteristic 1 Development of sub-system requirements and functional architecture is

planned. Characteristic 2 The user/customer requirements baseline is examined to ensure that

expectations are realized and to resolve requirements that cannot be met. Characteristic 3 A complete system requirements baseline (including subsystems) is

established and traceable to the customer/user requirements baseline. Characteristic 4 A complete functional architecture is established and traceable to the

system requirements baseline. Characteristic 5 Changes to requirements are controlled and communicated to affected

groups.

Questions

yes no

3.2-1 Performed 3.2-1.1/G1 Is requirements analysis being accomplished by the program in at least an

informal manner? 3.2-1.2 Is functional analysis being accomplished by the program in at least an informal

manner? 3.2-1.3/G2 Are requirements and functional analysis activities planned for the program? 3.2-1.4 Are stakeholder constraints included in the user requirements baseline? T3.2-A-L1 3.2-1.5 Are subsystem requirements documented? T3.2-B-L1 3.2-1.6 Is the functional architecture developed, related to, and consistent with, the system and

subsystem requirements? T3.2-C-L1 3.2-1.7 Is sequencing of time-critical functions considered? T3.2-D-L1 3.2-1.8 Are trade studies conducted during requirements and functional analysis to select

between competing alternatives? T3.2-E-L1 3.2-1.9 Are stakeholders provided an opportunity to review requirements? T3.2-F-L1 3.2-1.10 Are requirements reviewed for errors? T3.2-G-L1 3.2-1.11 Are requirements reviewed for completeness? T3.2-H-L1 3.2-1.12 Is the complexity of requirements considered? T3.2-I-L1 3.2-1.13 Are changes to requirements documented? T3.2-J-L1 3.2-1.14/G3 Are the products / results of the requirements and functional analysis activities

at least of marginal value to the program? 3.2-1.15/G4 Are requirements and functional analysis activities at least of marginal

effectiveness?

3.2-2 Managed

3.2-2.1/G5 Does the program follow a written organizational policy (may be part of a broad-based policy) that requires the management and flow down of system requirements and the development of a corresponding functional architecture?

Page 99: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

93

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.2-2.2/G6 Is there an approved program plan (may be part of a larger technical management plan) describing how requirements and functional analysis will be defined and managed?

3.2-2.3/G7 Is responsibility designated for the management of requirements and functional analysis on the program?

3.2-2.4 Do stakeholders participate in the requirements approval process? T3.2-F-L2

3.2-2.5 Does the program perform a planned sequence of tasks to transform customer/user expectations and constraints into a system requirements baseline (including subsystem requirements)? T3.2-A-L2

3.2-2.6 Are system-level requirements allocated and flowed-down to subsystems? T3.2-C-L2a

3.2-2.7 Are trade studies performed to select between alternative lower-level requirements? T3.2-E-L2a

3.2-2.8 Are requirements inspected for errors? T3.2-G-L2

3.2-2.9 Are requirements inspected for completeness? T3.2-H-L2

3.2-2.10 Are requirements inspected to establish complexity? T3.2-I-L2

3.2-2.11 Are changes to requirements managed? T3.2-J-L2

3.2-2.12 Is the impact of system requirement changes evaluated? T3.2-L-L2

3.2-2.13 Is the system-level functional architecture reduced to lower-level functions? T3.2-K-L2

3.2-2.14 Is a time line analysis generated for sequencing of time-critical functions? T3.2-D-L2

3.2-2.15 Are trade studies performed to select between alternative lower-level functional architectures? T3.2-E-L2b

3.2-2.16 Is system design considered as a potential constraint in determining feasible solutions for functional architecture trade studies? T3.2-E-L2c

3.2-2.17 Do trade studies conducted consider cost, schedule, and performance? T3.2-E-L2d

3.2-2.18 Is requirements analysis conducted iteratively in conjunction with functional analysis to develop lower level subsystem requirements? T3.2-B-L2

3.2-2.19 Are subsystem requirements traceable to the customer/user requirements baseline? T3.2-C-L2b

Page 100: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

94

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.2-2.20 Are discrepancies between lower-level requirements and the customer/user baseline identified and resolved? T3.3-M-L2

3.2-2.21/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for managing requirements and functional analysis assessed?

3.2-2.22 Does the requirements capture and functional architecture team consist of personnel with appropriate skills?

3.2-2.23/G9 When skills are inadequate, is training provided in requirements and functional analysis?

3.2-2.24 Are tools and models used where appropriate to conduct requirements and functional analyses?

3.2-2.25/G10 Are subsystem requirements being developed in a structured manner on the program?

3.2-2.26 Is the functional architecture being developed in a structured manner?

3.2-2.27/G11 Is data collected for monitoring requirements and functional analysis activities?

3.2-2.28/G12 Are corrective actions initiated when requirements and functional analysis activities deviate significantly from the plan?

3.2-2.29/G13 Are the products / results of the requirements and functional analysis activities at least of adequate value to the program?

3.2-2.30/G14 Are requirements and functional analysis activities at least of adequate effectiveness?

3.2-3 Defined 3.2-3.1/G15 Are requirements and functional analysis activities planned, approved, and

established according to a formal procedure? 3.2-3.2 Are requirements and functional analysis managed throughout the program's life cycle? 3.2-3.3 Are standards applied to the generation and documentation of subsystem requirements

and the corresponding subsystem functional architecture? T3.2-B-L3 3.2-3.4 Are requirements reviewed for completeness, feasibility (risk), verifiability, clarity, and

consistency, before they are incorporated into and committed by the program? T3.2-J-L3a 3.2-3.5 Is stakeholder participation in the requirements approval process required? T3.2-F-L3 3.2-3.6 Is active customer/user participation sought to identify and resolve discrepancies

between lower-level requirements and the customer/user baseline? T3.3-M-L3a 3.2-3.7 Are the allocated requirements used as the basis for plans, work products, and

activities? T3.2-D-L3a 3.2-3.8 Are models, simulations, or prototyping mechanisms used during the analytical

process? From 3.4-L4

Page 101: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

95

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.2-3.9 Are models, simulations, or prototyping mechanisms used during the analytical process validated?

3.2-3.10 Are allocated requirements and lower-level functional architectures described and disseminated in documentation, models, etc.?

3.2-3.11 Are verification criteria documented in conjunction with the allocated requirements and corresponding functional architecture?

3.2-3.12 Are changes to the allocated requirements reviewed for impact on feasibility, cost, schedule, risk, etc. before being incorporated into the program? T3.2-J-L3b

3.2-3.13 Are subsystem requirements traceable to the definition of the system functional architecture? T3.2-C-L3a

3.2-3.14 Are allocated requirements used to guide the definition of the functional architecture of each subsystem component?

3.2-3.15 Is a time line analysis generated for sequencing of time-critical functions for critical use modes of the system and subsystems? T3.2-D-L3b

3.2-3.16 Are derived requirements traceable to analysis and design rationale? T3.2-C-L3b

3.2-3.17 Is system design margin managed and controlled across the program to mitigate requirement shortfalls within individual configuration items?

3.2-3.18 Are relationships between requirements identified and tracked in order to manage the impact of requirement changes during the development process? T3.2-J-L3c

3.2-3.19 Are non-technical requirements (e.g. certification, safety, producability, etc...) managed and weighted with respect to operational and functional requirements?

3.2-3.20 Are existing designs and products assessed against requirements, functional architecture and verification plans?

3.2-3.21 Is a formal discussion captured reflecting constraints and criteria used to evaluate alternatives? T3.2-E-L3a (From 3.4-L3)

3.2-3.22 Are the applicable interfaces addressed when selecting viable alternatives? T3.2-E-L3b (From 3.4-L3)

3.2-3.23 Are deliberations on each alternative conducted by applying a standard set of “rules of evidence"? T3.2-E-L3c (From 3.4-L3)

3.2-3.24 Are all lower level requirements traceable to higher level requirements? T3.2-C-L3c

3.2-3.25 Are all lower level functions traceable to higher level functions? T3.2-C-L3d 3.2-3.26 Is there a mechanism for periodic review of the requirements and their

relationship to the functional architecture? 3.2-3.27 Are standards for requirements traceability applied to the program? T3.2-C-

L3e 3.2-3.28 Are standards for reducing requirements and functions applied to the program?

T3.2-K-L3 3.2-3.29 Is there a formal process to establish documented criteria for correctness and

completeness of requirements? T3.2-H-L3a 3.2-3.30 Are formal reviews used to inspect requirements for errors? T3.2-G-L3 3.2-3.31 Are formal reviews used to inspect requirements for completeness? T3.2-H-

L3b 3.2-3.32 Are formal reviews used to establish requirement complexity? T3.2-I-L3 3.2-3.33 Are the action items resulting from formal reviews of the requirements

controlled and tracked to closure? T3.2-M-L3b

Page 102: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

96

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.2-3.34 Is a mechanism used for controlling changes to the system requirements? T3.2-J-L3f

3.2-3.35 Are system and system component requirements formally reviewed? 3.2-3.36 Is a mechanism used for controlling changes to the system, subsystem and

configuration item specifications? T3.2-J-3h 3.2-3.37 Is the impact of system requirement changes evaluated regressively (i.e. for

secondary effects within the system)? T3.3-L-L3 3.2-3.38 Does program management review each specification prior to making

contractual estimates, schedules and personnel assignments? 3.2-3.39/G16 Are metrics collected for assessing the effectiveness of requirements and

functional analysis activities? 3.2-3.40/G17 Are peer/defect reviews conducted to assess and improve requirements and

functional analysis activities and products? 3.2-3.41/G18 Is the process for requirements and functional analysis standardized across the

organization? 3.2-3.42/G19 Are guidelines provided to allow the program to tailor the standard

requirements and functional analysis process for its specific needs? 3.2-3.43/G20 Are the products / results of the requirements and functional analysis activities

at least of significant value to the program? 3.2-3.44/G21 Are requirements and functional analysis activities at least of significant

effectiveness?

3.2-4 Measured 3.2-4.1 Is estimated system complexity used to establish the anticipated number of

requirements necessary to define the system? T3.2-I-L4 3.2-4.2/G22 Are metrics used to determine the status and effectiveness of requirements and

functional analysis activities? 3.2-4.3 Are errors found in the requirements analyzed to determine causality with respect to the

program's processes? T3.2-G-L4 3.2-4.4 Is the stability/volatility of requirements measured and analyzed to establish and track

the maturation of the requirements and functional analysis? T3.2-L-L4 3.2-4.5/G23 Are analyses performed on the metrics associated with requirements and

functional analysis to identify corrective actions for the program? 3.2-4.6/G24 Are the identified corrective actions implemented as necessary for the

program? 3.2-4.7 Are metrics collected for evaluating the requirements and functional analysis process? 3.2-4.8/G25 Are the products / results of the requirements and functional analysis activities

at least of measurably significant value to the program? 3.2-4.9/G26 Is the effectiveness of requirements and functional analysis activities at least

measurably significant?

3.2-5 Optimizing 3.2-5.1/G27 Is the effectiveness of the requirements and functional analysis process and its

implementation activities reviewed on both an event-driven and periodic basis?

Page 103: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

97

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.2-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the requirements and functional analysis process and its implementation?

3.2-5.3/G29 Are quality management reviews and/or audits conducted of the requirements and functional analysis activities and program data and the results reported?

3.2-5.4 Are errors found in the requirements analyzed to determine causality with respect to the standard process? T3.2-G-L5

3.2-5.5 Is the requirements and functional analysis process revised as necessary to optimize the systems engineering process?

3.2-5.6 Does the optimization of the requirements and functional analysis process include, or is based upon an analysis of the metrics collected regarding the system requirements process?

3.2-5.7 As programs involving the development of systems differing complexity are undertaken are guidelines for tailoring the standard systems engineering process developed and improved? T3.2-I-L5

3.2-5.8/G30 Are the metrics collected on the effectiveness of requirements and functional analysis activities used to monitor and improve the systems engineering process?

3.2-5.9/G31 Are the products / results of the requirements and functional analysis activities of optimal value to the program?

3.2-5.10/G32 Are requirements and functional analysis activities of optimal effectiveness?

5.3.3 KFA 3.3 SYSTEM DESIGN

System design is the process of transforming system requirements into a functional baseline and synthesizing a system-level design solution (initial product baseline) for the original user-defined problem. The system design process uses the products of the functional analysis and synthesis tasks of system concept definition, but during this iteration the focus is on completeness. The functional baseline completely defines the system functions to the point that they are understood and can be allocated to subsystems. The initial product baseline completely accounts for all system requirements. All system requirements and all functions are allocated and traceable to one or more subsystems. Complete requirements for all subsystems are defined. System design activities performed within the systems engineering process include activities associated with the initial and detailed design stages of system life cycle development.

Initial design includes the process of performing system analysis (functional or object oriented as an system analysis methodology example), requirement allocation, the accomplishment of trade studies, optimization, system synthesis, and configuration definition in the form of the development of preliminary design specifications.

Detailed design activities begin with the concept and configuration derived through the preliminary system design phase. Once the overall system design configuration has been established through preliminary system design, further system definition activities occur which define in detail the system and associated specifications.

General Characteristics Characteristic 1 System design activities are planned. Characteristic 2 A system design baseline is established that satisfies the system

requirements.

Page 104: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

98

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Characteristic 3 The current system requirements are traceable to user requirements. Characteristic 4 System design rationale and other information is captured and made

available within the organization. Characteristic 5 The system design is controlled and communicated to affected groups.

Questions

yes no

3.3-1 Performed 3.3-1.1/G1 Is system design being performed in at least an informal manner? 3.3-1.2/G2 Are system design activities planned for the program? 3.3-1.3 Are system requirements assigned to hardware, software, and other system components?

From 3.2-L2 T3.3-A-L1 3.3-1.4/G3 Are the products / results of the system design activities at least of marginal

value to the program? 3.3-1.5/G4 Are system design activities at least of marginal effectiveness?

3.3-2 Managed 3.3-2.1/G5 Does the program follow a written organizational policy (may be part of a

broad-based policy) for implementing the design process for system and subsystem development?

3.3-2.2/G6 Is there an approved plan (may be part of a larger technical management plan) for the program to perform system design?

3.3-2.3/G7 Is there a designated systems engineering manager or team leader responsible for the management system design on the program?

3.3-2.4 Are systems engineering personnel assigned to perform system design, prepare documentation, and initiate design?

3.3-2.5 Is there a mechanism to assure compliance with the architecture guidelines and rules during design and implementation?

3.3-2.6 Are a planned sequence of tasks performed to transform system requirements into a functional baseline?

3.3-2.7 Are system requirements and functions allocated to hardware, software, and other system components? From 3.2-L2 T3.3-A-L2

3.3-2.8 Are key system components traceable to the requirements baseline? 3.3-2.9 Are a planned sequence of tasks performed to synthesize a physical product? 3.3-2.10 Are trade studies performed to select between candidate design alternatives

when required? T3.3-B-L2 3.3-2.11/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided to

accomplish system design activities assessed? 3.3-2.12 Does the system design team consist of personnel with appropriate engineering

skills? 3.3-2.13/G9 When skills are inadequate, is training provided to systems engineering

personnel involved with system design? 3.3-2.14 Does the system design team identify and resolve design issues? T3.3-C-L2

Page 105: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

99

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.3-2.15 Does the system design team design and control all system interfaces? If not, does the system design team coordinate system design with other system integration teams that are responsible for interfaces not under control of the system design team?

3.3-2.16 Is there a designated individual or team responsible for establishing, maintaining, and monitoring the system architecture?

3.3-2.17 Is there a designated individual or team responsible for enforcing the baseline system architecture during design and development?

3.3-2.18 Are appropriate tools available to conduct system design activities? 3.3-2.19 Are design issues addressed in formal and informal reviews? T3.3-D-L2 3.3-2.20/G10 Is system design accomplished on the program in a structured manner? 3.3-2.21/G11 Is data collected for monitoring system design activities? 3.3-2.22/G12 Are corrective actions initiated when system design activities deviate

significantly from the plan? 3.3-2.23/G13 Are the products / results of the system design activities at least of adequate

value to the program? 3.3-2.24/G14 Are system design activities at least of adequate effectiveness?

3.3-3 Defined 3.3-3.1/G15 Are system design activities planned, approved, and established according to a

formal procedure? 3.3-3.2 Are exit criteria for system design established at the beginning of the program? 3.3-3.3 Is there a mechanism for modifying the system design plan if conditions warrant? 3.3-3.4 Are system component design inspections conducted? 3.3-3.5 Are system component implementation inspections conducted? 3.3-3.6 Are internal design review standards applied? T3.3-D-L3a 3.3-3.7 Are the design process errors analyzed during product design and test to determine the

distribution and characteristics of the errors found? 3.3-3.8 Are system design reviews conducted internally? T3.3-D-L3b 3.3-3.9 Are all elements (i.e. configuration items) of the system design traceable to the

requirements baseline? T3.3-A-L3a 3.3-3.10 Are non-traceable elements of the system design justified or resolved? T3.3-A-

L3b 3.3-3.11 Is there a mechanism to design system components for reuse by other

programs? 3.3-3.12 Is a process used for determining if the prototyping of system functions is an

appropriate part of the design process? 3.3-3.13 Is there a mechanism to identify opportunities for trade studies throughout

system development? T3.3-B-L3 3.3-3.14 Is the analysis and resolution of system design issues reviewed for adequacy

and completeness? T3.3-C-L3 3.3-3.15/G16 Are metrics for assessing the effectiveness of system design activities

developed according to a formal procedure? 3.3-3.16/G17 Are peer/defect reviews conducted to assess and improve system design

activities and products? 3.3-3.17/G18 Are system design processes standardized across the organization?

Page 106: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

100

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.3-3.18/G19 Are guidelines provided to allow the program to tailor the standard system design process for its specific needs?

3.3-3.19/G20 Are the products / results of the system design activities at least of significant value to the program?

3.3-3.20/G21 Are system design activities at least of significant effectiveness?

3.3-4 Measured 3.3-4.1/G22 Are metrics used to determine the status and effectiveness of system design

activities? 3.3-4.2/G23 Are analyses performed on the metrics associated with system design to

identify corrective actions for the program? 3.3-4.3/G24 Are the identified corrective actions implemented as necessary for the

program? 3.3-4.4/G25 Are the products / results of the system design activities at least of measurably

significant value to the program? 3.3-4.5/G26 Is the effectiveness of system design activities at least measurably significant?

3.3-5 Optimizing 3.3-5.1/G27 Is the effectiveness of the system design process and its implementation

activities reviewed on both an event-driven and periodic basis? 3.3-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the system

design process and its implementation? 3.3-5.3/G29 Are quality management reviews and/or audits conducted of the system design

activities and program data and the results used to improve the process? 3.3-5.4/G30 Are the metrics collected on the effectiveness of system design activities used

to monitor and improve the systems engineering process? 3.3-5.5/G31 Are the products / results of the activities of optimal value to the program? 3.3-5.6/G32 Are system design activities of optimal effectiveness?

5.3.4 KFA 3.4 INTEGRATED ENGINEERING ANALYSIS

The purpose of integrated engineering analysis is to (1) identify issues which require the application of decision theory techniques in order to accomplish timely technical decision making, (2) select a decision making technique appropriate to each technical issue and (3) involve the right mix of technical disciplines in the decision making process during the development of a system.

Technical issues requiring a decision making process may surface during any phase of a program. The objective of a program should be to surface as many impending technical issues as possible early in the development life cycle so as to maximize the time available to deal with each issue. Many candidate technical issues for Integrated Engineering Analysis are discovered using Risk Management activities. Additionally, technical alternatives from which there is no single preferred choice are also candidates for Integrated Engineering Analysis.

Page 107: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

101

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Many methods exist to perform comparative studies of technical alternatives. These methods include classical trade-off analysis, analytical hierarchical process, quality function deployment as well as many others. The selection of the appropriate decision making method should match the type and scope of the technical issue being analyzed. The application of a specific decision making method is sometimes referred to as a trade study.

Inputs to Integrated Engineering Analysis include the defined alternatives that are an interim output of the Concept Definition, Requirements and Functional Analysis, or System Design (Synthesis) KFAs. The trade study activity must be planned to clearly identify the objective and requirements of the analysis, alternatives to be traded, the selected decision making method, and preferred selection criteria. The development activity determines the measures to be considered for optimization and the relative weighting of those measures. Performance of the trade study will provide a recommendation as to which alternative should be selected. The alternative selected will provide the best balance of technical cost, schedule, and risk, and any other set of factors which are considered important by the developer. This recommended solution is then provided back to the activity which surfaced the alternatives.

Requirement, functional architecture, and physical architecture alternatives can be narrowed to a preferred alternative through application of analysis. Integrated Engineering Analysis also provides a means of resolving requirement (or functional or physical solution) conflicts in which no single candidate alternative will satisfy the expectations and constraints, nevertheless a preferred alternative must be selected.

A primary role of systems engineering is to ensure that the diverse engineering disciplines needed to develop complex systems are combined into a timely and appropriate manner to satisfy program objectives. During requirements analysis, functional analysis, and synthesis activities, alternative technical solutions are often identified which may appear to equally satisfy defined system expectations and constraints. To select one alternative that best satisfies system requirements, it is necessary to involve appropriate technical disciplines in decision making processes.

Engineering disciplines include both traditional and specialty engineering, as well as test, manufacturing, and field support engineering. Traditional engineering disciplines include (but are not limited to) electrical/electronic, mechanical, software, and aerodynamic engineering. Specialty engineering disciplines include (but are not limited to) safety, reliability, maintainability, and human factors engineering.

Integrate Engineering Analysis is closely related to the degree of success of System Concept Definition, Requirements and Functional Analysis, and System Design -- the extent to which engineering disciplines are integrated in a timely and appropriate fashion determines the overall effectiveness of these activities.

General Characteristics Characteristic 1 Integrated engineering analysis activities are planned (i.e. alternatives

defined, methodology established, selection criteria established) before they are accomplished.

Characteristic 2 An environment exists that promotes the integrated development of products and their related processes.

Characteristic 3 Appropriate decision making techniques are use to resolve technical issues. Characteristic 4 Relevant engineering/technical disciplines are combined as appropriate to

participate in technical decision making processes.

Page 108: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

102

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Questions

yes no

3.4-1 Performed 3.4-1.1/G1 Is the integration of engineering analysis being accomplished in at least an

informal manner? 3.4-1.2/G2 Are integrated engineering analysis activities planned for the program? 3.4-1.3 Are both traditional and speciality engineering disciplines involved in analysis activities

as needed? T3.4-A-L1 3.4-1.4 Do trade studies clearly produce a recommended alternative? T3.4-I-L1 3.4-1.5/G3 Are the products / results of the integrated engineering analysis activities at

least of marginal value to the program? 3.4-1.6/G4 Are integrated engineering analysis activities at least of marginal effectiveness?

3.4-2 Managed 3.4-2.1/G5 Does the program follow a written organizational policy (may be part of a

broad-based policy) requiring integrated engineering analysis activities? 3.4-2.2/G6 Is there an approved plan (may be part of a larger technical plan) for for the

program that identifies integrated engineering analysis activities? 3.4-2.3/G7 Has the responsibility for coordinating integrated engineering analysis activities

been assigned? 3.4-2.4 Are processes for assembling the right mix of technical disciplines established? T3.4-

A-L2 3.4-2.5 Are sufficient personnel in each discipline available to vary the composition of multi-

discipline teams? T3.4-B-L2 3.4-2.6/G8 Is the adequacy of resources (e.g. funding, staff, tools, etc.) provided for

integrated engineering analysis activities assessed? 3.4-2.7 Do the integrated engineering analysis teams consist of personnel with the relevant

engineering/technical skills? 3.4-2.8/G9 When analytical skills are inadequate, is training provided? 3.4-2.9 Are integrated engineering analysis teams assembled regularly to address design and

development questions? T3.4-C-L2 3.4-2.10 Are requirements and objectives for trades studies established? 3.4-2.11 Are alternatives for consideration clearly established? T3.4-D-L2 3.4-2.12 Are objective selection criteria formulated? T3.4-E-L2 3.4-2.13 Are weightings established for selection criteria? 3.4-2.14 Are significant alternative requirements, functions, and design solutions

selected by means of trade studies on the program? 3.4-2.15 In addition to technical matters, do trade studies consider cost, and schedule?

T3.4-F-L2 3.4-2.16 Are results of trade studies documented? T3.4-G-L2 3.4-2.17 Are tools and metrics defined to support activities requiring integrated

engineering analysis? T3.4-H-L2

Page 109: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

103

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.4-2.18/G10 Is integrated engineering analysis performed on the program in a structured manner?

3.4-2.19/G11 Is data collected for monitoring integrated engineering analysis activities? 3.4-2.20/G12 Are corrective actions initiated when integrated engineering analysis activities

deviate significantly from the plan? 3.4-2.21/G13 Are the products / results of the integrated engineering analysis activities at

least of adequate value to the program? 3.4-2.22/G14 Are integrated engineering analysis activities at least of adequate effectiveness?

3.4-3 Defined 3.4-3.1/G15 Are integrated engineering analysis activities planned, approved, and

established according to a formal procedure? 3.4-3.2 Is an environment established that promotes the integrated development of products and

their related processes. 3.4-3.3 Is a mechanism used for ensuring traceability between the system requirements and

lower level specification requirements? 3.4-3.4 Is a mechanism used for ensuring traceability between the system requirements and the

top level designs? 3.4-3.5 Is traceability maintained between the trades studies conducted and the impacted

requirements, functions, and design features? 3.4-3.6 Are groups responsible for integrated engineering analysis regularly assembled to

surface and resolve design and development problems? T3.4-C-L3 3.4-3.7 Are the participants in integrated engineering analysis activities varied with respect to

where the program is in the development life cycle? T3.4-B-L3 3.4-3.8 Are the efforts of engineering analysis teams coordinated to identify and communicate

problems to be solved? 3.4-3.9 Are the efforts of multi-discipline engineering/technical teams coordinated to identify

viable alternative solutions? T3.4-A-L3 3.4-3.10 Does the formulation of selection criteria include both objective and subjective

criteria? T3.4-E-L3 3.4-3.11 Is the number of alternative candidates for consideration expanded, as

necessary, to reflect the widest possible range of distinctly different solutions in order to achieve the overall goal of optimized system design? T3.4-D-L3

3.4-3.12 Are alternative candidates screened for their ability to solve the stated problem in order to ensure that analysis effort is not wasted on non-productive solutions?

3.4-3.13 Is a mechanism established to identify risks for each candidate solution? T3.4-F-L3

3.4-3.14 Is more than one decision method considered for the evaluation of critical program problems?

3.4-3.15 Is a sensitivity analysis performed to provide confidence in the selection of an alternative? T3.4-I-L3a

3.4-3.16 Is the selected alternative verified to meet all engineering and technical requirements? T3.4-I-L3b

3.4-3.17 Are the results of integrated engineering analysis and associated decision rationale documented and communicated to all affected groups? T3.4-G-L3a

Page 110: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

104

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.4-3.18 Does trade studies documentation include objectives, candidates considered/rejected, method of evaluation, selection criteria, success criteria, sensitivity analysis, selected alternative, and degree of confidence? T3.4-G-L3b

3.4-3.19 Are exit criteria used to determine when requirements analysis, design, integration, and verification are complete for each system component?

3.4-3.20 Are a common set of metrics available for integrated engineering analysis? T3.4-H-L3

3.4-3.21/G16 Are metrics collected for assessing the effectiveness of integrated engineering analysis activities?

3.4-3.22/G17 Are peer/defect reviews conducted to assess and improve integrated engineering analysis activities and products?

3.4-3.23/G18 Are integrated engineering analysis processes standardized across the organization?

3.4-3.24/G19 Are guidelines provided to allow the program to tailor the standard integrated engineering analysis process for its specific needs?

3.4-3.25/G20 Are the products / results of the integrated engineering analysis activities at least of significant value to the program?

3.4-3.26/G21 Are integrated engineering analysis activities at least of significant effectiveness?

3.4-4 Measured 3.4-4.1/G22 Are metrics used to determine the status and effectiveness of integrated

engineering analysis activities? 3.4-4.2/G23 Are analyses performed on the metrics associated with integrated engineering

analysis to identify corrective actions for the program? 3.4-4.3/G24 Are the identified corrective actions implemented as necessary for the

program? 3.4-4.4/G25 Are the products / results of the integrated engineering analysis activities at

least of measurably significant value to the program? 3.4-4.5/G26 Is the effectiveness of integrated engineering analysis activities at least

measurably significant?

3.4-5 Optimizing 3.4-5.1/G27 Is the effectiveness of the integrated engineering analysis process and its

implementation activities reviewed on both an event-driven and periodic basis? 3.4-5.2 Is customer satisfaction information used to improve the integrated engineering analysis

process? 3.4-5.3/G28 Upon review, are actions taken to correct identified deficiencies in the

integrated engineering analysis process and its implementation? 3.4-5.4/G29 Are quality management reviews and/or audits conducted of the integrated

engineering analysis activities and program data and the results used to improve the process? 3.4-5.5/G30 Are the metrics collected on the effectiveness of the integrated engineering

analysis used to monitor and improve the systems engineering process?

Page 111: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

105

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.4-5.6 When multiple companies are involved in integrated engineering analysis teams, do they continually assess the quality of their mutual interaction to improve the integrated engineering analysis process?

3.4-5.7 Are metrics regarding corrective actions compiled into a lessons learned archive for use on future programs? T3.4-D-L5

3.4-5.8/G31 Are the products / results of integrated engineering analysis of optimal value to the program?

3.4-5.9/G32 Are integrated engineering analysis activities of optimal effectiveness?

5.3.5 KFA 3.5 SYSTEM INTEGRATION

As systems become more complex and distributed, the system integration task becomes one of the key elements of systems engineering activity. System integration ensures that each of the subsystems come together and perform as a complete system that satisfies the system level requirements within the defined operating environment. Coupling between subsystems is indicative of their interdependence; connectivity between subsystems interfaces some or all of the internal elements of one subsystem to selected internal elements of another subsystem. Loose coupling is important for modula rity - so that a subsystem can be redesigned and/or replaced, if needed, without affecting the other subsystems.

System integration is managed by interface control documents, which ensure compatibility of all interfaces. The content and structure of interface control documents varies greatly depending upon the type of interface, e.g. mechanical interface, electrical interface, or radio frequency interface.

System integration is more than a onehtime assembly of the system elements at the conclusion of system design and fabrication. System integration should be approached in an incremental manner. Incremental system integration is the iterative process of “build-test-build" that occurs throughout the development process, beginning with simulations and steadily progressing through increasingly more realistic builds of prototypes until the final system is achieved. In each successive build, prototypes are constructed, tested, improved, and reconstructed based upon knowledge gained in the testing process. The end product achieved in this manner has a high confidence of passing system verification and validation.

General Characteristics Characteristic 1 System integration is planned. Characteristic 2 Interface design enables subsystems to satisfy assigned system

requirements. Characteristic 3 Interface design and design rationale/information is captured and made

available within the program. Characteristic 4 Interface design is controlled, communicated to affected groups, and used

to support program decisions.

Page 112: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

106

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Questions

yes no

3.5-1 Performed 3.5-1.1/G1 Is system integration being accomplished in at least an informal manner? 3.5-1.2 Is incremental system integration being accomplished in at least an informal manner?

T3.5-A-L1 3.5-1.3/G2 Are system integration activities planned for the program? 3.5-1.4/G3 Are the products / results of the system integration activities at least of marginal

value to the program? 3.5-1.5/G4 Are system integration activities at least of marginal effectiveness?

3.5-2 Managed 3.5-2.1/G5 Does the program follow a written organizational policy (may be part of a

broad-based policy) for implementing system integration? 3.5-2.2 Does the policy require planning of system integration and interface control? T3.5-A-

L2a 3.5-2.3/G6 Is there an approved plan (may be part of a larger technical management plan)

for the program to perform system integration? 3.5-2.4 Does the approved system integration plan identify interfaces between affected

subsystems and require management of their design, documentation, and change control? T3.5-B-L2

3.5-2.5/G7 Is there a designated systems engineering manager or team leader responsible for interface management on the program?

3.5-2.6 Are systems engineering personnel assigned to perform interface design, documentation, and initiate design modifications?

3.5-2.7 Are systems integration personnel, who have experience in incremental integration and rapid prototyping, assigned to the program?

3.5-2.8/G8 Is adequacy of resources (e.g., funding, staff, tools, etc.) provided to conduct system integration activities assessed (independently from the adequacy of resources provided for system design activities)?

3.5-2.9 Are appropriate tools available to conduct interface definition and design? 3.5-2.10 Are interface control documents placed under inter-organizational configuration

control? T3.5-D-L2 3.5-2.11 Are interface issues addressed in design reviews? T3.5-C-L2 3.5-2.12 Are personnel skills adequate to perform system integration? 3.5-2.13/G9 When skills are inadequate, is training provided to systems engineering

personnel involved in system integration? 3.5-2.14 When system integration involves multiple internal and external organizations,

do the assigned systems engineers have the necessary people skills to be effective in working together in a diverse group to achieve an interface design that is optimal for the program?

3.5-2.15 When multiple companies are involved, are clear roles and responsibilities established for interface management? T3.5-D-L2

3.5-2.16/G10 Is system integration accomplished on the program in a structured manner?

Page 113: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

107

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.5-2.17 Is incremental system integration accomplished on the program in a structured manner? T3.5-A-L2b

3.5-2.18/G11 Is data collected for monitoring system integration activities? 3.5-2.19/G12 Are corrective actions initiated when system integration activities deviate

significantly from the plan? 3.5-2.20/G13 Are the products / results of the system integration activities at least of adequate

value to the program? 3.5-2.21/G14 Are system integration activities at least of adequate effectiveness?

3.5-3 Defined 3.5-3.1/G15 Is the system integration plan developed and approved according to a formal

procedure? 3.5-3.2 Does the system integration plan employ an incremental integration approach? T3.5-

A-L3 3.5-3.3 Are all interfaces identified and managed? T3.5-B-L3a 3.5-3.4 Are interface designs documented in a common interface control document format or

database? 3.5-3.5 Is interface design rationale captured in a formal report or database? 3.5-3.6 Is inter-organizational configuration management of the interface control document(s)

accomplished? T3.5-D-L3 3.5-3.7 Are system integration issues included as an integral part of all formal, system level

design reviews? T3.5-C-L3a 3.5-3.8 Are system integration issues assessed for their impact on the program? T3.5-C-L3b 3.5-3.9 Are system trouble reports resulting from integration activities, tracked to closure?

T3.5-C-L3c 3.5-3.10 Is an interface control working group established that includes personnel from

affected subsystems? T3.5-B-L3b 3.5-3.11 Are decisions of the interface control working group documented and

communicated to all interested parties? 3.5-3.12/G16 Are metrics collected for the assessing the effectiveness of system integration

activities? 3.5-3.13 When multiple companies are involved in system integration, do they follow a

formal procedure for conducting their integration activities? T3.5-D-L3 3.5-3.14/G17 Are peer/defect reviews conducted to assess and improve system integration

activities and products? 3.5-3.15/G18 Are system integration processes standardized across the organization? 3.5-3.16/G19 Are guidelines provided to allow the program to tailor the standard system

integration process for its specific needs? 3.5-3.17/G20 Are the products / results of the system integration activities at least of

significant value to the program? 3.5-3.18/G21 Are system integration activities at least of significant effectiveness?

3.5-4 Measured 3.5-4.1/G22 Are metrics used to determine the status and effectiveness of system integration

activities?

Page 114: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

108

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.5-4.2/G23 Are analyses performed on the metrics associated with system integration to identify corrective actions for the program?

3.5-4.3 Are interface control documents reviewed periodically to assess their adequacy as the system design matures? T3.5-B-L4

3.5-4.4 When multiple companies are involved in system integration, do they continually assess the quality of the ir mutual interaction to improve the program level system integration effort? T3.5-D-L4

3.5-4.5/G24 Are the identified corrective actions implemented as necessary for the program?

3.5-4.6/G25 Are the products / results of the system integration activities at least of measurably significant value to the program?

3.5-4.7/G26 Is the effectiveness of system integration activities at least measurably significant?

3.5-5 Optimizing 3.5-5.1/G27 Is the effectiveness of the system integration process and its implementation

activities reviewed on both an event-driven and periodic basis? 3.5-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the system

integration process and its implementation? 3.5-5.3 Is customer satisfaction information used to improve the system integration process? 3.5-5.4 Are problems discovered during system integration used to evaluate the system

integration process for needed improvement? T3.5-C-L5 3.5-5.5/G29 Are quality management reviews and/or audits conducted of the system

integration activities and the results used to improve the process? 3.5-5.6/G30 Are the metrics collected on the effectiveness of the system integration

activities on the program used to monitor and improve the system engineering process? 3.5-5.7 When multiple companies are involved in system integration, do they continually assess

the quality of their mutual interaction to improve the system integration process? T3.5-D-L5 3.5-5.8/G31 Are the products / results of the system integration activities of optimal value to

the program? 3.5-5.9/G32 Are system integration activities of optimal effectiveness?

5.3.6 KFA 3.6 SYSTEM VERIFICATION

System verification is a stepwise approach to ensure that each element of the system satisfies its requirements. When accomplished at each level of the system hierarchy, an assurance is made that the complete, integrated system satisfies the system level requirements. Just as requirements are successively developed for each configuration item within the system (i.e. system, subsystem, unit, components, etc.), the system verification process is successively applied to determine for a given level of development activity, that the implementation satisfies the requirements specified at that level and that interfaces are as defined in the interface control documents. Verification is accomplished via any combination of the following methods: test, analysis, simulation, demonstration and/or inspection.

Page 115: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

109

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

System verification should not be viewed as a single event comprised of a single test or series of tests conducted at the conclusion of development. Instead, the verification process should be started in the early stages of the life cycle. As requirements are developed, a method of verifying the requirement is also specified. Developmental verification activities during design evolution enables tracking and control of critical technical performance parameters that are key to satisfying specification requirements. Developmental verification activities detect design deficiencies and interface issues; permitting timely resolution before such issues can impact schedule and cost.

System verification usually has two components: functional system verification and physical system verification. Functional system verification is usually comprised of system performance testing, which verifies actual performance with respect to specified performance, and qualification testing, which verifies system performance within its specified operational environment (e.g., temperature, vibration, electromagnetic interference). Physical system verification determines that the as-built configuration is physically in compliance with the specification and referenced drawings.

Page 116: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

110

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

General Characteristics Characteristic 1 System verification is planned. Characteristic 2 Actual performance of the system satisfies specified performance

requirements. Characteristic 3 The system is able to perform in the specified environment. Characteristic 4 The as-built configuration of the system is as specified. Characteristic 5 Verification activities within the program are performed in a consistent and

coordinated manner.

Questions

yes no

3.6-1 Performed 3.6-1.1/G1 Is system verification accomplished in at least an informal manner? 3.6-1.2 Is component and subsystem verification conducted in some fashion on the program to

reduce the risk of failing system verification? T3.6-A-L1 3.6-1.3/G2 Are system verification activities planned for the program? 3.6-1.4/G3 Are the products / results of the system verification activities at least of

marginal value to the program? 3.6-1.5/G4 Are system verification activities at least of marginal effectiveness?

3.6-2 Managed 3.6-2.1/G5 Is there a written organizational policy (may be part of a broad-based policy)

requiring management of system verification activities? 3.6-2.2 Does the organizational policy require activities in the early stages of the life cycle to

mitigate the potential risk of not successfully passing system verification? 3.6-2.3 Does the policy require the development of a system verification plan? T3.6-B-L2 3.6-2.4/G6 Is there an approved system verification plan (may be part of a larger technical

management plan) for the program? 3.6-2.5/G7 Is there a designated systems engineering manager or team leader responsible

for coordinating verification activities on the program? 3.6-2.6 Are systems engineering personnel assigned to perform verification activities per the

verification plan? 3.6-2.7 Is funding provided to conduct system verification activities independent of the design

and integration activities? 3.6-2.8/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for

performing system verification activities assessed? 3.6-2.9 Is the composition or focus of the verification team sufficiently different than the

composition of the design and analysis team? 3.6-2.10 Are new and/or unproven designs (i.e. highest risk ) tested at the lowest

assembly level to verify their compliance to established requirements early in the development life cycle? T3.6-A-L2

Page 117: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

111

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.6-2.11 Are verification results documented? T3.6-C-L2a 3.6-2.12 Are appropriate tools available to support system verification activities? 3.6-2.13 Are verification issues (e.g., design deficiencies, interface problems) addressed

in formal and/or informal design reviews? T3.6-D-L2 3.6-2.14 Are technical skills adequate to perform the required system verification

activities? 3.6-2.15/G9 When skills are inadequate, is training provided to system engineering

personnel involved in system verification? 3.6-2.16 Are developmental verification activities conducted in a structured manner to

reduce the risk of not passing system verification? 3.6-2.17 Are the results of developmental verification used in a structured manner to

support tracking and oversight of technical performance parameters? T3.6-C-L2b 3.6-2.18/G10 Is system verification accomplished on the program in a structured manner? 3.6-2.19/G11 Is data collected for monitoring system verification activities? 3.6-2.20/G12 Are corrective actions initiated when system verification activities deviate

significantly from the plan? 3.6-2.21/G13 Are the products / results of the system verification activities at least of

adequate value to the program? 3.6-2.22/G14 Are system verification activities at least of adequate effectiveness?

3.6-3 Defined 3.6-3.1/G15 Are system verification activities planned, approved and established according

to a formal procedure? 3.6-3.2 Is the system verification plan under configuration control? T3.6-B-L3 3.6-3.3 Are verification activities included in the early design stages of the standard systems

engineering process to reduce risk of failing system verification? T3.6-A-L3 3.6-3.4 Are standards applied to the preparation of verification documentation? T3.6-C-L3a 3.6-3.5 Are developmental verification results used to support tracking and oversight of

technical performance parameters? T3.6-C-L3b 3.6-3.6 As requirement changes occur during system development, is there a mechanism to

assure that regression analysis and re-verification are performed? T3.6-D-L3a 3.6-3.7 Are system verification issues (e.g., design deficiencies, interface problems) included as

an integral part of all formal, system level design reviews? T3.6-D-L3b 3.6-3.8 Is there a mechanism to ensure that system verification issues (e.g., design deficiencies,

interface problems) are assessed for their impact on the program? T3.6-D-L3c 3.6-3.9 Are system verification issues (e.g., design deficiencies, interface problems)

documented and communicated to all interested parties? 3.6-3.10 Are system verification issues factored into risk analysis? 3.6-3.11/G16 Are metrics collected for assessing the effectiveness of system verification

activities? 3.6-3.12 Are metrics collected for assessing the effectiveness of developmental

verification activities? 3.6-3.13/G17 Are peer/defect reviews conducted to assess and improve system verification

activities and products? 3.6-3.14/G18 Are system verification processes standardized across the organization? 3.6-3.15/G19 Are guidelines provided to allow the program to tailor the standard system

verification process for its specific needs?

Page 118: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

112

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.6-3.16/G20 Are the products / results of the system verification activities at least of significant value to the program?

3.6-3.17/G21 Are system verification activities at least of significant effectiveness? 3.6-4 Measured 3.6-4.1/G22 Are metrics used to determine the status and effectiveness of system

verification activities? 3.6-4.2/G23 Are analyses performed on the metrics associated with system verification to

identify corrective actions for the program? 3.6-4.3/G24 Are the identified corrective actions implemented as necessary for the

program? 3.6-4.4 Are the results of developmental verification reviewed periodically to assess the

adequacy of the system design as it matures with corrective action taken as necessary? T3.6-C-L4

3.6-4.5 Is coverage analysis performed for each phase of verification? T3.6-B-L4 3.6-4.6 Is the effectiveness of regression analysis and re-verification assessed? T3.6-D-L4 3.6-4.7/G25 Are the products / results of the system verification activities at least of

measurably significant value to the program? 3.6-4.8/G26 Is the effectiveness of system verification activities at least measurably

significant? 3.6-5 Optimizing 3.6-5.1/G27 Is the effectiveness of the system verification process and its implementation

activities reviewed on both an event-driven and periodic basis? 3.6-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the system

verification process and its implementation? 3.6-5.3/G29 Are quality management reviews and/or audits conducted of the system

verification activities and program data and the results used to improve the process? 3.6-5.4/G30 Are the metrics collected on the effectiveness of the developmental testing and

system verification activities on the program used to monitor and improve the system engineering process?

3.6-5.5 Is customer satisfaction information regarding the conduct of system verification used to improve the system verification process?

3.6-5.6 Are metrics regarding corrective actions compiled into a lessons learned archive for use on future programs?

3.6-5.7/G31 Are the products / results of the system verification activities of optimal value to the program?

3.6-5.8/G32 Are system verification activities of optimal effectiveness?

5.3.7 KFA 3.7 SYSTEM VALIDATION

System validation is an “end-to-end" approach to ensure that the completed, integrated system will operate as needed in the environment for which it is intended. True system validation can only be accomplished using the actual system in its intended environment. However, validation issues can be discovered early in the development life cycle through the use of early validation activities. Validation issues typically consist of unanticipated or unintended function or behavior.

Page 119: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

113

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

Early validation activities use approaches similar to verification (i.e. test, analysis, simulation, etc.) to determine how the system as a whole, is likely to perform under anticipated operational and environmental conditions. Additional prototyping techniques (e.g., rapid or sluggish prototyping) may also be used.

Rapid prototypes are models of the system that may be developed quickly to provide information on a particular aspect of the system under development. Such models may be physical models or mockups used to determine how a man-machine interface might be optimized, or simulations which provide insight into the operation or behavioral characteristics of a system. Rapid prototypes are typically used until the necessary information is obtained and are then discarded. Sluggish prototypes are higher fidelity representations of a system or system element. Sluggish prototypes are developed more slowly and are continuously updated and maintained to accurately reflect the characteristics of the actual system or system element under development.

General Characteristics Characteristic 1 System validation is planned. Characteristic 2 Actual performance of the system satisfies the operational needs. Characteristic 3 The system is able to perform in its intended environment Characteristic 4 System requirements as specified will result in a system that satisfies the

intended need. Characteristic 5 Validation activities (including early validation) within the program are

performed in a consistent and coordinated manner.

Questions yes no

3.7-1 Performed 3.7-1.1/G1 Is system validation accomplished in at least an informal manner? 3.7-1.2 Is early validation conducted in some fashion on the program to reduce the risk of

failing system validation? T3.7-A-L1 3.7-1.3/G2 Are system validation activities planned for the program? 3.7-1.4/G3 Are the products / results of the system validation activities at least of marginal

value to the program? 3.7-1.5/G4 Are system validation activities at least of marginal effectiveness?

3.7-2 Managed 3.7-2.1/G5 Is there a written organizational policy (may be part of a broad-based policy)

requiring management of system validation activities? 3.7-2.2 Does the organizational policy require activities in the early stages of the life cycle to

identify aspects of the system which pose a validation risk and to initiate appropriate early validation activities to mitigate the potential risk?

3.7-2.3 Does the policy require the development of a system validation plan? T3.7-B-L1 3.7-2.4/G6 Is there an approved system validation plan (may be part of a larger technical

management plan) on the program to evaluate related subsystems for their contribution to intended system operation?

3.7-2.5/G7 Is there a designated systems engineering manager or team leader responsible for coordinating validation activ ities on the program?

Page 120: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

114

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.7-2.6 Are systems engineering personnel assigned to perform validation activities per the validation plan?

3.7-2.7 Is funding provided to conduct system validation activities independent of the design, integration, and verification activities?

3.7-2.8 Is funding provided to conduct early validation activities necessary to assure that the requirements if implemented will result in a system that meets the intended need?

3.7-2.9/G8 Is the adequacy of resources (e.g., funding, staff, tools, etc.) provided for performing system validation activities assessed?

3.7-2.10 Are appropriate tools available to support system validation activities (e.g. rapid prototyping, simulation, decision making, etc.)?

3.7-2.11 Is the resolution of validation issues coordinated among affected projects within the program? T3.7-C-L2

3.7-2.12 Are technical skills adequate to perform the required system validation activities?

3.7-2.13/G9 When skills are inadequate, is training provided to systems engineering personnel involved in system validation?

3.7-2.14 Are early validation activities conducted in a structured manner to reduce the risk of not specifying valid requirements? T3.7-A-L2

3.7-2.15 Are the results of early validation used in a structured manner to support tracking and oversight of technical performance parameters? T3.7-D-L2

3.7-2.16/G10 Is system validation accomplished on the program in a structured manner? 3.7-2.17/G11 Is data collected for monitoring system validation activities? 3.7-2.18/G12 Are corrective actions initiated when system validation activities deviate

significantly from the plan? 3.7-2.19/G13 Are the products / results of the system validation activities at least of adequate

value to the program? 3.7-2.20/G14 Are system validation activities at least of adequate effectiveness?

3.7-3 Defined 3.7-3.1/G15 Are system validation activities planned, approved and established according

to a formal procedure? 3.7-3.2 Is the system validation plan under configuration control? T3.7-B-L2 3.7-3.3 Are early validation activities included as part of concept definition to reduce risk of

specifying invalid requirements? T3.7-A-L3 3.7-3.4 Are early validation results used in accordance with a formal procedure to support

tracking and oversight of technical performance parameters? T3.7-D-L3 3.7-3.5 Are system validation issues (e.g., unanticipated or unintended functions or behavior)

included as an integral part of all formal, system level design reviews? T3.7-C-L3 3.7-3.6 Are system validation issues (e.g., unanticipated or unintended functions or behavior)

assessed for their impact on the program? 3.7-3.7 Are system validation issues (e.g., unanticipated or unintended functions or behavior)

documented and communicated to all interested parties? 3.7-3.8 Are system validation issues factored into risk analysis? 3.7-3.9/G16 Are metrics collected for assessing the effectiveness of system validation

activities? 3.7-3.10 Are metrics collected for assessing the effectiveness of early validation

activities? 3.7-3.11/G17 Are peer/defect reviews conducted to assess and improve system validation

activities and products?

Page 121: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

115

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.7-3.12/G18 Are system validation processes standardized across the organization? 3.7-3.13/G19 Are guidelines provided to allow the program to tailor the standard system

validation process for its specific needs? 3.7-3.14/G20 Are the products / results of the system validation activities at least of

significant value to the program? 3.7-3.15/G21 Are system validation activities at least of significant effectiveness?

3.7-4 Measured 3.7-4.1/G22 Are metrics used to determine the status and effectiveness of system validation

activities? 3.7-4.2/G23 Are analyses performed on the metrics associated with system validation to

identify corrective actions for the program? 3.7-4.3/G24 Are the identified corrective actions implemented as necessary for the

program? 3.7-4.4 Are the results of early validation reviewed periodically to assess the adequacy of the

system design as it matures with corrective action taken as necessary? T3.7-D-L4 3.7-4.5/G25 Are the products / results of the system validation activities at least of

measurably significant value to the program? 3.7-4.6/G26 Is the effectiveness of system validation activities at least measurably

significant?

3.7-5 Optimizing 3.7-5.1/G27 Is the effectiveness of the system validation process and its implementation

activities reviewed on both an event-driven and periodic basis? 3.7-5.2/G28 Upon review, are actions taken to correct identified deficiencies in the system

validation process and its implementation? 3.7-5.3/G29 Are quality management reviews and/or audits conducted of the system

validation activities and program data and the results used to improve the process? 3.7-5.4/G30 Are the metrics collected on the effectiveness of the developmental testing and

system validation activities on the program used to monitor and improve the system engineering process?

3.7-5.5 Is customer satisfaction information regarding the conduct of system validation used to improve the system validation process?

3.7-5.6/G31 Are the products / results of the system validation activities of optimal value to the program?

3.7-5.7/G32 Are system validation activities of optimal effectiveness?

Page 122: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

116

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

6 GLOSSARY

This glossary contains terms, phrases and acronyms used within the text of the INCOSE SECAM and the SECAM Assessment Method. For each term, a brief prose description is provided, along with the associated acronym, if any. Definitions which can be attributed to another work or standard are identified by the following references:

[INCOSE] Officially endorsed INCOSE Definition [CAWG] INCOSE Capability Assessment Working Group [MWG] INCOSE Metrics Working Group [EIA/IS 632] EIA Information Standard 632 [IEEE/ANSI Std 729-1983] Standard Glossary of Software Terminology [IEEE Std 1220-1994] Trial-Use Standard for Application and Management of the Systems

Engineering Process

acquisition agency: Any agency tasked to acquire, on behalf of a third party, a (software and/or hardware intensive) system. [CAWG]

action plan: A plan that addresses how to implement the recommendations in a findings report, including the detailed actions and the change necessary. The following components are considered to be essential: (1) action-oriented description of task (for each action item), (2) responsibility, (3) resources, and (4) schedule of checkpoints and milestones. See action task. [CAWG]

action plan team: The collection of people who are tasked to formulate an action plan following the on-site phase of a systems engineering assessment. Typically, the team is composed of self-assessment team members, site SEPG members, and other key site personnel. [CAWG]

action task: The part of an action plan that addresses how a specific recommendation or recommendations in the final report will be implemented. [CAWG]

allocated baseline : The initially approved documentation describing a subsystem functional, performance, interoperability, and interface requirements that are allocated from those of the system or a higher level subsystem; interface requirements with interfacing subsystems; design constraints; derived requirements (functional and performance); and verification requirements and methods to demonstrate the achievement of those requirements and constraints. Generally there is an allocated baseline for each subsystem to be developed. [EIA/IS 632]

architecture : See system architecture.

assessment: See systems engineering process assessment.

assessme nt facilitators : Experts in systems engineering process assessment responsible for guiding a systems engineering process assessment team members through a systems engineering process assessment activity. [CAWG]

Page 123: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

117

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

assessment follow-up phase: The final phase of the assessment method. The focus of this phase is upon the implementation and execution of the action plan to implement the recommendations made to strive for systems engineering capability improvement. [CAWG]

assessment method: The set of steps or procedure for conducting a systems engineering assessment. The assessment method consists of five phases: commitment, preparation, on-site, post-assessment, and assessment follow-up. [CAWG]

assessment participants: A term used to collectively refer to the managers, project leaders, and the systems engineering practitioners who participated in the assessment. [CAWG]

assessment questionnaire : See questionnaire.

assessment sponsor: The senior site executive who has committed to or requested the systems engineering assessment. This person typically has control of financial and other resources for the systems engineering organization. See senior site manager. [CAWG]

assessment team (AT): A team of experienced engineering professionals that are trained in the SECAM assessment method to perform assessment(s). [CAWG]

assessment team briefing: A formal briefing for assessment participants intended to familiarize them with the impending assessment, clarify their roles, review the on-site phase schedule of activities, and set appropriate expectations. This briefing is typically conducted by one or more of the site assessment team members 1-2 weeks prior to the beginning of the on-site phase. [CAWG]

assessment team coordinator: An assessment team member who has the additional role of being the primary focal point for detailed site planning, facilities and logistics. Assists the assessment team leader by acting as back-up as needed. [CAWG]

assessment team leader: An assessment team member who has the additional role of being the assessment team leader. Typically, the following responsibilities are assigned to this person: (1) overall facilitator, (2) spokesperson for the team, (3) successful conduct of the assessment. The AFL usually is the most experienced ATM (typically 15 years or more of engineering experience). [CAWG]

assessment team member: A member of the assessment team. These members are respected senior-level systems engineering professionals (practitioners) who are opinion leaders and advocates of improvement in the systems engineering capabilities of the organization. An important selection consideration is the degree to which their presence on the assessment team might affect the free flow of information during discussions with project leaders and systems engineering practitioners. Individuals who might affect the free flow of information (e.g., managers, technical directors, team leaders, etc.) should not be selected as assessment team members. [CAWG]

assessment team training: A training program for assessment team members. The principle objectives of the training are to: (1) ensure a common knowledge base (systems engineering process management, change management, systems engineering assessment process), (2) discuss and practice selected assessment skills, and (3) to engage in team-building activities. [CAWG]

baseline : See configuration baseline.

Page 124: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

118

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

capability level: The extent to which the organization can potentially accomplish the essential elements of systems engineering as defined in the context of SECAM key focus areas. Capability involves the attributes of people, technology, and process. SECAM capability levels are an ascending scale progressing from Initial (Level 0), to Performing, to Managed, to Defined, to Measured, through Optimizing (Level 5). See also systems engineering capability. [CAWG]

champion: See change advocate.

change advocate : An individual or group who wants to achieve a change but lacks sufficient sponsorship. Contrast with change agent. [CAWG]

change agent: An individual or group that has sponsorship and is responsible for implementing or facilitating change(s). An example of a change agent is the systems engineering process group. Contrast with change advocate. [CAWG]

change management: The process of actively and explicitly taking appropriate measures to increase the likelihood of effective and efficient introduction of change(s) in an organization. Also, the body of knowledge pertaining thereto. Also known as managing technological change. [CAWG]

change sponsor: An individual or group who legitimizes, authorizes, and supports the change. In an assessment, this person is usually the senior site manager. [CAWG]

chief engineer: A senior level engineer whose responsibilities include the definition and development of key elements of the system, coordinating the activities of the technical team, reviewing all of the work of the technical team, and having overall responsibility for the technical content of the design. [CAWG]

commitment phase: In this phase of the assessment process, an organization makes a commitment to conduct an assessment. This commitment is typically elicited from the change agent by change advocates and champions. [CAWG]

composite results : Results which are non-specific with regard to particular individuals or projects. Typically used to refer to the findings which are presented to the senior management team during the the post-assessment phase of an assessment. [CAWG]

confidentiality ground rules: The free flow of relevant information is essential to the success of an assessment. There are two potential concerns which may inhibit this free flow of information - (1) the assessed organization may be concerned about sensitive information concerning its systems engineering capabilities becoming known to its competitors or customers and (2) assessment participants may be concerned about information being attributed to them by project or name. Accordingly, the following rules are abided by and supported by an assessment agreement:

1. Only composite results are given to the management team. 2. The assessment team and assessment participants agree to keep all information disclosed during the

course of an assessment confidential. 3. The assessing agent will not release or otherwise identify the results of any organization's assessment. 4. The assessing agent is free to use assessment data and conclusions to be derived there from for

statistical, analytical, or reporting purposes provided that the confidentiality requirement can be honored and that the information can be used without attribution to its source either directly or by inference.

5. The assessing agent will not publish collective data externally unless such data is based upon information from not less than 10 different organizations.

Page 125: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

119

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

6. Project-specific data is retained by the assessed organization. [CAWG]

confidentiality requirement: See confidentiality ground rules.

configuration baseline : The configuration information formally designated as a specific time during a system's or subsystem's life cycle. Configuration baselines, plus approved changes from those baselines, constitute the current configuration information. There are four formally designated configuration baselines, namely the user requirements, functional, allocated, and product baselines. [adapted from EIA/IS 632]

configuration item: A collection of hardware and/or software elements treated as a unit for the purpose of configuration management. [adapted IEEE/ANSI Std 729-1983]

configuration management: The process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the system life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. [adapted IEEE/ANSI Std 729-1983]

customer: An individual or organization that commissions the engineering of a system or is a prospective purchaser of an end product. Customers are one source of requirements. see also stakeholder [adapted from EIA/IS 632]

data: The various forms of documentation required to support a program in all of its areas. Data may take any from (e.g., printed or drawn on various materials, electronic media, or photographs). Data may be deliverable to a customer or non-deliverable for internal use only. [CAWG]

data management: Administrative control of program data, both deliverable and non-deliverable. Administrative control involves such items as identification, interpretation of requirements, planning, scheduling, control, archiving and retrieval of program data. [CAWG]

derived requirement: Characteristics needed to complete the requirements set for configuration item design that are dependent on the nature of the configuration item solution for their initial identification. These are typically identified during synthesis of preliminary product and process solutions, related trade off studies and verifications. [adapted from EIA/IS 632]

documented criteria: A formally controlled set of rules or standards on which an assessment can be made. [CAWG]

drawing tree: Similar to a specification tree. A drawing tree identifies the hierarchy of hardware component documentation and their interfaces for each system configuration which are to be controlled. [CAWG]

effectiveness: A measure of the performance of an activity. SECAM characterizes effectiveness as marginal, adequate, significant, measurably significant and optimal. These are defined as follows:

marginal effectiveness: effort is being expended but it is not clear that the value received for the effort invested is worth the cost of the effort. The effort could be removed without causing significant impact to the program.

adequate effectiveness: effort is being expended and the activities provide reasonable benefit to the program.

Page 126: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

120

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

significant effectiveness: effort being expended is obviously beneficial to the program.

measurably significant effectiveness: effort being expended and the benefit are measured and found to be of significant value to the program.

optimal effectiveness: the return for effort expended is demonstrably optimal (i.e. of maximum value for the effort expended) [CAWG]

environment: The aggregate of technological, process, social and cultural conditions that influence the development of a system. [CAWG]

error prevention analysis : A process that is typically conducted by a working group of engineering professional who developed the documentation/product in question. It is an objective assessment of each error, its potential cause, and the steps to be taken to prevent it. While placing blame is to be avoided, such questions as mistakes, adequacy of education and training, proper tools capability, and support effectiveness are appropriate areas for analysis. [CAWG]

exit criteria: Specific accomplishments or conditions that must be satisfactorily demonstrated before an effort can progress further in the current life(cycle phase or transition to the next phase. [EIA/IS 632]

exploratory question: A question formulated with the intent of furthering the assessment team's understanding of a SECAM Questionnaire response. A template of exploratory questions are provided as part of the SECAM Assessment Method. These questions should be tailored by the assessment team as part of the on-site period prior to conducting interviews of those that completed the questionnaire. [CAWG]

facilitator: See assessment facilitator.

findings: The assessment team's view of the most important systems engineering process issues or problems currently facing an organization. These are always presented as composite results to honor the confidentiality requirement of an assessment. [CAWG]

findings report: A formal written report (sometimes referred to as the Findings and Recommendations Report) containing the findings and recommendations, along with the details of the assessment and its conduct. This report is sometimes included in the action plan. [CAWG]

first-line manager: The first level of management responsible for reviewing performance and salary; the manager directly responsible for supervising the working level staff. First-line managers are responsible for managing technical supervisors and professionals. [CAWG]

formal method: A technique for expressing requirements in a manner that allows the requirements to be studied mathematically. Formal methods allow sets of requirements to be examined for completeness, consistency, and equivalency to another requirement set. Formal methods result in formal specifications. [CAWG]

formal procedure : A documented series of steps with guidelines for use. [CAWG]

formal reviews : A review that is conducted in accordance with an approved with established standards. [CAWG]

Page 127: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

121

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

function: A task, action, or activity expressed as a noun-verb combination (e.g., Brake Function: stop vehicle) to achieve a defined outcome. [IEEE Std 1220-1994]

functional analysis : Examination of a defined function to identify all the subfunctions necessary to the accomplishment of that function; identification of functional relationships and interfaces (internal and external) and capturing these in a functional architecture; and flow down of upper`level performance requirements and assignment of these requirements to lower`level subfunctions. [EIA/IS 632]

functional architecture : The hierarchical arrangement of functions, their internal and external (external to the aggregation itself) functional interfaces and external physical interfaces, their respective functional and performance requirements, and design constraints. [EIA/IS 632]

functional baseline : The initially approved documentation describing a system's or configuration item's functional performance, interoperability, and interface requirements and the verification required to demonstrate the achievement of those specified requirements. [ adapted EIA/IS 632]

guidelines: A documented approach, which if followed, assures a specific outcome or result. [CAWG]

hardware design team: The engineering staff responsible for the development of a particular hardware configuration item. [CAWG]

informal review: A review that is conducted in an ad-hoc manner. Sometime referred to as peer reviews. [CAWG]

inspection(s ): The examination (review) of a product and/or its associated documentation to determine whether or not it conforms to requirements. [CAWG]

integration: The merger or combining of two or more elements (e.g., components, parts, or configuration items) into a functioning and higher level element with the functional and physical interfaces satisfied. [IEEE Std 1220-1994] Italics [CAWG]

key focus area (KFA): An element of a Process Category within the INCOSE Systems Engineering Capability Assessment Model. Key focus areas are associated with a set of questions used to assess a particular aspect of systems engineering capability. [CAWG]

life cycle : See system life cycle.

management team: Typically the senior site executive and his direct reports. May be enlarged to include additional levels of management at the discretion of the senior site executive. [CAWG]

measurement: Data collected on a process, task, or activity. Measurements are used to synthesize metrics. [CAWG]

mechanism: A means or technique whereby the performance of a task, procedure, or process is assured. The mechanism may involve several organizational elements, and its documentation may include some combination of function statements, operating plans, position descriptions, and/or formal procedures. The documentation defines what should be performed, how it should be performed, and who is accountable for the results. [CAWG]

Page 128: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

122

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

method: A specific set of rules, techniques, and guidelines for carrying out a process and its activities. Thus a method serves to organize and discipline the overall process of preparing and evolving systems. [CAWG]

metrics : A synthesis of multiple measurements for the purpose of defining a process characteristic. See also planning and control metric, systems engineering metric, and or technical performance measures. [CAWG]

need: A user_related capability shortfall (such as those documented in a need statement, field deficiency report, or engineering change order), or an opportunity to satisfy a new market or capability requirement because of a new technology application or breakthrough, or to reduce costs. Needs may also relate to providing a desired service (e.g. system disposal). [adapted EIA/IS 632]

on-site phase: The phase of the assessment process in which the assessment team conducts discussions with project leaders and functional area representatives at the organization's site and reports its findings to the management team and assessment participants. This phase typically lasts 3-5 days. [CAWG]

opening meeting : A formal one-hour meeting conducted as the initial activity of the on-site phase of an assessment. The meeting is attended by the sponsoring executive, his management team, assessment participants, and the assessment team. [CAWG]

opinion leader: An engineering professional highly respected by his/her peers and whose opinion can significantly inf luence his/her peers. [CAWG]

organization: Usually the portion of an enterprise bounded by the span of authority of the senior site executive. [CAWG]

organizational policy: See policy.

peer review: See informal review.

performance requirement: A quantitative measure characterizing a physical or functional attribute relating to the execution of a mission/operation function. Performance attributes include quantity (how many or how much), quality (how well), coverage (how much area, how far), timeliness (how responsive, how frequent), and readiness (availability, mission/operational readiness). Performance is an attribute for all system people, products, and processes including those for development, production, verification, deployment, operations, support, training, and disposal. Thus, supportability parameters, manufacturing process variability, reliability, and so forth, are all performance parameters. [EIA/IS 632]

plan: A documented series of tasks required to meet an objective. [CAWG]

planning and control metric: A combination of measurements used to provide periodic assessment of the health and status of a program throughout its lifehcycle. These metrics as used to detect the presence of adverse trends early enough so that corrective actions may be taken. Examples of planning and control metrics include: schedule performance vs. plan; cost performance vs. plan; staffing level actuals vs. planned. [MWG]

policy: A specific set of principles endorsed by senior management selected to bound present or future actions. [CAWG]

post assessment phase: The assessment phase that consists of finalizing the action plan based upon the assessment results (i.e. findings and recommendations) and the priorities of senior management. [CAWG]

Page 129: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

123

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

practitioner: Anyone who participates in the systems engineering process. Practitioners may be individuals responsible for accomplishing systems engineering tasks, individuals that use the products of systems engineering, or individuals that provide resources to enable systems engineering process implementation. For the purpose of a systems engineering process assessment, practitioners should be peers within the organization having the aforementioned responsibilities. Some of the more important selection considerations are the following: (1) should be respected by his/her peers and considered an opinion leader, (2) considered an expert in one or more of the key focus areas, (3) is actually doing systems engineering work on one or more active projects (not a staff professional). In earlier versions of the SECAM practitioners were referred to as functional area representatives (FARs). [CAWG]

pre-assessment phase: See preparation phase.

preparation phase: This phase of the assessment process is devoted to preparing for the on-site phase. Major activities conducted during this phase are obtaining senior management sponsorship, selecting and training the assessment team, and administering the questionnaire. [CAWG]

process: (1) A set of actions, tasks, and procedures that, when performed or executed, obtains a specific goal or objective. (2) The logical organization of people, machines, tools, methods, and procedures into work activities designed to produce a specified end result (work product). We use the term systems engineering process to refer to processes which are intrinsic to developing and evolving systems. [CAWG]

process data: The data that is gathered about a process. It typically includes review, test, and resource data by process phase and change activity. To be most meaningful, this data should be associated with the process documentation, the tools and methods used, and the characteristics of the product being produced. [CAWG]

process database: A repository into which all process data is entered. It is a centralized resource managed by the process group. Centralized control of this database ensures that the process data from all projects are permanently retained and protected. [CAWG]

process group: A process group is composed of specialists concerned with the process used by the organization for system development. Its typical functions include defining and documenting the process, establishing and defining metrics, gathering data, assisting projects in analyzing data, and advising management on areas requiring further attention. The process group typically conducts quarterly management reviews on process status and may provide review leaders. [CAWG]

process maturity: The extent to which the processes used by an organization are explicitly defined, managed, measured and controlled. [CAWG]

process maturity matrix templates: An assessment aid developed to facilitate the scoring of initial process maturity levels based on responses to the maturity questionnaire and using a set of heuristic characteristics which are derived from ISO SPICE generic practices. The process maturity matrix is provided as part of the SECAM Assessment Method. This is not the preferred method for deriving a capability assessment score because this approach does not consider the KFA-specific attributes as part of the scoring process. [CAWG]

process metrics : Metrics used for the purpose of assessing the effectivity and identifying corrective actions to be taken with respect to processes used by an organization. [CAWG]

product baseline : The initially approved documentation describing all of the necessary functional performance, and physical requirements of the subsystem; the functional and physical requirements designated

Page 130: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

124

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

for production acceptance testing; and tests necessary for deployment, support, training, and disposal of the subsystem. In addition to the documentation, the product baseline of a subsystem may consist of the actual equipment and software. There is a product baseline for each subsystem, component, and part. [EIA IS 632]

profile : A representation (e.g., chart, spreadsheet, etc.) of the actual versus planned status of an item, action or task. [CAWG]

program: A set of tasks that are oriented towards meeting specific, defined objectives and accomplished by a group of individuals. The set of tasks are generally complex in nature and are performed within a definable time span (time between start and completion can often span numerous years) according to a planned schedule that has intermediate milestones. [CAWG]

program engineer: A systems engineer responsible for all the engineering technical, cost and schedule activities (e.g., Systems, Software, Hardware, Specialty) for a program. The Program Engineer reports to the Program Manager and is the primary interface between engineering and the Program Manager. [CAWG]

program manager: The individual ultimately responsible for accomplishing the tasking and meeting the objectives, schedule and fiscal constraints of the program. Everyone working on the program is tasked by the program manager, either directly or by delegated authority through other subordinate managers. [CAWG]

project: A planned undertaking of something to be accomplished, produced, or constructed, having a specified start and end date. The te rms project and program are sometimes used interchangeably. Where the word project appears the word program may be substituted, for clarification, if needed. Generally, larger projects are referred to as programs. [CAWG]

project leader: A designated systems engineering manager or team leader with responsibility and authority over a project. Depending upon the organization's structure, this term may or may not be synonymous with the term “first-line manager". [CAWG]

project manager: A person responsible for exercising executive, administrative, and supervisory direction of a project. [CAWG]

project representatives: The engineering professionals representing a project to be assessed - typically the project leader and any optional support from one or two of the project technical professionals. [CAWG]

quality goals: Specific objectives, which if met, provide a level of confidence that the quality of a product is satisfactory. [CAWG]

questionnaire : A template used, typically in the context of an assessment, to collect relevant information about the questionnaire respondent and the program used as the context for responses. [CAWG]

recommendations : The description of the actions which if taken, to correct or improve the situations identified in the findings and, thereby, their associated consequences. [CAWG]

regression testing: The testing required to determine that a change to a system component has not adversely affected functionality, reliability or performance. [CAWG]

requirements : Characteristics that identify the accomplishment levels needed to achieve specific objectives for a given set of conditions. [EIA/IS 632]

Page 131: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

125

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

requirements analysis : The determination of system specific performance and functional characteristics based on analyses of customer needs, requirements, and objectives; mission/operations; projected utilization environments for people, products, and processes; constraints; and measures of effectiveness. The bridge between customer requirements and system specific requirements from which solutions can be generated for the primary system functions. [EIA/IS 632]

requirements traceability: The evidence of an association between a derived requirement and its parent requirement or between a requirement and its implementation. [CAWG]

response analysis matrices: A format which aids the assessment team in structuring, understanding and correlating questionnaire responses across multiple projects within an organization during the on-site phase of an assessment. A response analysis matrix may be constructed by hand, or may be constructed using tools such as an electronic spreadsheet. Response analysis matrices should be prepared from questionnaire responses at the end of the pre-assessment phase and prior to the start of the on-site assessment phase. [CAWG]

response forms : The templates used to record responses to the questions in the questionnaire. The response form may be the questionnaire provided in the SECAM Assessment Method, or may be an electronic system capable of consolidating the responses into a response analysis matrix. [CAWG]

review: See formal review or informal review.

review coverage : The degree to which all requirements and design of the system have been reviewed. It is typically stated as a percentage and measures the percentage of the requirements and/or design evaluated by the review process. [CAWG]

review data: The data that is gathered from requirements or design reviews. This data is of two types. The first, concerning the review process, typically includes preparation time, errors identified during preparation (by category), hours per error found in preparation, review time, number of requirements or design statements reviewed, number of requirements or design statements reviewed per hour, and errors found per review man-hour (by category). The second type, product data from the review, typically includes errors found per requirement or design statement, action items identified from each review, action items closed for each review, items needing re-review, re-reviews conducted. [CAWG]

review efficiency: The percentage of errors found through the review process. It is typically stated as a percentage and is calculated by dividing the total errors found during review by the total errors found by both review and test through the completion of system integration test. [CAWG]

review leader: Typically a member of the process or assurance group who is thoroughly trained in the review process. The review leader's role is to ensure that the participants are properly prepared and that the review is efficiently and thoroughly conducted. The review leader is responsible for recording review data, making sure that the actions resulting from the review are completed, and for conducting re-reviews where appropriate. [CAWG]

risk: A measure of the uncertainty of attaining a goal, objective, or requirement pertaining to technical performance, cost, and schedule. Risk level is categorized by the probability of occurrence and the consequences of occurrence. Risk is assessed for program, product, and process aspects of the system. This includes the adverse consequences of process variability. The sources of risk include technical (e.g. feasibility operability, producibility, testability, and system effectiveness); cost (e.g. estimates, goals); schedule (e.g.

Page 132: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

126

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

technology/material availability, technical achievements, milestones); and programmatic (e.g. resources, contractual). [EIA/IS 632]

risk management: An organized process to identify what can go wrong, to quantify and assess associated risks, and to implement/control the appropriate approach for preventing or handling each risk identified. [EIA/IS 632]

risk management plan: Description of the risk management program that describes the approach and activities for risk management. The technical risk management plan is an essential part of the System Engineering Management Plan (SEMP). [EIA/IS 632]

scoring profile templates: An assessment aid developed to facilitate the presentation of capability assessment scores based upon assessment participant responses to the questionnaire, exploratory questions, and practitioner discussion groups. The scoring results template presents each KFA score as a separate vertical bar on the template. The scoring “signature" can be evaluated using this template. [CAWG]

self-assessment: A systems engineering assessment for which (1) all or most of the assessment team is from the assessed organization (not necessarily the assessed site) and (2) the assessed organization has primary responsibility for facilitation and planning. [CAWG]

senior management: Management responsible for establishing policy and authorizing resources for specific programs. [CAWG]

senior site executive : The senior executive manager who sets the operational priorities for the organization. [CAWG]

senior site manager: Same as senior site executive.

site assessment team members : The assessment team members who are employed by the site being assessed. [CAWG]

site team training: A one or two day training program conducted for the benefit of the site self-assessment team members. [CAWG]

software design team: The engineering staff responsible for the development of a particular computer program configuration item. [CAWG]

specification: A description of the essential technical requirements for items, materials, and services that include the verification criteria for determining whether these requirements are met. A specification supports the development and life cycle management of the item, material, and service described. [EIA/IS 632]

specification element: A component of the system defined by a specification. [adapted from IEEE-1220-1994]

specification tree: A hierarchy of specification elements and their interface specifications that identifies the elements and the specifications related to physical elements of the system configuration which are to be controlled. [IEEE-1220-1994]

sponsor: See change sponsor.

Page 133: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

127

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

stakeholder: An individual or organization interested in the success of a product or system. Examples of stakeholder include customers, developers, engineering, management, manufacturing, users, etc. [CAWG]

standard(s): An approved, documented, and available set of criteria used to determine the adequacy of an action or object. [CAWG]

statement of work: A description of all work required to complete a program or project. A contractual document which defines the work effort required from contractors or customer support activities. [CAWG]

structure tree: Similar to a specification tree. A structure tree identifies the hierarchy of software components and their interfaces for each system configuration which are to be controlled. [CAWG]

structured manner: A term used in the managed level of capability maturity that is intended to denote that a process, e.g., system concept definition, is performed in a logical manner by a group of individuals according to a procedure that may not be completely documented or documented at all. The group of individuals are performing the process at a level of sophistication greater than that of ad hoc methods, however, the process has not been captured in a formal, completely documented procedure that is required to be followed. [CAWG]

supporting materials: Typically documentation which is requested by the assessment team during the on-site phase of the assessment to either substantiate a questionnaire response or clarify an issue. [CAWG]

subsystem: A grouping of configuration items satisfying a logical group of functions within a particular system. [EIA/IS 632]

synthesis : The translation of input requirements (including performance, function, and interface) into possible solutions (resources and techniques) satisfying those inputs. Defines a physical architecture of people, product, and process solutions for logical grouping of requirements (performance, function, and interface) and then designs those solutions. [EIA/IS 632] Note synthesis is addressed by the System Design KFA.

system: A set of interrelated components working together to accomplish a common purpose. [CAWG] An interacting combination of elements viewed in relation to function. [INCOSE]

system architecture : A logical, physical structure that specifies interfaces and services provided by the system components necessary to accomplish system functionality. [CAWG]

system breakdown structure (SBS): IEEE-1220-1994 terminology for a Work Breakdown Structure. [CAWG]

system component: A basic part of a system. System components may be personnel, hardware, software, facilities, data, materials, services, and or techniques which satisfy one or more requirements in the lowest levels of the functional architecture. System components may be subsystems and/or configuration items. [CAWG]

system design team: The engineering staff responsible for development of the system. This team may consist of engineers from many different specialty areas. [CAWG]

system engineering management plan (SEMP): The basic plan governing the systems engineering effort. The SEMP describes the technical program plan and control, the systems engineering process used by the program, and how the activities of each engineering specialty are to be integrated during the program. [CAWG]

Page 134: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

128

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

system life cycle : The period extending from inception of development activities, based on an identified need or objective, through decommissioning and disposal of the system. [EIA IS 632]

system requirements : Characteristics of a system that identify the accomplishment levels needed to achieve specific objectives for a given set of conditions. [CAWG / EIA/IS 632] See also system and requirements.

systems engineering : An inter-disciplinary approach and means to enable the realization of successful systems. [INCOSE]

systems engineering capability: A term used to refer to the ability of a systems engineering organization to perform successfully (in terms of cost, schedule and quality) on systems commitments. There are many different dimensions of systems engineering capability. Some of the more dominant ones are: (1) people (includes training, experience, domain expertise and motivation, (2) process maturity of the organization and (3) technology which can be brought to bear. See also capability level. [CAWG]

systems engineering metric: Metrics which provide a quantitative means of assessing the systems engineering effort on a program. [MWG]

systems engineering process: The total set of systems engineering engineering activities needed to transform a user's requirements into a system or product. The process of applying engineering principles to produce a system or product. [CAWG]

systems engineering process assessment (SEPA): An appraisal or review done by a trained team of engineering professionals to determine the state of an organization's current systems engineering process maturity level or systems engineering capability level, to determine the high-priority systems engineering process issues that face an organization, and to start the actions needed for systems engineering process improvement. An assessment is conducted on a confidential basis for the benefit of the assessed organization. [CAWG]

systems engineering process group: A group focused on improving the systems engineering process used by an organization. The SEPG defines and documents the systems engineering process, establishes and defines process metrics, supports project data gathering, assists projects in analyzing data, and advises management of areas requiring further attention. [CAWG]

systems engineering process improvement: The changes implemented to a systems engineering process that bring about improvements. [CAWG]

systems engineering process management: The use of process engineering concepts, techniques, and practices to explicitly monitor, control, and improve the systems engineering process. The objective of systems engineering process management is to enable an organization to produce system/segment products according to plan while simultaneously improving its ability to produce better products. [CAWG]

systems engineering professionals : One whose profession is intrinsically tied to the development or evolution of systems. [CAWG]

systems engineering team: see systems design team.

tailoring: The process by which individual task statements (sections, paragraphs, or sentences) of specifications, standards, guidelines, and related documents are evaluated to determine the extent to which they

Page 135: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

129

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

are most suitable for a specific system and equipment development and the modification of these requirements to ensure that each achieves an optimal balance between operational needs and cost. [EIA/IS 632 (modified)]

team leader: See project leader.

technical effort: A technical effort is any activity that influences system performance by defining, designing, or executing a task, requirement or procedure. All the activit ies required to implement and execute the systems engineering process are technical efforts. [CAWG]

technical objectives: Technical objectives or goals guide the development effort by providing “target" values for item characteristics. These can include cost, schedule, and performance attributes deemed important. Technical objectives are not specification requirements. [CAWG]

technical performance measurement (TPM ): The continuing analysis, test and demonstration of the degree of anticipated and actual achievement of selected technical measures (technical performance parameters). Technical Performance Measurement is a method to assess compliance to requirements and the level of technical risk in the development program. Technical Performance Measurement includes analysis of the differences among the achievement to date, current estimated, and the required goal or target value identified for the technical performance parameter.

a. achievement to date - The value of the technical parameter estimated or measured in a particular test, demonstration, analysis or simulation.

b. current estimate - The value of a technical parameter predicted to be achieved with existing resources by the end of the contract.

technical performance parameter (TPP): Technical Performance Parameters are a selected subset of the system's performance parameters used as the technical measures tracked in TPM. Technical Performance Parameters can be:

a. Specification Requirements. b. Performance parameters such as measures of effectiveness and other key decision metrics used

to guide and control progressive development. c. Design to cost requirements or goals. [CAWG]

technical resource profiles: Profiles of one or more configuration item's technical resources. Examples of aspects of the profiles are: (1) list critical technical resources (e.g., memory, CPU speed, throughput, size, weight, power consumption/requirements etc.); (2) define resource profiles from operations concept (e.g., memory utilization versus CSCI versus time, etc.); (3) establish a technical resource margin or reserve requirement (e.g., 20%, 25%, 50%, etc.); (4) determine the maturity of resource utilization measurement (e.g., estimated, analyzed, existing design, existing item, etc.); (5) update/maintain resource utilization profiles/margins over life-cycle of systems development (time). [CAWG]

technique : The manner in which a specific set of technical details are treated to accomplish a specific goal. [CAWG]

tool: Typically, a computer program used to help automate tasks associated with the synthesis, test, analysis, or maintenance of models, designs, and/or documentation associated with systems components. [CAWG]

traceable : See requirements traceability.

Page 136: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

130

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

traceability: See requirements traceability.

Trade-off analysis: One of a number of decision making methods. see trade study. [CAWG]

trade study: An objective evaluation of alternative requirements, architectures, design approaches, or solutions using identical ground rules and criteria. [EIA IS 632]. Trade studies may be accomplished using a variety of decision making techniques (e.g., Trade-off Analysis Method, Quality Function Deployment, Analytical Hierarchical Process). [CAWG]

training : Instruction and applied exercises for the attainment and retention of skills, knowledge, and attitudes required to accomplish necessary tasks. [MIL-STD-1379D (modified)].

user: An individual or organization that uses end products of a system. see also stakeholder [adapted from EIA/IS 632]

user requirements baseline : A configuration baseline of user requirements. This baseline is necessary to systems engineering domains in which there does not exist a single, clearly discernable, user or customer. [CAWG]

validation: The process of establishing that a system design will meet its intended objectives in the environment for which it is intended. [CAWG]

value : A measure of the desirability of the products of an activity. SECAM characterizes value as marginal, adequate, significant, measurably significant and optimal. These are defined as follows:

marginal value : products are generated by the activity, but it is not clear that the products are of use to those persons that they are intended for. The products could be removed without causing significant impact to the program or organization.

adequate value : products generated by the activity provide reasonable benefit to those that use them. Products providing adequate value are generally used by those for which they are intended.

significant value : products generated by the activity are obviously beneficial by those that use them. Products of significant value are avidly sought out and used by those for which they are intended.

measurably significant value : the benefits of each product generated by the activity are measured and found to be of significant value to the program or organization.

optimal value : the value of each product generated by the activity is demonstrably optimal for the program or organization (i.e. of maximum utility) [CAWG]

verification: The determination of compliance to a set of requirements. [CAWG]

work breakdown structure (WBS): A product-oriented family tree composed of hardware, software, services, data, and facilities which result from systems engineering efforts and which completely defines the program. Displays and defines the product(s) to be developed or produced, and relates the elements of work to be accomplished to each other and to the end product. [EIA IS 632] A product oriented family tree composed of the hardware, services, and data to produce the end product. The WBS is structured in accordance with the way

Page 137: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

131

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

in which the work will be performed, and reflects the way in which the costs will be summarized and eventually reported. [DOE]

Page 138: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

132

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

APPENDIX A DETAILED REVISION HISTORY

The revision numbering convention consists of a three digit number in the form of "X.YZ". Changes in the basic model structure or philosophy are considered a major revision to the document and will increment "X" and reset "YZ". Changes in content within the scope of the existing model structure will increment "Y" and reset "Z". Changes to the document from a publishing tool perspective which do not alter the content of the model will increment "Z". Revision bars are used to identify changes made since the previous release of the document.

A.1 DRAFT VERSION 1.00 FEBRUARY 1994 1. Original Release of Interim Model Document 2. Original Release of Interim Model Questionnaire

A.2 DRAFT VERSION 1.01 JUNE 1994 1. Combined the Interim Model Document and Questionnaire into a single Interleaf document.

Effectivity control is used to generate both reports from a single document. This is necessary to minimize configuration management on the questions that appear in both reports.

2. Changed all references to "Key Process Areas" or "KPAs" in the Introduc tion, to "Key Focus Areas" or "KFAs".

3. Content remains the same as version 1.00.

A.3 DRAFT VERSION 1.10 JULY 1994 1. Questions have been updated based upon lessons learned from pilot assessments conducted by

Blake Andrews, Collins Air Transport Division, Rockwell and Bill Mackey, CSC from 3-94 through 5-94.

2. Key Focus Area 7, Risk Management, has been rewritten per comments from Bob Jones & George Stem.

3. Key Focus Area 10, Technology Management, has been rewritten per comments from Bob Jones & George Stem.

4. Key Focus Area 15, Integrated Engineering Analysis, has been rewritten per comments from Bob Jones & George Stem.

5. Added a glossary of terms provided by Jerry Burleson, Loral Command and Control Systems. 6. Italicized terms used in the questions to indicate that the term is defined in the glossary. 7. Introduction and Goals were updated based upon suggestions provided by Bill Mackey, Suzi

Garcia, and Blake Andrews. 8. Version 1.10 introduction was added per Rich Widmann.

A.4 VERSION 1.20 NOVEMBER 1994 1. Key Focus Area 3, Subcontract Management, questions have been revised based upon CAWG

review of CSC recommended changes. 2. Key Focus Area 4, Intergroup Coordination, questions have been revised based upon CAWG

review of CSC recommended changes. 3. Key Focus Area 13, Systems Design, questions have been revised based upon CAWG review

of CSC recommended changes. 4. Key Focus Area 14, Systems Integration & Verification, questions have been revised based

upon CAWG review of CSC recommended changes.

Page 139: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

133

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

5. Key Focus Area 16, System Concept Definition, has been added based upon CAWG review of CSC recommendations.

A.5 VERSION 1.30 APRIL 1995 1. Key Focus Area 9, Training:

Added questions 9.2.1 and 9.2.2. 2. Key Focus Area 11, Environment & Tools Support:

Description has been rewritten. 3. Introduction Updated:

New Section 1.2.4, Interim Model Version 1.30 New Section 2.5, Traceability Matrices New Section 3.0, Key Focus Area Goals-to-Questions Mapping

4. Interim Model Descriptions have been incorporated into a separate section 4.

A.6 VERSION 1.31 APRIL 1995 1. Revision bar notations are cumulative from Version 1.30. 2 TOC entries for each KFA were added 3. Revision History for Version 1.20 was incorrectly dated December 1994. Date was revised to

November 1994. 4. Section 1.2 was updated to correctly reflect the current version. 5. Terminology used in Section 4.0 was corrected. 6. Glossary was incorporated into a separate Section 5.0. 7. Miscellaneous formatting changes were made. 8. Repaired broken autoreferences in KFA 11 (indicated by "NO TAG"). 9. Replaced TBD entries in glossary with definitions. 10. Alphabetized glossary entries.

A.7 VERSION 1.40 MAY 1995 1. Section 1.2 was updated to correctly reflect the current version. 2. Section 1.2.5 added. 3. Section 2.0 was rewritten. 4. Process Categories were renamed as follows: System Management renamed Management

Organizational renamed Organization Engineering Process renamed System Engineering 5. Numbering of Key Focus Areas has been improved to reflect the hierarchy of Process

Categories, as follows: Version 1.3: Version 1.4:

1.0 Planning 1.1 2.0 Tracking and Oversight 1.2 3.0 Subcontract Management 1.3 4.0 Intergroup Coordination 1.4 5.0 Configuration Management 1.5 6.0 Quality Assurance 1.6 7.0 Risk Management 1.7 8.0 Process Management and Improvement 2.1 9.0 Training 2.2 10.0 Technology Management 2.3 11.0 Environment and Tool Support 2.4 12.0 System Requirements 3.2

Page 140: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

134

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

13.0 System Design 3.3 14.0 System Integration 3.5 15.0 Integrated Engineering Analysis 3.4 16.0 System Concept Definition 3.1

(Revision Bars are not used to denote these changes) 6. The System Integration and Verification KFA has been separated into the following KFAs:

3.5 System Integration 3.6 System Verification

7. A new KFA has been added as follows: 3.7 System Validation

8. Traceability Tables in Section 3.0 have been updated to map numbering used prior to version 1.4 to numbering used in version 1.4 (Column denoted "Question" was split into "V 1.3" and "V 1.4" - Revision Bars are not used to denote these changes).

9. Traceability Tables in Section 3.0 have been updated to include new tables for KFAs 3.6 and 3.7.

10. Within each KFA Description, place markers for deleted and moved questions, have been removed (Revision Bars are not used to denote these changes - refer to traceability table in Section 3.0 for this information).

11. Questions in each KFA have been renumbered and reordered as appropriate to improve readability (Revision Bars are not used to denote these changes - refer to traceability table in Section 3.0 for this information).

12. Questions in each KFA have been modified as indicated by Revision Bars per discussions of lessons learned from the Westinghouse SEPA and per recommendations made by Ken Crowder (Boeing Company).

13. Questions in each KFA have been modified as indicated by Revision Bars for consistency. 14. KFA 1.2 Tracking and Oversight was updated to reflect work done by the Metrics Working

Group (e.g. Metrics are classified as TPMs, Planning/ Control and Systems Engineering Process).

15. KFA 2.1 was renamed "Process Management and Improvement". 16. The following changes have been made to goals within each KFA:

1.1 Planning: - Goal 3 is new 1.2 Tracking and Oversight: - Goals 1, 2, and 4 are modif ied 1.3 Subcontract Management: - Goals 1-3 become goals 2-4 - New Goal 1 (planning) 1.4 Intergroup Coordination: - Goals 1-3 become goals 2-4 - New Goal 1 (planning) - Goals 2 and 3 are modified 1.5 Configuration Management: - No changes 1.6 Quality Assurance: - No Changes 1.7 Risk Management - Goals 2 and 3 are modified Process Management and Improvement: - Goals 1-4 become goals 2-5 - New Goal 1 (planning)

Page 141: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

135

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.2 Training: - Goal 1 is modified. - Goal 2 is replaced (needs of the org) - New Goal 3 (needs of the program) - New Goal 4 (needs of the engineer) - New Goal 5 (training process improvement) 2.3 Technology Management: - No Changes 2.4 Environment and Tool Support: - Goals 1 and 2 become goals 2 and 3 - New Goal 1 (planning) - New Goal 4 (implementation of env & tools) 3.1 System Concept Definition: - Goals 1-3 become Goals 2-4 - New Goal 1 (planning) - Goal 2 is modified - Goal 3 is modified - Goal 4 is deleted & new goal 4 (baseline) - New Goal 5 (conceptual phy architecture) 3.2 System Requirements: - Goal 2 is deleted - Goal 1 becomes Goal 2 - New Goal 1 (planning) - Goal 3 becomes Goal 5 - New Goal 3 (complete & traceable) - New Goal 4 (documented rationale) 3.3 System Design: - Goal 3 becomes Goal 5 - Goal 2 becomes Goal 4 - Goal 1 becomes Goal 2 - New Goal 2 (baseline) - New Goal 3 (traceable) 3.4 Integrated Engineering Analysis: - Goal 1 becomes Goal 2 - New Goal 1 (planning) 3.5 System Integration (formerly & Validation): - Goal 2 replaced (interface design) - Goal 3 becomes Goal 4 - New Goal 3 (rationale captured)

These changes are noted by Revision Bars in the KFA description. Traceability Tables have also

been updated per these changes (Revision Bars were not used in the traceabilit y tables). 17. TOC Updated per items 2. 3. 5. 6. and 7. 18. Nomenclature used for expressing capability levels has been changed to reflect model

alignment with ISO SPICE conventions. 19. KFA Descriptions have been improved as noted by Revision Bars.

A.8 VERSION 1.41 (PRELIMINARY) OCTOBER 1995 - JANUARY 1996 (NEVER FORMALLY RELEASED)

1. Minor editorial changes throughout.

Page 142: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

136

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2. Added two questions to each KFA to further align the Interim Model with ISO SPICE definition of capability.

3. Changes to glossary: - added "champion" as synonym for "change advocate" - added definition for "practitioner" - updated definition for "policy" - updated definition for "functional areas" - added definition for "system requirements" - added definition for "user requirements baseline" - updated definition for "configuration baseline" to acknowledge "user requirements baseline"

4. Changes to KFA 3.1, Concept Definition: - Additional emphasis on cost issues in description - modified Goal 4 for consistency with user requirements theme

5. Changes to KFA 3.2, System Requirements: - amended goal 2 to include "negotiated" 6. All instances of "NCOSE" have been changed to "INCOSE" to reflect the new name of "Inter-

National Council On Systems Engineering" adopted by NCOSE during the Fifth Annual Symposium.

7. Changed title of section 1.0. 8. Inserted new section 1.2 entitled "Endorsement of the SECAM". 9. Section 1.2 "Evolution", has been moved to Section 2.0 (all section numbers have been updated

accordingly). 10. Document "preface" material pages (including revision history and table of content) have been

renumbered using lower case roman numbering. 11. Added the following figures:

- Figure 2.0-1 Historical Perspective of INCOSE SECAM Development - Figure 3.2-2 Organization of Interim Model Questions With Respect to Capability

Levels 12. Updated Figure 3.2-1 13. The following sections have been rewritten:

- Section 3.2 Organization of the Interim Model - Section 3.2.3 Goals - Section 3.3 Traceability Matrices

14. Updated Section 3.1.3.1

A.9 VERSION 1.50 (PRELIMINARY) APRIL 1996 1. Incorporated all Version 1.41 changes (Version 1.41 revision bars are enforced in addition to

Version 1.50). 2. All present tense references to "Interim Model" changed to "INCOSE SECAM". 3. Minor editorial changes throughout. 4. Rewrite & restructure of model description sections:

- Revision History Moved to Appendix A - Section 2.0 Retitled "Development of SECAM" - Section 4.0 Moved to Appendix B - Sections 2.1 through 2.6 Moved to section 4.0 and subsections - Section 2.6 Moved to

section 2.3 and rewritten - Section 3.1.1 Retitled "Process Maturity" - Sections 3.1.2 thru 3.1.3 Moved to sections 3..1.3 through 3.1.4 - Section 3.1.4 Moved to section 3.1.4.1 & retitled "Use" - Sections 3.1.3.1 thru .3

Moved to sections 3.1.4.2 -.4

Page 143: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

137

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

- Section 3.1.4.4 Retitled "Product Life Cycle" - Section 3.2 Retitled "Structure of the INCOSE SECAM - Section 3.2.2 Content divided into sections 3.2.2.1 - Section 3.2.3 Moved to 3.2.2.3 "General Characteristics" - Section 3.2.4 Moved to section 3.2.5 & retitled - Section 5.0 Retitled "INCOSE Systems Engineering..."

5. Added the following new descriptive sections: - Section 1.3 SECAM and it Assessment Method - Section 1.4 Why the SECAM was Developed - Section 1.5 Acknowledgements - Section 1.6 Additional Copies / General Info... - Section 1.7 Information on the SECAM - Section 2.1 Approach - Section 2.2 Incremental Development - Section 3.1.2 Systems Engineering Capability - Section 3.1.4.1 Use - Section 3.2.2.4 Questions - Section 3.2.3 Relationship to Systems Engineering - Section 3.2.4 SECAM Capability Levels - Section 3.5 Relationship to Other Standards - Section 4.6 Version 1.41 (Preliminary) - Section 4.7 Version 1.50

6. Revised the following figures: - Figure 2.0-1 Updated & Moved to Figure 2.1-1 - Figure 3.2-1 Updated - Figure 3.2-2 Updated & moved to Figure 3.2-7 - Table 3.2.2-1 Updated & moved to Figure 3.2-2

7. Added the following new figures: - Figure 2.1-2 SECAM build-test-analyze-build Concept - Figure 3.1-1 Scoping the assessment objective - Figure 3.1-2 SEPAs conducted to date - Figure 3.1-3 Life cycle coverage - Figure 3.1-4 Scoring profile example - Figure 3.2-3 Example Attributes of SE Capability - Figure 3.2-4 Classes of Capability Attributes - Figure 3.2-5 Example Generic Attributes - Figure 3.2-6 Example Vertical Theme Attributes - Figure 4.0-1 SECAM Improvement Summary

8. General KFA Changes: - New common questions to complete alignment with ISO SPICE - Used common wording of similar questions among KFAs - Process & Policy questions moved from Performed to Managed Level - Goals are now

referred to as General Characteristics 9. KFA 1.2 - Tracking & Oversight changes:

- Updated introductory text - Goal 4: "management plan" changed to "plan(s)"

10. KFA 1.3 - Subcontract Management changes: - Minor update to introductory text

Page 144: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

138

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

11. KFA 1.4 - Intergroup Coordination changes: - Updated introductory text

12. KFA 1.6 - Quality Assurance changes KFA renamed to Quality Management (terms updated throughout) - Updated introductory text

13. KFA 1.7 - Risk Management changes: - Minor update to introductory text

14. KFA 1.8 -Data Management (new in this version) 15. KFA 2.1 - Process Management & Improvement changes

- Minor update to introductory text 16. KFA 2.2 - Training

- KFA renamed to Competency Development (terms updated thru-out) - Updated introductory text - Characteristic 5 was reworded

17. KFA 2.3 - Technology Management - Minor update to introductory text - Characteristic 3 is new (old 3 & 4 become 4 & 5)

18. KFA 3.1 - System Concept Definition - Updated introductory text - Minor rewording of characteristics 2, 4, & 5.

19. Changes to Glossary - added explanation of "reference tags" - updated all reference tags - changed "assessment process" to "assessment method" - added definition for "capability level" - added definition for "data" - added definition for "data management" - added definition for "drawing tree" - added definition for "effectiveness" - changed "probe question" to "exploratory questions" - added definition for "function" - deleted definition for "functional areas" - deleted definition for "functional area representatives" - changed "interim model questionnaire" to "questionnaire" - added definition for "measurement" - added definition for "plan" - updated the definition for "practitioner" - added definition for "process maturity matrix template" - deleted "raw response forms" - added definition for "requirements traceability" - deleted "scoring templates" - added definition for "scoring profile template" - added definition for "specification element" - added definition for "specification tree" - added definition for "structure tree" - added definition for "system breakdown structure" - added definition for "systems engineering" - added definition for "trade-off analysis"

Page 145: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

139

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

- changed "trade-off study" to "trade study" - added definition for "user" - added definition for "value"

20. Added Appendix C to map to EIA 632 IEEE-1220-1994, ISO 9001 21. Notations were added to questions to identify generic question applied to each KFA. 22. Notations were added to identify "themes" within each KFA.

APPENDIX B TRACEABILITY MATRICES

Note: At the time of publication of Version 1 S0 of the SECAM updated versions of these tables were not available. Updated copies of these tables will be provided in a future minor revision of this document (e.g., Version 1.51). If updated copies of these table are needed prior to this release, please use the contact information provided in Section 1 of this document. The tables inc luded provide revision history through Version 1.41 (Preliminary) of the INCOSE SECAM.

The following tables provide a mapping between the questions and general characteristics identified for each key focus area. Each table provides a mapping for a key focus area. This work was originally contributed by Leroy Botten of Computer Sciences Corporation.

Each table consists of the following fields: • V1.v First two fields. Each identifies numbering used for the

question being considered (where v=3 or 4). If a question has been deleted and was removed from the model in version 1.4, the field in VIA is left blank.

• Char n Five columns which correlate to general characteristics in each KFA. Note that not all columns are used for each KFA. Also note that from V1.3 to VIA some characteristics were moved (refer to the VIA revision history for further information).

• Comments Information regarding changes that have been made.

In general, comments are of the form #.##: [change] where #.## denotes the version number in which the change occurred and [change] is one of the following phrases:

"New Question" the question was added in this version. "Moved to" x.x.x the question was moved from this location to a new section and assigned

number x.x.x. "Moved from" y.y.y the question was moved to this location from section y.y.y. "Modified" the wording of the question has been improved. The intent of the question was

not altered sufficiently to warrant a new number.

Other information may also appear in the comment field, as appropriate.

Shading is used within columns to denote goals fields which are not applicable to a particular KFA. Shading is also used within rows to denote obsolete entries which have either been deleted from the KFA, or moved to a more appropriate section.

Page 146: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

140

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 1.1 - Planning V1.3 6.1.1.1 V1.4Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

1.1-1.1 N N N N/A N/A 1.40: New Question 1.1.1 N/A N/A N/A N/A N/A 1.40: Moved to 1.1-2.9 1.1.2 1.1-1.2 N Y Y N/A N/A 1.40: Modified 1.1.3 1.1-1.3 Y N N N/A N/A 1.40: Modified 1.1-1.4 Y Y Y N/A N/A 1.40: New Question 1.1-1.5 Y Y Y N/A N/A 1.40: New Question 1.1.4 1.1-1.6 N Y Y N/A N/A 1.40: Modified 1.1.5 N/A N/A N/A N/A N/A 1.40: Moved to 1.1-2.10 1.1-2.1 N N Y N/A N/A 1.40: New Question 1.2.1 1.1-2.2 Y Y Y N/A N/A 1.2.2 1.1-2.3 Y Y Y N/A N/A 1.2.3 1.1-2.4 N Y Y N/A N/A 1.40: Modified 1.2.4 1.1-2.5 N Y Y N/A N/A 1.40: Modified 1.1-2.6 N Y N N/A N/A 1.40: New Question 1.2.5 1.1-2.7 Y Y Y N/A N/A 1.1-2.8 Y Y Y N/A N/A 1.40: Moved from 1.3.7 1.1-2.9 Y Y Y N/A N/A 1.40: New Question 1.1-2.10 N N Y N/A N/A 1.40: New Question 1.1-2.11 Y N N N/A N/A 1.40: New Question 1.1-2.12 N N Y N/A N/A 1.40: New Question 1.1.1 1.1-2.13 N N Y N/A N/A 1.40: Moved from 1.1.1 1.1.5 1.1-2.14 Y N Y N/A N/A 1.40: Moved from 1.1.5 1.1-2.15 N Y N N/A N/A 1.40: New Question 1.1-2.16 N N Y N/A N/A 1.41: New Question 1.1-3.1 Y Y N N/A N/A 1.40: New Question 1.3.1 1.1-3.2 N Y Y N/A N/A 1.40: Modified 1.3.2 N/A N/A N/A N/A N/A 1.40: Deleted 1.3.3 1.1-3.3 N Y Y N/A N/A 1.40: Modified 1.3.4 1.1-3.4 N Y Y N/A N/A 1.40: Modified 1.3.5 1.1-3.5 N Y Y N/A N/A 1.40: Modified 1.3.6 1.1-3.6 N/A N/A N/A N/A N/A 1.40: Modified

1.41: Deleted (same as 1.1-3.1)

1.3.7 N/A N/A N/A N/A N/A 1.40: Moved to 1.1-2.6 1.1-3.7 Y Y Y N/A N/A 1.40: New Question 1.3.8 1.1-3.8 N Y Y N/A N/A 1.3.9 1.1-3.9 N N Y N/A N/A 1.40: Modified 1.3.10 1.1-3.10 N Y Y N/A N/A 1.40: Modified 1.3.11 1.1-3.11 N Y Y N/A N/A 1.40: Modified 1.3.12 1.1-3.12 N Y Y N/A N/A 1.40: Modified 1.3.13 1.1-3.13 N Y Y N/A N/A 1.3.14 1.1-3.14 Y Y Y N/A N/A 1.40: Modified 1.3.15 1.1-3.15 Y Y Y N/A N/A

Page 147: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

141

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 1.1 - Planning V1.3 6.1.1.1 V1.4Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

1.3.16 1.1-3.16 N Y Y N/A N/A 1.40: Modified 1.3.17 1.1-3.17 N Y Y N/A N/A 1.40: Modified 1.3.18 1.1-3.18 N N Y N/A N/A 1.40: Modified 1.3.19 1.1-3.19 N Y Y N/A N/A 1.3.20 1.1-3.20 Y N N N/A N/A 1.40: Modified 1.3.21 1.1-3.21 N Y Y N/A N/A 1.1-3.22 Y N N N/A N/A 1.40: New Question 1.1-3.23 Y N N N/A N/A 1.40: New Question 1.1-3.24 N N Y N/A N/A 1.40: New Question 1.1-3.25 Y Y Y N/A N/A 1.41: New Question 1.4.1 1.1-4.1 N Y Y N/A N/A 1.4.2 1.1-4.2 N Y Y N/A N/A 1.1-4.3 Y N Y N/A N/A 1.40: New Question 1.1-4.4 Y Y Y N/A N/A 1.40: New Question 1.1-4.5 N Y Y N/A N/A 1.40: New Question 1.1-4.6 N Y Y N/A N/A 1.40: New Question 1.1-4.7 Y N Y N/A N/A 1.40: New Question 1.1-4.8 Y N Y N/A N/A 1.40: New Question 1.5.5 1.1-4.9 N Y N N/A N/A 1.1-4.10 Y N Y N/A N/A 1.40: New Question 1.5.1 1.1-5.1 N Y Y N/A N/A 1.40: Modified 1.5.2 1.1-5.2 N Y Y N/A N/A 1.40: Modified 1.5.3 N/A N/A N/A N/A N/A 1.40: Deleted 1.5.4 N/A N/A N/A N/A N/A 1.40: Moved to 1.2-3.9 1.5.5 N/A N/A N/A N/A N/A 1.40: Moved to 1.1-3.7 1.5.6 1.1-5.3 N Y Y N/A N/A 1.40: Modified 1.5.7 N/A N/A N/A N/A N/A 1.40: Deleted

(Same as 1.1-5.2) 1.5.8 N/A N/A N/A N/A N/A 1.40: Deleted 1.5.9 N/A N/A N/A N/A N/A 1.40: Deleted

(Same as 1.5.4) 1.5.10 N/A N/A N/A N/A N/A 1.40: Deleted

(Same as 1.5.5) 1.5.11 N/A N/A N/A N/A N/A 1.40: Deleted

(Same as 1.6-3.7) 1.1-5.4 Y Y Y N/A N/A 1.40: New Question 1.1-5.5 N Y Y N/A N/A 1.40: New Question 1.1-5.6 N N Y N/A N/A 1.40: New Question 1.1-5.7 N Y Y N/A N/A 1.40: New Question

KFA 1.2 - Tracking and Oversight

Page 148: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

142

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

V1.3 V1.4 - Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments 2.1.1 N/A N/A N/A N/A N/A 1.40: Moved to 1.2-2.1 1.2-1.1 N N N N N/A 1.40: New Question 2.1.2 1.2-1.2 Y Y Y Y N/A 1.40: Modified 1.2-1.3 Y Y N N N/A 1.40: New Question 1.2-1.4 Y Y N Y N/A 1.40: New Question 1.2-1.5 Y Y N Y N/A 1.40: New Question 1.2-1.6 N Y Y N N/A 1.40: New Question 1.2-1.7 N N N Y N/A 1.40: New Question 1.2-2.1 Y Y Y Y N/A 1.40: Moved from 2.1.1 2.2.1 N/A N/A N/A N/A N/A 2.2.2 1.2-2.2 Y Y Y Y N/A 1.40: Modified 2.2.3 1.2-2.3 N Y Y Y N/A 1.40: Modified 2.2.4 1.2-2.4 Y Y Y Y N/A 1.40: Modified 2.2.5 1.2-2.5 Y Y Y Y N/A 1.40: Modified 2.2.6 N/A N/A N/A N/A N/A 1.40: Deleted 1.2-2.6 N Y Y Y N/A 1.40: New Question 1.2-2.7 N Y Y Y N/A 1.40: New Question 1.2-2.8 N Y Y N N/A 1.40: New Question 1.2-2.9 Y Y N Y N/A 1.40: New Question 1.2-2.10 Y Y Y N N/A 1.40: New Question 2.3.1 1.2-3.1 Y Y Y Y N/A 1.40: Modified 2.3.2 1.2-3.2 Y Y Y Y N/A 1.40: Modified 2.3.3 1.2-3.3 Y Y Y Y N/A 1.40: Modified 2.3.4 N/A N/A N/A N/A N/A 1.40: Moved to 1.2-4.4 2.3.5 N/A N/A N/A N/A N/A 1.40: Moved to 1.2-4.5 2.3.6 N/A N/A N/A N/A N/A 1.40: Moved to 1.2-4.6 2.3.7 N/A N/A N/A N/A N/A 1.40: Moved to 1.2-4.7 2.3.8 N/A N/A N/A N/A N/A 1.40: Moved to 1.2-4.8 1.5.4/.9 1.2-3.4 Y Y N Y N/A 1.40: Moved from 1.5.4/.9

Modified 2.3.9 1.2-3.5 Y Y Y N N/A 1.40: Modified 2.3.10 1.2-3.6 N N Y Y N/A 1.40: Modified 2.3.11 1.2-3.7 Y Y Y Y N/A 1.40: Modified 2.3.12 1.2-3.8 Y Y Y Y N/A 2.3.13 1.2-3.9 Y N Y Y N/A 1.40: Modified 2.3.14 N/A N/A N/A N/A N/A 1.40: Deleted 1.2-3.10 Y Y Y N N/A 1.40: New Question 1.2-3.11 Y Y Y N N/A 1.40: New Question 1.2-3.12 Y Y Y N N/A 1.40: New Question 1.2-3.13 Y Y Y Y N/A 1.40: New Question 1.2-3.14 N Y N Y N/A 1.40: New Question 1.2-3.15 Y Y Y N N/A 1.40: New Question 2.3.15 1.2-3.16 Y Y Y Y N/A 2.3.16 1.2-3.17 Y Y Y Y N/A 2.3.17 1.2-3.18 Y N Y Y N/A 1.40: Modified

Page 149: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

143

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

2.3.18 N/A N/A N/A N/A N/A 1.40: Moved to 1.2-4.9 2.3.19 1.2-3.19 Y Y Y Y N/A 1.40: Modified 2.3.20 1.2-3.20 N Y Y Y N/A 2.3.21 1.2--3.21 Y Y Y Y N/A 1.40: Modified 2.3.22 N/A N/A N/A N/A N/A 1.40: Deleted (Same as 2.5.1) 2.3.23 1.2-3.22 N N Y Y N/A 1.40: Modified 1.2-3.23 N Y N N N/A 1.40: New Question 1.2-3.24 Y Y N N N/A 1.40: New Question 1.2-3.25 Y Y N N N/A 1.40: New Question 1.2-3.26 Y N N N N/A 1.41: New Question 2.4.1 1.2-4.1 Y Y Y Y N/A 2.4.2 1.2-4.2 Y Y Y Y N/A 2.4.3 N/A N/A N/A N/A N/A 1.40: Deleted 2.4.4 1.2-4.3 Y Y Y Y N/A 1.40: Moved from 2.3.4 2.4.5 1.2-4.4 Y Y Y Y N/A 1.40: Moved from 2.3.5 2.4.6 1.2-4.5 Y Y Y Y N/A 1.40: Moved from 2.3.6 2.4.7 1.2-4.6 Y Y Y Y N/A 1.40: Moved from 2.3.7 2.4.8 1.2-4.7 Y Y Y Y N/A 1.40: Moved from 2.3.8 1.2-4.8 Y Y N Y N/A 1.40: New Question 2.3.18 1.2-4.9 Y Y Y Y N/A 1.40: Moved from 2.3.18 2.5.1 1.2-5.1 Y Y Y Y N/A 1.40: Modified 1.2-5.2 N Y Y Y N/A 1.40: New Question 2.5.2 1.2-5.3 Y Y Y Y N/A 1.40: Modified 2.5.3 1.2-5.4 Y Y Y Y N/A 1.40: Modified 2.5.4 1.2-5.5 Y Y Y Y N/A 1.40: Modified

KFA 1.3 - Subcontract Management V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

1.3-1.1 N N N Y N/A 1.40: New Question 3.1.1 1.3-1.2 N Y Y Y N/A 1.40: Modified 3.1.2 N/A N/A N/A N/A N/A 1.40: Moved to 1.3-2.1 1.3-1.3 Y Y Y N N/A 1.40: New Question 3.1.2 1.3-2.1 N Y Y Y N/A 1.40: Modified 3.2.1 1.3-2.2 N N Y Y N/A 1.40: Modified 3.2.2 1.3-2.3 Y N Y N N/A 1.40: Modified 3.2.3 1.3-2.4 N Y Y N N/A 1.40: Modified 1.3-2.5 Y Y Y Y N/A 1.40: New Question 1.3-2.6 N Y Y Y N/A 1.41: New Question 3.3.1 1.3-3.1 Y Y N N N/A 1.40: Modified 3.3.2 1.3-3.2 N N Y N N/A 1.40: Modified 3.3.3 1.3-3.3 Y N N Y N/A 3.3.14 1.3-3.4 Y N N Y N/A 1.40: Modified 1.3-3.5 Y N N Y N/A 1.40: New Question 3.3.4 N/A N/A N/A N/A N/A 1.40: Deleted 3.3.5 1.3-3.6 Y Y N Y N/A

Page 150: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

144

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 1.3 - Subcontract Management V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.3.6 1.3-3.7 Y Y N Y N/A 1.40: Modif ied 3.3.7 1.3-3.8 N N Y Y N/A 3.3.8 1.3-3.9 N N N Y N/A 1.40: Modified 3.3.9 1.3-3.10 Y N Y Y N/A 1.40: Modified 3.3.10 1.3-3.11 N N N Y N/A 1.40: Modified 3.3.11 1.3-3.12 N N N Y N/A 1.40: Modified 3.3.12 1.3-3.13 N N N Y N/A 1.40: Modified 3.3.13 N/A N/A N/A N/A N/A 1.40: Moved to 3.4.3 3.3.15 1.3-3.14 Y Y Y Y N/A 1.40: Modified 3.3.16 N/A N/A N/A N/A N/A 1.40: Moved to 3.4.2 3.3.17 1.3-3.15 N N Y Y N/A 1.40: Modified 3.3.18 1.3-3.16 Y N Y Y N/A 3.3.19 N/A N/A N/A N/A N/A 1.40: Deleted

(Same as 3.3.10) 3.3.20 1.3-3.17 N N N Y N/A 1.40: Modified 1.3-3.18 Y N N Y N/A 1.40: New Question 1.3-3.19 Y N N N N/A 1.41: New Question 3.3.21 N/A N/A N/A N/A N/A Moved to 1.3-3.14 3.4.1 1.3-4.1 N N N Y N/A 1.40: Modified 3.4.2 1.3-4.2 N N N Y N/A 1.20: Modified 3.4.3 1.3-4.3 Y N N Y N/A 1.20: Modified

1.40: Modified 3.5.1 1.3-5.1 N N N Y N/A 1.40: Modified 3.5.2 1.3-5.2 Y N N Y N/A 1.40: Modified 3.5.3 1.3-5.3 N N N Y N/A 1.40: Modified 1.3-5.4 Y Y Y Y N/A 1.40: New Question

KFA 1.4 - Intergroup Coordination V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

1.4-1.1 N N Y N N/A 1.40: New Question 4.1.1 N/A N/A N/A N/A N/A 1.20: Deleted 4.1.2 N/A N/A N/A N/A N/A 1.20: Moved to 4.2.2 4.1.3 N/A N/A N/A N/A N/A 1.20: Moved to 4.2.4 4.1.6 N/A N/A N/A N/A N/A 1.20: New Questions

1.40: Deleted 4.1.7 1.4-1.2 N Y Y Y N/A 1.20: New Question 1.4-1.3 Y Y Y Y N/A 1.40: New Question 4.1.4 1.4-1.4 Y Y Y Y N/A 1.20: Modified

1.40: Modified 4.1.5 1.4-1.5 Y N N N N/A 1.20: New Questions

1.40: Modified 1.4-1.6 N N Y Y N/A 1.40: New Question 4.2.1 1.4-2.1 N Y Y Y N/A 1.40: Modified

Page 151: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

145

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 1.4 - Intergroup Coordination V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

4.2.2 1.4-2.2 N Y Y Y N/A 1.20: Moved from 4.1.2 1.40: Modified

4.2.3 1.4-2.3 N Y Y Y N/A 1.20: New Question 1.4-2.4 N Y Y N N/A 1.40: New Question 4.2.4 1.4-2.5 N Y Y Y N/A 1.20: Moved from 4.1.3 4.2.5 1.4-2.6 Y Y Y Y N/A 1.20: New Question 1.4-2.7 N N Y N N/A 1.40: New Question 1.4-2.8 N N N Y N/A 1.40: New Question 1.4-2.9 N N N Y N/A 1.40: New Question 1.4-2.10 N N Y Y N/A 1.40: New Question 1.4-2.11 N Y Y Y N/A 1.41: New Question 1.4-3.1 Y N N N N/A 1.40: New Question 4.3.1 1.4-3.2 Y Y N N N/A 1.40: Modified 4.3.2 1.4-3.3 N Y Y Y N/A 4.3.3 1.4-3.4 Y Y Y Y N/A 1.40: Modified 4.3.4 1.4-3.5 N Y Y Y N/A 4.3.5 1.4-3.6 N Y Y Y N/A 4.3.6 1.4-3.7 N Y Y Y N/A 4.3.7 1.4-3.8 N Y Y Y N/A 1.40: Modified 1.4-3.9 Y N Y Y N/A 1.40: New Question 1.4-3.10 Y Y Y Y N/A 1.40: New Question 4.4.1 1.4-3.11 Y Y Y N N/A 1.20: New Question

1.40: Moved from 4.4.1 Modified

1.4-3.12 N Y N Y N/A 1.41: New Question 4.4.1 N/A N/A N/A N/A N/A 1.20: New Questions

1.40: Moved to 1.4-3.13 Modified

4.4.2 1.4-4.1 N Y Y N N/A 1.20: New Question 1.40: Modified

4.4.3 1.4-4.2 Y Y Y N N/A 1.20: New Question 1.40: Modified

4.5.1 1.4-5.1 Y Y Y Y N/A 1.20: New Question 1.40: Modified

4.5.2 1.4-5.2 N Y Y Y N/A 1.20: New Question 4.5.3 1.4-5.3 Y Y Y Y N/A 1.20: New Question 1.4-5.4 Y Y Y Y N/A 1.40: New Question

KFA 1.5 - Configuration Management V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goals Comments

1.5-1.1 N Y N N N/A 1.40: New Question 5.1.1 1.5-1.2 Y Y Y Y N/A 5.1.2 1.5-1.3 N Y Y Y N/A 1.40: Modified

Page 152: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

146

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

1.5-1.4 Y N N N N/A 1.40: New Question 5.2.1 1.5-2.1 Y Y Y N N/A 1.40: Modified 5.2.2 1.5-2.2 N Y Y Y N/A 1.40: Modified 5.2.3 1.5-2.3 Y Y Y Y N/A 5.2.4 1.5-2.4 Y Y Y N N/A 1.40: Modified 5.2.5 1.5-2.5 Y Y Y N N/A 1.40: Modified 1.5-2.6 N Y N Y N/A 1.40: New Question 1.5-2.7 N N Y Y N/A 1.41: New Question 5.3.1 1.5-3.1 Y Y Y N N/A 1.40: Modified

1.41: Modified 5.3.2 1.5-3.2 N Y Y Y N/A 5.3.3 1.5-3.3 Y Y Y Y N/A 5.3.4 1.5-3.4 Y Y Y N N/A 5.3.5 1.5-3.5 Y Y Y Y N/A 1.40: Modified 5.3.6 1.5-3.6 Y Y Y N N/A 1.40: Modified 5.3.7 1.5-3.7 Y Y Y N N/A 1.40: Modified 5.3.8 1.5-3.8 Y Y Y Y N/A 1.40: Modified 5.3.9 1.5-3.9 N Y Y Y N/A 5.3.10 1.5-3.10 Y Y Y Y N/A 1.40: Modified 5.3.11 1.5-3.11 Y Y Y Y N/A 1.40: Modified 5.3.12 1.5-3.12 Y Y Y Y N/A 5.3.13 1.5-3.13 Y Y Y Y N/A 1.40: Modified 5.3.14 1.5-3.14 Y Y Y Y N/A 5.3.15 1.5-3.15 Y Y Y N N/A 1.40: Modified 5.3.16 1.5-3.16 Y Y Y Y N/A 1.40: Modified 5.3.17 1.5-3.17 Y Y Y Y N/A 1.40: Modified

1.41: Modified 5.3.18 1.5-3.18 N Y Y Y N/A 1.5-3.19 N N Y Y N/A 1.40: New Question 1.5-3.20 Y Y Y Y N/A 1.41: New Question 5.4.1 1.5-4.1 N Y Y N N/A 1.40: Modified 1.5-4.2 Y Y Y N N/A 1.40: New Question 5.5.1 1.5-5.1 Y Y Y Y N/A 1.40: Modified 5.5.2 1.5-5.2 Y Y Y Y N/A 1.40: Modified 5.5.3 1.5-5.3 N Y Y Y N/A 1.40: Modified 5.5.4 1.5-5.4 N Y Y Y N/A 1.40: Modified 1.5-5.5 N Y N Y N/A 1.40: New Question 1.5-5.6 Y N Y N N/A 1.40: New Question

KFA 1.6 - Quality Assurance V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

1.6-1.1 N N Y N/A N/A 1.40: New Question 6.1.1 1.6-1.2 Y Y Y N/A N/A 1.40: Modified 6.1.2 1.6-1.3 Y Y Y NIA N/A 1.40: Modified 6.1.3 1.6-1.4 Y Y Y N/A N/A 1.40: New Question 6.1.4 1.6-1.5 Y Y Y N/A N/A

Page 153: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

147

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 1.6 - Quality Assurance V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

1.6-1.6 Y N N N/A N/A 6.2.1 1.6-2.1 N Y Y N/A N/A 1.40: Modified 6.2.2 1.6-2.2 N Y Y N/A N/A 1.40: Modified 1.6-2.3 N Y Y N/A N/A 1.40: New Question 1.6-2.4 N Y Y N/A N/A 1.41: New Question 1.6-3.1 Y N N N/A N/A 1.40: New Question 6.3.1 1.6-3.2 N Y N N/A N/A 1.40: Modified 6.3.2 N/A N/A N/A N/A N/A 1.10: Deleted 1.6-3.3 Y N N N/A N/A 1.40: New Question 1.6-3.4 N Y N N/A N/A 1.40: New Question 6.3.3 N/A N/A N/A N/A N/A 1.40: Moved to 1.6-5.2 1.6-3.5 N Y N N/A N/A 1.40: New Question 6.3.4 1.6-3.6 N Y Y N/A N/A 1.40: Modified 6.3.5 1.6-3.7 N Y Y N/A N/A 1.40: Modified 6.3.6 1.6-3.8 N Y Y N/A N/A 1.40: Modified 6.3.7 1.6-3.9 N Y Y N/A N/A 1.40: Modified 1.6-3.10 N Y N N/A N/A 1.40: New Question 1.6-3.11 Y Y Y N/A N/A 1.41: New Question 6.4.1 1.6-4.1 Y Y Y N/A N/A 1.40: Modified 6.4.2 1.6-4.2 Y Y Y N/A N/A 1.40: Modified 1.6-4.3 N N Y N/A N/A 1.40: New Question 1.6-4.4 Y N Y N/A N/A 1.40: New Question 1.6-5.1 N Y N N/A N/A 1.40: New Question 6.5.1 1.6-5.2 Y Y Y N/A N/A 1.40: Modified 1.6-5.3 N Y N N/A N/A 1.40: New Question 1.6-5.4 N Y Y N/A N/A 1.40: New Question 1.6-5.5 N N Y N/A N/A 1.40: New Question 1.6-5.6 Y Y Y N/A N/A 1.40: New Question 1.6-5.7 N Y N N/A N/A 1.40: New Question

KFA 1.7 -- Risk Management V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5

1.7-1.1 N Y N N N 1.40: New Question 1.7-1.2 Y N N N N 1.40: New Question 7.1.1 1.7-1.3 Y Y Y Y Y 1.40: Modified 7.1.2 N/A N/A N/A N/A N/A 1.10: Deleted 7.1.3 N/A N/A N/A N/A N/A 1.10: Deleted 7.1.4 N/A N/A N/A N/A N/A 1.10: Deleted 1.7-1.4 Y N N N N 1.40: New Question 7.2.1 N/A N/A N/A N/A N/A 1.10: Moved to 1.7-2.5 1.7-2.1 Y N N Y N 1.40: New Question 7.2.2 1.7-2.2 Y Y Y N N 1.40: Modified 7.2.3 N/A N/A N/A N/A N/A 1.10: Deleted

Page 154: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

148

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 1.7 -- Risk Management V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5

1.7-2.3 Y Y N N N 1.40: New Question 1.7-2.4 Y Y N N N 1.40: New Question 1.7-2.5 Y Y Y N N 1.40: New Question 1.7-2.6 N Y Y Y Y 1.40: New Question 7.2.4 1.7-2.7 Y Y Y N N 1.40: Modified 7.2.1 1.7-2.8 Y Y Y N N 1.40: Modified 1.7-2.9 N N N Y Y 1.40: New Question 1.7-2.10 N N N N Y 1.40: New Question 1.7-2.11 N Y Y N N 1.40: New Question 1.7-2.12 N N Y Y N 1.40: New Question 1.7-2.13 N Y N Y N 1.41: New Question 1.7-3.1 Y N N N N 1.40: New Question 7.3.1 1.7-3.2 Y Y Y Y N 1.40: Modified 1.7-3.3 Y Y N N N 1.40: New Question 1.7-3.4 N Y N N N 1.40: New Question 1.7-3.5 N Y N N N 1.40: New Question 1.7-3.6 N Y Y N N 1.40: New Question 1.7-3.7 N Y Y N N 1.40: New Question 1.7-3.8 N Y Y Y N 1.40: New Question 1.7-3.9 N Y Y Y Y 1.40: New Question 1.7-3.10 N Y N N N 1.40: New Question 1.7-3.11 N Y Y Y Y 1.40: New Question 7.3.2 N/A N/A N/A N/A N/A 1.40: Deleted 7.3.3 N/A N/A N/A N/A N/A 1.40: Deleted 1.7-3.12 Y Y Y N Y 1.40: New Question 7.3.5 1.7-3.13 Y Y Y Y Y 1.40: Modified 1.7-3.14 N N Y Y N 1.40: New Question 1.7-3.15 N Y N N N 1.40: New Question 1.7-3.16 Y Y Y N N 1.40: New Question 1.7-3.17 N N N Y Y 1.40: New Question 1.7-3.18 Y Y Y N N 1.40: New Question 7.3.4 1.7-3.19 N N Y Y N 1.40: Modified 1.7-3.20 N N N N Y 1.40: New Question 1.7-3.21 N N Y Y N 1.40: New Question 1.7-3.22 N Y Y Y Y 1.40: New Question 7.4.1 1.7-3.23 Y Y Y Y N 1.40: Moved to 1.7-3.15 1.7-3.24 N N N Y N 1.40: New Question 1.7-3.25 N N N N Y 1.41: New Question 7.4.2 N/A N/A N/A N/A N/A 1.40: Deleted 7.4.3 N/A N/A N/A N/A N/A 1.40: Deleted 7.4.4 N/A N/A N/A N/A N/A 1.40: Deleted 1.7-4.1 N N Y Y Y 1.40: New Question 1.7-4.2 Y Y Y N N 1.40: New Question 1.7-4.3 N Y Y Y N 1.40: New Question 1.7-4.4 N N Y N N 1.40: New Question

Page 155: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

149

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 1.7 -- Risk Management V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5

7.5.1 N/A N/A N/A N/A N/A 1.40: Deleted 7.5.2 N/A N/A N/A N/A N/A 1.40: Deleted 7.5.3 N/A N/A N/A N/A N/A 1.40: Deleted 7.5.4 N/A N/A N/A N/A N/A 1.40: Deleted 1.7-5.1 N N N Y N 1.40: New Question 1.7-5.2 N N N Y N 1.40: New Question 1.7-5.3 Y N N Y N 1.40: New Question 1.7-5.4 N N N Y N 1.40: New Question 1.7-5.5 N Y Y Y Y 1.40: New Question

KFA 2.1 -- Process Management and Improvement V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5

2.1-1.1 N N Y N N 1.40: New Question 2.1-1.2 N N N N Y 1.40: New Question 8.1.1 2.1-1.3 N Y Y Y Y 8.1.2 2.1-1.4 N Y Y Y Y 8.1.3 2.1-1.5 N Y Y Y Y 1.40: Modified 8.1.4 2.1-1.6 N Y Y Y Y 2.1-1.7 Y Y Y N N 1.40: New Question

1.40: Modified 2.1-1.8 Y N N N Y 1.40: New Question 8.2.1 N/A N/A N/A N/A N/A 1.40: Deleted 2.1-2.1 N Y Y Y Y 1.40: New Question 2.1-2.2 N Y Y Y Y 1.40: New Question 2.1-2.3 N Y Y Y Y 1.40: New Question 2.1-2.4 N Y N N N 1.40: New Question 2.1-2.5 N Y Y Y Y 8.2.2 N/A N/A N/A N/A N/A 1.40: Deleted 8.2.3 N/A N/A N/A N/A N/A 1.40: Deleted 8.2.4 N/A N/A N/A N/A N/A 1.40: Deleted 2.1-2.6 1.40: New Question 2.1-2.7 1.40: New Question 2.1-2.8 1.40: New Question 2.1-2.9 1.40: New Question 2.1-2.10 1.40: New Question 2.1-2.11 1.40: New Question 2.1-2.12 1.40: New Question 8.3.1 2.1-3.1 1.40: New Question 8.3.1 2.1-3.2 1.40: New Question 8.3.2 N/A N/A N/A N/A N/A 1.40: Moved to 2.1-4.4 8.3.3 N/A N/A N/A N/A N/A 1.40: Moved to 2.1-4.6 8.3.4 N/A N/A N/A N/A N/A 1.40: Moved to 2.1-4.5 8.3.5 2.1-3.3 N Y N N N 1.40: Modified

Page 156: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

150

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 2.1 -- Process Management and Improvement V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5

8.3.6 N/A N/A N/A N/A N/A 1.40: Deleted (Same as 8.4.2)

8.3.7 N/A N/A N/A N/A N/A 1.40: Deleted 8.3.8 2.1-3.4 N Y N Y Y 1.40: Modified 8.3.9 2.1-3.5 N Y Y Y Y 8.3.10 2.1-3.6 N Y Y N N 1.40: Modified 2.1-3.7 N Y Y N N 1.40: New Question 8.3.11 2.1-3.8 N Y Y N Y 1.40: Modified 2.1-3.9 N Y N N N 1.40: New Question 8.3.12 N/A N/A N/A N/A N/A 1.40: Deleted 8.4.1 2.1-4.1 N Y Y N Y 1.40: Modified 8.4.2 2.1-4.2 N Y Y Y Y 1.40: Modified 8.4.3 2.1-4.3 N Y Y N Y 1.40: Modified 2.1-4.4 N Y Y Y Y 1.40: Moved from 8.3.2 2.1-4.5 N N Y Y Y 1.40: Moved from 8.3.4 2.1-4.6 N Y N Y Y 1.40: Moved from 8.3.2 2.1-4.7 N Y Y N N 1.40: New Question 2.1-4.8 N N N N Y 1.40: New Question 2.1-5.1 N N N N Y 1.40: New Question 2.1-5.2 N N Y N Y 1.40: New Question 2.1-5.3 N N N N Y 1.40: New Question 2.1-5.4 N N N N Y 1.40: New Question 8.5.1 N/A N/A N/A N/A N/A 1.40: Deleted 8.5.2 2.1-5.5 N Y Y Y Y

KFA 2.2 -- Training V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

2.2-1.1 N Y N N N 1.40: New Question 9.1.1 2.2-1.2 Y N N N N 2.2-1.3 Y N N N N 1.40: New Question 2.2-2.1 Y N N N N 1.40: New Question 2.2-2.2 Y Y N N N 1.40: New Question 2.3-2.3 Y N Y N N 1.40: New Question 2.4-2.4 Y Y N N N 1.40: New Question 2.2-2.5 Y Y Y Y N 1.40: New Question 2.2-2.6 Y Y Y Y N 1.40: New Question 2.2-2.7 Y Y N N N 1.40: New Question 2.2-2.8 Y Y Y Y N 1.40: New Question 2.2-2.9 Y Y Y Y N 1.40: New Question 2.2-2.10 Y Y N N N 1.40: New Question 9.2.1 2.2-2.11 Y Y N N Y 1.30: New Question

1.40: Modified 9.2.2 2.2-2.12 Y Y Y Y N 1.40: New Question

Page 157: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

151

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 2.2 -- Training V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

2.2-2.13 Y N N N N 1.40: New Question 2.2-2.14 N N N Y N 1.40: New Question 2.2-2.15 N Y Y Y N 1.40: New Question 2.2-2.16 Y Y Y Y Y 1.40: New Question 2.2-3.1 Y N N N N 1.40: New Question 2.2-3.2 Y Y Y Y N 1.40: New Question 2.2-3.3 N N N Y N 1.40: New Question 9.3.1 2.2-3.4 N N Y N N 1.40: Modified 9.3.2 2.2-3.5 Y Y Y N N 9.3.3 2.2-3.6 N Y Y N N 1.40: Modified 2.2-3.7 Y Y Y Y N 1.40: New Question 9.3.4 N/A N/A N/A N/A N/A 1.40: Deleted 9.3.5 2.2-3.8 N Y N N N 9.3.6 N/A N/A N/A N/A N/A 1.10: Deleted 9.3.7 2.2-3.9 N Y N N N 9.3.8 2.2-3.10 N Y Y Y N 1.40: Modified 2.2-3.11 Y N N N N 1.40: New Question 2.2-3.12 Y Y N N N 1.40: New Question 2.2-3.13 N N N Y N 1.40: New Question 2.2-3.14 N N N Y N 1.40: New Question 2.2-3.15 N N N N Y 1.40: New Question 2.2-3.16 N Y N N N 1.41: New Question 9.4.1 2.2-4.1 Y N N N N 1.40: Modified 9.4.2 2.2-4.2 N N N N Y 2.2-4.3 N N N Y Y 1.40: New Question 2.2-4.4 N N Y N Y 1.40: New Question 2.2-4.5 N N N N Y 1.40: New Question 9.5.1 N/A N/A N/A N/A N/A 1.10: Deleted 2.2-5.1 Y N N N Y 1.40: New Question 2.2-5.2 Y N Y N Y 1.40: New Question 2.2-5.3 Y N N N Y 1.40: New Question 2.2-5.4 N N N N Y 1.40: New Question 9.5.2 2.2-5.5 Y N N N Y 9.5.3 2.2-5.6 Y Y Y N Y

KFA 2.3 -- Technology Management V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

2.3-1.1 N Y N N N/A 1.40: New Question 10.1.1 2.3-1.2 N Y N N N/A 10.1.2 2.3-1.3 N Y N N N/A 2.3-2.1 N Y N N N/A 1.40: New Question 9.5.1 N/A N/A N/A N/A N/A 1.10: Deleted 2.3-2.2 N Y N N N/A 1.40: New Question

Page 158: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

152

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 2.3 -- Technology Management V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

10.2.2 2.3-2.3 N Y N N N/A 1.40: Modified 10.2.3 2.3-2.4 N Y N N N/A 2.3-2.5 N Y N N N/A 1.40: New Question 10.2.4 2.3-2.6 N Y Y N N/A 2.3-2.7 Y Y N N N/A 1.40: New Question 2.3-2.8 N N Y N N/A 1.40: New Question 2.3-2.9 N Y Y N N/A 1.40: New Question 2.3-2.10 N Y Y Y N/A 1.41: New Question 2.3-3.1 Y Y N N N/A 1.40: New Question

1.41: Deleted (Same as 2.3-2.1)

10.2.1 2.3-3.2 Y Y N N N/A 1.40: Moved from 10.2.1 10.4.1 2.3-3.3 Y Y Y N N/A 1.40: Moved from 10.4.1 10.4.2 2.3-3.4 Y Y N N N/A 1.40: Moved from 10.4.2 10.4.3 2.3-3.5 Y Y N N N/A 1.40: Moved from 10.4.3 10.3.1 2.3-3.6 Y Y N Y N/A 10.3.2 2.3-3.7 Y Y Y Y N/A 10.3.3 2.3-3.8 Y Y Y N N/A 10.3.4 2.3-3.9 N Y Y N N/A 10.3.5 2.3-3.10 Y Y Y Y N/A 10.3.6 2.3-3.11 Y Y N Y N/A 10.3.7 2.3-3.12 Y Y Y Y N/A 10.3.8 2.3-3.13 Y Y Y Y N/A 10.3.9 2.3-3.14 Y Y N N N/A 2.3-3.15 N Y N N/A 1.40: New Question 2.3-3.16 Y N N Y N/A 1.41: New Question 2.3-4.1 N N Y Y N/A 1.40: New Question 2.3-4.2 N Y Y Y N/A 1.40: New Question 10.4.1 N/A N/A N/A N/A N/A 1.40: Moved to 2.3-3.3 10.4.2 N/A N/A N/A N/A N/A 1.40: Moved to 2.3-3.4 10.4.3 N/A N/A N/A N/A N/A 1.40: Moved to 2.3-3.5 10.5.1 N/A N/A N/A N/A N/A 1.40: Deleted 2.3-5.1 Y N Y Y N/A 1.40: New Question 2.3-5.2 Y Y Y Y N/A 1.40: New Question 2.3-5.3 Y Y N N N/A 1.40: New Question 2.3-5.4 Y Y N Y N/A 1.40: New Question 2.3-5.5 Y N N Y N/A 1.40: New Question

KFA 2.4 -- Environment and Tool Support V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

2.4-1.1 Y N N N N/A 1.40: New Question 11.1.1 2.4-1.2 N Y Y N N/A 1.20: Both items reanalyzed 11.1.2 2.4-1.3 Y Y Y N N/A 1.20: New Question

Page 159: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

153

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 2.4 -- Environment and Tool Support V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

11.1.3 2.4-1.4 Y Y Y Y N/A 1.20: New Question 11.2.1 N/A N/A N/A N/A N/A 1.20: Moved to 11.3.19 (2.4-

3.19) 11.2.2 N/A N/A N/A N/A N/A 2.4-1.5 Y N N N N/A 1.40: New Question 11.2.3 2.4-2.1 Y Y N N N/A 1.20: New Question 11.2.4 2.4-2.2 Y Y Y N N/A 1.20: New Question

1.40: Modified 11.2.5 2.4-2.3 N Y Y N N/A 1.20: New Question 11.2.6 N/A N/A N/A N/A N/A 1.20: New Question

1.40: Deleted 2.4-2.4 Y Y N N N/A 1.40: New Question 2.4-2.5 Y Y N N N/A 1.40: New Question 2.4-2.6 N Y N N N/A 1.40: New Question 2.4-2.7 N Y N N N/A 1.40: New Question 2.4-2.8 N Y Y N N/A 1.40: New Question 2.4-2.9 N Y N N N/A 1.40: New Question 2.4-2.10 Y Y N N N/A 1.40: New Question 2.4-2.11 N Y N N N/A 1.41: New Question 11.3.1 N/A N/A N/A N/A N/A 1.20: Moved to 11.4.1 (2.4-

4.1) 2.4-3.1 Y Y N N N/A 1.40: New Question 11.4.1 2.4-3.2 Y Y Y Y N/A 1.20: Moved from 11.3.1

1.40: Moved from 11.4.1 2.4-3.3 Y Y Y Y N/A 1.40: New Question 11.3.2 2.4-3.4 Y Y Y Y N/A 1.20: Both items reanalyzed

1.40: Modified 11.3.3 N/A N/A N/A N/A N/A 1.20: Deleted 11.3.4 N/A N/A N/A N/A N/A 1.20: Deleted 11.3.5 N/A N/A N/A N/A N/A 1.20: Deleted 11.3.6 N/A N/A N/A N/A N/A 1.20: Deleted 2.4-3.5 Y Y N Y N/A 1.40: New Question 2.4-3.6 Y Y N Y N/A 1.40: New Question 2.4-3.7 Y Y N Y N/A 1.40: New Question 11.3.7 2.4-3.8 N Y Y N N/A 1.20: Both items reanalyzed 11.3.8 N/A N/A N/A N/A N/A 1.20: Goal 1 change implies

side bar 1.40: Deleted

11.3.9 2.4-3.9 Y Y Y Y N/A 1.20: Goal 1 change implies side bar

11.3.10 2.4-3.10 N Y N N N/A 1.20: Goal 1 change implies side bar

11.3.11 2.4-3.11 Y Y Y Y N/A 1.20: Goal 1 change implies side bar

11.3.12 2.4-3.12 N Y N N N/A 1.20: Both items reanalyzed

Page 160: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

154

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 2.4 -- Environment and Tool Support V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

11.3.8 N/A N/A N/A N/A N/A 1.20: Deleted 11.3.8 N/A N/A N/A N/A N/A 1.20: Deleted 11.3.8 N/A N/A N/A N/A N/A 1.20: Deleted 11.3.8 N/A N/A N/A N/A N/A 1.20: Goal 1 change implies

side bar 1.40: Deleted

11.3.8 N/A N/A N/A N/A N/A 1.20: Deleted 11.3.8 N/A N/A N/A N/A N/A 1.20: Deleted 11.3.8 N/A N/A N/A N/A N/A 1.20: Both items reanalyzed

1.40: Deleted 2.4-3.13 N Y N N N/A 1.40: New Question 2.4-3.14 Y N N N N/A 1.40: New Question 2.4-3.15 N Y N N N/A 1.41: New Question 11.4.1 N/A N/A N/A N/A N/A 1.20: Moved from 11.3.1

1.40: Moved to 2.4-4.1 2.4-4.1 Y N N N N/A 1.40: New Question 2.4-4.2 Y Y N N N/A 1.40: New Question 2.4-4.3 Y N N N N/A 1.40: New Question 2.4-4.4 Y N N Y N/A 1.40: New Question 2.4-4.5 N N N Y N/A 1.40: New Question 2.4-4.6 Y N N Y N/A 1.40: New Question 2.4-5.1 Y N N Y N/A 1.40: New Question 2.4-5.2 Y Y N Y N/A 1.40: New Question 11.5.1 2.4-5.3 Y Y Y Y N/A 1.40: New Question 11.5.2 2.4-5.4 Y Y Y Y N/A 1.40: New Question 11.5.3 2.4-5.5 Y Y Y Y N/A 1.20: New Question

1.40: Modified

KFA 3.1 -- System Concept Definition V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.1-1.1 N Y N N N 1.40: New Question 16.1.1 3.1-1.2 N Y N N N 1.20: New Question 16.1.2 3.1-1.3 N Y Y Y N 1.20: New Question 16.1.3 3.1-1.4 Y Y Y Y N 1.20: New Question 3.1-1.5 Y N N N N 1.40: New Question 3.1-2.1 Y N N N N 1.40: New Question 3.1-2.2 Y N N N N 1.40: New Question 16.2.1 3.1-2.3 N Y Y Y N 1.20: New Question 16.2.2 3.1-2.4 N Y Y Y N 1.20: New Question 3.1-2.5 N N Y N N 1.40: New Question 3.1-2.6 N Y N Y N 1.40: New Question 3.1-2.7 N N N Y Y 1.40: New Question 3.1-2.8 N Y Y N N 1.40: New Question

Page 161: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

155

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.1 -- System Concept Definition V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.1-2.9 N N N Y N 1.40: New Question 16.2.3 3.1-2.10 N N N N Y 1.20: New Question 3.1-2.11 N Y Y Y Y 1.40: New Question 16.2.5 3.1-2.12 N Y Y Y N 1.20: New Question

1.40: Modified 16.2.4 3.1-2.13 Y Y Y Y Y 1.20: New Question

1.40: Modified 16.2.6 16.2.7 3.1-2.14 N Y N N N 1.40: New Question 3.1-2.15 N Y Y Y N 1.40: New Question 3.1-2.16 N N Y N Y 1.40: New Question 3.1-3.1 Y N N N N 1.40: New Question 16.3.1 3.1-3.2 N Y N N N 1.40: New Question 16.3.2 3.1-3.3 N Y Y Y N 1.40: New Question 16.3.3 3.1-3.4 N Y Y N N 1.40: New Question 16.3.4 3.1-3.5 N N N Y Y 1.40: New Question 3.1-3.6 N N Y N N 1.40: New Question 3.1-3.7 N N Y N N 1.40: New Question 16.3.5 3.1-3.8 N Y Y Y Y 1.20: New Question 16.3.6 3.1-3.9 N Y Y Y Y 1.20: New Question 16.3.7 3.1-3.10 N N N Y Y 1.20: New Question 16.3.8 3.1-3.11 Y Y Y N N 1.20: New Question 16.3.9 3.1-3.12 N N Y Y Y 1.20: New Question 16.3.10 3.1-3.13 N N Y Y Y 1.20: New Question 16.3.11 3.1-3.14 N Y Y Y Y 1.20: New Question 3.1-3.15 Y N N N N 1.40: New Question 3.1-3.16 Y Y Y Y Y 1.41: New Question 16.4.1 3.1-4.1 N Y Y N N 1.20: New Question

1.40: Modified 16.4.2 3.1-4.2 N Y Y Y Y 1.20: New Question 3.1-4.3 Y Y Y Y Y 1.40: New Question 3.1-4.4 Y Y Y Y Y 1.40: New Question 3.1-4.5 Y Y Y Y Y 1.40: New Question 3.1-5.1 Y N N N N 1.40: New Question 3.1-5.2 Y Y Y Y Y 1.40: New Question 16.5.1 3.1-5.3 N Y Y Y Y 1.20: New Question 16.5.2 3.1-5.4 N Y Y Y Y 1.20: New Question 3.1-5.5 Y Y Y Y Y 1.40: New Question 3.1-5.6 Y Y Y Y Y 1.40: New Question 3.1-5.7 Y Y Y Y Y 1.40: New Question

KFA 3.2 -- Systems Requirements V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

Page 162: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

156

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.2 -- Systems Requirements V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.2-1.1 N Y N N N 1.40: New Question 12.1.1 3.2-1.2 Y Y Y N N 1.40: Modified 3.2-1.3 Y N N N N 1.40: New Question 12.2.1 3.2-2.1 Y N N N N 1.40: Modified 12.2.2 N/A N/A N/A N/A N/A 1.40: Deleted 3.2-2.2 Y N N N N 1.40: New Question 3.2-2.3 N Y N Y N 1.40: New Question 3.2-2.4 N Y Y N N 1.40: New Question 12.2.3 3.2-2.5 Y N N N N 1.40: Modified 3.2-2.6 N Y Y Y N 1.40: New Question 12.2.4 3.2-2.7 Y Y Y Y N 1.40: Modified 3.2-2.8 N Y Y Y N 1.40: New Question 12.2.5 N/A N/A N/A N/A N/A 1.40: Moved to 3.2-3.2 12.2.6 N/A N/A N/A N/A N/A 1.40: Moved to 3.2-3.2 3.2-2.9 N Y N Y N 1.40: New Question 3.2-2.10 N N N Y Y 1.41: New Question 3.2-3.1 Y N N N N 1.40: New Question 12.2.5 3.2-3.2 Y Y Y Y Y 1.40: Moved from 12.2.5

Modified 12.2.6 3.2-3.3 N Y Y Y N 1.40: Moved from 12.2.6

Modified 12.3.1 3.2-3.4 N Y N N Y 1.40: Modified 12.3.2 3.2-3.5 Y Y Y Y Y 12.3.3 N/A N/A N/A N/A N/A 1.40: Deleted 3.2-3.6 N Y N N Y 1.40: New Question 3.2-3.7 N N Y Y Y 1.40: New Question 12.3.4 3.2-3.8 N N Y Y Y 1.40: Modified 3.2-3.9 N N N Y N 1.40: New Question 3.2-3.10 N N N Y N 1.40: New Question 3.2-3.11 N N Y N N 1.40: New Question 3.2-3.12 N Y N N Y 1.40: New Question 3.2-3.13 N Y N N Y 1.40: New Question 3.2-3.14 N Y Y Y N 1.40: New Question 12.3.5 3.2-3.15 N Y Y Y Y 1.40: Modified 12.3.6 3.2-3.16 N Y Y N N 1.40: Modified 12.3.7 3.2-3.17 N Y Y N N 12.3.8 3.2-3.18 N N Y N N 1.40: Modified 12.3.9 3.2-3.19 N Y Y Y N 1.40: Modified 12.3.10 3.2-3.20 N Y N Y Y 1.40: Modif ied 12.3.11 3.2-3.21 N Y Y Y Y 12.3.12 3.2-3.22 N Y Y Y N 1.40: Modified 12.3.13 3.2-3.23 N Y N Y Y 12.3.14 3.2-3.24 N Y N Y Y 1.40: Modified 12.3.15 3.2-3.25 Y Y Y Y Y 1.40: Modified 12.4.1 3.2-3.26 Y Y Y Y Y

Page 163: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

157

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.2 -- Systems Requirements V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.2-3.27 Y N N N N 1.41: New Question 12.3.16 N/A N/A N/A N/A N/A 1.40: Deleted 12.4.1 N/A N/A N/A N/A N/A 1.40: Moved to 3.2-3.17 12.4.2 3.2-4.1 Y Y Y Y Y 1.40: Modified 3.2-4.2 Y Y Y Y Y 1.40: New Question 3.2-4.3 Y N N N Y 1.40: New Question 12.5.1 3.2-5.1 Y N N N Y 1.40: Modified 12.5.2 3.2-5.2 Y Y N N Y 1.40: Modified 12.5.3 3.2-5.3 Y Y Y Y Y 3.2-5.4 Y Y Y Y Y 1.40: New Question 3.2-5.5 Y Y Y Y Y 1.40: New Question

KFA 3.3 -- Systems Design V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

13.1.1 N/A N/A N/A N/A N/A 1.40: Moved to 13.3.14 3.3-1.1 N Y N N N 1.40: New Question 13.1.2 3.3-1.2 Y Y N Y Y 1.20: Modified 3.3-1.3 Y N N N Y 1.40: New Question 13.1.3 N/A N/A N/A N/A N/A 1.20: Modified

1.40: Deleted 13.2.1 N/A N/A N/A N/A N/A 1.40: Moved to 13.3.15 3.3-2.1 Y N N N N 1.40: New Question 3.3-2.2 N Y N N N 1.40: New Question 13.2.2 3.3-2.3 N Y N Y Y 1.20: Moved from 13.3.4 3.3-2.4 Y N N N N 1.40: New Question 3.3-2.5 N Y N N N 1.40: New Question 3.3-2.6 N Y N N N 1.40: New Question 13.2.3 3.3-2.7 N Y N Y Y 1.40: New Question 3.3-2.8 N Y N N N 1.40: New Question 3.3-2.9 Y Y N N N 1.40: New Question 13.2.4 3.3-2.10 Y Y N Y Y 1.40: New Question 13.2.5 N/A N/A N/A N/A N/A 1.20: New Question

1.40: Deleted 3.3-2.11 N Y N Y Y 1.40: New Question 13.2.6 3.3-2.12 N Y N Y Y 1.20: New Question 13.2.7 3.3-2.13 N Y N Y Y 1.20: New Question 3.3-2.14 N Y N Y Y 1.40: New Question 3.3-2.15 N Y N Y Y 1.40: New Question 3.3-2.16 N Y N Y Y 1.40: New Question 3.3-2.17 N Y N N Y 1.41: New Question 13.3.1 N/A N/A N/A N/A N/A 1.20: Moved to 13.2.5 13.3.2 N/A N/A N/A N/A N/A 1.20: Moved to 13.2.6 13.3.3 N/A N/A N/A N/A N/A 1.20: Moved to 13.2.7

Page 164: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

158

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.3 -- Systems Design V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

13.3.4 N/A N/A N/A N/A N/A 1.20: Moved to 13.2.2 3.3-3.1 N Y N Y Y 1.40: New Question 3.3-3.2 N Y N Y Y 1.40: New Question 3.3-3.3 Y N N N Y 1.40: New Question 13.3.5 3.3-3.4 N Y N Y Y 1.20: Modified 13.3.6 3.3-3.5 N Y N Y Y 1.20: Modified 13.3.7 3.3-3.6 Y Y N Y Y 1.20: Modified 13.3.8 N/A N/A N/A N/A N/A 1.20: Deleted 13.3.9 N/A N/A N/A N/A N/A 1.20: Moved to 13.4.3 13.3.10 3.3.-3.7 N Y N N Y 1.20: Modified 13.3.11 3.3.-3.8 N Y N Y Y 1.20: Modified 13.3.12 3.3.-3.9 N Y Y Y Y 1.20: Modified 13.3.13 3.3.-3.10 N Y Y Y Y 1.20: Modified 13.3.14 3.3.-3.11 N Y N Y Y 1.20: Moved from 13.1.1 13.3.15 3.3.-3.12 Y Y N Y Y 1.20: Moved from 13.2.1 13.3.16 3.3.-3.13 N Y N Y Y 1.20: New Question 13.3.17 3.3.-3.14 N Y N Y Y 1.20: New Question 13.4.4 3.3.-3.15 Y Y N Y Y 1.20: New Question

1.40: Moved from 13.4.4 3.3.-3.16 N N N Y N 1.41: New Question 13.4.1 N/A N/A N/A N/A N/A 1.20: Deleted 13.4.2 3.3.-4.1 N Y N Y Y 1.20: New Question

1.40: Modified 13.4.3 3.3.-4.2 N Y N Y Y 1.20: New Question 13.4.4 N/A N/A N/A N/A N/A 1.20: New Question

1.40: Moved to 3.3-3.13 13.4.5 3.3.-4.4 Y Y N Y Y 1.20: New Question 13.4.6 3.3.-4.5 Y Y N Y Y 1.20: New Question 13.5.1 3.3.-5.1 Y Y N Y Y 1.20: New Question

1.40: Modified 3.3.-5.2 Y Y N Y Y 1.40: New Question 3.3.-5.3 Y Y N Y Y 1.20: New Question 3.3.-5.4 Y Y N Y Y 1.20: New Question

1.40: Modified

KFA 3.4 -- Integrated Engineering Analysis V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.4-1.1 N Y N/A N/A N/A 1.40: New Question 3.4-1.2 Y Y N/A N/A N/A 1.40: New Question 3.4-1.3 Y N N/A N/A N/A 1.40: New Question 3.4-1.4 Y N N/A N/A N/A 1.40: New Question 15.1.1 N/A N/A N/A N/A N/A 1.40: Moved to 3.4-2.9 15.1.2 N/A N/A N/A N/A N/A 1.40: Moved to 1.4-3.3

Page 165: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

159

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.4 -- Integrated Engineering Analysis V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

15.1.3 N/A N/A N/A N/A N/A 1.40: Moved to NO TAG 15.1.4 N/A N/A N/A N/A N/A 1.40: Moved to 3.4-3.20 3.4-2.1 Y N N/A N/A N/A 1.40: New Question 15.2.1 3.4-2.2 N Y N/A N/A N/A 1.40: Modified 15.2.2 N/A N/A N/A N/A N/A 1.40: Deleted 15.2.3 3.4-2.3 Y Y N/A N/A N/A 15.2.4 3.4-2.4 N Y N/A N/A N/A 3.4-2.5 Y N N/A N/A N/A 1.40: New Question 3.4-2.6 N Y N/A N/A N/A 1.40: New Question 15.2.5 3.4-2.7 N Y N/A N/A N/A 1.40: Modified 3.4-2.8 Y Y N/A N/A N/A 1.40: Moved from 15.1.1 15.2.6 3.4-2.9 Y N N/A N/A N/A 15.2.7 3.4-2.10 N Y N/A N/A N/A 3.4-2.11 Y Y N/A N/A N/A 1.40: New Question 3.4-2.12 N Y N/A N/A N/A 1.41: New Question 3.4-3.1 Y N N/A N/A N/A 1.40: New Question 15.3.1 3.4-3.2 N Y N/A N/A N/A 15.3.2 3.4-3.3 N Y N/A N/A N/A 15.3.3 3.4-3.4 N Y N/A N/A N/A 15.3.4 3.4-3.5 Y Y N/A N/A N/A 15.3.5 3.4-3.6 Y Y N/A N/A N/A 15.3.6 3.4-3.7 Y Y N/A N/A N/A 1.40: Modified 3.4-3.8 Y Y N/A N/A N/A 1.40: Moved from 15.1.2 3.4-3.9 Y Y N/A N/A N/A 1.40: Moved from 15.1.3 15.3.7 3.4-3.10 Y Y N/A N/A N/A 15.3.8 3.4-3.11 N Y N/A N/A N/A 1.40: Modified 15.3.9 N/A N/A N/A N/A N/A 1.40: Deleted 15.3.10 N/A N/A N/A N/A N/A 1.40: Deleted 15.3.11 3.4-3.12 Y Y N/A N/A N/A 15.3.12 3.4-3.13 Y Y N/A N/A N/A 15.3.13 3.4-3.14 N Y N/A N/A N/A 15.3.14 3.4-3.15 N Y N/A N/A N/A 15.3.15 3.4-3.16 Y Y N/A N/A N/A 3.4-3.17 N Y N/A N/A N/A 1.40: Moved from 15.5.2 3.4-3.18 N Y N/A N/A N/A 1.40: Moved from 15.4.8 3.4-3.19 Y Y N/A N/A N/A 1.40: Moved from 15.5.1 3.4-3.20 Y Y N/A N/A N/A 1.40: Moved from 15.1.5 3.4-3.21 Y Y N/A N/A N/A 1.40: New Question 3.4-3.22 Y Y N/A N/A N/A 1.41: New Question 15.4.1 N/A N/A N/A N/A N/A 1.10: Deleted 15.4.2 N/A N/A N/A N/A N/A 1.10: Deleted 15.4.3 N/A N/A N/A N/A N/A 1.40: Deleted 15.4.4 3.4-4.1 Y Y N/A N/A N/A 1.40: Modified 15.4.5 3.4-4.2 Y Y N/A N/A N/A

Page 166: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

160

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.4 -- Integrated Engineering Analysis V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

15.4.6 3.4-4.3 Y Y N/A N/A N/A 15.4.7 3.4-4.4 N Y N/A N/A N/A 15.4.8 N/A N/A N/A N/A N/A 1.40: Moved to 3.4-3.17 15.5.1 N/A N/A N/A N/A N/A 1.40: Moved to 3.4-3.19 15.5.2 N/A N/A N/A N/A N/A 1.40: Moved to 3.4-3.16 15.5.3 N/A N/A N/A N/A N/A 1.40: Moved to 3.6-2.9 15.5.4 N/A N/A N/A N/A N/A 1.40: Deleted 15.5.5 N/A N/A N/A N/A N/A 1.40: Deleted 3.4-5.1 Y Y N/A N/A N/A 1.40: New Question 3.4-5.2 Y Y N/A N/A N/A 1.40: New Question 3.4-5.3 Y Y N/A N/A N/A 1.40: New Question 3.4-5.4 Y Y N/A N/A N/A 1.40: New Question 3.4-5.5 Y Y N/A N/A N/A 1.40: New Question 3.4-5.6 Y Y N/A N/A N/A 1.40: New Question 3.4-5.7 Y Y N/A N/A N/A 1.40: New Question

KFA 3.5 -- System Integration V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.5-1.1 N Y N N N/A 1.40: New Question 3.5-1.2 N N N Y N/A 1.40: New Question 14.1.1 3.5-1.3 N N N Y N/A 1.20: New Question

1.40: Modified 14.1.2 3.5-1.4 Y N N N N/A 1.20: New Question

1.40: Modified 3.5-1.5 N N N Y N/A 1.40: New Question 14.2.1 3.5-2.1 Y N N N N/A 1.40: Modified 3.5 2.2 Y N N N N/A 1.40: New Question 14.2.2 N/A N/A N/A N/A N/A 1.40: Moved to 3.6-2.x 14.2.3 N/A N/A N/A N/A N/A 1.40: Moved to 3.6-2.x 3.5-2.3 Y N N N N/A 1.40: New Question 14.2.4 3.5-2.4 Y N N N N/A 1.20: New Question

1.40: Modified 3.5-2.5 N Y Y N N/A 1.40: New Question 3.5-2.6 N N Y Y N/A 1.40: New Question 3.5-2.7 N Y Y Y N/A 1.40: New Question 14.2.5 3.5-2.8 N Y N Y N/A 1.20: New Question

1.40: Modified 3.5-2.9 N Y Y Y N/A 1.40: New Question 3.5-2.10 N Y N Y N/A 1.40: New Question 3.5-2.11 Y N N Y N/A 1.40: New Question 3.5-2.12 N Y Y Y N/A 1.40: New Question 3.5-2.13 N Y Y Y N/A 1.40: New Question 3.5-2.14 N N N Y N/A 1.41: New Question

Page 167: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

161

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.5 -- System Integration V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

14.3.1 N/A N/A N/A N/A N/A 1.20: Modified 1.40: Moved to 3.6-2.x

14.3.2 N/A N/A N/A N/A N/A 1.40: Moved to 3.6-3.x 14.3.3 N/A N/A N/A N/A N/A 1.20: Modified

1.40: Moved to 3.6-3.x 14.3.4 N/A N/A N/A N/A N/A 1.20: Moved to 13.4.2 14.3.5 N/A N/A N/A N/A N/A 1.40: Moved to 14.4.3 3.5-2.13 N Y Y Y N/A 1.40: New Question 3.5-3.1 N N Y Y N/A 1.40: New Question 3.5-3.2 Y N N N N/A 1.40: New Question 3.5-3.3 N Y N N N/A 1.40: New Question 3.5-3.4 N N Y N N/A 1.40: New Question 3.5-3.5 N N Y N N/A 1.40: New Question 3.5-3.6 N N N Y N/A 1.40: New Question 15-3.7 N N Y Y N/A 1.40: New Question 3.5-3.8 N N Y Y N/A 1.40: New Question 14.3.6 3.5-3.9 Y N N Y N/A 1.20: Modified

1.40: Modified 3.5-3.10 N Y Y N N/A 1.40: New Question 3.5-3.11 N N Y Y N/A 1.40: New Question 3.5-3.12 Y Y N N N/A 1.40: New Question 3.5-3.13 Y Y Y Y N/A 1.40: New Question 3.5--3.14 N N N Y N/A 1.40: New Question 14.3.7 N/A N/A N/A N/A N/A 1.20: Modified

1.40: Moved to 3.6-3.x 14.3. N/A N/A N/A N/A N/A 1.20: Modified

1.40: Moved to 3.6-3.x 14.3. N/A N/A N/A N/A N/A 1.20: Modified

1.40: Moved to 3.6-3.x 14.4.1 N/A N/A N/A N/A N/A 1.20: Modified

1.40: Moved to 3.6-3.x 14.4.2 N/A N/A N/A N/A N/A 1.20: New Question

1.40: Moved to 3.6-3.x 14.4.3 N/A N/A N/A N/A N/A 1.20: Moved from 14.3.5

1.40: Moved to 3.6-3.x 14.4.4 N/A N/A N/A N/A N/A 1.20: New Question

1.40: Moved to 3.6-3.x 14.4.5 N/A N/A N/A N/A N/A 1.20: New Question

1.40: Moved to 3.6-3.x 3.5-4.1 Y N N Y N/A 1.40: New Question 3.5-4.2 N Y Y Y N/A 1.40: New Question 3.5-4.3 N Y Y Y N/A 1.40: New Question 14.5.1 3.5-5.1 Y N N Y N/A 1.20: New Question

1.40: Modified 3.5-5.2 Y N N Y N/A 1.40: New Question

Page 168: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

162

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.5 -- System Integration V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

14.5.2 3.5-5.3 N N N Y N/A 1.20: New Question 1.40: Modified

3.5-5.4 N N N Y N/A 1.40: New Question 14.5.3 3.5-5.5 N N N Y N/A 1.20: New Question

1.40: Modified 3.5-5.6 N N Y Y N/A 1.40: New Question 14.5.4 N/A N/A N/A N/A N/A 1.20: New Question

1.40: Deleted

KFA 3.6 -- System Verification V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.6-1.1 N Y Y N N 1.40: New Question 3.6-1.2 N Y Y Y N 1.40: New Question 14.1.1 3.6-1.3 Y N N N N 1.40: Moved from 14.1.1 3.6-1.4 Y Y Y Y N 1.40: New Question 14.1.2 3.6-1.5 Y N N N N 1.40: Moved from 14.1.2 3.6-1.6 Y N N N N 1.40: New Question 14.2.1 3.6-2.1 Y N N N Y 1.40: Split from 14.2.1 3.6-2.2 Y N N N N 1.40: New Question 14.2.4 3.6-2.3 N Y Y Y N 1.40: Moved from 14.2.4 3.6-2.4 Y Y Y Y N 1.40: New Question 3.6-2.5 Y Y Y Y N 1.40: Moved from 15.5.3 3.6-2.6 N Y Y Y Y 1.40: New Question 14.2.2 3.6-2.7 N Y Y Y Y 1.40: Moved from 14.2.2 14.2.3 3.6-2.8 N N N N Y 1.40: Moved from 14.2.3 3.6-2.9 N Y Y Y Y 1.40: New Question 3.6-2.10 N N N N Y 1.40: New Question 14.2.5 3.6-2.11 N Y Y Y N 1.40: Split from 14.2.5 3.6-2.12 N Y Y Y N 1.40: New Question 3.6-2.13 N Y Y Y Y 1.40: New Question 3.6-2.14 N Y Y Y Y 1.40: New Question 3.6-2.15 N Y Y Y Y 1.40: New Question 3.6-2.16 N Y N N Y 1.41: New Question 3.6-3.1 Y N N N N 1.40: New Question 3.6-3.2 Y N N N Y 1.40: New Question 3.6-3.3 Y N N N Y 1.40: New Question 3.6-3.4 N Y Y Y Y 1.40: New Question 3.6-3.5 Y N N N Y 1.40: New Question 3.6-3.6 Y N N N Y 1.40: New Question 3.6-3.7 Y N N N Y 1.40: New Question 3.6-3.8 N Y Y Y Y 1.40: New Question 3.6-3.9 N Y Y Y Y 1.40: New Question 3.6-3.10 N N N N Y 1.41: New Question

Page 169: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

163

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.6 -- System Verification V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.6-4.1 Y N N N Y 1.40: New Question 3.6-4.2 Y Y Y Y Y 1.40: New Question 3.6-4.3 Y Y Y Y Y 1.40: New Question 3.6-4.4 N Y Y Y Y 1.40: From 14.4.2 3.6-4.5 N Y Y Y Y 1.40: From 14.4.4 14.5.1 3.6-5.1 Y Y Y Y Y 1.40: Split From 14.5.1 3.6-5.2 Y Y Y Y Y 1.40: New Question 14.5.3 3.6-5.3 N Y Y Y Y 1.40: Split From 14.5.3 14.5.2 3.6-5.4 N Y Y Y Y 1.40: Split From 14.5.2 3.5-5.5 Y Y Y Y Y 1.40: New Question

KFA 3.7 -- System Validation V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.7-1.1 N N Y Y N 1.40: New Question 3.7-1.2 Y N Y Y N 1.40: New Question 3.7-1.3 Y N N N N 1.40: New Question 3.7-1.4 Y N N N N 1.40: New Question 3.7-1.5 Y N N N N 1.40: New Question 3.7-1.6 Y Y N N Y 1.40: New Question 3.7-2.1 Y N N N N 1.40: New Question 3.7-2.2 Y N N N N 1.40: New Question 3.7-2.3 Y N N N N 1.40: New Question 3.7-2.4 Y N N Y N 1.40: New Question 3.7-2.5 N Y Y Y Y 1.40: New Question 3.7-2.6 N N N N Y 1.40: New Question 3.7-2.7 N Y Y Y N 1.40: New Question 3.7-2.8 N Y Y Y N 1.40: New Question 3.7-2.9 Y N N N N 1.40: New Question 3.7-2.10 Y N N N Y 1.40: New Question 3.7-2.11 N N N N Y 1.40: New Question 3.7-2.12 N Y N N Y 1.41: New Question 3.7-3.1 Y N N N N 1.40: New Question 3.7-3.2 Y N N N Y 1.40: New Question 3.7-3.3 Y N N Y N 1.40: New Question 3.7-3.4 Y N N N Y 1.40: New Question 3.7-3.5 N N N N Y 1.40: New Question 3.7-3.6 N N N N Y 1.40: New Question 3.7-3.7 N N N N Y 1.40: New Question 3.7-3.8 Y Y Y Y N 1.40: New Question 3.7-3.9 Y Y Y Y N 1.40: New Question 3.7-3.10 N N N N Y 1.41: New Question 3.7-4.1 Y Y Y Y Y 1.40: New Question 3.7-4.2 N Y Y Y Y 1.40: New Question

Page 170: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Method (SECAM)

164

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

KFA 3.7 -- System Validation V1.3 V1.4 Goal 1 Goal 2 Goal 3 Goal 4 Goal 5 Comments

3.7-4.3 Y Y Y Y Y 1.40: New Question 3.7-5.1 Y N N N Y 1.40: New Question 3.7-5.2 Y N N N Y 1.40: New Question 3.7-5.3 Y N N N Y 1.40: New Question 3.7-5.4 N Y Y Y Y 1.40: New Question

Page 171: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Model

165

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

APPENDIX C RELATIONSHIP TO OTHER STANDARDS

This appendix provides a structural mapping of the INCOSE SECAM to the EIA/IS 632, IEEE-1220-1994 and ISO 9001:1994 standards.

SECAM EIA 632 IEEE-1220-1994 ISO 9001 Comments

1.0 - All KFAs in Level 2 4.2 - Policies 4.1.1 - Quality Policy 1.0 - All KFAs in Level 3 4.2 - Procedures 4.4.1 - Design Control

4.2.2 - Qual Sys Proc

1.0 - All KFAs in Level 2 4.2.1 - Planning & Org. 4.3 - Planning Tech Effort 4.4.2 - Design Dev Plan

M a n a g 1.0 - Mgmt Proc Category 4.2 - Mgmt Element 6.8 - Control 4.4.1 - Design Control e m

1.1 - Planning 4.2.1 - Planning & Org. 6.8.10 - Project Plans 4.4.2 - Design Dev Plan

1.1 - Planning 4.2.1 - Planning & Org. 6.8.11 - Technical Plans 4.2.3 - Quality Planning e n t

1.2 - Tracking & Oversight 4.2.2 - Controlling 6.8.2.5 - Perf Based Status 4.4.1 - Design Control

1.2 - Tracking & Oversight 4.2.2 - Controlling 6.8.5 - Prog per Proj Plan 4.4.2 - Design Dev Plan 1.2 - Tracking & Oversight 4.2.2 - Controlling 6.8.6 - Prog per Tech Plan 4.4.2 - Design Dev Plan 1.2 - Tracking & Oversight 4.2.2 - Controlling 6.8.7 - Prod / Proc Metrics 4.4.2 - Design Dev Plan 1.3 - Subcontract Mgmt 4.6.2 - Eval of Subcon 1.4 - Intergroup Coord 4.2.2 - Controlling 6.8.2.3 - Interface Mgmt 4.4.3 - Org & Tech Infos Also KFAs 3.3 and

3.5 in SECAM 1.4 - Intergroup Coord 4.12 - Technical Reviews 4.4.6 - Design Reviews 1.5 - Confer Management 4.2.2 - Controlling 6.8.2.2 - Config Mgmt 4.5 - Doc & Data Ctrl 1.6 - Quality Management 4.13 - Quality Management 4.1 - Qual Mgmt Resp 1.7 - Risk Management 4.2.2 - Controlling 6.8.2.4 - Risk Mgmt 1.8 - Data Management 4.2.2 - Controlling 6.8.2.1 - Data Management 4.5 - Doc & Data Ctrl

Page 172: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Model

166

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

SECAM EIA 632 IEEE-1220-1994 ISO 9001 Comments

2.0 - All KFAs in Level 2 4.2 - Policies 4.1.1 - Quality Policy 2.0 - All KFAs in Level 3 4.2 - Procedures 4.4.1 - Design Control

4.2.2 - Qual Sys Proc

2.0 - All KFAs in Level 2 4.2.1 - Planning & Org 4.3 - Planning Tech Effort 4.4.2 - Design Dev Plan

O r g a n i

2.1 - Process Mgmt & Imp 4.14 - Cont Process Imprv 4.14 - Cor & Prev Action

z 2.2 - Competency Dev. 4.18 - Training a 2.3 - Technology Mgmt 4.14 - Cont Product Imprv

2.4 - Env & Tool Support 4.3.2 - Synth of Solution 4.5 - Modeling & Proto 4.20 - Statistical Techniq t i o

2.4 - Env & Tool Support 4.2.2 - Controlling 4.6 - Integrated Database 1220 redundant with Section 6.8.12

n 2.4 - Env & Tool Support 4.2.2 - Controlling 6.8.12 - Integrated Databas 1220 redundant with Section 4.6

SECAM EIA 632 IEEE-1220-1994 ISO 9001 Comments

S 3.0 - All KFAs in Level 2 4.2 - Policies 4.1.1 - Quality Policy E 3.0 - All KFAs in Level 3 4.2 - Procedures 4.4.1 - Design Control 3.0 - All KFAs in Level 2 4.2.1 - Planning & Org 4.3 - Planning Tech Effort 4.4.2 - Design Dev Plan P r

3.0 - SE Process Category 4.3 - Technical Element 6.0 - SE Process 4.4.1 - Design Control

o 3.1 - Sys Concept Def 4.3.1.1 - Anal of Rqmts 6.1 - Reqmts Analysis 4.4.4 - Design Input e e

3.2 - Rqmts & Funct Anal 4.3.1.2 - Anal of Functs 6.3 - Functional Analysis 4.4.5 - Design Output

s 3.2 - Rqmts & Funct Anal 4.2.2 - Controlling 6.8.4 - Reqmt & Des Chgs 4.4.9 - Design Changes s 3.2 - Rqmts & Funct Anal 4.2.2 - Controlling 6.8.8 - Baselines 3.3 - System Design 4.3.2 - Synth of Solution 6.5 - Synthesis 4.4.5 - Design Output 3.3 - System Design 4.3.2 - Synth of Solution 6.8.9 - Reqmts & Arch 3.4 - Integr Eng Analysis 4.11 - Integ of SE Effort 3.4 - Integr Eng Analysis 4.3.4 - Assess & Select 6.7 - Systems Analysis 1220 redundent with

Section 6.8.3 3.4 - Integr Eng Analysis 4.3.4 - Assess & Select 6.8.3 - Sys Analy & Test 1220 redundent with

Section 6.7 3.5 - System Integration 4.3.3 - Verif of Solution 6.6 - Physical Verification 4.4.7 - Design Verification

Page 173: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Model

167

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

3.6 - System Verification 4.3.1.2 - Anal of Functs 6.4 - Functional Verif 3.7 - System Validation 4.3.1.1 - Anal of Rqmts 6.2 - Reqmts Validation 4.4.8 - Design Validation

SECAM EIA 632 IEEE-1220-1994 ISO 9001 Comments

L Selected KFAs 5.0 - Life Cycle Reqmts 4.4.4 - Design Input I Selected KFAs 5.1 - System Def Stage F Selected KFAs 5.2 - Prelim Design Stage E Selected KFAs 5.3 - Detail Design Stage 4.4.5 - Design Output

A specific system development life cycle model. Is not defined in EIA 632 or the SECAM.

Selected KFAs 5.4 - FAIT Stage C Selected KFAs 5.5 - Production Stage 4.9 - Process Control Y Selected KFAs 5.5 - Cust Support Stage 4.19 - Servicing C L E

Selected KFAs 5.6 - Sim Eng of Services

The INCOSE SECAM does verify appropriate awareness of cycle issues within its KFAs.

Page 174: Systems Engineering Capability Assessment Model - V1.5a - June 1996

Systems Engineering Capability Assessment Model

168

© 1996 (Permissive), INCOSE INCOSE-TP-1996-002-01 (Originally CAWG-1996-01-1.50)

APPENDIX D SYSTEMS ENGINEERING CAPABILITY ASSESSMENT MODEL QUESTIONNAIRE

Thank you for participating in this systems engineering capability self-assessment. A questionnaire is one of several methods that can be used to gather information about your organization. Each question should be responded to based upon your perspectives and with respect to your immediate department, organization and/or program. Boxes have been provided in the questionnaire for yes/no responses. Responses for “not applicable" or “don't know" are also permitted. When appropriate, please indicate these responses by writing “n/a" and “d/k", respectively (in the margin).

A glossary is provided in the back of the questionnaire to clarify the meaning of key terms. Within the questionnaire, italics are used to denote terms that are included in the glossary.

Certain questions posed in the questionnaire will require you to make value judgments. Please be aware that answers will be evaluated in the context of many other responses and will not be attributable to a single individual. Wherever possible, terms used in association with the questions have been defined to assist you in making an appropriate response.

If you have any questions about the questionnaire or the meaning of terminology and/or phrases, please ask for assistance. Again, thank you for your time.