17
NDIA SE Architecture Committee 25 June 2012 Agenda: • General Info – Nothing new • Task # 4 Status (S. Dam) • Task # 8 Status (R. Carson)

NDIA SE Architecture Committee 25 June 2012 Agenda: General Info – Nothing new Task # 4 Status (S. Dam) Task # 8 Status (R. Carson)

Embed Size (px)

Citation preview

NDIA SE Architecture Committee25 June 2012

Agenda:

• General Info – Nothing new

• Task # 4 Status (S. Dam)

• Task # 8 Status (R. Carson)

Task # 4:

• Is the use of authoritative DoDAF-like architectures critical for a successful systems integration effort? (Navy). Reusable architectures/environments, system hierarchies. (AF)

• SED: define “architecture” context and task plan. Establish architecture committee.

• Situation– Industry and others observation is that use of DoDAF is 1. Often disjointed from program technical work and/or performed in duplication with a

programs’ other systems engineering efforts, which can result in limited utility of the DoDAF data and artifacts;

2. Appropriate application of DoDAF in conjunction with bigger picture SE efforts is often misunderstood; and

3. Lack of documented, measurable benefits of a unified architecture framework standard and it’s applicability to future DoD program and mission success.

– Increased industry and DoD awareness and understanding of how architecture is an integral part of the systems engineering process to enable model based design, and a defined path forward to address current DoDAF limitations

• Proposal– Identify recommendations for:

• Should the DoDAF be mandatory?• Where should the DoDAF be used? Where should it not be used?• Adjustments needed to current DoDAF path including topics of standards, tools, etc• How DoDAF should be integrated into larger SE efforts, including what DoD guidance is

needed in this area• Suggested alternatives to DoDAF• Areas for future continued investigation

3

• Small working group of industry participants • Use 2009 Architecture Working Group report as

reference• Identify/document case studies showing what has

worked and what has been the issues with current DoDAF approach (industry participants query their organizations)

• Address 6 proposal topics

• Deliverable: Report, including an executive summary

4

• Define deliverable report– Goals– Objectives– Outline

• Solicit volunteers to write portions of deliverable

• Develop deliverable schedule– Draft prior to SE Conference in October

5

• Connect architecture products to deliverables for key milestones

• Do they have to work together? Interoperability?

6

• Presentation Title: Use of DoDAF for Systems Integration: An NDIA SE Division Architecture Subcommittee Update (Submitted Ref. #14830)

• Abstract: In April 2012 NDIA SE Division formed a committee to address the question: “Is the use of authoritative DoDAF-like architectures critical for a successful systems integration effort?” A number of issues related to DoDAF were discussed, including: 1) industry and others observation is that use of DoDAF is often disjointed from program technical work and/or performed in duplication with a programs’ other systems engineering efforts, which can result in limited utility of the DoDAF data and artifacts; 2) appropriate application of DoDAF in conjunction with bigger picture SE efforts is often misunderstood; and 3) lack of documented, measurable benefits of a unified architecture framework standard and its applicability to future DoD program and mission success. To address this question, it was determined that we need to increase industry and DoD awareness and understanding of how architecture is an integral part of the systems engineering process to enable model based design, and define a path forward to address possible current DoDAF limitations. A proposal was developed to: a) identify recommendations for adjustments needed to current DoDAF path including topics of standards, languages, tools, etc.; b) describe how DoDAF should be integrated into larger SE efforts, including what DoD guidance is needed in this area; and c) recommend priorities for ongoing architecture framework standards (e.g., IDEAS Group, OMG UPDM) and innovations (e.g., Ontology working groups, INCOSE MBSE activities). This presentation will present preliminary results of this investigation.

7

1. Executive Summary

2. Should the DoDAF be mandatory?

3. Where should the DoDAF be used?

4. Where should DoDAF not be used?

5. What adjustments are needed to current DoDAF path?

6. How should DoDAF be integrated into larger SE efforts?

7. What DoD guidance is needed in this area?

8. What are the alternatives to DoDAF?

9. What are the areas for future continued investigation?

8

• Develop deliverable outline• Obtain volunteers for writing assignments• Write sections independently• Collect expanded outline/draft sections,

review and send out for comments• Develop presentation based on deliverable• Submit presentation to NDIA• Make copies of draft deliverable available at

conference

9

Sunday Monday Tuesday Wednesday

Thursday Friday Saturday

1 2 3 4 Independence Day

5 6Draft deliverable outline finalized

7

8 9 Bi-weekly telecon

10Draft deliverable writing assignments

11 12 13 14

15 16 17 18 19 20 21

22 23Bi-weekly telecon

24 25 26 27Expanded outlines of each section due

28

29 30 31

10

Sunday Monday Tuesday Wednesday

Thursday Friday Saturday

1 2 3 4

5 6 Bi-weekly telecon

7 8 9 10 11

12 13 14 15 16 17First draft of deliverable sections due

18

19 20Bi-weekly telecon

21 22 23 24 25

26 27 28 29 30 31

11

Sunday Monday Tuesday Wednesday

Thursday Friday Saturday

1

2 3 Labor Day

4 5 6Outline for presentation due

7First integrated draft due

8

9 10Send draft and outline out for comments

11 12 13 14 15

16 17Bi-weekly telecon

18Receive comments

19 20 21Second integrated draft due

22

23 24Send integrated draft and presentation out for comments

25 26 27 28 29

30

12

Sunday Monday Tuesday Wednesday

Thursday Friday Saturday

1 2Receive comments

3 4 5Final inputs on NDIA presentation due

6

7 8 Columbus Day

9NDIA Presentations Due?

10 11 12Final inputs on draft deliverable due

13

14 15 16 17Print copies of draft deliverable

18 19 20

21 22NDIA SE Division Conference

23 24 25 26 27

28 29 30 31

13

Task # 8

• Architecture Metrics

• Provide Inputs to Ron Carson, Paul Kohl, Garry Roedler

Industry Architecture Measurement WG

• Members (Company / representing)• L-M – Paul Kohl (PSM, leader), Garry Roedler (PSM, INCOSE), Jamie Kanyok (INCOSE), • Liveware (Argentina) – Alejandro Bianchi (PSM, Co-Leader)• Boeing – Ron Carson (INCOSE, NDIA)

• Activities:• Abstract submitted for NDIA SE Conference: “New Opportunities for Architecture Measurement”• Preparing for PSM (Practical Systems and Software Measurement) workshop on architecture measurement, July

31, Portsmouth, VA

• Workshop Objectives/Agenda• Identify the key attributes of architecture to be measured • Define a set of architecture measures that provide insight into the architecture (proposed set on next page)• Multi-vote on proposed metrics for each attribute – benefit vs. difficulty (PICK chart idea)• Recommend means/methods for obtaining the measures selected (Modeling tools, Requirement tools,

Outputs from related processes)• Fill in the PSM template for the measures (benefit/decisions, frequency, base and derived measures, source

data, means of capture, etc.)

Proposed Attributes and Metrics• Size

• Number of elements• Number of relationships (external)• Number of requirements• Number of mission / system

scenarios / use cases• Number of artifacts produced• Number Data points• Number of Function points• Number of Use Case points

• Complexity• Number of relationships (internal &

external)• Number of interactions• Number of functions/methods• Number of states

• Completeness• Requirements satisfied• Artifacts produced

• Quality of Solution• Number of defects• Degree of requirements satisfaction• Degree of coupling• Degree of “pick an ‘ility”

• Quality of representation• Number of defects• Degree of consistency of

representation• Degree of standards compliance

– Cost of architecture dev’t

Submitted Abstract

New Opportunities for System Architecture MeasurementThe United States Government Accountability Office, the United States Department of Defense ((Carter, 2010) and

(Thompson 2010)), and industry (NDIA 2010) have all made the case for better measurement in support of program success. Last year’s NDIA System Development Performance Measurement working group report (NDIA 2011) attempted to define a broad set of leading indicators building on the Practical Software Measurement (PSM) and INCOSE System Engineering Leading Indicators (SELI) Guide (INCOSE 2010).

The workshop conducted as part of the NDIA study identified System Architecture as a high priority area for a leading indicator but was unable to identify suitable candidates. Objectives for architecture measurement were described as “Evaluates the architecture from the perspectives of quality, flexibility, and robustness, stability, [and] adequacy of design rules.” The report chartered further activity as a need ”to identify a consistent, common measure of architecture quality in the current community of practice to include as a consensus recommendation.” To address this residual need, an architecture measurement working group has been formed with participation from companies representing INCOSE, NDIA, and PSM.

The introduction of architecture modeling tools and their evolution has created opportunities for defining meaningful measures. This paper reviews previous measurement literature and process standards to identify measurement concepts applicable to “architecture”. The process activities and outputs of the System Architecture processes defined in ISO 15288 and the INCOSE Handbook were reviewed and along with the measurement concepts defined in Carson (2011) and Olson (2008). Classes of measurement include quality and completeness of architecture, and work progress measures vs. planned work. The emphasis is on identifying measures that can serve as leading indicators and predictors of program impacts. Measures derived from Rechtin and Maier (1997) heuristics are also examined in this context. Measures related to US Department of Defense Architecture Framework (DoDAF) viewpoints are also examined.

The results were compiled and a set of potential measures defined. The measures are documented using the PSM methodology. General techniques are proposed that take advantage of the opportunities afforded by the current architecture modeling environment to provide a basic measurement plan for architecture including leveraging existing measurement concepts found within the SELI. The result is a comprehensive tailorable suite of measures that can provide decision-making data to program managers and program leaders.