20
Process Modeling and Simulation Experiments for Software Engineering Nancy S. Eickelmann,PhD Motorola Labs 1303 E. Algonquin Rd. Annex-2 Schaumburg, IL 60196 Phone: (847) 310-0785 Fax: (847) 576-3280 Nancy.Eickelmann@ motorola .com Dr. Nancy Eickelmann USC-CSE Octoberr 23-26, 2001

Process Modeling and Simulation Experiments for Software Engineering Nancy S. Eickelmann,PhD Motorola Labs 1303 E. Algonquin Rd. Annex-2 Schaumburg, IL

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

Process Modeling and Simulation Experiments for

Software Engineering

Nancy S. Eickelmann,PhDMotorola Labs

1303 E. Algonquin Rd.Annex-2

Schaumburg, IL 60196Phone: (847) 310-0785

Fax: (847) [email protected]

Dr. Nancy EickelmannUSC-CSE Octoberr 23-26, 2001

Dr. Nancy Eickelmann 2 USC-CSE October 23-26, 2001

OverviewOverviewOverviewOverview

• Process Modeling and Simulation• Who Uses It – CMM High Maturity Organizations• How to use it for Defect Prevention • Simulation Experiments for Software Engineering• Internal Validity• External Validity• Design of Experiments

Dr. Nancy Eickelmann 3 USC-CSE October 23-26, 2001

Process Modeling and Simulation for Process Modeling and Simulation for High Maturity OrganizationsHigh Maturity Organizations

Process PerformanceCost, Quality, Schedule

Code DevCodeInsp

Unit Test

Functional Test

System Test

FieldSupportandMain-tenance

H Lev DesignHLD Insp

L Lev DesignLLDInsp

Func SpecFSInsp

Project is Approved Development

CompleteUnit Test Complete

Release to Customers

InspUTPlan

Follow UT Pln

ProposedProcessChange

CreateUTPlan

Project DataProcess and

Product

Informed Data-Driven

DecisionsSoftware Lifecycle Process

Code DevCodeInsp

Unit Test

Functional Test

System Test

FieldSupportandMain-tenance

H Lev DesignHLD Insp

L Lev DesignLLDInsp

Func SpecFSInsp

Project is Approved Development

CompleteUnit Test Complete

Release to Customers

InspUTPlan

Follow UT Pln

ProposedProcessChange

CreateUTPlan

Dr. Nancy Eickelmann 4 USC-CSE October 23-26, 2001

State of the Practice: Increasing Process MaturityState of the Practice: Increasing Process Maturity

Source: SEI Web Site SEMA Report for March 2000

Dr. Nancy Eickelmann 5 USC-CSE October 23-26, 2001

Level 5 KPAs: OptimizingLevel 5 KPAs: OptimizingLevel 5 KPAs: OptimizingLevel 5 KPAs: Optimizing

Defect Prevention• Goal 1- Defect prevention activities are planned

• Goal 2- Common causes of defects are identified

• Goal 3- Common causes of defects are prioritized and eliminated

Technology Change Management• Goal 1- Incorporation of technology changes are planned

• Goal 2- New technologies are evaluated to determine their effect on quality and productivity

• Goal 3- Appropriate new technologies are transferred into practice

Process Change Management• Goal 1- Continuous process improvement CPI is planned

• Goal 2- Organization wide process improvement

• Goal 3- Standard processes are improved continuously

Dr. Nancy Eickelmann 6 USC-CSE October 23-26, 2001

Defect PreventionDefect PreventionDefect PreventionDefect Prevention

“Defect prevention is defined as an activity of continuous institutionalized learning during which common causes of errors in work products are systematically identified and process changes eliminating those causes are made.”

Dr. Nancy Eickelmann 7 USC-CSE October 23-26, 2001

What is Required for Defect Prevention?What is Required for Defect Prevention?What is Required for Defect Prevention?What is Required for Defect Prevention?

• A measurement program that provides full lifecycle in-process visibility

• Knowledge of how and when defects by type, severity, and impact are introduced into the product

• Methods to improve the process that will result in defect prevention

Dr. Nancy Eickelmann 8 USC-CSE October 23-26, 2001

From a Risk Management Perspective…From a Risk Management Perspective…From a Risk Management Perspective…From a Risk Management Perspective…

Defect prevention through risk management means engaging in a set of planning, controlling, and measuring activities that result in obviating, mitigating or ameliorating defect causing conditions.

Dr. Nancy Eickelmann 9 USC-CSE October 23-26, 2001

Process Simulation ModelsProcess Simulation ModelsProcess Simulation ModelsProcess Simulation Models

• Experimental Simulation Qualitative and quantitative results based on non-deterministic or hybrid simulation model

• mirrors a segment of the real world • control of variables is high • supports testing of causal hypothesis• results can be replicated• high internal validity • high external validity, generalizability

Dr. Nancy Eickelmann 10 USC-CSE October 23-26, 2001

Key Issues for Empirical StudiesKey Issues for Empirical Studies

• First, software engineering has a large number of key variables that have different degrees of significance depending on the process lifecycle, organizational maturity, degree of process automation, level of expertise in the domain, computational constraints on the product, required properties of the product.

• Second, the individual key variables required to mirror the real world context have the potential property of extreme variance in the set of known values within the same context or across multiple contexts. For instance, programmer productivity a key variable in most empirical studies has been documented at 10:1 and 25:1 variances in the same context.

• Third, software engineering domain variables, in combination, may create a critical mass or contextual threshold not present when studied in isolation. To identify variables that co-vary and have interdependent relationships statistical methods are applied to the data sets.

[1986 IEEE TSE, Basili, Selby and Hutchins]

Dr. Nancy Eickelmann 11 USC-CSE October 23-26, 2001

Empirical Research SummaryEmpirical Research SummaryEmpirical Research SummaryEmpirical Research Summary

• Experimental Simulation Qualitative and quantitative results based on non-deterministic or hybrid simulation model

• Math Modeling quantitative results based on a deterministic model

• Mirrors a segment of the real world, control of variables is high, supports testing of causal hypothesis, results can be replicated, high internal validity and generalizability

• Captures real world context in which to isolate and control variables

• Researcher bias can be introduced through selection of variables, parameters and assumptions concerning the model. Modeling requires high degree of analytical skill, and interdisciplinary knowledge

• Results are not typically generalizable to other populations or environmental contexts, researcher bias is common.

Dr. Nancy Eickelmann 12 USC-CSE October 23-26, 2001

Factors Jeopardizing Research Internal Validity

Factors Jeopardizing Research Internal Validity

• History - events occurring between the 1st and 2nd measurement of the experimental variables

• Maturation - processes impacting study results pertaining to the passage of time, i.e., growing tired, growing hungry, growing older, undocumented reliability growth or decay

• Testing - the effects of taking a test upon the scores of the 2nd test

• Instrumentation - changes in the measuring instrument, changes in the observers or record keeper perceptions

• Statistical regression - group selection based on extreme scores

• Bias - differential selection of comparison groups• Experimental mortality - loss of respondents• Selection/Maturation interaction - confounding variable

mistaken for dependent variable

Dr. Nancy Eickelmann 13 USC-CSE October 23-26, 2001

Factors Jeopardizing Research External Validity (Generalizability)

Factors Jeopardizing Research External Validity (Generalizability)

• Testing interaction or reactive effects - altered respondent sensitivity due to pre-test measurement

• Interaction effects - confounding effects from selection bias and experimental variable

• Reactive effects of experimental arrangements - obviates applicability of results to persons or contexts not exposed under the experimental setting

• Multiple treatment interference - occurs when the respondent pool is reused repeatedly

Dr. Nancy Eickelmann 14 USC-CSE October 23-26, 2001

How We Assure Internal ValidityHow We Assure Internal ValidityHow We Assure Internal ValidityHow We Assure Internal Validity

RandomAssignment

MeasurementorObservation

Exposure toExperimentalVariable orEvent

MeasurementorObservation

R M X M

R M M

R X M

R M

Solomon Four Group DesignSolomon Four Group Design

Dr. Nancy Eickelmann 15 USC-CSE October 23-26, 2001

How We Assure Internal ValidityHow We Assure Internal ValidityHow We Assure Internal ValidityHow We Assure Internal Validity

RandomAssignment

Data Assignment

Measurement/Observation

Model

Exposure toExperimental

Variable Simulation

MeasurementorObservation

Model Output

R M1 M

R M2 M

R M

R M

Simulation Experiment DesignSimulation Experiment Design

M3

M4

{X,Y,Z}

{X,Y}

{X,Z}

{X}

Dr. Nancy Eickelmann 16 USC-CSE October 23-26, 2001

Initialization Sub-ModuleInitialization Sub-ModuleInitialization Sub-ModuleInitialization Sub-Module

• Set the initial parameters for the model

• Inputs– Initial Defects {X}

– Detection Effectiveness {Y}

– Correction Effectiveness

– Number of Inspections

– Inspection Size {Z}

– Delta Size

– Resources (Moderator, Author, Librarian, Recorder, Inspector, Reader, Other)

• Output– Item Out

Dr. Nancy Eickelmann 17 USC-CSE October 23-26, 2001

Fagan Inspection Sub-ModuleFagan Inspection Sub-ModuleFagan Inspection Sub-ModuleFagan Inspection Sub-Module

• Calculate duration and number of defects found and removed

• Inputs– InspectedItem– OverviewIn, ThirdHourIn

• Output– FaganInspectionDurationOut– OverviewDurationOut– PlanningDurationOut– PreparationDurationOut– InspectionDurationOut– ThirdHourDurationOut– ReworkDurationOut– FollowUpDurationOut– MinorDefectFoundOut– MajorDefectFoundOut– DefectRemovalOut– ItemOut

Dr. Nancy Eickelmann 18 USC-CSE October 23-26, 2001

Preliminary ResultsPreliminary ResultsPreliminary ResultsPreliminary Results

• Captures Numeric & Graphical Simulation Results

• Inputs– Selected Intermediate and

Final Module Values

• Outputs– Duration for Each Activity

– Number of Major Defects Found

– Number of Minor Defects Found

– Number of Defects Removed

– Minimum & Maximum Number of Days Expended

Dr. Nancy Eickelmann 19 USC-CSE October 23-26, 2001

What We Need for Empirical Studies in the What We Need for Empirical Studies in the Software Engineering DomainSoftware Engineering Domain

What We Need for Empirical Studies in the What We Need for Empirical Studies in the Software Engineering DomainSoftware Engineering Domain

Process simulation experiments• Capture and replicate the variables of the real

world environment • variable variances are isolated and documented• variables are studied in isolation or in

combination to isolate and document “critical mass” effects

• the cost to replicate the multiple real world environments and evaluate across projects and organizations is much less than field studies, longitudinal case studies or controlled experiments

• we can replicate other empirical studies and evaluate applicability and generalizability of results

Dr. Nancy Eickelmann 20 USC-CSE October 23-26, 2001

Thank You!

• Nancy S. Eickelmann,PhDMotorola Labs1303 E. Algonquin Rd.Annex-2Schaumburg, IL 60196Phone: (847) 310-0785Fax: (847) [email protected]