Report on: Workshop on Process Modeling Technologies Lee Osterweil University of Massachusetts Sue...

Preview:

Citation preview

Report on:Workshop on Process Modeling Technologies

Lee Osterweil

University of Massachusetts

Sue Koolmanojwong

USC-CSSE

Problem: How to Sort Through a Profusion of Approaches to Software

Process?

Monday’s Workshop

• Presentation of four approaches

• Discussion of how to sort through the alternatives

Phase 1: Presentation of Four Different Technologies

• Eclipse Process Framework Composer (EPFC) / Rational Method Composer

• System Dynamics• Object Petri-Nets• Little-JIL

Modeling Software Engineering Processes using Eclipse Process Framework Composer (EPFC) /

Rational Method Composer (RMC)

Molly Phongpaibul, Sue Koolmanojwong

March 17, 2008

Professional

Desires: - Simplicity - Templates - Examples - Guidance

Who Uses EPFC/RMC?

Process Author

Produces: - Base methods - Plug ins

Management

Requires: - Realistic consistency - Viable governance - Improved ROI

Service Provider

Provides: - Training - Consulting - Mentoring - Adoption services

Wants to: - Build tools - Sell tools - Sell services

Tool Provider

Needs: - Teachable material - Teach process development - Use in student projects - Bring research to mainstream

Academia

Process Coach

Performs: - Tailoring - Publishing - Support - Training

Source: www.eclipse.org/epf

Process Representation

Process Elements Representation

Form-based Editor

One Key advantage: Scalability

• Method content repository approximately contains– 100s Work products– 30-50 roles– 1,000+ tasks– Around 100 delivery processes

CommercialCommercialExtensionExtension

CompanyCompany

Proprietary ExtensionProprietary Extension

RUPRUP

OpenOpen

SourceSource

PracticesPractices

IBM Global ServicesIBM Global Services

Other Advantages

• Reusability

• Compatibility

• Universality

Modeling System and Software Engineering Processes with System

Dynamics

Ray MadachyUSC Center for Systems and Software Engineering

madachy@usc.edu

Annual Research ReviewMarch 17, 2008

System Dynamics Notation• System represented by x’(t)= f(x,p).

• x: vector of levels (state variables), p: set of parameters

• Legend:

• Example system:

defects

defect generation rate

undetected defects

defect escape rate

detected defects

defect detection ratedefect detection

efficiency

Noname 1

level

rate

auxiliary variable

source/sink

information link

Dynamic ODC COQUALMO Portion

• Portion for Requirements Completeness defects only:

timing defects

total timing defects

generation elaboration function

Graph 1

generation buildup parameter

Graph 2

defect gap

timing defects

total timing defectstiming defects found

detection elaboration function

schedule at completion

Automated Analysis Peer Reviews Execution Testing and Tools

~

SLOC

effort at completion

detection start time

schedule at completion

Graph 3

25.1schedule at completion

~~

83.2composite É efficiency

interface defects

total interface defects

detection elaboration function

detection buildup parameter

interface defects foundcorrectness defects

~

detection start timegeneration elaboration function

total correctness defects

detection start time

~~

SLOC

Automated Analysis Peer ReviewsExecution Testing and Tools

correctness defects found

~

~

schedule at completion

detection start time

~~

SLOC Automated Analysis Peer Reviews Execution Testing and Tools

340effort at completion

schedule at completion

~

requirements detection start time

Analyst Capability

~

requirements defect multiplier

~

requirements defect multiplier

completeness defects

completeness defect generation rate

total completeness defects

completenessdefect fraction

completness defect detection rate

completeness defects found

~

execution testing and toolscompleteness defect detection effic

composite completness defect detection efficiency

~

peer reviews completeness defect detection efficiency

~

automated analysis completeness defect detection efficiency

SLOC Automated Analysis Peer Reviews Execution Testing and Tools

consistency defects

requirements defect generationelaboration function

requirements defect detectionelaboration function

~

requirements defect multiplier

total consistecny defects

consistency defects found

~

~

~

SLOC Automated Analysis Peer Reviews Execution Testing and Tools

ambiguity testability defects

~

requirements defect multiplier

requirements defect fraction

~

~

~

SLOC Automated Analysis Peer Reviews Execution Testing and Tools

~

requirements defect multiplier

code timing defects

total code timing defects

code timing defects found

~

~

~

SLOC Automated Analysis Peer Reviews Execution Testing and Tools

code start time

~

requirements defect multiplier

code detection start time

code defects fraction

code defects fraction

code detection start time

code start time

schedule at completion

Dynamic ODC COQUALMO Sample Outputs

• Example of applying increased V&V for Execution Testing and Tools at 18 months:

Some Advantages

• Rests on established, respected work– Jay Forrester (1950s)

• Is a Macro approach

• Can address the highest level issues

• Yields nice analytic answers

System Diagram

03/17/08 18

LiGuo Huanglghuang@engr.smu.edu

Department of Computer Science & Engineering

Southern Methodist University

Modeling Value-Based Process with Object Petri-Nets

03/17/08 19

VBSQA-OPN System Net – VBSQA Process Framework

Mission objective s and stages

Project SCS classes & Business Cases

SCS define acceptable &desired

value s for Q -attributes(_)

Risk analysis & Architecture /technology evaluation

(_)

Identify conflicting Q-attribute s &

perform tradeoff analysis

(_)

Deliverables :System top -level design and FRD

Architecture /technology

combination (s ) CAN satisfy all Q -attribute

requirements ?

Initiate project(_)

Project cost /benefit

analysis(_) Launch project

SCS adjust acceptablevalues for Q -attributes

(_)

Terminate or redefine project

Architecture /technology

combination (s) CANNOT satisfy all

Q-attribute requirements ?

System top -level design

(_ )

LCO Pass LCO Review ( _)(Exit criteria : Provide at

least one feasible architecture)

LCO Fail

LCO phase rework or extra work

(_)

SCS: Success Critical Stakeholders

LCO: Life Cycle Objective

FRD: Feasibility Rationale D escription

Q-: Qualitysynchronous transitions

status transitions

Legend (System Net)

03/17/08 20

Developer’s Object Net System Acquirer’s Object Net

VBSQA-OPN Object Nets – Stakeholders’ Process Instances

object-autonomous transitionssynchronous transitions

status transitions

Legend (Object Net )

Mission objective s and stages

Project SCS classes & Business Cases

Requirement elicitation meeting

(_)

External prototypeevaluation

(_)

Identify conflicting Q-attribute s &

perform tradeoff analysis

(_)

Deliverables :System top -level design and FRD

Architecture /technology

combination (s ) CAN satisfy all Q -attribute

requirements ?

Acquire system upgrade

requirements(_)

Estimate system upgrade schedule /cost & develop DMR results chain

(_)

Schedule /cost accepted

Stakeholder renegotiation

(_)

Schedule /cost not accepted

Architecture /technology

combination (s) CANNOT satisfy all

Q-attribute requirements ?

System top -level design

(_)

LCO Pass LCO Review ( _)(Exit criteria : Provide at

least one feasible architecture)

LCO Fail

LCO phase rework or extra work

(_)

Developers

Peer review system top-level design & other documents

Mission objective s and stages

Project SCS classes & Business Cases

Requirement elicitation meeting

(_)

External prototypeevaluation

(_)

Identify conflicting Q-attribute s &

perform tradeoff analysis

(_)Deliverables :

System top-level design and FRD

Architecture /technology

combination (s) CAN satisfy all Q-attribute

requirements ?

Issue project bidding

(_)

Verify system upgrade

schedule/cost & DMR results

chain(_)

Launch project

Stakeholder renegotiation

(_)

Terminate or redefine project

Architecture /technology

combination (s) CANNOT satisfy all

Q-attribute requirements ?

LCO Pass LCO Review (_)(Exit criteria: Provide at

least one feasible architecture)

LCO Fail

System Acquirer

03/17/08 21

VBSQA Process Generator – Based on VBSQA-OPN Model

VBSQA Process Creator

VBSQA Process Checker

VBSQA Process Simulator

A Mapping between ERP Software Development Activities

and VBSQA Process Framework

Simulation Results

03/17/08 22

VBSQA Process Simulator– ROI of Synchronous Stakeholder Interaction Activities

DIMS Top-Priority Q-attributes: Performance, Evolvability

Some Advantages

• Petri Nets have interesting well-defined properties

• Coordination of different views– Separation of concerns

• Graphical notation

• Particularly useful for concurrency

The Little-JIL Process Definition Language

Leon J. Osterweil (ljo@cs.umass.edu)

Lab. For Advanced SE Research

University of Massachusetts

USC Center for Software and Systems Engineering

17 March 2008

The “Step” is the central Little-JIL abstraction

TheStepName

Interface Badge(parameters, resources, agent)

Prerequisite Badge Postrequisite Badge

Substep sequencingHandlers

X

Artifactflows

Exception type

continuation

Trivial Example Elaboration of Design Step

Little-JIL Environment Architecture

ProcessDefinition

VariousEditors

ResourcesDefinition

ArtifactsDefinition

ExecutionEngine

(Juliette)

Agents

Agendas

CoordinationStructure

ResourceRepository

ArtifactRepository

AgendaManager

Analyzers

Properties Flavers

Simulator

Fault TreeAnalyzer

User InterfaceManager

ProcessProgrammer

Some Advantages

• Broad semantics

• Precise semantics

• Analysis

• Growing toolset

Phase 2:What to Make of All of This?

What to Make of All This?

• Which is good for what?

• What are we missing?

• What needs are not covered?

• Can we compare and contrast?

• Can we combine best features?

A Classification and Comparison Framework for Software

Architecture Description Languagesby Medvidovic and Taylor

as a model?

• Comparison of Software Architecture technologies

• Technologies are rows

• Features are columns

• Lots of work to fill in the entries

The paper by Taylor and Medvidovic as a model?

• Comparison of Software Architecture technologies

• Technologies are rows

• Features are columns

• Lots of work to fill in the entries

Can we do something like that forProcess modeling technologies?

A Possible Approach

• What should we be doing?– What goals do stakeholders have?– Columns of a matrix (??)

• What are we currently doing?– What do we say our goals are?

• What are we really doing?– What do our technologies address?– Rows of a matrix (??)

Process Stakeholders

• Process performer • Process engineer• Manager• Customer • End user• Educator/trainer• Tool provider• Researcher• Union representative• Regulator• Standardizer (e.g. OMG) • Domain specific stakeholder• …….. MORE?

Stakeholder Goals for Process Technology

• Ambiguity tolerance • Analysis • Automation • Compliance • Composability • Cost effectiveness / save money • Coverage • Efficiency • Evolvability • Implementability / doable • Interchangability • Learnability • Maintainability • manager’s satisfaction • Marketability • Minimum cost of the product • No job loss• Non-interference (with other standards) • Optimal time of the product / speed • Precision

• Prepare negotiation stance • Process analysis • Process management • Profit • Purpose fulfillment • Quality • Reasoning • Reinvention • Repeatability • Reusability • Risk mitigation • Satisfiability • Satisfy high value stakeholders • Scalability • Tailorability • Teachability • Understandability • Usability • Verifiability/ conformance • Work rule

MORE

Goals Technologies seem to be addressing

• Comprehension• Coordination• Automation• Education and training• Continuous improvement• Deep understanding• Planning and control• Reinvention• Strategic management• Communication• Standardization• Analysis• Risk mitigation• Agility

These don’t match theprevious list very well

First Attempt to Structure and Organize the Goals

• Survey some example stakeholder communities

• Top level goals

• Some decomposition into subgoals

Goals for Researchers, Educators• Understanding, comprehension

– Understanding, comprehension– Education– Training– Dissemination– Radical reinvention

• Improvement– Of workforce

• More (,) better workers• Better management• Better resource allocation

– Of process itself• Faster, better, cheaper

– Of product• More better ilities in the product

Possible Research roadmap

• Refine and organize list of goals• Turn it into a list of desired capabilities

– The columns of a matrix

• Identify a list of technologies– The four presented here are only a start– Some come from other disciplines

• E.g. business process, workflow, service architecture

• Study which technologies do what well• Identify gaps in coverage• Suggest syntheses

Something for CSSE to Lead?

Recommended