29
1 Carnegie Mellon University Software Engineering Institut DFAS Software Symposium August 25, 1999 Dave Zubrow Software Engineering Institute Sponsored by the US Department of Defense Performance Measurement for Software Organizations

Performance Measurement for Software Organizations

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Performance Measurement for Software Organizations

1

Carnegie Mellon UniversitySoftware Engineering Institute

DFAS Software SymposiumAugust 25, 1999

Dave ZubrowSoftware Engineering Institute

Sponsored by the US Department of Defense

Performance Measurement for Software Organizations

Page 2: Performance Measurement for Software Organizations

2

Carnegie Mellon UniversitySoftware Engineering Institute

Bring Me A Rock!!!!!!!

“We need a measurement program. Get one started.”

“We don’t have time to define our goals. We have products to deliver.”

“We collect a lot of data. We just never seem to use it.”

Page 3: Performance Measurement for Software Organizations

3

Carnegie Mellon UniversitySoftware Engineering Institute

OutlineSome questions about performance measurement:

• What is performance measurement?• Who is the audience/consumer of performance

information?• How are the data for performance measurement

produced?• What do the results mean?

A process for defining performance measures

Page 4: Performance Measurement for Software Organizations

4

Carnegie Mellon UniversitySoftware Engineering Institute

What is IT Performance MeasurementQuantitative characterization of an organization’s

accomplishment of some aspect of its goals with a focus on the contribution of IT

• quantitative - need something more discriminating than success/failure, yes/no

• organization - focus is on the organization or enterprise view, not a specific project or program

• aspect - performance is multidimensional, what to measure is not obvious

• goals - for measurement to be meaningful, we need a reference point for comparison and judgement

• contribution of IT - attribution of organizational performance to IT performance

Page 5: Performance Measurement for Software Organizations

5

Carnegie Mellon UniversitySoftware Engineering Institute

Performance Management Framework

Source: National Academy of Public Administration, “Information Management Performance Measures,” January, 1996.

Page 6: Performance Measurement for Software Organizations

6

Carnegie Mellon UniversitySoftware Engineering Institute

Userinterfaces

Reliability Predictability

SoftwareUnitPotentialContributions

BusinessStrategyElements

usability ratinguser training availability

mean time to failuresystem availabilitydefect density

On-time deliverycost variance

Metrics

HighPriorityContributions

Relating IT Performance tothe Business Strategy

Page 7: Performance Measurement for Software Organizations

7

Carnegie Mellon UniversitySoftware Engineering Institute

Who is the audience?

do we all speak …

…the same language?

We all want the business to succeed, but

revenuemarketshare

mean timeto failure

defectdensityinspections

Customersatisfaction

key process areas

Page 8: Performance Measurement for Software Organizations

8

Carnegie Mellon UniversitySoftware Engineering Institute

Business Manager’s Attention

Business Operating Systems

Departments and Work CentersQuality Delivery

CycleTime Waste

Market Financial

CustomerSatisfaction Productivity

Vision

Flexibility

Business Units

Adapted from Cross & Lynch “Measure Up: Yardsticks for Continuous Improvement.”

Page 9: Performance Measurement for Software Organizations

9

Carnegie Mellon UniversitySoftware Engineering Institute

Quality DeliveryCycleTime Waste

Technical View

Business Units

Business Operating Systems

Departments and Work Centers

Market Financial

CustomerSatisfaction Productivity

Vision

Flexibility

Page 10: Performance Measurement for Software Organizations

10

Carnegie Mellon UniversitySoftware Engineering Institute

The Audiences and Their InterestsSenior management - for strategic decisions

• business managers• IT managers

Improvement team - to implement improvements and know how well they are doing

• IPTs• IT process action teams

Customers - to evaluate suppliers and understand their capability

Page 11: Performance Measurement for Software Organizations

11

Carnegie Mellon UniversitySoftware Engineering Institute

How are the data produced?

Quality DeliveryCycleTime Waste

Business Units

Business Operating Systems

Departments and Work Centers

Market Financial

CustomerSatisfaction Productivity

Vision

Flexibility

Data generated bywork processes and transactions

Information used to assess performance and guide improvement

Page 12: Performance Measurement for Software Organizations

12

Carnegie Mellon UniversitySoftware Engineering Institute

What do the results mean?Possible Interpretations

• Accomplished a goal - has the goal been met• Progress towards a goal - are trends moving in the right

direction according to schedule• Impending threat - can signal risk of not meeting future goal• Guidance for improvement - what should we look at as an

opportunity for improvement• Competitive position - ranking or performance relative to

competitors

It depends on the goal and strategy

Page 13: Performance Measurement for Software Organizations

13

Carnegie Mellon UniversitySoftware Engineering Institute

A Process for Measuring the Performance…of Information Technology….Follow an IT Results Chain

Use a Balanced Scorecard

Target Measures at Decision Making Tiers

Build a Measurement and Analysis Infrastructure

Strengthen IT Processes to Improve Mission Performance

Source: “Executive Guide-Measuring performance and demonstrating results of information technology investments,” US General Accounting Office, March 1998.

Page 14: Performance Measurement for Software Organizations

14

Carnegie Mellon UniversitySoftware Engineering Institute

BusinessGoals define the Need

The Process provides the Opportunity

Alignment is the Key

Business Goals

What do I want to achieve?

Subgoals

To do this, I will need to…

Mental Model

receives producesholds

<The Process>

consists of

entities entities

attributes attributes

Measurement Goals

Questions

Indicators

Measures

G1

Q1

I2

Q2 Q3

I1 I3 I4

M1 M2 M3

G2

What do I want to know?

entities

attributes

Page 15: Performance Measurement for Software Organizations

15

Carnegie Mellon UniversitySoftware Engineering Institute

A Balanced Perspective on Performance

CustomerHow do customerssee us?

FinancialHow do we lookto shareholders?

Innovationand LearningCan we continue to improve and createvalue?

InternalBusinessWhat must we excel at?

A BalancedPerspective

Watch out for maskedtrade-offs, unintendedconsequences

Can improvement inone area be made withoutsacrificing another?

See Kaplan and Norton, “The Balanced Scorecard - Measures that Drive Performance” Harvard Business Review, Jan/Feb, 1992.

Page 16: Performance Measurement for Software Organizations

16

Carnegie Mellon UniversitySoftware Engineering Institute

A Balanced Scorecard Example

Return on Capital EmployedCash FlowProject ProfitabilityProfit Forecast ReliabilitySales Backlog

Pricing Index Tier II CustomersCustomer Ranking SurveyCustomer Satisfaction IndexMarket ShareBusiness Segment Tier I CustomersKey Accounts

% Revenue from New ServicesRate of Improvement IndexStaff Attitude Survey# of Employee SuggestionsRevenue per Employee

Hours with Customers on New WorkTender Success RateReworkSafety Incident IndexProject Performance IndexProject Closeout Cycle

Financial Perspective

Customer Perspective Internal Business Perspective

Innovation and Learning Perspective

Source: Kaplan and Norton, ”Putting the Balanced Scorecard to Work” Harvard Business Review, Sept-Oct 1993

Page 17: Performance Measurement for Software Organizations

17

Carnegie Mellon UniversitySoftware Engineering Institute

Verify Tasks

Action Plans

Analysis & Diagnosis

Weeks Module Weeks

Define Indicators

Step 6

DataElements Avail Source

+0-0+

- -

QACM?

Etc.••

SizeDefects

Step 7Identify Data Elements

definition checklist

_____ _____ _____ _____ _____

Step 8

Define DataElements

Complete Planning Task Matrix

Step 9

Step 5Formalize

Measurement Goals

Step 4

Questions, Entities, & Attributes

Post Workshop

Step 10 Implementation

Goal-Driven Process Steps

Measurement Workshop

Planning Tasks

Task 1Task 2Task 3Task n

Y

YY

NY

Data Elements1 2 3 4 5

Planning Tasks

Task 1Task 2Task 3Task n

Y

YY

NY

Data Elements1 2 3 4 5

Step 1Identify your Business Goals

Step 2Identify what do you want to learn or know about the goals.

Identify your Sub-goals

Step 3

Page 18: Performance Measurement for Software Organizations

18

Carnegie Mellon UniversitySoftware Engineering Institute

Operational DefinitionsProject Phases

Feasible Study

Alternative Analysis

Functional Specification Design

Initiation Definition Design Build Verification Implementation

Code & Unit Test

Integration Test UAT Deployment

Start Date Start Date End Date (ship date)Project Estimation

definition checklist

_____ _____ _____ _____ _____

definition checklist

_____ _____ _____ _____ _____

definition checklist

_____ _____ _____ _____ _____

Effort & Schedule Estimate

Key dates - start and end times

Page 19: Performance Measurement for Software Organizations

19

Carnegie Mellon UniversitySoftware Engineering Institute

Characteristics of the MeasuresMutually Exclusive

• Measure different dimensions with each measureExhaustive• Outcomes, Outputs, Inputs, Process• Balanced ScorecardValid• The measures logically relate to their corresponding indicator

or useReliable• The same performance would result in the same measurementInterval Scale• Need variability to distinguish performance levels

Page 20: Performance Measurement for Software Organizations

20

Carnegie Mellon UniversitySoftware Engineering Institute

Defining Performance MeasuresDocument the why, what, who, when, where, and how

INDICATOR TEMPLATE

Objective

Questions

Visual Display

Interpretation

Evolution

Assumptions

X-reference

Probing Questions

Input(s)

Data Elements

Responsibility for Reporting

Form(s)

Algorithm

50

100

150

1 2 3 4 5 6 7 8 9 10Weeks

Now

PlannedActual

Trou

ble

Rep

orts

DefectsCost of QualitySchedule PredictabilityEffort PredictabilityCycle TimeMaintenance EffortProject MixCustomer Satisfaction

Measures

INDICATOR TEMPLATE

Objective

Questions

Visual Display

Interpretation

Evolution

Assumptions

X-reference

Probing Questions

Input(s)

Data Elements

Responsibility for Reporting

Form(s)

Algorithm

50

100

150

1 2 3 4 5 6 7 8 9 10Weeks

Now

PlannedActual

Trou

ble

Rep

orts

INDICATOR TEMPLATE

Objective

Questions

Visual Display

Interpretation

Evolution

Assumptions

X-reference

Probing Questions

Input(s)

Data Elements

Responsibility for Reporting

Form(s)

Algorithm

50

100

150

1 2 3 4 5 6 7 8 9 10Weeks

Now

PlannedActual

Trou

ble

Rep

orts

Page 21: Performance Measurement for Software Organizations

21

Carnegie Mellon UniversitySoftware Engineering Institute

Criteria for Evaluating PerformanceMeasures - 1Are we measuring the right thing?

• improvement in performance of mission• improvement in performance of goals and objectives• value added by IT organization• ROI, costs, savings

Based on strategy and objectives• not what’s convenient and “lying around”• relevant and important

Source: National Academy of Public Administration, “Information Management Performance Measures,” January, 1996.

Page 22: Performance Measurement for Software Organizations

22

Carnegie Mellon UniversitySoftware Engineering Institute

Criteria for Evaluating PerformanceMeasures - 2Do we have the right measures?

• measures of results rather than inputs or outputs• linked to specific and critical processes• understood by their audience and users effective in

prompting action• credible and possible to communicate effectively• accurate, reliable, valid, verifiable, cost-effective, timelyDevelop as a Set• don’t rely on a single indicator• will trade-offs in performance be detected?

Source: National Academy of Public Administration, “Information Management Performance Measures,” January, 1996.

Page 23: Performance Measurement for Software Organizations

23

Carnegie Mellon UniversitySoftware Engineering Institute

Criteria for Evaluating PerformanceMeasures - 3Are the measures used in the right way?

• strategic planning• guide prioritization of program initiatives• resource allocation decisions• day-to-day management• communicate results to stakeholders

Source: National Academy of Public Administration, “Information Management Performance Measures,” January, 1996.

Page 24: Performance Measurement for Software Organizations

24

Carnegie Mellon UniversitySoftware Engineering Institute

Example: Process Improvement Goals

Internal Processes• increase productivity by a factor of 2 over 5

years• reduce development time by 40% over 5 years• improve quality by a factor of 5 over 5 year• reduce maintenance effort by 40% over 5 yearsCustomer Satisfaction• improve predictability to within 10% over 5 years

Page 25: Performance Measurement for Software Organizations

25

Carnegie Mellon UniversitySoftware Engineering Institute

Enterprise Metrics

Project Size:

Large

Small

Medium

Schedule Predictability

0%20%40%60%80%

100%

Percent Deviation

1996 19971 2 3 4 1 2 3 4

Effort Predictability

0%20%40%60%80%

100%

Percent Deviation

1996 19971 2 3 4 1 2 3 4

Cycle Time

108

6

4

2

0

Calendar Days per Size Unit

1996 19971 2 3 4 1 2 3 4

QualityDefect Density

at UAT Number of High Priority

Field Defects

1086420

1000

800

600

400

200

0

1996 19971 2 3 4 1 2 3 4

Customer Satisfaction

Implemented Solution

Working Relationship

Satisfaction index

60%50%40%30%20%

70%

1996 19971 2 3 4 1 2 3 4

ReworkAppraisalPreventionPerformance

Cost of Quality: COQ – Medium Projects COQ – Small ProjectsCOQ – Large Projects

020406080

1 2 3 4 5

Effort Hours per Size Unit100

020406080

1 2 3 4 5

Effort Hours per Size Unit100

01 2 3 4 5

Effort Hours per Size Unit

20406080100

Maintenance EffortNon-discretionaryMaint.Effort

OpenTrouble Reports

EnhancementEffort

100%80%60%40%20%

0%

1996 19971 2 3 4 1 2 3 4

Trouble Reports Open

1000

800

600

400

200

0

Percent of Tech. Staff-Hours

Defect Corrections

Page 26: Performance Measurement for Software Organizations

26

Carnegie Mellon UniversitySoftware Engineering Institute

Example OutputSchedule PredictabilityThe Objective is to understand the effectiveness the user acceptance test (UAT) was to be completed and the actual date when the UAT was completed along with the start date of coding of the project.The Percentage Deviation in schedule for different categories is calculated as follows:

Absolute value (Actual Ship date - Planned Ship date)Percent Deviation= ------------------------------------------------------------------------ * 100 Planned Ship date - Start date of coding

A downward trend predicts improvement in the predictability and an upward prove its ability to predict schedules for completion of projects if we monitor this metric over a period of time.

Data for illustrative purpose only

Month Large Medium SmallL M S

Jun-96 79.54% 82.00%Jul-96 654.57% 196.96% 45.00%Aug-96 243.55% 125.79% 14.00%Sep-96 595.39% 149.08% 34.00%Oct-96 117.24% 225.00% 36.00%Nov-96 11.11% 43.69% 20.00%Dec-96Dec-96Jan-97Feb-97Mar-97Apr-97May-97

Page 27: Performance Measurement for Software Organizations

27

Carnegie Mellon UniversitySoftware Engineering Institute

INFORMATION NEEDS

ANALYSIS RESULTS

ANALYSIS RESULTS AND

PERFORMANCEMEASURESIMPROVEMENT

ACTIONS

Scope of Standard

USER FEEDBACK

EstablishCapability Plan Perform

Evaluate

Technical and Management

Processes

Core Measurement Process

ExperienceBase

MEAS-UREMENTPLAN

Software Measurement Process ISO 15939 (draft)

Measurement Approach

Page 28: Performance Measurement for Software Organizations

28

Carnegie Mellon UniversitySoftware Engineering Institute

SummaryIT cannot do this alone

• requires business goals• requires a customer life-cycle perspective• business and IT managers must agree on the priority areas

to which IT contributes

Alignment of measures is key

Action must result from the information

Real improvement can only be gauged by multiple measures

Page 29: Performance Measurement for Software Organizations

29

Carnegie Mellon UniversitySoftware Engineering Institute

For more information

SEI and SEMA• http://www.sei.cmu.edu• http://www.sei.cmu.edu/sema

Performance Measurement• http://www.itpolicy.gsa.gov/mkm/pathways/

pp03link.htm• http://www.dtic.mil/c3i/c3ia/itprmhome.html