Software Metrics –OverviewBlackboard
by Sirisha N
Objectives
• Understand metrics
• Applying Metrics to drive Testing Projects
• Data sources and means of capturing metrics
Agenda
• Operational definitions
• Introduction– Why Measure– Purpose– Metrics in ISO, CMM & CMMI– Basic Definitions– Institutionalize Metrics program– What to Measure– Data Collection Strategy
• Metrics based Project Management
Purpose
• To plan projects (estimation based on past data)
• To control project’s process– Taking corrective and preventive action in a timely manner
– To monitor the goals of the project set by client/organization
• To provide feedback on performance of projects
• To improve the organization process/tools/methods
• To identify the training needs based on project performance
Metrics in ISO, CMM & CMMI
ISO9001:2000• Section 8 – Measurement Analysis & improvement
CMM• Level 4 KPA’s – Quantitative Process Management
– Software Quality Management
CMMI• Level 2 KPA – Measurement and Analysis
• Level 4 KPA’s – Quantitative Project Management
– Organizational Process Performance
Basic Definitions
Metrics“ A quantitative measure of the degree to which a system, component or
process possesses a given attribute” ---- IEEE
e.g, No of defects per KLOC/FP, No. of TC Authored / Hr.
Defects per person hour
Measurement“ is the act of determining a measure of something”
Measure“ is a single quantitative attribute of an entity – the basic building block for a
measurement”
e.g, 100 LOC – “100 is the measure, LOC is the Unit of measure”
Institutionalize Metrics Program
SG 1 Align Measurement and Analysis ActivitiesSP 1.1-1 Establish Measurement ObjectivesSP 1.2.1 Specify MeasuresSP 1.3.1 Specify Data Collection and Storage ProceduresSP 1.4.1 Specify Analysis Procedures
CMMI L 2 – PA 5
SG 2 Provide Measurement ResultsSP 2.1-1 Collect Measurement DataSP 2.2.1 Analyze Measurement DataSP 2.3.1 Store Data and ResultsSP 2.4.1 Communicate Results
Clarify Business Goals
Prioritize Issues
Select & Define Measures
Collect, Verify & Store Data
Analyze Process Behavior
Stable?
Capable?
ContinuallyImprove
Remove AssignableCauses
Change Process
NewIssues?
NewMeasures?
NewGoals?
Y
N
N
N
N
N
Y
Y
Y
What to Measure ?
Process Metrics:Metrics that are used to control processes in a software system
(i.e, Productivity, Efficiency, etc)
Product Metrics:Metrics that are used to control the software life cycle process
(not within the scope of EQA)
Project Metrics:Metrics that are used to control project life cycle process
(i.e, Effort Variation, Schedule Variation, etc)
Quality Metrics:Metrics that are used to control quality in product or service
(i.e, CSI, % TC modified, etc)
…..What to Measure ?
SoftwareTest Metrics
ProductMetrics
ProcessMetrics
ProjectMetrics
QualityMetrics
1. Size Variation2. Defect density3. Code coverage4. MTBF
1. TCA Productivity2. TCR Productivity3. TCE Productivity4. Test Case challenged percentage
1. Effort variation2. Schedule Variation3. Schedule Compliance4. Staff Utilization
1. Adhoc Bug %2. Challenged Bug %3. Rejected Bug %4. Customer satisfaction Index
Data Collection Strategy
WBS (.XLS)Time
Sheet (.XLS)
Estimation Sheet (.XLT)
Estimation
Methodology
DTS (.XLS)
RL (.XLS)
Data Collected in PROJECT DATA
COLLECTION EQA 2.0 D1.XLS
PROJECT WBS EQA 1.0 D1.XLS
Resource Name, Project Name, Build Name, Planned Tasks, Unplanned Tasks, Time spent
Testing Defects, Customer identified defects (CID)
Review Errors & Defects
Test Report (.XLS)
Guidelines, Templates
Planned Effort, Actual Effort, Planned Start date, Planned Finish date, Interim Start date, Interim Finish date, Actual Start date, Actual Finish date,
Estimated Size, Estimated Effort, Estimated Resource Count,
CID Report (.XLS)
Defect ID, Description, Source / Location, Identified date / by, Defect Type / Class, Detected in Phase, Injected in Phase, Defect Severity, Defect Status
Error / Defect ID, Description, Source / Location, Identified date / by, Error/Defect Status
Derived Metrics: 1. Effort Variation 2. Schedule Variation 3. TCA productivity 4. TCR productivity 5. TCE productivity 6. Challenged TC % 7. Adhoc Bug % 8. Challenged Bug % 9. Rejected Bug %
Test Case ID, Executed by, Execution date, Test procedure, Expected results, Actual results, Execution status, Defect description
Defect description Identified by, Identified date, Defect Type, Defect priority,
Operational Definitions• For each Metric
– Objective
– Definition
– Formula used
– Unit of Measure
– Key Note
• For each Metric
– Data Input
– Data Source
– Responsibility
– Frequency
A. Metrics Operational Definitions:
Metric UOM Attribute/Entity Definition Life Cycle
Base Measure Measurement Method
B. Decision Criteria:
Data Pattern Reporting Format Data extraction cycle Reporting cycle Distribution Availability
C. Data Collection Procedure:
Data Item Data type Database Record Data Elements / Fields
Who Collects the Data
Data Collection Rules & Procedures
Effort Variation
Objective– To improve the estimation and productivity
Definition– Effort variation is the % deviation of the actual effort spent on a project/
phase/activity from the estimated effort
Formula usedEffort Variation = Actual Effort – Estimated Effort x 100%
Estimated Effort
Unit of Measure– %
Effort Variation
Key Note– If the figure is negative – efforts put in the project is less
– If the figure is positive - efforts put in the project is more
– Can be used as a multiplication factor to the estimated effort to arrive at a near approximate and realistic figure
Data Input– Activity code/Activity name
– Actual Effort (PH)
– Estimated Effort (PH)
– Estimation methodology
Other Inputs– Effort type
(Requirement. Analysis/ Test Design/Test Execution)
– Product/Build/Module/ Phase
Effort Variation
Effort can be derived from Size, if Productivity factor is known..
Effort (PH) = {Size (# TC) x 1000} / Productivity (TCA or TCR or TCE /Hr.)
Data Collection Sheet for Effort Variation -Phase Wise
ActivityCode
Product Build Modules Phase ActivityPlannedEffort (inperson hrs)
ActualEffort (inperson hrs)
% Variation
Artifact : Project WBS EQA 1.0 D1.xls
Effort Variation
Responsibility– PL/TL (EQA) will provide estimated effort data
– Actual effort data/activity will be provided by every staff using Time sheet
Frequency– Estimated effort will be sent to SEPG at the time of estimation and effort
distribution (includes Re-estimates too)
– Actual effort data will be sent to SEPG on weekly basis (Time sheet)
Data Source– Estimated Effort – Proposal.doc, Contract.doc, Test Effort Estimation Sheet.doc
– Actual Effort - Project Data Collection.xls, Project WBS EQA.xls, Time Sheet.xls
Schedule Variation
Objective– To identify the current status of the project and check whether the project can meet
the schedule deadline
Definition– Schedule variation is the % deviation of actual duration from the planned
duration of a project, phase or activity
Formula usedSchedule Variation = Actual Finish – Planned Finish x 100%
(Planned Finish – Planned Start) + 1
Unit of Measure– %
Schedule Variation
Key Note– If the figure is negative - the schedule has crossed the target duration
– If the figure is positive - the schedule is ahead of the target duration
– Can be used as a multiplication factor to estimate realistic schedule for each milestone
Data Input– Activity code/Activity name
– Plan start & finish dates
– Planned Duration (PD)
– Actual start & finish dates
– Actual Duration (AD)
– % complete
Other Inputs– Task type
(Technical/Project)
(Planned/Unplanned)
– Product/Build/Module/ Phase
Schedule Variation
Data Collection Sheet for Schedule Variation - Phase Wise
Act.Code
Product Build Modules Phase Activity
PlanDuration(cal.days)
PlanStartDate
PlanFinishDate
ActualDuration(cal.days)
Actual StartDate
Actual Finish Date
% Variation
% Complete
Artifact : Project WBS EQA 1.0 D1.xls
Schedule Variation
Responsibility– PL/TL (EQA) will provide the planned schedule data
Frequency– Schedule data will be reported to SEPG at WBS completion and as and
when re-scheduled during the PLC phases
– Final schedule data will be sent at the project closure
Data Source– Planned Duration – WBS Sheet.mpp, Test Plan.doc
– Actual Duration – Project Data Collection.xls, Project WBS EQA.xls,
Productivity
Objective– To find out the productivity of a project
Definition– Is the size of the task completed per hours of effort (effort being fixed)
Formula used(TCA/TCR/TCE) Productivity = Actual Size (# TC)
Effort
Unit of Measure– # TCA/PH, # TCR/PH , # TCE/PH
Productivity
Key Note– No. of test steps shall also be taken into account
– In case of test scripts, Lines of script (LOS) shall be considered
– In case of back end scripting, LOS generated by Tool shall be counted
Data Input– # TC Authored
– # TC Reviewed
– # TC Executed
– TCA Effort spent
– TCR Effort spent
– TCE Effort spent
Other Inputs– Date
– Resource ID
– Product/Build/Module
Productivity
DateResourceID Product Build Module
# TCAuthored
TCAuthoring
Effort# TC
Reviewed
TCReviewing
Effort# TC
Executed
TCExecution
Effort
Artifact : Project Data Collection EQA 2.0 D1.xls
Data Collection Sheet for Test Case Productivity
Productivity
Responsibility– PL/TL will provide the size and effort data (design/Doc./Manual Testing)– Size Capturing Toll will provide size data for coding/Automated Testing
Frequency– productivity data will be sent to SEPG on weekly basis (every Friday)– Final size and effort details are reported at every phase milestone
Data Source– Effort data - Project Data Collection EQA 2.0 D1.xls,
– Size data - Project Data Collection EQA 2.0 D1.xls, Size Capturing Tools,
Adhoc/Challenged/Rejected Bug %
Objective– To effectively identify and report bugs early in the product
Definition– Is the % of bugs Adhoc/Challenged/Rejected as compared to the total no.
of bugs identified in the product
Formula usedAdhoc/Chall./Rej. Bug % = # of Adhoc/Chall./Rej. bugs x 100%
Total # of bugs found
Unit of Measure– %
Key Note– Challenged bugs are challenged by client and later accepted– Adhoc bugs are identified during Adhoc/Exploratory testing
Data Input– Bugs by Testing type
– Total Bugs posted
– # Enhancement Bugs
– # Challenged Bugs
– # Redundant Bugs
– # Invalid Bugs
Adhoc/Challenged/Rejected Bug %
Other Inputs– Date
– Resource ID
– Product/Build/Module
– Severity (Cr/H/M/L)
DateResourceID Product Build Module
BugsbyTestingType
# BugsPosted
#Enhancements
#ChallengedBugs
#RedundantBugs
#InvalidBugs
Adhoc/Challenged/Rejected Bug %
Data Collection Sheet for Bug Details
Artifact : Project Data Collection EQA 2.0 D1.xls
Responsibility– PL/TL will provide the review Bug data
Frequency– will be sent to SEPG on weekly basis (every Friday)
– Final review defect details are reported at project closure
Data Source– Bug data - Project Data Collection EQA 2.0 D1.xls,
Adhoc/Challenged/Rejected Bug %
Process Capability Baseline
Is Process Stable/Capable?
– Variation brings inconsistency in a process
– Variations are either due to Chance/Assignable Causes
– 80% of Variation are caused by 20% of Causes
– Eliminating variation brings a stable process
– However a stable process may not be Capable!!!
Metrics based Project Mgmt.
Metrics based Project Mgmt.
Metrics based Project Mgmt.
QQThankThank