Harold van Heeringen,
ISBSG president
OverviewBenchmarking
Software Project Industry
Functional Size Measurement
Software Metrics
Historical Data: ISBSGHistorical Data: ISBSG
Project Benchmark Example
Organization Benchmark
Other uses for Benchmarking
Country report – Italy
Data submissions Italy
Benchmarking (wikipedia)
Benchmarking is the process of comparing one's business processes
and performance metrics to industry bests or best practices from
other industries.
Benchmarking is used to measure performance using a
specific indicator (cost per unit of measure, productivity per unit specific indicator (cost per unit of measure, productivity per unit
of measure, cycle time of x per unit of measure or defects per
unit of measure) resulting in a metric of performance that is then
compared to others
This then allows organizations to develop plans on how to make
improvements or adapt specific best practices, usually with the
aim of increasing some aspect of performance. Benchmarking
may be a one-off event, but is often treated as a continuous
process in which organizations continually seek to improve their
practices.
Where are we now?“Even the most detailed navigation map of an
area is useless if you don’t know where you are”
?
?
?
?
?
Informed decisionsSenior Management of IT departments/organizations
need to make decisions need to make decisions
based on ‘where they are’ and ‘where they want to
go’.
Benchmarking is about determining ‘where you are’
compared to relevant peers, in order to make
informed decisions.
But how to measure and determine where you are?
Software project industry
Low ‘performance metrics’ maturityFew Performance Measurement Process implemented
Few Benchmarking processes implemented
Most organizations don’t know how good or how bad
they are in delivering or maintaining software.they are in delivering or maintaining software.
These organizations are not able to assess their
competitive position, nor able to make informed
strategic decisions to improve their competitive
position.
But…Best in Class organizations deliver software up to 30
times more productively than Worst in Class
organizationsHigh Productivity, High Quality
More functionality for the users against lower costs – value
Shorter Time to Market – competitive advantage!
Worst in Class organizations will find themselves in
trouble in an increasingly competitive marketOutperformed by competition
Internal IT departments get outsourced
Commercial software houses fail to win new contracts
Important to know where you stand!
Benchmark is essential!
Difficulty – low industry maturity
How to measure metrics like productivity, quality,
time-to-market in such a way that a meaningful
comparison is possible?
Comparing apples to apples
Software is not easy to compare
Functional Size MeasurementFunction Point Analysis (NESMA, IFPUG or COSMIC)
Measure the functional user requirements – size in function points;
ISO standards – objective, independent, verifiable, repeatable;
Strong relation between functional size and project effort needed;
What to do with the results?What to do with the results?
Project effort/duration/cost estimation
Benchmarking/performance measurement
Use in Request for Proposal management (answer price/FP questions)
What about historical data?
Company data (preferably for estimation)
Industry data (necessary for external benchmarking)
Unit of Measure (UoM)Why are Function Points the best UoM to use in Benchmarking?
Functionality is of value for the client/business. More functionality means
more value. More Lines of code (technical size) is not necessarily of value.
Function Points are measured independent from technical requirements
500 FP of functionality implemented in Java SOA architecture500 FP of functionality implemented in Java SOA architecture
= 500 FP of functionality implemented in Cobol mainframe
Function Points are measured independent from implementation method
500 FP delivered in an agile development project
=500 FP delivered in a COTS package implementation
Software metrics – some examples
Productivity
Productivity Rate: #Function points per staff month
PDR: #Effort hours per function Point
QualityQuality
Defect Density: #Defects delivered per 1000 function
points
Time to Market
Speed: #Function points delivered per calendar month
Performance measurementMeasure the size of completed projects
Project size in Function Points
Product size in Function Points
Collect and analyze the data
Effort hours, duration, defects
Normalize the data when necessary
Store the data in the corporate database
Benchmark the project, internally and externally
Report metrics and trends
Different reports for different stakeholders
Depending on goals of the stakeholder
External benchmarkHow to benchmark your performance externally?
Gartner / McKinsey/other ??
Very expensive!
No insight into data used !!!
Do it yourself Benchmarking, use:
Low cost, more feeling for the data, decide yourself which
peer groups are most relevant, decide yourself which
data is relevant!
Historical data of completed projects!!
� International Software Benchmarking Standards Group
� Independent and not-for-profit
� Full Members are non-profit organizations, like DASMA, IFPUG, FiSMA, QESP and NESMA. GUFPI-ISMA is now associate member
Historical data: ISBSG repositories
� Grows and exploits two repositories of software data:
� New development projects and enhancements (> 6000 projects)
� Maintenance and support (> 1000 applications)
� Everybody can submit project data
� DCQ on the site
� Completely anonymous
� Free benchmark report in return
� Mission: “To improve the management of IT resources by both business and government, through the provision and exploitation of public repositories of software engineering knowledge that are standardized, verified, recent and representative of current technologies”.
ISBSG
� All ISBSG data is � validated and rated in accordance with its quality guidelines
� current
� representative of the industry
� independent and trusted
� captured from a range of organization sizes and industries
Website and portal
Example project benchmark
Project X was completed, the following data was collected:
Primary programming language: Java
Effort hours spent: 5730
Duration: 11 months
Defects found after delivery: 23
The functional size of the project was measured: 411 FP
Software metrics:
Project Delivery Rate: 5730/411 = 13,9 h/FP
Project Speed: 411/11 = 37,4 FP per calendar month
Defect Density: (23/411) *1000 = 56,0 defects/1000 FP
Example benchmarkISBSG ‘New Developments & Enhancements’
Select the right ‘peer group’Data Quality A or B
Count approach: IFPUG 4.x or NESMA
Primary Programming Language = ‘Java’
300 FP < Project Size < 500 FP300 FP < Project Size < 500 FP
Results project benchmark
PDR
N 488
Minimum 0,1
Percentile 10 2,5
Percentile25 4,7
Median 9,8
Percentile 75 18,4
Speed
N 428
Minimum 9,4
Percentile 10 23,1
Percentile 25 32,5
Median 53,8
Defect
Density
N 154
Minimum 0,0
Percentile 10 0,0
Percentile 25 0,0
Median 3,7Percentile 75 18,4
Percentile
90 28,9
Maximum 621,3
Average 15,2
Project Delivery Rate: 5730/411 = 13,9 h/FP
Project Speed: 411/11 = 37,4 FP per calendar month
Defect Density: (23/411) *1000 = 56,0 defects/1000 FP
Median 53,8
Percentile 75 95,4
Percentile 90 130,2
Maximum 476,0
Average 70,9
Median 3,7
Percentile 75 17,9
Percentile 90 40,1
Maximum 366,5
Average 18,6
This project was carried out less productive and slower
than market average, and the quality is worse than average.
Organization benchmark
0,4
0,6
0,8
1,0
1,2
1,4
1,6
1,8
2,0
2,2
<2009 2009 2010 2011 2012
Pro
du
ctiv
ity
In
de
x
Organization Y - Productivity lndex
0,4
0,6
0,8
1,0
1,2
1,4
1,6
1,8
2,0
2,2
Spee
d In
dex
Organization Y - Speed index
Organization Y PI Target (baseline +50%)
Industry Productivity Lower bound (baseline -40%)
<2009 2009 2010 2011 2012
Organization Y Speed index Target (baseline +50%)
Industry Speed Lower bound (baseline -40%)
0,2
0,4
0,6
0,8
1,0
1,2
1,4
1,6
1,8
2,0
<2009 2009 2010 2011 2012
Qu
ali
ty I
nd
ex
Organization Y - Quality lndex
organization Y - Quality index Target (baseline +50%)
Industry Quality level Lower bound (baseline -50%)
Analysis:Until 2010, the organization was improving
After 2010/2011, the trends go the wrong way
Recommendation: find the cause and draw up
improvement plan
Other uses for ISBSG data
Vendor selection, based on productivity, speed or
quality metrics, compared to the industry.
Definition of SLA agreements (or other KPI’s) based on
industry average performance.industry average performance.
Establish a baseline from which to measure future
improvement.
Explain to the client/business that a project was
carried out in a ‘better-than-average’ way, while the
client may perceive otherwise.
Analysis of the dataAnalyze the difference in productivity or quality
between two (or more) types of projects:
Traditional vs. Agile
Outsourced vs. In-houseOutsourced vs. In-house
Government vs. Non-government
One site, multi site
Reuse vs. no reuse
Etcetera.
� ISBSG Special Analysis reports
Traditional vs. Agile� Agile productivity: 10 – 20% gain after 1 year of
adoption
� Agile cost: 20-40% lower after 1 year of adoption
� Agile time-to-market: 10-60% less� Agile time-to-market: 10-60% less
� Agile quality (post production defects): 2-8% higher
� �10 Take Aways from Reifer Agile Report (www.isbsg.com)
Special reports� Impact of Software Size on Productivity
� Government and Non-Government Software Project Performance
� ISBSG Software Industry Performance report� ISBSG Software Industry Performance report
� ISBSG The Performance of Business Application, Real-Time and Component Software Projects
� Estimates – How accurate are they?
� Planning Projects – Role Percentages
� Team size impact on productivity
� Manage your M&S environment – what to expect?
� Many more
Country report
Italy (IFPUG)Latest project: 2005
Italy (COSMIC)Latest project: 2010
Government vs. Non-government
Role percentages
� Everybody wants to use data
� But nobody wants to submit data… Why not?
� Is it hard?
Is there a risk?
We need data!
� Is there a risk?
� Is the reward not big enough?
� Why not try it? You’ll get a free Benchmark report and 100 portal credits in return!!
� Are there any factors preventing you?
� WWW.ISBSG.ORG
GUFPI-ISMA Event offer
GUFPI-ISMA members always get a 10% discount, using the code provided earlier this year.
H.S.2
Dia 33
H.S.2 van Heeringen; 15-11-2013
Thank you!
NEtherlands Software Metrics users Association
To celebrate our anniversary NESMAwill organise the IWSM Mensura
October 6-8, 2014
2014.iwsm-mensura.org
October 6-8, 2014
HistoricalHistorical landmarkslandmarks
Modern Modern landmarkslandmarks
EuropeanEuropean CulturalCultural CapitalCapital 20012001
The city of modern The city of modern architecturearchitecture
LargestLargest soccerstadiumsoccerstadium in the in the NetherlandsNetherlands
WelcomeWelcome toto
RotterdamRotterdam
BusiestBusiest port in Europeport in Europe
2014.iwsm-mensura.org
I hope to see you next year
October 6-8, 2014