50
Software complexity Software complexity estimation estimation by Adam Bondarowicz by Adam Bondarowicz

Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Embed Size (px)

Citation preview

Page 1: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Software complexity estimationSoftware complexity estimation

by Adam Bondarowiczby Adam Bondarowicz

Page 2: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

cocomococomo

"COnstructive COst MOdel""COnstructive COst MOdel"

COCOMOCOCOMO is a model designed by is a model designed by Barry Boehm to to give an estimate of the number of man-months it will give an estimate of the number of man-months it will

take to take to develop a a software product. product.

Page 3: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

cocomococomoCOCOMOCOCOMO consists of a hierarchy of three increasingly detailed consists of a hierarchy of three increasingly detailed

and accurate forms.and accurate forms.Basic COCOMOBasic COCOMO - is a static, single-valued model that computes - is a static, single-valued model that computes software development effort (and cost) as a function of program software development effort (and cost) as a function of program

size expressed in estimated lines of code. size expressed in estimated lines of code.

Intermediate COCOMOIntermediate COCOMO - computes software development - computes software development effort as function of program size and a set of "cost drivers" that effort as function of program size and a set of "cost drivers" that include subjective assessment of product, hardware, personnel include subjective assessment of product, hardware, personnel

and project attributes. and project attributes.

Detailed COCOMODetailed COCOMO - incorporates all characteristics of the - incorporates all characteristics of the intermediate version with an assessment of the cost driver's intermediate version with an assessment of the cost driver's impact on each step (analysis, design, etc.) of the software impact on each step (analysis, design, etc.) of the software

engineering process.engineering process.

Page 4: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

basicbasic cocomo cocomoUsed for:Used for:

Organic projectsOrganic projects - relatively small, simple software - relatively small, simple software projects in which small teams with good application projects in which small teams with good application

experience work to a set of less than rigid experience work to a set of less than rigid requirements. requirements.

Semi-detached projectsSemi-detached projects - intermediate (in size and - intermediate (in size and complexity) software projects in which teams with complexity) software projects in which teams with

mixed experience levels must meet a mix of rigid and mixed experience levels must meet a mix of rigid and less than rigid requirements. less than rigid requirements.

Embedded projectsEmbedded projects - software projects that must be - software projects that must be developed within a set of tight hardware, software, developed within a set of tight hardware, software,

and operational constraints. and operational constraints.

Page 5: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

basic COCOMO equationsbasic COCOMO equations

E=aE=abb(KLOC)(KLOC)bb

bb

D=cD=cbb(E)(E)dd

bb

P=E/D P=E/D E is the effort applied in person-monthsE is the effort applied in person-months

D is the development time in chronological monthsD is the development time in chronological monthsKLOC is the estimated number of delivered lines of code for the project (expressed in thousands)KLOC is the estimated number of delivered lines of code for the project (expressed in thousands)

Page 6: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

cocomo coefficientscocomo coefficients

aabb, , bbbb, , ccbb and and ddbb

Software project aSoftware project abb b bbb c cbb d dbb

Organic 2.4 1.05 2.5 0.38Organic 2.4 1.05 2.5 0.38

Semi-detached 3.0 1.12 2.5 0.35Semi-detached 3.0 1.12 2.5 0.35

Embedded 3.6 1.20 2.5 0.32Embedded 3.6 1.20 2.5 0.32

Page 7: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Basic cocomo summaryBasic cocomo summary

Basic COCOMOBasic COCOMO is good for is good for quick, early, roughquick, early, rough order of order of magnitude estimates of software costs, but its magnitude estimates of software costs, but its

accuracy is limitedaccuracy is limited because of its because of its lack of factorslack of factors to to account for differences in hardware constraints, account for differences in hardware constraints,

personnel quality and experience, use of modern tools personnel quality and experience, use of modern tools and techniques, and other project attributes known to and techniques, and other project attributes known to

have a significant influence on software costs.have a significant influence on software costs.

Page 8: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Extended cocomoExtended cocomo

The basic model is extended to The basic model is extended to consider a set of "cost driver attributes" consider a set of "cost driver attributes"

that can be grouped into four major that can be grouped into four major categories:categories:

Page 9: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

1. 1. Product attributesProduct attributes

a. required software reliabilitya. required software reliabilityb. size of application data baseb. size of application data base

c. complexity of the productc. complexity of the product

Page 10: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

2. 2. Hardware attributesHardware attributes

a. run-time performance constraintsa. run-time performance constraintsb. memory constraintsb. memory constraints

c. volatility of the virtual machine c. volatility of the virtual machine environmentenvironment

d. required turnaround timed. required turnaround time

Page 11: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

3. 3. Personnel attributes Personnel attributes

a. analyst capabilitya. analyst capabilityb. software engineer capabilityb. software engineer capability

c.applications experiencec.applications experienced. virtual machine experienced. virtual machine experience

e. programming language experiencee. programming language experience

Page 12: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

4. 4. Project attributesProject attributes

a. use of software toolsa. use of software toolsb. application of software engineering methodsb. application of software engineering methods

c. required development schedulec. required development schedule

Page 13: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Each of the 15 attributes is rated on a 6 point scale that Each of the 15 attributes is rated on a 6 point scale that ranges from "very low" to "extra high" (in importance or value)ranges from "very low" to "extra high" (in importance or value)

Based on the rating, an effort multiplier is determined from Based on the rating, an effort multiplier is determined from tables published by Boehm [BOE81], and the product of all tables published by Boehm [BOE81], and the product of all effort multipliers results is an effort multipliers results is an effort adjustment factoreffort adjustment factor (EAF) (EAF). .

Typical values for EAF range from 0.9 to 1.4.Typical values for EAF range from 0.9 to 1.4.

Page 14: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

intermediate COCOMO equationintermediate COCOMO equation

E = aE = aii KLOC KLOC bbii xx EAFEAF

EE is the effort applied in person-months is the effort applied in person-monthsKLOCKLOC is the estimated number of delivered lines of code for the project is the estimated number of delivered lines of code for the project

Page 15: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Intermediate cocomo coefficientsIntermediate cocomo coefficients

Softwareproject

ai bi

organic 3.2 1.05

Semi-detached 3.0 1.12

embedded 2.8 1.20

Page 16: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

ExampleExampleUsing the LOC estimate and the coefficients noted in table, we use the basic Using the LOC estimate and the coefficients noted in table, we use the basic

model to get:model to get:E = 2.4 (KLOC) E = 2.4 (KLOC) 1.051.05

= 2.4 (33.2) = 2.4 (33.2) 1.051.05

= 95 person-months= 95 person-months

Page 17: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Cocomo IICocomo II

CCOCOMO II is a model that allows one to OCOMO II is a model that allows one to estimate the cost, effort, and schedule when estimate the cost, effort, and schedule when

planning a new software development planning a new software development activity. It consists of three submodels, each activity. It consists of three submodels, each

one offering increased fidelity the further one offering increased fidelity the further along one is in the project planning and along one is in the project planning and

design process.design process.

Page 18: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Compared to COCOMO ICompared to COCOMO I

COCOMO II is tuned to modern COCOMO II is tuned to modern software lifesoftware life cycles. cycles.

The The original COCOMO model has been very successful, model has been very successful, but it doesn't apply to newer software development but it doesn't apply to newer software development practices as well as it does to traditional practices. practices as well as it does to traditional practices.

COCOMO II targets the software projects of the 1990s COCOMO II targets the software projects of the 1990s and 2000s, and will continue to evolve over the next few and 2000s, and will continue to evolve over the next few

years.years.

Page 19: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

COCOMO II is really three different models:COCOMO II is really three different models:The Application Composition ModelThe Application Composition Model

Suitable for projects built with modern GUI-builder tools. Based on Suitable for projects built with modern GUI-builder tools. Based on new Object Points. new Object Points.

The Early Design ModelThe Early Design ModelYou can use this model to get rough estimates of a project's cost You can use this model to get rough estimates of a project's cost and duration before you've determined it's entire architecture. It and duration before you've determined it's entire architecture. It

uses a small set of new Cost Drivers, and new estimating uses a small set of new Cost Drivers, and new estimating equations. Based on Unadjusted Function Points or KSLOC. equations. Based on Unadjusted Function Points or KSLOC.

The Post-Architecture ModelThe Post-Architecture ModelThis is the most detailed COCOMO II model. You'll use it after This is the most detailed COCOMO II model. You'll use it after you've developed your project's overall architecture. It has new you've developed your project's overall architecture. It has new

cost drivers, new line counting rules, and new equations. cost drivers, new line counting rules, and new equations.

Page 20: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

PM = A*(KSLOC)^B * PM = A*(KSLOC)^B * ΠΠ(i=1..17)(i=1..17)EMiEMi

B = 1.01 + B = 1.01 + ΣΣ(j=1..5)(j=1..5)SF jSF j

– – A is a constantA is a constant– – KSLOC is thousands of source lines of codeKSLOC is thousands of source lines of code

– – EM are effort multipliers, parameters that effect effortEM are effort multipliers, parameters that effect effortthe same amount regardless of project sizethe same amount regardless of project size

– – SF are scale factors, parameters that have largeSF are scale factors, parameters that have largeinfluence on big projects and small influence on smallinfluence on big projects and small influence on small

projectsprojects

Page 21: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

COCOMO II Parameters• EM Example: Application ExperienceCriteria < 2 months 6 months 1 year 3 years 6 yearsRating Very Low Low Nominal High Very HighValue 1.22 1.10 1.00 0.88 0.81

• SF Example: Process Maturity

Criteria CMM 1Lower

CMM 1Upper

CMM 2 CMM 3 CMM 4 CMM5

Rating VeryLow

Low Nominal High VeryHigh

ExtraHigh

Value 0.78 0.62 0.47 0.31 0.16 0.00

Page 22: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

8 cocomo II uses

- software development approach- software development approach- budget decisions- budget decisions

- production trade-offs- production trade-offs- IT capital planning- IT capital planning- investment options- investment options

- management decisions- management decisions- prioritizing projects- prioritizing projects

- SPI strategy- SPI strategy

Page 23: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

6 cocomo II Model Objectives6 cocomo II Model Objectives

- accuracy- accuracy- customization- customization

- model ease of use- model ease of use- usefulness- usefulness

- resource manager- resource manager- modifiability- modifiability

Page 24: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Use Case Points MethodUse Case Points Method

Page 25: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Use Case Points MethodUse Case Points Method

The Use Case Points Method (UCPM) The Use Case Points Method (UCPM) is an effort estimation algorithm is an effort estimation algorithm proposed by Gustav Karner that proposed by Gustav Karner that

employs Use Cases as a representation employs Use Cases as a representation of system complexity based on system of system complexity based on system

functionality.functionality.

Page 26: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Method summaryMethod summary

• • Identify, classify and weight Identify, classify and weight actorsactors• Identify, classify and weight • Identify, classify and weight use casesuse cases• Identify and weight • Identify and weight Technical FactorsTechnical Factors

• Identify and weight • Identify and weight Environmental Environmental FactorsFactors

• Converting • Converting Points into TimePoints into Time• Calculate • Calculate Adjusted Use Case PointsAdjusted Use Case Points

Page 27: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Identify, classify and weight actorsIdentify, classify and weight actorsActors are classified as either people or other systems. Each Actors are classified as either people or other systems. Each

identified actor is given a weighting from 1-3 that corresponds to identified actor is given a weighting from 1-3 that corresponds to simple, average, and complex. Human actors are always simple, average, and complex. Human actors are always

classified as complex and receive a weighting of 3. Systems to classified as complex and receive a weighting of 3. Systems to which the new system will interface (legacy systems) are either which the new system will interface (legacy systems) are either simple or average depending on the mechanism by which they simple or average depending on the mechanism by which they

are addressed.are addressed.

E.g.:E.g.:2 simple * 1 = 22 simple * 1 = 2

2 average * 2 = 42 average * 2 = 43 complex * 3 = 93 complex * 3 = 9

Total actor weight = 2 + 4 + 9 = 15Total actor weight = 2 + 4 + 9 = 15

Actor type Definition Factor

Simple Program interface 1

Average Interactive, or protocol-driven interface 2

Complex Graphical interface (Human) 3

Page 28: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Identify, classify and weight use Identify, classify and weight use casescases

E.g.:E.g.:5 simple * 5 = 255 simple * 5 = 25

4 average * 10 = 404 average * 10 = 400 complex * 3 = 00 complex * 3 = 0

Total use case weight = 25 + 40 + 0 = 65Total use case weight = 25 + 40 + 0 = 65

The Total actor weight and the Total use case weight are then summed to The Total actor weight and the Total use case weight are then summed to produce the Unadjusted Use Case Points produce the Unadjusted Use Case Points (UUCP)(UUCP) score. score.

15 + 65 = 8515 + 65 = 85UUCP = 85UUCP = 85

Use case type Definition Factor

Simple 3 or fewer transactionsor < 5 analysis classes

5

Average 4 to 7 transactionsor 5 – 10 analysis classes

10

Complex More than 7 transactionsor > 10 analysis classes

15

Page 29: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Identify and Weight Technical Identify and Weight Technical FactorsFactors

E.g.:TFactor = Sum of Weight * Value column

TFactor = 30Technical Complexity Factor (TCF) = 0.6 + (0.01 * TFactor)

TCF = 0.9

Technical Factor Number

Technical Factor Description

Weight Value Weight * Value

T1 System will be distributed (released) 2 0 0

T2 Performance objectives 1 3 3

T3 End-user efficiency 1 5 5

T4 Complex internal processing 1 1 1

T5 Code must by reused 1 0 0

T6 Easy to install .5 5 2.5

T7 Easy to use .5 5 2.5

T8 Portable 2 0 0

T9 Easy to change 1 3 3

T10 Concurrent 1 5 5

T11 Includes special security features 1 3 3

T12 Provides direct access for third parties 1 5 5

T13 Special user training facilities are required 1 0 0

Page 30: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Identify and Weight Environmental Identify and Weight Environmental FactorsFactors

E.g.:EF-Factor = Sum of (Weight * Value) column

EF-Factor = 16.5Environmental Complexity Factor (ECF) = 1.4 + (-0.03 * EF-Factor)

ECF = 0.905

EnvironmentalFactor Number

Environmental Factor Description

Weight

ValueWeight *

Value

EF1 Familiar with RUP 1.5 1 1.5

EF2 Application experience 0.5 1 0.5

EF3 Object-oriented experience 1 1 1

EF4 Lead analyst capability 0.5 5 2.5

EF5 Motivation 1 5 5

EF6 Stable requirements 2 5 10

EF7 Part-time workers -1 0 0

EF8 Difficult programminglanguage

-2 2 -4

Page 31: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Calculate Adjusted Use Case Points Calculate Adjusted Use Case Points

Finally Use Case Points are calculated Finally Use Case Points are calculated using this formula:using this formula:

UCP = UUCP * TCF * ECFUCP = UUCP * TCF * ECFE.g.:E.g.:

UCP = UUCP * TCF * ECFUCP = UUCP * TCF * ECFUCP = 80 * 0.9 * 0.905UCP = 80 * 0.9 * 0.905

UCP = 65.16 (65)UCP = 65.16 (65)

Page 32: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Converting Points into TimeConverting Points into Time

It is recommended to convert each UCP It is recommended to convert each UCP to 20-28 hoursto 20-28 hours

Page 33: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

DELPHIDELPHI

The Delphi technique is a method for obtaining forecasts The Delphi technique is a method for obtaining forecasts from a panel of independent experts over two or more from a panel of independent experts over two or more rounds. Experts are asked to predict quantities. After rounds. Experts are asked to predict quantities. After each round, an administrator provides an anonymous each round, an administrator provides an anonymous

summary of the experts’ forecasts and their reasons for summary of the experts’ forecasts and their reasons for them. When experts’ forecasts have changed little them. When experts’ forecasts have changed little

between rounds, the process is stopped and the final between rounds, the process is stopped and the final round forecasts are combined by averaging.round forecasts are combined by averaging.

Page 34: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Role of the facilitatorRole of the facilitator

The person co-ordinating the Delphi method The person co-ordinating the Delphi method can be known as a can be known as a facilitatorfacilitator, and facilitates , and facilitates the responses of their the responses of their panel of expertspanel of experts, who , who are selected for a reason, usually that they are selected for a reason, usually that they hold knowledge on an opinion or view. The hold knowledge on an opinion or view. The

facilitator sends out questionnaires, surveys facilitator sends out questionnaires, surveys etc. and if the panel of experts accept, they etc. and if the panel of experts accept, they follow instructions and present their views.follow instructions and present their views.

Page 35: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

The Delphi method and forecastingThe Delphi method and forecasting

The Delphi method is a systematic interactive forecasting method based on independent inputs of selected experts.

Delphi method uses a panel of carefully selected experts who answer a series of questionnaires. Questions are usually formulated as hypotheses,

and experts state the time when they think these hypotheses will be fulfilled. Each round of questioning is followed with the feedback on the preceding round of replies, usually presented anonymously. Thus the experts are encouraged to revise their earlier answers in light of the

replies of other members of the group.

Page 36: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

key characteristics of the Delphi key characteristics of the Delphi methodmethod

1. Structuring of information flow 1. Structuring of information flow 2. Regular feedback 2. Regular feedback

3. Anonymity of the participants 3. Anonymity of the participants

Page 37: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Structuring of information flowStructuring of information flow

The initial contributions from the experts are The initial contributions from the experts are collected in the form of answers to questionnaires collected in the form of answers to questionnaires

and their comments to these answers. and their comments to these answers. The panel director controls the interactions among The panel director controls the interactions among the participants by processing the information and the participants by processing the information and

filtering out irrelevant content. This avoids the filtering out irrelevant content. This avoids the negative effects of face-to-face panel discussions negative effects of face-to-face panel discussions and solves the usual problems of group dynamics.and solves the usual problems of group dynamics.

Page 38: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Regular feedbackRegular feedback

Participants comment on their own forecasts, the Participants comment on their own forecasts, the responses of others and on the progress of the responses of others and on the progress of the

panel as a whole. panel as a whole. At any moment they can revise their earlier At any moment they can revise their earlier

statements.statements. While in regular group meetings participants tend While in regular group meetings participants tend

to stick to previously stated opinions and often to stick to previously stated opinions and often conform too much to group leader, the Delphi conform too much to group leader, the Delphi

method prevents it.method prevents it.

Page 39: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Anonymity of the participantsAnonymity of the participants

Usually all participants maintain Usually all participants maintain anonymityanonymity. Their identity is . Their identity is not revealed not revealed even after the completion of the final reporteven after the completion of the final report. .

This stops them from dominating others in the process using This stops them from dominating others in the process using their authority or personality, frees them to some extent from their authority or personality, frees them to some extent from

their personal biases, allows them to freely express their their personal biases, allows them to freely express their opinions, encourages opinions, encourages open critiqueopen critique and and admitting errorsadmitting errors by by

revising earlier judgments.revising earlier judgments.

Page 40: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

ApplicationsApplications

First applications of the Delphi method were in the field of First applications of the Delphi method were in the field of science.science.

Later the Delphi method was applied in other areas, Later the Delphi method was applied in other areas, especially those related to public policy issues, such as especially those related to public policy issues, such as

economic trends, health and education. It was also applied economic trends, health and education. It was also applied successfully and with high accuracy in business forecasting. successfully and with high accuracy in business forecasting. For example, in one case reported by Basu and Schroeder For example, in one case reported by Basu and Schroeder

(1977), the Delphi method predicted the sales of a new (1977), the Delphi method predicted the sales of a new product during the first two years with inaccuracy of 3–4% product during the first two years with inaccuracy of 3–4%

compared with actual sales. Quantitative methods produced compared with actual sales. Quantitative methods produced errors of 10–15%, and traditional unstructured forecast errors of 10–15%, and traditional unstructured forecast

methods had errors of about 20%.methods had errors of about 20%.

Page 41: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Function Point Function Point AnalisysAnalisys

Function points are a unit measure for software Function points are a unit measure for software much like an hour is to measuring time, miles are much like an hour is to measuring time, miles are to measuring distance or Celsius is to measuring to measuring distance or Celsius is to measuring

temperature. Function Points are an ordinal temperature. Function Points are an ordinal measure much like other measures such as measure much like other measures such as

kilometers, Fahrenheit, hours, so on and so forth.kilometers, Fahrenheit, hours, so on and so forth.

Page 42: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Objectives of Function Point Objectives of Function Point AnalysisAnalysis

Since Function Points measures systems from a Since Function Points measures systems from a functional functional perspective perspective -- they are independent of technology. Regardless they are independent of technology. Regardless

of language, development method, or hardware platform of language, development method, or hardware platform used, the number of function points for a system will remain used, the number of function points for a system will remain constant. The only variable is the amount of effort needed to constant. The only variable is the amount of effort needed to

deliver a given set of function points; therefore, Function Point deliver a given set of function points; therefore, Function Point Analysis can be used to determine whether a tool, an Analysis can be used to determine whether a tool, an

environment, a language is more productive compared with environment, a language is more productive compared with others within an organization or among organizations. This is others within an organization or among organizations. This is

a critical point and one of the greatest values of Function a critical point and one of the greatest values of Function Point Analysis.Point Analysis.

Page 43: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

The Five Major ComponentsThe Five Major Components

External Inputs (EI) External Inputs (EI) External Outputs (EO)External Outputs (EO)External Inquiry (EQ) External Inquiry (EQ)

Internal Logical Files (ILF’s)Internal Logical Files (ILF’s)External Interface Files (EIF’s) External Interface Files (EIF’s)

Page 44: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

External Inputs (EI) External Inputs (EI)

an elementary process in which data crosses the boundary an elementary process in which data crosses the boundary from outside to inside. This data may come from a data input from outside to inside. This data may come from a data input

screen or another application. The data may be used to screen or another application. The data may be used to maintain one or more internal logical files. The data can be maintain one or more internal logical files. The data can be

either control information or business information. If the data either control information or business information. If the data is control information it does not have to update an internal is control information it does not have to update an internal

logical file. logical file.

Page 45: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

External Outputs (EO) External Outputs (EO)

elementary process in which derived data passes across the elementary process in which derived data passes across the boundary from inside to outside. Additionally, an EO may boundary from inside to outside. Additionally, an EO may

update an ILF. The data creates reports or output files sent update an ILF. The data creates reports or output files sent to other applications. These reports and files are created to other applications. These reports and files are created

from one or more internal logical files and external interface from one or more internal logical files and external interface file. file.

Page 46: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

External Inquiry (EQ) External Inquiry (EQ)

elementary process with both input and output components elementary process with both input and output components that result in data retrieval from one or more internal logical that result in data retrieval from one or more internal logical files and external interface files. The input process does not files and external interface files. The input process does not update any Internal Logical Files, and the output side does update any Internal Logical Files, and the output side does not contain derived data. The graphic below represents an not contain derived data. The graphic below represents an

EQ with two ILF's and no derived data.EQ with two ILF's and no derived data.

Page 47: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Internal Logical Files (ILF’s):Internal Logical Files (ILF’s):a user identifiable group of logically related data a user identifiable group of logically related data

that resides entirely within the applications that resides entirely within the applications boundary and is maintained through external boundary and is maintained through external

inputs.inputs.

External Interface Files (EIF’s):External Interface Files (EIF’s):a user identifiable group of logically related data a user identifiable group of logically related data

that is used for that is used for reference purposes onlyreference purposes only. The data . The data resides resides entirely outside the applicationentirely outside the application and is and is

maintained by another application. The external maintained by another application. The external interface file is an internal logical file for another interface file is an internal logical file for another

application.application.

Page 48: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Functional ComplexityFunctional Complexity The first adjustment factor considers the Functional Complexity for each unique

function. Functional Complexity is determined based on the combination of data

groupings and data elements of a particular function. The number of data elements and unique groupings are counted and compared to a complexity

matrix that will rate the function as low, average or high complexity. Each of the five functional components (ILF, EIF, EI, EO and EQ) has its own unique

complexity matrix. The following is the complexity matrix for External Outputs.

1-5 DETs 6 - 19 DETs 20+ DETs0 or 1 FTRs L L A2 or 3 FTRs L A H4+ FTRs A H H

Complexity UFPL (Low) 4A (Average) 5H (High) 7

Page 49: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

Value Adjustment Factor - The Unadjusted Function Point count is multiplied by the Value Adjustment Factor - The Unadjusted Function Point count is multiplied by the second adjustment factor called the Value Adjustment Factor. This factor considers the second adjustment factor called the Value Adjustment Factor. This factor considers the

system's technical and operational characteristics and is calculated by answering 14 system's technical and operational characteristics and is calculated by answering 14 questions. The factors are:questions. The factors are:

1. Data Communications1. Data CommunicationsThe data and control information used in the application are sent or received over The data and control information used in the application are sent or received over

communication facilities.communication facilities.

2. Distributed Data Processing2. Distributed Data ProcessingDistributed data or processing functions are a characteristic of the application within the Distributed data or processing functions are a characteristic of the application within the

application boundary.application boundary.

3. Performance3. PerformanceApplication performance objectives, stated or approved by the user, in either response or Application performance objectives, stated or approved by the user, in either response or throughput, influence (or will influence) the design, development, installation and support throughput, influence (or will influence) the design, development, installation and support

of the application.of the application.

4. Heavily Used Configuration4. Heavily Used ConfigurationA heavily used operational configuration, requiring special design considerations, is a A heavily used operational configuration, requiring special design considerations, is a

characteristic of the application.characteristic of the application.

5. Transaction Rate5. Transaction RateThe transaction rate is high and influences the design, development, installation and The transaction rate is high and influences the design, development, installation and

support.support.

Page 50: Software complexity estimation by Adam Bondarowicz by Adam Bondarowicz

6. On-line Data Entry6. On-line Data EntryOn-line data entry and control information functions are provided in the application.On-line data entry and control information functions are provided in the application.

7. End -User Efficiency7. End -User EfficiencyThe on-line functions provided emphasize a design for end-user efficiency.The on-line functions provided emphasize a design for end-user efficiency.

8. On-line Update8. On-line UpdateThe application provides on-line update for the internal logical files.The application provides on-line update for the internal logical files.

9. Complex Processing9. Complex ProcessingComplex processing is a characteristic of the application.Complex processing is a characteristic of the application.

10. Reusability10. ReusabilityThe application and the code in the application have been specifically designed, developed and The application and the code in the application have been specifically designed, developed and

supported to be usable in other applications.supported to be usable in other applications.

11. Installation Ease11. Installation EaseConversion and installation ease are characteristics of the application. A conversion and installation Conversion and installation ease are characteristics of the application. A conversion and installation

plan and/or conversion tools were provided and tested during the system test phase.plan and/or conversion tools were provided and tested during the system test phase.

12. Operational Ease12. Operational EaseOperational ease is a characteristic of the application. Effective start-up, backup and recovery Operational ease is a characteristic of the application. Effective start-up, backup and recovery

procedures were provided and tested during the system test phase.procedures were provided and tested during the system test phase.

13. Multiple Sites13. Multiple SitesThe application has been specifically designed, developed and supported to be installed at multiple The application has been specifically designed, developed and supported to be installed at multiple

sites for multiple organizations.sites for multiple organizations.

14. Facilitate Change14. Facilitate ChangeThe application has been specifically designed, developed and supported to facilitate change. The application has been specifically designed, developed and supported to facilitate change.