Upload
lewis
View
35
Download
2
Embed Size (px)
DESCRIPTION
The Effort Distribution Pattern Analysis of Three Types of Software Quality Assurance Activities and Process Implication: an Empirical Study. Qi Li University of Southern California-CSSE Ye Yang, Fengdi Shu Institute of Software, Chinese Academy of Sciences COCOMO Forum, MIT, Nov. 4 th 2009. - PowerPoint PPT Presentation
Citation preview
University of Southern California
Center for Systems and Software Engineering
04/22/23 1
The Effort Distribution Pattern Analysis of Three Types of Software Quality Assurance Activities and
Process Implication: an Empirical Study
Qi LiUniversity of Southern California-CSSE
Ye Yang, Fengdi ShuInstitute of Software, Chinese Academy of Sciences
COCOMO Forum, MIT, Nov. 4th 2009
04/22/23 1
University of Southern California
Center for Systems and Software Engineering
04/22/23 2
Background
• Three main Quality Assurance Activities:– Review– Process Audit– Testing
• Complementary and can’t be replaced by each other– Early reviewing will save time in later fixing– Reviewing is hard to tell system’s reliability and performance– Testing serves as a hands-on experience of the actual and
operational system– Process Audit: although not directly influences software product,
but right process leads to right product
04/22/23 2
University of Southern California
Center for Systems and Software Engineering
04/22/23 3
Related Work• COQUALMO:
– The efficiency of review, testing, and automated analysis are compared by two-round Delphi
– people review is the most efficient on removing requirement and design defects
– testing is the most efficient on removing code defects
• Capers Johns: (“Applied Software Measurement”)– lists Defect Removal Efficiency of 16 combinations of 4 defect
removal methods: design inspections, code inspections, quality assurance (corresponds to process audit in our study), and testing
– no single defect removal method is adequate– removal efficiency from better to worse would be design
inspections, code inspections, testing and process audit
04/22/23 3
University of Southern California
Center for Systems and Software Engineering
04/22/23 4
Related Work (Cont.)
• Both of them are based on Delphi estimation:– These results are generally from expert Delphi estimation and
lacks quantitative verification
• Both of them are based on the assumption:– That all these defect removal strategies have been done
sufficiently, but in practice, they might be done insufficiently due to many reasons
– From an economic aspect, the ROI might be negative if they have been done sufficiently
– Don’t give clue on effort allocation guidelines for QA activities– Don’t address how much reviewing, process auditing , and testing
is enough from a value-based perspective
04/22/23 4
University of Southern California
Center for Systems and Software Engineering
04/22/23 5
Motivation
• How do we know which type of QA activity is insufficient while others might be overdone– A “Diagnostic” Model
• How to balance the effort allocation among them?– To improve the process and resource allocation efficiency– Maximize the ROI of the entire quality assurance
• It is an ongoing work and welcome any comments, discussion and cooperation in the future
University of Southern California
Center for Systems and Software Engineering
04/22/23 6
Case Study Background• Organization:
– An R&D organization at ISCAS in China – Appraised and rated at CMMI maturity level 4 in 2005– A research group focused on software process improvement
• Project: – SoftPM: a tool used to manage software project and has been
deployed to many software organizations in China– Iterative development method and CMMI process management.
• Every version shares the same quality goal– Test cases coverage rate is 100%– All planned test cases executed and passed– No non-trivial defects are detected during at least one day of
continuous testing– Satisfy the quality goal of 0.2 defects/KLOC when released04/22/23 6
University of Southern California
Center for Systems and Software Engineering
04/22/23 7
Empirical Data Analysis Procedure• Data Collection:
– Extract raw data from related plans, records and reports– Consolidate the original information– Resolve the inconsistency
• Data Analysis: Goal/ Question/Metric (GQM) Method• Goal: Improve the ROI of QA activities
– Question 1: How to identify which QA activities might be insufficient while other might be overdone? (Q1)
– Metric 1: Regression Diagnostic Model (M1)– Question 2: How to improve reviewing? (Q2)– Metric 2: Filter Effectiveness (M2)– Question 3: How to improve process audit? (Q3)– Metric 3: NCs Distribution (M3)– Question 4: How to improve testing? (Q4)– Metric 4: Value-based Software Testing: Feature Prioritization (M4)
University of Southern California
Center for Systems and Software Engineering
04/22/23 8
Effort Distribution of Three QA Activities
Hrs (%) E_Re E_PA E_Test E_TotalV2.5 401.7(4.90%) 146(1.78%) 2587(31.58%) 8192V2.6 41(4.74%) 21.7(2.51%) 187(21.61%) 865.5
V3.0 120.5(0.38%) 472.65(1.49%) 15466(48.81%) 31684.5
V3.1 606(3.74%) 162(1.00%) 7787.6(48.03%) 16215.1
V3.1.1 127(7.76%) NA 365.5(22.33%) 1636.5
Effort and its Percentages of Each QA Activity
E_Re: Review effortE_PA: Process Audit effortE_Test: Testing effortE_Total: Total effort for the project
E_Re% E_PA% E_Test%
E_Re% 1.00
E_PA% 0.40 1.00
E_Test% -0.73 -0.92 1.00
Correlation between Effort Percentages of Three QA Activities
University of Southern California
Center for Systems and Software Engineering
04/22/23 9
Effort Distribution of Three QA Activities
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
V3.0 V3.1 V2.6 V2.5
E_Re%
E_PA%
E_Re%+E_PA%
E_Test%
M1: E_Test%= 0.73-2.69*E_Re%-15.73*E_PA%
Regression Analysis:
University of Southern California
Center for Systems and Software Engineering
04/22/23 10
Effort Distribution of Three QA Activities
M1: E_Test%= 0.73-2.69*E_Re%-15.73*E_PA%
• How to explain the regression model?– Coefficients: -2.69, -15.73
– Increase review effort by 1% =>save testing effort by 2.69%– Increase process audit effort by 1%=>save testing effort by 15.73%
– Constant : 0.73– If E_Re% and E_PA% are both equal to 0, which means no reviewing
and process audit activites– The testing will take up as high as 73% of the total effort
University of Southern California
Center for Systems and Software Engineering
04/22/23 11
Effort Distribution of Three QA Activities
M1: E_Test%= 0.73-2.69*E_Re%-15.73*E_PA%
• What the regression model tells?– Review and Process Audit might not be done sufficiently
– ROI of putting more effort on the review is: (2.69%-1%)/1%=1.69– ROI of putting more effort on process audit is: (15.73%-1%)/1%=14.73
– Process Implication– Put more effort on review especially process audit would be more cost-
effective– Diagnostic Model (M1)
– This regression model could be seen as a diagnostic model to identify which QA activities have been done adequately, while others might still insufficient (Q1)
University of Southern California
Center for Systems and Software Engineering
04/22/23 12
Q2: How to Improve the Reviewing
Percentage FE_Re_Req FE_Re_Des FE_Re_CodV2.5 0.76 0.41 NAV2.6 0.32 0.27 NAV3.0 0.44 0.09 0.01V3.1 0.97 0.50 0.19
V3.1.1 0.98 0.84 0.10Mean 0.69 0.42 0.10
Median 0.76 0.41 0.10Stdev 0.30 0.28 0.09
• M2: Filter Effectiveness: defects found in this phase/ (defects found in this phase+ defects introduced in this phase and found by testing)
• Filter Effectiveness of Each Phase by Reviewing
• Process Implication: • Filter Effectiveness decreases• Especially in coding phase, as low as 0.01 in V3.0• Implies that the development team relies highly on the testing group• More effort should be put on code walkthrough and unit testing by reviewing
University of Southern California
Center for Systems and Software Engineering
04/22/23 13
Q3: How to Improve Process Audit?
• Process Implication: • PMC (Project Monitoring & Controlling) and PC (Project Planning)
NCs consist of the most NCs• Main problem for PMC is that developers often neglect submitting
their periodical reports in time• The most common problem for PP is that planned and actual effort is
always displaying a huge difference• Put more auditing effort on PMC and PC
• M3: NCs (Not Compatibles) Distribution
University of Southern California
Center for Systems and Software Engineering
04/22/23 14
Testing Complements Review
• Functional defects take the most part =>insufficient effort on early review especially insufficient code walkthrough
• Defects related to user interface and performance are hard to detect by reviewing artifacts but can be more easily detected by testing
• Testing activity serves to complement review activity• Q4: How to improve the Testing?
• M4: Value-based Software Testing: Feature Prioritization (M4)• Qi Li et al. “Bridge the Gap between Software Test Process and Business Value: A Case Study” .
ICSP 2009: 212-223
University of Southern California
Center for Systems and Software Engineering
04/22/23 15
Other Findings: Cost of Finding and Removing Defects in Each Phase
(Hours/defect) Hrs/Def_Req Hrs/Def_Des Hrs/Def_TestV2.5 0.97 1.94 3.90 V2.6 1.33 1.42 1.02 V3.0 0.57 2.72 6.23 V3.1 1.81 2.17 14.32
V3.1.1 0.58 2.06 4.51 Mean 1.05 2.06 5.99
Median 0.97 2.06 4.51 Stdev 0.53 0.47 5.02
Hrs/Def_Req (Des): Hours spent on finding and removing per defect in requirement (Design)phase by reviewing
Hrs/Def_Test: Hours spent on finding and removing per defect by testing
• Process Implication: • Adding more effort on early review and defects finding is more cost-
effective • Further empirically validate Dr. Boehm’s summary of surveys in TRW,
IBM and GTE
University of Southern California
Center for Systems and Software Engineering
04/22/23 16
Discussion of Validity and Future Work
04/22/23 16
• The linear regression diagnostic model might not really reflect the utility function of QA activates Apply Indifference
Curve from Microeconomics to build model to deal with the QA effort allocation
• Need more data to build and validate the model Utility Function of QA activities
University of Southern California
Center for Systems and Software Engineering
04/22/23 17
Conclusion• This procedure of these analysis based on G/Q/M methodology
would help quality managers (under a specified organizational context):– identify which type of QA activity is insufficient while others might
be overdone– how to identify and improve the weakest part of QA activities– how to balance the effort allocation and planning for future
projects• The data analysis process is simple and practical• Provide process improvement implication & suggestion and
finally improve the whole process effectiveness under the specified organizational context
04/22/23 17
University of Southern California
Center for Systems and Software Engineering
04/22/23 18
Thank you
Questions or Comments?
University of Southern California
Center for Systems and Software Engineering
04/22/23 19
Data Source (Cont.)
04/22/23 19
Def_Re_Req Def_Re_Des Def_Re_Cod Def_Re_Total
V2.5 277 68 NA 345V2.6 18 12 NA 30V3.0 73 29 18 120V3.1 253 6 121 380
V3.1.1 60 36 8 104
Defects found in Review Activities
Def_Re_Req: Defects found in requirement phase by reviewingDef_Re_Des: Defects found in design phase by reviewingDef_Re_Cod: Defects found in coding phase by reviewingDe_Re_Total: Total defects found by reviewing
University of Southern California
Center for Systems and Software Engineering
04/22/23 20
Data Source (Cont.)
Def_Intro_Req Def_Intro_Des Def_Intro_Cod Def_Test_Total
V2.5 88 97 478 663
V2.6 38 32 114 184
V3.0 93 296 2095 2484
V3.1 8 6 530 544
V3.1.1 1 7 73 81
Defects found in Testing Activities
Def_Intro_Req: Defects introduced in requirement phase and found by testing Def_Intro_Des: Defects introduced in design phase and found by testing Def_Intro_Cod: Defects introduced in coding phase and found by testingDef_Test_Total: Total defects found by testing