46
©2015 InfoStretch Corporation. All rights reserved. Liana Gevorgyan | May 6, 2015 Measuring Quality QA Metrics and Trends in Practice US - UK - India

Measuring Quality_Testing&Trends_Final _May 5

Embed Size (px)

Citation preview

Page 1: Measuring Quality_Testing&Trends_Final _May 5

©2015 InfoStretch Corporation. All rights reserved.

Liana Gevorgyan | May 6, 2015

Measuring Quality

QA Metrics and Trends in PracticeUS - UK - India

Page 2: Measuring Quality_Testing&Trends_Final _May 5

©2015 InfoStretch Corporation. All rights reserved.

SECTION 1: TECHNOLOGY IN LIFEBugs Are Costly

Page 3: Measuring Quality_Testing&Trends_Final _May 5

1999

Mars Climate Orbiter CrashInstead of using the provided metric system for navigation, the contractor carried out measurements using empirical units and the space craft crashed into Mars.

COST

$135 Million

Page 4: Measuring Quality_Testing&Trends_Final _May 5

1996

ARIANE FailureAriane 5 rocket exploded 36.7 seconds after take off. The engine of this satellite was much faster than that of the previous models, but it had a software bug that went unnoticed.

COST

>$370 Million

Page 5: Measuring Quality_Testing&Trends_Final _May 5

2003

EDS Fails Child SupportEDS created an IT system for a Child Support Agency in the UK that had many software incompatibility errors.

COST

$1.1 Billion

Page 6: Measuring Quality_Testing&Trends_Final _May 5

2013

NASDAQ Trading ShutdownAugust 22, 2013 NASDAQ Stock Market Shut down trading for three hours because of a computer error.

COST

$2 Billion

Page 7: Measuring Quality_Testing&Trends_Final _May 5

1985-1987

Therac-25 Medical AcceleratorA software failure caused wrong dosages of x-rays. These dosages were hundreds or thousands of times greater than normal, resulting in death or serious injury.

COST

5 Human Lives

Page 8: Measuring Quality_Testing&Trends_Final _May 5

Technology In Our Daily LifeAverage usage of electronic systems in developed countries:

One PC or desktop in each home. 80% of people are using mobile phones 40% of people are driving cars with various electronic systems People are traveling via train, plane on an average once a

year Dozens of other embedded systems in our homes Dozens of software programs in our work place, service

systemsQuality of all mentioned systems are equal to the Quality of life!

Page 9: Measuring Quality_Testing&Trends_Final _May 5

SECTION 2: DEFINING THE “WHAT”Known QA Metrics & Trends

Page 10: Measuring Quality_Testing&Trends_Final _May 5

10

Defining “What” Metrics and Trends

Measure to Understand

Understand to Control

Control to Improve

Page 11: Measuring Quality_Testing&Trends_Final _May 5

11

Several Known QA Metrics and Trends Manual & automation time ratio

during regression cycle Scripts maintenance time during

delivery iteration Daily test cases manual execution Automation effectiveness for issues

identification Issues found per area during

regression Areas impacted after new features

integration Issues identification behavior based

on major refactoring. Software process timetable metrics Delivery process productivity metric Software system availability metrics

Test cases coverage Automation coverage Defined issues based on gap

analysis Ambiguities per requirement Identified issues by criticality Identified issues by area separation Issues resolution turnaround time Backlog growth speed Release patching tendency and

costs Customer escalations by

Blocker/Critical/Major issues per release

QA engineer performance Continuous integration efficiency

Page 12: Measuring Quality_Testing&Trends_Final _May 5

12

Metrics Classification

PRODUCT METRICS

PROCESS METRICS

QA METRICS

Page 13: Measuring Quality_Testing&Trends_Final _May 5

13

Metrics Examples by Classification

Delivery process productivity metrics Continuous integration efficiency Release patching tendency and costs Backlog growth speed QA engineer performance Software process timetable metrics

Software system stability metrics Identified issues by criticality Identified issues by area separation Customer escalations by

Blocker/Critical/Major issues per release

Ambiguities per requirement Backlog growth speed

PRODUCT METRICS PROCESS METRICS

Page 14: Measuring Quality_Testing&Trends_Final _May 5

14

Sample Metrics Visual

55%20%

15%7% 3%

Automated UI and BEAutomated UIIn ProgressPending AutomationNot Feasible

AUTOMATION COVERAGE BUGS BY SEVERITY

3 % 5 %12 %

34 %

45 %

Blocker Critical High Medium Low

Page 15: Measuring Quality_Testing&Trends_Final _May 5

15

Visual Depiction Of Sample Trends

Jan Feb March April05

10152025303540

BlockerHigh

Low

BlockerCriticalHighMediumLow

00.5

11.5

22.5

33.5

44.5

5%

ISSUE ESCALATIONS BY CRITICALITY - MONTHLY TREND

1 2 3 4 5 6WEEKS

REJECTED BUGS % PER WEEK

Page 16: Measuring Quality_Testing&Trends_Final _May 5

16

ExpectationsSmooth releasesPredefined risks with mitigation plansNice feedback and appreciationTop notch and innovative products

Page 17: Measuring Quality_Testing&Trends_Final _May 5

17

Real Life Delivery not always ideal

We are familiar what is patching the release

Lack of process tracking data for analysis

Experimental delivery models not exactly the Best practice models

Page 18: Measuring Quality_Testing&Trends_Final _May 5

SECTION 3: DELIVERY PROCESSES & METRICS

Waterfall & Agile

Page 19: Measuring Quality_Testing&Trends_Final _May 5

Waterfall ProcessREQUIREMENT

SVALIDATION

ARCHITECTUREVERIFICATION

MODULE DESIGNVERIFICATION

IMPLEMENTATIONSYSTEM TEST

OPERATIONS AND

MAINTENANCEREVALIDATION

Page 20: Measuring Quality_Testing&Trends_Final _May 5

20

Agile Process

TESTING/VALIDATION/VERIFICATION

Product Backlog Client prioritized

product features

Sprint Backlog Features assigned

to Sprint Estimated by team Team Commitment

Working Code Ready For

DEPLOYMENT

Time-Boxed Test/Develop

PRODUCT BACKLOG BACKLOG TASKS

Scrum Meetings Every 24 hours

Page 21: Measuring Quality_Testing&Trends_Final _May 5

21

Agile Process MetricsSCRUM TEAM SPRINT METRICS

Scrum Team’s Understanding of Sprint Scope and Goal

Scrum Team’s Adherence to Scrum Rules & Engineering

Practices

Scrum Team’s Communication

Retrospective Process Improvement

Team Enthusiasm

Quality Delivered to Customer

Team Velocity

Technical Debt Management

Actual Stories Completed vs. Planned

Page 22: Measuring Quality_Testing&Trends_Final _May 5

22

Processes Are Not Always Best Practices Unique way of Agile Transition from Waterfall to Agile Transition from Agile to Kanban

Page 23: Measuring Quality_Testing&Trends_Final _May 5

23

Metrics Set Definition for Your Project Process Technology Iterations Project/Team Size Goal

Page 24: Measuring Quality_Testing&Trends_Final _May 5

SECTION 4: WEIGHT BASED ANALYSIS FOR QA METRICS & MEASUREMENTS

Mapping With Graph Theory

Page 25: Measuring Quality_Testing&Trends_Final _May 5

25

Metric and Trends for your Project

You are watching Metrics/Trends set, are they the right ones?

Trends are in an acceptable range, but the products quality is not improving?

Trying to improve one metric and another is going down?

How do you analyze and fix it?

Page 26: Measuring Quality_Testing&Trends_Final _May 5

26

Mapping QA Metrics Info Graph Theory

Process Metrics > A= Metric 1, B=Metric 2…

Actions/Data set that takes effect in Metrics > A1, A2…Metric dependencies of specific action

Product Metrics > C= Metric 3, D=Metric 4…

Page 27: Measuring Quality_Testing&Trends_Final _May 5

27

Preconditions & Definitions For Metrics & Actions mapped model Node’s initial weight is predefined and has value from 1-10

Edge’s weight is predefined and has value from 1-10

Connections between Nodes is defined based on dependencies of Metrics from each other and from Actions

All Actions have fixed 1 weight

Page 28: Measuring Quality_Testing&Trends_Final _May 5

28

Initial Metrics Model & DependenciesAssumeCurrent Metric set is: 2 Process Metrics -> M1, M2

2 Product Metrics -> M3, M4

Where : M1 has dependency on M3M1 has dependency on M4M2 has dependency on M3

There are 3 Actions or Data sets that have effect on some of the Metrics. Those are A1, A2, A3

Where : M1 has dependency on A1 and A2M4 has dependency on A3

Initial Priority

Initial Priority based on Best Practices

W(M1) = 5W(M2) = 4W(M3) = 3W(M4) = 2

Page 29: Measuring Quality_Testing&Trends_Final _May 5

29

Metrics Visualization via Graph

M2

M3

M14

3

M4

5

2

A2

A1

A3Process Metrics > A= Metric 1, B=Metric 2…

Actions/Data set that takes effect in Metrics > A1, A2…Metric dependencies of specific action

Product Metrics > C= Metric 3, D=Metric 4…

Page 30: Measuring Quality_Testing&Trends_Final _May 5

30

Weight Assignment On Undirected Graph

M2

M3

M14

3

M4

5

2

A2

A1

A3

23

5

1

1

6

1

1 1Process Metrics > A= Metric 1, B=Metric 2…

Actions/Data set that takes effect in Metrics > A1, A2…Metric dependencies of specific action

Product Metrics > C= Metric 3, D=Metric 4…

Page 31: Measuring Quality_Testing&Trends_Final _May 5

31

Calculation Formula for Metrics New Priority Priority of the node is calculated the following way:

where

- initial priority of the node

- node weight assigned by user

- cumulative weight of each node's edges

𝑾(𝑨)

Page 32: Measuring Quality_Testing&Trends_Final _May 5

32

New Priority Calculations For One Metric

M2

M32

4

3A2

A11

11

1

Initial PriorityM2 = W(M2)=4

New PriorityM2 = W(M2) * (W(M2-A1) + W(M2-A2) + W(M2-M3))M2 = 4 * (1+1+2) = 4*4 = 16

Page 33: Measuring Quality_Testing&Trends_Final _May 5

33

New Priority Calculations For Graph

M1 M2 M3 M4 A1 A2 A3W 5 4 3 2 1 1 1

M1

5 3 5 40

M2

4 2 1 1 16

M3

3 3 2 6 33

M4

2 5 10

C A L C U L AT I O N S

Metrics New

Priority

M1M3M2M4

Metrics Initial

Priority

M1M2M3M4

Page 34: Measuring Quality_Testing&Trends_Final _May 5

34

Metrics Priorities: Current Vs. Calculated

Initial Priority Based on Best Practices

M1

M2

M3

M4

Project Dependent Calculated Priority

M1

M3

M2

M4

INITIAL PRIORITY NEW PRIORITY

Page 35: Measuring Quality_Testing&Trends_Final _May 5

SECTION 5: METRICS WEIGHT BASED ANALYSIS IN PRACTICE

Defining “How”

Page 36: Measuring Quality_Testing&Trends_Final _May 5

36

Metrics Definition For Test Project Process – Agile with Area ownership Technology – SAAS Based Enterprise Web & Mobile App Iteration – 2 weeks Project Size – 5 Scrum Teams Goal – Customer Satisfaction, No Blocker, Critical Issues

Escalation by Customer

Page 37: Measuring Quality_Testing&Trends_Final _May 5

37

Key Metrics and DependenciesMetricsM1 - Customer Escalations per defect severity – Product MetricM2 – Opened Valid Defects per Area – Product MetricM3 – Rejected Defects – Process MetricM4 - Test cases Coverage – Process MetricM5 - Automation Coverage – Process MetricM6 - Defect fixes per Criticality – Product Metric

Actions and Data SetsA1 – Customer types per investment and escalations per severityA2 – Most Buggy areas

Page 38: Measuring Quality_Testing&Trends_Final _May 5

38

Metrics Initial PriorityWeight assignment and dependency analysis

Metric Name Predefined Node Weight

Metrics By Initial Priority

M1 - Customer Escalations per defect severity

8 M1

M2 – Opened Valid Defects per Area

5 M6

M3 – Rejected Defects 4 M2

M4 - Test cases Coverage 3 M3

M5 - Automation Coverage 2 M4

M6 - Defect fixes per Criticality per Team

6 M5

Node Weight

M1 M2 M3 M4 M5 M6 A1 A2

M1 = 8 2 6 4 5 2

M2 = 5 2 1 3

M3 = 4 1 3

M4 = 3 6 3 3 2

M5 = 2 2

M6 = 6 4 2

Page 39: Measuring Quality_Testing&Trends_Final _May 5

39

Graph Creation

3

M1

M3

M2

6

8

4

M4

5

A2

A1 2

51

1

3

M6M5 26

4

31

22

2

Page 40: Measuring Quality_Testing&Trends_Final _May 5

40

Calculations and Metrics PrioritizationNode Weight M1 M2 M3 M4 M5 M6 A1 A2 Calculated

PriorityM1 = 8 2 6 4 5 2 152

M2 = 5 2 1 3 30

M3 = 4 1 3 16

M4 = 3 6 3 3 2 42

M5 = 2 2 4

M6 = 6 4 2 36

Initial Priority

M1

M6

M2

M3

M4

M5

Calculated Priority

M1

M3

M6

M2

M4

M5

Page 41: Measuring Quality_Testing&Trends_Final _May 5

41

Key Metric Changes & Improvement PlansMetrics by Calculated Priority

M1 - Customer Escalations per defect severity M3 – Rejected Defects

M6 - Defect fixes per Criticality per TeamM2 – Opened Valid Defects per Area M4 - Test cases Coverage

M5 - Automation Coverage

Group Defect by Severity and per Customer investment to understand real picture. 1000 Minor issues can cost more than 1 High severity issue.

Proceed Trainings to low defect rejection, so developers will not spend more time on analysis of invalid issues

Make sure Defect fixes are going in Parallel with new feature development for each sprint

Continuously update Test case after each new issue, to make sure you have good coverage

Automate as much as possible to cut the costs and increase the coverage

Page 42: Measuring Quality_Testing&Trends_Final _May 5

42

Monitoring of Trend Based Priority MetricsBased on Process Changes

Jan Feb March April0

10

20

30

40

50

60

70

80

90

60

70 6875

2025

2934

70

80 8278

M1 M3 M6

Page 43: Measuring Quality_Testing&Trends_Final _May 5

43

Let the Challenge Begin… & Have FUN

Page 44: Measuring Quality_Testing&Trends_Final _May 5

Thank You

Global Footprint

About UsA leading provider of next-gen mobile application lifecycle services ranging from design and development to testing and sustenance.

LocationsCorporate HQ: Silicon ValleyOffices: Conshohocken (PA), Ahmedabad (India), Pune (India), London (UK)

InfoStretch Corporation

Page 45: Measuring Quality_Testing&Trends_Final _May 5

References Narsingh Deo, Graph Theory with Applications to Engineering and Computer Science, Prentice Hall

1974. A.A. Shariff K, M.A. Hussain, and S. Kumar, Leveraging un- structured data into intelligent

information – analysis and evaluation, Int. Conf. Information and Network Technology, IPCSIT, vol. 4, IACSIT press, Singapore, pp. 153-157, 2011.

http://en.wikipedia.org/wiki/List_of_software_bugs http://www.starshipmodeler.com/real/vh_ari52.htm http://news.nationalgeographic.com/news/2011/11/pictures/111123-mars-nasa-rover-curiosity-russ

ia-phobos-lost-curse-space-pictures/ http

://www.bloomberg.com/news/articles/2013-08-22/nasdaq-shuts-trading-for-three-hours-in-latest-computer-error

Page 46: Measuring Quality_Testing&Trends_Final _May 5

©2015 InfoStretch Corporation. All rights reserved. 46

Q & ALiana Gevorgyan

Sr. QA ManagerInfoStretch Corporation Inc.

[email protected]/in/lianag/en

[email protected]