Upload
test-huddle
View
405
Download
1
Embed Size (px)
Citation preview
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
1
Are your tests well-travelled? Thoughts
about coveragePrepared and presented by
Dorothy Graham email: [email protected]
www.DorothyGraham.co.uk
EuroStar Webinar
© Dorothy Graham 2017
2
Contents
Analogy with travellingWhat is coverage?
Should testing be thorough?What coverage is not (often mistaken for)
The four caveats of coverageThe question you should ask
Twitter: @DorothyGraham
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
3
Where on earth have you been?
• have you seen a lot of the world? – are you “well-travelled”?
• what does it mean if you say “yes”?– what does it mean if you say “85%”? – or “100%”?
4
Scratch map 1 (unscratched)
scratch off places I have been – cities?
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
5
can you spot the 74 cities I’ve visited?
6
hint: look on the East side of the country
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
7
Where is Grand Rapids?
8
Cities aren’t that impressive on the map – how about States?
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
9
Only 4 US States I haven’t
been to.
plus Hawaii
10
country coverage
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
11
Some travel metrics• Have you been to
– every street in the place where you live?
– every city/town?– every state/province?– every country in the
world?– every continent?
• where haven’t you been?
12
Contents
Analogy with travellingWhat is coverage?
Should testing be thorough?What coverage is not (often mistaken for)
The four caveats of coverageThe question you should ask
Twitter: @DorothyGraham
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
13
Travelling
• have you seen a lot of the world? – are you “well-travelled”?
• what does it mean if you say “yes”?– what does it mean if you say “85%”? – or “100%”?
Test coverage
your tests seen a lot of the system?
- do you have “good coverage”?they
14
Some travel metrics• Have you been to
– every street in the place where you live?
– every city/town?– every state/ province?– every country in the
world?– every continent?
• where haven’t you been?
• Have your tests been to– every statement / decision
/ branch?– every data combination?– every error message?– every menu option?– every program / function?– every user story option?
• where haven’t your tests been?
coverage
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
15
What is coverage?
system
the
tests
16
What is coverage?
systemthe
tests
this part of the system has been covered by these tests
the rest has not been covered by these tests
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
17
What is coverage?
systemmore
tests
these tests give more coverage than the previous set of tests
18
Tests giving 100% coverage
systemeven more
tests
Great - we’ve tested all of the system!
- or have we?
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
19
Tested everything?
systemsystemsystemsystemsystemeven more
tests
100%? – of what?modules, statements, branches,
states, data, menu options, functionsbusiness rules, user stories
20
Statement vs decision coverage
read(a)IF a > 6 THEN
b = a * 2ENDIFprint b
12345
Statementnumbers
5
1
3
4
True
TestCase Input Expected
OutputA 7 14
Test Path Decision Decision StatementCase Taken Outcome Coverage Coverage
A 1, 2, 3, 4, 5 True 50% 100%
True
2
Great – 100% tested, right?
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
21
Statement vs decision coverage
read(a)IF a > 6 THEN
b = a * 2ENDIFprint b
12345
Statementnumbers
5
1
3
4
True
TestCase Input Expected
OutputA 7 14
Test Path Decision Decision StatementCase Taken Outcome Coverage Coverage
A 1, 2, 3, 4, 5 True 50% 100%
True
2
22
Statement vs decision coverage
read(a)IF a > 6 THEN
b = a * 2ENDIFprint b
12345
Statementnumbers
B 3 3
5
1
3
4
True
TestCase Input Expected
OutputA 7 14
Test Path Decision Decision StatementCase Taken Outcome Coverage Coverage
A 1, 2, 3, 4, 5 True 50% 100%B 1, 2, 4, 5 False 50% 80%
True
2
Both 100% 100%
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
23
ISTQB definitions• coverage
– the degree, expressed as a percentage, to which a specified coverage item has been exercised by a test suite
• coverage item– an entity or property used as a basis for test
coverage, e.g. equivalence partitions or code statements
ISTQB Glossary v 1.3
24
What is coverage?• coverage is a relationship
– between a set of tests– and some countable part of the system
• 100% coverage is not 100% tested– 100% of some countable things – among several
types of countable things• an objective measurement of some aspect of
thoroughness– if thoroughness is what you want!
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
25
Contents
Analogy with travellingWhat is coverage?
Should testing be thorough?What coverage is not (often mistaken for)
The four caveats of coverageThe question you should ask
Twitter: @DorothyGraham
26
Thoroughness of testing• should testing be thorough? why?• is testing like butter or like strawberry jam?
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
27
Testing is like butter?• spread it evenly over
the bread• same thickness
throughout• no part not covered• every part of the
software is tested to the same extent
28
Testing is like strawberry jam?• thicker in some
places than others• big lumps• some parts not
covered• should some parts of
the software be tested more than others?
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
29
Breadth or depth? the coverage illusion
major bug minor bug
Breadth / width is coverage
30
Breadth or depth? the coverage illusion
major bug minor bug
Breadth / width is coverageDepth / lumpy testing is selectiveWhat is better testing?
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
31
Breadth or depth? the coverage illusion
major bug minor bug
Breadth / width is coverageDepth / lumpy testing is selective
“I’ve covered / tested everything -
haven’t missed anything!
An illusion, a trapWhat is better testing?
32
What’s the goal for testing?• Width
– every part has been tested once
– may be required by regulatory bodies
– wide view, no area untouched
– may miss something
• Selected depth– not all parts of the
system are equally important or equally risky
– focus on where testing brings greatest value
– deep view, concentrate on critical parts
– may miss something
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
33
Contents
Analogy with travellingWhat is coverage?
Should testing be thorough?What coverage is not (often mistaken for)
The four caveats of coverageThe question you should ask
Twitter: @DorothyGraham
34
Coverage is NOT
the system
the
tests
the
teststhis is test completion!don’t call it “coverage”!
“we’ve run all of the tests”[that we have thought of]
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
35
Contents
Analogy with travellingWhat is coverage?
Should testing be thorough?What coverage is not (often mistaken for)
The four caveats of coverageThe question you should ask
Twitter: @DorothyGraham
36
1 A single measure is only one level of coverage out of many
• what is the “right” level of coverage?– e.g. city, state, country?– statement, menu options, user stories?– 100%? 80%? See what you have tested / missed?
systemsystemsystemsystemcoverage level
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
37
2 Only need one test to cover• to cover Australia or Brazil
– only visit one city per country• to cover a “coverage element”
(statement, decision outcome, menu option)– only one test per element
• might be hundreds of ways that element is used
38
3 Not related to quality of the software
read(a)IF a > 6 THEN
b = a * 2ENDIFprint b
12345
Statementnumbers
B 3 3
5
1
3
4
True
TestCase Input Expected
OutputA 7 14
Test Path Decision Decision StatementCase Taken Outcome Coverage Coverage
A 1, 2, 3, 4, 5 True 50% 100%B 1, 2, 4, 5 False 50% 80%
True
2
what gets printed?
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
39
3 Not related to quality of the software
• we still have decision coverage, even though the test fails (a bug)
• statement coverage only wouldn’t find the bug• client story
– poor quality software from 3rd party– required decision coverage (80%?)– 3rd party got a tool, demonstrated coverage– the software was still rubbish
• tests didn’t pass, lots of bugs• but the tests exercised the required decisions
40
4 Coverage of what exists, not what should existthe system
as built
the systemas needed
coverage (White Box)
not tested,even with100%coverage
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
41
4 Coverage of what exists, not what should exist• map didn’t show some cities I have visited
– are they important?• coverage does not show missing functions or
features– are they important?
• what about requirements coverage?– good idea, but still only test the requirements you
listed
42
Coverage traps• the four coverage caveats
– only one level / aspect of thoroughness– only needs one test to “tick the box”– not related to how good the tests or software are– only what is there, not what’s missing
• 100% coverage is NOT 100% tested!• “coverage” feels comfy, re-assuring
– insurance – you’re covered– you think you haven’t missed anything
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
43
Contents
Analogy with travellingWhat is coverage?
Should testing be thorough?What coverage is not (often mistaken for)
The four caveats of coverageThe question you should ask
Twitter: @DorothyGraham
44
“We need to increase our coverage”
“Make sure you cover 100%!”
“What coverage are we getting?”
“If we automate, we will get better coverage”
Have you heard:
“We need as muchcoverage as possible”
EuroS|tar webinar 30 Jan [email protected]
© Dorothy Graham 2017www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
45
Next time you hear:• “we need coverage”, ask: “of what”?
– exactly what countable things need to be covered by tests?
– why is it important to test them [all]?– how “deeply” should we cover things?
• would testing be more effective if lumpy, not smooth?• what can we safely not test (this time)?
– we always miss something • better to miss it on purpose than fool yourself into
thinking you haven’t missed anything
46
Summary• coverage is a relationship
– between tests and what-is-tested• coverage is not:
– test completion (my tests tested what they tested)– 100% tested – only in one dimension
• beware the coverage traps• when you hear “coverage”
– ask “of what”
www.DorothyGraham.co.uk email:[email protected]