50
Exploratory and Experience Based Testing 21.11.2011 Juha Itkonen Aalto University School of Science Department of Computer Science and Engineering

Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Exploratory and Experience Based Testing 21.11.2011

Juha Itkonen Aalto University School of Science Department of Computer Science and Engineering

Page 2: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Contents   Intelligent Manual Testing

–  Experience base testing –  Exploratory testing

  Ways of Exploring –  Session Based Test Management –  Touring testing

  Intelligent Manual Testing Practices –  Examples of empirically identified testing

practices

  Benefits of Experience Based Testing

Juha Itkonen, 2011 SoberIT

2

Page 3: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Manual Testing

  Testing that is performed by human testers

  Stereotype of manual testing –  Executing detailed pre-designed test

cases –  Mechanically following the step-by-

step instructions –  Treated as work that anybody can do

Juha Itkonen, 2011 SoberIT

3

RReesseeaarrcchh hhaass sshhoowwnn:: 1.  Individual differences in

testing are high 2.  Test case design

techniques alone do not explain the results

In practice, it’s clear that some testers are better than others in manual testing and more effective at revealing defects...

Image: Salvatore Vuono

Page 4: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Traditional emphasis on test documentation   Test case design and documentation over emphasized

–  Both in textbooks and research

  Test cases make test designs tangible, reviewable, and easy to plan and track – i.e. manage and control

  In many contexts test cases and other test design documentation are needed

  The level and type of test documentation should vary based on context

Juha Itkonen, 2011 SoberIT

4

Page 5: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Experience is invaluable in software testing   Domain experience

–  Knowledge and skills gained in the application domain area –  How the system is used in practice, and by whom –  What are the goals of the users –  How is the system related to the customer’s (business) processes

  Technical system experience –  How the system was built –  What are the typical problems and defects –  How is the system used and all the details work –  How things work together and interact

  Testing experience –  Knowledge of testing methods and techniques –  Testing skills grown in practice

Juha Itkonen, 2011 SoberIT

5

Page 6: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Software testing…

  is creative and exploratory work   requires skills and knowledge

–  application domain –  users’ processes and objectives –  some level of technical details and history of the application

under test

  requires certain kind of attitude

Juha Itkonen, 2011 SoberIT

6

Page 7: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Tester’s Attitude

  People tend to see what they want or expect to see –  If you want to show that software works

correctly, you will miss defects

  Tester’s goal is to “break the software” –  Reveal all relevant defects –  Find out any problems real users would

experience in practice

  Testing is all about exceptions, special cases, invalid inputs, error situations, and complicated unexpected combinations

Juha Itkonen, 2011 SoberIT

7

Photo by Arvind Balaraman

Page 8: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Tester’s Goal

  Explore, investigate, and measure   Provide quality related information for

other stakeholders in useful form

  Testers attitude is destructive towards the software under test, but highly constructive towards people

Juha Itkonen, 2011 SoberIT

8

Photo by Graur Codrin

Page 9: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

My viewpoint: Experience Based – Intelligent – Manual Testing   Manual testing that builds on the tester’s experience

–  knowledge and skills

  Some aspects of testing rely on tester’s skills –  during testing –  e.g., input values, expected results, or interactions

  Testers are assumed to know what they are doing –  Testing does not mean executing detailed scripts

  Focus on the actual testing work in practice –  What happens during testing activities? –  How are defects actually found? –  Experience-based and exploratory aspects of software

testing

Juha Itkonen, 2011 SoberIT

9

Page 10: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Exploratory Testing is creative testing without predefined test cases Based on knowledge and skills of the tester

1.  Tests are not defined in advance –  Exploring with a general mission –  without specific step-by-step instructions on how to accomplish the mission

2.  Testing is guided by the results of previously performed tests and the gained knowledge from them –  Testers can apply deductive reasoning to the test outcomes

3.  The focus is on finding defects by exploration –  Instead of demonstrating systematic coverage

4.  Parallel learning of the system under test, test design, and test execution 5.  Experience and skills of an individual tester strongly affect effectiveness

and results

Juha Itkonen, 2011 SoberIT

10

Page 11: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Document driven vs. exploratory testign

Tests

Tests

A B

C

A B

C

Juha Itkonen, 2011 SoberIT

11

Page 12: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Exploratory Testing is an approach

  Most of the testing techniques can be used in exploratory way

  Exploratory testing and (automated) scripted testing are the ends of a continuum

Juha Itkonen, 2011 SoberIT

12

Freestyle exploratory “bug hunting”

Pure scripted (automated) testing

Manual scripts

High level test cases

Chartered exploratory testing

Page 13: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Lateral thinking

  Allowed to be distracted   Find side paths and explore interesting areas   Periodically check your status against your mission

Juha Itkonen, 2011 SoberIT

13

Page 14: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Scripted vs. Exploratory Tests Mine-field analogy

Juha Itkonen, 2011 SoberIT

14

bugs fixes

Page 15: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

15 Juha Itkonen, 2011 SoberIT

Two views of agile testing eXtreme Testing   Automated unit testing

–  Developers write tests –  Test first development –  Daily builds with unit tests always

100% pass

  Functional (acceptance) testing –  Customer-owned –  Comprehensive –  Repeatable –  Automatic –  Timely –  Public

Exploratory Testing   Utilizes professional testers’ skills

and experience   Optimized to find bugs   Minimizing time spent on

documentation   Continually adjusting plans, re-

focusing on the most promising risk areas

  Following hunches   Freedom, flexibility and fun for

testers

Focus on automated verification – enabling agile software

development

Focus on manual validation – making testing activities

agile

Page 16: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Contents   Intelligent Manual Testing

–  Experience base testing –  Exploratory testing

  Ways of Exploring –  Session Based Test Management –  Touring testing

  Intelligent Manual Testing Practices –  Examples of empirically identified testing

practices

  Benefits of Experience Based Testing

Juha Itkonen, 2011 SoberIT

16

Page 17: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Some ways of exploring in practice   Freestyle exploratory

testing –  Unmanaged ET

  Functional testing of individual features

  Exploring high level test cases

  Exploratory regression testing –  by verifying fixes or

changes

  Session-based exploratory testing

  Exploring like a tourist

  Outsourced exploratory testing –  Advanced users, strong

domain knowledge –  Beta testing

Juha Itkonen, 2011 SoberIT

17

Page 18: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Session Based Test Management (SBTM)

Juha Itkonen, 2011 SoberIT

18

  Charter   Time Box   Reviewable Result   Debriefing

Bach, J. "Session-Based Test Management", STQE, vol. 2, no. 6, 2000. http://www.satisfice.com/articles/sbtm.pdf

Page 19: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Session-Based Testing – a way to manage ET   Enables planning and tracking exploratory testing

–  Without detailed test (case) designs –  Dividing testing work in small chunks –  Tracking testing work in time-boxed sessions

  Efficient – no unnecessary documentation   Agile – it’s easy to focus testing to most important areas based on

the test results and other information –  Changes in requirements, increasing understanding, revealed

problems, identified risks, …

  Explicit, scheduled sessions can help getting testing done –  when resources are scarce –  When testers are not full-time testers...

Juha Itkonen, 2011 SoberIT

19

Page 20: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Exploring like a tourist – a way to guide ET sessions

  Touring tests use tourist metaphor to guide testers’ actions

  Focus to intent rather than separate features –  This intent is communicated as tours in

different districts of the software

Juha Itkonen, 2011 SoberIT

20

James A. Whittaker. Exploratory Software Testing, Tips, Tricks, Tours, and Techniques to Guide Test Design. Addison-Wesley, 2009.

Page 21: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Districts and Tours   Business district

–  Guidebook tour –  Money tour –  Landmark tour –  Intellectual tour –  FedEx tour –  After-hours tour –  Garbage collector’s tour

  Historical district –  Bad-Neighborhood tour –  Museum tour –  Prior version tour

  Entertainment district –  Supporting actor tour –  Back alley tour –  All-nighter tour

  Tourist district –  Collector’s tour –  Lonely businessman tour –  Supermodel tour –  TOGOF tour –  Scottish pub tour

  Hotel district –  Rained-out tour –  Coach potato tour

  Seedy district –  Saboteur tour –  Antisocial tour –  Obsessive-compulsive tour

Juha Itkonen, 2011 SoberIT

21

James A. Whittaker. Exploratory Software Testing, Tips, Tricks, Tours, and Techniques to Guide Test Design. Addison-Wesley, 2009.

Page 22: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Examples of exploratory testing tours The Guidebook Tour   Use user manual or other

documentation as a guide   Test rigorously by the guide   Tests the details of important

features   Tests also the guide itself   Variations

–  Blogger’s tour, use third party advice as guide –  Pundit’s tour, use product reviews as guide –  Competitor’s tour

The Garbage Collector’s Tour   Choosing a goal and then visiting

each item by shortest path   Screen-by-screen, dialog-by-

dialog, feature-by-feature, …   Test every corner of the software,

but not very deeply in the details The All-Nighter Tour   Never close the app, use the

features continuously –  keep software running –  keep files open –  connect and don’t disconnect –  don’t save –  move data around, add and remove –  sleep and hibernation modes ...

Juha Itkonen, 2011 SoberIT

22

Page 23: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Contents   Intelligent Manual Testing

–  Experience base testing –  Exploratory testing

  Ways of Exploring –  Session Based Test Management –  Touring testing

  Intelligent Manual Testing Practices –  Examples of empirically identified

testing practices   Benefits of Experience Based Testing

Juha Itkonen, 2011 SoberIT

23

Page 24: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

  Empirically observed practices from industry   Testing, not test case pre-design   Practices work on different levels of abstraction

–  Many practices are similar to traditional test case design techniques –  Many practices are similar to more general testing strategies,

heuristics, or rules of thumb

Juha Itkonen, 2011 SoberIT

24

Page 25: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

  Structuring testing work   Guiding a tester through features

Overall strategies

  Low level test design   Defect hypotheses   Checking the test results

Detailed techniques

Juha Itkonen, 2011 SoberIT

25

Page 26: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Overall strategies

Exploratory

Exploring weak areas

Aspect oriented testing

User interface exploring

Top-down functional exploring

Simulating a real usage scenario

Smoke testing by intuition and

experience

Juha Itkonen, 2011 SoberIT

26

Documentation based

Data as test cases

Exploring high-level test cases

Checking new and changed

features

Page 27: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Detailed techniques

Exploratory

Testing alternative ways

Exploring against old functionality

Simulating abnormal and

extreme situations

Persistence testing

Feature interaction testing

Defect based exploring

Juha Itkonen, 2011 SoberIT

27

Input

Testing input alternatives

Testing boundaries and

restrictions

Covering input combinations

Testing to-and-from the feature

Comparison

Comparing with another application

or version

Comparing within the software

Checking all the effects

End-to-end data check

Page 28: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Basic Objectives in Testing Activities

Juha Itkonen, 2011 SoberIT

28

Exploring: Guiding tester through the functionality

Coverage: Selecting what gets tested – and what not

Oracle: Deciding if the results are correct

Risks: Detecting specific types of defects

Prioritization: Selecting what to test first

Page 29: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

<exploratory strategy> Exploring weak areas   Description: Exploring areas of the software that are weak or

risky based on the experience and knowledge of the tester.

  Goal: Reveal defects in areas that are somehow known to be risky. Focus testing on risky areas. –  complicated –  coded in a hurry –  lots of changes –  coders' opinion –  testers' opinion –  based on who implemented –  a hunch...

Juha Itkonen, 2011 SoberIT

29

Page 30: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

<exploratory strategy> Top-down functional exploring   Description: Proceeding in testing by first going through typical

cases and simple checks. Proceed gradually deeper in the details of the tested functionality and applying more complicated tests.

  Goal: To get first high level understanding of the function and then deeper confidence on its quality set-by-step. –  Is this function implemented? –  Does it do the right thing? –  Is there missing functionality? –  Does it handle the exceptions and special cases? –  Does is work together with the rest of the system? –  How about error handling and recovery –  …

Juha Itkonen, 2011 SoberIT

30

Page 31: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

<documentation based strategy> Using data as test cases   Description: Pre-defined test data set includes all relevant cases and

combinations of different data and situations. Covering all cases in a pre-defined test data set provides the required coverage.

–  Testing is exploratory, but the pre-defined data set is used to achieve systematic coverage.

–  Suitable for situations where data is complex, but operations simple. Or when creating the data requires much effort.

  Goal: To manage exploratory testing based on pre-defined test data. Achieve and measure coverage in exploratory testing.

  Example: Different types of customers in a CRM system. –  User privileges –  Situation, services, relationships –  History, data

Juha Itkonen, 2011 SoberIT

31

Page 32: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

<comparison technique> Comparing within the software

  Description: Comparing similar features in different places of the same system and testing their consistency.

  Goal: Investigating and revealing problems in the consistency of functionality inside a software; help decide if a feature works correctly or not.

Juha Itkonen, 2011 SoberIT

32

Page 33: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

<input technique> Testing to-and-from the feature   Description:

  Test all things that affect to the feature   Test all things that get effects from the feature

  Goal: Systematically cover the feature’s interactions. Reveal defects that are caused by a not-the-most-obvious relationship between the tested feature and other features or environment.

Juha Itkonen, 2011 SoberIT

33

Page 34: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Ways of utilizing IMT Practices

Juha Itkonen, 2011 SoberIT

34

Training testers

Guiding test execution

Test documentation and tracking

Test patterns for different situations

Page 35: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Contents   Intelligent Manual Testing

–  Experience base testing –  Exploratory testing

  Ways of Exploring –  Session Based Test Management –  Touring testing

  Intelligent Manual Testing Practices –  Examples of empirically identified testing

practices

  Benefits of Experience Based Testing

Juha Itkonen, 2011 SoberIT

35

Page 36: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Strengths of experience based testing Testers’ skills   Utilizing the skills and experience of the tester

–  Testers know how the software is used and for what purpose –  Testers know what functionality and features are critical –  Testers know what problems are relevant –  Testers know how the software was built

  Risks, tacit knowledge

  Enables creative exploring   Enables fast learning and improving testing

–  Investigating, searching, finding, combining, reasoning, deducting, ...

  Testing intangible properties –  “Look and feel” and other user perceptions

Juha Itkonen, 2011 SoberIT

36

Page 37: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Strengths of experience based testing Process   Agility and flexibility

–  Easy and fast to focus on critical areas –  Fast reaction to changes –  Ability to work with missing or weak documentation

  Effectiveness –  Reveals large number of relevant defects

  Efficiency –  Low documentation overhead –  Fast feedback

Juha Itkonen, 2011 SoberIT

37

Page 38: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Challenges of experience based testing   Planning and tracking

–  How much testing is needed, how long does it take? –  What is the status of testing? –  How to share testing work between testers?

  Managing test coverage –  What has been tested? –  When are we done?

  Logging and reporting –  Visibility outside testing team

  or outside individual testing sessions

  Quality of testing –  How to assure the quality of tester’s work

  Detailed test cases can be reviewed, at least

Juha Itkonen, 2011 SoberIT

38

Page 39: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Reasons for documenting test cases   Optimizing

–  Selecting optimal test set –  Avoiding redundancy

  Organization –  Organized so that tests can be reviewed and used effectively –  Selecting and prioritizing

  Repeatability –  Know what test cases were run and how; so that you can repeat the same tests

  Tracking –  What requirements, features, or components are tested –  What is the coverage of testing –  How testing proceeds? Are we going to make the deadline?

  Proof of testing –  Evaluating the level of confidence –  How do we know what has been tested?

Juha Itkonen, 2011 SoberIT

39

Page 40: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Detail level of test cases

  Experienced testers need less detailed test cases –  More experienced as testers –  More familiar with the software and application domain

  Input conditions –  Depends on the testing technique and goals of testing –  E.g. if goal is to cover all pairs of certain input conditions, the test cases

have to be more detailed than in scenario testing

  Expected results –  More detail is required if the result is not obvious, requires complicated

comparison, etc. –  Inexperienced tester needs more guidance on what to pay attention to

Juha Itkonen, 2011 SoberIT

40

Page 41: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Why should we document the expected outcome?   The expected values explicitly define what is the correct

result –  Important if the correct result is not obvious

  If expected values are not defined –  The tester wants and expects to see the correct behavior

  The tester has less work if the software works correctly –  The tester does not know the correct behavior –  Tester assumes the correct behavior –  Many defects are not found

Juha Itkonen, 2011 SoberIT

41

Looks OK to me!

On the other hand... If provided with detailed expected results, the tester just

looks for the exact details pointed out and ignores everything else – many unexpected defects are missed.

Page 42: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Experimental Comparison of ET and Test Case Based Testing (TCBT) Itkonen, J., M. V. Mäntylä and C. Lassenius. "Defect Detection Efficiency: Test Case Based vs. Exploratory Testing", in proceedings of the International Symposium on Empirical Software Engineering and Measurement, pp. 61-70, 2007.

  Effectiveness in terms of revealed defects   Test execution time was fixed

  No difference in effectiveness –  ET revealed more defects, but no statistical difference

  TCBT required much more effort –  Test case design before the test execution

  TCBT produced twice as many false reports than ET

Juha Itkonen, 2011 SoberIT

42

Page 43: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Who tested my software? Mäntylä, M. V., Itkonen, J., Iivonen, J., "Who Tested My Software? Testing as an Organizationally Cross-Cutting Activity", Software Quality Journal, 2011.

  Testing is not an action that is solely performed by specialists. In all our cases, people in roles that vary from sales to software development found a substantial number of defects.

  Validation from the viewpoint of end-users was found more valuable than verification aiming at zero defect software.

Juha Itkonen, 2011 SoberIT

43

Page 44: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Who tested my software?   Developers’ defects had the highest fix rate and

specialized testers’ defects had the lowest fix rate.   People with a personal stake in the product (e.g., sales

and consulting personnel) tend to place more importance on their defects, but it does not seem to improve their fix ratios.

Juha Itkonen, 2011 SoberIT

44

Page 45: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

The role of knowledge in failure detection Itkonen J., Mäntylä M. V., Lassenius C., "The Role of Knowledge in Failure Detection During Exploratory Software Testing", in review for IEEE Transactions on Software Engineering, 2011.

  Detailed analysis of 91 defect detection incidents form video recorded exploratory testing sessions

  Analysed what type of knowledge is required for detecting failures?

  Analysed failure detection difficulty

Juha Itkonen, 2011 SoberIT

45

Page 46: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

The role of knowledge in failure detection – findings   Knowledge utilization

  Testers are able to utilize their personal knowledge of the application domain, the users’ needs, and the tested system for defect detection.

  Side effect bugs   In exploratory testing, testers frequently recognize relevant failures

in a wider set of features than the actual target features of the testing activity.

  Obvious failures   A large number of the failures in software applications and systems

can be detected without detailed test design or descriptions.   Application domain related failures are simple to reveal

  The majority of failures detected by domain knowledge are straightforward to recognize.

Juha Itkonen, 2011 SoberIT

46

Page 47: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

The role of knowledge in failure detection – conclusions   The ET approach could be effective even when less

experienced testers are used.   All testing does not need to be scripted or rigorously

(pre)designed. –  A lot of benefits can be achieved by efficient exploring

  The ET approach is an effective way of involving the knowledge of domain experts in testing activities –  who do not have testing expertise

Juha Itkonen, 2011 SoberIT

47

Page 48: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

Questions and more discussion

Contact information

Juha Itkonen

[email protected]

+358 50 577 1688

http://www.soberit.hut.fi/jitkonen

48

Juha Itkonen, 2011 SoberIT

Page 49: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

References (primary) Bach, J., 2000. Session-Based Test Management. Software Testing and Quality Engineering, 2(6). Available at: http://www.satisfice.com/articles/sbtm.pdf.

Bach, J., 2004. Exploratory Testing. In E. van Veenendaal, ed. The Testing Practitioner. Den Bosch: UTN Publishers, pp. 253-265. http://www.satisfice.com/articles/et-article.pdf.

Itkonen, J. & Rautiainen, K., 2005. Exploratory testing: a multiple case study. In Proceedings of International Symposium on Empirical Software Engineering. International Symposium on Empirical Software Engineering. pp. 84-93.

Itkonen, J., Mäntylä, M.V. & Lassenius, C., 2007. Defect Detection Efficiency: Test Case Based vs. Exploratory Testing. In Proceedings of International Symposium on Empirical Software Engineering and Measurement. International Symposium on Empirical Software Engineering and Measurement. pp. 61-70.

Itkonen, J., Mantyla, M. & Lassenius, C., 2009. How do testers do it? An exploratory study on manual testing practices. In Empirical Software Engineering and Measurement, 2009. ESEM 2009. 3rd International Symposium on. Empirical Software Engineering and Measurement, 2009. ESEM 2009. 3rd International Symposium on. pp. 494-497.

Itkonen, J., 2011. Empirical Studies on Exploratory Software Testing. Doctoral dissertation, Aalto University School of Science. http://lib.tkk.fi/Diss/2011/isbn9789526043395/

Lyndsay, J. & Eeden, N.V., 2003. Adventures in Session-Based Testing. http://www.workroom-productions.com/papers/AiSBTv1.2.pdf. Available at: http://www.workroom-productions.com/papers/AiSBTv1.2.pdf .

Martin, D. et al., 2007. 'Good' Organisational Reasons for 'Bad' Software Testing: An Ethnographic Study of Testing in a Small Software Company. In Proceedings of International Conference on Software Engineering. International Conference on Software Engineering. pp. 602-611.

Whittaker, J.A., 2009. Exploratory Software Testing: Tips, Tricks, Tours, and Techniques to Guide Test Design, Addison-Wesley Professional.

Juha Itkonen, 2011 SoberIT

49

Page 50: Exploratory and Experience Based Testing › ~tie21201 › s2011 › luennot › ET_TTY_2011-11-21.pdf · – Testing is exploratory, but the pre-defined data set is used to achieve

References (secondary) Agruss, C. & Johnson, B., 2005. Ad Hoc Software Testing.

Ammad Naseer & Marium Zulfiqar, 2010. Investigating Exploratory Testing in Industrial Practice. Master's Thesis. Rönneby, Sweden: Blekinge Institute of Technology. Available at: http://www.bth.se/fou/cuppsats.nsf/all/8147b5e26911adb2c125778f003d6320/$file/MSE-2010-15.pdf.

Armour, P.G., 2005. The unconscious art of software testing. Communications of the ACM, 48(1), 15-18.

Beer, A. & Ramler, R., 2008. The Role of Experience in Software Testing Practice. In Proceedings of Euromicro Conference on Software Engineering and Advanced Applications. Euromicro Conference on Software Engineering and Advanced Applications. pp. 258-265.

Houdek, F., Schwinn, T. & Ernst, D., 2002a. Defect Detection for Executable Specifications — An Experiment. International Journal of Software Engineering & Knowledge Engineering, 12(6), 637.

Kaner, C., Bach, J. & Pettichord, B., 2002. Lessons Learned in Software Testing, New York: John Wiley & Sons, Inc.

Martin, D. et al., 2007. 'Good' Organisational Reasons for 'Bad' Software Testing: An Ethnographic Study of Testing in a Small Software Company. In Proceedings of International Conference on Software Engineering. International Conference on Software Engineering. pp. 602-611.

Tinkham, A. & Kaner, C., 2003a. Learning Styles and Exploratory Testing. In Pacific Northwest Software Quality Conference (PNSQC). Pacific Northwest Software Quality Conference (PNSQC).

Wood, B. & James, D., 2003. Applying Session-Based Testing to Medical Software. Medical Device & Diagnostic Industry, 90.

Våga, J. & Amland, S., 2002. Managing High-Speed Web Testing. In D. Meyerhoff et al., eds. Software Quality and Software Testing in Internet Times. Berlin: Springer-Verlag, pp. 23-30.

Juha Itkonen, 2011 SoberIT

50