85
MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 02: Black-Box Testing and Usability Testing (Textbook Ch. 4 & 12.8) Dietmar Pfahl email: [email protected] Spring 2015

MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

MTAT.03.159: Software Testing

Lecture 02:

Black-Box Testing and

Usability Testing

(Textbook Ch. 4 & 12.8) Dietmar Pfahl

email: [email protected] Spring 2015

Page 2: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Lecture

• Chapter 4: Test case design

– Black-box testing techniques ( Lab 2)

• Chapter 12: Evaluating software quality

– Usability testing

Page 3: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Exam Dates

Exam 1: Thursday, 21 May, 10:15-12:00

Students graduating in June have preference!

Exam 2: Thursday, 28 May, 10:15-12:00

You must have at least 20 marks from the homework assignments (labs 1-6) to qualify for the exam.

You must receive at least 10 marks in the exam to not fail the course.

In total, you need at least 50 marks to not fail the course.

Page 4: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Message Board

• Please sign-up to the message board (in course wiki)

• Use it for questions on lectures and exam

Page 5: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Structure of Lecture 2

• Black-Box vs. White-Box Testing

• Black-Box Testing Methods

• Usability Testing

• Lab 2

Page 6: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Why Test Case Design Techniques?

Exhaustive testing (use of all possible inputs and conditions) is impractical

• Must use a subset of all possible test cases

• Must have high probability of detecting faults

Need processes that help us selecting test cases

• Different people – equal probability to detect faults

Effective testing – detect more faults

• Focus attention on specific types of faults

• Know you’re testing the right thing

Efficient testing – detect faults with less effort

• Avoid duplication

• Systematic techniques are measurable and repeatable

Page 7: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Basic Testing Strategies

Page 8: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Black-Box vs. White-Box

External/user view:

Check conformance with specification -> function coverage

Abstraction from details:

Source code not needed

Scales up:

Different techniques at different levels of granularity

USE

Internal/developer view:

Allows tester to be confident about code coverage

Based on control or data flow:

Easier debugging

Does not scale up:

Most useful at unit & integration testing levels, as well as regression testing

BOTH!

Page 9: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Black-Box vs. White-Box

External/user view:

Check conformance with specification -> function coverage

Abstraction from details:

Source code not needed

Scales up:

Different techniques at different levels of granularity

USE

Internal/developer view:

Allows tester to be confident about code coverage

Based on control or data flow:

Easier debugging

Does not scale up:

Most useful at unit & integration testing levels, as well as regression testing

BOTH!

Gray-Box Testing Combines black-box and white-box testing; typically, the focus is on input/output testiing (black-box view) which is informed by structural information of the code (white-box view). Example: The tester knows that certain constraints on the input are checked by the unit under test. Application, e.g., in regression testing: apply (or update) black-box test cases only where code has been changed;

Page 10: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Black-Box vs. White-Box

Implementation Specification

Unexpected functionality: Cannot be (directly) revealed by black-box techniques

Missing functionality: Cannot be (directly) revealed by white-box techniques

System

Page 11: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Types of Testing

unit

integration

system

efficiency

maintainability

functionality

white box black box

Level of detail

Accessibility

Characteristics

usability

reliability

(module)

portability

Page 12: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Structure of Lecture 2

• Black-Box vs. White-Box Testing

• Black-Box Testing Methods

• Usability Testing

• Lab 2

Page 13: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Black-Box Testing Methods

• Equivalence class partitioning (ECP)

• Boundary value analysis (BVA)

• Cause and effect graphing

• Combinatorial testing

• State transition testing

• Error guessing / Exploratory testing

• Usability testing

Page 14: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Equivalence Class Partitioning

• Split input/output into classes which the software handles equivalently

• Select test cases to cover each class

x

x

x x

Page 15: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Example – Insurance System

Specification Statement:

• System shall reject over-age insurance applicants

Specification Item:

• Reject male insurance applicants, if over the age of 80 years on day of application

• Reject female insurance applicants, if over the age of 85 years on day of application

Page 16: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Example (cont.) Input: Gender & Age

Output: accept/reject

Classes

C1: InputGender: Male

C2: InputGender: Female

C3: InputAge: >85

C4: InputAge: (80, 85]

C5: InputAge: <=80

C6: Output: accept

C7: Output: reject

Test Cases

TC1: male, 83, reject

TC2: male, 56, accept

TC3: female, 101, reject

TC4: female, 83, accept

TC5: female, -3, ??

TC6: fmale, 14, ??

minimal, cover all classes

Page 17: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Example (cont.) Input: Gender & Age

Output: accept/reject

Classes

C1: InputGender: Male

C2: InputGender: Female

C3: InputGender: invalid

C4: InputAge: >85

C5: InputAge: (80, 85]

C6: InputAge: [18, 80]

C7: InputAge: [0, 18)

C8: InputAge: invalid

C9: Output: accept

C10: Output: reject

C11: Output: error

Test Cases

TC1: male, 83, reject

TC2: male, 56, accept

TC3: female, 101, reject

TC4: female, 83, accept

TC5: female, -3, error

TC6: fmale, 14, error

...

Page 18: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

ECP Guidelines [Myers 79/04]

If input condition:

is a range, e.g., x = [0, 9]

is an ordered list of values, e.g., owner = <1, 2, 3, 4>

one in-range/list and two out-of-range/list classes are defined

is a set, e.g., vehicle = [car, motorcycle, truck]

is a “must be” condition (boolean)

one in-set and one out-of-set class are defined

is anything else (e.g., invalid)

partition further

Have enough test cases to cover all valid input classes Have one test case for each invalid input class

Page 19: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Boundary Value Analysis

• Adds to the equivalence partitioning method

• Select test cases to represent each side of the class boundaries

x x

x

x x x

Page 20: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Insurance Example Input: Gender & Age

Output: accept/reject

Classes

C1: InputGender: Male

C2: InputGender: Female

C3: InputGender: invalid

C4: InputAge: >85

C5: InputAge: (80, 85]

C6: InputAge: [0, 80]

C7: InputAge: invalid

C8: Output: accept

C9: Output: reject

C10: Output: error

...

Test Cases

TC1: male, 83, reject

TC2: male, 56, accept

TC3: female, 101, reject

TC4: female, 83, accept

TC5: female, -1, error

TC6: female, 0, accept

TC7: fmale, 0, error

...

Page 21: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Boundary Value Analysis Guidelines

• Range a..b a, b, just above a, just below b

• List of values: max, min, just below min, just above max

• Boundaries of externally visible data structures shall be checked (e.g. ordered sets, arrays)

• Output bounds should be checked

Page 22: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Combinatorial Designs

• ECP and BVA define test cases per class

• Combinations of equivalence classes need to be handled

• Combinatorial explosion needs to be handled

x x

x

x x x

x

x

Page 23: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Cause-Effect Graphing

‘3’ occurs if both/one of ‘1’ and ‘2’ are present

‘2’ occurs if ‘1’ occurs

‘2’ occurs if ‘1’ does not occur

1

2

2

3

2 1

1

and/or

Page 24: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Cause-Effect Graphing

male

fe-male

Age <= 80

and

accept

reject

80 < Age

<= 85

Age > 85

and

Age

Gender

Accept/Reject

<=80 <80 <=85 >85

male female

accept reject

One advantage of this method is that development of the rules and the graph from the specification allows for a thorough inspection (logically, semantically) of the specification.

Page 25: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Cause-Effect Graphing

1. The tester must decompose the specification into lower level units

2. For each specification unit, the tester needs to identify causes and effects.

• A cause is a distinct input condition (or equivalence class of input conditions)

• An effect is an output condition (or equivalence class of output conditions) or a system transformation

• A table of causes and effects helps the tester to record the necessary details.

• Then relationships between the causes and effects should be determined, ideally in the form of a set of rules.

3. From the set of rules, the boolean cause-effect graph is constructed, which in turn is converted into a decision table.

4. Test cases are then generated from the equivalent decision table.

Read in Textbook!

Page 26: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Decision Table Format

Case Female Male <= 80 >80 &

<= 85

> 85 Accept Reject

TC1 0 1 0 1 0 0 1

TC2 0 1 1 0 0 1 0

TC3 1 0 0 0 1 0 1

TC4 1 0 0 1 0 1 0

A test-input would then be generated from this table, E.g.: TC1: male, 83 reject TC2: male, 75 accept (could also be female) TC3: female, 88 reject (could also be male) TC4: female, 83 accept

Page 27: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Combinatorial Testing – Example 1

Plan: flt, flt+hotel, flt+hotel+car From: CONUS, HI, AK, Europe, Asia … To: CONUS, HI, AK, Europe, Asia … Compare: yes, no Date-type: exact, 1to3, flex Depart: today, tomorrow, 1yr, Sun, Mon … Return: today, tomorrow, 1yr, Sun, Mon … Adults: 1, 2, 3, 4, 5, 6 Minors: 0, 1, 2, 3, 4, 5 Seniors: 0, 1, 2, 3, 4, 5

Many variables Many values per variable Need to abstract values (equivalence classes, boundary values)

Page 28: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

How Do We Test This? 34 switches = 234 = 1.7 x 1010 possible inputs = 1.7 x 1010 tests

If only 3-way interactions, need only 33 tests

For 4-way interactions, need only 85 tests

Page 29: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Combinatorial Testing

What is it?

• Methods for systematically testing t-way interaction effects of input (or configuration parameter) values.

Why do it?

• The interaction of specific combinations of input values may trigger failures that won’t be triggered if testing input values (or configurations) only in isolation.

Page 30: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Two Scopes of Combinatorial Testing

Pizza Ordering

System under test (SUT)

Test inputs

Test case OS CPU Protocol

1 Windows Intel IPv4

2 Windows AMD IPv6

3 Linux Intel IPv6

4 Linux AMD IPv4

Test Configurations

Test Inputs Size Topp Addr Phone

Sm Custom Val Val

Sm Preset Inv Inv

Med Custom Inv Val

Med Preset Val Inv

Lg Custom Val Inv

Lg Preset Inv Val

Page 31: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Combinatorial Testing – Example 2

Platform configuration parameters:

• OS: Windows XP, Apple OS X, Red Hat Linux

• Browser: Internet Explorer, Firefox

• Protocol: IPv4, IPv6

• CPU: Intel, AMD

• DBMS: MySQL, Sybase, Oracle

Total number of combinations: 3*2*2*2*3 = 72

Do we need 72 test cases?

Page 32: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Pair-Wise Testing (2-Way Interaction)

# of pairs:

# of 2-way interactions N=57:

a:b 3 x 2 = 6

a:c 3 x 2 = 6 a:d 3 x 2 = 6 a:e 3 x 3 = 9 b:c 2 x 2 = 4 b:d 2 x 2 = 4 b:e 2 x 3 = 6 c:d 2 x 2 = 4 c:e 2 x 3 = 6 d:e 2 x 3 = 6 ------ 57

Test OS

(a)

Browser

(b)

Protocol

(c) CPU

(d)

DBMS

(e)

1 XP IE IPv4 Intel MySQL

2 XP Firefox IPv6 AMD Sybase

3 XP IE IPv6 Intel Oracle

4 OS X Firefox IPv4 AMD MySQL

5 OS X IE IPv4 Intel Sybase

6 OS X Firefox IPv4 Intel Oracle

7 RHL IE IPv6 AMD MySQL

8 RHL Firefox IPv4 Intel Sybase

9 RHL Firefox IPv4 AMD Oracle

10 OS X Firefox IPv6 AMD Oracle

10)!25(!2

!5

2

5

9032

52

2

540 22

N

Page 33: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Pair-Wise Testing (2-Way Interaction)

Only 10 tests needed, if we want to test all interactions of one parameter with one other parameter (pair-wise interaction)

Pair 1:

3 x 2 = 6 Combinations

Test OS Browser Protocol CPU DBMS

1 XP IE IPv4 Intel MySQL

2 XP Firefox IPv6 AMD Sybase

3 XP IE IPv6 Intel Oracle

4 OS X Firefox IPv4 AMD MySQL

5 OS X IE IPv4 Intel Sybase

6 OS X Firefox IPv4 Intel Oracle

7 RHL IE IPv6 AMD MySQL

8 RHL Firefox IPv4 Intel Sybase

9 RHL Firefox IPv4 AMD Oracle

10 OS X Firefox IPv6 AMD Oracle

Page 34: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Pair-Wise Testing (2-Way Interaction)

Only 10 tests needed, if we want to test all interactions of one parameter with one other parameter (pair-wise interaction)

Pair 2:

2 x 2 = 4 Combinations

Test OS Browser Protocol CPU DBMS

1 XP IE IPv4 Intel MySQL

2 XP Firefox IPv6 AMD Sybase

3 XP IE IPv6 Intel Oracle

4 OS X Firefox IPv4 AMD MySQL

5 OS X IE IPv4 Intel Sybase

6 OS X Firefox IPv4 Intel Oracle

7 RHL IE IPv6 AMD MySQL

8 RHL Firefox IPv4 Intel Sybase

9 RHL Firefox IPv4 AMD Oracle

10 OS X Firefox IPv6 AMD Oracle

Page 35: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Pair-Wise Testing (2-Way Interaction)

Only 10 tests needed, if we want to test all interactions of one parameter with one other parameter (pair-wise interaction)

Pair 3:

3 x 3 = 9 Combinations

Test OS Browser Protocol CPU DBMS

1 XP IE IPv4 Intel MySQL

2 XP Firefox IPv6 AMD Sybase

3 XP IE IPv6 Intel Oracle

4 OS X Firefox IPv4 AMD MySQL

5 OS X IE IPv4 Intel Sybase

6 OS X Firefox IPv4 Intel Oracle

7 RHL IE IPv6 AMD MySQL

8 RHL Firefox IPv4 Intel Sybase

9 RHL Firefox IPv4 AMD Oracle

10 OS X Firefox IPv6 AMD Oracle

Page 36: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Is Testing 2-Way Interactions Enough?

• Analyses of failure-triggering conditions showed this:

• Medical device (dark blue)

• NASA distrib. DB (light blue)

• Browser (green)

• Web server (magenta)

• Network security (orange)

• TCAS* module (purple)

* Traffic Collision Avoiding System

Page 37: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Combinatorial Testing – Example 3

• How Many Tests Would It Take?

• There are 10 effects, each can be on or off

• All combinations is 210 = 1,024 tests

• too many to check …

• Let’s look at all 3-way interactions …

Page 38: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

How Many Tests Would it Take?

• There are triples of parameters.

• Naively 120 x 23 = 960 tests.

• Since we can pack 3 triples into each test, we need no more than 320 tests.

• Each test exercises many triples:

0 0 0 1 1 1 0 1 0 1

We should be able to pack many in one test, so what’s the smallest number we need?

1203

10

Page 39: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Covering Arrays

Each row is a test:

Each column is a parameter:

All 3-way interactions in only 13 tests

Page 40: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

How Many Tests Would it Take?

0 = effect off 1 = effect on

13 tests for all 3-way interactions

210 = 1,024 tests for all 10-way interactions

Page 41: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

What about 2-Way Interactions?

0 = effect off 1 = effect on

13 tests for all 3-way interactions

210 = 1,024 tests for all 10-way interactions

?

Page 42: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

What about 2-Way Interactions?

0 = effect off 1 = effect on

13 tests for all 3-way interactions

210 = 1,024 tests for all 10-way interactions

...

9 pairs (*4)

8 pairs (*4)

7 pairs (*4)

1 pair (*4)

Page 43: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

What about 2-Way Interactions?

0 = effect off 1 = effect on

7 (6?) tests for all 2-way interactions

210 = 1,024 tests for all 10-way interactions

0 0 1 0 0 1 1 0 1 0

0 1 0 1 0 1 0 1 0 0

1 0 0 1 1 0 0 0 1 1

1 1 0 0 1 1 1 0 0 0

1 1 1 1 0 0 1 0 1 1

0 0 1 0 1 0 0 1 0 1

X x x x x 0 1 1 x 0

Page 44: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

ACTS Tool (NIST & UT Arlington)

Page 45: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

12600 1070048 >1 day NA 470 11625 >1 day NA 65.03 10941 6

1549 313056 >1 day NA 43.54 4580 >1 day NA 18s 4226 5

127 64696 >21 hour 1476 3.54 1536 5400 1484 3.05 1363 4

3.07 9158 >12 hour 472 0.71 413 1020 2388 0.36 400 3

2.75 101 >1 hour 108 0.001 108 0.73 120 0.8 100 2

Time Size Time Size Time Size Time Size Time Size

TVG (Open Source)‏ TConfig (U. of Ottawa)‏ Jenny (Open Source)‏ ITCH (IBM)‏ IPOG T-Way

Traffic Collision Avoidance System (TCAS): 273241102

Times in seconds

New Algorithms – Comparison

ACTS (prev. FireEye) tool with IPOG algorithm

IPOG = In Parameter Order General

Page 46: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Combinatorial Testing Links

http://cse.unl.edu/~citportal/

http://csrc.nist.gov/groups/SNS/acts/index.html

Page 47: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

How to (Automatically) Generate Test Oracles? Creating test (input) data is the (relatively) easy part!

How do we check that the code worked correctly on the test input?

• Using existing test sets (with known oracles) – easy if test sets exist

• Crash testing code to ensure it does not crash for randomly generated test input (‘fuzz testing’) – easy but of limited value

• Embedded assertions – incorporate assertions in code to check critical states at different points in the code – will they cover all possible incorrect states?

• Model-checking using a mathematical model of the system under test and a model checker to generate expected results for each input – expensive but tractable

Page 48: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Using Model-Checking to Produce Tests

The system can never get in this state!

Yes it can, and here’s

how …

Model-checker test production:

if assertion is not true, then a counter-example is generated.

This can be converted into a test case.

(cf. Kuhn et al., 2008)

Example: System handling access control to files (grant/deny read, write access, etc.)

Page 49: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Model-Checking: Example

Input Variables: Values

User

File

Action

Question: GRANT or DENY access? Covering array (2-way interaction):

GRANT

GRANT

GRANT

GRANT

GRANT

GRANT

GRANT

GRANT

GRANT

DENY

DENY

DENY

DENY

DENY

DENY

DENY

DENY

DENY

Check assertion: AG (u_l = 0 & f_l = 0) & act = rd) -> AX !(access = GRANT) If assertion is false -> counterexample

?

Page 50: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Model-Checking: Example (cont’d)

Model Spec

Input Vars

Output Var

Rules /* no read-up */

/* no write-down */

Page 51: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Model-Checking: Example (cont’d)

Rules: R1: u_1 >= f_1 & act = rd R2: f_1 >= u_1 & act = wr

Start

Grant

Deny

R1 II R2

!(R1 II R2) = !R1 & !R2

Note: In case of state = ’GRANT’, the type of access granted is specified by the variable ’act’ (action).

Page 52: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Model-Checking: Example (cont’d)

Rules: R1: u_1 >= f_1 & act = rd R2: f_1 >= u_1 & act = wr

Start

Grant

Deny

R1 II R2

!(R1 II R2) = !R1 & !R2

R1 II R2

R1 II R2 !R1 & !R2

!R1 & !R2

Page 53: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Model-Checking: Example (cont’d)

For each of the nine configurations in the covering array:

Create SPEC claim of form: SPEC AG(covering array rules) -> AX !(access=result)

Repeat this for each result (GRANT or DENY)

Page 54: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Model-Checking: Example (cont’d)

For each of the nine configurations in the covering array:

Create SPEC claim of form: SPEC AG(covering array rules) -> AX !(access=result)

Repeat this for each result (GRANT or DENY)

Expected:

GRANT GRANT DENY DENY GRANT GRANT GRANT DENY GRANT

Page 55: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Model-Checking: Example (cont’d)

Model Spec

CTL: Compliance Test Language

Statement is ’false’ according to rules Note: u_1 >= f_1

Counter-example

Page 56: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Using Model-Checking to Produce Tests

The system can never get in this state!

Yes it can, and here’s

how …

Model-checker test production:

if assertion is not true, then a counter-example is generated.

This can be converted into a test case.

(cf. Kuhn et al., 2008)

Example: System handling access control to files (grant/deny read, write access, etc.)

Page 57: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

State-Transition Testing

Create a set of test cases that triggers each state-transition at least once

Input (Event)

Action State-Transition Graph

Page 58: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

State Table

State

Input (Event)

Wait for Card

(S1)

Wait for PIN

(S2)

Next

(S3)

Card inserted S2 (Ask for PIN) - -

Invalid PIN - S2 (Beep) -

Valid PIN - S3 (Ask amount) -

Cancel - S1 (Return card) -

... ... ... ...

Page 59: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

State Table

State

Input (Event)

Wait for Card

(S1)

Wait for PIN

(S2)

Next

(S3)

Card inserted S2 (Ask for PIN) - -

Invalid PIN - S2 (Beep) -

Valid PIN - S3 (Ask amount) -

Cancel - S1 (Return card) -

... ... ... ...

4 Test Cases: - Input ’Card inserted’ in S1 - Input ’Invalid PIN’ in S2 - Input ’Valid PIN’ in S2 - Input ’Cancel’ in S2

Page 60: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Error Guessing

• Exploratory testing, happy testing, ...

• Always worth including

• Can trigger failures that systematic techniques miss

• Consider

• Past failures

• Intuition

• Experience

• Brain storming

• ”What is the craziest thing we can do?”

• Lists in literature

Page 61: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Exploratory Testing

• Inventors:

• Cem Kaner, James Bach (1990s)

• Definition:

• “Exploratory testing is simultaneous learning, test design, and test execution.”

• Elements / Variants

• Charter: defines mission (and sometimes tactics to use)

• Example: “Check UI against Windows interface standards”

• Session-based test management: Defects + Notes + Interviews of the testers

Page 62: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Structure of Lecture 2

• Black-Box vs. White-Box Testing

• Black-Box Testing Methods

• Usability Testing

• Lab 2

Page 63: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Quality Attributes – ISO 9126 (ISO 25000)

Page 64: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Testing Usability Requirements

Page 65: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Testing Usability Requirements

How to test: - Define several (typical) usage

scenarios involving tasks Q and R

- Select test users and classify as ’novice’ and ’experienced’

- Let 5 (or better 10, 15) novices perform the secenarios

- Observe what problems they encounter

- Classify and count observed problems

Page 66: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Usability Test Types + Environment

Rubin’s Types of Usability Tests (Rubin, 1994, p. 31-46)

Exploratory test – early product development

Assessment test – most typical, either early or midway in the product development

Validation test – confirmation of product’s usability

Comparison test – compare two or more designs; can be used with other three types of tests

Think aloud

Observe

Audio/video recording

Page 67: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Usability Testing – What? How?

• Test Focus

• Understandability

• Easy to understand?

• Ease of learning

• Easy to learn?

• Operability

• Matches purpose & environment of operation?

• Ergonomics: color, font, sound,...

• Communicativeness

• In accordance with psychological characteristics of user?

• Test Environments

• Free form tasks

• Procedure scripts

• Paper screens

• Mock-ups

• Field trial

Page 68: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Inspection Methods Usability Testing

Cognitive Walkthrough

Heuristic Evaluation

Guidelines Review

Field Study

Laboratory Experiment

Example: Evaluating UI Designs

Page 69: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Inspection Methods Usability Testing

Cognitive Walkthrough

Heuristic Evaluation

Guidelines Review

Field Study

Laboratory Experiment

Example: Evaluating UI Designs

• evaluates design on how well it supports user in learning task

• usually performed by expert in cognitive psychology

• expert `walks though' design to identify potential problems using psychological principles

• Scenarios may be used to guide analysis

Page 70: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Inspection Methods Usability Testing

Cognitive Walkthrough

Heuristic Evaluation

Guidelines Review

Field Study

Laboratory Experiment

Example: Evaluating UI Designs

• usability criteria (heuristics) are identified

• design examined by experts to see if these are violated

Page 71: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Heuristic Evaluation by Inspection

List of 10 Heuristics according to (Nielsen, 2005):

List violations of heuristics: Rank by severity: 0...4 0: positive (or neutral) aspect of system ... 4: major, catastrophic aspect of system

(See example report on course wiki!)

Page 72: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Inspection Methods Usability Testing

Cognitive Walkthrough

Heuristic Evaluation

Guidelines Review

Field Study

Laboratory Experiment

Example: Evaluating UI Designs Written guidelines recommended for larger projects:

• Screen layout

• Appearance of objects

• Terminology

• Wording of prompts and error messages

• Menu’s

• Direct manipulation actions and feedback

• On-line help and other documentation

Page 73: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Inspection Methods Usability Testing

Cognitive Walkthrough

Heuristic Evaluation

Guidelines Review

Field Study

Laboratory Experiment

Example: Evaluating UI Designs

Usability testing in a controlled environment

• There is a test set of users

• They perform pre-specified tasks

• Data is collected (quantitative and qualitative)

• Take mean and/or median value of measured attributes

• Compare to goal or another system

Page 74: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Inspection Methods Usability Testing

Cognitive Walkthrough

Heuristic Evaluation

Guidelines Review

Field Study

Laboratory Experiment

Example: Evaluating UI Designs

1. Direct observation in actual use

discover new uses

take notes, don’t help, chat later

2. Logging actual use

objective, not intrusive

great for identifying errors

which features are/are not used

privacy concerns

Page 75: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Inspection Methods Usability Testing

Cognitive Walkthrough

Heuristic Evaluation

Guidelines Review

Field Study

Laboratory Experiment

Example: Evaluating UI Designs

3. Questionnaires and interviews with real users

ask users to recall critical incidents

questionnaires must be short and easy to return

4. Focus groups

6-9 users

skilled moderator with pre-planned script

computer conferencing??

5 On-line direct feedback mechanisms

initiated by users

may signal change in user needs

trust but verify

6. Bulletin boards and user groups

Page 76: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

UT – Example Student Project

• University of Michigan, USA

• Full slide set available on course wiki

Content:

• Recruiting of participants

• Test Cases / Tasks

• Test Logging / Recording

• Moderation & Debriefing

• Location & Setup

Page 77: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

A/B Testing

Two GUI Versions A & B

Page 78: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

A/B‏Testing‏(cont’d)

Two GUI Versions A & B

Tool support

Page 79: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

A/B‏Testing‏(cont’d)

Two GUI Versions A & B ?

Page 80: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

A/B‏Testing‏(cont’d)

What to vary ...

• Call-To-Actions – Placement, wording, size

• Copywriting – Value propositions, product descriptions

• Forms – Their length, field types, text on the forms.

• Layout – Homepage, content pages, landing pages

• Product pricing – Try testing for revenue by testing your prices

• Images – Their placement, content and size

• Amount of content on the page (short vs. long)

Link: http://conversionxl.com/how-to-build-a-strong-ab-testing-plan-that-gets-results/

Page 81: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Structure of Lecture 2

• Black-Box vs. White-Box Testing

• Black-Box Testing Methods

• Usability Testing

• Lab 2

Page 82: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Lab 2 – Black-Box Testing

Lab 2 (week 27: Mar 02 - Mar 05) - Black-Box Testing (10%) Lab 2 Instructions Lab 2 Documentation Lab 2 Application Submission Deadlines:

• Monday Lab: Sunday, 08 Mar, 23:59 • Tuesday Labs: Monday, 09 Mar, 23:59 • Wednesday Labs: Tuesday, 10 Mar, 23:59 • Thursday Labs: Wednesday, 11 Mar, 23:59

• Penalties apply for late delivery: 50% penalty, if

submitted up to 24 hours late; 100 penalty, if submitted more than 24 hours late

Instructions

Documentation

Program

Page 83: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Lab 2 – Black-Box‏Testing‏(cont’d)

Documentation

Program

Test Cases: Input & Exp. Output 1 3 4 triangle type x 2 5 5 triangle type y 0 1 2 no triangle ...

Test Report: TC1 pass TC2 pass TC3 pass TC4 fail defect ...

Strategies: - Equivalence Class

Partitioning - Boundary Value Analysis

Page 84: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Recommended Textbook Exercises

Chapter 4

1, 2, 3, 4, 5

8, 11, 12

Page 85: MTAT.03.159: Software Testing · Black-Box vs. White-Box External/user view: Check conformance with specification -> function coverage Abstraction from details: Source code not needed

MTAT.03.159 / Lecture 02 / © Dietmar Pfahl 2015

Next 2 Weeks

• Lab 2:

– Black-Box Testing

• Lecture 3:

– White-Box Testing Techniques

• In addition to do:

– Read textbook chapters 4, 12.8 (available via OIS)